[go: up one dir, main page]

CN119002734B - Information processing methods, apparatus, equipment and storage media - Google Patents

Information processing methods, apparatus, equipment and storage media

Info

Publication number
CN119002734B
CN119002734B CN202310947458.7A CN202310947458A CN119002734B CN 119002734 B CN119002734 B CN 119002734B CN 202310947458 A CN202310947458 A CN 202310947458A CN 119002734 B CN119002734 B CN 119002734B
Authority
CN
China
Prior art keywords
window
interactive
digital assistant
page
message
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310947458.7A
Other languages
Chinese (zh)
Other versions
CN119002734A (en
Inventor
谢欣
齐俊元
朱一冰
刘杨
赵博文
孙樱迪
徐颖逸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Lemon Inc Cayman Island
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Lemon Inc Cayman Island
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd, Lemon Inc Cayman Island filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202310947458.7A priority Critical patent/CN119002734B/en
Priority to PCT/CN2024/107871 priority patent/WO2025026227A1/en
Publication of CN119002734A publication Critical patent/CN119002734A/en
Application granted granted Critical
Publication of CN119002734B publication Critical patent/CN119002734B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

According to embodiments of the present disclosure, methods, apparatuses, devices, and storage media for information processing are provided. The method includes presenting a first interactive window with the digital assistant in response to invoking the digital assistant in the first page, presenting at least one first interactive message of the user with the digital assistant in the first interactive window, and displaying the at least one first interactive message in an aggregated form as a first message record in a main conversation window in which the user interacts with the digital assistant. Thus, the interactive messages of the user and the digital assistant can be uniformly recorded, and thus can be traced back. And the method is beneficial to providing efficient assistance experience for users.

Description

Information processing method, apparatus, device and storage medium
Technical Field
Example embodiments of the present disclosure relate generally to the field of computers and, more particularly, relate to methods, apparatuses, devices, and computer-readable storage media for information processing.
Background
With the development of information technology, various terminal devices can provide various services to people in terms of work and life, etc. For example, an application providing a service may be deployed in the terminal device. The terminal device or its application may provide helper-like functionality to the user to assist the user in using the terminal device or application to complete tasks. How to make such helper class functions better to serve users is a technical problem currently explored.
Disclosure of Invention
In a first aspect of the present disclosure, an information processing method is provided. The method includes presenting a first interactive window with the digital assistant in response to an operation of invoking the digital assistant in the first page, presenting at least one first interactive message of the user with the digital assistant in the first interactive window, and displaying the at least one first interactive message in an aggregated form as a first message record in a main conversation window in which the user interacts with the digital assistant.
In a second aspect of the present disclosure, an information processing method is provided. The method includes opening a target page through user interaction with a digital assistant in a main session window, presenting an interaction window with the digital assistant in response to an operation of invoking the digital assistant in the target page, and displaying context information related to the target page in the main session window in the interaction window.
In a third aspect of the present disclosure, an apparatus for information processing is provided. The apparatus includes a first presentation module configured to present a first interactive window with the digital assistant in response to an operation of invoking the digital assistant in a first page, a second presentation module configured to present at least one first interactive message of the user with the digital assistant in the first interactive window, and a third presentation module configured to display the at least one first interactive message in an aggregated form as a first message record in a main conversation window in which the user interacts with the digital assistant.
In a fourth aspect of the present disclosure, an apparatus for information processing is provided. The device comprises an opening module, a fourth displaying module and a fifth displaying module, wherein the opening module is configured to open a target page through interaction of a user and a digital assistant in a main session window, the fourth displaying module is configured to display an interaction window of the digital assistant in response to the operation of calling the digital assistant in the target page, and the fifth displaying module is configured to display context information related to the target page in the main session window in the interaction window.
In a fifth aspect of the present disclosure, an electronic device is provided. The apparatus includes at least one processing unit, and at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit. The instructions, when executed by at least one processing unit, cause the apparatus to perform the method of the first or second aspect.
In a sixth aspect of the present disclosure, a computer readable storage medium is provided. The computer readable storage medium has stored thereon a computer program executable by a processor to implement the method of the first or second aspect.
It should be understood that what is described in this section of the disclosure is not intended to limit key features or essential features of the embodiments of the disclosure, nor is it intended to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals denote like or similar elements, in which:
FIG. 1 illustrates a schematic diagram of an example environment in which embodiments of the present disclosure may be implemented;
FIG. 2 illustrates a schematic diagram of an example interface of a main session window, according to some embodiments;
FIG. 3 illustrates a schematic diagram of an example interface that opens a page and invokes a minute interactive window in the page, in accordance with some embodiments;
FIG. 4A illustrates an interface diagram of one example of opening a page through a main session window, in accordance with some embodiments;
FIG. 4B illustrates an interface diagram of one example of invoking an interactive window in the page opened by way of FIG. 4A, in accordance with some embodiments;
FIG. 5 illustrates an interface diagram of one example of an interactive message unifying the interactive windows in a main session window, in accordance with some embodiments;
FIG. 6 illustrates a schematic diagram of an interface of one example of an interactive message record exposing multiple sub-interactive windows in a main session window, in accordance with some embodiments;
FIG. 7 illustrates an interface diagram of one example of an interactive message record in the same page showing different secondary openings in a primary session window, in accordance with some embodiments;
FIG. 8 illustrates an interface diagram showing one example of an interactive message record in a sub-interactive window containing two topics in a main conversation window, in accordance with some embodiments;
FIG. 9 illustrates an interface diagram of one example of an interactive message record in an interactive window of a floating window mode exposed in a main session window, in accordance with some embodiments;
FIG. 10 illustrates a flow diagram of a process for information processing according to some embodiments;
FIG. 11 illustrates another flow diagram of a process of information processing according to some embodiments;
FIG. 12 illustrates a block diagram of an apparatus for information processing, in accordance with some embodiments;
FIG. 13 illustrates a block diagram of another apparatus for information processing, in accordance with some embodiments;
fig. 14 illustrates a block diagram of an apparatus capable of implementing various embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been illustrated in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather, these embodiments are provided so that this disclosure will be more thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
In describing embodiments of the present disclosure, the term "comprising" and its like should be taken to be open-ended, i.e., including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The term "some embodiments" should be understood as "at least some embodiments". Other explicit and implicit definitions are also possible below.
In this context, unless explicitly stated otherwise, performing a step "in response to a" does not mean that the step is performed immediately after "a", but may include one or more intermediate steps.
It will be appreciated that the data (including but not limited to the data itself, the acquisition, use, storage or deletion of the data) involved in the present technical solution should comply with the corresponding legal regulations and the requirements of the relevant regulations.
It will be appreciated that prior to using the technical solutions disclosed in the embodiments of the present disclosure, the relevant users, which may include any type of rights subjects, such as individuals, enterprises, groups, etc., should be informed and authorized by appropriate means of the types of information, usage ranges, usage scenarios, etc. involved in the present disclosure according to relevant legal regulations.
For example, in response to receiving an active request from a user, prompt information is sent to the relevant user to explicitly prompt the relevant user that the operation requested to be performed will need to obtain and use information to the relevant user, so that the relevant user may autonomously select whether to provide information to software or hardware such as an electronic device, an application program, a server, or a storage medium that performs the operation of the technical solution of the present disclosure according to the prompt information.
As an alternative but non-limiting implementation manner, in response to receiving an active request from a relevant user, the prompt information may be sent to the relevant user, for example, in a popup window, where the prompt information may be presented in a text manner. In addition, a selection control for the user to select to provide information to the electronic device in a 'consent' or 'disagreement' manner can be carried in the popup window.
It will be appreciated that the above-described notification and user authorization process is merely illustrative and not limiting of the implementations of the present disclosure, and that other ways of satisfying relevant legal regulations may be applied to the implementations of the present disclosure.
FIG. 1 illustrates a schematic diagram of an example environment 100 in which embodiments of the present disclosure may be implemented. In this example environment 100, an office suite 120 is installed in a terminal device 110. The user 140 may interact with the office suite 120 via the terminal device 110 and/or an attachment device of the terminal device 110. The office suite 120 is capable of providing the user 140 with an integration of multiple components. These components may be provided as component modules in the office suite 120. The components integrated in the office suite 120 are sometimes referred to as "office applications," "office components," "collaborative office platforms," and the like. By way of example, the components integrated in the office suite 120 may include, but are not limited to, one or more of a chat component (also known as an Instant Messaging (IM) component), a document component, an audio video conferencing component, a mail component, a calendar component, a tasks component, a targets and key results (OKR) component.
In some embodiments, the office suite 120 may be downloaded and installed as an application on the terminal device 110. In some embodiments, the office suite 120 may also be accessed by other means, such as by web page access, or the like.
In the environment 100 of fig. 1, if the office suite 120 is initiated, the terminal device 110 may present the user 140 with an interface 150 of the office suite 120. Interface 150 is sometimes referred to as a client interface. Interface 150 may include various types of interfaces that office suite 120 can provide, such as a conversation interface, a video conferencing interface, a file sharing interface, and so forth, that presents chat content.
In some embodiments, the terminal device 110 communicates with the server 130 to enable provisioning of services for the office suite 120. The terminal device 110 may be any type of mobile terminal, fixed terminal, or portable terminal, including a mobile handset, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, media computer, multimedia tablet, personal Communication System (PCS) device, personal navigation device, personal Digital Assistant (PDA), audio/video player, digital camera/camcorder, television receiver, radio broadcast receiver, electronic book device, game device, or any combination of the preceding, including accessories and peripherals for these devices, or any combination thereof. In some embodiments, terminal device 110 is also capable of supporting any type of interface to the user (such as "wearable" circuitry, etc.). Server 130 may be various types of computing systems/servers capable of providing computing power, including, but not limited to, mainframes, edge computing nodes, computing devices in a cloud environment, and so forth.
It should be understood that the structure and function of the various elements in environment 100 are described for illustrative purposes only and are not meant to suggest any limitation as to the scope of the disclosure. For example, embodiments of the present disclosure may be applied to any suitable application or applications, and are not limited to office suites.
As mentioned briefly above, a digital assistant may be provided to the user to assist the user in using the terminal device or application to accomplish a target task.
In an embodiment of the present disclosure, a digital assistant is configured for the user 140. The digital assistant may be implemented in any suitable form. In some embodiments, the digital assistant of user 140 may interact with it as a contact of user 140. Such a digital assistant may also uniquely correspond to the user 140. In other words, the digital assistant of the user 140 may be specific or proprietary to the user 140. For example, in providing assistance or services to the user 140 by the digital assistant, the digital assistant may utilize its historical interaction information with the user 140, the data authorized by the user 140 that it has access to, its current dialog content with the user, and so on. Such a digital assistant may be considered a personal digital assistant if the user 140 is an individual or person. It will be appreciated that the digital assistant in embodiments of the present disclosure is based on the user's 140 authorized access to the data to which the rights are granted.
It should be appreciated that the "unique correspondence" or similar expressions in this disclosure are not intended to limit that the digital assistant will be updated accordingly based on the interaction process between the user 140 and the digital assistant.
In some embodiments, the digital assistant may be implemented in a single component or application, such as an IM component. In some embodiments, the digital assistant may be implemented in multiple components. For example, these components may be different components of an office suite or a collaborative office platform. In some embodiments, the digital assistant may be implemented directly in the terminal device 110 without attaching to any application. In such an embodiment, the digital assistant provides assistance to the user 140 in the form of contacts.
In order for the digital assistant to provide an efficient assistance experience for the user, embodiments of the present disclosure propose an information processing scheme. According to various embodiments of the present disclosure, the interactive window presented after the preset page wakes up the digital assistant is the master interactive window. In the disclosed embodiments, the primary interaction window is typically a session mode and thus may also be referred to as a primary session window. After the other pages except the preset page wake up the digital assistant, an interactive window (which may also be described as a sub-interactive window in the embodiment of the present disclosure) corresponding to the page is presented, and an interactive message record in the sub-interactive window may be displayed in the main session window. Thus, the interactive messages of the user and the digital assistant can be uniformly recorded, and thus can be traced back. And the method is beneficial to providing efficient assistance experience for users.
In order to more clearly describe embodiments of the present disclosure, some example embodiments of the present disclosure will be described below with continued reference to the accompanying drawings. Hereinafter, an office suite is mainly described as an example scenario, but this is merely exemplary and is not intended to limit the scope of the present disclosure. The digital assistant may be implemented in any other application or scenario, not limited to an office suite.
As mentioned above, in the embodiment of the present disclosure, the interactive window presented after the preset page wakes up the digital assistant is the main session window. In some embodiments, the preset page may be a page of a preset component or application. As an example, the preset component or application may be an IM component in an office suite. Accordingly, the preset page may be a page of an IM component in an office suite. In other examples, the preset component or application may also be other applications or components.
In some embodiments, the digital assistant may be evoked in the aforementioned preset page in a variety of ways to present the main session window. As an example, the digital assistant may be invoked by a search to interact with it. For example, the user 140 may enter keywords (such as the name of the digital assistant, the name of the remark) related to the digital assistant in the search box. In response, options associated with the digital assistant may be presented as search results. By clicking on this option, the user 140 evokes the digital assistant and initiates an interactive window with the digital assistant. As another example, the digital assistant may be invoked by a contact list or address book. For example, contact options corresponding to the digital assistant may be presented in a contact interface. In this way, a stable portal for the digital assistant may be provided to facilitate convenient arousal of the digital assistant by the user 140 when desired.
One example of a main session window in an office suite is described with reference to fig. 2. The terminal device presents an interface 200 of the office suite. The interface 200 includes a navigation bar 230 in which entry controls for multiple components and access entries for multiple pages are displayed. By selecting these portal controls and accessing the portal, the corresponding components or pages may be activated for browsing or operating. In this example, the chat component is activated. Accordingly, a chat component's information flow (feed) page 210, i.e., a list of conversation portals, is presented in interface 200. Wherein a user's session entry item 211 with a digital assistant ("XX assistant" in fig. 2) is displayed in the information flow page 210. The user may click on the session entry item 211 of "XX helper" causing interactive window 220 to be presented. In this example, the interactive window 220 is the user's main session window with the digital assistant.
In the present disclosure, the digital assistant may also be in a contact list, and the user may open a main conversation window with the digital assistant by triggering the digital assistant in the contact list.
In the examples above and in the examples below, the digital assistant is shown with the name "XX assistant", but it should be understood that this is merely exemplary and is not intended to limit the scope of the present disclosure. In embodiments of the present disclosure, the digital assistant may have any suitable name, and the user may customize the name of the digital assistant.
In the embodiment of the disclosure, after other pages except the preset page wake up the digital assistant, the interactive window corresponding to the page may be presented. Such interactive windows may also be referred to as interactive windows in embodiments of the present disclosure.
In some embodiments, the component or application in which the main session window is located and the page in which the sub-interaction window is located may be different from the component or application to which it belongs. For example, the component in which the main session window is located is an IM component, through which the user may invoke the digital assistant, and the main session window is presented. The page where the sub-interactive window is located may be a document page, and the user may call up the digital assistant through the document page, and further the sub-interactive window is presented in the document page.
In some embodiments, after the user wakes up the digital assistant on other pages, the presented sub-interaction window may not contain contextual information, i.e., no historical interaction messages of the user with the digital assistant. In other embodiments, the presented sub-interaction window may contain historical interaction messages of the user with the digital assistant after the user wakes up the digital assistant on other pages.
As an example, one example of a sub-interactive window in an office suite is described with reference to fig. 3. The terminal device presents an interface 300 of the office suite. The interface 300 includes a navigation bar 330 in which entry controls for multiple components and access entries for multiple pages are displayed. By selecting these portal controls and accessing the portal, the corresponding components or pages may be activated for browsing or operating. In this example, an opened document page 310 is presented in interface 300, where document page 310 may be opened based on a document tab in the lower half of navigation bar 330, may be opened based on a document component entry in the upper half of navigation bar 330, may be opened via a document link sent in a session window, may be opened via a search results page, etc., as shown in FIG. 3, which is not a limitation in this disclosure. In the document page 310, options or controls 311 associated with the digital assistant are presented. The user may click on the control 311 to wake up the digital assistant and present the interactive window 320. In this example, the interactive window 320 is a sub-interactive window. Alternatively or additionally, no contextual information may be displayed in the interactive window 320. For example, only the hint message "start and you collaborate together-based on AAAA" is displayed in the interactive window 320. Further, the user may interact with the digital assistant through the interaction window 320. Interactive messages may be presented in interactive window 320 (as indicated by dashed box 321).
In some embodiments, contextual information may be included in the sub-interaction window. In some embodiments, other pages (such as pages of a document, a calendar, a task, a meeting, approval, project management, customer relationship management, etc.) can be opened through the main session window displayed in the preset page, so that the digital assistant is awakened in the other pages, and a corresponding sub-interaction window is presented. Alternatively or additionally, in such a trigger scenario, contextual information is displayed in the sub-interaction window. The context information may be, for example, context information associated with the page being opened in the main session window, such as a dialog message associated with the page.
As an example, referring to fig. 4A and 4B, the terminal device presents an interface to an office suite. Included in this interface is a main conversation window 420 that is invoked in the chat component's information flow page 410. The main session window 420 contains a document 421, and when the user clicks on the document 421, a document interface 430 corresponding to the document 421 can be displayed. In the document interface 430, options or controls 431 associated with the digital assistant are presented. Referring to FIG. 4B, the user may click on control 431 in the document interface 430 to wake up the digital assistant and present an interactive window 440. Alternatively or additionally, context information is presented in the interactive window 440, such as the message presented before "start collaboration with you-based on XX Domain investigation" in the interactive window 440 is the context information. Further, the user may interact with the digital assistant through the interaction window 440. Interactive messages may be presented in interactive window 440 (as indicated by dashed box 441).
In some embodiments, the opening time of the main session window may precede or be later than the sub-interaction window. Embodiments of the present disclosure do not limit the order of opening times of the main session window and the sub-interaction window.
In some embodiments, the interactive messages in the partial interactive window are displayed in an aggregated form in the main session window.
As an example, referring to fig. 3 and 5, the interactive messages in the interactive window 320 presented in the interface 300 (as indicated by the dashed box 321) are presented in the main conversation window 220 presented in the interface 500 in an aggregated form of message records (as indicated by the dashed box 510). The aggregate form herein may refer to the message records in the interactive window 320 being centrally presented, e.g., centrally presented at the location indicated by the dashed box 510, where no other messages are inserted.
In some embodiments of the present disclosure, a plurality of different other pages outside of the previously mentioned preset page may each wake up the digital assistant and present a corresponding sub-interaction window. The awakened sub-interactive windows in the plurality of different other pages are independent of each other. In this case, the interactive message records of the plurality of sub-interactive windows may be displayed in an aggregated form in the main session window, respectively. The plurality of interactive message records are independent of each other. In this way, the user may evoke the digital assistant at a different page and present a corresponding interactive window to interact in the event that the user opens the page multiple times, or opens multiple pages. The interaction between users and the digital assistant in a plurality of pages can be realized without interference. As an example, referring to fig. 6, a terminal device presents an interface 600 of an office suite. The main session window 220 in the interface 600 presents, respectively, an interactive message record 610 corresponding to the user's interactive message in the interactive window 440 (as indicated by the dashed box 441) and an interactive message record 620 corresponding to the user's interactive message in the interactive window 320 (as indicated by the dashed box 321).
In some embodiments of the present disclosure, the display positions or display orders of the plurality of mutually independent interactive message records may be determined according to the start interaction time or the end interaction time of the corresponding sub-interaction window. For example, based on the time at which the corresponding sub-interactive window was invoked or the time at which the first interactive message was displayed. The presentation timing of the interactive message record in the main session window can be described in detail below.
For ease of distinction, various other pages are referred to in the embodiments of the present disclosure as a first page and a second page, and the interaction message records in the first page and the second page are referred to as a first interaction message record and a second interaction message record. For example, document page 310 may be referred to as a first page and interactive window 320 may be referred to as a first interactive window. The document page 430 may be referred to as a second page and the interactive window 440 may be referred to as a second interactive window.
In some embodiments of the present disclosure, the first page and the second page may be pages belonging to different applications or components, respectively. By way of example, the first page may be a page of a document application and the second page may be a page of an application other than the document application (e.g., a calendar, task, meeting, approval, or project management application).
In some embodiments, the first page and the second page may belong to the same client or different clients.
In some embodiments, the first page and the second page may be different pages belonging to the same application or component. By way of example, the first page and the second page may both belong to a document application, e.g., page 310 in FIG. 3 and page 430 in FIGS. 4A and 4B both belong to a document application, but are pages of different documents.
In some embodiments, the first page and the second page may also correspond to the same page opened at different times, respectively. In this case, each time the page is opened, the corresponding interactive window may be awakened. And the interactive messages in the interactive windows of the page presentations which are opened at different times can be displayed in the main session window independently of each other.
For example, the user may open the "XX Domain research" document multiple times and evoke the digital assistant to interact. For example, after the user opens this document 1 st time, the user evokes a digital assistant in the document page and presents a corresponding interactive window in which the interactive message records may be unified into the main conversation window. After opening this document for the 2 nd time, the digital assistant is again evoked in the document page and the corresponding interactive window is presented, and the interactive message records in the interactive window can be unified into the main session window.
Illustratively, referring to FIG. 7, in interface 700, the interaction message records 710 and 720 generated after a user has interacted through the corresponding interaction window after having opened the "XX Domain investigation" document at different times are presented in the main session window 220.
In some embodiments, taking a first message record corresponding to a first page as an example, the first message record includes two or more topics, and different topics are isolated by a preset identifier.
For example, referring to FIG. 4B, a control 442 for creating a new topic is displayed in the interactive window 440. If the user triggers the control 442 in the interactive window 440, a new topic is created. Accordingly, referring to fig. 8, the interaction message record 810 corresponding to the document page 430 shown in the main conversation window 220 includes the newly created topic 820. And is isolated from the previous topics by the identification 812 of "new topics on".
Some implementations of the first message record are described below taking the first message record corresponding to the first page as an example.
In some embodiments of the present disclosure, the first message record may include first indication information for indicating that the first message record is associated with the first page. In this way, different message records can be distinguished by different indication information. When the message records of the different sub-interactive windows are unified to the main session window, the user can conveniently distinguish the message records of the different sub-interactive windows.
Illustratively, the system messages 511, 512 in fig. 5, and the system messages 811, 814 in fig. 8 may each be considered as some examples of the first indication information.
In some embodiments, the first indication information may be a system message. The system message may be used to indicate a corresponding first message record. The system message may be in a form different from the interactive message sent by the session member.
Alternatively or additionally, the first indication information may be located at a starting position of the first message record. Illustratively, the first indication information 811 as in fig. 8 may be located at a starting position of the message record 810.
Alternatively or additionally, the first indication information includes a split line therein, which may be used to isolate the first message record from the preceding message records. Illustratively, referring to FIG. 8, message record 810 is isolated from message record 830 by parting line 813.
Alternatively or additionally, the first indication information may include an entry control for the first page. For example, control 816 or a document link, etc., as shown in fig. 8. In response to the entry control being triggered, the first page is opened. In some embodiments, the first page may be opened in the manner of a new tab page.
Alternatively or additionally, the first indication information may include identification information of the first page. For example, document identification 817 shown in FIG. 8.
Alternatively or additionally, a control for expanding and collapsing the first message record may be included in the first indication information. Illustratively, a "stow" control 815 is shown in FIG. 8.
In some embodiments of the present disclosure, the first message record can be expanded and collapsed. For example, the first message record is expanded or collapsed by triggering control 815. Deployment and stowage herein may also be understood as hiding and deployment. In case the first message record is collapsed, only the first indication information may be displayed, while the interactive messages contained in the first message record are hidden. In the case where the first message record is expanded, the first message record may be displayed in the form of a topic, a tile expansion, or the like. In case there are a plurality of message records dividing the session window, i.e. there are a plurality of message records. The plurality of message records can be independently expanded or collapsed. For example, a first message record may be expanded while a second message record may be hidden. In some embodiments, the first message record is shown in the main session window in a collapsed manner by default. The user may also customize the first message record to be presented in a collapsed or expanded manner.
Illustratively, still referring to fig. 8, after the "stow" control 815 is triggered, the interface 800 presented by the terminal device switches to an interface 840 in which the "message record" 810 is stowed, and the "stow" control 815 switches to the "expand" control 841. Further, if the "expand" control 841 is triggered, the message record 810 is expanded and the presentation of the message record 810 shown as interface 800 may be switched.
In some embodiments of the present disclosure, a start prompt is displayed at a start location of the first message record and an end prompt is displayed at an end location of the first message record.
Illustratively, as shown in fig. 5, a start prompt 511 of "start and your collaboration based on AAAA" is displayed at a start position of the message record 510, and an end prompt 512 of "end and your collaboration based on AAAA" is displayed at an end position.
In some embodiments of the present disclosure, there may be a plurality of occasions when at least one interactive message in the sub-interactive window is displayed in an aggregated form as a message record in the main session window.
The display timing of the first message record in the main session window will be described below taking the first message record in the first interactive window shown in the first page as an example.
In some embodiments, the placeholder message may be displayed first in the main session window. And then at least one interactive message is aggregated with the placeholder message for display.
In some embodiments, the operation of the digital assistant is invoked upon detecting the first page, in response to which a placeholder message is displayed in the main session window. And, aggregating and displaying at least one first interactive message with the placeholder message. The at least one first interactive message may be displayed at the position indicated by the placeholder message after the interaction in the first interactive window is finished, or may be displayed at the position indicated by the placeholder message in real time.
In some embodiments, the placeholder message may be the first indication information mentioned previously.
In some embodiments, a placeholder message may also be displayed in the main session window after the first interactive window is evoked in response to the first message sent in the first interactive window. And aggregate and display at least one first interactive message with the placeholder message.
In some embodiments, the first message record may also be displayed in the main session window after the first interactive window interaction ends in response to the interaction ending in the first interactive window. For example, the end of interaction in the first interaction window may be that the first interaction window is closed, or that the first page is closed.
The above is exemplified with a first message record in a first interactive window. It will be appreciated that the presentation opportunities for different message records may be consistent or independent of each other. That is, the presentation opportunities for different message records may be the same or different. The user can set the setting according to the needs.
The above description takes the sub-interactive window as the session mode interactive window as an example. In some embodiments, the first interactive window may also be a floating window mode interactive window. When the first interactive window is a floating window mode interactive window, the strategy of synchronizing the interactive message to the main session window is the same as that described above, and will not be repeated here.
In some embodiments, where the first interactive window is a session-mode interactive window, a floating-window-mode interactive window with the digital assistant may be further invoked in the first page. And then unifying the first interaction information of the user and the digital assistant in the floating window mode interaction window into the first interaction window. In some embodiments, the interaction information in the floating window mode interaction window may be synchronized in real-time into the first interaction window. In some embodiments, the timing at which the interactive information in the floating-window mode interactive window is unified to the first interactive window may be in response to the floating-window mode interactive window being closed, synchronizing the user with the first interactive information of the digital assistant in the floating-window mode interactive window to the first interactive window. The interaction information of the first interaction window is resynchronized to the main session window.
In some embodiments, the first interaction information may include all interaction messages generated by a user interacting with the digital assistant in a floating window mode interaction window.
In some embodiments, the first interaction information may also include summary information of interactions that the user has with the digital assistant in the floating window mode interaction window, or result information of interactions that the user has with the digital assistant in the floating window mode interaction window.
As an example, referring to fig. 9, the user may invoke a floating window mode interactive window 911 with the digital assistant in the first page 910. Further, the user may interact with the digital assistant through the floating window mode interaction window 911. For example, the user may input an interactive instruction in the floating window mode interactive window 911 or a preset shortcut instruction provided by the digital assistant, such as "summary", "insert", "edit" and "delete", etc., to instruct the digital assistant to perform a target operation. In response to a shortcut instruction entered by the user or an interactive instruction entered by the user, the digital assistant may perform a corresponding operation on the first page 910. The operation result or the like may not be displayed in the floating window mode interaction window 911. But is presented in a session-mode interaction window 920. For example, the interactive summary information and the result information occurring in the floating window mode interactive window, such as "change the image", "continue to compose the paragraph", and "summarize the paragraph", may be presented in the interactive window 920 of the conversation mode.
Further, the first interaction information in the floating window mode interaction window is further displayed in the main session window. That is, the first message record of the main session window includes first interaction information. Still referring to fig. 9, the primary session window 220 has first interaction information 921 presented therein. It is appreciated that more interaction information may be included in the interaction window 920 of the session mode. Accordingly, more interaction information may also be included in the main session window 220.
In some embodiments of the present disclosure, different schemes may be combined with each other. For example, the interactive message 921 shown in fig. 9 may be expanded or contracted.
Example interfaces for setup and dialogue interactions of a digital assistant are described above with reference to the accompanying figures. It should be understood that the various interfaces described above and interface elements, text, etc. therein are exemplary and are not intended to limit the scope of the present disclosure. Furthermore, it should be understood that the various embodiments described above may be implemented alone or in combination.
Fig. 10 illustrates a flow chart of a method 1000 of information processing according to some embodiments. Method 1000 may be implemented at terminal device 110. The method 1000 is described below with reference to fig. 1.
At block 1010, terminal device 110 presents a first interactive window with the digital assistant in response to an operation that evokes the digital assistant in the first page.
At block 1020, terminal device 110 presents at least one first interactive message of the user with the digital assistant in a first interactive window.
At block 1030, the terminal device 110 displays at least one first interactive message in an aggregated form as a first message record in a main session window in which the user interacts with the digital assistant.
In some embodiments of the present disclosure, the method 1000 further includes presenting a second interactive window with the digital assistant in response to an operation invoking the digital assistant in the second page. At least one second interactive message of the user with the digital assistant is presented in a second interactive window. And displaying at least one second interactive message in an aggregated form as a second message record in the main session window, wherein the first message record and the second message record are displayed independently of each other in the main session window.
In some embodiments of the present disclosure, the first page and the second page are pages belonging to different applications, respectively, or the first page and the second page are different pages belonging to the same application, or the first page and the second page correspond to the same page opened at different times, respectively.
In some embodiments of the present disclosure, the first message record includes first indication information for indicating that the first message record is associated with the first page.
In some embodiments of the present disclosure, the first indication information includes one or more of a system message, the first indication information is located at a starting position of the first message record, the first indication information includes a dividing line, the first indication information includes an entry control of the first page, the first indication information includes identification information of the first page, or the first indication information includes a control for expanding and collapsing the first message record.
In some embodiments of the present disclosure, the first message record can be expanded and collapsed.
In some embodiments of the present disclosure, the first message record includes two or more topics, and different topics are isolated by a preset identifier.
In some embodiments of the present disclosure, a start prompt is displayed at a start location of the first message record and an end prompt is displayed at an end location of the first message record.
In some embodiments of the present disclosure, displaying at least one first interactive message in an aggregated form as a first message record in a main conversation window in which a user interacts with a digital assistant includes displaying a placeholder message in the main conversation window and aggregating and displaying the at least one first interactive message with the placeholder message in response to an operation of invoking the digital assistant in a first page, or displaying a placeholder message in the main conversation window in response to a first message sent in the first interactive window after the first interactive window is invoked and aggregating and displaying the at least one first interactive message with the placeholder message in the main conversation window in response to an end of the interaction in the first interactive window, or displaying the first message record in the main conversation window.
In some embodiments of the present disclosure, the first interactive window is a session-mode interactive window or a floating-window-mode interactive window.
In some embodiments of the present disclosure, the first interactive window is a session-mode interactive window and the method 1000 further includes invoking a floating-window-mode interactive window with the digital assistant in the first page, synchronizing first interactive information of the user with the digital assistant in the floating-window-mode interactive window into the first interactive window, wherein the first message record of the main session window includes the first interactive information.
In some embodiments of the present disclosure, the first interaction information includes summary information of interactions that occur in the floating window mode interaction window by the user with the digital assistant, or result information of interactions that occur in the floating window mode interaction window by the user with the digital assistant.
In some embodiments of the present disclosure, synchronizing the first interactive information of the user with the digital assistant in the floating window mode interactive window into the first interactive window includes synchronizing the first interactive information of the user with the digital assistant in the floating window mode interactive window into the first interactive window in response to the floating window mode interactive window being closed.
In some embodiments of the present disclosure, the application in which the main session window is located and the application in which the first page is located are different applications.
In some embodiments of the present disclosure, the first page is opened by user interaction with the digital assistant in a main session window, and contextual information about the first page in the main session window is presented in the first interactive window.
Fig. 11 illustrates a flow diagram of a method 1100 of information processing according to some embodiments. Method 1100 may be implemented at terminal device 110. The method 1100 is described below with reference to fig. 1.
At block 1110, terminal device 110 opens the target page through user interaction with the digital assistant in the main session window.
In block 1120, the terminal device 110 presents an interactive window with the digital assistant in response to invoking the digital assistant in the target page.
In block 1130, terminal device 110 displays the context information related to the target page in the main session window in the interactive window.
In some embodiments of the present disclosure, wherein the interactive window is a session-mode interactive window.
In some embodiments of the present disclosure, further comprising presenting at least one interactive message of the user with the digital assistant in an interactive window, and displaying the at least one interactive message in an aggregated form as a message record in a main session window.
It should be appreciated that one or more of the steps, features, characteristics, or combinations thereof described with reference to the above example information processing process may also be incorporated into the process as appropriate without departing from the spirit of process 1100. This disclosure is not repeated here.
Fig. 12 illustrates a schematic block diagram of an apparatus 1200 for information processing according to some embodiments of the present disclosure. The apparatus 1200 may be implemented as or included in the terminal device 110. The various modules/components in apparatus 1200 may be implemented in hardware, software, firmware, or any combination thereof.
The apparatus 1200 includes a first presentation module 1210 configured to present a first interactive window with the digital assistant in response to an operation invoking the digital assistant in a first page.
The apparatus 1200 further includes a second presentation module 1220 configured to present at least one first interactive message of the user with the digital assistant in the first interactive window.
The apparatus 1200 also includes a third presentation module 1230 configured to display at least one first interactive message in an aggregated form as a first message record in a main session window in which the user interacts with the digital assistant.
In some embodiments of the present disclosure, the apparatus 1200 further includes a first presentation module 1210 further configured to present a second interactive window with the digital assistant in response to an operation invoking the digital assistant in the second page. The second presentation module 1220 is further configured to present at least one second interactive message of the user with the digital assistant in a second interactive window. The third presentation module 1230 is further configured to display at least one second interactive message in an aggregated form as a second message record in the main session window, the first message record and the second message record being displayed independently of each other in the main session window.
In some embodiments of the present disclosure, the first page and the second page are pages belonging to different applications, respectively, or the first page and the second page are different pages belonging to the same application, or the first page and the second page correspond to the same page opened at different times, respectively.
In some embodiments of the present disclosure, the first message record includes first indication information for indicating that the first message record is associated with the first page.
In some embodiments of the present disclosure, the first indication information includes one or more of a system message, the first indication information is located at a starting position of the first message record, the first indication information includes a dividing line, the first indication information includes an entry control of the first page, the first indication information includes identification information of the first page, or the first indication information includes a control for expanding and collapsing the first message record.
In some embodiments of the present disclosure, the first message record can be expanded and collapsed.
In some embodiments of the present disclosure, the first message record includes two or more topics, and different topics are isolated by a preset identifier.
In some embodiments of the present disclosure, a start prompt is displayed at a start location of the first message record and an end prompt is displayed at an end location of the first message record.
In some embodiments of the present disclosure, the third presentation module 1230 is further configured to display a placeholder message in the main session window and aggregate the at least one first interactive message with the placeholder message in response to invoking the digital assistant in the first page, or to display a placeholder message in the main session window in response to the first message sent in the first interactive window after the first interactive window is invoked and aggregate the at least one first interactive message with the placeholder message in response to ending the interaction in the first interactive window, or to display the first message record in the main session window.
In some embodiments of the present disclosure, the first interactive window is a session-mode interactive window or a floating-window-mode interactive window.
In some embodiments of the present disclosure, the first interactive window is a session-mode interactive window, and the apparatus 1200 further includes a synchronization module configured to evoke a floating-window-mode interactive window with the digital assistant in the first page, and to synchronize the user with first interactive information of the digital assistant in the floating-window-mode interactive window into the first interactive window. Wherein the first message record of the main session window includes first interaction information.
In some embodiments of the present disclosure, the first interaction information includes summary information of interactions that occur in the floating window mode interaction window by the user with the digital assistant, or result information of interactions that occur in the floating window mode interaction window by the user with the digital assistant.
In some embodiments of the disclosure, the synchronization module is further configured to synchronize the user with the first interactive information of the digital assistant in the floating window mode interactive window into the first interactive window in response to the floating window mode interactive window being closed.
In some embodiments of the present disclosure, the application in which the main session window is located and the application in which the first page is located are different applications.
In some embodiments of the present disclosure, the first page is opened by user interaction with the digital assistant in a main session window, and contextual information about the first page in the main session window is presented in the first interactive window.
Fig. 13 illustrates a schematic block diagram of an apparatus 1300 for information processing according to some embodiments of the present disclosure. The apparatus 1300 may be implemented as or included in the terminal device 110. The various modules/components in apparatus 1300 may be implemented in hardware, software, firmware, or any combination thereof.
The apparatus 1300 includes an opening module 1310 configured to open a target page through user interaction with the digital assistant in the main session window.
The apparatus 1300 also includes a fourth presentation module 1320 configured to present an interactive window with the digital assistant in response to an operation invoking the digital assistant in the target page.
The apparatus 1300 also includes a fifth presentation module 1330 configured to display, in the interactive window, contextual information related to the target page in the main session window.
In some embodiments, the interactive window is a session-mode interactive window.
In some embodiments, the fourth presentation module 1320 is further configured to present at least one interactive message of the user with the digital assistant in an interactive window, and the fifth presentation module 1330 is further configured to display the at least one interactive message as a message record in an aggregated form in the main session window.
Fig. 14 illustrates a block diagram of an electronic device 1400 in which one or more embodiments of the disclosure may be implemented. It should be understood that the electronic device 1400 illustrated in fig. 14 is merely exemplary and should not be construed as limiting the functionality and scope of the embodiments described herein. The electronic device 1400 illustrated in fig. 14 may be used to implement the terminal device 110 or the server 130 of fig. 1.
As shown in fig. 14, the electronic device 1400 is in the form of a general-purpose electronic device. Components of electronic device 1400 may include, but are not limited to, one or more processors or processing units 1410, memory 1420, storage 1430, one or more communication units 1440, one or more input devices 1450, and one or more output devices 1460. The processing unit 1410 may be an actual or virtual processor and is capable of performing various processes according to programs stored in the memory 1420. In a multiprocessor system, multiple processing units execute computer-executable instructions in parallel to improve the parallel processing capabilities of the electronic device 1400.
Electronic device 1400 typically includes a number of computer storage media. Such a medium may be any available media that is accessible by electronic device 1400 and includes, but is not limited to, volatile and non-volatile media, removable and non-removable media. The memory 1420 may be volatile memory (e.g., registers, cache, random Access Memory (RAM)), non-volatile memory (e.g., read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory), or some combination thereof. Storage 1430 may be a removable or non-removable medium and may include machine-readable media such as flash drives, magnetic disks, or any other medium that may be capable of storing information and/or data and that may be accessed within electronic device 1400.
The electronic device 1400 may further include additional removable/non-removable, volatile/nonvolatile storage media. Although not shown in fig. 14, a magnetic disk drive for reading from or writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk may be provided. In these cases, each drive may be connected to a bus (not shown) by one or more data medium interfaces. Memory 1420 can include a computer program product 1425 having one or more program modules configured to perform the various methods or acts of the various embodiments of the present disclosure.
The communication unit 1440 enables communication with other electronic devices via a communication medium. Additionally, the functionality of the components of the electronic device 1400 may be implemented in a single computing cluster or in multiple computing machines capable of communicating over a communication connection. Thus, electronic device 1400 may operate in a networked environment using logical connections to one or more other servers, a network Personal Computer (PC), or another network node.
The input device 1450 may be one or more input devices such as a mouse, keyboard, trackball, or the like. The output device 1460 may be one or more output devices such as a display, speakers, printer, etc. The electronic device 1400 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., with one or more devices that enable a user to interact with the electronic device 1400, or with any device (e.g., network card, modem, etc.) that enables the electronic device 1400 to communicate with one or more other electronic devices, as desired, via the communication unit 1440. Such communication may be performed via an input/output (I/O) interface (not shown).
According to an exemplary implementation of the present disclosure, a computer-readable storage medium having stored thereon computer-executable instructions, wherein the computer-executable instructions are executed by a processor to implement the method described above is provided. According to an exemplary implementation of the present disclosure, there is also provided a computer program product tangibly stored on a non-transitory computer-readable medium and comprising computer-executable instructions that are executed by a processor to implement the method described above.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus, devices, and computer program products implemented according to the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various implementations of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of implementations of the present disclosure has been provided for illustrative purposes, is not exhaustive, and is not limited to the implementations disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various implementations described. The terminology used herein was chosen in order to best explain the principles of each implementation, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand each implementation disclosed herein.

Claims (21)

1. An information processing method, comprising:
responsive to an operation invoking the digital assistant in a first page, displaying a first interactive window with the digital assistant;
displaying at least one first interaction message of a user and the digital assistant in the first interaction window;
displaying the at least one first interactive message in an aggregated form as a first message record in a main session window in which the user interacts with the digital assistant;
responsive to an operation invoking the digital assistant in a second page, displaying a second interactive window with the digital assistant;
displaying at least one second interactive message of the user and the digital assistant in a second interactive window, and
Displaying the at least one second interactive message in an aggregated form as a second message record in the main session window,
The main session window is an interactive window presented by waking up the digital assistant on a preset page, the preset page is different from the first page and the second page, and the first message record and the second message record are displayed in the main session window independently of each other.
2. The method of claim 1, wherein,
The first page and the second page are pages belonging to different applications respectively, or
The first page and the second page are different pages belonging to the same application, or
The first page and the second page respectively correspond to the same page which is opened at different times.
3. The method of claim 1, wherein,
The first message record includes first indication information for indicating that the first message record is associated with the first page.
4. A method according to claim 3, comprising one or more of the following:
The first indication information is a system message;
the first indication information is positioned at the starting position of the first message record;
The first indication information comprises a dividing line;
the first indication information comprises an entry control of the first page;
The first indication information includes identification information of the first page, or
The first indication information comprises a control used for expanding and collapsing the first message record.
5. The method of claim 1, wherein the first message record can be expanded and collapsed.
6. The method of claim 1, wherein the first message record includes two or more topics, and different topics are isolated by a preset identifier.
7. The method of claim 1, wherein,
And displaying start prompt information at the starting position of the first message record, and displaying end prompt information at the ending position of the first message record.
8. The method of claim 1, wherein displaying the at least one first interactive message in an aggregated form as a first message record in a main conversation window in which the user interacts with the digital assistant comprises:
Displaying a placeholder message in the main session window in response to invoking the digital assistant in the first page, and aggregating the at least one first interactive message with the placeholder message, or
After the first interactive window is evoked, displaying a placeholder message in the main session window in response to a first message sent in the first interactive window, and aggregating and displaying the at least one first interactive message with the placeholder message, or
And displaying the first message record in the main session window in response to the interaction in the first interaction window ending.
9. The method of claim 1, wherein,
The first interactive window is a session mode interactive window or a floating window mode interactive window.
10. The method of claim 1, wherein the first interactive window is a session-mode interactive window, and the method further comprises:
invoking a floating window mode interactive window with the digital assistant in the first page;
Synchronizing first interaction information of the user and the digital assistant in the floating window mode interaction window into the first interaction window;
wherein a first message record of the main session window includes the first interaction information.
11. The method of claim 10, wherein the first interaction information comprises:
summary information of interactions that the user has with the digital assistant in the floating window mode interaction window;
Or alternatively
Result information of interactions that the user has with the digital assistant occur in the floating window mode interaction window.
12. The method of claim 10, wherein synchronizing first interaction information of a user with the digital assistant in the floating window mode interaction window into the first interaction window comprises:
and in response to the floating window mode interaction window being closed, synchronizing first interaction information of the user and the digital assistant in the floating window mode interaction window into the first interaction window.
13. The method of claim 1, wherein,
The application where the main session window is located and the application where the first page is located are different applications.
14. The method of claim 1, wherein the first page is opened by interaction of the user with the digital assistant in the main session window, and the first interaction window reveals contextual information in the main session window related to the first page.
15. An information processing method, comprising:
Opening a target page through interaction of a user and a digital assistant in a main session window, wherein the main session window is an interaction window presented after a preset page wakes up the digital assistant, and the preset page is different from the target page;
Responsive to an operation of invoking the digital assistant in the target page, presenting an interactive window with the digital assistant, and
And displaying context information related to the target page in the main session window in the interactive window, wherein the context information comprises dialogue information related to the target page.
16. The method of claim 15, wherein the interactive window is a session-mode interactive window.
17. The method of claim 16, further comprising:
Presenting at least one interactive message of the user with the digital assistant in the interactive window, and
In the main session window, the at least one interactive message is displayed in an aggregated form as a message record.
18. An apparatus for information processing, comprising:
A first presentation module configured to present a first interactive window with the digital assistant in response to an operation of invoking the digital assistant in a first page, and a second interactive window with the digital assistant in response to an operation of invoking the digital assistant in a second page;
a second presentation module configured to present at least one first interactive message of a user with the digital assistant in the first interactive window and at least one second interactive message of the user with the digital assistant in a second interactive window;
A third presentation module configured to display the at least one first interactive message in an aggregated form as a first message record in a main session window in which the user interacts with the digital assistant;
The first presentation module is further configured to present a second interactive window with the digital assistant in response to an operation invoking the digital assistant in a second page;
The second presentation module is further configured to present at least one second interactive message of the user with the digital assistant in a second interactive window, and
The third presentation module is further configured to display the at least one second interactive message in an aggregated form as a second message record in the main session window,
The main session window is an interactive window presented by waking up the digital assistant on a preset page, the preset page is different from the first page and the second page, and the first message record and the second message record are displayed in the main session window independently of each other.
19. An apparatus for information processing, comprising:
An opening module configured to open a target page through user interaction with a digital assistant in a main session window, wherein the main session window is an interaction window presented after a preset page wakes up the digital assistant, and the preset page is different from the target page;
A fourth presentation module configured to present an interactive window with the digital assistant in response to an operation of invoking the digital assistant in the target page, and
And a fifth display module configured to display context information related to the target page in the main session window in the interactive window, wherein the context information comprises dialogue information related to the target page.
20. An electronic device, comprising:
at least one processing unit, and
At least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit, which when executed by the at least one processing unit, cause the electronic device to perform the method of any one of claims 1 to 14 or any one of claims 15 to 17.
21. A computer readable storage medium having stored thereon a computer program executable by a processor to implement the method of any one of claims 1 to 14 or any one of claims 15 to 17.
CN202310947458.7A 2023-07-28 2023-07-28 Information processing methods, apparatus, equipment and storage media Active CN119002734B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310947458.7A CN119002734B (en) 2023-07-28 2023-07-28 Information processing methods, apparatus, equipment and storage media
PCT/CN2024/107871 WO2025026227A1 (en) 2023-07-28 2024-07-26 Information processing method and apparatus, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310947458.7A CN119002734B (en) 2023-07-28 2023-07-28 Information processing methods, apparatus, equipment and storage media

Publications (2)

Publication Number Publication Date
CN119002734A CN119002734A (en) 2024-11-22
CN119002734B true CN119002734B (en) 2025-11-14

Family

ID=93479030

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310947458.7A Active CN119002734B (en) 2023-07-28 2023-07-28 Information processing methods, apparatus, equipment and storage media

Country Status (2)

Country Link
CN (1) CN119002734B (en)
WO (1) WO2025026227A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110603545A (en) * 2017-04-26 2019-12-20 谷歌有限责任公司 Organizing messages exchanged in a human-machine conversation with an automated assistant
CN115686277A (en) * 2021-07-28 2023-02-03 阿瓦亚管理有限合伙公司 System and method for providing digital assistant related to communication session information

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11695711B2 (en) * 2017-04-06 2023-07-04 International Business Machines Corporation Adaptive communications display window
CN112311656B (en) * 2020-02-14 2022-10-11 北京字节跳动网络技术有限公司 Message aggregation and display method and device, electronic equipment and computer readable medium
CN115373698A (en) * 2021-05-19 2022-11-22 腾讯科技(深圳)有限公司 Message forwarding method and device, electronic equipment and storage medium
CN116541114A (en) * 2023-04-28 2023-08-04 北京字跳网络技术有限公司 Information display method, device, computer equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110603545A (en) * 2017-04-26 2019-12-20 谷歌有限责任公司 Organizing messages exchanged in a human-machine conversation with an automated assistant
CN115686277A (en) * 2021-07-28 2023-02-03 阿瓦亚管理有限合伙公司 System and method for providing digital assistant related to communication session information

Also Published As

Publication number Publication date
CN119002734A (en) 2024-11-22
WO2025026227A1 (en) 2025-02-06
WO2025026227A9 (en) 2025-07-03

Similar Documents

Publication Publication Date Title
JP2015510175A (en) Notebook-driven accumulation of meeting documents and meeting notes
US11611519B1 (en) Event trigger visibility within a group-based communication system
US11068853B2 (en) Providing calendar utility to capture calendar event
US20240385860A1 (en) Method, device, and storage medium for information interaction
US20250013479A1 (en) Method, apparatus, device and storage medium for processing information
CN119002734B (en) Information processing methods, apparatus, equipment and storage media
US20250165127A1 (en) Information interaction
CN118520954B (en) Method, apparatus, device and storage medium for information interaction
US20260019390A1 (en) Method, apparatus, device and storage medium for information processing
US20250138708A1 (en) Conversation interaction
CN119002980B (en) Method, device, equipment and storage medium for model configuration
US20240402885A1 (en) Method, apparatus, device, and storage medium for information processing
CN119011514B (en) Information processing methods, apparatus, equipment and storage media
CN119003056B (en) Method, apparatus, device and storage medium for information processing
US20250165264A1 (en) Method, apparatus, device and storage medium for information interaction
US20250166634A1 (en) Method, apparatus, device, and storage medium for information interaction
US20250307555A1 (en) Information extraction
US20250168133A1 (en) Method, apparatus, device and storage medium for conversation interaction
US20250013478A1 (en) Method, apparatus, device, and storage medium for processing information
US20250150417A1 (en) Method, apparatus, device and storage medium for session interaction
CN117908715A (en) Method, device, equipment and storage medium for information interaction
CN119002740A (en) Method, apparatus, device and storage medium for information interaction
WO2025108133A1 (en) Method and apparatus for information interaction, device, and storage medium
CN119883447A (en) Method, apparatus, device and storage medium for information processing
WO2025108132A1 (en) Method and apparatus for task processing, device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant