WO2019220791A1 - Dispositif de dialogue - Google Patents
Dispositif de dialogue Download PDFInfo
- Publication number
- WO2019220791A1 WO2019220791A1 PCT/JP2019/014088 JP2019014088W WO2019220791A1 WO 2019220791 A1 WO2019220791 A1 WO 2019220791A1 JP 2019014088 W JP2019014088 W JP 2019014088W WO 2019220791 A1 WO2019220791 A1 WO 2019220791A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- concealment
- utterance
- level
- user
- secret information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/93—Document management systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/12—Use of codes for handling textual entities
Definitions
- One aspect of the present invention relates to a dialogue apparatus.
- Patent Document 1 when an instruction to delete information used for a predetermined process is received, the information is deleted and an identifier related to the predetermined process using the information to be deleted is specified.
- a dialog history management device is described that deletes the content of a voice related to a response associated with an identified identifier from the dialog history storage unit.
- Patent Document 1 does not execute the deletion process unless a deletion instruction is accepted.
- secret information that is preferably not recorded from the beginning may be present in the utterance texts constituting the dialog. Therefore, it is desired to ensure the security of the confidential information that appears in the dialogue.
- An interactive apparatus includes an acquisition unit that acquires an utterance sentence that constitutes an automatic dialogue executed with a user, and at least a part of the utterance sentence that is not yet recorded in a log.
- a concealing unit that executes concealment processing for concealing secret information on an utterance sentence, and a recording unit that records an utterance sentence in which at least a part of secret information is concealed in a log.
- the security of the secret information that appears in the dialogue can be more reliably ensured.
- the interactive device is a computer that executes an automatic dialogue with a user.
- a user is a person who uses an interactive device.
- An automatic dialogue is a conversation exchange between a user and a computer (this computer is generally referred to as a “bot”), and in this automatic dialogue, the computer automatically speaks in response to the utterance from the user.
- An utterance is a single episode uttered by a user or a computer.
- the user's utterance is also referred to as “user utterance”
- the computer's utterance is also referred to as “system utterance”.
- the usage purpose of the interactive device is not limited.
- the interactive device may be used in a FAQ (Frequently Asked Questions) system that answers a user's question, or may be used in a chat system that performs an arbitrary story that is not limited to FAQ.
- the interactive device may be incorporated in a virtual assistant function that performs an arbitrary task based on a voice operation.
- the interactive device may be a part of a client-server type system or a single device. In the present embodiment, it is assumed that the interactive device is a computer that functions as a server.
- FIG. 1 is a diagram illustrating an example of a functional configuration of the interactive apparatus 10 according to the embodiment.
- the interactive device 10 can be connected to at least one user terminal 90 via a communication network.
- the configuration of the communication network is not limited at all, and may be configured using, for example, at least one of the Internet and an intranet.
- the user terminal 90 is a computer (client terminal) operated by a user.
- the user terminal 90 has a function of transmitting an utterance (user utterance) input by the user to the dialog device 10 and a function of outputting an utterance (system utterance) received from the dialog device 10.
- the type of the user terminal 90 is not limited, and may be, for example, a stationary computer or device or a portable computer or device.
- Specific examples of the user terminal 90 include, but are not limited to, a smartphone, a tablet terminal, a wearable terminal, a personal computer, a smart speaker, a smart TV, and a smart home appliance.
- the dialogue apparatus 10 records the utterance text constituting the automatic dialogue executed with the user terminal 90 in the log 40.
- the utterance sentence is a sentence indicating the utterance of the user or the bot, and can be expressed by a character string.
- a log is a permanent record of the history of spoken sentences.
- the log 40 is realized by a logical file of an arbitrary format, and is stored in an arbitrary storage (storage device).
- One of the features of the dialogue apparatus 10 is a method for recording an utterance sentence in a log. Hereinafter, the characteristics will be described in detail.
- the dialogue apparatus 10 includes, as functional elements, a front function 20 that comprehensively controls automatic dialogue and one or more bot functions 30 that output system utterances related to a specific topic.
- the front function 20 determines a bot function 30 for processing the user utterance received from the user terminal 90, and transfers the user utterance to the bot function 30. Thereafter, the front function 20 transmits the system utterance output from the bot function 30 to the user terminal 90 as a response.
- the front function 20 records the user utterance and system utterance in the log 40.
- Each bot function 30 refers to a scenario that is a rule of a dialog (a rule that defines what kind of system utterance is output when what kind of user utterance is accepted).
- the bot function 30 determines a system utterance corresponding to the user utterance based on the scenario, and outputs the determined system utterance to the front function 20.
- the front function 20 includes an acquisition unit 21, a concealment unit 22, and a recording unit 23.
- the acquisition unit 21 is a functional element that acquires an utterance sentence constituting an automatic dialogue executed with a user.
- the acquisition unit 21 acquires an utterance sentence indicating a user utterance or a system utterance.
- the concealing unit 22 is a functional element that executes concealment processing for concealing at least a part of secret information on the utterance sentence in a state where the utterance sentence is not yet recorded in the log 40.
- Confidential information is information that is desirably not notified to a third party (at least a person different from the user). What information is confidential information may be arbitrarily determined.
- the confidential information may include at least a part of personal information or may include at least a part of confidential company information.
- Concealment is a process for making it impossible or difficult to identify original information by a third party.
- the recording unit 23 is a functional element that records in the log 40 an utterance sentence in which at least a part of secret information is concealed.
- FIGS. 2 and 3 are sequence diagrams illustrating an example of the operation of the interactive apparatus 10.
- FIG. 4 is a diagram showing some examples of the log 40.
- the front function 20 and the bot function 30 of the interactive apparatus 10 are individually shown, and one or more bot functions 30 are simply shown in one block.
- step S11 the user terminal 90 transmits an automatic dialogue start request to the dialogue device 10.
- the dialog device 10 receives and processes the start request, so that the user can perform an automatic dialog using the user terminal 90.
- the start request is a data signal including an utterance attribute.
- the utterance attribute is information indicating a property or characteristic related to the user utterance, and can be used in an automatic dialogue.
- the utterance attribute is provided from the user terminal 90 to the dialogue apparatus 10 in the form included in the start request or together with the user utterance.
- the specific content of the speech attribute is not limited.
- the speech attributes include a user attribute indicating the property or characteristic of the user who intends to use the automatic dialogue and a terminal attribute indicating the property or characteristic of the user terminal 90.
- the type and number of data items indicating user attributes are not limited.
- the user attribute may be one selected from a name, gender, age, and address, or may be a combination of two or more items arbitrarily selected from the four items.
- the user attribute may include one or more other data items different from the four items.
- the type and number of data items indicating terminal attributes are not limited at all.
- the terminal attribute may be one selected from the terminal name, the operating system name, and the operation mode, or may be a combination of two or more items arbitrarily selected from the three items.
- the terminal attribute may include one or more data items different from the three items.
- the acquisition unit 21 receives the start request.
- the concealment unit 22 determines the concealment level based on the speech attribute of the start request.
- the concealment level is an index indicating at least one of a method for concealing secret information (concealment method) and a data item of secret information to be concealed.
- the concealment method may be deletion of at least part of secret information.
- the concealment level is a deletion level.
- the hidden secret information is excluded from the character string to be recorded in the log 40 and is not recorded in the log 40 from the beginning.
- the concealment method may be a process of replacing at least a part of secret information with one or more concept words.
- the concealment level is a replacement level.
- a concept word is a phrase that abstracts the original information. Therefore, in the case of the replacement level, the secret information to be concealed is recorded in the log 40 after being replaced with another word that is more ambiguous.
- the concept word may be a phrase indicating the category of the original information or a variable name corresponding to the original information.
- the concept word may be an ambiguous expression that gives no clue to the original information. For example, if the surname “Tanaka” is concealed at the substitution level, the character string “Tanaka” is a notional word “% userInfo.surname%”, “surname”, “name”, or “% variable_1%”. May be substituted.
- the replacement level may be set in multiple stages. This means that a plurality of types of abstraction levels (abstraction levels) of certain secret information are prepared. Two types of replacement levels may be prepared, or three or more types of replacement levels may be prepared.
- the concealment level in this case includes a first replacement level that replaces at least a part of secret information with the first concept word, and at least a part of the secret information with an abstraction level higher than that of the first concept word. And a second substitution level for substitution with a lower second concept word.
- the first replacement level and the second replacement level are prepared in advance, and an example is shown in which two surnames “Tanaka” and “Suzuki” are concealed by the two types of replacement levels.
- both of the two character strings “Tanaka” and “Suzuki” may be replaced with the first concept word “% userInfo.surname%”.
- the character string “Tanaka” is replaced with the second conceptual word “% userInfo.surname1%”
- the character string “Suzuki” is replaced with the second conceptual word “% userInfo.surname2%”. May be.
- the first replacement level is a process of replacing two surnames in an indistinguishable form
- the second replacement level is a process of replacing two surnames in a distinguishable form. Therefore, the second concept word is less abstract than the first concept word.
- the concealment level may be selectable from a plurality of replacement levels, may be selectable from a deletion level and one replacement level, or may be a deletion level. And a plurality of substitution levels.
- the type of secret information that should be concealed may be set arbitrarily.
- the concealment level may be set so as to conceal only one type of secret information, or the concealment level may be set so as to conceal two or more types of secret information.
- the concealment level may be set so as to conceal only a part or all of the assumed secret information.
- the concealment level may be set such that one or more data items selected from the name, age, gender, and location are concealed. In any case, the concealment level is set so as to conceal at least a part of the secret information.
- the concealment level may indicate both the concealment method and the type of secret information to be concealed, or only one of them.
- the concealment level may be set for each individual user, or may be common to all users. Alternatively, the concealment level may be set for each individual speech attribute, for example, for each age group of the user.
- the concealment level of the speech sentence and the concealment level of the speech attribute may be the same or different from each other. In any case, the concealment level can be flexibly set in various ways.
- the dialogue apparatus 10 stores information related to the concealment level in advance so that the concealment unit 22 can determine the concealment level.
- stored is not limited.
- the concealment level may be represented by any one of a setting file, a database, a mapping table, an algorithm, a mathematical formula, and a threshold, or may be represented by a combination of two or more arbitrarily selected from these methods. May be.
- the concealment level can be set in various ways.
- the concealment unit 22 determines one of one or more concealment levels prepared in advance based on the speech attribute included in the start request.
- the concealment unit 22 executes concealment processing for the start request based on the determined concealment level.
- the concealing unit 22 conceals the secret information (for example, at least a part of the user attributes) specified by the concealment level in the character string indicating the start request to be recorded in the log 40 using the specified concealment method. To do. For example, if the determined concealment level is the deletion level, the concealment unit 22 deletes the designated secret information from the character string. If the determined concealment level is the replacement level, the concealment unit 22 replaces the designated secret information in the character string with a concept word.
- the method for specifying the secret information to be concealed from the character string to be processed is not limited.
- the concealment unit 22 may specify the secret information using at least one method selected from character string matching, machine learning, anaphora analysis, and specific expression extraction (entity linking).
- the concealment unit 22 specifies secret information to be concealed by character string matching, and conceals the secret information. You may process based on a level.
- step S14 the recording unit 23 records the start request subjected to the concealment process in the log 40. Since the secret information to be concealed is already concealed (for example, deleted or replaced) at this point, it is not written in the log 40 from the beginning.
- step S15 an automatic dialogue is executed between the user terminal 90 and the dialogue device 10 in response to the start request.
- step S ⁇ b> 15 the front function 20 executes a process for starting an automatic dialogue, and thereafter, a user utterance and a system utterance are transmitted between the user terminal 90 and the dialogue device 10. The user can acquire or transmit desired information in this automatic dialogue.
- FIG. 3 shows an example of recording of utterance data in the log during the automatic dialogue (in other words, in step S15).
- the user terminal 90 transmits user utterance data to the dialogue apparatus 10.
- the user utterance data is data including an utterance sentence representing the user utterance and an utterance attribute.
- the acquisition method of the user utterance (uttered text) in the user terminal 90 is not limited.
- the user terminal 90 may acquire a user utterance input by voice (that is, a user utterance represented by voice data), or a user utterance input by a character string (that is, a user utterance represented by text data). May be obtained.
- the user terminal 90 may automatically generate a user utterance in the form of voice data or text data based on a user instruction.
- the utterance attribute may be the same as that included in the start request, or may be different from at least a part of what is included in the start request.
- the utterance attribute is data associated with the user utterance (utterance sentence).
- the user terminal 90 generates user utterance data including the acquired user utterance and utterance attribute, and transmits the user utterance data.
- the front function 20 receives the user utterance data.
- step S152 the front function 20 determines the bot function 30 for processing the user utterance data, and transfers the user utterance data to the corresponding bot function 30.
- step S153 the concealment unit 22 determines the concealment level based on the utterance attribute of the user utterance data.
- the concealing unit 22 determines the concealment level as in step S12.
- step S154 the concealment unit 22 executes concealment processing on the user utterance data based on the determined concealment level.
- the concealment unit 22 executes concealment processing as in step S13.
- the utterance text of the user utterance can be freely described. Therefore, when identifying the secret information to be concealed from the utterance text, the concealment unit 22 uses at least one method selected from character string matching, machine learning, anaphora analysis, and specific expression extraction (entity linking).
- the secret information may be specified from the utterance text.
- the concealment unit 22 may specify secret information to be concealed in the speech attribute by character string matching.
- step S155 the recording unit 23 records the concealed user utterance data in the log 40. Similar to step S14, the user utterance data is recorded in the log 40 in such a manner that secret information to be concealed does not appear as it is.
- the dialogue apparatus 10 conceals the place name when recording the user utterance (uttered text) “Tell me the weather in Minato-ku Akasaka and Yokosuka City” in the log 40, and includes the utterance attribute (clientData) The user attribute (userInfo) is hidden.
- clientData the utterance attribute
- userInfo The user attribute (userInfo) is hidden.
- “Minato-ku Akasaka” is assumed to be the user's work place
- “Yokosuka City” is assumed to be the user's residence.
- deletion level, the first replacement level, and the second replacement level processing for replacing with a second concept word having a lower abstraction level than the first concept word
- the deletion level, the first replacement level, and the second replacement level processing for replacing with a second concept word having a lower abstraction level than the first concept word
- FIG. 4 shows data described by applying AIML (Artificial Intelligence Markup Language), but the method and rules for describing the speech data are not limited at all.
- AIML Artificial Intelligence Markup Language
- the concealment unit 22 When the concealment unit 22 selects the deletion level as the concealment level, the concealment unit 22 deletes these two place names from the utterance text. Therefore, the recording unit 23 records the character string “Tell me about the weather” in the log 40. Further, the concealment unit 22 deletes the name “tanaka” and the age “28” constituting the user attribute.
- the concealment unit 22 When the concealment unit 22 selects the first replacement level as the concealment level, the concealment unit 22 replaces these two place names with a common character string. For example, the concealment unit 22 uses the variable name “% userInfo.place%” corresponding to the place name as the first concept word, and replaces both “Minato-ku Akasaka” and “Yokosuka City” with this variable name. Further, the concealment unit 22 deletes the name “tanaka” and the age “28” constituting the user attribute. As described above, the concealment level may be different between the utterance text and the utterance attribute.
- the concealment unit 22 When the concealment unit 22 selects the second replacement level as the concealment level, the concealment unit 22 replaces the two place names with different character strings. For example, the concealment unit 22 replaces “Minato-ku Akasaka” with the variable name “% userInfo.workplace%”, which is the second concept word corresponding to the workplace. Further, the concealment unit 22 replaces “Yokosuka City” with the variable name “% userInfo.home%” which is the second concept word corresponding to the place of residence. Further, the concealment unit 22 deletes the name “tanaka” and the age “28” constituting the user attribute.
- the method of recording the utterance text in the log is different among the three examples.
- the concealment level is the deletion level
- the secret information that is the object of concealment is completely deleted from the utterance text, so that no clue regarding the place name is recorded in the log 40. Therefore, even if a third party other than the user (for example, the administrator of the dialogue apparatus 10) reads the log 40, it cannot be specified what weather the user has told me about. For example, the “Tell me the weather with” field can also contain a time such as “Today” or “Tomorrow”.
- the concealment level is the first concealment level
- the first concept word “% userInfo.place%” is described in the portion corresponding to the secret information.
- the concealment level is the second concealment level
- second concept words “% userInfo.workplace%” and “% userInfo.home%” are described on the log 40. Therefore, the third party can guess that the user has heard the weather at two places and that the place is related to the work place and the place of residence, but cannot know the specific place. Therefore, in the example of FIG. 4, the abstract level of the utterance sentence recorded in the log 40 is the highest in the deletion level, the lowest in the second replacement level, and the deletion level in the first replacement level. And the second replacement level.
- the recording method of the speech attribute in the log is the same in the three examples.
- the concealment unit 22 may replace at least a part of secret information in the speech attribute with a concept word.
- the bot function 30 outputs system utterance data to the front function 20 in response to the user utterance.
- the bot function 30 receives user utterance data from the front function 20 and refers to a scenario based on the user utterance and utterance attributes included in the data, thereby presuming that the bot function 30 is appropriate as a response to the user utterance. Determine the utterance. Then, the bot function 30 outputs system utterance data including the determined system utterance to the front function 20.
- the system utterance data is data including an utterance sentence representing the system utterance.
- the method of expressing the system utterance is not limited. For example, the system utterance can be expressed by voice or text.
- step S157 the front function 20 transmits the system utterance data to the user terminal 90.
- the front function 20 may transmit the system utterance data after designating the output format of the system utterance (that is, after shaping the system utterance).
- the user terminal 90 receives and outputs the system utterance data, so that the user can recognize the reply of the bot to the user utterance.
- step S158 the concealment unit 22 determines the concealment level of the system utterance. Similar to step S153, the concealment unit 22 may determine the concealment level based on the speech attribute included in the user speech data.
- step S159 the concealment unit 22 executes concealment processing on the system utterance data based on the determined concealment level.
- the concealment unit 22 may specify secret information to be concealed based on the variable. For example, if the system utterance output to the user terminal 90 is “Minato-ku Akasaka and Yokosuka City are clear”, the system utterance is the variable “% userInfo.workplace%” corresponding to “Minato-ku Akasaka”. The variable “% userInfo.home%” corresponding to “Yokosuka City” may be included. In this case, the concealment unit 22 can determine to conceal these two place names based on the variable. Alternatively, the concealment unit 22 may specify the secret information by the same process as in step S154.
- step S160 the recording unit 23 records the concealed system utterance data in the log 40. Similar to step S155, the system utterance data is recorded in the log 40 in such a manner that secret information to be concealed does not appear as it is.
- FIG. 4 shows processes similar to steps S151 and S152 as steps S161 and S162.
- User utterance data is recorded in the log 40 in accordance with step S161.
- system utterance data is generated and transmitted to the user terminal 90 in accordance with step S162, and the system utterance data is recorded in the log 40.
- the secret information to be concealed is reliably concealed by the cooperation of the acquisition unit 21, the concealment unit 22, and the recording unit 23, the utterance sentence and the utterance attribute are recorded in the log 40 for the first time. This process is completely different from the process of deleting the secret information once recorded on the log 40 later.
- the dialogue apparatus 10 may execute the concealment process for the utterance sentence without performing the concealment process for the utterance attribute. That is, it is not essential to conceal the speech attribute.
- each functional block may be realized by one device physically and / or logically coupled, and two or more devices physically and / or logically separated may be directly and / or indirectly. (For example, wired and / or wireless) and may be realized by the plurality of devices.
- the dialogue apparatus 10 may function as a computer that performs the processing according to the present embodiment.
- FIG. 5 is a diagram illustrating an example of a hardware configuration of the computer 100 that functions as the interactive apparatus 10.
- the computer 100 may physically include a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a bus 1007, and the like.
- the term “apparatus” can be read as a circuit, a device, a unit, or the like.
- the hardware configuration of the interactive device 10 may be configured to include one or a plurality of each device illustrated in the figure, or may be configured not to include some devices.
- Each function in the interactive apparatus 10 is performed by causing the processor 1001 to perform computation by reading predetermined software (program) on hardware such as the processor 1001 and the memory 1002, and performing communication by the communication device 1004, the memory 1002, and the storage 1003. This is realized by controlling the reading and / or writing of data.
- the processor 1001 controls the entire computer by operating an operating system, for example.
- the processor 1001 may be configured by a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, a register, and the like.
- CPU central processing unit
- the interactive device 10 may be realized by the processor 1001.
- the processor 1001 reads a program (program code), software module, and data from the storage 1003 and / or the communication device 1004 to the memory 1002, and executes various processes according to these.
- a program program code
- the program a program that causes a computer to execute at least a part of the operations described in the above embodiments is used.
- at least a part of the functional elements of the interactive apparatus 10 may be realized by a control program stored in the memory 1002 and operated by the processor 1001, and may be realized similarly for other functional blocks.
- the above-described various processes have been described as being executed by one processor 1001, they may be executed simultaneously or sequentially by two or more processors 1001.
- the processor 1001 may be implemented by one or more chips. Note that the program may be transmitted from a network via a telecommunication line.
- the memory 1002 is a computer-readable recording medium and includes, for example, at least one of ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM (Random Access Memory), and the like. May be.
- the memory 1002 may be called a register, a cache, a main memory (main storage device), or the like.
- the memory 1002 can store a program (program code), a software module, and the like that can be executed to implement the wireless communication method according to the embodiment of the present invention.
- the storage 1003 is a computer-readable recording medium, such as an optical disk such as a CDROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, Blu-ray (registered). (Trademark) disk), smart card, flash memory (for example, card, stick, key drive), floppy disk, magnetic strip, etc.
- the storage 1003 may be referred to as an auxiliary storage device.
- the storage medium described above may be, for example, a table including a memory 1002 and / or a storage 1003, a server, or other suitable medium.
- the communication device 1004 is hardware (transmission / reception device) for performing communication between computers via a wired and / or wireless network, and is also referred to as a network device, a network controller, a network card, a communication module, or the like.
- a network device for performing communication between computers via a wired and / or wireless network
- a network controller for controlling network access
- a network card for controlling communication between computers via a wired and / or wireless network
- a communication module or the like.
- at least some functional elements of the interactive device 10 may be realized by the communication device 1004.
- the input device 1005 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, etc.) that accepts an input from the outside.
- the output device 1006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that performs output to the outside. Note that the input device 1005 and the output device 1006 may have an integrated configuration (for example, a touch panel).
- each device such as the processor 1001 and the memory 1002 is connected by a bus 1007 for communicating information.
- the bus 1007 may be configured with a single bus or may be configured with different buses between apparatuses.
- the computer 100 is configured to include hardware such as a microprocessor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a programmable logic device (PLD), and a field programmable gate array (FPGA). Some or all of the functional blocks may be realized by the hardware. For example, the processor 1001 may be implemented by at least one of these hardware.
- DSP digital signal processor
- ASIC application specific integrated circuit
- PLD programmable logic device
- FPGA field programmable gate array
- the interactive apparatus includes an acquisition unit that acquires an utterance sentence that constitutes an automatic dialogue executed with a user, and a state in which the utterance sentence is not yet recorded in the log.
- a concealing unit that executes concealment processing for concealing at least a part of secret information on an utterance sentence
- a recording unit that records an utterance sentence in which at least a part of secret information is concealed in a log.
- the security of the confidential information appearing in the dialogue can be ensured more reliably.
- the amount of hardware resources used by the interactive device for example, processor load and memory consumption (Amount) can be suppressed.
- the concealment unit may execute the concealment process based on at least one concealment level selected from a plurality of concealment levels determined in advance.
- the concealment process can be executed flexibly as necessary. In other words, the abstraction level of the utterance sentence recorded in the log can be changed as necessary.
- the plurality of concealment levels may include at least a deletion level indicating deletion of at least some secret information and a replacement level replacing at least some secret information with concept words. In this case, it is possible to select whether the information to be concealed is completely deleted or replaced with another phrase.
- the plurality of concealment levels includes a first replacement level that replaces at least a part of secret information with the first concept word, and at least a part of the secret information in comparison with the first concept word. It may include at least a second replacement level for replacement with a second concept word having a low level of abstraction.
- the acquisition unit further acquires an utterance attribute associated with the utterance sentence
- the concealment unit further performs concealment processing on the utterance attribute
- the recording unit includes at least a part of the utterance attribute.
- An utterance attribute in which secret information is concealed may be further recorded in a log.
- notification of information is not limited to the aspect and embodiment described in this specification, and may be performed by other methods.
- notification of information includes physical layer signaling (for example, DCI (Downlink Control Information), UCI (Uplink Control Information)), upper layer signaling (for example, RRC (Radio Resource Control) signaling, MAC (Medium Access Control) signaling), It may be implemented by broadcast information (MIB (Master Information Block), SIB (System Information Block))), other signals, or a combination thereof.
- the RRC signaling may be called an RRC message, and may be, for example, an RRC connection setup message, an RRC connection reconfiguration message, or the like.
- Each aspect / embodiment described in this specification includes LTE (Long Term Evolution), LTE-A (LTE-Advanced), SUPER 3G, IMT-Advanced, 4G, 5G, FRA (Future Radio Access), W-CDMA.
- LTE Long Term Evolution
- LTE-A Long Term Evolution-Advanced
- SUPER 3G IMT-Advanced
- 4G 5G
- FRA Full Radio Access
- W-CDMA Wideband
- GSM registered trademark
- CDMA2000 Code Division Multiple Access 2000
- UMB User Mobile Broadband
- IEEE 802.11 Wi-Fi
- IEEE 802.16 WiMAX
- IEEE 802.20 UWB (Ultra-Wideband)
- Bluetooth registered trademark
- systems using other appropriate systems and / or next-generation systems extended based on these systems.
- Information etc. can be output from the upper layer (or lower layer) to the lower layer (or upper layer). Input / output may be performed via a plurality of network nodes.
- the input / output information or the like may be stored in a specific place (for example, a memory) or may be managed by a management table. Input / output information and the like can be overwritten, updated, or additionally written. The output information or the like may be deleted. The input information or the like may be transmitted to another device.
- the determination may be performed by a value represented by 1 bit (0 or 1), may be performed by a true / false value (Boolean: true or false), or may be performed by comparing numerical values (for example, a predetermined value) Comparison with the value).
- notification of predetermined information is not limited to explicitly performed, but is performed implicitly (for example, notification of the predetermined information is not performed). Also good.
- software, instructions, etc. may be transmitted / received via a transmission medium.
- software may use websites, servers, or other devices using wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave.
- wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave.
- DSL digital subscriber line
- wireless technology such as infrared, wireless and microwave.
- system and “network” used in this specification are used interchangeably.
- information, parameters, and the like described in this specification may be represented by absolute values, may be represented by relative values from a predetermined value, or may be represented by other corresponding information.
- the radio resource may be indicated by an index.
- User terminals and mobile communication terminals are known by those skilled in the art from subscriber stations, mobile units, subscriber units, wireless units, remote units, mobile devices, wireless devices, wireless communication devices, remote devices, mobile subscriber stations, access terminals, It may also be called mobile terminal, wireless terminal, remote terminal, handset, user agent, mobile client, client, or some other appropriate terminology.
- determining may encompass a wide variety of actions. “Judgment” and “determination” are, for example, judgment, calculation, calculation, processing, derivation, investigating, looking up (eg, table) , Searching in a table or another data structure), ascertaining what has been ascertaining, and so on. In addition, “determination” and “determination” are reception (for example, receiving information), transmission (for example, transmitting information), input (input), output (output), and access. (Accessing) (eg, accessing data in a memory) may be considered as “determined” or “determined”.
- determination and “determination” means that “resolving”, “selecting”, “choosing”, “establishing”, and “comparing” are regarded as “determining” and “deciding”. May be included. In other words, “determination” and “determination” may include considering some operation as “determination” and “determination”.
- connection means any direct or indirect connection or coupling between two or more elements and It can include the presence of one or more intermediate elements between two “connected” or “coupled” elements.
- the coupling or connection between the elements may be physical, logical, or a combination thereof.
- the two elements are radio frequency by using one or more wires, cables and / or printed electrical connections, as well as some non-limiting and non-inclusive examples.
- electromagnetic energy such as electromagnetic energy having wavelengths in the region, the microwave region and the light (both visible and invisible) region can be considered “connected” or “coupled” to each other.
- the phrase “based on” does not mean “based only on”, unless expressly specified otherwise. In other words, the phrase “based on” means both “based only on” and “based at least on.”
- any reference to the element does not generally limit the quantity or order of the elements. These designations can be used herein as a convenient way to distinguish between two or more elements. Thus, a reference to the first and second elements does not mean that only two elements can be employed there, or that in some way the first element must precede the second element.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Bioethics (AREA)
- Computer Hardware Design (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Artificial Intelligence (AREA)
- Machine Translation (AREA)
Abstract
Selon un mode de réalisation, la présente invention concerne un dispositif de dialogue qui comporte : une unité d'acquisition destinée à acquérir des textes d'énoncé qui constituent un dialogue automatique effectué avec un utilisateur ; une unité de dissimulation destinée à effectuer, pour les textes d'énoncé, un processus de dissimulation pour masquer au moins une partie d'informations secrètes dans un état dans lequel les textes d'énoncés ne sont pas encore enregistrés dans un journal ; et une unité d'enregistrement destinée à enregistrer, dans le journal, les textes d'énoncés, dont au moins la partie des informations secrètes est dissimulée.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020519502A JP7033195B2 (ja) | 2018-05-14 | 2019-03-29 | 対話装置 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018-092832 | 2018-05-14 | ||
| JP2018092832 | 2018-05-14 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019220791A1 true WO2019220791A1 (fr) | 2019-11-21 |
Family
ID=68540111
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/014088 Ceased WO2019220791A1 (fr) | 2018-05-14 | 2019-03-29 | Dispositif de dialogue |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP7033195B2 (fr) |
| WO (1) | WO2019220791A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2022044387A (ja) * | 2020-09-07 | 2022-03-17 | 株式会社日立製作所 | 音声情報加工システム及び音声情報加工方法 |
| JP7730203B1 (ja) * | 2024-07-05 | 2025-08-27 | 株式会社Aces | 抽象化システム、抽象化方法及び抽象化プログラム |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006127148A (ja) * | 2004-10-28 | 2006-05-18 | Fujitsu Ltd | 音声自動対話システムにおける情報処理方法 |
| JP2007249770A (ja) * | 2006-03-17 | 2007-09-27 | Nec Corp | 個人情報隠蔽サービスシステム |
| JP2010079235A (ja) * | 2008-09-28 | 2010-04-08 | Avaya Inc | 個人(オーディ)情報を含まないメディア・ストリームを保存する方法 |
| WO2011142327A1 (fr) * | 2010-05-10 | 2011-11-17 | 日本電気株式会社 | Dispositif de traitement d'informations, procédé et programme de commande |
| WO2016136208A1 (fr) * | 2015-02-27 | 2016-09-01 | パナソニックIpマネジメント株式会社 | Dispositif d'interaction vocale, système d'interaction vocale, procédé de commande de dispositif d'interaction vocale |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5018541B2 (ja) * | 2008-02-19 | 2012-09-05 | 富士ゼロックス株式会社 | 情報処理装置および履歴情報管理プログラム |
-
2019
- 2019-03-29 WO PCT/JP2019/014088 patent/WO2019220791A1/fr not_active Ceased
- 2019-03-29 JP JP2020519502A patent/JP7033195B2/ja active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006127148A (ja) * | 2004-10-28 | 2006-05-18 | Fujitsu Ltd | 音声自動対話システムにおける情報処理方法 |
| JP2007249770A (ja) * | 2006-03-17 | 2007-09-27 | Nec Corp | 個人情報隠蔽サービスシステム |
| JP2010079235A (ja) * | 2008-09-28 | 2010-04-08 | Avaya Inc | 個人(オーディ)情報を含まないメディア・ストリームを保存する方法 |
| WO2011142327A1 (fr) * | 2010-05-10 | 2011-11-17 | 日本電気株式会社 | Dispositif de traitement d'informations, procédé et programme de commande |
| WO2016136208A1 (fr) * | 2015-02-27 | 2016-09-01 | パナソニックIpマネジメント株式会社 | Dispositif d'interaction vocale, système d'interaction vocale, procédé de commande de dispositif d'interaction vocale |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2022044387A (ja) * | 2020-09-07 | 2022-03-17 | 株式会社日立製作所 | 音声情報加工システム及び音声情報加工方法 |
| JP7388997B2 (ja) | 2020-09-07 | 2023-11-29 | 株式会社日立製作所 | 音声情報加工システム及び音声情報加工方法 |
| JP7730203B1 (ja) * | 2024-07-05 | 2025-08-27 | 株式会社Aces | 抽象化システム、抽象化方法及び抽象化プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2019220791A1 (ja) | 2021-02-12 |
| JP7033195B2 (ja) | 2022-03-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7166350B2 (ja) | 対話装置 | |
| US11868734B2 (en) | Dialogue system | |
| WO2019193796A1 (fr) | Serveur d'interaction | |
| JP7033195B2 (ja) | 対話装置 | |
| JP7043593B2 (ja) | 対話サーバ | |
| JP7016405B2 (ja) | 対話サーバ | |
| US11663420B2 (en) | Dialogue system | |
| JP6745402B2 (ja) | 質問推定装置 | |
| WO2019216054A1 (fr) | Serveur interactif | |
| WO2020235136A1 (fr) | Système interactif | |
| JP6960049B2 (ja) | 対話装置 | |
| US11604831B2 (en) | Interactive device | |
| JP7112487B2 (ja) | 対話装置 | |
| JP6895580B2 (ja) | 対話システム | |
| JP2022025917A (ja) | 対話装置 | |
| JP7429193B2 (ja) | 対話装置及び対話プログラム | |
| JP7093844B2 (ja) | 対話システム | |
| JP2019179394A (ja) | 情報提示システム | |
| WO2025243473A1 (fr) | Dispositif et procédé | |
| WO2024241734A1 (fr) | Système de détermination de phrases contextuelles, dispositif de traduction machine et dispositif d'apprentissage | |
| WO2026009320A1 (fr) | Dispositif et procédé | |
| WO2025234102A1 (fr) | Dispositif de sortie d'informations et procédé de sortie d'informations | |
| WO2025224892A1 (fr) | Dispositif et procédé | |
| WO2025210904A1 (fr) | Dispositif de conversion et procédé de conversion | |
| WO2022070792A1 (fr) | Système de réglage de paramètre |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19803222 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2020519502 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19803222 Country of ref document: EP Kind code of ref document: A1 |