CN113408275B - Word learning method, device, system and computing equipment - Google Patents
Word learning method, device, system and computing equipment Download PDFInfo
- Publication number
- CN113408275B CN113408275B CN202010182902.7A CN202010182902A CN113408275B CN 113408275 B CN113408275 B CN 113408275B CN 202010182902 A CN202010182902 A CN 202010182902A CN 113408275 B CN113408275 B CN 113408275B
- Authority
- CN
- China
- Prior art keywords
- word
- learning
- words
- nodes
- biggest
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 238000004891 communication Methods 0.000 claims description 27
- 238000010586 diagram Methods 0.000 description 16
- 238000012552 review Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 230000002452 interceptive effect Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 230000003446 memory effect Effects 0.000 description 2
- 230000007723 transport mechanism Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/06—Foreign languages
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/02—Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Machine Translation (AREA)
Abstract
The embodiment of the invention discloses a word learning method, which comprises the following steps: acquiring a target word to be learned currently in a word set, wherein the target word is determined based on the learning progress of a user on the word set and the learning sequence of each word in the word set; displaying the acquired target word; and displaying the associated example sentences corresponding to the target words, wherein the associated example sentences at least comprise the target words and other words in the word set. The embodiment of the invention also discloses a corresponding word learning device, a corresponding word learning system and corresponding computing equipment.
Description
Technical Field
The present invention relates to the field of language learning technologies, and in particular, to a word learning method, device, system, and computing device.
Background
Language is composed of a large number of words, and thus, word learning is an important part of language learning, which is the basis of language. Most of the current word learning schemes are based on an Egnotooff curve to assist in memorizing words.
The Ebinhaos forgetting curve was found by the German psychologist Ebinhaos study, which describes the law of forgetting a new thing by the human brain. The Ebinhaos forgetting curve indicates that forgetting in learning is regular, that forgetting progresses quickly, and that it is fast and slow first. Therefore, based on the forgetting curve, the review of the word can be arranged when the forgetting point is memorized so as to obtain a better memorizing effect.
However, in the experiments of the eibinhaos curve, meaningless letter combinations are used, which have no relation to each other. However, in the process of learning words, many words are not irrelevant, and when learning a certain word, other words closely related to the certain word can be more easily learned. The learning and review are performed completely according to the Ebinhaos forgetting curve, and the learning efficiency is not high.
Thus, a more advanced word learning scheme is desired.
Disclosure of Invention
To this end, embodiments of the present invention provide a word learning method, apparatus, system, and computing device in an effort to solve or at least alleviate the above-identified problems.
According to an aspect of an embodiment of the present invention, there is provided a word learning method including: acquiring a target word to be learned currently in a word set, wherein the target word is determined based on the learning progress of a user on the word set and the learning sequence of each word in the word set; displaying the acquired target word; and displaying a related example sentence corresponding to the target word, wherein the related example sentence at least comprises the target word and other words in the word set.
Optionally, in the method according to the embodiment of the present invention, the step of displaying the related example sentence of the target word includes: and responding to the marking operation of the user on the target word, displaying the associated example sentence corresponding to the target word, wherein the marking operation indicates the cognition degree of the user on the target word.
According to another aspect of the embodiment of the present invention, there is provided a word learning method including: receiving a word set selected by a user; acquiring a plurality of associated example sentences corresponding to the word set, wherein the associated example sentences comprise a plurality of words in the word set; and configuring the learning sequence of each word in the word set based on the word set and the plurality of related example sentences, and determining the related example sentences corresponding to each word.
Optionally, in the method according to the embodiment of the present invention, the step of obtaining a plurality of association example sentences corresponding to the word set includes: and acquiring the related example sentences from the corresponding sources based on the types of the word sets.
According to another aspect of an embodiment of the present invention, there is provided a word learning apparatus including: the communication module is suitable for acquiring a target word to be learned currently in the word set, wherein the target word is determined based on the learning progress of a user on the word set and the learning sequence of each word in the word set; and a display module adapted to display the acquired target word; and the method is also suitable for displaying the associated example sentences corresponding to the target words, and the associated example sentences at least comprise the target words and other words in the word set.
According to another aspect of an embodiment of the present invention, there is provided a word learning apparatus including: a communication module adapted to receive a set of words selected by a user; the example sentence acquisition module is suitable for acquiring a plurality of associated example sentences corresponding to the word set, wherein the associated example sentences comprise a plurality of words in the word set; and a sequence configuration module, adapted to configure learning sequence of each word in the word set based on the word set and the plurality of associated example sentences, and determine associated example sentences corresponding to each word.
According to another aspect of an embodiment of the present invention, there is provided a word learning system including: a client on which the word learning apparatus according to the embodiment of the present invention resides; and a server on which the word learning apparatus according to the embodiment of the present invention resides.
According to yet another aspect of an embodiment of the present invention, there is provided a computing device including: one or more processors; a memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing the word learning method according to an embodiment of the present invention.
According to the word learning scheme provided by the embodiment of the invention, the user can learn other words (such as reviewing the most recently learned word and pre-learning the word to be learned) associated with the target word by displaying the associated example sentences of the target word, so that the memory effect of the user on the word is improved, and the learning efficiency of the user is improved. The learning sequence of the words is configured based on the maximum groups, so that the words learned by the user through the related example sentences are maximized as much as possible, and the learning sequence is beneficial to the user to master more words in a short time.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present invention can be more clearly understood, and the following specific implementation of the embodiments of the present invention will be more apparent.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which set forth the various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to fall within the scope of the claimed subject matter. The above, as well as additional objects, features, and advantages of the present disclosure will become more apparent from the following detailed description when read in conjunction with the accompanying drawings. Like reference numerals generally refer to like parts or elements throughout the present disclosure.
FIG. 1 shows a schematic diagram of a word learning system 100 according to one embodiment of the invention;
FIG. 2 shows a schematic diagram of a computing device 200 according to one embodiment of the invention;
FIG. 3 illustrates an interactive flow diagram of a word learning method 300 according to one embodiment of the invention;
FIG. 4 shows a schematic diagram of a word learning sequence according to one embodiment of the invention;
FIGS. 5A-5D are schematic diagrams illustrating a plurality of graphical user interfaces according to one embodiment of the present invention;
FIG. 6 illustrates an interactive flow diagram of a word learning method 600 according to one embodiment of the invention;
FIG. 7 shows a schematic diagram of a word learning device 700 according to one embodiment of the invention; and
fig. 8 shows a schematic diagram of a word learning device 800 according to one embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
FIG. 1 shows a schematic diagram of a word learning system 100 according to one embodiment of the invention. The word learning system 100 may assist the user in word memory and learning. As shown in fig. 1, the word learning system 100 may include a client 120 and a server 140. In other implementations, the word learning system 100 may include different and/or additional modules.
The client 120 may receive user input, for example, may provide a graphical user interface for the user to select a set of words to learn and to set a number of new words to learn daily. The server 140 may schedule the words to be learned daily for the user based on the user's input and communicate with the server 120 via the network 160, for example, sending the user's words to be learned today and entry data for the words to the client. Network 160 may include wired and/or wireless communication paths.
According to an embodiment of the present invention, each of the components (clients, servers, etc.) in the word learning system 100 described above may be implemented by the computing device 200 as described below.
FIG. 2 shows a schematic diagram of a computing device 200 according to one embodiment of the invention. As shown in FIG. 2, in a basic configuration 202, computing device 200 typically includes a system memory 206 and one or more processors 204. A memory bus 208 may be used for communication between the processor 204 and the system memory 206.
Depending on the desired configuration, processor 204 may be any type of processor including, but not limited to: a microprocessor (μp), a microcontroller (μc), a digital information processor (DSP), or any combination thereof. Processor 204 may include one or more levels of cache, such as a first level cache 210 and a second level cache 212, a processor core 214, and registers 216. The example processor core 214 may include an Arithmetic Logic Unit (ALU), a Floating Point Unit (FPU), a digital signal processing core (DSP core), or any combination thereof. The example memory controller 218 may be used with the processor 204, or in some implementations, the memory controller 218 may be an internal part of the processor 204.
Depending on the desired configuration, system memory 206 may be any type of memory including, but not limited to: volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.), or any combination thereof. The system memory 206 may include an operating system 220, one or more applications 222, and program data 224. In some implementations, the application 222 may be arranged to execute instructions on an operating system by the one or more processors 204 using the program data 224.
Computing device 200 may also include an interface bus 240 that facilitates communication from various interface devices (e.g., output devices 242, peripheral interfaces 244, and communication devices 246) to basic configuration 202 via bus/interface controller 230. The example output device 242 includes a graphics processing unit 248 and an audio processing unit 250. They may be configured to facilitate communication with various external devices, such as a display or speakers, via one or more a/V ports 252. The example peripheral interface 244 may include a serial interface controller 254 and a parallel interface controller 256, which may be configured to facilitate communication via one or more I/O ports 258 and external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device) or other peripherals (e.g., printer, scanner, etc.). The example communication device 246 may include a network controller 260 that may be arranged to facilitate communication with one or more other computing devices 262 over a network communication link via one or more communication ports 264.
The network communication link may be one example of a communication medium. Communication media may typically be embodied by computer readable instructions, data structures, program modules, and may include any information delivery media in a modulated data signal, such as a carrier wave or other transport mechanism. A "modulated data signal" may be a signal that has one or more of its data set or changed in such a manner as to encode information in the signal. By way of non-limiting example, communication media may include wired media such as a wired network or special purpose network, and wireless media such as acoustic, radio Frequency (RF), microwave, infrared (IR) or other wireless media. The term computer readable media as used herein may include both storage media and communication media.
Computing device 200 may be implemented as a server, such as a database server, an application server, a WEB server, etc., or as a personal computer including desktop and notebook computer configurations. Of course, computing device 200 may also be implemented as at least a portion of a small-sized portable (or mobile) electronic device.
In an embodiment in accordance with the invention, computing device 200 may be implemented as word learning apparatus 700 and/or 800 and configured to perform word learning methods 300 and/or 600 in accordance with embodiments of the invention. The application 222 of the computing device 200 includes a plurality of instructions for executing the word learning method 300/600 according to an embodiment of the present invention, and the program data 224 may also store configuration data of the word learning system 100.
FIG. 3 illustrates an interactive flow diagram of a word learning method 300 according to one embodiment of the invention. The word learning method 300 is adapted to be executed in the word learning system 100.
As shown in fig. 3, the word learning method 300 begins at step S310. In step S310, the client 120 may request the server 140 to acquire a target word to be currently learned in the word set. In some embodiments, the server 140 may be requested to obtain the target word currently to be learned in the set of words in response to the user initiating learning of the set of words, or ending learning of a previous word in the set of words.
In step S320, the server 140 may determine the target word according to the learning order of the words in the word set and the learning progress of the user for the word set. For example, the server 140 may store a learning order for each word in a set of words, all words in the set of words arranged in the learning order to form a word learning sequence. The server 140 may also store a learning progress of the user for the set of words that indicates which word in the word learning sequence the user learned. The server 140 selects, as the target word, the word immediately following the word indicated by the learning progress in the learning order.
FIG. 4 shows a schematic diagram of a word learning sequence according to one embodiment of the invention. In fig. 4, the words are arranged in the arranged learning order from top to bottom, and the learning order of the upper word is the front. The arrow indicates that the word D in the word learning sequence has been learned so far, and thus the word E located below the word D can be selected as the target word.
The server 140 may transmit the target word to the client 120 so that the client 120 displays the acquired target word via the graphical user interface in step S330. Alternatively, the server 140 may send the target word and its entry data to the client 120, the entry data including paraphrasing, phonetic symbols, examination frequencies, example sentences, etc. of the target word.
In some embodiments, the graphical user interface displaying the target word may include marking buttons via which the client 120 receives user marking operations of the displayed target word. The marking operation may indicate a degree of awareness of the user about the target word, such as recognizing the word, not recognizing the word, or recognizing the word with ambiguity.
According to step S340, the server 140 may also transmit a related example sentence (which will be described in detail later) corresponding to the target word to the client 120, so that the client 120 displays the related example sentence including other words in the word set in addition to the target word in step S350. Therefore, the user can associatively learn other words in the word set besides the target word through the associated example sentence, and learning efficiency is improved. Alternatively, the server 140 may send the related example sentence of the target word, other words in the word set included in the related example sentence, and related data of the other words to the client 120. The related data may include at least one of entry data of a word, learning time, and learning state.
The learning state of a word may include learned and to be learned. For a word whose learning state is learned, the server 140 stores the last actual learning time or actual review time of the word. For a word whose learning state is not learned, the server 140 determines an estimated learning time of the word based on the new word learning amount per unit time set by the user and the learning order of the words in the word set.
In some embodiments, the related illustrative sentences of the target word may be displayed in response to a user's marking operation of the target word. For example, if the marking operation indicates that the user does not recognize or does not recognize the target word in a fuzzy manner, a related illustrative sentence of the target word may be displayed. If the marking operation indicates that the user recognizes the target word, the related illustrative sentence of the target word may or may not be displayed.
In the related example sentence, the target word may be highlighted, for example, highlighted, line-down displayed, or the like, to facilitate the user's positioning to the target word. In addition to displaying the related example sentence, the paraphrase of the target word in the related example sentence, the source, paraphrase, pronunciation, and the like of the related example sentence may be displayed. Or, other words in the word set contained in the related example sentence may be acquired, and other words and/or learning states of other words may be displayed.
In some embodiments, the client 120 may also receive a user selection operation (e.g., a click operation) of the displayed other word, and display a learning time of the other word and/or entry data of the other word in response to the selection operation.
Fig. 5A-5D respectively show schematic diagrams of a plurality of graphical user interfaces according to one embodiment of the invention. As shown in fig. 5A, the graphical user interface 510 displays the target word acknowledge, with the "i know" and "prompt" marker buttons displayed at the bottom of the interface. The user clicks the "i know" mark button, indicating that the user knows the target word. The user clicks the "prompt for one" tab button and then enters the graphical user interface 520 shown in fig. 5B. The bottom of the graphical user interface 520 displays "recall" and "don't recall" marker buttons. The user clicking on the "don't want" mark button indicates that the user does not recognize the target word and enters the graphical user interface 530 as shown in FIG. 5C. The user clicking on the "imagine" marking button indicates that the user is fuzzily aware of the target word, and likewise enters the graphical user interface 530. The graphic user interface 530 displays the associated illustrative sentences with the target word, the paraphrasing of the target word in the associated illustrative sentences, the examination frequency of the target word, the paraphrasing, pronunciation and source of the associated illustrative sentences. Wherein the target word acknowledge is highlighted in the associated example sentence.
Under the related example sentence, the graphical user interface 530 also displays other words remonstrans, per, eventable, receiver, and learning states of the words of the set of words appearing in the related example sentence. The user clicks on the word remottranstrate, the learning time of the word remottranstrate ("this word has been learned 2 days ago") and the entry data may be displayed in a graphical user interface 540 as shown in fig. 5D.
The user may also click on the "next" button displayed at the bottom of the graphical user interface 540 to end learning of the currently displayed target word, acquire and display the next target word to be learned.
According to one embodiment of the present invention, the server 140 may configure a learning state of a currently displayed target word to be learned and record an actual learning time of the target word in response to a user ending learning of the target word. Meanwhile, the server 140 may also update the learning progress of the user for the set of words to determine the next target word to be learned based on the updated learning progress and the learning order of the words. Wherein the user clicks a completion button (e.g., a "next" button in graphical user interface 530) on the graphical user interface displaying the target word or exits the learning of the set of words, then the user is indicated to end the learning of the currently displayed target word.
According to one embodiment of the present invention, the server 140 may determine a plurality of target words and data thereof to be learned in the current unit time, associated example sentences and data thereof corresponding to the target words, other words and data thereof in the word set included in the associated example sentences according to the learning sequence of each word in the word set, the learning progress of the user on the word set, and the new word learning amount per unit time set by the user, and send the determined target words and data thereof to the client 120. The client 120 sequentially displays each target word and the related illustrative sentence of each target word in the learning order of the plurality of target words.
It should be noted that the word learning method according to an embodiment of the present invention is directed to learning of new words that have not been learned, irrespective of review of the learned words. The "target word" referred to throughout is a new word that the user has not learned. Unlike review of learned words, learning of a target word refers to first learning of the target word, and learning order of words in a set of words refers to first learning order of words.
The learning sequence of each word in the word set and the related example sentence corresponding to each word are determined in detail with reference to fig. 6.
FIG. 6 illustrates an interactive flow diagram of a word learning method 600 according to one embodiment of the invention. The word learning method 600 is adapted to be executed in the word learning system 100.
As shown in fig. 6, the word learning method 600 starts at step S610. In step S610, the client 120 may receive the word set selected by the user and/or the new word learning amount per unit time set by the user.
The client 120 may provide a user interface for a user to select a set of words that the user wants to learn, such as a set of yasii/tueship words, a set of level four/six words, a set of reputation/movie words, or other suitable set of words. The client 120 may also provide a user interface for the user to input a new word learning amount per unit time, i.e., the number of new words to be learned per unit time, such as a daily new word learning amount, a weekly new word learning amount, and so forth.
The client 120 transmits the word set selected by the user and/or the new word learning amount per unit time input by the user to the server 140, and the server 140 configures the learning order of each word in the word set and the associated illustrative sentence for the user in step S620.
In step S630, the server 140 may acquire a plurality of related illustrative sentences corresponding to the word set. The related illustrative sentence may contain at least two words, i.e., a plurality of words, in the word set. In some embodiments, related illustrative sentences may be obtained from respective sources based on the type of word set. For example, if the word set is of an examination type, the related example sentence may be obtained from the calendar true questions corresponding to the word set. If the word set is a movie/television play type word set, the related example sentences can be obtained from the movie/television play or subtitle text corresponding to the word set.
Then, in step S640, the server 140 may configure the learning order of each word in the word set and determine the associated illustrative sentence corresponding to each word based on the word set selected by the user and the plurality of associated illustrative sentences corresponding to the word set.
Specifically, a word relation graph may be constructed based on a set of words and associated illustrative sentences to which the set of words corresponds. The word relation graph takes words in a word set as nodes, takes related example sentences containing the words as attributes of the nodes, and an edge between two nodes indicates that the related example sentences simultaneously comprise the words corresponding to the two nodes.
After constructing the word relationship graph, a plurality of bigrams in the word relationship table that satisfy the attribute constraint may be obtained. The attribute constraint indicates that each node of the biggest group has at least one common attribute, that is, that words corresponding to each node of the biggest group should appear in the same associated example sentence.
For example, assume that there is a maximum graph Q in the word relationship graph 1 =(r 1 ,r 2 ,r 3 ) And a maximum group Q 2 =(r 1 ,r 2 ,r 4 ). Related example sentenceS 1 And associated example sentence S 2 Each includes a node r 1 Corresponding words, i.e. node r 1 With attribute values S 1 And S is 2 . Related example sentence S 2 Including node r 2 Corresponding words, i.e. node r 2 With attribute values S 2 . Related example sentence S 1 And associated example sentence S 3 Each includes a node r 3 Corresponding words, i.e. node r 3 With attribute values S 1 And S is 3 . Related example sentence S 2 And associated example sentence S 3 Each includes a node r 4 Corresponding words, i.e. node r 4 With attribute values S 2 And S is 3 . Obviously, the maximum group Q 1 Words corresponding to all nodes in the system do not appear in the same related example sentence, all nodes do not have common attribute, and the maximum group Q 1 The attribute constraint is not satisfied. Maximum mass Q 2 Words corresponding to all nodes in the same related example sentence S 2 Wherein each node has a common attribute S 2 Maximum group Q 2 The attribute constraint condition is satisfied.
The learning sequence of each word in the word set can be configured based on the obtained plurality of biggest groups, and associated example sentences corresponding to each word can be determined.
Preferably, according to an embodiment of the present invention, in order to make the word proportion in the word set learned by the user through the association example sentence as high as possible in a predetermined period (for example, a period after the user registration), the learning order of the word corresponding to the maximum group with the larger number of nodes among the acquired plurality of maximum groups may be configured to be forward, and the learning order of the word corresponding to the maximum group with the smaller number of nodes may be configured to be backward. That is, the learning order of the words included in the biggest group with a large number of nodes may precede the learning order of the words included in the biggest group with a small number of nodes.
For example, the obtained biggest groups may be traversed in order of the number of nodes from more to less, the learning order of the words included in the previously traversed biggest group is configured to be the front, and the learning order of the words included in the subsequently traversed biggest group is configured to be the rear. For each bigram traversed, the learning order of the words that the bigram includes may be randomly configured. For example, one word is randomly selected in the biggest group, its learning order is configured as the first of all the words included in the biggest group, another word is randomly selected, its learning order is configured as the second, and so on.
Wherein, for each word in the traversed biggest group, before configuring the learning order for the word, it may also be determined whether the word has been configured with the learning order (considering that the previously traversed biggest group may also include the word), if so, the word is ignored, otherwise, the learning order is configured for the word. Meanwhile, determining the associated example sentence corresponding to the common attribute of each node in the maximum group (including the word) traversed at the moment as the associated example sentence corresponding to the word. Therefore, words in the word set contained in the example sentence can be associated through the associated example sentence, so that word learning can be easier and more effective.
It should be noted that the above only gives one specific example of a configuration learning sequence, and that a person skilled in the art can envisage various ways for configuring the learning sequence, all of which are within the scope of protection of the present invention, based on the above examples.
After the learning sequence of each word in the word set is configured, the predicted learning time of each word can be determined based on the learning sequence of each word in the word set and the new word learning amount per unit time set by the user, and the learning state of each word can be configured to be learned.
Fig. 7 shows a schematic diagram of a word learning device 700 according to one embodiment of the invention. As shown in fig. 7, the word learning device 700 resides in the client 120 and may include a communication module 710 and a display module 720.
The communication module 710 is adapted to obtain a target word in the set of words currently to be learned, the target word being determined based on a user's progress in learning the set of words and a learning order of words in the set of words. The display module 720 is coupled to the communication module 710 and is adapted to display the acquired target word and to display a related example sentence corresponding to the target word, the related example sentence including at least the target word and other words in the set of words.
The word learning device 700 may further comprise an interaction module 730 (not shown in fig. 7), the interaction module 730 being adapted to receive input from a user, for example, the user selecting a set of words or setting a new amount of word learning per unit time.
For detailed processing logic and implementation of the modules in the word learning device 700, reference is made to the related descriptions of the word learning methods 300 and 600 in conjunction with fig. 1-6, and no further description is given here.
Fig. 8 shows a schematic diagram of a word learning device 800 according to one embodiment of the invention. As shown in fig. 8, the word learning apparatus 800 resides in the server 140 and may include a communication module 810, an example sentence acquisition module 820, and a sequential configuration module 830.
The communication module 810 is adapted to receive a set of words selected by a user. The example sentence obtaining module 820 is coupled to the communication module 810 and is adapted to obtain a plurality of example related sentences corresponding to the word set, where the example related sentences include a plurality of words in the word set. The sequence configuration module 830 is coupled to the illustrative sentence acquisition module 820, and is adapted to configure a learning sequence of each word in the word set based on the word set and a plurality of associated illustrative sentences corresponding to the word set, and determine associated illustrative sentences corresponding to each word.
For detailed processing logic and implementation of the modules in the word learning device 800, reference is made to the related descriptions of the word learning methods 300 and 600 in conjunction with fig. 1-6, and no further description is given here.
In summary, according to the word learning scheme of the embodiment of the invention, by displaying the related example sentences of the target word, the user can learn other words related to the target word (for example, review the most recently learned word and pre-learn the word to be learned), so that the memory effect of the user on the word is improved, and the learning efficiency of the user is improved. The learning sequence of the words is configured based on the maximum groups, so that the words learned by the user through the related example sentences are maximized as much as possible, and the learning sequence is beneficial to the user to master more words in a short time.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present embodiments, or certain aspects or portions of the methods and apparatus of the present embodiments, may take the form of program code (i.e., instructions) embodied in tangible media, such as removable hard drives, U-drives, floppy diskettes, CD-ROMs, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing embodiments of the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the method of the embodiments of the invention according to instructions in the program code stored in the memory.
By way of example, and not limitation, readable media comprise readable storage media and communication media. The readable storage medium stores information such as computer readable instructions, data structures, program modules, or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of readable media.
The invention may further include: a6, the method of A5, wherein the entry data at least comprises one of the following: the explanation of words, examination frequency, pronunciation, example sentences and example sentence explanation. B8, the method as in B7, wherein the step of obtaining a plurality of associated example sentences corresponding to the word set comprises the following steps: and acquiring the related example sentences from the corresponding sources based on the types of the word sets. B9, the method of B7, wherein, based on the word set and the plurality of association example sentences, the steps of configuring learning order of each word in the word set, and determining the association example sentence corresponding to each word include: constructing a word relation graph based on the word set and the plurality of related example sentences, wherein the word relation graph takes words in the word set as nodes, takes related example sentences containing the words as attributes of the nodes, and the edges between the two nodes indicate that related example sentences simultaneously comprising the words corresponding to the two nodes exist; acquiring a plurality of biggest groups meeting attribute constraint conditions in the word relation table, wherein the attribute constraint conditions indicate that all nodes of the biggest groups have at least one common attribute; based on the obtained multiple maximum groups, configuring the learning sequence of each word in the word set, and determining the associated example sentences corresponding to each word. B10, the method of B9, wherein the learning sequence of the words included in the biggest group with a larger number of nodes precedes the learning sequence of the words included in the biggest group with a smaller number of nodes. B11, the method of B10, wherein configuring the learning order of each word in the set of words based on the acquired plurality of biggest cliques comprises: traversing the plurality of maximum clusters according to the sequence of the number of the nodes from more to less; the learning order of the words included in the first traversed maximum clique is configured to be the front, and the learning order of the words included in the second traversed maximum clique is configured to be the rear. B12, the method of B11, wherein the step of determining, based on the obtained plurality of biggest groups, the associated example sentence corresponding to each word includes: and for each word in the traversed biggest group, determining the associated example sentence corresponding to the common attribute of each node in the traversed biggest group as the associated example sentence corresponding to the word. B13, the method as in B7, wherein the method further comprises: the predicted learning time of each word in the word set is determined based on the learning sequence of each word in the word set and the new word learning amount per unit time set by the user.
In the description provided herein, algorithms and displays are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with examples of embodiments of the invention. The required structure for a construction of such a system is apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It should be appreciated that the teachings of embodiments of the present invention described herein may be implemented in a variety of programming languages, and the above descriptions of specific languages are provided for disclosure of implementation of embodiments of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., an embodiment of the invention that is claimed, requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment, or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into a plurality of sub-modules.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of embodiments of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Furthermore, some of the above embodiments are described herein as methods or combinations of method elements that may be implemented by a processor of a computer system or by other means of performing the above described functions. Thus, a processor with the necessary instructions for implementing the above-described method or method element forms a means for implementing the method or method element. Furthermore, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is for carrying out the functions performed by the elements for carrying out the objects of the invention.
As used herein, unless otherwise specified the use of the ordinal terms "first," "second," "third," etc., to describe a general object merely denote different instances of like objects, and are not intended to imply that the objects so described must have a given order, either temporally, spatially, in ranking, or in any other manner.
While embodiments of the invention have been described with respect to a limited number of embodiments, those skilled in the art, having benefit of the above disclosure, will appreciate that other embodiments are contemplated within the scope of the embodiments of the invention described thereby. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The disclosure of embodiments of the invention is intended to be illustrative, but not limiting, of the scope of embodiments of the invention, which is set forth in the following claims.
Claims (6)
1. A method of word learning, comprising:
receiving a word set selected by a user;
acquiring a plurality of associated example sentences corresponding to the word set, wherein the associated example sentences comprise a plurality of words in the word set; and
based on the word set and the plurality of associated example sentences, configuring the learning sequence of each word in the word set, and determining the associated example sentences corresponding to each word, wherein the method comprises the following steps: constructing a word relation graph based on the word set and the plurality of related example sentences, wherein the word relation graph takes words in the word set as nodes, takes related example sentences containing the words as attributes of the nodes, and the edges between the two nodes indicate that related example sentences simultaneously comprising the words corresponding to the two nodes exist; acquiring a plurality of biggest groups meeting attribute constraint conditions in the word relation graph, wherein the attribute constraint conditions indicate that all nodes of the biggest groups have at least one common attribute, all nodes of the biggest groups have at least one common attribute representation, and words corresponding to all nodes of the biggest groups at least appear in the same associated example sentence; based on the obtained multiple maximum groups, configuring the learning sequence of each word in the word set, and determining the associated example sentences corresponding to each word;
wherein configuring a learning order of each word in the set of words based on the obtained plurality of biggest cliques, comprises: traversing the plurality of maximum clusters according to the sequence of the number of the nodes from more to less; the learning sequence of the words included in the first traversed maximum clique is configured to be the front, and the learning sequence of the words included in the second traversed maximum clique is configured to be the rear; and
based on the obtained multiple biggest groups, determining associated example sentences corresponding to the words comprises the following steps: and for each word in the traversed biggest group, determining the associated example sentence corresponding to the common attribute of each node in the traversed biggest group as the associated example sentence corresponding to the word.
2. The method of claim 1, wherein the step of obtaining a plurality of association illustrative sentences corresponding to the set of words comprises:
and acquiring the related example sentences from the corresponding sources based on the types of the word sets.
3. The method of claim 1, further comprising:
the predicted learning time of each word in the word set is determined based on the learning sequence of each word in the word set and the new word learning amount per unit time set by the user.
4. A word learning device, comprising:
the communication module is used for receiving the word set selected by the user;
the example sentence acquisition module is used for acquiring a plurality of associated example sentences corresponding to the word set, wherein the associated example sentences comprise a plurality of words in the word set; and
the sequence configuration module is configured to configure learning sequence of each word in the word set based on the word set and the plurality of associated example sentences, and determine associated example sentences corresponding to each word, and comprises: constructing a word relation graph based on the word set and the plurality of related example sentences, wherein the word relation graph takes words in the word set as nodes, takes related example sentences containing the words as attributes of the nodes, and the edges between the two nodes indicate that related example sentences simultaneously comprising the words corresponding to the two nodes exist; acquiring a plurality of biggest groups meeting attribute constraint conditions in the word relation graph, wherein the attribute constraint conditions indicate that all nodes of the biggest groups have at least one common attribute, all nodes of the biggest groups have at least one common attribute representation, and words corresponding to all nodes of the biggest groups at least appear in the same associated example sentence; based on the obtained multiple maximum groups, configuring the learning sequence of each word in the word set, and determining the associated example sentences corresponding to each word;
wherein configuring a learning order of each word in the set of words based on the obtained plurality of biggest cliques, comprises: traversing the plurality of maximum clusters according to the sequence of the number of the nodes from more to less; the learning sequence of the words included in the first traversed maximum clique is configured to be the front, and the learning sequence of the words included in the second traversed maximum clique is configured to be the rear; and
based on the obtained multiple biggest groups, determining associated example sentences corresponding to the words comprises the following steps: and for each word in the traversed biggest group, determining the associated example sentence corresponding to the common attribute of each node in the traversed biggest group as the associated example sentence corresponding to the word.
5. A word learning system, comprising:
a client, the client hosting a word learning device, the word learning device comprising: the communication module is used for acquiring a target word to be learned currently in the word set, wherein the target word is determined based on the learning progress of a user on the word set and the learning sequence of each word in the word set; the display module is used for displaying the acquired target word and also used for displaying a related example sentence corresponding to the target word, wherein the related example sentence at least comprises the target word and other words in the word set; and
a server on which the word learning apparatus according to claim 4 resides.
6. A computing device, comprising:
one or more processors; and
a memory;
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing any of the word learning methods of claims 1-3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010182902.7A CN113408275B (en) | 2020-03-16 | 2020-03-16 | Word learning method, device, system and computing equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010182902.7A CN113408275B (en) | 2020-03-16 | 2020-03-16 | Word learning method, device, system and computing equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113408275A CN113408275A (en) | 2021-09-17 |
CN113408275B true CN113408275B (en) | 2023-10-20 |
Family
ID=77676593
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010182902.7A Active CN113408275B (en) | 2020-03-16 | 2020-03-16 | Word learning method, device, system and computing equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113408275B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101142231B1 (en) * | 2011-06-15 | 2012-05-07 | 한민석 | Vocabulary learning apparatus and method thereof |
JP2012118640A (en) * | 2010-11-30 | 2012-06-21 | Casio Comput Co Ltd | Example sentence book preparation device and example sentence book preparation program |
KR20140101548A (en) * | 2013-02-12 | 2014-08-20 | 주홍찬 | Apparatus and method for learning word by using link example sentence. |
KR20140142552A (en) * | 2013-06-04 | 2014-12-12 | 이상현 | Example Sentence Providing System, Terminal and Method based on Studied Words |
JP2015045904A (en) * | 2013-08-27 | 2015-03-12 | 株式会社リコー | Information processing apparatus and method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190080626A1 (en) * | 2017-09-14 | 2019-03-14 | International Business Machines Corporation | Facilitating vocabulary expansion |
-
2020
- 2020-03-16 CN CN202010182902.7A patent/CN113408275B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012118640A (en) * | 2010-11-30 | 2012-06-21 | Casio Comput Co Ltd | Example sentence book preparation device and example sentence book preparation program |
KR101142231B1 (en) * | 2011-06-15 | 2012-05-07 | 한민석 | Vocabulary learning apparatus and method thereof |
KR20140101548A (en) * | 2013-02-12 | 2014-08-20 | 주홍찬 | Apparatus and method for learning word by using link example sentence. |
KR20140142552A (en) * | 2013-06-04 | 2014-12-12 | 이상현 | Example Sentence Providing System, Terminal and Method based on Studied Words |
JP2015045904A (en) * | 2013-08-27 | 2015-03-12 | 株式会社リコー | Information processing apparatus and method |
Non-Patent Citations (1)
Title |
---|
基于Android的单词学习系统设计与实现;徐芬芬;中国优秀硕士学位论文全文数据库(2013年第S2期);参见第2.2、4.1-4.5、5.2.6-5.2.8、5.3、6章节 * |
Also Published As
Publication number | Publication date |
---|---|
CN113408275A (en) | 2021-09-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108027873B (en) | Interacting with an assistant component based on captured stroke information | |
CN111552880B (en) | Knowledge graph-based data processing method and device, medium and electronic equipment | |
CN108845806B (en) | Applet distributing method, device, server and storage medium | |
EP1986175A2 (en) | Method, interface and system for obtaining user input | |
CN108958731B (en) | Application program interface generation method, device, equipment and storage medium | |
US20170132198A1 (en) | Provide interactive content generation for document | |
US11769013B2 (en) | Machine learning based tenant-specific chatbots for performing actions in a multi-tenant system | |
DE102014101026A1 (en) | Stylus shorthand | |
DE102014101042A1 (en) | Modifying a stylus input or response using an inferred motion | |
WO2020052060A1 (en) | Method and apparatus for generating correction statement | |
CN113408275B (en) | Word learning method, device, system and computing equipment | |
CN112883218A (en) | Image-text combined representation searching method, system, server and storage medium | |
CN113592430B (en) | Schedule management method, schedule management device, electronic equipment and storage medium | |
CN109331469A (en) | The method and electronic equipment of a kind of typing of language based on programming outpost information | |
US20210034946A1 (en) | Recognizing problems in productivity flow for productivity applications | |
KR101891754B1 (en) | Device and method of providing for learning application, and server of providing for learning content | |
US12242742B1 (en) | Storing data in a digital assistant | |
US20250087367A1 (en) | Method, apparatus, device and storage medium for information interaction | |
WO2024240145A1 (en) | Method and apparatus for application processing, and device and storage medium | |
WO2025091733A1 (en) | Method for interaction with digital assistant, apparatus, device and storage medium | |
CN119234235A (en) | Method, apparatus, device and medium for generating a response | |
CN117435711A (en) | Intelligent question-answering service processing method, electronic equipment and storage medium | |
CN119011520A (en) | Method, apparatus, device and storage medium for message processing | |
CN120631216A (en) | Interface interaction method, device, equipment and storage medium | |
JP2025104395A (en) | Information processing device, method, program, and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |