[go: up one dir, main page]

WO2019130333A1 - Learning device and method thereof - Google Patents

Learning device and method thereof Download PDF

Info

Publication number
WO2019130333A1
WO2019130333A1 PCT/IN2018/050468 IN2018050468W WO2019130333A1 WO 2019130333 A1 WO2019130333 A1 WO 2019130333A1 IN 2018050468 W IN2018050468 W IN 2018050468W WO 2019130333 A1 WO2019130333 A1 WO 2019130333A1
Authority
WO
WIPO (PCT)
Prior art keywords
page
book
user
code
learning device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IN2018/050468
Other languages
French (fr)
Inventor
Samir Jain
Anubha Chaurasia
Mithun Kumar Biswas
Uday Kumar SWARNAPURI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bodhaguru Learning Private Ltd
Original Assignee
Bodhaguru Learning Private Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bodhaguru Learning Private Ltd filed Critical Bodhaguru Learning Private Ltd
Publication of WO2019130333A1 publication Critical patent/WO2019130333A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student

Definitions

  • the present disclosure generally relates to the educational devices and more particularly, but not exclusively to a learning device and a method of facilitating learning using an interactive paper book.
  • Paper based learning materials like books are the most common learning aid used in schools across the world. However, for learning language in the early grades, regular practicing is the most important requirement and paper-based learning material provides a severe limitation, as the student may not able to comprehend and read the text written in the paper-based learning material.
  • Current personal computing devices when used in conjunction with Internet provides vast opportunities for learning, teaching and so forth. However, for early grade learners there is difficulty in using the computing devices, and further they impose strain on eyes due to screen resolution and exposure to radiations from the display. Hence there is a need for a customized learning methodology with engaging activities to facilitate learning and inculcate self-learning for children.
  • Some existing learning devices use audio books to provide appropriate response to user interaction. As the audio books are printed worksheets, the method may not be feasible in enabling continuous learning and also the response time taken by the devices in responding to user interactions may be huge.
  • Embodiments of the present disclosure relates to a learning device.
  • the device comprises a processor and a user interface unit.
  • the user interface unit comprises at least one layer with at least one user touch control located at a first position in the at least one layer. Each user touch control is configured to generate an output signal in response to activation of the at least one user touch control.
  • the user interface unit further comprises a holder overlying the user interface unit to receive a paper book that comprises of at least one page. Each page of the paper book identified by a book code and a page code, comprises at least one visual touch indicator located in a second position. The pressing of at least one visual touch indicator on the paper book further dynamically activates the at least one user touch control in the first position based on the page code, book code and the second position of the at least one visual touch indicator.
  • the present disclosure relates to a method of facilitating learning.
  • the method comprises receiving an input of pressing at least one visual touch indicator on at least one page of the paper book from a user.
  • the at least one page comprises the at least one visual touch indicator on the paper book is located in a second position corresponding to at least one user touch control located at a first position of at least one layer of the user interface.
  • the method identifies the book code using the second position of the at least one visual touch indicator that is pressed on the first page of the paper book.
  • the method also identifies the page code using the second position of the at least one visual touch indicator that is pressed on any page of the paper book.
  • the at least one user touch control in the first position is activated using at least the page code, book code and the second position of the at least one visual touch indicator.
  • a unique output signal is generated.
  • Figure 1 depicts an exemplary architecture of a system for facilitating learning in accordance with some embodiments of the present disclosure
  • Figure la depicts an exemplary layers of learning device of Figure 1 in accordance with some embodiments of the present disclosure
  • Figure lb depicts an exemplary first layer of learning device in accordance with some embodiments of the present disclosure
  • Figure lc depicts an exemplary user touch control layer of learning device in accordance with some embodiments of the present disclosure
  • Figure 2 illustrates an exemplary flowchart showing a method for facilitating learning in accordance with some embodiments of the present disclosure
  • Figure 2a, 2b, 2c and 2d depicts exemplary pages of the interactive paper book in accordance with some embodiments of the present disclosure.
  • Figure 3 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
  • the present disclosure relates to a learning device for facilitating learning for early grade learners.
  • the device is a portable electronic device capable of enhancing language reading and learning skills using a paper-based learning material.
  • the device may comprise a user interface unit with multiple layers of conductive traces separated by a spacer with one layer having multiple user touch controls located at different positions. Each user touch control is configured to generate an output signal to the processor in response to activation of the user touch controls by a user.
  • the user interface unit also comprises a holder onto which a user may place the learning material or a paper-based book containing various learning and assessment activities.
  • the book comprises at least one page and each page may comprise visual touch indicators located in positions that corresponds to the positions of the user touch controls of the user interface unit.
  • the user may begin the learning session by pressing a start touch visual indicator located at another position on topmost layer of the user interface unit that interprets touch control associated with the start touch visual indicator.
  • the corresponding start user touch control is activated generating a start output signal indicating the beginning of interaction of a new learning material with the learning device.
  • the user places the paper based learning book over the book holdable area of the user interface unit and may press at least one visual touch indicator on the book that has a unique position on the first page of the book which identifies the unique book code.
  • the user then presses the visual touch indicator on the specific location on each page of the book, which identifies the page code for the identified book code.
  • the user touch controls associated with all the visual touch indicators for that page of the book are activated to enable the user to learn the content in that page of the book by pressing the visual touch indicators of the corresponding page on the book.
  • a unique output signal is generated.
  • the system may also comprise a user device coupled with the learning device to assess performance of the learner.
  • the user device comprises an integrated application to receive the user interaction data from the learning device and generate performance reports based on the received data.
  • the response time in generating the output signals by the learning device is reduced, with more efficient processing, less memory and power usage.
  • different language books can be supported that are read from left to right and also from right to left.
  • Figure 1 depicts an exemplary architecture of a system for facilitating learning in accordance with some embodiments of the present disclosure.
  • the exemplary system 100 comprises one or more components configured to facilitate interactive learning for early grade learners.
  • the exemplary system 100 comprises a learning device 102, a data repository 103, and one or more user devices 104-1, 104-2, ... 104-N (collectively referred to as user device 104) connected via a communication network 108.
  • the communication network 108 may include, without limitation, a direct interconnection, local area network (LAN), wide area network (WAN), wireless network (e.g., using Wireless Application Protocol, Bluetooth), the Internet and other configurations.
  • LAN local area network
  • WAN wide area network
  • wireless network e.g., using Wireless Application Protocol, Bluetooth
  • the learning device 102 is a compact case made of user friendly material.
  • Figure la illustrates the internal architecture of the learning device 102.
  • the learning device 102 comprises a user interface unit 110 comprising multiple layers.
  • the user interface unit 110 may comprise a plurality of layers, a first layer 111 at topmost capable of receiving a learning material overlying the first layer 111.
  • the first layer 111 is disposed with holder having one or more holding pins 112 as illustrated in Figure lb.
  • the one or more holding pins 112 may be used to hold the learning material when placed over the first layer 111.
  • the user interface unit 110 also comprise a second layer (alternatively referred to as user touch control layer) 114 beneath the first layer 111 that comprises a membrane keypad with one or more user touch controls (alternatively referred to as keys) arranged in the form of rows and columns in a list of first position 115 as illustrated in Figure lc.
  • the first layer 111 of the user interface unit 110 also comprises a start visual touch indicator 116 as illustrated in Figure la, which corresponds to a start user touch control 117 of Figure lc in a third position of the second layer 114.
  • the first layer of the user interface unit 110 may also comprise an alphanumeric or numeric keypad to enable the user to enter a book code manually by pressing the numbers on the keypad.
  • the user interface unit 110 may also comprise a touch conducting layer 118 below the user touch control layer 114 that receives the touch or press trigger from the upper layers of the user interface unit 110.
  • the touch conducting layer 118 connects to a processor 119 through a wire connector.
  • the processor 119 is configured to map list of second positions 124 of visual touch indicators on paper book with the list of first position 115 of user touch controls on the user touch control layer 114.
  • the user interface unit 110 also comprises at least one intermediary layers between the first, second and touch conducting layers to enable and disable the connectivity among the layers of the learning device 102 based on the user touch.
  • the at least one intermediary layers may be a glue layers to combine the first, the second and the touch conducting layers with space.
  • the learning material or paper book may be at least a spiral- bound or stapled book with at least one page.
  • Each book is assigned with a unique book code 121.
  • each page of the book is assigned with a page code 122 to uniquely identify the page and its associated learning content.
  • each page has at least one visual touch indicator on the book in a second position 124 corresponding to the at least one user touch control in the first position 115 on the user touch control layer 114.
  • the first page of the book comprises at least one visual touch indicator which when pressed enables the device to identify the book code 121.
  • each page of the book comprises at least one visual touch indicator on a unique second position to identify the page code 122 of the corresponding page.
  • each page of the book has visual touch indicator on a fixed second position for the user to go to the previous or the next page to select the page code 122 of the corresponding page.
  • the second position 124 of the visual touch indicators is determined based on the first position 115 of user touch controls.
  • the visual touch indicator may be a symbolic indicator indicating the presence of user touch control underneath the symbol which is configured to generate an output signal 126 in response to pressing of visual touch indicator on the page.
  • the data repository 103 stores book related data that include meta data information about the paper-based book (alternatively referred to as book).
  • the data repository 103 may store for each book, the book code 121, at least one page code 122, at least one output signal 126 and other book related data for each page of each book.
  • the data repository 103 also stores the list of first position 115 of user touch controls, the list of second position 124 of the visual touch indicators for each page and other data 128 related to each book.
  • the book code 121 and the page code 122 may be a unique identification code with for example, alphanumeric characters to uniquely identify the book and the page respectively.
  • the first position 115 is defined as the location of at least one user touch control on the learning device 102.
  • the list of second position 124 for each page of the book indicates the location of at least one visual touch indicator on the page of the book, wherein the second position 124 is enabled for the first position 115 of the at least one user touch control of the learning device 102.
  • the output signal 126 may be a user response signal, like playing an audio, blinking an LED, displaying a pattern on LCD, that is to be generated in response to pressing of a visual touch indicator in at least one page by the user.
  • the data repository 103 also stores the other data 128 that includes more information about each book including type of book (right to left reading book OR left to right reading book), type of page identification method in the book, the type of content (for example flash card, read aloud story, multiple choice questions, match the following or any other form of engaging activity), along with list of the first position 115 of the user touch controls that are to be dynamically activated in response to pressing of the at least one visual touch indicator located on second position 124 for each of the corresponding page code 122.
  • the data repository 103 may be integrated within the learning device 102.
  • the data repository 103 may be a standalone repository communicatively coupled with the learning device 102 over the communication network 108.
  • the data repository 103 may be a cloud based storage repository from where data is retrieved by the learning device 102.
  • the user device 104 may be a mobile device, a portable computer or a computing device including the functionality for communicating over the network 108.
  • the mobile device can be a conventional web-enabled personal computer in the home, mobile computer (laptop, notebook, or subnotebook), Smart Phone (iPhone, Android), VOIP device, television set-top box, interactive kiosk, personal digital assistant, wireless electronic mail device, tablet computer or another device capable of communicating over the Internet or other appropriate communications network.
  • the user device 104 may comprise an integrated application that is configured to receive a user interaction data from the learning device 102 and assess the performance of the student throughout the interactive session by generating performance reports.
  • the integrated application may also be a local application residing on the learning device 102.
  • the learning device 102 is configured to facilitate interactive learning for early grade learners.
  • the learning device 102 comprises at least the processor 119 and the user interface unit 110.
  • the learning device 102 further comprises an interactive unit 131, an identification unit 132, an activation unit 133, a response unit 134 and an assessment unit 135.
  • the learning device 102 may include other modules or functionality units to perform various miscellaneous functionalities of the learning device 102. It will be appreciated that such aforementioned components or units may be represented as a single unit or a combination of different units.
  • the modules may be implemented in the form of software executed by a processor, hardware and/or firmware.
  • the interactive unit 131 is configured to receive input from the user and transfer the input to the lower layers of the user interface unit 110.
  • the input from the user may be for example, pressing of start visual touch indicator 116 as given in Figure la, which indicates the start of the learning activity.
  • the user may place the book over the topmost layer of the user interface unit 110 secured by holding pins and may press at least one visual touch indicator on a first page of the book to identify the book code 121 of the book.
  • the interactive unit 131 receives the input from the user and transfers the input from the first layer 111 of the user interface unit 110 to the touch conducting layer 118.
  • the identification unit 132 determines the book code 121 when the user presses on the at least one visual touch indicator on the first page of the book.
  • the identification unit 132 determines the page code 122 of the page when a user touches on the at least one visual touch indicator located at the unique second position 124 on the page. In one embodiment, the identification unit 132 continuously determines the page code 122 of a current page by determining the second position 124 of the at least one visual touch indicator that is pressed and the page code of previous or next page of the book. In one embodiment, the data repository 103 may store for each page, mapping of page code 122 with the second position 124, and mapping of second position 124 of visual touch indicators with the corresponding first position 115 of user touch controls.
  • the activation unit 133 further determines the first position 115 of the user touch controls associated with the visual touch indicators of the identified page of the corresponding book and thereby activates the user touch controls in the first position 115 using at least the page code 122, the book code 121 and the second position 124 of the visual touch indicators.
  • the response unit 134 Upon activation of user touch controls associated with the current page, the response unit 134 generates the corresponding output signal 126 as a response when each visual touch indicator on the book is pressed.
  • the response unit 134 determines the first position 115 of the user touch control of the corresponding visual touch indicator that is pressed and using the repository 103, the response unit 134 further determines the visual touch indicator present in the second position 124. The response unit 134 further generates the output signal 126 configured as output in response to the pressing of the visual touch indicator on the second position 124 for the corresponding book code 121 and the page code 122.
  • the learning device 102 may be connected to an audio speaker that is configured to output audio response upon pressing on visual touch indicators.
  • the learning device may be connected to LED, LCD or any other output system to show engaging response to the user.
  • the user may connect the learning device 102 with the user device 104.
  • learning device 102 may connect with the user device 104 using direct interconnection, local area network (LAN), wide area network (WAN), wireless network such as Bluetooth, the Internet, etc.
  • LAN local area network
  • WAN wide area network
  • wireless network such as Bluetooth
  • the assessment unit 135 connects the learning device 102 with the integrated application of the user device 104 and enables transferring of a user interaction/usage data and corresponding book’s data to the user device 104 for evaluation.
  • the assessment unit 135 monitors user interaction with the user interface unit 110 of the learning device 102, stores the monitoring user interaction in the data repository 103 and dynamically sends the user interaction data to the user device 104, at each time the user presses at least one visual touch indicator 220 of at least one page of the book.
  • the assessment unit 135 further enables the integrated application of the user device 104 to assess the performance of the user upon completion of learning session based on the user interaction data and to generate performance reports.
  • Figure 2 illustrates an exemplary flowchart showing a method for facilitating learning in accordance with some embodiments of the present disclosure
  • the method 200 comprises one or more blocks implemented by the processor 119 for facilitating learning.
  • the method 200 may be described in the general context of computer executable instructions.
  • computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
  • the order in which the method 200 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 200. Additionally, individual blocks may be deleted from the method 200 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 200 can be implemented in any suitable hardware, software, firmware, or combination thereof.
  • the input may comprise pressing on the start user touch control 117 located at third position on the upper layer 111 of user interface unit 110.
  • the interactive unit 131 receives the input of pressing on start user touch control 117 from the user and generates the start output signal indicating the beginning of learning session with the learning device 102. Further, the input includes pressing on at least one visual indicator on the book located in the second position 124 of the page of the book. The interactive unit 131 receives the input and transfers the input to the lower layers of the user interface unit 110. The input from the user is used to identify the book code 121 and the page code 122 at block 204.
  • the book code 121 and page code 122 is identified.
  • the identification unit 132 determines the unique book code 121 and the page code 122 based on the input received from the user.
  • the user may place the book over the holder of the user interface unit 110 and may press at least one visual touch indicator on the first page of the book to identify the book code 121 of the book, as shown in Figure 2a.
  • the symbols 211 for example, indicated as apple and banana in Figure 2a are the visual touch indicators which when pressed identifies the book code 121 of the book.
  • the identification unit 132 determines the book code 121 when the user presses on the at least one visual touch indicator on the first page of the book.
  • the identification unit 132 determines the page code 122 of the page when a user touches on the at least one visual touch indicator located at the unique second position on the page.
  • the unique visual touch indicator for page identification may be indicated as GO symbol 218 as illustrated in Figure 2b.
  • last column of user touch controls on the user touch control layer 114 may be dedicated for page identification, and the visual touch indicators 220 are arranged on each page.
  • the identification unit 132 determines the page code 122 of a current page by determining the second position 124 of the at least one visual touch indicator that is pressed and the page code 122 of the previous or next page of the book, when user presses the next or previous symbol indicator 221 as illustrated in Figure 2c and Figure 2d.
  • the identification unit 132 also interprets the type of book, for example right to left reading book as illustrated in Figure 2d and left to right reading book as illustrated in Figure 2c based on the book code. Based on identification of the page of the book, the user touch controls for the corresponding page of the book are activated using the data in data repository 103.
  • the at least one user touch control is activated for a given page of the book.
  • the activation unit 133 activates the user touch controls corresponding to all the visual indicators associated with the identified page for the given book.
  • visual touch indicators 220 may be indicated as volume symbols in the second position 124 as shown in Figure 2b, Figure 2c and Figure 2d.
  • the activation unit 133 determines the second position 124 of all the visual touch indicators associated with the identified page.
  • the activation unit 133 further determines the first position 115 of the user touch controls associated with the visual touch indicators 220 of the identified page.
  • the activation unit 133 activates the user touch controls using at least the page code 122, the book code 121 and the second position 124 of the visual touch indicators 220.
  • the output signal 126 is generated when the user press one of the visual touch indicator on the book.
  • the response unit 134 generates the output signal 126 corresponding to pressing on each visual touch indicator 220 of the identified page.
  • the response unit 134 determines the output signal 126 by extracting the information from data repository 103 based on the page code 122, the book code 121 and the second position 124 of the at least one visual touch indicator as pressed by the user.
  • the output signal 126 may be an audio signal displayed via the speaker connected to the learning device 102.
  • the output signal 126 may be blinking of LED or displaying a pattern on LCD connected to the learning device 102.
  • the response may be mix of playing an audio, blinking of LED, displaying a pattern on LCD or any other suitable engaging response to the user.
  • FIG. 3 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
  • the computer system 302 may be a learning device 102, which is used for facilitating learning for early grade learners.
  • the computer system 302 may include a central processing unit (“CPU” or“processor”) 304.
  • the processor 304 may comprise at least one data processor for executing program components for executing user or system-generated business processes.
  • the processor 304 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
  • the processor 304 may be disposed in communication with one or more input/output (I/O) devices (306 and 308) via I/O interface 310.
  • the I/O interface 310 may employ communication protocols/methods such as, without limitation, audio, analog, digital, stereo, IEEE- 1394, serial bus, Universal Serial Bus (USB), infrared, PS/2, BNC, coaxial, component, composite, Digital Visual Interface (DVI), high-definition multimedia interface (HDMI), Radio Frequency (RF) antennas, S-Video, Video Graphics Array (VGA), IEEE 802.n /b/g/n/x, Bluetooth, cellular (e.g., Code-Division Multiple Access (CDMA), High-Speed Packet Access (HSPA+), Global System For Mobile Communications (GSM), Long-Term Evolution (LTE) or the like), etc.
  • CDMA Code-Division Multiple Access
  • HSPA+ High-Speed Packet Access
  • GSM Global System For Mobile Communications
  • LTE Long-Term Evolution
  • the computer system 302 may communicate with one or more I/O devices (306 and 308).
  • the processor 304 may be disposed in communication with a communication network 312 via a network interface 314.
  • the network interface 314 may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/Internet Protocol (TCP/IP), token ring, IEEE 802.1 la/b/g/n/x, etc.
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • token ring IEEE 802.1 la/b/g/n/x, etc.
  • the communication network 312 can be implemented as one of the several types of networks, such as intranet or any such wireless network interfaces.
  • the communication network 312 may either be a dedicated network or a shared network, which represents an association of several types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), etc., to communicate with each other.
  • HTTP Hypertext Transfer Protocol
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • WAP Wireless Application Protocol
  • the communication network 312 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, etc.
  • the processor 304 may be disposed in communication with a memory 316 e.g., RAM 318, and ROM 320, etc. as shown in Figure 3, via a storage interface 322.
  • the storage interface 322 may connect to memory 316 including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as Serial Advanced Technology Attachment (SATA), Integrated Drive Electronics (IDE), IEEE- 1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI), etc.
  • the memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, Redundant Array of Independent Discs (RAID), solid-state memory devices, solid-state drives, etc.
  • the memory 316 may store a collection of program or database components, including, without limitation, user/application 324, an operating system 326, a web browser 328, a mail client 330, a mail server 332, a user interface 334, and the like.
  • computer system 302 may store user/application data 324, such as the data, variables, records, etc. as described in this invention.
  • databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase.
  • the operating system 326 may facilitate resource management and operation of the computer system 302.
  • Examples of operating systems include, without limitation, Apple MacintoshTM OS XTM, UNIXTM, Unix-like system distributions (e.g., Berkeley Software Distribution (BSD), FreeBSDTM, Net BSDTM, Open BSDTM, etc.), Linux distributions (e.g., Red HatTM, UbuntuTM, K-UbuntuTM, etc.), International Business Machines (IBMTM) OS/2TM, Microsoft WindowsTM (XPTM, Vista/7/8, etc.), Apple iOSTM, Google AndroidTM, BlackberryTM Operating System (OS), or the like.
  • a user interface may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities.
  • GUIs may provide computer interaction interface elements on a display system operatively connected to the computer system 302, such as cursors, icons, check boxes, menus, windows, widgets, etc.
  • Graphical User Interfaces may be employed, including, without limitation, AppleTM MacintoshTM operating systems’ AquaTM, IBMTM OS/2TM, MicrosoftTM WindowsTM (e.g., Aero, Metro, etc.), Unix X-WindowsTM, web interface libraries (e.g., ActiveX, Java, JavaScript, AJAX, HTML, Adobe Flash, etc.), or the like.
  • a computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored.
  • a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein.
  • the term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., are non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
  • the present disclosure provides a learning device and a method thereof.
  • the present disclosure uses simple learning material such as paper book. As book comprises multiple pages that can be identified uniquely, the present disclosure enables continuous learning instead of replacing each page individually on the learning device.
  • the present disclosure also supports easy reading for the visually impaired children by supporting braille encoded paper books as the learning material. As the user presses the touch control by just using finger (instead of using any form of pen), the learning device can also be used by visually impaired students to learn braille encoded paper books.
  • the learning device of the present disclosure also supports learning of multiple languages. The reading may differ for multiple languages for example, English is to be read from left to right, and Urdu is to be read from right to left.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Human Computer Interaction (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

The present disclosure relates to a learning device and method for facilitating learning for early grade learners. The learning device comprises a user interface unit with multiple layers, one layer having multiple user touch controls located at different positions. A user may place a paper book containing various learning and assessment activities on learning device and may interact with the book using the user interface unit. Each page of the paper book comprise visual touch indicators located in positions corresponding to positions of user touch controls. The device identifies a book code and page code when the user presses at least one visual touch indicator on page. Upon identification, the user touch controls corresponding to all the visual touch indicators of the identified page are activated and when the corresponding visual touch indicator is pressed, generates a unique response. Thus, an effective learning device with interactive paper book is provided.

Description

Title: LEARNING DEVICE AND METHOD THEREOF
TECHNICAL FIELD
The present disclosure generally relates to the educational devices and more particularly, but not exclusively to a learning device and a method of facilitating learning using an interactive paper book.
BACKGROUND
Learning to read is a very important skill for early grade students, so that they can read to learn various subjects in the higher classes. Paper based learning materials like books are the most common learning aid used in schools across the world. However, for learning language in the early grades, regular practicing is the most important requirement and paper-based learning material provides a severe limitation, as the student may not able to comprehend and read the text written in the paper-based learning material. Current personal computing devices when used in conjunction with Internet provides vast opportunities for learning, teaching and so forth. However, for early grade learners there is difficulty in using the computing devices, and further they impose strain on eyes due to screen resolution and exposure to radiations from the display. Hence there is a need for a customized learning methodology with engaging activities to facilitate learning and inculcate self-learning for children. Some existing learning devices use audio books to provide appropriate response to user interaction. As the audio books are printed worksheets, the method may not be feasible in enabling continuous learning and also the response time taken by the devices in responding to user interactions may be huge.
SUMMARY
One or more shortcomings of the prior art are overcome and additional advantages are provided through the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
Embodiments of the present disclosure relates to a learning device. The device comprises a processor and a user interface unit. The user interface unit comprises at least one layer with at least one user touch control located at a first position in the at least one layer. Each user touch control is configured to generate an output signal in response to activation of the at least one user touch control. The user interface unit further comprises a holder overlying the user interface unit to receive a paper book that comprises of at least one page. Each page of the paper book identified by a book code and a page code, comprises at least one visual touch indicator located in a second position. The pressing of at least one visual touch indicator on the paper book further dynamically activates the at least one user touch control in the first position based on the page code, book code and the second position of the at least one visual touch indicator.
Further the present disclosure relates to a method of facilitating learning. The method comprises receiving an input of pressing at least one visual touch indicator on at least one page of the paper book from a user. The at least one page comprises the at least one visual touch indicator on the paper book is located in a second position corresponding to at least one user touch control located at a first position of at least one layer of the user interface. The method identifies the book code using the second position of the at least one visual touch indicator that is pressed on the first page of the paper book. The method also identifies the page code using the second position of the at least one visual touch indicator that is pressed on any page of the paper book. In response to pressing of at least one visual touch indicator on the paper book, the at least one user touch control in the first position is activated using at least the page code, book code and the second position of the at least one visual touch indicator. Upon activation of at least one user touch control, a unique output signal is generated.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DECSRIPTION OF DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and regarding the accompanying figures, in which:
Figure 1 depicts an exemplary architecture of a system for facilitating learning in accordance with some embodiments of the present disclosure;
Figure la depicts an exemplary layers of learning device of Figure 1 in accordance with some embodiments of the present disclosure;
Figure lb depicts an exemplary first layer of learning device in accordance with some embodiments of the present disclosure;
Figure lc depicts an exemplary user touch control layer of learning device in accordance with some embodiments of the present disclosure;
Figure 2 illustrates an exemplary flowchart showing a method for facilitating learning in accordance with some embodiments of the present disclosure;
Figure 2a, 2b, 2c and 2d depicts exemplary pages of the interactive paper book in accordance with some embodiments of the present disclosure; and
Figure 3 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the specific forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
The terms“comprises”,“comprising”,“includes”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device, or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by“comprises... a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
The present disclosure relates to a learning device for facilitating learning for early grade learners. The device is a portable electronic device capable of enhancing language reading and learning skills using a paper-based learning material. The device may comprise a user interface unit with multiple layers of conductive traces separated by a spacer with one layer having multiple user touch controls located at different positions. Each user touch control is configured to generate an output signal to the processor in response to activation of the user touch controls by a user. The user interface unit also comprises a holder onto which a user may place the learning material or a paper-based book containing various learning and assessment activities. The book comprises at least one page and each page may comprise visual touch indicators located in positions that corresponds to the positions of the user touch controls of the user interface unit. The user may begin the learning session by pressing a start touch visual indicator located at another position on topmost layer of the user interface unit that interprets touch control associated with the start touch visual indicator. In response, the corresponding start user touch control is activated generating a start output signal indicating the beginning of interaction of a new learning material with the learning device. The user places the paper based learning book over the book holdable area of the user interface unit and may press at least one visual touch indicator on the book that has a unique position on the first page of the book which identifies the unique book code. The user then presses the visual touch indicator on the specific location on each page of the book, which identifies the page code for the identified book code. Upon identification of the book code and the page code, the user touch controls associated with all the visual touch indicators for that page of the book are activated to enable the user to learn the content in that page of the book by pressing the visual touch indicators of the corresponding page on the book. On pressing of each visual touch indicators on the second position of book and, a unique output signal is generated.
In one embodiment, the system may also comprise a user device coupled with the learning device to assess performance of the learner. The user device comprises an integrated application to receive the user interaction data from the learning device and generate performance reports based on the received data. As the user touch controls are activated dynamically in response to corresponding visual touch indicator activated by the user, the response time in generating the output signals by the learning device is reduced, with more efficient processing, less memory and power usage. Also, by easily mapping of user touch control on the user interface unit with the interpretation of the second position on the book, different language books can be supported that are read from left to right and also from right to left. Thus, the present disclosure enables user- friendly interactive learning and assessment of early grade learners using the simple, less complex learning device.
In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
Figure 1 depicts an exemplary architecture of a system for facilitating learning in accordance with some embodiments of the present disclosure. As shown in Figure 1, the exemplary system 100 comprises one or more components configured to facilitate interactive learning for early grade learners. In one embodiment, the exemplary system 100 comprises a learning device 102, a data repository 103, and one or more user devices 104-1, 104-2, ... 104-N (collectively referred to as user device 104) connected via a communication network 108. The communication network 108 (alternatively referred as the network) may include, without limitation, a direct interconnection, local area network (LAN), wide area network (WAN), wireless network (e.g., using Wireless Application Protocol, Bluetooth), the Internet and other configurations.
The learning device 102, as illustrated in Figure 1, is a compact case made of user friendly material. Figure la illustrates the internal architecture of the learning device 102. The learning device 102 comprises a user interface unit 110 comprising multiple layers. In one example, the user interface unit 110 may comprise a plurality of layers, a first layer 111 at topmost capable of receiving a learning material overlying the first layer 111. The first layer 111 is disposed with holder having one or more holding pins 112 as illustrated in Figure lb. The one or more holding pins 112 may be used to hold the learning material when placed over the first layer 111. The user interface unit 110 also comprise a second layer (alternatively referred to as user touch control layer) 114 beneath the first layer 111 that comprises a membrane keypad with one or more user touch controls (alternatively referred to as keys) arranged in the form of rows and columns in a list of first position 115 as illustrated in Figure lc. The first layer 111 of the user interface unit 110 also comprises a start visual touch indicator 116 as illustrated in Figure la, which corresponds to a start user touch control 117 of Figure lc in a third position of the second layer 114. In another embodiment, the first layer of the user interface unit 110 may also comprise an alphanumeric or numeric keypad to enable the user to enter a book code manually by pressing the numbers on the keypad. Further, the user may also manually input student data such as student identification number or roll number using the keypad on the first layer. The user interface unit 110 may also comprise a touch conducting layer 118 below the user touch control layer 114 that receives the touch or press trigger from the upper layers of the user interface unit 110. The touch conducting layer 118 connects to a processor 119 through a wire connector. The processor 119 is configured to map list of second positions 124 of visual touch indicators on paper book with the list of first position 115 of user touch controls on the user touch control layer 114. The user interface unit 110 also comprises at least one intermediary layers between the first, second and touch conducting layers to enable and disable the connectivity among the layers of the learning device 102 based on the user touch. In one example, the at least one intermediary layers may be a glue layers to combine the first, the second and the touch conducting layers with space.
The learning material or paper book (alternatively referred to as book) may be at least a spiral- bound or stapled book with at least one page. Each book is assigned with a unique book code 121. Also, each page of the book is assigned with a page code 122 to uniquely identify the page and its associated learning content. In one embodiment, each page has at least one visual touch indicator on the book in a second position 124 corresponding to the at least one user touch control in the first position 115 on the user touch control layer 114. In one embodiment, for each book, the first page of the book comprises at least one visual touch indicator which when pressed enables the device to identify the book code 121. In one embodiment, each page of the book comprises at least one visual touch indicator on a unique second position to identify the page code 122 of the corresponding page. In other embodiment, each page of the book has visual touch indicator on a fixed second position for the user to go to the previous or the next page to select the page code 122 of the corresponding page. The second position 124 of the visual touch indicators is determined based on the first position 115 of user touch controls. The visual touch indicator may be a symbolic indicator indicating the presence of user touch control underneath the symbol which is configured to generate an output signal 126 in response to pressing of visual touch indicator on the page.
The data repository 103 stores book related data that include meta data information about the paper-based book (alternatively referred to as book). In one embodiment, the data repository 103 may store for each book, the book code 121, at least one page code 122, at least one output signal 126 and other book related data for each page of each book. The data repository 103 also stores the list of first position 115 of user touch controls, the list of second position 124 of the visual touch indicators for each page and other data 128 related to each book. The book code 121 and the page code 122 may be a unique identification code with for example, alphanumeric characters to uniquely identify the book and the page respectively. The first position 115 is defined as the location of at least one user touch control on the learning device 102. The list of second position 124 for each page of the book indicates the location of at least one visual touch indicator on the page of the book, wherein the second position 124 is enabled for the first position 115 of the at least one user touch control of the learning device 102. The output signal 126 may be a user response signal, like playing an audio, blinking an LED, displaying a pattern on LCD, that is to be generated in response to pressing of a visual touch indicator in at least one page by the user. The data repository 103 also stores the other data 128 that includes more information about each book including type of book (right to left reading book OR left to right reading book), type of page identification method in the book, the type of content (for example flash card, read aloud story, multiple choice questions, match the following or any other form of engaging activity), along with list of the first position 115 of the user touch controls that are to be dynamically activated in response to pressing of the at least one visual touch indicator located on second position 124 for each of the corresponding page code 122. In one embodiment, the data repository 103 may be integrated within the learning device 102. In another embodiment, the data repository 103 may be a standalone repository communicatively coupled with the learning device 102 over the communication network 108. In one embodiment, the data repository 103 may be a cloud based storage repository from where data is retrieved by the learning device 102.
The user device 104 may be a mobile device, a portable computer or a computing device including the functionality for communicating over the network 108. For example, the mobile device can be a conventional web-enabled personal computer in the home, mobile computer (laptop, notebook, or subnotebook), Smart Phone (iPhone, Android), VOIP device, television set-top box, interactive kiosk, personal digital assistant, wireless electronic mail device, tablet computer or another device capable of communicating over the Internet or other appropriate communications network. In one embodiment, the user device 104 may comprise an integrated application that is configured to receive a user interaction data from the learning device 102 and assess the performance of the student throughout the interactive session by generating performance reports. The integrated application may also be a local application residing on the learning device 102.
In operation, the learning device 102 is configured to facilitate interactive learning for early grade learners. The learning device 102 comprises at least the processor 119 and the user interface unit 110. The learning device 102 further comprises an interactive unit 131, an identification unit 132, an activation unit 133, a response unit 134 and an assessment unit 135. The learning device 102 may include other modules or functionality units to perform various miscellaneous functionalities of the learning device 102. It will be appreciated that such aforementioned components or units may be represented as a single unit or a combination of different units. The modules may be implemented in the form of software executed by a processor, hardware and/or firmware.
The interactive unit 131 is configured to receive input from the user and transfer the input to the lower layers of the user interface unit 110. The input from the user may be for example, pressing of start visual touch indicator 116 as given in Figure la, which indicates the start of the learning activity. The user may place the book over the topmost layer of the user interface unit 110 secured by holding pins and may press at least one visual touch indicator on a first page of the book to identify the book code 121 of the book. The interactive unit 131 receives the input from the user and transfers the input from the first layer 111 of the user interface unit 110 to the touch conducting layer 118. The identification unit 132 determines the book code 121 when the user presses on the at least one visual touch indicator on the first page of the book. Further, the identification unit 132 determines the page code 122 of the page when a user touches on the at least one visual touch indicator located at the unique second position 124 on the page. In one embodiment, the identification unit 132 continuously determines the page code 122 of a current page by determining the second position 124 of the at least one visual touch indicator that is pressed and the page code of previous or next page of the book. In one embodiment, the data repository 103 may store for each page, mapping of page code 122 with the second position 124, and mapping of second position 124 of visual touch indicators with the corresponding first position 115 of user touch controls. The activation unit 133 further determines the first position 115 of the user touch controls associated with the visual touch indicators of the identified page of the corresponding book and thereby activates the user touch controls in the first position 115 using at least the page code 122, the book code 121 and the second position 124 of the visual touch indicators. Upon activation of user touch controls associated with the current page, the response unit 134 generates the corresponding output signal 126 as a response when each visual touch indicator on the book is pressed. In one embodiment, when the user presses the visual touch indicator located in the second position 124 on the book, the response unit 134 determines the first position 115 of the user touch control of the corresponding visual touch indicator that is pressed and using the repository 103, the response unit 134 further determines the visual touch indicator present in the second position 124. The response unit 134 further generates the output signal 126 configured as output in response to the pressing of the visual touch indicator on the second position 124 for the corresponding book code 121 and the page code 122.
In one aspect, the learning device 102 may be connected to an audio speaker that is configured to output audio response upon pressing on visual touch indicators. In other embodiment, the learning device may be connected to LED, LCD or any other output system to show engaging response to the user.
Upon completion of the learning session, the user may connect the learning device 102 with the user device 104. In one embodiment, learning device 102 may connect with the user device 104 using direct interconnection, local area network (LAN), wide area network (WAN), wireless network such as Bluetooth, the Internet, etc.
In one embodiment, the assessment unit 135 connects the learning device 102 with the integrated application of the user device 104 and enables transferring of a user interaction/usage data and corresponding book’s data to the user device 104 for evaluation. In one embodiment, the assessment unit 135 monitors user interaction with the user interface unit 110 of the learning device 102, stores the monitoring user interaction in the data repository 103 and dynamically sends the user interaction data to the user device 104, at each time the user presses at least one visual touch indicator 220 of at least one page of the book. The assessment unit 135 further enables the integrated application of the user device 104 to assess the performance of the user upon completion of learning session based on the user interaction data and to generate performance reports.
Figure 2 illustrates an exemplary flowchart showing a method for facilitating learning in accordance with some embodiments of the present disclosure;
As illustrated in Figure 2, the method 200 comprises one or more blocks implemented by the processor 119 for facilitating learning. The method 200 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
The order in which the method 200 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 200. Additionally, individual blocks may be deleted from the method 200 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 200 can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 202, input from the user is received. In one embodiment, the input may comprise pressing on the start user touch control 117 located at third position on the upper layer 111 of user interface unit 110. In an embodiment, the interactive unit 131 receives the input of pressing on start user touch control 117 from the user and generates the start output signal indicating the beginning of learning session with the learning device 102. Further, the input includes pressing on at least one visual indicator on the book located in the second position 124 of the page of the book. The interactive unit 131 receives the input and transfers the input to the lower layers of the user interface unit 110. The input from the user is used to identify the book code 121 and the page code 122 at block 204.
At block 204, the book code 121 and page code 122 is identified. In one embodiment, the identification unit 132 determines the unique book code 121 and the page code 122 based on the input received from the user. In one example, the user may place the book over the holder of the user interface unit 110 and may press at least one visual touch indicator on the first page of the book to identify the book code 121 of the book, as shown in Figure 2a. The symbols 211 for example, indicated as apple and banana in Figure 2a are the visual touch indicators which when pressed identifies the book code 121 of the book. The identification unit 132 determines the book code 121 when the user presses on the at least one visual touch indicator on the first page of the book. Further, the identification unit 132 determines the page code 122 of the page when a user touches on the at least one visual touch indicator located at the unique second position on the page. The unique visual touch indicator for page identification may be indicated as GO symbol 218 as illustrated in Figure 2b. As shown in Figure 2b, last column of user touch controls on the user touch control layer 114 may be dedicated for page identification, and the visual touch indicators 220 are arranged on each page. In one embodiment, the identification unit 132 determines the page code 122 of a current page by determining the second position 124 of the at least one visual touch indicator that is pressed and the page code 122 of the previous or next page of the book, when user presses the next or previous symbol indicator 221 as illustrated in Figure 2c and Figure 2d. In another embodiment, the identification unit 132 also interprets the type of book, for example right to left reading book as illustrated in Figure 2d and left to right reading book as illustrated in Figure 2c based on the book code. Based on identification of the page of the book, the user touch controls for the corresponding page of the book are activated using the data in data repository 103.
At block 206, the at least one user touch control is activated for a given page of the book. In one embodiment, the activation unit 133 activates the user touch controls corresponding to all the visual indicators associated with the identified page for the given book. In one example, in each page, visual touch indicators 220 may be indicated as volume symbols in the second position 124 as shown in Figure 2b, Figure 2c and Figure 2d. The activation unit 133 determines the second position 124 of all the visual touch indicators associated with the identified page. The activation unit 133 further determines the first position 115 of the user touch controls associated with the visual touch indicators 220 of the identified page. Upon determining the first position 115 of the user touch controls, the activation unit 133 activates the user touch controls using at least the page code 122, the book code 121 and the second position 124 of the visual touch indicators 220.
At block 208, the output signal 126 is generated when the user press one of the visual touch indicator on the book. In one embodiment, the response unit 134 generates the output signal 126 corresponding to pressing on each visual touch indicator 220 of the identified page. In one example, the response unit 134 determines the output signal 126 by extracting the information from data repository 103 based on the page code 122, the book code 121 and the second position 124 of the at least one visual touch indicator as pressed by the user. In one embodiment, the output signal 126 may be an audio signal displayed via the speaker connected to the learning device 102. In other embodiment, the output signal 126 may be blinking of LED or displaying a pattern on LCD connected to the learning device 102. In one example, the response may be mix of playing an audio, blinking of LED, displaying a pattern on LCD or any other suitable engaging response to the user.
Figure 3 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure. In an embodiment, the computer system 302 may be a learning device 102, which is used for facilitating learning for early grade learners. The computer system 302 may include a central processing unit (“CPU” or“processor”) 304. The processor 304 may comprise at least one data processor for executing program components for executing user or system-generated business processes. The processor 304 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
The processor 304 may be disposed in communication with one or more input/output (I/O) devices (306 and 308) via I/O interface 310. The I/O interface 310 may employ communication protocols/methods such as, without limitation, audio, analog, digital, stereo, IEEE- 1394, serial bus, Universal Serial Bus (USB), infrared, PS/2, BNC, coaxial, component, composite, Digital Visual Interface (DVI), high-definition multimedia interface (HDMI), Radio Frequency (RF) antennas, S-Video, Video Graphics Array (VGA), IEEE 802.n /b/g/n/x, Bluetooth, cellular (e.g., Code-Division Multiple Access (CDMA), High-Speed Packet Access (HSPA+), Global System For Mobile Communications (GSM), Long-Term Evolution (LTE) or the like), etc.
Using the I/O interface 310, the computer system 302 may communicate with one or more I/O devices (306 and 308). In some implementations, the processor 304 may be disposed in communication with a communication network 312 via a network interface 314. The network interface 314 may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/Internet Protocol (TCP/IP), token ring, IEEE 802.1 la/b/g/n/x, etc. Using the network interface 314 and the communication network 312, the computer system 302 may be connected to the data repository 103 and the user device 104.
The communication network 312 can be implemented as one of the several types of networks, such as intranet or any such wireless network interfaces. The communication network 312 may either be a dedicated network or a shared network, which represents an association of several types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), etc., to communicate with each other. Further, the communication network 312 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, etc. In some embodiments, the processor 304 may be disposed in communication with a memory 316 e.g., RAM 318, and ROM 320, etc. as shown in Figure 3, via a storage interface 322. The storage interface 322 may connect to memory 316 including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as Serial Advanced Technology Attachment (SATA), Integrated Drive Electronics (IDE), IEEE- 1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI), etc. The memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, Redundant Array of Independent Discs (RAID), solid-state memory devices, solid-state drives, etc.
The memory 316 may store a collection of program or database components, including, without limitation, user/application 324, an operating system 326, a web browser 328, a mail client 330, a mail server 332, a user interface 334, and the like. In some embodiments, computer system 302 may store user/application data 324, such as the data, variables, records, etc. as described in this invention. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase.
The operating system 326 may facilitate resource management and operation of the computer system 302. Examples of operating systems include, without limitation, Apple Macintosh™ OS X™, UNIX™, Unix-like system distributions (e.g., Berkeley Software Distribution (BSD), FreeBSD™, Net BSD™, Open BSD™, etc.), Linux distributions (e.g., Red Hat™, Ubuntu™, K-Ubuntu™, etc.), International Business Machines (IBM™) OS/2™, Microsoft Windows™ (XP™, Vista/7/8, etc.), Apple iOS™, Google Android™, Blackberry™ Operating System (OS), or the like. A user interface may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities. For example, user interfaces may provide computer interaction interface elements on a display system operatively connected to the computer system 302, such as cursors, icons, check boxes, menus, windows, widgets, etc. Graphical User Interfaces (GUIs) may be employed, including, without limitation, Apple™ Macintosh™ operating systems’ Aqua™, IBM™ OS/2™, Microsoft™ Windows™ (e.g., Aero, Metro, etc.), Unix X-Windows™, web interface libraries (e.g., ActiveX, Java, JavaScript, AJAX, HTML, Adobe Flash, etc.), or the like.
The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments. Also, the words "comprising," "having," "containing," and "including," and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms“a,”“an,” and“the” include plural references unless the context clearly dictates otherwise.
Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., are non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
Advantages of the embodiment of the
Figure imgf000017_0001
disclosure are illustrated herein.
The present disclosure provides a learning device and a method thereof.
In the present disclosure, based on the book code and the page code, a specific set of user touch controls are activated. This does not require polling of each touch indicator individually to activate its corresponding user touch control. This reduces the time consumption in data retrieval providing user response in minimum time and reducing the power consumption. The present disclosure uses simple learning material such as paper book. As book comprises multiple pages that can be identified uniquely, the present disclosure enables continuous learning instead of replacing each page individually on the learning device. The present disclosure also supports easy reading for the visually impaired children by supporting braille encoded paper books as the learning material. As the user presses the touch control by just using finger (instead of using any form of pen), the learning device can also be used by visually impaired students to learn braille encoded paper books. The learning device of the present disclosure also supports learning of multiple languages. The reading may differ for multiple languages for example, English is to be read from left to right, and Urdu is to be read from right to left.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, the disclosure of the embodiments of the disclosure is intended to be illustrative, but not limiting, of the scope of the disclosure.
With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.

Claims

The Claims
1. A learning device, comprising:
a processor; and
a user interface unit comprising at least one layer comprising at least one user touch control located at a first position in the at least one layer, each user touch control is configured to generate an output signal in response to activation of the at least one user touch control;
wherein the user interface unit comprises a holder to receive a paper book overlying the user interface unit, the paper book comprising at least one page identified by a page code and a book code, each page comprising at least one visual touch indicator located in a second position of the paper book that is pressed to dynamically activate the at least one user touch control based on at least the page code, book code and the second position of the at least one visual touch indicator.
2. The learning device as claimed in claim 1, wherein the at least one page of the paper book comprises the second position of the at least one visual touch indicator corresponding to the first position of the at least one user touch control of the at least one layer of the user interface unit of the learning device.
3. The learning device as claimed in claim 1, further comprises a data repository configured to store:
for each paper book and for each page of the paper book,
the first position of the at least one user touch control that is previously mapped to be activated in response to pressing of the at least one visual touch indicator;
the corresponding book code;
the page code;
the second position of the at least one visual touch indicator; and corresponding output signal to be generated.
4. The learning device as claimed in claim 3, wherein the at least one user touch control is configured to generate a unique output signal using the book code, the page code and the second position of the at least one visual touch indicator on the paper book.
5. The learning device as claimed in claim 1, wherein the at least one layer of the user interface unit comprises a start touch control at a third position that is activated to generate a start output signal indicating the beginning of the interaction of user with the learning device using the paper book.
6. The learning device as claimed in claim 1 , wherein the processor identifies the book code by determining the second position of the at least one visual touch indicator on a first page of the paper book that is pressed, wherein the at least one visual touch indicator has a unique second position in the first page of each book.
7. The learning device as claimed in claim 1, wherein the processor identifies the page code of a current page by determining the second position of the at least one visual touch indicator on the paper book that is pressed and the page code of previous page of the book.
8. The learning device as claimed in claim 1, wherein the processor identifies the page code by determining the second position of the at least one visual touch indicator of the book that is pressed, wherein the at least one visual touch indicator has a unique second position in each page of each book.
9. A method of facilitating learning, the method comprising:
receiving, by a user interface unit of a learning device, an input from a user of the learning device, wherein the input comprises pressing of at least one visual touch indicator on at least one page of a paper book, wherein the at least one page comprises the at least one visual touch indicator located in a second position corresponding to at least one user touch control located at a first position of at least one layer of the user interface;
identifying, by a processor of the learning device, a book code and a page code using the second position of the at least one visual touch indicator that is pressed; activating, by the processor, the previously mapped at least one user touch control in the first position in response to pressing of the at least one visual touch indicator using at least the page code, book code and the second position of the at least one visual touch indicator; and
generating, by the processor, a unique output signal upon activation of at least one user touch control.
10. The method as claimed in claim 9, further comprising generating a start output signal processor in response to pressing of a start user touch control located on a third position of at least one layer by the user, wherein the start user touch control is activated to initiate the interaction with the learning device.
11. The method as claimed in claim 9, further comprising
identifying the book code using the second position of the at least one visual touch indicator on a first page of the paper book that is pressed; and
identifying the page code of a current page of the book using the second position of the at least one visual touch indicator on each page of the paper book and the page code of previous page of the paper book; and
identifying the unique output signal using the book code, the page code and the second position of the at least one visual touch indicator on the paper book that is pressed.
12. The method as claimed in claim 11, wherein identifying the page code comprising the step of determining the second position of the at least one visual touch indicator of the paper book that is pressed, wherein the at least one visual touch indicator has a unique second position in each page of each paper book.
PCT/IN2018/050468 2017-12-28 2018-07-19 Learning device and method thereof Ceased WO2019130333A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201741047058 2017-12-28
IN201741047058 2017-12-28

Publications (1)

Publication Number Publication Date
WO2019130333A1 true WO2019130333A1 (en) 2019-07-04

Family

ID=67066732

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2018/050468 Ceased WO2019130333A1 (en) 2017-12-28 2018-07-19 Learning device and method thereof

Country Status (1)

Country Link
WO (1) WO2019130333A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120192057A1 (en) * 2011-01-24 2012-07-26 Migos Charles J Device, Method, and Graphical User Interface for Navigating through an Electronic Document
WO2012115758A2 (en) * 2011-02-24 2012-08-30 Google Inc. Instructor-curated electronic textbook systems and methods

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120192057A1 (en) * 2011-01-24 2012-07-26 Migos Charles J Device, Method, and Graphical User Interface for Navigating through an Electronic Document
WO2012115758A2 (en) * 2011-02-24 2012-08-30 Google Inc. Instructor-curated electronic textbook systems and methods

Similar Documents

Publication Publication Date Title
US9324242B2 (en) Electronic book that can communicate directly with hardware devices via a keyboard API interface
US20170103667A1 (en) Customized tests that allow a teacher to choose a level of difficulty
US10546508B2 (en) System and method for automated literacy assessment
US9536438B2 (en) System and method for customizing reading materials based on reading ability
JP2013145265A (en) Server, terminal device for learning, and learning content management method
US11574558B2 (en) Game-based method for developing foreign language vocabulary learning application
JP6957803B2 (en) Learning support device
KR101050173B1 (en) Online reading learning training system and method
KR101080092B1 (en) Foreign language word learning method and foreign language learning device using same
JP7041958B2 (en) Education support system and education support method
WO2019130333A1 (en) Learning device and method thereof
JP6262948B2 (en) English grammar learning system
US12014649B2 (en) Book recommendation and flashcard generation
Ebner et al. Cloud-based service for eBooks using EPUB under the Aspect of Learning Analytics
JP7599648B2 (en) Information processing device, control method for information processing device, and control program for information processing device
JP6519249B2 (en) Answer support program, answer support device, and answer support method
KR102129725B1 (en) Teaching materials for study of English
Inie et al. Developing evaluation metrics for active reading support
JP7172218B2 (en) Entry support device and program
Wahyudi et al. Strategies in Coping with Problems Faced by University Students in Speaking Class Across Different Proficiency Levels
KR20200018054A (en) Teaching materials for study of English
Gohil et al. Use of hot potatoes software for language teaching and learning
KR20220153930A (en) Method and apparatus for providing contents for learning study of public administration through repetitive memorizing
JP2017191120A (en) Print creation system, print creation method, and print creation program
JP2025048586A (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18896190

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18896190

Country of ref document: EP

Kind code of ref document: A1