US20190065447A1 - Method of processing analog data and electronic device thereof - Google Patents
Method of processing analog data and electronic device thereof Download PDFInfo
- Publication number
- US20190065447A1 US20190065447A1 US16/173,437 US201816173437A US2019065447A1 US 20190065447 A1 US20190065447 A1 US 20190065447A1 US 201816173437 A US201816173437 A US 201816173437A US 2019065447 A1 US2019065447 A1 US 2019065447A1
- Authority
- US
- United States
- Prior art keywords
- electronic device
- symbols
- handwritten
- text
- digital
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06F17/2264—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/12—Detection or correction of errors, e.g. by rescanning the pattern
- G06V30/127—Detection or correction of errors, e.g. by rescanning the pattern with the intervention of an operator
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/12—Use of codes for handling textual entities
- G06F40/151—Transformation
-
- G06K9/033—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
- G06V10/987—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns with the intervention of an operator
-
- G06K2209/01—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
Definitions
- the present invention relates generally to a method of processing acquired data and an electronic device thereof, and more particularly, to a method of displaying digital and analog text on an electronic device.
- An electronic device provides various services such as, for example, a camera function, a data communication function, a moving picture reproduction function, an audio reproduction function, a messenger function, a schedule management function, an alarm function, and an audio communication function.
- Various programs are used in the electronic device that can utilize these functions.
- the electronic device may convert analog text included in an image photographed through a camera, analog text included in an image received through network communication, or analog text included in an image which is stored at a memory to digital text processing.
- the electronic device may be unable to acquire digital text corresponding to the analog text and, as a result, may output information that does not correspond with information displayed in the original document, such as by displaying an error code or digital text, which has was converted incorrectly.
- an aspect of the present invention provides a method of processing data and an electronic device thereof that can provide clear information about a text of an acquired document even when a portion or the entire of a text in which the acquired document includes is not determined to a matching digital text.
- Another aspect of the present invention provides a method of processing data and an electronic device thereof that can perform a function of the electronic device without operation of directly manipulating through a text in which handwriting is input to an acquired document.
- a method includes displaying, via a display of the electronic device, an electronic document including an image; identifying, from the electronic document, a plurality of handwritten inputs distinguished from a plurality of text words included in the electronic document; acquiring, from among the plurality of handwritten inputs, first handwritten symbols; identifying, from the electronic document, at least one text word corresponding to each of the first handwritten symbols; and in response to receiving a user input, displaying, via the display, the at least one text word as associated with first digital symbols corresponding to each of the first handwritten symbols, based in order of the first handwritten symbols.
- an electronic device includes a display; and at least one processor coupled to the display, configured to: display, via the display, an electronic document including an image; identify, from the electronic document, a plurality of handwritten inputs distinguished from a plurality of text words included in the electronic document; acquire, from among the plurality of handwritten inputs, first handwritten symbols; identify, from the electronic document, at least one text word corresponding to each of the first handwritten symbols; and in response to receiving a user input, display, via the display, the at least one text word as associated with first digital symbols corresponding to each of the first handwritten symbols, based in order of the first handwritten symbols.
- FIG. 1 is a block diagram illustrating a configuration of an electronic device, according to an embodiment of the present invention
- FIGS. 2A and 2B illustrate writing information that an electronic device can recognize, according to an embodiment of the present invention
- FIGS. 3A and 3B illustrate an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention
- FIGS. 4A and 4B illustrate an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention
- FIGS. 5A and 5B illustrate an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention
- FIGS. 6A and 6B illustrate an operation of displaying document information that an electronic device, acquires according to an embodiment of the present invention
- FIGS. 7A to 7E illustrate an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention.
- FIGS. 8A and 8B are flowcharts illustrating an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention.
- an electronic device When describing various embodiments of the present invention, an electronic device will be described based on a touch screen that can perform an input operation through an input device and a display operation through a display unit on a physical screen.
- the display unit may include the input device, or the input device may be represented with the display unit.
- Embodiments of the present invention are not limited only to an electronic device including a touch screen, and may be applied to various electronic devices in which the display unit and the input device are physically separated or that include only one of the display unit and the input device.
- An electronic device may be embodied as a mobile communication user device, a Personal Digital Assistant (PDA), a Personal Computer (PC), a laptop computer, a smart phone, a smart pad, a smart television, a Netbook, a Mobile Internet Device (MID), an Ultra Mobile PC (UMPC), a tablet PC, a mobile pad, a media player, a hand-held computer, a navigation device, a smart watch, a Head Mounted Display (HMD), and a Moving Picture Experts Group layer-3 (MP3) player.
- PDA Personal Digital Assistant
- PC Personal Computer
- a laptop computer a smart phone, a smart pad, a smart television, a Netbook, a Mobile Internet Device (MID), an Ultra Mobile PC (UMPC), a tablet PC, a mobile pad, a media player, a hand-held computer, a navigation device, a smart watch, a Head Mounted Display (HMD), and a Moving Picture Experts Group layer-3 (MP3) player.
- PDA Personal Digital Assistant
- PC Personal
- an element When it is described herein that an element is “connected” or “coupled” to another element, it should be understood that the element may be directly connected or coupled to the other element or electrically coupled to the other element through a third element. In contrast, when it is described that an element is “directly connected” or “directly coupled” to another element, it should be understood that there is no intermediate part between the two parts.
- FIG. 1 is a block diagram illustrating a configuration of an electronic device, according to an embodiment of the present invention.
- an electronic device 100 includes a memory 110 , a processor unit 120 .
- the electronic device 100 also includes an input and output processor 130 , a display unit 131 , a touch input device 132 , an audio processor 140 , a communication system 150 as a peripheral device, and other peripheral devices.
- the memory 110 includes a program storage unit 111 that stores a program for controlling operation of the electronic device 100 , and a data storage unit 112 that stores data generated while performing a program.
- the data storage unit 112 may store data generated at a program with operation of a processor 122 of the processor unit 120 .
- operation information that can control the electronic device 100 or other electronic devices may be previously determined and a database in a table or list form may be formed and stored.
- the electronic device 100 may convert text displayed in document information and acquired by an image sensor 160 to digital text, and store data of the converted digital text as a digital document at the data storage unit 112 .
- the electronic device 100 may store data of text, such as a character that can compare, a font of the character, a symbol, and a figure, and digital text (a character, a font of the character, a symbol, and a figure) data according to the data of text at the data storage unit 112 .
- data of text such as a character that can compare, a font of the character, a symbol, and a figure
- digital text a character, a font of the character, a symbol, and a figure
- the electronic device 100 may store an image of an area that is not converted to digital text as image data at the data storage unit 122 .
- the program storage unit 111 includes an Optical Character Recognition (OCR) program 115 , a handwriting processing program 116 , a communication control program 117 , and at least one application program 118 .
- Programs included in the program storage unit 111 are formed in a set of instructions and may be represented with an instruction set.
- the application program 118 may include a software element of at least one application program installed at the memory 110 of the electronic device 100 .
- the OCR program 115 may convert text information, such as a character, a symbol, and a figure displayed in document information and acquired through the image sensor 160 , to digital text and generate a digital document in the same form as, or a form similar to, that of text displayed in a document.
- the handwriting processing program 116 may work with the OCR program 115 . According to an embodiment of the present invention, the handwriting processing program 116 may convert a symbol input in handwriting to acquired document information, to a digital symbol corresponding to a predetermined method, and apply a function or an effect corresponding to the determined digital symbol to the digital document.
- the electronic device 100 may output a digital document in which a function or an effect corresponding to a digital symbol is applied.
- the handwriting processing program 116 may copy text from an area that cannot be converted to digital text to image data, or may crop with a method such as pasting. Cropping may be a method of acquiring a partial area or an entire area of image data.
- the handwriting processing program 116 may display image data of an area that cannot be converted to digital text at a predetermined position of the digital document.
- Acquired document information described in embodiments of the present invention may be a document that includes text information of an analog method that does not include a digital text (e.g., text information of a data form used in the electronic device 100 like image information of a document that is photographed through the image sensor 160 of the electronic device 100 and image information of a document including text stored at a memory of the electronic device 100 ).
- the electronic device 100 may detect text from image information of a document and determine digital text matched to the same shape as, or a shape similar to, that of the detected text.
- the communication control program 117 may include at least one software element for controlling communication with at least one other electronic device using the communication system 150 or the image sensor 160 .
- the communication control program 117 may search for another electronic device for a communication connection. When another electronic device for the communication connection is found, the communication control program 117 may set a connection for communication with the other electronic device. Thereafter, by performing a performance search and session setting procedure with the connected other electronic device, the communication control program 117 may control to transmit and receive data (e.g., packet data) to and from the other electronic device through the communication system 150 .
- data e.g., packet data
- the memory 110 included in the electronic device 100 may be one or more memory components. According to an embodiment of the present invention, the memory 110 may perform a function of only the program storage unit 111 , a function of only the data storage unit 112 , or functions of both the program storage unit 11 and the data storage unit 112 according to use. In the memory 110 , a physical area within the memory 110 may not be clearly divided based on the characteristics of the electronic device 100 .
- the processor unit 120 includes a memory interface 121 , at least one processor 122 , and a peripheral device interface 123 .
- the memory interface 121 , the at least one processor 122 , and the peripheral device interface 123 included in the processor unit 120 may be integrated in at least one circuit or may be embodied with a separate constituent elements.
- the memory interface 121 may control access to the memory 110 of a constituent element such as the processor 122 or the peripheral device interface 123 .
- the peripheral device interface 123 may control a connection of an input and output peripheral device, the processor 122 , and the memory interface 121 of the electronic device 100 .
- the processor 122 may control the electronic device 100 to provide various multimedia services using at least one software program and to display a UI operation of the electronic device 100 at the display unit 131 through the input and output processor 130 .
- the processor 122 may also control the touch input device 132 to provide a service that receives input of an instruction from outside of the electronic device 100 .
- the processor 122 may control to provide a service corresponding to a corresponding program.
- the input and output processor 130 may provide an interface between an input and output device 133 , such as the display unit 131 and the touch input device 132 , and the peripheral device interface 123 .
- the display unit 131 may receive state information of the electronic device 100 , a character, a moving picture, or a still picture input from outside of the processor unit 120 .
- the display unit 131 may form an UI operation, and display the UI operation through the input and output processor 130 .
- the touch input device 132 may provide input data occurring by a user's selection to the processor unit 120 through the input and output processor 130 .
- the touch input device 132 may be formed with only a control button or may be formed with a keypad.
- the touch input device 132 may be provided with the input and output device 133 together with the display unit 131 so that an input and output may operate on one screen.
- the touch input device 132 used for the input and output device 133 may use at least one of a capacitive type, a resistive (pressure detection) type, an infrared ray type, an electromagnetic induction type, and an ultrasonic wave type.
- an input method of the touch input device 132 may be a method of inputting an instruction when an input means is positioned within a predetermined distance from the touch screen 133 , in addition to a method of directly touching and inputting the touch screen 133 .
- the input method may use inputs such as a hovering touch, a floating touch, an indirect touch, a proximity touch, or a non-contact input.
- the input and output device 133 is a device that physically couples the touch input device 132 as a single screen on the display unit 131 .
- the input and output device 133 may be a touch screen that can input an instruction by touching a screen configuration displayed in the display unit 131 .
- the touch screen can perform functions of both the display unit 131 that displays a UI operation of the electronic device 100 and the touch input device 132 that inputs an external instruction to the electronic device 100 .
- the touch screen may be formed as the touch screen 133 including the display unit 131 and the touch input device 132 .
- the touch screen 133 is formed in a complex touch panel in which a touch panel and a pen touch panel are formed together.
- the touch screen 133 of the electronic device 100 is not limited to a touch panel formed in a complex touch panel, and may be embodied as a touch screen to which a pen touch panel, which can perform only a pen touch, is applied.
- the audio processor 140 may provide an audio interface between a user and the electronic device 100 through a speaker 141 and a microphone 142 .
- the communication system 150 performs a communication function. According to an embodiment of the present invention, the communication system 150 may perform communication with another electronic device using at least one of mobile communication, wire communication, and satellite communication through a base station, and is connected to at least one short range wireless communication module to perform short range wireless communication.
- a short range wireless communication module may perform communication with another electronic device using at least one of short range wireless communication such as, for example, infrared ray communication, Bluetooth communication, Bluetooth Low Energy (BLE) communication, Wi-Fi communication, Near Field Communication (NFC) wireless communication, Zigbee communication, and Ultra WideBand (UWB) communication, Wireless Local Area Network (LAN) communication, and wire communication.
- short range wireless communication such as, for example, infrared ray communication, Bluetooth communication, Bluetooth Low Energy (BLE) communication, Wi-Fi communication, Near Field Communication (NFC) wireless communication, Zigbee communication, and Ultra WideBand (UWB) communication, Wireless Local Area Network (LAN) communication, and wire communication.
- the communication system 150 or a short range wireless communication module is divided and described, but the communication system 150 and the short range wireless communication module may perform communication in one communication system module.
- the image sensor 160 may photograph an object and generate image data.
- the image sensor 160 may include an optical unit and an operation detection sensor (motion sensor), and may be formed with a module such as an operation detection module and a camera module.
- the optical unit may be driven by a mechanical shutter, a motor, or an actuator, and may perform operations such as a zoom function and focusing by the actuator.
- the optical unit photographs a peripheral object, and the image sensor 160 may detect an image photographed by the optical unit and convert the detected image to an electrical signal.
- the image sensor 160 may be embodied as a sensor such as a Complementary Metal-Oxide Semiconductor (CMOS) or a Charge-Coupled Device (CCD), and another image sensor having high resolution may be used.
- CMOS Complementary Metal-Oxide Semiconductor
- CCD Charge-Coupled Device
- the image sensor of the camera may house a grovel shutter therein. The grovel shutter may perform a function similar to a mechanical shutter housed in a sensor
- a display to the electronic device 100 or an output to the electronic device 100 may be a term representing a method of displaying a moving picture, a still picture, or an GUI operation on the touch screen 133 of the electronic device 100 or outputting a signal sound or a voice to the speaker 141 .
- the term display or output may be used herein, and when it is necessary to distinguish a display or an output, the display or the output may be separately described.
- FIGS. 2A and 2B illustrate writing information that an electronic device can recognize, according to an embodiment of the present invention.
- the electronic device 100 may detect analog text, such as a character or a symbol included in a document captured through a camera device including an image sensor, by using the handwriting processing program 116 . That is, the electronic device 100 may detect a character or a symbol input in handwriting by a user, as well as a character or a symbol printed in a constant font. The electronic device 100 converts the character or the symbol input in handwriting in the detected analog text to a digital text, i.e., to a digital character or symbol. The electronic device 100 may display the converted digital text on the touch screen 133 of the electronic device 100 .
- analog text such as a character or a symbol included in a document captured through a camera device including an image sensor
- the electronic device 100 may detect a highlighter input 203 input in handwriting in a document in which a text of a constant font is printed.
- the electronic device 100 may convert the detected highlighter input 203 to a corresponding digital highlighter effect input.
- the electronic device 100 may detect a symbol input 201 , 207 , or 211 , input by handwriting in the document in which a text of a constant font is printed.
- the electronic device 100 may convert the detected symbol input 201 , 207 , or 211 to a corresponding digital symbol input.
- the electronic device 100 may detect a character and/or numeral input 205 , input by handwriting in the document in which a text of a constant font is printed.
- the electronic device 100 may convert the detected character and/or numeral input 205 to a corresponding digital character and/or numeral input.
- the electronic device 100 may detect an annotation symbol input 209 and 219 , input in handwriting, input by handwriting, in the document in which a text of a constant font is printed.
- the electronic device 100 may convert the detected annotation symbol to a corresponding digital annotation symbol input.
- the annotation symbol input 209 may be described as one of various symbol inputs, such as, for example, the symbol inputs 201 , 207 , and 211 .
- the electronic device 100 may detect a text 213 , input by handwriting in the document in which a text of a constant font is printed.
- the electronic device 100 may convert the detected symbol and text 213 input with handwriting to a corresponding digital description input.
- the electronic device 100 may detect a figure input 215 , input by handwriting in the document in which a text of a constant font is printed.
- the electronic device 100 may convert the figure input 215 detected in a document input with a digital text to a corresponding digital figure input.
- the electronic device 100 may detect an underline input 208 , 210 , 212 , and 217 that is input with handwriting in the document in which a text of a constant font is printed.
- the electronic device 100 may convert the detected underline input to a corresponding digital underline input.
- the electronic device 100 may detect underlining with various geometrical lines such as a wave 217 , a straight line 208 , 210 , or 212 , and a dotted line.
- an operation in which the electronic device 100 converts a text such as the highlighter input 203 , the character and/or numeral input 205 , the symbol input 201 , 207 , 209 , and 211 , the description input 213 , the underline input 208 , 210 , 212 , and 217 , or the figure input 215 to a digital text may include the operation of determining matching or similar data at the memory 110 of the electronic device 100 .
- the electronic device 100 may detect text information in which handwriting is input to the document in which a digital text is displayed, determine digital text corresponding to the detected text information, and display the determined digital text in the digital document.
- the electronic device 100 may determine whether a text included in document information acquired through text information stored at a database is a printed text of a digital form or a handwritten text.
- the electronic device 100 may convert handwritten text such as, for example, the detected highlighter input 203 , the character and/or numeral input 205 , the symbol input 201 , 207 , 209 , and 211 , the description input 213 , the underline input 208 , 210 , 212 , and 217 , or the figure input 215 to a digital text input such as, for example, a digital highlighter input 223 , a digital character and/or numeral input 225 , a digital symbol input 221 , 227 , 229 , and 231 , a digital description input 233 , a digital underline input 228 , 230 , 232 , and 237 , or a digital figure input 235 , corresponding to each text.
- the electronic device 100 may display the determined digital text input on the touch screen 133 .
- the electronic device 100 may output digital text displayed on the touch screen 133 with a sound through the speaker
- the electronic device 100 may convert the detected highlighter input 203 to the corresponding digital highlighter effect 223 , and display the determined digital highlighter effect 223 in a predetermined area of the touch screen 133 , or an area of a digital document corresponding to an area of the highlighter input 203 .
- the electronic device 100 may convert the detected symbol input 201 to the corresponding digital symbol effect 221 (e.g., asterisk), and display the determined digital symbol effect 221 in a predetermined area of the touch screen 133 , or an area of a digital document corresponding to an area of the symbol input 201 .
- the digital symbol effect 221 e.g., asterisk
- the electronic device 100 may convert the detected at least one symbol input 207 and 211 of the same method and a continued order to the corresponding digital symbol input 227 or 231 , and display the determined digital symbol input 227 or 231 at a predetermined position of the touch screen 133 , or a position of the digital document corresponding to a position of the at least one symbol input 207 and 211 .
- the electronic device 100 may convert the detected two or more same annotation symbols 209 and 219 to the corresponding digital annotation symbols 229 and 239 , and display the determined digital annotation symbols 229 and 239 at positions of the digital document corresponding to positions of the annotation symbols 209 and 219 or a predetermined position of the touch screen 133 .
- the electronic device 100 may move a display to a position of 239 of the digital document corresponding to the annotation 209 .
- the electronic device 100 may display the position of 209 of the digital document.
- the electronic device 100 may convert the symbol and character 213 input with handwriting to the corresponding digital symbol and text 233 .
- the electronic device 100 may convert the combined symbol and character 213 to the digital description input 233 through a preset database, as described above.
- the electronic device 100 may display the determined digital description input 233 at a position of the digital document corresponding to a position of the symbol and character 213 t or a predetermined position of the touch screen 133 .
- the electronic device 100 may convert a rectangle 215 , a triangle, a circle, and figure input that is input with handwriting to the corresponding digital figure input 235 , and display the determined digital figure input 235 at a position of the digital document corresponding to a position of the figure input 215 , or a predetermined position of the touch screen 133 .
- the electronic device 100 may convert a underline input such as, for example, the detected straight line form underline 208 , 210 , or 212 , a wave form underline 217 , or a dotted line form underline to a corresponding digital underline input 228 , 230 , 232 , and 237 , and display the determined digital underline input 228 , 230 , 232 , and 237 at a position of the digital document corresponding to a position of the underline input 208 , 210 , 212 , and 217 or a predetermined position of the touch screen 133 .
- a underline input such as, for example, the detected straight line form underline 208 , 210 , or 212 , a wave form underline 217 , or a dotted line form underline
- a corresponding digital underline input 228 , 230 , 232 , and 237 displayed at a position of the digital document corresponding to a position of the underline input 208 ,
- the electronic device 100 may not convert handwritten text into digital text.
- the electronic device 100 may copy or cut out a predetermined area including handwritten text that is not converted to a digital text.
- the electronic device 100 may display a predetermined area cropped through copy or crop in a predetermined area of the digital document.
- the electronic device 100 may not convert handwritten input (digital character and/or numeral input) such as ‘due date: 2013.08.11 205 to a corresponding digital input.
- Handwritten input may not converted to a corresponding digital input when a digital text corresponding to a portion or the entire handwritten input is not matched at a database.
- the electronic device 100 may crop the handwritten input area 205 in image data that photographs the document and display the cropped handwritten area 225 at a predetermined position of the digital document.
- the electronic device 100 may rotate and display a handwritten area that obliquely input with a slope corresponding to a text line of the digital document.
- the electronic device 100 may output a matching error code (wrongly converted text).
- a matching error code (wrongly converted text).
- the electronic device 100 may display a matching error code in the cropped handwritten area 225 of the digital document when a display is released, and display the cropped handwritten 225 when a matching error code is displayed.
- the electronic device 100 when displaying various digital inputs such as the above-described digital highlighter input, digital character and/or numeral input, digital symbol input, digital description input, digital underline input, or digital figure input on the touch screen 133 , the electronic device 100 does not limit the area for display to a previously displayed area and may move a position thereof.
- FIGS. 3A and 3B illustrate operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention.
- the electronic device 100 may photograph a document formed with a text and handwriting input and generate a digital document.
- the electronic device 100 may include a figure input with handwriting or a function corresponding to a figure in the digital document.
- the electronic device 100 may photograph a document formed with a text and handwriting input through an image sensor.
- the electronic device 100 may convert the handwritten input to a corresponding digital text.
- the electronic device 100 may detect a figure displayed with handwritten input in a text portion of a constant font in acquired document information.
- the electronic device 100 may determine a figure in a database that matches the handwritten input.
- the electronic device 100 may detect a figure of a rectangle 301 displayed around ‘optical character recognition’ in the acquired document information, and may detect a figure of an asterisk (*) 303 displayed near ‘OCR’.
- the electronic device 100 may determine a digital text matched to the quadrangle 301 and the asterisk 303 t with reference to a database.
- the electronic device 100 may convert a text having handwriting input to acquired document information.
- the electronic device 100 may match the text having handwriting input to a digital text of a database and display an acquired digital text or a function connected to the digital text in the digital document.
- the electronic device 100 may determine a digital figure matched to handwriting input, and display an acquired digital figure at a predetermined position of the digital document.
- the electronic device 100 may detect a FIG. 311 of a rectangular form input to an area of ‘optical character recognition’ in acquired document information.
- the electronic device 100 may determine information matched to the detected rectangular form FIG. 311 through a database.
- the electronic device 100 may reverse and output an area ‘optical character recognition’ of the digital document according to the determined information.
- the electronic device 100 may detect that an asterisk 303 is input with handwriting in an area of ‘OCR’ in acquired document information.
- the electronic device 100 may determine information matched to the detected asterisk 303 through a database.
- the electronic device 100 may reset a font of a word ‘OCR’ 313 with a predetermined method according to the determined information.
- the electronic device 100 may detect the same word ‘OCR’ 315 .
- the electronic device 100 may reset a font of the detected ‘OCR’ 315 with the same method as a resetting font of the ‘OCR’ 313 .
- FIGS. 4A and 4B illustrate an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention.
- the electronic device 100 may generate a digital document through image data including text and handwriting input.
- the electronic device 100 may include a symbol input with handwriting or a function corresponding to a symbol in the digital document.
- the electronic device 100 may detect at least one symbol input with handwriting in acquired document information.
- the electronic device 100 may determine a digital symbol that matches the detected at least one symbol at a database.
- the electronic device 100 may determine two or more connected symbols among the determined symbol.
- the electronic device 100 may detect a symbols 405 , 401 , and 403 input with handwriting in acquired document information.
- the electronic device 100 may determine a matched digital symbol through information of a database.
- the electronic device 100 may determine that two or more digital symbols of matched digital symbols 405 , 401 , and 403 are connected.
- the electronic device 100 may determine a word included in an area of the underline. According to an embodiment of the present invention, the electronic device 100 may detect the symbol 403 in acquired document information and determine a phrase ‘bikes rides’ included in an area of an underline input with handwriting in an area near 403 . When determining a phrase ‘bikes rides’ included in an area of near 403 , the electronic device 100 may refer to acquired document information and/or a digital document generated through acquired document information.
- the electronic device 100 may determine a word from an area in which a symbol input with handwriting is positioned to an area in which a sentence is terminated. According to an embodiment of the present invention, the electronic device 100 may detect the symbol 401 in acquired document information and determine a phrase ‘more photos’ from a word ‘more’ at a position of the symbol 401 to a word ‘photos’ where the sentence is terminated. According to an embodiment of the present invention, the electronic device 100 may detect the symbol 405 in acquired document information and determine ‘be inspired’ from a word ‘be’ at a symbol position of 405 to a word ‘inspired’ where the sentence is terminated. When determining a phrase ‘more photos’ of 401 or a phrase ‘be inspired’ of 405 , the electronic device 100 may refer to acquired document information and/or a digital document generated through acquired document information.
- a heart symbol 407 may be included in a determined area of the symbol 405 .
- the electronic device 100 may include the heart symbol 407 input with handwriting beside ‘be inspired’.
- the electronic device 100 may match a symbol input with handwriting to a digital symbol of a database. When two or more related symbols are determined, the electronic device 100 may display two or more related symbols and a word or a sentence connected to each symbol.
- the electronic device 100 may display two or more circle characters determined through acquired document information in a separate area of the digital document.
- the electronic device 100 may convert a symbol of 405 , 401 , and 403 to a related symbol in acquired document information, as shown in the touch screen 133 of FIG. 4B .
- the electronic device 100 may determine words in an area including each symbol 405 , 401 , and 403 and another symbol, and display the word and the other symbol in a separate area.
- the electronic device 100 may determine at least one area of a predetermined area of the digital document, a pop-up window displayed in the digital document, and a layer area separate from the digital document as a separate area to be displayed.
- the electronic device 100 may determine ‘be inspired’ and the heart symbol 407 included in the symbol 405 as a first symbol area.
- the electronic device 100 may determine a digital heart symbol matched to the heart symbol 407 input with handwriting.
- the electronic device 100 may crop a symbol area of acquired document information and include the symbol area in the digital document.
- the electronic device 100 may determine ‘more photos’ included in the symbol 401 as a second symbol area.
- the electronic device 100 may determine ‘bikes rides’ included in the symbol 403 as a third symbol area.
- the electronic device 100 may input a symbol related to order, as shown in FIG. 4B .
- the electronic device 100 may determine that two or more symbols related to order are input in order of 405 , 403 , and 401 and rearrange the symbols in an order of 405 , 401 , and 403 .
- the electronic device 100 may reset a related symbol area input in an order of a second symbol area, a third symbol area, and a first symbol area to an order of a first symbol area, a second symbol area, and a third symbol area.
- a predetermined operation can display the above-described ‘numbered list’ in proper order.
- FIGS. 5A and 5B illustrate operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention.
- the electronic device 100 may detect at least one symbol input with handwriting in acquired document information.
- the electronic device 100 may determine that the detected at least one symbol is formed with two or more types and distinguish and determine a symbol according to each type.
- the electronic device 100 may detect a symbol of 501 , 503 , 505 , 507 , 509 and check marks 511 and 513 input with handwriting in acquired document information.
- the electronic device 100 may determine a matched digital symbol through information of a database.
- the electronic device 100 may determine that symbols 503 , 501 , and 509 match to one type (e.g., a circle character), may determine that symbols 505 and 507 match to another type (e.g., a parenthesis character), and may determine that check marks 511 and 513 match to a single type.
- one type e.g., a circle character
- another type e.g., a parenthesis character
- the electronic device 100 may determine an area of a word or a sentence in relation to a distinguished symbol (e.g., a circle character). When a circle character area includes an underline, the electronic device 100 may determine a word corresponding to the underline. According to an embodiment of the present invention, the electronic device 100 may detect the symbol 501 in acquired document information and determine a phrase ‘scanned images’ in an area of an underline 502 input with handwriting near the symbol 501 . When determining a phrase ‘scanned images’ in an area of the input underline 502 , the electronic device 100 may refer to acquired document information and/or a digital document generated through acquired document information.
- a distinguished symbol e.g., a circle character
- the electronic device 100 may determine a word corresponding to the underline.
- the electronic device 100 may detect the symbol 501 in acquired document information and determine a phrase ‘scanned images’ in an area of an underline 502 input with handwriting near the symbol 501 .
- the electronic device 100 may refer to acquired document information and/or a
- the electronic device 100 may determine a word ‘typewritten’ in an area of an underline 504 near the symbol 503 , and a phrase ‘text-to-speech’ in an area of an underline 510 near the symbol 509 , according to the above-described method.
- the electronic device 100 may determine a phrase ‘printed text into machine’ in an area of an underline 506 near the symbol 505 , according to the above-described method.
- the electronic device 100 may determine a phrase included to be associated with the symbol 507 as ‘printed records’, according to a method described-above with reference to FIG. 4A .
- the electronic device 100 may determine an area of a word or a sentence associated with a check mark in relation to a distinguished symbol (e.g., a check mark) according to the above-described method. Specifically, when the electronic device 100 detects a check mark input with handwriting, the electronic device 100 may determine an area of a word or a sentence in which the check mark indicates to the end of the sentence. According to an embodiment of the present invention, the electronic device 100 may determine an area corresponding to the check mark 511 as ‘OCR is a field of research in pattern recognition, artificial intelligence and computer vision’, and may determine an area corresponding to the check mark 513 as “‘intelligent” system with a high degree of recognition accuracy for most fonts are now common’.
- OCR is a field of research in pattern recognition, artificial intelligence and computer vision
- the electronic device 100 may determine a text in which handwriting is input to acquired document information.
- the electronic device 100 may display a word or a sentence connected to each type of symbol.
- the electronic device 100 may display a symbol such as a circle character, a parenthesis character, and a check mark according to each type determined through acquired document information, and may display a word or a sentence included in an area of the symbol in a separate area of the digital document.
- the electronic device 100 does not limit a range of the above-described symbol to a special character of a circle character or a parenthesis character, and may include various special characters in a range of a symbol.
- the electronic device 100 may determine the symbol 503 and ‘typewritten’ as a first circle character area, the symbol 501 and ‘scanned images’ as a second circle character area, the symbol 509 and ‘text-to-speech’ as a third circle character area, the symbol 505 and ‘printed text into machine’ as a first parenthesis character area, the symbol 507 and ‘printed record’ as a second parenthesis character area, the check mark 511 and ‘OCR is a field of research in pattern recognition, artificial intelligence and computer vision’ as a ‘first check mark area’, and the check mark 513 and “‘intelligent” system with a high degree of recognition accuracy for most fonts are now common’ as a second check mark area’.
- Each area may be an area divided to correspond to each symbol type acquired in the electronic device 100 .
- the electronic device 100 may display symbols and corresponding text areas according to an order of the symbols on the touch screen 133 that displays the digital document.
- the electronic device 100 may determine that a circle character symbol input with handwriting is not input in order, and may output a circle character area acquired in the digital document according to a predetermined order.
- the electronic device 100 may input an order of a second circle character area, a first circle character area, and a the third character area, and the electronic device 100 may determine a circle character area in order of a first circle character area, a second circle character area, and a the third character area, according to a predetermined function in a circle character of a digital symbol.
- the electronic device 100 may display the first circle character area, the second circle character area, and the third character area in which the order is determined in a predetermined display area (e.g., ‘numbered text’) of the touch screen 100 .
- a predetermined display area e.g., ‘numbered text’
- the electronic device 100 may determine another type of symbol (e.g., a parenthesis character area) that can display in order and may display a first parenthesis character area and a second parenthesis character area of a parenthesis character area in an area of a numbered text that displays the first circle character area, the second circle character area, and the third character area of the above-described circle character area.
- a parenthesis character area e.g., a parenthesis character area
- the electronic device 100 may display a symbol corresponding to each type (e.g., a check mark area) and an area including the symbol in an area separate from an area that displays the digital document.
- the electronic device 100 may generate and display a new pop-up window for a first check mark area and a second check mark area on the touch screen 133 that displays the digital document.
- FIGS. 6A and 6B illustrate operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention.
- the electronic device 100 may output a text corresponding to handwriting input in a predetermined area (e.g., space between lines) formed with a text.
- a predetermined area e.g., space between lines
- the electronic device 100 may detect “Current issues” 603 input with handwriting between lines formed with a text of a document printed in a constant font.
- the electronic device 100 may detect “Reach themselves in situation what scanning will be in enough cases generally” 601 input with handwriting in an area in which a text is not printed and under a line formed with a text.
- the electronic device may determine a digital text matched to a detected text.
- the electronic device 100 may display a determined digital text “Current issues” 613 in a space between ‘Performing multisite designs’ and ‘Reach what may be’, which is the same position as that of “Current issues” 603 input with handwriting.
- the electronic device 100 may insert determined digital text “Reach themselves in situation what scanning will be in enough cases generally” 611 under “schofield increasing, the generalizability of” displayed in the digital document with reference to a position of “Reach themselves in situation what scanning will be in enough cases generally” 601 input with handwriting.
- the electronic device 100 may display determined “Reach themselves in situation what scanning will be in enough cases generally” 611 at the same position as that of “Reach themselves in situation what scanning will be in enough cases generally” 601 input with handwriting.
- the electronic device 100 may crop image data of a text area in which handwriting is input, and display the cropped image in a predetermined area of the digital document.
- the electronic device 100 may not determine characters of a digital text matched to at least one character. Because the electronic device 100 cannot determine a character of the digital text, the electronic device 100 cannot complete a digital text, such as the above-described “Reach themselves in situation what scanning will be in enough cases generally” 611 .
- the electronic device 100 may crop image data of an area “Reach themselves in situation what scanning will be in enough cases generally” 601 , and insert the cropped image 601 at a position that displays “Reach themselves in situation what scanning will be in enough cases generally” 611 with reference to FIG. 6B .
- the electronic device 100 may display the cropped image at a position of a corresponding digital document at the same position as that of “Reach themselves in situation what scanning will be in enough cases generally” 601 and determine and change a random position.
- FIG. 7A to 7E illustrate operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention.
- the electronic device 100 may display the digital text in the digital document with various methods.
- the electronic device 100 may determine each of various types of symbols or characters included in a text in which handwriting is input to acquired document information.
- the electronic device 100 may determine a symbol, an asterisk (predetermined symbol), and a character designating a range of text printed in a constant font in a symbol and character 701 input with handwriting in acquired document information.
- the electronic device 100 may determine a symbol that designates a range of a text and determine a text corresponding to a symbol that displays a range.
- the electronic device 100 may detect a character input with handwriting and determine a matched digital character.
- the electronic device 100 may determine an asterisk (predetermined symbol) and connect a text corresponding to a symbol that designates a range according to a function of an asterisk and a character input with handwriting.
- the electronic device 100 may determine a predetermined area of a printed text according to a symbol, input with handwriting, that designates a range of a text printed in a constant font.
- the electronic device 100 may determine a text area including a symbol with two lines that start with ‘High’ of FIG. 7A and that terminate with ‘manner’, and determine a sentence ‘at whoever cost of trouble to learn how to read’ divided by a comma (,) or a period (.) in the determined two lines to an area corresponding to a symbol that designates a range.
- the electronic device 100 may display a menu that may determine at least one method that may display.
- the electronic device 100 may convert a character and a symbol input with handwriting in acquired document information to a matched digital text, and provide a method that can display according to a function of the determined digital text.
- the electronic device 100 may output the text in a selection area separate from a digital document such as ‘Detected text’ 721 .
- the electronic device 100 may display with an underline, as shown in ‘at whoever cost of trouble to learn how to read’ 723 corresponding to a symbol that displays a range and insert a digital text ‘firm determination that should learn how to read’ 725 (predetermined function of an asterisk) below ‘at whoever cost of trouble to learn how to read’ 723 .
- the electronic device 100 may display an icon ‘save’ 727 and/or ‘cancel’ 729 that can determine whether to display in the digital document according to the above-described method.
- the electronic device 100 when the electronic device 100 cannot determine at least one of a character and a symbol input with handwriting in acquired document information to a digital text, the electronic device 100 may provide various methods that may display in a predetermined digital document.
- the electronic device 100 may output the text in a selection area separate from a digital document such as ‘Detected text’ 707 .
- the electronic device 100 may determine a digital symbol matched to a symbol that designates a range and be displayed with an underline, as shown in corresponding ‘at whoever cost of trouble to learn how to read’ 705 according to the digital symbol.
- the electronic device 100 when the electronic device 100 cannot determine a digital symbol matched to a symbol that displays a range, the electronic device 100 cannot display a underline, as shown in corresponding ‘at whoever cost of trouble to learn how to read’ 705 according to the digital symbol.
- the electronic device 100 may not determine at least one digital text in a predetermined symbol (asterisk) input with handwriting and a character input with handwriting as well as a symbol that designates a range in which handwriting is input, and may not perform a combined function.
- the electronic device 100 may crop image data corresponding to a symbol and character area in which handwriting is input in the acquired document information, and display a cropped image at the same digital document position as that of an area in which handwriting is input.
- the electronic device 100 may display an icon ‘save’ 709 and/or ‘cancel’ 711 that can determine whether to display in the digital document according to the above-described method.
- the electronic device 100 may output a text in which handwriting is input to the digital document with at least one of various display methods.
- the electronic device 100 may set (reset) a method of outputting a text in which handwriting is input and that is output to the digital document.
- the electronic device 100 when the electronic device 100 selects ‘save’ 709 in a selection window shown in FIG. 7B (a), the electronic device 100 may display a digital document, as shown in FIG. 7C . Because the electronic device 100 cannot match at least one text to a digital text, the electronic device 100 may display a cropped image 733 and perform again operation of determining a character or a symbol of the cropped image 733 to a digital text.
- the electronic device 100 may not display a determined digital text ‘at whoever cost of trouble to learn how to read’ 705 and ‘firm determination that should learn how to read’ 725 . Even when the electronic device 100 determines a digital text matched to a text in which handwriting is input, the electronic device 100 may display a cropped image 733 in the digital document, as shown in FIG. 7C according to setting.
- the electronic device 100 in a method of displaying a text in which handwriting is input to acquired document information in the digital document, when the electronic device 100 selects a text area displayed in the digital document with an input means 743 (e.g., a finger or an electronic pen), the electronic device 100 may be set to display a connected text.
- an input means 743 e.g., a finger or an electronic pen
- the electronic device 100 when determining a case of selecting with an input means, if touching a text area with the input means, the electronic device 100 may determine to use at least one method of cases of indirectly touching (e.g., hovering) with an input means.
- the electronic device 100 may display a digital text determined with a text printed in a constant font.
- the electronic device 100 may display (e.g., an underline 741 ) that a connected text or connected data exists.
- the underline 741 in which data exists may be displayed according to a function of a symbol that designates a range, described with reference to FIG. 7B .
- the electronic device 100 may display a predetermined function in the digital document through various methods of outputting a text in which handwriting is input such as a symbol that designates a range.
- the electronic device 100 may change a color of a digital text with a method of distinguishing from a digital text displayed with an underline including another function, and may add various effects such as a method of differently displaying a form of the underline like two or three lines.
- the electronic device 100 may apply an effect according to a predetermined method.
- the electronic device 100 may display the digital text 731 to which other data is connected in an area (e.g., a pop-up window) separate from a digital document that displays connected data ‘firm determination that should learn how to read’ 745 .
- the electronic device 100 may set to display an image (e.g., 733 of FIG. 7C ) that crops a text input in handwriting in addition to a method of displaying the digital text 745 determined to a text input in handwriting.
- the electronic device 100 may display an object (e.g., an icon) that controls to display a text in which handwriting is input in a predetermined area of the digital document.
- an object e.g., an icon
- the electronic device 100 may connect ‘at whoever cost of trouble to learn how to read’ 751 displayed in the digital document of the electronic device 100 to digital text data or a text (crop image) in which handwriting is input according to a predetermined symbol.
- the electronic device 100 may display an object 753 that may display connected digital text data or a text (crop image) in which handwriting is input at a predetermined position of the digital document.
- a predetermined position that displays the object 753 may be a position within a determined range of a periphery of ‘at whoever cost of trouble to learn how to read’ 751 in which digital text data in which the object 753 is connected or a text (crop image) input in handwriting is connected according to a predetermined symbol.
- the electronic device 100 may generate and display a text (crop image) input in handwriting or digital text data connected to the object in an area separate from the digital document, as shown in FIG. 7D .
- the electronic device 100 does not limit a position of the displayed object 753 to a position within a determined range of a periphery of ‘at whoever cost of trouble to learn how to read 751 ’ and may set the position of the displayed object 753 to a random position of the digital document.
- FIGS. 8A and 8B are flowcharts illustrating operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention.
- the electronic device 100 may detect a text in which handwriting is input from image date of an acquired analog document, and determine a text in which handwriting is input to a matching digital text.
- the electronic device 100 may perform operation corresponding to the symbol, and display a digital document generated based on a text included in image data of the analog document.
- the electronic device 100 photographs a document that displays the text through the image sensor 160 , in step 801 .
- the electronic device 100 may receive image data including the text through network communication, and document information acquired from image data previously stored at the memory 100 may include a text printed in a constant font and include a text in which handwriting is input.
- the electronic device 100 may detect a text displayed in the document through a text recognition and conversion program such as the OCR program 115 and/or the hand writing processing program 116 .
- the electronic device 100 determines whether a digital character is matched to text detected in a photographed document (acquired through direct photographing or that receives photographed image data) with reference to a database, in step 803 . If at least one character of a word cannot be determined, or if at least one word of a sentence cannot be determined, the electronic device 100 may terminate operation of determining a corresponding word or a sentence to a digital text.
- the electronic device 100 converts a text input in handwriting to a matching digital text, in step 805 .
- the electronic device 100 may match various texts such as, for example, various types of characters (e.g., a circle character and a parenthesis character), a highlighter, an underline, a figure, and a character input with handwriting to the same text as or a text similar to that stored at a database.
- the electronic device 100 may determine whether handwriting input includes a predetermined symbol, in step 807 . If the handwriting input includes a predetermined symbol, when a digital symbol matched to the symbol is determined, the electronic device 100 may determine whether the digital symbol includes a predetermined function. If the digital symbol includes a predetermined function, the electronic device 100 performs an operation corresponding the symbol, in step 809 . If the digital symbol does not include a predetermined function, the electronic device 100 may perform operation 811 .
- the electronic device 100 may determine a range of a printed text according to a function of a symbol input with handwriting. When displaying a determined range, the electronic device 100 may apply and display an effect such as an underline, a figure, and a contrast according to a function of a predetermined symbol.
- the electronic device 100 may process to display a printed text (e.g., a digital text matched to a printed text) of a determined range according to a function in which handwriting is input and a text in which handwriting is input (e.g., a digital text matched to a text input in handwriting) in a digital document according to a function of a symbol input with handwriting.
- a printed text e.g., a digital text matched to a printed text
- a text in which handwriting is input e.g., a digital text matched to a text input in handwriting
- the electronic device 100 may display an image of an analog text included in the digital document and/or the digital text according to a predetermined function ( 811 ).
- the electronic device 100 may display a digital document (e.g., FIG. 2B ) in the same form as or a form similar to that of an analog document (e.g., FIG. 2A ) including a text in which handwriting is input and a text printed according to a predetermined method at the memory 110 .
- a digital document e.g., FIG. 2B
- FIG. 2A an analog document including a text in which handwriting is input and a text printed according to a predetermined method at the memory 110 .
- the electronic device 100 may determine a predetermined function of a figure or a symbol input with handwriting, apply the function to a printed text according to a predetermined function and a text in which handwriting is input, and display the text in the digital document.
- a method of displaying an image that crops a connected digital text e.g., a digital text matched to a text in which handwriting is input
- a text in which handwriting is input in a pop-up window may be used.
- the electronic device 100 may terminate operation of the methodology of FIG. 8A .
- the electronic device 100 may acquire image data including a text printed in a constant font or a text in which handwriting is input.
- the electronic device 100 may detect at least one text included in image data and determine a matched digital text.
- the electronic device 100 may perform operation 823 .
- the electronic device 100 When the electronic device 100 cannot determine a digital text matched to a text printed in a constant font or handwritten text, the electronic device 100 selects a text area that cannot be converted, in step 823 . When the electronic device 100 can determine a digital text matched to a text printed in a constant font or handwritten text, the electronic device 100 may terminate operation of FIG. 8B . When the electronic device 100 can determine a digital text matched to a text printed in a constant font or handwritten text, the electronic device 100 may perform operation 805 of FIG. 8A .
- the electronic device 100 When the electronic device 100 cannot determine a digital text matched to a text printed in a constant font or handwritten text, the electronic device 100 acquires image data corresponding to a text area in which of the text that cannot be converted to digital text. When acquiring image data corresponding to a text area that cannot be converted to digital text, the electronic device 100 may use a cropping method and acquired image data may not be limited to handwritten text. When the electronic device 100 does not determine a text printed in a constant font to a matched digital text, the electronic device 100 crops an image corresponding to a text area that is not converted.
- the electronic device 100 displays a crop image of a text that is not converted to a digital text at a corresponding position of a text that is not determined of a generated digital document, in step 825 .
- Operation of the electronic device 100 is not limited to operation of displaying at a corresponding position of a text that is not converted, and the electronic device 100 may provide a function of setting a displaying position and may provide a function of changing a position of a displayed image.
- the electronic device 100 may terminate operation of the methodology of FIG. 8B .
- the electronic device 100 may crop an image corresponding to a printed text area that is not converted and include the cropped image in a corresponding area of the digital document.
- the electronic device 100 may convert and display a text included in a document included in the memory of the electronic device 100 as well as a document formed with a text and a digital text in which handwriting is input of an image that photographs through an image sensor to the digital document.
- the electronic device 100 may display a digital document corresponding to an entire area of FIG. 2B by combining a partial area of a document in which a converted digital text is printed and display a predetermined area of a generated digital document.
- the electronic device 100 may recognize a text in which handwriting is input to acquired document information, provide a digital document generated with operation corresponding to a character, figure, or a symbol input with handwriting and clearly provide a text in which handwriting is input and that is not recognized in the electronic device 100 .
- Various embodiments of the present invention may be performed through at least one program which the memory 110 of the electronic device 100 includes, and may be directly controlled by a processor. Further, various embodiments may be controlled through at least one control module which a processor controls.
- Methods according to various embodiments of the present invention can be implemented in a form of hardware components, software components, or combinations thereof.
- a computer readable storage medium that stores at least one program (software module) may be provided.
- At least one program stored at a computer readable storage medium is formed to execute by at least one processor within the electronic device 100 .
- At least one program may include an instruction that enables the electronic device 100 to execute a method, according to embodiments of the present invention.
- Such a program may be stored at a non-volatile memory including a Random Access Memory (RAM) and a flash memory, a Read-Only Memory (ROM), an Electrically Erasable and Programmable ROM (EEPROM), a magnetic disk storage device, a Compact Disk ROM (CD-ROM), a Digital Versatile Disk (DVD), or an optical storage device of other form, and a magnetic cassette.
- the program may be stored at a memory formed with a combination of a portion or the entire thereof. Further, each constituent memory may be included in plural.
- the program may be stored at an attachable storage device that may access to the electronic device 100 through a communication network such as Internet, intranet, a Local Area Network (LAN), a Wireless LAN (WLAN), or a Storage Area Network (SAN), or a communication network formed with a combination thereof.
- a storage device can access to the electronic device 100 through an external port.
- a separate storage device on the communication network may provide access to a portable electronic device 100 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- User Interface Of Digital Computer (AREA)
- Facsimiles In General (AREA)
Abstract
Description
- This is a Continuation application of U.S. patent application Ser. No. 14/338,743, which was filed with the U.S. Patent and Trademark Office on Jul. 23, 2014, and claims priority under 35 U.S.C. § 119(a) to a Korean Patent Application No. 10-2013-0103292, which was filed in the Korean Intellectual Property Office on Aug. 29, 2013, the entire disclosure of each of which is incorporated herein by reference.
- The present invention relates generally to a method of processing acquired data and an electronic device thereof, and more particularly, to a method of displaying digital and analog text on an electronic device.
- An electronic device provides various services such as, for example, a camera function, a data communication function, a moving picture reproduction function, an audio reproduction function, a messenger function, a schedule management function, an alarm function, and an audio communication function. Various programs are used in the electronic device that can utilize these functions.
- The electronic device may convert analog text included in an image photographed through a camera, analog text included in an image received through network communication, or analog text included in an image which is stored at a memory to digital text processing.
- In converting the analog text to the digital text, the electronic device may be unable to acquire digital text corresponding to the analog text and, as a result, may output information that does not correspond with information displayed in the original document, such as by displaying an error code or digital text, which has was converted incorrectly.
- The present invention has been made to address at least the above problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention provides a method of processing data and an electronic device thereof that can provide clear information about a text of an acquired document even when a portion or the entire of a text in which the acquired document includes is not determined to a matching digital text.
- Another aspect of the present invention provides a method of processing data and an electronic device thereof that can perform a function of the electronic device without operation of directly manipulating through a text in which handwriting is input to an acquired document.
- In accordance with an aspect of the present invention, a method is provided that includes displaying, via a display of the electronic device, an electronic document including an image; identifying, from the electronic document, a plurality of handwritten inputs distinguished from a plurality of text words included in the electronic document; acquiring, from among the plurality of handwritten inputs, first handwritten symbols; identifying, from the electronic document, at least one text word corresponding to each of the first handwritten symbols; and in response to receiving a user input, displaying, via the display, the at least one text word as associated with first digital symbols corresponding to each of the first handwritten symbols, based in order of the first handwritten symbols.
- In accordance with another aspect of the present invention, an electronic device is provided that includes a display; and at least one processor coupled to the display, configured to: display, via the display, an electronic document including an image; identify, from the electronic document, a plurality of handwritten inputs distinguished from a plurality of text words included in the electronic document; acquire, from among the plurality of handwritten inputs, first handwritten symbols; identify, from the electronic document, at least one text word corresponding to each of the first handwritten symbols; and in response to receiving a user input, display, via the display, the at least one text word as associated with first digital symbols corresponding to each of the first handwritten symbols, based in order of the first handwritten symbols.
- The above and other aspects, features, and advantages of embodiments of the present invention will be more apparent from the following detailed description when taken in conjunction with the accompanying drawings, in which:
-
FIG. 1 is a block diagram illustrating a configuration of an electronic device, according to an embodiment of the present invention; -
FIGS. 2A and 2B illustrate writing information that an electronic device can recognize, according to an embodiment of the present invention; -
FIGS. 3A and 3B illustrate an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention; -
FIGS. 4A and 4B illustrate an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention; -
FIGS. 5A and 5B illustrate an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention; -
FIGS. 6A and 6B illustrate an operation of displaying document information that an electronic device, acquires according to an embodiment of the present invention; -
FIGS. 7A to 7E illustrate an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention; and -
FIGS. 8A and 8B are flowcharts illustrating an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention. - Embodiments of the present invention are described in detail with reference to the accompanying drawings. The same or similar components may be designated by the same or similar reference numerals although they are illustrated in different drawings. Detailed descriptions of constructions or processes known in the art may be omitted to avoid obscuring the subject matter of the present invention. Further, the terms used herein are defined according to the functions of the present invention. Therefore, the terms may vary depending on user's or operator's intention and usage. That is, the terms used herein must be understood based on the descriptions made herein.
- When describing various embodiments of the present invention, an electronic device will be described based on a touch screen that can perform an input operation through an input device and a display operation through a display unit on a physical screen. In a device configuration of the present invention, even if a display unit and an input device are separately shown when representing the display unit, the display unit may include the input device, or the input device may be represented with the display unit.
- Embodiments of the present invention are not limited only to an electronic device including a touch screen, and may be applied to various electronic devices in which the display unit and the input device are physically separated or that include only one of the display unit and the input device.
- An electronic device may be embodied as a mobile communication user device, a Personal Digital Assistant (PDA), a Personal Computer (PC), a laptop computer, a smart phone, a smart pad, a smart television, a Netbook, a Mobile Internet Device (MID), an Ultra Mobile PC (UMPC), a tablet PC, a mobile pad, a media player, a hand-held computer, a navigation device, a smart watch, a Head Mounted Display (HMD), and a Moving Picture Experts Group layer-3 (MP3) player.
- When it is described herein that an element is “connected” or “coupled” to another element, it should be understood that the element may be directly connected or coupled to the other element or electrically coupled to the other element through a third element. In contrast, when it is described that an element is “directly connected” or “directly coupled” to another element, it should be understood that there is no intermediate part between the two parts.
-
FIG. 1 is a block diagram illustrating a configuration of an electronic device, according to an embodiment of the present invention. - As shown in
FIG. 1 , anelectronic device 100 includes amemory 110, aprocessor unit 120. Theelectronic device 100 also includes an input andoutput processor 130, adisplay unit 131, atouch input device 132, anaudio processor 140, acommunication system 150 as a peripheral device, and other peripheral devices. - The
memory 110 includes aprogram storage unit 111 that stores a program for controlling operation of theelectronic device 100, and adata storage unit 112 that stores data generated while performing a program. Thedata storage unit 112 may store data generated at a program with operation of aprocessor 122 of theprocessor unit 120. According to an embodiment of the present invention, at thedata storage unit 112, operation information that can control theelectronic device 100 or other electronic devices may be previously determined and a database in a table or list form may be formed and stored. According to an embodiment of the present invention, theelectronic device 100 may convert text displayed in document information and acquired by animage sensor 160 to digital text, and store data of the converted digital text as a digital document at thedata storage unit 112. According to an embodiment of the present invention, in order to convert text displayed in a document to a digital text, theelectronic device 100 may store data of text, such as a character that can compare, a font of the character, a symbol, and a figure, and digital text (a character, a font of the character, a symbol, and a figure) data according to the data of text at thedata storage unit 112. According to an embodiment of the present invention, when text displayed in document information, which is acquired by an image sensor, cannot be converted to digital text, theelectronic device 100 may store an image of an area that is not converted to digital text as image data at thedata storage unit 122. - The
program storage unit 111 includes an Optical Character Recognition (OCR)program 115, ahandwriting processing program 116, acommunication control program 117, and at least oneapplication program 118. Programs included in theprogram storage unit 111 are formed in a set of instructions and may be represented with an instruction set. Theapplication program 118 may include a software element of at least one application program installed at thememory 110 of theelectronic device 100. - The
OCR program 115 may convert text information, such as a character, a symbol, and a figure displayed in document information and acquired through theimage sensor 160, to digital text and generate a digital document in the same form as, or a form similar to, that of text displayed in a document. - The
handwriting processing program 116 may work with theOCR program 115. According to an embodiment of the present invention, thehandwriting processing program 116 may convert a symbol input in handwriting to acquired document information, to a digital symbol corresponding to a predetermined method, and apply a function or an effect corresponding to the determined digital symbol to the digital document. Theelectronic device 100 may output a digital document in which a function or an effect corresponding to a digital symbol is applied. - According to an embodiment of the present invention, when text input in handwriting to acquired document information cannot be converted to digital text, the
handwriting processing program 116 may copy text from an area that cannot be converted to digital text to image data, or may crop with a method such as pasting. Cropping may be a method of acquiring a partial area or an entire area of image data. Thehandwriting processing program 116 may display image data of an area that cannot be converted to digital text at a predetermined position of the digital document. - Acquired document information described in embodiments of the present invention may be a document that includes text information of an analog method that does not include a digital text (e.g., text information of a data form used in the
electronic device 100 like image information of a document that is photographed through theimage sensor 160 of theelectronic device 100 and image information of a document including text stored at a memory of the electronic device 100). Theelectronic device 100 may detect text from image information of a document and determine digital text matched to the same shape as, or a shape similar to, that of the detected text. - The
communication control program 117 may include at least one software element for controlling communication with at least one other electronic device using thecommunication system 150 or theimage sensor 160. According to an embodiment of the present invention, thecommunication control program 117 may search for another electronic device for a communication connection. When another electronic device for the communication connection is found, thecommunication control program 117 may set a connection for communication with the other electronic device. Thereafter, by performing a performance search and session setting procedure with the connected other electronic device, thecommunication control program 117 may control to transmit and receive data (e.g., packet data) to and from the other electronic device through thecommunication system 150. - The
memory 110 included in theelectronic device 100 may be one or more memory components. According to an embodiment of the present invention, thememory 110 may perform a function of only theprogram storage unit 111, a function of only thedata storage unit 112, or functions of both the program storage unit 11 and thedata storage unit 112 according to use. In thememory 110, a physical area within thememory 110 may not be clearly divided based on the characteristics of theelectronic device 100. - The
processor unit 120 includes amemory interface 121, at least oneprocessor 122, and aperipheral device interface 123. Thememory interface 121, the at least oneprocessor 122, and theperipheral device interface 123 included in theprocessor unit 120 may be integrated in at least one circuit or may be embodied with a separate constituent elements. - The
memory interface 121 may control access to thememory 110 of a constituent element such as theprocessor 122 or theperipheral device interface 123. - The
peripheral device interface 123 may control a connection of an input and output peripheral device, theprocessor 122, and thememory interface 121 of theelectronic device 100. - The
processor 122 may control theelectronic device 100 to provide various multimedia services using at least one software program and to display a UI operation of theelectronic device 100 at thedisplay unit 131 through the input andoutput processor 130. Theprocessor 122 may also control thetouch input device 132 to provide a service that receives input of an instruction from outside of theelectronic device 100. By executing at least one program stored at thememory 110, theprocessor 122 may control to provide a service corresponding to a corresponding program. - The input and
output processor 130 may provide an interface between an input andoutput device 133, such as thedisplay unit 131 and thetouch input device 132, and theperipheral device interface 123. - The
display unit 131 may receive state information of theelectronic device 100, a character, a moving picture, or a still picture input from outside of theprocessor unit 120. Thedisplay unit 131 may form an UI operation, and display the UI operation through the input andoutput processor 130. - The
touch input device 132 may provide input data occurring by a user's selection to theprocessor unit 120 through the input andoutput processor 130. According to an embodiment of the present invention, in order to receive data from outside of theelectronic device 100, thetouch input device 132 may be formed with only a control button or may be formed with a keypad. - According to an embodiment of the present invention, the
touch input device 132 may be provided with the input andoutput device 133 together with thedisplay unit 131 so that an input and output may operate on one screen. Thetouch input device 132 used for the input andoutput device 133 may use at least one of a capacitive type, a resistive (pressure detection) type, an infrared ray type, an electromagnetic induction type, and an ultrasonic wave type. - According to an embodiment of the present invention, an input method of the
touch input device 132 may be a method of inputting an instruction when an input means is positioned within a predetermined distance from thetouch screen 133, in addition to a method of directly touching and inputting thetouch screen 133. The input method may use inputs such as a hovering touch, a floating touch, an indirect touch, a proximity touch, or a non-contact input. - The input and
output device 133 is a device that physically couples thetouch input device 132 as a single screen on thedisplay unit 131. When operating theelectronic device 100, the input andoutput device 133 may be a touch screen that can input an instruction by touching a screen configuration displayed in thedisplay unit 131. The touch screen can perform functions of both thedisplay unit 131 that displays a UI operation of theelectronic device 100 and thetouch input device 132 that inputs an external instruction to theelectronic device 100. Thus, in the following description, the touch screen may be formed as thetouch screen 133 including thedisplay unit 131 and thetouch input device 132. In an embodiment of the present invention, thetouch screen 133 is formed in a complex touch panel in which a touch panel and a pen touch panel are formed together. Thetouch screen 133 of theelectronic device 100 is not limited to a touch panel formed in a complex touch panel, and may be embodied as a touch screen to which a pen touch panel, which can perform only a pen touch, is applied. - The
audio processor 140 may provide an audio interface between a user and theelectronic device 100 through a speaker 141 and amicrophone 142. - The
communication system 150 performs a communication function. According to an embodiment of the present invention, thecommunication system 150 may perform communication with another electronic device using at least one of mobile communication, wire communication, and satellite communication through a base station, and is connected to at least one short range wireless communication module to perform short range wireless communication. - According to an embodiment of the present invention, a short range wireless communication module may perform communication with another electronic device using at least one of short range wireless communication such as, for example, infrared ray communication, Bluetooth communication, Bluetooth Low Energy (BLE) communication, Wi-Fi communication, Near Field Communication (NFC) wireless communication, Zigbee communication, and Ultra WideBand (UWB) communication, Wireless Local Area Network (LAN) communication, and wire communication. According to an embodiment of the present invention, the
communication system 150 or a short range wireless communication module is divided and described, but thecommunication system 150 and the short range wireless communication module may perform communication in one communication system module. - The
image sensor 160 may photograph an object and generate image data. According to an embodiment of the present invention, theimage sensor 160 may include an optical unit and an operation detection sensor (motion sensor), and may be formed with a module such as an operation detection module and a camera module. The optical unit may be driven by a mechanical shutter, a motor, or an actuator, and may perform operations such as a zoom function and focusing by the actuator. The optical unit photographs a peripheral object, and theimage sensor 160 may detect an image photographed by the optical unit and convert the detected image to an electrical signal. Theimage sensor 160 may be embodied as a sensor such as a Complementary Metal-Oxide Semiconductor (CMOS) or a Charge-Coupled Device (CCD), and another image sensor having high resolution may be used. The image sensor of the camera may house a grovel shutter therein. The grovel shutter may perform a function similar to a mechanical shutter housed in a sensor. - In accordance with embodiments of the present invention, a display to the
electronic device 100 or an output to theelectronic device 100 may be a term representing a method of displaying a moving picture, a still picture, or an GUI operation on thetouch screen 133 of theelectronic device 100 or outputting a signal sound or a voice to the speaker 141. The term display or output may be used herein, and when it is necessary to distinguish a display or an output, the display or the output may be separately described. -
FIGS. 2A and 2B illustrate writing information that an electronic device can recognize, according to an embodiment of the present invention. - The
electronic device 100 may detect analog text, such as a character or a symbol included in a document captured through a camera device including an image sensor, by using thehandwriting processing program 116. That is, theelectronic device 100 may detect a character or a symbol input in handwriting by a user, as well as a character or a symbol printed in a constant font. Theelectronic device 100 converts the character or the symbol input in handwriting in the detected analog text to a digital text, i.e., to a digital character or symbol. Theelectronic device 100 may display the converted digital text on thetouch screen 133 of theelectronic device 100. - According to an embodiment of the present invention, the
electronic device 100 may detect ahighlighter input 203 input in handwriting in a document in which a text of a constant font is printed. Theelectronic device 100 may convert the detectedhighlighter input 203 to a corresponding digital highlighter effect input. - According to an embodiment of the present invention, the
electronic device 100 may detect a 201, 207, or 211, input by handwriting in the document in which a text of a constant font is printed. Thesymbol input electronic device 100 may convert the detected 201, 207, or 211 to a corresponding digital symbol input.symbol input - According to an embodiment of the present invention, the
electronic device 100 may detect a character and/ornumeral input 205, input by handwriting in the document in which a text of a constant font is printed. Theelectronic device 100 may convert the detected character and/ornumeral input 205 to a corresponding digital character and/or numeral input. - According to an embodiment of the present invention, the
electronic device 100 may detect an 209 and 219, input in handwriting, input by handwriting, in the document in which a text of a constant font is printed. When detecting at least two of the same symbols input with handwriting in a document in which a text of a constant font is printed, theannotation symbol input electronic device 100 may convert the detected annotation symbol to a corresponding digital annotation symbol input. According to an embodiment of the present invention, theannotation symbol input 209 may be described as one of various symbol inputs, such as, for example, the 201, 207, and 211.symbol inputs - According to an embodiment of the present invention, the
electronic device 100 may detect atext 213, input by handwriting in the document in which a text of a constant font is printed. Theelectronic device 100 may convert the detected symbol andtext 213 input with handwriting to a corresponding digital description input. - According to an embodiment of the present invention, the
electronic device 100 may detect afigure input 215, input by handwriting in the document in which a text of a constant font is printed. Theelectronic device 100 may convert thefigure input 215 detected in a document input with a digital text to a corresponding digital figure input. - According to an embodiment of the present invention, the
electronic device 100 may detect an 208, 210, 212, and 217 that is input with handwriting in the document in which a text of a constant font is printed. Theunderline input electronic device 100 may convert the detected underline input to a corresponding digital underline input. Theelectronic device 100 may detect underlining with various geometrical lines such as awave 217, a 208, 210, or 212, and a dotted line.straight line - According to an embodiment of the present invention, an operation in which the
electronic device 100 converts a text such as thehighlighter input 203, the character and/ornumeral input 205, the 201, 207, 209, and 211, thesymbol input description input 213, the 208, 210, 212, and 217, or theunderline input figure input 215 to a digital text may include the operation of determining matching or similar data at thememory 110 of theelectronic device 100. - Referring to
FIG. 2B , theelectronic device 100 may detect text information in which handwriting is input to the document in which a digital text is displayed, determine digital text corresponding to the detected text information, and display the determined digital text in the digital document. Theelectronic device 100 may determine whether a text included in document information acquired through text information stored at a database is a printed text of a digital form or a handwritten text. - According to an embodiment of the present invention, the
electronic device 100 may convert handwritten text such as, for example, the detectedhighlighter input 203, the character and/ornumeral input 205, the 201, 207, 209, and 211, thesymbol input description input 213, the 208, 210, 212, and 217, or theunderline input figure input 215 to a digital text input such as, for example, adigital highlighter input 223, a digital character and/ornumeral input 225, a 221, 227, 229, and 231, adigital symbol input digital description input 233, a 228, 230, 232, and 237, or adigital underline input digital figure input 235, corresponding to each text. Theelectronic device 100 may display the determined digital text input on thetouch screen 133. According to an embodiment of the present invention, theelectronic device 100 may output digital text displayed on thetouch screen 133 with a sound through the speaker 141. - According to an embodiment of the present invention, the
electronic device 100 may convert the detectedhighlighter input 203 to the correspondingdigital highlighter effect 223, and display the determineddigital highlighter effect 223 in a predetermined area of thetouch screen 133, or an area of a digital document corresponding to an area of thehighlighter input 203. - According to an embodiment of the present invention, the
electronic device 100 may convert the detectedsymbol input 201 to the corresponding digital symbol effect 221 (e.g., asterisk), and display the determineddigital symbol effect 221 in a predetermined area of thetouch screen 133, or an area of a digital document corresponding to an area of thesymbol input 201. - According to an embodiment of the present invention, the
electronic device 100 may convert the detected at least one 207 and 211 of the same method and a continued order to the correspondingsymbol input digital symbol input 227 or 231, and display the determineddigital symbol input 227 or 231 at a predetermined position of thetouch screen 133, or a position of the digital document corresponding to a position of the at least one 207 and 211.symbol input - According to an embodiment of the present invention, the
electronic device 100 may convert the detected two or more 209 and 219 to the correspondingsame annotation symbols 229 and 239, and display the determineddigital annotation symbols 229 and 239 at positions of the digital document corresponding to positions of thedigital annotation symbols 209 and 219 or a predetermined position of theannotation symbols touch screen 133. According to an embodiment of the present invention, when selecting theannotation 229 of the digital document displayed on thetouch screen 133, theelectronic device 100 may move a display to a position of 239 of the digital document corresponding to theannotation 209. When selecting 239, theelectronic device 100 may display the position of 209 of the digital document. - According to an embodiment of the present invention, by combining at least one of the detected parentheses, asterisk, and text, the
electronic device 100 may convert the symbol andcharacter 213 input with handwriting to the corresponding digital symbol andtext 233. Theelectronic device 100 may convert the combined symbol andcharacter 213 to thedigital description input 233 through a preset database, as described above. Theelectronic device 100 may display the determineddigital description input 233 at a position of the digital document corresponding to a position of the symbol and character 213 t or a predetermined position of thetouch screen 133. - According to an embodiment of the present invention, the
electronic device 100 may convert arectangle 215, a triangle, a circle, and figure input that is input with handwriting to the correspondingdigital figure input 235, and display the determineddigital figure input 235 at a position of the digital document corresponding to a position of thefigure input 215, or a predetermined position of thetouch screen 133. - According to an embodiment of the present invention, the
electronic device 100 may convert a underline input such as, for example, the detected straight 208, 210, or 212, aline form underline wave form underline 217, or a dotted line form underline to a corresponding 228, 230, 232, and 237, and display the determineddigital underline input 228, 230, 232, and 237 at a position of the digital document corresponding to a position of thedigital underline input 208, 210, 212, and 217 or a predetermined position of theunderline input touch screen 133. - According to an embodiment of the present invention, the
electronic device 100 may not convert handwritten text into digital text. Theelectronic device 100 may copy or cut out a predetermined area including handwritten text that is not converted to a digital text. Theelectronic device 100 may display a predetermined area cropped through copy or crop in a predetermined area of the digital document. - According to an embodiment of the present invention, the
electronic device 100 may not convert handwritten input (digital character and/or numeral input) such as ‘due date: 2013.08.11 205 to a corresponding digital input. Handwritten input may not converted to a corresponding digital input when a digital text corresponding to a portion or the entire handwritten input is not matched at a database. Theelectronic device 100 may crop thehandwritten input area 205 in image data that photographs the document and display the croppedhandwritten area 225 at a predetermined position of the digital document. When displaying the croppedhandwritten area 225, theelectronic device 100 may rotate and display a handwritten area that obliquely input with a slope corresponding to a text line of the digital document. According to an embodiment of the present invention, when theelectronic device 100 does not determine a character or numeral corresponding at least one character or numeral of handwritten input, theelectronic device 100 may output a matching error code (wrongly converted text). When displaying the croppedhandwritten area 225, theelectronic device 100 may display a matching error code in the croppedhandwritten area 225 of the digital document when a display is released, and display the cropped handwritten 225 when a matching error code is displayed. - According to an embodiment of the present invention, when displaying various digital inputs such as the above-described digital highlighter input, digital character and/or numeral input, digital symbol input, digital description input, digital underline input, or digital figure input on the
touch screen 133, theelectronic device 100 does not limit the area for display to a previously displayed area and may move a position thereof. -
FIGS. 3A and 3B illustrate operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention. - According to an embodiment of the present invention, the
electronic device 100 may photograph a document formed with a text and handwriting input and generate a digital document. When including a figure input with handwriting in acquired document information, theelectronic device 100 may include a figure input with handwriting or a function corresponding to a figure in the digital document. - According to an embodiment, the
electronic device 100 may photograph a document formed with a text and handwriting input through an image sensor. When including a text having handwriting input in acquired document information, theelectronic device 100 may convert the handwritten input to a corresponding digital text. - According to an embodiment of the present invention, the
electronic device 100 may detect a figure displayed with handwritten input in a text portion of a constant font in acquired document information. Theelectronic device 100 may determine a figure in a database that matches the handwritten input. - According to an embodiment of the present invention, the
electronic device 100 may detect a figure of arectangle 301 displayed around ‘optical character recognition’ in the acquired document information, and may detect a figure of an asterisk (*) 303 displayed near ‘OCR’. Theelectronic device 100 may determine a digital text matched to thequadrangle 301 and the asterisk 303 t with reference to a database. - As shown in
FIG. 3B , theelectronic device 100 may convert a text having handwriting input to acquired document information. Theelectronic device 100 may match the text having handwriting input to a digital text of a database and display an acquired digital text or a function connected to the digital text in the digital document. - According to an embodiment of the present invention, the
electronic device 100 may determine a digital figure matched to handwriting input, and display an acquired digital figure at a predetermined position of the digital document. - According to an embodiment of the present invention, the
electronic device 100 may detect aFIG. 311 of a rectangular form input to an area of ‘optical character recognition’ in acquired document information. Theelectronic device 100 may determine information matched to the detected rectangular formFIG. 311 through a database. Theelectronic device 100 may reverse and output an area ‘optical character recognition’ of the digital document according to the determined information. - According to an embodiment of the present invention, the
electronic device 100 may detect that anasterisk 303 is input with handwriting in an area of ‘OCR’ in acquired document information. Theelectronic device 100 may determine information matched to the detectedasterisk 303 through a database. Theelectronic device 100 may reset a font of a word ‘OCR’ 313 with a predetermined method according to the determined information. Theelectronic device 100 may detect the same word ‘OCR’ 315. Theelectronic device 100 may reset a font of the detected ‘OCR’ 315 with the same method as a resetting font of the ‘OCR’ 313. -
FIGS. 4A and 4B illustrate an operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention. - The
electronic device 100 may generate a digital document through image data including text and handwriting input. When including a symbol input with handwriting in acquired document information, theelectronic device 100 may include a symbol input with handwriting or a function corresponding to a symbol in the digital document. - According to an embodiment of the present invention, the
electronic device 100 may detect at least one symbol input with handwriting in acquired document information. Theelectronic device 100 may determine a digital symbol that matches the detected at least one symbol at a database. Theelectronic device 100 may determine two or more connected symbols among the determined symbol. - According to an embodiment of the present invention, the
electronic device 100 may detect a 405, 401, and 403 input with handwriting in acquired document information. Thesymbols electronic device 100 may determine a matched digital symbol through information of a database. Theelectronic device 100 may determine that two or more digital symbols of matched 405, 401, and 403 are connected.digital symbols - According to an embodiment of the present invention, when acquired document information includes an underline in relation to a symbol input with handwriting, the
electronic device 100 may determine a word included in an area of the underline. According to an embodiment of the present invention, theelectronic device 100 may detect thesymbol 403 in acquired document information and determine a phrase ‘bikes rides’ included in an area of an underline input with handwriting in an area near 403. When determining a phrase ‘bikes rides’ included in an area of near 403, theelectronic device 100 may refer to acquired document information and/or a digital document generated through acquired document information. - According to an embodiment of the present invention, when acquired document information does not include handwriting input of an underline in relation to a symbol input with handwriting, the
electronic device 100 may determine a word from an area in which a symbol input with handwriting is positioned to an area in which a sentence is terminated. According to an embodiment of the present invention, theelectronic device 100 may detect thesymbol 401 in acquired document information and determine a phrase ‘more photos’ from a word ‘more’ at a position of thesymbol 401 to a word ‘photos’ where the sentence is terminated. According to an embodiment of the present invention, theelectronic device 100 may detect thesymbol 405 in acquired document information and determine ‘be inspired’ from a word ‘be’ at a symbol position of 405 to a word ‘inspired’ where the sentence is terminated. When determining a phrase ‘more photos’ of 401 or a phrase ‘be inspired’ of 405, theelectronic device 100 may refer to acquired document information and/or a digital document generated through acquired document information. - According to an embodiment of the present invention, a
heart symbol 407 may be included in a determined area of thesymbol 405. When determining an area of 405, theelectronic device 100 may include theheart symbol 407 input with handwriting beside ‘be inspired’. - The
electronic device 100 may match a symbol input with handwriting to a digital symbol of a database. When two or more related symbols are determined, theelectronic device 100 may display two or more related symbols and a word or a sentence connected to each symbol. - According to an embodiment of the present invention, the
electronic device 100 may display two or more circle characters determined through acquired document information in a separate area of the digital document. According to an embodiment of the present invention, theelectronic device 100 may convert a symbol of 405, 401, and 403 to a related symbol in acquired document information, as shown in thetouch screen 133 ofFIG. 4B . Theelectronic device 100 may determine words in an area including each 405, 401, and 403 and another symbol, and display the word and the other symbol in a separate area. Thesymbol electronic device 100 may determine at least one area of a predetermined area of the digital document, a pop-up window displayed in the digital document, and a layer area separate from the digital document as a separate area to be displayed. - According to an embodiment of the present invention, the
electronic device 100 may determine ‘be inspired’ and theheart symbol 407 included in thesymbol 405 as a first symbol area. Theelectronic device 100 may determine a digital heart symbol matched to theheart symbol 407 input with handwriting. When a digital heart symbol cannot be matched, theelectronic device 100 may crop a symbol area of acquired document information and include the symbol area in the digital document. According to an embodiment of the present invention, theelectronic device 100 may determine ‘more photos’ included in thesymbol 401 as a second symbol area. According to an embodiment of the present invention, theelectronic device 100 may determine ‘bikes rides’ included in thesymbol 403 as a third symbol area. - According to an embodiment of the present invention, when a symbol is related to order, as described above, the
electronic device 100 may input a symbol related to order, as shown inFIG. 4B . Theelectronic device 100 may determine that two or more symbols related to order are input in order of 405, 403, and 401 and rearrange the symbols in an order of 405, 401, and 403. According to an embodiment of the present invention, theelectronic device 100 may reset a related symbol area input in an order of a second symbol area, a third symbol area, and a first symbol area to an order of a first symbol area, a second symbol area, and a third symbol area. - According to an embodiment of the present invention, a predetermined operation can display the above-described ‘numbered list’ in proper order.
-
FIGS. 5A and 5B illustrate operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention. - According to an embodiment of the present invention, the
electronic device 100 may detect at least one symbol input with handwriting in acquired document information. Theelectronic device 100 may determine that the detected at least one symbol is formed with two or more types and distinguish and determine a symbol according to each type. - According to an embodiment of the present invention, the
electronic device 100 may detect a symbol of 501, 503, 505, 507, 509 and 511 and 513 input with handwriting in acquired document information. Thecheck marks electronic device 100 may determine a matched digital symbol through information of a database. Theelectronic device 100 may determine that 503, 501, and 509 match to one type (e.g., a circle character), may determine thatsymbols 505 and 507 match to another type (e.g., a parenthesis character), and may determine thatsymbols 511 and 513 match to a single type.check marks - According to an embodiment of the present invention, the
electronic device 100 may determine an area of a word or a sentence in relation to a distinguished symbol (e.g., a circle character). When a circle character area includes an underline, theelectronic device 100 may determine a word corresponding to the underline. According to an embodiment of the present invention, theelectronic device 100 may detect thesymbol 501 in acquired document information and determine a phrase ‘scanned images’ in an area of anunderline 502 input with handwriting near thesymbol 501. When determining a phrase ‘scanned images’ in an area of theinput underline 502, theelectronic device 100 may refer to acquired document information and/or a digital document generated through acquired document information. Theelectronic device 100 may determine a word ‘typewritten’ in an area of anunderline 504 near thesymbol 503, and a phrase ‘text-to-speech’ in an area of anunderline 510 near thesymbol 509, according to the above-described method. - According to an embodiment of the present invention, the
electronic device 100 may determine a phrase ‘printed text into machine’ in an area of anunderline 506 near thesymbol 505, according to the above-described method. When an area near thesymbol 507 does not include an underline, theelectronic device 100 may determine a phrase included to be associated with thesymbol 507 as ‘printed records’, according to a method described-above with reference toFIG. 4A . - According to an embodiment of the present invention, the
electronic device 100 may determine an area of a word or a sentence associated with a check mark in relation to a distinguished symbol (e.g., a check mark) according to the above-described method. Specifically, when theelectronic device 100 detects a check mark input with handwriting, theelectronic device 100 may determine an area of a word or a sentence in which the check mark indicates to the end of the sentence. According to an embodiment of the present invention, theelectronic device 100 may determine an area corresponding to thecheck mark 511 as ‘OCR is a field of research in pattern recognition, artificial intelligence and computer vision’, and may determine an area corresponding to thecheck mark 513 as “‘intelligent” system with a high degree of recognition accuracy for most fonts are now common’. - As shown in
FIG. 5B , theelectronic device 100 may determine a text in which handwriting is input to acquired document information. When symbols input with handwriting are formed with two or more types, theelectronic device 100 may display a word or a sentence connected to each type of symbol. - As shown in
FIG. 5B (a), according to an embodiment of the present invention, theelectronic device 100 may display a symbol such as a circle character, a parenthesis character, and a check mark according to each type determined through acquired document information, and may display a word or a sentence included in an area of the symbol in a separate area of the digital document. When determining a range of the above-described symbol, theelectronic device 100 does not limit a range of the above-described symbol to a special character of a circle character or a parenthesis character, and may include various special characters in a range of a symbol. Theelectronic device 100 may determine thesymbol 503 and ‘typewritten’ as a first circle character area, thesymbol 501 and ‘scanned images’ as a second circle character area, thesymbol 509 and ‘text-to-speech’ as a third circle character area, thesymbol 505 and ‘printed text into machine’ as a first parenthesis character area, thesymbol 507 and ‘printed record’ as a second parenthesis character area, thecheck mark 511 and ‘OCR is a field of research in pattern recognition, artificial intelligence and computer vision’ as a ‘first check mark area’, and thecheck mark 513 and “‘intelligent” system with a high degree of recognition accuracy for most fonts are now common’ as a second check mark area’. Each area may be an area divided to correspond to each symbol type acquired in theelectronic device 100. - According to an embodiment of the present invention, the
electronic device 100 may display symbols and corresponding text areas according to an order of the symbols on thetouch screen 133 that displays the digital document. - According to an embodiment of the present invention, the
electronic device 100 may determine that a circle character symbol input with handwriting is not input in order, and may output a circle character area acquired in the digital document according to a predetermined order. Theelectronic device 100 may input an order of a second circle character area, a first circle character area, and a the third character area, and theelectronic device 100 may determine a circle character area in order of a first circle character area, a second circle character area, and a the third character area, according to a predetermined function in a circle character of a digital symbol. Theelectronic device 100 may display the first circle character area, the second circle character area, and the third character area in which the order is determined in a predetermined display area (e.g., ‘numbered text’) of thetouch screen 100. According to an embodiment of the present invention, theelectronic device 100 may determine another type of symbol (e.g., a parenthesis character area) that can display in order and may display a first parenthesis character area and a second parenthesis character area of a parenthesis character area in an area of a numbered text that displays the first circle character area, the second circle character area, and the third character area of the above-described circle character area. - As shown in
FIG. 5B (b), according to an embodiment of the present invention, theelectronic device 100 may display a symbol corresponding to each type (e.g., a check mark area) and an area including the symbol in an area separate from an area that displays the digital document. According to an embodiment of the present invention, theelectronic device 100 may generate and display a new pop-up window for a first check mark area and a second check mark area on thetouch screen 133 that displays the digital document. -
FIGS. 6A and 6B illustrate operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention. - When displaying a digital document generated with acquired document information on a touch screen, the
electronic device 100 may output a text corresponding to handwriting input in a predetermined area (e.g., space between lines) formed with a text. - As shown in
FIG. 6A , theelectronic device 100 may detect “Current issues” 603 input with handwriting between lines formed with a text of a document printed in a constant font. Theelectronic device 100 may detect “Reach themselves in situation what scanning will be in enough cases generally” 601 input with handwriting in an area in which a text is not printed and under a line formed with a text. The electronic device may determine a digital text matched to a detected text. - According to
FIG. 6B , theelectronic device 100 may display a determined digital text “Current issues” 613 in a space between ‘Performing multisite designs’ and ‘Reach what may be’, which is the same position as that of “Current issues” 603 input with handwriting. - According to an embodiment of the present invention, the
electronic device 100 may insert determined digital text “Reach themselves in situation what scanning will be in enough cases generally” 611 under “schofield increasing, the generalizability of” displayed in the digital document with reference to a position of “Reach themselves in situation what scanning will be in enough cases generally” 601 input with handwriting. - According to an embodiment of the present invention, the
electronic device 100 may display determined “Reach themselves in situation what scanning will be in enough cases generally” 611 at the same position as that of “Reach themselves in situation what scanning will be in enough cases generally” 601 input with handwriting. - According to an embodiment of the present invention, when the
electronic device 100 does not determine a digital text matched to a text input with handwriting, theelectronic device 100 may crop image data of a text area in which handwriting is input, and display the cropped image in a predetermined area of the digital document. - According to an embodiment of the present invention, when determining a digital text matched to a text “Reach themselves in situation what scanning will be in enough cases generally” 601 which is input in handwriting, the
electronic device 100 may not determine characters of a digital text matched to at least one character. Because theelectronic device 100 cannot determine a character of the digital text, theelectronic device 100 cannot complete a digital text, such as the above-described “Reach themselves in situation what scanning will be in enough cases generally” 611. Theelectronic device 100 may crop image data of an area “Reach themselves in situation what scanning will be in enough cases generally” 601, and insert the croppedimage 601 at a position that displays “Reach themselves in situation what scanning will be in enough cases generally” 611 with reference toFIG. 6B . According to an embodiment of the present invention, when displaying the croppedimage 601, theelectronic device 100 may display the cropped image at a position of a corresponding digital document at the same position as that of “Reach themselves in situation what scanning will be in enough cases generally” 601 and determine and change a random position. -
FIG. 7A to 7E illustrate operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention. - The
electronic device 100 may display the digital text in the digital document with various methods. - According to an embodiment of the present invention, the
electronic device 100 may determine each of various types of symbols or characters included in a text in which handwriting is input to acquired document information. - According to an embodiment of the present invention, the
electronic device 100 may determine a symbol, an asterisk (predetermined symbol), and a character designating a range of text printed in a constant font in a symbol andcharacter 701 input with handwriting in acquired document information. Theelectronic device 100 may determine a symbol that designates a range of a text and determine a text corresponding to a symbol that displays a range. Theelectronic device 100 may detect a character input with handwriting and determine a matched digital character. Theelectronic device 100 may determine an asterisk (predetermined symbol) and connect a text corresponding to a symbol that designates a range according to a function of an asterisk and a character input with handwriting. - According to an embodiment of the present invention, as shown in
FIG. 7A , in acquired document information, theelectronic device 100 may determine a predetermined area of a printed text according to a symbol, input with handwriting, that designates a range of a text printed in a constant font. Theelectronic device 100 may determine a text area including a symbol with two lines that start with ‘High’ ofFIG. 7A and that terminate with ‘manner’, and determine a sentence ‘at whoever cost of trouble to learn how to read’ divided by a comma (,) or a period (.) in the determined two lines to an area corresponding to a symbol that designates a range. - According to an embodiment of the present invention, when displaying a text in which handwriting is input to acquired document information in the digital document, the
electronic device 100 may display a menu that may determine at least one method that may display. - As shown in
FIG. 7B (b), according to an embodiment of the present invention, theelectronic device 100 may convert a character and a symbol input with handwriting in acquired document information to a matched digital text, and provide a method that can display according to a function of the determined digital text. - According to an embodiment of the present invention, when outputting a method of determining to display a text in which handwriting input in the digital document, the
electronic device 100 may output the text in a selection area separate from a digital document such as ‘Detected text’ 721. Theelectronic device 100 may display with an underline, as shown in ‘at whoever cost of trouble to learn how to read’ 723 corresponding to a symbol that displays a range and insert a digital text ‘firm determination that should learn how to read’ 725 (predetermined function of an asterisk) below ‘at whoever cost of trouble to learn how to read’ 723. Theelectronic device 100 may display an icon ‘save’ 727 and/or ‘cancel’ 729 that can determine whether to display in the digital document according to the above-described method. - As shown in
FIG. 7B (a), according to an embodiment of the present invention, when theelectronic device 100 cannot determine at least one of a character and a symbol input with handwriting in acquired document information to a digital text, theelectronic device 100 may provide various methods that may display in a predetermined digital document. - According to an embodiment of the present invention, when outputting a method of determining to display a text in which handwriting is input in the digital document, the
electronic device 100 may output the text in a selection area separate from a digital document such as ‘Detected text’ 707. Theelectronic device 100 may determine a digital symbol matched to a symbol that designates a range and be displayed with an underline, as shown in corresponding ‘at whoever cost of trouble to learn how to read’ 705 according to the digital symbol. - According to an embodiment of the present invention, when the
electronic device 100 cannot determine a digital symbol matched to a symbol that displays a range, theelectronic device 100 cannot display a underline, as shown in corresponding ‘at whoever cost of trouble to learn how to read’ 705 according to the digital symbol. Theelectronic device 100 may not determine at least one digital text in a predetermined symbol (asterisk) input with handwriting and a character input with handwriting as well as a symbol that designates a range in which handwriting is input, and may not perform a combined function. Theelectronic device 100 may crop image data corresponding to a symbol and character area in which handwriting is input in the acquired document information, and display a cropped image at the same digital document position as that of an area in which handwriting is input. Theelectronic device 100 may display an icon ‘save’ 709 and/or ‘cancel’ 711 that can determine whether to display in the digital document according to the above-described method. - As shown in
FIG. 7C , according to an embodiment of the present invention, theelectronic device 100 may output a text in which handwriting is input to the digital document with at least one of various display methods. Theelectronic device 100 may set (reset) a method of outputting a text in which handwriting is input and that is output to the digital document. - According to an embodiment of the present invention, when the
electronic device 100 selects ‘save’ 709 in a selection window shown inFIG. 7B (a), theelectronic device 100 may display a digital document, as shown inFIG. 7C . Because theelectronic device 100 cannot match at least one text to a digital text, theelectronic device 100 may display a croppedimage 733 and perform again operation of determining a character or a symbol of the croppedimage 733 to a digital text. - When the
electronic device 100 selects ‘cancel’ 729 in a selection window shown inFIG. 7B (b), theelectronic device 100 may not display a determined digital text ‘at whoever cost of trouble to learn how to read’ 705 and ‘firm determination that should learn how to read’ 725. Even when theelectronic device 100 determines a digital text matched to a text in which handwriting is input, theelectronic device 100 may display a croppedimage 733 in the digital document, as shown inFIG. 7C according to setting. - As shown in
FIG. 7D , according to an embodiment of the present invention, in a method of displaying a text in which handwriting is input to acquired document information in the digital document, when theelectronic device 100 selects a text area displayed in the digital document with an input means 743 (e.g., a finger or an electronic pen), theelectronic device 100 may be set to display a connected text. According to an embodiment of the present invention, when determining a case of selecting with an input means, if touching a text area with the input means, theelectronic device 100 may determine to use at least one method of cases of indirectly touching (e.g., hovering) with an input means. - According to an embodiment of the present invention, when displaying a digital document generated with acquired document information on the
touch screen 133, theelectronic device 100 may display a digital text determined with a text printed in a constant font. When including an area connected to a text in which handwriting is input in a displayed digital text, theelectronic device 100 may display (e.g., an underline 741) that a connected text or connected data exists. According to an embodiment of the present invention, theunderline 741 in which data exists may be displayed according to a function of a symbol that designates a range, described with reference toFIG. 7B . Theelectronic device 100 may display a predetermined function in the digital document through various methods of outputting a text in which handwriting is input such as a symbol that designates a range. According to an embodiment of the present invention, theelectronic device 100 may change a color of a digital text with a method of distinguishing from a digital text displayed with an underline including another function, and may add various effects such as a method of differently displaying a form of the underline like two or three lines. Theelectronic device 100 may apply an effect according to a predetermined method. - According to an embodiment of the present invention, when selecting a
digital text 731 to which other data is connected, theelectronic device 100 may display thedigital text 731 to which other data is connected in an area (e.g., a pop-up window) separate from a digital document that displays connected data ‘firm determination that should learn how to read’ 745. According to an embodiment of the present invention, when displaying connected data, theelectronic device 100 may set to display an image (e.g., 733 ofFIG. 7C ) that crops a text input in handwriting in addition to a method of displaying thedigital text 745 determined to a text input in handwriting. - As shown in
FIG. 7E , according to an embodiment of the present invention, in a method of displaying a text in which handwriting is input to acquired document information in the digital document, theelectronic device 100 may display an object (e.g., an icon) that controls to display a text in which handwriting is input in a predetermined area of the digital document. - According to an embodiment of the present invention, the
electronic device 100 may connect ‘at whoever cost of trouble to learn how to read’ 751 displayed in the digital document of theelectronic device 100 to digital text data or a text (crop image) in which handwriting is input according to a predetermined symbol. Theelectronic device 100 may display anobject 753 that may display connected digital text data or a text (crop image) in which handwriting is input at a predetermined position of the digital document. A predetermined position that displays theobject 753 may be a position within a determined range of a periphery of ‘at whoever cost of trouble to learn how to read’ 751 in which digital text data in which theobject 753 is connected or a text (crop image) input in handwriting is connected according to a predetermined symbol. When selecting the displayedobject 753, theelectronic device 100 may generate and display a text (crop image) input in handwriting or digital text data connected to the object in an area separate from the digital document, as shown inFIG. 7D . Theelectronic device 100 does not limit a position of the displayedobject 753 to a position within a determined range of a periphery of ‘at whoever cost of trouble to learn how to read 751’ and may set the position of the displayedobject 753 to a random position of the digital document. -
FIGS. 8A and 8B are flowcharts illustrating operation of displaying document information that an electronic device acquires, according to an embodiment of the present invention. - As shown in
FIG. 8A , according to an embodiment of the present invention, theelectronic device 100 may detect a text in which handwriting is input from image date of an acquired analog document, and determine a text in which handwriting is input to a matching digital text. When the text in which handwriting is input includes a symbol, theelectronic device 100 may perform operation corresponding to the symbol, and display a digital document generated based on a text included in image data of the analog document. - The
electronic device 100 photographs a document that displays the text through theimage sensor 160, instep 801. Theelectronic device 100 may receive image data including the text through network communication, and document information acquired from image data previously stored at thememory 100 may include a text printed in a constant font and include a text in which handwriting is input. Theelectronic device 100 may detect a text displayed in the document through a text recognition and conversion program such as theOCR program 115 and/or the handwriting processing program 116. - The
electronic device 100 determines whether a digital character is matched to text detected in a photographed document (acquired through direct photographing or that receives photographed image data) with reference to a database, instep 803. If at least one character of a word cannot be determined, or if at least one word of a sentence cannot be determined, theelectronic device 100 may terminate operation of determining a corresponding word or a sentence to a digital text. - If the characters and words can be determined and the handwriting input can be converted to digital text, the
electronic device 100 converts a text input in handwriting to a matching digital text, instep 805. Theelectronic device 100 may match various texts such as, for example, various types of characters (e.g., a circle character and a parenthesis character), a highlighter, an underline, a figure, and a character input with handwriting to the same text as or a text similar to that stored at a database. - The
electronic device 100 may determine whether handwriting input includes a predetermined symbol, instep 807. If the handwriting input includes a predetermined symbol, when a digital symbol matched to the symbol is determined, theelectronic device 100 may determine whether the digital symbol includes a predetermined function. If the digital symbol includes a predetermined function, theelectronic device 100 performs an operation corresponding the symbol, instep 809. If the digital symbol does not include a predetermined function, theelectronic device 100 may performoperation 811. - According to an embodiment of the present invention, when a symbol includes a function of determining a range of a text printed in a constant font, the
electronic device 100 may determine a range of a printed text according to a function of a symbol input with handwriting. When displaying a determined range, theelectronic device 100 may apply and display an effect such as an underline, a figure, and a contrast according to a function of a predetermined symbol. When including a function of connecting a determined range of a text in which a symbol is printed in acquired document information and a text, theelectronic device 100 may process to display a printed text (e.g., a digital text matched to a printed text) of a determined range according to a function in which handwriting is input and a text in which handwriting is input (e.g., a digital text matched to a text input in handwriting) in a digital document according to a function of a symbol input with handwriting. - The
electronic device 100 may display an image of an analog text included in the digital document and/or the digital text according to a predetermined function (811). - When displaying an analog text in which handwriting is input and that includes a character, a symbol, and a figure in an analog document, the
electronic device 100 may display a digital document (e.g.,FIG. 2B ) in the same form as or a form similar to that of an analog document (e.g.,FIG. 2A ) including a text in which handwriting is input and a text printed according to a predetermined method at thememory 110. - According to an embodiment of the present invention, the
electronic device 100 may determine a predetermined function of a figure or a symbol input with handwriting, apply the function to a printed text according to a predetermined function and a text in which handwriting is input, and display the text in the digital document. According to an embodiment of the present invention, when selecting a digital text of a determined range displayed in the digital document with a method of displaying in the digital document, a method of displaying an image that crops a connected digital text (e.g., a digital text matched to a text in which handwriting is input) or a text in which handwriting is input in a pop-up window may be used. - When the
electronic device 100 performsoperation 811, theelectronic device 100 may terminate operation of the methodology ofFIG. 8A . - Referring now to
FIG. 8B , according to an embodiment of the present invention, theelectronic device 100 may acquire image data including a text printed in a constant font or a text in which handwriting is input. Theelectronic device 100 may detect at least one text included in image data and determine a matched digital text. Instep 821, it is determined whether handwritten text can be converted to digital text. When converting a detected text in which handwriting is input to a digital text, if at least one character of a word cannot be determined, or if at least one word of a sentence cannot be determined, theelectronic device 100 may performoperation 823. When theelectronic device 100 cannot determine a digital text matched to a text printed in a constant font or handwritten text, theelectronic device 100 selects a text area that cannot be converted, instep 823. When theelectronic device 100 can determine a digital text matched to a text printed in a constant font or handwritten text, theelectronic device 100 may terminate operation ofFIG. 8B . When theelectronic device 100 can determine a digital text matched to a text printed in a constant font or handwritten text, theelectronic device 100 may performoperation 805 ofFIG. 8A . - When the
electronic device 100 cannot determine a digital text matched to a text printed in a constant font or handwritten text, theelectronic device 100 acquires image data corresponding to a text area in which of the text that cannot be converted to digital text. When acquiring image data corresponding to a text area that cannot be converted to digital text, theelectronic device 100 may use a cropping method and acquired image data may not be limited to handwritten text. When theelectronic device 100 does not determine a text printed in a constant font to a matched digital text, theelectronic device 100 crops an image corresponding to a text area that is not converted. - The
electronic device 100 displays a crop image of a text that is not converted to a digital text at a corresponding position of a text that is not determined of a generated digital document, instep 825. Operation of theelectronic device 100 is not limited to operation of displaying at a corresponding position of a text that is not converted, and theelectronic device 100 may provide a function of setting a displaying position and may provide a function of changing a position of a displayed image. - When the
electronic device 100 performsstep 825, theelectronic device 100 may terminate operation of the methodology ofFIG. 8B . - When the
electronic device 100 does not determine a digital text matched to handwritten text, although it is described that an image corresponding to a text area in which handwriting is input that does not convert may be cropped, operation of theelectronic device 100 is not limited thereto, and when a digital text matched to a printed text cannot be converted, theelectronic device 100 may crop an image corresponding to a printed text area that is not converted and include the cropped image in a corresponding area of the digital document. - The
electronic device 100 may convert and display a text included in a document included in the memory of theelectronic device 100 as well as a document formed with a text and a digital text in which handwriting is input of an image that photographs through an image sensor to the digital document. Theelectronic device 100 may display a digital document corresponding to an entire area ofFIG. 2B by combining a partial area of a document in which a converted digital text is printed and display a predetermined area of a generated digital document. - According to an embodiment of the present invention, the
electronic device 100 may recognize a text in which handwriting is input to acquired document information, provide a digital document generated with operation corresponding to a character, figure, or a symbol input with handwriting and clearly provide a text in which handwriting is input and that is not recognized in theelectronic device 100. - Various embodiments of the present invention may be performed through at least one program which the
memory 110 of theelectronic device 100 includes, and may be directly controlled by a processor. Further, various embodiments may be controlled through at least one control module which a processor controls. - Methods according to various embodiments of the present invention can be implemented in a form of hardware components, software components, or combinations thereof. When implemented by software components, a computer readable storage medium that stores at least one program (software module) may be provided. At least one program stored at a computer readable storage medium is formed to execute by at least one processor within the
electronic device 100. At least one program may include an instruction that enables theelectronic device 100 to execute a method, according to embodiments of the present invention. - Such a program (software module, software) may be stored at a non-volatile memory including a Random Access Memory (RAM) and a flash memory, a Read-Only Memory (ROM), an Electrically Erasable and Programmable ROM (EEPROM), a magnetic disk storage device, a Compact Disk ROM (CD-ROM), a Digital Versatile Disk (DVD), or an optical storage device of other form, and a magnetic cassette. Alternatively, the program may be stored at a memory formed with a combination of a portion or the entire thereof. Further, each constituent memory may be included in plural.
- Further, the program may be stored at an attachable storage device that may access to the
electronic device 100 through a communication network such as Internet, intranet, a Local Area Network (LAN), a Wireless LAN (WLAN), or a Storage Area Network (SAN), or a communication network formed with a combination thereof. Such a storage device can access to theelectronic device 100 through an external port. - Further, a separate storage device on the communication network may provide access to a portable
electronic device 100. - While the invention has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention as defined by the appended claims.
Claims (20)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/173,437 US20190065447A1 (en) | 2013-08-29 | 2018-10-29 | Method of processing analog data and electronic device thereof |
| US17/204,285 US11574115B2 (en) | 2013-08-29 | 2021-03-17 | Method of processing analog data and electronic device thereof |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020130103292A KR102147935B1 (en) | 2013-08-29 | 2013-08-29 | Method for processing data and an electronic device thereof |
| KR10-2013-0103292 | 2013-08-29 | ||
| US14/338,743 US20150067485A1 (en) | 2013-08-29 | 2014-07-23 | Method of processing data and electronic device thereof |
| US16/173,437 US20190065447A1 (en) | 2013-08-29 | 2018-10-29 | Method of processing analog data and electronic device thereof |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/338,743 Continuation US20150067485A1 (en) | 2013-08-29 | 2014-07-23 | Method of processing data and electronic device thereof |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/204,285 Continuation US11574115B2 (en) | 2013-08-29 | 2021-03-17 | Method of processing analog data and electronic device thereof |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20190065447A1 true US20190065447A1 (en) | 2019-02-28 |
Family
ID=51485455
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/338,743 Abandoned US20150067485A1 (en) | 2013-08-29 | 2014-07-23 | Method of processing data and electronic device thereof |
| US16/173,437 Abandoned US20190065447A1 (en) | 2013-08-29 | 2018-10-29 | Method of processing analog data and electronic device thereof |
| US17/204,285 Active US11574115B2 (en) | 2013-08-29 | 2021-03-17 | Method of processing analog data and electronic device thereof |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/338,743 Abandoned US20150067485A1 (en) | 2013-08-29 | 2014-07-23 | Method of processing data and electronic device thereof |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/204,285 Active US11574115B2 (en) | 2013-08-29 | 2021-03-17 | Method of processing analog data and electronic device thereof |
Country Status (3)
| Country | Link |
|---|---|
| US (3) | US20150067485A1 (en) |
| EP (1) | EP2843592A3 (en) |
| KR (1) | KR102147935B1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2022173239A1 (en) * | 2021-02-10 | 2022-08-18 | Samsung Electronics Co., Ltd. | Methods and systems for performing on-device image to text conversion |
Families Citing this family (157)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8677377B2 (en) | 2005-09-08 | 2014-03-18 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
| US9318108B2 (en) | 2010-01-18 | 2016-04-19 | Apple Inc. | Intelligent automated assistant |
| US8977255B2 (en) | 2007-04-03 | 2015-03-10 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
| US10002189B2 (en) | 2007-12-20 | 2018-06-19 | Apple Inc. | Method and apparatus for searching using an active ontology |
| US9330720B2 (en) | 2008-01-03 | 2016-05-03 | Apple Inc. | Methods and apparatus for altering audio output signals |
| US8996376B2 (en) | 2008-04-05 | 2015-03-31 | Apple Inc. | Intelligent text-to-speech conversion |
| US20100030549A1 (en) | 2008-07-31 | 2010-02-04 | Lee Michael M | Mobile device having human language translation capability with positional feedback |
| US8676904B2 (en) | 2008-10-02 | 2014-03-18 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
| US10241644B2 (en) | 2011-06-03 | 2019-03-26 | Apple Inc. | Actionable reminder entries |
| US10241752B2 (en) | 2011-09-30 | 2019-03-26 | Apple Inc. | Interface for a virtual digital assistant |
| US20120309363A1 (en) | 2011-06-03 | 2012-12-06 | Apple Inc. | Triggering notifications associated with tasks items that represent tasks to perform |
| US9431006B2 (en) | 2009-07-02 | 2016-08-30 | Apple Inc. | Methods and apparatuses for automatic speech recognition |
| US10276170B2 (en) | 2010-01-18 | 2019-04-30 | Apple Inc. | Intelligent automated assistant |
| US8682667B2 (en) | 2010-02-25 | 2014-03-25 | Apple Inc. | User profiling for selecting user specific voice input processing information |
| US9262612B2 (en) | 2011-03-21 | 2016-02-16 | Apple Inc. | Device access using voice authentication |
| US10057736B2 (en) | 2011-06-03 | 2018-08-21 | Apple Inc. | Active transport based notifications |
| US10134385B2 (en) | 2012-03-02 | 2018-11-20 | Apple Inc. | Systems and methods for name pronunciation |
| US10417037B2 (en) | 2012-05-15 | 2019-09-17 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
| US9721563B2 (en) | 2012-06-08 | 2017-08-01 | Apple Inc. | Name recognition system |
| US9547647B2 (en) | 2012-09-19 | 2017-01-17 | Apple Inc. | Voice-based media searching |
| EP3809407A1 (en) | 2013-02-07 | 2021-04-21 | Apple Inc. | Voice trigger for a digital assistant |
| US10652394B2 (en) | 2013-03-14 | 2020-05-12 | Apple Inc. | System and method for processing voicemail |
| US10748529B1 (en) | 2013-03-15 | 2020-08-18 | Apple Inc. | Voice activated device for use with a voice-based digital assistant |
| WO2014197334A2 (en) | 2013-06-07 | 2014-12-11 | Apple Inc. | System and method for user-specified pronunciation of words for speech synthesis and recognition |
| WO2014197335A1 (en) | 2013-06-08 | 2014-12-11 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
| US10176167B2 (en) | 2013-06-09 | 2019-01-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
| AU2014278592B2 (en) | 2013-06-09 | 2017-09-07 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
| AU2014306221B2 (en) | 2013-08-06 | 2017-04-06 | Apple Inc. | Auto-activating smart responses based on activities from remote devices |
| US10296160B2 (en) | 2013-12-06 | 2019-05-21 | Apple Inc. | Method for extracting salient dialog usage from live data |
| EP3480811A1 (en) | 2014-05-30 | 2019-05-08 | Apple Inc. | Multi-command single utterance input method |
| US9633004B2 (en) | 2014-05-30 | 2017-04-25 | Apple Inc. | Better resolution when referencing to concepts |
| US9715875B2 (en) | 2014-05-30 | 2017-07-25 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
| US9430463B2 (en) | 2014-05-30 | 2016-08-30 | Apple Inc. | Exemplar-based natural language processing |
| US10170123B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Intelligent assistant for home automation |
| JP2016015115A (en) * | 2014-06-09 | 2016-01-28 | 株式会社リコー | Information processing device, information processing method, and recording medium |
| US9338493B2 (en) | 2014-06-30 | 2016-05-10 | Apple Inc. | Intelligent automated assistant for TV user interactions |
| US9818400B2 (en) | 2014-09-11 | 2017-11-14 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
| US10074360B2 (en) | 2014-09-30 | 2018-09-11 | Apple Inc. | Providing an indication of the suitability of speech recognition |
| US9668121B2 (en) | 2014-09-30 | 2017-05-30 | Apple Inc. | Social reminders |
| US10127911B2 (en) | 2014-09-30 | 2018-11-13 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
| CN105988568B (en) * | 2015-02-12 | 2020-07-24 | 北京三星通信技术研究有限公司 | Method and device for acquiring note information |
| US10152299B2 (en) | 2015-03-06 | 2018-12-11 | Apple Inc. | Reducing response latency of intelligent automated assistants |
| US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
| US9721566B2 (en) | 2015-03-08 | 2017-08-01 | Apple Inc. | Competing devices responding to voice triggers |
| US9886953B2 (en) | 2015-03-08 | 2018-02-06 | Apple Inc. | Virtual assistant activation |
| US10460227B2 (en) | 2015-05-15 | 2019-10-29 | Apple Inc. | Virtual assistant in a communication session |
| US10083688B2 (en) | 2015-05-27 | 2018-09-25 | Apple Inc. | Device voice control for selecting a displayed affordance |
| US10200824B2 (en) | 2015-05-27 | 2019-02-05 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device |
| US9578173B2 (en) | 2015-06-05 | 2017-02-21 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
| US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
| US20160378747A1 (en) | 2015-06-29 | 2016-12-29 | Apple Inc. | Virtual assistant for media playback |
| US10740384B2 (en) | 2015-09-08 | 2020-08-11 | Apple Inc. | Intelligent automated assistant for media search and playback |
| US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
| US10331312B2 (en) | 2015-09-08 | 2019-06-25 | Apple Inc. | Intelligent automated assistant in a media environment |
| US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
| KR20170037424A (en) * | 2015-09-25 | 2017-04-04 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
| US11010550B2 (en) | 2015-09-29 | 2021-05-18 | Apple Inc. | Unified language modeling framework for word prediction, auto-completion and auto-correction |
| US10366158B2 (en) | 2015-09-29 | 2019-07-30 | Apple Inc. | Efficient word encoding for recurrent neural network language models |
| US11587559B2 (en) | 2015-09-30 | 2023-02-21 | Apple Inc. | Intelligent device identification |
| US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
| US10956666B2 (en) | 2015-11-09 | 2021-03-23 | Apple Inc. | Unconventional virtual assistant interactions |
| US10049668B2 (en) | 2015-12-02 | 2018-08-14 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
| US10223066B2 (en) | 2015-12-23 | 2019-03-05 | Apple Inc. | Proactive assistance based on dialog communication between devices |
| US10446143B2 (en) | 2016-03-14 | 2019-10-15 | Apple Inc. | Identification of voice inputs providing credentials |
| US9934775B2 (en) | 2016-05-26 | 2018-04-03 | Apple Inc. | Unit-selection text-to-speech synthesis based on predicted concatenation parameters |
| US9972304B2 (en) | 2016-06-03 | 2018-05-15 | Apple Inc. | Privacy preserving distributed evaluation framework for embedded personalized systems |
| US10249300B2 (en) | 2016-06-06 | 2019-04-02 | Apple Inc. | Intelligent list reading |
| US11227589B2 (en) | 2016-06-06 | 2022-01-18 | Apple Inc. | Intelligent list reading |
| US10049663B2 (en) | 2016-06-08 | 2018-08-14 | Apple, Inc. | Intelligent automated assistant for media exploration |
| DK179588B1 (en) | 2016-06-09 | 2019-02-22 | Apple Inc. | Intelligent automated assistant in a home environment |
| US12223282B2 (en) | 2016-06-09 | 2025-02-11 | Apple Inc. | Intelligent automated assistant in a home environment |
| US10067938B2 (en) | 2016-06-10 | 2018-09-04 | Apple Inc. | Multilingual word prediction |
| US10490187B2 (en) | 2016-06-10 | 2019-11-26 | Apple Inc. | Digital assistant providing automated status report |
| US10192552B2 (en) | 2016-06-10 | 2019-01-29 | Apple Inc. | Digital assistant providing whispered speech |
| US10509862B2 (en) | 2016-06-10 | 2019-12-17 | Apple Inc. | Dynamic phrase expansion of language input |
| US10586535B2 (en) | 2016-06-10 | 2020-03-10 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
| DK179343B1 (en) | 2016-06-11 | 2018-05-14 | Apple Inc | Intelligent task discovery |
| DK179415B1 (en) | 2016-06-11 | 2018-06-14 | Apple Inc | Intelligent device arbitration and control |
| US12197817B2 (en) | 2016-06-11 | 2025-01-14 | Apple Inc. | Intelligent device arbitration and control |
| DK179049B1 (en) | 2016-06-11 | 2017-09-18 | Apple Inc | Data driven natural language event detection and classification |
| DK201670540A1 (en) | 2016-06-11 | 2018-01-08 | Apple Inc | Application integration with a digital assistant |
| US10402740B2 (en) * | 2016-07-29 | 2019-09-03 | Sap Se | Natural interactive user interface using artificial intelligence and freeform input |
| US10474753B2 (en) | 2016-09-07 | 2019-11-12 | Apple Inc. | Language identification using recurrent neural networks |
| US10043516B2 (en) | 2016-09-23 | 2018-08-07 | Apple Inc. | Intelligent automated assistant |
| US11281993B2 (en) | 2016-12-05 | 2022-03-22 | Apple Inc. | Model and ensemble compression for metric learning |
| US10593346B2 (en) | 2016-12-22 | 2020-03-17 | Apple Inc. | Rank-reduced token representation for automatic speech recognition |
| US11204787B2 (en) | 2017-01-09 | 2021-12-21 | Apple Inc. | Application integration with a digital assistant |
| DK201770383A1 (en) | 2017-05-09 | 2018-12-14 | Apple Inc. | User interface for correcting recognition errors |
| US10417266B2 (en) | 2017-05-09 | 2019-09-17 | Apple Inc. | Context-aware ranking of intelligent response suggestions |
| DK180048B1 (en) | 2017-05-11 | 2020-02-04 | Apple Inc. | MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION |
| US10726832B2 (en) | 2017-05-11 | 2020-07-28 | Apple Inc. | Maintaining privacy of personal information |
| DK201770439A1 (en) | 2017-05-11 | 2018-12-13 | Apple Inc. | Offline personal assistant |
| US10395654B2 (en) | 2017-05-11 | 2019-08-27 | Apple Inc. | Text normalization based on a data-driven learning network |
| DK201770427A1 (en) | 2017-05-12 | 2018-12-20 | Apple Inc. | Low-latency intelligent automated assistant |
| DK179745B1 (en) | 2017-05-12 | 2019-05-01 | Apple Inc. | SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT |
| DK179496B1 (en) | 2017-05-12 | 2019-01-15 | Apple Inc. | USER-SPECIFIC Acoustic Models |
| US11301477B2 (en) | 2017-05-12 | 2022-04-12 | Apple Inc. | Feedback analysis of a digital assistant |
| DK201770411A1 (en) | 2017-05-15 | 2018-12-20 | Apple Inc. | MULTI-MODAL INTERFACES |
| DK201770431A1 (en) | 2017-05-15 | 2018-12-20 | Apple Inc. | Optimizing dialogue policy decisions for digital assistants using implicit feedback |
| DK201770432A1 (en) | 2017-05-15 | 2018-12-21 | Apple Inc. | Hierarchical belief states for digital assistants |
| US10303715B2 (en) | 2017-05-16 | 2019-05-28 | Apple Inc. | Intelligent automated assistant for media exploration |
| US20180336892A1 (en) | 2017-05-16 | 2018-11-22 | Apple Inc. | Detecting a trigger of a digital assistant |
| DK179549B1 (en) | 2017-05-16 | 2019-02-12 | Apple Inc. | Far-field extension for digital assistant services |
| US10311144B2 (en) | 2017-05-16 | 2019-06-04 | Apple Inc. | Emoji word sense disambiguation |
| US10403278B2 (en) | 2017-05-16 | 2019-09-03 | Apple Inc. | Methods and systems for phonetic matching in digital assistant services |
| US10657328B2 (en) | 2017-06-02 | 2020-05-19 | Apple Inc. | Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling |
| JP6891073B2 (en) * | 2017-08-22 | 2021-06-18 | キヤノン株式会社 | A device for setting a file name, etc. on a scanned image, its control method, and a program. |
| US10445429B2 (en) | 2017-09-21 | 2019-10-15 | Apple Inc. | Natural language understanding using vocabularies with compressed serialized tries |
| US10755051B2 (en) | 2017-09-29 | 2020-08-25 | Apple Inc. | Rule-based natural language processing |
| US10636424B2 (en) | 2017-11-30 | 2020-04-28 | Apple Inc. | Multi-turn canned dialog |
| US10733982B2 (en) | 2018-01-08 | 2020-08-04 | Apple Inc. | Multi-directional dialog |
| US10733375B2 (en) | 2018-01-31 | 2020-08-04 | Apple Inc. | Knowledge-based framework for improving natural language understanding |
| US10789959B2 (en) | 2018-03-02 | 2020-09-29 | Apple Inc. | Training speaker recognition models for digital assistants |
| US10592604B2 (en) | 2018-03-12 | 2020-03-17 | Apple Inc. | Inverse text normalization for automatic speech recognition |
| US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
| US10909331B2 (en) | 2018-03-30 | 2021-02-02 | Apple Inc. | Implicit identification of translation payload with neural machine translation |
| US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
| US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
| US10984780B2 (en) | 2018-05-21 | 2021-04-20 | Apple Inc. | Global semantic word embeddings using bi-directional recurrent neural networks |
| DK180639B1 (en) | 2018-06-01 | 2021-11-04 | Apple Inc | DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT |
| US11386266B2 (en) | 2018-06-01 | 2022-07-12 | Apple Inc. | Text correction |
| DK179822B1 (en) | 2018-06-01 | 2019-07-12 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
| DK201870355A1 (en) | 2018-06-01 | 2019-12-16 | Apple Inc. | Virtual assistant operation in multi-device environments |
| US10892996B2 (en) | 2018-06-01 | 2021-01-12 | Apple Inc. | Variable latency device coordination |
| US10504518B1 (en) | 2018-06-03 | 2019-12-10 | Apple Inc. | Accelerated task performance |
| USD876449S1 (en) * | 2018-09-12 | 2020-02-25 | Apple Inc. | Electronic device or portion thereof with animated graphical user interface |
| US11010561B2 (en) | 2018-09-27 | 2021-05-18 | Apple Inc. | Sentiment prediction from textual data |
| US10839159B2 (en) | 2018-09-28 | 2020-11-17 | Apple Inc. | Named entity normalization in a spoken dialog system |
| US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
| US11170166B2 (en) | 2018-09-28 | 2021-11-09 | Apple Inc. | Neural typographical error modeling via generative adversarial networks |
| US11475898B2 (en) | 2018-10-26 | 2022-10-18 | Apple Inc. | Low-latency multi-speaker speech recognition |
| USD912699S1 (en) | 2018-11-15 | 2021-03-09 | Biosense Webster (Israel) Ltd. | Display screen or portion thereof with icon |
| US11638059B2 (en) | 2019-01-04 | 2023-04-25 | Apple Inc. | Content playback on multiple devices |
| US10783323B1 (en) * | 2019-03-14 | 2020-09-22 | Michael Garnet Hawkes | Analysis system |
| US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
| DK201970509A1 (en) | 2019-05-06 | 2021-01-15 | Apple Inc | Spoken notifications |
| US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
| US11475884B2 (en) | 2019-05-06 | 2022-10-18 | Apple Inc. | Reducing digital assistant latency when a language is incorrectly determined |
| US11423908B2 (en) | 2019-05-06 | 2022-08-23 | Apple Inc. | Interpreting spoken requests |
| US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
| DK180129B1 (en) | 2019-05-31 | 2020-06-02 | Apple Inc. | USER ACTIVITY SHORTCUT SUGGESTIONS |
| DK201970510A1 (en) | 2019-05-31 | 2021-02-11 | Apple Inc | Voice identification in digital assistant systems |
| US11496600B2 (en) | 2019-05-31 | 2022-11-08 | Apple Inc. | Remote execution of machine-learned models |
| US11289073B2 (en) | 2019-05-31 | 2022-03-29 | Apple Inc. | Device text to speech |
| US11360641B2 (en) | 2019-06-01 | 2022-06-14 | Apple Inc. | Increasing the relevance of new available information |
| US11468890B2 (en) | 2019-06-01 | 2022-10-11 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
| US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
| CN112764599B (en) * | 2019-11-01 | 2023-03-10 | 北京搜狗科技发展有限公司 | Data processing method, device and medium |
| US11061543B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | Providing relevant data items based on context |
| US12301635B2 (en) | 2020-05-11 | 2025-05-13 | Apple Inc. | Digital assistant hardware abstraction |
| US11038934B1 (en) | 2020-05-11 | 2021-06-15 | Apple Inc. | Digital assistant hardware abstraction |
| USD942470S1 (en) | 2020-06-21 | 2022-02-01 | Apple Inc. | Display or portion thereof with animated graphical user interface |
| WO2022008314A1 (en) * | 2020-07-06 | 2022-01-13 | Tetra Laval Holdings & Finance S.A. | A method for controlling a food handling system |
| US11490204B2 (en) | 2020-07-20 | 2022-11-01 | Apple Inc. | Multi-device audio adjustment coordination |
| US11438683B2 (en) | 2020-07-21 | 2022-09-06 | Apple Inc. | User identification using headphones |
| CN114513527B (en) * | 2020-10-28 | 2023-06-06 | 华为技术有限公司 | Information processing method, terminal equipment and distributed network |
| CN115965982A (en) * | 2021-10-09 | 2023-04-14 | 联想(北京)有限公司 | Data marking method and device |
Family Cites Families (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5583542A (en) | 1992-05-26 | 1996-12-10 | Apple Computer, Incorporated | Method for deleting objects on a computer display |
| US5331431A (en) * | 1992-08-31 | 1994-07-19 | Motorola, Inc. | Method and apparatus for transmitting and receiving encoded data |
| US6587587B2 (en) * | 1993-05-20 | 2003-07-01 | Microsoft Corporation | System and methods for spacing, storing and recognizing electronic representations of handwriting, printing and drawings |
| US5606690A (en) * | 1993-08-20 | 1997-02-25 | Canon Inc. | Non-literal textual search using fuzzy finite non-deterministic automata |
| US6021218A (en) * | 1993-09-07 | 2000-02-01 | Apple Computer, Inc. | System and method for organizing recognized and unrecognized objects on a computer display |
| JPH07200155A (en) * | 1993-12-10 | 1995-08-04 | Microsoft Corp | Detection of nonobjective result of pen-type computer system |
| US5729637A (en) * | 1994-08-31 | 1998-03-17 | Adobe Systems, Inc. | Method and apparatus for producing a hybrid data structure for displaying a raster image |
| US5900876A (en) * | 1995-04-14 | 1999-05-04 | Canon Kabushiki Kaisha | Information processing apparatus and method with display book page turning |
| US5889523A (en) * | 1997-11-25 | 1999-03-30 | Fuji Xerox Co., Ltd. | Method and apparatus for dynamically grouping a plurality of graphic objects |
| WO2000016221A1 (en) * | 1998-09-15 | 2000-03-23 | Microsoft Corporation | Interactive playlist generation using annotations |
| US6859909B1 (en) * | 2000-03-07 | 2005-02-22 | Microsoft Corporation | System and method for annotating web-based documents |
| CA2353682A1 (en) * | 2001-07-23 | 2003-01-23 | Ibm Canada Limited-Ibm Canada Limitee | Link management of document structures |
| US7266765B2 (en) * | 2001-08-31 | 2007-09-04 | Fuji Xerox Co., Ltd. | Detection and processing of annotated anchors |
| US20030214531A1 (en) | 2002-05-14 | 2003-11-20 | Microsoft Corporation | Ink input mechanisms |
| US20040139391A1 (en) * | 2003-01-15 | 2004-07-15 | Xerox Corporation | Integration of handwritten annotations into an electronic original |
| US7290251B2 (en) * | 2003-05-16 | 2007-10-30 | Microsoft Corporation | Method and system for providing a representation of merge conflicts in a three-way merge operation |
| US7295708B2 (en) * | 2003-09-24 | 2007-11-13 | Microsoft Corporation | System and method for detecting a list in ink input |
| US7574048B2 (en) * | 2004-09-03 | 2009-08-11 | Microsoft Corporation | Freeform digital ink annotation recognition |
| US9727564B2 (en) * | 2005-12-15 | 2017-08-08 | Nokia Technology Oy | Annotating content with context metadata |
| US7913162B2 (en) * | 2005-12-20 | 2011-03-22 | Pitney Bowes Inc. | System and method for collaborative annotation using a digital pen |
| US20080008387A1 (en) * | 2006-07-06 | 2008-01-10 | Cheng Yi-Hsun E | Method and apparatus for recognition of handwritten symbols |
| US8508756B2 (en) * | 2006-12-28 | 2013-08-13 | Konica Minolta Business Technologies, Inc. | Image forming apparatus having capability for recognition and extraction of annotations and additionally written portions |
| KR101142270B1 (en) * | 2009-12-23 | 2012-05-07 | 주식회사 디오텍 | Handwriting input device having the document editting function and method thereof |
| US9460068B2 (en) * | 2010-02-03 | 2016-10-04 | Google Inc. | Narrative-based media organizing system for transforming and merging graphical representations of digital media within a work area |
| US20110239108A1 (en) | 2010-03-26 | 2011-09-29 | Microsoft Corporation | Configurable dynamic combination of html resources for download optimization in script based web page |
| US20120030564A1 (en) | 2010-07-30 | 2012-02-02 | International Business Machines Corporation | Domain-Specific Spell Check Overlays |
| US8380753B2 (en) * | 2011-01-18 | 2013-02-19 | Apple Inc. | Reconstruction of lists in a document |
| KR101862123B1 (en) | 2011-08-31 | 2018-05-30 | 삼성전자 주식회사 | Input device and method on terminal equipment having a touch module |
| US20140189593A1 (en) * | 2012-12-27 | 2014-07-03 | Kabushiki Kaisha Toshiba | Electronic device and input method |
-
2013
- 2013-08-29 KR KR1020130103292A patent/KR102147935B1/en active Active
-
2014
- 2014-07-23 US US14/338,743 patent/US20150067485A1/en not_active Abandoned
- 2014-08-26 EP EP20140182238 patent/EP2843592A3/en not_active Ceased
-
2018
- 2018-10-29 US US16/173,437 patent/US20190065447A1/en not_active Abandoned
-
2021
- 2021-03-17 US US17/204,285 patent/US11574115B2/en active Active
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2022173239A1 (en) * | 2021-02-10 | 2022-08-18 | Samsung Electronics Co., Ltd. | Methods and systems for performing on-device image to text conversion |
| US12205388B2 (en) | 2021-02-10 | 2025-01-21 | Samsung Electronics Co., Ltd. | Methods and systems for performing on-device image to text conversion |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2843592A2 (en) | 2015-03-04 |
| KR20150025452A (en) | 2015-03-10 |
| EP2843592A3 (en) | 2015-05-06 |
| US11574115B2 (en) | 2023-02-07 |
| US20150067485A1 (en) | 2015-03-05 |
| KR102147935B1 (en) | 2020-08-25 |
| US20210200935A1 (en) | 2021-07-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11574115B2 (en) | Method of processing analog data and electronic device thereof | |
| KR102559028B1 (en) | Method and apparatus for recognizing handwriting | |
| CN102906671B (en) | Gesture input device and gesture input method | |
| JP5347673B2 (en) | Information processing apparatus, information processing method, and program | |
| US8111247B2 (en) | System and method for changing touch screen functionality | |
| US9335835B2 (en) | Method and apparatus for providing user interface | |
| US9172879B2 (en) | Image display control apparatus, image display apparatus, non-transitory computer readable medium, and image display control method | |
| US20130324089A1 (en) | Method for providing fingerprint-based shortcut key, machine-readable storage medium, and portable terminal | |
| JP4031255B2 (en) | Gesture command input device | |
| KR20140030361A (en) | Apparatus and method for recognizing a character in terminal equipment | |
| KR20150070870A (en) | Preview method of picture taken in camera and electronic device implementing the same | |
| CN108829644A (en) | Information processing unit, recording medium and the method for showing translation result | |
| US10593077B2 (en) | Associating digital ink markups with annotated content | |
| CN102667813B (en) | Information processing device, and control method of information processing device | |
| US20110037731A1 (en) | Electronic device and operating method thereof | |
| JP6390480B2 (en) | Information processing apparatus and data structure of data obtained by imaging on paper medium | |
| JP6036856B2 (en) | Electronic control apparatus, control method, and control program | |
| KR20140146884A (en) | Method for editing images captured by portable terminal and the portable terminal therefor | |
| CN107395966A (en) | A kind of photographic method, electronic equipment and computer-readable recording medium | |
| KR20150026382A (en) | Electronic apparatus and method for contacts management in electronic apparatus | |
| US8629846B2 (en) | Information processing apparatus and information processing method | |
| JP2010191907A (en) | Character input device and character input method | |
| KR20190063803A (en) | Method and apparatus for image synthesis of object | |
| JP2009245165A (en) | Face recognition device, face recognition program, face recognition method | |
| JP2009245166A (en) | Face recognition device, face recognition program, face recognition method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING RESPONSE FOR INFORMALITY, FEE DEFICIENCY OR CRF ACTION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING RESPONSE FOR INFORMALITY, FEE DEFICIENCY OR CRF ACTION |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |