[go: up one dir, main page]

CN111565328A - Method and device for inputting intelligent equipment and remote controller - Google Patents

Method and device for inputting intelligent equipment and remote controller Download PDF

Info

Publication number
CN111565328A
CN111565328A CN202010361593.XA CN202010361593A CN111565328A CN 111565328 A CN111565328 A CN 111565328A CN 202010361593 A CN202010361593 A CN 202010361593A CN 111565328 A CN111565328 A CN 111565328A
Authority
CN
China
Prior art keywords
input
intelligent equipment
interface
determining
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010361593.XA
Other languages
Chinese (zh)
Inventor
付加纯
郭宏志
崔传凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Shengshihui Technology Co ltd
Original Assignee
Beijing Shengshihui Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Shengshihui Technology Co ltd filed Critical Beijing Shengshihui Technology Co ltd
Priority to CN202010361593.XA priority Critical patent/CN111565328A/en
Publication of CN111565328A publication Critical patent/CN111565328A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42212Specific keyboard arrangements
    • H04N21/42213Specific keyboard arrangements for facilitating data entry

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of intelligent device control, and discloses a method for intelligent device input, which comprises the following steps: when the intelligent equipment is determined to be in an input scene, displaying an input interface corresponding to the input scene; determining input operation and generating an input instruction according to the input operation; and sending the input instruction to the intelligent equipment. According to the method and the device, the control end for controlling the intelligent device automatically identifies the scene of the intelligent device, and when the intelligent device is determined to be in the input scene, an input interface meeting the input requirement is generated, so that the input process of a user is simplified, the user is prevented from adjusting the type of an input keyboard, the information input efficiency is improved, and the user experience is improved. The application also discloses a device and a remote controller for the input of the intelligent equipment.

Description

Method and device for inputting intelligent equipment and remote controller
Technical Field
The present application relates to the field of intelligent device control technologies, and for example, to a method and an apparatus for intelligent device input, and a remote controller.
Background
At present, with the development of network technology, users watch rich network videos and play, review or manage video contents through intelligent devices such as an intelligent television, an intelligent box, a projection device and the like, and can listen to music, shop or play games.
In the control process of the intelligent device, the control operation can be finished through an adaptive remote controller, or the control operation can be finished through a mobile terminal interconnected with the intelligent device. In the process of inputting information into the intelligent device, a virtual keyboard is usually displayed on a display interface of the intelligent device, a user is required to jump among various interface elements through a direction key of a remote controller, and then an enter key is used for selecting to complete an input operation.
In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art: the information input process is complicated, the input speed is slow, and particularly when more information is input, the user experience is reduced.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a method and a device for inputting intelligent equipment and a remote controller, which aim to solve the technical problem that the process of completing information input through a direction key is complicated and the input speed is slow.
In some embodiments, the method comprises:
when the intelligent equipment is determined to be in an input scene, displaying an input interface corresponding to the input scene;
determining input operation and generating an input instruction according to the input operation;
and sending the input instruction to the intelligent equipment.
In some embodiments, the apparatus includes a processor and a memory storing program instructions, the processor being configured to, when executing the program instructions, perform the method for smart device input described above.
In some embodiments, the remote control comprises the above-described means for smart device input.
The method, the device and the remote controller for inputting the intelligent equipment provided by the embodiment of the disclosure can realize the following technical effects:
the control end for controlling the intelligent device automatically identifies the scene of the intelligent device, and generates an input interface meeting input requirements when the intelligent device is determined to be in the input scene, so that the input process of a user is simplified, the adjustment of the type of an input keyboard by the user is avoided, the information input efficiency is accelerated, and the user experience is improved.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in limitation thereof, in which elements having the same reference numeral designations are shown as like elements and not in limitation thereof, and wherein:
FIG. 1 is a schematic flow chart diagram of a method for smart device input provided by an embodiment of the present disclosure;
fig. 2a is a schematic view of a display interface corresponding to an input scene provided by an embodiment of the present disclosure;
FIG. 2b is a schematic diagram of a display interface corresponding to another input scenario provided by the embodiment of the present disclosure;
FIG. 2c is a schematic diagram of a display interface corresponding to another input scenario provided by the embodiment of the present disclosure;
FIG. 2d is a schematic diagram of a display interface corresponding to another input scenario provided by the embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an apparatus for smart device input provided in an embodiment of the present disclosure.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, one or more embodiments may be practiced without these details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
The terms "first," "second," and the like in the description and in the claims, and the above-described drawings of embodiments of the present disclosure, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the present disclosure described herein may be made. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions.
The term "plurality" means two or more unless otherwise specified.
In the embodiment of the present disclosure, the character "/" indicates that the preceding and following objects are in an or relationship. For example, A/B represents: a or B.
The term "and/or" is an associative relationship that describes objects, meaning that three relationships may exist. For example, a and/or B, represents: a or B, or A and B.
Fig. 1 is a schematic flowchart of a method for inputting to an intelligent device according to an embodiment of the present disclosure, where the method includes the following steps:
s101, when the intelligent device is determined to be in an input scene, displaying an input interface corresponding to the input scene.
In the embodiment of the disclosure, whether the intelligent device is in an input scene is determined by the control end for the input of the intelligent device.
Optionally, the control end for inputting the smart device is an intelligent remote controller or a mobile terminal that can be interconnected with the smart device, such as: a mobile phone or a tablet computer, etc. The control end comprises a screen used for displaying an input interface, and interconnection with the intelligent device is completed before input to the intelligent device is executed.
In various embodiments, there are multiple ways to determine that the smart device is in an input scenario.
In some embodiments, determining that the smart device is in an input scenario includes: acquiring element components forming a display interface of the intelligent equipment; when the element component includes an input component, it is determined that the smart device is in an input scenario.
In some embodiments, determining that the smart device is in an input scenario includes: acquiring image information of a display interface of the intelligent equipment; segmenting the image information to determine constituent elements; when the constituent elements include the setting element, it is determined that the smart device is in the input scene.
In some embodiments, the setting elements include an input box, an alphabetical button list for initial search, a candidate text list for text search, a search icon, a keyboard icon, and a sort navigation option for sorting page switching.
Fig. 2a, 2b, 2c and 2d are schematic diagrams corresponding to input scenes provided by the embodiments of the present disclosure.
The element categories included as shown in fig. 2a are an input box 201 and a keyboard icon 202. The element categories included as shown in fig. 2b are an input box 203, an alphabetical button list 204 for initial search and a search icon 205. As shown in fig. 2c, includes a candidate text list 206 for performing a text search. A sort navigation option 207 is included for performing sort page switching as shown in figure 2 d.
In different embodiments, there are multiple ways to display an input interface corresponding to an input scenario. Optionally, the method includes generating an input interface in real time and calling the input interface.
In some embodiments, displaying an input interface corresponding to an input scenario includes: integrating the constituent elements according to a display interface of the intelligent equipment; and generating an input interface matched with the display interface of the intelligent equipment, and displaying the input interface. When the intelligent device is provided with different application programs, the corresponding input interfaces are generated in real time according to the different application programs or application scenes, diversified control requirements are met, and user interaction experience is improved.
Wherein, generating the input interface matched with the display interface of the intelligent device comprises: generating an interface which is identical to the display interface of the intelligent device, wherein the interface comprises: inputting a keyboard and a background, or inputting the keyboard, an input frame and the background; or generating a keyboard interface matched with the display interface of the intelligent device, wherein the key positions are the same as the display interface of the intelligent device.
In some embodiments, displaying an input interface corresponding to an input scenario includes: analyzing image information of a display interface of the intelligent equipment to acquire a displayed character type; calling a corresponding keyboard type according to the character type; and calling the corresponding input interface according to the keyboard type, and displaying the input interface.
Optionally, the character types of the acquired display include one or more types. When the displayed character is one, calling a keyboard type corresponding to the character type, or calling a keyboard formed by combining the character type and other character types; and when the display characters are two or more than two, calling the keyboard formed by combining the corresponding character types.
In some embodiments, displaying an input interface corresponding to an input scenario includes: analyzing the intelligent equipment identification and/or the application program identification corresponding to the intelligent equipment display interface; determining the keyboard type according to the intelligent equipment identification and/or the application program identification; and calling the corresponding input interface according to the keyboard type, and displaying the input interface.
Optionally, when the input interface of the intelligent device is the system application display interface, the keyboard type is determined according to the identification of the intelligent device.
When the intelligent equipment is provided with application programs provided by other operators, the content of the display interface of the application program is unchanged, scaling in equal proportion and background self-adaptive filling are carried out according to different display proportions of the intelligent equipment, or different display interfaces are set according to the display proportions of the intelligent equipment so as to optimize the display effect.
Optionally, when the input interface of the intelligent device is an application program provided by another installed operator, the keyboard type is determined according to the application program identifier, or the keyboard type is determined according to the intelligent device identifier and the application program identifier.
In some embodiments, the keyboard types include pinyin keyboards, english keyboards, numeric keyboards, stroke keyboards, five-stroke keyboards, and symbolic keyboards. In some embodiments, the keyboard is a combination of one or more of the aforementioned keyboard types to form a keyboard. For example: the pinyin keyboard and the symbol keyboard are combined, the English keyboard and the symbol keyboard are combined, and the English keyboard, the digital keyboard and the symbol keyboard are combined.
Under the mode of calling the input interface, different types of keyboards are prestored in a control end for inputting the intelligent equipment, and when the type of the required keyboard is determined, the corresponding keyboard is directly called from a system memory, so that the display speed of the input interface is increased, and the information input efficiency is improved.
In some embodiments, the keyboard type is a handwriting keyboard, and the handwriting keyboard is adjusted in a set input scene or all input scenes according to a user set rule or a use habit. For example: calling a handwriting keyboard when a user sets the keyboard type corresponding to the display interface of the intelligent equipment as stroke input, and calling the handwriting keyboard when the control end for inputting the intelligent equipment recognizes that the keyboard type corresponding to the display interface of the intelligent equipment is stroke input; or when the user is the old who is used to the handwriting input, the control end used for the intelligent device input adjusts the default keyboard type to be the handwriting keyboard according to the frequency of the user for modifying the keyboard type.
S102, determining input operation and generating an input instruction according to the input operation.
In various embodiments, there are multiple ways to determine an input operation and generate an input instruction based on the input operation.
In some embodiments, determining an input operation and generating an input instruction according to the input operation includes: and determining a character corresponding to the touch focus, and generating an input instruction according to the character.
Optionally, the input instruction is generated in real time according to the input operation, that is, each character corresponds to one input instruction. Optionally, the input instruction is generated according to an interval time of the input operation or an input completion instruction, that is, the input instruction includes one or more characters.
In some embodiments, determining an input operation and generating an input instruction according to the input operation includes: and determining coordinate information corresponding to the touch focus, and generating an input instruction according to the coordinate information. And under the condition of generating an input interface matched with the display interface of the intelligent device, generating an input instruction by using the coordinate information corresponding to the touch focus, and determining the character selected by the user by the intelligent device according to the coordinate information.
And S103, sending an input instruction to the intelligent equipment.
And the control terminal for inputting the intelligent equipment sends the input instruction to the intelligent equipment, and the intelligent equipment analyzes and identifies the input information and executes subsequent corresponding operation according to the input information.
In the embodiment of the disclosure, the control terminal for controlling the intelligent device automatically identifies the scene of the intelligent device, and generates an input interface meeting the input requirement when the intelligent device is determined to be in the input scene, so that the input process of a user is simplified, the adjustment of the type of the input keyboard by the user is avoided, the information input efficiency is accelerated, and the user experience is improved.
The embodiment of the present disclosure provides an apparatus for intelligent device input, the apparatus including: the device comprises a determining module, a display module, a determining module, a generating module and a sending module.
Wherein the determination module is configured to determine whether the smart device is in an input scenario.
The display module is configured to display an input interface corresponding to an input scene when the intelligent device is determined to be in the input scene.
A determination module configured to determine an input operation.
A generating module configured to generate an input instruction according to the input operation.
A sending module configured to send the input instruction to the smart device.
The disclosed embodiment provides an apparatus for intelligent device input, which includes a processor and a memory storing program instructions, and is characterized in that the processor is configured to execute the method for intelligent device input in the above embodiment when executing the program instructions.
As shown in fig. 3, an apparatus for smart device input according to an embodiment of the present disclosure includes a processor (processor)300 and a memory (memory) 301. Optionally, the apparatus may also include a Communication Interface 302 and a bus 303. The processor 300, the communication interface 302 and the memory 301 may communicate with each other via a bus 303. The communication interface 302 may be used for information transfer. The processor 300 may invoke logic instructions in the memory 301 to perform the method for smart device input of the above-described embodiment.
In addition, the logic instructions in the memory 301 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products.
The memory 301 is a computer-readable storage medium, and can be used for storing software programs, computer-executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 300 performs functional applications and data processing, i.e., implementing the methods for smart device input in the above embodiments, by executing program instructions/modules stored in the memory 301.
The memory 301 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. Further, the memory 301 may include a high-speed random access memory, and may also include a nonvolatile memory.
The embodiment of the disclosure provides a remote controller, which comprises the device for inputting the intelligent equipment in the embodiment.
Embodiments of the present disclosure provide a computer-readable storage medium storing computer-executable instructions configured to perform the method for intelligent device input in the above embodiments.
The disclosed embodiments provide a computer program product comprising a computer program stored on a computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the method for smart device input of the above embodiments.
The computer-readable storage medium described above may be a transitory computer-readable storage medium or a non-transitory computer-readable storage medium.
The technical solution of the embodiments of the present disclosure may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes one or more instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium comprising: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes, and may also be a transient storage medium.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It can be clearly understood by the skilled person that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be merely a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than disclosed in the description, and sometimes there is no specific order between the different operations or steps. For example, two sequential operations or steps may in fact be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved. Each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Claims (10)

1. A method for smart device input, comprising:
when the intelligent equipment is determined to be in an input scene, displaying an input interface corresponding to the input scene;
determining input operation and generating an input instruction according to the input operation;
and sending the input instruction to the intelligent equipment.
2. The method of claim 1, wherein determining that the smart device is in an input scenario comprises:
acquiring element components forming a display interface of the intelligent equipment;
when the element component includes an input component, determining that the smart device is in an input scenario.
3. The method of claim 1, wherein determining that the smart device is in an input scenario comprises:
acquiring image information of a display interface of the intelligent equipment;
segmenting the image information to determine constituent elements;
when the component elements include a setting element, determining that the smart device is in an input scene.
4. The method of claim 3, wherein the setting elements comprise an input box, a search icon, and a keyboard icon.
5. The method of claim 3 or 4, wherein displaying the input interface corresponding to the input scene comprises:
integrating the constituent elements according to the intelligent equipment display interface;
and generating an input interface matched with the display interface of the intelligent equipment, and displaying the input interface.
6. The method of claim 1, wherein displaying the input interface corresponding to the input scene comprises:
analyzing image information of a display interface of the intelligent equipment to acquire a displayed character type;
calling a corresponding keyboard type according to the character type;
and calling a corresponding input interface according to the keyboard type, and displaying the input interface.
7. The method of claim 1, wherein displaying the input interface corresponding to the input scene comprises:
analyzing the intelligent equipment identification and/or the application program identification corresponding to the intelligent equipment display interface;
determining the keyboard type according to the intelligent equipment identification and/or the application program identification;
and calling a corresponding input interface according to the keyboard type, and displaying the input interface.
8. The method of claim 1, wherein determining an input operation and generating an input instruction based on the input operation comprises:
determining a character corresponding to the touch focus, and generating the input instruction according to the character; or,
and determining coordinate information corresponding to the touch focus, and generating an input instruction according to the coordinate information.
9. An apparatus for smart device input, comprising a processor and a memory storing program instructions, wherein the processor is configured to perform the method for smart device input according to any one of claims 1 to 8 when executing the program instructions.
10. A remote control comprising the apparatus of claim 9.
CN202010361593.XA 2020-04-30 2020-04-30 Method and device for inputting intelligent equipment and remote controller Pending CN111565328A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010361593.XA CN111565328A (en) 2020-04-30 2020-04-30 Method and device for inputting intelligent equipment and remote controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010361593.XA CN111565328A (en) 2020-04-30 2020-04-30 Method and device for inputting intelligent equipment and remote controller

Publications (1)

Publication Number Publication Date
CN111565328A true CN111565328A (en) 2020-08-21

Family

ID=72073296

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010361593.XA Pending CN111565328A (en) 2020-04-30 2020-04-30 Method and device for inputting intelligent equipment and remote controller

Country Status (1)

Country Link
CN (1) CN111565328A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103596027A (en) * 2013-11-22 2014-02-19 乐视致新电子科技(天津)有限公司 Method and device for retrieving keyboards under different scenes of intelligent television
US20140165099A1 (en) * 2011-07-26 2014-06-12 Beijing Lenovo Software Ltd. Input method, smart tv and smart interaction system
CN104375666A (en) * 2014-12-11 2015-02-25 上海触乐信息科技有限公司 Cross-equipment input method, processing device, input equipment and intelligent display equipment
CN105992066A (en) * 2015-02-13 2016-10-05 Tcl集团股份有限公司 Character input method and character input device applied to intelligent device
CN106210815A (en) * 2011-05-09 2016-12-07 张沈平 A kind of input method for electric room and corresponding electronic equipment
CN107566900A (en) * 2017-08-30 2018-01-09 北京酷我科技有限公司 A kind of optimization method of the intelligent television App keyboard layouts based on button multiplexing
CN108762876A (en) * 2018-05-31 2018-11-06 努比亚技术有限公司 A kind of input method switching method, mobile terminal and computer storage media

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210815A (en) * 2011-05-09 2016-12-07 张沈平 A kind of input method for electric room and corresponding electronic equipment
US20140165099A1 (en) * 2011-07-26 2014-06-12 Beijing Lenovo Software Ltd. Input method, smart tv and smart interaction system
CN103596027A (en) * 2013-11-22 2014-02-19 乐视致新电子科技(天津)有限公司 Method and device for retrieving keyboards under different scenes of intelligent television
CN104375666A (en) * 2014-12-11 2015-02-25 上海触乐信息科技有限公司 Cross-equipment input method, processing device, input equipment and intelligent display equipment
CN105992066A (en) * 2015-02-13 2016-10-05 Tcl集团股份有限公司 Character input method and character input device applied to intelligent device
CN107566900A (en) * 2017-08-30 2018-01-09 北京酷我科技有限公司 A kind of optimization method of the intelligent television App keyboard layouts based on button multiplexing
CN108762876A (en) * 2018-05-31 2018-11-06 努比亚技术有限公司 A kind of input method switching method, mobile terminal and computer storage media

Similar Documents

Publication Publication Date Title
US10810698B2 (en) Information processing method and client
EP4432062A1 (en) Content publishing method and apparatus, computer device, and storage medium
CN113703624A (en) Screen splitting method and device and electronic equipment
CN105335036A (en) Input interaction method and input method system
CN105929980A (en) Method and device for inputting information
WO2016107462A1 (en) Information input method and device, and smart terminal
CN113177190A (en) Document content sharing method and electronic equipment
CN106131643A (en) A kind of barrage processing method, processing means and electronic equipment thereof
CN113220209A (en) Split screen display method and device
CN112684912A (en) Candidate information display method and device and electronic equipment
KR20190141122A (en) How to Navigate a Panel of Displayed Content
CN108008894B (en) Content display method and device and terminal equipment
EP3065032A1 (en) Word prediction input method and terminal
CN111565328A (en) Method and device for inputting intelligent equipment and remote controller
US20160147379A1 (en) Information processing system, information processing device, and screen display method
CN105824694A (en) Adjustment method and device for resources of applications
CN111459571A (en) Configuration method, processing method and device
CN106951167A (en) A kind of dummy keyboard input method, terminal and input equipment
CN105872689B (en) A kind of character input method of set top box virtual keyboard
CN117572974B (en) Information input method and device based on customized keyboard, electronic equipment and medium
EP3635527B1 (en) Magnified input panels
CN111586320A (en) Method and device for switching signal source and remote controller
CN114356861B (en) Document generation and sending method, device and electronic device
CN113835809B (en) Hiding method and device
CN117389458A (en) Information input method, device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200821