[go: up one dir, main page]

CN114911404A - Interface control method, control system, electronic device and storage medium - Google Patents

Interface control method, control system, electronic device and storage medium Download PDF

Info

Publication number
CN114911404A
CN114911404A CN202210468334.6A CN202210468334A CN114911404A CN 114911404 A CN114911404 A CN 114911404A CN 202210468334 A CN202210468334 A CN 202210468334A CN 114911404 A CN114911404 A CN 114911404A
Authority
CN
China
Prior art keywords
electronic device
interface
focus area
target
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210468334.6A
Other languages
Chinese (zh)
Inventor
石悌君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Kaihong Digital Industry Development Co Ltd
Original Assignee
Shenzhen Kaihong Digital Industry Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Kaihong Digital Industry Development Co Ltd filed Critical Shenzhen Kaihong Digital Industry Development Co Ltd
Priority to CN202210468334.6A priority Critical patent/CN114911404A/en
Publication of CN114911404A publication Critical patent/CN114911404A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

本申请涉及智能交互技术领域,提供了界面控制方法、控制系统、电子设备及存储介质,应用于第一电子设备,第一电子设备包括无触控屏,方法包括:向第二电子设备发送界面内容,以使第二电子设备生成投屏界面,第二电子设备包括触控屏;接收第二电子设备反馈的目标坐标信息,并基于目标坐标信息确定一目标位置,目标坐标信息由第二电子设备基于用户对投屏界面所触发的触屏事件生成;获取当前焦点区域;判断目标位置与当前焦点区域间的位置关系;基于位置关系,确定一目标焦点区域,并于目标焦点区域内模拟确认按键指令;响应确认按键指令,以完成对界面内容的操作。本申请极大地方便了用户对界面内容进行操作,提高了用户的体验感。

Figure 202210468334

The present application relates to the technical field of intelligent interaction, and provides an interface control method, a control system, an electronic device and a storage medium, which are applied to a first electronic device. The first electronic device includes a non-touch screen. The method includes: sending an interface to a second electronic device content, so that the second electronic device generates a screen projection interface, and the second electronic device includes a touch screen; receives the target coordinate information fed back by the second electronic device, and determines a target position based on the target coordinate information, and the target coordinate information is determined by the second electronic device. The device generates based on the touch screen event triggered by the user on the screen projection interface; obtains the current focus area; determines the positional relationship between the target position and the current focus area; determines a target focus area based on the positional relationship, and simulates confirmation in the target focus area Key command; respond to the confirmation key command to complete the operation of the interface content. The present application greatly facilitates the user to operate the interface content and improves the user's sense of experience.

Figure 202210468334

Description

Interface control method, control system, electronic device and storage medium
Technical Field
The present application relates to the field of intelligent interaction technologies, and in particular, to an interface control method, a control system, an electronic device, and a storage medium.
Background
With the development of technology, many electronic devices without a touch screen function can install various Applications (APPs). The user can operate the APP through the remote controllers of the devices.
For example, most home video devices (e.g., smart tvs, smart set-top boxes, smart projectors …) have substantially no touch screen capability. But they have an intelligent operating system (e.g., android), so the home movie device can install various APPs that the user needs. At present, these home-use movie and television equipment are generally equipped with a key-type remote controller; the user can operate and control the APP operated on the household movie and television equipment or the household movie and television equipment through keys on the key type remote controller.
When the user actually uses the key-type remote controller, many operations are complex, inconvenience is brought to the user operation, and the experience of the user is reduced. For example, when a user searches for a movie, the user needs to input characters (i.e., the name of the movie), and the user needs to press a direction key on a remote controller for many times to move a cursor displayed on a home video device to a desired letter, and then presses a confirmation key to input the letter, thereby spelling the name of the movie. This method requires pressing the direction key many times, and the operation method is very inconvenient.
In summary, a new and convenient way for a user to operate a device without a touch screen function is needed.
Disclosure of Invention
The application provides an interface control method, a control system, electronic equipment and a storage medium, so that a user can adjust the first electronic equipment through the second electronic equipment instead of a key-type remote controller, user operation is greatly facilitated, and experience of the user is improved.
In a first aspect, the present application provides an interface control method applied to a first electronic device, where the first electronic device includes a touchless screen, and the method includes:
sending interface content to a second electronic device to enable the second electronic device to generate a screen projection interface, wherein the second electronic device comprises a touch screen;
receiving target coordinate information fed back by the second electronic device, and determining a target position based on the target coordinate information, wherein the target coordinate information is generated by the second electronic device based on a touch screen event triggered by a user on the screen projection interface;
acquiring a current focus area;
judging the position relation between the target position and the current focus area;
determining a target focus area based on the position relation, and simulating a confirmation key instruction in the target focus area;
and responding to the confirmation key instruction to finish the operation on the interface content.
Through the scheme, the user operation is greatly facilitated, and the experience of the user is improved.
In the interface control method provided by the present application, the determining a target focal region based on the position relationship includes:
when the target position is located in the current focus area, determining the current focus area as a target focus area;
and when the target position is not located in the current focus area, moving the current focus area to a target focus area according to a preset rule.
In the interface control method provided by the present application, moving the current focus area to a target focus area according to a preset rule includes:
planning a shortest path between the current focus area and a target position according to a preset direction;
and moving the current focus area to a target focus area according to the shortest path.
In the interface control method provided by the present application, the method further includes:
sending configuration information to the second electronic equipment so that the second electronic equipment generates a zoomed screen projection interface based on the configuration information and the interface content;
the configuration information comprises resolution information, and the zoomed screen projection interface is obtained by zooming the interface content by the first electronic device according to the resolution information and based on an equal proportion principle.
In the interface control method provided by the present application, the configuration information further includes gesture information, and the method further includes:
sending gesture information to the second electronic device to enable the second electronic device to generate a blank area;
receiving sliding track information sent by the second electronic device, wherein the sliding track information is generated by the second electronic device based on a touch screen event triggered by a user to an empty area;
when the sliding track information is successfully matched with a preset sliding track, determining a corresponding control instruction according to the successfully matched preset sliding track;
and responding to the control instruction, and executing corresponding operation on the interface content.
In the interface control method provided by the application, the configuration information further includes virtual key information; the method further comprises the following steps:
sending virtual key information to the second electronic device to generate a virtual key area on the second electronic device;
receiving a virtual key instruction sent by the second electronic device, wherein the virtual key instruction is generated by the second electronic device based on a user operation in the virtual key area;
and responding to the virtual key instruction, and executing corresponding operation on the interface content.
In a second aspect, an interface control system includes a first electronic device and a second electronic device, where the first electronic device is configured to execute the interface control method, and the second electronic device is configured to execute steps including:
the second electronic equipment is used for receiving the interface content sent by the first electronic equipment and generating a screen projection interface;
the second electronic equipment generates coordinate information based on a touch screen event of the user on the screen projection interface, and scales the coordinate information to obtain target coordinate information;
and sending the target coordinate information to the first electronic equipment.
In the interface control system provided by the present application, the steps executed by the second electronic device further include:
receiving configuration information sent by the first electronic device, wherein the configuration information comprises resolution information;
zooming the interface content according to the resolution information and an equal proportion principle to generate a zoomed screen projection interface;
generating coordinate information based on a touch screen event triggered by the zoomed screen projection interface by a user, and zooming the coordinate information according to the equal proportion principle to obtain the target coordinate information;
and sending the target coordinate information to the first electronic equipment.
In a third aspect, the present application further provides an electronic device, which includes a processor, a memory, and a computer program stored on the memory and executable by the processor, wherein the computer program, when executed by the processor, implements the steps of the interface control method as described above.
In a fourth aspect, the present application further provides a computer-readable storage medium having a computer program stored thereon, where the computer program, when executed by a processor, implements the steps of the interface control method as described above.
Compared with the prior art, in the interface control method provided by the embodiment of the application, the first electronic device can determine the target position according to the target coordinate information, determine the target focus area according to the position relation between the target position and the current focus area, and simulate the confirmation key instruction in the target focus area, so that the electronic device without the touch screen function can be quickly and conveniently controlled, the user operation is greatly facilitated, and the experience of the user is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of an interface control method according to an embodiment of the present disclosure.
Fig. 2 is an interface content diagram of a first electronic device according to an embodiment of the present application.
Fig. 3 is a diagram illustrating a positional relationship between a current focal region and a target position in the embodiment of the present application.
Fig. 4 is a schematic structural diagram of a screen projection interface displayed on a screen of a second electronic device in an embodiment of the present application.
Fig. 5 is a schematic structural diagram of another screen projection interface displayed on a screen of a second electronic device according to an embodiment of the present disclosure.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
It is to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that for the convenience of clearly describing the technical solutions of the embodiments of the present application, the words "first", "second", and the like are used in the embodiments of the present application to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
The inventor of the application finds that a user needs to use a control device such as a remote controller to operate the electronic device without the touch screen. The operation of the existing remote controller is complex, and the user experience is poor. In view of this, an embodiment of the present application provides a method for controlling an electronic device, where the method is applied to a first electronic device, where the first electronic device includes a touchless screen, and the method mainly includes: sending interface content to second electronic equipment to enable the second electronic equipment to generate a screen projection interface, wherein the second electronic equipment comprises a touch screen; receiving target coordinate information fed back by the second electronic device, wherein the target coordinate information is generated by the second electronic device based on a touch screen event triggered by a user on a screen projection interface; acquiring a current focus area; judging the position relation between the target coordinate information and the current focus area; determining a target focus area based on the position relation, and simulating a confirmation key instruction in the target focus area; and responding to the confirmation key instruction to finish the operation of the interface content.
It can be understood that the first electronic device in the embodiments of the present application may include a device without a touch screen, such as a smart television, a smart set-top box, a smart projector, and a smart speaker. That is, the first electronic device does not have a touch screen function.
The second electronic device may include a smart phone, tablet, or the like having a touch screen. That is, the second electronic device has a touch screen function.
Before the first electronic device sends interface content to the second electronic device, the first electronic device needs to establish connection with the second electronic device and perform communication. The first electronic device and the second electronic device can be connected through a communication network, so that the first electronic device and the second electronic device are in the same network.
The communication network may be a Local Area Network (LAN) or a Wide Area Network (WAN), such as the internet. The communication network may be implemented using any known network communication protocol, which may be a variety of wired or wireless communication protocols such as, for example, it may be ethernet, Universal Serial Bus (USB), FIREWIRE (FIREWIRE), global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (TD-SCDMA), Long Term Evolution (LTE), bluetooth, wireless fidelity (Wi-Fi), NFC, voice over Internet protocol (VoIP), a communication protocol supporting a network slice architecture, or any other suitable communication protocol. For example, in some embodiments, a first electronic device and a second electronic device may establish a Wi-Fi connection via a Wi-Fi protocol.
Some embodiments of the present application are described in detail below with reference to the following detailed description of the drawings. In the following embodiments, features of the embodiments may be combined with each other without conflict.
Referring to fig. 1, an embodiment of the present application provides an interface control method, which is applied to a first electronic device, where the first electronic device includes a touchless screen, and the method includes:
s100, sending interface content to second electronic equipment to enable the second electronic equipment to generate a screen projection interface, wherein the second electronic equipment comprises a touch screen.
It is understood that the interface content transmitted by the first electronic device may include, but is not limited to: a screen interface of the first electronic device, or video, audio, images, documents, games, etc. played by the first electronic device. As shown in fig. 2, fig. 2 illustrates an interface content diagram of a first electronic device. The first electronic device is an intelligent television, a function bar is configured above a screen of the intelligent television, and the function bar can include a plurality of sub-function bars, such as a recommendation bar, a movie bar, an education bar, a shopping bar, a game bar, an application bar and other sub-function bars for a user to select. When the cursor stays at a certain sub-function bar, all the application icons included in the sub-function bar are displayed below the function bar of the screen. All the application icons and the function bars which are displayed in an arranged mode form interface content together.
For example, when the cursor stays at the "application bar", all application icons included in the "application bar" are displayed on the screen. At this time, the main interface formed by all application icons and function bars included in the "application bar" displayed on the screen is the interface content. Therefore, after receiving the screen projection, the second electronic device can display the interface content, that is, display the screen projection interface, on its own screen.
S200, receiving target coordinate information fed back by the second electronic device, confirming a target position based on the target coordinate information, wherein the target coordinate information is generated by the second electronic device based on a touch screen event triggered by a user on a screen projection interface, and the first electronic device can confirm the target position according to the target coordinate information.
Generally, the user may touch the touch screen of the second electronic device by a finger or a stylus to generate a touch screen event. The second electronic device determines the position touched by the user according to the touch screen event so as to obtain target coordinate information, and then the second electronic device sends the target coordinate information to the first electronic device. Then, the first electronic device determines a target position corresponding to the touch screen event position on the first electronic device according to the target coordinate information.
Taking fig. 2 as an example, the first electronic device is a smart television, the second electronic device is a smart phone, and the smart television sends the current interface content to the smart phone. Assuming that a user touches the local playing on the screen projection interface of the smart phone, the smart phone generates target coordinate information based on the touch screen event and sends the target coordinate information to the smart television. The intelligent television obtains the target coordinate information, and searches a corresponding target position, namely a local playing position on the intelligent television according to the target coordinate information.
S300, acquiring a current focus area;
the current focus area may be a position of a cursor of the first electronic device. Exemplarily, the first electronic device is taken as an example of an intelligent electronic device without a touch screen function; referring to fig. 2, the cursor in fig. 2 stays at the "local play", where the position of the "local play" is the current focus area, and in order to distinguish the cursor area from other areas, the electronic device may appropriately enlarge the cursor area, or configure different colors, etc. to complete the differentiated setting.
S400, judging the position relation between the target position and the current focus area;
when a user clicks a certain APP icon in a screen projection interface on the second electronic device, the second electronic device sends target coordinate information of the touch point to the first electronic device. The first electronic device judges a position area of the APP, on the first electronic device, required to be triggered by the user according to the target coordinate information, wherein the position area is often different from the current focus area, so that the position relationship between the target coordinate information and the current focus area needs to be determined.
S500, determining a target focus area based on the position relation, and simulating a confirmation key instruction in the target focus area;
wherein, based on the position relationship, determining a target focus area comprises:
when the target position is located in the current focus area, determining the current focus area as a target focus area;
and when the target position is not located in the current focus area, moving the current focus area to a target focus area according to a preset rule.
In general, a target position is understood to mean a coordinate which lies within a focal region, i.e. the target focal region.
Wherein, the preset rule may include: when the target position (i.e. the converted coordinates) is not located in the focus area, the shortest path between the current focus area and the target position is planned according to the preset direction. And then, according to the shortest path, moving the current focus area to the target focus area.
In order to facilitate understanding of the application scenario of the above scheme, a smart television without a virtual key function on the market is taken as an example for description. In an embodiment of the present application:
referring to fig. 3, fig. 3 is a schematic structural diagram illustrating that the current focus area is moved to the target position in the embodiment of the present application. When the first electronic device determines that the target position is not in the current focus area, the first electronic device may determine the shortest path according to all tracks and directions of the current focus area moving to the target focus area, so that the first electronic device moves the position of the current focus area to the target focus area, and the target position falls into the target focus area.
As for the determination method of the shortest path, it may be that the first electronic device will determine the shortest path according to how many times the first electronic device moves. Taking the direction of fig. 3 as an example, the path for moving the current focus area to the target focus area includes:
a first path: move down once and then move left twice.
And a second route: move twice to the left, once down, and then once to the right.
At this time, the first electronic device may set the path with the smallest number of movements as the most suitable movement path. Therefore, the first electronic device determines that the first path is the shortest path. Therefore, the first electronic device simulates a direction control according to the track and the moving direction of the first planned path, so that the current focus area moves to the specified target position, and the coincidence of the current focus area and the target focus area is realized.
If the moving times of the paths are the same in the above scheme, the path with the minimum distance length may be used as the shortest path.
The calculation rule for the distance length may include:
and sequentially calculating the distance between the centers of each adjacent focus area along a path planned according to the moving times, and finally summing to obtain the distance length.
Taking path one in fig. 3 as an example, the length of path one may be:
sequentially calculating the distance between the center point of the current focus area and the center point of the lower focus area, the distance between the center point of the lower focus area (namely the focus area where the lower application icon is located) and the center point of the left adjacent focus area, and the distance between the center point of the left adjacent focus area and the center point of the target focus area; and summing all the calculated distances to obtain the length of path one.
For example, assuming that the moving times of the path one and the path two in fig. 3 are the same, the method for determining the shortest path further includes:
the method comprises the steps that first electronic equipment obtains a first distance length of a first path and a second distance length of a second path;
the first electronic device judges the size of the first distance length and the second distance length, and when the first distance length is smaller than the second distance length, the first path is the shortest path.
Of course, if the moving times of the first path and the second path are the same, and the first distance length and the second distance length are also the same, the first electronic device may randomly select the first path or the second path.
In addition, the shortest path may be selected according to the distance length of the movement. Still taking fig. 3 as an example, the path for moving the current focus area to the target focus area includes two paths, i.e., a first path and a second path. As is apparent from the figure, the first distance length of the first path is smaller than the second distance length of the second path, so that the first path is the shortest path. And if the distance lengths of the paths are the same, the path with the least number of movements can be used as the shortest path. If the moving distance length and the moving times are the same in the multiple paths, the first electronic device may randomly select a path as the shortest path.
After the target focus area is determined, the first electronic device may move the current focus area to the designated target focus area, and then the first electronic device may simulate a confirm key instruction in the target focus area.
S600, responding to the confirmation key instruction to finish the operation of the interface content.
When the first electronic device monitors that the target position is in the focus area, a key instruction (namely a 'confirm' key) is simulated and determined immediately, the APP corresponding to the converted target focus area is activated, and then the operation of the interface content of the first electronic device is realized.
In addition, in view of the difference between the control modes of the existing first electronic device and the second electronic device, some applications APP of the first electronic device cannot be run on the second electronic device, for example, the mobile phone version of smart APP cannot be run on the smart television, and a developer needs to separately develop a TV version of smart APP for use by the smart television, thereby increasing the development cost of the developer.
By the interface control method, the mobile phone edition Youkou APP can be used in the smart television, and the development cost of a developer is reduced.
In an embodiment of the application, referring to fig. 2, when a user touches a youku APP in interface content (or a screen projection interface) of a smart phone, the smart phone performs amplification processing on coordinate information of the youku APP according to a second preset proportion to obtain target coordinate information; and sending the target coordinate information to the smart television, confirming the target position touched by the user by the smart television according to the target coordinate information, and judging that the target position is in the focus area range of the Youkou APP on the screen of the smart television. If the target position is located in the current focus area, the smart television simulates and determines a key instruction, namely, the Youkou APP is opened, and the Youkou APP on the smart television is also opened at the same time.
In another embodiment of the present application, referring to fig. 2, if the user wants to return to the desktop of the smart tv after watching the movie through the youku APP on the smart tv; the user can click a return control on a video playing interface of the Youkou APP in the smart phone. After the user clicks the return control, the smart phone can generate corresponding coordinate information and convert the coordinate information according to an equal proportion principle to obtain target coordinate information. The intelligent television receives the target coordinate information, confirms a target position and judges that the target position is in the focus area range of a 'return control' on the intelligent television; namely, the target position is located in the current focus area, the intelligent television simulates a confirmation key instruction and triggers the 'return control' to execute the operation.
It should be emphasized that the video playing APP, similar to the cooling APP, the Tencent video APP, the Aiqiyi video APP, the bilibili video APP, and the like, has a return control or at least a return function when playing video or audio.
Further, when the target position is not located within the current focus area, the situation may be as follows:
referring to fig. 2, the current focus area on the smart tv is the location of "local play". The user can click the Youkou video on the smart phone, and the smart phone generates target coordinate information and sends the target coordinate information to the smart television. The smart television can plan the shortest path between the target position and the position of the current focus area according to a preset rule. As can be seen from fig. 2, the paths total many, and the shortest path is: the method comprises the steps of local playing, internet access public class, fox searching video MAX and Youkou video in sequence.
Then, the smart television moves the current focus area to the target position information, that is, to the kuku video according to the planned shortest path. After the current focus area moves to the Youkou video, the intelligent television can simulate and confirm a key instruction, and the Youkou video is opened.
The inventor of the present application finds that, in most cases, the sizes of the first electronic device and the second electronic device are often different, and at this time, in order to ensure the experience of the user, the first electronic device may also send its own configuration information to the second electronic device.
When the configuration information of the first electronic device includes resolution information, the second electronic device receives the resolution information of the first electronic device, and the interface content is displayed in an equal proportion according to an equal proportion principle based on the resolution information and the size of a screen of the second electronic device, so that a zoomed screen projection interface can be obtained.
According to the equal proportion principle, the equal proportion display interface content can be as follows: the ratio of the width to the height of the interface content of the first electronic device is the same as the ratio of the width to the height of the screen-projecting interface displayed on the second electronic device after the screen is projected to the second electronic device. And displaying the zoomed interface content, namely the zoomed screen projection interface on the second electronic equipment in an equal ratio. By means of the equal proportion principle, the ratio of the width to the height of the screen projection interface can be guaranteed to be unchanged, the screen projection interface is prevented from deforming, and the experience feeling of a user is guaranteed.
Illustratively, when the first electronic device is a smart tv, the second electronic device is a smart phone, and the resolution of the smart tv is 1920 × 1080, the ratio of the width to the height of the interface content of the smart tv may be recorded as 16/9. Due to the screen limitation of the smart phone, the smart phone can display the interface to be projected of the smart television on the screen of the smart phone in an equal ratio, for example, the resolution of the interface content is reduced to 960 × 540, and the ratio of the width to the height of the screen projection interface is also 16/9 at this time.
Furthermore, in order to ensure the aesthetic appearance, the screen projection interface can be positioned close to the middle position of the screen of the second electronic equipment.
In the scheme, the second electronic device can generate coordinate information based on a touch screen event triggered by a zoomed screen-in interface of a user;
the second electronic device may convert the coordinate information to obtain target coordinate information, and send the target coordinate information to the first electronic device.
Specifically, when a user operates in a screen projection interface to trigger a touch event by using a finger or a touch pen, the second electronic device acquires coordinate information of the touch event;
the second electronic equipment converts the coordinate information to obtain target coordinate information and sends the target coordinate information to the first electronic equipment.
Converting the coordinate information may include: and zooming the coordinate information according to an equal proportion principle to obtain target coordinate information.
When the smart television is projected to the smart phone, the size of the smart television is larger than that of the smart phone, so that the screen projection interface needs to be compressed according to an equal proportion principle, and the screen projection interface can be clearly displayed on the smart phone.
Correspondingly, if the content of the smart phone is projected on the smart television, the size of the smart television is larger than that of the smart phone, so that the screen projection interface needs to be amplified according to an equal proportion principle, and the screen projection interface can be clearly displayed on the smart television.
In order to facilitate understanding of the step of scaling the coordinate information according to the equal proportion principle, the following description takes an intelligent television and an intelligent mobile phone as examples:
assuming that the relative coordinate information (x, y) of the position touched by the user is acquired on the smart phone (with the upper left corner of the touch area as a starting point), the relative coordinate information is converted into an absolute coordinate (nx, ny) (with the upper left corner of the whole screen as a starting point) according to an equal proportion principle, and the absolute coordinate is sent to the first electronic device.
It should be emphasized that the algorithm for converting the coordinate information is not limited in the embodiments of the present application.
As a further aspect of the present invention, the configuration information may further include gesture information.
The first electronic device sends gesture information to the second electronic device to generate a blank area on the second electronic device;
the method comprises the steps that first electronic equipment receives sliding track information sent by second electronic equipment;
the first electronic equipment matches the sliding track information with a preset sliding track, and when the sliding track information is successfully matched with the preset sliding track, a corresponding control instruction is determined according to the successfully matched preset sliding track; the first electronic device is internally pre-stored with preset sliding tracks, and different preset sliding tracks correspond to different operation instructions.
The first electronic device can respond to the control instruction and execute corresponding operation so as to complete operation on the interface content.
In order to prevent the preset sliding track from possibly colliding with the inherent sliding track of the second electronic device, in view of this, the embodiment of the present application further includes:
the second electronic equipment judges whether the sliding track information is a sliding track corresponding to the inherent sliding track of the second electronic equipment or can control the sliding track of the first electronic equipment;
if the sliding track information belongs to the sliding track corresponding to the inherent sliding track of the second electronic equipment, the second electronic equipment does not send the sliding track information; if the first electronic equipment does not belong to the preset sliding track, the sliding track information is sent, and the first electronic equipment further matches the sliding track information with the preset sliding track;
and if the matching is successful, the first electronic equipment responds to the operation instruction triggered by the sliding track information, and executes the operation corresponding to the operation instruction.
The following describes the above scheme in detail by taking the first electronic device as a smart television, the second electronic device as a smart phone, and displaying a blank area on the smart phone as an example.
Referring to fig. 4, fig. 4 shows a screen projection interface displayed on a screen of a smartphone, where a space is left between a boundary of the screen projection interface and an adjacent screen boundary to form a blank area (i.e., a black frame portion in the drawing).
According to an embodiment of the application, a preset sliding track is stored in the smart television, and meanwhile, the smart television can determine a corresponding operation instruction according to the preset sliding track and adjust interface content according to the operation instruction. When the user slides from one end of the blank area to the other end of the blank area in the smart phone, the smart phone acquires the sliding track information. The smart phone sends the sliding track information to a smart television, and the smart television matches the sliding track information with a preset sliding track; and when the matching is successful, the intelligent television determines a control instruction corresponding to the sliding track according to the sliding track successfully matched, and executes corresponding operation according to the control instruction to adjust the interface content.
Meanwhile, if the sliding track is a sliding track corresponding to the inherent sliding track of the smart phone, the smart phone can determine a corresponding control instruction according to the sliding track, and the smart television cannot receive the sliding track information.
In addition, as another embodiment of the present application, the preset sliding track may be different from an inherent sliding track of the second electronic device, i.e., the smartphone.
Illustratively, when the user slides in the screen projection interface by using a finger, the starting point of the corresponding sliding track is positioned in the screen projection interface, and the ending point of the sliding track is also positioned in the screen projection interface, and the sliding track is used for controlling the content of the screen projection interface in the smart phone. When the starting point of the sliding track of the finger of the user is located in the blank area or the screen projection interface and the ending point is located in other blank areas, the sliding track is used for controlling the smart phone instead of controlling the screen projection interface. By the method, the conflict between the inherent sliding track of the smart phone and the sliding track of the screen projection control interface can be effectively avoided.
Furthermore, when a user needs to control the smart television through the smart phone, the user can slide in the screen projection interface through a finger, so that the smart television determines a corresponding control instruction according to the sliding track, and performs corresponding operation according to the control instruction. If the user needs to operate the smartphone itself, for example, the smartphone is split into screens, and the sliding track of the split-screen operation instruction may include: the thumb of the user is positioned in the screen projection interface, the index finger is positioned in the blank area, and the thumb and the index finger are close to each other in opposite directions. At this time, the user slides on the screen of the smartphone by using a finger to form the sliding track, so that the smartphone executes the screen switching operation.
In addition, the sliding track information can be manually set, so that different sliding track information corresponds to different control commands. For example, if the distance from one end to the other end in the screen of the smart phone is more than half the screen distance, the smart phone is represented to return to the home page; and if the distance from one end to the other end in the screen does not exceed the half-screen distance, the smart television returns to the homepage.
As a further aspect of the present invention, the configuration information may further include virtual key information.
The first electronic device further sends the virtual key information to the second electronic device to generate a virtual key area on the second electronic device.
The method comprises the steps that a first electronic device receives a virtual key instruction sent by a second electronic device, wherein the virtual key instruction is generated by the second electronic device based on the operation of a user in a virtual key area;
and the first electronic equipment responds to the virtual key instruction and executes corresponding operation so as to finish the operation on the interface content.
Correspondingly, in the further scheme, the second electronic device may further include a virtual key area on the basis of the screen projection interface and the blank area. The second electronic device can display the screen projection interface on the screen of the second electronic device in the middle, one side of the screen projection interface is a blank area, and the other side of the screen projection interface is a virtual key area. For example, referring to fig. 4, a virtual key area may be arranged on the left side of the screen projection interface in fig. 4, and the virtual key area may include a progress adjustment key and a volume adjustment key; and a blank area is arranged on the right side of the screen projection interface. Then, when the user touches the virtual key in the virtual key area, a virtual key instruction corresponding to the virtual key is generated and sent to the first electronic device. And the first electronic equipment executes corresponding operation according to the virtual key instruction.
The virtual keys may include a direction adjustment control, a determination control, a volume adjustment control, and the like. Therefore, when the user clicks the direction adjusting control on the second electronic device, a moving operation instruction is generated and sent to the first electronic device. The first electronic device controls the cursor position to move according to the moving operation instruction.
Of course, the embodiment of the present application may also be applied to the case of multiple first smart devices. Referring to fig. 5, fig. 5 is a schematic structural diagram of another screen projection interface displayed on a screen of a second electronic device according to an embodiment of the present disclosure. In fig. 5, the second electronic device is a smart phone, and the two first electronic devices are a smart television and a smart sound box, respectively. The smart phone can perform layout of interface contents according to configuration information of the smart television and the smart sound box and by combining the size of a screen of the smart phone.
Of course, if a plurality of first electronic devices without touch screen function are projected to a second electronic device, the second electronic device may also perform layout of interface content according to the configuration information of each first electronic device and by combining the size of its own screen.
For example, the configuration information of the smart television only includes resolution information, and the configuration information of the smart sound box includes resolution information, virtual key information, and gesture information. The user can simultaneously or sequentially project the interface content of the smart television and the interface content of the smart sound box to the smart phone in advance. At this moment, the smart phone can reasonably divide the screen according to the configuration information of the two screen projection devices, and arrange the corresponding screen projection interface, the virtual key area and the blank area. Taking the direction of fig. 5 as an example, a screen projection interface corresponding to the smart television is displayed on the left side of the smart phone, a screen projection interface corresponding to the smart speaker is displayed on the right side of the smart phone, and meanwhile, a virtual button area can be displayed above the screen projection interface corresponding to the smart speaker, and a blank area is displayed below the screen projection interface.
In order to save bandwidth and ensure transmission efficiency, the first electronic device (e.g., a smart tv) may compress the interface content to a certain extent, so as to ensure that the picture outline in the interface content can be clearly displayed on the screen of the second electronic device (e.g., a smart phone).
It will be appreciated that the user may select the degree of compression of the interface content based on the actual circumstances.
An embodiment of the present application further provides an interface control system, where the system includes a first electronic device and a second electronic device, where the first electronic device is configured to execute the interface control method, and the second electronic device is configured to execute the steps including:
the second electronic equipment is used for receiving the interface content sent by the first electronic equipment and generating a screen projection interface;
the second electronic equipment generates coordinate information based on a touch screen event of a screen projection interface by a user, and scales the coordinate information to obtain target coordinate information;
and sending the target coordinate information to the first electronic equipment.
Further, the method also comprises the following steps: receiving configuration information sent by first electronic equipment, wherein the configuration information comprises resolution information;
zooming interface contents according to the resolution information and an equal proportion principle to generate a screen projection interface;
generating coordinate information based on a touch screen event triggered by a user on a screen projection interface, and zooming the coordinate information according to an equal proportion principle to obtain target coordinate information;
and sending the target coordinate information to the first electronic equipment.
Specifically, when a user uses a finger or a touch pen to operate and trigger a touch event in the compressed screen projection interface, at this time, the second electronic device may obtain coordinate information of the touch event;
and converting the coordinate information, generating target coordinate information according to the converted coordinate information, and sending the target coordinate information to the first electronic equipment.
Of course, the configuration information may also include gesture information, and the second electronic device receives the gesture information and generates the blank area according to the gesture information.
The second electronic device acquires the sliding track information of the user in the blank area, and when judging that the sliding track information is not the inherent track information of the second electronic device, the sliding track information is sent to the first electronic device, so that the first electronic device can conveniently execute corresponding operation according to the sliding track information.
Further, the configuration information may further include virtual key information, and the second electronic device generates a corresponding virtual key region according to the virtual key information. The user clicks the virtual key in the virtual key area on the second electronic device, and the second electronic device can send the virtual key instruction to the first electronic device.
The embodiment of the application also provides an electronic device, which is shown in fig. 6, and fig. 6 is a schematic structural diagram of the electronic device. The electronic device 500 includes a processor (CPU, GPU, FPGA, etc.) 501, which can perform part or all of the processing in the embodiments shown in the above-described drawings according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data necessary for the operation of the system 500 are also stored. The processor 501, the ROM502, and the RAM503 are connected to each other by a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard, a mouse, and the like; an output portion 507 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
In particular, according to embodiments of the present application, the method described above with reference to the figures may be implemented as a computer software program. For example, embodiments of the present application include a computer program product comprising a computer program tangibly embodied on a medium readable thereby, the computer program comprising program code for performing the methods of the figures. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowcharts or block diagrams may represent a module, a program segment, or a portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules described in the embodiments of the present application may be implemented by software or hardware. The units or modules described may also be provided in a processor, and the names of the units or modules do not in some cases constitute a limitation on the units or modules themselves.
As another embodiment, the present application also provides a computer-readable storage medium, which may be the computer-readable storage medium included in the screen projection device in the above embodiment; or it may be a separate computer readable storage medium not incorporated into the device. The computer-readable storage medium stores one or more programs, which are used by one or more processors to execute the control method of the electronic device described in the present application.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1.一种界面控制方法,其特征在于,应用于第一电子设备,所述第一电子设备包括无触控屏,所述方法包括:1. An interface control method, characterized in that, applied to a first electronic device, the first electronic device comprising a non-touch screen, the method comprising: 向第二电子设备发送界面内容,以使所述第二电子设备生成投屏界面,其中,所述第二电子设备包括触控屏;sending interface content to a second electronic device, so that the second electronic device generates a screen projection interface, wherein the second electronic device includes a touch screen; 接收所述第二电子设备反馈的目标坐标信息,并基于所述目标坐标信息确定一目标位置,所述目标坐标信息由所述第二电子设备基于用户对所述投屏界面所触发的触屏事件生成;Receive target coordinate information fed back by the second electronic device, and determine a target position based on the target coordinate information, where the target coordinate information is triggered by the second electronic device based on the touch screen triggered by the user on the screen projection interface event generation; 获取当前焦点区域;Get the current focus area; 判断所述目标位置与所述当前焦点区域间的位置关系;judging the positional relationship between the target position and the current focus area; 基于所述位置关系,确定一目标焦点区域,并于所述目标焦点区域内模拟确认按键指令;determining a target focus area based on the positional relationship, and simulating a confirmation button command in the target focus area; 响应确认按键指令,以完成对界面内容的操作。Respond to the confirmation key command to complete the operation on the interface content. 2.根据权利要求1所述的控制方法,其特征在于,所述基于所述位置关系,确定一目标焦点区域,包括:2. The control method according to claim 1, wherein the determining a target focus area based on the positional relationship comprises: 当目标位置位于所述当前焦点区域内时,将所述当前焦点区域确定为目标焦点区域;When the target position is within the current focus area, determining the current focus area as the target focus area; 当目标位置未位于所述当前焦点区域内时,按照预设规则将当前焦点区域移动至一目标焦点区域处。When the target position is not located in the current focus area, the current focus area is moved to a target focus area according to a preset rule. 3.根据权利要求2所述的控制方法,其特征在于,所述按照预设规则将当前焦点区域移动至一目标焦点区域处,包括:3. The control method according to claim 2, wherein the moving the current focus area to a target focus area according to a preset rule comprises: 按照预设方向规划出当前焦点区域与目标位置之间的最短路径;Plan the shortest path between the current focus area and the target position according to the preset direction; 根据所述最短路径,将当前焦点区域移动至目标焦点区域。According to the shortest path, the current focus area is moved to the target focus area. 4.根据权利要求1-3任一项所述的控制方法,其特征在于,所述方法还包括:4. The control method according to any one of claims 1-3, wherein the method further comprises: 向所述第二电子设备发送配置信息,以使所述第二电子设备基于所述配置信息及界面内容,生成缩放后的投屏界面;sending configuration information to the second electronic device, so that the second electronic device generates a zoomed screen projection interface based on the configuration information and interface content; 其中,所述配置信息包括分辨率信息,所述缩放后的投屏界面由所述第一电子设备根据所述分辨率信息,基于等比例原则缩放所述界面内容得到。The configuration information includes resolution information, and the scaled screen projection interface is obtained by the first electronic device scaling the interface content based on the principle of equal proportions according to the resolution information. 5.根据权利要求4所述的控制方法,其特征在于,所述配置信息还包括手势信息,所述方法还包括:5. The control method according to claim 4, wherein the configuration information further comprises gesture information, and the method further comprises: 向所述第二电子设备发送手势信息,以使所述第二电子设备生成空白区域;sending gesture information to the second electronic device, so that the second electronic device generates a blank area; 接收所述第二电子设备发送的滑动轨迹信息,其中,所述滑动轨迹信息是所述第二电子设备基于用户对空白区域所触发的触屏事件生成;receiving sliding track information sent by the second electronic device, wherein the sliding track information is generated by the second electronic device based on a touch screen event triggered by a user on a blank area; 当所述滑动轨迹信息与预设滑动轨迹匹配成功时,根据匹配成功的所述预设滑动轨迹确定相应的操控指令;When the sliding track information is successfully matched with the preset sliding track, determine a corresponding manipulation instruction according to the successfully matched preset sliding track; 响应于所述操控指令,对所述界面内容执行相应的操作。In response to the manipulation instruction, a corresponding operation is performed on the interface content. 6.根据权利要求4所述的控制方法,其特征在于,所述配置信息还包括虚拟按键信息;所述方法还包括:6. The control method according to claim 4, wherein the configuration information further comprises virtual key information; the method further comprises: 向所述第二电子设备发送虚拟按键信息,以在所述第二电子设备上生成虚拟按键区域;sending virtual key information to the second electronic device to generate a virtual key area on the second electronic device; 接收所述第二电子设备发送的虚拟按键指令,其中,所述虚拟按键指令由所述第二电子设备基于用户在所述虚拟按键区域操作生成;receiving a virtual key instruction sent by the second electronic device, wherein the virtual key instruction is generated by the second electronic device based on a user's operation in the virtual key area; 响应于所述虚拟按键指令,对所述界面内容执行相应的操作。In response to the virtual key instruction, a corresponding operation is performed on the interface content. 7.一种界面控制系统,其特征在于,包括第一电子设备、第二电子设备,其中,所述第一电子设备用于执行权利要求1-6任一项所述的界面控制方法,所述第二电子设备用于执行的步骤包括:7. An interface control system, characterized in that it comprises a first electronic device and a second electronic device, wherein the first electronic device is used to execute the interface control method according to any one of claims 1-6, wherein the The steps for performing the second electronic device include: 所述第二电子设备用于接收所述第一电子设备发送的界面内容,并生成投屏界面;The second electronic device is configured to receive the interface content sent by the first electronic device, and generate a screen projection interface; 所述第二电子设备基于用户对所述投屏界面的触屏事件生成坐标信息,并将坐标信息缩放得到目标坐标信息;The second electronic device generates coordinate information based on a user's touch event on the screen projection interface, and scales the coordinate information to obtain target coordinate information; 将所述目标坐标信息发送至所述第一电子设备处。Sending the target coordinate information to the first electronic device. 8.根据权利要求7所述的控制系统,其特征在于,所述第二电子设备执行的步骤,还包括:8. The control system according to claim 7, wherein the step performed by the second electronic device further comprises: 接收所述第一电子设备发送的配置信息,其中所述配置信息包括分辨率信息;receiving configuration information sent by the first electronic device, wherein the configuration information includes resolution information; 根据所述分辨率信息,按照等比例原则缩放所述界面内容,生成缩放后的投屏界面;According to the resolution information, the interface content is scaled according to the principle of equal proportions, and a scaled screen projection interface is generated; 基于用户对所述缩放后的投屏界面触发的触屏事件生成坐标信息,按照所述等比例原则缩放所述坐标信息,得到所述目标坐标信息;Generate coordinate information based on a touch screen event triggered by the user on the scaled screen projection interface, and scale the coordinate information according to the equal-scale principle to obtain the target coordinate information; 将所述目标坐标信息发送至第一电子设备处。Sending the target coordinate information to the first electronic device. 9.一种电子设备,其特征在于,所述电子设备包括处理器、存储器、以及存储在所述存储器上并可被所述处理器执行的计算机程序,其中所述计算机程序被所述处理器执行时,实现如权利要求1至6中任一项所述的界面控制方法的步骤。9. An electronic device, characterized in that the electronic device comprises a processor, a memory, and a computer program stored on the memory and executable by the processor, wherein the computer program is executed by the processor During execution, the steps of the interface control method according to any one of claims 1 to 6 are implemented. 10.一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序,其中所述计算机程序被处理器执行时,实现如权利要求1至6中任一项所述的界面控制方法的步骤。10. A computer-readable storage medium, characterized in that, a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program as claimed in any one of claims 1 to 6 is implemented. The steps of the interface control method described above.
CN202210468334.6A 2022-04-29 2022-04-29 Interface control method, control system, electronic device and storage medium Pending CN114911404A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210468334.6A CN114911404A (en) 2022-04-29 2022-04-29 Interface control method, control system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210468334.6A CN114911404A (en) 2022-04-29 2022-04-29 Interface control method, control system, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN114911404A true CN114911404A (en) 2022-08-16

Family

ID=82765193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210468334.6A Pending CN114911404A (en) 2022-04-29 2022-04-29 Interface control method, control system, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN114911404A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116029746A (en) * 2023-03-28 2023-04-28 深圳市湘凡科技有限公司 Data processing method and related device
CN118695043A (en) * 2024-07-03 2024-09-24 湖南快乐阳光互动娱乐传媒有限公司 A method and device for moving focus of user interface
CN118842889A (en) * 2024-07-24 2024-10-25 稻兴光学(厦门)有限公司 Method, system and equipment for controlling display of large-screen equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104298461A (en) * 2014-09-10 2015-01-21 海信集团有限公司 Method for controlling intelligent device through mobile terminal and mobile terminal of method
CN107071551A (en) * 2017-04-26 2017-08-18 四川长虹电器股份有限公司 Applied to the multi-screen interactive screen response method in intelligent television system
CN108124179A (en) * 2017-12-14 2018-06-05 上海斐讯数据通信技术有限公司 The method and system that focus switches between a kind of more controls
CN111629245A (en) * 2020-05-29 2020-09-04 深圳Tcl数字技术有限公司 Focus control method, television and storage medium
CN114240754A (en) * 2021-12-20 2022-03-25 海宁奕斯伟集成电路设计有限公司 Screen projection processing method, device, electronic device, and computer-readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104298461A (en) * 2014-09-10 2015-01-21 海信集团有限公司 Method for controlling intelligent device through mobile terminal and mobile terminal of method
CN107071551A (en) * 2017-04-26 2017-08-18 四川长虹电器股份有限公司 Applied to the multi-screen interactive screen response method in intelligent television system
CN108124179A (en) * 2017-12-14 2018-06-05 上海斐讯数据通信技术有限公司 The method and system that focus switches between a kind of more controls
CN111629245A (en) * 2020-05-29 2020-09-04 深圳Tcl数字技术有限公司 Focus control method, television and storage medium
CN114240754A (en) * 2021-12-20 2022-03-25 海宁奕斯伟集成电路设计有限公司 Screen projection processing method, device, electronic device, and computer-readable storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116029746A (en) * 2023-03-28 2023-04-28 深圳市湘凡科技有限公司 Data processing method and related device
CN116029746B (en) * 2023-03-28 2023-07-04 深圳市湘凡科技有限公司 Data processing method and related device
CN118695043A (en) * 2024-07-03 2024-09-24 湖南快乐阳光互动娱乐传媒有限公司 A method and device for moving focus of user interface
CN118842889A (en) * 2024-07-24 2024-10-25 稻兴光学(厦门)有限公司 Method, system and equipment for controlling display of large-screen equipment

Similar Documents

Publication Publication Date Title
CN114911404A (en) Interface control method, control system, electronic device and storage medium
CN108595137B (en) Wireless screen projection method and device and screen projector
CN110515580B (en) Display control method, device and terminal
CN110456907A (en) Virtual screen control method, device, terminal equipment and storage medium
JP2013504826A (en) Method and apparatus for providing an application interface on a computer peripheral
WO2014002241A1 (en) Display system, display apparatus, display terminal, and display method and control program for display terminal
CN115120979B (en) Virtual object display control method, device, storage medium and electronic device
CN104301661A (en) Intelligent household monitoring method and client and related devices
CN106293563B (en) Control method and electronic equipment
CN105824399A (en) Control method, system and apparatus for projection equipment
KR101987859B1 (en) A program, a game system, an electronic device, a server, and a game control method for improving operability of user input
CN112073770A (en) Display device and video communication data processing method
CN110837308B (en) Information processing method and device and electronic equipment
CN110928509B (en) Display control method, display control device, storage medium, communication terminal
CN114461124B (en) Screen projection control method, device, screen projection device, and computer-readable storage medium
US20130244730A1 (en) User terminal capable of sharing image and method for controlling the same
CN113918070B (en) Synchronous display method, device, readable storage medium and electronic device
CN114461123A (en) Screen projection control method and device, screen projector and computer readable storage medium
TWI547862B (en) Multi - point handwriting input control system and method
CN110692036A (en) Presentation server, data relay method, and method for generating virtual pointer
US11875465B2 (en) Virtual reality data-processing device, system and method
CN115129277A (en) An interaction method, display device and VR device
CN115068937A (en) Game picture display adjusting method and device, storage medium and electronic equipment
KR102679825B1 (en) display device
CN111381670A (en) Virtual content interaction method, device, system, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220816

RJ01 Rejection of invention patent application after publication