CN108154864B - Display screen processing method, first electronic device and second electronic device - Google Patents
Display screen processing method, first electronic device and second electronic device Download PDFInfo
- Publication number
- CN108154864B CN108154864B CN201711403441.6A CN201711403441A CN108154864B CN 108154864 B CN108154864 B CN 108154864B CN 201711403441 A CN201711403441 A CN 201711403441A CN 108154864 B CN108154864 B CN 108154864B
- Authority
- CN
- China
- Prior art keywords
- display screen
- virtual display
- virtual
- user
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
- G06F3/1446—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/002—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/14—Solving problems related to the presentation of information to be displayed
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The embodiment of the invention provides a display screen processing method, first electronic equipment and second electronic equipment, wherein the first electronic equipment displays one or more virtual display screens through an optical lens module, the first electronic equipment is connected with the second electronic equipment, and the second electronic equipment is provided with a physical display screen; the first electronic device may acquire usage scenario information of the virtual display screen, and adjust display parameters of the virtual display screen based on the usage scenario information, where the display parameters, such as a display shape, of the virtual display screen may be changed because the virtual display screen is not limited by physical devices and space.
Description
Technical Field
The invention relates to the technical field of communication, in particular to a display screen processing method, first electronic equipment and second electronic equipment.
Background
The electronic devices all have physical display screens, and the display parameters, such as display forms, of the current physical display screens cannot be changed.
Disclosure of Invention
In view of this, the present invention provides a display screen processing method, a first electronic device, and a second electronic device, so as to overcome the problem that the display parameters of the current physical display screen cannot be changed in the prior art.
In order to achieve the purpose, the invention provides the following technical scheme:
a display screen processing method is applied to first electronic equipment, wherein the first electronic equipment displays a virtual display screen through an optical lens module, the first electronic equipment is connected with second electronic equipment, and the second electronic equipment is provided with a physical display screen; the method comprises the following steps:
acquiring the use scene information of the virtual display screen;
and adjusting the display parameters of the virtual display screen based on the using scene information.
A display screen processing method is applied to a second electronic device and comprises the following steps:
the method comprises the steps that using scene information of a virtual display screen displayed by first electronic equipment through an optical lens module is obtained, wherein the second electronic equipment is connected with the first electronic equipment and is provided with a physical display screen;
and adjusting display parameters of the virtual display screen in the first electronic equipment based on the use scene information.
A first electronic device connected to a second electronic device, the second electronic device having a physical display screen, the first electronic device comprising:
the optical lens module is used for displaying one or more virtual display screens;
a memory for storing a program;
a processor configured to execute the program, the program specifically configured to:
acquiring the use scene information of the virtual display screen;
and adjusting the display parameters of the virtual display screen based on the using scene information.
A second electronic device, comprising:
a physical display screen;
a memory for storing a program;
a processor configured to execute the program, the program specifically configured to:
acquiring use scene information of one or more virtual display screens displayed by first electronic equipment through an optical lens module, wherein the second electronic equipment is connected with the first electronic equipment;
and adjusting display parameters of a virtual display screen in the first electronic equipment based on the using scene information.
As can be seen from the foregoing technical solutions, compared with the prior art, an embodiment of the present invention provides a display screen processing method, in which a first electronic device displays one or more virtual display screens through an optical lens module, where the first electronic device is connected to a second electronic device, and the second electronic device has a physical display screen; the first electronic device may acquire usage scenario information of the virtual display screen, and adjust display parameters of the virtual display screen based on the usage scenario information, where the display parameters, such as a display shape, of the virtual display screen may be changed because the virtual display screen is not limited by physical devices and space.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a schematic diagram of a display screen processing system according to an embodiment of the present invention;
fig. 2 is a flowchart of a method of an implementation manner of a display screen processing method applied to a first electronic device according to an embodiment of the present invention;
fig. 3 is a schematic diagram of acquiring a gaze direction of a user according to an embodiment of the present invention;
fig. 4a to 4l and fig. 5a to 5e are schematic diagrams of application scenarios provided in the embodiment of the present invention;
fig. 6 is a flowchart of a display screen processing method applied to a second electronic device according to an embodiment of the present invention;
fig. 7 is an internal structural diagram of an implementation manner of a first electronic device according to an embodiment of the present invention;
fig. 8 is a structural diagram of an implementation manner of a second electronic device according to an embodiment of the present invention;
fig. 9 is a block diagram of another implementation manner of a first electronic device according to an embodiment of the present invention;
fig. 10 is a block diagram of another implementation manner of a second electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The display screen processing method provided by the embodiment of the invention can be applied to the first electronic equipment or the second electronic equipment.
The first electronic device may be a wearable electronic device, e.g., a head mounted display.
The first electronic device comprises a lens module which can display virtual display screens (the number of the virtual display screens can be one or more). The first electronic device and the second electronic device may be connected. The first electronic device and the second electronic device may be connected through an adapter, may also be connected through bluetooth, and may also be connected through wifi, which is not specifically limited in this embodiment of the present invention. Fig. 1 illustrates an example in which a first electronic device and a second electronic device are connected via an adapter.
The second electronic device may be an electronic device such as a desktop, a mobile terminal (e.g., smartphone, laptop), an ipad, and the like.
The second electronic device comprises a physical display screen, and the virtual display screen displayed by the first electronic device is used for displaying one or more window interfaces in the physical display screen, namely the virtual display screen is equivalent to an expansion screen of the physical display screen.
The display screen processing method provided by the embodiment of the invention can be applied to various application scenarios, and the embodiment of the invention provides but is not limited to the following application scenarios.
First application scenario:
the method comprises the steps that the first electronic equipment determines using scene information of a virtual display screen; and adjusting display parameters of the virtual display screen based on the usage scenario information.
In a first application scenario, the display screen processing method is implemented entirely in the first electronic device.
Second application scenario:
fig. 1 is a schematic diagram of a display screen processing system according to an embodiment of the present invention. The display screen processing system comprises a first electronic device 11 and a second electronic device 12.
The second electronic device 12 determines the usage scenario information of the virtual display screen, and sends the usage scenario information of the virtual display screen to the first electronic device 11, so that the first electronic device 11 adjusts the display parameters of the virtual display screen based on the usage scenario information.
In a second application scenario, a usage scenario of the virtual display screen is obtained by the second electronic device 12.
The third application scenario:
the second electronic device 12 determines second usage scenario information of the virtual display screen and sends the second usage scenario information of the virtual display screen to the first electronic device 11, the first electronic device 11 may also determine first usage scenario information of the virtual display screen, and the first electronic device 11 adjusts display parameters of the virtual display screen based on the first usage scenario information and the second usage scenario information.
In a third application scenario, both the second electronic device 12 and the first electronic device 11 may determine usage scenario information of the virtual display screen.
A fourth application scenario:
the first electronic device 11 may determine usage scenario information of a virtual display screen displayed by itself, and send the usage scenario information to the second electronic device 12; the second electronic device 12 adjusts the display parameters of the virtual display screen displayed by the first electronic device 11 based on the usage scenario information of the virtual display screen.
The manner of adjusting the first electronic device 11 by the second electronic device 12 may include:
the second electronic device 12 obtains the display parameter based on the usage scenario information of the virtual display screen, and sends the display parameter to the first electronic device 11, and the first electronic device 11 displays the virtual display screen based on the display parameter.
Or the like, or, alternatively,
the second electronic device 12 obtains first indication information for indicating the first electronic device 11 to adjust the display parameters of the virtual display screen based on the usage scenario information of the virtual display screen, and sends the first indication information to the first electronic device 11; the first electronic device 11 adjusts the display parameters of the virtual display screen based on the first indication information.
In a fourth application scenario, the usage scenario information of the virtual display screen is completely obtained by the first electronic device 11.
Fifth application scenario:
the first electronic device 11 may determine first usage scenario information of a virtual display screen displayed by itself, and send the first usage scenario information to the second electronic device 12; the second electronic device 12 may itself determine second usage scenario information for the virtual display screen; the second electronic device 12 adjusts the display parameters of the virtual display screen based on the first usage scenario information and the second usage scenario information of the virtual display screen.
In a fifth application scenario, both the first electronic device 11 and the second electronic device 12 may determine a usage scenario of the virtual display screen. The second electronic device 12 combines the two kinds of usage scenario information to adjust the display parameters of the virtual display screen.
Based on the information of the two usage scenarios, the manner of adjusting the display parameters of the virtual display screen by the second electronic device 12 may be the same as the manner of adjusting the first electronic device 11 by the second electronic device 12 described in the second application scenario, and details are not repeated here.
The following describes, with reference to a first application scenario, a second application scenario, and a third application scenario, a display screen processing method applied to a first electronic device according to an embodiment of the present invention, as shown in fig. 2, which is a method flowchart of an implementation manner of the display screen processing method applied to the first electronic device according to the embodiment of the present invention, where the method includes:
step S201: and acquiring the use scene information of the virtual display screen.
The virtual display screen mentioned in step S201 refers to one or more virtual display screens displayed by the first electronic device in a broad sense.
The method for acquiring the usage scenario information of the virtual display screen includes but is not limited to:
acquiring behavior information of a user aiming at least one virtual display screen in the virtual display screens; and/or acquiring window interface information displayed by at least one virtual display screen in the virtual display screens.
The number of "at least one virtual display screen" may be one or more. Acquiring behavior information of a user aiming at least one virtual display screen in the virtual display screens; and/or acquiring window interface information displayed by at least one virtual display screen in the virtual display screens, wherein the window interface information comprises:
determining the at least one virtual display screen from the virtual display screens;
and acquiring behavior information of the at least one virtual display screen, and/or acquiring window interface information displayed by the at least one virtual display screen.
The implementation manner of determining the at least one virtual display screen from the virtual display screens displayed by the first electronic device includes, but is not limited to, the following methods.
The first method includes the step of determining a virtual display screen where a focus position corresponding to a user sight direction is located from virtual display screens displayed by first electronic equipment.
There are various methods for acquiring the sight direction of the user, and the embodiments of the present invention provide, but are not limited to, the following:
the first method for acquiring the sight direction of a user comprises the following steps:
a near infrared sensor NIR sensor may be provided in the first electronic device. When a user wears a first electronic device (such as a head-mounted electronic device), infrared light emitted by the near-infrared sensor illuminates eyes of the user, at the moment, the iris of the eyes reflects the infrared light, and the position parameters of eyeballs of the user can be determined by detecting the reflected light through the near-infrared sensor, wherein the position parameters can be specific positions and directions of the eyeballs, so that the sight line direction of the user is obtained.
The method specifically comprises the following steps: acquiring a coordinate position of the user eyeball on a first preset plane; and acquiring an included angle between the eyeballs of the user and a first preset direction.
As shown in fig. 3, for a schematic diagram of acquiring a line-of-sight direction of a user according to an embodiment of the present invention, the first electronic device may include a MEMS sensor 31, and the display in fig. 3 may be a near infrared sensor NIR. In this embodiment, a first preset plane may be predefined, for example, the first preset plane may be a plane a perpendicular to a certain reference light emitted by the first electronic device, at this time, an intersection point of the illumination light emitted by the first electronic device on the first preset plane is defined as an origin of coordinates (0, 0), and then, according to a projection position of a light refracted by the iris on the first preset plane detected by the near-infrared sensor, a coordinate position of an eyeball of the user on the first preset plane is determined. For example, the position of the user's eye in the figure may be (-1, 1), located in the first quadrant on plane A.
In addition, the angle between the eyeball of the user and the first predetermined direction is also obtained in this embodiment, and preferably, the first predetermined direction is defined as the direction of a reference light ray emitted by the first electronic device, such as the horizontal direction in the figure, as can be seen from fig. 3, the angle between the eyeball of the user and the first predetermined direction is α.
It should be noted that the above definitions of the present embodiment are only for illustration, and the present invention is not limited to the above method for defining the preset direction and the manner for acquiring the eyeball position information of the user. For example, a plane on which the first electronic device displays the virtual scene may be defined as a first preset plane, and at this time, a projection position of the user's eyeball on the first preset plane may be obtained.
The second method for acquiring the sight direction of the user comprises the following steps: and obtaining the sight direction of the user by using a visual angle tracking technology.
And acquiring the projection direction of the eyeballs of the user and the position parameters of the eyeballs of the user by utilizing an angle-of-view tracking technology, thereby obtaining the projection direction parameters.
Or, acquiring the position parameters of the optical lens module and the rotation angle of the optical lens module by using a visual angle tracking technology.
The optical lens module is used for projecting light rays to eyes of a user so that the user can see the virtual display screen; the direction of the user's line of sight changes, and the angle of rotation and the position of optical lens module will change. Therefore, the projection direction parameter can be obtained by utilizing the rotation angle and the position parameter of the optical lens module.
The third method for acquiring the sight direction of the user comprises the following steps: the user's gaze direction is obtained using the slam (simultaneous localization and mapping) algorithm.
The SLAM algorithm can be used for identifying and detecting the position of the physical display screen in real time, and the position and the rotation angle of the optical lens module in the first electronic equipment can also be obtained in real time.
The fourth method for acquiring the sight direction of the user comprises the following steps:
tracking the head movement of the user, and determining the sight line information of the user according to the head movement of the user.
For the manner, the first electronic device may be an electronic device worn on the head of the user, the optional first electronic device is provided with various motion detection sensors such as an acceleration sensor, the motion detection sensors may detect the head motion of the user by detecting the motion of the first electronic device itself, and the gaze direction of the user may be determined by detecting the motion state information in combination with the relative position relationship between the head of the user and the first electronic device.
In an alternative embodiment, the focus position corresponding to the gaze direction of the user may be switched among a plurality of virtual display screens, that is, the virtual display screens where the focus position corresponding to the gaze direction of the user is located are different at different times.
In the second method, one or more virtual display screens displayed by the first electronic device 11 are determined as the at least one virtual display screen.
The use scene information respectively corresponding to the at least one virtual display screen can be determined by the first electronic device; or the first electronic equipment sends the information to the second electronic equipment after the information is determined by the second electronic equipment; or after determining the behavior information of the user respectively for the at least one virtual display screen (or the window interface information respectively displayed by the at least one virtual display screen) by the second electronic device, sending the behavior information of the user respectively for the at least one virtual display screen (or the window interface information respectively displayed by the at least one virtual display screen) to the first electronic device, and determining the window interface information respectively displayed by the at least one virtual display screen (or the behavior information respectively displayed by the user for the at least one virtual display screen) by the first electronic device.
Step S202: and adjusting the display parameters of the virtual display screen based on the using scene information.
Wherein the display parameter may include at least one of a display position, a display shape, and a display size of the virtual display screen. Wherein the display size comprises: display area and/or display resolution.
The embodiment of the invention provides a display screen processing method applied to first electronic equipment, wherein the first electronic equipment displays one or more virtual display screens through an optical lens module, the first electronic equipment is connected with second electronic equipment, and the second electronic equipment is provided with a physical display screen; the first electronic device may acquire usage scenario information of the virtual display screen, and adjust display parameters of the virtual display screen based on the usage scenario information, where the display parameters, such as a display shape, of the virtual display screen may be changed because the virtual display screen is not limited by physical devices and space.
The following describes an implementation manner of "adjusting display parameters of the virtual display screen based on the usage scenario information" in detail.
The first method is as follows: if the usage scenario information of the virtual display screen displayed by the first electronic device is distance change information between the user and the at least one virtual display screen, respectively, adjusting the display parameters of the virtual display screen based on the usage scenario information in the embodiment of the present invention includes: and respectively adjusting the display size of the at least one virtual display screen based on the distance change information of the user and the at least one virtual display screen.
The following description will take one of the at least one virtual display screen as an example.
In an alternative embodiment, if the user approaches the virtual display screen, that is, the distance between the user and the virtual display screen is changed from a first distance to a second distance, and the first distance is greater than the second distance, the size of the virtual display screen is reduced so that the user can see the entire virtual display screen. If the user is far away from the virtual display screen, namely the distance between the user and the virtual display screen is changed from a third distance to a fourth distance, and the third distance is smaller than the fourth distance, namely, the size of the virtual display screen is increased, so that the user can clearly see the window interface displayed in the virtual display screen.
In an alternative embodiment, the size of the virtual display screen is increased or decreased in the same proportion as the window interface displayed in the virtual display screen. That is, the adjustment of the virtual display screen is an overall adjustment, and if the virtual display screen is adjusted as a whole, the display content displayed in the virtual display screen is also enlarged or reduced as a whole, so that the user can watch the content conveniently.
Fig. 4a to 4b are schematic diagrams illustrating a change in the display size of a virtual display screen according to an embodiment of the present invention.
Fig. 4a to 4b are the process in which the user gradually moves away from the virtual display screen 41, and therefore, the display size of the virtual display screen 41 in fig. 4b is larger than that of the virtual display screen 41 in fig. 4 a.
In another alternative embodiment, a user may approach one virtual display screen (referred to herein as a first virtual display screen) while moving away from the other virtual display screen (referred to herein as a second virtual display screen), where the display size of the first virtual display screen is increasing while the display size of the second virtual display screen is decreasing.
In yet another alternative embodiment, the user may approach the virtual display screen only when the window interface displayed in the virtual display screen is not clear, and the user may subconsciously adjust or attempt to adjust the distance to the virtual display screen by bending over, squinting, or the like when the user is not clear, and the display size of the virtual display screen may be increased if the user's actions are captured, so that adjusting the display size of the at least one virtual display screen based on the information about the change in the distance between the user and the at least one virtual display screen includes: obtaining intention parameters which are respectively corresponding to the at least one virtual display screen and used for representing the intention of the user through distance change information of the user and the at least one virtual display screen and sight line information of the user; and respectively adjusting the display size of the at least one virtual display screen based on the intention parameters respectively corresponding to the at least one virtual display screen. The following description will take one of the at least one virtual display screen as an example.
The sight line information of the user may further include: a user squinting or blinking. The information of user squinting or blink can reflect the current definition of seeing the virtual display screen of user on the one hand, and on the other hand can reflect the current tiredness of eyes of user. In the embodiment of the invention, the display size of the virtual display screen can be dynamically adjusted, the display theme color or background color of the eye protection can be switched to, or the display brightness and other information can be dynamically adjusted according to the winking or squinting information of the user, so that the virtual display screen can form various intelligent dynamic display effects. In the embodiment of the present invention, the acquisition of the gaze information of the user may be determined by image acquisition and/or analysis of the first electronic device, or obtained by image acquisition and/or analysis of a camera connected to the first electronic device.
The second method comprises the following steps: if the usage scenarios respectively corresponding to the at least one virtual display screen are the sight projection positions of the user respectively corresponding to the at least one virtual display screen, adjusting the display parameters of the virtual display screen based on the usage scenario information in the embodiment of the present invention includes: and dynamically adjusting the position of the at least one virtual display screen in the horizontal direction and/or the position of the at least one virtual display screen in the vertical direction respectively based on the sight line projection position of the user.
Fig. 4c is a schematic diagram illustrating another variation of the display size of the virtual display screen according to the embodiment of the present invention.
In fig. 4c, the position of the seat of the user 42 is not changed, but the body of the user 42 is tilted in a direction approaching the virtual display screen 43, and the display size of the virtual display screen 43 is increased by changing from the solid line 421 to the broken line 422, that is, bending in a direction approaching the virtual display screen 43, and becomes the virtual display screen 43 indicated by the broken line pointed by the arrow.
If the user is far away, when looking at the window interface displayed by the virtual display screen, if the user feels that the display of the window interface is too large and the user cannot clearly see the window interface, the user can actively get away from the virtual display screen, and if the user cannot clearly see the window interface, the user can subconsciously adjust the distance between the window interface and the virtual display screen through actions such as leaning backwards, squinting and the like or try to adjust the distance between the window interface and the virtual display screen, and if the user is caught in the operation, the display size of the virtual display screen can be reduced.
In an alternative embodiment, the size of the virtual display screen is increased or decreased in the same proportion as the window interface displayed in the virtual display screen. That is, the adjustment of the virtual display screen is an overall adjustment, and if the virtual display screen is adjusted as a whole, the display content displayed in the virtual display screen is also enlarged or reduced as a whole, so that the user can watch the content conveniently.
The third method comprises the following steps: if the usage scenarios respectively corresponding to the at least one virtual display screen are the directions of the lines of sight of the user respectively corresponding to the at least one virtual display screen, adjusting the display parameters of the virtual display screen based on the usage scenario information in the embodiment of the present invention includes: dynamically adjusting the orientation of the at least one virtual display screen based on the sight line direction of the user respectively, so that the orientation of the at least one virtual display screen changes along with the change of the sight line direction of the user.
The following description will take one of the at least one virtual display screen as an example.
In an optional embodiment, the at least one virtual display screen is a virtual display screen where a focus position corresponding to the user's gaze direction is located.
For example, if the sight line direction of the user is the southeast direction, the orientation of the virtual display screen is the northwest direction; if the user looks up, for example, looks at a video while lying, the direction of the user's sight line is from bottom to top, and the orientation of the virtual downward view screen is downward, that is, the direction of the user's sight line is opposite to the virtual display screen, and preferably, the user's head corresponds to the center position of the virtual display screen, so that the user can view the virtual display screen. Fig. 4d to 4e are schematic diagrams illustrating orientation changes of a virtual display screen according to an embodiment of the present invention.
Fig. 4d shows that the focus position corresponding to the sight line direction of the user 42 is in the virtual display screen 41, and if the user looks to the virtual display screen 41 from left, the virtual display screen 41 faces to right, so that the sight line direction of the user just faces to the virtual display screen 41, which is convenient for the user to watch the virtual display screen 41; fig. 4e shows that the direction of the user's 42 line of sight is to the right, and the virtual display screen 41 is oriented to the left, so that the user's line of sight is directly opposite to the virtual display screen 41, and the user can view the virtual display screen 41 conveniently.
In summary, if the user turns his head, the direction of the user's sight line changes, and the direction of the virtual display screen also changes dynamically, the virtual display screen rotates along with the turning of the user's head, and the image in the space where the user is located rotates. In this embodiment, the orientation of the virtual display screen is dynamically adjusted according to the direction of the line of sight of the user, so that the virtual display screen dynamically tracks the line of sight of the user, and the virtual display screen is always displayed in the visual range of the user, and thus, the user does not need to search for the virtual display screen to be watched everywhere when needing to look up the virtual display screen.
The following description will take one of the at least one virtual display screen as an example.
With reference to fig. 4f to 4h, the virtual display screen 43 in fig. 4f is at the default position, and assuming that the focus position corresponding to the user's gaze direction is in the virtual display screen 41, if the user's gaze projection position moves downward, the display position of the virtual display screen 41 moves downward, and the position of the dotted line in fig. 4f is the position where the upper side edges of the virtual display screen 41 are aligned before moving downward.
In fig. 4g, the position of the line of sight projection of the user is shifted upward, for example, when the user stands to view the virtual display screen 41, the display position of the virtual display screen 41 is shifted upward, and the position of the dotted line in fig. 4g is a position where the upper side edges of the virtual display screen 41 are aligned before the shift-up.
When the user's gaze projecting position is shifted to the left in fig. 4h, for example, the user is shifted to the left, the display position of the virtual display screen 41 is shifted to the left, and the dotted line is positioned in fig. 4h as the right edge of the virtual display screen 41 is aligned before the shift to the left.
The method is as follows: if the usage scenarios respectively corresponding to the at least one virtual display screen are the distance change information between the user and the at least one virtual display screen, adjusting the display parameters of the virtual display screen based on the usage scenario information in the embodiment of the present invention includes: and respectively and dynamically adjusting the bending form of the at least one virtual display screen based on the distance change information between the user and the at least one virtual display screen.
The following description will take one of the at least one virtual display screen as an example.
And adjusting the virtual display screen into a curved screen based on the distance change information of the user and the virtual display screen. Namely, the bending form of the virtual display screen can be adjusted based on the distance change information between the user and the virtual display screen.
In an optional embodiment, the dynamically adjusting the bending form of the at least one virtual display screen based on the distance variation information between the user and the at least one virtual display screen respectively includes:
aiming at each virtual display screen, obtaining at least one line of pixels from the virtual display screen, wherein each line of pixels is parallel to a horizontal plane;
and aiming at each line of pixels, adjusting the bending form of the line to ensure that the distances between each pixel in the line and the user are equal, so as to obtain the bending forms corresponding to the at least one line of pixels, and thus obtain the bending forms corresponding to the at least one virtual display screen.
The following examples are given.
Fig. 4i is a cross-sectional view of a physical display screen and a virtual display screen according to an embodiment of the present invention.
Assuming that the focus position corresponding to the user's sight line direction is in the virtual display 41, fig. 4i shows a cross-sectional view of the virtual display 41 after bending, a cross-sectional view of the physical display 40 and a cross-sectional view of the virtual display 43, it can be seen that the cross-section of the virtual display 41 may be a part of a circle, and the position of the user, preferably the user's head, or the position of the user's eyeball is located at the center of the circle corresponding to the circular arc.
Fig. 4j is another cross-sectional view of a physical display screen and a virtual display screen according to an embodiment of the present invention.
If the distance between the user and the virtual display screen 41 in fig. 4j is greater than the distance between the user and the virtual display screen 41 in fig. 4i, the bending degree of the virtual display screen 41 is less than that of the virtual display screen 41 in fig. 4 i. However, it can be seen that the cross-section of the virtual display screen 41 may be a portion of a circle, and the position of the user, preferably the user's head, or the position of the user's eyeball, is located at the center of the circle corresponding to the arc.
In summary, the bending curvature of the virtual display screen 41 can be automatically adjusted according to the distance from the user to the virtual display screen, so as to ensure that the head of the user is always at the center of the circle corresponding to the arc formed by each cross section of the virtual display screen 41, that is, the distances from each point on the arc formed by each cross section of the virtual display screen 41 to the head of the user are respectively equal. Since the user is further away from the virtual display screen 41 in fig. 4j than in fig. 4i, the curvature of the cross-section of the virtual display screen 41 in fig. 4j is smaller than the curvature of the cross-section of the virtual display screen 41 in fig. 4 i.
The virtual display screen 42 has a plurality of cross sections parallel to the horizontal plane, and a row of pixels corresponding to a cross section forms an arc having equal distances from each point on the arc to the head of the user for any cross section.
The fifth mode is as follows: if the usage scenario is a window interface displayed by a first virtual display screen in the at least one virtual display screen, adjusting the display parameters of the virtual display screen based on the usage scenario information in the embodiment of the present invention includes: acquiring window interface information displayed by a first virtual display screen in the at least one virtual display screen; and adjusting the first virtual display screen to be a straight screen or a curved screen based on the displayed window interface information.
The following description will take one of the at least one virtual display screen as an example.
The first electronic device 11 or the second electronic device 12 may identify a type of an application program of a window interface displayed in the virtual display screen, where the window interface information may be a type of an application program to which the window interface belongs, and in an optional embodiment, if the type of the application program to which the window interface displayed in the virtual display screen belongs to an entertainment application program, since the curved screen may improve the immersion of a user in the entertainment application program, such as a game or a movie, the virtual display screen may be controlled to be the curved screen; if the application program to which the content displayed in the virtual display screen belongs is an application program which the user does not desire to deform, for example, an application program such as a high-precision working class, for example, CAD; the virtual display screen can be controlled to be a straight screen.
A method for controlling the virtual display screen to be a curved screen may be referred to as a mode four, and details are not described here.
As shown in fig. 4k, the application program to which the window interface displayed on the virtual display screen 43 belongs is an entertainment application program, and therefore the virtual display screen 43 is controlled to be a curved screen. As shown in fig. 4l, since the application program to which the window interface displayed on the virtual display screen 43 belongs is an application program such as a high-precision job class, the virtual display screen 43 is a straight screen.
Fig. 4a to 4l are only examples in the embodiment of the present invention, and the number and/or the display position of the virtual display screens displayed by the optical lens module of the first electronic device are not limited, for example, the number of the virtual display screens displayed by the optical lens module of the first electronic device may be one, two, three, four, five, six, seven, eight. The position of the virtual display screen displayed by the first electronic device through the optical lens module can be located on the left side, and/or the right side, and/or the upper side, and/or the lower side of the physical display screen.
In summary, the five implementation manners may be used as implementation manners separately, or any two implementation manners may be combined, or any three implementation manners may be combined, or any four implementation manners may be combined, or five implementation manners may be combined.
For example, the first implementation manner and the second implementation manner are combined, and a virtual display screen in the at least one virtual display screen is taken as an example for description.
The display size, the bending form and the orientation of the virtual display screen can be dynamically adjusted based on the distance change information between the user and the virtual display screen and the sight line direction of the user. For example, the distance between the user and the virtual display screen is from near to far, and the sight line direction of the user faces to the left side, optionally, the display size of the virtual display screen is increased, the virtual display screen is controlled to be a curved screen, and the orientation of the virtual display screen is adjusted to the right.
Other combinations are similar and will not be described in detail here.
It can be understood that, in practical applications, the distance between the user and the virtual display screen, the sight direction of the user, the sight projection position of the user, and the window interface displayed in the virtual display screen may all be changed, and it is assumed that, the distance between the user and the virtual display screen is from near to far, the sight direction of the user faces to the right, the sight projection position of the user moves upward, and the application program to which the window interface displayed in the virtual display screen belongs is an entertainment application program, so that the display size of the virtual display screen becomes larger, the virtual display screen becomes a bent shape, the display position of the virtual display screen moves upward, and the virtual display screen faces to the left.
In practical application, the first electronic device is provided with a plurality of virtual display screens through the optical lens module. If the second electronic device is in the screen saver state or the screen off state, the physical display screen does not display any window interface, and at this time, the focus position corresponding to the sight direction of the user should be in the virtual display screen, in an optional embodiment, the display positions of the at least two virtual display screens are moved, so that the at least two virtual display screens are spliced together.
As shown in fig. 5a to 5b, schematic diagrams of a plurality of virtual display screens provided by the embodiment of the present invention after being spliced are shown.
After the plurality of virtual display screens are spliced, for example, the virtual display screen 41 and the virtual display screen 43 may be used as independent virtual display screens, or may be used as an integrated spliced display screen.
Fig. 5a and 5b illustrate an example of a second electronic device being a notebook computer, in which a physical display 40 and a physical keyboard of the notebook computer are combined, so that the notebook computer is in a screen saver state or a screen off state.
FIG. 5a shows the effect of the virtual display screen 41 and the virtual display screen 43 after being spliced; fig. 5b shows that the first electronic device displays six virtual display screens through the optical lens module, and the six virtual display screens are spliced to obtain an effect.
After the plurality of virtual display screens are spliced, the virtual display screens can be used as independent virtual display screens respectively, and can also be used as an integral spliced display screen.
Fig. 5a and 5b both show that after the virtual display screens are spliced, each virtual display screen is still used as an independent virtual display screen, for example, the virtual display screen 51 in fig. 5b displays a first window interface; the virtual display 52 displays a second window interface and the virtual display 53 displays a third window interface.
Generally, if the eyes of the user need to switch among a plurality of virtual display screens, if the display resolutions of the plurality of virtual display screens are not consistent, the problem of insufficient display definition may occur when the user switches from a high-resolution virtual display screen to a low-resolution virtual display screen, or the problem of excessive display sharpness may occur when the user switches from a low-resolution virtual display screen to a high-resolution virtual display screen, so that the eyes of the user are easily fatigued. Thus, in a preferred embodiment, the display resolution of each virtual display screen is the same.
In an optional embodiment, if the physical display screen is in a screen saver state or a screen off state, at least two virtual display screens are spliced to form a spliced display screen.
And the at least two virtual display screens are spliced to form an integral spliced display screen. The tiled display can be controlled to display at least one window interface as a whole. As shown in fig. 5c, the tiled display 54 is formed by tiling six virtual displays. The tiled display 54 can display a window interface 541 as an integral display.
The spliced display screen can also be adjusted to be a curved screen, and the specific method is the same as the method for adjusting the bending form of the single virtual display screen, and is not repeated here. For example, for a tiled display screen, at least one row of pixels is obtained from the tiled display screen, each row of pixels being parallel to a horizontal plane;
and aiming at each row of pixels, adjusting the bending form of the row to ensure that the distances between each pixel in the row and the user are equal, so as to obtain the bending form corresponding to the at least one row of pixels, and thus obtain the bending form corresponding to the spliced display screen.
In practical application, the first electronic device is provided with a plurality of virtual display screens through the optical lens module. If the second electronic device is in a bright screen state, in an optional embodiment, the display position of the at least one virtual display screen is moved, so that the at least one virtual display screen and the physical display screen are spliced together to form a spliced display screen.
Splicing at least one virtual display screen and the physical display screen to form an integral spliced display screen; in an optional embodiment, both the virtual display screen and the physical display screen included in the tiled display screen can be used as independent display screens, which do not interfere with the display window interface, as shown in fig. 5 d; in FIG. 5d, the physical display 40 displays a window interface; virtual display 54 and virtual display 55 each display other window interfaces, with complementary interference between the screens.
In another alternative embodiment, the tiled display can display at least one window interface as a unitary display.
As shown in fig. 5e, the tiled display screen 56 is a tiled display screen formed by splicing 5 virtual display screens and a physical display screen. The tiled display 56 can display a window interface as a whole as shown in fig. 5 e.
It is understood that the physical display screen may be a curved screen or a straight screen; if the physical display screen is a curved surface screen, in order to see the visual effect that the spliced display screen is an integral display screen, adjusting the spliced display screen to be the curved surface screen; if the physical display screen is a straight screen, the spliced display screen is adjusted to be a straight screen in order to be seen from the visual effect.
It can be understood that, since the bending form of the physical display screen is fixed and cannot be changed, adjusting the bending form of the tiled display screen is related to the bending form of the physical display screen and the position of the physical display screen in the tiled display screen.
To sum up, if the physics display screen is the curved surface screen, adjusts the tiled display screen is the curved surface screen, include:
if the physical display screen is a curved surface screen, determining the position of the physical display screen in the spliced display screen;
and adjusting the bending form of the spliced display screen based on the bending form of the physical display screen and the position of the physical display screen in the spliced display screen.
Assuming that the physical display screen is located at the lower left position of the tiled display screen, taking fig. 5e as an example, the bending form of the virtual display screen 58 is the same as that of the physical display screen 40; the virtual display 57 and the virtual display 54 have the same bending form; the virtual display 55 is the same as the physical display 59 in its folded configuration.
The process of adjusting the bending form of the tiled display screen is the same as the method of adjusting the bending form of a single virtual display screen, and is not described herein again. For example, for a tiled display screen, at least one row of pixels is obtained from the tiled display screen, each row of pixels being parallel to a horizontal plane;
and aiming at each row of pixels, adjusting the bending form of the row to ensure that the distances between each pixel in the row and the user are equal, so as to obtain the bending form corresponding to the at least one row of pixels, and thus obtain the bending form corresponding to the spliced display screen.
A display screen processing method applied to a second electronic device is described below with reference to a fourth application scenario and a fifth application scenario, and as shown in fig. 6, is a flowchart of the display screen processing method applied to the second electronic device according to the embodiment of the present invention, where the method includes:
step S601: the method comprises the steps of obtaining the use scene information of the virtual display screen displayed by the first electronic equipment through the optical lens module.
The number of the virtual display screens displayed by the first electronic equipment through the optical lens module is one or more.
In an alternative embodiment, the usage scenario information of the virtual display screen obtained by the first electronic device includes, but is not limited to: behavior information of a user for at least one of the virtual display screens, and/or window interface information displayed by at least one of the virtual display screens.
Step S602: and adjusting display parameters of a virtual display screen in the first electronic equipment based on the using scene information.
In step S602, the second electronic device 12 may adjust the display parameters of the virtual display screen in the first electronic device based on the usage scenario information of the virtual display screen obtained from the first electronic device.
In an alternative embodiment, the second electronic device 12 may also obtain usage scenario information of the virtual display screen.
At this time, step S602 may include: and adjusting the display parameters of the virtual display screen in the first electronic equipment based on the use scene information of the virtual display screen obtained by the first electronic equipment and the use scene information of the virtual display screen determined by the first electronic equipment.
The method for obtaining the usage context information of the virtual display screen by the second electronic device is the same as that of the first electronic device, and reference may be made to the description of obtaining the usage context information of the virtual display screen by the first electronic device, which is not described herein again.
In an alternative embodiment, step S602 may be implemented in various ways, and the following ways are provided in the embodiments of the present invention.
The first method is as follows:
generating first indication information based on the using scene information, wherein the first indication information is used for indicating the first electronic equipment to adjust display parameters of a virtual display screen; and sending the first indication information to the first electronic equipment.
If the usage scenario of the virtual display screen is behavior information of a user for at least one virtual display screen in the virtual display screen, generating first indication information based on the usage scenario information includes:
generating first indication information for respectively adjusting the display size of the at least one virtual display screen and/or respectively dynamically adjusting the bending form of the at least one virtual display screen based on the distance change information of the user from the at least one virtual display screen;
and/or the presence of a gas in the gas,
generating first indication information for dynamically adjusting the orientation of the at least one virtual display screen respectively based on the sight line direction of the user, so that the orientation of the at least one virtual display screen changes along with the change of the sight line direction of the user;
and/or the presence of a gas in the gas,
and generating first indication information for dynamically adjusting the position of the at least one virtual display screen in the horizontal direction and/or the vertical direction respectively based on the sight line projection position of the user.
If the usage scenario of the virtual display screen is window interface information displayed by at least one virtual display screen in the virtual display screen, generating first indication information based on the usage scenario information comprises:
acquiring window interface information displayed by a first virtual display screen in the at least one virtual display screen;
and generating first indication information for adjusting the first virtual display screen to be a straight screen or a curved screen based on the displayed window interface information.
The first electronic device may obtain display parameters respectively corresponding to the at least one virtual display screen based on the first indication information.
The second method comprises the following steps:
obtaining and adjusting display parameters of a virtual display screen in the first electronic equipment based on the using scene information; and sending the display parameters to the first electronic equipment.
If the usage scenario of the virtual display screen is behavior information of a user for at least one virtual display screen in the virtual display screens, obtaining and adjusting display parameters of the virtual display screen in the first electronic device based on the usage scenario information, including:
obtaining display sizes respectively corresponding to the at least one virtual display screen and/or obtaining a bending form of the at least one virtual display screen based on distance change information of the user and the at least one virtual display screen;
and/or the presence of a gas in the gas,
respectively obtaining the orientation of the at least one virtual display screen based on the sight line direction of the user, so that the orientation of the at least one virtual display screen changes along with the change of the sight line direction of the user;
and/or the presence of a gas in the gas,
and respectively obtaining the position of the at least one virtual display screen in the horizontal direction and/or the position of the at least one virtual display screen in the vertical direction based on the sight line projection position of the user.
If the usage scenario of the virtual display screen is window interface information displayed by at least one virtual display screen in the virtual display screen, obtaining and adjusting display parameters of the virtual display screen in the first electronic device based on the usage scenario information, including:
acquiring window interface information displayed by a first virtual display screen in the at least one virtual display screen;
and obtaining the adjustment information of the straight screen or the curved screen corresponding to the first virtual display screen based on the displayed window interface information.
In an optional embodiment, the display screen processing method applied to the second electronic device further includes:
if the physical display screen is in a screen saver state or a screen off state, generating second indication information for controlling at least two virtual display screens in the first electronic equipment to be spliced to form a spliced display screen;
or the like, or, alternatively,
and if the physical display screen is in a bright screen state, generating third indication information for controlling the physical display screen and the at least one virtual display screen of the first electronic device to be spliced to form a spliced display screen.
After receiving the second indication information, the first electronic device may generate a control instruction for controlling the at least two virtual display screens to be spliced to form a spliced display screen; or, the second indication information is used as a corresponding control instruction.
After receiving the third indication information, the first electronic device may generate a control instruction for controlling the physical display screen and the at least one virtual display screen of the first electronic device to be spliced to form a spliced display screen, or use the third indication information as a corresponding control instruction.
In an optional embodiment, the display screen processing method applied to the second electronic device further includes:
and controlling the spliced display screen to display at least one window interface as an integral display screen, or generating fourth indication information for controlling the spliced display screen to display at least one window interface as an integral display screen.
The display screen processing method applied to the second electronic device under the condition that the physical display screen and the at least one virtual display screen are spliced to form a spliced display screen if the physical display screen is in a bright screen state further comprises the following steps:
if the physical display screen is a straight-surface screen, generating fifth indication information for adjusting the spliced display screen to be the straight-surface screen;
and if the physical display screen is a curved surface screen, generating sixth indication information for adjusting the spliced display screen to be the curved surface screen.
After receiving the fifth indication information, the first electronic device may generate a control instruction for controlling the tiled display screen to be a straight screen; or, the fifth indication information is used as a corresponding control instruction.
After receiving the sixth indication information, the first electronic device may generate a control instruction for controlling the tiled display screen to be a curved screen, or use the sixth indication information as a corresponding control instruction.
As shown in fig. 7, an internal structure diagram of an implementation manner of a first electronic device provided in an embodiment of the present invention is shown, where the first electronic device includes:
the optical lens module 71 is used for displaying a virtual display screen;
a first obtaining module 72, configured to obtain usage scenario information of the virtual display screen;
a first adjusting module 73, configured to adjust a display parameter of the virtual display screen based on the usage scenario information.
Optionally, the first obtaining module 72 includes:
the first acquisition unit is used for acquiring behavior information of a user aiming at least one virtual display screen in the virtual display screens; and/or the second acquiring unit is used for acquiring window interface information displayed by at least one of the virtual display screens.
Optionally, if the obtaining of the usage scenario of the virtual display screens is to obtain behavior information of a user for at least one of the virtual display screens, the first adjusting module 73 includes:
the first adjusting unit is used for respectively adjusting the display size of the at least one virtual display screen and/or respectively and dynamically adjusting the bending form of the at least one virtual display screen based on the distance change information between the user and the at least one virtual display screen;
and/or the presence of a gas in the gas,
a second adjusting unit, configured to dynamically adjust the orientation of the at least one virtual display screen based on the gaze direction of the user, respectively, so that the orientation of the at least one virtual display screen changes along with the change of the gaze direction of the user;
and/or the presence of a gas in the gas,
and the third adjusting unit is used for dynamically adjusting the position of the at least one virtual display screen in the horizontal direction and/or the vertical direction respectively based on the sight line projection position of the user.
Optionally, the first adjusting unit includes:
the first acquisition subunit is used for acquiring at least one line of pixels from each virtual display screen, wherein each line of pixels is parallel to the horizontal plane;
the first adjusting subunit is configured to adjust the bending form of each line of pixels, so that distances between each pixel in the line and the user are equal, and thus, bending forms corresponding to the at least one line of pixels are obtained, and thus, bending forms corresponding to the at least one virtual display screen are obtained.
Optionally, if the obtaining of the usage scenario of the virtual display screen is to obtain window interface information displayed by at least one virtual display screen in the virtual display screen, the first adjusting module 73 includes:
the second acquisition unit is used for acquiring window interface information displayed by a first virtual display screen in the at least one virtual display screen;
and the fourth adjusting unit is used for adjusting the first virtual display screen to be a straight screen or a curved screen based on the displayed window interface information.
Optionally, the first electronic device has a plurality of virtual display screens through the optical lens module shows, still includes:
the first splicing module is used for splicing at least two virtual display screens to form a spliced display screen if the physical display screen is in a screen saver state or a screen off state;
or the like, or, alternatively,
and the second splicing module is used for splicing the physical display screen and the at least one virtual display screen to form a spliced display screen if the physical display screen is in a bright screen state.
Optionally, the method further includes:
and the control module is used for controlling the spliced display screen to display at least one window interface as a whole display screen.
Optionally, if the physical display screen is in a bright screen state, the method further includes, in a case that the physical display screen and the at least one virtual display screen are spliced to form a spliced display screen:
the second adjusting module is used for adjusting the spliced display screen to be a straight screen if the physical display screen is the straight screen;
and the third adjusting module is used for adjusting the spliced display screen to be a curved screen if the physical display screen is the curved screen.
Optionally, the third adjusting module includes:
the determining unit is used for determining the position of the physical display screen in the spliced display screen if the physical display screen is a curved screen;
and the fourth and fifth adjusting unit is used for adjusting the bending form of the spliced display screen based on the bending form of the physical display screen and the position of the physical display screen in the spliced display screen.
As shown in fig. 8, a structure diagram of an implementation manner of a second electronic device provided in an embodiment of the present invention is shown, where the second electronic device includes:
a physical display screen 40;
the first obtaining module 81 is configured to obtain usage scene information of a virtual display screen displayed by a first electronic device through an optical lens module, where the second electronic device is connected to the first electronic device, and the second electronic device has a physical display screen;
a first adjusting module 82, configured to adjust a display parameter of the virtual display screen in the first electronic device based on the usage scenario information.
Optionally, the usage scenario information of the virtual display screen includes: the behavior information of the user aiming at least one of the virtual display screens and/or the window interface information displayed by at least one of the virtual display screens.
Optionally, the first adjusting module 82 includes:
a first generating unit configured to generate first indication information based on the usage scenario information; the first sending unit is used for sending the first indication information to the first electronic equipment;
or the like, or, alternatively,
the second generating unit is used for obtaining and adjusting display parameters of a virtual display screen in the first electronic equipment based on the using scene information; and the second sending unit is used for sending the display parameters to the first electronic equipment.
Optionally, if the usage scenario of the virtual display screens is behavior information of a user for at least one of the virtual display screens, the first generating unit includes:
the first generating subunit is configured to generate, based on the distance change information between the user and the at least one virtual display screen, first indication information used for adjusting the display size of the at least one virtual display screen, and/or dynamically adjusting the bending form of the at least one virtual display screen, respectively;
and/or the presence of a gas in the gas,
a second generating subunit, configured to generate, based on the gaze direction of the user, first indication information for dynamically adjusting the orientations of the at least one virtual display screen, respectively, so that the orientation of the at least one virtual display screen changes along with a change in the gaze direction of the user;
and/or the presence of a gas in the gas,
a third generating subunit, configured to generate, based on the gaze projection position of the user, first indication information for dynamically adjusting a position of the at least one virtual display screen in a horizontal direction and/or a position of the at least one virtual display screen in a vertical direction, respectively.
Optionally, if the usage scenario of the virtual display screen is window interface information displayed by at least one virtual display screen in the virtual display screen, the first generating unit includes:
the first acquiring subunit is configured to acquire window interface information displayed by a first virtual display screen in the at least one virtual display screen;
and the fourth generating subunit is configured to generate, based on the displayed window interface information, first indication information for adjusting the first virtual display screen to be a straight screen or a curved screen.
Optionally, if the usage scenario of the virtual display screens is behavior information of a user for at least one of the virtual display screens, the second generating unit includes:
the second obtaining subunit is configured to obtain, based on the distance change information between the user and the at least one virtual display screen, a display size corresponding to the at least one virtual display screen, and/or a bending form of the at least one virtual display screen;
and/or the presence of a gas in the gas,
a third obtaining subunit, configured to obtain, based on the gaze direction of the user, orientations of the at least one virtual display screen, respectively, so that the orientations of the at least one virtual display screen change along with changes in the gaze direction of the user;
and/or the presence of a gas in the gas,
and the fourth acquisition subunit is used for respectively acquiring the position of the at least one virtual display screen in the horizontal direction and/or the position of the at least one virtual display screen in the vertical direction based on the sight line projection position of the user.
Optionally, if the usage scenario of the virtual display screen is window interface information displayed by at least one virtual display screen in the virtual display screen, the second generating unit includes:
a fifth obtaining subunit, configured to obtain window interface information displayed by a first virtual display screen in the at least one virtual display screen;
and the sixth obtaining subunit is configured to obtain, based on the displayed window interface information, adjustment information of the straight screen or the curved screen corresponding to the first virtual display screen.
Optionally, the method further includes:
the first generation module is used for generating second indication information for controlling at least two virtual display screens in the first electronic equipment to be spliced to form a spliced display screen if the physical display screen is in a screen saver state or a screen off state;
or the like, or, alternatively,
and the second generation module is used for generating third indication information for controlling the physical display screen and the at least one virtual display screen of the first electronic device to be spliced to form a spliced display screen if the physical display screen is in a bright screen state.
Optionally, the method further includes:
the control module is used for controlling the spliced display screen to display at least one window interface as a whole display screen, or the third generation module is used for generating fourth indication information used for controlling the spliced display screen to display at least one window interface as a whole display screen.
Optionally, if the physical display screen is in a bright screen state, under the condition that the physical display screen and the at least one virtual display screen are spliced to form one spliced display screen, the method further includes:
a third generating module, configured to generate fifth indication information for adjusting the tiled display screen to be a straight-sided screen if the physical display screen is the straight-sided screen;
and the fourth generation module is used for generating sixth indication information for adjusting the spliced display screen to be a curved screen if the physical display screen is the curved screen.
As shown in fig. 9, a structure diagram of another implementation manner of a first electronic device provided in an embodiment of the present invention is shown, where the first electronic device includes:
an optical lens module 90 for displaying one or more virtual display screens;
a memory 91 for storing a program;
a processor 92 configured to execute the program, the program being specifically configured to:
acquiring the use scene information of the virtual display screen;
and adjusting the display parameters of the virtual display screen based on the using scene information.
The first electronic device further includes: a bus, a communication interface 93, an input device 94, and an output device 95.
The optical lens module 90, the processor 92, the memory 91, the communication interface 93, the input device 94, and the output device 95 are connected to each other by a bus. Wherein:
a bus may include a path that transfers information between components of a computer system.
The processor 92 may be a general-purpose processor, such as a general-purpose Central Processing Unit (CPU), a Network Processor (NP), a microprocessor, etc., an application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the program according to the present invention. But may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
The processor 92 may include a main processor and may also include a baseband chip, modem, and the like.
The memory 91 stores a program for executing the technical solution of the present invention, and may also store an operating system and other critical services. In particular, the program may include program code including computer operating instructions. More specifically, memory 91 may include a read-only memory (ROM), other types of static storage devices that may store static information and instructions, a Random Access Memory (RAM), other types of dynamic storage devices that may store information and instructions, a disk storage, a flash, and so forth.
Output device 95 may include equipment that allows output of information to a user, such as a display screen, a printer, speakers, etc.
The processor 92 executes programs stored in the memory 91 and invokes other devices that may be used to implement the steps of the matching method provided by the embodiments of the present invention.
As shown in fig. 10, a structure diagram of another implementation manner of a second electronic device provided in the embodiment of the present invention is shown, where the second electronic device includes:
a physical display screen 40;
a memory 1001 for storing a program;
a processor 1002 configured to execute the program, the program being specifically configured to:
acquiring use scene information of one or more virtual display screens displayed by first electronic equipment through an optical lens module, wherein the second electronic equipment is connected with the first electronic equipment;
and adjusting display parameters of a virtual display screen in the first electronic equipment based on the using scene information.
The second electronic device further includes: a bus, a communication interface 1003, an input device 1004, and an output device 1005.
The physical display screen 40, the processor 1002, the memory 1001, the communication interface 1003, the input device 1004, and the output device 1005 are connected to each other via a bus.
Embodiments of the present invention further provide a storable medium having a computer program stored thereon, where the computer program, when executed by a processor, implements the steps included in any of the embodiments of the display processing method applied to the first electronic device.
Embodiments of the present invention further provide a storable medium having a computer program stored thereon, where the computer program, when executed by a processor, implements the steps included in any of the embodiments of the display processing method applied to the second electronic device.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein, but rather
Is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (8)
1. The display screen processing method is applied to first electronic equipment, wherein the first electronic equipment displays a virtual display screen through an optical lens module, wherein the first electronic equipment is connected with second electronic equipment, and the second electronic equipment is provided with a physical display screen; the method comprises the following steps:
acquiring the use scene information of the virtual display screen;
adjusting display parameters of the virtual display screen based on the use scene information; wherein, the acquiring of the usage scenario information of the virtual display screen includes at least one of:
acquiring behavior information of a user aiming at least one virtual display screen in the virtual display screens; or the like, or, alternatively,
acquiring window interface information displayed by at least one virtual display screen in the virtual display screens;
if the obtaining of the usage scenario of the virtual display screen is to obtain behavior information of a user for at least one virtual display screen in the virtual display screen, the adjusting of the display parameters of the virtual display screen based on the usage scenario information includes:
dynamically adjusting the bending form of the at least one virtual display screen respectively based on the distance change information of the user and the at least one virtual display screen respectively; wherein, the dynamically adjusting the bending form of the at least one virtual display screen based on the distance change information between the user and the at least one virtual display screen respectively comprises: aiming at each virtual display screen, obtaining at least one line of pixels from the virtual display screen, wherein each line of pixels is parallel to a horizontal plane; and aiming at each line of pixels, adjusting the bending form of the line to ensure that the distances between each pixel in the line and the user are equal, so as to obtain the bending forms corresponding to the at least one line of pixels, and thus obtain the bending forms corresponding to the at least one virtual display screen.
2. The display screen processing method according to claim 1, wherein if the obtaining of the usage scenario of the virtual display screens is obtaining of behavior information of a user for at least one of the virtual display screens, the adjusting of the display parameters of the virtual display screens based on the usage scenario information further comprises:
respectively adjusting the display size of the at least one virtual display screen based on the distance change information of the user from the at least one virtual display screen;
and/or the presence of a gas in the gas,
dynamically adjusting the orientation of the at least one virtual display screen based on the sight line direction of the user respectively, so that the orientation of the at least one virtual display screen changes along with the change of the sight line direction of the user;
and/or the presence of a gas in the gas,
and dynamically adjusting the position of the at least one virtual display screen in the horizontal direction and/or the position of the at least one virtual display screen in the vertical direction respectively based on the sight line projection position of the user.
3. The display screen processing method according to claim 1, wherein if the obtaining of the usage scenario of the virtual display screen is to obtain window interface information displayed by at least one of the virtual display screens, the adjusting of the display parameters of the virtual display screen based on the usage scenario information includes:
acquiring window interface information displayed by a first virtual display screen in the at least one virtual display screen;
and adjusting the first virtual display screen to be a straight screen or a curved screen based on the displayed window interface information.
4. The display screen processing method according to claim 1, wherein the first electronic device displays a plurality of virtual display screens through the optical lens module, the method further comprising:
if the physical display screen is in a screen saver state or a screen off state, splicing at least two virtual display screens to form a spliced display screen;
or the like, or, alternatively,
and if the physical display screen is in a bright screen state, splicing the physical display screen and at least one virtual display screen to form a spliced display screen.
5. The display screen processing method according to claim 4, further comprising:
controlling the spliced display screen to be used as an integral display screen to display at least one window interface;
if the physical display screen is in a bright screen state, the physical display screen and at least one virtual display screen are spliced to form a spliced display screen, and the method further comprises the following steps:
if the physical display screen is a straight screen, adjusting the spliced display screen to be the straight screen;
if the physical display screen is a curved surface screen, adjusting the spliced display screen to be the curved surface screen;
wherein, if the physics display screen is the curved surface screen, the adjustment the concatenation display screen is the curved surface screen, include:
if the physical display screen is a curved surface screen, determining the position of the physical display screen in the spliced display screen;
and adjusting the bending form of the spliced display screen based on the bending form of the physical display screen and the position of the physical display screen in the spliced display screen.
6. A display screen processing method is applied to a second electronic device and comprises the following steps:
the method comprises the steps that using scene information of a virtual display screen displayed by first electronic equipment through an optical lens module is obtained, wherein the second electronic equipment is connected with the first electronic equipment and is provided with a physical display screen;
adjusting display parameters of the virtual display screen in the first electronic equipment based on the using scene information;
wherein, the acquiring of the usage scenario information of the virtual display screen includes at least one of:
acquiring behavior information of a user aiming at least one virtual display screen in the virtual display screens; or the like, or, alternatively,
acquiring window interface information displayed by at least one virtual display screen in the virtual display screens;
if the obtaining of the usage scenario of the virtual display screen is to obtain behavior information of a user for at least one virtual display screen in the virtual display screen, the adjusting of the display parameters of the virtual display screen in the first electronic device based on the usage scenario information includes:
and respectively and dynamically adjusting the bending form of the at least one virtual display screen based on the distance change information between the user and the at least one virtual display screen.
7. A first electronic device, wherein the first electronic device is connected to a second electronic device, wherein the second electronic device has a physical display screen, and wherein the first electronic device comprises:
the optical lens module is used for displaying one or more virtual display screens;
a memory for storing a program;
a processor configured to execute the program, the program specifically configured to:
acquiring the use scene information of the virtual display screen;
adjusting display parameters of the virtual display screen based on the use scene information;
wherein, the acquiring of the usage scenario information of the virtual display screen includes at least one of:
acquiring behavior information of a user aiming at least one virtual display screen in the virtual display screens; or the like, or, alternatively,
acquiring window interface information displayed by at least one virtual display screen in the virtual display screens;
if the obtaining of the usage scenario of the virtual display screen is to obtain behavior information of a user for at least one virtual display screen in the virtual display screen, the adjusting of the display parameters of the virtual display screen based on the usage scenario information includes:
dynamically adjusting the bending form of the at least one virtual display screen respectively based on the distance change information between the user and the at least one virtual display screen respectively, wherein the dynamically adjusting the bending form of the at least one virtual display screen respectively based on the distance change information between the user and the at least one virtual display screen respectively comprises: aiming at each virtual display screen, obtaining at least one line of pixels from the virtual display screen, wherein each line of pixels is parallel to a horizontal plane; and aiming at each line of pixels, adjusting the bending form of the line to ensure that the distances between each pixel in the line and the user are equal, so as to obtain the bending forms corresponding to the at least one line of pixels, and thus obtain the bending forms corresponding to the at least one virtual display screen.
8. A second electronic device, comprising:
a physical display screen;
a memory for storing a program;
a processor configured to execute the program, the program specifically configured to:
acquiring use scene information of one or more virtual display screens displayed by first electronic equipment through an optical lens module, wherein the second electronic equipment is connected with the first electronic equipment;
adjusting display parameters of a virtual display screen in the first electronic equipment based on the using scene information;
wherein, the acquiring of the usage scenario information of the virtual display screen includes at least one of:
acquiring behavior information of a user aiming at least one virtual display screen in the virtual display screens; or the like, or, alternatively,
acquiring window interface information displayed by at least one virtual display screen in the virtual display screens;
if the obtaining of the usage scenario of the virtual display screen is to obtain behavior information of a user for at least one virtual display screen in the virtual display screen, the adjusting of the display parameters of the virtual display screen in the first electronic device based on the usage scenario information includes:
dynamically adjusting the bending form of the at least one virtual display screen respectively based on the distance change information of the user and the at least one virtual display screen respectively; wherein, the dynamically adjusting the bending form of the at least one virtual display screen based on the distance change information between the user and the at least one virtual display screen respectively comprises: aiming at each virtual display screen, obtaining at least one line of pixels from the virtual display screen, wherein each line of pixels is parallel to a horizontal plane; and aiming at each line of pixels, adjusting the bending form of the line to ensure that the distances between each pixel in the line and the user are equal, so as to obtain the bending forms corresponding to the at least one line of pixels, and thus obtain the bending forms corresponding to the at least one virtual display screen.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711403441.6A CN108154864B (en) | 2017-12-22 | 2017-12-22 | Display screen processing method, first electronic device and second electronic device |
US16/230,952 US20190196710A1 (en) | 2017-12-22 | 2018-12-21 | Display screen processing method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711403441.6A CN108154864B (en) | 2017-12-22 | 2017-12-22 | Display screen processing method, first electronic device and second electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108154864A CN108154864A (en) | 2018-06-12 |
CN108154864B true CN108154864B (en) | 2020-02-21 |
Family
ID=62464289
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711403441.6A Active CN108154864B (en) | 2017-12-22 | 2017-12-22 | Display screen processing method, first electronic device and second electronic device |
Country Status (2)
Country | Link |
---|---|
US (1) | US20190196710A1 (en) |
CN (1) | CN108154864B (en) |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11036284B2 (en) | 2018-09-14 | 2021-06-15 | Apple Inc. | Tracking and drift correction |
CN109474738A (en) * | 2018-10-30 | 2019-03-15 | 努比亚技术有限公司 | Terminal and its eyeshield mode control method, computer readable storage medium |
EP4270159A3 (en) | 2019-09-26 | 2024-01-03 | Apple Inc. | Wearable electronic device presenting a computer-generated reality environment |
US11379033B2 (en) * | 2019-09-26 | 2022-07-05 | Apple Inc. | Augmented devices |
WO2021062278A1 (en) | 2019-09-27 | 2021-04-01 | Apple Inc. | Environment for remote communication |
CN110851227B (en) * | 2019-11-13 | 2021-10-22 | 联想(北京)有限公司 | Display control method and electronic equipment |
CN111026488B (en) * | 2019-12-06 | 2023-04-07 | Tcl移动通信科技(宁波)有限公司 | Communication data saving method, device, terminal equipment and storage medium |
CN111143003B (en) * | 2019-12-25 | 2024-01-23 | 维沃移动通信有限公司 | Desktop display method and electronic equipment |
WO2021262507A1 (en) * | 2020-06-22 | 2021-12-30 | Sterling Labs Llc | Displaying a virtual display |
CN112618299A (en) * | 2020-12-18 | 2021-04-09 | 上海影创信息科技有限公司 | Eyesight protection method and system and VR glasses thereof |
CN114721752B (en) * | 2020-12-18 | 2024-07-26 | 青岛海信移动通信技术有限公司 | Mobile terminal and display method of application interface thereof |
CN112764624B (en) * | 2021-01-26 | 2022-09-09 | 维沃移动通信有限公司 | Information screen display method and device |
CN114120326A (en) * | 2021-12-31 | 2022-03-01 | 读书郎教育科技有限公司 | A global scanning pen with a screen protector |
CN117056869B (en) * | 2023-10-11 | 2024-09-13 | 轩创(广州)网络科技有限公司 | Electronic information data association method and system based on artificial intelligence |
CN117056749B (en) * | 2023-10-12 | 2024-02-06 | 深圳市信润富联数字科技有限公司 | Point cloud data processing method and device, electronic equipment and readable storage medium |
CN117784996B (en) * | 2023-12-27 | 2024-09-03 | 深圳市勤泰智能信息技术有限公司 | Multi-screen intelligent management system, method, device and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103294428A (en) * | 2012-03-02 | 2013-09-11 | 联想(北京)有限公司 | Information display method and electronic equipment |
CN105793764A (en) * | 2013-12-27 | 2016-07-20 | 英特尔公司 | Device, method and system for providing extended display device for head-mounted display device |
CN106453885A (en) * | 2016-09-30 | 2017-02-22 | 努比亚技术有限公司 | Eye protecting terminal and eye protecting method |
CN106997242A (en) * | 2017-03-28 | 2017-08-01 | 联想(北京)有限公司 | Methods for interface management and head-mounted display apparatus |
CN107085489A (en) * | 2017-03-21 | 2017-08-22 | 联想(北京)有限公司 | A control method and electronic device |
CN107168513A (en) * | 2017-03-22 | 2017-09-15 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN107247511A (en) * | 2017-05-05 | 2017-10-13 | 浙江大学 | A kind of across object exchange method and device based on the dynamic seizure of eye in virtual reality |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6198462B1 (en) * | 1994-10-14 | 2001-03-06 | Hughes Electronics Corporation | Virtual display screen system |
WO2007007727A1 (en) * | 2005-07-11 | 2007-01-18 | Sharp Kabushiki Kaisha | Video transmitting apparatus, video display apparatus, video transmitting method and video display method |
US8786675B2 (en) * | 2008-01-23 | 2014-07-22 | Michael F. Deering | Systems using eye mounted displays |
US8767053B2 (en) * | 2010-08-26 | 2014-07-01 | Stmicroelectronics, Inc. | Method and apparatus for viewing stereoscopic video material simultaneously with multiple participants |
US20130002724A1 (en) * | 2011-06-30 | 2013-01-03 | Google Inc. | Wearable computer with curved display and navigation tool |
US9232145B2 (en) * | 2012-04-24 | 2016-01-05 | Lenovo (Beijing) Co., Ltd. | Hand-held electronic device and display method |
US20130326364A1 (en) * | 2012-05-31 | 2013-12-05 | Stephen G. Latta | Position relative hologram interactions |
US20150042640A1 (en) * | 2013-08-07 | 2015-02-12 | Cherif Atia Algreatly | Floating 3d image in midair |
IL236243A (en) * | 2014-12-14 | 2016-08-31 | Elbit Systems Ltd | Visual perception enhancement of displayed color symbology |
US20160334868A1 (en) * | 2015-05-15 | 2016-11-17 | Dell Products L.P. | Method and system for adapting a display based on input from an iris camera |
KR102471977B1 (en) * | 2015-11-06 | 2022-11-30 | 삼성전자 주식회사 | Method for displaying one or more virtual objects in a plurality of electronic devices, and an electronic device supporting the method |
US10229541B2 (en) * | 2016-01-28 | 2019-03-12 | Sony Interactive Entertainment America Llc | Methods and systems for navigation within virtual reality space using head mounted display |
US10754416B2 (en) * | 2016-11-14 | 2020-08-25 | Logitech Europe S.A. | Systems and methods for a peripheral-centric augmented/virtual reality environment |
-
2017
- 2017-12-22 CN CN201711403441.6A patent/CN108154864B/en active Active
-
2018
- 2018-12-21 US US16/230,952 patent/US20190196710A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103294428A (en) * | 2012-03-02 | 2013-09-11 | 联想(北京)有限公司 | Information display method and electronic equipment |
CN105793764A (en) * | 2013-12-27 | 2016-07-20 | 英特尔公司 | Device, method and system for providing extended display device for head-mounted display device |
CN106453885A (en) * | 2016-09-30 | 2017-02-22 | 努比亚技术有限公司 | Eye protecting terminal and eye protecting method |
CN107085489A (en) * | 2017-03-21 | 2017-08-22 | 联想(北京)有限公司 | A control method and electronic device |
CN107168513A (en) * | 2017-03-22 | 2017-09-15 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN106997242A (en) * | 2017-03-28 | 2017-08-01 | 联想(北京)有限公司 | Methods for interface management and head-mounted display apparatus |
CN107247511A (en) * | 2017-05-05 | 2017-10-13 | 浙江大学 | A kind of across object exchange method and device based on the dynamic seizure of eye in virtual reality |
Also Published As
Publication number | Publication date |
---|---|
US20190196710A1 (en) | 2019-06-27 |
CN108154864A (en) | 2018-06-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108154864B (en) | Display screen processing method, first electronic device and second electronic device | |
EP3752897B1 (en) | Systems and methods for eye tracking in virtual reality and augmented reality applications | |
KR102544062B1 (en) | Method for displaying virtual image, storage medium and electronic device therefor | |
US20200272234A1 (en) | Method for controlling device on the basis of eyeball motion, and device therefor | |
US8836768B1 (en) | Method and system enabling natural user interface gestures with user wearable glasses | |
CN111886564B (en) | Information processing device, information processing method, and program | |
JP5728009B2 (en) | Instruction input device, instruction input method, program, recording medium, and integrated circuit | |
CN107977586B (en) | Display content processing method, first electronic device and second electronic device | |
CN103827780B (en) | Method and system for virtual input device | |
US20190385372A1 (en) | Positioning a virtual reality passthrough region at a known distance | |
US9530051B2 (en) | Pupil detection device | |
US9823745B1 (en) | Method and apparatus for selectively presenting content | |
EP4026318A1 (en) | Intelligent stylus beam and assisted probabilistic input to element mapping in 2d and 3d graphical user interfaces | |
US20240361835A1 (en) | Methods for displaying and rearranging objects in an environment | |
CN105917292A (en) | Eye Gaze Detection Using Multiple Light Sources and Sensors | |
TW201600887A (en) | A method and a display device with pixel repartition optimization | |
US10867174B2 (en) | System and method for tracking a focal point for a head mounted device | |
US20240152245A1 (en) | Devices, Methods, and Graphical User Interfaces for Interacting with Window Controls in Three-Dimensional Environments | |
US12380653B2 (en) | Devices, methods, and graphical user interfaces for maps | |
US11301049B2 (en) | User interface control based on elbow-anchored arm gestures | |
WO2025024469A1 (en) | Devices, methods, and graphical user interfaces for sharing content in a communication session | |
US20240403080A1 (en) | Devices, methods, and graphical user interfaces for displaying views of physical locations | |
CN110858095A (en) | Electronic device that can be controlled by head and its operation method | |
WO2024064373A1 (en) | Devices, methods, and graphical user interfaces for interacting with window controls in three-dimensional environments | |
KR101492832B1 (en) | Method for controlling display screen and display apparatus thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |