Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to one embodiment of the present invention, there is provided an embodiment of a method of controlling view display, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and that, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order other than that shown or described herein.
The method of controlling view display in one embodiment of the present invention may be run on a terminal device or a server. The terminal device may be a local terminal device. When the method for controlling view display is operated on a server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and client equipment.
In an alternative embodiment, various cloud applications, such as cloud gaming, may be run under the cloud interaction system. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presentation main body are separated, the storage and running of the method for controlling the view display are completed on the cloud game server, the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like, and the terminal device for information processing is the cloud game server of the cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In a possible implementation manner, the embodiment of the invention provides a method for controlling view display, and a graphical user interface is provided through a terminal device, wherein the terminal device may be the aforementioned local terminal device or the aforementioned client device in the cloud interaction system.
Taking a Mobile terminal running in a local terminal device as an example, the Mobile terminal can be a terminal device such as a smart phone (e.g. an Android Mobile phone, an iOS Mobile phone, etc.), a tablet computer, a palm computer, a Mobile internet device (Mobile INTERNET DEVICES, abbreviated as MID), a PAD, a game console, etc. Fig. 1 is a block diagram of a hardware structure of a mobile terminal for controlling a method of view display according to an embodiment of the present invention. As shown in fig. 1, a mobile terminal may include one or more (only one is shown in fig. 1) processors 102 (the processors 102 may include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processor (GPU), a Digital Signal Processing (DSP) chip, a Microprocessor (MCU), a programmable logic device (FPGA), a neural Network Processor (NPU), a Tensor Processor (TPU), an Artificial Intelligence (AI) type processor, etc.) and a memory 104 for storing data. Optionally, the mobile terminal may further include a transmission device 106, an input-output device 108, and a display device 110 for communication functions. It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting of the structure of the mobile terminal described above. For example, the mobile terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
The memory 104 may be used to store a computer program, for example, a software program of application software and a module, such as a computer program corresponding to a method of controlling view display in an embodiment of the present invention, and the processor 102 executes the computer program stored in the memory 104 to perform various functional applications and data processing, that is, to implement the above-described method of controlling view display. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory remotely located relative to the processor 102, which may be connected to the mobile terminal via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as a NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is configured to communicate with the internet wirelessly.
The input in the input output device 108 may come from a plurality of Human interface devices (Human INTERFACE DEVICE, abbreviated as HID). Such as a keyboard and mouse, a gamepad, other special game controllers (e.g., steering wheel, fishing pole, dance mat, remote control, etc.). Part of the ergonomic interface device may provide output functions, such as force feedback and vibration of the gamepad, audio output of the controller, etc., in addition to input functions.
The display device 110 may be, for example, a head-up display (HUD), a touch screen type Liquid Crystal Display (LCD), and a touch display (also referred to as a "touch screen" or "touch display"). The liquid crystal display may enable a user to interact with a user interface of the mobile terminal. In some embodiments, the mobile terminal has a Graphical User Interface (GUI) with which a user may interact by touching finger contacts and/or gestures on a touch-sensitive surface, where the human-machine interaction functionality optionally includes interactions such as creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, sending and receiving electronic mail, talking interfaces, playing digital video, playing digital music and/or web browsing, etc., executable instructions for performing the human-machine interaction functionality described above are configured/stored in one or more processor-executable computer program products or readable storage media.
The method of controlling view display in one embodiment of the present invention may be run on a local terminal device or a server. When the method for controlling view display is operated on a server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and client equipment.
In an alternative embodiment, various cloud applications, such as cloud gaming, may be run under the cloud interaction system. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presentation main body are separated, the storage and running of the method for controlling the view display are completed on a cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer and the like, which is close to a user side, but is the cloud game server for information processing. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In a possible implementation manner, the embodiment of the invention provides a method for controlling view display, and a graphical user interface is provided through a terminal device, wherein the terminal device may be the aforementioned local terminal device or the aforementioned client device in the cloud interaction system. Fig. 2 is a flowchart of a method for controlling view display according to an embodiment of the present invention, in which a graphic user interface is provided through a terminal device, and display contents in the graphic user interface include game pictures obtained by photographing a virtual camera in a game scene at a default viewing angle, as shown in fig. 2, the method includes the steps of:
Step S21, displaying a first view in a first touch area in the graphical user interface, wherein the first view is a thumbnail view corresponding to a game picture under a default view angle;
the graphical user interface may be a graphical user interface of a game application program, and the graphical user interface may display a game screen, where the game screen is an image frame captured by a virtual camera in a game scene at a default viewing angle. The default view may be pre-specified by the technician or may be determined in real-time based on the game scene.
The first view may be a thumbnail view corresponding to the game screen at the default view angle, and the thumbnail view may be a small-sized view in which partial information of the game screen is displayed.
The first touch area may be a partial area supporting a touch operation in the graphical user interface. The partial area in the graphical user interface may be an area designated in advance by a technician or may be an area set autonomously by a player. The first view is displayed in the first touch area.
For example, in the graphical user interface of the mobile-side multiplayer online tactical competition (Multiplayer Online Battle Arena, moba) game, the first view may be a thumbnail of a virtual map (often referred to as a "small map") and the first touch area may be an area of the graphical user interface for displaying the thumbnail.
For another example, in a graphical user interface of a sandbox game (such as my world), the first view may be a thumbnail view of a virtual building by a player, and the first touch area may be an area in the graphical user interface for displaying the thumbnail view.
Fig. 3 is a schematic view of an alternative first touch area according to an embodiment of the present invention, and fig. 4 is a schematic view of an alternative first view according to an embodiment of the present invention. As shown in fig. 3 and 4, in the virtual game scene a, the first touch area is a square area in the upper right corner of the graphical user interface, and the first touch area is used for displaying a first view, and the first view is a thumbnail of the game screen a.
Step S22, responding to a first touch operation acting on the first touch area, and displaying a second view in a second touch area in the graphical user interface, wherein the second touch area is larger than the first touch area, and the second view is an observation view corresponding to the thumbnail view;
The first touch operation may be used to control the display of the second view in the second touch area in the graphical user interface. The second touch area is larger than the first touch area, and the size of the second view may be larger than the size of the first view.
In an actual application scenario, the first touch operation may be a touch operation for displaying the thumbnail view in an enlarged manner. For example, the first touch operation may be a click touch operation (such as clicking a designated button to zoom in and double clicking the first touch area to zoom in), or a gesture touch operation (such as two-finger drag to zoom in and multi-finger designated gesture to zoom in). The first touch operation may be specified in advance by a technician or may be set autonomously by a player.
The second view may be an observation view corresponding to the thumbnail view. The thumbnail view may be a small-sized view of a portion of information on which a game screen is displayed, and the view may be a normal-sized (which may be determined by a default viewing angle) view of all information or designated information (which may be autonomously set by a player) on which the game screen is displayed.
The second touch area may be a part or all of the areas supporting the touch operation in the graphical user interface. The second touch area in the graphical user interface may be an area pre-designated by a technician or may be an area autonomously set by the player (e.g., may be set to a full screen display view). And displaying the second view in the second touch area.
For example, in a graphical user interface of a sandbox game, the first view is a thumbnail view of a virtual building set-up by a player, and the second view is an observation view of the virtual building set-up by the player. When a double-click touch operation (corresponding to the first touch operation) is received, the view is displayed in full screen (corresponding to the second touch area).
Fig. 5 is a schematic view of an optional second touch area according to an embodiment of the invention, and fig. 6 is a schematic view of an optional second view according to an embodiment of the invention. As shown in fig. 3, 4, 5 and 6, in the virtual game scene a, the second touch area is a square area on the right side of the graphical user interface, the size of the second touch area is larger than that of the first touch area, and the second touch area is used for displaying a second view, where the second view is an observation view of the game screen a.
Specifically, as shown in fig. 6, a virtual building B1 including a three-layer building structure is displayed in the observation view (corresponding to the second view) of the game screen a. The display content of the observation view (corresponding to the second view) of the game screen a further includes a control bar (for adjusting the virtual time in the game scene a) and a plurality of buttons (for adjusting the virtual building B1). The view of game screen a may be responsive to a user's touch manipulation (e.g., zooming in, rotating, panning, etc.).
In step S23, in response to a second touch operation applied to the second touch area, the view angle corresponding to the second view is adjusted from the default view angle to the target view angle, and the game scene is previewed under the target view angle.
The second touch area may be an area of a graphical user interface for displaying the second view. The second touch operation may be used to control a viewing angle of the second view displayed in the second touch area in the graphical user interface to be adjusted from a default viewing angle to a target viewing angle. The target view angle may be determined in real time by the second touch operation, or may be a view angle (e.g., a front view angle, a top view angle, a left view angle, etc.) preset by the player and corresponding to the second touch operation.
In an actual application scenario, the second touch operation may be a touch operation for adjusting a viewing angle. For example, the second touch operation may be a click touch operation (such as clicking a designated button to quickly restore a front view angle and clicking a double-click second touch area to quickly restore a top view angle), or a gesture touch operation (such as rotating a single-finger dragging fixed point to adjust a view angle direction and rotating a multi-finger designated gesture fixed axis to adjust a view angle direction). The second touch operation may be specified in advance by a technician or may be set autonomously by the player.
The second touch operation may also be used to preview the game scene at the target viewing angle. For example, the second touch operation may be used to preview the specified state information (e.g., the build progress) of the specified virtual object (e.g., the virtual building model) in the game scene at the target perspective. Previewing the construction progress can be displaying a built partial model corresponding to the virtual building model, displaying a construction progress bar and a construction progress percentage value of the virtual building model, and the like. The preview of the progress of the construction of the virtual building model in the game scene at the target view angle may be that the virtual building model constructed in the game scene is displayed at the target view angle.
For another example, in the actual application scenario, the second touch operation may be a touch operation for displaying a construction progress of the virtual building model. For example, the second touch operation may be a click touch operation (e.g., clicking a "display progress" button or a "query" button to display the construction progress), or a gesture touch operation (e.g., a two-finger pinch gesture touch to display the construction progress). The second touch operation may be specified in advance by a technician or may be set autonomously by the player.
For example, in at least some embodiments of the present invention, first, a first view is displayed in a first touch area in a graphical user interface, where the first view is a thumbnail view corresponding to a game screen under a default view angle, further, in response to a first touch operation acting on the first touch area, a second view is displayed in a second touch area in the graphical user interface, where the second touch area is larger than the first touch area, the second view is an observation view corresponding to the thumbnail view, and in response to a second touch operation acting on the second touch area, the view angle corresponding to the second view is adjusted from the default view angle to a target view angle, and a game scene is previewed under the target view angle, so that the purpose of observing and previewing a virtual game scene through the operable thumbnail view is achieved, thereby achieving the technical effect of improving the convenience of the observation operation and the richness of functions in the virtual game scene, and further solving the technical problems of large operation, complexity and poor player experience of the method for constructing a virtual model provided by related technologies.
Optionally, the method for controlling view display may further include the following steps:
step S241, acquiring a first position of the virtual character in the game scene;
step S242, determining a second position of the virtual camera in the game scene based on the first position;
Step S243, determining a default viewing angle using the first position and the second position.
Still taking the example of building the virtual building B1 in the game scene a, fig. 7 is a schematic diagram of an alternative first position determination according to an embodiment of the present invention, as shown in fig. 7, in the game scene a, a three-dimensional rectangular coordinate system is built with the virtual character r0 as the origin of coordinates, and positive directions of the X axis, the Y axis and the Z axis are shown in fig. 7, where coordinates (corresponding to the first position) of the position of the virtual character r0 can be determined to be (0, 0).
Still taking as an example the construction of a virtual building B1 in a game scene a, fig. 8 is a schematic diagram of an alternative determination of a second position according to an embodiment of the present invention, and as shown in fig. 8, in the game scene a, the coordinates of the virtual camera position (corresponding to the second position) may be determined to be (10 m,40m,28 m) based on the three-dimensional rectangular coordinate system XYZ corresponding to the virtual character r 0.
Still taking as an example the construction of a virtual building B1 in a game scene a, fig. 9 is a schematic diagram of an alternative default view according to one embodiment of the invention. From the coordinates (0, 0) of the position of the virtual character r0 and the coordinates (10 m,40m,28 m) of the virtual camera position in the game scene a, a default view angle at which the virtual building B1 is constructed can be determined. The virtual building B1 displayed at the default view angle is shown in fig. 9.
Still taking the example of building the virtual building B1 in the game scene a, the second touch operation may be an zoom-in touch operation, a rotation touch operation, and a translation touch operation, and correspondingly, the target viewing angle may be a viewing angle in which the default viewing angle is zoomed in, a viewing angle in which the default viewing angle is rotated, and a viewing angle in which the default viewing angle is translated.
Fig. 10 is a schematic diagram of an alternative adjustment of the view angle of the second view according to an embodiment of the present invention, as shown in fig. 10, after performing an enlarged touch operation (corresponding to the above second touch operation) on the second view at the default view angle, the view angle corresponding to the second view may be adjusted to a view angle v1, where the view angle v1 is the view angle of the enlarged default view angle.
Fig. 11 is a schematic diagram of an alternative adjustment of the view angle of the second view according to an embodiment of the present invention, as shown in fig. 11, after performing a rotating touch operation (corresponding to the above second touch operation) on the second view at the default view angle, the view angle corresponding to the second view may be adjusted to a view angle v2, where the view angle v2 is the view angle of the rotated default view angle.
Fig. 12 is a schematic diagram of an alternative adjustment of the view angle of the second view according to an embodiment of the present invention, as shown in fig. 12, after performing a panning touch operation (corresponding to the above second touch operation) on the second view at the default view angle, the view angle corresponding to the second view may be adjusted to a view angle v3, where the view angle v3 is the view angle of the default view angle after panning.
By the method for adjusting the view angle of the second view, the player can flexibly change the view angle in the view to observe the construction condition of the virtual building B1, and the game experience of the player is improved.
Optionally, the method for controlling view display may further include the following steps:
Step S251, responding to a third touch operation acting on a first control in a second touch area, and recovering a view angle corresponding to a second view from a target view angle to a default view angle, wherein the first control is used for resetting the view angle of the second view;
in step S252, in response to exiting the second view, the first view remains displayed within the first touch area.
The first control may be a "reset" button within the second touch area that may be used to restore the view angle of the second view to the default view angle. The third touch operation may be clicking the "reset" button or pressing the "reset" button for a long time. When the second view is exited, the first touch area in the graphical user interface will display the first view.
Taking the example of building the virtual building B1 in the game scene a as still an example, fig. 13 is a schematic view of an alternative control view angle reset according to an embodiment of the present invention, as shown in fig. 13, a reset touch operation (corresponding to the third touch operation described above) performed on the second view at the view angle v2 may be performed by clicking the button 1 in the second view, adjusting the view angle of the second view to the default view angle when the reset touch operation is detected, and then displaying the first view under the default view angle in the first touch area of the graphical user interface when the exit view touch operation (such as long press of the second view) acting on the second touch area is detected.
Optionally, the method for controlling view display may further include the following steps:
and step S26, in response to exiting the second view, updating and displaying a third view in the first touch area, wherein the third view is a thumbnail view corresponding to the second view under the target view angle.
Taking the example of building the virtual building B1 in the game scene a as still an example, fig. 14 is a schematic diagram of an alternative updated thumbnail display according to an embodiment of the present invention, as shown in fig. 14, after the second view is rotated and touched, the second view corresponds to the view angle v2, and when the exit from the view touch operation is detected, a thumbnail corresponding to the second view at the view angle v2 is displayed in the first touch area of the graphical user interface (corresponding to the third view described above).
Optionally, the method for controlling view display may further include the following steps:
step S271, in response to a fourth touch operation applied to a second control in the second touch area, acquiring a first mark position in the second view, where the second control is used to activate a mark mode in the second view;
In step S272, in response to exiting the second view, the second marker position is synchronously marked, where the second marker position is a corresponding position of the first marker position within the game screen.
Still taking the example of building the virtual building B1 in the game scene a as an example, fig. 15 is a schematic diagram of an alternative display marker position according to an embodiment of the present invention, as shown in fig. 15, when an activation marker touch operation (corresponding to the fourth touch operation described above) acting on the second touch area is detected, the marker position is acquired from the second view. The mark position may be a position of a mark by a player through a specified gesture touch operation (e.g., clicking, painting, etc.) acting on the virtual building B1. The activation-marker touch operation may be clicking or long pressing of button 2 in the second view. As shown in fig. 15, the present example is exemplified by marking the bottommost layer (hatched portion in the figure) of the three-layer building structure of the virtual building B1.
As also shown in fig. 15, when the second view is exited, a thumbnail of the second view with the indicia is displayed within the first touch area in the graphical user interface. When an adjustment operation (e.g., replacement, deletion, and removal of a mark, etc.) is detected on the marked portion, the display mark may be canceled in the first view and the second view.
Through the marking method for the virtual building B1 in the game scene A, a player can record the position to be adjusted in the construction process, so that the accuracy of multiple adjustments is ensured.
Optionally, the method for controlling view display may further include the following steps:
Step S281, responding to a fifth touch operation acting on a third control in the second touch area, and displaying a planar map model corresponding to the game scene, wherein the third control is used for activating a blueprint mode in the second view;
step S282, selecting a target display range from the planar map model in response to a sixth touch operation acting on the planar map model;
In step S283, the fourth view is updated and displayed in the second touch area based on the target display range, wherein the fourth view is an observation view determined by the target display range.
Taking the example of building the virtual building B1 in the game scene a as still an example, fig. 16 is a schematic diagram of an alternative blueprint mode operation according to one embodiment of the present invention, as shown in fig. 16, when an open blueprint touch operation (corresponding to the fifth touch operation described above) acting on the second view is detected, a planar map model corresponding to the game scene a is displayed in the graphical user interface (such as the blueprint mode view in fig. 16). The open blueprint touch operation may be clicking on button 3 in the second view (e.g., an "activate blueprint mode" button).
As shown in fig. 16, when a selection range touch operation (corresponding to the sixth touch operation described above) acting on the planar map model is detected, a target display range is selected from the planar map model.
As also shown in fig. 16, a fourth view is displayed in the second touch area of the graphical user interface based on the target display range. The fourth view is an observation view of the virtual building B1 determined by the target display range in the game scene a.
As also shown in fig. 16, in the blueprint mode view, a height value and a depth value may be input to determine a target display range. In the above-described operation of selecting the target display range, the planar map model may be enlarged, reduced, and closed. After closing the planar map model, only the objects within the selected target display range are displayed in the second view on the graphical user interface, so as to avoid interference of other objects on building the virtual building B1.
Optionally, the method for controlling view display may further include the following steps:
And step S29, responding to a seventh touch operation acting on a fourth control in the second touch area, and previewing the display state of the virtual building model under different environment illumination intensities, wherein the fourth control is used for controlling the environment illumination change of the virtual building model within a preset time range.
Still taking the example of building a virtual building B1 in game scene a, the fourth control described above may be a control bar in the second view for controlling the virtual time in game scene a (the extent of which corresponds to 24 hours a day in virtual scene a). The seventh touch operation may be dragging the control handle of the control bar. When the seventh touch operation is detected, the display state of the virtual building B1 at the time corresponding to the control handle of the control bar can be previewed in the second view on the graphical user interface. When the control handle of the control bar is dragged, the control handle corresponds to different virtual time and further corresponds to different ambient light intensity.
By the method for previewing the light and shadow effect of the virtual building, the display effect of the virtual building B1 under different environmental light source intensities at different times in one day can be observed rapidly under the condition that the actual virtual time corresponding to the game scene A is not affected, so that the building process of the virtual building B1 is convenient for a player to adjust, and the game experience of the player is improved.
It is easy to notice that by the method provided by the embodiment of the invention, the following functions of quick display of the observation view, adjustment of the observation view by gesture operation, auxiliary construction of a marking function, local view of a blueprint target range and quick preview of a time light effect can be realized in the virtual game scene. Therefore, the method provided by the embodiment achieves the technical effect of improving the convenience of observing operation and the richness of functions in the virtual game scene.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiment also provides a device for controlling view display, which is used for implementing the above embodiment and the preferred implementation, and is not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 17 is a block diagram of an apparatus for controlling view display according to an embodiment of the present invention, wherein a graphical user interface is provided through a terminal device, and display contents in the graphical user interface include a game screen shot by a virtual camera in a game scene under a default view angle, as shown in fig. 17, the apparatus includes a first display module 171 for displaying a first view in a first touch area in the graphical user interface, wherein the first view is a thumbnail view corresponding to the game screen under the default view angle, a second display module 172 for displaying a second view in a second touch area in the graphical user interface in response to a first touch operation acting on the first touch area, wherein the second touch area is larger than the first touch area, and the second view is an observation view corresponding to the thumbnail view, and a control module 173 for adjusting a view corresponding to the second view from the default view angle to a target view angle in response to a second touch operation acting on the second touch area, and performing a game scene under the target view angle.
Alternatively, FIG. 18 is a block diagram of an apparatus for controlling view display according to an alternative embodiment of the present invention, as shown in FIG. 18, which includes a determining module 174 for acquiring a first position of a virtual character in a game scene, determining a second position of a virtual camera in the game scene based on the first position, and determining a default view angle using the first position and the second position, in addition to all the modules shown in FIG. 17.
Optionally, FIG. 19 is a block diagram of an apparatus for controlling view display according to an embodiment of the present invention, where the apparatus includes, as shown in FIG. 19, a restoration module 175, in addition to all the modules shown in FIG. 18, for restoring a view angle corresponding to a second view from a target view angle to a default view angle in response to a third touch operation applied to a first control in a second touch area, where the first control is configured to perform view angle restoration on the second view, and in response to exiting the second view, keeps displaying the first view in the first touch area.
Optionally, FIG. 20 is a block diagram of an apparatus for controlling view display according to an embodiment of the present invention, and as shown in FIG. 20, the apparatus includes, in addition to all the modules shown in FIG. 19, a first update module 176 configured to update and display a third view in the first touch area in response to exiting the second view, where the third view is a thumbnail view corresponding to the second view at the target view angle.
Optionally, FIG. 21 is a block diagram of an apparatus for controlling view display according to an embodiment of the present invention, where the apparatus includes, as shown in FIG. 21, a marking module 177 for acquiring a first marking location in a second view in response to a fourth touch operation on a second control in a second touch area, where the second control is used to activate a marking mode in the second view, and synchronously marking the second marking location in response to exiting the second view, where the second marking location is a corresponding location of the first marking location in a game screen, in addition to all the modules shown in FIG. 20.
Optionally, FIG. 22 is a block diagram of an apparatus for controlling view display according to an embodiment of the present invention, where the apparatus includes, as shown in FIG. 22, a second update module 178, in addition to all the modules shown in FIG. 23, for displaying a planar map model corresponding to a game scene in response to a fifth touch operation applied to a third control in a second touch area, where the third control is used to activate a blueprint mode in the second view, selecting a target display range from the planar map model in response to a sixth touch operation applied to the planar map model, and updating and displaying a fourth view in the second touch area based on the target display range, where the fourth view is an observation view determined by the target display range.
Optionally, FIG. 23 is a block diagram of an apparatus for controlling view display according to an embodiment of the present invention, and as shown in FIG. 23, the apparatus further includes a preview module 179, in addition to all the modules shown in FIG. 22, for previewing a display state of a virtual building model under different ambient light intensities in response to a seventh touch operation applied to a fourth control in the second touch area, where the fourth control is used to control an ambient light change of the virtual building model within a preset time range.
It should be noted that each of the above modules may be implemented by software or hardware, and the latter may be implemented by, but not limited to, the above modules all being located in the same processor, or each of the above modules being located in different processors in any combination.
Embodiments of the present invention also provide a computer readable storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-mentioned computer readable storage medium may include, but is not limited to, a USB flash disk, a Read-Only Memory (ROM), a random access Memory (RandomAccess Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, etc. various media in which a computer program can be stored.
Alternatively, in this embodiment, the above-mentioned computer-readable storage medium may be located in any one of the computer terminals in the computer terminal group in the computer network, or in any one of the mobile terminals in the mobile terminal group.
Optionally, the computer readable storage medium is further configured to store program code for displaying a first view in a first touch area in the graphical user interface, wherein the first view is a thumbnail view corresponding to a game screen under a default view angle, displaying a second view in a second touch area in the graphical user interface in response to a first touch operation acting on the first touch area, wherein the second touch area is larger than the first touch area, the second view is an observation view corresponding to the thumbnail view, adjusting a view angle corresponding to the second view from the default view angle to a target view angle in response to a second touch operation acting on the second touch area, and previewing the game scene under the target view angle.
Optionally, the above computer readable storage medium is further arranged to store program code for obtaining a first position of the virtual character in the game scene, determining a second position of the virtual camera in the game scene based on the first position, determining a default perspective using the first position and the second position.
Optionally, the computer readable storage medium is further configured to store program code for restoring a perspective corresponding to the second view from the target perspective to a default perspective in response to a third touch operation acting on a first control within the second touch area, wherein the first control is configured to reset the perspective of the second view, and maintaining the first view displayed within the first touch area in response to exiting the second view.
Optionally, the computer readable storage medium is further configured to store program code for updating display of a third view within the first touch area in response to exiting the second view, wherein the third view is a thumbnail view corresponding to the second view at the target perspective.
Optionally, the computer readable storage medium is further arranged to store program code for, in response to a fourth touch operation acting on a second control in the second touch area, obtaining a first marker position in the second view, wherein the second control is used to activate a marker mode in the second view, and in response to exiting the second view, synchronously marking the second marker position, wherein the second marker position is a corresponding position of the first marker position in the game screen.
Optionally, the computer readable storage medium is further configured to store program code for displaying a planar map model corresponding to the game scene in response to a fifth touch operation acting on a third control in the second touch area, wherein the third control is used for activating a blueprint mode in the second view, selecting a target display range from the planar map model in response to a sixth touch operation acting on the planar map model, and updating and displaying a fourth view in the second touch area based on the target display range, wherein the fourth view is an observation view determined by the target display range.
Optionally, the computer readable storage medium is further configured to store program code for previewing a display status of the virtual building model under different ambient light intensities in response to a seventh touch operation acting on a fourth control in the second touch area, wherein the fourth control is configured to control an ambient light variation of the virtual building model within a preset time range.
In the computer-readable storage medium of this embodiment, a technical solution of a method of controlling view display is provided. The method comprises the steps of firstly displaying a first view in a first touch area in a graphical user interface, wherein the first view is a thumbnail view corresponding to a game picture under a default view angle, further, responding to first touch operation acting on the first touch area, displaying a second view in a second touch area in the graphical user interface, wherein the second touch area is larger than the first touch area, the second view is an observation view corresponding to the thumbnail view, responding to second touch operation acting on the second touch area, adjusting the view angle corresponding to the second view from the default view angle to a target view angle, and previewing the game scene under the target view angle, so that the purpose of observing and previewing the virtual game scene through the operable thumbnail view is achieved, the technical effects of improving the convenience of observing operation and the richness of functions in the virtual game scene are achieved, and the technical problems of great operation difficulty, complex flow and poor player experience of a method provided by related technologies are solved.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present invention may be embodied in the form of a software product, which may be stored in a computer readable storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present invention.
In an exemplary embodiment of the present invention, a computer-readable storage medium stores thereon a program product capable of implementing the method described above in this embodiment. In some possible implementations, the various aspects of the embodiments of the invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the "exemplary methods" section of this embodiment, when the program product is run on the terminal device.
A program product for implementing the above-described method according to an embodiment of the present invention may employ a portable compact disc read-only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the embodiments of the present invention is not limited thereto, and in the embodiments of the present invention, the computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Any combination of one or more computer readable media may be employed by the program product described above. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium include an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that the program code embodied on the computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
An embodiment of the invention also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
The method comprises the steps of displaying a first view in a first touch area in a graphical user interface, wherein the first view is a thumbnail view corresponding to a game picture under a default view angle, displaying a second view in a second touch area in the graphical user interface in response to a first touch operation acting on the first touch area, wherein the second touch area is larger than the first touch area, the second view is an observation view corresponding to the thumbnail view, and adjusting the view angle corresponding to the second view from the default view angle to a target view angle in response to a second touch operation acting on the second touch area, and previewing a game scene under the target view angle.
Optionally, the processor may be further configured to perform the steps of obtaining a first position of the virtual character in the game scene, determining a second position of the virtual camera in the game scene based on the first position, and determining the default perspective using the first position and the second position.
Optionally, the computer readable storage medium is further configured to store program code for restoring a perspective corresponding to the second view from the target perspective to a default perspective in response to a third touch operation acting on a first control within the second touch area, wherein the first control is configured to reset the perspective of the second view, and maintaining the first view displayed within the first touch area in response to exiting the second view.
Optionally, the processor may be further configured to execute, by the computer program, in response to exiting the second view, to update display a third view within the first touch area, wherein the third view is a thumbnail view corresponding to the second view at the target perspective.
Optionally, the processor may be further configured to perform the steps of obtaining, by the computer program, a first marker position in the second view in response to a fourth touch operation acting on a second control in the second touch area, wherein the second control is configured to activate a marker mode in the second view, and in response to exiting the second view, synchronously marking the second marker position, wherein the second marker position is a corresponding position of the first marker position in the game screen.
Optionally, the processor may be further configured to execute the steps of displaying, by the computer program, a planar map model corresponding to the game scene in response to a fifth touch operation acting on a third control in the second touch area, where the third control is used to activate a blueprint mode in the second view, selecting a target display range from the planar map model in response to a sixth touch operation acting on the planar map model, and updating and displaying, based on the target display range, a fourth view in the second touch area, where the fourth view is an observation view determined by the target display range.
Optionally, the processor may be further configured to execute, by means of a computer program, a step of previewing a display state of the virtual building model under different ambient light intensities in response to a seventh touch operation acting on a fourth control in the second touch area, where the fourth control is used to control an ambient light variation of the virtual building model within a preset time range.
In the computer-readable storage medium of this embodiment, a technical solution of a method of controlling view display is provided. The method comprises the steps of firstly displaying a first view in a first touch area in a graphical user interface, wherein the first view is a thumbnail view corresponding to a game picture under a default view angle, further, responding to first touch operation acting on the first touch area, displaying a second view in a second touch area in the graphical user interface, wherein the second touch area is larger than the first touch area, the second view is an observation view corresponding to the thumbnail view, responding to second touch operation acting on the second touch area, adjusting the view angle corresponding to the second view from the default view angle to a target view angle, and previewing the game scene under the target view angle, so that the purpose of observing and previewing the virtual game scene through the operable thumbnail view is achieved, the technical effects of improving the convenience of observing operation and the richness of functions in the virtual game scene are achieved, and the technical problems of great operation difficulty, complex flow and poor player experience of a method provided by related technologies are solved.
Fig. 24 is a schematic diagram of an electronic device according to an embodiment of the invention. As shown in fig. 24, the electronic device 2400 is merely an example, and should not be construed as limiting the functionality and scope of use of the embodiments of the present invention.
As shown in fig. 24, the electronic apparatus 2400 is represented in the form of a general purpose computing device. The components of the electronic device 2400 can include, but are not limited to, the at least one processor 2410, the at least one memory 2420, a bus 2430 connecting the different system components (including the memory 2420 and the processor 2410), and a display 2440.
Wherein the memory 2420 stores program code that can be executed by the processor 2410 to cause the processor 2410 to perform the steps according to various exemplary embodiments of the present invention described in the above method section of the embodiment of the present invention.
The memory 2420 may include readable media in the form of volatile memory units, such as Random Access Memory (RAM) 24201 and/or cache memory unit 24202, and may further include Read Only Memory (ROM) 24203, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory.
In some examples, memory 2420 may also include a program/utility 24204 having a set (at least one) of program modules 24205, such program modules 24205 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. The memory 2420 may further include memory located remotely from the processor 2410, which may be connected to the electronic device 2400 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The bus 2430 can be one or more of several types of bus structure including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processor 2410, or a local bus using any of a variety of bus architectures.
The display 2440 may be, for example, a touch screen type Liquid Crystal Display (LCD) that may enable a user to interact with a user interface of the electronic device 2400.
Optionally, the electronic apparatus 2400 may also communicate with one or more external devices 1400 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic apparatus 2400, and/or with any device (e.g., router, modem, etc.) that enables the electronic apparatus 2400 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 2450. Also, electronic device 2400 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet via network adapter 2460. As shown in fig. 24, the network adapter 2460 communicates with other modules of the electronic device 2400 over the bus 2430. It should be appreciated that although not shown in FIG. 24, other hardware and/or software modules may be used in connection with the electronic apparatus 2400, which may include, but is not limited to, microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The electronic device 2400 described above may also include a keyboard, a cursor control device (e.g., a mouse), an input/output interface (I/O interface), a network interface, a power supply, and/or a camera.
It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 24 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, the electronic device 2400 may also include more or fewer components than shown in fig. 24, or have a different configuration than shown in fig. 24. The memory 2420 may be used to store a computer program and corresponding data, such as a computer program and corresponding data corresponding to a method of controlling view display in an embodiment of the present invention. The processor 2410 executes a computer program stored in the memory 2420 to perform various functional applications and data processing, i.e., to implement the above-described method of controlling view display. The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present invention, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. The storage medium includes various media capable of storing program codes, such as a U disk, a Read-Only Memory (ROM), a random access Memory (RAM, randomAccess Memory), a removable hard disk, a magnetic disk, or an optical disk.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.