Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary embodiments do not represent all implementations consistent with one or more embodiments of the present specification. Rather, they are merely examples of apparatus and methods consistent with aspects of one or more embodiments of the present description as detailed in the accompanying claims.
It should be noted that in other embodiments, the steps of the corresponding method are not necessarily performed in the order shown and described in this specification. In some other embodiments, the method may include more or fewer steps than described in this specification. Furthermore, a single step described in this specification may be described as being split into multiple steps in other embodiments, while multiple steps described in this specification may be described as being combined into a single step in other embodiments.
Example 1
Virtual space for projecting virtual picture content. In an embodiment of the present disclosure, the virtual space includes a plurality of view ports, and virtual picture content corresponding to each view port is projected into the corresponding view port to form a fully immersive virtual space together. The scheme aims at realizing real-time editing of virtual picture contents in a virtual space.
Referring to fig. 3, the method for implementing real-time editing based on the extended ghost window provided by the present scheme includes:
creating at least one illusive window of the viewport of the one-to-one virtual space in the virtual engine;
Displaying virtual picture content corresponding to each viewport in the corresponding virtual window, and transmitting corresponding viewport rendering data, wherein the viewport rendering data at least comprises a rendering mode, a viewport orientation and a viewport matrix;
And the virtual engine renders the viewport of the corresponding virtual space based on the viewport rendering data and the virtual picture content and projects the viewport on the corresponding viewport.
It should be noted that, in the "at least one illusion window creating a view port corresponding to a virtual space in a virtual engine", the number of illusion windows is equal to the number of view ports in the virtual space, and each view port corresponds to one illusion window.
In a specific application scenario, as shown in fig. 1, the virtual space includes five view ports, i.e., a ground view port, a southeast view port, a south view port, a west view port, and a north view port, wherein the eastern view port, the southwest view port, the west view port, and the north view port are sequentially arranged according to the southeast and northwest directions, and the ground view port is disposed on the bottom planes of the eastern view port, the south view port, the west view port, and the north view port so as to jointly enclose a bowl-shaped space. The projection device can turn to different view port directions and project corresponding virtual picture contents. At this time, five illusive windows corresponding to the ground view port, the east view port, the south view port, the west view port, and the north view, respectively, are created in the virtual engine.
Editing of the virtual engine may be performed within a virtual engine editor. And the illusion window is directly displayed in the virtual space, and a designer edits the virtual engine in the virtual space and can feed back on the illusion window in real time.
In the step of displaying the virtual picture content corresponding to each view port in the corresponding illusive window, editing the picture content in an editor of a virtual engine, converting the picture content into virtual picture content, and filling the virtual picture content into the corresponding illusive window by using a rendering interface.
It is noted that the operator can edit the virtual screen content in the editor of the virtual engine, including but not limited to replacing, modifying, deleting or adding any map data, object material or blueprint virtual screen content, the corresponding virtual screen content is displayed in the illusion window, so that the operator can see the modified screen in the virtual engine.
It should be noted that, the virtual frame content is the content that has been rendered by the viewport rendering data, so that the operator can play the virtual frame in the virtual space in the editor of the virtual engine. In some embodiments, virtual picture content is filled into the illusive window through a rendering interface.
Unlike the scene where VR or AR devices are required to be worn, in the virtual space, the view port of the virtual space and the viewing angle of the observer in the virtual space are unbinding, the observer can freely move in the virtual space, and the virtual picture content displayed in the virtual space is correspondingly visually changed according to the coordinate position of the observer. Specifically, when the observer approaches a specific view port, the virtual picture content projected by the view port should be wider, and when the observer approaches the specific view port, the virtual picture content projected by the view port is narrower, so that the observer can feel spatial stereoscopic impression in the virtual space, and the change of the virtual picture content needs to be controlled according to the view port matrix transmitted by the scheme.
In the step of generating the virtual picture content, not only the view port matrix of each view port will influence the rendering result, but also the direction and rendering mode of each view port will influence the rendering result. Therefore, the scheme can transmit corresponding viewport rendering data when each illusion window is rendered, and the viewport rendering data at least comprises a rendering mode, a viewport orientation and a viewport matrix.
The rendering mode is a virtual engine rendering view port mode, which can be selected as PERSPECTIVE perspective projection, the view port orientation is the orientation of the rendered view port, for example, the view port orientation for the east view port is eastern, so as to distinguish and determine each view port, the view port matrix is view angle transformation information related to the observation coordinates of a user, the view port matrix is automatically created by the observation coordinate information of the user in a virtual space and the view port window size, and the schematic representation of the view port matrix is as follows:
it should be noted that, the matrix elements in the viewport matrix are automatically calculated according to parameters such as the size of the viewport window and the observed coordinate information, so that the numerical value of a single matrix element is only illustrated and has no practical meaning.
Therefore, the virtual engine needs to acquire the observation coordinate information of the user, and calculate the view port matrix of the current view port based on the view port window size of the current view port and the observation coordinate information.
In the step of displaying the virtual picture content corresponding to each visual port in the corresponding virtual window and transmitting the virtual picture content into the corresponding visual port rendering data, the observation coordinate information of a user is obtained, the visual port matrix is obtained by combining the visual ports corresponding to the current virtual window, and the virtual picture content is obtained by processing the edited picture content by using the visual port matrix.
In addition, it should be noted that, in order for the virtual engine to process the editing picture content according to the viewport matrix to obtain the virtual picture content, the virtual engine needs to reload the rendered class of the editor of the virtual engine first, and then transmits the virtual picture content into the viewport matrix.
In the step of rendering the viewport of the corresponding virtual space by the virtual engine based on the viewport rendering data and the virtual picture content and projecting the viewport onto the corresponding viewport, the viewport needing to be rendered is determined based on the viewport rendering data, and the virtual picture content is projected onto the corresponding viewport.
In an embodiment of the present disclosure, the virtual space includes five view ports, including a ground view port, an eastern view port, a south view port, a west view port, and a north view port, and the corresponding optical-mechanical devices may be respectively oriented in five directions, i.e., forward east, forward south, forward west, forward north, and ground, so as to project the virtual image content onto the corresponding view ports.
In addition, the window size of the view port window and the virtual window are inconsistent with the window size of the virtual engine editor, and the view port matrix and the virtual window cannot be adapted. That is, in the editor of the virtual engine, the view port matrix is automatically generated according to the size of the corresponding view port window, and if the size of the view port window is inconsistent with the size of the editor of the virtual engine, the virtual screen content is distorted.
In order to solve the problem, in the step of displaying the virtual picture content corresponding to each view port in the corresponding virtual window, a virtual engine creates a virtual window with the same view port size proportion as the view port, edits the picture content by using the view port matrix processing to obtain virtual picture content, renders the virtual picture content into the virtual window, and processes and fills the picture of the virtual window into the window according to the size proportion of the window body of a virtual engine editor.
In this scheme, the virtual engine editor sets a function of refreshing the size ratio of the virtual picture content to the size ratio of the window of the virtual engine editor. Because the virtual picture content is generated by the corresponding view port matrix, when the virtual engine projects the virtual picture content onto the corresponding view port, the picture content matched with the size of the view port can still be obtained.
Illustratively, assume that the ground view is a square and the window of the virtual engine editor is rectangular. At this time, when the virtual engine renders the virtual window corresponding to the ground view port, the virtual engine creates a 3840×3840 virtual window, and edits the picture content by using the view port matrix to obtain virtual picture content and render the virtual picture content into the virtual window, so that the content rendered onto the ground view port can be ensured to be square, and the virtual picture content is compressed to 3840×2160 format and is filled into the window body of the virtual engine editor. This allows the ground-taking pixels to be compressed within the phantom window 3840 x 2160 when the virtual engine performs projection restoration within the virtual space, but restored to the square area when projected.
According to the scheme, the virtual engine is used as a real-time responder of the virtual engine in a mode of expanding the virtual window, so that the effect that virtual picture contents can be checked in real time on an editor of the virtual engine is achieved. In addition, the scheme can finish 3D display of the complex editing process to the maximum extent by means of the powerful editing function of the virtual engine.
The scheme can realize real-time 3D display, so the scheme has very good application in some specific scenes. For example, the project of tiny part assembly in the education field can reach the teaching person through this scheme and only need select a tiny object in virtual engine through mouse keyboard, borrow the illusion from the function of taking, the object of choosing has different special effects, and the mobile control is also very meticulous convenient simultaneously. The trainee's understanding of the manner of operation can be enhanced.
Example two
Based on the same conception, referring to fig. 3, the application also provides a device for realizing real-time editing in a virtual space based on the extended ghost window, which comprises:
The virtual window expansion unit is used for creating at least one virtual window of the view port of the one-to-one corresponding virtual space in the virtual engine;
the editing unit is used for displaying virtual picture content corresponding to each viewport in the corresponding virtual window and transmitting corresponding viewport rendering data, wherein the viewport rendering data at least comprises a rendering mode, a viewport orientation and a viewport matrix;
And the rendering unit is used for enabling the virtual engine to render the view port of the corresponding virtual space based on the view port rendering data and the virtual picture content and projecting the view port on the corresponding view port.
The device for realizing real-time editing in the virtual space based on the extended ghost window can be realized in the virtual engine as a plug-in, and the same technical content as that of the first embodiment in the second embodiment is detailed in the description of the first embodiment, and is not described in detail herein.
Example III
The present embodiment also provides an electronic device, referring to fig. 4, comprising a memory 404 and a processor 402, the memory 404 having stored therein a computer program, the processor 402 being arranged to run the computer program to perform any of the above steps in an embodiment of a method of real-time editing within a virtual space based on an extended ghost window.
In particular, the processor 402 may include a Central Processing Unit (CPU), or an application specific integrated circuit (ApplicationSpecificIntegratedCircuit, abbreviated as ASIC), or may be configured as one or more integrated circuits that implement embodiments of the present application.
The memory 404 may include, among other things, mass storage 404 for data or instructions. By way of example, and not limitation, memory 404 may comprise a hard disk drive (HARDDISKDRIVE, abbreviated HDD), a floppy disk drive, a solid state drive (SolidStateDrive, abbreviated SSD), flash memory, an optical disk, a magneto-optical disk, a magnetic tape, or a Universal Serial Bus (USB) drive, or a combination of two or more of these. Memory 404 may include removable or non-removable (or fixed) media, where appropriate. Memory 404 may be internal or external to the data processing apparatus, where appropriate. In a particular embodiment, the memory 404 is a Non-Volatile (Non-Volatile) memory. In particular embodiments, memory 404 includes Read-only memory (ROM) and Random Access Memory (RAM). Where appropriate, the ROM may be a mask-programmed ROM, a programmable ROM (ProgrammableRead-only memory, abbreviated PROM), an erasable PROM (ErasableProgrammableRead-only memory, abbreviated EPROM), an electrically erasable PROM (ElectricallyErasableProgrammableRead-only memory, abbreviated EEPROM), an electrically rewritable ROM (ElectricallyAlterableRead-only memory, abbreviated EAROM) or a FLASH memory (FLASH), or a combination of two or more of these. The RAM may be a static random access memory (StaticRandom-access memory, abbreviated SRAM) or a dynamic random access memory (DynamicRandomAccessMemory, abbreviated DRAM) where the DRAM may be a fast page mode dynamic random access memory 404 (FastPageModeDynamicRandomAccessMemory, abbreviated FPMDRAM), an extended data output dynamic random access memory (ExtendedDateOutDynamicRandomAccessMemory, abbreviated EDODRAM), a synchronous dynamic random access memory (SynchronousDynamicRandom-access memory, abbreviated SDRAM), or the like, where appropriate.
Memory 404 may be used to store or cache various data files that need to be processed and/or used for communication, as well as possible computer program instructions for execution by processor 402.
The processor 402 implements the real-time editing method in virtual space based on the extended phantom window by reading and executing the computer program instructions stored in the memory 404 to implement any of the above embodiments.
Optionally, the electronic apparatus may further include a transmission device 406 and an input/output device 408, where the transmission device 406 is connected to the processor 402 and the input/output device 408 is connected to the processor 402.
The transmission device 406 may be used to receive or transmit data via a network. Specific examples of the network described above may include a wired or wireless network provided by a communication provider of the electronic device. In one example, the transmission device includes a network adapter (Network Interface Controller, simply referred to as a NIC) that can connect to other network devices through the base station to communicate with the internet. In one example, the transmission device 406 may be a Radio Frequency (RF) module, which is configured to communicate with the internet wirelessly.
The input-output device 408 is used to edit content or the like, and the information output may be virtual screen content on an extended window, content projected on a viewport, or the like.
Alternatively, in the present embodiment, the above-mentioned processor 402 may be configured to execute the following steps by a computer program:
creating at least one illusive window of the viewport of the one-to-one virtual space in the virtual engine;
Displaying virtual picture content corresponding to each viewport in the corresponding virtual window, and transmitting corresponding viewport rendering data, wherein the viewport rendering data at least comprises a rendering mode, a viewport orientation and a viewport matrix;
And the virtual engine renders the viewport of the corresponding virtual space based on the viewport rendering data and the virtual picture content and projects the viewport on the corresponding viewport.
It should be noted that, specific examples in this embodiment may refer to examples described in the foregoing embodiments and alternative implementations, and this embodiment is not repeated herein.
In general, the various embodiments may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. Some aspects of the invention may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto. While various aspects of the invention may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
Embodiments of the invention may be implemented by computer software executable by a data processor of a mobile device, such as in a processor entity, or by hardware, or by a combination of software and hardware. Computer software or programs (also referred to as program products) including software routines, applets, and/or macros can be stored in any apparatus-readable data storage medium and they include program instructions for performing particular tasks. The computer program product may include one or more computer-executable components configured to perform embodiments when the program is run. The one or more computer-executable components may be at least one software code or a portion thereof. In addition, in this regard, it should be noted that any blocks of the logic flows as illustrated may represent program steps, or interconnected logic circuits, blocks and functions, or a combination of program steps and logic circuits, blocks and functions. The software may be stored on a physical medium such as a memory chip or memory block implemented within a processor, a magnetic medium such as a hard disk or floppy disk, and an optical medium such as, for example, a DVD and its data variants, a CD, etc. The physical medium is a non-transitory medium.
It should be understood by those skilled in the art that the technical features of the above embodiments may be combined in any manner, and for brevity, all of the possible combinations of the technical features of the above embodiments are not described, however, they should be considered as being within the scope of the description provided herein, as long as there is no contradiction between the combinations of the technical features.
The foregoing examples illustrate only a few embodiments of the application, which are described in greater detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.