[go: up one dir, main page]

CN113077541B - Virtual sky picture rendering method and related equipment - Google Patents

Virtual sky picture rendering method and related equipment Download PDF

Info

Publication number
CN113077541B
CN113077541B CN202110362290.4A CN202110362290A CN113077541B CN 113077541 B CN113077541 B CN 113077541B CN 202110362290 A CN202110362290 A CN 202110362290A CN 113077541 B CN113077541 B CN 113077541B
Authority
CN
China
Prior art keywords
information
vertex
color information
preset
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110362290.4A
Other languages
Chinese (zh)
Other versions
CN113077541A (en
Inventor
刘立
周明付
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Yiju Future Network Technology Co ltd
Original Assignee
Guangzhou Yiju Future Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Yiju Future Network Technology Co ltd filed Critical Guangzhou Yiju Future Network Technology Co ltd
Priority to CN202110362290.4A priority Critical patent/CN113077541B/en
Publication of CN113077541A publication Critical patent/CN113077541A/en
Application granted granted Critical
Publication of CN113077541B publication Critical patent/CN113077541B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)

Abstract

The embodiment of the application provides a rendering method of a virtual sky picture and related equipment, and relates to the technical field of computers. The method comprises the following steps: acquiring vector information of each vertex normal on a preset sphere model; determining color information of each pixel point position on a preset sphere model based on preset background color information of a virtual sky picture and vector information of each vertex normal; and rendering a virtual sky picture on a preset sphere model based on the color information of the positions of the pixel points. The implementation of the scheme of the application can effectively improve the efficiency of rendering the virtual sky picture.

Description

Virtual sky picture rendering method and related equipment
Technical Field
The application relates to the technical field of computers, in particular to a rendering method of a virtual sky picture and related equipment.
Background
In some programs or web pages for a scene experience, such as 3D game programs or web pages, a virtual environment that is visually close to the real world is generally constructed, so that a user can obtain a true experience when using the program.
In the prior art, the construction of a sky picture in a virtual environment is generally obtained by rendering a sky panorama in a product through manual pre-drawing or shooting. The implementation of the method has very high requirements on early-stage drawing or shooting; meanwhile, the resolution of the rendered sky picture is generally low, so that the picture is blurred; and the rendered sky picture belongs to a static picture, and the user experience is poor.
Disclosure of Invention
The technical solution provided by the present application aims to solve at least one of the above technical drawbacks. The technical scheme is as follows:
in a first aspect of the present application, a method for rendering a virtual sky picture is provided, including:
acquiring vector information of each vertex normal on a preset sphere model;
determining color information of each pixel point position on the preset sphere model based on preset background color information of a virtual sky picture and vector information of each vertex normal;
and rendering a virtual sky picture on the preset sphere model based on the color information of the positions of the pixel points.
With reference to the first aspect, in a first implementation manner of the first aspect, the obtaining vector information of normals of vertices on a preset sphere model includes:
acquiring a preset sphere model used for constructing a virtual sky picture in a virtual environment, wherein vertex coordinate information of each vertex on the preset sphere model;
and determining vector information of the normals of the vertexes based on the vertex coordinate information and the normals corresponding to the vertexes.
With reference to the first implementation manner of the first aspect, in a second implementation manner of the first aspect, the preset sphere model includes a hemisphere model, and the preset sphere model is composed of at least two first lines arranged in a latitude direction and at least two second lines arranged in a longitude direction; the vertex comprises an intersection of the first line and the second line.
With reference to the first aspect, in a third implementation manner of the first aspect, the determining color information of each pixel point position on the preset sphere model based on the preset background color information of the virtual sky picture and the vector information of each vertex normal includes any one of:
aiming at each vertex, calculating to obtain color information of the vertex position based on preset background color information of a virtual sky picture and Y-axis information of a vertex normal corresponding to the vertex on a right-hand coordinate system; determining color information of each pixel point position on the preset sphere model based on the color information of each vertex position;
determining vector information of the normal of each pixel point on the preset sphere model based on the vector information of the normal of each vertex; and aiming at each pixel point, calculating to obtain the color information of each pixel point position based on the preset background color information of the virtual sky picture and the Y-axis information of the pixel point normal corresponding to the pixel point on a right-hand coordinate system.
With reference to the third implementation manner of the first aspect, in a fourth implementation manner of the first aspect, the color information of the vertex position is described as:
color=backcolor-normal.y2*C
wherein, color is color information of a vertex position or a pixel point position, back color is preset background color information, normal.y is Y-axis information of a vertex normal or a pixel point normal on a right-hand coordinate system, and C is a coefficient for adjusting the color change of the virtual sky picture.
With reference to the third implementation manner of the first aspect, in a fifth implementation manner of the first aspect, the determining color information of each pixel point position on the preset sphere model based on the color information of each vertex position includes:
calculating color information of each pixel point position on the preset sphere model by a linear interpolation method based on the color information of each vertex position;
the determining vector information of the normals of the pixel points on the preset sphere model based on the vector information of the normals of the vertexes includes:
and calculating the vector information of the normal of each pixel point on the preset sphere model by a linear interpolation method based on the vector information of the normal of each vertex.
With reference to the first aspect, in a sixth implementation manner of the first aspect, the preset background color information includes at least one of a color value representing blue in an RGB mode, a color value dynamically adjusted based on weather information, and a color value dynamically adjusted based on a time of earth rotation or earth revolution.
In a second aspect of the present application, there is provided an apparatus for rendering a virtual sky picture, comprising:
the acquisition module is used for acquiring vector information of each vertex normal on a preset spherical model;
the determining module is used for determining color information of each pixel point position on the preset sphere model based on preset background color information of a virtual sky picture and vector information of each vertex normal;
and the rendering module is used for rendering a virtual sky picture on the preset sphere model based on the color information of the positions of the pixel points.
In a third aspect of the present application, there is provided an electronic device including:
one or more processors;
a memory;
one or more computer programs, wherein the one or more computer programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: the rendering method of the virtual sky picture provided by the first aspect is performed.
In a fourth aspect of the present application, a computer-readable storage medium is provided for storing computer instructions which, when executed on a computer, cause the computer to perform the method for rendering a virtual sky picture provided in the first aspect.
The beneficial effect that technical scheme that this application provided brought is:
in the application, by acquiring vector information of each vertex normal on a preset spherical model and combining preset background color information aiming at a virtual sky picture to be rendered, color information of each pixel point position on the preset spherical model is determined, and then the virtual sky picture is rendered on the preset spherical model based on the color information of each pixel point position. According to the implementation of the scheme, any drawn or shot material does not need to be manually preprocessed, so that the rendering cost and the professional requirement are reduced; the rendering result of the scheme is obtained by executing relevant operation logic through a computer, so that the resolution of the virtual sky picture can be effectively controlled; meanwhile, the color information of each pixel point position can be controlled through the preset vector information of the background color information and the vertex normal, so that the overall effect change of the virtual sky picture is controlled, the dynamic change of the virtual sky picture is favorably realized, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a flowchart illustrating a method for rendering a virtual sky picture according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a preset sphere model according to an embodiment of the present disclosure;
fig. 3a is a schematic cross-sectional view of a preset sphere model according to an embodiment of the present disclosure;
FIG. 3b is a schematic diagram of vertex and pixel point positions according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a right-hand coordinate system provided by an embodiment of the present application;
fig. 5 is a schematic operating environment diagram of a method for rendering a virtual sky picture according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an apparatus for rendering a virtual sky picture according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
In the related art, the construction of a sky picture in a virtual environment is generally obtained by rendering a sky panorama in a product based on a sky panorama pre-drawn or photographed by a human (e.g., an artist). The implementation of the method has very high requirements on early-stage drawing or shooting; meanwhile, the resolution of the rendered sky picture is generally low, so that the picture is fuzzy, and if the resolution is increased, great operation pressure is brought to hardware such as a video memory and the like; moreover, the rendered sky picture belongs to a static picture, and if it needs to be transformed for different environments (e.g., sand storm, daytime, rainy day, etc.), a plurality of panoramas need to be separately produced, which is very costly and has poor user experience.
To solve at least one of the above problems, the present application provides a method for rendering a virtual sky picture and related apparatus. The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 shows a flowchart illustrating a method for rendering a virtual sky picture according to an embodiment of the present invention, where the method may be executed by any electronic device, such as a user terminal or a server, and the user terminal may be a smart phone, a tablet computer, a notebook computer, a desktop computer, and the present invention is not limited thereto. Specifically, the method includes the following steps S101 to S103:
step S101: and obtaining vector information of each vertex normal on a preset sphere model.
In particular, the preset sphere model may be used to construct a virtual sky picture in a virtual environment. The virtual environment may refer to a virtual environment displayed when various programs (such as applets, application programs installed on the terminal) run on the terminal or browse an online webpage through a terminal webpage; the virtual environment can be designed by imitating a real world, and can also be an imaginary environment; the virtual environment can be two-dimensional or three-dimensional, which is suitable for the requirements of different networks, applications or plans. Wherein, the preset sphere model can be a spherical net structure; the model comprises a hemisphere model (shown in fig. 2) composed of at least two first lines arranged in a latitude direction (a transverse direction shown in fig. 2) and at least two second lines arranged in a longitude direction (a longitudinal direction shown in fig. 2); the vertex comprises an intersection of the first line and the second line.
Wherein, the vertex normal may be a normal corresponding to the vertex; in the embodiment of the present application, since one vertex may relate to multiple surfaces (as shown in fig. 2, there may be a case where one vertex relates to four surfaces, or there may be a case where one vertex relates to two surfaces, or a vertex at the topmost point may relate to N surfaces, where N is the number of the second line × 2), one vertex may correspond to multiple normals that adapt to different surfaces, and to improve the operation accuracy in the embodiment of the present application, a vertex normal may specifically refer to a normal of a mean value of multiple normals that correspond to one vertex, and if multiple normals respectively correspond to respective vectors, the vector of the vertex normal takes a mean value of the multiple normal vectors.
Wherein the vector information comprises information characterizing the direction and the length.
Optionally, the preset sphere model and the vector information of each vertex normal on the model may be pre-constructed before the embodiment of the present application is executed, and may be used to construct various sky-related element pictures (such as sun, starry sky, moon, etc.); in addition, the preset sphere model and the vector information of each vertex normal on the model may also be constructed in an initial stage of executing the embodiment of the present application to adapt to the virtual sky picture. This is not limited in this application.
Step S102: and determining the color information of the position of each pixel point on the preset sphere model based on the preset background color information of the virtual sky picture and the vector information of each vertex normal.
Alternatively, the virtual sky picture may be a sky with a 3D effect.
The preset background color information includes a color value representing blue (blue is used as a ground color of the virtual sky picture) in the RGB mode. Optionally, the preset background color may be adjusted according to a sky effect to be created, which may be dynamically adjusted according to the rotation and/or revolution time of the earth; if the earth rotates in the daytime and at night, the color value can be adjusted to be blue in the daytime and can be adjusted to be black in the night; if the rotation of the earth involves four seasons, the color value of dark blue (blue) can be adjusted in summer, and the color value of light blue (cyan) can be adjusted in spring. Optionally, the preset background color can be dynamically adjusted according to the weather condition; the weather information is acquired in real time or periodically, and corresponding color numerical values under different weathers are determined based on a mapping list of preset weather information and color numerical values.
Specifically, when the color information of each pixel position is determined based on the preset background color information of the virtual sky picture and the vector information of each vertex normal, an atmospheric scattering (atmospherical scattering) principle is involved; atmospheric scattering refers to the phenomenon that electromagnetic waves interact with atmospheric molecules or aerosol and the like, so that incident energy is redistributed in all directions according to a certain rule. And finally determining the color information of the redistribution of the vertex positions by combining the preset background color information and the vector information of the vertex normal in each direction. Wherein the preset background color information includes numerical values characterizing various colors in an RGB mode; the color information of each pixel point position comprises numerical values which are obtained by combining preset background color information with vector information of each vertex normal line and are used for processing, and the numerical values are used for representing various colors in an RGB mode.
Optionally, based on the color information of each vertex position, color information of each pixel point position on the preset sphere model may be obtained (which will be described in the following embodiments); and after vector information of the normal corresponding to each pixel point on the preset sphere model is obtained based on the vector information of the normal of each vertex, color information of the position of each pixel point is determined based on the preset background color information and the vector information of the normal of each pixel point.
Step S103: and rendering a virtual sky picture on a preset sphere model based on the color information of the positions of the pixel points.
Specifically, the color information of each pixel point position is added to the preset sphere model, and then a virtual sky picture can be rendered based on the color value of each pixel point position on the preset sphere model.
The following description is made with reference to fig. 2 to 3 with respect to the contents related to the preset sphere model.
In one embodiment, the step S101 of obtaining vector information of each vertex normal on the preset sphere model includes the following steps a1-a 2:
step A1: acquiring a preset sphere model used for constructing a virtual sky picture in a virtual environment, and presetting vertex coordinate information of each vertex on the sphere model.
Specifically, since the preset sphere model is a stereoscopic model, vertex coordinate information of each vertex on the preset sphere model includes three-dimensional coordinate information (x, y, z). When the coordinate origin is set based on the preset sphere model, the coordinate origin may be set based on the top position point or the core position point of the sphere model, or may be set at any other position, which is not limited in the present application. And the vertex coordinate information of each vertex comprises three-dimensional coordinate information corresponding to the intersection point of all the first lines and all the second lines in the grid.
Optionally, in consideration that vector information in the direction shown in fig. 3a is mainly used when determining the color information of each vertex position in step S102, in this embodiment of the present application, a right-hand coordinate system (shown in fig. 4) may be used to establish a preset sphere model coordinate system; in addition, other three-dimensional coordinate systems may be considered.
Step A2: vector information of each vertex normal is determined based on the vertex coordinate information and the normal corresponding to each vertex.
Specifically, the normal lines corresponding to the vertices may include multiple lines, and if a vertex relates to four surfaces (as shown in fig. 2, a surface may correspond to four vertices, or may correspond to three vertices), and the vertex may correspond to four normal lines, vector information (which may be a unit vector) of the vertex normal line may be obtained based on the coordinate information of the vertex and all corresponding normal lines.
A specific procedure of how to determine color information of each vertex position is explained below.
In an embodiment, the determining, in step S102, color information of each pixel point position on the preset sphere model based on the preset background color information of the virtual sky picture and the vector information of each vertex normal includes any one of the following steps B1-B2:
step B1: aiming at each vertex, calculating to obtain color information of the vertex position based on preset background color information of a virtual sky picture and Y-axis information of a vertex normal corresponding to the vertex on a right-hand coordinate system; and determining the color information of the position of each pixel point on the preset spherical surface model based on the color information of each vertex position.
Specifically, as shown in fig. 3a, for each vertex normal line, the color information of the corresponding vertex position may be calculated by combining the Y-axis information (length value related to the Y-axis in the unit vector, and may also reflect the color change of the sky in different directions) on the right-hand coordinate system with the preset background color information of the virtual sky picture.
Wherein the color information of the vertex position is described as the following formula (1):
color=backcolor-normal.y2*C
.... formula (1)
Wherein, color is color information of a vertex position, back color is preset background color information, normal is Y-axis information of a vertex normal on a right-hand coordinate system, and C is a coefficient for adjusting color change of a virtual sky picture.
Alternatively, as shown in fig. 3a, the Y-axis information corresponding to the vertex at the bottom position is 0, and the Y-axis information corresponding to the vertex at the top position is 1; on the basis, the combination coefficient C can adjust the color change of a virtual sky picture rendered by a preset sphere model; the coefficient C may be any value in the range (0,1), such as 0.25 or 0.7 (when C is 0.7, the virtual sky picture may exhibit a cartoon style effect). Wherein, the larger the coefficient C is, the larger the color change of the virtual sky picture is finally presented.
Determining color information of each pixel point position on the preset sphere model based on the color information of each vertex position in step B1 includes the following steps B11:
step B11: and calculating the color information of the position of each pixel point on the preset sphere model by a linear interpolation method based on the color information of each vertex position.
Specifically, as shown in fig. 3B, after the color information corresponding to the positions of the vertex a and the vertex B is obtained through calculation in step S102, when the color information of a certain pixel point C between the vertex a and the vertex B is obtained, the color information may be obtained based on the following formula (2):
Figure BDA0003006087560000091
wherein, the formula (2) can represent the calculation process of linear interpolation, CCColor information corresponding to the position of pixel point C, CAColor information corresponding to the position of the vertex A, CBCorresponding to good color information for vertex B position.
Step B2: determining vector information of the normal of each pixel point on a preset sphere model based on the vector information of the normal of each vertex; and aiming at each pixel point, calculating to obtain the color information of each pixel point position based on the preset background color information of the virtual sky picture and the Y-axis information of the pixel point normal corresponding to the pixel point on a right-hand coordinate system.
Specifically, in step B2, the vector information of the normal of each pixel point on the preset sphere model is determined based on the vector information of the normal of each vertex, including the following steps B21:
step B21: based on the vector information of the normal lines of all the vertexes, the vector information of the normal lines of all the pixel points on the preset sphere model is calculated through a linear interpolation method.
As shown in fig. 3B, after the vector information of the vertex normal of vertex a and the vector information of the vertex normal of vertex B are obtained in step S101, when the vector information of the pixel point normal of a certain pixel point C between vertex a and vertex B is obtained, the vector information may be obtained based on the following formula (3):
Figure BDA0003006087560000092
wherein, the formula (3) can represent the calculation process of linear interpolation, NCVector information of pixel point normal corresponding to pixel point C, NAVector information for vertex normal corresponding to vertex A, NBAnd vector information of vertex normal corresponding to the vertex B.
Specifically, when color information of a corresponding position is calculated for each pixel point, the calculation may be performed by using the content shown in the above formula (1); accordingly, at this time, color in the formula (1) is color information of the pixel point position, and normal is Y-axis information of the pixel point normal on a right-hand coordinate system.
The following describes a specific operation process of the rendering method of a virtual sky picture provided by the present application.
In one embodiment, as shown in FIG. 5, embodiments of the present application may involve a terminal 400, a network 300, a server 200, and a database 500 in an operating environment. The method execution subject provided by the embodiment of the present application may be the terminal 400, or may be the server 200.
Specifically, the preset sphere model and the preset background color information may be data stored in the database 500, and when rendering the virtual sky image, the terminal 400 may initiate a rendering request to the server 200 through the network 300, the server 200 acquires relevant information from the database 500 and processes the information, and finally, after rendering the virtual sky image, the relevant data is sent to the terminal 200 through the network 300 to be displayed. The method can effectively reduce the operating pressure of the user terminal by applying the computing power of the cloud computing. Among them, cloud computing (cloud computing) is a computing mode that distributes computing tasks over a resource pool formed by a large number of computers, so that various application systems can acquire computing power, storage space, and information services as needed. The network that provides the resources is referred to as the "cloud". Resources in the "cloud" appear to the user as being infinitely expandable and available at any time, available on demand, expandable at any time, and paid for on-demand. The database 500 may be regarded as an electronic file cabinet, that is, a place for storing electronic files, and a user may add, query, update, delete, or the like, to data in the files. A "database" is a collection of data that is stored together in a manner that can be shared by multiple users, has as little redundancy as possible, and is independent of the application.
Specifically, the method provided by the embodiment of the present application may also be executed by the terminal 400, where the preset sphere model and the preset background color information may be stored in a memory of the terminal 400, or may be stored in the database 500.
An embodiment of the present application provides a rendering apparatus of a virtual sky picture, as shown in fig. 6, the rendering apparatus 600 of the virtual sky picture may include: an acquisition module 601, a determination module 602, and a rendering module 603.
The obtaining module 601 is configured to obtain vector information of each vertex normal on a preset sphere model; the determining module 602 is configured to determine color information of each pixel point position on the preset sphere model based on preset background color information of the virtual sky picture and vector information of each vertex normal; the rendering module is used for rendering a virtual sky picture on the preset sphere model based on the color information of the positions of the pixel points.
In an embodiment, the obtaining module 601, when performing the step of obtaining vector information of each vertex normal on the preset sphere model, is further configured to perform the following steps:
acquiring a preset sphere model used for constructing a virtual sky picture in a virtual environment, and vertex coordinate information of each vertex on the preset sphere model;
vector information of each vertex normal is determined based on the vertex coordinate information and the normal corresponding to each vertex.
In one embodiment, the preset sphere model comprises a hemisphere model, and the hemisphere model is composed of at least two first lines arranged in the direction of latitude and at least two second lines arranged in the direction of longitude; the vertex comprises an intersection of the first line and the second line.
In an embodiment, the determining module 602 is configured to, when performing the step of determining the color information of each pixel point position on the preset sphere model based on the vector information of each vertex normal and the preset background color information of the virtual sky picture, further perform any one of the following steps:
aiming at each vertex, calculating to obtain color information of the vertex position based on preset background color information of a virtual sky picture and Y-axis information of a vertex normal corresponding to the vertex on a right-hand coordinate system; determining color information of each pixel point position on a preset sphere model based on the color information of each vertex position;
determining vector information of the normal of each pixel point on a preset sphere model based on the vector information of the normal of each vertex; and aiming at each pixel point, calculating to obtain the color information of each pixel point position based on the preset background color information of the virtual sky picture and the Y-axis information of the pixel point normal corresponding to the pixel point on a right-hand coordinate system.
In one embodiment, the color information is described as:
color=backcolor-normal.y2*C
wherein, color is color information of the pixel point position at the vertex position, back color is preset background color information, normal.y is Y-axis information of the vertex normal or the pixel point normal on a right-hand coordinate system, and C is a coefficient for adjusting the color change of the virtual sky picture.
In an embodiment, the determining module 602 is further configured to perform the following steps when performing the step of rendering the virtual sky picture on the preset sphere model based on the color information of the vertex positions:
determining color information of each pixel point position on a preset sphere model based on the color information of each vertex position, comprising:
calculating color information of each pixel point position on a preset sphere model by a linear interpolation method based on the color information of each vertex position;
based on the vector information of the normals of the vertexes, determining the vector information of the normals of the pixels on the preset sphere model, and the method comprises the following steps:
and calculating the vector information of the normal of each pixel point on the preset sphere model by a linear interpolation method based on the vector information of the normal of each vertex.
In one embodiment, the preset background color information includes at least one of a color value representing blue in the RGB mode, a color value dynamically adjusted based on weather information, and a color value dynamically adjusted based on a time of earth rotation or earth revolution.
The apparatus according to the embodiment of the present application may execute the method provided by the embodiment of the present application, and the implementation principle is similar, the actions executed by the modules in the apparatus according to the embodiments of the present application correspond to the steps in the method according to the embodiments of the present application, and for the detailed functional description of the modules in the apparatus, reference may be specifically made to the description in the corresponding method shown in the foregoing, and details are not repeated here.
An embodiment of the present application provides an electronic device, including: a memory and a processor; at least one program stored in the memory for execution by the processor, which when executed by the processor, implements: in the application, by acquiring vector information of each vertex normal on a preset spherical model and combining preset background color information aiming at a virtual sky picture to be rendered, color information of each pixel point position on the preset spherical model is determined, and then the virtual sky picture is rendered on the preset spherical model based on the color information of each pixel point position. According to the implementation of the scheme, any drawn or shot material does not need to be manually preprocessed, so that the rendering cost and the professional requirement are reduced; the rendering result of the scheme is obtained by executing relevant operation logic through a computer, so that the resolution of the virtual sky picture can be effectively controlled; meanwhile, the color information of each pixel point position can be controlled through the preset vector information of the background color information and the vertex normal, so that the overall effect change of the virtual sky picture is controlled, the dynamic change of the virtual sky picture is favorably realized, and the user experience is improved.
In an alternative embodiment, an electronic device is provided, as shown in fig. 7, the electronic device 700 shown in fig. 7 comprising: a processor 701 and a memory 703. The processor 701 is coupled to a memory 703, such as via a bus 702. Optionally, the electronic device 700 may further include a transceiver 704, and the transceiver 704 may be used for data interaction between the electronic device and other electronic devices, such as transmission of data and/or reception of data. It should be noted that the transceiver 704 is not limited to one in practical applications, and the structure of the electronic device 700 is not limited to the embodiment of the present application.
The Processor 701 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor 701 may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others.
Bus 702 may include a path that transfers information between the above components. The bus 702 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 702 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 7, but this is not intended to represent only one bus or type of bus.
The Memory 703 may be a ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, a RAM (Random Access Memory) or other type of dynamic storage device that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these.
The memory 703 is used for storing application program codes (computer programs) for executing the present application, and is controlled by the processor 701. The processor 701 is configured to execute application program code stored in the memory 703 to implement the content shown in the foregoing method embodiments.
Among them, electronic devices include but are not limited to: smart phones, tablet computers, notebook computers, smart speakers, smart watches, vehicle-mounted devices, and the like.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. A processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method for rendering the virtual sky picture provided in the above-mentioned various alternative implementations.
The present application provides a computer-readable storage medium, on which a computer program is stored, which, when running on a computer, enables the computer to execute the corresponding content in the foregoing method embodiments.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (9)

1. A method for rendering a virtual sky picture, comprising:
acquiring vector information of each vertex normal on a preset sphere model;
determining color information of each pixel point position on the preset sphere model based on preset background color information of the virtual sky picture and vector information of each vertex normal, including: aiming at each vertex, calculating to obtain color information of the vertex position based on preset background color information of a virtual sky picture and Y-axis information of a vertex normal corresponding to the vertex on a right-hand coordinate system; determining color information of each pixel point position on the preset sphere model based on the color information of each vertex position; or determining vector information of the normals of all pixel points on the preset sphere model based on the vector information of the normals of all vertexes; aiming at each pixel point, calculating to obtain color information of each pixel point position based on preset background color information of a virtual sky picture and Y-axis information of a pixel point normal corresponding to the pixel point on a right-hand coordinate system;
rendering a virtual sky picture on the preset sphere model based on the color information of the positions of the pixel points;
the product of Y-axis information of the vertex normal on a right-hand coordinate system and a coefficient for adjusting the color change of the virtual sky picture forms a first parameter; the product of the Y-axis information of the pixel point normal on the right-hand coordinate system and the coefficient for adjusting the color change of the virtual sky picture forms a second parameter;
the color information of the vertex position is determined by the difference value of the preset background color information and the first parameter; and the color information of the pixel point position is determined by the difference value of the preset background color information and the second parameter.
2. The method according to claim 1, wherein the obtaining vector information of each vertex normal on the preset sphere model comprises:
acquiring a preset sphere model used for constructing a virtual sky picture in a virtual environment, wherein vertex coordinate information of each vertex on the preset sphere model;
and determining vector information of the normals of the vertexes based on the vertex coordinate information and the normals corresponding to the vertexes.
3. The method according to claim 2, wherein the preset sphere model includes a hemisphere model composed of at least two first lines laid in a latitude direction and at least two second lines laid in a longitude direction; the vertex comprises an intersection of the first line and the second line.
4. The method of claim 1, wherein the color information is described as:
color=backcolor-normal.y2*C
wherein, color is color information of a vertex position or a pixel point position, back color is preset background color information, normal.y is Y-axis information of a vertex normal or a pixel point normal on a right-hand coordinate system, and C is a coefficient for adjusting the color change of the virtual sky picture.
5. The method of claim 1,
the determining the color information of the position of each pixel point on the preset sphere model based on the color information of each vertex position includes:
calculating color information of each pixel point position on the preset sphere model by a linear interpolation method based on the color information of each vertex position;
the determining vector information of the normals of the pixel points on the preset sphere model based on the vector information of the normals of the vertexes includes:
and calculating the vector information of the normal of each pixel point on the preset sphere model by a linear interpolation method based on the vector information of the normal of each vertex.
6. The method of claim 1, wherein the preset background color information includes at least one of a color value characterizing blue in an RGB mode, a color value dynamically adjusted based on weather information, and a color value dynamically adjusted based on a time of earth rotation or earth revolution.
7. An apparatus for rendering a virtual sky picture, comprising:
the acquisition module is used for acquiring vector information of each vertex normal on a preset spherical model;
a determining module, configured to determine color information of each pixel point position on the preset sphere model based on preset background color information of a virtual sky picture and vector information of each vertex normal, including: aiming at each vertex, calculating to obtain color information of the vertex position based on preset background color information of a virtual sky picture and Y-axis information of a vertex normal corresponding to the vertex on a right-hand coordinate system; determining color information of each pixel point position on the preset sphere model based on the color information of each vertex position; or determining vector information of the normals of all pixel points on the preset sphere model based on the vector information of the normals of all vertexes; aiming at each pixel point, calculating to obtain color information of each pixel point position based on preset background color information of a virtual sky picture and Y-axis information of a pixel point normal corresponding to the pixel point on a right-hand coordinate system;
the rendering module is used for rendering a virtual sky picture on the preset sphere model based on the color information of the positions of the pixel points;
the product of Y-axis information of the vertex normal on a right-hand coordinate system and a coefficient for adjusting the color change of the virtual sky picture forms a first parameter; the product of the Y-axis information of the pixel point normal on the right-hand coordinate system and the coefficient for adjusting the color change of the virtual sky picture forms a second parameter;
the color information of the vertex position is determined by the difference value of the preset background color information and the first parameter; and the color information of the pixel point position is determined by the difference value of the preset background color information and the second parameter.
8. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a memory;
one or more computer programs, wherein the one or more computer programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: performing the method according to any one of claims 1 to 6.
9. A computer-readable storage medium for storing computer instructions which, when executed on a computer, cause the computer to perform the method of any of claims 1 to 6.
CN202110362290.4A 2021-04-02 2021-04-02 Virtual sky picture rendering method and related equipment Active CN113077541B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110362290.4A CN113077541B (en) 2021-04-02 2021-04-02 Virtual sky picture rendering method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110362290.4A CN113077541B (en) 2021-04-02 2021-04-02 Virtual sky picture rendering method and related equipment

Publications (2)

Publication Number Publication Date
CN113077541A CN113077541A (en) 2021-07-06
CN113077541B true CN113077541B (en) 2022-01-18

Family

ID=76615072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110362290.4A Active CN113077541B (en) 2021-04-02 2021-04-02 Virtual sky picture rendering method and related equipment

Country Status (1)

Country Link
CN (1) CN113077541B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113592999B (en) * 2021-08-05 2022-10-28 广州益聚未来网络科技有限公司 Rendering method of virtual luminous body and related equipment
CN113936084B (en) * 2021-10-28 2022-10-28 广州益聚未来网络科技有限公司 Generation method of target elements in virtual sky and related equipment
CN114470756B (en) * 2021-12-31 2025-09-02 北京像素软件科技股份有限公司 Starry sky simulation method, starry sky simulation device, computer equipment and medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111068312A (en) * 2019-12-02 2020-04-28 网易(杭州)网络有限公司 Game picture rendering method and device, storage medium and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6825851B1 (en) * 2000-08-23 2004-11-30 Nintendo Co., Ltd. Method and apparatus for environment-mapped bump-mapping in a graphics system
CN109427088B (en) * 2017-08-18 2023-02-03 腾讯科技(深圳)有限公司 Rendering method for simulating illumination and terminal
CN111145326B (en) * 2019-12-26 2023-12-19 网易(杭州)网络有限公司 Processing method of three-dimensional virtual cloud model, storage medium, processor and electronic device
CN111420404B (en) * 2020-03-20 2023-04-07 网易(杭州)网络有限公司 Method and device for rendering objects in game, electronic equipment and storage medium
CN112233215B (en) * 2020-10-15 2023-08-22 网易(杭州)网络有限公司 Contour rendering method, device, equipment and storage medium
CN112263837B (en) * 2020-11-16 2021-12-21 腾讯科技(深圳)有限公司 Weather rendering method, device, equipment and storage medium in virtual environment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111068312A (en) * 2019-12-02 2020-04-28 网易(杭州)网络有限公司 Game picture rendering method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN113077541A (en) 2021-07-06

Similar Documents

Publication Publication Date Title
CN112884875B (en) Image rendering method, device, computer equipment and storage medium
CN113077541B (en) Virtual sky picture rendering method and related equipment
CN110910486B (en) Indoor scene illumination estimation model, method and device, storage medium and rendering method
CN104637089B (en) Three-dimensional model data processing method and device
CN113643414B (en) Three-dimensional image generation method and device, electronic equipment and storage medium
CN107886552B (en) Mapping processing method and device
US9965893B2 (en) Curvature-driven normal interpolation for shading applications
US8675013B1 (en) Rendering spherical space primitives in a cartesian coordinate system
Okura et al. Mixed-reality world exploration using image-based rendering
CN113592999B (en) Rendering method of virtual luminous body and related equipment
CN116977531A (en) Three-dimensional texture image generation method, three-dimensional texture image generation device, computer equipment and storage medium
Livny et al. A GPU persistent grid mapping for terrain rendering
CN114842127B (en) Terrain rendering method and device, electronic equipment, medium and product
CN108304450A (en) The online method for previewing of threedimensional model and device
CN113538502A (en) Picture clipping method and device, electronic equipment and storage medium
WO2024244659A1 (en) Cloud map processing method and apparatus, computer device, computer readable storage medium, and computer program product
CN109658495B (en) Rendering method and device for ambient light shielding effect and electronic equipment
CN113274735B (en) Model processing method and device, electronic equipment and computer readable storage medium
Masood et al. High‐performance virtual globe GPU terrain rendering using game engine
WO2024199097A1 (en) Rendering method and corresponding device
CN116485989B (en) Image processing method, device, equipment and storage medium
CN115810086B (en) Three-dimensional scene reconstruction method, device, computer equipment and storage medium
CN117934675A (en) Fog effect processing method, device, equipment, medium and program product for virtual scene
CN116934934A (en) Anti-aliasing method and device for picture, computer readable medium and electronic equipment
US20240371079A1 (en) Face-Oriented Geometry Streaming

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant