WO2019042183A1 - 虚拟场景显示方法、装置及存储介质 - Google Patents
虚拟场景显示方法、装置及存储介质 Download PDFInfo
- Publication number
- WO2019042183A1 WO2019042183A1 PCT/CN2018/101451 CN2018101451W WO2019042183A1 WO 2019042183 A1 WO2019042183 A1 WO 2019042183A1 CN 2018101451 W CN2018101451 W CN 2018101451W WO 2019042183 A1 WO2019042183 A1 WO 2019042183A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- virtual scene
- animation
- display
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/80—2D [Two Dimensional] animation, e.g. using sprites
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/33—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
- A63F13/335—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/63—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
- A63F13/47—Controlling the progress of the video game involving branching, e.g. choosing one of several possible scenarios at a given point in time
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6661—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
Definitions
- the embodiments of the present invention relate to the field of Internet technologies, and in particular, to a virtual scene display method, apparatus, and storage medium.
- a plurality of independent scenes are usually included, and one scene may be defined as a plurality of pictures including one continuous event or a plurality of pictures of a continuous motion.
- the existing scheme mainly controls the movement of the lens by controlling the multiple material elements of the multiple layers at the same time by the program script to realize the scene switching.
- the embodiment of the present application provides a method for switching the virtual scene.
- An embodiment of the present application provides a virtual scene display method, where the method includes:
- an animation interface corresponding to the animation data Displaying, in an animation display area of the display interface, an animation interface corresponding to the animation data, where the animation interface includes a first virtual scene composed of a plurality of animation elements;
- a video clip in the video data is played in response to an operation of an animated element in the animated interface, the video clip exhibiting a transition from the first virtual scene to a second virtual scene.
- An embodiment of the present application provides a virtual scene display apparatus, where the apparatus includes: a processor and a memory, wherein the memory stores computer readable instructions, and the instructions may cause the processor to:
- an animation interface corresponding to the animation data Displaying, in an animation display area of the display interface, an animation interface corresponding to the animation data, where the animation interface includes a first virtual scene composed of a plurality of animation elements;
- a video clip in the video data is played in response to an operation of an animated element in the animated interface, the video clip exhibiting a transition from the first virtual scene to a second virtual scene.
- the embodiment of the present application provides a computer readable storage medium storing a computer program executable by a processor to perform a virtual scene display method of various embodiments.
- the virtual scene and the virtual scene switching screen are made into a video, and the effect of simulating the lens movement is realized by playing the video clip in the animation display, thereby reducing the processing resource consumption of the terminal device and avoiding the picture jam. It is convenient for the production and modification of art, so that artists do not need to consider the influence of animation on the performance of mobile phones, and produce high-quality animation effects.
- Figure 1-1 to Figure 1-2 are schematic diagrams of the scene switching effect interface of the simulated lens movement
- FIG. 2 is a schematic diagram of a system of an embodiment of the present application.
- FIG. 3 is a block diagram of an apparatus of an embodiment of the present application.
- 4A and 4B are flowcharts showing a virtual scene display method according to an embodiment of the present application.
- FIG. 5 is a schematic diagram of a virtual scene opening interface according to an embodiment of the present application.
- FIG. 6 is a flowchart of a method for implementing step S430 in FIG. 4B according to an embodiment of the present application
- FIG. 7-1 to 7-3 are schematic diagrams of a switching effect interface for simulating lens movement using the prior art
- FIG. 8 is a flowchart of a virtual scene display method based on the embodiment of FIG. 4B;
- FIG. 9 is a schematic diagram of a target location video segment display interface according to an embodiment of the present application.
- FIG. 10 is a detailed flowchart of a virtual scene display method according to an embodiment of the present application.
- FIG. 11 is a schematic diagram of segmenting video segments of different locations according to scene movement requirements according to an embodiment of the present application.
- FIG. 12 is a block diagram of a virtual scene display apparatus according to an embodiment of the present application.
- Figure 13 is a block diagram of an initial segment display module of the embodiment of Figure 12.
- Figure 1-1 to Figure 1-2 are schematic diagrams of the scene switching effect interface for simulating lens movement.
- Figure 1-1 to Figure 1-2 simulate the scene switching screen when the lens moves to the right.
- the embodiment of the present application splits the animation display content into multiple parts, wherein the part of the scene conversion is realized by video, and the other parts are still realized by animation.
- 2 is a schematic diagram of a system of an embodiment of the present application.
- the system can include: a terminal device 110 and at least one content providing device 120.
- the content providing device 120 is a device that provides the terminal device 110 with presentation content.
- the display content may include a web page with an animated display effect, web content, game content in a game app, and the like.
- the content providing device 120 may be a website, a web game server, an online game server, or the like.
- the terminal device 110 is an electronic device that can acquire content and present content through a network.
- the terminal device 110 may include, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a PC, a set top box, and the like.
- the terminal device 110 can acquire the presentation content from the content providing device 120 by running the application, and present the presentation content in the display device.
- the display device may be a screen of the terminal device 110, a display, a television, a projector, and the like.
- the application for acquiring and presenting content in the terminal device 110 may include, but is not limited to, a browser, a game software APP, and the like.
- the game software APP is installed in the terminal device 110, and after receiving the user trigger running game software APP 120, the stored game software APP 120 can be called to execute the virtual scene display method of each embodiment.
- FIG. 3 is a block diagram of an apparatus 200 in accordance with an embodiment of the present application.
- device 200 can be terminal device 110 in FIG.
- the terminal device may be a mobile terminal, such as a smart phone, a tablet computer, etc., or may be a non-portable terminal device such as a PC or a set top box.
- apparatus 200 can include one or more of the following components: processing component 202, memory 204, power component 206, multimedia component 208, audio component 210, sensor component 214, and communication component 216.
- Processing component 202 controls the overall operation of device 200 and may include one or more processors 218 to execute instructions to perform all or part of the steps of the methods.
- Memory 204 stores various types of data, including instructions for applications or methods in device 200.
- the memory 204 can be implemented by any type of volatile or non-volatile storage device or a combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read only memory (Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read Only Memory (EPROM), Programmable Red-Only Memory (PROM), Read Only Memory ( Read-Only Memory (ROM), magnetic memory, flash memory, disk or optical disk.
- SRAM Static Random Access Memory
- EEPROM Electrically erasable programmable read only memory
- EPROM Erasable Programmable Read Only Memory
- PROM Programmable Red-Only Memory
- ROM Read Only Memory
- magnetic memory flash memory
- flash memory disk or optical disk.
- the apparatus 200 may be configured by one or more Application Specific Integrated Circuits (ASICs), digital signal processors, digital signal processing devices, programmable logic devices, field programmable gate arrays, A controller, microcontroller, microprocessor or other electronic component is implemented for performing the methods of the various embodiments.
- ASICs Application Specific Integrated Circuits
- digital signal processors digital signal processing devices
- programmable logic devices programmable logic devices
- field programmable gate arrays programmable gate arrays
- a controller, microcontroller, microprocessor or other electronic component is implemented for performing the methods of the various embodiments.
- FIG. 4A is a flowchart of a virtual scene display method according to an embodiment of the present application. As shown in FIG. 4A, the method can include the following steps.
- step S41 animation data and video data are acquired.
- Animation data is data used to render animation effects and can include data for describing animation elements and motion logic. Animation data can be generated using animation editing tools such as Flash.
- Step S42 displaying an animation interface corresponding to the animation data in an animation display area of the display interface, where the animation interface includes a first virtual scene composed of a plurality of animation elements.
- Step S43 in response to the operation of the animation element in the animation interface, playing a video segment in the video data, the video segment showing a picture converted from the first virtual scene to the second virtual scene.
- the video data may include data for one or more video segments.
- the preset video clip position corresponding to the entry instruction may be acquired; and the video clip corresponding to the video clip position in the video data is played.
- the video clip location may include a starting location of the video clip in the video data.
- the location can be a time location or a location of a frame.
- the position of the corresponding video data and the video segment in the video data may be set in advance for the operation of triggering the scene switching, so that when the operation is detected, the position of the video segment corresponding to the operation may be determined according to the preset information.
- the video clip location may include the start time of the video clip in the video data.
- the time attribute of the video element in the display interface may be configured as the start time, so that the video element plays a video segment corresponding to the start time position in the video data.
- the animated display area may be provided by a Canvas element of HTML5; the video element may be a Video element of HTML5.
- the CurrentTime property of the Video element in the display interface can be set to the start time of the video clip to be played, and the Video element can be played from the start time in the video data. Video clips to switch virtual scenes.
- the video data when the video data includes a plurality of video segments, different needs in the animation display can be met by playing different video segments in the video data.
- the combination of video clips and interactive elements can also be utilized to achieve interaction with the user.
- a second video segment corresponding to the second virtual scene in the video data may be played, and an interactive element is superimposed on a display area of the second video segment; in response to an operation on the interactive element, playing a third video segment of the video data corresponding to the interactive display element.
- the dependence on the animation data can be further reduced, thereby reducing the consumption of processing resources of the terminal.
- the second video segment can also be played cyclically, waiting for the user to issue an operation instruction. That is, in response to the determination that the operation is not detected, when the second video clip is played, the second video clip is played again and the interactive element is superimposed.
- FIG. 4B is a flowchart of a virtual scene display method according to an embodiment of the present application.
- the applicable range and execution subject of the virtual scene display method of the simulated lens shift may be, for example, the method for the terminal device 110 of the implementation environment shown in Fig. 2.
- the virtual scene display method of the simulated lens movement may be performed by the terminal device 110, and may include the following steps.
- step S410 a trigger instruction for starting the virtual scene display is received, and a virtual scene opening interface is displayed.
- the virtual scene involved in the embodiment of the present application includes an animation-implemented user interface part and a video part.
- the terminal device 110 runs the game software APP to display the virtual scene opening interface as shown in FIG. 5.
- the open interface can generate animation effects and logical information files using animation editing tools such as Flash software.
- a control for entering the virtual scene function may be created and displayed in the open interface, for example, the "Enter Lab" button shown in FIG.
- step S430 an entry instruction of the virtual scene function in the opening interface is received, and the initial position video segment indicated by the entry instruction is played, and the analog lens is in an initial position. That is, the displayed screen is a screen that transitions from the virtual scene of the open interface to the virtual scene corresponding to the initial position.
- the user clicks the “Enter Lab” button in the opening interface shown in FIG. 5, and the terminal device 110 receives an entry instruction to enter the virtual scene, and jumps to play the initial according to the initial position video segment indicated by the entry instruction.
- Location video clip Playing the initial position video clip simulates the video shot when the playback lens is in the initial position.
- the mobile control button may be set on the upper layer of the initial position video clip display interface, and the user clicks the mobile control button, and the terminal device 110 can receive the mobile trigger command that triggers the analog lens shift function.
- a virtual scene of a game often includes scenes of multiple regions, and can switch back and forth between multiple regional scenes, for example, the analog lens moves from the area 1 to the area 2, and moves from the area 2 back to the area. 1.
- the initial position may mean that the analog lens is located in the area 1, and the terminal device receives the entry instruction, and enters the corresponding video segment of the instruction indication area 1, so that the video segment corresponding to the playback area 1 can be directly jumped, thereby simulating the playback.
- step S450 a motion triggering instruction for triggering the analog lens shifting function is received, and the process moves to the moving video segment indicated by the motion triggering instruction to simulate the movement of the lens from the initial position to the target position. That is, the displayed screen is a screen that transitions from the virtual scene corresponding to the initial position to the virtual scene corresponding to the target position.
- the moving video segment may be a picture during the movement of the analog lens from the area 1 to the area 2, or may be a picture during the movement of the analog lens from the area 2 to the area 3, that is, the moving video segment is simulated.
- the picture taken during the movement of the lens from the initial position to the target position, the pictures moving between different areas are generally different, and different movement triggering instructions may correspond to different moving video segments.
- the screen in which the analog lens is located in the area 1 , the screen in which the analog lens is moved from the area 1 to the area 2, and the screen in which the analog lens is located in the area 2 are pressed into a video in advance, and the video start and end time of the analog lens in the area 1 is respectively recorded.
- the terminal device 110 In the process of playing the video clip of the analog lens in the area 1 (ie, the initial position video clip), waiting for the mobile user to click the mobile trigger command of the upper layer mobile control button of the video display interface, after receiving the motion trigger instruction, the terminal device 110
- the video segment (ie, the moving video segment) of the analog lens indicated by the movement trigger instruction from the area 1 to the area 2 plays the video clip in the process of moving the analog lens from the area 1 to the area 2.
- the user By playing a moving video clip, the user is presented with a visual effect that the lens moves slowly from area 1 to area 2.
- step S470 when the moving video segment is played, the jump to the target position video segment indicated by the motion trigger instruction is performed, and the analog lens is at the target position. . That is, the displayed screen is a screen of the virtual scene corresponding to the target position.
- the lens has moved from the area 1 to the area 2, and the lens captures the picture of the area 2 at this time.
- the corresponding video segment of the playback area 2 can be directly jumped according to the motion triggering instruction.
- the terminal device 110 is a smart phone.
- the smart phone simulates the movement of the lens by controlling multiple material elements of multiple layers in the canvas to move at the same time. It takes a lot of tests to be easily modified and maintained, and moving a large number of material components consumes the performance of the smartphone. Improper handling can also cause the picture to get stuck.
- the embodiment of the present application directly plays different video segments before and after the simulated lens movement and during the simulated lens movement by playing the video segment, thereby presenting a visual effect of the lens movement, and making the entire virtual scene into a video, reducing The performance of the smartphone is consumed, avoiding the picture sticking.
- FIG. 6 is a schematic flowchart of the details of step S430 corresponding to the embodiment of FIG. 4B. As shown in FIG. 6, step S430 specifically includes:
- step S431 receiving an entry instruction of the virtual scene function in the open interface, and dynamically configuring a play time attribute of the initial position video segment indicated by the entry instruction according to the entry trigger time indicated by the entry instruction;
- the terminal device 110 can play the video through the video (video) element of the HTML5 document, and the video element includes the annotation method of the video, and can set and return the specified playback position in the video by setting the currentTime attribute of the video element.
- the terminal device invokes the program script to dynamically configure the play time attribute of the video clip at the initial position indicated by the instruction, that is, the currentTime attribute of the video clip of the initial position.
- the program script can be called to set the currentTime attribute of the initial position video clip to the indicated entry trigger time.
- step S432 the initial position video segment is played according to the play start time indicated by the play time attribute.
- the video clip of the initial position can be played on time according to the play start time included in the play time attribute.
- the calling program script is configured to configure the playing time attribute of the moving video segment according to the time indicated by the motion triggering instruction.
- the playback time attribute of the target video segment is configured according to the playback end time of the mobile video segment, so that the target video segment can start playing when the mobile video segment ends.
- the prior art configures the currentTime attribute of each frame of the video in the program script, and decrements the currentTime attribute of each frame image to achieve the effect of simulating the lens movement, as shown in FIG. 7-1 and FIG. 7 2.
- Figure 7-3 the currentTime attribute of the image is modified by the program script for each frame of the video, except that it causes no sound when the video is played, and the operation is not smooth on the poorly performing mobile phone device, and the picture is stuck.
- the embodiment of the present application when modifying the currentTime attribute of the video segment indicated by the triggering instruction, directly jumps to play the corresponding video segment without modifying the currentTime attribute of each frame of the video, thereby achieving the simulation. The effect of the lens movement, the sound of the video will not disappear.
- the virtual scene display method for simulating lens movement provided by the embodiment of the present application further includes the following steps:
- step S801 a fallback trigger instruction for triggering the analog lens fallback function is received, and a jump back to the playback of the fallback video segment indicated by the fallback trigger instruction is performed, and the process of moving the simulated lens from the target position to the initial position is performed.
- the upper layer of the video clip display interface of the target position may be superimposed with a control button for triggering the analog lens fallback function.
- the terminal device receives a fallback trigger instruction for triggering the analog lens fallback function, and then jumps.
- the playback video clip specified by the playback back trigger command Simulate the visual effect of the lens moving from the target position to the initial position by playing back the video clip.
- the moving video segment and the back-off video segment are mutually inverted, that is, the first frame of the moving video segment is the last frame of the backward video segment, and the first frame of the backward video segment is the last frame of the moving video segment.
- the way to jump back to playback of a video clip can be by modifying the currentTime property of the fallback video clip by calling a program script.
- step S802 when the playback of the fallback video clip ends, jumping to play the initial position video segment indicated by the fallback trigger instruction, the analog lens is in the initial position, waiting to receive the movement triggering the simulated scene moving function Trigger instruction.
- the lens has finished returning from the target position to the initial position, thereby jumping back to the playback initial position video clip, which can simulate the visual effect when the lens is in the initial position.
- the playback of the initial position video clip it may wait to receive a motion trigger instruction that triggers the analog scene move function, and proceeds to step S450-step S470.
- the virtual scene display method for the simulated lens movement provided by the embodiment of the present application further includes:
- Determining whether there is a next position video segment if present, looping the target position video segment, waiting to receive a trigger command that triggers the analog lens to move to the next position.
- the terminal device 110 determines whether there is still a next location video segment, in other words, whether the analog lens is still required to move to the next location. For example, after the analog lens moves from the area 1 (initial position) to the area 2 (target position), it is determined whether there is still a video clip of the area 3 (next position), and if present, the video clip corresponding to the loop play area 2 Waiting to receive a trigger command that triggers the analog lens to move to zone 3, and can also wait to receive a trigger command that triggers the analog lens to fall back to zone 1.
- the moving video segment between play zone 2 to zone 3 is jumped, after which the video clip of zone 3 is played. If a trigger instruction to fall back to the area 1 is received, the video clip of the display area 1 can be jumped back by performing the above steps S801-S802.
- determining whether there is a next-position video segment includes:
- the video segments of different regions may be numbered according to the movement of the scene in advance, for example, from 1, 2, 3, 4 to N, respectively representing the video segment of the number 1 region, the video segment of the number 2 region, and so on.
- the video clip of the number 2 area is the target position video clip.
- a form of a video clip display interface of a target location may be superimposed on the upper layer of the video clip display interface of the target location when the video clip of the next location is present (for example, FIG. 9 In the "Enter Energy Room” button) and the scene right shift control button (such as the "Enter Transform Room” button in Figure 9).
- the triggering instruction for moving the analog lens to the next position may be a triggering instruction for triggering the scene to shift the control button to the right
- the backing triggering command may be a triggering instruction for triggering the left shifting control button of the scene.
- a detailed process of a virtual scene display method for simulating lens movement is as follows:
- step S1001 receiving an application triggered by the user to load the virtual scene display
- step S1002 display a start interface (refer to FIG. 5);
- step S1003 the user clicks the "Enter Lab” button, and the terminal device receives the trigger instruction of the user to start the virtual scene function of the start interface, and then plays the initial position video segment.
- step S1004 it is judged whether or not there is a next position video segment.
- step S1005 if there is a next position video segment, the initial position video segment is looped. If it does not exist, the landing page is displayed after the initial position video clip ends.
- step S1006 during the initial position video segment loop, when the user clicks the right shift control button, a trigger command for triggering the scene right shift control button is received.
- a trigger command for triggering the left shift control button of the scene is received.
- step S1007 when the user clicks the right shift control button, the play scene is shifted to the right; or when the user clicks the left shift control button, the play scene is shifted to the left.
- the video clip of the specified position can be played cyclically, and the specified left-shifted video clip can be played when the user clicks the left shift button, and the user clicks the right button.
- the specified right-shifted video clip can be played when the button is moved.
- step S1008 when the scene right shifting segment or the left shifting segment playing ends, the video segment of the corresponding position after the scene is shifted right or left is displayed.
- step S1008 Continuing to determine whether there is a next location video segment, if there is, the video segment of the corresponding location in step S1008 is looped. Wait for the user to trigger the left shift control button or the right shift control button.
- the technical solution provided by the embodiment of the present application can simulate the movement of the lens in the virtual scene, and the animation scene can be presented with a visual effect that can be played backwards.
- the high-quality images are guaranteed without losing the original sound of the video.
- This technology has low performance requirements for smart devices such as mobile phones, and does not cause images when the network conditions are ideal.
- Caton can give users the best visual experience of lens movement, page turning, scrolling and rewinding.
- the following is an embodiment of the device in the embodiment of the present application, which may be used to perform the virtual scene display method embodiment of the analog lens movement performed by the terminal device 110 in the embodiment of the present application.
- the details that are not disclosed in the embodiment of the present application refer to the embodiment of the virtual scene display method for simulating lens movement in the embodiment of the present application.
- FIG. 12 is a block diagram of a virtual scene display device for simulating lens movement, which may be used in the terminal device 110 of the implementation environment shown in FIG. 2, according to an exemplary embodiment. 4B, FIG. 6, FIG. 8, and FIG. 10 all or part of the steps of the virtual scene display method of the simulated lens movement.
- the device includes, but is not limited to, an opening interface display module 1210, an initial segment display module 1230, a moving segment display module 1250, and a target segment display module 1270.
- the interface display module 1210 is configured to receive a triggering instruction for enabling the virtual scene display, and display a virtual scene opening interface.
- the initial segment display module 1230 is configured to receive an entry instruction of the virtual scene function in the open interface, play an initial position video segment indicated by the entry instruction, and the simulated lens is in an initial position;
- a moving segment display module 1250 configured to receive a motion triggering instruction that triggers an analog lens moving function, jump to a mobile video segment that is played by the mobile triggering instruction, and simulate a process of moving the lens from the initial position to a target position;
- the target segment display module 1270 is configured to jump to play the target position video segment indicated by the motion trigger instruction when the mobile video segment is played, and the simulated lens is at the target position.
- the open interface display module 1210 can be, for example, one of the physical structure multimedia components 208 of FIG.
- the initial segment display module 1230, the mobile segment display module 1250, and the target segment display module 1270 may also be functional modules for performing corresponding steps in the virtual scene display method for analog lens movement described above. It will be appreciated that these modules can be implemented in hardware, software, or a combination of both. When implemented in hardware, these modules may be implemented as one or more hardware modules, such as one or more application specific integrated circuits. When implemented in software, the modules may be implemented as one or more computer programs executed on one or more processors, such as the programs stored in memory 204 executed by processor 218 of FIG.
- the initial segment display module 1230 may include, but is not limited to:
- the attribute configuration unit 1231 is configured to receive an entry instruction of the virtual scene function in the open interface, and dynamically configure, according to the entry trigger time indicated by the entry instruction, a play time attribute of the video segment of the initial position indicated by the entry instruction;
- the playing unit 1232 is configured to play the initial position video segment according to the play start time indicated by the play time attribute.
- the virtual scene display device that simulates the movement of the lens may further include, but is not limited to:
- a rollback segment display module configured to receive a fallback trigger instruction that triggers an analog lens fallback function, jump to play back a backed video segment indicated by the fallback trigger instruction, and move the simulated lens from the target position to the initial Location process
- the virtual scene display device that simulates the movement of the lens may further include, but is not limited to:
- the video segment determining module is configured to determine whether there is a next location video segment; if present, the target location video segment is cyclically played, and waits to receive a trigger instruction that triggers the analog lens to move to the next location.
- the video segment determining module includes:
- the number judging unit is configured to judge whether there is a video clip of the next number according to the number order of the different video clips set in advance and the number of the video clip of the target position.
- the embodiment of the present application further provides an electronic device, which can be used in the terminal device 110 of the implementation environment shown in FIG. 2, and performs the operations shown in any of FIG. 4B, FIG. 6, FIG. 8, FIG. Simulate all or part of the steps of the virtual scene display method of moving the lens.
- the electronic device includes:
- a memory for storing processor executable instructions
- the processor is configured to perform the virtual scene display method of the simulated lens movement described in the foregoing embodiments.
- a storage medium is also provided, which is a computer readable storage medium, such as a temporary and non-transitory computer readable storage medium including instructions.
- the storage medium stores a computer program executable by the processor 218 to perform a virtual scene display method for performing analog lens movement as described in the above embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (13)
- 一种虚拟场景显示方法,应用于终端设备,所述方法包括:获取动画数据和视频数据;在显示界面的动画显示区域显示所述动画数据对应的动画界面,所述动画界面包括由多个动画元素构成的第一虚拟场景;响应于对所述动画界面中动画元素的操作,播放所述视频数据中的视频片段,所述视频片段展示从所述第一虚拟场景转换到第二虚拟场景的画面。
- 根据权利要求1所述的方法,其中,播放所述视频数据中的视频片段包括:获取预设的与所述操作对应的视频片段位置;播放所述视频数据中所述视频片段位置对应的所述视频片段。
- 根据权利要求2所述的方法,其中,所述视频片段位置包括所述视频片段在所述视频数据中的开始时间;播放所述视频数据中所述视频片段位置对应的所述视频片段包括:将所述显示界面中视频元素的时间属性配置为所述开始时间,使所述视频元素播放所述视频数据中所述开始时间位置对应的视频片段。
- 根据权利要求3所述的方法,其中,所述动画显示区域由HTML5的Canvas元素提供;所述视频元素为HTML5的Video元素。
- 根据权利要求1所述的方法,其中,所述视频数据包括复数个视 频片段;所述方法进一步包括:播放所述视频数据中与所述第二虚拟场景对应的第二视频片段,并在所述第二视频片段的显示区域叠加显示互动元素;响应于对所述互动元素的操作,播放所述视频数据中与所述互动显示元素对应的第三视频片段。
- 根据权利要求5所述的方法,其中,响应于未检测到所述操作的判断,当所述第二视频片段播放完毕时,再一次播放所述第二视频片段并叠加显示所述互动元素。
- 一种虚拟场景显示装置,包括:处理器和存储器,所述存储器中存储有计算机可读指令,所述指令可以使所述处理器:获取动画数据和视频数据;在显示界面的动画显示区域显示所述动画数据对应的动画界面,所述动画界面包括由多个动画元素构成的第一虚拟场景;响应于对所述动画界面中动画元素的操作,播放所述视频数据中的视频片段,所述视频片段展示从所述第一虚拟场景转换到第二虚拟场景的画面。
- 根据权利要求7所述的装置,其中,所述指令可以使所述处理器:获取预设的与所述操作对应的视频片段位置;播放所述视频数据中所述视频片段位置对应的所述视频片段。
- 根据权利要求8所述的装置,其中,所述指令可以使所述处理器:利用所述视频片段位置获取所述视频片段在所述视频数据中的开始时间;将所述显示界面中视频元素的时间属性配置为所述开始时间,使所述视频元素播放所述视频数据中所述开始时间位置对应的视频片段。
- 根据权利要求9所述的装置,其中,所述指令可以使所述处理器:利用HTML5的Canvas元素提供所述动画显示区域;利用HTML5的Video元素提供所述视频元素。
- 根据权利要求7所述的装置,其中,所述视频数据包括复数个视频片段;所述指令可以使所述处理器:播放所述视频数据中与所述第二虚拟场景对应的第二视频片段,并在所述第二视频片段的显示区域叠加显示互动元素;响应于对所述互动元素的操作,播放所述视频数据中与所述互动显示元素对应的第三视频片段。
- 根据权利要求11所述的装置,其中,所述视频数据包括复数个视频片段;所述指令可以使所述处理器:响应于未检测到所述操作的判断,当所述第二视频片段播放完毕时,再一次播放所述第二视频片段并叠加显示所述互动元素。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序可由处理器执行完成权利要求1-6任意一项所述的虚拟场景显示方法。
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020197030918A KR102339205B1 (ko) | 2017-08-31 | 2018-08-21 | 가상 장면 디스플레이 방법 및 디바이스, 및 저장 매체 |
| EP18851267.7A EP3677322A4 (en) | 2017-08-31 | 2018-08-21 | VIRTUAL SCENE DISPLAY METHOD AND DEVICE, AND INFORMATION MEDIA |
| JP2020511776A JP6995425B2 (ja) | 2017-08-31 | 2018-08-21 | 仮想シーン表示方法、装置、及びコンピュータプログラム |
| US16/551,498 US11341706B2 (en) | 2017-08-31 | 2019-08-26 | Virtual scene display method and apparatus, and storage medium |
| US17/727,636 US11620784B2 (en) | 2017-08-31 | 2022-04-22 | Virtual scene display method and apparatus, and storage medium |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710770774.6 | 2017-08-31 | ||
| CN201710770774.6A CN109420338A (zh) | 2017-08-31 | 2017-08-31 | 模拟镜头移动的虚拟场景显示方法及装置、电子设备 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/551,498 Continuation US11341706B2 (en) | 2017-08-31 | 2019-08-26 | Virtual scene display method and apparatus, and storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019042183A1 true WO2019042183A1 (zh) | 2019-03-07 |
Family
ID=65504698
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2018/101451 Ceased WO2019042183A1 (zh) | 2017-08-31 | 2018-08-21 | 虚拟场景显示方法、装置及存储介质 |
Country Status (6)
| Country | Link |
|---|---|
| US (2) | US11341706B2 (zh) |
| EP (1) | EP3677322A4 (zh) |
| JP (1) | JP6995425B2 (zh) |
| KR (1) | KR102339205B1 (zh) |
| CN (1) | CN109420338A (zh) |
| WO (1) | WO2019042183A1 (zh) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113761281A (zh) * | 2021-04-26 | 2021-12-07 | 腾讯科技(深圳)有限公司 | 虚拟资源处理方法、装置、介质及电子设备 |
| CN114237475A (zh) * | 2021-12-31 | 2022-03-25 | 视伴科技(北京)有限公司 | 虚拟片场的构建方法、系统、设备及介质 |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110166842B (zh) * | 2018-11-19 | 2020-10-16 | 深圳市腾讯信息技术有限公司 | 一种视频文件操作方法、装置和存储介质 |
| CN111494954B (zh) * | 2020-04-22 | 2023-09-15 | 网易(杭州)网络有限公司 | 游戏中的动画处理方法、装置、电子设备及存储介质 |
| CN113744377B (zh) * | 2020-05-27 | 2025-04-08 | 腾讯科技(深圳)有限公司 | 一种动画处理系统、方法、装置、设备及介质 |
| CN111667589B (zh) * | 2020-06-12 | 2024-06-11 | 上海商汤智能科技有限公司 | 动画效果触发展示的方法、装置、电子设备及存储介质 |
| CN111803946B (zh) * | 2020-07-22 | 2024-02-09 | 网易(杭州)网络有限公司 | 游戏中的镜头切换方法、装置和电子设备 |
| CN112188255A (zh) * | 2020-09-30 | 2021-01-05 | 北京字跳网络技术有限公司 | 基于视频的交互、视频处理方法、装置、设备及存储介质 |
| CN112770135B (zh) * | 2021-01-21 | 2021-12-10 | 腾讯科技(深圳)有限公司 | 基于直播的内容讲解方法、装置、电子设备及存储介质 |
| CN114367113B (zh) * | 2022-01-13 | 2025-10-31 | 上海莉莉丝科技股份有限公司 | 编辑虚拟场景的方法、设备、介质和计算机程序产品 |
| CN114915819B (zh) * | 2022-03-30 | 2023-09-15 | 卡莱特云科技股份有限公司 | 一种基于互动屏的数据交互方法、装置及系统 |
| CN115068936B (zh) * | 2022-05-19 | 2025-09-09 | 网易(杭州)网络有限公司 | 一种动画播放方法、装置、计算机设备及存储介质 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6040841A (en) * | 1996-08-02 | 2000-03-21 | Microsoft Corporation | Method and system for virtual cinematography |
| CN101465957A (zh) * | 2008-12-30 | 2009-06-24 | 应旭峰 | 一种虚拟三维场景中实现遥控互动的系统 |
| CN101908232A (zh) * | 2010-07-30 | 2010-12-08 | 重庆埃默科技有限责任公司 | 一种交互式场景仿真系统及场景虚拟仿真方法 |
| CN105069827A (zh) * | 2015-08-19 | 2015-11-18 | 北京中科大洋科技发展股份有限公司 | 一种采用三维模型处理视频转场的方法 |
Family Cites Families (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11137849A (ja) * | 1997-11-07 | 1999-05-25 | Daiichikosho Co Ltd | コンピュータゲーム装置 |
| EP1230959A1 (en) * | 2001-02-13 | 2002-08-14 | Pumpkin Pie Net Limited | Video simulation method and program |
| DE60223483T2 (de) * | 2001-10-29 | 2008-09-18 | Humax Co. Ltd., Yougin | Verfahren zum aufzeichenen eines digitalen Rundfunkprogramms und zeitbasierter Wiedergabe eines aufgezeichneten Rundfunkprogramms und zugehörige Vorrichtung |
| US20050075166A1 (en) * | 2002-05-14 | 2005-04-07 | Hemstreet Paul A. | Media program with interactive feature |
| US7299417B1 (en) * | 2003-07-30 | 2007-11-20 | Barris Joel M | System or method for interacting with a representation of physical space |
| CN101005609B (zh) * | 2006-01-21 | 2010-11-03 | 腾讯科技(深圳)有限公司 | 生成互动视频图像的方法及系统 |
| US8622831B2 (en) * | 2007-06-21 | 2014-01-07 | Microsoft Corporation | Responsive cutscenes in video games |
| JP5292146B2 (ja) | 2009-03-25 | 2013-09-18 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置および情報処理方法 |
| EP2430833A4 (en) * | 2009-05-13 | 2014-01-22 | Coincident Tv Inc | REPRODUCTION AND PROCESSING OF LINKED AND COMMENTED AUDIOVISUAL WORKS |
| US20120001925A1 (en) * | 2010-06-30 | 2012-01-05 | Ati Technologies, Ulc | Dynamic Feedback Load Balancing |
| CN102508662A (zh) * | 2011-11-04 | 2012-06-20 | 广东科学技术职业学院 | 一种基于brew平台的通用手机游戏开发系统和方法 |
| US10223926B2 (en) * | 2013-03-14 | 2019-03-05 | Nike, Inc. | Skateboard system |
-
2017
- 2017-08-31 CN CN201710770774.6A patent/CN109420338A/zh active Pending
-
2018
- 2018-08-21 KR KR1020197030918A patent/KR102339205B1/ko active Active
- 2018-08-21 JP JP2020511776A patent/JP6995425B2/ja active Active
- 2018-08-21 WO PCT/CN2018/101451 patent/WO2019042183A1/zh not_active Ceased
- 2018-08-21 EP EP18851267.7A patent/EP3677322A4/en active Pending
-
2019
- 2019-08-26 US US16/551,498 patent/US11341706B2/en active Active
-
2022
- 2022-04-22 US US17/727,636 patent/US11620784B2/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6040841A (en) * | 1996-08-02 | 2000-03-21 | Microsoft Corporation | Method and system for virtual cinematography |
| CN101465957A (zh) * | 2008-12-30 | 2009-06-24 | 应旭峰 | 一种虚拟三维场景中实现遥控互动的系统 |
| CN101908232A (zh) * | 2010-07-30 | 2010-12-08 | 重庆埃默科技有限责任公司 | 一种交互式场景仿真系统及场景虚拟仿真方法 |
| CN105069827A (zh) * | 2015-08-19 | 2015-11-18 | 北京中科大洋科技发展股份有限公司 | 一种采用三维模型处理视频转场的方法 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3677322A4 * |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113761281A (zh) * | 2021-04-26 | 2021-12-07 | 腾讯科技(深圳)有限公司 | 虚拟资源处理方法、装置、介质及电子设备 |
| CN113761281B (zh) * | 2021-04-26 | 2024-05-14 | 腾讯科技(深圳)有限公司 | 虚拟资源处理方法、装置、介质及电子设备 |
| CN114237475A (zh) * | 2021-12-31 | 2022-03-25 | 视伴科技(北京)有限公司 | 虚拟片场的构建方法、系统、设备及介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3677322A4 (en) | 2020-09-16 |
| KR20190131074A (ko) | 2019-11-25 |
| CN109420338A (zh) | 2019-03-05 |
| JP2020532904A (ja) | 2020-11-12 |
| US11620784B2 (en) | 2023-04-04 |
| US20190378319A1 (en) | 2019-12-12 |
| US11341706B2 (en) | 2022-05-24 |
| KR102339205B1 (ko) | 2021-12-14 |
| JP6995425B2 (ja) | 2022-01-14 |
| US20220245881A1 (en) | 2022-08-04 |
| EP3677322A1 (en) | 2020-07-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019042183A1 (zh) | 虚拟场景显示方法、装置及存储介质 | |
| CN112153288B (zh) | 用于发布视频或图像的方法、装置、设备和介质 | |
| US11079923B1 (en) | User interface for a video capture device | |
| CN101213606B (zh) | 用于交互式多媒体演示管理的同步系统和方法 | |
| KR101365829B1 (ko) | 대화형 멀티미디어 프리젠테이션을 재생하는 방법을 수행하는 컴퓨터 실행가능 명령어들이 인코딩된 컴퓨터 판독가능 매체, 및 대화형 멀티미디어 프리젠테이션을 재생하는 프리젠테이션 시스템 및 장치 | |
| KR101265936B1 (ko) | 대화형 멀티미디어 프레젠테이션 관리의 동기 특징 | |
| KR102131322B1 (ko) | 동영상을 처리하는 컴퓨팅 장치, 방법 및 컴퓨터 프로그램 | |
| WO2022068639A1 (zh) | 一种基于视频的交互方法、装置、设备及存储介质 | |
| CN101193298A (zh) | 播放运动图像的系统、方法和介质 | |
| WO2022063090A1 (zh) | 用于用户引导的方法、装置、设备和存储介质 | |
| US10319411B2 (en) | Device and method for playing an interactive audiovisual movie | |
| CN102089823A (zh) | 媒体内容呈现的各方面 | |
| US11941728B2 (en) | Previewing method and apparatus for effect application, and device, and storage medium | |
| JP2023521199A (ja) | ビデオストリーム再生制御方法、機器及び記憶媒体 | |
| US20180239504A1 (en) | Systems and methods for providing webinars | |
| WO2022252998A1 (zh) | 一种视频处理方法、装置、设备及存储介质 | |
| WO2023088484A1 (zh) | 用于多媒体资源剪辑场景的方法、装置、设备及存储介质 | |
| CN113038014A (zh) | 应用程序的视频处理方法和电子设备 | |
| CN116389802A (zh) | 一种视频处理方法、装置、设备及存储介质 | |
| JP6916860B2 (ja) | 動画を再生するためのプログラム、システム、及び方法 | |
| CN111367598B (zh) | 动作指令的处理方法、装置、电子设备及计算机可读存储介质 | |
| CN107277602B (zh) | 信息获取方法及电子设备 | |
| US10939187B1 (en) | Traversing a semantic graph to process requests for video | |
| CN119583911B (zh) | 处理视频文件的方法和装置 | |
| KR102276789B1 (ko) | 동영상 편집 방법 및 장치 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18851267 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 20197030918 Country of ref document: KR Kind code of ref document: A |
|
| ENP | Entry into the national phase |
Ref document number: 2020511776 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2018851267 Country of ref document: EP Effective date: 20200331 |