US20170064245A1 - Method, device, terminal device, and storage medium for video effect processing - Google Patents
Method, device, terminal device, and storage medium for video effect processing Download PDFInfo
- Publication number
- US20170064245A1 US20170064245A1 US15/158,276 US201615158276A US2017064245A1 US 20170064245 A1 US20170064245 A1 US 20170064245A1 US 201615158276 A US201615158276 A US 201615158276A US 2017064245 A1 US2017064245 A1 US 2017064245A1
- Authority
- US
- United States
- Prior art keywords
- video
- effect processing
- preset
- video effect
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440281—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/78—Television signal recording using magnetic recording
- H04N5/782—Television signal recording using magnetic recording on tape
- H04N5/783—Adaptations for reproducing at a rate different from the recording rate
-
- G06T7/2033—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/005—Reproducing at a different information rate from the information rate of recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47217—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Definitions
- the present disclosure relates to video processing and, more particularly, to a method, device, terminal device, and storage medium for video effect processing.
- a user in order to add a video effect, such as fast forward or slow motion, to a video in a terminal device, a user needs to install video editing software on the terminal device, perform effect processing operations on the video using the video edit software, and then save a new copy of the video. Then, the user can play the copy on the terminal device to watch the video with the video effect, such as fast forward or slow motion.
- a video effect such as fast forward or slow motion
- the method includes receiving a video effect processing instruction associated with a video, performing a movement amount detection on each of a plurality of video frames of the video to determine target video frames that require the video effect processing, and performing the video effect processing on the target video frames during playback of the video.
- a terminal device including a processor and a memory storing instructions.
- the instructions when executed by the processor, cause the processor to receive a video effect processing instruction associated with a video, perform a movement amount detection on each of a plurality of video frames of the video to determine target video frames that require the video effect processing, and perform the video effect processing on the target video frames during playback of the video.
- a non-transitory computer-readable storage medium storing instructions that, when executed by a processor in a terminal, cause the terminal to receive a video effect processing instruction associated with a video, perform a movement amount detection on each of a plurality of video frames of the video to determine target video frames that require the video effect processing, and perform the video effect processing on the target video frames during playback of the video.
- FIG. 1 is a flow chart showing a method for video effect processing according to an exemplary embodiment.
- FIG. 2 is a flow chart showing a method for video effect processing according to another exemplary embodiment.
- FIG. 3 is a schematic diagram showing a user clicking a video effect processing button.
- FIG. 4 is a flow chart showing a method for video effect processing according to another exemplary embodiment.
- FIG. 5 is a schematic diagram showing a user performing a slide operation on an interface of a terminal device that is playing a video.
- FIG. 6 is a flow chart showing a method for video effect processing according to another exemplary embodiment.
- FIG. 7 is a block diagram illustrating a device for video effect processing according to an exemplary embodiment.
- FIG. 8 is a block diagram illustrating a device for video effect processing according to another exemplary embodiment.
- FIG. 9 is a block diagram illustrating a device for video effect processing according to another exemplary embodiment.
- FIG. 10 is a block diagram illustrating a device for video effect processing according to another exemplary embodiment.
- FIG. 11 is a block diagram illustrating a device for video effect processing according to another exemplary embodiment.
- FIG. 12 is a block diagram illustrating a terminal device according to an exemplary embodiment.
- a method for video effect processing consistent with the present disclosure can be implemented, for example, in a terminal device such as a smart phone, a computer, a tablet device, a Personal Digital Assistant (PDA), or the like, and more specifically, can be performed by a component with processing function, such as a Central Processing Unit (CPU), in the terminal device.
- a component with processing function such as a Central Processing Unit (CPU)
- FIG. 1 is a flow chart of a method for video effect processing according to an exemplary embodiment.
- a video effect processing instruction is received.
- the user may wish to add video effects to the video. For example, for a video showing a rabbit running on grass, if a slow motion effect is added to the part showing the rabbit running, movement of the running rabbit can be clearly seen, such that the user watching the video may have a feeling of enjoying a blockbuster movie.
- a large portion may be merely a still picture that does not change for a long time. The user watching the surveillance video may wish to fast forward the surveillance video to reduce the time for watching the video.
- a fast forward or a slow motion video effect can be performed directly while the video is played.
- the terminal device decodes and plays the video under a normal speed. If the user wishes to add a video effect of fast forward or slow motion on the video, the user can turn on a video effect processing function before playing the video, or input a video effect processing instruction to the terminal interactively while the video is being played.
- the video effect processing instruction generated by such operations can trigger the terminal device to perform a video effect processing on the video.
- a fast forward or a slow motion video effect processing is performed on the video according to the video effect processing instruction. That is, when being triggered by the video effect processing instruction, the terminal device initiates the fast forward or the slow motion video effect processing on the video.
- the terminal device can also acquire instruction information on performing the video effect processing on the video according to the video effect processing instruction.
- the user can generate the video effect processing instruction by clicking a preset video effect processing button, which can be a virtual button on a touch screen of the terminal device, or a physical button of the terminal device.
- the terminal device can include two special buttons for fast forward or slow motion, respectively.
- the terminal device can include only one special button and the video effect processing instruction can be generated by various clicking operations, such as a single click and a double click.
- FIG. 2 is a flow chart of a method for video effect processing according to another exemplary embodiment.
- a video effect processing instruction is received.
- the video effect processing instruction is generated after a user clicks a preset video effect processing button, which is configured to trigger a video effect processing on a video to be played, such as the fast forward or slow motion video effect processing.
- the user can click on the preset video effect processing button before playing the video to generate a slow motion video effect processing instruction that triggers the terminal device to perform the slow motion video effect processing while playing the video.
- the preset video effect processing button can be a virtual button on the touch screen of the terminal device or a physical button of the terminal device.
- the preset video effect processing button can be one of two special buttons for fast forward and slow motion, respectively, or can be a special button configured to generate different video effect processing instructions for fast forward and slow motion by different clicking operations such as single click and double click.
- FIG. 3 is a schematic diagram showing the user clicking the video effect processing button. As illustrated in FIG. 3 , the buttons for fast forward and slow motion are provided on an interface for playing the video on the terminal device.
- a movement amount detection is performed on the video according to the video effect processing instruction and video frames that require the video effect processing are determined.
- the terminal device can perform the movement amount detection automatically on the video during the playback of the video according to the above video effect processing instruction.
- the video includes a plurality of video frames, which may be processed differently according to the present disclosure.
- the slow motion video effect processing can be performed such that the user can more clearly see the movement or can be performed to realize a slowing-down effect of the fast movement.
- the fast forward video effect processing can be performed to shorten the watching time or to prevent the user from being impatient.
- the terminal device can perform the movement amount detection in each frame of the video using a Motion Estimate and Motion Compensation (MEMC) technology, to find fast movement frames or slow movement frames, according to the received video effect processing instruction, based on the speed of changes of the image content in each frame of the video.
- MEMC Motion Estimate and Motion Compensation
- the terminal device can determine the fast movement frames as video frames that require slow motion video effect processing and the slow movement frames as video frames that require fast forward video effect processing.
- each video frame can be divided to a plurality of preset blocks.
- the terminal device can perform the movement amount detection, for example, through the MEMC technology, on the video according to the above video effect processing instruction to acquire a motion vector magnitude value for each of the preset blocks in each video frame. If the video effect processing instruction is the slow motion video effect processing instruction, the terminal device determines the number of fast blocks in each of the video frames, where a fast block refers to a preset block having a motion vector magnitude value greater than a motion vector threshold. For each video frame, the terminal device calculates a first ratio between the number of fast blocks and the total number of preset blocks in the video frame, and judges whether the first ratio is greater than a first preset ratio threshold. If the first ratio is greater than the first preset ratio threshold, which means the corresponding video frame involves fast movement, the terminal device determines that the corresponding video frame is a video frame that requires the slow motion video effect processing.
- the terminal device determines the number of slow blocks in each of the video frames, where a slow block refers to a preset block having a motion vector magnitude value smaller than the motion vector threshold. For each video frame, the terminal device calculates a second ratio between the number of slow blocks and the total number of preset blocks in the video frame, and judges whether the second ratio is greater than a second preset ratio threshold. If the second ratio is greater than the second preset ratio threshold, which means the corresponding video frame involves slow movement or a still or close-to-still image, the terminal device determines that the corresponding video frame is a video frame that requires the fast forward video effect processing.
- a video frame that requires either the slow motion video effect processing or the fast forward video effect processing is also referred to as a “video frame to be processed.”
- the video effect processing is performed on the video frames to be processed according to the video effect processing instruction.
- the terminal device performs a frame interpolation, such as an MEMC technology based frame interpolation, on the video frames to be processed according to a preset interpolation algorithm.
- the preset interpolation algorithm can include a multiple of the interpolation and a correspondence between the number of interpolating frames and a moving speed in the video frames to be processed.
- the terminal device performs a frame extraction, such as an MEMC technology based frame extraction, on the video frames to be processed according to a preset extraction algorithm.
- the preset extraction algorithm can include a ratio of the extraction and a correspondence between the number of extracted frames and the moving speed in the video frames to be processed.
- FIG. 4 is a flow chart showing a method for video effect processing according to another exemplary embodiment.
- a video effect processing instruction is received.
- the video effect processing instruction is generated after a user performs a slide operation on an interface of the terminal device that is playing a video.
- the slide operation can include, for example, a left slide operation to generate a slow motion video effect processing instruction or a right slide operation to generate a fast forward video effect processing instruction.
- FIG. 5 is a schematic diagram of the user performing the left slide operation on the interface with a finger during the playback of the video.
- the slide operation is the left slide operation, a slow motion video effect may be added to the video.
- the slide operation is the right slide operation, a fast forward video effect may be added to the video.
- a video effect process is performed on video frames that start from a video frame corresponding to the time at which the video effect processing instruction is received, also referred to herein as “instruction receiving time,” to realize the corresponding video effect processing on the video.
- the terminal device starts to perform a frame interpolation on the video frames starting from the video frame corresponding to the instruction receiving time. Specifically, the terminal device processes these video frames according to a preset interpolation algorithm to realize the slow motion video effect in the video.
- the terminal device starts to perform a frame extraction on the video frames starting from the video frame corresponding to the instruction receiving time. Specifically, the terminal device processes these video frames according to a preset extraction algorithm to realize the fast forward video effect in the video.
- FIG. 6 is a flow chart showing a method for video effect processing according to another exemplary embodiment.
- a video effect processing instruction is received.
- the video effect processing instruction is generated after a user clicks a preset video effect processing button on an interface of the terminal device that is playing a video, and the preset video effect processing button is configured to trigger the video effect processing on the video being played.
- a video effect process is performed on the video frames that start from a video frame corresponding to the instruction receiving time, to realize the corresponding video effect processing on the video.
- the terminal device starts to perform a frame interpolation on the video frames starting from the video frame corresponding to the instruction receiving time. Specifically, the terminal device processes these video frames according to a preset interpolation algorithm to realize the slow motion video effect in the video.
- the terminal device starts to perform a frame extraction on the video frames starting from the video frame corresponding to the instruction receiving time. Specifically, the terminal device processes these video frames according to a preset extraction algorithm to realize the fast forward video effect in the video.
- the video effect processing instruction is received before the video is played.
- the terminal device automatically performs the movement amount detection to determine the video frames that need to be processed.
- the video effect processing instruction is received while the video is being played.
- the terminal device does not actively perform the movement amount detection on the video. Rather, the terminal device starts to perform the corresponding video effect processing on the video from the video frames corresponding to the instruction receiving time. That is, in this scenario, the terminal triggers the video effect processing according to an interactive operation of the user.
- FIG. 7 is a block diagram illustrating a device 700 for video effect processing according to an exemplary embodiment.
- the device 700 includes a receiving unit 701 and a video effect processing unit 702 .
- the receiving unit 701 is configured to receive a video effect processing instruction.
- the video effect processing unit 702 is configured to perform a video effect processing, such as a slow motion video effect processing or a fast forward video effect processing, on a video according to the video effect processing instruction during the playback of the video.
- the receiving unit 701 is further configured to receive the video effect processing instruction that is generated after a user clicks a preset video effect processing button.
- FIG. 8 is a block diagram illustrating a device 800 for video effect processing according to another exemplary embodiment.
- the video effect processing unit 702 includes a slow motion movement detection module 801 and an interpolation processing module 802 .
- the slow motion movement detection module 801 is configured to perform a movement amount detection on the video according to the video effect processing instruction and determine video frames that require the slow motion video effect processing in the video.
- the interpolation processing module 802 is configured to perform a frame interpolation process on these video frames according to a preset interpolation algorithm to realize the slow motion video effect processing on the video.
- FIG. 9 is a block diagram illustrating a device 900 for video effect processing according to another exemplary embodiment.
- the slow motion movement detection module 801 includes a magnitude value acquiring sub module 901 , a first determining sub module 902 , a calculation sub module 903 , a deciding sub module 904 , and a slow motion frame determining sub module 905 .
- the magnitude value acquiring sub module 901 is configured to perform a movement amount detection on the video according to the video effect processing instruction and acquire motion vector magnitude values of all preset blocks in each video frame of the video.
- the first determining sub module 902 is configured to determine fast blocks in a video frame.
- the calculation sub module 903 is configured to calculate a first ratio between the number of fast blocks and the total number of preset blocks in the video frame.
- the deciding sub module 904 is configured to judge whether the first ratio is greater than a first preset ratio threshold.
- the slow motion frame determining sub module 905 is configured to determine that the video frame requires the slow motion video effect processing if the first ratio is greater than the first preset ratio threshold.
- FIG. 10 is a block diagram illustrating a device 1000 for video effect processing according to another exemplary embodiment.
- the video effect processing unit 702 includes a fast forward movement detection module 1001 and an extraction processing module 1002 .
- the fast forward movement detection module 1001 is configured to perform a movement amount detection on the video that is to be processed according to the video effect processing instruction and determine, from the video, video frames that require the fast forward video effect processing.
- the extraction processing module 1002 is configured to perform a frame extraction process on these video frames according to a preset extraction algorithm to realize the fast forward video effect processing on the video.
- FIG. 11 is a block diagram illustrating a device 1100 for video effect processing according to another exemplary embodiment.
- the fast forward movement detection module 1001 includes the magnitude value acquiring sub module 901 , a second determining sub module 1102 , a calculation sub module 1103 , a deciding sub module 1104 , and a fast forward frame determining sub module 1105 .
- the second determining sub module 1102 is configured to determine slow blocks in a video frame.
- the calculation sub module 1103 is configured to calculate a second ratio between the number of fast blocks and the total number of preset blocks in the video frame.
- the deciding sub module 1104 is configured to judge whether the second ratio is greater than a second preset ratio threshold.
- the fast forward frame determining sub module 1105 is configured to determine that the video frame requires the fast forward video effect processing if the second ratio is greater than the second preset ratio threshold.
- the receiving unit 701 is further configured to receive the video effect processing instruction that is generated after the user performs a slide operation on an interface of the terminal that is playing a video.
- the video effect processing unit 702 is further configured to perform, according to the video effect processing instruction and a preset interpolation algorithm, a frame interpolation process on the video frames that start from a video frame corresponding to the instruction receiving time, to realize the slow motion effect processing on the video.
- the video effect processing unit 702 is further configured to perform, according to the video effect processing instruction and a preset extraction algorithm, a frame extraction process on the video frames that start from a video frame corresponding to the instruction receiving time, to realize the fast forward effect processing on the video.
- the receiving unit 701 is further configured to receive the video effect processing instruction that is generated after the user clicks a preset video effect processing button on the interface of the terminal device that is playing the video.
- the video effect processing unit 702 is further configured to perform, according to the video effect processing instruction and a preset interpolation algorithm, a frame interpolation process on the video frames that start from a video frame corresponding to the instruction receiving time, to realize the slow motion video effect processing on the video.
- the video effect processing unit 702 is further configured to perform, according to the video effect processing instruction and a preset extraction algorithm, a frame extraction process on the video frames that start from a video frame corresponding to the instruction receiving time, to realize the fast forward video effect processing on the video.
- FIG. 12 is a block diagram of a terminal device 1200 according to an exemplary embodiment.
- the terminal device 1200 may be a smart phone, a computer, a tablet device, a PDA (Personal Digital Assistant), or the like.
- PDA Personal Digital Assistant
- the terminal device 1200 includes one or more of the following components: a processing component 1202 , a memory 1204 , a power component 1206 , a multimedia component 1208 , an audio component 1210 , an input/output ( 1 / 0 ) interface 1212 , a sensor component 1214 , and a communication component 1216 .
- the processing component 1202 typically controls overall operations of the terminal device 1200 , such as the operations associated with display, telephone calls, data communications, camera operations, and recording operations.
- the processing component 1202 may include one or more processors 1220 to execute instructions to perform all or part of a method consistent with the present disclosure, such as one of the above-described exemplary methods.
- the processing component 1202 may include one or more modules which facilitate the interaction between the processing component 1202 and other components.
- the processing component 1202 may include a multimedia module to facilitate the interaction between the multimedia component 1208 and the processing component 1202 .
- the memory 1204 is configured to store various types of data to support the operation of the terminal device 1200 . Examples of such data include instructions for any applications or methods operated on the terminal device 1200 , contact data, phonebook data, messages, pictures, video, etc.
- the memory 1204 may be implemented using any type of volatile or non-volatile memory devices, or a combination thereof, such as a static random access memory (SRAM), an electrically erasable programmable read-only memory (EEPROM), an erasable programmable read-only memory (EPROM), a programmable read-only memory (PROM), a read-only memory (ROM), a magnetic memory, a flash memory, a magnetic or optical disk.
- SRAM static random access memory
- EEPROM electrically erasable programmable read-only memory
- EPROM erasable programmable read-only memory
- PROM programmable read-only memory
- ROM read-only memory
- magnetic memory a magnetic memory
- flash memory a flash memory
- the power component 1206 provides power to various components of the terminal device 1200 .
- the power component 1206 may include a power management system, one or more power sources, and any other components associated with the generation, management, and distribution of power in the terminal device 1200 .
- the multimedia component 1208 includes a screen providing an output interface between the terminal device 1200 and the user.
- the screen may include a liquid crystal display (LCD) and a touch panel. If the screen includes the touch panel, the screen may be implemented as a touch screen to receive input signals from the user.
- the touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensors may not only sense a boundary of a touch or swipe action, but also sense a period of time and a pressure associated with the touch or swipe action.
- the multimedia component 1208 includes a front camera and/or a rear camera.
- the front camera and the rear camera may receive an external multimedia datum while the terminal device 1200 is in an operation mode, such as a photographing mode or a video mode.
- an operation mode such as a photographing mode or a video mode.
- Each of the front camera and the rear camera may be a fixed optical lens system or have focus and optical zoom capability.
- the audio component 1210 is configured to output and/or input audio signals.
- the audio component 1210 includes a microphone configured to receive an external audio signal when the terminal device 1200 is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode.
- the received audio signal may be further stored in the memory 1204 or transmitted via the communication component 1216 .
- the audio component 1210 further includes a speaker to output audio signals.
- the I/O interface 1212 provides an interface between the processing component 1202 and peripheral interface modules, such as a keyboard, a click wheel, buttons, and the like.
- the buttons may include, but are not limited to, a home button, a volume button, a starting button, and a locking button.
- the sensor component 1214 includes one or more sensors to provide status assessments of various aspects of the terminal device 1200 .
- the sensor component 1214 may detect an open/closed status of the terminal device 1200 , relative positioning of components, e.g., the display and the keypad, of the terminal device 1200 , a change in position of the terminal device 1200 or a component of the terminal device 1200 , a presence or absence of user contact with the terminal device 1200 , an orientation or an acceleration/deceleration of the terminal device 1200 , and a change in temperature of the terminal device 1200 .
- the sensor component 1214 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
- the sensor component 1214 may also include a light sensor, such as a CMOS (Complementary Metal Oxide Semiconductor) or CCD (Charge-coupled Device) image sensor, for use in imaging applications.
- CMOS Complementary Metal Oxide Semiconductor
- CCD Charge-coupled Device
- the sensor component 1214 may also include an accelerometer sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
- the communication component 1216 is configured to facilitate communication, wired or wirelessly, between the terminal device 1200 and other devices.
- the terminal device 1200 can access a wireless network based on a communication standard, such as WiFi (WIreless-Fidelity), 3G or 4Q or a combination thereof.
- the communication component 816 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel.
- the communication component 1216 further includes a near field communication (NFC) module to facilitate short-range communications.
- the NFC module may be implemented based on a radio frequency identification (RFID) technology, an infrared data association (IrDA) technology, an ultra-wideband (UWB) technology, a Bluetooth technology, or other technologies.
- RFID radio frequency identification
- IrDA infrared data association
- UWB ultra-wideband
- Bluetooth or other technologies.
- the terminal device 1200 may be implemented with one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components, for performing a method consistent with the present disclosure, such as one of the above-described exemplary methods.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- controllers micro-controllers, microprocessors, or other electronic components, for performing a method consistent with the present disclosure, such as one of the above-described exemplary methods.
- non-transitory computer-readable storage medium including instructions, such as included in the memory 1204 , executable by the processor 1220 in the terminal device 1200 , for performing a method consistent with the present disclosure, such as one of the above-described exemplary methods.
- the non-transitory computer-readable storage medium may be a ROM, a RAM (Random Access Memory), a CD-ROM (Compact Disc Read-Only Memory), a magnetic tape, a floppy disc, an optical data storage device, or the like.
- a fast forward video effect processing or a slow motion video effect processing can be performed during playback of a video according to a video effect processing instruction.
- the video playback and the video effect processing can be performed simultaneously.
- the efficiency of the video effect processing is increased and a user's enjoyment of the video playback is improved.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
A method for video effect processing includes receiving a video effect processing instruction associated with a video, performing a movement amount detection on each of a plurality of video frames of the video to determine target video frames that require the video effect processing, and performing the video effect processing on the target video frames during playback of the video.
Description
- This application is based upon and claims priority to Chinese Patent Application 201510541656.9, filed on Aug. 28, 2015, the entire contents of which are incorporated herein by reference.
- The present disclosure relates to video processing and, more particularly, to a method, device, terminal device, and storage medium for video effect processing.
- In conventional technologies, in order to add a video effect, such as fast forward or slow motion, to a video in a terminal device, a user needs to install video editing software on the terminal device, perform effect processing operations on the video using the video edit software, and then save a new copy of the video. Then, the user can play the copy on the terminal device to watch the video with the video effect, such as fast forward or slow motion.
- In accordance with the present disclosure, there is provided a method for video effect processing. The method includes receiving a video effect processing instruction associated with a video, performing a movement amount detection on each of a plurality of video frames of the video to determine target video frames that require the video effect processing, and performing the video effect processing on the target video frames during playback of the video.
- In accordance with the present disclosure, there is also provided a terminal device including a processor and a memory storing instructions. The instructions, when executed by the processor, cause the processor to receive a video effect processing instruction associated with a video, perform a movement amount detection on each of a plurality of video frames of the video to determine target video frames that require the video effect processing, and perform the video effect processing on the target video frames during playback of the video.
- In accordance with the present disclosure, there is also provided a non-transitory computer-readable storage medium storing instructions that, when executed by a processor in a terminal, cause the terminal to receive a video effect processing instruction associated with a video, perform a movement amount detection on each of a plurality of video frames of the video to determine target video frames that require the video effect processing, and perform the video effect processing on the target video frames during playback of the video.
- It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the present disclosure.
- The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
-
FIG. 1 is a flow chart showing a method for video effect processing according to an exemplary embodiment. -
FIG. 2 is a flow chart showing a method for video effect processing according to another exemplary embodiment. -
FIG. 3 is a schematic diagram showing a user clicking a video effect processing button. -
FIG. 4 is a flow chart showing a method for video effect processing according to another exemplary embodiment. -
FIG. 5 is a schematic diagram showing a user performing a slide operation on an interface of a terminal device that is playing a video. -
FIG. 6 is a flow chart showing a method for video effect processing according to another exemplary embodiment. -
FIG. 7 is a block diagram illustrating a device for video effect processing according to an exemplary embodiment. -
FIG. 8 is a block diagram illustrating a device for video effect processing according to another exemplary embodiment. -
FIG. 9 is a block diagram illustrating a device for video effect processing according to another exemplary embodiment. -
FIG. 10 is a block diagram illustrating a device for video effect processing according to another exemplary embodiment. -
FIG. 11 is a block diagram illustrating a device for video effect processing according to another exemplary embodiment. -
FIG. 12 is a block diagram illustrating a terminal device according to an exemplary embodiment. - Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise represented. The implementations set forth in the following description of exemplary embodiments do not represent all implementations consistent with the present disclosure. Instead, they are merely examples of devices and methods consistent with aspects related to the invention as recited in the appended claims.
- A method for video effect processing consistent with the present disclosure can be implemented, for example, in a terminal device such as a smart phone, a computer, a tablet device, a Personal Digital Assistant (PDA), or the like, and more specifically, can be performed by a component with processing function, such as a Central Processing Unit (CPU), in the terminal device.
-
FIG. 1 is a flow chart of a method for video effect processing according to an exemplary embodiment. As illustrated inFIG. 1 , at 101, a video effect processing instruction is received. Sometimes, while watching a video on the terminal device, the user may wish to add video effects to the video. For example, for a video showing a rabbit running on grass, if a slow motion effect is added to the part showing the rabbit running, movement of the running rabbit can be clearly seen, such that the user watching the video may have a feeling of enjoying a blockbuster movie. As another example, in a surveillance video, a large portion may be merely a still picture that does not change for a long time. The user watching the surveillance video may wish to fast forward the surveillance video to reduce the time for watching the video. According to the present disclosure, a fast forward or a slow motion video effect can be performed directly while the video is played. Generally, the terminal device decodes and plays the video under a normal speed. If the user wishes to add a video effect of fast forward or slow motion on the video, the user can turn on a video effect processing function before playing the video, or input a video effect processing instruction to the terminal interactively while the video is being played. The video effect processing instruction generated by such operations can trigger the terminal device to perform a video effect processing on the video. - At 102, during playback of the video, a fast forward or a slow motion video effect processing is performed on the video according to the video effect processing instruction. That is, when being triggered by the video effect processing instruction, the terminal device initiates the fast forward or the slow motion video effect processing on the video. At the same time, the terminal device can also acquire instruction information on performing the video effect processing on the video according to the video effect processing instruction. For example, the user can generate the video effect processing instruction by clicking a preset video effect processing button, which can be a virtual button on a touch screen of the terminal device, or a physical button of the terminal device. In some embodiments, the terminal device can include two special buttons for fast forward or slow motion, respectively. In some embodiments, the terminal device can include only one special button and the video effect processing instruction can be generated by various clicking operations, such as a single click and a double click.
-
FIG. 2 is a flow chart of a method for video effect processing according to another exemplary embodiment. As illustrated inFIG. 2 , at 201, a video effect processing instruction is received. The video effect processing instruction is generated after a user clicks a preset video effect processing button, which is configured to trigger a video effect processing on a video to be played, such as the fast forward or slow motion video effect processing. For example, the user can click on the preset video effect processing button before playing the video to generate a slow motion video effect processing instruction that triggers the terminal device to perform the slow motion video effect processing while playing the video. The preset video effect processing button can be a virtual button on the touch screen of the terminal device or a physical button of the terminal device. The preset video effect processing button can be one of two special buttons for fast forward and slow motion, respectively, or can be a special button configured to generate different video effect processing instructions for fast forward and slow motion by different clicking operations such as single click and double click.FIG. 3 is a schematic diagram showing the user clicking the video effect processing button. As illustrated inFIG. 3 , the buttons for fast forward and slow motion are provided on an interface for playing the video on the terminal device. - At 202, a movement amount detection is performed on the video according to the video effect processing instruction and video frames that require the video effect processing are determined.
- The terminal device can perform the movement amount detection automatically on the video during the playback of the video according to the above video effect processing instruction. The video includes a plurality of video frames, which may be processed differently according to the present disclosure. For example, for video frames containing fast movement, which are also referred to herein as “fast movement frames,” the slow motion video effect processing can be performed such that the user can more clearly see the movement or can be performed to realize a slowing-down effect of the fast movement. On the other hand, for video frames containing slow movement or including still or close-to-still images, which are also referred to herein as “slow movement frames,” the fast forward video effect processing can be performed to shorten the watching time or to prevent the user from being impatient. In some embodiments, the terminal device can perform the movement amount detection in each frame of the video using a Motion Estimate and Motion Compensation (MEMC) technology, to find fast movement frames or slow movement frames, according to the received video effect processing instruction, based on the speed of changes of the image content in each frame of the video. The terminal device can determine the fast movement frames as video frames that require slow motion video effect processing and the slow movement frames as video frames that require fast forward video effect processing.
- In some embodiments, each video frame can be divided to a plurality of preset blocks. The terminal device can perform the movement amount detection, for example, through the MEMC technology, on the video according to the above video effect processing instruction to acquire a motion vector magnitude value for each of the preset blocks in each video frame. If the video effect processing instruction is the slow motion video effect processing instruction, the terminal device determines the number of fast blocks in each of the video frames, where a fast block refers to a preset block having a motion vector magnitude value greater than a motion vector threshold. For each video frame, the terminal device calculates a first ratio between the number of fast blocks and the total number of preset blocks in the video frame, and judges whether the first ratio is greater than a first preset ratio threshold. If the first ratio is greater than the first preset ratio threshold, which means the corresponding video frame involves fast movement, the terminal device determines that the corresponding video frame is a video frame that requires the slow motion video effect processing.
- On the other hand, if the video effect processing instruction is the fast forward video effect processing instruction, the terminal device determines the number of slow blocks in each of the video frames, where a slow block refers to a preset block having a motion vector magnitude value smaller than the motion vector threshold. For each video frame, the terminal device calculates a second ratio between the number of slow blocks and the total number of preset blocks in the video frame, and judges whether the second ratio is greater than a second preset ratio threshold. If the second ratio is greater than the second preset ratio threshold, which means the corresponding video frame involves slow movement or a still or close-to-still image, the terminal device determines that the corresponding video frame is a video frame that requires the fast forward video effect processing. Hereinafter, a video frame that requires either the slow motion video effect processing or the fast forward video effect processing is also referred to as a “video frame to be processed.”
- At 203, the video effect processing is performed on the video frames to be processed according to the video effect processing instruction. If the video effect processing instruction is the slow motion video effect processing instruction, the terminal device performs a frame interpolation, such as an MEMC technology based frame interpolation, on the video frames to be processed according to a preset interpolation algorithm. The preset interpolation algorithm can include a multiple of the interpolation and a correspondence between the number of interpolating frames and a moving speed in the video frames to be processed.
- On the other hand, if the video effect processing instruction is the fast forward video effect processing instruction, the terminal device performs a frame extraction, such as an MEMC technology based frame extraction, on the video frames to be processed according to a preset extraction algorithm. The preset extraction algorithm can include a ratio of the extraction and a correspondence between the number of extracted frames and the moving speed in the video frames to be processed.
-
FIG. 4 is a flow chart showing a method for video effect processing according to another exemplary embodiment. As illustrated inFIG. 4 , at 401, a video effect processing instruction is received. The video effect processing instruction is generated after a user performs a slide operation on an interface of the terminal device that is playing a video. The slide operation can include, for example, a left slide operation to generate a slow motion video effect processing instruction or a right slide operation to generate a fast forward video effect processing instruction. For example,FIG. 5 is a schematic diagram of the user performing the left slide operation on the interface with a finger during the playback of the video. Thus, if the slide operation is the left slide operation, a slow motion video effect may be added to the video. On the other hand, if the slide operation is the right slide operation, a fast forward video effect may be added to the video. - At 402, according to the video effect processing instruction and a preset algorithm, a video effect process is performed on video frames that start from a video frame corresponding to the time at which the video effect processing instruction is received, also referred to herein as “instruction receiving time,” to realize the corresponding video effect processing on the video.
- If the slide operation is the left slide operation, the terminal device starts to perform a frame interpolation on the video frames starting from the video frame corresponding to the instruction receiving time. Specifically, the terminal device processes these video frames according to a preset interpolation algorithm to realize the slow motion video effect in the video. On the other hand, if the slide operation is the right slide operation, the terminal device starts to perform a frame extraction on the video frames starting from the video frame corresponding to the instruction receiving time. Specifically, the terminal device processes these video frames according to a preset extraction algorithm to realize the fast forward video effect in the video.
-
FIG. 6 is a flow chart showing a method for video effect processing according to another exemplary embodiment. As illustrated inFIG. 6 , at 601, a video effect processing instruction is received. The video effect processing instruction is generated after a user clicks a preset video effect processing button on an interface of the terminal device that is playing a video, and the preset video effect processing button is configured to trigger the video effect processing on the video being played. - At 602, according to the video effect processing instruction and a preset algorithm, a video effect process is performed on the video frames that start from a video frame corresponding to the instruction receiving time, to realize the corresponding video effect processing on the video.
- If the video effect processing button clicked by the user is a slow motion video effect processing button, the terminal device starts to perform a frame interpolation on the video frames starting from the video frame corresponding to the instruction receiving time. Specifically, the terminal device processes these video frames according to a preset interpolation algorithm to realize the slow motion video effect in the video. On the other hand, if the video effect processing button clicked by the user is a fast forward video effect processing button, the terminal device starts to perform a frame extraction on the video frames starting from the video frame corresponding to the instruction receiving time. Specifically, the terminal device processes these video frames according to a preset extraction algorithm to realize the fast forward video effect in the video.
- In the exemplary methods described above in connection with
FIGS. 2 and 3 , the video effect processing instruction is received before the video is played. Thus, during the playback of the video, the terminal device automatically performs the movement amount detection to determine the video frames that need to be processed. On the other hand, in the exemplary methods described above in connection withFIGS. 4-6 , the video effect processing instruction is received while the video is being played. In this scenario, the terminal device does not actively perform the movement amount detection on the video. Rather, the terminal device starts to perform the corresponding video effect processing on the video from the video frames corresponding to the instruction receiving time. That is, in this scenario, the terminal triggers the video effect processing according to an interactive operation of the user. -
FIG. 7 is a block diagram illustrating adevice 700 for video effect processing according to an exemplary embodiment. Referring toFIG. 7 , thedevice 700 includes a receivingunit 701 and a videoeffect processing unit 702. The receivingunit 701 is configured to receive a video effect processing instruction. The videoeffect processing unit 702 is configured to perform a video effect processing, such as a slow motion video effect processing or a fast forward video effect processing, on a video according to the video effect processing instruction during the playback of the video. - In some embodiments, the receiving
unit 701 is further configured to receive the video effect processing instruction that is generated after a user clicks a preset video effect processing button. -
FIG. 8 is a block diagram illustrating adevice 800 for video effect processing according to another exemplary embodiment. Referring toFIG. 8 , the videoeffect processing unit 702 includes a slow motionmovement detection module 801 and aninterpolation processing module 802. The slow motionmovement detection module 801 is configured to perform a movement amount detection on the video according to the video effect processing instruction and determine video frames that require the slow motion video effect processing in the video. Theinterpolation processing module 802 is configured to perform a frame interpolation process on these video frames according to a preset interpolation algorithm to realize the slow motion video effect processing on the video. -
FIG. 9 is a block diagram illustrating adevice 900 for video effect processing according to another exemplary embodiment. Referring toFIG. 9 , the slow motionmovement detection module 801 includes a magnitude value acquiringsub module 901, a first determiningsub module 902, acalculation sub module 903, a decidingsub module 904, and a slow motion frame determiningsub module 905. The magnitude value acquiringsub module 901 is configured to perform a movement amount detection on the video according to the video effect processing instruction and acquire motion vector magnitude values of all preset blocks in each video frame of the video. The first determiningsub module 902 is configured to determine fast blocks in a video frame. Thecalculation sub module 903 is configured to calculate a first ratio between the number of fast blocks and the total number of preset blocks in the video frame. The decidingsub module 904 is configured to judge whether the first ratio is greater than a first preset ratio threshold. The slow motion frame determiningsub module 905 is configured to determine that the video frame requires the slow motion video effect processing if the first ratio is greater than the first preset ratio threshold. -
FIG. 10 is a block diagram illustrating adevice 1000 for video effect processing according to another exemplary embodiment. Referring toFIG. 10 , the videoeffect processing unit 702 includes a fast forwardmovement detection module 1001 and anextraction processing module 1002. The fast forwardmovement detection module 1001 is configured to perform a movement amount detection on the video that is to be processed according to the video effect processing instruction and determine, from the video, video frames that require the fast forward video effect processing. Theextraction processing module 1002 is configured to perform a frame extraction process on these video frames according to a preset extraction algorithm to realize the fast forward video effect processing on the video. -
FIG. 11 is a block diagram illustrating adevice 1100 for video effect processing according to another exemplary embodiment. Referring toFIG. 11 , the fast forwardmovement detection module 1001 includes the magnitude value acquiringsub module 901, a second determiningsub module 1102, acalculation sub module 1103, a decidingsub module 1104, and a fast forward frame determiningsub module 1105. The second determiningsub module 1102 is configured to determine slow blocks in a video frame. Thecalculation sub module 1103 is configured to calculate a second ratio between the number of fast blocks and the total number of preset blocks in the video frame. The decidingsub module 1104 is configured to judge whether the second ratio is greater than a second preset ratio threshold. The fast forward frame determiningsub module 1105 is configured to determine that the video frame requires the fast forward video effect processing if the second ratio is greater than the second preset ratio threshold. - In some embodiments, the receiving
unit 701 is further configured to receive the video effect processing instruction that is generated after the user performs a slide operation on an interface of the terminal that is playing a video. - If the slide operation includes a left slide operation indicating the slow motion video effect processing, the video
effect processing unit 702 is further configured to perform, according to the video effect processing instruction and a preset interpolation algorithm, a frame interpolation process on the video frames that start from a video frame corresponding to the instruction receiving time, to realize the slow motion effect processing on the video. - On the other hand, if the slide operation includes a right slide operation indicating the fast forward effect processing, the video
effect processing unit 702 is further configured to perform, according to the video effect processing instruction and a preset extraction algorithm, a frame extraction process on the video frames that start from a video frame corresponding to the instruction receiving time, to realize the fast forward effect processing on the video. - In some embodiments, the receiving
unit 701 is further configured to receive the video effect processing instruction that is generated after the user clicks a preset video effect processing button on the interface of the terminal device that is playing the video. - If the preset video effect processing button is configured to trigger the slow motion video effect processing, the video
effect processing unit 702 is further configured to perform, according to the video effect processing instruction and a preset interpolation algorithm, a frame interpolation process on the video frames that start from a video frame corresponding to the instruction receiving time, to realize the slow motion video effect processing on the video. - On the other hand, if the preset video effect processing button is configured to trigger the fast forward video effect processing, the video
effect processing unit 702 is further configured to perform, according to the video effect processing instruction and a preset extraction algorithm, a frame extraction process on the video frames that start from a video frame corresponding to the instruction receiving time, to realize the fast forward video effect processing on the video. - Operations of individual modules in the above-described exemplary devices are similar to the exemplary methods described above, and thus their details are omitted here.
-
FIG. 12 is a block diagram of aterminal device 1200 according to an exemplary embodiment. For example, theterminal device 1200 may be a smart phone, a computer, a tablet device, a PDA (Personal Digital Assistant), or the like. - Referring to
FIG. 12 , theterminal device 1200 includes one or more of the following components: aprocessing component 1202, amemory 1204, apower component 1206, amultimedia component 1208, anaudio component 1210, an input/output (1/0)interface 1212, asensor component 1214, and acommunication component 1216. - The
processing component 1202 typically controls overall operations of theterminal device 1200, such as the operations associated with display, telephone calls, data communications, camera operations, and recording operations. Theprocessing component 1202 may include one ormore processors 1220 to execute instructions to perform all or part of a method consistent with the present disclosure, such as one of the above-described exemplary methods. Moreover, theprocessing component 1202 may include one or more modules which facilitate the interaction between theprocessing component 1202 and other components. For example, theprocessing component 1202 may include a multimedia module to facilitate the interaction between themultimedia component 1208 and theprocessing component 1202. - The
memory 1204 is configured to store various types of data to support the operation of theterminal device 1200. Examples of such data include instructions for any applications or methods operated on theterminal device 1200, contact data, phonebook data, messages, pictures, video, etc. Thememory 1204 may be implemented using any type of volatile or non-volatile memory devices, or a combination thereof, such as a static random access memory (SRAM), an electrically erasable programmable read-only memory (EEPROM), an erasable programmable read-only memory (EPROM), a programmable read-only memory (PROM), a read-only memory (ROM), a magnetic memory, a flash memory, a magnetic or optical disk. - The
power component 1206 provides power to various components of theterminal device 1200. Thepower component 1206 may include a power management system, one or more power sources, and any other components associated with the generation, management, and distribution of power in theterminal device 1200. - The
multimedia component 1208 includes a screen providing an output interface between theterminal device 1200 and the user. In some embodiments, the screen may include a liquid crystal display (LCD) and a touch panel. If the screen includes the touch panel, the screen may be implemented as a touch screen to receive input signals from the user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensors may not only sense a boundary of a touch or swipe action, but also sense a period of time and a pressure associated with the touch or swipe action. In some embodiments, themultimedia component 1208 includes a front camera and/or a rear camera. The front camera and the rear camera may receive an external multimedia datum while theterminal device 1200 is in an operation mode, such as a photographing mode or a video mode. Each of the front camera and the rear camera may be a fixed optical lens system or have focus and optical zoom capability. - The
audio component 1210 is configured to output and/or input audio signals. For example, theaudio component 1210 includes a microphone configured to receive an external audio signal when theterminal device 1200 is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may be further stored in thememory 1204 or transmitted via thecommunication component 1216. In some embodiments, theaudio component 1210 further includes a speaker to output audio signals. - The I/
O interface 1212 provides an interface between theprocessing component 1202 and peripheral interface modules, such as a keyboard, a click wheel, buttons, and the like. The buttons may include, but are not limited to, a home button, a volume button, a starting button, and a locking button. - The
sensor component 1214 includes one or more sensors to provide status assessments of various aspects of theterminal device 1200. For instance, thesensor component 1214 may detect an open/closed status of theterminal device 1200, relative positioning of components, e.g., the display and the keypad, of theterminal device 1200, a change in position of theterminal device 1200 or a component of theterminal device 1200, a presence or absence of user contact with theterminal device 1200, an orientation or an acceleration/deceleration of theterminal device 1200, and a change in temperature of theterminal device 1200. Thesensor component 1214 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. Thesensor component 1214 may also include a light sensor, such as a CMOS (Complementary Metal Oxide Semiconductor) or CCD (Charge-coupled Device) image sensor, for use in imaging applications. In some embodiments, thesensor component 1214 may also include an accelerometer sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor. - The
communication component 1216 is configured to facilitate communication, wired or wirelessly, between theterminal device 1200 and other devices. Theterminal device 1200 can access a wireless network based on a communication standard, such as WiFi (WIreless-Fidelity), 3G or 4Q or a combination thereof. In one exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, thecommunication component 1216 further includes a near field communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on a radio frequency identification (RFID) technology, an infrared data association (IrDA) technology, an ultra-wideband (UWB) technology, a Bluetooth technology, or other technologies. - In exemplary embodiments, the
terminal device 1200 may be implemented with one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components, for performing a method consistent with the present disclosure, such as one of the above-described exemplary methods. - In exemplary embodiments, there is also provided a non-transitory computer-readable storage medium including instructions, such as included in the
memory 1204, executable by theprocessor 1220 in theterminal device 1200, for performing a method consistent with the present disclosure, such as one of the above-described exemplary methods. For example, the non-transitory computer-readable storage medium may be a ROM, a RAM (Random Access Memory), a CD-ROM (Compact Disc Read-Only Memory), a magnetic tape, a floppy disc, an optical data storage device, or the like. - According to the present disclosure, a fast forward video effect processing or a slow motion video effect processing can be performed during playback of a video according to a video effect processing instruction. Thus, the video playback and the video effect processing can be performed simultaneously. As a result, the efficiency of the video effect processing is increased and a user's enjoyment of the video playback is improved.
- Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the present disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the present disclosure following the general principles thereof and including such departures from the present disclosure as come within known or customary practice in the art. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
- It will be appreciated that the present invention is not limited to the exact construction that has been described above and illustrated in the accompanying drawings, and that various modifications and changes can be made without departing from the scope thereof. It is intended that the scope of the present disclosure only be limited by the appended claims.
Claims (13)
1. A method for video effect processing, comprising:
receiving a video effect processing instruction associated with a video;
performing a movement amount detection on each of a plurality of video frames of the video to determine target video frames that require the video effect processing; and
performing the video effect processing on the target video frames during playback of the video.
2. The method according to claim 1 , wherein receiving the video effect processing instruction includes:
receiving a video effect processing instruction generated by a user clicking a preset video effect processing button.
3. The method according to claim 2 , wherein:
the video effect processing instruction includes a slow motion video effect processing instruction, and
performing the video effect processing on the target video frames includes:
performing a frame interpolation process on the target video frames according to a preset interpolation algorithm.
4. The method according to claim 3 , wherein performing the movement amount detection to determine the target video frames includes, for each of the plurality of video frames:
acquiring motion vector magnitude values of all preset blocks in the video frame;
determining a number of fast blocks in the video frame, a fast block being a preset block that has a motion vector magnitude value greater than a motion vector threshold;
calculating a ratio between the number of fast blocks and a total number of preset blocks in the video frame;
deciding whether the ratio is greater than a preset ratio threshold; and
determining, if the ratio is greater than the preset ratio threshold, the video frame to be a target video frame.
5. The method according to claim 2 , wherein:
the video effect processing instruction includes a fast forward video effect processing instruction, and
performing the video effect processing on the target video frames includes:
performing a frame extraction process on the target video frames according to a preset extraction algorithm.
6. The method according to claim 5 , wherein the performing the movement amount detection to determine the target video frames includes, for each of the plurality of video frames:
acquiring motion vector magnitude values of all preset blocks in the video frame;
determining a number of slow blocks in the video frame, a slow block being a preset block that has a motion vector magnitude value smaller than a motion vector threshold;
calculating a ratio between the number of slow blocks and a total number of preset blocks in the video frame;
deciding whether the ratio is greater than a preset ratio threshold; and
determining, if the ratio is greater than the preset ratio threshold, the video frame to be a target video frame.
7. A terminal device, comprising:
a processor; and
a memory storing instructions that, when executed by the processor, cause the processor to:
receive a video effect processing instruction associated with a video;
perform a movement amount detection on each of a plurality of video frames of the video to determine target video frames that require the video effect processing; and
perform the video effect processing on the target video frames during playback of the video.
8. The terminal device according to claim 7 , wherein the instructions further cause the processor to:
receive a video effect processing instruction generated by a user clicking a preset video effect processing button.
9. The terminal device according to claim 8 , wherein:
the video effect processing instruction includes a slow motion video effect processing instruction, and
the instructions further cause the processor to:
perform a frame interpolation process on the target video frames according to a preset interpolation algorithm.
10. The terminal device according to claim 9 , wherein the instructions further cause the processor to, for each of the plurality of video frames:
acquire motion vector magnitude values of all preset blocks in the video frame;
determine a number of fast blocks in the video frame, a fast block being a preset block that has a motion vector magnitude value greater than a motion vector threshold;
calculate a ratio between the number of fast blocks and a total number of preset blocks in the video frame;
decide whether the ratio is greater than a preset ratio threshold; and
determine, if the ratio is greater than the preset ratio threshold, the video frame to be a target video frame.
11. The terminal device according to claim 8 , wherein:
the video effect processing instruction includes a fast forward video effect processing instruction, and
the instructions further cause the processor to:
perform a frame extraction process on the target video frames according to a preset extraction algorithm.
12. The terminal device according to claim 11 , wherein the instructions further cause the processor to, for each of the plurality of video frames:
acquire motion vector magnitude values of all preset blocks in the video frame;
determine a number of slow blocks in the video frame, a slow block being a preset block that has a motion vector magnitude value smaller than a motion vector threshold;
calculate a ratio between the number of slow blocks and a total number of preset blocks in the video frame;
decide whether the ratio is greater than a preset ratio threshold; and
determine, if the ratio is greater than the preset ratio threshold, the video frame to be a target video frame.
13. A non-transitory computer-readable storage medium storing instructions that, when executed by a processor in a terminal, cause the terminal to:
receive a video effect processing instruction associated with a video;
perform a movement amount detection on each of a plurality of video frames of the video to determine target video frames that require the video effect processing; and
perform the video effect processing on the target video frames during playback of the video.
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201510541656.9A CN105120337A (en) | 2015-08-28 | 2015-08-28 | Video special effect processing method, video special effect processing device and terminal equipment |
| CN201510541656 | 2015-08-28 | ||
| CN201510541656.9 | 2015-08-28 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20170064245A1 true US20170064245A1 (en) | 2017-03-02 |
| US10212386B2 US10212386B2 (en) | 2019-02-19 |
Family
ID=54668183
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/158,276 Active US10212386B2 (en) | 2015-08-28 | 2016-05-18 | Method, device, terminal device, and storage medium for video effect processing |
Country Status (8)
| Country | Link |
|---|---|
| US (1) | US10212386B2 (en) |
| EP (1) | EP3136391B1 (en) |
| JP (1) | JP6321301B2 (en) |
| KR (1) | KR101756044B1 (en) |
| CN (1) | CN105120337A (en) |
| MX (1) | MX369882B (en) |
| RU (1) | RU2640735C2 (en) |
| WO (1) | WO2017036038A1 (en) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109862380A (en) * | 2019-01-10 | 2019-06-07 | 北京达佳互联信息技术有限公司 | Video data handling procedure, device and server, electronic equipment and storage medium |
| CN110163043A (en) * | 2018-05-18 | 2019-08-23 | 腾讯科技(深圳)有限公司 | Type of face detection method, device, storage medium and electronic device |
| US11297232B2 (en) | 2019-01-25 | 2022-04-05 | Samsung Electronics Co., Ltd. | Apparatus and method for producing slow motion video |
| US11955144B2 (en) * | 2020-12-29 | 2024-04-09 | Snap Inc. | Video creation and editing and associated user interface |
Families Citing this family (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105120337A (en) * | 2015-08-28 | 2015-12-02 | 小米科技有限责任公司 | Video special effect processing method, video special effect processing device and terminal equipment |
| CN107707930B (en) * | 2016-08-09 | 2021-01-15 | 北京奇虎科技有限公司 | Video processing method, device and system |
| CN109068052B (en) * | 2018-07-24 | 2020-11-10 | 努比亚技术有限公司 | Video shooting method, mobile terminal and computer readable storage medium |
| WO2020019212A1 (en) * | 2018-07-25 | 2020-01-30 | 深圳市大疆创新科技有限公司 | Video playback speed control method and system, control terminal, and mobile platform |
| CN109040615A (en) * | 2018-08-10 | 2018-12-18 | 北京微播视界科技有限公司 | Special video effect adding method, device, terminal device and computer storage medium |
| CN111385639B (en) * | 2018-12-28 | 2021-02-12 | 广州市百果园信息技术有限公司 | Video special effect adding method, device, equipment and storage medium |
| CN111260760B (en) | 2020-01-10 | 2023-06-20 | 腾讯科技(深圳)有限公司 | Image processing method, device, electronic equipment and storage medium |
| CN111586321B (en) * | 2020-05-08 | 2023-05-12 | Oppo广东移动通信有限公司 | Video generation method, device, electronic equipment and computer readable storage medium |
| CN111641829B (en) * | 2020-05-16 | 2022-07-22 | Oppo广东移动通信有限公司 | Video processing method, device and system, storage medium and electronic equipment |
| CN114581566A (en) * | 2022-03-10 | 2022-06-03 | 北京字跳网络技术有限公司 | A method, device, device and medium for generating animation special effects |
| CN115119040B (en) * | 2022-07-19 | 2024-01-30 | 北京字跳网络技术有限公司 | Video processing method, device, electronic equipment and storage medium |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20040227817A1 (en) * | 2003-02-14 | 2004-11-18 | Takashi Oya | Motion detecting system, motion detecting method, motion detecting apparatus, and program for implementing the method |
| US20050155072A1 (en) * | 2003-10-07 | 2005-07-14 | Ucentric Holdings, Inc. | Digital video recording and playback system with quality of service playback from multiple locations via a home area network |
| US20100092151A1 (en) * | 2007-02-01 | 2010-04-15 | Sony Corporation | Image reproducing apparatus, image reproducing method, image capturing apparatus, and control method therefor |
| US20110176740A1 (en) * | 2009-07-29 | 2011-07-21 | Panasonic Corporation | Image coding method, image coding apparatus, program, and integrated circuit |
Family Cites Families (27)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3319678B2 (en) | 1995-08-11 | 2002-09-03 | ケイディーディーアイ株式会社 | High-speed video browsing system |
| KR100303727B1 (en) * | 1998-04-28 | 2001-09-24 | 구자홍 | Adaptive display speed control system |
| US6424789B1 (en) | 1999-08-17 | 2002-07-23 | Koninklijke Philips Electronics N.V. | System and method for performing fast forward and slow motion speed changes in a video stream based on video content |
| EP1331816B1 (en) | 1999-11-10 | 2004-09-22 | Thomson Licensing S.A. | Method for editing source video to fast motion on the recordable media |
| JP2001238182A (en) * | 2000-02-23 | 2001-08-31 | Sanyo Electric Co Ltd | Image reproduction device and image reproduction method |
| US20020051081A1 (en) * | 2000-06-30 | 2002-05-02 | Osamu Hori | Special reproduction control information describing method, special reproduction control information creating apparatus and method therefor, and video reproduction apparatus and method therefor |
| US7003154B1 (en) | 2000-11-17 | 2006-02-21 | Mitsubishi Electric Research Laboratories, Inc. | Adaptively processing a video based on content characteristics of frames in a video |
| KR100551952B1 (en) | 2002-10-18 | 2006-02-20 | 주식회사 모티스 | Motion Detection Method in Image Compression |
| JP4208595B2 (en) * | 2003-02-14 | 2009-01-14 | キヤノン株式会社 | Image change detection system |
| JP3800207B2 (en) * | 2003-07-18 | 2006-07-26 | ソニー株式会社 | Imaging device |
| KR100552077B1 (en) | 2003-08-29 | 2006-02-20 | 바로비젼(주) | Content providing system and mobile terminal for it |
| US7177532B2 (en) * | 2003-12-12 | 2007-02-13 | Numark Industries, Llc | Dual video player for disc jockeys |
| JP4727342B2 (en) * | 2004-09-15 | 2011-07-20 | ソニー株式会社 | Image processing apparatus, image processing method, image processing program, and program storage medium |
| CN101075949A (en) * | 2006-05-15 | 2007-11-21 | 中兴通讯股份有限公司 | Method for changing fluid-medium file broadcasting speed |
| JP4181598B2 (en) * | 2006-12-22 | 2008-11-19 | シャープ株式会社 | Image display apparatus and method, image processing apparatus and method |
| US8136133B2 (en) * | 2007-11-13 | 2012-03-13 | Walker Digital, Llc | Methods and systems for broadcasting modified live media |
| CN101448092A (en) * | 2007-11-28 | 2009-06-03 | 新奥特(北京)视频技术有限公司 | Method for realizing video forwarding and slowing |
| KR101486254B1 (en) * | 2008-10-10 | 2015-01-28 | 삼성전자주식회사 | Method for setting frame rate conversion and display apparatus applying the same |
| CN101600107B (en) * | 2009-07-08 | 2012-01-25 | 杭州华三通信技术有限公司 | Method for adjusting play speed of videotape as well as system and device |
| CN102065320B (en) * | 2009-11-12 | 2012-12-19 | 三星电子(中国)研发中心 | Method and equipment for processing trick playing command related to transport stream (TS) code stream |
| CN101815199B (en) * | 2010-04-07 | 2013-08-07 | 中兴通讯股份有限公司 | Video processing method and terminal |
| EP2429192A1 (en) * | 2010-08-17 | 2012-03-14 | Streamworks International S.A. | Video signal processing |
| WO2013113985A1 (en) | 2012-01-31 | 2013-08-08 | Nokia Corporation | Method, apparatus and computer program product for generation of motion images |
| KR101366150B1 (en) * | 2013-06-21 | 2014-02-25 | (주)티비스톰 | Moving picture playing controlling user interfacing method and computer readable record-medium on which program for excuting method therof |
| US20150221335A1 (en) | 2014-02-05 | 2015-08-06 | Here Global B.V. | Retiming in a Video Sequence |
| CN104080006B (en) * | 2014-07-10 | 2017-10-27 | 福州瑞芯微电子股份有限公司 | A kind of video process apparatus and method |
| CN105120337A (en) * | 2015-08-28 | 2015-12-02 | 小米科技有限责任公司 | Video special effect processing method, video special effect processing device and terminal equipment |
-
2015
- 2015-08-28 CN CN201510541656.9A patent/CN105120337A/en active Pending
- 2015-12-30 JP JP2017537008A patent/JP6321301B2/en active Active
- 2015-12-30 WO PCT/CN2015/099729 patent/WO2017036038A1/en not_active Ceased
- 2015-12-30 RU RU2016114551A patent/RU2640735C2/en active
- 2015-12-30 KR KR1020167006989A patent/KR101756044B1/en active Active
- 2015-12-30 MX MX2016003307A patent/MX369882B/en active IP Right Grant
-
2016
- 2016-05-18 US US15/158,276 patent/US10212386B2/en active Active
- 2016-06-17 EP EP16175108.6A patent/EP3136391B1/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20040227817A1 (en) * | 2003-02-14 | 2004-11-18 | Takashi Oya | Motion detecting system, motion detecting method, motion detecting apparatus, and program for implementing the method |
| US20050155072A1 (en) * | 2003-10-07 | 2005-07-14 | Ucentric Holdings, Inc. | Digital video recording and playback system with quality of service playback from multiple locations via a home area network |
| US20100092151A1 (en) * | 2007-02-01 | 2010-04-15 | Sony Corporation | Image reproducing apparatus, image reproducing method, image capturing apparatus, and control method therefor |
| US20110176740A1 (en) * | 2009-07-29 | 2011-07-21 | Panasonic Corporation | Image coding method, image coding apparatus, program, and integrated circuit |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110163043A (en) * | 2018-05-18 | 2019-08-23 | 腾讯科技(深圳)有限公司 | Type of face detection method, device, storage medium and electronic device |
| CN109862380A (en) * | 2019-01-10 | 2019-06-07 | 北京达佳互联信息技术有限公司 | Video data handling procedure, device and server, electronic equipment and storage medium |
| US11297232B2 (en) | 2019-01-25 | 2022-04-05 | Samsung Electronics Co., Ltd. | Apparatus and method for producing slow motion video |
| US11917291B2 (en) | 2019-01-25 | 2024-02-27 | Samsung Electronics Co., Ltd. | Apparatus and method for producing slow motion video |
| US12342073B2 (en) | 2019-01-25 | 2025-06-24 | Samsung Electronics Co., Ltd | Apparatus and method for producing slow motion video |
| US11955144B2 (en) * | 2020-12-29 | 2024-04-09 | Snap Inc. | Video creation and editing and associated user interface |
Also Published As
| Publication number | Publication date |
|---|---|
| CN105120337A (en) | 2015-12-02 |
| JP2017536783A (en) | 2017-12-07 |
| JP6321301B2 (en) | 2018-05-09 |
| US10212386B2 (en) | 2019-02-19 |
| KR20170036654A (en) | 2017-04-03 |
| EP3136391B1 (en) | 2019-10-09 |
| MX2016003307A (en) | 2017-12-07 |
| RU2016114551A (en) | 2017-10-19 |
| MX369882B (en) | 2019-11-25 |
| EP3136391A1 (en) | 2017-03-01 |
| WO2017036038A1 (en) | 2017-03-09 |
| KR101756044B1 (en) | 2017-07-07 |
| RU2640735C2 (en) | 2018-01-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10212386B2 (en) | Method, device, terminal device, and storage medium for video effect processing | |
| CN109600659B (en) | Operation method, device, device and storage medium when playing video | |
| KR101772177B1 (en) | Method and apparatus for obtaining photograph | |
| EP3786778A1 (en) | Method and device for screen projection, terminal and storage medium | |
| US20170344192A1 (en) | Method and device for playing live videos | |
| US20170304735A1 (en) | Method and Apparatus for Performing Live Broadcast on Game | |
| US11545188B2 (en) | Video processing method, video playing method, devices and storage medium | |
| CN106559712B (en) | Video playback processing method, device and terminal device | |
| US20170178289A1 (en) | Method, device and computer-readable storage medium for video display | |
| US9959484B2 (en) | Method and apparatus for generating image filter | |
| EP2998960B1 (en) | Method and device for video browsing | |
| CN105979383A (en) | Image acquisition method and device | |
| CN105631803B (en) | The method and apparatus of filter processing | |
| RU2666626C1 (en) | Playback state controlling method and device | |
| KR20180037235A (en) | Information processing method and apparatus | |
| US9799376B2 (en) | Method and device for video browsing based on keyframe | |
| US20220256230A1 (en) | Method and apparatus for video playing | |
| US20220222831A1 (en) | Method for processing images and electronic device therefor | |
| US11600300B2 (en) | Method and device for generating dynamic image | |
| CN108769769B (en) | Video playback method, device and computer-readable storage medium | |
| CN112445348A (en) | Expression processing method, device and medium | |
| CN105791924B (en) | Acquisition method, capture device and the electronic device of video and/or audio | |
| CN110809184A (en) | Video processing method, device and storage medium | |
| CN112784107A (en) | Method, device and storage medium for extracting picture from video | |
| CN119031179A (en) | Video processing method, device, electronic device and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: XIAOMI INC., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIU, JIE;WU, XIAOYONG;WANG, WEI;REEL/FRAME:038643/0005 Effective date: 20160406 |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |