[go: up one dir, main page]

HK1130099A - Method and system for online remixing of digital multimedia - Google Patents

Method and system for online remixing of digital multimedia Download PDF

Info

Publication number
HK1130099A
HK1130099A HK09108061.8A HK09108061A HK1130099A HK 1130099 A HK1130099 A HK 1130099A HK 09108061 A HK09108061 A HK 09108061A HK 1130099 A HK1130099 A HK 1130099A
Authority
HK
Hong Kong
Prior art keywords
media asset
resolution
resolution media
low
asset
Prior art date
Application number
HK09108061.8A
Other languages
Chinese (zh)
Inventor
迈克尔.乔治.福格纳
瑞恩.布里斯.坎宁安
Original Assignee
雅虎公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 雅虎公司 filed Critical 雅虎公司
Publication of HK1130099A publication Critical patent/HK1130099A/en

Links

Description

Method and system for on-line remixing of digital multimedia
Copyright notice
A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the patent and trademark office patent file or records, but otherwise reserves all copyright rights whatsoever.
RELATED APPLICATIONS
Priority of U.S. provisional application No.60/758,664, filed on 13.1.2006, which is incorporated herein by reference, and priority of U.S. provisional application No.60/790,569, filed on 10.4.2006, which is also incorporated herein by reference.
Background
In the current internet, there are many different types of media assets (media assets) in the form of digital files. Digital files may contain data representing one or more types of content, including but not limited to audio, images, and video. For example, media assets include a variety of file formats, such as MPEG-1 Audio layer 3 ("MP 3") for audio, Joint photographic experts group ("JPEG") for images, moving Picture experts group ("MPEG-2" and "MPEG-4") for video, Adobe Flash for animation, and executable files.
Such media assets are currently created and edited using applications running locally on a dedicated computer. For example, in the case of digital video, popular applications for creating and editing media assets include apple's iMovie and FinalCut Pro, and microsoft's MovieMaker. After the media asset is created and edited, one or more files may be sent to a computer (e.g., a server) located on a distributed network such as the Internet. The server may host these files for viewing by different users. Examples of companies running such servers are YouTube (http:// YouTube. com) and Google Video (http:// Video. Google. com).
Currently, users must create and/or edit media assets on their client computers before sending them to the server. Many users are therefore unable to edit a media asset from another client, for example, in the event that the user's client computer does not contain the appropriate media asset or application for editing. Furthermore, editing applications are typically designed for professional or high-end customer markets. Such applications do not address the needs of general customers who lack specialized computers with comparable processing power and/or storage capacity.
In addition, typical clients typically do not have the necessary transmission bandwidth to send, share, or access media assets that are widely spread across the network. Many media assets are increasingly stored on computers connected to the internet. For example, a facilitator such as Getty Images sells media assets (e.g., Images) stored on computers connected to the internet. Thus, when a user requests a media asset for manipulation or editing, the asset is typically transported in its entirety over a network. Especially in the case of digital video, such delivery may consume a large amount of processing and transmission resources.
Disclosure of Invention
Based on this background, systems and methods have been developed for manipulating media assets in a networked computing environment where processing power, bandwidth, and/or storage capacity may be limited. More specifically, systems and methods have been developed by which low-resolution media assets optimized for transmission over low-bandwidth networks and for editing and manipulation in environments with low processing power and low storage capacity can be created and high-resolution media assets can be created for playback.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a method for editing a low-resolution media asset to produce a high-resolution edited media asset. The method includes receiving a request from a requestor to edit a first high-resolution media asset. The method also includes sending a low-resolution media asset to the requestor, the low-resolution media asset being based on the first high-resolution media asset. The method also includes receiving editing instructions associated with the low-resolution media asset from a requestor. The method also includes generating a second high-resolution media asset based on the first high-resolution media asset and editing instructions associated with the low-resolution media asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a computer readable medium encoding or containing computer executable instructions for performing a method of editing a low-resolution media asset to produce a high-resolution edited media asset. The computer-readable medium includes instructions for receiving a request from a requestor to edit a first high-resolution media asset. The computer-readable medium further includes instructions for transmitting a low-resolution media asset to the requestor, the low-resolution media asset being based on the first high-resolution media asset. The computer-readable medium includes instructions for receiving editing instructions associated with the low-resolution media asset from a requestor. The computer-readable medium further includes instructions for generating a second high-resolution media asset based on the first high-resolution media asset and the edit instructions associated with the low-resolution media asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a system. The system includes a high-resolution media asset library. The system also includes a low-resolution media asset generator that generates a low-resolution media asset from a high-resolution media asset contained in a high-resolution media asset library. The system includes a high-resolution media asset editor that applies edits to the high-resolution media asset based on edits made to an associated low-resolution media asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a method. The method includes receiving a request to generate a video asset that identifies a start frame and an end frame in a keyframe master asset. The method also includes generating a first portion of the video asset containing one or more keyframes associated with the starting frame, the keyframes obtained from a keyframe master asset. The method includes generating a second portion of the video asset, the second portion containing a set of keyframes and optimized frames, the optimized frames obtained from an optimized master asset associated with a keyframe master asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a method. The method includes receiving a request to generate a video asset that identifies a start frame and an end frame in a master asset. The method also includes generating a first portion of the video asset containing one or more keyframes associated with the starting frame, the keyframes obtained from a keyframe master asset corresponding to the master asset. The method includes generating a second portion of the video asset, the second portion containing a set of keyframes and optimized frames, the optimized frames obtained from an optimized master asset corresponding to the master asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a method. The method includes receiving a request to generate a video asset that identifies a start frame and an end frame in an optimized master asset. The method also includes generating a keyframe master asset based on the optimized master asset, the keyframe master asset including one or more keyframes corresponding to the starting frame. The method includes generating a first portion of the video asset that includes at least an identified start frame in the optimized master asset. The method also includes generating a second portion of the video asset, the second portion including a set of keyframes and optimized frames, the optimized frames obtained from the optimized master asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a computer-readable medium encoding or containing computer-executable instructions for performing a method. The computer-readable medium includes instructions for receiving a request to generate a video asset that identifies a start frame and an end frame in a keyframe master asset. The computer-readable medium further includes instructions for generating a first portion of the video asset containing one or more keyframes associated with the starting frame, the keyframes obtained from a keyframe master asset. The computer-readable medium includes instructions for generating a second portion of the video asset, the second portion containing a set of keyframes and optimized frames, the optimized frames obtained from an optimized master asset associated with a keyframe master asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a computer-readable medium encoding or containing computer-executable instructions for performing a method. The computer-readable medium includes instructions for receiving a request to generate a video asset that identifies a start frame and an end frame in a master asset. The computer-readable medium further includes instructions for generating a first portion of the video asset containing one or more keyframes associated with the starting frame, the keyframes obtained from a keyframe master asset corresponding to the master asset. The computer-readable medium includes instructions for generating a second portion of the video asset, the second portion containing a set of keyframes and optimized frames, the optimized frames obtained from an optimized master asset corresponding to the master asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a computer-readable medium encoding or containing computer-executable instructions for performing a method. The computer-readable medium includes instructions for receiving a request to generate a video asset that identifies a start frame and an end frame in an optimized master asset. The computer-readable medium further includes instructions for generating a keyframe master asset based on the optimized master asset, the keyframe master asset including one or more keyframes corresponding to the starting frame. The computer-readable medium includes instructions for generating a first portion of a video asset that includes at least an identified start frame in an optimized master asset. The computer medium also includes instructions for generating a second portion of the video asset, the second portion including a set of keyframes and optimized frames, the optimized frames obtained from the optimized master asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a system. The system includes a master asset library storing at least one high-resolution master asset. The system also includes a specification applicator that stores at least one edit specification for applying edits to at least one high-resolution master asset. The system includes a master asset editor that applies the at least one edit specification to the at least one high-resolution master asset. The system also includes an edit asset generator that generates a low-resolution asset corresponding to the high-resolution master asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a method. The method includes editing a low-resolution media asset, the low-resolution media asset corresponding to a master high-resolution media asset. The method also includes generating an edit specification based on the edits to the low-resolution media asset. The method includes applying an edit specification to a master high-resolution media asset to produce an edited high-resolution media asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a computer-readable medium having a data structure stored thereon. The computer readable medium includes a first data field that includes data identifying a high-resolution media asset. The computer-readable medium also includes a second data field including data describing one or more edits made to a low-resolution media asset associated with the high-resolution media asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a method for identifying edit information for a media asset. The method includes editing a low-resolution media asset containing at least a first portion corresponding to a first high-resolution master media asset and a second portion corresponding to a second high-resolution master media asset. The method also includes receiving a request to generate a high-resolution edited media asset, the request identifying a first high-resolution master media asset and a second high-resolution master media asset. The method includes generating a high-resolution edited media asset. The method also includes associating with the high-resolution edited media asset edit information, the high-resolution edited media asset edit information identifying the first high-resolution master media asset and the second high-resolution master media asset.
In one example (which example is intended to be illustrative and not restrictive), the method may be considered a computer-readable medium encoding or containing computer-executable instructions for performing a method for identifying edit information for a media asset. The method comprises the following steps: editing a low-resolution media asset comprising at least a first portion corresponding to a first high-resolution master media asset and a second portion corresponding to a second high-resolution master media asset. The method also includes receiving a request to generate a high-resolution edited media asset, the request identifying a first high-resolution master media asset and a second high-resolution master media asset. The method includes generating a high-resolution edited media asset. The method also includes associating with the high-resolution edited media asset edit information, the high-resolution edited media asset edit information identifying the first high-resolution master media asset and the second high-resolution master media asset.
In one example, which is intended to be illustrative and not limiting, the present invention can be considered a method for rendering (rendering) a media asset. The method includes receiving a command to render an aggregate media asset defined by an edit specification, the edit specification identifying at least a first media asset associated with at least one edit instruction. The method also includes retrieving the edit specification. The method includes retrieving a first media asset. The method also includes rendering, on a media asset rendering device, a first media asset of the aggregate media asset in accordance with the at least one edit instruction.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a computer-readable medium encoding or containing computer-executable instructions for performing a method for rendering a media asset. The method includes receiving a command to render an aggregate media asset defined by an edit specification, the edit specification identifying at least a first media asset associated with at least one edit instruction. The method also includes retrieving the edit specification. The method includes retrieving a first media asset. The method also includes rendering, on a media asset rendering device, the first media asset of the aggregate media asset in accordance with the at least one edit instruction.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a method for editing an aggregate media asset. The method includes receiving a stream corresponding to an aggregate media asset from a remote computing device in a playback session, the aggregate media asset comprising at least one component media asset. The method also includes rendering the aggregate media asset on an image rendering device. The method includes receiving a user command to edit an edit specification associated with the aggregate media asset. The method also includes initiating an edit session for editing an edit specification associated with the aggregate media asset.
In one example (which example is intended to be illustrative and not restrictive), the present invention may be considered a computer readable medium encoding or containing computer executable instructions for performing a method for editing an aggregate media asset. The method includes receiving a stream corresponding to an aggregate media asset from a remote retrieval device in a playback session, the aggregate media asset comprising at least one component media asset. The method also includes rendering the aggregate media asset on an image rendering device. The method includes receiving a user command to edit an edit specification associated with the aggregate media asset. The method also includes initiating an edit session for editing an edit specification associated with the aggregate media asset.
In one example, which example is intended to be illustrative and not restrictive, the present invention may be considered a method for storing an aggregate media asset. The method includes storing a plurality of component media assets. The method also includes storing a first aggregate edit specification including at least one command for rendering the plurality of component media assets to produce a first aggregate media asset.
These and various other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. Additional features will be set forth in part in the description which follows, or may be learned by practice of the embodiments. The advantages and features will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, are included to illustrate the following example systems and methods and are not intended to limit the scope of the invention, which is defined in any way by the appended claims.
FIG. 1 illustrates an embodiment of a system for manipulating media assets in a networked computing environment.
FIG. 2 illustrates an embodiment of a system for manipulating media assets in a networked computing environment.
Fig. 3 illustrates an embodiment of a method for editing a low-resolution media asset to produce a high-resolution edited media asset.
FIG. 4 illustrates an embodiment of a method for generating a media asset.
FIG. 5 illustrates an embodiment of a method for generating a media asset.
FIG. 6 illustrates an embodiment of a method for generating a media asset.
FIG. 7 illustrates an embodiment of a method for recording edits to media content.
FIG. 8 illustrates an embodiment of a method for identifying edit information for a media asset.
FIG. 9 illustrates an embodiment of a method for rendering a media asset.
Fig. 10 illustrates an embodiment of a method for storing an aggregate media asset.
FIG. 11 illustrates an embodiment of a method for editing an aggregate media asset.
Detailed Description
Fig. 1 illustrates an embodiment of a system 100 for generating media assets. In one embodiment, system 100 includes a master asset library 102. In one embodiment, master asset library 102 may be a logical grouping of data including, but not limited to, high-resolution and low-resolution media assets. In another embodiment, master asset library 102 may be a physical grouping of data including, but not limited to, high-resolution and low-resolution media assets. In one embodiment, master asset library 102 may comprise one or more databases and reside on one or more servers. In one embodiment, master asset library 102 may include multiple libraries, including public, private, and shared libraries. In one embodiment, the master asset library 102 may be organized as a searchable library. In another embodiment, one or more servers comprising master asset library 102 may include connections to one or more storage devices for storing digital files.
For purposes of this disclosure, the term "file" in the figures, appended claims, and drawings associated with this disclosure generally refers to a collection of information that is stored as a unit and that may otherwise be retrieved, modified, stored, deleted or transmitted. Storage devices may include, but are not limited to, volatile memory (e.g., RAM, DRAM), non-volatile memory (e.g., ROM, EPROM, flash memory), and devices such as hard disk drives and optical drives. The storage device may store information redundantly. The storage devices may also be connected in parallel, serial, or some other connection configuration. As set forth in this embodiment, one or more assets may reside in the master asset library 102.
For purposes of this disclosure, in the drawings, appended claims, and accompanying drawings associated with this disclosure, "assets" refer to logical collections of content that can be included in one or more files. For example, an asset may comprise a single file (e.g., an MPEG video file) that contains images (e.g., still frames of video), audio, and video information. As another example, an asset may also include a collection of files (e.g., JPGE image files) that may be collectively used to render an animation or video. As another example, an asset may also include an executable file (e.g., an executable vector graphics file, such as an SWF file or a FLA file). Master asset library 102 may include many types of assets including, but not limited to, videos, images, animations, text, executables, and audio. In one embodiment, master asset library 102 may include one or more high-resolution master assets. In other parts of this disclosure, the "master asset" will be disclosed as a digital file containing video content. However, those skilled in the art will recognize that the master asset is not limited to containing video information, and as previously described, the master asset may contain various types of information including, but not limited to, images, audio, text, executables, and/or animations.
In one embodiment, the media assets may be stored in the master asset library 102, thereby preserving the quality of the media assets. For example, in the case of a media asset comprising video information, two important aspects of video quality are spatial resolution and temporal resolution. Spatial resolution generally describes the degree of sharpness of the displayed image where no blur is present, while temporal resolution generally describes the degree of smoothness of the motion. Motion video, such as movies, includes a certain number of frames per second to represent motion in a scene. In general, the first step in digitizing video is to divide each frame into a large number of shorter picture elements or pixels. The larger the number of pixels, the higher the spatial resolution. Similarly, the more frames per second, the higher the temporal resolution.
In one embodiment, the media assets may be stored in the master asset library 102 as master assets that are not directly manipulated. For example, the media asset may be saved in its original form in the master asset library 102, but it may still be used to create copies or derivative media assets (e.g., low resolution assets). In one embodiment, media assets may also be stored in the master asset library 102 along with corresponding or associated assets. In one embodiment, the media assets stored in the master asset library 102 may be stored as multiple versions of the same media asset. For example, the multiple versions of a media asset stored in the master asset library 102 may include: full key frame (all-keyframe) versions that do not use intra-frame similarity for compression purposes; and an optimized version that utilizes intra-frame similarities. In one embodiment, the original media asset may represent a full keyframe version. In another embodiment, the original media asset may be initially in the form of or stored as an optimized version. Those skilled in the art will recognize that the media assets may take many forms within the master asset library 102 that are within the scope of the present disclosure.
In one embodiment, system 100 also includes an edit asset generator 104. In one embodiment, the edit asset generator 104 may include transcoding hardware and/or software that, among other things, may convert media assets from one format to another. For example, a transcoder may be used to convert MPEG files into Quicktime files. As another example, a transcoder may be used to convert JPEG files into bitmap (e.g., a. BMP) files. As yet another example, a transcoder may be used to standardize media asset formats to Flash video file (·. FLV) formats. In one embodiment, the transcoder may create more than one version of the original media asset. For example, when an original media asset is received, the transcoder may convert the original media asset into a high resolution version or a low resolution version. As another example, a transcoder may convert an original media asset into one or more files. In one embodiment, the transcoder may reside on a remote computing device. In another embodiment, the transcoder may reside on one or more connected computers. In one embodiment, the edit asset generator 104 may also include hardware and/or software for transferring and/or uploading media assets to one or more computers. In another embodiment, the edit asset generator 104 may include or be connected to hardware and/or software for capturing media assets from an external source (e.g., a digital camera).
In one embodiment, the edit asset generator 104 may generate a low-resolution version of the high-resolution media asset stored in the master asset library 102. In another embodiment, the edit asset generator 104 may transmit the low resolution version of the media asset stored in the master asset library 102 to the remote computing device by, for example, converting the media asset in real-time and transmitting the media asset as a stream. In another embodiment, the edit asset generator 104 may generate a low-quality version of another media asset (e.g., a master asset) such that the low-quality version remains while still providing sufficient data to enable a user to edit the low-quality version.
In one embodiment, system 100 may also include a specification applicator 106. In one embodiment, the specification applicator 106 may include one or more files or edit specifications containing data for editing and modifying media assets (e.g., high-resolution media assets). In one embodiment, the specification applicator 106 may include one or more edit specifications that include modification instructions for a high-resolution media asset based on edits made to a corresponding or associated low-resolution media asset. In one embodiment, specification applicator 106 may store one or more edit specifications in one or more libraries.
In one embodiment, the system 100 also includes a master asset editor 108, and the master asset editor 108 may apply one or more edit specifications to the media asset. For example, the master asset editor 108 may apply an edit specification stored in the specification applicator 106 to a first high-resolution media asset to create another high-resolution media asset, e.g., a second high-resolution media asset. In one embodiment, the master asset editor 108 may apply an edit specification to the media asset in real-time. For example, the master asset editor 108 may modify a media asset as it is sent to another location. In another embodiment, the master asset editor 108 may apply the edit specification to the media asset in non-real time. For example, the master asset editor 108 may apply an edit specification to a media asset as part of a scheduled process. In one embodiment, the master asset editor 108 may be used to minimize the necessity of transferring large media assets over a network. For example, by storing edits in an edit specification, master asset editor 108 may transmit smaller data files across a network to enable manipulation of high quality assets stored on one or more local computers (e.g., computers comprising a master asset library) on a remote computing device.
In another embodiment, the master asset editor 108 may respond to a command from a remote computing device (e.g., clicking a "remix" button at the remote computing device may command the master asset editor 108 to apply an edit specification to the high-resolution media asset). For example, the master asset editor 108 may dynamically and/or interactively apply an edit specification to a media asset upon user command from a remote computing device. In one embodiment, the master asset editor 108 may dynamically apply an edit specification to the high-resolution asset, thereby producing an edited high-resolution media asset for playback. In another embodiment, the master asset editor 108 may apply an edit specification to a media asset on a remote computing device and one or more computers connected over a network (e.g., the Internet 114). For example, bisection (bifurcate) application to the edit specification may minimize the size of the edited high-resolution asset before it is transmitted to the remote computing device for playback. In another embodiment, for example, master asset editor 108 may apply an edit specification on a remote computing device to take advantage of vector-based processing that may be efficiently performed on the remote computing device when played.
In one embodiment, the system 100 also includes an editor 110, which editor 110 may reside on a remote computing device 112 connected to one or more networked computers, such as the Internet 114. In one embodiment, editor 110 may include software. For example, the editor 110 may be a stand-alone program. As another example, editor 110 may include one or more instructions that may be executed via another program, such as an Internet 114 browser (e.g., Microsoft's Internet Explorer). In one embodiment, the editor 110 may be designed with a user interface similar to other media editing programs. In one embodiment, editor 110 may contain connections to the following components: a master asset library 102, an edit asset generator 104, a specification applicator 106, and/or a master asset editor 108. In one embodiment, the editor 110 may include a pre-built or "default" edit specification that may be applied to the media asset by the remote computing device. In one embodiment, the editor 110 may include a player program for displaying media assets and/or applying one or more instructions from an edit specification when playing back the media assets. In another embodiment, the editor 110 may be connected to a player program (e.g., a stand alone editor may be connected to a browser).
Fig. 2 illustrates an embodiment of a system 200 for generating media assets. In one embodiment, the system 200 includes a high-resolution media asset library 202. In one embodiment, the high-resolution media asset library 202 may be a shared library, a public library, and/or a private library. In one embodiment, the high-resolution media asset library 202 may include at least one video file. In another embodiment, the high-resolution media asset library 202 may include at least one audio file. In yet another embodiment, the high-resolution media asset library 202 may include at least one reference to a media asset residing on the remote computing device 212. In one embodiment, the high-resolution media asset library 202 may reside on multiple computing devices.
In one embodiment, the system 200 also includes a low-resolution media asset generator 204, the low-resolution media asset generator 204 generating low-resolution media assets from high-resolution media assets contained in a high-resolution media asset library. For example, as described above, the low-resolution media asset generator 204 may convert a high-resolution media asset to a low-resolution media asset.
In one embodiment, the system 200 also includes a low-resolution media asset editor 208, the low-resolution media asset editor 208 transmitting edits to an associated low-resolution media asset to one or more computers via a network, such as the Internet 214. In another embodiment, the low-resolution media asset editor 208 may reside on a computing device remote from the high-resolution media asset editor, such as the remote computing device 212. In another embodiment, the low-resolution media asset editor 208 may utilize a browser. For example, the low-resolution media asset editor 208 may store the low-resolution media asset in a cache of a browser.
In one embodiment, system 200 may also include an image rendering device 210 that displays the associated low-resolution media assets. In one embodiment, the image rendering device 210 resides on a computing device 212 remote from the high-resolution media asset editor 206. In another embodiment, image rendering device 210 may utilize a browser.
In one embodiment, the system 200 also includes a high-resolution media asset editor 206 that applies edits to the high-resolution media asset based on edits made to the associated low-resolution media asset.
Fig. 3 illustrates an embodiment of a method 300 for editing a low-resolution media asset to produce a high-resolution edited media asset. In method 300, a request to edit a first high-resolution media is received from a requestor in a requesting operation 302. In one embodiment, the first high-resolution media asset may comprise a plurality of files, and the receiving of the request to edit the first high-resolution media asset in the requesting operation 302 may further comprise receiving a request to edit at least one of the plurality of files. In another embodiment, the requesting operation 302 may further include receiving a request to edit at least one high-resolution audio or video file.
In the method 300, a low-resolution media asset based on a first high-resolution media asset is sent to a requestor in a sending operation 304. In one embodiment, the sending operation 304 may comprise sending at least one low resolution audio or video file. In another embodiment, the sending operation 304 may further comprise converting at least one high-resolution audio or video file associated with the first high-resolution media asset from a first file format to at least one low-resolution audio or video file, respectively, having a second file format. For example, a high resolution uncompressed audio file (e.g., a WAV file) may be converted to a compressed audio file (e.g., an MP3 file). As another example, a compressed file with a lower compression ratio may be converted to a file of the same format but formatted with a larger compression ratio.
The method 300 then includes receiving editing instructions associated with the low-resolution media asset from the requestor in a receiving operation 306. In one embodiment, receiving operation 306 may further comprise receiving an instruction to modify a video presentation property of at least one high resolution video file. For example, modifying attributes of a video presentation may include receiving instructions to modify: an image aspect ratio, a spatial resolution value, a temporal resolution value, a bit rate value, or a compression value. In another embodiment, receiving operation 306 may further comprise receiving instructions to modify a timeline (e.g., an order of frames) of the at least one high resolution video file.
The method 300 further includes generating a second high-resolution media asset based on the first high-resolution media asset and the edit instruction associated with the low-resolution media asset in a generating operation 308. In one embodiment of the generating operation 308, the edit specification is applied to at least one high-resolution audio or video file that includes the first high-resolution media asset. In yet another embodiment, the generating operation 308 generates at least one high resolution audio or video file. In another embodiment, the generating operation 308 further comprises the steps of: generating a copy of at least one high-resolution audio or video file associated with a first high-resolution media asset; applying editing instructions to the at least one high-resolution audio or video file, respectively; and saving the copy as a second high-resolution media asset.
In another embodiment of method 300, at least a portion of the second high-resolution media asset may be transmitted to a remote computing device. In yet another embodiment of method 300, at least a portion of the second high-resolution media asset may be displayed by an image rendering device. For example, the image rendering device may take the form of a browser residing on a remote computing device.
Fig. 4 illustrates an embodiment of a method 400 for generating a media asset. In the method 400, a request to generate a video asset identifying a start frame and an end frame in a keyframe master asset is received in a receiving operation 402. For example, the request of receiving operation 402 may identify a first portion and/or a second portion of a video asset.
In generate first portion operation 404, the method 400 then includes generating a first portion of the video asset, where the first portion contains one or more keyframes associated with the starting frame and the keyframes obtained from the keyframe master asset. For example, where the keyframe master asset comprises an uncompressed video file, one or more frames of the uncompressed video file may comprise a keyframe associated with a starting frame of the media asset.
In generate second portion operation 406, the method 400 further includes generating a second portion of the video asset, where the second portion contains a set of keyframes and optimized frames obtained from an optimized master asset associated with the keyframe master asset. For example, where the optimized master asset comprises a compressed video file, a set of compressed frames may be combined into the video asset along with one or more uncompressed frames from the uncompressed video file.
In another embodiment of method 400, a library of master assets may be maintained such that keyframe master assets and optimized master assets corresponding to at least one library master asset may be generated. In yet another embodiment of the method 400, the request may identify a start key frame or an end key frame in the key frame master asset corresponding to the start frame or the end frame, respectively.
Fig. 5 illustrates an embodiment of a method 500 for generating a media asset. In method 500, a request to generate a video asset is received in receiving operation 502, the video asset identifying a start frame and an end frame in a master asset. For example, the request of receiving operation 502 may identify a first portion and/or a second portion of a video asset.
In generate a first portion operation 504, the method 500 then includes generating a first portion of the video asset, where the first portion contains one or more keyframes associated with the starting frame and the keyframes are obtained from a keyframe master asset corresponding to the master asset.
In generate second portion operation 506, the method 500 then includes generating a second portion of the video asset, where the second portion contains a set of keyframes and optimized frames, the optimized frames obtained from an optimized master asset corresponding to the master asset. For example, where the optimized master asset comprises a compressed video file, a set of compressed frames may be combined into the video asset along with one or more uncompressed frames from the keyframe master asset.
In another embodiment of method 500, a library of master assets may be maintained such that keyframe master assets and optimized master assets corresponding to at least one library master asset may be generated. In yet another embodiment of method 500, the request may identify a start key frame or an end key frame in the key frame master asset that corresponds to the start frame or the end frame, respectively.
Fig. 6 illustrates an embodiment of a method 600 for generating a media asset. In the method 600, a request to generate a video asset is received in a receiving operation 602, where the video asset identifies a start frame and an end frame in an optimized master asset. For example, the request of receiving operation 602 may identify a first portion and/or a second portion of a video asset.
The method 600 then includes generating a keyframe master asset based on the optimized master asset, the keyframe master asset including one or more keyframes corresponding to the starting frame in a generate keyframe operation 604. In generate a first portion operation 606, the method 600 further includes generating a first portion of the video asset, where the first portion includes at least the start frame identified in the optimized master asset. In generate second portion operation 608, the method 600 then further includes generating a second portion of the video asset, where the second portion includes a set of keyframes and optimized frames, the optimized frames obtained from the optimized master asset.
In another embodiment of method 600, a library of master assets may be maintained such that keyframe master assets and optimized master assets corresponding to at least one library master asset may be generated. In yet another embodiment of method 600, the request may identify a start key frame or an end key frame in the key frame master asset that corresponds to the start frame or the end frame, respectively.
Fig. 7 illustrates an embodiment of a method 700 for recording edits to media content. In method 700, a low-resolution media asset corresponding to a master high-resolution media asset is edited in an editing operation 702. In one embodiment, editing includes modifying an image of a low-resolution media asset corresponding to a master high-resolution media asset. For example, in the case of an image comprising pixel data, the pixels may be manipulated such that they appear in different colors or at different brightnesses. In another embodiment, editing includes modifying a duration of a low-resolution media asset corresponding to a duration of a master high-resolution media asset. For example, modifying the duration may include shortening the low-resolution media asset and the high-resolution media asset corresponding to the low-resolution media asset.
In yet another embodiment, where the master high-resolution media asset and the low-resolution media asset include at least one or more frames of video information, the editing includes modifying a transition property of the at least one or more frames of video information of the low-resolution media asset corresponding to the master high-resolution media asset. For example, transforms such as fade-in and fade-out transforms may replace an image of one frame with an image of another frame. In another embodiment, editing includes modifying a volume value of an audio component of a low-resolution media asset corresponding to a master high-resolution media asset. For example, a media asset that includes video information may include an audio track that may be played more or less strongly depending on whether a greater or lesser volume value is selected.
In another embodiment, where the master high-resolution media asset and the low-resolution media asset comprise at least two or more frames of sequential video information, the editing comprises modifying an order of the at least two or more frames of sequential video information of the low-resolution media asset corresponding to the master high-resolution media asset. For example, the order of the second frame may be adjusted to precede the first frame of the media asset comprising video information.
In yet another embodiment, editing includes modifying one or more uniform resource locators (e.g., URLs) associated with a low-resolution media asset corresponding to a master high-resolution media asset. In yet another embodiment, editing includes modifying a playback rate (e.g., 30 frames per second) of a low-resolution media asset corresponding to a master high-resolution media asset. In yet another embodiment, editing includes modifying a resolution (e.g., temporal or spatial resolution) of a low-resolution media asset corresponding to a master high-resolution media asset. In one embodiment, the editing may occur on a remote computing device. For example, the edit specification itself may be created on a remote computing device. Similarly, for example, the edited high-resolution media asset may be sent to a remote computing device for rendering on an image rendering device such as a browser.
The method 700 then includes generating an edit specification based on the edits to the low-resolution media asset in a generating operation 704. The method 700 further includes, in an applying operation 706, applying the edit specification to the master high-resolution media asset to create an edited high-resolution media asset. In one embodiment, method 700 further comprises rendering the edited high-resolution media asset on an image rendering device. For example, rendering the edited high-resolution media asset may itself include applying a media asset filter to the edited high-resolution media asset. As another example, applying the media asset filter may include overlaying the edited high-resolution media asset with an animation. As another example, applying the media asset filter may also include changing a display attribute of the edited high-resolution media asset. Changing display attributes may include, but is not limited to, changing video presentation attributes. In this example, applying the media asset filter can include changing a video effect, a title, a frame rate, a trick play effect (e.g., the media asset filter can change a fast forward, pause, slow motion, and/or rewind (rewind) operation), and/or a composite display (e.g., displaying a portion of two different media assets at least simultaneously, such as in the case of picture-in-picture and/or green screen composite). In another embodiment, method 700 may further include storing the edit specification. For example, the edit specification may be stored on a remote computing device or one or more computers connected via a network, e.g., via the internet.
FIG. 8 illustrates an embodiment of a method 800 for identifying edit information for a media asset. In method 800, a low-resolution media asset is edited in an editing operation 802, wherein the low-resolution media asset contains at least a first portion corresponding to a first high-resolution master media asset and a second portion corresponding to a second high-resolution master media asset. In one embodiment, the editing operation 802 further comprises storing at least some of the editing information as metadata with the high-resolution edited media asset. In another embodiment, the editing operation 802 may occur on a remote computing device.
In receiving operation 804, the method 800 then includes receiving a request to generate a high-resolution edited media asset, where the request identifies a first high-resolution master media asset and a second high-resolution master media asset. The method 800 then includes generating a high-resolution edited media asset in a generating operation 806. The method 800 further includes associating with the high-resolution edited media asset edit information in an associating operation 808, wherein the edit information identifies the first high-resolution master media asset and the second high-resolution master media asset.
In one embodiment, method 800 further comprises retrieving the first high-resolution master media asset or the second high-resolution master media asset. In another embodiment, the method 800 further comprises assembling the retrieved first high-resolution media asset and the retrieved second high-resolution asset into a high-resolution edited media asset.
FIG. 9 illustrates an embodiment of a method 900 for rendering a media asset. In method 900, a command to render an aggregate media asset defined by an edit specification is received in a receiving operation 902, wherein the edit specification identifies at least a first media asset associated with at least one edit instruction. In one embodiment, receiving operation 902 comprises an end user command. In another embodiment, receiving operation 902 may comprise a command issued by a computing device, such as a remote computing device. In another embodiment, the receiving operation 902 may include a series of commands that together represent a command for rendering an aggregate media asset defined by an edit specification.
In edit specification retrieval operation 904, an edit specification is retrieved. In one embodiment, the retrieving operation 904 may comprise retrieving the edit specification from memory or some other storage device. In another embodiment, the retrieving operation 904 may include retrieving the edit specification from a remote computing device. In yet another embodiment, retrieving the edit specification in retrieving operation 904 can include retrieving several edit specifications that collectively include a single related edit specification. For example, several edit specifications may be associated with different media assets (e.g., the programs of a show may each include a media asset) that together include a single related edit specification (e.g., for the entire show, including each program of the show). In one embodiment, the edit specification may identify a second media asset associated with a second edit instruction that is likely to be retrieved and rendered on a media asset rendering device.
In a media asset retrieval operation 906, a first media asset is retrieved. In one embodiment, the retrieving operation 906 may comprise retrieving the first media asset from a remote computing device. In another embodiment, the retrieving operation 906 may comprise retrieving the first media asset from memory or some other storage device. In yet another embodiment, the retrieving operation 906 may include retrieving a portion of the first media asset (e.g., a header or a first portion of a file). In another embodiment of the retrieving operation 906, the first media asset may comprise a plurality of sub-parts. According to the example set forth in retrieving operation 904, a first media asset in the form of a video (e.g., a show having a plurality of programs) may include a plurality of media asset portions (e.g., a plurality of programs represented by different media assets). In this example, the edit specification may contain information that links or otherwise relates a plurality of different media assets together to form a single related media asset.
In a rendering operation 908, a first media asset of the aggregate media asset is displayed on the media asset rendering device according to at least one edit instruction. In one embodiment, the edit instruction may identify or point to a second media asset. In one embodiment, a media asset rendering device may include a display for video information and a speaker for audio information. In embodiments where a second media asset is present, the second media asset may include information similar to the first media asset (e.g., both the first and second media assets may contain audio or video information) or different from the first media asset (e.g., the second media asset may contain audio information, such as commentary for a movie, while the first media asset may contain video information, such as images and voice for a movie). In another embodiment, rendering operation 908 may also include editing instructions for: modifying a transformation attribute of a transformation from a first media asset to a second media asset; overlaying effects and/or titles on assets; combining two assets (e.g., a combination of picture-in-picture and/or green screen capabilities generated from editing instructions); modifying a frame rate and/or presentation rate of at least a portion of the media asset; modifying a duration of the first media asset; modifying a display attribute of the first media asset; or modifying an audio attribute of the first media asset.
Fig. 10 illustrates an embodiment of a method 1000 for storing an aggregate media asset. In method 1000, a plurality of component media assets are stored in a storing operation 1002. For example, by way of illustration and not limitation, storing operation 1002 may comprise caching at least one of the plurality of component media assets in a memory. As another example, one or more component media assets can be cached in a memory cache reserved for a program, such as an internet browser.
In a storing operation 1004, a first aggregate edit specification is stored, wherein the first aggregate edit specification includes at least one command for rendering the plurality of component media assets to produce a first aggregate media asset. For example, an aggregate media asset may comprise one or more component media assets that contain video information. In this example, the component videos may be ordered such that they are rendered in some order as an aggregate video (e.g., a video clip). In one embodiment, the storing operation 1004 includes storing at least one command to sequentially display a first portion of the plurality of component media assets. For example, the command for display may modify the playback duration of a component media asset that includes video information. In another embodiment of the storing operation 1004, at least one command to render an effect corresponding to at least one of the plurality of component media assets may be stored. As one example, the storing operation 1004 may include commanding one or more effects that make up a transformation between media assets. In yet another embodiment of the storing operation 1004, a second aggregate edit specification can be stored, the second aggregate edit specification including at least one command for rendering a plurality of component media assets to produce a second aggregate media asset.
FIG. 11 illustrates an embodiment of a method for editing an aggregate media asset.
In method 1100, a stream corresponding to an aggregate media asset from a remote computing device is received in a playback session in a receiving operation 1102, the aggregate media asset comprising at least one component media asset. For example, a playback session may include a user environment that allows playback of media assets. As another example, a playback session may include one or more programs that may display one or more files. According to this example, the playback session may include an internet browser capable of receiving the streaming aggregate media asset. In this example, the aggregate media asset may comprise one or more component media assets residing on the remote computing device. The one or more component media assets can be streamed, thereby achieving bandwidth and processing efficiency on the local computing device.
In a rendering operation 1104, the aggregate media asset is rendered on an image rendering device. For example, an aggregate media asset may be displayed displaying pixel information from the aggregate media asset that includes video information. In a receiving operation 1106, a user command to edit an edit specification associated with an aggregate media asset is received. As previously discussed, the edit specification can take a variety of forms, including but not limited to one or more of the following files: the file contains metadata and other information associated with the component media assets that may be associated with the aggregate media asset.
In an initiating operation 1108, an edit session is initiated to edit an edit specification associated with the aggregate media asset. In one embodiment, initiating operation 1108 includes displaying information corresponding to an edit specification associated with the aggregate media asset. For example, an editing session may allow a user to adjust the duration of a certain component media asset. In another embodiment, the method 1100 further includes modifying an edit specification associated with the aggregate media asset, thereby changing the aggregate media asset. According to the foregoing example, once a component media asset is edited in an editing session, the editing of the component media asset may be performed on the aggregate media asset.
Those skilled in the art will recognize that the methods and systems of the present disclosure may be implemented in a variety of ways and as such are not limited by the foregoing exemplary embodiments and examples. In other words, functional elements and individual functions performed by a single or multiple components in various combinations of hardware and software or firmware may be distributed among software applications at either the client or server or both. In this regard, any number of the features of the different embodiments described herein may be combined into single or multiple embodiments, and alternative embodiments having fewer than or more than all of the features described herein are possible. The functionality may also be distributed, in whole or in part, among multiple components, in manners now known or to be known. Thus, many software/hardware/firmware combinations are possible in implementing the functions, features, interfaces and preferences described herein. Moreover, the scope of the present disclosure covers conventionally known manners for implementing the described features and functions and interfaces, as well as those changes and modifications that may be made to the hardware or software or firmware components described herein as would be understood by those skilled in the art now and in the future.
Although various embodiments have been described for purposes of this disclosure, various changes and modifications may be made which are within the scope of the invention. For example, the edit specification may also include instructions for layering multiple audio tracks together or combining different audio samples together. As another example, online reconfiguration of a three-dimensional gaming environment (e.g., editing of a 3D gaming environment) may be accomplished using the methods and systems described for generating low-resolution media assets corresponding to high-resolution media assets. As yet another example, the methods and systems described herein allow for interactive reconfiguration of Internet pages.
Many other changes may be made which will be suggested to a person skilled in the art by changes in itself, which changes are encompassed within the scope of the invention disclosed and as defined by the appended claims.

Claims (52)

1. A method for editing a low-resolution media asset to produce a high-resolution edited media asset, comprising:
receiving a request from a requestor to edit a first high-resolution media asset;
sending a low-resolution media asset to the requestor, the low-resolution media asset being based on the first high-resolution media asset;
receiving editing instructions associated with the low-resolution media asset from a requestor; and
generating a second high-resolution media asset based on the first high-resolution media asset and editing instructions associated with the low-resolution media asset.
2. The method of claim 1, further comprising:
transmitting at least a portion of the second high-resolution media asset to a remote computing device.
3. The method of claim 2, further comprising:
displaying, with an image rendering device associated with the remote computing device, at least a portion of the second high-resolution media asset.
4. The method of claim 1 wherein the first high-resolution media asset comprises a plurality of files and receiving a request to edit the first high-resolution media asset further comprises receiving a request to edit at least one of the plurality of files.
5. The method of claim 1, wherein receiving a request to edit a first high-resolution media asset further comprises:
a request to edit at least one high-resolution video file is received.
6. The method of claim 1, wherein generating a second high-resolution media asset further comprises:
applying the edit specification to at least one high-resolution video file that includes the first high-resolution media asset.
7. The method of claim 6, wherein receiving the editing instructions further comprises:
instructions for modifying video presentation properties of the at least one high-resolution video file are received.
8. The method of claim 7, wherein receiving instructions for modifying video presentation properties of the at least one high-resolution video file further comprises:
instructions for modifying an image aspect ratio, a spatial resolution value, a temporal resolution value, a bit rate value, or a compression value are received.
9. The method of claim 6, wherein receiving the editing instructions further comprises:
an instruction to modify a timeline of the at least one high resolution video file is received.
10. The method of claim 1, wherein generating the second high-resolution media asset further comprises:
at least one high resolution video file is generated.
11. The method of claim 1, wherein generating the second high-resolution media asset further comprises:
generating a copy of at least one high-resolution video file associated with the first high-resolution media asset;
applying the edit specification to the at least one high-resolution video file; and
saving the copy as the second high-resolution media asset.
12. The method of claim 1, wherein sending the low-resolution media asset further comprises:
at least one low resolution video file is transmitted.
13. The method of claim 12, wherein sending the low-resolution media asset further comprises:
converting at least one high-resolution video file associated with the first high-resolution media asset from a first file format to the at least one low-resolution video file having a second file format.
14. The method of claim 1, wherein receiving a request to edit a first high-resolution media asset further comprises:
a request to edit at least one high-resolution audio file is received.
15. The method of claim 1, wherein generating a second high-resolution media asset further comprises:
applying the edit specification to at least one high-resolution audio file comprising the first high-resolution media asset.
16. The method of claim 1, wherein generating a second high-resolution media asset further comprises:
at least one high resolution audio file is generated.
17. The method of claim 1, wherein generating a second high-resolution media asset further comprises:
generating a copy of at least one high-resolution audio file associated with the first high-resolution media asset;
applying the edit specification to the at least one high-resolution audio file; and
saving the copy as the second high-resolution media asset.
18. The method of claim 1, wherein sending the low-resolution media asset further comprises:
at least one low resolution audio file is transmitted.
19. The method of claim 18, wherein sending the low-resolution media asset further comprises:
converting at least one high-resolution audio file associated with the first high-resolution media asset from a first file format to the at least one low-resolution audio file having a second file format.
20. A computer readable medium encoding or containing computer executable instructions for performing a method for editing a low resolution media asset to produce a high resolution edited media asset, the method comprising:
receiving a request from a requestor to edit a first high-resolution media asset;
sending a low-resolution media asset to the requestor, the low-resolution media asset being based on a first high-resolution media asset;
receiving editing instructions associated with the low-resolution media asset from a requestor; and
generating a second high-resolution media asset based on the first high-resolution media asset and editing instructions associated with the low-resolution media asset.
21. The computer readable medium of claim 20, further comprising instructions for:
transmitting at least a portion of the second high-resolution media asset to a remote computing device.
22. The computer readable medium of claim 21, further comprising instructions for:
displaying, with an image rendering device associated with the remote computing device, at least a portion of the second high-resolution media asset.
23. The computer readable medium of claim 20 wherein the first high-resolution media asset comprises a plurality of files and receiving the request to edit the first high-resolution media asset further comprises instructions for: a request to edit at least one of the plurality of files is received.
24. The computer readable medium of claim 20, wherein receiving a request to edit a first high-resolution media asset further comprises instructions for:
a request to edit at least one high-resolution video file is received.
25. The computer readable medium of claim 20, wherein generating a second high-resolution media asset further comprises instructions for:
applying the edit specification to at least one high-resolution video file that includes the first high-resolution media asset.
26. The computer readable medium of claim 25, wherein receiving the editing instructions further comprises instructions for:
instructions for modifying video presentation properties of the at least one high-resolution video file are received.
27. The computer readable medium of claim 26, wherein the instructions for receiving video presentation properties for modifying the at least one high resolution video file further comprise instructions for:
instructions for modifying an image aspect ratio, a spatial resolution value, a temporal resolution value, a bit rate value, or a compression value are received.
28. The computer readable medium of claim 25, wherein receiving the editing instructions further comprises instructions for:
an instruction to modify a timeline of the at least one high resolution video file is received.
29. The computer readable medium of claim 20, wherein generating the second high-resolution media asset further comprises instructions for:
at least one high resolution video file is generated.
30. The computer readable medium of claim 20, wherein generating the second high-resolution media asset further comprises instructions for:
generating a copy of at least one high-resolution video file associated with the first high-resolution media asset;
applying the edit specification to the at least one high-resolution video file; and
saving the copy as the second high-resolution media asset.
31. The computer readable medium of claim 20, wherein sending the low resolution media asset further comprises instructions for:
at least one low resolution video file is transmitted.
32. The computer readable medium of claim 31, wherein sending the low resolution media asset further comprises instructions for:
converting at least one high-resolution video file associated with the first high-resolution media asset from a first file format to the at least one low-resolution video file having a second file format.
33. The computer readable medium of claim 20, wherein receiving a request to edit a first high-resolution media asset further comprises instructions for:
a request to edit at least one high-resolution audio file is received.
34. The computer readable medium of claim 20, wherein generating a second high-resolution media asset further comprises instructions for:
applying the edit specification to at least one high-resolution audio file comprising the first high-resolution media asset.
35. The computer readable medium of claim 20, wherein generating a second high-resolution media asset further comprises instructions for:
at least one high resolution audio file is generated.
36. The computer readable medium of claim 20, wherein generating a second high-resolution media asset further comprises instructions for:
generating a copy of at least one high-resolution audio file associated with the first high-resolution media asset;
applying the edit specification to the at least one high-resolution audio file; and
saving the copy as the second high-resolution media asset.
37. The computer readable medium of claim 20, wherein sending the low resolution media asset further comprises instructions for:
at least one low resolution audio file is transmitted.
38. The computer readable medium of claim 37, wherein sending the low resolution media asset further comprises instructions for:
converting at least one high-resolution audio file associated with the first high-resolution media asset from a first file format to the at least one low-resolution audio file having a second file format.
39. A system, comprising:
a high-resolution media asset library;
a low-resolution media asset generator that generates low-resolution media assets from high-resolution media assets contained in the high-resolution media asset library; and
a high-resolution media asset editor that applies edits to the high-resolution media asset based on edits made to the associated low-resolution media asset.
40. The system of claim 39, further comprising:
a low-resolution media asset editor on a computing device remote from the high-resolution media asset editor that sends edits to the associated low-resolution media asset.
41. The system of claim 40 wherein the low resolution media asset editor utilizes a browser.
42. The system of claim 41, wherein the low-resolution media assets are stored in a cache of the browser.
43. The system of claim 40, further comprising:
an image rendering device on a computing device remote from the high-resolution media asset editor that displays the associated low-resolution media asset.
44. The system of claim 43, wherein the image rendering device utilizes a browser.
45. The system of claim 44, wherein the associated low-resolution media assets are stored in a cache of the browser.
46. The system of claim 39, wherein the high-resolution media asset library is a shared library.
47. The system of claim 39, wherein the high-resolution media asset library is a public library.
48. The system of claim 39, wherein the high-resolution media asset library is a proprietary library.
49. The system of claim 39 wherein the high-resolution media asset library comprises at least one video file.
50. The system of claim 39, wherein the high-resolution media asset library comprises at least one audio file.
51. The system of claim 39 wherein the high-resolution media asset library includes at least one reference to a media asset residing on a remote computing device.
52. The system of claim 39 wherein the high-resolution media asset library resides on a plurality of computing devices.
HK09108061.8A 2006-01-13 2007-01-12 Method and system for online remixing of digital multimedia HK1130099A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US60/758,664 2006-01-13
US60/790,569 2006-04-10
US11/622,920 2007-01-12

Publications (1)

Publication Number Publication Date
HK1130099A true HK1130099A (en) 2009-12-18

Family

ID=

Similar Documents

Publication Publication Date Title
US8411758B2 (en) Method and system for online remixing of digital multimedia
KR100976887B1 (en) Method and system for creating and applying dynamic media specification builder and applicator
US8868465B2 (en) Method and system for publishing media content
KR100991583B1 (en) A method, computer readable storage medium and system for combining editorial information with media content
JP5112287B2 (en) Method and system for providing distributed editing and storage of digital media over a network
US20070240072A1 (en) User interface for editing media assests
KR100987862B1 (en) Method and system for recording edits in media content
CN101395918B (en) Method and system for creating and applying dynamic media specification creator and applicator
JP2015510727A (en) Method and system for providing file data for media files
HK1130099A (en) Method and system for online remixing of digital multimedia
HK1130137A (en) Method and system for combining edit information with media content
HK1130138A (en) Method and system for recording edits to media content
HK1130136A (en) Method and system for creating and applying dynamic media specification creator and applicator