[go: up one dir, main page]

WO2016205674A1 - Dynamic adaptive contribution streaming - Google Patents

Dynamic adaptive contribution streaming Download PDF

Info

Publication number
WO2016205674A1
WO2016205674A1 PCT/US2016/038114 US2016038114W WO2016205674A1 WO 2016205674 A1 WO2016205674 A1 WO 2016205674A1 US 2016038114 W US2016038114 W US 2016038114W WO 2016205674 A1 WO2016205674 A1 WO 2016205674A1
Authority
WO
WIPO (PCT)
Prior art keywords
video content
rate
server
dacs
client
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2016/038114
Other languages
French (fr)
Inventor
Alexander GILADI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vid Scale Inc
Original Assignee
Vid Scale Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vid Scale Inc filed Critical Vid Scale Inc
Publication of WO2016205674A1 publication Critical patent/WO2016205674A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/44029Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display for generating different versions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44245Monitoring the upstream path of the transmission network, e.g. its availability, bandwidth
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments

Definitions

  • Some communications are performed via expensive special purpose satellite links, for example, due to a lack of reliable capability to communicate with appropriate quality and/or latency over the Internet.
  • Live broadcasting of media content may be achieved by streaming through Hypertext Transfer Protocol (HTTP) streaming technologies.
  • HTTP Hypertext Transfer Protocol
  • traditional live newsgathering is very expensive, as it requires dedicated satellite bandwidth, dedicated vans with satellite antennas, and other high cost infrastructure.
  • a client device may receive video content, for example, by recording the video content (e.g., via a camera of the client device).
  • the client device may determine available bandwidth of a wireless communication channel for communication with a server.
  • the wireless communication channel may be that of a consumer grade network, for example, as described herein.
  • the available bandwidth may change over time.
  • the client device may determine a plurality of rates to encode the video content.
  • the client device may encode the video content to generate a plurality of segments.
  • the client device may determine a rate to encode each of the segments. The rate may be determined based on the available bandwidth at the time the client device is encoding each specific segment.
  • the encoding rate for each segment may increase or decrease accordingly.
  • the client device may send each segment to the server via the wireless com muni cation channel of the consumer grade network, for example, in real-time as the client device is receiving the video content.
  • the client device may generate a manifest file for the video content, which for example, may comprise information related to the video content (e.g., each of the encoded segments of the video content).
  • the client device may send the manifest fi le to the server, for example, via the wireless communication channei.
  • the manifest file may be static or the client device may dynamically update the manifest file.
  • the client device may monitor the available bandwidth, identify a change in the available bandwidth, determine (e.g., select) rates to encode segments of the video content based on the available bandwidth, and update the manifest file accordingly.
  • the client device may send the updated manifest file to the server. Thereafter, the client device may encode and send the video content at the determined rates (e.g., in real-time).
  • the recording, encoding, and sending the video content may occur in real-time.
  • the client device may encode segments of the video content at varying rates and send the segments in real-time.
  • the client device may encode the video content at higher rates when more bandwidth for the wireless channel is available, and encode the video content at lower rates when less bandwidth for the wireless channel is available.
  • the client device may encode and send each of the segments at a single rate in real-time, where a plurality of different rates may be used when encoding the entirety of the video content (e.g., based on the available bandwidth at the time of encoding each segment).
  • the client device may encode and send each of the segments at each of the plurality of rates to the server that we not part of the real-time transmission, for example, such that the client device sends each of the segments of the video content at each and eve ' one of the plurality of rates (e.g., backfills at a later time).
  • FIG. 1 is a block diagram of an example of a DASH system model.
  • FIG. 2 is a block diagram of an example DACS system.
  • FIG. 3 is an interaction diagram showing an example of D AC S client flow.
  • FIG. 4 is an interaction diagram showing an example of DACS client flow.
  • FIG. 5 is an interaction diagram showing an example of DACS client flow.
  • FIG. 6 is an interaction diagram showing an example of DACS client flow.
  • FIG. 7 A is a system diagram of an example communications system in which one or more disclosed embodiments may be implemented.
  • FIG. 7B is a system diagram of an example wireless transmit/receive unit (WTRU) that may be used within the communications system illustrated in FIG. 7A.
  • WTRU wireless transmit/receive unit
  • FIG. 7C is a system diagram of an example radio access network and an example core network that may be used within the communications system illustrated in FIG. 7A.
  • FIG. 7D is a system diagram of another example radio access network and an example core network that may be used within the communications system illustrated in FIG. 7A.
  • FIG. 7E is a system diagram of another example radio access network and an example core network that may be used within the communications system illustrated in FIG. 7A.
  • a DACS protocol for contribution streaming may have similarities and differences compared to Dynamic Adaptive Streaming over HTTP (DASH),
  • a DACS client may, for example, mn on or be connected to an acquisition device (e.g., a client device such as a mobile phone, DSLR, professional camera, etc).
  • a DACS client may provide a manifest file, generate one or more encodings of video content (e.g., representations, segments, etc).
  • a DACS client may upload one or more representations using Hypertext Transfer Protocol (HTTP) methods such as, for example, PUT or PUSH.
  • HTTP Hypertext Transfer Protocol
  • Content segments and/or representations may be uploaded via one or more network interfaces at one or more times (e.g., real-time and thereafter), for example, for load balancing, bandwidth adaptation, trading off content quality and latency among various representations, adding or removing enhancements, etc.
  • a CMD (contribution media description) file may comprise a manifest providing a content receiver with information about the video content (e.g., content segments).
  • a DACS client may specify and/or update one or more CMDs.
  • a CMD may enumerate multiple rates (e.g., bitrates). The rate may comprise encoding parameters such as rates, frame rates, bit depth, and/or the like.
  • a DACS client may adaptively switch between multiple rates, for example, based on available bandwidth between the DACS client and the receiving device. Techniques described may be applicable, for example, to Moving Picture Experts Group (MPEG) DASH, Society of Cable
  • Over-the-top (OTT) streaming may utilize the Internet as a deliver ⁇ ' medium.
  • Hardware capabilities may provide a wide range of video-capable devices, e.g., mobile devices, Internet set-top boxes (STBs), and network TVs.
  • Network capabilities may provide high-quality- video delivery over the Internet.
  • Closed networks may be controlled by a multi-system operator (MSO),
  • MSO multi-system operator
  • the Internet may be a best effort environment, in which bandwidth and/or latency may change.
  • Network conditions may be volatile in mobile networks. Dynamic adaptation to network changes (volatility) may improve user experience.
  • HTTP streaming may provide a video transport protocol.
  • HTTP infrastructure such as video content distribution networks (CDNs) and/or the ubiquity of HTTP support on multiple platforms and devices may permit use and scaling of HTTP for Internet video streaming.
  • Some firewalls may disallow user datagram protocol (UDP) traffic while permitting video over HTTP penetration behind firewalls.
  • HTTP streaming may be used for rate-adaptive streaming.
  • HTTP adaptive streaming may segment an asset ⁇ e.g., video content), for example, virtually or physically, HTTP adaptive streaming may publish segments to CDN's.
  • Intelligence may reside in a client.
  • a client may acquire knowledge of published alternative encodings ⁇ e.g., representations) and a way or technique to construct uniform resource locators (URLs), for example, to download a segment from a given representation.
  • An adaptive bitrate (ABR) client may observe network conditions and/or may select a combination of rate, resolution, etc., for example, to provide a quality ⁇ e.g., a best quality) of experience on a client device at a (e.g., each) instance of time.
  • a client may issue an HTTP GET request to download a segment, for example, when a client determines an optimal URL to use.
  • MPEG DASH may be built on top of a ubiquitous HTTP/TCP/IP stack.
  • DASH may define a manifest format, e.g., Media Presentation Description (MPD).
  • MPD Media Presentation Description
  • DASH may define segment formats, for example, for International Organization for Standardization (ISO) Base Media File Format (BMFF) and/or MPEG-2 Transport Streams (TS).
  • BMFF International Organization for Standardization
  • TS MPEG-2 Transport Streams
  • DASH may define a set of quality metrics at network, client operation and/or media presentation levels. Quality metrics may enable an interoperable technique, for example, to monitor Quality of Experience and Quality of Service.
  • a DASH concept may be a representation.
  • a representation may be defined as a single encoded version of an asset (e.g., a complete asset) or a subset of an asset's components.
  • a DASH: (e.g., DASH264) representation may be, for example, an ISO-BMFF comprising unmultiplexed 2.5 Mbps 720p Advanced Video Coding (AVC) video.
  • AVC Advanced Video Coding
  • AAC MPEG-4 Advanced Audio Coding
  • a single transport stream comprising video, audio, and subtitles may be a single multiplexed representation.
  • a structure may be combined.
  • video and English audio may be a single multiplexed representation, while Spanish and Chinese audio tracks may be separate unmultiplexed representations.
  • a segment may be a minimal or smallest individually addressable unit of media data.
  • a segment may be, for example, an entity that may be downloaded using URLs advertised via the MPD.
  • a media segment may be a 4-second part of a live broadcast, where pi ay out time starts at 0:42:38, ends at 0:42:42, and may be available within a 3 -minute time window.
  • a media segment may be a complete on-demand movie, which may be available for part or ail of a period a movie is licensed.
  • An MPD may be an extensible markup language (XML) document.
  • An XML document may advertise the available media and may provide information for a client, for example, to select a representation, make adaptation decisions and/or retrieve segments from a network.
  • An MPD may be independent of a segment.
  • An MPD may signal properties, for example, to determine whether a representation may be played.
  • An MPD may signal functional properties (e.g., an indication whether segments start at random access points).
  • An MPD may use a hierarchical data model to describe a presentation.
  • a representation may be the lowest conceptual level of a hierarchical data model.
  • An MPD may signal information, such as bandwidth, codecs for presentation and/or techniques to construct URLs for accessing segments. Additional information may be provided, such as trick mode, random access information, layer and view information for scalable and multiview codecs and/or generic schemes that may be supported by a client wishing to play a given representation.
  • D ASH may provide rich and flexible URL construction functionality.
  • a single monolithic per-segment URL may be possible in DASH.
  • DASH may allow dynamic
  • a list (e.g., an explicit list) of URLs and byte ranges may, for example, reach several thousand elements per representation, such as when short segments are used.
  • DASH may permit use of predefined variables, such as segment number, segment time, etc., and printf-style syntax, for example, for on-the-fly construction of URLs using templates. Segments may be listed (e.g., seg_0000I .ts, seg_00002.ts, . . ., seg_03600.ts).
  • a single line e.g.,
  • seg $Index%05$.ts may be written, for example, to express any number of segments.
  • a single line may be written, for example, even when segments may not be retrieved at the time an MPD is fetched.
  • Multi-segment representations may be used with templates, e.g., in DASH-AVC/264, for example, for template efficiency.
  • Different representations of an (e.g., the same) asset or a (e.g., the same) component, such as in an un-multiplexed case, may be grouped into adaptation sets.
  • Representations within an adaptation set may render the same content (e.g., video content).
  • a client may switch between representations.
  • an adaptation set may include 10 representations with video encoded with different rates and resolutions.
  • a client may switch between representations at segment or subsegment granularity while presenting the same video content to the viewer.
  • a seamless representation switch may occur, for example, under one or more segment-level restrictions.
  • Restrictions may be specified in DASH profiles and DASH subsets adopted by multiple SDO's. Segment restrictions may be applied to one or more (e.g., all) representations within an adaptation set. Bitstream switching may occur.
  • a time-limited subset of the presentation may be referred to as a period.
  • Adaptation sets may be valid within a period. Adaptation sets in different periods may or may not contain similar representations (e.g., in terms of codecs, rates, etc).
  • An MPD may contain a single period for the duration of the asset. Periods may be used for advertisement (ad) markup. Different periods may be dedicated to parts of the asset itself and/or to an advertisement.
  • An MPD may be an XML document that presents a hierarchy.
  • a hierarchy may comprise, for example, global presentation-level properties (e.g., timing), period-level properties and adaptation sets available for periods. Representations may be the lowest level of a hierarchy.
  • DASH may use a simplified version of XLink, for example, to permit loading parts of the MPD (e.g., periods) in real-time from a remote location.
  • An example may be ad insertion, such as when precise timing of ad breaks is known ahead of time.
  • Ad servers may determine an ad in real-time.
  • a dynamic MPD may change and may he periodically reloaded by a client.
  • a static MPD may be valid for a (e.g., an entire) presentation.
  • a static MPD may be used, for example, for Voice on Demand (VoD) applications.
  • a dynamic MPD may be used, for example, for live and Personal Video Recorder (PVR) applications.
  • VoD Voice on Demand
  • PVR Personal Video Recorder
  • a media segment may be a type of segment.
  • Media segments may be time- bounded parts of a representation. Approximate segment durations may appear in an MPD. Segment duration may or may not be the same for all segments. As an example, DASH- AVC/264 may use segments with durations within a 25% tolerance margin.
  • An MPD may comprise information regarding media segments that are unavailable at the time the MPD is read by the client, for example, in a live broadcast scenario. Segments may be (e.g., only) available within a well-defined availability time window. A time window may be calculated from the wall-clock time and/or segment duration,
  • An index segment is a type of segment. Index segments may appear as side files and/or within media segments. An index segment may comprise, for example, timing and/or random access information. Indexes may permit efficient implementation of random access and trick modes. Index segments may be used for efficient bitstream switching. Indexing may be used, for example, in VoD and PVR type applications.
  • Bitstream switching may be implemented, for example, using segment-level and/or representation-level properties.
  • DASH may provide functional (e.g., explicit functional) requirements for properties. Properties may be expressed in the MPD in a format-independent way, A (e.g., each) segment format specification may comprise format-level restrictions that correspond to generic requirements.
  • Media segment i of representation R may be denoted as S (i). having a duration
  • EPT earliest presentation time
  • Switching efficiency may be based in part on time alignment of segments for representations within an adaptation set.
  • Eq. 1 may define a relationship for a (e.g., every) " pair
  • a segment may start with a random access point of certain types. Switching may occur at a segment border with or without overlapped downloads and dual decoding. Bitstream switching may occur at a subsegment level, for example when indexing is used and similar requirements hold for subsegments. Time alignment and random access point placement restrictions may be applied. Restrictions may translate into encodings with matching
  • TDR instantaneous decoder refresh
  • FIG. 1 is a block diagram of an example of a DASH system model.
  • a DASH system model may comprise a DASH client 1000.
  • FIG. 1 may illustrate logical components of a conceptual DASH client model.
  • a DASH client 1000 may comprise an access client (e.g., an HTTP client, a DASH access engine 1020), a media engine 1040, and an application 1060.
  • the DASH access engine 020 may receive the MPD from a DASH server, construct, and i ssue requests, and receive segments or parts of segments from the DASH server.
  • the DASH access engine 1020 may pass events and/or timing to the application 1060.
  • a DASH server may- generate MPD files and/or encode video content. Interfaces may be defined, e.g., on-the-wire formats of the MPD and segments.
  • the DASH access engine 1020 may comprise a java script.
  • the media engine 1060 may decode, output, and present media.
  • the output of the DASH access engine 1020 may compri se media in MPEG container formats (e.g., MP4 file format or MPEG-2 transport stream) and timing information that maps internal timing of the media to a timeline of presentation.
  • the DASH access engine 1020 may include an HTTP server. It may be assumed that a combination of encoded chunks of media and/or timing information may be sufficient for correct rendering of video content.
  • the media engine 1060 may be provided by a browser, a browser plugin (e.g., Flash or Silverlight), and/or an operating system.
  • Timing behavior of a DASH client may be defined.
  • segments e.g., segments mentioned in a manifest
  • a client may poll (e.g., continuously) for updated manifests in Apple HLS.
  • DASH MPD may reduce polling behavior, for example, by defining MPD update frequency and allowing explicit calculation of segment availability.
  • a static MPD may be valid (e.g., always valid, or valid for the entire interaction between the DASH client and the DASH server).
  • a dynamic MPD may be valid, for example, from the time it was fetched by the client.
  • a refresh period may be explicitly stated.
  • An MPD may have a notion of versioning. As an example, an MPD may explicitly expose its publication time.
  • An MPD may provide an availability time of the earliest segment of a period, T (0).
  • n may be available, for example, starting from a time given by Eq. 2: n-1
  • Media segment n may be available for the duration of the timeshift buffer Ts, which may be explicitly stated in an MPD.
  • Availability window size may impact catch-up TV functionality of a DASH deployment. Segment availability time may be relied upon by an access client, for example when segment availability time falls within an MPD validity period.
  • MPD may declare bandwidth B for a representation R.
  • MPD may define a global minimum buffering time, BT .
  • An access client may pass a segment to the media engine, for example, after B *BT bits are downloaded, A segment may start with a random access point.
  • the earliest time that a segment n may be passed to a media engine may be, for example,
  • T (n)+Tj(ri)+BT T (ri) may represent the download time of segment n.
  • a DASH client may start play out (e.g. , in real-time), for example, to minimize delay.
  • MPD may propose a presentation delay as an offset from T (ri), for example, to synchronize different clients.
  • a tight synchronization of segment HTTP GET requests may create a thundering herd effect, which may tax infrastructure.
  • MPD validity and segment availability may be calculated, for example, using absolute (e.g., wall-clock) time.
  • Media time may be expressed within segments.
  • Drift may develop between the encoder and client clocks, for example, in a live scenario. Drift may be addressed at a container level, for example, when MPEG-2 TS and ISO-BMFF provide synch oni zati on functi on al i ty .
  • HTTP may be stateless and client-driven. Push-style events may be emulated using frequent polls. Upcoming ad breaks may be signaled before (e.g. , 3 to 8 seconds before) they start, for example, to insert ads in cable/Internet Protocol Television (IPTV) systems.
  • IPTV Internet Protocol Television
  • Events may be "blobs," for example, with explicit time and duration information and application-specific payioads.
  • Inband events may be message boxes appearing at a beginning of a media segment(s)
  • MPD events may be a period-level list of timed elements.
  • DASH may define an MPD validity expiration event.
  • a validity expiration event may identify an early MPD version that is valid after a given presentation time.
  • the early MPD version may be the earliest version that is valid after a given presentation time.
  • DASH may be agnostic to content protection, such as digital rights management (DRM).
  • DASH may support signaling a DRM scheme and its properties, e.g., within an MPD.
  • a DRM scheme may be signaled, for example, via a ContentProtection descriptor.
  • a value (e.g., an opaque value) may be passed within a descriptor.
  • a DRM scheme may be signaled, for example, using a unique identifier and/or a definition of the meaning of the passed value.
  • a DRM scheme may be signaled, for example, using a scheme-specific namespace.
  • MPEG may define protection standards, such as Common Encryption for ISO- BMFF (CENG) and Segment Encryption and Authentication.
  • Common encryption may- standardize which parts of a sample are encrypted and how encryption metadata is signaled within a track.
  • a DRM module may deliver encryption keys to the client, for example, based on encryption metadata in a segment. Decryption may use, for example. Advanced Encryption Standard Counter Mode (AES-CTR) or AES-Cipher Block Chaining (CBC) modes.
  • AES-CTR Advanced Encryption Standard Counter Mode
  • CBC AES-Cipher Block Chaining
  • CENG may ⁇ be extensible, e.g., to use other encryption algorithms.
  • Common Encryption may be used with DRM systems. Common Encryption may be used in DASH-AVC/264.
  • DASH Segment Encryption and Authentication may be agnostic to the segment format.
  • Encryption metadata may be passed via the MPD.
  • MPD may comprise information on which key is used for segment decryption and/or how to obtain a key.
  • AES-CBC encryption and HTTPS-based key transport may be utilized, for example, in Apple HLS.
  • MPEG-2 TS media segments may be compatible with encrypted HLS segments.
  • DASH- SEA may be extensible.
  • DASH-SEA. may permit other encryption algorithms and/or DRM systems, which may be similar to CENG.
  • DASH-SEA may provide a segment authenticity framework.
  • a framework may, for example, confirm that a segment received by a client is the segment an MPD author intended the client to receive. Confirmation may be performed, for example, using message
  • MAC authentication code
  • digest algorithms Confirmation may prevent content
  • content quality may be reduced, for example, to reduce content latency to a processing or serving location.
  • Content may be streamed (e.g., over the Internet) at the time of content acquisition, for example, using improvements in mobile networks (e.g., LTE-A) and WiFi ubiquity.
  • a consumer grade network may include a network that is available to the general public, for example, at a fee.
  • a consumer grade network may include a
  • a consumer grade network may comprise one or multiple network access technologies such as 3G cellular, 4G cellular, 5G cellular, WiFi, and/or the like.
  • a consumer grade network may be terrestrial (e.g., entirely terrestrial).
  • a consumer grade network may comprise satellite-based Internet.
  • a private network may be a dedicated network for a selected few businesses, for example, an extrateiTestrial satellite network dedicated to a broadcasting company such as NBC, ABC, FOX, etc.
  • a device may comprise multiple antennas (e.g., Wi-Fi and LTE), for example, to increase bandwidth, e.g., to send high quality content (e.g., video content).
  • Multiple devices e.g., multiple Long Term Evolution (LTE) modems and/or multiple WiFi connections to different networks, may be used to increase bandwidth.
  • Codecs such as High Efficiency Video coding (HEVC), may reduce upload bandwidth.
  • HEVC High Efficiency Video coding
  • DACS dynamic adaptive contribution protocol
  • DACS may include a protocol for dynamic adaptive contribution streaming
  • DACS may include a client and a server.
  • the protocol may include a contribution media description (CMD).
  • CMD contribution media description
  • the CMD may provide a manifest file, e.g., for uploaded segments.
  • the CMD may be static or dynamic. For example, the CMD may be updated periodically.
  • One or more versions of the CMD may be sent to a server (e.g., to the cloud).
  • Segments may be encoded, for example, on a consumer grade encoder on a device.
  • the device may be a WTRU including a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, a virtual reality headset, consumer electronics, and the like.
  • the device may include a mobile phone, digital single-lens reflex camera (DSLR), professional camera, and/or the like.
  • Encoded segments may be sent (e.g., uploaded) over a unicast protocol, such as HTTP or Google QUIC. Segments may be encoded and uploaded using a unicast protocol, such as HTTP or Google QUIC.
  • An uploading client may include, e.g., an upload engine.
  • the client may measure available bandwidth, e.g., continuously.
  • An uploading client may upload segments from a representation with adequate bandwidth.
  • An uploading client may inform or command an encoder to change a representation (e.g., to change encoding settings such as a target rate or quality level), for example, when the measured available bandwidth changes.
  • the encoder may be integrated in the device or separate from the device.
  • a device may include an adaptive encoder and/or a multi-rate encoder.
  • An adaptive encoder may output an (e.g., a single) encoded stream with capability to reconfigure (e.g., dynamically) rate control and/or output rate (e.g., in real-time).
  • the output rate may be set by a DACS client, for example, based on the available bandwidth (e.g., available upload throughput).
  • the output rate may be updated, for example, when the measured available bandwidth changes.
  • the AE may be implemented in software, hardware, or a combination of hardware and software.
  • the AE may implement a hardware for outputting an encoded stream and/or dynamically reconfiguring rate control and/or output rate in real-time
  • the AE may implement a software and/or a software encoder for outputting an encoded stream and/or dynamically reconfiguring rate control and/or output rate in real-time.
  • a multi-rate encoder may output simultaneous streams at different rates.
  • the output rates may be set by a DACS client, for example, based on the available bandwidth (e.g., the available upload throughput).
  • the output rates may be updated, for example, when the measured available bandwidth changes over time.
  • the MRE may be implemented in software, hardware, or a combination of hard ware and software.
  • an MRE may implement hardware for producing multiple different encodings of content at different rates.
  • the MRE may ⁇ be implemented by running multiple software encoders concurrently (e.g. in different threads), or by having a single software encoder that has the capability of outputting multiple different encoding rates, or of switching contexts/states to alternately service multiple different encoding tasks.
  • a device may determine (e.g., measure) a change in bandwidth of a communication channel with a server (e.g., a communication channel of a consumer grade network) when determining the encoding rate of content (e.g., the encoding rate of a segment of video content).
  • the device may monitor available bandwidth for a wireless communication channel (e.g., of a consumer grade network).
  • the device may determine a change of the available bandwidth of the wireless communication channel (e.g., of the consumer grade network) and/or adapt the encoding rate to the change of the available bandwidth of the wireless communication channel .
  • the device may monitor an aggregated available bandwidth of one or more network access types (e.g., WiFi, Cellular).
  • the device may determine a change of the aggregated available bandwidth of the one or more network access types and/or adapt the encoding rate to the change of the aggregated available bandwidth of the one or more network access types.
  • the device may switch between the one or more network access types. For example, the device may switch between WiFi and 3G, 3G and 4G, 4G and 5G etc.
  • the device may determine that bandwidth of a different wireless
  • the device may determine that bandwidth of the different communication channel and/or network is better than the current channel or bandwidth used for uploading the content. In such instances, the device may switch to the different channel and/or different network, and continue to encode and upload content (e.g., in real-time)
  • a device may encode segments of video content at one or more rates, for example, in real-time.
  • the device may receive video content (e.g., a portion of the entirety of the video content), encode the video content to generate a segment of the video content, and then send the segment to a server in real-time.
  • the device may receive, encode, and send segments to the server before the device has received the entirety of the video content ⁇ e.g., while the device is receiving the video content).
  • a complete video content may not be received and/or stored by the device before the device sends the segments to the server.
  • the device may adaptive! y determine the rate to encode subsequent segments of the content ⁇ e.g., based on the available channel bandwidth) as the device is receiving and sending segments of the content to a remote site ⁇ e.g., a server).
  • a device may upload media segments at a plurality of rates, for example, to permit a server to provide a multi-rate DASH presentation offering. Higher-quality segments that were not uploaded in time may be uploaded afterwards, for example, to permit a replay of footage at a higher quality than quality available in real-time.
  • base layer segments may be uploaded ⁇ e.g., uploaded as content is captured and encoded, or uploaded substantially in real-time) while enhancement layers that may not be uploaded initially ⁇ e.g., could not be uploaded in real-time) may be uploaded later,
  • a multi-rate encoding client may, if CPU power permits, encode the multiple rates as the content i s captured or acquired and may store the multiple rates locally for uploading later (e.g., may schedule uploading of the resulting multiple-rate content segments based on available upload throughput).
  • a multi-rate encoding client may store the original captured content, or may initially produce a high quality (e.g., lightly compressed) version of the content for local storage, and may then use the stored original content or the stored high quality version of the content to encode the multiple rate versions of the content for uploadi ng over a time period.
  • a high quality (e.g., lightly compressed) version of the content for local storage, and may then use the stored original content or the stored high quality version of the content to encode the multiple rate versions of the content for uploadi ng over a time period.
  • segments of a first representation at a first rate may be uploaded via a first network interface
  • representation at a second rate may be uploaded via a second network interface.
  • i th segment may be uploaded via interface mod(i,N), for example.
  • Different segments of the same representation may be uploaded via different network interfaces.
  • segments of a first representation at a first rate may be uploaded substantially in real-time (e.g., as the content is captured and is encoded at the first rate) via a first network interface, while segments of a second representation at a second rate may be uploaded via a second network interface with additional latency relative to the corresponding segments of the first representation.
  • Multiple interfaces may be on the same network (e.g., "bonded" interfaces) and/or on different networks. Bonding may occur at an application level or a driver/kernel level.
  • An adaptive upload protocol may be implemented based on a hypertext transfer protocol (HTTP).
  • HTTP hypertext transfer protocol
  • a HTTP-based implementation of an adaptive upload protocol may have some similarities and/or differences relative to moving picture experts group (MPEG) DASH. Segments may be uploaded, for example, using HTTP PUT or POST techniques.
  • Structure of an MPD e.g., or of a similar manifest adapted to be used for adaptive uploading of content
  • Table ⁇ provides an example of an element that may convey an upload URL. Locally stored segment URLs may be implemented, for example, using a file:// URL scheme. Table 1 - Example of an element to convey an upload URL
  • AE and/or MRE output rates may be listed in a contribution media description (CMD) file.
  • a CMD contribution media description
  • a CMD may provide a manifest, e.g., for uploaded segments.
  • the CMD may be based on MPD,
  • the CMD may reuse syntax elements of the well-known MPD manifest format.
  • the CMD may have additional elements such as, for example, the UploadBaseUrl ⁇ e.g., as defined in Table 1).
  • the CMD may include one or more of the determined rate, a coding parameter, resolution, information on a codec stream, initialization information, metadata associated with the encoded video content, suggested URL routes, time to announce the manifest file, time resolution of a segment, and a time window.
  • the CMD may be updated, for example, based on a change in the available bandwidth.
  • the updated CMD may be sent to a server via a wireless communication channel of a consumer grade network for communication with a server.
  • a DACS client may, for example, run on an acquisition device ⁇ e.g., mobile phone, DSLR, professional camera).
  • a device running DACS may be connected to an acquisition device, for example, using a non-compressed and/or high quality video interface, such as serial digital interface (SDI), High-Definition Multimedia Interface (HDMI) or IP ⁇ e.g., using Society of Motion Picture and Television Engineers (SMPTE) 2022 and/or high quality ⁇ e.g., mezzanine quality) 1 1.264, H.265 or MPEG-2 for video).
  • SDI serial digital interface
  • HDMI High-Definition Multimedia Interface
  • IP IP
  • SMPTE Society of Motion Picture and Television Engineers
  • SMPTE Society of Motion Picture and Television Engineers
  • Received footage may be encoded, for example, using a single-rate adaptive or multi-rate encoder.
  • Footage may be augmented, for example, with acquisition information, such as location, altitude, direction, and camera and lens identity and parameters, etc. Footage may
  • augmentation may be provided in the CMD manifest generated by the DACS client and/or may be sent inband within the uploaded content segments.
  • Encoded information may be uploaded, for example, to the cloud.
  • the uploaded encoded information may be made available for streaming, for example, after transcoding into distribution formats and editing.
  • the uploaded segments may be made available ⁇ e.g., as uploaded, or after some processing such as transcoding and/or editing) as a DASH presentation.
  • DACS ⁇ e.g., a DACS client
  • CMD may be generated and/or updated.
  • One or more versions of the CMD may be uploaded to the cloud.
  • FIG. 2 is a block diagram of an example DACS system 200.
  • a DACS client 220 may connect to an acquisition device 210, The DACS client 220 may be integrated with the acquisition device 210.
  • the DACS client 220 may comprise acquisition capabilities, e.g., the DACS client 260.
  • DACS protocol may be used to provide and/or update the CMD ⁇ e.g., a manifest of uploaded segments), e.g., to a DACS server 240.
  • DACS protocol may be used to upload segments ⁇ e.g., content such as video content segments which may be of varying bitrates) to the DACS server 240.
  • Information such as the CMD, updated CMD's, and video content segments may be sent from the DACS client 220 to the DACS server 240 through Network 230, for example, via HTTP PUT or HTTP POST techniques.
  • Information such as the CMD, updated CMD's, and video content segments may be sent from the DACS client 260 to the DACS server 240 through Network 250, for example, via HTTP PUT or HTTP POST techniques.
  • the DACS server 240 may communicate with a DASH server 270, e.g., the DACS server 240 may send the information to the DASH server 270 and/or communicate with the DASH server 270 in a similar manner as the DACS server 240 communicates with the DACS client 260.
  • the DACS seiver 240 may track the communication with the DASH server 270.
  • the DASH server 270 may send content segments, an MPD file, and/or updated MPD files to the DASH client 290 via the network 280.
  • the communication between the DASH server 270 and the DASH client 290 may follow DASH protocol.
  • the DACS server 240 or another cloud-based entity may offer the provided video content segments as a DASH presentation.
  • the DACS server 240 may have DASH server functionality, and/or the uploaded video content segments at the various rates provided by the DACS client 260 may be transferred to a separate DASH server 270,
  • a suitable MPD e.g., or multiple such MPD' s in dynamic fashion
  • the one or multiple MPD's may advertise the uploaded media segments at the various rates to a DASH client 290, which for example, may then request, retrieve and play back the media segments according to the DASH specifications.
  • the one or multiple MPD' s may be generated, for example, by the DACS server 240 which received the CMDs and/or the uploaded segments, by a DASH server 270 to which the uploaded video content was transferred, and/or by another cloud-based entity capable of generating MPD' s,
  • the one or multiple MPD's may be made available to a DASH client 290 (e.g., by the DASH server 270, or by other means).
  • the DASH server 270 may generate an MPD and/or may offer content streaming of a presentation to the DASH client 290 while the DASH Server 270 is receiving content for the presentation from the DACS Server 240,
  • the DASH server 270 may generate an MPD and/or may offer content streaming of a presentation to the DASH client 290 while the DACS Server 240 is receiving uploaded segments for the presentation from the DACS client (e.g., 220 and/or 260).
  • the DASH server 270 may generate an initial MPD based on the content segments that are available at the DASH server 270.
  • the D ASH server 270 may generate (e.g. dynamically) one or more updated MPD' s that reflect additional content of the presentation when such content is received by the DASH Server 270 and/or is available for streaming.
  • a DACS client may upload information, such as one or more CMDs and content segments, via more than one network, e.g., via more than one network interface.
  • a DACS client may include an AE and/or a MRE.
  • the DAC S client in FIG. 3 and FIG. 4 may include an AE.
  • the DACS client in FIG. 5 and FIG. 6 may include a MRE,
  • a DACS client may receive video content, e.g., by recording the video content, or by receiving the video content from a connected content acquisition device.
  • the DACS client may determine the available bandwidth that a consumer grade network may offer. Through the consumer grade network, the DACS client may communicate with a server.
  • the DACS client may determine a rate(s) at which the DACS client may encode the video content based on the available bandwidth.
  • the DACS client may monitor the available bandwidth.
  • the DACS client When the DACS client detects a change in the available bandwidth, the DACS client may switch to a different rate at which the DACS client may encode subsequent segments.
  • the DACS client may send a DACS server a CMD that includes the rate(s).
  • the CMD may be updated, e.g., based on the available bandwidth.
  • An adaptive encoder may be an encoder that outputs an (e.g., a single) encoded stream with capability to reconfigure (e.g., dynamically) rate control and/or output bitrate.
  • a D ACS client may set an AE bitrate, for example, as needed or desired.
  • the bitrate may be set based on the available upload throughput, for example.
  • a DACS client may switch to a different bitrate, for example, due to network conditions, and/or may set the AE bitrate appropriately.
  • a DACS client may insert a new period into the CMD.
  • the DACS client may upload the result as an updated CMD.
  • FIG. 3 is an interaction diagram showing an example of DACS client flow 300.
  • FIG. 3 illustrates CMD and media segments sent (e.g., uploaded) from a DACS client 320 with an AE.
  • the DACS client 320 may provide a CMD that may enumerate multiple rates such as rates A, B, and/or C at 322.
  • Information, such as the CMD, updated CMD's, and video content segments may be sent from the DACS client 320 to the DACS server 340 through a network via HTTP PUT or HTTP POST techniques, e.g., as shown in a communication 322.
  • the network may include a consumer grade network for communication between the DACS client 320 and the DACS server 340,
  • the DACS client 320 may stream media content in multiple rates in real-time.
  • the DACS client 320 may adaptively switch between multiple rates, such as rates A, B, and/or C when providing the content. Switching may occur, for example, based on available bandwidth of a wireless communication channel of the consumer grade network that is used for
  • the DACS client 320 may determine to encode media segments at rate A, for example, based on the available bandwidth of the wireless communication channel or the DACS client 320 may be configured to begin encoding at a particular rate independent of the communication channel.
  • the DACS client 320 may encode one or more media segment at rate A, and upload the media segments to the DACS server 340 at 324-324N.
  • the DACS client 320 may monitor (e.g., continuously monitor) the available bandwidth of the wireless communication channel of the consumer grade network.
  • the DACS client 320 may determine that the available bandwidth of the wireless communication channel of the consumer grade network changes.
  • the DACS client may determine that the wireless communication channel can now accommodate a segment encoded at rate C at 325. Accordingly, the DACS client 320 may encode one or more subsequent media segments at rate C, and upload the media segments to the DACS server 340 at 326-326N. The DACS client 320 may determine additional changes in the bandwidth of the wireless communication channel. For example, the D ACS client 320 may determine to encode subsequent media segments at rate B at 327 (e.g., the channel may no longer be able to accommodate segments encoded at rate C for real-time streaming). Accordingly, the DACS client 320 may encode one or more subsequent media segments at rate B, and upload the media segments to the DACS server 340 at 332-332N. Although not illustrated, the DACS client flow 300 may include HTTP responses from the DACS server 340.
  • FIG. 4 is an interaction diagram showing an example of DACS client flow 400.
  • FIG. 4 illustrates CMD and media segments sent (e.g., uploaded) from a DACS client 420 with an AE.
  • the DACS client 420 may upload media segments, for example, according to one or more of rates defined in an updated (e.g., most recently updated) CMD.
  • the DACS client 420 may adaptively switch between rates, for example, based on available bandwidth in a wireless communication channel for communication between the DACS client 420 and the DACS server 440,
  • the wireless communication channel be that of a consumer grade network.
  • the DACS client 420 may send a dynamic CMD to the server 440 at 422.
  • the CMD may specify one or more rates (e.g., rate A) and/or an expiration time of the CMD.
  • Information such as the CMD, updated CMD' s, and/or video content segments may be sent from the DACS client 420 to the DACS server 440 through a network via HTTP PUT or HTTP POST techniques, e.g., as shown in a communication 422.
  • the network may include a consumer grade network for communication between the DACS client 420 and the DACS sever 440.
  • the DACS client 420 may determine to encode media segments at rate A and stream the media content (e.g., video content) in real-time.
  • the DACS client 420 may make the determination based on the available bandwidth of the wireless communication channel of the consumer grade network that is used for communication between the DACS client 420 and the DACS server 440.
  • the DACS client 420 may be configured to begin encoding at rate A. For example, the DACS client 420 may determine to encode at rate A independent of the
  • the DACS client 420 may encode one or more media segments at rate
  • the DACS client 420 may adaptively switch to a different rate and/or switch from the different rate to rate A. Switching may occur, for example, based on the available bandwidth of the wireless communication channel with the DACS server 440 (e.g., a wireless
  • the DACS client 420 may monitor (e.g., continuously monitor) available bandwidth of the wireless communication channel .
  • the DACS client 420 may determine that available bandwidth of the wireless communication channel changes. For example, the DACS client 420 may determine that the wireless communication channel may accommodate a segment encoded at rate B at 426.
  • the DACS client 420 may update the CMD to include, e.g., additional rates that a changing bandwidth of a wireless communication channel may accommodate.
  • the DACS client 420 may update the CMD to include rate B.
  • the DACS client 420 may send the updated CMD to the DACS server (e.g., in real-time) at 428.
  • the updated CMD may include information related to the segment that may be encoded at the additional rate.
  • the DACS client 420 may send the updated CMD to the DACS server (e.g., in real-time) at 428 through the wireless communication channel via HTTP PUT or HTTP POST techniques, e.g., as shown at 428,
  • the DACS client 420 may encode one or more subsequent media segment at rate
  • the DACS client 420 may monitor (e.g., continuously measure) the available bandwidth and adaptively switch the encoding rate of the video content (e.g., in real-time) based on the change in available bandwidth. For example, the DACS client 420 may determine to encode subsequent segments at rate A at 432. For example, the wireless communication channel may no longer be able to accommodate segments encoded at rate B for real-time streaming. The DACS client 420 may encode one or more subsequent media segments at rate A, and upload the media segments to the DACS server 440 at 434-434N (not shown in FIG. 4). Although not illustrated, the DACS client flow 400 may include HTTP responses from the DACS server 440.
  • a DACS client may upload the entirety of media content (e.g., a video) but do so by uploading segments encoded according to a plurality of different rates.
  • the DACS client 320 may upload a varying, multi-rate encoding of the media content to the server in real-time (e.g., based on the available bandwidth at the time of encoding segments of the content), and subsequently upload segments of the content at rates that were not part of the real-time transmission (e.g., backfill to ensure that the server has a complete version of the content at each of the plurality of rates).
  • the DACS client in FIG. 5 and FIG. 6 may include a MRE.
  • a DACS client with a MRE may encode and/or send to a DACS server simultaneous streams at different rates via a network.
  • the DACS client may upload to a server (e.g., to the cloud) video content with one or more representations that are encoded at different rates.
  • the DACS client may send a DACS server (e.g., a cloud) a CMD that includes the different rates via the network.
  • the CMD may be updated, for example, based on the available bandwidth.
  • a multi-rate encoder may output simultaneous streams at different bitrates.
  • MRE output bitrates may be listed in the CMD.
  • a DACS client may upload, e.g., to the cloud, content with one or more representations.
  • a DACS client may send to a server a representation of a segment that is sustainable with available bandwidth.
  • the DACS client may perform a refinement, for example, when more bandwidth is available (e.g. after the end of recording).
  • the DACS client may upload higher-quality segments of a representation.
  • a time window for refinement may be signaled (e.g. explicitly) in the CMD.
  • FIG. 5 is an interaction diagram showing an example of DACS client flow 500.
  • D ACS client flow 500 illustrates an example call flow of CMD and media segments being sent (e.g., uploaded) from a DACS client with an MRE 520 to a DACS server 540 (e.g., in real-time).
  • the DACS client 520 may provide a CMD that may enumerate multiple rates such as rates A, B, and/or C at 522.
  • Information, such as the CMD, updated CMD' s, and video content segments may be sent from the DACS client 520 to the DACS server 540 through a network via HTTP PUT or HTTP POST techniques, for example, as shown in a communication 522.
  • the network mav include a consumer grade network for communication between the DACS client 520 and the DACS server 540.
  • the DACS client 520 may stream media content in multiple rates simultaneously or successively.
  • the DACS client 520 may determine to encode media segments at rate A, B, and C.
  • the DACS client 520 may be configured to begin encoding at rate A, B, and C.
  • the DACS client 520 may encode one or more media segments at rate A, B, and C, and upload the media segments to the DACS server 540.
  • the DACS client 520 may encode segment- 1 at rate A, B, and C.
  • the DACS ciient 520 may upload segment- 1 at rate A, B, and C to the DACS server 540 at 524-1, 526-1, and 528- 1 simultaneously via a network, or at different times and/or via different networks.
  • the network may be a wireless communication channel of a consumer grade network.
  • the DACS client 520 may encode segment-2 at rate A, B, and C.
  • the DACS client 520 may upload segment-2 at rate A, B, and C to the DACS server 540 at 524-2, 526-2, and 528-2 simultaneously via a network, or at different times and/or via different networks.
  • the DACS client 520 may upload segment-N at rate A, B, and C to the DACS server 540 at 524-N, 526-N, and 528-N simultaneously via a network, or at different time via different networks (not shown in FIG. 5) 524.
  • the network the DACS ciient 520 used at 524-2, 526-2, and 528-2 may differ from the network the DACS client 520 used at 524- 1, 526-1, and 528-1.
  • the network the DACS client 520 used for sending segment-2 at rate A, B, and C to the DACS server 540 at 524-2, 526-2, and 528-2 may be a different consumer grade network.
  • the DACS client flow 500 may include HTTP responses from the DACS server 540.
  • the DACS ciient 520 may encode multiple rates (e.g., simultaneously) if sufficient encoding power and/or sufficient upload bandwidth are available.
  • the DACS client 520 may upload segments at the multiple rates to the DACS Server 540 if sufficient encoding power and/or sufficient upload bandwidth are available.
  • the upload bandwidth may not be sufficient to achieve continuous real-time uploading of segments at all advertised rates.
  • there may be an interruption .
  • a temporary lack of computing power on the DACS client 520 may occur.
  • the DACS ciient 520 may drop transmission of segments at one or more of the advertised rates.
  • the DACS client may later (e.g., not in real-time) resume real-time uploading of segments at all advertised rates when sufficient bandwidth is available, or when the interruption is over.
  • the DACS ciient 520 may, at a later time, upload the segments at the one or more of the advertised rates that were dropped during the initial (e.g., real-time) multi-rate streaming phase. Uploading the segments at the one or more of the advertised rates that were dropped during the initial (e.g., real-time) multi- rate streaming phase may occur after an entirety of content is captured and/or when the initial (e.g., real-time) multi-rate streaming phase is over.
  • Uploading the segments at the one or more of the advertised rates that were dropped during the initial (e.g., real-time) multi-rate streaming phase may occur while the content is being captured and/or when the initial (e.g., real-time) multi-rate streaming phase is in progress. Uploading the segments at the one or more of the advertised rates that were dropped during the initial (e.g., real-time) multi-rate streaming phase may occur when sufficient bandwidth is available.
  • a DACS client may generate a manifest file comprising multiple rates, and/or encode video content at one or more rates in real-time, and send the video content at one or more of the multiple rates subsequently (e.g., not in real-time) to the server.
  • the DACS client may encode the video content at a rate (e.g., the lowest rate) in real-time such that the video content is encoded at an acceptable quality and/or send the video content with the acceptable quality at the lowest rate to the server.
  • the DACS client may subsequently (e.g., not in real-time) select a rate corresponding to a video quality that is higher than the acceptable rate, encode the video content according to the selected rate, and/or send the encoded content to the DACS server.
  • FIG. 6 is an interaction diagram showing an example of DACS client flow 600.
  • the DACS client flow 600 illustrates an example call flow of CMD and media segments being sent (e.g., uploaded) from a DACS client 620 to a DACS server 640 (e.g., in real-time).
  • the DACS client 620 may comprise an MRE.
  • the DACS client 620 may send a CMD to a DACS server 640 at 622.
  • the CMD may enumerate one or more rates, such as rate A, B and/or C.
  • the rates and/or the number of the rates may be preeonfigured (e.g., based on processing power of the DACS client 620 and/or other circuitry of the DACS client 620), and/or may be based on the available bandwidth of a wireless communication channel.
  • Information such as the CMD, updated CMD' s, and video content segments may be sent from the DACS client 620 to the DACS server 640 through a network via HTTP PUT or HTTP POST techniques, e.g., as shown at 622.
  • the network may include a consumer grade network for communication between the DACS client 620 and the DACS server 640.
  • the DACS client 620 may upload segments 1, 2 until segment N according to the CMD at rate A at 624-1 to 624-N.
  • media segments at rate A may be uploaded substantially in real-time, for example, as media video content is captured and encoded.
  • An entirety of the media content may be encoded into segments l -N at rate A.
  • the encoded video content at rate A may have the lowest acceptable quality.
  • a DACS client may upload video content (e.g., one or more segments) at additional rates.
  • a DACS client may upload media segments at additional rates, for example, to permit a DACS server to provide a multi-rate DASH presentation offering.
  • a DACS client may- upload video content at additional rates, for example, after an entirety of the video content at an original rate is uploaded (e.g., not in real-time) and/or at a time when a DACS client determines there is enough bandwidth available to upload segments encoded at additional rates.
  • the DACS client 620 may upload segments 1 -N at rate B at 630- 1 to 630-N.
  • the DACS client 620 may upload segments 1-N at rate C at 632-1 to 632-N.
  • the encoded video content at rate B and/or at rate C may have a quality higher than the encoded video content encoded at rate A.
  • the DACS client flow 600 may include HTTP responses from the DACS server 640.
  • the DACS client may implement the combination of one or more techniques illustrated in FIG. 3, FIG. 4, FIG. 5, and FIG. 6.
  • a DACS client may adaptively switch between various rates for encoding and/or uploading multiple segments.
  • the DACS client may initially encode and/or upload various segments at rates that adaptively change as the video content is captured and encoded.
  • the rate at which each segment is encoded may be adapted to an available upload bandwidth of a wireless communication channel of a consumer grade network for communication with a server.
  • the DACS client may switch rates as it encodes and/or uploads the video content segments, for example, as illustrated in FIG. 3.
  • the DACS client may switch rates according to a CMD and/or an updated CMD, The CMD and/or the update CMD may enumerate the rates, for example, as illustrated in FIG. 4.
  • the DACS client may (e.g., with some latency, or after the adaptively encoded version of the video content is uploaded) upload additional video content segments.
  • the additional video content segments may comprise one or more alternative versions of each previously encoded/uploaded video content segment.
  • the video content segments may be encoded and/or uploaded at additional rates which were not previously used.
  • the DACS client may track which segments were initially uploaded during the adaptive encoding phase.
  • the DACS server may receive an entirety of the video content that comprises segments encoded at various rates.
  • the DACS client may determine at which rates a video content segment may be uploaded but were not initially or previously uploaded. Based on this determination, the DACS client may encode and/or may upload the segments at the rates that were not previously used. The DACS client may (e.g., after some latency) provide all video content segments at all of a set of intended rates. The set of intended rates may have been announced by the DACS client in a CMD and sent by the DACS client to a DACS server, for example. [0099] After a DACS client has provided content segments at multiple rates to the DACS server and/or another cloud-based entity, the DACS server and/or another cloud-based entity may offer the provided content segments as a DASH presentation.
  • the DACS server may have DASH server functionality, or transfer the uploaded content segments at the multiple rates that are provided by the DACS client to a separate DASH server.
  • a suitable MPD e.g., multiple suitable MPD's in a dynamic fashion
  • the one or multiple MPD's may advertise the uploaded media segments at the multiple rates to a DASH client, which for example, may then request, retrieve, and/or play back the media segments according to DASH specifications.
  • the one or multiple MPD's may be generated, for example, by the DACS server which received the CMDs and/or the video content segments, by a DASH server to which the video content segments were transferred, and/or by a cloud-based entity capable of generating MPD's.
  • the one or multiple MPD's may be made available to DASH clients (e.g., by the DASH server, or by other means).
  • FIG. 7A is a diagram of an example communications system 100 in which one or more disclosed embodiments may be implemented.
  • the commumcatioiis system 100 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, etc., to multiple wireless users.
  • the communications system 100 may enable multiple wireless users to access such content through the sharing of system resources, including wireless bandwidth.
  • the communications systems 100 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single- carrier FDMA (SC-FDMA), and the like.
  • CDMA code division multiple access
  • TDMA time division multiple access
  • FDMA frequency division multiple access
  • OFDMA orthogonal FDMA
  • SC-FDMA single- carrier FDMA
  • the communications system 100 may include wireless transmit/receive units (WTRUs) 102a, 102b, 102c, and/or 102d (which generally or collectively may be referred to as WTRU 102), a radio access network (RAN) 103/104/105, a core network 106/107/109, a public switched telephone network (PSTN) 108, the Internet 110, and other networks 1 12, though it will be appreciated that the disclosed embodiments contemplate any number of WTRUs, base stations, networks, and/or network elements.
  • WTRUs 102a, 102b, 102c, 102d may be any type of device configured to operate and/or communicate in a wireless environment.
  • the WTRUs 102a, 102b, 102c, 102d may be configured to transmit and/or receive wireless signals and may include user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, consumer electronics, and the like.
  • UE user equipment
  • PDA personal digital assistant
  • smartphone a laptop
  • netbook a personal computer
  • a wireless sensor consumer electronics, and the like.
  • the communications systems 100 may also include a base station 114a and a base station 1 14b.
  • Base stations 14a, 114b may be any type of device configured to wirelessly interface with at least one of the WTRUs 102a, 102b, 102c, 102d to facilitate access to one or more communication networks, such as the core network 106/107/109, the Internet 1 10, and/or the networks 1 2.
  • the base stations 114a, 114b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a site controller, an access point (AP), a wireless router, and the like. While the base stations 114a, 114b are depicted as a single element, it will be appreciated that the base stations 114a, 114b may include any number of interconnected base stations and/or network elements.
  • the base station 1 14a may be part of the RAN 103/104/105, which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, etc.
  • BSC base station controller
  • RNC radio network controller
  • the base station 1 14a and/or the base station 14b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a ceil (not shown).
  • the cell may further be divided into cell sectors.
  • the cell associated with the base station 1 14a may be divided into three sectors.
  • the base station 114a may include three transceivers, e.g., one for each sector of the cell.
  • the base station 1 4a may employ multiple-input multiple output (MIMO) technology and, therefore, may utilize multiple transceivers for each sector of the cell.
  • MIMO multiple-input multiple output
  • the base stations 114a, 1 14b may communicate with one or more of the WTRUs 102a, 102b, 102c, 102d over an air interface 115/1 16/1 17, which may be any suitable wireless communication link (e.g., radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible light, etc.).
  • the air interface 115/116/117 may be established using any suitable radio access technology (RAT).
  • RAT radio access technology
  • the communications system 100 may be a multiple access system and may employ one or more channel access schemes, such as CDM A, TDM A, FDMA, OFDMA, SC-FDMA, and the like.
  • CDM A Code Division Multiple Access
  • TDM A Time Division Multiple Access
  • FDMA Frequency Division Multiple Access
  • SC-FDMA SC-FDMA
  • the base station 1 4a in the RAN 103/104/105 and the WTRUs 102a, 102b, 102c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 1 1 5/1 16/1 17 using wideband CDMA (WCDMA).
  • UMTS Universal Mobile Telecommunications System
  • UTRA Universal Mobile Telecommunications System
  • WCDMA wideband CDMA
  • WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA.+).
  • HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA).
  • HSPA High-Speed Packet Access
  • HSDPA High-Speed Downlink Packet Access
  • HSUPA High-Speed Uplink Packet Access
  • the base station 1 14a and the WTRUs 102a, 102b, 102c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 1 15/1 16/1 17 using Long Term Evolution (LTE) and/or LTE- Advanced (LTE- A).
  • E-UTRA Evolved UMTS Terrestrial Radio Access
  • LTE Long Term Evolution
  • LTE- A LTE- Advanced
  • the base station 1 14a and the WTRUs 102a, 102b, 102c may implement radio technologies such as IEEE 802.16 (e.g., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 IX, CDMA2000 EV-DO, Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and the like.
  • IEEE 802.16 e.g., Worldwide Interoperability for Microwave Access (WiMAX)
  • CDMA2000, CDMA2000 IX, CDMA2000 EV-DO Code Division Multiple Access 2000
  • IS-95 Interim Standard 95
  • IS-856 Interim Standard 856
  • GSM Global System for Mobile communications
  • GSM Global System for Mobile communications
  • EDGE Enhanced Data rates for GSM Evolution
  • GERAN GSM EDGERAN
  • the base station 1 14b in FIG. 7A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, and the like.
  • the base station J 14b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802.1 1 to establish a wireless local area network (WLAN).
  • WLAN wireless local area network
  • the base station 1 14b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802, 15 to establish a wireless personal area network (WPAN),
  • the base station 114b and the WTRUs 102c, 102d may utilize a cellular- based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE- A, etc.) to establish a picocel l or femtoceli.
  • the base station 1 14b may have a direct connection to the Internet 1 10.
  • the base station 1 14b may not be used to access the Internet 1 10 via the core network 106/107/109.
  • the RAN 103/104/105 may be in communication with the core network
  • the core network 106/107/109 may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of the WTRUs 102a, 102b, 102c, 102d.
  • the core network 106/107/109 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perform high-level security functions, such as user authentication.
  • the RAN 103/104/105 and/or the core network 106/107/109 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 103/104/105 or a different RAT.
  • the core network 106/107/109 may also be in communication with another RAN (not shown) employing a GSM radio technology.
  • the core network 106/107/109 may also serve as a gateway for the WTRUs 102a, 102b, 102c, 102d to access the PSTN 108, the Internet 110, and/or other networks 112.
  • the PSTN 108 may include circuit-switched telephone networks that provide plain old telephone service (POTS).
  • POTS plain old telephone service
  • the Internet 110 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and the internet protocol (IP) in the TCP/IP internet protocol suite.
  • the networks 112 may include wired or wireless
  • the networks 112 may include another core network connected to one or more RANs, which may employ the same RAT as the RAN 103/104/105 or a different RAT.
  • One or more of the WTRUs 102a, 102b, 102c, 102d in the communications system 100 may include multi-mode capabilities, e.g., the WTRUs 102a, 102b, 102c, 102d may include multiple transceivers for communicating with different wireless networks over different wireless links.
  • the WTRU 102c shown in FIG. 7 A may be configured to communicate with the base station 1 14a, which may emplo a cellular-based radio technology, and with the base station 114b, which may employ an IEEE 802 radio technology.
  • FIG. 7B is a system diagram of an example WTRU 102.
  • the WTRU 102 may include a processor 118, a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, non-removable memory 130, removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and other peripherals 138.
  • GPS global positioning system
  • the base stations 1 14a and 114b, and/or the nodes that base stations 114a and 1 14b may represent, such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HeNB), a home evolved node-B gateway, and proxy nodes, among others, may include one or more of the elements depicted in FIG. 7B.
  • BTS transceiver station
  • Node-B a Node-B
  • AP access point
  • eNodeB evolved home node-B
  • HeNB home evolved node-B gateway
  • proxy nodes among others, may include one or more of the elements depicted in FIG. 7B.
  • the processor 1 8 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of
  • the processor 18 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment.
  • the processor 1 18 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. 7B depicts the processor 1 18 and the transceiver 120 as separate components, it will be appreciated that the processor 1 18 and the transceiver 120 may be integrated together in an electronic package or chip.
  • the transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 1 14a) over the air interface
  • the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals.
  • the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example.
  • the transmit/receive element 122 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
  • the WTRU 102 may include any number of transmit/receive elements 122. More specifically, the WTRU 102 may employ MIMO technology. Thus, in one embodiment, the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 1 15/1 16/117.
  • the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 1 15/1 16/117.
  • the transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122.
  • the WTRU 102 may have multi-mode capabilities.
  • the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as UTRA and IEEE 802.1 1 , for example.
  • the processor 1 18 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit).
  • the processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128.
  • the processor 1 18 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132.
  • the non-removable memory 130 may include random- access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
  • the removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like.
  • SIM subscriber identity module
  • SD secure digital
  • the processor 1 18 may access information from, and store data in, memon,' that is not physically located on the WTRU 102, such as on a server or a home computer (not shown),
  • the processor 1 18 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102,
  • the power source 134 may be any suitable device for powering the WTRU 102.
  • the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.
  • the processor 1 18 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102.
  • location information e.g., longitude and latitude
  • the RU 102 may receive location information over the air interface 115/116/1 17 from a base station (e.g., base stations 1 14a, 114b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
  • the processor 1 18 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity.
  • the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
  • the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player
  • FIG. 7C is a system diagram of the RAN 103 and the core network 106 according to an embodiment.
  • the RAN 103 may employ a UTRA radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 1 15.
  • the RAN 103 may also be in communication with the core network 106.
  • the RAN 103 may include Node-Bs 140a, 140b, 140c, which may each include one or more transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 1 15.
  • the Node-Bs 140a, 140b, 140c may each be associated with a particular cell (not shown) within the RAN 103.
  • the RAN 103 may also include RNCs 142a, 142b. It will be appreciated that the RAN 103 may include any number of Node-Bs and RNCs while remaining consistent with an embodiment.
  • the Node-Bs 140a, 140b may be in communication with the RNC 142a. Additionally, the Node-B 140c may be in communication with the RNC142b.
  • the Node-Bs 140a, 140b, 140c may communicate with the respective RNCs 142a, 142b via an Iub interface.
  • the RNCs 142a, 142b may be in communication with one another via an Iur interface.
  • Each of the RNCs 142a, 142b may be configured to control the respective Node-Bs 140a, 140b, 140c to which it is connected.
  • each of the RNCs 142a, 142b may be configured to cany out or support other functionality, such as outer loop power control, load control, admission control, packet scheduling, handover control, macrodiversity, security functions, data encryption, and the like.
  • the core network 106 shown in FIG. 7C may include a media gateway (MGW) 144, a mobile switching center (MSC) 146, a serving GPRS support node (SGSN) 148, and/or a gateway GPRS support node (GGSN) 150. While each of the foregoing elements are depicted as part of the core network 106, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • MGW media gateway
  • MSC mobile switching center
  • SGSN serving GPRS support node
  • GGSN gateway GPRS support node
  • the RNC 142a in the RAN 103 may be connected to the MSC 146 in the core network 106 via an IuCS interface.
  • the MSC 146 may be connected to the MGW 144.
  • the MSC 146 and the MGW 144 may provide the WTRUs 102a, 102b, 102c with access to circuit- switched networks, such as the PSTN 108, to facilitate communications between the WTRUs 102a, 102b, 102c and land-line communications devices.
  • the RNC 142a in the RAN 103 may also be connected to the SGSN 148 in the core network 106 via an MPS interface.
  • the SGSN 148 may be connected to the GGSN 150,
  • the SGSN 148 and the GGSN 150 may provide the WTRUs 102a, 102b, 102c with access to packet-switched networks, such as the Internet 110, to facilitate communications between and the WTRUs 102a, 102b, 102c and IP-enabled devices.
  • the core network 106 mav also be connected to the networks 1 12, which may include other wired or wireless networks that are owned and/or operated by other service providers.
  • FIG. 7D is a system diagram of the RAN 104 and the core network 107 according to an embodiment.
  • the RAN 104 may employ an E-UTRA radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 1 16.
  • the RAN 104 may also be in communication with the core network 107.
  • the RAN 104 may include eNode-Bs 160a, 160b, 160c, though it will be appreciated that the RAN 104 may include any number of eNode-Bs while remaining consistent with an embodiment.
  • the eNode-Bs 160a, 160b, 160c may each include one or more
  • the eNode-Bs 160a, 160b, 160c may implement ⁇ technology.
  • the e ' Node-B 160a for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 102a.
  • Each of the eNode-Bs 160a, 160b, 160c may be associated with a particular cell (not shown) and may be configured to handle radio resource management decisions, handover decisions, scheduling of users in the uplink and/or downlink, and the like. As shown in FIG. 7D, the eNode-Bs 160a, 160b, 160c may communicate with one another over an X2 interface,
  • the core network 107 shown in FIG. 7D may include a mobility management gateway (MME) 162, a serving gateway 164, and a packet data network (PDN) gateway 166. While each of the foregoing elements are depicted as part of the core network 107, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • MME mobility management gateway
  • PDN packet data network
  • the MME 162 may be connected to each of the eNode-Bs 160a, 160b, 160c in the RAN 104 via an SI interface and may serve as a control node.
  • the MME 162 may be responsible for authenticating users of the WTRUs 102a, 102b, 102c, bearer
  • the MME 162 may also provide a control plane function for switching between the RAN 104 and other RANs (not shown) that employ other radio technologies, such as GSM or WCDMA.
  • the serving gateway 164 may be connected to each of the eNode-Bs 160a, 160b, 160c in the RAN 104 via the SI interface.
  • the serving gateway 164 may generally route and forward user data packets to/from the WTRUs 102a, 102b, 102c.
  • the serving gateway 164 may also perform other functions, such as anchoring user planes during inter-eNode B handovers, triggering paging when downlink data is available for the WTRUs 102a, 102b, 02c, managing and storing contexts of the WTRUs 102a, 102b, 102c, and the like.
  • the serving gateway 164 may also be connected to the PDN gateway 166, which may provide the WTRUs 102a, 102b, 02c with access to packet-switched networks, such as the Internet 1 10, to facilitate communications between the WTRUs 102a, 102b, 102c and IP-enabled devices.
  • the PDN gateway 166 may provide the WTRUs 102a, 102b, 02c with access to packet-switched networks, such as the Internet 1 10, to facilitate communications between the WTRUs 102a, 102b, 102c and IP-enabled devices.
  • the core network 107 may facilitate communications with other networks.
  • the core network 107 may provide the WTRUs 102a, 102b, 102c with access to circuit- switched networks, such as the PSTN 108, to facilitate communications between the W 7 TRUs 102a, 102b, 102c and land-line communications devices.
  • the core network 107 may include, or may communicate with, an IP gateway (e.g., an IP multimedia subsystem (IMS) server) that serves as an interface between the core network 107 and the PSTN 108.
  • IMS IP multimedia subsystem
  • the core network 107 may provide the WTRUs 102a, 102b, 102c with access to the networks 1 12, which may include other wired or wireless networks that are owned and/or operated by other service providers.
  • FIG. 7E is a system diagram of the RAN 105 and the core network 109 according to an embodiment.
  • the RAN 105 may be an access service network (ASN) that employs IEEE 802.16 radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 1 17.
  • ASN access service network
  • the communication links between the different functional entities of the WTRUs 102a, 102b, 102c, the RAN 105, and the core network 109 may be defined as reference points.
  • the RAN 105 may include base stations 180a, 80b, 180c, and an ASN gateway 182, though it will be appreciated that the RAN 105 may include any number of base stations and ASN gateways while remaining consistent with an embodiment.
  • the base stations 180a, 180b, 180c may each be associated with a particular cell (not shown) in the RAN 105 and may each include one or more transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 1 17.
  • the base stations 180a, 180b, 180c may implement MTMO technology.
  • the base station 180a for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 102a.
  • the base stations 180a, 180b, 180c may also provide mobility management functions, such as handoff triggering, tunnel establishment, radio resource management, traffic classification, quality of service (QoS) policy enforcement, and the like.
  • the ASN gateway 182 may serve as a traffic aggregation point and may be responsible for paging, caching of subscriber profiles, routing to the core network 109, and the like.
  • the air interface 1 17 between the WTRUs 102a, 102b, 102c and the RAN 105 may be defined as an Rl reference point that implements the IEEE 802.16 specification.
  • each of the WTRUs 102a, 102b, 102c may establish a logical interface (not shown) with the core network 109,
  • the logical interface between the WTRUs 102a, 102b, 102c and the core network 109 may be defined as an R2 reference point, which may be used for
  • the communication link between each of the base stations 180a, 180b, 180c may be defined as an R8 reference point that includes protocols for facilitating WTRU handovers and the transfer of data between base stations.
  • the communication link between the base stations 180a, 180b, 180c and the ASN gateway 82 may be defined as an R6 reference point.
  • the R6 reference point may include protocols for facilitating mobility management based on mobility events associated with each of the WTRUs 102a, 102b, 102c.
  • the RAN 105 may be connected to the core network 109,
  • the communication link between the RAN 105 and the core network 109 may defined as an R3 reference point that includes protocols for facilitating data transfer and mobility management capabilities, for example.
  • the core network 109 may include a mobile IP home agent ( ⁇ - ⁇ ) 184, an authentication, authorization, accounting (AAA) server 186, and a gateway 188, While each of the foregoing elements are depicted as part of the core network 109, it will be
  • any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • the MIP-HA may be responsible for IP address management, and may enable the WTRUs 102a, 102b, 102c to roam between different ASNs and/or different core networks.
  • the MIP-HA 184 may provide the WTRUs 102a, 102b, 102c with access to packet-switched networks, such as the Internet 110, to facilitate communications between the WTRUs 102a, 102b, 102c and IP-enabled devices.
  • the AAA server 186 may be responsible for user authentication and for supporting user services.
  • the gateway 188 may facilitate interworking with other networks.
  • the gateway 188 may provide the WTRUs 102a, 102b, 102c with access to circuit-switched networks, such as the PSTN 108, to facilitate communications between the WTRUs 102a, 102b, 102c and land-line communications devices.
  • the gateway 188 may provide the WTRUs 102a, 102b, 102c with access to the networks 1 12, which may include other wired or wireless networks that are owned and/or operated by other service providers.
  • the RAN 105 may be connected to other ASNs and the core network 109 may be connected to other core networks.
  • the communication link between the RAN 105 the other ASNs may be defined as an R4 reference point, which may include protocols for coordinating the mobility of the WTRUs 102a, 102b, 102c between the RAN 105 and the other ASNs.
  • the communication link between the core network 109 and the other core networks may be defined as an R5 reference, which may include protocols for facilitating interworking between home core networks and visited core networks.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Systems, methods, and instrumentalities are disclosed for dynamic adaptive contribution streaming (DACS). A client device may receive video content. The client device may record the video content. The client device may determine available bandwidth of a wireless communication channel of a consumer grade network for communication with a server. The available bandwidth may change over time. The client device may determine multiple rates to encode the video content based on the available bandwidth. The client device may encode the video content to generate multiple segments, wherein a segment may be encoded at a rate determined based on the available bandwidth at the time of its encoding. The client device may send (e.g., in real-time) the segment to the server via the wireless communication channel of the consumer grade network.

Description

[0001] This application claims the benefit of U. S. Provisional Patent Application No. 62/181,077, filed June 17, 2015, the contents of which are incorporated by reference herein.
BACKGROUND
[0002] Some communications (e.g., news gathering, battleground intelligence) are performed via expensive special purpose satellite links, for example, due to a lack of reliable capability to communicate with appropriate quality and/or latency over the Internet. Live broadcasting of media content may be achieved by streaming through Hypertext Transfer Protocol (HTTP) streaming technologies. However, traditional live newsgathering is very expensive, as it requires dedicated satellite bandwidth, dedicated vans with satellite antennas, and other high cost infrastructure.
SUMMARY
[0003] Systems, methods, and instrumentalities are disclosed for dynamic adaptive contribution streaming (DACS). A client device may receive video content, for example, by recording the video content (e.g., via a camera of the client device). The client device may determine available bandwidth of a wireless communication channel for communication with a server. The wireless communication channel may be that of a consumer grade network, for example, as described herein. The available bandwidth may change over time. The client device may determine a plurality of rates to encode the video content. The client device may encode the video content to generate a plurality of segments. The client device may determine a rate to encode each of the segments. The rate may be determined based on the available bandwidth at the time the client device is encoding each specific segment. For example, as the available bandwidth of the communication channel increase or decreases, the encoding rate for each segment may increase or decrease accordingly. The client device may send each segment to the server via the wireless com muni cation channel of the consumer grade network, for example, in real-time as the client device is receiving the video content.
[0004] The client device may generate a manifest file for the video content, which for example, may comprise information related to the video content (e.g., each of the encoded segments of the video content). The client device may send the manifest fi le to the server, for example, via the wireless communication channei. The manifest file may be static or the client device may dynamically update the manifest file. For example, the client device may monitor the available bandwidth, identify a change in the available bandwidth, determine (e.g., select) rates to encode segments of the video content based on the available bandwidth, and update the manifest file accordingly. The client device may send the updated manifest file to the server. Thereafter, the client device may encode and send the video content at the determined rates (e.g., in real-time).
[0005] The recording, encoding, and sending the video content may occur in real-time. The client device may encode segments of the video content at varying rates and send the segments in real-time. The client device may encode the video content at higher rates when more bandwidth for the wireless channel is available, and encode the video content at lower rates when less bandwidth for the wireless channel is available. As such, the client device may encode and send each of the segments at a single rate in real-time, where a plurality of different rates may be used when encoding the entirety of the video content (e.g., based on the available bandwidth at the time of encoding each segment). Thereafter (e.g., at a later time), the client device may encode and send each of the segments at each of the plurality of rates to the server that we not part of the real-time transmission, for example, such that the client device sends each of the segments of the video content at each and eve ' one of the plurality of rates (e.g., backfills at a later time).
BRIEF DESCRIPTIO OF THE DRAWINGS
[0006] FIG. 1 is a block diagram of an example of a DASH system model.
[0007] FIG. 2 is a block diagram of an example DACS system.
[0008] FIG. 3 is an interaction diagram showing an example of D AC S client flow.
[0009] FIG. 4 is an interaction diagram showing an example of DACS client flow.
[0010] FIG. 5 is an interaction diagram showing an example of DACS client flow.
[0011] FIG. 6 is an interaction diagram showing an example of DACS client flow. [0012] FIG. 7 A is a system diagram of an example communications system in which one or more disclosed embodiments may be implemented.
[0013] FIG. 7B is a system diagram of an example wireless transmit/receive unit (WTRU) that may be used within the communications system illustrated in FIG. 7A.
[0014] FIG. 7C is a system diagram of an example radio access network and an example core network that may be used within the communications system illustrated in FIG. 7A.
[0015] FIG. 7D is a system diagram of another example radio access network and an example core network that may be used within the communications system illustrated in FIG. 7A.
[0016] FIG. 7E is a system diagram of another example radio access network and an example core network that may be used within the communications system illustrated in FIG. 7A.
DETAILED DESCRIPTION
[0017] A DACS protocol for contribution streaming may have similarities and differences compared to Dynamic Adaptive Streaming over HTTP (DASH), A DACS client may, for example, mn on or be connected to an acquisition device (e.g., a client device such as a mobile phone, DSLR, professional camera, etc). A DACS client may provide a manifest file, generate one or more encodings of video content (e.g., representations, segments, etc). A DACS client may upload one or more representations using Hypertext Transfer Protocol (HTTP) methods such as, for example, PUT or PUSH. Content segments and/or representations may be uploaded via one or more network interfaces at one or more times (e.g., real-time and thereafter), for example, for load balancing, bandwidth adaptation, trading off content quality and latency among various representations, adding or removing enhancements, etc. A CMD (contribution media description) file may comprise a manifest providing a content receiver with information about the video content (e.g., content segments). A DACS client may specify and/or update one or more CMDs. A CMD may enumerate multiple rates (e.g., bitrates). The rate may comprise encoding parameters such as rates, frame rates, bit depth, and/or the like. A DACS client may adaptively switch between multiple rates, for example, based on available bandwidth between the DACS client and the receiving device. Techniques described may be applicable, for example, to Moving Picture Experts Group (MPEG) DASH, Society of Cable
Telecommunications Engineers (SCTE) DASH, DASH- Industry Forum (IF) Interoperability
Points (IOP), Third Generation Partnership Project (3GPP) SA4, etc. [0018] Over-the-top (OTT) streaming may utilize the Internet as a deliver}' medium. Hardware capabilities may provide a wide range of video-capable devices, e.g., mobile devices, Internet set-top boxes (STBs), and network TVs. Network capabilities may provide high-quality- video delivery over the Internet.
[0019] Closed networks may be controlled by a multi-system operator (MSO), The Internet may be a best effort environment, in which bandwidth and/or latency may change.
Network conditions may be volatile in mobile networks. Dynamic adaptation to network changes (volatility) may improve user experience.
[0020] An example of adaptive streaming is HTTP streaming. HTTP may provide a video transport protocol. HTTP infrastructure, such as video content distribution networks (CDNs) and/or the ubiquity of HTTP support on multiple platforms and devices may permit use and scaling of HTTP for Internet video streaming. Some firewalls may disallow user datagram protocol (UDP) traffic while permitting video over HTTP penetration behind firewalls. HTTP streaming may be used for rate-adaptive streaming.
[0021] HTTP adaptive streaming may segment an asset {e.g., video content), for example, virtually or physically, HTTP adaptive streaming may publish segments to CDN's. Intelligence may reside in a client. A client may acquire knowledge of published alternative encodings {e.g., representations) and a way or technique to construct uniform resource locators (URLs), for example, to download a segment from a given representation. An adaptive bitrate (ABR) client may observe network conditions and/or may select a combination of rate, resolution, etc., for example, to provide a quality {e.g., a best quality) of experience on a client device at a (e.g., each) instance of time. A client may issue an HTTP GET request to download a segment, for example, when a client determines an optimal URL to use.
[0022] MPEG DASH may be built on top of a ubiquitous HTTP/TCP/IP stack. DASH may define a manifest format, e.g., Media Presentation Description (MPD). DASH may define segment formats, for example, for International Organization for Standardization (ISO) Base Media File Format (BMFF) and/or MPEG-2 Transport Streams (TS). DASH may define a set of quality metrics at network, client operation and/or media presentation levels. Quality metrics may enable an interoperable technique, for example, to monitor Quality of Experience and Quality of Service.
[0023] A DASH concept may be a representation. A representation may be defined as a single encoded version of an asset (e.g., a complete asset) or a subset of an asset's components. A DASH: (e.g., DASH264) representation may be, for example, an ISO-BMFF comprising unmultiplexed 2.5 Mbps 720p Advanced Video Coding (AVC) video. As an example, there may be separate ISO-BMFF representations for 96 bps MPEG-4 Advanced Audio Coding (AAC) audio in different languages. A single transport stream comprising video, audio, and subtitles may be a single multiplexed representation. A structure may be combined. As an example, video and English audio may be a single multiplexed representation, while Spanish and Chinese audio tracks may be separate unmultiplexed representations.
[0024] A segment may be a minimal or smallest individually addressable unit of media data. A segment may be, for example, an entity that may be downloaded using URLs advertised via the MPD. In an example, a media segment may be a 4-second part of a live broadcast, where pi ay out time starts at 0:42:38, ends at 0:42:42, and may be available within a 3 -minute time window. In an example, a media segment may be a complete on-demand movie, which may be available for part or ail of a period a movie is licensed.
[0025] An MPD may be an extensible markup language (XML) document. An XML document may advertise the available media and may provide information for a client, for example, to select a representation, make adaptation decisions and/or retrieve segments from a network. An MPD may be independent of a segment. An MPD may signal properties, for example, to determine whether a representation may be played. An MPD may signal functional properties (e.g., an indication whether segments start at random access points). An MPD may use a hierarchical data model to describe a presentation.
[0026] A representation may be the lowest conceptual level of a hierarchical data model. An MPD may signal information, such as bandwidth, codecs for presentation and/or techniques to construct URLs for accessing segments. Additional information may be provided, such as trick mode, random access information, layer and view information for scalable and multiview codecs and/or generic schemes that may be supported by a client wishing to play a given representation.
[0027] D ASH may provide rich and flexible URL construction functionality. A single monolithic per-segment URL may be possible in DASH. DASH may allow dynamic
construction of URLs, for example, by combining parts of the URL (e.g., base URLs) that appear at different levels of a hierarchical data model. Segments may have and/or provide multi-path functionality, for example, when multiple base URLs are used. Segments requested from more than one location may improve performance and/or reliability, [0028] A list (e.g., an explicit list) of URLs and byte ranges may, for example, reach several thousand elements per representation, such as when short segments are used. DASH may permit use of predefined variables, such as segment number, segment time, etc., and printf-style syntax, for example, for on-the-fly construction of URLs using templates. Segments may be listed (e.g., seg_0000I .ts, seg_00002.ts, . . ., seg_03600.ts). A single line, e.g.,
seg $Index%05$.ts, may be written, for example, to express any number of segments. A single line may be written, for example, even when segments may not be retrieved at the time an MPD is fetched. Multi-segment representations may be used with templates, e.g., in DASH-AVC/264, for example, for template efficiency.
[0029] Different representations of an (e.g., the same) asset or a (e.g., the same) component, such as in an un-multiplexed case, may be grouped into adaptation sets.
Representations within an adaptation set may render the same content (e.g., video content). A client may switch between representations.
[0030] For example, an adaptation set may include 10 representations with video encoded with different rates and resolutions. A client may switch between representations at segment or subsegment granularity while presenting the same video content to the viewer. A seamless representation switch may occur, for example, under one or more segment-level restrictions. Restrictions may be specified in DASH profiles and DASH subsets adopted by multiple SDO's. Segment restrictions may be applied to one or more (e.g., all) representations within an adaptation set. Bitstream switching may occur.
[0031] A time-limited subset of the presentation may be referred to as a period.
Adaptation sets may be valid within a period. Adaptation sets in different periods may or may not contain similar representations (e.g., in terms of codecs, rates, etc). An MPD may contain a single period for the duration of the asset. Periods may be used for advertisement (ad) markup. Different periods may be dedicated to parts of the asset itself and/or to an advertisement.
[0032] An MPD may be an XML document that presents a hierarchy. A hierarchy may comprise, for example, global presentation-level properties (e.g., timing), period-level properties and adaptation sets available for periods. Representations may be the lowest level of a hierarchy.
[0033] DASH may use a simplified version of XLink, for example, to permit loading parts of the MPD (e.g., periods) in real-time from a remote location. An example may be ad insertion, such as when precise timing of ad breaks is known ahead of time. Ad servers may determine an ad in real-time. [0034] A dynamic MPD may change and may he periodically reloaded by a client. A static MPD may be valid for a (e.g., an entire) presentation. A static MPD may be used, for example, for Voice on Demand (VoD) applications. A dynamic MPD may be used, for example, for live and Personal Video Recorder (PVR) applications.
[0035] A media segment may be a type of segment. Media segments may be time- bounded parts of a representation. Approximate segment durations may appear in an MPD. Segment duration may or may not be the same for all segments. As an example, DASH- AVC/264 may use segments with durations within a 25% tolerance margin.
[0036] An MPD may comprise information regarding media segments that are unavailable at the time the MPD is read by the client, for example, in a live broadcast scenario. Segments may be (e.g., only) available within a well-defined availability time window. A time window may be calculated from the wall-clock time and/or segment duration,
[0037] An index segment is a type of segment. Index segments may appear as side files and/or within media segments. An index segment may comprise, for example, timing and/or random access information. Indexes may permit efficient implementation of random access and trick modes. Index segments may be used for efficient bitstream switching. Indexing may be used, for example, in VoD and PVR type applications.
[0038] Bitstream switching may be implemented, for example, using segment-level and/or representation-level properties. DASH may provide functional (e.g., explicit functional) requirements for properties. Properties may be expressed in the MPD in a format-independent way, A (e.g., each) segment format specification may comprise format-level restrictions that correspond to generic requirements.
[0039] Media segment i of representation R may be denoted as S (i). having a duration
D(S (i)). An earliest presentation time (EPT) may be denoted as EPT(SJf)}. EPT may correspond to an earliest presentation time of a segment.
[0040] Switching efficiency may be based in part on time alignment of segments for representations within an adaptation set. Eq. 1 may define a relationship for a (e.g., every)" pair
Figure imgf000008_0001
EPT(SU i Y HiS^ (i ! )W >(.V , (/-!)) Eq. 1 [0041] A segment may start with a random access point of certain types. Switching may occur at a segment border with or without overlapped downloads and dual decoding. Bitstream switching may occur at a subsegment level, for example when indexing is used and similar requirements hold for subsegments. Time alignment and random access point placement restrictions may be applied. Restrictions may translate into encodings with matching
instantaneous decoder refresh (TDR) frames at segment borders and closed group of pictures (GOPs).
[0042] FIG. 1 is a block diagram of an example of a DASH system model. A DASH system model may comprise a DASH client 1000. FIG. 1 may illustrate logical components of a conceptual DASH client model. A DASH client 1000 may comprise an access client (e.g., an HTTP client, a DASH access engine 1020), a media engine 1040, and an application 1060. The DASH access engine 020 may receive the MPD from a DASH server, construct, and i ssue requests, and receive segments or parts of segments from the DASH server. The DASH access engine 1020 may pass events and/or timing to the application 1060. A DASH server may- generate MPD files and/or encode video content. Interfaces may be defined, e.g., on-the-wire formats of the MPD and segments. The DASH access engine 1020 may comprise a java script.
[0043] The media engine 1060 may decode, output, and present media. The output of the DASH access engine 1020 may compri se media in MPEG container formats (e.g., MP4 file format or MPEG-2 transport stream) and timing information that maps internal timing of the media to a timeline of presentation. The DASH access engine 1020 may include an HTTP server. It may be assumed that a combination of encoded chunks of media and/or timing information may be sufficient for correct rendering of video content. The media engine 1060 may be provided by a browser, a browser plugin (e.g., Flash or Silverlight), and/or an operating system.
[0044] Timing behavior of a DASH client may be defined. As an example, segments (e.g., segments mentioned in a manifest) may be valid in Apple HLS. A client may poll (e.g., continuously) for updated manifests in Apple HLS. DASH MPD may reduce polling behavior, for example, by defining MPD update frequency and allowing explicit calculation of segment availability.
[0045] A static MPD may be valid (e.g., always valid, or valid for the entire interaction between the DASH client and the DASH server). A dynamic MPD may be valid, for example, from the time it was fetched by the client. A refresh period may be explicitly stated. An MPD may have a notion of versioning. As an example, an MPD may explicitly expose its publication time. An MPD may provide an availability time of the earliest segment of a period, T (0).
Media segment n may be available, for example, starting from a time given by Eq. 2: n-1
u I SO) ∑D(Sp(i)) ha. 2 i=0
[0046] Media segment n may be available for the duration of the timeshift buffer Ts, which may be explicitly stated in an MPD. Availability window size may impact catch-up TV functionality of a DASH deployment. Segment availability time may be relied upon by an access client, for example when segment availability time falls within an MPD validity period.
[0047] MPD may declare bandwidth B for a representation R. MPD may define a global minimum buffering time, BT . An access client may pass a segment to the media engine, for example, after B *BT bits are downloaded, A segment may start with a random access point.
The earliest time that a segment n may be passed to a media engine may be, for example,
T (n)+Tj(ri)+BT where T (ri) may represent the download time of segment n. A DASH client may start play out (e.g. , in real-time), for example, to minimize delay. MPD may propose a presentation delay as an offset from T (ri), for example, to synchronize different clients. A tight synchronization of segment HTTP GET requests may create a thundering herd effect, which may tax infrastructure.
[0048] MPD validity and segment availability may be calculated, for example, using absolute (e.g., wall-clock) time. Media time may be expressed within segments. Drift may develop between the encoder and client clocks, for example, in a live scenario. Drift may be addressed at a container level, for example, when MPEG-2 TS and ISO-BMFF provide synch oni zati on functi on al i ty .
[0049] HTTP may be stateless and client-driven. Push-style events may be emulated using frequent polls. Upcoming ad breaks may be signaled before (e.g. , 3 to 8 seconds before) they start, for example, to insert ads in cable/Internet Protocol Television (IPTV) systems.
[0050] Events may be "blobs," for example, with explicit time and duration information and application-specific payioads. Inband events may be message boxes appearing at a beginning of a media segment(s), MPD events may be a period-level list of timed elements. DASH: may define an MPD validity expiration event. A validity expiration event may identify an early MPD version that is valid after a given presentation time. For example, the early MPD version may be the earliest version that is valid after a given presentation time.
[0051] DASH may be agnostic to content protection, such as digital rights management (DRM). DASH may support signaling a DRM scheme and its properties, e.g., within an MPD. A DRM scheme may be signaled, for example, via a ContentProtection descriptor. A value (e.g., an opaque value) may be passed within a descriptor. A DRM scheme may be signaled, for example, using a unique identifier and/or a definition of the meaning of the passed value. A DRM scheme may be signaled, for example, using a scheme-specific namespace.
[0052] MPEG may define protection standards, such as Common Encryption for ISO- BMFF (CENG) and Segment Encryption and Authentication. Common encryption may- standardize which parts of a sample are encrypted and how encryption metadata is signaled within a track. A DRM module may deliver encryption keys to the client, for example, based on encryption metadata in a segment. Decryption may use, for example. Advanced Encryption Standard Counter Mode (AES-CTR) or AES-Cipher Block Chaining (CBC) modes. CENG may¬ be extensible, e.g., to use other encryption algorithms. Common Encryption may be used with DRM systems. Common Encryption may be used in DASH-AVC/264.
[0053] DASH Segment Encryption and Authentication (DASH-SEA) may be agnostic to the segment format. Encryption metadata may be passed via the MPD. For example, MPD may comprise information on which key is used for segment decryption and/or how to obtain a key. AES-CBC encryption and HTTPS-based key transport may be utilized, for example, in Apple HLS. MPEG-2 TS media segments may be compatible with encrypted HLS segments. DASH- SEA may be extensible. DASH-SEA. may permit other encryption algorithms and/or DRM systems, which may be similar to CENG.
[0054] DASH-SEA may provide a segment authenticity framework. A framework may, for example, confirm that a segment received by a client is the segment an MPD author intended the client to receive. Confirmation may be performed, for example, using message
authentication code (MAC) or digest algorithms. Confirmation may prevent content
modification within a network, such as ad replacement and inband event alteration.
[0055] In some scenarios or use cases, e.g., news gathering, battleground intelligence or telemedicine, content quality may be reduced, for example, to reduce content latency to a processing or serving location. Content may be streamed (e.g., over the Internet) at the time of content acquisition, for example, using improvements in mobile networks (e.g., LTE-A) and WiFi ubiquity.
[0056] There may be an economic advantage to streaming content (e.g., video content) through a consumer grade network. The price of local consumer-grade internet access may be less than the cost of private networks, such as special purpose satellite links, by orders of magnitude. A consumer grade network may include a network that is available to the general public, for example, at a fee. For example, a consumer grade network may include a
telecommunication network that a sendee provider (e.g., Verizon Wireless, ATT, T-Mobiie, Sprint, etc.) offers to the general public for a fee. A consumer grade network may comprise one or multiple network access technologies such as 3G cellular, 4G cellular, 5G cellular, WiFi, and/or the like. A consumer grade network may be terrestrial (e.g., entirely terrestrial). A consumer grade network may comprise satellite-based Internet. A private network may be a dedicated network for a selected few businesses, for example, an extrateiTestrial satellite network dedicated to a broadcasting company such as NBC, ABC, FOX, etc.
[0057] A device may comprise multiple antennas (e.g., Wi-Fi and LTE), for example, to increase bandwidth, e.g., to send high quality content (e.g., video content). Multiple devices, e.g., multiple Long Term Evolution (LTE) modems and/or multiple WiFi connections to different networks, may be used to increase bandwidth. Codecs, such as High Efficiency Video coding (HEVC), may reduce upload bandwidth.
[0058] DACS (dynamic adaptive contribution protocol) may include a protocol for dynamic adaptive contribution streaming, DACS may include a client and a server. The protocol may include a contribution media description (CMD). The CMD may provide a manifest file, e.g., for uploaded segments. The CMD may be static or dynamic. For example, the CMD may be updated periodically. One or more versions of the CMD may be sent to a server (e.g., to the cloud).
[0059] Segments may be encoded, for example, on a consumer grade encoder on a device. The device may be a WTRU including a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, a virtual reality headset, consumer electronics, and the like. The device may include a mobile phone, digital single-lens reflex camera (DSLR), professional camera, and/or the like. Encoded segments may be sent (e.g., uploaded) over a unicast protocol, such as HTTP or Google QUIC. Segments may be encoded and uploaded using
- I I - a file transfer protocol, such as Aspera or File Transfer Protocol (FTP). An uploading client (e.g., a dynamic adaptive contribution protocol (DACS) client) may include, e.g., an upload engine. The client may measure available bandwidth, e.g., continuously. An uploading client may upload segments from a representation with adequate bandwidth. An uploading client may inform or command an encoder to change a representation (e.g., to change encoding settings such as a target rate or quality level), for example, when the measured available bandwidth changes. The encoder may be integrated in the device or separate from the device.
[0060] A device may include an adaptive encoder and/or a multi-rate encoder. An adaptive encoder (AE) may output an (e.g., a single) encoded stream with capability to reconfigure (e.g., dynamically) rate control and/or output rate (e.g., in real-time). The output rate may be set by a DACS client, for example, based on the available bandwidth (e.g., available upload throughput). The output rate may be updated, for example, when the measured available bandwidth changes. The AE may be implemented in software, hardware, or a combination of hardware and software. For example, the AE may implement a hardware for outputting an encoded stream and/or dynamically reconfiguring rate control and/or output rate in real-time, the AE may implement a software and/or a software encoder for outputting an encoded stream and/or dynamically reconfiguring rate control and/or output rate in real-time.
[0061] A multi-rate encoder (MRE) may output simultaneous streams at different rates. The output rates may be set by a DACS client, for example, based on the available bandwidth (e.g., the available upload throughput). The output rates may be updated, for example, when the measured available bandwidth changes over time. The MRE may be implemented in software, hardware, or a combination of hard ware and software. As an example, an MRE may implement hardware for producing multiple different encodings of content at different rates. The MRE may¬ be implemented by running multiple software encoders concurrently (e.g. in different threads), or by having a single software encoder that has the capability of outputting multiple different encoding rates, or of switching contexts/states to alternately service multiple different encoding tasks.
[0062] A device (e.g., a DACS client) may determine (e.g., measure) a change in bandwidth of a communication channel with a server (e.g., a communication channel of a consumer grade network) when determining the encoding rate of content (e.g., the encoding rate of a segment of video content). In one or more examples, the device may monitor available bandwidth for a wireless communication channel (e.g., of a consumer grade network). The device may determine a change of the available bandwidth of the wireless communication channel (e.g., of the consumer grade network) and/or adapt the encoding rate to the change of the available bandwidth of the wireless communication channel . The device may monitor an aggregated available bandwidth of one or more network access types (e.g., WiFi, Cellular). The device may determine a change of the aggregated available bandwidth of the one or more network access types and/or adapt the encoding rate to the change of the aggregated available bandwidth of the one or more network access types. The device may switch between the one or more network access types. For example, the device may switch between WiFi and 3G, 3G and 4G, 4G and 5G etc. The device may determine that bandwidth of a different wireless
communication channel of the consumer grade network may become available, or a wireless communication channel of a different consumer grade network may become available, and the device may determine that bandwidth of the different communication channel and/or network is better than the current channel or bandwidth used for uploading the content. In such instances, the device may switch to the different channel and/or different network, and continue to encode and upload content (e.g., in real-time)
[0063] A device may encode segments of video content at one or more rates, for example, in real-time. The device may receive video content (e.g., a portion of the entirety of the video content), encode the video content to generate a segment of the video content, and then send the segment to a server in real-time. For example, the device may receive, encode, and send segments to the server before the device has received the entirety of the video content {e.g., while the device is receiving the video content). A complete video content may not be received and/or stored by the device before the device sends the segments to the server. The device may adaptive! y determine the rate to encode subsequent segments of the content {e.g., based on the available channel bandwidth) as the device is receiving and sending segments of the content to a remote site {e.g., a server).
[0064] A device may upload media segments at a plurality of rates, for example, to permit a server to provide a multi-rate DASH presentation offering. Higher-quality segments that were not uploaded in time may be uploaded afterwards, for example, to permit a replay of footage at a higher quality than quality available in real-time. In an example of scalable video coding, base layer segments may be uploaded {e.g., uploaded as content is captured and encoded, or uploaded substantially in real-time) while enhancement layers that may not be uploaded initially {e.g., could not be uploaded in real-time) may be uploaded later, A multi-rate encoding client may, if CPU power permits, encode the multiple rates as the content i s captured or acquired and may store the multiple rates locally for uploading later (e.g., may schedule uploading of the resulting multiple-rate content segments based on available upload throughput). A multi-rate encoding client may store the original captured content, or may initially produce a high quality (e.g., lightly compressed) version of the content for local storage, and may then use the stored original content or the stored high quality version of the content to encode the multiple rate versions of the content for uploadi ng over a time period.
[0065] Multiple network interfaces may be used simultaneously. Different segments may be uploaded through different interfaces. For example, segments of a first representation at a first rate may be uploaded via a first network interface, and segments of a second
representation at a second rate may be uploaded via a second network interface. Given N network interfaces, ith segment may be uploaded via interface mod(i,N), for example. Different segments of the same representation may be uploaded via different network interfaces. As another example, segments of a first representation at a first rate may be uploaded substantially in real-time (e.g., as the content is captured and is encoded at the first rate) via a first network interface, while segments of a second representation at a second rate may be uploaded via a second network interface with additional latency relative to the corresponding segments of the first representation. Multiple interfaces may be on the same network (e.g., "bonded" interfaces) and/or on different networks. Bonding may occur at an application level or a driver/kernel level.
[0066] An adaptive upload protocol may be implemented based on a hypertext transfer protocol (HTTP). A HTTP-based implementation of an adaptive upload protocol may have some similarities and/or differences relative to moving picture experts group (MPEG) DASH. Segments may be uploaded, for example, using HTTP PUT or POST techniques. Structure of an MPD (e.g., or of a similar manifest adapted to be used for adaptive uploading of content) may comprise segment URLs that point to an UploadBaseURL element, for example, to permit specification of upload characteristics.
[0067] Table ί provides an example of an element that may convey an upload URL. Locally stored segment URLs may be implemented, for example, using a file:// URL scheme. Table 1 - Example of an element to convey an upload URL
Figure imgf000016_0001
[0068] AE and/or MRE output rates may be listed in a contribution media description (CMD) file. A CMD (contribution media description) may provide a manifest, e.g., for uploaded segments. The CMD may be based on MPD, For example, the CMD may reuse syntax elements of the well-known MPD manifest format. The CMD may have additional elements such as, for example, the UploadBaseUrl {e.g., as defined in Table 1). The CMD may include one or more of the determined rate, a coding parameter, resolution, information on a codec stream, initialization information, metadata associated with the encoded video content, suggested URL routes, time to announce the manifest file, time resolution of a segment, and a time window. The CMD may be updated, for example, based on a change in the available bandwidth. The updated CMD may be sent to a server via a wireless communication channel of a consumer grade network for communication with a server.
[0069] A DACS client may, for example, run on an acquisition device {e.g., mobile phone, DSLR, professional camera). A device running DACS may be connected to an acquisition device, for example, using a non-compressed and/or high quality video interface, such as serial digital interface (SDI), High-Definition Multimedia Interface (HDMI) or IP {e.g., using Society of Motion Picture and Television Engineers (SMPTE) 2022 and/or high quality {e.g., mezzanine quality) 1 1.264, H.265 or MPEG-2 for video). [0070] Received footage may be encoded, for example, using a single-rate adaptive or multi-rate encoder. Footage may be augmented, for example, with acquisition information, such as location, altitude, direction, and camera and lens identity and parameters, etc. Footage may be augmented with information, such as potential places for advertisement breaks. Such
augmentation may be provided in the CMD manifest generated by the DACS client and/or may be sent inband within the uploaded content segments.
[0071] Encoded information may be uploaded, for example, to the cloud. The uploaded encoded information may be made available for streaming, for example, after transcoding into distribution formats and editing. For example, the uploaded segments may be made available {e.g., as uploaded, or after some processing such as transcoding and/or editing) as a DASH presentation. DACS {e.g., a DACS client) may measure and monitor available bandwidth for uploading and/or may determine the upload rates such that the upload may be maintained at a sustainable rate. CMD may be generated and/or updated. One or more versions of the CMD may be uploaded to the cloud.
[0072] FIG. 2 is a block diagram of an example DACS system 200. A DACS client 220 may connect to an acquisition device 210, The DACS client 220 may be integrated with the acquisition device 210. The DACS client 220 may comprise acquisition capabilities, e.g., the DACS client 260. DACS protocol may be used to provide and/or update the CMD {e.g., a manifest of uploaded segments), e.g., to a DACS server 240. DACS protocol may be used to upload segments {e.g., content such as video content segments which may be of varying bitrates) to the DACS server 240. Information such as the CMD, updated CMD's, and video content segments may be sent from the DACS client 220 to the DACS server 240 through Network 230, for example, via HTTP PUT or HTTP POST techniques.
[0073] Information, such as the CMD, updated CMD's, and video content segments may be sent from the DACS client 260 to the DACS server 240 through Network 250, for example, via HTTP PUT or HTTP POST techniques. The DACS server 240 may communicate with a DASH server 270, e.g., the DACS server 240 may send the information to the DASH server 270 and/or communicate with the DASH server 270 in a similar manner as the DACS server 240 communicates with the DACS client 260. The DACS seiver 240 may track the communication with the DASH server 270. The DASH server 270 may send content segments, an MPD file, and/or updated MPD files to the DASH client 290 via the network 280. The communication between the DASH server 270 and the DASH client 290 may follow DASH protocol. For example, when an MRE-capable DACS client 260 has provided content segments at multiple rates, the DACS server 240 or another cloud-based entity may offer the provided video content segments as a DASH presentation. The DACS server 240 may have DASH server functionality, and/or the uploaded video content segments at the various rates provided by the DACS client 260 may be transferred to a separate DASH server 270, A suitable MPD (e.g., or multiple such MPD' s in dynamic fashion) may be generated based on the one or multiple CMDs provided by the DACS client 260. The one or multiple MPD's may advertise the uploaded media segments at the various rates to a DASH client 290, which for example, may then request, retrieve and play back the media segments according to the DASH specifications. The one or multiple MPD' s may be generated, for example, by the DACS server 240 which received the CMDs and/or the uploaded segments, by a DASH server 270 to which the uploaded video content was transferred, and/or by another cloud-based entity capable of generating MPD' s, The one or multiple MPD's may be made available to a DASH client 290 (e.g., by the DASH server 270, or by other means). The DASH server 270 may generate an MPD and/or may offer content streaming of a presentation to the DASH client 290 while the DASH Server 270 is receiving content for the presentation from the DACS Server 240, The DASH server 270 may generate an MPD and/or may offer content streaming of a presentation to the DASH client 290 while the DACS Server 240 is receiving uploaded segments for the presentation from the DACS client (e.g., 220 and/or 260). For example, the DASH server 270 may generate an initial MPD based on the content segments that are available at the DASH server 270. The D ASH server 270 may generate (e.g. dynamically) one or more updated MPD' s that reflect additional content of the presentation when such content is received by the DASH Server 270 and/or is available for streaming.
[0074] Although not shown in FIG. 2, a DACS client (e.g., a single DACS client) may upload information, such as one or more CMDs and content segments, via more than one network, e.g., via more than one network interface.
[0075] A DACS client may include an AE and/or a MRE. For example, the DAC S client in FIG. 3 and FIG. 4 may include an AE. The DACS client in FIG. 5 and FIG. 6 may include a MRE, A DACS client may receive video content, e.g., by recording the video content, or by receiving the video content from a connected content acquisition device. The DACS client may determine the available bandwidth that a consumer grade network may offer. Through the consumer grade network, the DACS client may communicate with a server. The DACS client may determine a rate(s) at which the DACS client may encode the video content based on the available bandwidth. The DACS client may monitor the available bandwidth. When the DACS client detects a change in the available bandwidth, the DACS client may switch to a different rate at which the DACS client may encode subsequent segments. The DACS client may send a DACS server a CMD that includes the rate(s). The CMD may be updated, e.g., based on the available bandwidth.
[0076] An adaptive encoder (AE) may be an encoder that outputs an (e.g., a single) encoded stream with capability to reconfigure (e.g., dynamically) rate control and/or output bitrate.
[0077] A D ACS client may set an AE bitrate, for example, as needed or desired. The bitrate may be set based on the available upload throughput, for example. A DACS client may switch to a different bitrate, for example, due to network conditions, and/or may set the AE bitrate appropriately. A DACS client may insert a new period into the CMD. The DACS client may upload the result as an updated CMD.
[0078] FIG. 3 is an interaction diagram showing an example of DACS client flow 300. FIG. 3 illustrates CMD and media segments sent (e.g., uploaded) from a DACS client 320 with an AE. The DACS client 320 may provide a CMD that may enumerate multiple rates such as rates A, B, and/or C at 322. Information, such as the CMD, updated CMD's, and video content segments may be sent from the DACS client 320 to the DACS server 340 through a network via HTTP PUT or HTTP POST techniques, e.g., as shown in a communication 322. The network may include a consumer grade network for communication between the DACS client 320 and the DACS server 340,
[0079] The DACS client 320 may stream media content in multiple rates in real-time. The DACS client 320 may adaptively switch between multiple rates, such as rates A, B, and/or C when providing the content. Switching may occur, for example, based on available bandwidth of a wireless communication channel of the consumer grade network that is used for
communication between the DACS client 320 and the DACS server 340. The DACS client 320 may determine to encode media segments at rate A, for example, based on the available bandwidth of the wireless communication channel or the DACS client 320 may be configured to begin encoding at a particular rate independent of the communication channel. The DACS client 320 may encode one or more media segment at rate A, and upload the media segments to the DACS server 340 at 324-324N. [0080] The DACS client 320 may monitor (e.g., continuously monitor) the available bandwidth of the wireless communication channel of the consumer grade network. The DACS client 320 may determine that the available bandwidth of the wireless communication channel of the consumer grade network changes. For example, the DACS client may determine that the wireless communication channel can now accommodate a segment encoded at rate C at 325. Accordingly, the DACS client 320 may encode one or more subsequent media segments at rate C, and upload the media segments to the DACS server 340 at 326-326N. The DACS client 320 may determine additional changes in the bandwidth of the wireless communication channel. For example, the D ACS client 320 may determine to encode subsequent media segments at rate B at 327 (e.g., the channel may no longer be able to accommodate segments encoded at rate C for real-time streaming). Accordingly, the DACS client 320 may encode one or more subsequent media segments at rate B, and upload the media segments to the DACS server 340 at 332-332N. Although not illustrated, the DACS client flow 300 may include HTTP responses from the DACS server 340.
[0081] FIG. 4 is an interaction diagram showing an example of DACS client flow 400. FIG. 4 illustrates CMD and media segments sent (e.g., uploaded) from a DACS client 420 with an AE. The DACS client 420 may upload media segments, for example, according to one or more of rates defined in an updated (e.g., most recently updated) CMD. The DACS client 420 may adaptively switch between rates, for example, based on available bandwidth in a wireless communication channel for communication between the DACS client 420 and the DACS server 440, The wireless communication channel be that of a consumer grade network.
[0082] As shown in FIG. 4, the DACS client 420 may send a dynamic CMD to the server 440 at 422. For example, the CMD may specify one or more rates (e.g., rate A) and/or an expiration time of the CMD. Information such as the CMD, updated CMD' s, and/or video content segments may be sent from the DACS client 420 to the DACS server 440 through a network via HTTP PUT or HTTP POST techniques, e.g., as shown in a communication 422. The network may include a consumer grade network for communication between the DACS client 420 and the DACS sever 440.
[0083] The DACS client 420 may determine to encode media segments at rate A and stream the media content (e.g., video content) in real-time. The DACS client 420 may make the determination based on the available bandwidth of the wireless communication channel of the consumer grade network that is used for communication between the DACS client 420 and the DACS server 440. The DACS client 420 may be configured to begin encoding at rate A. For example, the DACS client 420 may determine to encode at rate A independent of the
communication channel . The DACS client 420 may encode one or more media segments at rate
A, and upload the media segments to the DACS server 440 at 424-424N.
[0084] The DACS client 420 may adaptively switch to a different rate and/or switch from the different rate to rate A. Switching may occur, for example, based on the available bandwidth of the wireless communication channel with the DACS server 440 (e.g., a wireless
communication channel of a consumer grade network). The DACS client 420 may monitor (e.g., continuously monitor) available bandwidth of the wireless communication channel . The DACS client 420 may determine that available bandwidth of the wireless communication channel changes. For example, the DACS client 420 may determine that the wireless communication channel may accommodate a segment encoded at rate B at 426.
[0085] The DACS client 420 may update the CMD to include, e.g., additional rates that a changing bandwidth of a wireless communication channel may accommodate. For example, the DACS client 420 may update the CMD to include rate B. The DACS client 420 may send the updated CMD to the DACS server (e.g., in real-time) at 428. The updated CMD may include information related to the segment that may be encoded at the additional rate. The DACS client 420 may send the updated CMD to the DACS server (e.g., in real-time) at 428 through the wireless communication channel via HTTP PUT or HTTP POST techniques, e.g., as shown at 428,
[0086] The DACS client 420 may encode one or more subsequent media segment at rate
B, and upload the media segments to the DACS server 440 at 430-430N. The DACS client 420 may monitor (e.g., continuously measure) the available bandwidth and adaptively switch the encoding rate of the video content (e.g., in real-time) based on the change in available bandwidth. For example, the DACS client 420 may determine to encode subsequent segments at rate A at 432. For example, the wireless communication channel may no longer be able to accommodate segments encoded at rate B for real-time streaming. The DACS client 420 may encode one or more subsequent media segments at rate A, and upload the media segments to the DACS server 440 at 434-434N (not shown in FIG. 4). Although not illustrated, the DACS client flow 400 may include HTTP responses from the DACS server 440.
[0087] A DACS client (e.g., the DACS client 320 and/or the DACS client 420) may upload the entirety of media content (e.g., a video) but do so by uploading segments encoded according to a plurality of different rates. For example, the DACS client 320 may upload a varying, multi-rate encoding of the media content to the server in real-time (e.g., based on the available bandwidth at the time of encoding segments of the content), and subsequently upload segments of the content at rates that were not part of the real-time transmission (e.g., backfill to ensure that the server has a complete version of the content at each of the plurality of rates).
[0088] The DACS client in FIG. 5 and FIG. 6 may include a MRE. A DACS client with a MRE may encode and/or send to a DACS server simultaneous streams at different rates via a network. The DACS client may upload to a server (e.g., to the cloud) video content with one or more representations that are encoded at different rates. The DACS client may send a DACS server (e.g., a cloud) a CMD that includes the different rates via the network. The CMD may be updated, for example, based on the available bandwidth.
[0089] A multi-rate encoder (M RE) may output simultaneous streams at different bitrates. MRE output bitrates may be listed in the CMD. A DACS client may upload, e.g., to the cloud, content with one or more representations. As an example, a DACS client may send to a server a representation of a segment that is sustainable with available bandwidth. The DACS client may perform a refinement, for example, when more bandwidth is available (e.g. after the end of recording). The DACS client may upload higher-quality segments of a representation. A time window for refinement may be signaled (e.g. explicitly) in the CMD.
[0090] FIG. 5 is an interaction diagram showing an example of DACS client flow 500. D ACS client flow 500 illustrates an example call flow of CMD and media segments being sent (e.g., uploaded) from a DACS client with an MRE 520 to a DACS server 540 (e.g., in real-time). The DACS client 520 may provide a CMD that may enumerate multiple rates such as rates A, B, and/or C at 522. Information, such as the CMD, updated CMD' s, and video content segments may be sent from the DACS client 520 to the DACS server 540 through a network via HTTP PUT or HTTP POST techniques, for example, as shown in a communication 522. The network mav include a consumer grade network for communication between the DACS client 520 and the DACS server 540.
[0091 ] The DACS client 520 may stream media content in multiple rates simultaneously or successively. The DACS client 520 may determine to encode media segments at rate A, B, and C. The DACS client 520 may be configured to begin encoding at rate A, B, and C. The DACS client 520 may encode one or more media segments at rate A, B, and C, and upload the media segments to the DACS server 540. For example, the DACS client 520 may encode segment- 1 at rate A, B, and C. The DACS ciient 520 may upload segment- 1 at rate A, B, and C to the DACS server 540 at 524-1, 526-1, and 528- 1 simultaneously via a network, or at different times and/or via different networks. The network may be a wireless communication channel of a consumer grade network.
[0092] The DACS client 520 may encode segment-2 at rate A, B, and C. The DACS client 520 may upload segment-2 at rate A, B, and C to the DACS server 540 at 524-2, 526-2, and 528-2 simultaneously via a network, or at different times and/or via different networks. The DACS client 520 may upload segment-N at rate A, B, and C to the DACS server 540 at 524-N, 526-N, and 528-N simultaneously via a network, or at different time via different networks (not shown in FIG. 5) 524. The network the DACS ciient 520 used at 524-2, 526-2, and 528-2 may differ from the network the DACS client 520 used at 524- 1, 526-1, and 528-1. For example, the network the DACS client 520 used for sending segment-2 at rate A, B, and C to the DACS server 540 at 524-2, 526-2, and 528-2 may be a different consumer grade network. Although not illustrated, the DACS client flow 500 may include HTTP responses from the DACS server 540.
[0093] The DACS ciient 520 may encode multiple rates (e.g., simultaneously) if sufficient encoding power and/or sufficient upload bandwidth are available. The DACS client 520 may upload segments at the multiple rates to the DACS Server 540 if sufficient encoding power and/or sufficient upload bandwidth are available. In some cases or at some time, the upload bandwidth may not be sufficient to achieve continuous real-time uploading of segments at all advertised rates. In some cases or at some time, there may be an interruption . For example, a temporary lack of computing power on the DACS client 520 may occur. The DACS ciient 520 may drop transmission of segments at one or more of the advertised rates. The DACS client may later (e.g., not in real-time) resume real-time uploading of segments at all advertised rates when sufficient bandwidth is available, or when the interruption is over. The DACS ciient 520 may, at a later time, upload the segments at the one or more of the advertised rates that were dropped during the initial (e.g., real-time) multi-rate streaming phase. Uploading the segments at the one or more of the advertised rates that were dropped during the initial (e.g., real-time) multi- rate streaming phase may occur after an entirety of content is captured and/or when the initial (e.g., real-time) multi-rate streaming phase is over. Uploading the segments at the one or more of the advertised rates that were dropped during the initial (e.g., real-time) multi-rate streaming phase may occur while the content is being captured and/or when the initial (e.g., real-time) multi-rate streaming phase is in progress. Uploading the segments at the one or more of the advertised rates that were dropped during the initial (e.g., real-time) multi-rate streaming phase may occur when sufficient bandwidth is available.
[0094] A DACS client (e.g., the DACS client 620) may generate a manifest file comprising multiple rates, and/or encode video content at one or more rates in real-time, and send the video content at one or more of the multiple rates subsequently (e.g., not in real-time) to the server. For example, the DACS client may encode the video content at a rate (e.g., the lowest rate) in real-time such that the video content is encoded at an acceptable quality and/or send the video content with the acceptable quality at the lowest rate to the server. The DACS client may subsequently (e.g., not in real-time) select a rate corresponding to a video quality that is higher than the acceptable rate, encode the video content according to the selected rate, and/or send the encoded content to the DACS server.
[0095] FIG. 6 is an interaction diagram showing an example of DACS client flow 600. The DACS client flow 600 illustrates an example call flow of CMD and media segments being sent (e.g., uploaded) from a DACS client 620 to a DACS server 640 (e.g., in real-time). The DACS client 620 may comprise an MRE. The DACS client 620 may send a CMD to a DACS server 640 at 622. The CMD may enumerate one or more rates, such as rate A, B and/or C. The rates and/or the number of the rates may be preeonfigured (e.g., based on processing power of the DACS client 620 and/or other circuitry of the DACS client 620), and/or may be based on the available bandwidth of a wireless communication channel. Information such as the CMD, updated CMD' s, and video content segments may be sent from the DACS client 620 to the DACS server 640 through a network via HTTP PUT or HTTP POST techniques, e.g., as shown at 622. The network may include a consumer grade network for communication between the DACS client 620 and the DACS server 640.
[0096] The DACS client 620 may upload segments 1, 2 until segment N according to the CMD at rate A at 624-1 to 624-N. As an example, media segments at rate A may be uploaded substantially in real-time, for example, as media video content is captured and encoded. An entirety of the media content may be encoded into segments l -N at rate A. The encoded video content at rate A may have the lowest acceptable quality.
[0097] A DACS client may upload video content (e.g., one or more segments) at additional rates. A DACS client may upload media segments at additional rates, for example, to permit a DACS server to provide a multi-rate DASH presentation offering. A DACS client may- upload video content at additional rates, for example, after an entirety of the video content at an original rate is uploaded (e.g., not in real-time) and/or at a time when a DACS client determines there is enough bandwidth available to upload segments encoded at additional rates. As shown in FIG. 6, the DACS client 620 may upload segments 1 -N at rate B at 630- 1 to 630-N. The DACS client 620 may upload segments 1-N at rate C at 632-1 to 632-N. The encoded video content at rate B and/or at rate C may have a quality higher than the encoded video content encoded at rate A. Although not illustrated, the DACS client flow 600 may include HTTP responses from the DACS server 640.
[0098] The DACS client may implement the combination of one or more techniques illustrated in FIG. 3, FIG. 4, FIG. 5, and FIG. 6. For example, a DACS client may adaptively switch between various rates for encoding and/or uploading multiple segments. The DACS client may initially encode and/or upload various segments at rates that adaptively change as the video content is captured and encoded. The rate at which each segment is encoded may be adapted to an available upload bandwidth of a wireless communication channel of a consumer grade network for communication with a server. The DACS client may switch rates as it encodes and/or uploads the video content segments, for example, as illustrated in FIG. 3. The DACS client may switch rates according to a CMD and/or an updated CMD, The CMD and/or the update CMD may enumerate the rates, for example, as illustrated in FIG. 4. The DACS client may (e.g., with some latency, or after the adaptively encoded version of the video content is uploaded) upload additional video content segments. The additional video content segments may comprise one or more alternative versions of each previously encoded/uploaded video content segment. For example, the video content segments may be encoded and/or uploaded at additional rates which were not previously used. The DACS client may track which segments were initially uploaded during the adaptive encoding phase. For example, during the adaptive encoding phase, the DACS server may receive an entirety of the video content that comprises segments encoded at various rates. The DACS client may determine at which rates a video content segment may be uploaded but were not initially or previously uploaded. Based on this determination, the DACS client may encode and/or may upload the segments at the rates that were not previously used. The DACS client may (e.g., after some latency) provide all video content segments at all of a set of intended rates. The set of intended rates may have been announced by the DACS client in a CMD and sent by the DACS client to a DACS server, for example. [0099] After a DACS client has provided content segments at multiple rates to the DACS server and/or another cloud-based entity, the DACS server and/or another cloud-based entity may offer the provided content segments as a DASH presentation. For example, the DACS server may have DASH server functionality, or transfer the uploaded content segments at the multiple rates that are provided by the DACS client to a separate DASH server. A suitable MPD (e.g., multiple suitable MPD's in a dynamic fashion) may be generated based on the one or multiple CMDs provided by the DACS client. The one or multiple MPD's may advertise the uploaded media segments at the multiple rates to a DASH client, which for example, may then request, retrieve, and/or play back the media segments according to DASH specifications. The one or multiple MPD's may be generated, for example, by the DACS server which received the CMDs and/or the video content segments, by a DASH server to which the video content segments were transferred, and/or by a cloud-based entity capable of generating MPD's. The one or multiple MPD's may be made available to DASH clients (e.g., by the DASH server, or by other means).
[0100] FIG. 7A is a diagram of an example communications system 100 in which one or more disclosed embodiments may be implemented. The commumcatioiis system 100 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, etc., to multiple wireless users. The communications system 100 may enable multiple wireless users to access such content through the sharing of system resources, including wireless bandwidth. For example, the communications systems 100 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single- carrier FDMA (SC-FDMA), and the like.
[0101] As shown in FIG. 7A, the communications system 100 may include wireless transmit/receive units (WTRUs) 102a, 102b, 102c, and/or 102d (which generally or collectively may be referred to as WTRU 102), a radio access network (RAN) 103/104/105, a core network 106/107/109, a public switched telephone network (PSTN) 108, the Internet 110, and other networks 1 12, though it will be appreciated that the disclosed embodiments contemplate any number of WTRUs, base stations, networks, and/or network elements. WTRUs 102a, 102b, 102c, 102d may be any type of device configured to operate and/or communicate in a wireless environment. By way of example, the WTRUs 102a, 102b, 102c, 102d may be configured to transmit and/or receive wireless signals and may include user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, consumer electronics, and the like.
[0102] The communications systems 100 may also include a base station 114a and a base station 1 14b. Base stations 14a, 114b may be any type of device configured to wirelessly interface with at least one of the WTRUs 102a, 102b, 102c, 102d to facilitate access to one or more communication networks, such as the core network 106/107/109, the Internet 1 10, and/or the networks 1 2. By way of example, the base stations 114a, 114b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a site controller, an access point (AP), a wireless router, and the like. While the base stations 114a, 114b are depicted as a single element, it will be appreciated that the base stations 114a, 114b may include any number of interconnected base stations and/or network elements.
[0103] The base station 1 14a may be part of the RAN 103/104/105, which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, etc. The base station 1 14a and/or the base station 14b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a ceil (not shown). The cell may further be divided into cell sectors. For example, the cell associated with the base station 1 14a may be divided into three sectors. Thus, in one embodiment, the base station 114a may include three transceivers, e.g., one for each sector of the cell. In another embodiment, the base station 1 4a may employ multiple-input multiple output (MIMO) technology and, therefore, may utilize multiple transceivers for each sector of the cell.
[0104] The base stations 114a, 1 14b may communicate with one or more of the WTRUs 102a, 102b, 102c, 102d over an air interface 115/1 16/1 17, which may be any suitable wireless communication link (e.g., radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible light, etc.). The air interface 115/116/117 may be established using any suitable radio access technology (RAT).
[0105] More specifically, as noted above, the communications system 100 may be a multiple access system and may employ one or more channel access schemes, such as CDM A, TDM A, FDMA, OFDMA, SC-FDMA, and the like. For example, the base station 1 4a in the RAN 103/104/105 and the WTRUs 102a, 102b, 102c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 1 1 5/1 16/1 17 using wideband CDMA (WCDMA).
WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA.+). HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA).
[0106] In another embodiment, the base station 1 14a and the WTRUs 102a, 102b, 102c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 1 15/1 16/1 17 using Long Term Evolution (LTE) and/or LTE- Advanced (LTE- A).
[0107] In other embodiments, the base station 1 14a and the WTRUs 102a, 102b, 102c may implement radio technologies such as IEEE 802.16 (e.g., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 IX, CDMA2000 EV-DO, Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and the like.
[0108] The base station 1 14b in FIG. 7A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, and the like. In one embodiment, the base station J 14b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802.1 1 to establish a wireless local area network (WLAN). In another embodiment, the base station 1 14b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802, 15 to establish a wireless personal area network (WPAN), In yet another embodiment, the base station 114b and the WTRUs 102c, 102d may utilize a cellular- based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE- A, etc.) to establish a picocel l or femtoceli. As shown in FIG. 7 A, the base station 1 14b may have a direct connection to the Internet 1 10. Thus, the base station 1 14b may not be used to access the Internet 1 10 via the core network 106/107/109.
[0109] The RAN 103/104/105 may be in communication with the core network
106/107/109, which may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of the WTRUs 102a, 102b, 102c, 102d. For example, the core network 106/107/109 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perform high-level security functions, such as user authentication. Although not shown in FIG. 7A, it will be appreciated that the RAN 103/104/105 and/or the core network 106/107/109 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 103/104/105 or a different RAT. For example, in addition to being connected to the RAN 103/104/105, which may be utilizing an E-UTRA radio technology, the core network 106/107/109 may also be in communication with another RAN (not shown) employing a GSM radio technology.
[0110] The core network 106/107/109 may also serve as a gateway for the WTRUs 102a, 102b, 102c, 102d to access the PSTN 108, the Internet 110, and/or other networks 112. The PSTN 108 may include circuit-switched telephone networks that provide plain old telephone service (POTS). The Internet 110 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and the internet protocol (IP) in the TCP/IP internet protocol suite. The networks 112 may include wired or wireless
communications networks owned and/or operated by other service providers. For example, the networks 112 may include another core network connected to one or more RANs, which may employ the same RAT as the RAN 103/104/105 or a different RAT.
[0111] One or more of the WTRUs 102a, 102b, 102c, 102d in the communications system 100 may include multi-mode capabilities, e.g., the WTRUs 102a, 102b, 102c, 102d may include multiple transceivers for communicating with different wireless networks over different wireless links. For example, the WTRU 102c shown in FIG. 7 A may be configured to communicate with the base station 1 14a, which may emplo a cellular-based radio technology, and with the base station 114b, which may employ an IEEE 802 radio technology.
[0112] FIG. 7B is a system diagram of an example WTRU 102. As shown in FIG. 7B, the WTRU 102 may include a processor 118, a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, non-removable memory 130, removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and other peripherals 138. It will be appreciated that the WTRU 102 may include any subcombination of the foregoing elements while remaining consistent with an embodiment. Also, embodiments contemplate that the base stations 1 14a and 114b, and/or the nodes that base stations 114a and 1 14b may represent, such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HeNB), a home evolved node-B gateway, and proxy nodes, among others, may include one or more of the elements depicted in FIG. 7B.
[0113] The processor 1 8 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of
microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 18 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment. The processor 1 18 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. 7B depicts the processor 1 18 and the transceiver 120 as separate components, it will be appreciated that the processor 1 18 and the transceiver 120 may be integrated together in an electronic package or chip.
[0114] The transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 1 14a) over the air interface
1 15/1 16/117. For example, in one embodiment, the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals. In another embodiment, the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example. In yet another embodiment, the transmit/receive element 122 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
[0115] In addition, although the transmit/receive element 122 i s depicted in FIG. 7B as a single element, the WTRU 102 may include any number of transmit/receive elements 122. More specifically, the WTRU 102 may employ MIMO technology. Thus, in one embodiment, the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 1 15/1 16/117.
[0116] The transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122. As noted above, the WTRU 102 may have multi-mode capabilities. Thus, the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as UTRA and IEEE 802.1 1 , for example. [0117] The processor 1 18 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128. In addition, the processor 1 18 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132. The non-removable memory 130 may include random- access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 1 18 may access information from, and store data in, memon,' that is not physically located on the WTRU 102, such as on a server or a home computer (not shown),
[0118] The processor 1 18 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102, The power source 134 may be any suitable device for powering the WTRU 102. For example, the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.
[0119] The processor 1 18 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102. In addition to, or in lieu of, the information from the GPS chipset 136, the RU 102 may receive location information over the air interface 115/116/1 17 from a base station (e.g., base stations 1 14a, 114b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
[0120] The processor 1 18 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
[0121] FIG. 7C is a system diagram of the RAN 103 and the core network 106 according to an embodiment. As noted above, the RAN 103 may employ a UTRA radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 1 15. The RAN 103 may also be in communication with the core network 106. As shown in FIG. 7C, the RAN 103 may include Node-Bs 140a, 140b, 140c, which may each include one or more transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 1 15. The Node-Bs 140a, 140b, 140c may each be associated with a particular cell (not shown) within the RAN 103. The RAN 103 may also include RNCs 142a, 142b. It will be appreciated that the RAN 103 may include any number of Node-Bs and RNCs while remaining consistent with an embodiment.
[0122] As shown in FIG. 7C, the Node-Bs 140a, 140b may be in communication with the RNC 142a. Additionally, the Node-B 140c may be in communication with the RNC142b. The Node-Bs 140a, 140b, 140c may communicate with the respective RNCs 142a, 142b via an Iub interface. The RNCs 142a, 142b may be in communication with one another via an Iur interface. Each of the RNCs 142a, 142b may be configured to control the respective Node-Bs 140a, 140b, 140c to which it is connected. In addition, each of the RNCs 142a, 142b may be configured to cany out or support other functionality, such as outer loop power control, load control, admission control, packet scheduling, handover control, macrodiversity, security functions, data encryption, and the like.
[0123] The core network 106 shown in FIG. 7C may include a media gateway (MGW) 144, a mobile switching center (MSC) 146, a serving GPRS support node (SGSN) 148, and/or a gateway GPRS support node (GGSN) 150. While each of the foregoing elements are depicted as part of the core network 106, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
[0124] The RNC 142a in the RAN 103 may be connected to the MSC 146 in the core network 106 via an IuCS interface. The MSC 146 may be connected to the MGW 144. The MSC 146 and the MGW 144 may provide the WTRUs 102a, 102b, 102c with access to circuit- switched networks, such as the PSTN 108, to facilitate communications between the WTRUs 102a, 102b, 102c and land-line communications devices.
[0125] The RNC 142a in the RAN 103 may also be connected to the SGSN 148 in the core network 106 via an MPS interface. The SGSN 148 may be connected to the GGSN 150, The SGSN 148 and the GGSN 150 may provide the WTRUs 102a, 102b, 102c with access to packet-switched networks, such as the Internet 110, to facilitate communications between and the WTRUs 102a, 102b, 102c and IP-enabled devices.
[01261 As noted above, the core network 106 mav also be connected to the networks 1 12, which may include other wired or wireless networks that are owned and/or operated by other service providers.
[0127] FIG. 7D is a system diagram of the RAN 104 and the core network 107 according to an embodiment. As noted above, the RAN 104 may employ an E-UTRA radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 1 16. The RAN 104 may also be in communication with the core network 107.
[0128] The RAN 104 may include eNode-Bs 160a, 160b, 160c, though it will be appreciated that the RAN 104 may include any number of eNode-Bs while remaining consistent with an embodiment. The eNode-Bs 160a, 160b, 160c may each include one or more
transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 116. In one embodiment, the eNode-Bs 160a, 160b, 160c may implement ΜΓΜΟ technology. Thus, the e'Node-B 160a, for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 102a.
[0129] Each of the eNode-Bs 160a, 160b, 160c may be associated with a particular cell (not shown) and may be configured to handle radio resource management decisions, handover decisions, scheduling of users in the uplink and/or downlink, and the like. As shown in FIG. 7D, the eNode-Bs 160a, 160b, 160c may communicate with one another over an X2 interface,
[0130] The core network 107 shown in FIG. 7D may include a mobility management gateway (MME) 162, a serving gateway 164, and a packet data network (PDN) gateway 166. While each of the foregoing elements are depicted as part of the core network 107, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
[0131] The MME 162 may be connected to each of the eNode-Bs 160a, 160b, 160c in the RAN 104 via an SI interface and may serve as a control node. For example, the MME 162 may be responsible for authenticating users of the WTRUs 102a, 102b, 102c, bearer
activation/deactivation, selecting a particular serving gateway during an initial attach of the WTRUs 102a, 102b, 102c, and the like. The MME 162 may also provide a control plane function for switching between the RAN 104 and other RANs (not shown) that employ other radio technologies, such as GSM or WCDMA.
[0132] The serving gateway 164 may be connected to each of the eNode-Bs 160a, 160b, 160c in the RAN 104 via the SI interface. The serving gateway 164 may generally route and forward user data packets to/from the WTRUs 102a, 102b, 102c. The serving gateway 164 may also perform other functions, such as anchoring user planes during inter-eNode B handovers, triggering paging when downlink data is available for the WTRUs 102a, 102b, 02c, managing and storing contexts of the WTRUs 102a, 102b, 102c, and the like.
[0133] The serving gateway 164 may also be connected to the PDN gateway 166, which may provide the WTRUs 102a, 102b, 02c with access to packet-switched networks, such as the Internet 1 10, to facilitate communications between the WTRUs 102a, 102b, 102c and IP-enabled devices.
[0134] The core network 107 may facilitate communications with other networks. For example, the core network 107 may provide the WTRUs 102a, 102b, 102c with access to circuit- switched networks, such as the PSTN 108, to facilitate communications between the W7TRUs 102a, 102b, 102c and land-line communications devices. For example, the core network 107 may include, or may communicate with, an IP gateway (e.g., an IP multimedia subsystem (IMS) server) that serves as an interface between the core network 107 and the PSTN 108. In addition, the core network 107 may provide the WTRUs 102a, 102b, 102c with access to the networks 1 12, which may include other wired or wireless networks that are owned and/or operated by other service providers.
[0135] FIG. 7E is a system diagram of the RAN 105 and the core network 109 according to an embodiment. The RAN 105 may be an access service network (ASN) that employs IEEE 802.16 radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 1 17. As will be further discussed below, the communication links between the different functional entities of the WTRUs 102a, 102b, 102c, the RAN 105, and the core network 109 may be defined as reference points.
[0136] As shown in FIG. 7E, the RAN 105 may include base stations 180a, 80b, 180c, and an ASN gateway 182, though it will be appreciated that the RAN 105 may include any number of base stations and ASN gateways while remaining consistent with an embodiment. The base stations 180a, 180b, 180c may each be associated with a particular cell (not shown) in the RAN 105 and may each include one or more transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 1 17. In one embodiment, the base stations 180a, 180b, 180c may implement MTMO technology. Thus, the base station 180a, for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 102a. The base stations 180a, 180b, 180c may also provide mobility management functions, such as handoff triggering, tunnel establishment, radio resource management, traffic classification, quality of service (QoS) policy enforcement, and the like. The ASN gateway 182 may serve as a traffic aggregation point and may be responsible for paging, caching of subscriber profiles, routing to the core network 109, and the like.
[0137] The air interface 1 17 between the WTRUs 102a, 102b, 102c and the RAN 105 may be defined as an Rl reference point that implements the IEEE 802.16 specification. In addition, each of the WTRUs 102a, 102b, 102c may establish a logical interface (not shown) with the core network 109, The logical interface between the WTRUs 102a, 102b, 102c and the core network 109 may be defined as an R2 reference point, which may be used for
authentication, authorization, IP host configuration management, and/or mobility management,
[0138] The communication link between each of the base stations 180a, 180b, 180c may be defined as an R8 reference point that includes protocols for facilitating WTRU handovers and the transfer of data between base stations. The communication link between the base stations 180a, 180b, 180c and the ASN gateway 82 may be defined as an R6 reference point. The R6 reference point may include protocols for facilitating mobility management based on mobility events associated with each of the WTRUs 102a, 102b, 102c.
[0139] As shown in FIG. 7E, the RAN 105 may be connected to the core network 109, The communication link between the RAN 105 and the core network 109 may defined as an R3 reference point that includes protocols for facilitating data transfer and mobility management capabilities, for example. The core network 109 may include a mobile IP home agent (ΜΓΡ-ΗΑ) 184, an authentication, authorization, accounting (AAA) server 186, and a gateway 188, While each of the foregoing elements are depicted as part of the core network 109, it will be
appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
[0140] The MIP-HA may be responsible for IP address management, and may enable the WTRUs 102a, 102b, 102c to roam between different ASNs and/or different core networks. The MIP-HA 184 may provide the WTRUs 102a, 102b, 102c with access to packet-switched networks, such as the Internet 110, to facilitate communications between the WTRUs 102a, 102b, 102c and IP-enabled devices. The AAA server 186 may be responsible for user authentication and for supporting user services. The gateway 188 may facilitate interworking with other networks. For example, the gateway 188 may provide the WTRUs 102a, 102b, 102c with access to circuit-switched networks, such as the PSTN 108, to facilitate communications between the WTRUs 102a, 102b, 102c and land-line communications devices. In addition, the gateway 188 may provide the WTRUs 102a, 102b, 102c with access to the networks 1 12, which may include other wired or wireless networks that are owned and/or operated by other service providers.
[0141] Although not shown in FIG. 7E, it will be appreciated that the RAN 105 may be connected to other ASNs and the core network 109 may be connected to other core networks. The communication link between the RAN 105 the other ASNs may be defined as an R4 reference point, which may include protocols for coordinating the mobility of the WTRUs 102a, 102b, 102c between the RAN 105 and the other ASNs. The communication link between the core network 109 and the other core networks may be defined as an R5 reference, which may include protocols for facilitating interworking between home core networks and visited core networks.
[0142] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element may be used alone or in any combination with the other features and elements. In addition, techniques described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor. Examples of computer- readable media include electronic signals (transmitted over wired or wireless connections) and computer-readable storage media. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.

Claims

CLAIMS What is claimed is:
1. A method for sending rate adaptive streaming video content comprising:
receiving video content on a client device;
determining available bandwidth of a wireless communication channel of a consumer grade network for communication with a server, wherein the available bandwidth changes over time;
determining a first rate to encode the video content based on the available bandwidth; generating a manifest file for the video content, wherein the manifest file comprises the first rate;
sending the manifest file to the server;
encoding the video content to generate a first segment encoded at the first rate; and sending the first segment to the server in real-time.
2. The method of claim 1, further comprising:
determining a change in the available bandwidth of the wireless communication channel of the consumer grade network for communication with the server;
determining a second rate based on the change in the available bandwidth, the second rate being different from the first rate;
encoding the video content to generate a second segment encoded at the second rate; and sending the second segment to the server in real-time.
3. The method of claim 2, wherein the first segment immediately precedes the second segment in time.
4. The method of claim 2, further comprising:
generating an updated manifest file for the video content, the updated manifest file comprising the second rate; and
sending the updated manifest file to the server.
5, The method of claim 2, further comprising: after an entirety of the video content has been sent to the server, encoding the video content to generate the first segment encoded at the second rate and the second segment encoded at the first rate; and
sending the first segment encoded at the second rate and the second segment encoded at the first rate to the server.
6. The method of claim 1, wherein the manifest file comprises information related to the first segment encoded at the first rate.
7. The method of claim 1, further comprising:
dynamically updating the manifest file; and
sending the updated manifest file to the server.
8. The method of claim 7, wherein dynamically updating the manifest file comprises:
monitoring the available bandwidth,
identifying a change in the available bandwidth; and
updating the manifest file based on the change in the available bandwidth.
9. The method of claim 1, wherein the mani fest file is generated based on the encoded video content.
10. The method of claim 1, wherein receiving the video content on a client device comprises recording the video content using the client device.
11. A method for sending rate adaptive streaming video content comprising:
receiving video content on a client device;
determining a plurality of rates for encoding the video content, wherein the plurality of rates comprise a first rate and a second rate, wherein the first rate is associated with a first level of quality for the video content, the second rate is associated with a second level of quality for the video content, and the first level of quality is lower than the second level of quality;
generating a manifest file comprising the plurality of rates;
sending the manifest file to a server; encoding the video content at the first rate; and
sending the video content encoded at the first rate to the server in real-time.
12. The method of claim 11, further comprising:
encoding the video content at the second rate, and
sending the video content encoded at the second rate to the server after an entirety of the video content is encoded at the first rate and sent to the server.
13. A method for sending rate adaptive streaming video content comprising:
receiving video content on a client device, wherein the video content is recorded by the client device;
determining available bandwidth of a wireless communication channel for
communication with a server, wherein the available bandwidth changes over time;
determining a plurality of rates to encode the video content,
encoding the video content to generate a plurality of segments, wherein each of the plurality of segments is encoded at a rate determined based on the available bandwidth at the time of its encoding; and
sending each of the plurality of segments to the server in real-time.
14. A wireless transmit/receive unit (WTRU) comprising:
a memory, and
a processor configured to:
receive video content on a client device,
determine available bandwidth of a wireless communication channel of a consumer grade network for communication with a server, wherein the available bandwidth changes over time;
generate a manifest file for the video content, wherein the manifest file comprises the first rate;
send the manifest file to the server;
determine a first rate to encode the video content based on the available bandwidth;
encode the video content to generate a first segment encoded at the first rate, and send the first segment to the server in real-time,
15. The WTRU of claim 14, wherein the processor is further configured to:
determine a change in the available bandwidth of the wireless communication channel of the consumer grade network for communication with the server;
determine a second rate based on the change in the available bandwidth, the second rate being different from the first rate;
encode the video content to generate a second segment encoded at the second rate; and send the second segment to the server in real-time.
16. The WTRU of claim 15, wherein the first segment immediately precedes the second segment in time.
17. The WTRU of claim 15, wherein the processor is further configured to:
generate an updated manifest file for the video content, the updated manifest file comprising the second rate; and
send the updated manifest file to the server.
18. The WTRU of claim 15, wherein the processor is further configured to:
after an entirety of the video content has been sent to the server, encode the video content to generate the first segment encoded at the second rate and the second segment encoded at the first rate; and
send the first segment encoded at the second rate and the second segment encoded at the first rate to the server.
19. The WTRU of claim 14, wherein the manifest file comprises information related to the first segment encoded at the first rate.
20. The WTRU of claim 14, wherein the processor is further configured to:
dynamically update the manifest file; and
send the updated manifest file to the server.
21. The WTRU of claim 20, wherein the processor is further configured to:
monitor the available bandwidth;
identify a change in the available bandwidth; and
update the manifest file based on the change in the available bandwidth.
22. The WTRU of claim 14, wherein the manifest file is generated based on the encoded video content.
23. The WTRU of claim 14, wherein receiving the video content on a client device comprises recording the video content using the client device.
24. A wireless transmit/receive unit (WTRU) comprising:
a memory; and
a processor configured to:
receive video content on a client device;
determine a plurality of rates for encoding the video content, wherein the plurality of rates comprise a first rate and a second rate, wherein the first rate is associated with a first level of quality for the video content, the second rate is associated with a second level of quality for the video content, and the first level of quality is lower than the second level of quality;
generate a manifest file comprising the plurality of rates;
send the manifest file to a server;
encode the video content at the first rate; and
send the video content encoded at the first rate to the server in real-time .
25. The WTRU of claim 24, wherein the processor is further configured to:
encode the video content at the second rate; and
send the video content encoded at the second rate to the server after an entirety of the video content is encoded at the first rate and sent to the server.
26. A wireless transmit/receive unit (WTRU) comprising:
a memory; and a processor configured to:
receive video content on a client device, wherein the video content is recorded by the client device;
determine available bandwidth of a wireless communication channel for communication with a server, wherein the available bandwidth changes over time;
determine a plurality of rates to encode the video content;
encode the video content to generate a plurality of segments, wherein each of the plurality of segments is encoded at a rate determined based on the available bandwidth at the time of its encoding; and
send each of the plurality of segments to the server in real-time.
PCT/US2016/038114 2015-06-17 2016-06-17 Dynamic adaptive contribution streaming Ceased WO2016205674A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562181077P 2015-06-17 2015-06-17
US62/181,077 2015-06-17

Publications (1)

Publication Number Publication Date
WO2016205674A1 true WO2016205674A1 (en) 2016-12-22

Family

ID=56345219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/038114 Ceased WO2016205674A1 (en) 2015-06-17 2016-06-17 Dynamic adaptive contribution streaming

Country Status (1)

Country Link
WO (1) WO2016205674A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019105528A1 (en) * 2017-11-28 2019-06-06 Telefonaktiebolaget Lm Ericsson (Publ) Controlled uplink adaptive streaming based on server performance measurement data
US10581707B2 (en) 2018-04-10 2020-03-03 At&T Intellectual Property I, L.P. Method and apparatus for selective segment replacement in HAS video streaming adaptation
US11032345B2 (en) 2018-05-10 2021-06-08 Microsoft Technology Licensing, Llc Client side data stream processing
US20220321627A1 (en) * 2021-03-31 2022-10-06 Tencent America LLC Methods and apparatus for just-in-time content preparation in 5g networks
CN119182807A (en) * 2024-11-21 2024-12-24 成都秦川物联网科技股份有限公司 Data aggregation system and method for industrial Internet of things sub-service platform

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6389473B1 (en) * 1998-03-24 2002-05-14 Geo Interactive Media Group Ltd. Network media streaming
US20110129011A1 (en) * 2009-11-30 2011-06-02 Alcatel-Lucent Usa Inc. Method Of Opportunity-Based Transmission Of Wireless Video
US20120158985A1 (en) * 2010-12-21 2012-06-21 Microsoft Corporation Distributed smooth streaming utilizing dynamic manifests
US20120259996A1 (en) * 2011-04-06 2012-10-11 Sony Corporation Reception apparatus, reception method, and program
EP2680603A1 (en) * 2012-06-29 2014-01-01 Orange Processing technique to provide real-time content to client entities
WO2014041547A1 (en) * 2012-09-13 2014-03-20 Yevvo Entertainment Inc. Live video broadcasting from a mobile device
WO2015007795A1 (en) * 2013-07-16 2015-01-22 Bitmovin Gmbh Apparatus and method for cloud assisted adaptive streaming

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6389473B1 (en) * 1998-03-24 2002-05-14 Geo Interactive Media Group Ltd. Network media streaming
US20110129011A1 (en) * 2009-11-30 2011-06-02 Alcatel-Lucent Usa Inc. Method Of Opportunity-Based Transmission Of Wireless Video
US20120158985A1 (en) * 2010-12-21 2012-06-21 Microsoft Corporation Distributed smooth streaming utilizing dynamic manifests
US20120259996A1 (en) * 2011-04-06 2012-10-11 Sony Corporation Reception apparatus, reception method, and program
EP2680603A1 (en) * 2012-06-29 2014-01-01 Orange Processing technique to provide real-time content to client entities
WO2014041547A1 (en) * 2012-09-13 2014-03-20 Yevvo Entertainment Inc. Live video broadcasting from a mobile device
WO2015007795A1 (en) * 2013-07-16 2015-01-22 Bitmovin Gmbh Apparatus and method for cloud assisted adaptive streaming

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SEO BEOMJOO ET AL: "An experimental study of video uploading from mobile devices with HTTP streaming", PROCEEDING MMSYS '12 PROCEEDINGS OF THE 3RD MULTIMEDIA SYSTEMS CONFERENCE, 22 February 2012 (2012-02-22), NY, USA, XP055056191, ISBN: 978-1-45-031131-1, Retrieved from the Internet <URL:http://delivery.acm.org/10.1145/2160000/2155589/p215-seo.pdf?ip=145.64.134.245&acc=ACTIVE%20SERVICE&key=986B26D8D17D60C88D75A192E3112143&CFID=217259358&CFTOKEN=92200429&__acm__=1368608861_4bcea2a97529dafd945916b34fc14fcc> [retrieved on 20130312], DOI: 10.1145/2155555.2155589 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019105528A1 (en) * 2017-11-28 2019-06-06 Telefonaktiebolaget Lm Ericsson (Publ) Controlled uplink adaptive streaming based on server performance measurement data
US11223862B2 (en) 2017-11-28 2022-01-11 Telefonaktiebolaget Lm Ericsson (Publ) Controlled uplink adaptive streaming based on server performance measurement data
US10581707B2 (en) 2018-04-10 2020-03-03 At&T Intellectual Property I, L.P. Method and apparatus for selective segment replacement in HAS video streaming adaptation
US11032345B2 (en) 2018-05-10 2021-06-08 Microsoft Technology Licensing, Llc Client side data stream processing
US20220321627A1 (en) * 2021-03-31 2022-10-06 Tencent America LLC Methods and apparatus for just-in-time content preparation in 5g networks
US12219002B2 (en) * 2021-03-31 2025-02-04 Tencent America LLC Methods and apparatus for just-in-time content preparation in 5G networks
CN119182807A (en) * 2024-11-21 2024-12-24 成都秦川物联网科技股份有限公司 Data aggregation system and method for industrial Internet of things sub-service platform

Similar Documents

Publication Publication Date Title
US20230209109A1 (en) Systems and methods for generalized http headers in dynamic adaptive streaming over http (dash)
US12021883B2 (en) Detecting man-in-the-middle attacks in adaptive streaming
US10880349B2 (en) Quality-driven streaming
JP6455741B2 (en) Streaming with video orientation adjustment (CVO)
US20140019635A1 (en) Operation and architecture for dash streaming clients
WO2016205674A1 (en) Dynamic adaptive contribution streaming
WO2016172328A1 (en) Content protection and modification detection in adaptive streaming and transport streams
WO2017100569A1 (en) Trick mode restrictions for mpeg dash
HK40107189A (en) Systems and methods for generalized http headers in dynamic adaptive streaming over http (dash)
WO2016004237A1 (en) Media presentation description signaling in typical broadcast content
HK1204513B (en) Quality-driven streaming

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16734799

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16734799

Country of ref document: EP

Kind code of ref document: A1