HK1140280A - Synchronous delivery of media content in a collaborative environment - Google Patents
Synchronous delivery of media content in a collaborative environment Download PDFInfo
- Publication number
- HK1140280A HK1140280A HK10106429.6A HK10106429A HK1140280A HK 1140280 A HK1140280 A HK 1140280A HK 10106429 A HK10106429 A HK 10106429A HK 1140280 A HK1140280 A HK 1140280A
- Authority
- HK
- Hong Kong
- Prior art keywords
- media object
- display
- command
- media
- user
- Prior art date
Links
Description
Technical Field
The present invention relates generally to systems and methods for displaying and controlling time-based media content, such as video and/or audio content, via a network, such as the internet or an intranet, and more particularly to interfaces and methods for synchronously displaying and controlling time-based media content in a collaborative environment.
Background
Sharing time-based media content (e.g., video or audio files) in a social or collaborative environment, such as an instant messaging environment, can be difficult. For example, users may typically share references (e.g., send links/URLs) to media objects with each other in a conversation window (or other social or collaborative environment) of an instant messaging application. However, because each user independently controls the display (e.g., stops, pauses, etc.), these actions have little effect on the display and control of media objects that synchronize dialog feedback. For example, when sharing media objects, it is often desirable to talk to and share reactions (interactions) with other users more or less in real time. However, when sending video URLs during an instant messaging conversation, it is often difficult for the sender and recipient to converse efficiently for media objects due to the media object display being out of sync, which causes conditions such as "is playing? "or" notify me when you pass a part with a mountain? "and the like.
In addition, controlling the display of time-based media content in a social or collaborative environment presents usage difficulties. For example, even if a user coordinates the start of play of a media object, controlling the display of the media object by one user device (e.g., one user pausing or rewinding a shared video) may cause the shared video to lose synchronization with other user devices.
Accordingly, it is desirable to synchronously communicate and share time-based media content, such as audio and/or video, among a group of people. In addition, it is desirable to allow such media to be controlled in real time by more than one member of the community in a synchronized manner.
Disclosure of Invention
According to one aspect and one example of the present invention, an apparatus and method are provided for facilitating the synchronized display of a time-based media object (e.g., a video or audio file) with a remote device. In one example, the apparatus includes: control logic for controlling display of the media object in response to a user input command with the first device; and communication logic to cause transmission of a user input command (e.g., play, pause, skip, etc.) and a metric associated with the media object (e.g., time or frame reference) to the second device for synchronizing display of the media object by the first and second devices.
The communication logic may also receive user input commands and metrics from the second device, the control logic controlling the display of the media object by the first device based on the received commands and metrics. Thus, in one example, all user devices sharing the media object may input a command to control the display of the media object in synchronization with all user devices.
In one example, the first and second devices may communicate via peer-to-peer communication, and the apparatus may be further operable to add synchronization control of the media object to a communication channel between two or more user devices. Further, in one example, the apparatus may also be supported from existing instant messaging applications, including existing good friends lists, profile information, and the like.
The user input commands may include play, pause, rewind/fast forward, seek (seek), and the like. The user input command may be transmitted in response to a user selection of a control button associated with the displayed interface or a command input to an associated instant messaging conversation. Further, the apparatus may also include various devices such as a personal computer, a mobile phone device and/or a mobile personal entertainment device, a DVR, and so forth.
Thus, in some examples described herein, the present invention provides a web-based architecture for sharing media objects among multiple users. Software (or other logic) located on each user device creates a communication channel between the user devices to synchronously deliver the user-specified media objects. Any user may control the playback and display of media in real-time by, for example, stopping, fast-forwarding, or slow-motion playing of the media object. Because the media is synchronized between user devices, each user device screen simultaneously displays the same segment of media. In this way, the present invention may facilitate collaborative communication by presenting synchronized media.
According to another aspect, a method for facilitating synchronous display of time-based media objects by a plurality of devices is provided. In one example, the method includes: controlling display of the media object in response to a user input command with the first device; and causing the user input command and the metric associated with the media object to be transmitted to a second device for synchronizing display of the media object by the first and second devices.
According to another aspect, a computer-readable medium is provided that includes instructions for facilitating synchronous display of time-based media objects by a plurality of devices. In one example, the instructions are to: controlling display of the media object in response to a user input command with the first device; and causing the user input command and the metric associated with the media object to be transmitted to a second device for synchronizing display of the media object by the first and second devices.
The invention and its various aspects will be better understood in view of the following detailed description taken in conjunction with the accompanying drawings and claims.
Drawings
FIG. 1 schematically illustrates an exemplary system and environment for communicating between two user devices sharing media content, such as time-based media objects;
FIG. 2A schematically illustrates an exemplary interface for displaying and controlling synchronized delivery of time-based media objects;
FIG. 2B illustrates a screen shot of an exemplary interface for displaying and controlling time-based media objects in conjunction with an instant messaging application;
FIG. 3 illustrates an exemplary signaling diagram between two users for synchronizing the display and control of media objects, according to one example;
FIGS. 4A and 4B illustrate an exemplary method for synchronizing the display of media objects;
FIG. 5 illustrates an exemplary method for controlling the display of synchronized media objects; and
FIG. 6 illustrates an exemplary computing system that may be used to implement the processing functionality of various aspects of the invention.
Detailed Description
The following description is presented to enable any person skilled in the art to make and use the invention. Descriptions of specific devices, techniques, and applications are provided only as examples. Various modifications to the examples described herein will be readily apparent to those of ordinary skill in the art, and the general principles defined herein may be applied to other examples and applications without departing from the spirit and scope of the invention. Thus, the present invention is in no way intended to be limited to the examples described and illustrated herein, but rather to correspond to the scope of the invention consistent with the claims.
In accordance with one aspect of the present invention, an apparatus and interface are provided for sharing time-based content (e.g., media objects) in a near-synchronous manner. In one example, software (or other logic) located on the user device creates a communication channel between at least one other user device for synchronized display and control of user-specified media content. The media content may include various media objects, such as one or more audio and/or video files, accessible (locally or remotely) by at least one user device. In addition, the media content may be provided by a third party.
In one example, each user wishing to share media content accesses logic (e.g., software) that causes an interface to be displayed that facilitates the synchronized delivery of the media content between the user devices. The software may be executed locally on the user's system, e.g., as a plug-in or applet downloaded or installed on the user's device, or remotely, e.g., as a servlet (servlet) through a web browser. In addition, software, located locally or remotely, facilitates direct or indirect connections between users who wish to share media content. For example, a direct IP-to-IP (peer-to-peer) connection may be created between user devices, or an indirect connection may be created through a server that streams media content. In both cases, a communication connection is established between the user devices to synchronize the display and control of the shared media object.
A user wishing to share media content may communicate a reference, such as a Uniform Resource Identifier (URI), associated with a media object to one or more user devices. Based on the URI, each user equipment may establish a synchronous exchange. The software ensures that each user is always viewing the same or substantially the same piece of media content. For example, a signal may be sent between users indicating the current location of the media object. The location may include a frame number, a time interval, or any other metric that can ensure synchronization between users.
After establishing a connection between the user devices, each user may control the media content via an interface displayed on each user device. For example, if one user enters a command, such as pause a video, the command and time reference will be transmitted to the other user devices to pause the video at the same location for all users sharing the video. Additionally, if one user wishes to jump to a particular segment or location of a media object, the jump and time are communicated to all users, and they are directed to the same segment or location of the media object accordingly.
According to some examples, other forms of communication, such as instant messaging, audio, and/or video conferencing, may operate in conjunction with the described interface. In these examples, users sharing media content may control playback of the media and collaborate through other forms of communication. Additionally, in one example, the invention may be implemented for an instant messaging service that allows users to enter comments about media being shared. Additionally, the user device may include a personal computer (e.g., a desktop computer or a laptop computer) as well as other media platforms, such as a television, a mobile phone, a Digital Video Recorder (DVR), or a personal media player (e.g., MP3 or a video player). For example, each user may share television-based media content using a DVR device that is operable to communicate with other user devices (whether similar or different, e.g., other DVR devices, personal computers, personal video players, etc.) via a network.
For convenience, videos are sometimes used and described as examples of media content and media objects manipulated by exemplary devices, interfaces, and methods; however, those skilled in the art will recognize that the various examples are similarly or equally applicable to other media objects (e.g., viewing and controlling a media object may be applicable to viewing a video file (with or without audio), listening to an audio file such as a soundtrack, or a combination thereof) subject to appropriate modification and using appropriate other functionality.
FIG. 1 schematically illustrates an exemplary system and environment for communicating between two user devices 110 sharing media content, such as time-based media objects. In particular, FIG. 1 illustrates two user devices 110 that may communicate with each other to share a media object, and may also communicate with one or more of media source 120, web server 122, and advertisement server 130 via network 112.
In one example, user devices 110 each include suitable communication logic 102 to interface and communicate (partially or wholly) with other devices 110, as well as media sources 120, web servers 122, and the like, via network 112. For example, communication logic 102 may cause a command (e.g., a user-entered command such as play, pause, or fast forward) and a metric reference (e.g., time or frame) associated with the shared display of the media object to be transmitted (and/or received) with other devices 110. In addition, user device 110 also includes control logic 104 for controlling the display of media content associated therewith in response to such commands (generated by the user device itself in response to user input or by other user devices in communication therewith).
Each user device 110 is further operable to display an interface (see, e.g., interface 200 of fig. 2A) for displaying and controlling media objects in a synchronized manner, where the interface may be facilitated locally by the user device 110 via logic executed locally by the user device 110, e.g., via a plug-in or applet downloaded or installed on the user device 110, or remotely, e.g., by launching the applet through a web browser from the web server 122. Further, logic, either locally or remotely located, may facilitate a direct or indirect connection between user devices 110 (i.e., between device a and device B) sharing media content. For example, a direct IP-to-IP (peer-to-peer) connection may be created between user devices 110, or an indirect connection may be created through a media server (e.g., media server 122) used to stream media content to user devices 110. In either case, a communication connection is established between the user devices 110 to relay commands and metrics for the synchronized display and control of the shared media objects.
Further, user device 110 may include logic for receiving and loading data conforming to any media format, such as still images (e.g., JPEG, TIFF), video (e.g., MPEG, AVI, Flash), or audio (e.g., MP3, OGG). Note that the format in which the media objects are displayed need not be the same when sharing the media objects; for example, the first user device 110 may display the media object via an MPEG player and the second user device 110 displays the media object via a Flash player.
User device 110 may include suitable hardware and software for performing the described functions, such as a processor connected to an input device (e.g., a keyboard), a network interface, memory, and a display. The memory may include logic or software operable in conjunction with the device to perform some of the functions described herein. The device is operable to include suitable interfaces for the following messaging tools: such as an email inbox, Instant Messaging (IM), Short Messaging Service (SMS), Multimedia Messaging Service (MMS), etc. The device is further operable to display a web browser for accessing the internet or a user account as follows: including, for example, a web page such as Yahoo!Mail Account or HotmailAn account, and the like. User equipment 110 may also include Wireless Application Protocol (WAP) features or other data communication protocols associated with user equipment 110 suitable for communication via network 112.
The network 112 may include a Wide Area Network (WAN) such as the internet, a wireless network such as a wireless gateway (e.g., a cellular, satellite, or other wireless network), a wired network such as a cable or fiber optic network, or a combination of wireless and wired systems. User devices 110 may communicate, in part or in whole, via wireless or hardwired communication (e.g., ethernet, IEEE 802.11b wireless, etc.). Additionally, communications between user device 110 and media source 120 may include (or access) various servers or devices, such as advertisement server 130, mail server (not shown), mobile server (not shown), and so forth.
Media source 120, web server 122, and/or advertisement server 130 may include logic to receive user credentials, user input, contextual information, media objects, and the like. To this end, media source 120, web server 122, and/or advertisement server 130 may utilize protocols such as the Common Gateway Interface (CGI)And associated applications (or "scripts"), Java"servlets" (e.g., Java running on a wireless server, web server, etc.)Applications) to present information and receive input from user device 110. Media source 120, web server 122, and/or advertisement server 130 (although described separately herein) may actually comprise multiple servers, computers, devices, etc. (wired and/or wireless) that communicate and cooperate to perform some or all of the functions described herein. In addition, media source 120, web server 122, and/or advertisement server 130 may be implemented in various forms and include various hardware, software, or firmware to perform the examples described herein.
Additionally, the advertisement server 130 may operate to deliver advertisements to the user device 110. For example, the ad server 130 may include logic to cause advertisements to be displayed with or in association with the delivered media content based on various factors such as the media content accessed or delivered. In other examples, the advertisement may alternatively or additionally be based on user profile information associated with the user device 110 (e.g., accessed via the user device 110 or the web server 122). In other examples, the advertisement may be randomly generated or associated with the user device 110.
Fig. 2A schematically illustrates an exemplary interface 200 for displaying and controlling synchronized delivery of time-based media objects, and fig. 2B illustrates a screen shot of an exemplary interface 200B for displaying and controlling time-based media objects in conjunction with an instant messaging application (and may be referenced in conjunction with fig. 2A).
The interface 200 generally includes a display portion 202 for displaying media objects (e.g., video files). In other examples, the media object may comprise an audio file such that display portion 202 is not used to display images associated with the received media object. In such a case, the display portion 202 may include a status bar indicating the time the audio file is playing (e.g., the time the audio file arrives or the time remaining).
The interface 200 also includes controls 204a-204d for controlling the display of media objects in response to user selections. Controls 204a-204d may include controls for play, pause, fast forward, and rewind, as shown. Other controls may be envisaged, for example controls for slow motion, skipping N seconds back/forth, stopping, etc.
Interface 200 also includes or interacts with communication logic for communicating user-entered commands to other user devices sharing the media object. For example, in response to the user selecting "pause," interface 200 causes the command to be transmitted to other user devices that share the media object. Further, a metric reference (e.g., time or frame number) associated with a command for controlling display of a media object is communicated to other user devices that execute the command in a synchronized manner. For example, continuing with the example "pause" command, the interface 200 will transmit the command "pause" to another user device and a time, for example, at 10 seconds (or at frame 550). The command, together with the metric, will cause the other user devices to execute the pause command at the specified time. If the other user device(s) are outside the specified metrics, the display will seek or jump back to the specified time and pause; on the other hand, if the other user device(s) have not reached the specified metric, the display will continue and pause when the metric is reached.
Additionally, the user interface 200 may include a communication section 240 that may display a conversation window associated with an instant messaging application or other social or collaborative application, for example. In one example, the user may alternatively or additionally control the display of media objects in display portion 202 via text input commands. For example, a user entering the text "pause" or "play" may cause interface 200 to control the media object in display portion 202 and pass the command to other user devices in communication therewith.
In other examples, communication portion 240 may include a voice over IP (VoIP) interface, video telephony, text messaging interface, a "smart" DVR remote control, or other collaborative or social applications. Further, the communication portion 240 and the display portion 202 need not be included in the same window or displayed simultaneously with the user device.
Referring to FIG. 2B, an exemplary screen shot of another interface 200B is illustrated. Interface 200b is similar to interface 200, including display portion 202b, communication portion 240b, and controls 204, and therefore only the differences will be discussed in detail. In this example, the communication portion 240b is implemented as an instant messaging application through which a user may exchange links to media objects, media object files, text, voice, etc. Further, as previously described, the user may enter commands, such as "pause" or "play" in lower window 242b, which are operable and in conjunction with the remote device/user to control the display of media objects in display portion 202 b.
The interface 200b may also include various other applications associated therewith, from editing operations (e.g., clip(s), grab, cut, etc.) to communication operations (e.g., email, save, share, etc.).
Fig. 3 illustrates an exemplary signaling diagram between user equipment a and user equipment B for initiating a sharing session and exchanging commands, according to one example. The described communication may be performed in an environment similar or dissimilar to the environment described with reference to fig. 1.
In one example, software to facilitate media delivery and communication between user device a and user device B is received from a web server at 440 and 442. For example, software to be executed locally with user devices a and B, such as a received plug-in or applet for generating a user interface, such as user interface 200 of fig. 2A. Alternatively, the user may initiate a servlet through a web server for facilitating communication between user device A and user device B.
User device a may then transmit an indication of the shared media object to user device B at 450. For example, user device A may communicate a link to a media object accessible by device B. The link may be to a media object of a remote media source (e.g., a media server) or to a media object located at user device a. Additionally, user device A may transfer the media object itself to user device B initially or in response to acceptance by the user of user device B.
In one example, user device A may send a desired shared media object and references or files associated with the following software to user device B: the software is used to display an interface that facilitates synchronized media display between user device a and user device B. Further, before or after transmitting the reference or file for sharing, the user devices a and B may open a communication channel, such as a communication channel similar to that of a conventional IM application.
In examples where the media object is located at a third party media source (e.g., a video and/or music server), user devices a and B request and receive the media object at 452a and 452B, respectively. In other examples, the media object may be transferred (directly or via a remote server) from one of user devices a and B to the other, such that both user devices a and B have access to the same media object for synchronized display. For example, the media object may be stored in user device A, which may include a personal entertainment device (e.g., MP3 player, iPod)TMEtc.), DVR (e.g., TiVo)TMEtc.) and the like.
User equipment B then sends at 454 an indication to user equipment a that: user equipment B is ready to play the media object. Those of ordinary skill in the art will appreciate that user device B need not receive the complete media object data before sending such an indication, but need only receive data sufficient to begin displaying the media object.
In response to the indication from user device B at 454, user device A sends a "play" command to user device B at 456 (when user device A is also ready to play the media object). The command at 456 may also include a metric associated with the command, e.g., a metric corresponding to the beginning of the media object. Note that in other examples, the ready indication may originate from user device a and/or the play command may originate from user device B. In addition, in the case of multiple user devices, user device a may wait until all user devices indicate that they are ready to issue a play command.
Additionally, user device A may include a delay in playing the media object locally (e.g., on user device A) to account for communication time to user device B. For example, the user device may include a standard delay time, or estimate a communication time based on the ping device B to reduce a time difference resulting from a communication lag between the user device a and the user device B, such that the media object is displayed with the user device a and the user device B in near synchronization (e.g., a display offset of less than 1 second in one example, and less than 500ms in another example).
In response to a user-entered command (e.g., a selection to pause, rewind, or skip back N seconds), user device a causes the command and associated metric (e.g., time, frame number, etc.) to be transmitted to user device B at 458. The transmission of commands and metrics allows both user devices a and B to control the display of media objects in a similar manner and maintain synchronization. For example, for a pause command that is communicated with a metric such as the time or frame number at which user device A made the pause, it causes user device B to pause the display of the media object at the same metric reference. Thus, regardless of any drift between users, a pause will occur at the same point in the media object. Subsequent command (e.g., play) transfers may occur at 460 and 462 (similar to that described at 454 and 456).
Another exemplary command includes a skip command (e.g., skip N seconds forward/backward). The skip command may be transmitted as a lookup command along with the destination metric. For example, if user device a is controlled by the user to skip 10 seconds back, user device a causes the transmission of the lookup command and the destination metric based on 10 seconds before the current location of user device a. Thus, if the devices are slightly out of sync, they will jump to or find the same reference point in the media object (but not necessarily both will jump the same amount). Again, user devices A and B may communicate similar to the communication described at 454 and 456 to synchronize playback after a seek command and will start playback from the same point of the media object.
Finally, one or both of user devices A and B may periodically update the other device with status commands and metrics to determine whether the media objects are playing synchronously on each device. For example, one device may drift out of synchronization over time, and the status commands and metrics may cause one or more of the devices to pause or skip display and resynchronize the devices. Of course, absolute synchronization is not necessary, and the system may set a desired tolerance for display to be near-synchronized, e.g., synchronized in less than about 1 second.
Figures 4A and 4B illustrate an exemplary method for sharing media content in a synchronized manner for an initiating device and an invited device. For example, at 400, a first user device may send a media object (e.g., a media object file or a link/reference thereto) to at least one second user device. If a remote media object, the first user device then loads the media object at 410 or otherwise prepares the media object for play at 410. When a ready or play command is received from other user device(s) at 420, the media object plays (and may also contribute to the play command at 430 depending on the particular implantation (implantation)).
The second user device receives the media object or a link thereto at 450 and begins loading or accessing the media object at 460 (the second user may also access or load an appropriate interface for displaying the shared media object). When the media object is sufficiently loaded for playback, the second device may signal an indication to the first user at 470 and play the media object in response to the received playback command at 480. Again, in other examples, the second user device may wait for a ready signal from the first user and issue the play command to the first user device.
FIG. 5 illustrates exemplary operations for controlling a shared media object between two or more users. First, a command is input with a first device at 500. At 510, the first device causes the command and associated metrics to be transmitted to other devices that share the media object. At 520, the first user device may wait for a confirmation or ready indication that the second user device is ready to execute the command entered at 500. For example, the first user device may wait for acknowledgement or ready indication for skip forward and backward commands before starting playing at a new time/frame of the media object.
FIG. 6 illustrates an exemplary computing system 600 that can be used to implement processing functionality (e.g., as a user device, web server, media source, etc.) for various aspects of the invention. Those skilled in the relevant art will also recognize how to implement the invention using other computer systems or architectures. Computing system 600 may represent, for example, a user device (e.g., desktop computer, mobile phone, personal entertainment device, DVR, etc.), a mainframe, a server, or any other type of special or general purpose computing device as may be appropriate or appropriate for a given application or environment. Computing device 600 may include one or more processors, such as a processor 604. Processor 604 can be implemented using a general or special purpose processing engine such as, for example, a microprocessor, microcontroller or other control logic. In this example, processor 604 is connected to a bus 602 or other communication medium.
Computing system 600 can also include a main memory 608, preferably Random Access Memory (RAM) or other dynamic storage, for storing information and instructions to be executed by processor 604. Main memory 608 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 604. Computing system 600 may likewise include a read only memory ("ROM") or other static storage device coupled to bus 602 for storing static information and instructions for processor 604.
Computing device 600 may also include information storage mechanism 610, where information storage mechanism 610 may include, for example, a media drive 612 and a removable storage interface 620. The media drive 612 may include a drive or other mechanism to support fixed or removable storage media, such as a hard disk drive, a floppy disk drive, a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW), or other fixed or removable media drive. Storage media 618 may include, for example, a hard disk, floppy disk, magnetic tape, optical disk, CD or DVD, or other fixed or removable medium that is read by and written to by media drive 614. As these examples illustrate, the storage media 618 may include a computer-readable storage medium having stored therein particular computer software or data.
In alternative embodiments, information storage mechanism 610 may include other similar media for allowing computer programs or other instructions or data to be loaded into computer system 600. Such media may include, for example, a removable storage unit 622 and an interface 620, such as a program cartridge (or cartridge interface), a removable memory (for example, a flash memory or other removable memory module) and memory slot, and other removable storage units 622 and interfaces 620 that enable software and data to be transferred from the removable storage unit 618 to computing system 600.
Computing system 600 may also include a communications interface 624. Communications interface 624 may be used to enable software and data to be transferred between computing system 600 and external devices. Examples of communications interface 624 may include a modem, a network interface (e.g., an ethernet or other NIC card), a communications port (e.g., a USB port), a PCMCIA slot and card (card), etc. Software and data transferred via communications interface 624 are in the form of signals which may be electronic, electromagnetic, optical, or other signals capable of being received by communications interface 624. These signals are provided to communications interface 624 via a channel 628. This channel 628 may carry signals and may be implemented using a wireless medium, wire or cable, fiber optics, or other communications medium. Some examples of channels include telephone lines, cellular telephone links, RF links, network interfaces, local or wide area networks, and other communication channels.
In this document, the terms "computer program product" and "computer-readable medium" may be used generally to refer to media such as, for example, memory 608, storage device 618, storage unit 622, or signals on channel 628. These and other forms of computer-readable media may be involved in providing one or more sequences of one or more instructions to processor 604 for execution. Such instructions, generally referred to as "computer program code" (which may be grouped in the form of computer programs or other groupings), when executed, enable the computing system 600 to perform features or functions of various embodiments of the present invention.
In an embodiment where the elements are implemented using software, the software may be stored in a computer-readable medium and loaded into computing system 600 using, for example, removable storage drive 614, drive 612 or communications interface 624. The control logic (in this example, software instructions or computer program code), when executed by the processor 604, causes the processor 604 to perform the functions of the invention as described herein.
It will be appreciated that the above description for clarity has described embodiments of the invention with reference to different functional units and processors. However, it will be apparent that any suitable distribution of functionality between different functional units, processors or domains may be used without detracting from the invention. For example, functions illustrated to be performed by separate processors or controllers may be performed by the same processor or controller. Hence, references to specific functional units are only to be seen as references to suitable means for providing the described functionality rather than indicative of a strict logical or physical structure or organization.
Although the present invention has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Rather, the scope of the invention is limited only by the claims. In addition, although a feature may appear to be described in connection with particular embodiments, one skilled in the art would recognize that various features of the described embodiments may be combined in accordance with the invention.
Furthermore, although individually listed, a plurality of means, elements or method steps may be implemented by e.g. a single unit or processor. Additionally, although individual features may be included in different claims, these may possibly advantageously be combined, and the inclusion in different claims does not imply that a combination of features is not feasible and/or advantageous. Furthermore, the inclusion of a feature in one category of claims does not imply a limitation to this category, but rather the feature may be equally applicable to other claim categories, as appropriate.
Although the present invention has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Rather, the scope of the invention is limited only by the claims. Additionally, although a feature may appear to be described in connection with particular embodiments, one skilled in the art would recognize that various features of the described embodiments may be combined in accordance with the invention. Also, the aspects of the invention described in connection with the embodiments may stand alone as the invention.
Also, it will be understood that various modifications and substitutions may be made by those skilled in the art without departing from the spirit and scope of the invention. The invention is not to be limited by the foregoing illustrative details, but rather is to be defined according to the claims.
Claims (39)
1. An apparatus for facilitating synchronous display of time-based media objects by a plurality of devices, the apparatus comprising:
control logic for controlling display of the media object in response to a user input command with the first device; and
communication logic to cause transmission of the user input command and a metric associated with the media object to a second device for synchronizing display of the media object by the first and second devices.
2. The apparatus of claim 1, the communication logic further to receive commands and metrics from the second device, wherein the control logic further to control display of a media object by the first device based on the received commands and metrics.
3. The apparatus of claim 1, wherein user input commands are communicated between the first and second devices via peer-to-peer communication.
4. The apparatus of claim 1, wherein the user input command is communicated via a communication channel of a collaborative application.
5. The apparatus of claim 1, wherein the user input command is communicated via a communication channel of an instant messaging application.
6. The apparatus of claim 1, further comprising display logic for displaying an interface for displaying the media object.
7. The apparatus of claim 1, wherein the metric comprises a time or frame reference associated with the media object.
8. The apparatus of claim 1, wherein the command comprises a pause command.
9. The apparatus of claim 1, wherein the command comprises a lookup command, and wherein the metric is a destination of the lookup command.
10. The apparatus of claim 1, wherein the user input command comprises a text input in an instant messaging application.
11. The apparatus of claim 1, wherein the media object comprises a video file.
12. The apparatus of claim 1, wherein the media object comprises an audio file.
13. The device of claim 1, wherein the device comprises a DVR.
14. The apparatus of claim 1, wherein the apparatus comprises a personal media player.
15. An apparatus for facilitating synchronous display of a time-based media object with a remote device, the apparatus comprising:
a display; and
communication logic to facilitate a synchronous communication channel with a remote device, wherein a user input command and a metric associated with a media object are transmitted to the remote device for synchronizing the display and the remote device's display of the media object.
16. The apparatus of claim 15, further comprising control logic to control display of the media object in response to a user input command.
17. The apparatus of claim 16, the communication logic further to receive commands and metrics from the remote device, wherein the control logic further controls the display of the media object based on the received commands and metrics from the remote device.
18. The apparatus of claim 15, further comprising display logic for displaying an interface for displaying and controlling the media object.
19. The apparatus of claim 15, wherein the metric comprises a time or frame reference associated with the media object.
20. A method for assisting a plurality of devices in the synchronized display of time-based media objects, the method comprising:
controlling display of the media object in response to a user input command with the first device; and
causing the user input command and the metric associated with the media object to be transmitted to a second device for synchronizing display of the media object by the first and second devices.
21. The method of claim 20, further comprising causing peer-to-peer communication between the first device and a second device.
22. The method of claim 20, further comprising controlling display of the media object by the first device based on commands and metrics received from the second device.
23. The method of claim 20, further comprising displaying an interface for displaying the media object.
24. The method of claim 20, wherein the metric comprises a time or frame reference associated with the media object.
25. The method of claim 20, wherein the command comprises a pause or seek command.
26. The method of claim 20, further comprising transmitting the command in response to text input in an instant messaging application.
27. The method of claim 20, wherein the media object comprises a video or audio file.
28. The method of claim 20, wherein the first device comprises a DVR.
29. The method of claim 20, wherein the first device comprises a personal media player.
30. A computer-readable medium comprising instructions for assisting a plurality of devices in synchronously displaying time-based media objects, the instructions causing performance of a method comprising:
controlling display of the media object in response to a user input command with the first device; and
causing the user input command and the metric associated with the media object to be transmitted to a second device for synchronizing display of the media object by the first and second devices.
31. The computer-readable medium of claim 30, further comprising causing peer-to-peer communication between the first device and a second device.
32. The computer-readable medium of claim 30, further comprising instructions for controlling display of the media object by the first device based on commands and metrics received from the second device.
33. The computer-readable medium of claim 30, further comprising instructions for displaying an interface for displaying the media object.
34. The computer-readable medium of claim 30, wherein the metric comprises a time or frame reference associated with the media object.
35. The computer-readable medium of claim 30, wherein the command comprises a pause or seek command.
36. The computer-readable medium of claim 30, further comprising instructions for transmitting the command in response to text input in an instant messaging application.
37. The computer-readable medium of claim 30, wherein the media object comprises a video or audio file.
38. The computer-readable medium of claim 30, wherein the first device comprises a DVR.
39. The computer-readable medium of claim 30, wherein the first device comprises a personal media player.
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/710,026 | 2007-02-22 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| HK1140280A true HK1140280A (en) | 2010-10-08 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11671470B2 (en) | Synchronous delivery of media content in a collaborative environment | |
| US20080209075A1 (en) | Synchronous delivery of media content and real-time communication for online dating | |
| US11831693B2 (en) | Ambient, ad hoc, multimedia collaboration in a group-based communication system | |
| JP5301425B2 (en) | Group content presentation and system and method for organizing group communications during group content presentation | |
| JP6370941B2 (en) | Communications system | |
| US7426192B2 (en) | Network conference system, conference server, record server, and conference terminal | |
| CN105144673B (en) | Server-Interventional Audio-Video Communication with Reduced Latency | |
| US20090327425A1 (en) | Switching between and dual existence in live and recorded versions of a meeting | |
| US20130198288A1 (en) | Systems, Methods, and Computer Programs for Suspending and Resuming an Online Conference | |
| US20140213227A1 (en) | Mobile device capable of substantially synchronized sharing of streaming media, calls and other content with other devices | |
| US20120233644A1 (en) | Mobile device capable of substantially synchronized sharing of streaming media with other devices | |
| KR20140066641A (en) | Server, multimedia apparatus, and control method of thereof | |
| AU2010287252A1 (en) | Method for play synchronization and device using the same | |
| US20130103770A1 (en) | Distributed semi-synchronized event driven playback of multimedia | |
| CA3213247A1 (en) | Method and system for integrating video content in a video conference session | |
| US8571189B2 (en) | Efficient transmission of audio and non-audio portions of a communication session for phones | |
| US20070121818A1 (en) | Information processing apparatus, information processing method, and program that enable viewing of content during telephone call | |
| HK1140280A (en) | Synchronous delivery of media content in a collaborative environment | |
| JPWO2010073506A1 (en) | Program content viewing system and content reception recording / playback apparatus | |
| JP2014027524A (en) | Moving image reproduction system, moving image reproduction device and progress state management device | |
| HK1179021A (en) | Distributed semi-synchronized event driven playback of multimedia |