WO2004014059A2 - Method and apparatus for processing image-based events in a meeting management system - Google Patents
Method and apparatus for processing image-based events in a meeting management system Download PDFInfo
- Publication number
- WO2004014059A2 WO2004014059A2 PCT/US2003/023194 US0323194W WO2004014059A2 WO 2004014059 A2 WO2004014059 A2 WO 2004014059A2 US 0323194 W US0323194 W US 0323194W WO 2004014059 A2 WO2004014059 A2 WO 2004014059A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- meeting
- application
- image processing
- event
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
Definitions
- the present invention relates generally to project management systems and, more particularly, to project management systems that facilitate the synchronous interaction of a number of individuals to create and modify documents and to perform other project tasks.
- Project management systems increase productivity and efficiency of members of a project team by automating the flow of information, including documents and files, among team members.
- Project management systems are often deployed to support collaborative work among a group of individuals, such as the members of a project team.
- Asynchronous collaboration systems allow team members to collaborate on one or more project tasks independently in time or space.
- Synchronous collaboration systems allow team members to simultaneously collaborate on one or more project tasks in the same or a different location.
- Project management systems 1 typically time stamp the various media components of a meeting, such as the audio and video aspects, so that the presentation of the various media components can be synchronized, and so that a user can selectively access any desired portion of the meeting.
- image processing events such as rotating an image or zooming in or out on an image, such that the image processing events can be recreated at a remote terminal or at a later time.
- a method and apparatus are provided for processing image based events in a meeting management system.
- a method is provided for recording a meeting having at least one participant. The method comprises the steps of detecting at least one image processing event associated with the meeting; and recording the image processing event for subsequent playback.
- the image processing event may include, for example, rotation of an image, zooming in or out on an image or another function that manipulates an image.
- each recorded image processing event is tagged with a time stamp and information about the event, such as an event type and a participant associated with the image processing event. In this manner, each image processing event can be broadcast to a remote user or recorded for subsequent playback.
- a method for sharing an application with a plurality of participants in a meeting.
- the method comprises the steps of capturing an image of the application; and broadcasting the image of the application to each participant.
- the image may be converted, for example, to a bit map or a graphics file format.
- the method may be initiated by a predefined hot key sequence.
- the invention allows images of individual applications or an entire desktop to be shared.
- FIG. 1 illustrates the network environment 100 in which the present invention can operate
- FIG. 2 is a schematic block diagram of an exemplary participant terminal of FIG. 1
- FIG. 3 is a schematic block diagram of an exemplary meeting management system of FIG. 1;
- FIG. 4 illustrates an exemplary meeting management interface 400 for managing past, present and future meetings that incorporates features of the present invention
- FIG. 5 illustrates an exemplary session interface that allows a user to participate in an ongoing meeting
- FIG. 6 illustrates an exemplary meeting review interface that incorporates • features of the present invention that allows a user to review past meetings
- FIG. 7 is a flow chart describing an exemplary speaker detection process incorporating features of the present invention
- FIG. 8 illustrates an exemplary session interface that allows a user to share images of an application according to the present invention.
- FIG. 9 is a flow chart describing an exemplary application sharing process incorporating features of the present invention.
- FIG. 1 illustrates the network environment 100 in which the present invention can operate.
- one or more meeting participants each employing a participant terminal 200-1 through 200-N (hereinafter, collectively referred to as participant terminals 200 and discussed further below in conjunction with FIG. 2), are connected to a network 120.
- the meeting participants may be, for example, members of a project team.
- the network 120 may be embodied, for example, as any wired or wireless network, or a combination of the foregoing, including the Public Switched Telephone Network (PSTN), the cellular telephone network or the Internet.
- PSTN Public Switched Telephone Network
- a meeting management system 300 is provided to allow two or more participants to participate in a virtual meeting, or to allow one or more participants to review a previous meeting.
- the meeting management system 300 allows one or more meeting participants to establish an agenda for a meeting, to assign action items during a meeting, and a built-in teleconferencing component that automatically initiates an audio meeting among meeting participants.
- the data-sharing interactions in meetings and the synchronized audio streams associated with them are recorded and are available asynchronously for playback or exporting to new meetings (or both).
- FIG. 2 is a schematic block diagram of an exemplary participant terminal 200.
- the participant terminal 200 includes a personal computer (comprising, e.g., a memory and processor), as well as one or more of a screen projector 1195, a screen overlay tablet 1105, a speaker 1185, a camera 1175 for capturing images, a microphone 1165 for capturing audio, a memory 1140 for storing documents, a user interface 1120 (such as one or more of a pen, keyboard or mouse).
- Components 1130, 1150, 1160 and 1170 transform signals as indicated by their corresponding function, in a known manner.
- the screen overlay tablet 1105 captures x-y coordinate movements of a pen over a screen. According to a feature of the present invention, discussed further below, the coordinates of a given modification event that changes a document are recorded as part of the event.
- the coordinates of the pen markups on the screen overlay tablet 1105 are captured and transformed to an appropriate format by converter 1130. If a tablet 1105 is not employed, a mouse can simulate the pen and tablet, as would be apparent to a person of ordinary skill in the art.
- Input from a keyboard 1120 as a markup text on the screen is also captured and passed to converter 1130.
- Converter 1130 converts the series of inputs and provides them to a central service component 1200 in the meeting management system 300, via the network 120, for recording and propagation to other participants, if appropriate, as discussed further below.
- All signals from the microphone 1165 and camera 1175 are processed by components 1160 and 1170, respectively, and then sent, for example, in a compressed format to the central service component 1200 in the meeting management system 300 for recording and propagation to other participants, if appropriate.
- Documents in the document storage 1140 can be converted to a compressed bit map by component 1150 and sent to the central service component 1200 for recording and propagation, as discussed further below in conjunction with FIGS. 8 and 9.
- the main screen projected by projector 1195 receives propagation data from other participants from a sound board 1220 of the central service component 1200 in the meeting management system 300.
- a sound board 1220 For a detailed description of an exemplary sound board 1220, see PCT Patent Application Serial Number PCT/US 03/09876, entitled “Method and Apparatus for Synchronous Project Collaboration," filed March 31, 2003, and incorporated by reference. Audio signals are directed to the speaker 1185 and image data are passed to the projector 1195 for screen projection.
- an automatic log-in component 1180 is automatically started when the participant terminal 200 is powered up.
- the automatic log-in component 1180 is programmed with the address of a directory system 1300 in the meeting management system 300 and performs an automatic log in procedure with the directory system 1300, in a known manner.
- FIG. 3 is a schematic block diagram of an exemplary meeting management system 300.
- the meeting management system 300 includes a central service component 1200 and a directory system 1300.
- the directory system 1300 provides a user management system in a remote configuration.
- the central service component 1200 is connected to the participant terminal 200 of each participant (or location) via the network 120.
- the meeting management system 300 may be embodied as one or more distributed or local servers.
- the meeting management system 300 may include the AG2000 or AG4000 (audio conferencing and recording... data event recording is a proprietary server) communication servers commercially available from NMS Communications of Framingham, MA.
- the central service component 1200 includes an event recorder 1205.
- the event recorder 1205 includes a time stamper 1210 and an event tagger 1215.
- the present invention records each event in a meeting, such as a document modification, together with a time stamp indicating when the event occurred and a tag that annotates the event with information on the nature of the event, and the participants associated with the event. For example, for an overlay modification to a document, the event would identify the base document, overlay text, coordinates of the overlay and the participant who provided the overlay text.
- the event recorder 1205 adds an ownership flag and time stamp to all images, audio, pen mark-ups, text mark-ups or other data and events received from a participant terminal 200 and captured as the result of activities of a local or remote participant.
- the present invention provides for time stamping and event tagging of each event recorded as part of a meeting.
- the ownership flag associated with a pen markup data will be assigned based on the identification of the participant associated with the participant terminal 200 that has logged in to the directory system 1300.
- the event recorder 1205 allows a number of different indexes to be created that will enhance the ability to replay and retrieve various meetings of interest.
- the buddy list information indicating the current participants provides an indication of participants that come and go during a meeting.
- the information provided by components 1130, 1150, 1160 and 1170 permit indexing of the document mark ups, page changes, speaker identity, and meeting content (e.g., what did a participant say at any point in the meeting).
- the time-stamped data is stored in a meeting calendar and room repository 1230.
- the meeting calendar and room repository 1230 is essentially a calendar system that registers past and future meetings and holds meeting spaces 1240. For each meeting space 1240, meeting attributes, such as starting time, ending time, participants, presentations, conclusions, audio conversations, pictures of participants, mark-ups on presentations, and other data relating to a meeting are defined.
- meeting attributes such as starting time, ending time, participants, presentations, conclusions, audio conversations, pictures of participants, mark-ups on presentations, and other data relating to a meeting are defined.
- the time-stamped data is also sent to the sound board 1220, described in
- the sound board 1220 makes actions by one team member visible to another team member. In other words, the sound board 1220 propagates all data associated with a given meeting to all registered participants.
- the sound board 1220 intercepts an incremental change (addition or modification) to a base document of one team member and broadcasts such intercepted traffic to all other active client agents of other active team members (and also records such intercepted traffic in an addendum database). Thus, all the team members in a synchronous session will share changes to the documents by sharing addendum additions in real time.
- the manner in which the sound board 1220 serializes the various modification requests made by each team member and ensures that each team member is presented with a consistent view of the shared document is discussed in PCT Patent Application Serial Number PCT/US 03/09876.
- the sound board 1220 consists of a serializer and a broadcaster. Each user can submit conflicting change requests for an object spontaneously and concurrently.
- a first user might request that an object is moved to the left while another user might request that the same object is moved to the right.
- the serializer receives each of the change requests and serializes them, for example, based on an arrival time or a global clock. Serialized requests are then sent to the broadcaster which broadcasts the requests to all users.
- the change requests can be broadcast to all currently active users in real-time, and can be stored in the meeting repository 1230 for subsequent access, e.g., by any late arriving users, as would be apparent to a person of ordinary skill in the art.
- a connection from the sound board 1220 to each participant terminal 200 has a first in first out (FIFO) queue for every connection.
- FIFO first in first out
- the directory system 1300 includes a directory 1310 of all potential meeting participants (and the corresponding participant terminal 200). In addition, the directory system 1300 maintains a table 1320 indicating the participants that are currently connected and available. The meeting participants identified in table 1320 are generally a subset of the people identified in the directory 1310.
- the directory system 1300 also includes a display 1330 indicating the active user list 1320 and a user directory management controller for management purposes. The information presented on display 1330 can also be superimposed on part of the main screen as projected by projector 1195 on the screen 1105 associated with each client interface 1120, 1105. For example, the presentation of the directory of available people in the system 1320 on the user display 1105 provides a buddy list. Each user or participant should be registered with the directory 1310, for example, using a manual process or automated presence detection techniques.
- the directory 1310 can be managed by meeting participants with an appropriate privilege level, in a known manner.
- Voice and other audio signals are captured by a microphone or a telephone handset 1165.
- the signal can be digitized and compressed by an optional voice compression component 1160 (optional, esp. for microphone).
- the signal can be phone signal sent to the sound board 1220 for digitizing and compression.
- the digitized, compressed audio signal is then sent to the event recorder 1205 for annotation, in a manner discussed further below.
- the digitized, compressed audio signal is also passed to the sound board 1220 for broadcasting to the participant terminals 200 corresponding to each participant by the speaker 1185.
- the annotated audio signal is stored in the appropriate meeting room 1240 for recording.
- Images and video of, e.g., objects, documents and faces at each participant terminal 200 are optionally captured by video camera 1175.
- the captured images are processed by component 1170 and sent to the event recorder 1205 for annotation.
- the digitized, compressed images are also passed to the sound board 1220 for broadcasting to the participant terminals 200 corresponding to each participant.
- the annotated image signal is stored in the appropriate meeting room 1240 for recording.
- each image is recorded for subsequent playback, as well as all image processing commands, such as user commands to rotate, zoom, modify or manipulate an image.
- Keyboard, mouse and stylus movements are captured by converter 1130 and are converted to overlay mark-ups.
- the markups are sent to the event recorder 1205 for annotation.
- the digitized, compressed mark-ups are also passed to the sound board 1220 for broadcasting to the participant terminals 200 corresponding to each participant.
- the annotated mark-ups are stored in the appropriate meeting room 1240 for recording.
- FIG. 4 illustrates an exemplary meeting management interface 400 for managing past, present and future meetings that incorporates features of the present invention.
- the meeting management interface 400 can be employed to create meetings, and to define associated agendas and participant lists.
- the exemplary management interface 400 includes a section 410 that provides a mechanism for selecting a meeting, for example, using a keyword search.
- a second section 420 indicates all meetings that satisfy any search criteria or filtering information that was entered in section 410, with each row corresponding to a different meeting.
- the summary information provided in section 420 may identify, for example, the name, project, organizer and start time associated with the corresponding meeting.
- a third section 430 shows the details of one particular meeting selected from the meeting list 420.
- the exemplary meeting details presented in section 430 may indicate, for example, the meeting organizer, status (open or closed), creation date, last update date, a list of various sessions included in the meeting, a list of meeting participants, a meeting description and meeting contents (e.g., documents, video and audio).
- the exemplary meeting management interface 400 also includes a tool bar 440 that allows a participant to join, review, empty, export, import, close, edit or delete a selected meeting. If the participant clicks on the "join" icon, the participant will then be presented with a session interface 500 that coordinates an active (real-time) meeting, as discussed further below in conjunction with FIG. 5. Similarly, if the participant clicks on the "review” icon, the participant will then be presented with a meeting review interface 600 that allows a user to review a prior meeting, as discussed further below in conjunction with FIG. 6.
- the export icon allows the contents of a prior meeting to be provided to a new meeting.
- the close icon prevents additional information from being added to an existing meeting.
- FIG. 5 illustrates an exemplary session interface 500 that allows a user to participate in an ongoing meeting.
- the exemplary session interface 500 includes three tabs 510, 520, 530 that can be selected to access various functions of the interface 500.
- the presentation tab is selected and presents the participants with the primary presentation information, such as mark-ups or overlays on a document.
- the action tab 520 allows a participant to edit or define action items associated with the meeting.
- the agenda tab 530 allows the user to edit or define agenda items associated with the meeting.
- the session interface 500 also includes a window 540 for identifying any components or content associated with the meeting, such as images. A selected content item from window 540 will automatically be presented in the presentation window 510.
- a meeting participant window 550 identifies all of the participants who have been defined for the current meeting. Individual names in the meeting participant window 550 can optionally be highlighted to indicate those participants that have joined the meeting. If an active participant clicks on a phone icon 560, all other participants who have logged into the directory system 1300 will automatically be contacted by telephone and invited to join the meeting.
- a participant database is maintained that includes a current telephone number for each participant. The participant database is accessed to retrieve the telephone number of each participant for a given meeting and an automated dialing system is employed to include the various participants in the meeting by automatically dialing the telephone numbers and providing a bridge.
- the participant database may also record a corporate affiliation, address, and role (e.g., administrator, manager or user) for each participant.
- an additional connection is optionally provided to an audio recorder for recording the audio component of the meeting.
- This recorder records the mixed or combined audio of all participants at the sound board ##.
- FIG. 6 illustrates an exemplary meeting review interface 600 incorporating features of the present invention that allows a user to review past meetings.
- the exemplary meeting review interface 600 includes a first section 610 that provides a coarse time scale, such as a two hour window in the exemplary embodiment.
- a coarse time scale such as a two hour window in the exemplary embodiment.
- the coarse time scale 610 allows a participant to select a desired portion of a selected meeting.
- a second section 620 provides a fine time scale, such as a 15 minute window in the exemplary embodiment, indicating each of the events in the selected 15 minute window.
- Events may include, for example, participant X joined, participant X left, page N of Presentation Y is shown, participant X wrote markups on presentation Y, and participant X inserted a bookmark.
- events are categorized into four exemplary categories: action, agenda, presentation and people, each identified by a corresponding icon.
- a third section 630 of the interface 600 allows a user to selectively include or exclude each category of event.
- the window 640 a number of selected action items are presented.
- Area 650 is a presentation board that presents the display content (such as images or overlays) associated with the meeting in a replay mode, synchronized with the overall meeting presentation.
- FIG. 7 illustrates a speaker detection process 700 incorporating features of the present invention.
- the speaker detection process 700 receives the audio signals from each of the N active participants in a meeting, in a manner discussed above.
- the speaker detection process 700 continuously monitors the signal strength (e.g., volume) of each of the N participants during step 710.
- a test is performed during step 720 to determine if the dominant channel changes (e.g., a different channel having the highest volume level for at least a predefined minimum time interval). If it is determined that there has been a change in the dominant channel, a speaker change is identified during step 720 and the participant associated with the dominant channel is thereafter used during step 730 for tagging each audio event.
- the dominant channel changes e.g., a different channel having the highest volume level for at least a predefined minimum time interval
- the name or user identifier of the speaker can be obtained, for example, from registration information that indicates a given speaker is associated with a given channel.
- Another aspect of the present invention allows images of user applications to be shared. In one implementation, images of a single window containing an application or an entire desktop can be shared.
- the application sharing function provides a solution that captures any application running on the client as an image to be broadcast to all participants in the meeting. Image sharing allows a meeting participant to take a "snapshot" of an application running on their machine and broadcast it to the other participants in a meeting. This snapshot is a one-time event and must be repeated if updates are made to the presentation.
- the application image sharing function of the present invention is initiated using one or more predefined "hot keys.”
- a first hot key sequence such as shift-control-c
- a second hot key sequence such as shift-control-d
- the specific hot key sequences can be predefined or configured by each user.
- the mouse pointer should be over the application (or desktop) to be shared.
- the user types the hot key sequence (such as shift- ctrl-c).
- the selected application is brought to the foreground and the image is taken and uploaded to the meeting management system 300.
- a session interface 800 discussed below in conjunction with FIG. 8, is made active and the image is loaded into the panel. The manner in which the application image is obtained and shared is discussed further below in conjunction with FIG. 9.
- FIG. 8 illustrates an exemplary session interface 800 that allows a user to share an image of an application.
- the exemplary session interface 800 is an extension of the session interface 500 discussed above in conjunction with FIG. 5.
- the exemplary session interface 800 includes a presentation snapshot window 810 that allows a participant to control the portion of the shared image that is presented in a presentation window 820. By adjusting a rectangular box within the presentation snapshot window 810, the user can pan through a desired portion of the entire image that is presented in window 820.
- the exemplary session interface 800 includes a set of icons 830 to activate a number of well known image processing functions, such as zoom in, zoom out, rotate, fit-to-page and crop. As previously indicated, whenever a user activates an image processing function, the resulting image manipulation is recorded as an event.
- the exemplary shared application shown in FIG. 8 is a browser presenting a web page from Collabo- Technology, Inc.
- FIG. 9 is a flow chart describing an exemplary application image sharing process 900 incorporating features of the present invention.
- the application image sharing process 900 is initiated during step 910 when a predefined hot key sequence is detected (such as shift-ctrl-c or shift-ctrl-d).
- a library function for example, from a Java library, is invoked to create image of the application within the selected window (if the desktop hot key sequence is detected) or of the desktop (if the desktop hotkey sequence is detected) during step 920.
- the image capture function uses a boundary of a window containing the application.
- the image capture function uses the size of the user's display as the boundary.
- the captured image is optionally converted to a bit map and then to a graphics file, such as a portable network graphics (png) file, during step 930.
- a graphics file such as a portable network graphics (png) file
- the application image is annotated (i.e., the event is tagged with the appropriate time stamp, as well as information on the nature of the event, and the participant that invoked the application sharing), broadcast to all participants and recorded in the meeting repository 1230.
- the shared application image will be presented in the session interface 800 (FIG. 8) of each active participant.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Television Signal Processing For Recording (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
A method and apparatus (300) are provided for processing image based events in a meeting management system. A method is provided for recording a meeting having at least one participant (200), comprising the steps of detecting at least one image processing event associated with the meeting; and recording the image processing event for subsequent playback. Each recorded image processing event is tagged with a time stamp and information about the event. In this manner, each image processing event can be broadcast to a remote user or recorded for subsequent playback. A method and apparatus are also provided for sharing an application with a plurality of participants in a meeting. The method comprises the steps of capturing an image of the application; and broadcasting the image of the application to each participant. The invention allows images of individual applications or an entire desktop to be shared.
Description
METHOD AND APPARATUS FOR PROCESSING IMAGE-BASED EVENTS IN A
MEETING MANAGEMENT SYSTEM
Cross Reference to Related Application
This application claims the benefit of United States Provisional Application Numbers 60/400,745, filed August 2, 2002. This application is related to PCT Patent Application entitled "Method and Apparatus for Identifying a Speaker in a Meeting Management System," (Attorney Docket No. 1008-4) filed contemporaneously herewith.
Field of the Invention
The present invention relates generally to project management systems and, more particularly, to project management systems that facilitate the synchronous interaction of a number of individuals to create and modify documents and to perform other project tasks.
Background of the Invention
Project management systems increase productivity and efficiency of members of a project team by automating the flow of information, including documents and files, among team members. Project management systems are often deployed to support collaborative work among a group of individuals, such as the members of a project team. Asynchronous collaboration systems allow team members to collaborate on one or more project tasks independently in time or space. Synchronous collaboration systems, on the other hand, allow team members to simultaneously collaborate on one or more project tasks in the same or a different location.
As the employees of an enterprise become more distributed in time and place, for example, due to flexible work hours, globalization and the distribution of enterprise employees to avoid the destruction of a centralized enterprise location, it becomes even more important to provide team members with an effective tool for asynchronous and synchronous collaboration. In today's enterprise environment, it is important for a project management system to permit distributed team members to initiate ad-hoc virtual meetings, for example, over the Internet. Generally, such project
management systems must allow distributed team members to communicate and interact as if the team members were in the same place.
In addition, it is important that meetings are recorded for subsequent access, for example, by any late arriving users or those team members that are unable to participate in the actual meeting. Project management systems1 typically time stamp the various media components of a meeting, such as the audio and video aspects, so that the presentation of the various media components can be synchronized, and so that a user can selectively access any desired portion of the meeting. Currently, however, there is no commercially available system that records image processing events, such as rotating an image or zooming in or out on an image, such that the image processing events can be recreated at a remote terminal or at a later time. A need therefore exists for an improved project management system and method that provides enhanced recording and annotation of image based events.
When team members collaborate, they often share and revise documents, such as tables, charts and drawings. In order for team members to collaborate on a given document at the same time, however, each user must have the associated application installed. A number of interactive application sharing techniques exist, such as PC Anywhere, from Semantic Systems, Inc. of Oakland, CA, Virtual Network Computing, (VNC) from AT&T Laboratories, Cambridge and NetMeeting from Microsoft Corp. of Redmond, WA. Such interactive application sharing techniques allow remote users to take over a presentation and to manipulate the actual application on the presentation source machine. In most meetings, however, animations are not required to understand the presentation. In fact, the majority of presentations rely on static information graphics to convey ideas (for example, spreadsheets, charts, graphs, blueprints, and CAD diagrams.) In addition, transmitting user manipulations of the shared document, such as mouse movements, button clicks and keyboard actions, over a network, such as the Internet, provides for a very poor user experience. Dragging and manipulating windows with delayed feedback is extremely challenging for most users. There is a significant round trip delay between an initial manipulation, having the action executed and then updating the display. Furthermore, an interactive application sharing session is not easily recorded.
A need therefore exists for a method and apparatus for presenting shared documents to each team member without requiring that each team member has a given
application installed or without requiring that all commands for document modification are transmitted to each participant.
Summary of the Invention The present invention provides a project management system that allows one or more team members to work on a project. Generally, a method and apparatus are provided for processing image based events in a meeting management system. According to one aspect of the invention, a method is provided for recording a meeting having at least one participant. The method comprises the steps of detecting at least one image processing event associated with the meeting; and recording the image processing event for subsequent playback. The image processing event may include, for example, rotation of an image, zooming in or out on an image or another function that manipulates an image. In addition, each recorded image processing event is tagged with a time stamp and information about the event, such as an event type and a participant associated with the image processing event. In this manner, each image processing event can be broadcast to a remote user or recorded for subsequent playback.
According to another aspect of the invention, a method is provided for sharing an application with a plurality of participants in a meeting. The method comprises the steps of capturing an image of the application; and broadcasting the image of the application to each participant. The image may be converted, for example, to a bit map or a graphics file format. The method may be initiated by a predefined hot key sequence. The invention allows images of individual applications or an entire desktop to be shared.
A more complete understanding of the present invention, as well as further features and advantages of the present invention, will be obtained by reference to the following detailed description and drawings.
Brief Description of the Drawings
FIG. 1 illustrates the network environment 100 in which the present invention can operate; FIG. 2 is a schematic block diagram of an exemplary participant terminal of FIG. 1;
FIG. 3 is a schematic block diagram of an exemplary meeting management system of FIG. 1;
FIG. 4 illustrates an exemplary meeting management interface 400 for managing past, present and future meetings that incorporates features of the present invention;
FIG. 5 illustrates an exemplary session interface that allows a user to participate in an ongoing meeting;
FIG. 6 illustrates an exemplary meeting review interface that incorporates • features of the present invention that allows a user to review past meetings; FIG. 7 is a flow chart describing an exemplary speaker detection process incorporating features of the present invention;
FIG. 8 illustrates an exemplary session interface that allows a user to share images of an application according to the present invention; and
FIG. 9 is a flow chart describing an exemplary application sharing process incorporating features of the present invention.
Detailed Description
FIG. 1 illustrates the network environment 100 in which the present invention can operate. As shown in FIG. 1, one or more meeting participants, each employing a participant terminal 200-1 through 200-N (hereinafter, collectively referred to as participant terminals 200 and discussed further below in conjunction with FIG. 2), are connected to a network 120. The meeting participants may be, for example, members of a project team. The network 120 may be embodied, for example, as any wired or wireless network, or a combination of the foregoing, including the Public Switched Telephone Network (PSTN), the cellular telephone network or the Internet.
According to one aspect of the invention, a meeting management system 300, as discussed further below in conjunction with FIG. 3, is provided to allow two or more participants to participate in a virtual meeting, or to allow one or more participants to review a previous meeting. In addition, the meeting management system 300 allows one or more meeting participants to establish an agenda for a meeting, to assign action items during a meeting, and a built-in teleconferencing component that automatically initiates an audio meeting among meeting participants. The data-sharing interactions in
meetings and the synchronized audio streams associated with them are recorded and are available asynchronously for playback or exporting to new meetings (or both).
FIG. 2 is a schematic block diagram of an exemplary participant terminal 200. As shown in FIG. 2, the participant terminal 200 includes a personal computer (comprising, e.g., a memory and processor), as well as one or more of a screen projector 1195, a screen overlay tablet 1105, a speaker 1185, a camera 1175 for capturing images, a microphone 1165 for capturing audio, a memory 1140 for storing documents, a user interface 1120 (such as one or more of a pen, keyboard or mouse). Components 1130, 1150, 1160 and 1170 transform signals as indicated by their corresponding function, in a known manner.
The screen overlay tablet 1105 captures x-y coordinate movements of a pen over a screen. According to a feature of the present invention, discussed further below, the coordinates of a given modification event that changes a document are recorded as part of the event. The coordinates of the pen markups on the screen overlay tablet 1105 are captured and transformed to an appropriate format by converter 1130. If a tablet 1105 is not employed, a mouse can simulate the pen and tablet, as would be apparent to a person of ordinary skill in the art. Input from a keyboard 1120 as a markup text on the screen is also captured and passed to converter 1130. Converter 1130 converts the series of inputs and provides them to a central service component 1200 in the meeting management system 300, via the network 120, for recording and propagation to other participants, if appropriate, as discussed further below.
All signals from the microphone 1165 and camera 1175 are processed by components 1160 and 1170, respectively, and then sent, for example, in a compressed format to the central service component 1200 in the meeting management system 300 for recording and propagation to other participants, if appropriate. Documents in the document storage 1140 can be converted to a compressed bit map by component 1150 and sent to the central service component 1200 for recording and propagation, as discussed further below in conjunction with FIGS. 8 and 9.
The main screen projected by projector 1195 receives propagation data from other participants from a sound board 1220 of the central service component 1200 in the meeting management system 300. For a detailed description of an exemplary sound board 1220, see PCT Patent Application Serial Number PCT/US 03/09876, entitled
"Method and Apparatus for Synchronous Project Collaboration," filed March 31, 2003, and incorporated by reference. Audio signals are directed to the speaker 1185 and image data are passed to the projector 1195 for screen projection. In one implementation, an automatic log-in component 1180 is automatically started when the participant terminal 200 is powered up. The automatic log-in component 1180 is programmed with the address of a directory system 1300 in the meeting management system 300 and performs an automatic log in procedure with the directory system 1300, in a known manner.
FIG. 3 is a schematic block diagram of an exemplary meeting management system 300. As shown in FIG. 3, the meeting management system 300 includes a central service component 1200 and a directory system 1300. In addition, the directory system 1300 provides a user management system in a remote configuration. The central service component 1200 is connected to the participant terminal 200 of each participant (or location) via the network 120. The meeting management system 300 may be embodied as one or more distributed or local servers. For example, the meeting management system 300 may include the AG2000 or AG4000 (audio conferencing and recording... data event recording is a proprietary server) communication servers commercially available from NMS Communications of Framingham, MA.
According to one aspect of the invention, shown in FIG. 3 and discussed further below, the central service component 1200 includes an event recorder 1205. The event recorder 1205 includes a time stamper 1210 and an event tagger 1215. As previously indicated, the present invention records each event in a meeting, such as a document modification, together with a time stamp indicating when the event occurred and a tag that annotates the event with information on the nature of the event, and the participants associated with the event. For example, for an overlay modification to a document, the event would identify the base document, overlay text, coordinates of the overlay and the participant who provided the overlay text. In this manner, the event recorder 1205 adds an ownership flag and time stamp to all images, audio, pen mark-ups, text mark-ups or other data and events received from a participant terminal 200 and captured as the result of activities of a local or remote participant. Thus, the present invention provides for time stamping and event tagging of each event recorded as part of a meeting. For example, the ownership flag associated with a pen markup data will be assigned based on the identification of the participant associated
with the participant terminal 200 that has logged in to the directory system 1300. In this manner, the event recorder 1205 allows a number of different indexes to be created that will enhance the ability to replay and retrieve various meetings of interest. For example, the buddy list information indicating the current participants provides an indication of participants that come and go during a meeting. In addition, the information provided by components 1130, 1150, 1160 and 1170 permit indexing of the document mark ups, page changes, speaker identity, and meeting content (e.g., what did a participant say at any point in the meeting).
The time-stamped data is stored in a meeting calendar and room repository 1230. The meeting calendar and room repository 1230 is essentially a calendar system that registers past and future meetings and holds meeting spaces 1240. For each meeting space 1240, meeting attributes, such as starting time, ending time, participants, presentations, conclusions, audio conversations, pictures of participants, mark-ups on presentations, and other data relating to a meeting are defined. The time-stamped data is also sent to the sound board 1220, described in
PCT Patent Application Serial Number PCT/US 03/09876, entitled "Method and Apparatus for Synchronous Project Collaboration," filed March 31, 2003, and incorporated by reference. Generally, the sound board 1220 makes actions by one team member visible to another team member. In other words, the sound board 1220 propagates all data associated with a given meeting to all registered participants.
The sound board 1220 intercepts an incremental change (addition or modification) to a base document of one team member and broadcasts such intercepted traffic to all other active client agents of other active team members (and also records such intercepted traffic in an addendum database). Thus, all the team members in a synchronous session will share changes to the documents by sharing addendum additions in real time. The manner in which the sound board 1220 serializes the various modification requests made by each team member and ensures that each team member is presented with a consistent view of the shared document is discussed in PCT Patent Application Serial Number PCT/US 03/09876. The sound board 1220 consists of a serializer and a broadcaster. Each user can submit conflicting change requests for an object spontaneously and concurrently. For example, a first user might request that an object is moved to the left while another user
might request that the same object is moved to the right. The serializer receives each of the change requests and serializes them, for example, based on an arrival time or a global clock. Serialized requests are then sent to the broadcaster which broadcasts the requests to all users. The change requests can be broadcast to all currently active users in real-time, and can be stored in the meeting repository 1230 for subsequent access, e.g., by any late arriving users, as would be apparent to a person of ordinary skill in the art.
Thus, one participant's action will be replicated on the corresponding participant terminal 200 of other participants. A connection from the sound board 1220 to each participant terminal 200 has a first in first out (FIFO) queue for every connection. In other words, if a late arriving participant terminal 200 connects after a given meeting has begun, all data up to the point when the a late arriving user joins the meeting is stored in the FIFO queue and sent to the participant terminal 200 in the appropriate order. Thus, a late arriving participant terminal 200 will not miss anything.
The directory system 1300 includes a directory 1310 of all potential meeting participants (and the corresponding participant terminal 200). In addition, the directory system 1300 maintains a table 1320 indicating the participants that are currently connected and available. The meeting participants identified in table 1320 are generally a subset of the people identified in the directory 1310. The directory system 1300 also includes a display 1330 indicating the active user list 1320 and a user directory management controller for management purposes. The information presented on display 1330 can also be superimposed on part of the main screen as projected by projector 1195 on the screen 1105 associated with each client interface 1120, 1105. For example, the presentation of the directory of available people in the system 1320 on the user display 1105 provides a buddy list. Each user or participant should be registered with the directory 1310, for example, using a manual process or automated presence detection techniques. The directory 1310 can be managed by meeting participants with an appropriate privilege level, in a known manner.
Capturing Participant Activities Voice and other audio signals are captured by a microphone or a telephone handset 1165. In the case of a microphone, the signal can be digitized and compressed by an optional voice compression component 1160 (optional, esp. for microphone). In the
case of a handset 1165, the signal can be phone signal sent to the sound board 1220 for digitizing and compression. The digitized, compressed audio signal is then sent to the event recorder 1205 for annotation, in a manner discussed further below. The digitized, compressed audio signal is also passed to the sound board 1220 for broadcasting to the participant terminals 200 corresponding to each participant by the speaker 1185. At the same time, the annotated audio signal is stored in the appropriate meeting room 1240 for recording.
Images and video of, e.g., objects, documents and faces at each participant terminal 200 are optionally captured by video camera 1175. The captured images are processed by component 1170 and sent to the event recorder 1205 for annotation. The digitized, compressed images are also passed to the sound board 1220 for broadcasting to the participant terminals 200 corresponding to each participant. At the same time, the annotated image signal is stored in the appropriate meeting room 1240 for recording. According to one aspect of the present invention, each image is recorded for subsequent playback, as well as all image processing commands, such as user commands to rotate, zoom, modify or manipulate an image.
Keyboard, mouse and stylus movements are captured by converter 1130 and are converted to overlay mark-ups. The markups are sent to the event recorder 1205 for annotation. The digitized, compressed mark-ups are also passed to the sound board 1220 for broadcasting to the participant terminals 200 corresponding to each participant. At the same time, the annotated mark-ups are stored in the appropriate meeting room 1240 for recording.
Meeting Management Interface FIG. 4 illustrates an exemplary meeting management interface 400 for managing past, present and future meetings that incorporates features of the present invention. The meeting management interface 400 can be employed to create meetings, and to define associated agendas and participant lists. The exemplary management interface 400 includes a section 410 that provides a mechanism for selecting a meeting, for example, using a keyword search. A second section 420 indicates all meetings that satisfy any search criteria or filtering information that was entered in section 410, with each row corresponding to a different meeting. The summary information provided in
section 420 may identify, for example, the name, project, organizer and start time associated with the corresponding meeting.
A third section 430 shows the details of one particular meeting selected from the meeting list 420. The exemplary meeting details presented in section 430 may indicate, for example, the meeting organizer, status (open or closed), creation date, last update date, a list of various sessions included in the meeting, a list of meeting participants, a meeting description and meeting contents (e.g., documents, video and audio).
As shown in FIG. 4, the exemplary meeting management interface 400 also includes a tool bar 440 that allows a participant to join, review, empty, export, import, close, edit or delete a selected meeting. If the participant clicks on the "join" icon, the participant will then be presented with a session interface 500 that coordinates an active (real-time) meeting, as discussed further below in conjunction with FIG. 5. Similarly, if the participant clicks on the "review" icon, the participant will then be presented with a meeting review interface 600 that allows a user to review a prior meeting, as discussed further below in conjunction with FIG. 6. The export icon allows the contents of a prior meeting to be provided to a new meeting. The close icon prevents additional information from being added to an existing meeting. The edit icon allows properties of a meeting to be modified. The delete icon removes the meeting from the meeting repository 1230. FIG. 5 illustrates an exemplary session interface 500 that allows a user to participate in an ongoing meeting. As shown in FIG. 5, the exemplary session interface 500 includes three tabs 510, 520, 530 that can be selected to access various functions of the interface 500. As presented in FIG. 5, the presentation tab is selected and presents the participants with the primary presentation information, such as mark-ups or overlays on a document. The action tab 520 allows a participant to edit or define action items associated with the meeting. The agenda tab 530 allows the user to edit or define agenda items associated with the meeting. The session interface 500 also includes a window 540 for identifying any components or content associated with the meeting, such as images. A selected content item from window 540 will automatically be presented in the presentation window 510.
A meeting participant window 550 identifies all of the participants who have been defined for the current meeting. Individual names in the meeting participant
window 550 can optionally be highlighted to indicate those participants that have joined the meeting. If an active participant clicks on a phone icon 560, all other participants who have logged into the directory system 1300 will automatically be contacted by telephone and invited to join the meeting. In one implementation, a participant database is maintained that includes a current telephone number for each participant. The participant database is accessed to retrieve the telephone number of each participant for a given meeting and an automated dialing system is employed to include the various participants in the meeting by automatically dialing the telephone numbers and providing a bridge. The participant database may also record a corporate affiliation, address, and role (e.g., administrator, manager or user) for each participant.
According to a further aspect of the present invention, whenever audio information is associated with a meeting, such as two participants speaking at the same location or via a telephone conference established among participants, an additional connection is optionally provided to an audio recorder for recording the audio component of the meeting. This recorder records the mixed or combined audio of all participants at the sound board ##.
FIG. 6 illustrates an exemplary meeting review interface 600 incorporating features of the present invention that allows a user to review past meetings. As shown in FIG. 6, the exemplary meeting review interface 600 includes a first section 610 that provides a coarse time scale, such as a two hour window in the exemplary embodiment. In an automatic replay mode, the locator moves from left to the right in normal speed. If a user grabs and moves the locator to a desired location, the user can randomly access any portion of the meeting. The coarse time scale 610 allows a participant to select a desired portion of a selected meeting. A second section 620 provides a fine time scale, such as a 15 minute window in the exemplary embodiment, indicating each of the events in the selected 15 minute window. Events may include, for example, participant X joined, participant X left, page N of Presentation Y is shown, participant X wrote markups on presentation Y, and participant X inserted a bookmark. In one implementation, events are categorized into four exemplary categories: action, agenda, presentation and people, each identified by a corresponding icon. A third section 630 of the interface 600 allows a user to selectively include or exclude each category of event. In the window 640, a number of selected action items are presented. Area 650 is a presentation board that presents the
display content (such as images or overlays) associated with the meeting in a replay mode, synchronized with the overall meeting presentation.
FIG. 7 illustrates a speaker detection process 700 incorporating features of the present invention. As shown in FIG. 7, the speaker detection process 700 receives the audio signals from each of the N active participants in a meeting, in a manner discussed above. In an exemplary implementation, the speaker detection process 700 continuously monitors the signal strength (e.g., volume) of each of the N participants during step 710. A test is performed during step 720 to determine if the dominant channel changes (e.g., a different channel having the highest volume level for at least a predefined minimum time interval). If it is determined that there has been a change in the dominant channel, a speaker change is identified during step 720 and the participant associated with the dominant channel is thereafter used during step 730 for tagging each audio event. The name or user identifier of the speaker can be obtained, for example, from registration information that indicates a given speaker is associated with a given channel. Another aspect of the present invention allows images of user applications to be shared. In one implementation, images of a single window containing an application or an entire desktop can be shared. The application sharing function provides a solution that captures any application running on the client as an image to be broadcast to all participants in the meeting. Image sharing allows a meeting participant to take a "snapshot" of an application running on their machine and broadcast it to the other participants in a meeting. This snapshot is a one-time event and must be repeated if updates are made to the presentation.
In one implementation, the application image sharing function of the present invention is initiated using one or more predefined "hot keys." A first hot key sequence, such as shift-control-c, can capture an image of a single window containing an application and a second hot key sequence, such as shift-control-d, can capture an image of the entire desktop. The specific hot key sequences can be predefined or configured by each user.
To share an image of an application, the mouse pointer should be over the application (or desktop) to be shared. The user types the hot key sequence (such as shift- ctrl-c). The selected application is brought to the foreground and the image is taken and uploaded to the meeting management system 300. A session interface 800, discussed
below in conjunction with FIG. 8, is made active and the image is loaded into the panel. The manner in which the application image is obtained and shared is discussed further below in conjunction with FIG. 9.
FIG. 8 illustrates an exemplary session interface 800 that allows a user to share an image of an application. As shown in FIG. 8, the exemplary session interface 800 is an extension of the session interface 500 discussed above in conjunction with FIG. 5. In addition, when configured for application image sharing, the exemplary session interface 800 includes a presentation snapshot window 810 that allows a participant to control the portion of the shared image that is presented in a presentation window 820. By adjusting a rectangular box within the presentation snapshot window 810, the user can pan through a desired portion of the entire image that is presented in window 820. In addition, the exemplary session interface 800 includes a set of icons 830 to activate a number of well known image processing functions, such as zoom in, zoom out, rotate, fit-to-page and crop. As previously indicated, whenever a user activates an image processing function, the resulting image manipulation is recorded as an event. The exemplary shared application shown in FIG. 8 is a browser presenting a web page from Collabo- Technology, Inc.
FIG. 9 is a flow chart describing an exemplary application image sharing process 900 incorporating features of the present invention. As shown in FIG. 9, the application image sharing process 900 is initiated during step 910 when a predefined hot key sequence is detected (such as shift-ctrl-c or shift-ctrl-d). Once a predefined hot key sequence is detected a library function, for example, from a Java library, is invoked to create image of the application within the selected window (if the desktop hot key sequence is detected) or of the desktop (if the desktop hotkey sequence is detected) during step 920. In the case of a shared application, the image capture function uses a boundary of a window containing the application. In the case of a shared desktop, the image capture function uses the size of the user's display as the boundary.
Thereafter, the captured image is optionally converted to a bit map and then to a graphics file, such as a portable network graphics (png) file, during step 930. Thereafter, the application image is annotated (i.e., the event is tagged with the appropriate time stamp, as well as information on the nature of the event, and the participant that invoked the application sharing), broadcast to all participants and recorded
in the meeting repository 1230. Thereafter, the shared application image will be presented in the session interface 800 (FIG. 8) of each active participant.
It is to be understood that the embodiments and variations shown and described herein are merely illustrative of the principles of this invention and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention.
Claims
1. A method for recording a meeting having at least one participant, comprising the steps of: detecting at least one image processing event associated with said meeting; and recording said image processing event for subsequent playback.
2. The method of claim 1, wherein said at least one image processing event includes a rotation of an image.
3. The method of claim 1, wherein said at least one image processing event includes a zooming in or out on an image.
4. The method of claim 1, wherein said image processing event is a function that manipulates an image.
5. The method of claim 1, wherein said image processing event is a user interface action.
6. The method of claim 1, further comprising the step of tagging each of said image processing events with a time stamp and information about said event.
7. The method of claim 6, wherein said information about said event indicates an event type.
8. The method of claim 6, wherein said information about said event indicates a participant associated with said image processing event.
9. The method of claim 1, further comprising the step of broadcasting said image processing events to each meeting participant.
10. A method for sharing an application with a plurality of participants in a meeting, comprising the steps of: capturing an image of said application; and broadcasting said image of said application to each of said participants.
11. The method of claim 10, further comprising the step of converting said image to a bit map.
12. The method of claim 10, further comprising the step of converting said image to a graphics file format.
13. The method of claim 10, wherein said method is initiated by a predefined hot key sequence.
14. The method of claim 10, wherein said image is an image of a window including said application.
15. The method of claim 10, wherein said image is an image of an entire desktop including said application.
16. The method of claim 10, further comprising the step of broadcasting changes to said application as overlays to said broadcast image.
17. A system for recording a meeting having at least one participant, comprising: a memory; and at least one processor, coupled to the memory, operative to: detect at least one image processing event associated with said meeting; and record said image processing event for subsequent playback.
18. The system of claim 17, wherein said at least one image processing event includes a rotation of an image.
19. The system of claim 17, wherein said at least one image processing event includes a zooming in or out on an image.
20. The system of claim 17, wherein said image processing event is a function that manipulates an image.
21. The system of claim 17, wherein said image processing event is a user interface action.
22. The system of claim 17, wherein said processor is further configured to tag each of said image processing events with a time stamp and information about said event.
23. The system of claim 17, wherein said processor is further configured to broadcast said image processing events to each meeting participant.
24. A system for sharing an application with a plurality of participants in a meeting, comprising: a memory; and at least one processor, coupled to the memory, operative to: capture an image of said application; and broadcast said image of said application to each of said participants.
25. The system of claim 24, wherein said processor is further configured to convert said image to a bit map.
26. The system of claim 24, wherein said processor is further configured to convert said image to a graphics file format.
27. The system of claim 24, wherein said image is an image of a window including said application.
28. The system of claim 24, wherein said image is an image of an entire desktop including said application.
29. The system of claim 24, wherein said processor is further configured to broadcast changes to said application as overlays to said broadcast image.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2003252158A AU2003252158A1 (en) | 2002-08-02 | 2003-07-25 | Method and apparatus for processing image-based events in a meeting management system |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US40074502P | 2002-08-02 | 2002-08-02 | |
| US60/400,745 | 2002-08-02 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2004014059A2 true WO2004014059A2 (en) | 2004-02-12 |
| WO2004014059A3 WO2004014059A3 (en) | 2004-09-02 |
Family
ID=31495869
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2003/023194 Ceased WO2004014059A2 (en) | 2002-08-02 | 2003-07-25 | Method and apparatus for processing image-based events in a meeting management system |
Country Status (2)
| Country | Link |
|---|---|
| AU (1) | AU2003252158A1 (en) |
| WO (1) | WO2004014059A2 (en) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7676582B2 (en) | 2006-06-30 | 2010-03-09 | Microsoft Corporation | Optimized desktop sharing viewer join |
| US20110196930A1 (en) * | 2004-09-20 | 2011-08-11 | Jitendra Chawla | Methods and apparatuses for reporting based on attention of a user during a collaboration session |
| US20120001999A1 (en) * | 2010-07-01 | 2012-01-05 | Tandberg Telecom As | Apparatus and method for changing a camera configuration in response to switching between modes of operation |
| US8516143B2 (en) | 2009-03-31 | 2013-08-20 | International Business Machines Corporation | Transmitting data within remote application |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6016478A (en) * | 1996-08-13 | 2000-01-18 | Starfish Software, Inc. | Scheduling system with methods for peer-to-peer scheduling of remote users |
| WO2002080076A1 (en) * | 2001-03-30 | 2002-10-10 | Sanches Manuel J | Method, system, and software for managing enterprise action initiatives |
-
2003
- 2003-07-25 AU AU2003252158A patent/AU2003252158A1/en not_active Abandoned
- 2003-07-25 WO PCT/US2003/023194 patent/WO2004014059A2/en not_active Ceased
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110196930A1 (en) * | 2004-09-20 | 2011-08-11 | Jitendra Chawla | Methods and apparatuses for reporting based on attention of a user during a collaboration session |
| US7676582B2 (en) | 2006-06-30 | 2010-03-09 | Microsoft Corporation | Optimized desktop sharing viewer join |
| US8516143B2 (en) | 2009-03-31 | 2013-08-20 | International Business Machines Corporation | Transmitting data within remote application |
| US20120001999A1 (en) * | 2010-07-01 | 2012-01-05 | Tandberg Telecom As | Apparatus and method for changing a camera configuration in response to switching between modes of operation |
Also Published As
| Publication number | Publication date |
|---|---|
| AU2003252158A8 (en) | 2004-02-23 |
| WO2004014059A3 (en) | 2004-09-02 |
| AU2003252158A1 (en) | 2004-02-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20240273147A1 (en) | Systems and methods for escalating a collaboration interface | |
| US7099798B2 (en) | Event-based system and process for recording and playback of collaborative electronic presentations | |
| US9071615B2 (en) | Shared space for communicating information | |
| US7707249B2 (en) | Systems and methods for collaboration | |
| US20120150577A1 (en) | Meeting lifecycle management | |
| EP2458540A1 (en) | Systems and methods for collaboration | |
| US20060053194A1 (en) | Systems and methods for collaboration | |
| US20060080432A1 (en) | Systems and methods for collaboration | |
| US20060053195A1 (en) | Systems and methods for collaboration | |
| US20060101022A1 (en) | System and process for providing an interactive, computer network-based, virtual team worksite | |
| JP2006146415A (en) | Conference support system | |
| JP2005222246A (en) | Cooperative work support system and method | |
| WO2004014059A2 (en) | Method and apparatus for processing image-based events in a meeting management system | |
| WO2004014054A1 (en) | Method and apparatus for identifying a speaker in a conferencing system | |
| JP2006113886A (en) | Meeting system, control method therefor, and program | |
| JP2024153516A (en) | Information processing program, information processing method, and information processing device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| 122 | Ep: pct application non-entry in european phase | ||
| NENP | Non-entry into the national phase |
Ref country code: JP |
|
| WWW | Wipo information: withdrawn in national office |
Country of ref document: JP |