HK1194883B - Embedded appliance for multimedia capture - Google Patents
Embedded appliance for multimedia capture Download PDFInfo
- Publication number
- HK1194883B HK1194883B HK14108030.9A HK14108030A HK1194883B HK 1194883 B HK1194883 B HK 1194883B HK 14108030 A HK14108030 A HK 14108030A HK 1194883 B HK1194883 B HK 1194883B
- Authority
- HK
- Hong Kong
- Prior art keywords
- signal
- capture
- processor system
- digital image
- input port
- Prior art date
Links
Description
The present application is a divisional application of the invention patent application having an application number of 200780030452.2, an application date of 2007, 6-month-22-day, and an invention name of "embedded device for multimedia capturing".
Technical Field
The present invention relates to an apparatus and method for media signal capture, and more particularly, to an apparatus and method for capturing a media signal using an embedded device.
Background
The ability to capture live media recordings of, for example, classroom lectures and meetings available on demand and viewed after recording is increasingly important for institutions such as universities and businesses. While some commercial schemes for capturing and publishing live recordings are known, these schemes are typically implemented on general purpose devices such as Personal Computers (PCs). Because these PC-based capture schemes use common components and software, they are expensive, difficult to maintain, inefficient when capturing and storing signals, vulnerable to security threats, require special technical support, and are difficult to integrate into, for example, a smart classroom environment. Thus, there is a need for a purpose-built multimedia capture device.
Disclosure of Invention
A multimedia device is provided comprising an input port dedicated to receiving real-time media signals and a processor system dedicated to capturing real-time media signals. The processor system defines an embedded environment. The input port and the processor system are integrated in a multimedia capture device. The input ports include an audio input port and at least one of a visual capture input port or a digital image input port.
Drawings
FIG. 1 is a system block diagram illustrating an embedded appliance coupled to a control server on a network according to an embodiment of the present invention.
FIG. 2 is a system block diagram illustrating an embedded appliance having input ports, a processor system, a memory, and an alarm module according to an embodiment of the present invention.
Fig. 3 is a block diagram illustrating the flow of media signals through modules in a control server according to an embodiment of the present invention.
FIG. 4 is a system block diagram of an example embodiment of an embedded appliance having input ports, output ports, a processor system, and a memory according to an embodiment of the invention.
FIG. 5 is a flow chart illustrating the capturing, processing, storing and/or sending of media signals using an embedded appliance according to an embodiment of the invention.
Detailed Description
An embedded appliance for multimedia capture (also referred to as an "embedded appliance") is a device dedicated to capturing, processing, storing and/or transmitting real-time media signals (e.g., audio signals, video signals, visual-capture signals, digital image signals). The embedded appliance may capture one or more real-time media signals, which may include, for example, digital image signals, visual capture signals, audio signals, and/or video signals of an ongoing classroom presentation. After one or more media signals have been captured, the embedded appliance may process the one or more signals by, for example, compression, indexing (indexing), encoding, decoding, synchronization, and/or formatting of the content. For example, embedded appliances may be distributed over a network and coordinated according to some schedule to capture, process, store, and transmit real-time media signals, e.g., for eventual retrieval by a user from a control server and/or one or more servers configured as, for example, a course management system. The media stream being captured on the embedded appliance may optionally also be monitored and/or further processed by the control server prior to distribution.
As a special-purpose (i.e., purpose-specific) device with an embedded environment, embedded appliances use a hardened (hardened) Operating System (OS) and a processor (e.g., processor system) to capture, process, store, and/or transmit real-time media signals. The hardened OS is configured to resist security attacks (e.g., prevent access by unauthorized users or programs) and facilitate functions related only to the capture, processing, storage, and/or transmission of real-time media signals. In other words, hardware and software are integrated into the embedded appliance and are specifically designed to capture, process, store and/or transmit real-time media signals. Because the hardware and software for capturing, processing, storing, and/or transmitting real-time media signals is integrated into the embedded environment of the embedded appliance, the costs and complexity associated with installation, tuning, design, deployment, and technical support are reduced as compared to general purpose systems.
The real-time media signal represents an image and/or sound of an event captured by the sensor at substantially the same time as the event occurred and transmitted between the sensor and the embedded appliance at the time of capture without appreciable delay. The capturing, processing, storing and/or sending of real-time media signals by the embedded appliance may be performed at any time. Throughout the description, a real-time media signal is also referred to as a media signal.
FIG. 1 is a block diagram illustrating a number of embedded appliances 100 distributed over a network 110 and connected to a control server 120. In this embodiment, the control server 120 is connected to a server 130, wherein said server 130 is configured, for example, as a course management system (e.g. running a Blackboard)TMOr a server for WebCT). The network 110 may be any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN) implemented as a wired or wireless network in various environments, such as an office or a university campus. The embedded device 100 may capture video through a microphone, Web camera, for exampleReal-time media signals acquired by electronic capture devices or sensors of video cameras, still cameras, and video players include audio signals, visual capture signals, digital image signals, and video signals. The embedded appliance 100 can also be configured to process, store and/or transmit the captured real-time media signals. Data related to content captured by the real-time media signal may also be processed, stored, and/or transmitted; such data may include, for example, a time of capture, a location of capture, and/or a name of a speaker.
The embedded appliance 100 can be prompted to start and stop capturing real-time media signals in response to start and stop indicators generated by, for example, the control server 120 or the embedded appliance 100. The start and stop indicators can be generated according to a schedule determined and/or stored by the control server 120 and/or each embedded appliance 100. For example, if implemented in a university campus environment, the embedded appliance 100 may be installed in a classroom of the university and connected through a university communications network. For example, the embedded appliance 100 can be prompted to capture media signals from a particular university classroom at a particular time according to a schedule stored on the embedded appliance 100.
The media signals captured by each embedded appliance 100 can be processed, stored, and transmitted to the control server 120. The control server 120 receives the media signals and sends them to the server 130, where the content of the media signals is made available for distribution. In certain embodiments, the content of the media signal is made available for distribution to the user 140 at the control server 120. In some embodiments, further processing of the media signal may be performed on the control server 120 and/or another processing device (not shown) before making the content of the media signal available for distribution. For example, the embedded appliance 100 and/or the control server 120 can process the media signals by compressing, indexing, encoding, transcoding, synchronizing, and/or formatting the media signals.
The embedded appliance 100 can be prompted to start and stop sending the processed real-time media signal in response to start and/or stop indicators generated by, for example, the control server 120 or the embedded appliance 100. The start and stop indicators may be generated according to a schedule or according to defined conditions. In some embodiments, the start and/or stop indicator may be a trigger signal generated by a trigger generator in the control server and received by a trigger receiver in the embedded appliance. Further details regarding the trigger signal are set forth with respect to the capture of the Video signal in co-pending application serial No. 10/076,872, published as US2002/0175991a1, entitled GPITrigger Over TCP/IP for Video Acquisition, which is incorporated herein by reference.
The embedded appliance 100 can also be configured to send media signals after any stage of processing. For example, the embedded appliance 100 can be configured to send media signals to the control server 120 after the signals have been compressed, depending on network traffic conditions, unsynchronized and unformatted portions of the audio and digital image signals. The control server 120 can be configured to synchronize and format the audio and digital image signals received from the embedded appliance 100.
The capture of media signals on the embedded appliance 100 can also be monitored by the control server 120, for example, by confidence monitoring signals. The control server 120 can use the confidence monitoring signal to determine whether a particular embedded appliance 100 is properly capturing media signals based on, for example, a schedule. The confidence monitoring signal can include any combination of media signals or portions of any media signals (e.g., split signals) captured and/or detected by the embedded appliance 100. For example, the confidence monitoring signal may be a frame/image periodically acquired from a video signal. The confidence monitoring signal may be a compressed or uncompressed media signal. The confidence monitoring signal may also be a separate signal/indicator generated from the media signal indicating that the media signal is being captured. For example, the confidence monitoring signal can be a binary indicator that indicates that a particular media signal is being captured or not captured by the embedded appliance 100.
Although FIG. 1 depicts only a single control server 120 connected to multiple embedded appliances 100 in some embodiments, in other embodiments, more than one control server 120 may be connected to any combination of embedded appliances 100. For example, two control servers 120 can be configured to coordinate the capture, processing, storage, and/or transmission of media signals captured by the embedded appliance 100. The embedded appliance 100 can be programmed to recognize multiple control servers 120 and can be programmed to transmit a portion of the processed media signal to one of the control servers 120, for example.
FIG. 2 is a system block diagram illustrating an embedded appliance 200 having a number of input ports 210, a processor system 250, a memory 260, and an alarm module 280. The embedded appliance 200 captures real-time media signals from various electronic devices via the input ports 210 in response to start and stop indicators generated by the scheduler 258 in the processor system 250. The processor system 250 receives the media signal and compresses the media signal using the compression module 254. The processor system 250 can use the memory 260 to perform any function associated with the embedded appliance 200, such as storing compressed media signals. When prompted by the scheduler 258, the embedded appliance 200 captures the compressed media signal and transmits the compressed media signal to the control server 220. For example, the captured media signals can be sent as multiplexed signals to the control server 220 over a network connection via an output port (not shown) of the embedded appliance 200.
The input ports 210 include one or more audio input ports 202, one or more visual capture input ports 204, one or more video input ports 206, and one or more digital image input ports 208. Each input port 210 is integrated as part of the embedded environment of the embedded appliance 200. The media signal captured by the input port 210 may be received as an analog signal or as a digital signal. If received as an analog signal, the processor system 250 may convert the analog signal to a digital signal and vice versa.
One or more audio input ports 202 are used to capture audio signals. For example, the one or more audio input ports 202 may be one or more RCA stereo audio input ports, one or more 1/4 ″ jack stereo audio input ports, one or more XLR input ports, and/or one or more Universal Serial Bus (USB) ports. The audio signal may be generated by any type of device capable of generating an audio signal, such as a separate microphone or a microphone connected to a video camera.
For example, the one or more Visual capture input ports 204 receive Digital or analog Video Graphics Array (VGA) signals through one or more VGA input ports, one or more Digital Visual Interface (DVI) input ports, one or more extended graphics array (XGA) input ports, one or more HD-15 input ports, and/or one or more BNC connector ports. The visual capture input port 204 captures images generated by, for example, a computer or microscope. An electronic device connected to the visual capture input port 204 may also be used to capture images, for example, from an electronic whiteboard transmitting images, such as through a VGA signal.
One or more video input ports 206 receive mobile video signals from a device such as a video camera via one or more input ports including, but not limited to, one or more s-video input ports, one or more composite video input ports, and/or one or more component video input ports.
The one or more digital image input ports 208 capture digital images via one or more input ports, such as one or more ethernet ports and/or one or more USB ports. For example, a digital camera or a Web camera may be used to acquire the digital image.
The hardware components in processor system 250, which may include, for example, an Application Specific Integrated Circuit (ASIC), a Central Processing Unit (CPU), a module, a digital signal processor (DPS), a processor, and/or a co-processor, are configured to perform functions specifically associated with capturing, processing, storing, and/or transmitting media signals.
The embedded appliance 200 captures any combination of real-time media signals received through the input ports 210 using the processor system 250. The embedded appliance 200 acquires each media signal synchronously, although the media signals are collected via different input ports 210. For example, although the sound of chalk hitting a classroom blackboard may be received through the audio input port 202 via a microphone, the motion of a professor's hand waving chalk may also be received simultaneously using a video camera connected to the video input port 206. The embedded appliance 200 synchronously receives and processes these media signals.
In some embodiments, the embedded appliance 200 can be configured to capture only certain portions of the media signal. For example, the embedded appliance 200 may be configured to capture and store sound received via a microphone, while ignoring electrostatic noise and/or silent pauses. For example, the embedded appliance 200 may also be configured to capture a video signal or digital image signal only when movement or substantial change in the scene is detected. In many embodiments, each of the input ports 210 included in the embedded appliance 200 can be configured to capture one or more media signals at different and/or variable rates. For example, video input port 206 may be configured to receive video signals at a higher frame rate than the frame rate of digital images received by digital image input port 208.
When receiving media signals, the processor system 250 may compress them using the compression module 254. For example, the compression module 254 may compress the audio signal and the synchronously received digital VGA signal into a plurality of compression formats such as a Moving Picture Experts Group (MPEG) layer 2 format. The compression module 254 may also compress the media signal into more than one format simultaneously. For example, if a digital image signal and an associated audio signal are received, the digital image signal may be compressed into a Joint Photographic Experts Group (JPEG) format while the audio signal may be compressed into an MPEG audio layer 3 (MP 3) format. In some embodiments, compression module 254 may compress a single media signal into multiple formats simultaneously. Similarly, one or more media signals may be compressed into a single compressed stream (e.g., MPEG-4).
The compression module 254 may be configured to adjust a number of variables including frame rate, bit rate, frequency, resolution, color, and stability of the input signal using any combination of lossy or lossless formats employing one or more codecs. A codec is a device, hardware module, and/or software module configured to encode and/or decode, for example, a captured media signal. The compression module 254 may also be configured to simultaneously compress, decompress, encode, and/or decode any combination of media signals into any combination of formats. The formats need not be compatible with each other.
In some embodiments, for example, the processor system 250 and compression module 254 may be configured to be able to use different codecs depending on the type of input device connected to the input port 210. For example, if a Web camera is used to capture a digital image via the digital image input port 208, the image may be compressed into tiff format, whereas if a digital still camera is used to capture a digital image signal, the processor system 250 and compression module 254 may be programmed or configured to detect the difference and instead use a JPEG compression codec.
After the processor system 254 compresses the media signals, the compressed media signals are stored in the memory 260, for example, for later transmission to the control server 220 for further processing. Memory 260 may be any suitable type of fixed and/or removable storage device. The memory may be, but is not limited to, magnetic tape, Digital Video Disk (DVD), digital video tape (DVC), Random Access Memory (RAM), flash memory, and/or a hard drive. The size of the memory 260 may vary depending on the amount of memory required for a particular application. For example, if the embedded appliance 200 is intended to capture a large number of media signals compressed in a lossless format, the size of the memory 260 can be increased. The size of the memory 260 can also be increased, for example, if the embedded appliance 200 is intended to capture media signals over a relatively long period of time (e.g., during a network failure) and without uploading the captured media signals to, for example, the control server 220. The memory 260 may be used to prevent the loss of captured media signals that cannot be sent to, for example, a control server due to a network disruption. In some embodiments, processor system 250 may use memory 260 to cache information received via input ports 210 prior to compression, if desired.
The processor system 250 also includes a scheduler 258 that can generate start and stop indicators that prompt the embedded appliance 200, such as to start and stop capturing and/or start and stop sending media signals. The scheduler 258 can access a schedule that is stored locally on the embedded appliance 200 or the control server 220. The schedule may include, for example, start and stop times specific to input port 210. For example, if a professor is to teach an hour of a day per week for 4 months, the scheduler 258 may use the schedule to prompt the embedded appliance 200 to capture the hour of the day that the professor taught weekly for a period of 4 months. The scheduler 258 can be configured to capture or transmit media signals according to more than one schedule stored, for example, on the embedded appliance 200.
Scheduler 258 may generate a schedule or receive a schedule from control server 220. For example, the scheduler 258 may generate a schedule for transmitting the captured media signals based on input from the control server 220 indicating a preferred transmission time. In some embodiments, the scheduler 258 can access and execute a schedule, such as one sent from the control server 220 and stored in the memory 260 of the embedded appliance 200. In some embodiments, the scheduler 258 can be used to not only start and stop the capture and/or transmission of media signals by the embedded appliance 200, but also to start and stop the processing and/or storage of media signals.
Instead of using scheduling to prompt the capture and/or transmission of media signals, the scheduler 258 may prompt certain functions to be performed according to defined criteria. For example, the scheduler 258 can be configured to prompt the transmission of media signals from the embedded appliance 200 when a certain amount of bandwidth is available for use by the embedded appliance 200. In some embodiments, scheduler 258 is included as a hardware and/or software module separate from processor system 250.
In some embodiments, instead of the processor system 250 having multiple processors, the embedded appliance includes a single processor that may be any type of processor (e.g., an embedded processor or a general-purpose processor) configured to define and/or operate in an embedded environment. The single processor can be configured to perform the functions performed by the processor system 250 and/or other functions within the embedded appliance 200. In some embodiments, the processor system 250 may also include other processors and/or co-processors in addition to the compression module 254 that are configured to operate in the embedded environment of the embedded appliance 200.
In some alternative embodiments, the functions of the scheduler in the embedded appliance may be performed by the control server. In some such embodiments, the embedded appliance may be designed without the scheduler if the full functionality of the scheduler is performed by the control server. For example, the control server can store a schedule associated with each embedded appliance distributed over the network and can send start and stop indicators to each embedded appliance to capture and/or send media signals.
In some embodiments, the start and stop indicators from the control server 220 can be based on variables such as the storage and/or transmission capacity of each embedded appliance 200. The control server 220 can query each embedded appliance 200, for example, to determine how much memory 260 capacity each embedded appliance 200 can use. For example, the control server 220 also receives signals from each embedded appliance 200 indicating how much of the available memory 260 capacity each embedded appliance 200 has. The control server 220 can then prioritize and prompt the transmission of information from the embedded appliance 200 based on the memory capacity indicator.
As shown in FIG. 2, the embedded appliance 200 may include an alarm module 280 that is a hardware and/or software module. Alarm module 280 may include both an output port (not shown) for sending signals and an input port for receiving signals. In the event that physical security is breached, alarm module 280 may be used to send a signal to control server 220. If the location of the embedded appliance 200 is changed from a fixed location, such as in a conference room in a building, the alarm module 280 can send a signal indicating that a physical breach has occurred. For example, the alarm module 280 can transmit an indicator associated with the embedded appliance 200 such that a compromised embedded appliance 200 can be identified by, for example, the control server 220. In addition, the control server 220 can send an audible pulse signal to the alarm module 280 to determine whether the embedded appliance 200 is functioning properly and/or has been physically compromised (e.g., removed), for example.
FIG. 2 also illustrates that the embedded appliance 200 can be controlled using a direct control signal 230 from, for example, a user. The embedded appliance 200 may include an interface, such as a Graphical User Interface (GUI) (not shown), a physical display (not shown), or buttons (not shown) to generate direct control signals 230 that control some or all of the functions that may be performed by the embedded appliance 200. For example, the direct control signals 230 can be used to modify schedules stored on the embedded appliance 200, modify the processing of media signals, eliminate errors on the embedded appliance 200, or control the embedded appliance, such as when the control server 220 is down. For example, the direct control signal 230 may also be used to start and stop the capture and/or transmission of media signals. For example, the embedded appliance 200 can be configured to require authentication (e.g., username/password) of the user, for example, before accepting a direct control signal 230 sent from the user via an interface (not shown). For example, an interface (not shown) that is not directly coupled to the embedded appliance 200 may also be used to generate the direct control signal 230. In some embodiments, the control server 220 may be used to directly control the embedded appliance.
In certain embodiments, the processor system 250 may include other software and/or hardware modules that perform other processing functions, such as encoding, decoding, indexing, formatting, and/or synchronization of media signals. In some embodiments, the embedded appliance 200 can be configured without the compression module 245 and can send uncompressed media signals to the control server 220.
Fig. 3 is a block diagram illustrating the flow of media signals from the embedded appliance through the modules in the control server 390. The control server 390 receives the individual compressed real-time media signals 305 including the compressed audio signal 300, the compressed visual capture signal 310, the compressed video signal 320, and the compressed digital image signal 330. Although the figure depicts each media signal 305 being received independently, the media signals 305 may be received by the control server 390 over a network connection such as an Internet Protocol (IP) network as a multiplexed signal that is de-multiplexed by the control server 390 when received. In some embodiments, the media signals 305 may be combined into one or more signals encoded into one or more formats by the embedded appliance, which when received, may be decoded and separated by the control server 390. For example, the audio and video signals may be combined into a single MPEG-2 signal before the embedded appliance sends them to the control server 390. Additionally, the control server 390 may also receive media signals 305 from more than one embedded appliance, and may process each media signal 305 in parallel using, for example, multi-threaded processing.
Each compressed media signal 305 received by the control server 390 is similarly processed. Each media signal 305 may be processed by one of the decode module 315, the index module 325, and the encode module 335. After each media signal 305 has been processed (e.g., individually processed, processed in groups), the signals are synchronized and/or formatted by the synchronizer/formatter 350.
The processing of the compressed video signal 320 will be used as a representative example of the processing of the compressed media signal 305. The processing of the remaining signal 305 is well understood from this representative example.
When the control server 390 receives the compressed video signal 320, the decoding module 322 decompresses the signal from its compressed format into a decoded video signal. The decode module 322 may be configured to detect the format of the compressed video signal 320 when received to properly decode/decompress the signal 320. When the compressed video signal 320 is converted to a decoded video signal, it may be decoded to its original format or into any other format that may be used by the control server 390 to continue processing the signal. In some embodiments, the compressed video signal 320 may be received in a format that can be processed by the control server 390 without decoding. In this case, the video signal 320 may be made to bypass the decoding module 322.
The index module 324 then processes the decoded video signal to index the decoded video signal, such as by determining and marking scene changes. Indexing is performed so that the decoded video signal can be properly synchronized with other media signals 305 at a later time by the synchronizer/formatter 350, as well as providing associated index points for use by, for example, an end user (not shown). The indexing module 304 is used to detect segments from the compressed audio signal 300, rather than scenes, so that the compressed audio signal 300 can be correctly synchronized with other media signals 305 and provide associated index points for use by, for example, an end user (not shown). The decoded video signal with the index (e.g., scene change flag) is then encoded by the encoding module 326 into an encoding that can be synchronized and formatted by the synchronizer/formatter 350.
Returning to the general discussion of fig. 3, the synchronizer/formatter 350 receives the media signal 305 after processing by the decode 315, index 325, and encode 335 modules. The synchronizer/formatter 350 indexes, synchronizes and formats the media signals so that they may be accessed by a user via the user interface 340. In the synchronization process, the scenes in each media signal are synchronized with the audio segments so that the sound of, for example, a falling pen hitting the floor can be matched to the video of the pen hitting the floor. The synchronizer/formatter 350 may format the synchronized media signal into any format usable by a user.
The synchronizer/formatter 350 may receive auxiliary material 370 and may combine the auxiliary material 370 with the media signal 305 that the modules have processed. For example, the auxiliary material 370 may be additional marking information that can be combined with the processed media signal to assist in the synchronization process. In some embodiments, the collateral material may be additional media signals captured by other multimedia capture devices (not shown) to be combined with the already described media signal 305. Although not shown in fig. 3, the control server 390 may also include separate modules for decoding, indexing (e.g., scene/segment detection or optical character recognition) and/or encoding the collateral material 370 received by the control server 390.
Although fig. 3 depicts separate modules performing the decoding, indexing, encoding, synchronizing, and formatting, the functionality of each module may be further refined and/or combined into one or more processors or modules. These functions may also be refined and/or combined onto more than one control server. Additionally, the control server 390 may also include a memory (not shown) or separate database (not shown) for storing and/or caching information received from one or more embedded devices.
Any combination of the functions performed by any of the modules and/or other components of the control server 390 may alternatively be performed on an embedded appliance. For example, the embedded appliance may perform indexing before compressing the media signal and transmitting it to the control server 390.
The control server 390 may also receive input signals from a user via the user interface 340. For example, the user interface 340 may be a remote computer that interfaces with the control server 390 via a network connection and/or may be an interface integrated into the control server 390. The user interface 340 may be used to control any of the modules and their associated functions and/or to specify parameters for processing information on the control server 390. For example, the user input signal may specify the type of format that should be used by the synchronizer/formatter 350 for a particular set of media signals 305 received at the control server 390. The user interface 340 can be configured to enable a user to manually manipulate any media signal 305 received by embedded appliances distributed over a network.
The user interface 340 can also be used to access, monitor and/or control any embedded devices (not shown) that can be connected to the control server 390 and distributed, such as over a network. For example, access to the embedded appliance and/or the control server 390 via the user interface 340 can be password protected. For example, the user interface 340 can be used to define a schedule used by the embedded appliances or a schedule used by the control server to send signals that cause the distributed embedded appliances to capture, process, store, and/or send start and stop. The user interface 340 can also be used to observe confidence monitoring signals that can be generated by embedded appliances connected to the control server 390.
The user interface 340 may also be used to access the final synchronized/formatted content generated by the control server 390. More than one user interface 340 may be distributed over the network and may be configured to access content produced by the control server 390 (e.g., personal computers distributed over a general network of access control servers 390). In certain embodiments, control server 390 sends the content to a server (not shown) where the content is made available to one or more users through user interface 340.
As shown in FIG. 3, the control server 390 includes an alarm module 380, the alarm module 380 being used to detect security breaches of any embedded device that may be associated with the control server 390. For example, in the event of a physical breach of an embedded appliance (not shown), the alarm module 380 may be used to send a signal to a user via the user interface 340. In some embodiments, the alarm module 380 can be programmed to send an indicator to a user, such as via email, that a particular embedded appliance has been breached to a particular degree.
Fig. 4 is a system block diagram of an example embodiment of an embedded appliance 400 having a number of input ports 410, a number of output ports 420, a processor system 450, and a memory 460. The embedded appliance 400 captures real-time media signals from an electronic device (e.g., microphone, camera) via the input port 410 in response to start and stop indicators generated by the scheduler 456. The processor system 450 accesses the memory 460 to perform functions associated with the embedded appliance 400, such as storing processed media signals. The embedded appliance 400 transmits the processed media signals to the control server 440 via the output port 420.
The input ports 410 include one or more audio input ports 412, one or more visual capture input ports 414, one or more video input ports 416, and one or more digital image input ports 418. Each output port 410 is configured to output media signals in communication with an output port 420. The output ports 420 include one or more audio output ports 422, one or more visual capture output ports 424, one or more video output ports 426, and one or more digital image output ports 428. Output port 420 may be used to transmit processed media signals stored, for example, in memory 460 to control server 440. The output port 420 may also be used to output signals, such as confidence monitoring signals, to the control server 440 or other electronic devices 480, for example.
The processor system 450 includes an embedded processor 452, a co-processor 454, and a scheduler 456. For example, the embedded processor 452 and/or the co-processor 454 may be a Digital Signal Processor (DSP) dedicated to processing media signals by capturing, compressing, encoding, decoding, indexing, synchronizing and/or formatting the media signals. For example, the co-processor 454 may be a processor, such as a Field Programmable Gate Array (FPGA), which is programmed to control the functions performed by the embedded processor 452. For example, the co-processor 454 and/or the embedded processor 452 may include the scheduler 456 as a module. In some embodiments, the scheduler 456 is included as a module separate from the processor system 450.
In some embodiments, rather than having the processor system 450 have multiple processors, the embedded appliance includes a single processor, which may be any type of processor (e.g., an embedded processor or a general purpose processor) configured to be capable of defining and/or operating in an embedded environment. A single processor may be configured to perform the functions performed by the processor system 450 and/or other functions within the embedded appliance 400. In some embodiments, the processor system 450 may include other processors and/or co-processors configured to operate in the embedded environment of the embedded appliance 400 in addition to the embedded processor 452 and the co-processor 456.
The processor system 450 and/or other processors (not shown) not included in the processor system 450 can be configured to perform additional functions of the embedded appliance 400. For example, the processor system 450 may be configured to support the partitioning of the captured media signal. In such a case, the processor system 450 can be configured to include hardware and/or software modules such as a visual capture distribution amplifier (e.g., an on-board VGA distribution amplifier), a visual capture signal splitter, and/or a visual capture sync amplifier. For example, some combinations of these hardware and/or software modules may cause the embedded appliance 400 to capture a VGA signal via the visual-capture input port 414 and return a copy of the signal (also referred to as a split signal) to the electronic device 480 (e.g., classroom projector) via the visual-capture output port 424. Using these hardware and/or software modules, the processor system 450 may also be configured to synchronize and stabilize the split media signal before it is transmitted to the electronic device 480.
In certain embodiments, for example, processor system 450 may be programmed to support an ethernet switch (not shown) (e.g., a multi-port fast ethernet switch, a gigabit ethernet switch) and/or a power-over-ethernet (PoE) port (not shown). Some embodiments of the embedded appliance 400 may include an integrated relay device (not shown) that shunts signals (e.g., visually captures input signals) through the embedded appliance 400 in the event of a power failure. For example, if the power to the embedded appliance 400 is interrupted, the integrated relay device may transmit a media signal to a classroom projector through the embedded appliance and the output of the output port 420.
FIG. 5 is a flow diagram illustrating the capturing, processing, storing and/or sending of media signals using an embedded appliance according to an embodiment of the invention. The flow diagram depicts at 500 the embedded appliance receiving a start capture indicator. The start capture indicator indicates when the embedded appliance captures real-time media signals. The start capture indicator at 500 may indicate that: the embedded appliance will begin capturing the media signal immediately upon creation of the media signal or at a later user-specified time, according to a schedule. The start capture indicator at 500 may also indicate that: the embedded appliance will capture a subset of the media signals, e.g., only the audio signal and the visual capture signal.
As shown in FIG. 5, the embedded appliance captures and compresses media signals in response to the start capture indicator at 510, 512, 514, and 516. More specifically, at 510, the embedded device captures and compresses an audio signal; at 512, the embedded appliance captures and compresses the visual capture signal; at 514, the embedded appliance captures and compresses the digital image signal; and at 516, the embedded appliance captures and compresses the video signal. Although fig. 5 depicts each of these types of media signals being captured, processed, etc. separately, the remainder of the discussion relating to fig. 5 will be made with reference to all media signals only, rather than each individual media signal. Additionally, although the flow diagrams depict all media signals, any combination of capturing, processing, storing, and transmitting media signals may be captured, processed, stored, and transmitted by an embedded appliance. For example, an embedded appliance may capture more than one audio signal and a single visual capture signal without capturing a digital image signal or a video signal.
Having captured and compressed the media signals at 510- > 516, each captured media signal is stored on the embedded device at 520- > 526. In this embodiment, the media signals are stored locally on the embedded appliance, while in some embodiments, the media signals may be stored, for example, on a remote database accessible by the embedded appliance. The flow diagram depicts the capturing and compressing of the media signal at 510- > 516 and the storing of the media signal at 520- > 526 as separate steps, with the capturing and compressing of the media signal at 510- > 516 and the storing of the media signal at 520- > 526 continuing until the embedded appliance receives the stop capture indicator at 530. At 530, the stop capture indicator indicates that the embedded appliance is to stop capturing, compressing, and storing the media signal.
The start capture indicator at 500 and the stop capture indicator at 530 may be generated by the embedded appliance or by the control server according to a schedule or according to defined criteria. In some embodiments, separate stop and start indicators may be sent to capture different media signals. Although not depicted in this flow chart, the capturing, compressing, and storing of the media signal may be paused and resumed at virtually any time. The pause may be prompted using a stop capture indicator generated by, for example, the control server or the embedded appliance, and the resume prompted by a start capture indicator generated by the same.
At 540, the embedded appliance receives a send indicator indicating that the embedded appliance is to send the stored media signal. For example, the transmit indicator at 540 may be generated by the embedded appliance or by the control server according to a schedule. The transmit indicator at 540 may indicate that: the embedded appliance will transmit the stored media signal immediately or at some later time specified. The transmit indicator at 540 may also indicate that: the embedded appliance will only send a portion of the one or more stored media signals, e.g., only a portion of the captured, compressed, and stored digital image signals.
In response to the send indicator received at 540, a signal is sent from the embedded device at 550 and 556. The media signal is then transcoded, indexed, and encoded at 560-. Any of the transcoding, indexing, and encoding at 560 and encoding at 556, and the synchronization and formatting at 570 may be performed on the embedded device or on the control server. For example, indexing of the video signal (e.g., scene detection) may be performed at the embedded appliance before the embedded appliance sends the video signal to, for example, a control server.
After the media signal has been synchronized and formatted at 570, the media signal is made available to the accessing user at 580. The media signals are synchronized based on the markers created during the indexing at 560- > 566. The media signal may be formatted into one or more types of formats. For example, a user may access the signals at the control server and/or one or more servers (e.g., servers configured as course management systems) from a personal computer via a network connection using a username and password.
In summary, an apparatus and method for capturing, processing, storing and/or transmitting media signals using an embedded appliance is described. While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and various changes in form and details may be made therein. For example, the processors and/or modules of the embedded devices may be included on separate electronic boards in one or more buildings.
Claims (39)
1. A method for multimedia capture, comprising:
receiving a schedule configured to trigger capture and compression of an audio signal and capture and compression of at least one of a visual capture signal or a digital image signal, the schedule received from a control device separate from a device having a processor system;
capturing the audio signal and capturing the at least one of a visual capture signal or a digital image signal based on the schedule;
compressing the audio signal and compressing the at least one of a visual capture signal or a digital image signal based on a compression variable associated with the schedule to produce a compressed signal; and
after generating the compressed signal, transmitting the compressed signal.
2. The method of claim 1, wherein:
the compression variable associated with the schedule includes at least one of a frame rate, a bit rate, a frequency, a resolution, a color, or a stability associated with at least one of the audio signal, visual capture signal, or digital image signal.
3. The method of claim 1, wherein:
the compressing includes compressing the at least one of a visual capture signal or a digital image signal into a lossy format based on the scheduling.
4. The method of claim 1, wherein:
the compressing includes simultaneously compressing the at least one of a visual capture signal or a digital image signal into a first format and a second format based on the schedule.
5. The method of claim 1, wherein:
the compressing includes compressing the at least one of a visual capture signal or a digital image signal into a first format at a first time and into a second format at a second time, the second time different from the first time, based on the schedule.
6. An apparatus for multimedia capture, comprising:
a plurality of input ports including (1) an audio input port configured to receive an audio signal and (2) at least one of a visual capture input port configured to receive a visual capture signal or a digital image input port configured to receive a digital image signal, the plurality of input ports configured to receive a plurality of media signals including the audio signal and at least one of the visual capture signal or the digital image signal; and
a processor system located in an embedded environment and coupled to the plurality of input ports, the plurality of input ports and the processor system being co-located in a multimedia capture device, the processor system configured to receive a schedule from a control device separate from the multimedia capture device,
the processor system is configured to capture the plurality of media signals based on the schedule, the processor system is configured to compress the plurality of media signals based on a compression variable associated with the schedule to produce a plurality of compressed signals, the processor system is configured to transmit the plurality of compressed signals after producing the compressed signals.
7. The apparatus of claim 6, wherein:
the compression variable associated with the schedule includes at least one of a frame rate, a bit rate, a frequency, a resolution, a color, or a stability associated with the plurality of media signals.
8. The apparatus of claim 6, wherein:
the processor system is configured to compress the plurality of media signals into a lossy format based on the schedule.
9. The apparatus of claim 6, wherein:
the processor system is configured to receive the schedule from the control device,
the processor system is configured to send a compressed signal to the control device.
10. The apparatus of claim 6, wherein:
the processor system is configured to receive the schedule from the control device,
the processor system is configured to send the compressed signal to a device separate from the control device and the multimedia capture device.
11. A method for multimedia capture, comprising:
receiving a first direct control signal representative of a start indicator and responsive to a first selection by a user of an embedded appliance containing a processor system, the start indicator configured to trigger capture of at least one of a visual capture signal or a digital image signal and an audio signal;
receiving scheduling information associated with the capture of the audio signal and at least one of the visual capture signal or the digital image signal from a control device separate from the embedded appliance;
capturing at least one of a visual capture signal or a digital image signal and an audio signal based on the start indicator to produce a captured signal;
receiving a second direct control signal representative of a transmit indicator and responsive to a second selection by a user, the transmit indicator configured to trigger transmission of the capture signal; and
transmitting an acquisition signal after the acquisition signal is generated such that the acquisition signal is formatted according to the scheduling information.
12. The method of claim 11, wherein the second direct control signal is received in response to a selection by a user of an embedded appliance containing a processor system.
13. The method of claim 11, wherein the second direct control signal is received in response to a selection of a button of an embedded device containing a processor system.
14. The method of claim 11, wherein the second direct control signal is received when a defined condition is satisfied.
15. The method of claim 11, wherein the second direct control signal is received when the control device is not operatively available, the acquisition signal being sent to the control device when the control device is operatively available.
16. The method of claim 11, wherein:
the first direct control signal is received in response to a first selection by a user of an embedded device containing a processor system; and
the second direct control signal is received in response to a second selection by the user.
17. The method of claim 11, wherein:
the first direct control signal is received in response to selection of a first button of an embedded device containing a processor system; and
the second direct control signal is received in response to a selection of a second button of the embedded appliance.
18. The method of claim 11, wherein:
the first direct control signal is received when a first defined condition is met,
the second direct control signal is received when a second defined condition is satisfied.
19. The method of claim 11, wherein:
the start indicator is a first start indicator received at a first time, the transmit indicator is a first transmit indicator received at a second time after the first time,
the scheduling information includes at least one of a second start indicator or a second transmit indicator, the scheduling information is received at a third time prior to the first time and prior to the second time,
at least one of the second start indicator or the second transmit indicator is replaced by the first start indicator or the first transmit indicator, respectively.
20. A method for multimedia capture, comprising:
receiving a start indicator configured to trigger capture of an audio signal on a processor system and capture of at least one of a visual capture signal or a digital image signal on the processor system;
capturing an audio signal on the processor system in response to the start indicator;
in response to the start indicator capturing at least one of a visual capture signal or a digital image signal, the processor system is located in a special purpose embedded appliance dedicated to capturing, processing, storing and sending a plurality of real-time signals including an audio signal and at least one of a visual capture signal or a digital image signal; and
the following transmission is performed: (1) transmitting the audio signal after the audio signal is captured, (2) transmitting at least one of the visual capture signal or the digital image signal after the at least one of the visual capture signal or the digital image signal is captured, and (3) transmitting data associated with the audio signal and the at least one of the visual capture signal or the digital image signal, the data including a capture time and a capture location.
21. The method of claim 20, wherein the data includes a speaker name.
22. The method of claim 20, wherein the first portion of the audio signal and at least one of the data, the visual capture signal, or the digital image signal are transmitted to a first control device of a plurality of control devices operatively coupled to the processor system and are not transmitted to a second control device of the plurality of control devices.
23. The method of claim 20, wherein:
the first portion of the audio signal and at least one of the data, visual capture signal or digital image signal are transmitted to a first control device of a plurality of control devices operatively coupled to the processor system and are not transmitted to a second control device of the plurality of control devices; and is
The second portion of the audio signal and at least one of the visual capture signal or the digital image signal are transmitted to the second control device and not to the first control device.
24. The method of claim 20, wherein:
the first portion of the audio signal and at least one of the data, visual capture signal or digital image signal are transmitted to a first control device of a plurality of control devices operatively coupled to the processor system and are not transmitted to a second control device of the plurality of control devices;
each control device of the plurality of control devices is operatively coupled to the remaining control devices of the plurality of control devices; and is
The processor system is a first processor system, and the first control device is operatively coupled to a second processor system that is physically separate from the first processor system.
25. A method for multimedia capture, comprising:
capturing an audio signal on a processor system of a specific-purpose embedded appliance;
capturing at least one of a visual capture signal or a digital image signal on a processor system of a specific-purpose embedded appliance dedicated to capturing, processing, storing, and transmitting a plurality of real-time signals including at least one of a visual capture signal or a digital image signal and an audio signal; and
transmitting at least one of a visual capture signal or a digital image signal and a first portion of an audio signal to a first control device of a plurality of control devices operatively coupled to the processor system and not to a second control device of the plurality of control devices.
26. The method of claim 25, further comprising:
identifying the plurality of control devices before the first portion of the audio signal is transmitted to the first control device,
in response to the plurality of control devices being identified, a first portion of the audio signal is transmitted to a first control device.
27. The method of claim 25, further comprising:
transmitting at least one of a visual capture signal or a digital image signal and a second portion of an audio signal to the second control device and not to the first control device.
28. The method of claim 25, wherein each of the plurality of control devices is operatively coupled to the remaining ones of the plurality of control devices.
29. The method of claim 25, wherein the processor system is a first processor system, the first control device being operatively coupled to a second processor system that is physically separate from the first processor system.
30. The method of claim 25, further comprising:
sending data associated with the audio signal and at least one of the visual capture signal or the digital image signal to the first control device, the data including a capture time and a capture location.
31. The method of claim 25, further comprising:
transmitting at least one of a visual capture signal or a digital image signal and a second portion of an audio signal to the second control device and not to the first control device;
sending data associated with a first portion of an audio signal and at least one of a visual capture signal or a digital image signal to the first control device, the data including a capture time and a capture location; and
sending data associated with the second portion of the audio signal and at least one of the visual capture signal or the digital image signal to the second control device, the data including a capture time and a capture location.
32. A method for multimedia capture, comprising:
receiving a schedule configured to trigger capture and compression of an audio signal and capture and compression of at least one of a visual capture signal or a digital image signal, the schedule received from a control device separate from a device having a processor system;
capturing an audio signal through a first input port based on the schedule to produce a first captured signal;
capturing at least one of a visual capture signal or a digital image signal through a second input port based on the schedule to produce a second captured signal;
detecting a device type of a capture device coupled to the first input port;
detecting a device type of a capture device coupled to the second input port;
compressing a first captured signal based on a first compression variable associated with the schedule and based on a device type of a capture device coupled to the first input port to produce a first compressed signal;
compressing a second capture signal based on a second compression variable associated with the schedule and based on a device type of a capture device coupled to the second input port to produce a second compressed signal; and
the first compressed signal and the second compressed signal are transmitted after the first compressed signal and the second compressed signal are generated.
33. The method of claim 32, further comprising:
selecting a first codec from a plurality of codecs based on a device type of a capture device coupled to the first input port before the first capture signal is compressed; and
selecting a second codec from the plurality of codecs based on a device type of a capture device coupled to the second input port before the second capture signal is compressed.
34. The method of claim 32, further comprising:
selecting a first codec from a plurality of codecs based on a device type of a capture device coupled to the first input port; and
selecting a second codec from a plurality of codecs based on a device type of a capture device coupled to the second input port,
the first captured signal is compressed using a first codec to produce a first compressed signal,
the second captured signal is compressed by using a second codec to generate a second compressed signal.
35. The method of claim 32, wherein:
a capture device coupled to the second input port is coupled to the second input port at a second time after the first time,
detecting a device type of a capture device coupled to the second input port includes detecting a difference between a capture device coupled to the second input port at a first time and a capture device coupled to the second input port at a second time.
36. The method of claim 32, wherein:
a capture device coupled to the second input port is coupled to the second input port at a second time after the first time,
detecting a device type of a capture device coupled to the second input port comprises detecting a difference between a capture device coupled to the second input port at a first time and a capture device coupled to the second input port at a second time,
the method further comprises:
a codec is selected from a plurality of codecs based on a device type of a capture device coupled to the second input port at a second time, the second capture signal being compressed using the codec to produce a second compressed signal.
37. The method of claim 32, wherein:
the captured signal is a digital image signal and,
a capture device coupled to the second input port is coupled to the second input port at a second time after the first time,
detecting a device type of a capture device coupled to the second input port comprises detecting a difference between a capture device coupled to the second input port at a first time and a capture device coupled to the second input port at a second time,
the method further comprises:
selecting a first codec from a plurality of codecs when a device type of a capture device coupled to the second input port at a second time is a webcam, the first codec configured to generate a compressed signal in a tiff format; and
selecting a second codec from a plurality of codecs when the device type of the capture device coupled to the second input port at a second time is a digital camera, the second codec configured to generate a compressed signal in JPEG format.
38. The method of claim 32, wherein:
a first compression variable associated with the schedule includes at least one of a frame rate, a bit rate, a frequency, a resolution, a color, or a stability associated with the first captured signal,
a second compression variable associated with the schedule includes at least one of a frame rate, a bit rate, a frequency, a resolution, a color, or a stability associated with the second captured signal.
39. The method of claim 32, wherein:
compressing the first captured signal includes compressing the first captured signal into a lossy format based on a first compression variable associated with the schedule to produce a first compressed signal;
compressing the second captured signal includes compressing the second captured signal into a lossy format based on a second compression variable associated with the schedule to produce a second compressed signal.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/472,997 | 2006-06-23 |
Publications (2)
Publication Number | Publication Date |
---|---|
HK1194883A HK1194883A (en) | 2014-10-24 |
HK1194883B true HK1194883B (en) | 2018-02-23 |
Family
ID=
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110519477B (en) | Embedded device for multimedia capture | |
US12206939B2 (en) | Methods and apparatus for an embedded appliance | |
AU2019204751B2 (en) | Embedded appliance for multimedia capture | |
CA2914803C (en) | Embedded appliance for multimedia capture | |
AU2013254937B2 (en) | Embedded Appliance for Multimedia Capture | |
HK1194883B (en) | Embedded appliance for multimedia capture | |
HK1194883A (en) | Embedded appliance for multimedia capture | |
HK1136063B (en) | Embedded appliance for multimedia capture | |
AU2012202843A1 (en) | Embedded appliance for multimedia capture |