[go: up one dir, main page]

WO2015118664A1 - Dispositif d'émission d'images, dispositif de réception d'images et système de caméras de surveillance, système de téléconférence et système de caméra montée sur véhicule l'utilisant - Google Patents

Dispositif d'émission d'images, dispositif de réception d'images et système de caméras de surveillance, système de téléconférence et système de caméra montée sur véhicule l'utilisant Download PDF

Info

Publication number
WO2015118664A1
WO2015118664A1 PCT/JP2014/052925 JP2014052925W WO2015118664A1 WO 2015118664 A1 WO2015118664 A1 WO 2015118664A1 JP 2014052925 W JP2014052925 W JP 2014052925W WO 2015118664 A1 WO2015118664 A1 WO 2015118664A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image data
unit
processing unit
compression
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2014/052925
Other languages
English (en)
Japanese (ja)
Inventor
稲田 圭介
甲 展明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Maxell Ltd
Original Assignee
Hitachi Maxell Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Maxell Ltd filed Critical Hitachi Maxell Ltd
Priority to PCT/JP2014/052925 priority Critical patent/WO2015118664A1/fr
Publication of WO2015118664A1 publication Critical patent/WO2015118664A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2365Multiplexing of several video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/109Selection of coding mode or of prediction mode among a plurality of temporal predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/115Selection of the code volume for a coding unit prior to coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/14Coding unit complexity, e.g. amount of activity or edge presence estimation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Definitions

  • the technical field of the present invention relates to video information transmission / reception technology and a system using the technology.
  • DisplayPort VESA registered trademark
  • HDMI High-Definition Multimedia Interface (HDMI Licensing, LLC registered trademark)
  • VESA Video Electronics Standards Association
  • Ethernet registered trademark
  • AVB Analog Video Bridge
  • Patent Document 1 describes "a video transmission apparatus 2 that compresses and encodes a video signal from the camera 1 and sends it out.”
  • a video transmission system and a video transmission control method including an operation terminal 3 for decoding a compression-encoded video signal received from a video transmission device 2 and displaying it on the video display unit 11 and controlling the camera 1,
  • the apparatus 2 receives a camera control signal from the low delay mode encoding unit 7, the high quality mode encoding unit 8, the mode selection unit 6, and the operation terminal 3 that compress and encode the video signal from the camera 1.
  • a control signal processing unit 9 that controls the mode selection unit 6 to select the low-delay mode encoding unit 7 and controls the imaging direction or imaging range of the camera 1 as a low-delay mode, 1 of display by reducing the delay time on the video display unit 11 'has been described that (see FIG. 1).
  • the amount of delay can be set according to the so-called two types of compression, compression that achieves high image quality and compression that achieves low delay, and the required amount of delay.
  • both methods have a problem in that when the compression is performed with a low delay, the image quality is deteriorated as compared with the case where the delay amount is large.
  • the present invention has been achieved in view of the above-described prior art, and an object thereof is an image transmission apparatus, an image transmission method, and an image transmission apparatus that can transmit image data with high image quality and low delay. Is to provide a surveillance camera system, a video conference system, and an in-vehicle camera system using these.
  • an image transmission device and an image reception device described in the following claims are provided. More specifically, an image input unit to which image data is input, a compression processing unit that generates compressed image data from an image input to the image input unit, and compressed image data generated by the compression processing unit A data transfer unit for outputting to a transmission line, the compression processing unit generates a plurality of compressed image data having different compression delay times, and the data transfer unit is configured to store the compression delay time generated by the compression processing unit.
  • An image transmission apparatus that multiplexes and outputs a plurality of different compressed image data, receives a plurality of compressed image data having different compression delay times, and extracts each compressed image data; and the data A plurality of compressed image data having different compression delay times received by the receiving unit are decompressed to generate a plurality of image data, and additional image data is generated.
  • An image superimposing unit that superimposes a plurality of image data, and the image superimposing unit adds the additional image data from the expansion processing unit to a part of the image data generated by the expansion processing unit.
  • an image receiving apparatus that superimposes and outputs other image data generated by the decompression processing unit is provided.
  • a surveillance camera system in order to achieve the above-described object, a surveillance camera system, a video conference system, and an in-vehicle camera system are provided as systems using the image transmission device and the image reception device.
  • an image transmission apparatus and an image receiving apparatus capable of transmitting image data with high image quality and low delay, and further, using these, a surveillance camera system that is practically excellent.
  • a video conference system and an in-vehicle camera system are provided.
  • FIG. 1 is a block diagram showing an image transmission system according to the present embodiment, and the image transmission system has a configuration in which an image transmission apparatus 1 and an image reception apparatus 2 are connected by a cable 3 as is apparent from the figure. .
  • the image transmission device 1 is an image transmission device that compresses and transmits image data. More specifically, the image transmission device 1 receives image data that has been decoded so that the digital broadcast can be received and viewed, or image data that has been captured by a camera, This is a so-called image recording / playback device that outputs image data to another device via an HDMI cable or a LAN.
  • Examples of the image transmission apparatus 1 include a recorder, a digital TV with a built-in recorder function, a personal computer with a built-in recorder function, a mobile phone with a camera function and a recorder function, a camcorder, an in-vehicle camera, and the like.
  • the image receiving device 2 is a display device that inputs image data and outputs an image to a monitor using an HDMI cable, a LAN, or the like.
  • Examples of the image receiving device 2 include a digital TV, a display, a projector, a mobile phone, a signage device, an in-vehicle peripheral monitoring device, and the like.
  • the cable 1000 is a data transmission path for performing data communication such as image data between the devices of the image transmission device 1 and the image reception device 2.
  • data communication such as image data between the devices of the image transmission device 1 and the image reception device 2.
  • the cable 3 there is a wired cable compatible with the HDMI standard, the DisplayPort standard, and the Ethernet, or a data transmission path for performing wireless data communication.
  • the image input unit 10 is an input unit for inputting image data to the image transmission apparatus 1.
  • image data input to the input unit 10 there is digital image data input from a monitoring camera or an in-vehicle camera.
  • the image processing unit 11 performs digital image processing on the image data from the image input unit 10 and outputs the image data after the digital image processing to the compression processing unit 12.
  • Examples of digital image processing include rotation, enlargement or reduction processing, frame rate conversion processing, edge extraction, motion vector extraction, high frequency component removal, noise removal, and the like.
  • the compression processing unit 12 performs compression processing on the image data from the image input unit 10 and the image data from the image processing unit 11 and outputs the compressed data to the data transfer unit 13.
  • the data transfer unit 13 converts the two types of compressed image data (compressed image data A and compressed image data B) compressed by the compression processing unit 12 into signals in a format suitable for cable transmission and outputs the signals to the cable 3.
  • An example of a signal in a format suitable for cable transmission is described in the HDMI standard.
  • image data adopts a TMDS data transmission format.
  • Another example of a signal in a format suitable for cable transmission is described in the IEEE P1722 standard used in in-vehicle Ethernet.
  • the compressed image data is transmitted according to AVTP (Audio Video Transport Protocol) Video Protocol.
  • the user IF unit 14 is an input unit for inputting a signal for controlling the operation of the image transmission apparatus 1.
  • An example of the user IF unit 14 is a remote control receiving unit.
  • the control signal from the user IF unit 14 is output to the control unit 15.
  • the control unit 15 controls the entire image transmission apparatus 1 in accordance with the signal from the user IF unit 14.
  • An example of the control unit 15 is a microprocessor. Image data from the image transmission apparatus 1 is supplied to the image reception apparatus 2 via the cable 3.
  • Data receiving unit 20 receives a signal in a format suitable for cable transmission.
  • the signal input to the data receiving unit 20 is converted into a predetermined digital data from a signal in a format suitable for cable transmission, and two types of compressed image data (compressed image data A and compressed image data B) are processed. ) Are extracted and output to the decompression processing unit 21.
  • the decompression processing unit 21 decompresses the compression processing performed by the compression processing unit 12 in the image transmission apparatus 1 to generate two types of image data (decoded image data A and decoded image data B). Also, the decoded image data A is output to the image superimposing unit 23, and the decoded image data B is output to the additional image generating unit 22.
  • the additional image generation unit 22 extracts image features from the input image data B, generates additional image data based on the result, and outputs the additional image data to the image superimposing unit 23.
  • image feature extraction include object detection, moving object detection, state detection, edge extraction, extraction of pixels having luminance within a predetermined threshold range, and the like.
  • object detection include vehicle detection, dive object detection, lane detection, road surface detection, and object detection such as traffic lights in an in-vehicle periphery monitoring system.
  • object detection include vehicle detection, dive object detection, lane detection, road surface detection, and object detection such as traffic lights in an in-vehicle periphery monitoring system.
  • object detection is suspicious person detection in a monitoring system.
  • state detection include red signal detection, road surface state detection, weather detection, and retrograde detection.
  • the additional image generation unit 22 may output the input image data B as it is or may output image feature extraction information.
  • the image superimposing unit 23 generates image data in which two types of image data input from the decompressing unit 21 and the additional image generating unit 22 are superimposed, and outputs the generated image data to the display unit 24.
  • the image superimposing unit 23 includes a memory unit and a memory control unit for temporarily storing images. During the period until the decoded image data A is input to the image superimposing unit 23, the additional image data is temporarily stored in the memory unit, and the decoded image data A and the additional image data are superimposed from the memory unit at a timing of superimposing processing. The additional image data is read and a superimposition process is performed. Further, the image superimposing unit 23 may perform the image quality improvement processing of the decoded image data A using the decoded image data B or the additional image data. As an example of image quality enhancement processing, there is a method of performing edge enhancement processing of decoded image data A using decoded image data B or additional image data including edge detection information.
  • the display unit 24 converts the input image data into a signal suitable for the display method and displays it on the screen.
  • Examples of the display unit 24 include a display unit such as a liquid crystal display, a plasma display, an organic EL (Electro-Luminescence) display, and a projector projection display.
  • the user IF unit 25 is an input unit for inputting a signal for controlling the operation of the image receiving device 2.
  • An example of the user IF unit 25 is a remote control receiving unit.
  • a control signal from the user IF unit 25 is supplied to the control unit 26.
  • the control unit 26 is a control unit that controls the entire image receiving apparatus 2 in accordance with a signal from the user IF unit 25.
  • FIG. 2 is a diagram showing an effective area in which image data in one frame period is transmitted and a blanking period in which image data is not transmitted.
  • a region indicated by reference numeral 400 is a vertical period, and the vertical period 400 includes a vertical blanking period 401 and a vertical effective period 402.
  • the VSYNC signal is a 1-bit signal in which 1 is set between the number of lines defined from the top of the vertical blanking period 401 and 0 is set between the other vertical blanking periods and the vertical effective period 402.
  • An example of the prescribed number of lines is 4 lines.
  • An area indicated by a reference numeral 403 is a horizontal period, and the horizontal period 403 includes a horizontal blanking period 404 and a horizontal effective period 405.
  • the HSYNC signal is a 1-bit signal in which 1 is set between the number of pixels defined from the head of the horizontal blanking period 404 and 0 is set between the other horizontal blanking periods and the horizontal effective period 405.
  • An example of the prescribed number of pixels is 40 pixels.
  • the effective period 406 is an area surrounded by a vertical effective period 402 and a horizontal effective period 405, and image data is allocated to this period.
  • the blanking period 407 is an area surrounded by a vertical blanking period 401 and a horizontal blanking period 404.
  • the compressed image data and the sub-compression code information are transmitted during the effective period 406, and the main compression code information is transmitted during the blanking period 407.
  • the blanking period 407 data obtained by packetizing audio data and other attached data is transmitted.
  • a method of sending a packet of voice data or the like as a reliable packet in the blanking period 407 is disclosed in, for example, Japanese translations of PCT publication No. 2005-514873.
  • the error correction code is included in the packet data in the blanking period, it is possible to correct an error occurring in the transmission path, and the error tolerance is increased. Further, the data for data transmission of the packet in the blanking period is transmitted to two physically different channels, and the channel to be transmitted is switched every certain time. For this reason, an error occurring in a burst manner in one channel is not affected by the other channel, so that a data error can be corrected.
  • the error correction rate has an improvement effect of 10 ⁇ 14 in the horizontal blanking period versus 10 ⁇ 9 in the horizontal effective period.
  • FIG. 3 is a diagram illustrating a detailed example of the image processing unit 11 described above.
  • the frame thinning unit 110 is a block for thinning out frames of input image data supplied from the image input unit 10.
  • image data of 15 fps is output by thinning out every other image for input image data input at 30 fps (frame / second).
  • the input image data may be output at the frame rate without performing the thinning.
  • the image feature extraction image generation unit 112 extracts features of the input image from the input image data, and generates image data obtained by simplifying or omitting image data other than the features.
  • the feature extraction processing for image data there are a first-order differential filter (edge detection, line detection) and a second-order differential detection (Laplacian filter).
  • An example of edge detection is a Sobel filter.
  • the frame thinning unit 110, the reduction unit 111, and the image feature extraction image generation unit 112 have been described as the components of the image processing unit 11, all the components may not be included.
  • the image processing unit 11 may be configured by only the reduction unit 111 or may be configured by only the image feature extraction image generation unit 112.
  • FIG. 4 is a diagram illustrating a detailed example of the compression processing unit 12.
  • a compression unit A that generates compressed image data A used for viewing and a compression unit B that generates compressed image data B used for recognition are provided.
  • the compression processing delay in the compression unit B is short.
  • the compression unit A performs high-quality compression processing on the image data supplied from the image input unit 10.
  • H.264 There is a compression process using H.264 B pictures.
  • B picture is a method of performing compression using past frames and future frames. This method is not suitable for low-delay transmission because a delay of one frame time or more occurs when performing compression processing because a future frame is used, but a high-quality decoded image can be obtained. This is a method suitable for viewing purposes.
  • the compression unit B performs low-delay compression processing on the image data supplied from the image processing unit 11.
  • H.264 There is a compression process using H.264 P pictures.
  • P picture is a method of performing compression using past frames. This method is suitable for recognition applications such as object detection because a future frame is not used and a compression delay associated with waiting for a future frame generated in the compression unit B does not occur.
  • the image data supplied from the image processing unit 11 is image data that has been subjected to digital image processing for reducing the amount of image information in the image processing unit 11. It is possible to greatly reduce the amount of generated codes after data compression. In other words, by assigning a larger amount of code to the feature-extracted image data, the compression unit A can be compressed with higher image quality.
  • the memory control unit 122 performs control for temporarily storing the compressed image data B supplied from the compression unit B in the memory unit 123.
  • this control there is a method of temporarily storing the compressed image data B processed in the valid period 406 in the memory unit 123 and outputting the compressed image data B in the subsequent blanking period 407.
  • FIGS. 5A to 5D show examples of input image data and output image data of the additional image generating unit 22 and the image superimposing unit 23.
  • FIG. 5A to 5D show examples of input image data and output image data of the additional image generating unit 22 and the image superimposing unit 23.
  • An image 50 shown in FIG. 5A shows an example in which image data (decoded image of the compressed image data A) supplied from the decompression processing unit 21 to the image superimposing unit 23 is displayed as an image. That is, in the image 50, two vehicles, a traveling vehicle 501 and a traveling vehicle 502, are drawn on the traveling road surface 500.
  • the traveling vehicle 501 is a vehicle that has been interrupted on the own vehicle traveling lane from the side, and the traveling vehicle 502 is a vehicle far ahead of the own vehicle.
  • a background 503 is also drawn.
  • An image 51 shown in FIG. 5B shows an example in which image data (decoded image of the compressed image data B) supplied from the decompression processing unit 21 to the additional image generation unit 22 is displayed as an image. That is, the compressed image data B is image data after edge detection by the image processing unit 11, and image information other than the edge information is simplified or removed.
  • a drawing 511 is an image after the edge of the traveling vehicle 501 is extracted.
  • a drawing 512 is an image after edge extraction of the traveling vehicle 502.
  • An image 52 shown in FIG. 5C is an example in which the image data supplied from the additional image generating unit 22 to the image superimposing unit 23 is displayed as an image.
  • the additional image generation unit 22 generates, based on the input image 51, a rectangular figure surrounding the detected vehicle that does not detect the vehicle.
  • the color of the rectangular figure may be changed according to the distance and direction from the vehicle. Moreover, it is good also as a photographer which changes in time, such as blinking.
  • a rectangular figure 521 shows a traveling vehicle 501.
  • Rectangular view 522 shows traveling vehicle 502.
  • An image 53 shown in FIG. 5D is an example in which image data supplied from the image superimposing unit 23 to the display unit 24 is displayed as an image.
  • FIG. 6 is a diagram illustrating a detailed example of the decompression processing unit 21.
  • the decompression unit A210 performs decompression processing on the compressed image data A to generate decoded image data A.
  • the decompression unit B211 performs decompression processing on the compressed image data B and generates decoded image data B.
  • FIG. 7 is a diagram showing an example of an image input and output timing chart in the present embodiment.
  • a signal 600 indicates a vertical blanking signal for the input images 604 and 605 of the image transmission apparatus 1.
  • a signal 610 indicates a vertical blanking signal for the output images 614, 615, and 617 of the image transmission apparatus 1.
  • a signal 620 indicates a vertical blanking signal for the input images 624, 625, and 627 of the image superimposing unit 23 of the image receiving device 2.
  • a signal 630 indicates a vertical blanking signal for the output image 631 of the image superimposing unit 23 of the image receiving device 2.
  • the vertical blanking signals 600, 610, 620, and 630 are composed of an effective period 601 and a blanking period 602.
  • the output image 614 is compressed image data B obtained by compressing the input image 604 by the compression unit B.
  • the output image 617 is compressed image data A obtained by compressing the input image 604 by the compression unit A.
  • the compression delay time 618 for the compressed image data B is shorter than the compression delay time 619 for the compressed image data A.
  • the compression unit A shows an example in which compression processing is performed using a future frame for one frame.
  • the input image 624 is decoded image data B obtained by decompressing the compressed image data 614 by the decompression unit B.
  • the image data 626 is an additional image generated after the recognition processing time 628 for the compressed image data 624 in the additional image generation unit 22.
  • the input image 627 is decoded image data A obtained by decompressing the input image 614 by the decompression unit A.
  • the output image 631 is image data after the image data 626 and the input image 627 are superimposed.
  • FIG. 8 is a diagram showing another example of an image input and output timing diagram in the present embodiment.
  • signals having the same numbers as those in FIG. 7 are the same as those in FIG.
  • the present embodiment is characterized in that the compressed image data B614 compressed with a low delay is transmitted in both the effective period and the blanking period, and by this method, the compressed image data B614 is made redundant.
  • the additional image generation unit 22 in the image receiving device 2 when there is an abnormality in the decoded image data B624, the transmission error resistance is improved by using the decoded image data B726.
  • FIG. 8 is a diagram showing another example of an image input and output timing diagram in the present embodiment.
  • this example is characterized in that the transmission of the compressed image data B (see reference numeral 614 in FIG. 7) in the effective period is eliminated in the image transmission apparatus 1 shown in FIG. is there. According to this method, only the compressed image data A is transmitted as the compressed image data during the effective period, so that the image quality of the decoded image data A can be further improved.
  • FIG. 10 is a diagram showing still another example of the image input and output timing chart in the present embodiment.
  • the compression unit A (see reference numeral 120 in FIG. 4) of the image transmission apparatus 1 displays frames up to two frames ahead. It is an example which enabled compression processing by using. With this method, it is possible to further improve the image quality of the decoded image data A.
  • the compression unit A may be configured to perform compression processing using frames up to N frames ahead. Also, in FIGS. 7, 8, and 9, the compression processing may be performed using frames up to N frames ahead.
  • FIG. 11 shows an example of a timing chart for each frame of the input image data, the compressed image data A, the compressed image data B, the additional image, and the finally displayed display image in the present embodiment.
  • the APVF Header of the APVF Video PDU format defined by the AVTP Video Protocol in the P1722 standard is shown.
  • Frames 1402 and 1404 are compressed image data for the input image data 1401.
  • a person 1409 exists in the frame 1401.
  • the compressed image data A (frame 1402) is compressed with a small delay.
  • the compressed image data B (frame 1404) is compressed with less compression deterioration with respect to the compressed image data A by using more time and wider range of frame information. As a result, the compressed image data B achieves higher image quality than the compressed image data A.
  • the additional image 1406 is a frame image that is drawn around the person in the frame by performing an expansion process and a person recognition process on the frame 1402.
  • the frame 1405 is an image displayed on the monitor.
  • Reference numeral 1407 denotes an image obtained by superimposing the decoded image (person) obtained by decompressing the compressed image data B (frame 1404) and the additional image 1406 (frame image).
  • compressed image data A and compressed image data B are generated from the input image and output from the image transmission apparatus.
  • the compressed image data A and the compressed image data B are decompressed with a time difference of one frame or more.
  • the image transmission apparatus generates an additional image 1406 based on the compressed image data B, and displays the additional image 1406 by superimposing it on the decoded image obtained by the decompression process on the compressed image data B.
  • the frcount values of the compressed image data A (frame 1401) and the compressed image data B (frame 1402) generated based on the same frame (frame 1401) of the input image are the same.
  • the image receiving apparatus can determine the compressed image data A (frame 1401) and the compressed image data B (frame 1402) for the same frame (frame 1401) of the input image.
  • the AVTP Video Protocol has been described as an example.
  • the present invention is not limited to this example.
  • the frame count information may be MPEG2-TS or H.264. It may be stored in the user data area of the H.264 / AVC NAL.
  • the image data transmitted by the image transmission apparatus is compressed and transmitted by two types of compression methods having different delay times. It is possible to achieve both high image quality and high recognition performance. Furthermore, by making the compressed image data compressed with a low delay redundant, it is possible to improve transmission error tolerance.
  • FIG. 12 shows a surveillance camera system
  • FIG. 13 shows a video conference system
  • FIG. 14 shows an example of an in-vehicle camera system.
  • reference numerals 1101, 1102, and 1103 are monitoring cameras installed at points A, B, and C, respectively
  • 1104 is a monitoring center that receives images captured by the monitoring cameras 1101, 1102, and 1103, and
  • Reference numeral 1105 denotes a wide area network (WAN) such as an Internet line. Images captured by the monitoring cameras 1101 to 1103 can be displayed on a monitor or the like in the monitoring center 1104 via the WAN 1105.
  • FIG. 11 shows an example in which there are three surveillance cameras, the number of in-vehicle cameras may be two or less, or four or more.
  • the image transmission apparatus is mounted on, for example, the monitoring cameras 1101 to 1103. That is, the image transmission apparatus performs a compression process, which will be described later, on the input image input via the lenses of the monitoring cameras 1101 to 1103, and the compressed input image is output to the WAN 1105.
  • the image receiving apparatus of the present embodiment is mounted on the monitoring center 1104, for example.
  • the monitoring center 1104 includes a display monitor. That is, the image receiving apparatus performs a decompression process, a recognition process, an additional image generation process, a superimposition process of the decompressed input image and the additional image, etc., which will be described later, on the compressed input image, and performs a superimposition process on the display monitor. Display the later image.
  • the image displayed on the monitor may be an image after decompression processing or an additional image.
  • reference numerals 1201, 1202, and 1203 are video conference apparatuses installed at points A, B, and C, respectively, and 1204 is a WAN such as an Internet line.
  • images captured by the cameras of the video conference apparatuses 1201 to 1203 can be displayed on the monitors of the video conference apparatuses 1201 to 1203 via the WAN 1204.
  • FIG. 12 shows an example in which there are three video conference systems. However, the number of video conference systems may be two, or four or more.
  • the image transmission apparatus is mounted on each camera of the video conference apparatuses 1201 to 1203, for example. That is, the image transmission apparatus performs compression processing, which will be described later, on the input image input via the camera lens of each of the video conference apparatuses 1201 to 1203, and outputs the compressed input image to the WAN 1205. Is done.
  • the image receiving apparatus according to the present embodiment is also installed in each of the video conference apparatuses 1201, 1202, and 1203, for example. That is, each of the video conference apparatuses 1201, 1202, and 1203 may include a display monitor by itself, or may be externally attached. In this case, the image receiving apparatus performs a compression process on an input image.
  • a decompression process, a recognition process, an additional image generation process, a decompressed input image and an additional image are superimposed, and the image after the superimposition process is displayed on the display monitor.
  • the image displayed on the monitor may be an image after expansion processing or an additional image.
  • reference numeral 1301 denotes an automobile
  • 1302 and 1303 denote in-vehicle cameras mounted on the automobile 1301
  • 1306 denotes decompression of compressed image data compressed by the in-vehicle camera.
  • ECU Electronic Control Unit
  • 1304 displays an image captured by the in-vehicle cameras 1302 and 1303, a video expanded by the ECU 1406, an OSD (On Screen Display) reflecting the recognition processing result, and the like
  • Reference numeral 1305 denotes a local area network (LAN) in the automobile 1301.
  • LAN local area network
  • Images captured by the in-vehicle cameras 1302 and 1303 can be displayed on the monitor 1304 after being subjected to image processing such as expansion processing, recognition processing, and OSD addition by the ECU 1306 via the LAN 1305.
  • FIG. 13 shows an example in which two in-vehicle cameras are mounted, however, the number of in-vehicle cameras may be one or three or more.
  • the image transmission apparatus of the present embodiment is mounted on, for example, in-vehicle cameras 1302 and 1303. That is, the image transmission apparatus performs a compression process, which will be described later, on the input image input via the lenses of the in-vehicle cameras 1302 and 1303, and the compressed input image is output to the LAN 1305.
  • the image receiving apparatus of the present embodiment is mounted on the ECU 1306, for example. Note that the image receiving apparatus may have both functions of the ECU 1306 and the monitor 1304.
  • the image receiving apparatus performs a decompression process, a recognition process, an additional image generation process, a decompressed input image and an additional image superimposing process, etc., which will be described later, on the compressed input image, and the processed image Is output to the monitor 1304.
  • this invention is not limited to the above-mentioned Example, Various modifications are included.
  • the above-described embodiments are described in detail for the entire system in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described.
  • a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
  • each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
  • Information such as programs, tables, and files that realize each function can be stored in a memory, a hard disk, a recording device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
  • control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

L'invention concerne un dispositif d'émission d'images et un dispositif de réception d'images qui peuvent être utilisés dans un système qui compresse et émet des données d'images à émettre et qui atteignent une haute qualité d'image et de hautes performances de reconnaissance. Le dispositif d'émission d'images comporte: une unité d'entrée d'images dans laquelle sont introduites des données d'images; une unité de traitement de compression qui génère des données d'images compressées à partir de l'image introduite dans l'unité d'entrée d'images; et une unité de transfert de données qui délivre en sortie les données d'images compressées générées par l'unité de traitement de compression vers un parcours d'émission. L'unité de traitement de compression génère une pluralité d'éléments de données d'images compressées qui différent en termes de temps de retard de compression. L'unité de transfert de données multiplexes et délivre en sortie la pluralité d'éléments de données d'images compressées qui sont générés par l'unité de traitement de compression et qui diffèrent en termes de temps de retard de compression. Le dispositif de réception d'images comporte: une unité de réception de données qui reçoit la pluralité multiplexée d'éléments de données d'images compressées qui diffèrent en termes de temps de retard de compression et extrait chacun des éléments de données d'images compressées; une unité de traitement d'expansion qui étend chaque élément de la pluralité d'éléments de données d'images compressées qui diffèrent en termes de temps de retard de compression et qui sont reçus par l'unité de réception de données et génère une pluralité d'éléments de données d'images; une unité de génération d'images supplémentaires qui génère des données d'images supplémentaires; et une unité de superposition d'images qui superpose une pluralité d'éléments de données d'images. L'unité de superposition d'images superpose des données d'images supplémentaires provenant de l'unité de traitement d'expansion ou d'autres données d'images générées par l'unité de traitement d'expansion sur une partie des données d'images générées par l'unité de traitement d'expansion et délivre le résultat en sortie.
PCT/JP2014/052925 2014-02-07 2014-02-07 Dispositif d'émission d'images, dispositif de réception d'images et système de caméras de surveillance, système de téléconférence et système de caméra montée sur véhicule l'utilisant Ceased WO2015118664A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/052925 WO2015118664A1 (fr) 2014-02-07 2014-02-07 Dispositif d'émission d'images, dispositif de réception d'images et système de caméras de surveillance, système de téléconférence et système de caméra montée sur véhicule l'utilisant

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/052925 WO2015118664A1 (fr) 2014-02-07 2014-02-07 Dispositif d'émission d'images, dispositif de réception d'images et système de caméras de surveillance, système de téléconférence et système de caméra montée sur véhicule l'utilisant

Publications (1)

Publication Number Publication Date
WO2015118664A1 true WO2015118664A1 (fr) 2015-08-13

Family

ID=53777494

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/052925 Ceased WO2015118664A1 (fr) 2014-02-07 2014-02-07 Dispositif d'émission d'images, dispositif de réception d'images et système de caméras de surveillance, système de téléconférence et système de caméra montée sur véhicule l'utilisant

Country Status (1)

Country Link
WO (1) WO2015118664A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018107655A (ja) * 2016-12-27 2018-07-05 株式会社Nexpoint 監視カメラシステム
CN110050462A (zh) * 2016-12-22 2019-07-23 康奈可关精株式会社 图像显示控制装置
WO2023136211A1 (fr) * 2022-01-14 2023-07-20 マクセル株式会社 Système de lentille d'imagerie, module de caméra, système monté sur véhicule et objet mobile
US20240121519A1 (en) * 2021-02-09 2024-04-11 Sony Group Corporation Information processing device, information processing method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007336260A (ja) * 2006-06-15 2007-12-27 Matsushita Electric Ind Co Ltd 映像監視装置
JP2008181196A (ja) * 2007-01-23 2008-08-07 Nikon Corp 画像処理装置、電子カメラ、および画像処理プログラム
JP2010288186A (ja) * 2009-06-15 2010-12-24 Olympus Corp 画像送信装置及び画像受信装置
JP2013535921A (ja) * 2010-08-06 2013-09-12 トムソン ライセンシング 信号を受信する装置および方法
JP2013186834A (ja) * 2012-03-09 2013-09-19 Canon Inc 画像処理装置、画像処理装置の制御方法、およびプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007336260A (ja) * 2006-06-15 2007-12-27 Matsushita Electric Ind Co Ltd 映像監視装置
JP2008181196A (ja) * 2007-01-23 2008-08-07 Nikon Corp 画像処理装置、電子カメラ、および画像処理プログラム
JP2010288186A (ja) * 2009-06-15 2010-12-24 Olympus Corp 画像送信装置及び画像受信装置
JP2013535921A (ja) * 2010-08-06 2013-09-12 トムソン ライセンシング 信号を受信する装置および方法
JP2013186834A (ja) * 2012-03-09 2013-09-19 Canon Inc 画像処理装置、画像処理装置の制御方法、およびプログラム

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110050462A (zh) * 2016-12-22 2019-07-23 康奈可关精株式会社 图像显示控制装置
JP2018107655A (ja) * 2016-12-27 2018-07-05 株式会社Nexpoint 監視カメラシステム
WO2018123078A1 (fr) * 2016-12-27 2018-07-05 株式会社Nexpoint Système de caméra de surveillance
US20240121519A1 (en) * 2021-02-09 2024-04-11 Sony Group Corporation Information processing device, information processing method, and program
WO2023136211A1 (fr) * 2022-01-14 2023-07-20 マクセル株式会社 Système de lentille d'imagerie, module de caméra, système monté sur véhicule et objet mobile

Similar Documents

Publication Publication Date Title
US9462296B2 (en) Method and system for motion-compensated frame-rate up-conversion for both compressed and decompressed video bitstreams
US8416852B2 (en) Video signal coding system and method of coding video signal for network transmission, video output apparatus, and signal conversion apparatus
EP2157791A2 (fr) Procédé et système permettant l'augmentation de la fréquence de trame compensée en mouvement à la fois pour des trains de bits vidéo comprimés et décomprimés
US9438849B2 (en) Systems and methods for transmitting video frames
US8724912B2 (en) Method, apparatus, and program for compressing images, and method, apparatus, and program for decompressing images
US20180184119A1 (en) Method and device for displaying a sequence of pictures
US20240305776A1 (en) Video encoding method, video decoding method, and related apparatuses
US20100045810A1 (en) Video Signal Processing System and Method Thereof
CN106031168A (zh) 具有减少色彩分辨率的视频流的自适应处理
WO2015118664A1 (fr) Dispositif d'émission d'images, dispositif de réception d'images et système de caméras de surveillance, système de téléconférence et système de caméra montée sur véhicule l'utilisant
US11849127B2 (en) Video encoding method, video decoding method, and related apparatuses
US12155848B2 (en) Video encoding method, video decoding method, and related apparatuses
JP2013187769A (ja) 符号化装置
US20150109436A1 (en) Smart Dual-View High-Definition Video Surveillance System
CN102843566B (zh) 一种3d视频数据的通讯方法和设备
JP5872171B2 (ja) カメラシステム
EP2312859A2 (fr) Procédé et système de communication de vidéo 3D via un lien de communication sans fil
WO2012147791A1 (fr) Dispositif de réception d'image et procédé de réception d'image
JP3577585B2 (ja) 画像伝送装置及び画像伝送方法並びに画像伝送プログラムを記録したコンピュータ読み取り可能な記録媒体
Misu et al. Real-time video coding system for up to 4K 120P videos with spatio-temporal format conversion
WO2012147786A1 (fr) Dispositif de transmission d'image et procédé de transmission d'image
US9781438B2 (en) Standardized hot-pluggable transceiving unit with signal encoding or decoding capabilities
CN2922341Y (zh) 一种实现高清晰度视频信号输入输出的视频会议终端
US20040179136A1 (en) Image transmission system and method thereof
JP2006311327A (ja) 画像信号復号装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14881722

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14881722

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP