EP3528169B1 - Image transmission apparatus, camera system, and image transmission method - Google Patents
Image transmission apparatus, camera system, and image transmission method Download PDFInfo
- Publication number
- EP3528169B1 EP3528169B1 EP19156677.7A EP19156677A EP3528169B1 EP 3528169 B1 EP3528169 B1 EP 3528169B1 EP 19156677 A EP19156677 A EP 19156677A EP 3528169 B1 EP3528169 B1 EP 3528169B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- camera
- passenger
- information
- imaged
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D11/00—Passenger or crew accommodation; Flight-deck installations not otherwise provided for
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/593—Recognising seat occupancy
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
Definitions
- the present disclosure relates to an image transmission apparatus in a moving body, a camera system, and an image transmission method.
- US 2014/195609 A1 relates to an imaging method, wherein an image including a plurality of people is captured and transmitted to devices belonging to people recognized within the image, The people may then approve or disapprove the image and, based thereon, portions corresponding to the people are removed from the image, which is subsequently published.
- EP 2 924 663 A1 relates to a camera system for counting passengers of a vehicle.
- the present disclosure provides an image transmission apparatus, a camera system, and an image transmission method useful for safely checking on the state of a passenger from outside a moving body.
- the shape, functions, and the like of the “camera” shall not be construed as being limited, and the term “camera” shall encompass dome, box, movable (pan and tilt), fixed, analog, digital, omnidirectional (360°), wired, wireless, and other types of cameras.
- the terms "image” and “image signal” shall be construed as encompassing videos and still images.
- processing to remove image region shall be construed as encompassing masking the image region.
- masking shall be construed as encompassing modifying predetermined values so that the pixel values of the same image region are all a uniform color or subjecting the image region to mosaic or blurring processing.
- a camera system 11 is installed in an aircraft 1.
- the camera system 11 is communicably connected to a ground monitoring system 15 via an aircraft wireless device 12 installed in the aircraft 1, a satellite wireless device 13, and a ground wireless device 14.
- the camera system 11 includes a camera 111 and a server 113.
- the camera system 11 captures images of the interior of the aircraft 1, and outputs the captured images out of the aircraft via the aircraft wireless device 12.
- the aircraft wireless device 12 is installed in the aircraft 1 and controls an antenna (not illustrated in the drawings) that enables communication with the satellite wireless device 13, and controls wireless signals for transmitting and receiving. Note that the aircraft wireless device 12 may bypass the satellite wireless device 13 and communicate directly with the ground wireless device 14, such as in air-to-ground communication.
- the satellite wireless device 13 is a satellite that communicates with the aircraft wireless device 12 and the ground wireless device 14.
- the ground wireless device 14 is capable of transmitting and receiving various signals to and from the satellite wireless device 13, and is connected to the ground monitoring system 15.
- the ground monitoring system 15 includes a server owned by an airline company and devices owned by passengers and family members of the passengers that use the airline company.
- a passenger and/or family member of the passenger sends a confirmation request for an in-flight image for a specific aircraft (reserved aircraft or aircraft that the passenger is riding on) to the server from a device such as a smartphone or tablet.
- the server receives the image transmission request from the device and transmits an image transmission request signal to the camera system 11 via each of the ground wireless device 14, the satellite wireless device 13, and the aircraft wireless device 12.
- the camera system 11 in response to the request from the ground monitoring system 15 (the image transmission request signal), transmits images, audio, and the like of the interior of the aircraft from the aircraft wireless device 12 to the ground monitoring system 15 via the satellite wireless device 13 and the ground wireless device 14.
- ground monitoring system 15 can be simultaneously connected to the wireless devices of a plurality of aircraft and, in the present disclosure, the operations and processing of the ground monitoring system 15 can be simultaneously executed for the camera systems of a plurality of aircraft.
- FIG. 2 illustrates the relationships between seats and imaging ranges of cameras installed in the aircraft 1.
- the aircraft advancing direction D1 of the aircraft 1 is depicted as being in the left paper direction.
- seats 211, 212, 213, 214, 221, 222, 223, and 224 are arranged from front to back in the aircraft advancing direction D1.
- Cameras 111 and 112 are installed in the ceiling of an aisle area A23.
- the seats 211 to 214 are included in an imaging range A21, which is the imaging range of the camera 111
- the seats 221 to 224 are included in an imaging range A22, which is the imaging range of the camera 112.
- the camera in order to transmit the image of a passenger seated in a specific seat, the camera must be selected that has the imaging range that covers the specific seat. For example, in order to transmit the image of the passenger seated in seat 211, the camera 111 that has the imaging range that covers the seat 211 is selected.
- FIG. 3 illustrates the configuration of the camera system 11.
- the camera system 11 includes a camera 111, a camera 112, and a server 113 that connects to the cameras 111 and 112. Note that an example of the camera system 11 is described that includes two cameras (the camera 111 and the camera 112). However, configurations are possible in which one camera or three or more cameras are provided.
- the camera 111 includes an imager 1111 and an image outputter 1112.
- the camera 112 includes an imager 1121 and an image outputter 1122.
- the imagers 1111 and 1121 each include a lens and an image sensor.
- the lens collects light that enters from outside the camera 111 and forms an image on the imaging surface of the image sensor. Examples of the lens include fisheye lenses and wide-angle lenses.
- the image sensor is, for example, an imaging device of a complementary metal oxide semiconductor (CMOS) or a charged-coupled device (CCD). The image sensor converts the optical image formed on the imaging surface to an electrical signal.
- CMOS complementary metal oxide semiconductor
- CCD charged-coupled device
- each of the image outputters 1112 and 1122 includes a central processing unit (CPU), a micro processing unit (MPU), or a digital signal processor (DSP).
- CPU central processing unit
- MPU micro processing unit
- DSP digital signal processor
- Data (frames) of the captured image that are recognizable by humans are generated by performing predetermined signal processing using the electrical signals from the imager 1111, and the generated data is output as image signals.
- the imager 1111 captures the image of the imaging range A21 illustrated in FIG. 2 , and transmits an image signal to the image outputter 1112.
- the imager 1121 captures the image of the imaging range A22, and transmits an image signal to the image outputter 1122.
- the image outputter 1112 outputs the image signal, sent from the imager 1111, to the server 113.
- the image outputter 1122 outputs the image signal, sent from the imager 1121, to the server 113.
- the server 113 includes a selector 1131, a receiver 1130, an image processor 1132, a transmitter 1133, and a storage 1135.
- the selector 1131 and the image processor 1132 are constituted by a processor 1139 that includes a central processing unit (CPU), a micro processing unit (MPU), a digital signal processor (DSP), or the like.
- the processor 1139 realizes the functions of the selector 1131 and the image processor 1132 by executing a program that is stored in the memory.
- the receiver 1130 receives an image transmission request signal Re1 that is sent from the ground monitoring system 15 via the ground wireless device 14, the satellite wireless device 13, and the aircraft wireless device 12.
- the image transmission request signal Re1 includes imaging subject information that identifies the passenger to be imaged.
- the imaging subject information includes seat information that identifies the seat of the passenger to be imaged, and identification information of the passenger to be imaged (hereinafter referred to as "passenger identifying information").
- the imaging subject information may be input in real-time (during travel) from the ground monitoring system 15, or may be received prior to boarding and registered in the storage 1135 or the like.
- the seat information is the seat number.
- the passenger identifying information is information that identifies the passenger and, for example, is a ticket number, a reservation number, a member registration number for the airline company, a passport number, a password, or the like. From the standpoint of security and to limit the in-flight images that can be checked to those captured on the aircraft 1, when transmitting the image transmission request signal Re1 from the ground monitoring system 15 to the server 113, authentication information known only to the passenger may be simultaneously sent with the identifying information, and authentication processing (described later) may be performed.
- the selector 1131 receives the image signals output from the image outputter 1112 of the camera 111 and the image signals output from the image outputter 1122 of the camera 112.
- the selector 1131 acquires the image transmission request signal Re1 from the receiver 1130. As described later, the selector 1131 selects the image captured by one camera among the plurality of cameras 111 and 112.
- the image processor 1132 executes processing to remove other image regions aside from the image region that covers the passenger to be imaged from the image of the camera 111 or 112 that was selected by the selector 1131.
- the transmitter 1133 outputs, to the aircraft wireless device 12, an image signal Im1 of the image processed by the image processor 1132, and transmits the image signal Im1 to the ground monitoring system 15 via the satellite wireless device 13 and the ground wireless device 14.
- the storage 1135 is configured from a semiconductor member, a magnetic disk, or the like.
- the storage 1135 stores tables T1 and T2 illustrated in FIGS. 5 and 6 .
- the passenger identifying information (ticket number, reservation number, a member registration number for the airline company, password, or the like) and the seat information (seat number or the like) are associated and stored in the table T1.
- the seat information, camera information, and image region information are associated and stored in the table T2.
- the selector 1131 uses these pieces of information for the authentication processing (described later) and for the selection of the camera image.
- the camera information is information that identifies the camera that has the imaging range that covers a seat.
- each camera has an imaging range that covers a plurality of seats and, therefore, a plurality of seat information is associated with each camera and stored.
- the image region information is information that specifies the image region that covers the position of the seat (or passenger) to be imaged. For example, when, as illustrated in FIG. 2 , the imaging range of the camera 111 includes seats 211 to 214 and the requested imaging subject is the seat 211 (passenger P1), image region 41a is associated with the seat 211 (passenger P1) and stored, as illustrated in FIG. 7 .
- the aircraft wireless device 12 that is connected to the camera system 11 includes a signal converter 121.
- the signal converter 121 converts the image signal Im1 to a wireless signal and outputs the wireless signal.
- the outputted wireless signal is sent to the ground monitoring system 15 via the satellite wireless device 13 and the ground wireless device 14.
- the server 113 executes the following processing.
- Authentication processing is executed by the selector 1131 of the server 113 using the imaging subject information included in the image transmission request signal Re1.
- the selector 1131 carries out the authentication processing by comparing the seat information and the passenger identifying information included in the imaging subject information with the seat information and the passenger identifying information of the table T1 of FIG. 5 that is stored in the storage 1135 (S102).
- the identifying information matches (S103; YES)
- step S104 is executed.
- the imaging subject information included in the image transmission request signal Re1 includes only the passenger identifying information.
- the imaging subject information is compared with the passenger identifying information of the table T1 of FIG. 5 and, thereafter, the seat of the passenger is identified based on the correspondence relationship with the seat information of table T1.
- the selector 1131 identifies the seat of the passenger to be imaged based on the imaging subject information (S104).
- the selector 1131 references the table T2 stored in the storage 1135 to identify the camera that corresponds to the identified seat (S105), and selects the image captured by that camera (S106).
- the selected image is processed by the image processor 1132 (S107). Specifically, the image processor 1132 references the table T2 ( FIG. 6 ) stored in the storage 1135 to acquire the image region information (for example, coordinate values or the like) that corresponds to the seat information of the passenger to be imaged. The image processor 1132 performs processing to remove image regions other than the image region that corresponds to the image region information.
- the image processor 1132 references the table T2 ( FIG. 6 ) stored in the storage 1135 to acquire the image region information (for example, coordinate values or the like) that corresponds to the seat information of the passenger to be imaged.
- the image processor 1132 performs processing to remove image regions other than the image region that corresponds to the image region information.
- FIG. 7 illustrates an example of the processing carried out by the image processor 1132.
- the seat specified by the imaging subject information is the seat 211, and the image that includes the passenger P1 seated in the seat 211 is processed.
- the seat 211 is covered in the imaging range A21 of the camera 111.
- the unprocessed image 41 is an image signal that is input to the image processor 1132 from the image outputter 1112 of the camera 111.
- the image includes, in addition to the passenger P1 seated in the seat 211, the other passengers seated in the seats 212 to 214 that are covered in the imaging range A21 ( FIG. 2 ) of the camera 111.
- the processed image 42 is an image signal that is output from the image processor 1132.
- the image processor 1132 performs processing so that the image regions that cover passengers other than the passenger P1 seated in the seat 211 are covered by a privacy mask 421. This processing is performed based on the information of table T2 of FIG. 6 .
- the shape of the privacy mask 421 illustrated in FIG. 7 is an example, and a configuration is possible in which masks are used that cover only the faces of the other passengers.
- the image processor 1132 may be configured to remove, by facial recognition, the image regions other than the image region that covers the passenger to be imaged. For example, a configuration is possible in which the image processor 1132 acquires feature information of the face of the passenger to be imaged, and performs processing to remove face regions that do not correspond with that feature information.
- the processed image is sent, as the image signal Im1, by the transmitter 1133 to the ground monitoring system 15 (S108). If there is an end operation (stop request from the ground monitoring system 15 or the like), the processing is ended.
- step S103 the processor 1139 of the server 113 sends a notification, indicating that an image will not be sent, from the transmitter 1133 to the ground monitoring system 15 (S110).
- the server 113 as the image transmission apparatus is an apparatus that connects to the plurality of cameras 111 and 112 that image the interior of a moving body, namely the aircraft 1.
- the server 113 includes the receiver 1130, the processor 1139, and the transmitter 1133.
- the receiver 1130 receives the image transmission request signal Re1 and the imaging subject information that identifies the passenger to be imaged from the ground monitoring system 15, which is an external device of the aircraft 1.
- the processor 1139 selects, based on the imaging subject information, an image captured by at least one camera of the plurality of cameras 111 and 112, and executes processing to remove, from the selected image, the other image regions aside from the image region that covers the passenger to be imaged.
- the transmitter 1133 transmits the processed image to the ground monitoring system 15.
- the identifying information and the seat information of the passenger and the characteristics of the images captured by the cameras installed in the aircraft 1 are used. As such, it is possible to safely check on the state of a specific passenger from outside the aircraft while maintaining the privacy of the other passengers in the aircraft.
- images, audio, and the like of what occurred on-site at the time of the incident or accident can be recorded in the ground monitoring system 15, and the recorded content can be reviewed after the incident or accident in order to investigate the cause of the incident or accident.
- the images, audio, and the like in the aircraft 1 can be analyzed in real-time and used in a variety of applications. 1-5 Modification Examples
- the server 113 executes steps S201 to S203 in the same manner as steps S101 to S103 of the processing of FIG. 4 .
- the imaging subject information received together with the image transmission request signal Re1 includes feature information that enables facial recognition of the passenger to be imaged. Note that this feature information is not limited to being received together with the image transmission request signal Re1, and may be acquired in advance and stored in the storage 1135 or the like.
- the selector 1131 selects, based on the seat information included in the imaging subject information or the seat information stored in the storage 1135, the image captured by the corresponding seat camera (S204).
- the processor 1139 uses the passenger identifying information to perform facial recognition of the selected image (S205). When the facial recognition is successful, the image captured by that seat camera is selected (S206).
- the processor 1139 of the server 113 acquires the images captured by the plurality of in-aircraft cameras (S207), and uses the passenger identifying information to perform facial recognition on the images captured by the various in-aircraft cameras that were acquired (S208).
- the image captured by the in-aircraft camera that output that image is selected (S209).
- the selected image is processed by the image processor 1132 in the same manner as described for steps S107 to S108 of FIG. 4 (S210).
- the image captured by the in-aircraft camera that was selected in step S209 is subjected to processing to remove the image regions other than the image region that covers the passenger for which facial recognition was successful.
- the processed image is sent, as the image signal Im1, to the ground monitoring system 15 via the transmitter 1133 (S211).
- the processor 1139 sends a notification, indicating that an image will not be sent, to the ground monitoring system 15 via the transmitter 1133.
- the processor 1139 of the server 113 may determine the seat information of the passenger for which facial recognition was successful from the selected image of the in-aircraft camera, and correct the seat information held by the server 113.
- a configuration is possible in which the facial recognition processing is repeated at a predetermined time or a predetermined number of times when the passenger is not present in the seat in steps S205 or S208 (when recognition of a human face is not possible).
- FIG. 9 illustrates the arrangement of a security camera system 51 in the aircraft. Note that the number and disposal locations of the cameras to be installed are examples.
- FIG. 10 illustrates an imaging range of a camera to be installed in the aircraft.
- the interior of the aircraft is roughly divided into seat areas A62 and an aisle area A63.
- a camera 511 is installed in the ceiling of the aisle area A63.
- the imaging range of the camera 511 is as indicated by an imaging range A61.
- FIG. 11 is a block diagram illustrating the configurations of a security camera system 51 and an aircraft system 52.
- the security camera system 51 includes the camera 511 and a server 513.
- a server 513 In this case, an example is described in which one camera is provided, but a configuration is possible in which two or more cameras are provided.
- the camera 511 includes an imager 5111 and an image outputter 5112.
- the operations of the imager 5111 and the image outputter 5112 are the same as those of the imager and the image outputter of Embodiment 1 and, as such, redundant descriptions thereof are avoided.
- the server 513 includes an image analyzer 5131 and an outputter 5133.
- the image analyzer 5131 receives image signals that are output from the image outputter 5112 of the camera 511. Additionally, the image analyzer 5131 receives sit-down request information Re2 that is issued from the aircraft system 52 when taking off, landing, or when sudden turbulence is expected.
- the image analyzer 5131 upon receipt of the sit-down request information Re2, the image analyzer 5131 analyzes the image signals and determines if there are passengers that are not seated. Then, when there is a passenger that is not seated, the image analyzer 5131 calculates the positional information of the passenger based on an identification number, installation location, and the like of the camera that captured the image of the passenger, and issues a notification, as a not-seated alert Al1, to the aircraft system 52.
- FIG. 12 illustrates an example of an image 81 output from the image outputter 5112 of the camera 511.
- the image analyzer 5131 acquires the image 81, determines the seat areas A62 and the aisle area A63, and analyzes whether a passenger is present in the aisle area A63.
- the not-seated alert Al1 is issued via the outputter 5133 to the aircraft system 52, and the crew members are notified.
- the image analyzer 5131 can determine if the person present in the aisle area is a passenger or a crew member based on clothing, hair style, movement, or the like. Thus, the generation of not-seated alerts Al1 about crew members can be suppressed.
- the image analyzer 5131 executes an analysis of the image signal not only upon receipt of the sit-down request information Re2, but also at a predetermined cycle after the sit-down request information Re2 is received.
- the content of the not-seated alert Al1 can be updated and notified to the crew members as a time series. For example, it is possible to exclude passengers from the not-seated alerts A11, who were in the aisle area immediately after the fasten seatbelt sign was turned on (immediately after receipt of sit-down request information Re2) but sat down right away. Additionally, it is possible to identify aisle areas in which passengers are present for an extended amount of time after the fasten seatbelt sign has been turned on, and not-seated alerts Al1 can be updated so as to direct the crew members to those aisle areas with priority.
- the image analyzer 5131 may count, in the image signal, the number of people present in an aisle area, and change the priority of the positional information included in the not-seated alert Al1 depending on the number of people. For example, assume that five passengers are present in an aisle area captured by a first camera and one passenger is present in an aisle area captured by a second camera. In this case, the image analyzer 5131 can assign higher priority to the positional information in the aircraft calculated based on the installation location of the first camera than to the positional information in the aircraft calculated based on the installation position of the second camera, include this priority in the not-seated alert A11, and notify the crew members. As a result of this configuration, the crew members can efficiently start guiding passengers to their seats, beginning with the locations where there are more passengers in the aisle areas.
- the moving body is an aircraft
- the technology according to the present disclosure is not limited thereto.
- the technology according to the present disclosure may be installed in a train, a bus, a marine vessel, or other vehicle.
- the processor 1139 of the server 113 may be implemented by a processor that is constituted by a dedicated electronic circuit that is designed to realize a predetermined function. Examples of such a processor include an FPGA and an ASIC.
- the program for performing the processing of each functional block of the server 113 may be stored in a storage device such as a hard disk or ROM. In this case, the program is read out to the ROM or RAM to be executed.
- each functional block of the server 113 may be realized by hardware, or may by realized by software (including cases when realized by an operating system (OS) or middleware, or with a predetermined library). Furthermore, the processing of each functional block of the server 113 maybe realized by mixed processing by software and hardware.
- OS operating system
- middleware middleware
- predetermined library a predetermined library
- Programs and methods that cause a computer to execute the processing of the various functional blocks of the server 113, and computer-readable recording media on which those programs are stored are within the scope of the present disclosure.
- Examples of computer-readable recording media include flexible disks, hard disks, CD-ROMs, MOs, DVDs, DVD-ROMs, DVD-RAMs, BDs (Blu-ray Disc), and semiconductor memory.
- the computer program is not limited to being stored on the recording media described above, and may be acquired over an electric telecommunication line, a wireless or wired communication line, a network such as the Internet, or the like.
- Each step described in the flowcharts may be executed by a single device, or may be shared and executed by a plurality of devices. Furthermore, when a single step includes a plurality of processes, the plurality of process of that single step may be executed by a single device, or may be shared and executed by a plurality of devices.
- system refers to collections of pluralities of constituents (devices, modules (parts), and the like), and should not be construed as suggesting that all of the constituents are included in the same housing. Accordingly, the term “system” encompasses a plurality of devices that are stored in separate housings and are connected across a network, and a single device including a single housing and a plurality of modules stored in that housing.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Closed-Circuit Television Systems (AREA)
- Alarm Systems (AREA)
Description
- The present disclosure relates to an image transmission apparatus in a moving body, a camera system, and an image transmission method.
- There are occasions when a person that requires care such as a child or an elderly person rides a moving body such as an aircraft or a train unaccompanied. In such a case, there is a demand by the family or guardian of the passenger to check on, by image, the state of the passenger from outside the moving body.
- While it is possible to use an image captured by a camera disposed in the moving body, doing so may violate the privacy of other passengers. As such, from a practical standpoint, it has been difficult for the family or guardian of a passenger to check on the passenger from outside a moving body.
-
US 2014/195609 A1relates to an imaging method, wherein an image including a plurality of people is captured and transmitted to devices belonging to people recognized within the image, The people may then approve or disapprove the image and, based thereon, portions corresponding to the people are removed from the image, which is subsequently published. -
EP 2 924 663 A1 -
WO 2015/155379 A1 relates to a monitoring system installed on a vehicle for determining whether or not seats of the vehicle are occupied. - The present disclosure provides an image transmission apparatus, a camera system, and an image transmission method useful for safely checking on the state of a passenger from outside a moving body.
- The invention is defined by the independent claims. The dependent claims describe advantageous embodiments.
-
-
FIG. 1 schematically illustrates the configuration of a communication system that includes a camera system according toEmbodiment 1; -
FIG. 2 illustrates relationships, in an aircraft, between seats and imaging ranges of cameras inEmbodiment 1; -
FIG. 3 illustrates the configuration of a camera system according toEmbodiment 1; -
FIG. 4 is a flowchart illustrating the behavior of a server of the camera system according toEmbodiment 1; -
FIG. 5 illustrates an example of an information table for image selection by the server according toEmbodiment 1; -
FIG. 6 illustrates an example of an information table for image selection by the server according toEmbodiment 1; -
FIG. 7 illustrates an example of image processing by the server according toEmbodiment 1; -
FIG. 8 is a flowchart illustrating the behavior of the server of the camera system according to a modification example ofEmbodiment 1; -
FIG. 9 is a drawing illustrating the arrangement, in an aircraft, of a camera system according to Embodiment 2;
FIG. 10 illustrates relationships, in the aircraft, between areas and the imaging range of a camera inEmbodiment 2; -
FIG. 11 illustrates the configuration of the camera system according toEmbodiment 2; and -
FIG. 12 illustrates an example of an image captured by the camera inEmbodiment 2. - Next, embodiments of the present disclosure will be described with reference to the drawings.
- In the following, an example of a case in which the moving body is a commercial aircraft will be described.
- Unless otherwise stipulated in each embodiment, as used in the following description, the shape, functions, and the like of the "camera" shall not be construed as being limited, and the term "camera" shall encompass dome, box, movable (pan and tilt), fixed, analog, digital, omnidirectional (360°), wired, wireless, and other types of cameras. The terms "image" and "image signal" shall be construed as encompassing videos and still images.
- The phrase "processing to remove image region" shall be construed as encompassing masking the image region. The term "masking" shall be construed as encompassing modifying predetermined values so that the pixel values of the same image region are all a uniform color or subjecting the image region to mosaic or blurring processing.
- In the following description, an example is given of a system that basically combines cameras and a separate device (a server or the like), but the present disclosure is not limited thereto and embodiments are possible in which cameras are implemented alone.
- As illustrated in
FIG. 1 , acamera system 11 is installed in anaircraft 1. Thecamera system 11 is communicably connected to aground monitoring system 15 via an aircraftwireless device 12 installed in theaircraft 1, a satellitewireless device 13, and a groundwireless device 14. - As described later, the
camera system 11 includes acamera 111 and aserver 113. Thecamera system 11 captures images of the interior of theaircraft 1, and outputs the captured images out of the aircraft via the aircraftwireless device 12. - The aircraft
wireless device 12 is installed in theaircraft 1 and controls an antenna (not illustrated in the drawings) that enables communication with the satellitewireless device 13, and controls wireless signals for transmitting and receiving. Note that the aircraftwireless device 12 may bypass the satellitewireless device 13 and communicate directly with the groundwireless device 14, such as in air-to-ground communication. The satellitewireless device 13 is a satellite that communicates with the aircraftwireless device 12 and the groundwireless device 14. The groundwireless device 14 is capable of transmitting and receiving various signals to and from the satellitewireless device 13, and is connected to theground monitoring system 15. - In one example, the
ground monitoring system 15 includes a server owned by an airline company and devices owned by passengers and family members of the passengers that use the airline company. A passenger and/or family member of the passenger sends a confirmation request for an in-flight image for a specific aircraft (reserved aircraft or aircraft that the passenger is riding on) to the server from a device such as a smartphone or tablet. The server receives the image transmission request from the device and transmits an image transmission request signal to thecamera system 11 via each of the groundwireless device 14, the satellitewireless device 13, and the aircraftwireless device 12. - In an overall configuration such as that described above, in response to the request from the ground monitoring system 15 (the image transmission request signal), the
camera system 11 transmits images, audio, and the like of the interior of the aircraft from the aircraftwireless device 12 to theground monitoring system 15 via the satellitewireless device 13 and the groundwireless device 14. - Note that the
ground monitoring system 15 can be simultaneously connected to the wireless devices of a plurality of aircraft and, in the present disclosure, the operations and processing of theground monitoring system 15 can be simultaneously executed for the camera systems of a plurality of aircraft. -
FIG. 2 illustrates the relationships between seats and imaging ranges of cameras installed in theaircraft 1. InFIG. 2 , the aircraft advancing direction D1 of theaircraft 1 is depicted as being in the left paper direction. In theaircraft 1,seats Cameras seats 211 to 214 are included in an imaging range A21, which is the imaging range of thecamera 111, and theseats 221 to 224 are included in an imaging range A22, which is the imaging range of thecamera 112. - Note that the number of seats, the positions of the cameras, and the imaging ranges illustrated in the drawings are merely examples and the present disclosure is not limited thereto.
- As illustrated in
FIG. 2 , in order to transmit the image of a passenger seated in a specific seat, the camera must be selected that has the imaging range that covers the specific seat. For example, in order to transmit the image of the passenger seated inseat 211, thecamera 111 that has the imaging range that covers theseat 211 is selected. -
FIG. 3 illustrates the configuration of thecamera system 11. Thecamera system 11 includes acamera 111, acamera 112, and aserver 113 that connects to thecameras camera system 11 is described that includes two cameras (thecamera 111 and the camera 112). However, configurations are possible in which one camera or three or more cameras are provided. - The
camera 111 includes animager 1111 and animage outputter 1112. Thecamera 112 includes animager 1121 and animage outputter 1122. - The
imagers camera 111 and forms an image on the imaging surface of the image sensor. Examples of the lens include fisheye lenses and wide-angle lenses. The image sensor is, for example, an imaging device of a complementary metal oxide semiconductor (CMOS) or a charged-coupled device (CCD). The image sensor converts the optical image formed on the imaging surface to an electrical signal. - In one example, each of the image outputters 1112 and 1122 includes a central processing unit (CPU), a micro processing unit (MPU), or a digital signal processor (DSP). Data (frames) of the captured image that are recognizable by humans are generated by performing predetermined signal processing using the electrical signals from the
imager 1111, and the generated data is output as image signals. - The
imager 1111 captures the image of the imaging range A21 illustrated inFIG. 2 , and transmits an image signal to theimage outputter 1112. Likewise, theimager 1121 captures the image of the imaging range A22, and transmits an image signal to theimage outputter 1122. - The
image outputter 1112 outputs the image signal, sent from theimager 1111, to theserver 113. Likewise, theimage outputter 1122 outputs the image signal, sent from theimager 1121, to theserver 113. - The
server 113 includes aselector 1131, areceiver 1130, animage processor 1132, atransmitter 1133, and astorage 1135. - In one example, the
selector 1131 and theimage processor 1132 are constituted by aprocessor 1139 that includes a central processing unit (CPU), a micro processing unit (MPU), a digital signal processor (DSP), or the like. Theprocessor 1139 realizes the functions of theselector 1131 and theimage processor 1132 by executing a program that is stored in the memory. - The
receiver 1130 receives an image transmission request signal Re1 that is sent from theground monitoring system 15 via theground wireless device 14, thesatellite wireless device 13, and theaircraft wireless device 12. - The image transmission request signal Re1 includes imaging subject information that identifies the passenger to be imaged. The imaging subject information includes seat information that identifies the seat of the passenger to be imaged, and identification information of the passenger to be imaged (hereinafter referred to as "passenger identifying information").
- The imaging subject information may be input in real-time (during travel) from the
ground monitoring system 15, or may be received prior to boarding and registered in thestorage 1135 or the like. - In one example, the seat information is the seat number. The passenger identifying information is information that identifies the passenger and, for example, is a ticket number, a reservation number, a member registration number for the airline company, a passport number, a password, or the like. From the standpoint of security and to limit the in-flight images that can be checked to those captured on the
aircraft 1, when transmitting the image transmission request signal Re1 from theground monitoring system 15 to theserver 113, authentication information known only to the passenger may be simultaneously sent with the identifying information, and authentication processing (described later) may be performed. - The
selector 1131 receives the image signals output from the image outputter 1112 of thecamera 111 and the image signals output from the image outputter 1122 of thecamera 112. Theselector 1131 acquires the image transmission request signal Re1 from thereceiver 1130. As described later, theselector 1131 selects the image captured by one camera among the plurality ofcameras - A described later, the
image processor 1132 executes processing to remove other image regions aside from the image region that covers the passenger to be imaged from the image of thecamera selector 1131. - The
transmitter 1133 outputs, to theaircraft wireless device 12, an image signal Im1 of the image processed by theimage processor 1132, and transmits the image signal Im1 to theground monitoring system 15 via thesatellite wireless device 13 and theground wireless device 14. - The
storage 1135 is configured from a semiconductor member, a magnetic disk, or the like. In one example, thestorage 1135 stores tables T1 and T2 illustrated inFIGS. 5 and6 . The passenger identifying information (ticket number, reservation number, a member registration number for the airline company, password, or the like) and the seat information (seat number or the like) are associated and stored in the table T1. The seat information, camera information, and image region information are associated and stored in the table T2. Theselector 1131 uses these pieces of information for the authentication processing (described later) and for the selection of the camera image. - The camera information is information that identifies the camera that has the imaging range that covers a seat. In this case, each camera has an imaging range that covers a plurality of seats and, therefore, a plurality of seat information is associated with each camera and stored. When the image captured by the one camera has an imaging range that covers a plurality of seats (or passengers), the image region information is information that specifies the image region that covers the position of the seat (or passenger) to be imaged. For example, when, as illustrated in
FIG. 2 , the imaging range of thecamera 111 includesseats 211 to 214 and the requested imaging subject is the seat 211 (passenger P1),image region 41a is associated with the seat 211 (passenger P1) and stored, as illustrated inFIG. 7 . - The
aircraft wireless device 12 that is connected to thecamera system 11 includes asignal converter 121. Thesignal converter 121 converts the image signal Im1 to a wireless signal and outputs the wireless signal. The outputted wireless signal is sent to theground monitoring system 15 via thesatellite wireless device 13 and theground wireless device 14. - The operations of the
camera system 11 will be described with reference toFIG. 4 . - When the
receiver 1130 receives an image transmission request signal Re1 (S101; YES), theserver 113 executes the following processing. - Authentication processing is executed by the
selector 1131 of theserver 113 using the imaging subject information included in the image transmission request signal Re1. In one example, theselector 1131 carries out the authentication processing by comparing the seat information and the passenger identifying information included in the imaging subject information with the seat information and the passenger identifying information of the table T1 ofFIG. 5 that is stored in the storage 1135 (S102). When, as a result of the comparison, the identifying information matches (S103; YES), step S104 is executed. - Note that a configuration is possible in which the imaging subject information included in the image transmission request signal Re1 includes only the passenger identifying information. In this case, the imaging subject information is compared with the passenger identifying information of the table T1 of
FIG. 5 and, thereafter, the seat of the passenger is identified based on the correspondence relationship with the seat information of table T1. - The
selector 1131 identifies the seat of the passenger to be imaged based on the imaging subject information (S104). Theselector 1131 references the table T2 stored in thestorage 1135 to identify the camera that corresponds to the identified seat (S105), and selects the image captured by that camera (S106). - The selected image is processed by the image processor 1132 (S107). Specifically, the
image processor 1132 references the table T2 (FIG. 6 ) stored in thestorage 1135 to acquire the image region information (for example, coordinate values or the like) that corresponds to the seat information of the passenger to be imaged. Theimage processor 1132 performs processing to remove image regions other than the image region that corresponds to the image region information. -
FIG. 7 illustrates an example of the processing carried out by theimage processor 1132. Here, an example of a case is described in which the seat specified by the imaging subject information is theseat 211, and the image that includes the passenger P1 seated in theseat 211 is processed. As illustrated inFIG. 2 , theseat 211 is covered in the imaging range A21 of thecamera 111. - The
unprocessed image 41 is an image signal that is input to theimage processor 1132 from the image outputter 1112 of thecamera 111. The image includes, in addition to the passenger P1 seated in theseat 211, the other passengers seated in theseats 212 to 214 that are covered in the imaging range A21 (FIG. 2 ) of thecamera 111. - Meanwhile, the processed
image 42 is an image signal that is output from theimage processor 1132. Theimage processor 1132 performs processing so that the image regions that cover passengers other than the passenger P1 seated in theseat 211 are covered by aprivacy mask 421. This processing is performed based on the information of table T2 ofFIG. 6 . Note that the shape of theprivacy mask 421 illustrated inFIG. 7 is an example, and a configuration is possible in which masks are used that cover only the faces of the other passengers. - Additionally, the
image processor 1132 may be configured to remove, by facial recognition, the image regions other than the image region that covers the passenger to be imaged. For example, a configuration is possible in which theimage processor 1132 acquires feature information of the face of the passenger to be imaged, and performs processing to remove face regions that do not correspond with that feature information. - The processed image is sent, as the image signal Im1, by the
transmitter 1133 to the ground monitoring system 15 (S108). If there is an end operation (stop request from theground monitoring system 15 or the like), the processing is ended. - Meanwhile, when the identifying information does not match and authentication fails in step S103 (S103; NO), the
processor 1139 of theserver 113 sends a notification, indicating that an image will not be sent, from thetransmitter 1133 to the ground monitoring system 15 (S110). - In the
camera system 11 according to the present disclosure, theserver 113 as the image transmission apparatus is an apparatus that connects to the plurality ofcameras aircraft 1. Theserver 113 includes thereceiver 1130, theprocessor 1139, and thetransmitter 1133. Thereceiver 1130 receives the image transmission request signal Re1 and the imaging subject information that identifies the passenger to be imaged from theground monitoring system 15, which is an external device of theaircraft 1. Theprocessor 1139 selects, based on the imaging subject information, an image captured by at least one camera of the plurality ofcameras transmitter 1133 transmits the processed image to theground monitoring system 15. - Typically, it is not possible to check on, by image, situations in an aircraft from outside the
aircraft 1. However, when, for example, a child or elderly person is unaccompanied, there is a demand by the family of the passenger to check on, by image, the state of the passenger during travel. In this case, it is possible to capture an image of the passenger using a camera that is installed in the aircraft and transmit that image out of the aircraft. However, if that captured image is sent without modification, there is a risk of violating the privacy of the other passengers. Additionally, since, unlike typical monitoring cameras, an image that includes a specific image target is selected from images captured by a plurality of cameras, there is a high risk of transmitting an image of the wrong person. - In the
camera system 11 according to the present disclosure, the identifying information and the seat information of the passenger and the characteristics of the images captured by the cameras installed in theaircraft 1 are used. As such, it is possible to safely check on the state of a specific passenger from outside the aircraft while maintaining the privacy of the other passengers in the aircraft. - Additionally, in the
camera system 11 according to the present disclosure, when an incident or accident occurs in the aircraft, images, audio, and the like of what occurred on-site at the time of the incident or accident can be recorded in theground monitoring system 15, and the recorded content can be reviewed after the incident or accident in order to investigate the cause of the incident or accident. Moreover, the images, audio, and the like in theaircraft 1 can be analyzed in real-time and used in a variety of applications. 1-5 Modification Examples - (1) A configuration is possible in which the passenger identifying information recorded in advance in the
storage 1135 of theserver 113 of thecamera system 11 and the imaging subject information sent from theground monitoring system 15 do not include the seat information and, instead, include feature information representing physical features that can be recognized from images, such as facial features.
In this case, when the feature information of the imaging subject is received from theground monitoring system 15, theprocessor 1139 of theserver 113 determines, based on facial recognition or the like, the appropriate passenger from the images captured by the plurality of cameras. The image captured by the camera that has the imaging range covering the passenger to be imaged may be selected by facial recognition. Additionally, a configuration is possible in which the seat information is acquired by determining the seat information of the passenger that has passed facial recognition based on the selected camera image. - (2) In
Embodiment 1, an example is described in which an image captured by a camera installed in the ceiling or the like of the aircraft 1 (hereinafter referred to as "in-aircraft camera") is selected. However, the present disclosure is not limited thereto. For example, a configuration is possible in which an image captured by a camera installed in the monitor of each seat (hereinafter referred to as "seat camera") is selected. In this case, since each seat camera has a positional relationship with each seat on a one-to-one basis, it is easier to identify, in the various images, the image region that covers the passenger to be imaged and the region that is to be subjected to the image processing by theimage processor 1132. - (3) In some cases, in moving bodies such as the
aircraft 1, the seats of the passengers may be changed after boarding. In this case, the acquired seat information may not match the actual seat information. In such cases, the image can be selected according to the processing illustrated by the flowchart ofFIG. 8 . - The
server 113 executes steps S201 to S203 in the same manner as steps S101 to S103 of the processing ofFIG. 4 . The imaging subject information received together with the image transmission request signal Re1 includes feature information that enables facial recognition of the passenger to be imaged. Note that this feature information is not limited to being received together with the image transmission request signal Re1, and may be acquired in advance and stored in thestorage 1135 or the like. - The
selector 1131 selects, based on the seat information included in the imaging subject information or the seat information stored in thestorage 1135, the image captured by the corresponding seat camera (S204). Theprocessor 1139 uses the passenger identifying information to perform facial recognition of the selected image (S205). When the facial recognition is successful, the image captured by that seat camera is selected (S206). - However, when the facial recognition in step S205 fails, the
processor 1139 of theserver 113 acquires the images captured by the plurality of in-aircraft cameras (S207), and uses the passenger identifying information to perform facial recognition on the images captured by the various in-aircraft cameras that were acquired (S208). When there is an image that passes the facial recognition, the image captured by the in-aircraft camera that output that image is selected (S209). - The selected image is processed by the
image processor 1132 in the same manner as described for steps S107 to S108 ofFIG. 4 (S210). The image captured by the in-aircraft camera that was selected in step S209 is subjected to processing to remove the image regions other than the image region that covers the passenger for which facial recognition was successful. The processed image is sent, as the image signal Im1, to theground monitoring system 15 via the transmitter 1133 (S211). - Meanwhile, when the identifying information does not match and authentication fails (S203; NO) or when the facial recognition in step S205 or S208 fails, the
processor 1139 sends a notification, indicating that an image will not be sent, to theground monitoring system 15 via thetransmitter 1133. - Note that, if the passenger to be imaged is not included in the image of the seat camera in
step 205, this means that the passenger indicated in the seat information of the passenger identifying information and the passenger that is actually seated in that seat do not match. In this case, theprocessor 1139 of theserver 113 may determine the seat information of the passenger for which facial recognition was successful from the selected image of the in-aircraft camera, and correct the seat information held by theserver 113. - Moreover, a configuration is possible in which the facial recognition processing is repeated at a predetermined time or a predetermined number of times when the passenger is not present in the seat in steps S205 or S208 (when recognition of a human face is not possible).
- (4) With the
camera system 11 according toEmbodiment 1, the description is focused on transmitting an image signal to theground monitoring system 15, but the present disclosure is not limited thereto. A configuration is possible in which the microphone is installed in addition to the camera in theaircraft 1. In this case, an audio signal of the passenger seated in the seat specified by the imaging subject information may be acquired by the microphone, the acquired audio signal may be subjected to audio processing, by theprocessor 1139 of theserver 113, to remove the voices of the other passengers, and the processed audio signal may be synchronized with the image signal and sent to theground monitoring system 15. - (5) A configuration is possible in which the
server 113 of thecamera system 11 transmits, at a predetermined time, the image signal of the passenger seated in the seat specified by the imaging subject information to theground monitoring system 15 via the various communication devices (theaircraft wireless device 12, thesatellite wireless device 13, theground wireless device 14, and the like). For example, a configuration is possible in which theprocessor 1139 transmits an alarm and/or an image signal to theground monitoring system 15 in accordance with a flight phase (take-off time, landing time, or the like) and/or in-flight services (meal service time, lights on time, and lights off time). In this case, theserver 113 may control the transmission time of the image signal by acquiring, from the system in theaircraft 1, flight information, ON/OFF information of lighting, information about crew member announcements, terminal operation information, and the like. - (6) A configuration is possible in which the
server 113 of thecamera system 11 analyzes the emotional or health state from expressions and/or motions based on the image of the passenger seated in the seat specified by the imaging subject information. In this case, when theprocessor 1139 determines that the passenger is in a specified condition (the passenger does not move for a set amount of time, or the like), an alarm and/or an image signal may be sent to theground monitoring system 15 via the various communication devices. - (7) With the
camera system 11 according toEmbodiment 1, an image captured by one camera was selected for the image transmission request signal Re1, but a configuration is possible in which images captured by a plurality of cameras, which have imaging ranges that cover the passenger to be imaged, are selected and sent to theground monitoring system 15. With such a configuration, images of the passenger from a plurality of angles can be sent. - During take-off and landing of the aircraft and when it is necessary to prepare for sudden turbulence, passengers must be seated in the seats to ensure safety. Conventionally, crew members visually determine that passengers are seated by walking back and forth in the narrow aisles of the large aircraft and checking to confirm that each passenger is seated.
-
FIG. 9 illustrates the arrangement of asecurity camera system 51 in the aircraft. Note that the number and disposal locations of the cameras to be installed are examples. -
FIG. 10 illustrates an imaging range of a camera to be installed in the aircraft. As illustrated inFIG. 10 , the interior of the aircraft is roughly divided into seat areas A62 and an aisle area A63. Acamera 511 is installed in the ceiling of the aisle area A63. In this case, the imaging range of thecamera 511 is as indicated by an imaging range A61. -
FIG. 11 is a block diagram illustrating the configurations of asecurity camera system 51 and anaircraft system 52. - The
security camera system 51 includes thecamera 511 and aserver 513. In this case, an example is described in which one camera is provided, but a configuration is possible in which two or more cameras are provided. - The
camera 511 includes animager 5111 and animage outputter 5112. The operations of theimager 5111 and theimage outputter 5112 are the same as those of the imager and the image outputter ofEmbodiment 1 and, as such, redundant descriptions thereof are avoided. - The
server 513 includes animage analyzer 5131 and anoutputter 5133. - The
image analyzer 5131 receives image signals that are output from the image outputter 5112 of thecamera 511. Additionally, theimage analyzer 5131 receives sit-down request information Re2 that is issued from theaircraft system 52 when taking off, landing, or when sudden turbulence is expected. - Moreover, upon receipt of the sit-down request information Re2, the
image analyzer 5131 analyzes the image signals and determines if there are passengers that are not seated. Then, when there is a passenger that is not seated, theimage analyzer 5131 calculates the positional information of the passenger based on an identification number, installation location, and the like of the camera that captured the image of the passenger, and issues a notification, as a not-seated alert Al1, to theaircraft system 52. -
FIG. 12 illustrates an example of animage 81 output from the image outputter 5112 of thecamera 511. Theimage analyzer 5131 acquires theimage 81, determines the seat areas A62 and the aisle area A63, and analyzes whether a passenger is present in the aisle area A63. In the example illustrated inFIG. 12 , since a passenger P2 is present in the aisle area A63, the not-seated alert Al1 is issued via theoutputter 5133 to theaircraft system 52, and the crew members are notified. - Note that the
image analyzer 5131 can determine if the person present in the aisle area is a passenger or a crew member based on clothing, hair style, movement, or the like. Thus, the generation of not-seated alerts Al1 about crew members can be suppressed. - Note that a configuration is possible in which the
image analyzer 5131 executes an analysis of the image signal not only upon receipt of the sit-down request information Re2, but also at a predetermined cycle after the sit-down request information Re2 is received. As a result of this configuration, the content of the not-seated alert Al1 can be updated and notified to the crew members as a time series. For example, it is possible to exclude passengers from the not-seated alerts A11, who were in the aisle area immediately after the fasten seatbelt sign was turned on (immediately after receipt of sit-down request information Re2) but sat down right away. Additionally, it is possible to identify aisle areas in which passengers are present for an extended amount of time after the fasten seatbelt sign has been turned on, and not-seated alerts Al1 can be updated so as to direct the crew members to those aisle areas with priority. - Furthermore, the
image analyzer 5131 may count, in the image signal, the number of people present in an aisle area, and change the priority of the positional information included in the not-seated alert Al1 depending on the number of people. For example, assume that five passengers are present in an aisle area captured by a first camera and one passenger is present in an aisle area captured by a second camera. In this case, theimage analyzer 5131 can assign higher priority to the positional information in the aircraft calculated based on the installation location of the first camera than to the positional information in the aircraft calculated based on the installation position of the second camera, include this priority in the not-seated alert A11, and notify the crew members. As a result of this configuration, the crew members can efficiently start guiding passengers to their seats, beginning with the locations where there are more passengers in the aisle areas. - During take-off and landing of the aircraft and when it is necessary to prepare for sudden turbulence, the seated state of each passenger can be confirmed without the crew members needing to walk back and forth in the aisles, and the workload of the crew members prior to take-off and landing can be lightened.
- The embodiments described above have been given as examples of the technology that is disclosed in the present application. However, the technology according to the present disclosure is not limited thereto, and changes, substitutions, additions, and omissions can be applied to the embodiments. Moreover, the constituents described in the embodiments may be combined to create new embodiments.
- In the embodiments described above, an example is described in which the moving body is an aircraft, but the technology according to the present disclosure is not limited thereto. For example, the technology according to the present disclosure may be installed in a train, a bus, a marine vessel, or other vehicle.
- Moreover, the
processor 1139 of theserver 113 may be implemented by a processor that is constituted by a dedicated electronic circuit that is designed to realize a predetermined function. Examples of such a processor include an FPGA and an ASIC. - Additionally, the program for performing the processing of each functional block of the
server 113 may be stored in a storage device such as a hard disk or ROM. In this case, the program is read out to the ROM or RAM to be executed. - The processing of each functional block of the
server 113 may be realized by hardware, or may by realized by software (including cases when realized by an operating system (OS) or middleware, or with a predetermined library). Furthermore, the processing of each functional block of theserver 113 maybe realized by mixed processing by software and hardware. - Programs and methods that cause a computer to execute the processing of the various functional blocks of the
server 113, and computer-readable recording media on which those programs are stored are within the scope of the present disclosure. Examples of computer-readable recording media include flexible disks, hard disks, CD-ROMs, MOs, DVDs, DVD-ROMs, DVD-RAMs, BDs (Blu-ray Disc), and semiconductor memory. The computer program is not limited to being stored on the recording media described above, and may be acquired over an electric telecommunication line, a wireless or wired communication line, a network such as the Internet, or the like. - Each step described in the flowcharts may be executed by a single device, or may be shared and executed by a plurality of devices. Furthermore, when a single step includes a plurality of processes, the plurality of process of that single step may be executed by a single device, or may be shared and executed by a plurality of devices.
- In the present disclosure, the terms "system," apparatus," and "device" refer to collections of pluralities of constituents (devices, modules (parts), and the like), and should not be construed as suggesting that all of the constituents are included in the same housing. Accordingly, the term "system" encompasses a plurality of devices that are stored in separate housings and are connected across a network, and a single device including a single housing and a plurality of modules stored in that housing.
Claims (12)
- An image transmission apparatus (113) connectable to a plurality of cameras (111, 112) disposed in a vehicle (1) that image an interior of the vehicle (1), the image transmission apparatus (113) comprising:a receiver (1130) that receives, from an external device (15) of the vehicle (1), an image transmission request including imaging subject information that identifies a passenger of the vehicle (1) to be imaged;a processor (1139) that selects, based on the imaging subject information, an image captured by at least one camera of the plurality of cameras (111,112), and executes processing to remove, from the image, other image regions aside from an image region that covers the passenger to be imaged; anda transmitter (1133) that transmits the image that has been processed to the external device (15);characterized in thatthe plurality of cameras (111, 112) include a first camera that has an imaging range covering a seat of the vehicle (1), and a second camera that has an imaging range covering a plurality of seats,the imaging subject information includes seat information of the vehicle (1), and identification information that indicates a feature of a face of the passenger to be imaged, andthe processor (1139)acquires an image captured by the first camera that corresponds to the seat of the seat information,determines, based on the identification information, whether the passenger to be imaged is included in the image captured by the first camera,when the passenger to be imaged is included in the image captured by the first camera, selects the image captured by the first camera,when the passenger to be imaged is not included in the image captured by the first camera, selects, based on the identification information, an image captured by at least one second camera that has an imaging range covering the passenger to be imaged, andchanges the seat information when the passenger to be imaged is not included in the image captured by the first camera.
- The image transmission apparatus (113) according to claim 1, wherein
the imaging subject information includes the seat information that identifies a seat of the passenger to be imaged and identification information of the passenger to be imaged. - The image transmission apparatus (113) according to claim 1 or 2, further comprising:a storage (1135) in which the seat information that identifies seats of the vehicle (1) and first identification information that identifies a plurality of passengers of the vehicle (1) are associated with each other and stored, whereinthe imaging subject information includes second identification information that identifies the passenger to be imaged,the processor (1139) determines whether the first identification information and the second identification information match with each other, andonly when the first identification information and the second identification information match, the transmitter (1133) transmits the image to the external device (15).
- The image transmission apparatus (113) according to any one of claims 1 to 3, further comprising:a storage (1135) in which the seat information that identifies seats of the vehicle (1) and camera information for identifying a camera that has an imaging range covering the seat are associated with each other and stored, whereinthe processor (1139) selects, based on the imaging subject information, an image captured by at least one camera that corresponds to the seat information stored in the storage (1135).
- The image transmission apparatus (113) according to claim 4, wherein:the storage (1135) stores image region information that indicates, among images captured by one camera that has an imaging range covering a plurality of the seats, an image region that covers each of the seats, andthe processor (1139) executes processing to remove the other image regions from the image based on the image region information.
- The image transmission apparatus (113) according to any one of claims 1 to 4, wherein:the receiver (1130) receives feature information about the passenger to be imaged, andthe processor (1139) selects, based on the feature information, an image captured by at least one camera that has an imaging range covering the passenger to be imaged.
- The image transmission apparatus (113) according to any one of claims 1 to 6, wherein:the vehicle (1) is an aircraft, andthe transmitter (1133) transmits the image to the external device (15) in accordance with a flight phase or an in-flight service time of the aircraft.
- The image transmission apparatus (113) according to any one of claims 1 to 7, wherein the processor (1139) determines, based on the image that is selected, whether an abnormality of the passenger to be imaged exists and, when an abnormality is determined, outputs an alarm.
- A camera system (11), comprising:the image transmission apparatus (113) according to any one claims 1 to 8; andthe plurality of cameras (111, 112) disposed in the vehicle (1) so as to be connected to the image transmission apparatus (113).
- The camera system (11) according to claim 9, wherein the plurality of cameras (111, 112) includes at least one of a first camera that has an imaging range covering a seat of the vehicle (1), and a second camera that has an imaging range covering a plurality of seats.
- An image transmission method including, by using an image transmission apparatus (113) connected to a plurality of cameras (111, 112) disposed in a vehicle (1) that image an interior of the vehicle (1),receiving an image transmission request including imaging subject information that identifies a passenger of the vehicle (1) to be imaged from an external device (15) of the vehicle (1),selecting, based on the imaging subject information, an image captured by at least one camera of the plurality of cameras (111, 112),executing processing to remove, from the image, other image regions aside from an image region that covers the passenger to be imaged, andtransmitting the image that has been processed to the external device (15);characterized in thatthe plurality of cameras (111, 112) include a first camera that has an imaging range covering a seat of the vehicle (1), and a second camera that has an imaging range covering a plurality of seats,the imaging subject information includes seat information of the vehicle (1), and identification information that indicates a feature of a face of the passenger to be imaged, andthe method further includesacquiring an image captured by the first camera that corresponds to the seat of the seat information,determining, based on the identification information, whether the passenger to be imaged is included in the image captured by the first camera,when the passenger to be imaged is included in the image captured by the first camera, selecting the image captured by the first camera,when the passenger to be imaged is not included in the image captured by the first camera, selecting, based on the identification information, an image captured by at least one second camera that has an imaging range covering the passenger to be imaged, andchanging the seat information when the passenger to be imaged is not included in the image captured by the first camera.
- The image transmission method according to claim 11, wherein
the imaging subject information includes the seat information that identifies seats of the vehicle (1) and identification information of the passenger to be imaged.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862631427P | 2018-02-15 | 2018-02-15 |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3528169A1 EP3528169A1 (en) | 2019-08-21 |
EP3528169B1 true EP3528169B1 (en) | 2025-04-16 |
Family
ID=66101791
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19156677.7A Active EP3528169B1 (en) | 2018-02-15 | 2019-02-12 | Image transmission apparatus, camera system, and image transmission method |
Country Status (2)
Country | Link |
---|---|
US (1) | US10911722B2 (en) |
EP (1) | EP3528169B1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11640723B2 (en) * | 2020-10-20 | 2023-05-02 | Rosemount Aerospace Inc. | System and method for enhanced surveillance using video analytics |
US20230156055A1 (en) * | 2021-11-17 | 2023-05-18 | The Boeing Company | System for Transferring Data from a Moving Vehicle to a Remote Monitoring Node |
KR102570386B1 (en) * | 2023-03-17 | 2023-08-28 | (주)지앤티솔루션 | Service providing system and method for detecting the number of passengers of vehicle in high occupancy vehicle lane |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4816828A (en) * | 1986-03-27 | 1989-03-28 | Feher Kornel J | Aircraft damage assessment and surveillance system |
US7634662B2 (en) * | 2002-11-21 | 2009-12-15 | Monroe David A | Method for incorporating facial recognition technology in a multimedia surveillance system |
US7131136B2 (en) * | 2002-07-10 | 2006-10-31 | E-Watch, Inc. | Comprehensive multi-media surveillance and response system for aircraft, operations centers, airports and other commercial transports, centers and terminals |
US6771186B1 (en) * | 2001-10-22 | 2004-08-03 | Birinder R. Boveja | Wireless remote control of systems for countering hostile activity aboard an airplane |
JP2005018307A (en) | 2003-06-25 | 2005-01-20 | Hitachi Kokusai Electric Inc | Automatic ticket inspection system |
US8824751B2 (en) * | 2013-01-07 | 2014-09-02 | MTN Satellite Communications | Digital photograph group editing and access |
EP2924663B1 (en) * | 2014-03-26 | 2019-10-23 | Airbus Operations GmbH | Automatic head count determination on board an aircraft |
CA2941924A1 (en) * | 2014-04-07 | 2015-10-15 | Zodiac Aerotechnics | Cabin monitoring system and cabin of aircraft or spacecraft |
US10198802B2 (en) * | 2015-07-01 | 2019-02-05 | Hitachi Kokusai Electric Inc. | Monitoring system, photography-side device, and verification-side device |
-
2019
- 2019-02-12 EP EP19156677.7A patent/EP3528169B1/en active Active
- 2019-02-13 US US16/274,795 patent/US10911722B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US10911722B2 (en) | 2021-02-02 |
EP3528169A1 (en) | 2019-08-21 |
US20190253671A1 (en) | 2019-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3528169B1 (en) | Image transmission apparatus, camera system, and image transmission method | |
US10881357B1 (en) | Systems and methods for monitoring the health of vehicle passengers using camera images | |
EP3606053B1 (en) | Monitoring system and monitoring method | |
EP3530569B1 (en) | Universal passenger seat system and data interface | |
JP2024019238A (en) | Video monitoring method, video monitoring system, video monitoring terminal, information processing device and program | |
CN114402319A (en) | System, method and computer program for enabling operations based on user authorization | |
US20170113801A1 (en) | Cabin monitoring system and cabin of aircraft or spacecraft | |
EP2924663B1 (en) | Automatic head count determination on board an aircraft | |
US10691955B1 (en) | Aircraft cabin artificial intelligence crew assistance | |
US10885342B1 (en) | Intelligent monitoring camera using computer vision and intelligent personal audio assistant capabilities to maintain privacy | |
US11740315B2 (en) | Mobile body detection device, mobile body detection method, and mobile body detection program | |
US20190191128A1 (en) | Image processing apparatus, image processing method, and storage medium having program stored therein | |
WO2020194584A1 (en) | Object tracking device, control method, and program | |
EP3503021A1 (en) | Information processing device, system, information processing method, and storage medium | |
KR101820344B1 (en) | Image sensing device included in the emergency propagation function | |
WO2019107167A1 (en) | Image processing device, image processing system, image pickup device, image pickup system, and image processing method | |
CN108877141A (en) | Safety zone monitoring system and method | |
KR101825134B1 (en) | System for crime prevention of drone using emotion recognition device | |
US20160221687A1 (en) | Aircraft communication network | |
JP2019008474A (en) | Monitoring support system and monitoring support method | |
CN116994198A (en) | Passenger behavior detection method, apparatus, electronic device, and computer-readable medium | |
JP7235325B2 (en) | Suspicious object detection system | |
CN112232142A (en) | Safety belt identification method and device and computer readable storage medium | |
US11640671B2 (en) | Monitoring system and method for identifying an object of interest after the object of interest has undergone a change in appearance | |
Bhatnagar et al. | Design of a cnn based autonomous in-seat passenger anomaly detection system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20200219 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210512 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Free format text: PREVIOUS MAIN CLASS: G06K0009000000 Ipc: G06V0020520000 Ref country code: DE Ref legal event code: R079 Ref document number: 602019068608 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G06K0009000000 Ipc: G06V0020520000 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 7/18 20060101ALI20241120BHEP Ipc: G06V 20/59 20220101ALI20241120BHEP Ipc: G06V 20/52 20220101AFI20241120BHEP |
|
INTG | Intention to grant announced |
Effective date: 20241202 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP Ref country code: DE Ref legal event code: R096 Ref document number: 602019068608 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20250416 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250416 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1786308 Country of ref document: AT Kind code of ref document: T Effective date: 20250416 |