WO2022244062A1 - Information processing device, information processing method, and computer program - Google Patents
Information processing device, information processing method, and computer program Download PDFInfo
- Publication number
- WO2022244062A1 WO2022244062A1 PCT/JP2021/018624 JP2021018624W WO2022244062A1 WO 2022244062 A1 WO2022244062 A1 WO 2022244062A1 JP 2021018624 W JP2021018624 W JP 2021018624W WO 2022244062 A1 WO2022244062 A1 WO 2022244062A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- probability
- information processing
- frame
- situation
- existence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to an information processing device, an information processing method, and a computer program.
- Information processing devices have various functions, but for example, there are information processing devices that are used to grasp the status of objects (including not only tangible objects but also intangible objects). Examples of such information processing devices include devices that can determine whether or not objects are likely to collide by acquiring distances between objects (an example of targets), human emotions (targets, etc.). For example), it is possible to estimate what kind of situation it is in.
- Patent Literature 1 proposes an information processing apparatus capable of obtaining biological information from a sensor and estimating human emotions.
- the present invention has been made in view of such circumstances, and aims to provide an information processing device, an information processing method, and a computer program that can appropriately grasp the situation of a target.
- an image acquisition unit configured to be capable of acquiring an image
- the image includes a plurality of frames with different timings
- the The plurality of frames includes a first frame and a second frame whose timing is later than the first frame
- the probability acquisition unit is configured to acquire the existence probability based on the image
- the existence probability is a probability that an object exists in the frame
- the determination unit is configured to determine the state of the object based on the existence probabilities of the first and second frames.
- the determination unit is configured to determine the state of the object based on the existence probabilities of the first and second frames, so that the state of the target (object) can be properly grasped.
- the probability acquisition unit is configured to be capable of further acquiring coordinates of the object based on the image, and the determination unit determines in advance the distance between the coordinates of the object in the first and second frames.
- An information processing apparatus is provided that compares the existence probabilities and determines the situation of the object when it is determined that the distance is equal to or less than a predetermined distance.
- the probability acquisition unit is configured to be capable of further acquiring type data based on the image
- the type data is data specifying the type of the object
- the determination unit includes the first and
- An information processing apparatus is provided that compares the existence probabilities and determines the situation of the object when it is determined that the type of the object in a second frame is the same.
- a type identification probability is specified for each candidate of the type of the object, and the determination unit selects the candidate with the highest type identification probability among the candidates of the type,
- An information processing device is provided configured to identify as said type of said object.
- the information processing device is provided, wherein the determining unit determines the situation of the object by comparing the existence probabilities when determining that the object in the first and second frames is the same. be.
- the situation corresponds to movement of the object, and the determination unit determines a difference between the existence probability of the second frame and the existence probability of the first frame as a predetermined first threshold.
- an information processing device configured to determine that the object is approaching when determining that the object is larger than.
- an information processing device is provided, wherein the object includes a ship, and the information processing device is configured to determine the situation of relative movement of the ship.
- a correction probability acquisition unit is further provided, and the correction probability acquisition unit is configured to acquire a correction probability based on the existence probability of the second frame and the existence probability of the first frame, and The correction probability is a probability that the object exists in the frame between the timing of the first frame and the timing of the second frame, and the determination unit determines the An information processing device is provided that is configured to determine said situation of an object.
- the correction probability acquired by the correction probability acquisition unit is higher than the existence probability of the second frame.
- an information processing apparatus determines the situation of the object based on the correction probability when both the existence probabilities of the first and second frames are lower than a predetermined second threshold.
- An information processing device is provided.
- the information processing device is provided, wherein the correction probability is higher than the second threshold.
- the determination unit determines the situation of the object without using the correction probability when at least one of the existence probabilities of the first and second frames is higher than the second threshold.
- an information processing method comprising an image acquisition step, a probability acquisition step, and a determination step, wherein the image acquisition step acquires an image, and the image is a plurality of frames having different timings, the plurality of frames having a first frame and a second frame having a later timing than the first frame; and in the probability obtaining step, based on the image and obtaining an existence probability, the existence probability being the probability that an object exists in the frame, and the determining step determining the state of the object based on the existence probability of the first and second frames.
- a method is provided for determining.
- a computer program for executing an information processing method comprising an image obtaining step, a probability obtaining step, and a determining step, wherein the image obtaining step obtains an image wherein the image includes a plurality of frames with different timings, the plurality of frames having a first frame and a second frame whose timing is later than the first frame, and in the probability obtaining step obtaining an existence probability based on the image, wherein the existence probability is the probability that an object exists in the frame; and in the determining step, based on the existence probability of the first and second frames,
- a computer program is provided for determining the condition of the object.
- the present invention comprises a probability obtaining unit and a modified probability obtaining unit, wherein the probability obtaining unit includes a first situation probability and a timing later than the first situation probability. and a second situation probability, wherein the first and second situation probabilities are probabilities representing the situation of a target, and the modified probability acquisition unit, based on the first and second situation probabilities, A modified probability is obtainable, and the modified probability is a probability representing the situation of the target between the timing of obtaining the first situation probability and the timing of obtaining the second situation probability.
- an information processing apparatus is provided.
- the correction probability acquired by the correction probability acquisition unit is higher than the second situation probability.
- the correction probability acquired by the correction probability acquisition unit is lower than the second situation probability.
- FIG. 1 is a functional block diagram of an information processing device 100 according to an embodiment.
- FIG. 2A is an explanatory diagram of an image D1 input to the probability acquisition unit 2 and an output D2 output from the probability acquisition unit 2.
- FIG. 2B is a schematic diagram for explaining the type data d3.
- FIG. 3 is a schematic diagram of the image D1.
- FIG. 4 is a position explanatory diagram of each object, showing the state of each object in the image D1 shown in FIG. 3 as viewed from above.
- FIG. 5A is a schematic diagram of the first frame of image D1.
- FIG. 5B is a schematic diagram of the second frame of image D1.
- FIG. 6 is a control flowchart of the information processing device 100 according to the embodiment.
- FIG. 1 is a functional block diagram of an information processing device 100 according to an embodiment.
- FIG. 2A is an explanatory diagram of an image D1 input to the probability acquisition unit 2 and an output D2 output from the probability acquisition unit 2.
- FIG. 2B
- FIG. 7 is a functional block diagram of an information processing apparatus 100 according to Modification 1.
- FIG. 8 is a control flowchart of the information processing apparatus 100 according to Modification 1.
- FIG. 9 is a schematic diagram of an image frame according to Modification 2.
- FIG. 10A is a schematic diagram of the first frame of the image D1 according to Modification 3.
- FIG. 10B is a schematic diagram of the second frame of the image D1 according to Modification 3.
- FIG. 11A is an explanatory diagram of an image D1 input to the probability acquisition unit 2 and an output D2 output from the probability acquisition unit 2 according to Modification 3.
- FIG. 11B is a schematic diagram for explaining the existence probability d1 according to Modification 3.
- FIG. FIG. 12 is a control flowchart of the information processing apparatus 100 according to Modification 3.
- FIG. 13 is a functional block diagram of an information processing apparatus 100 according to Modification 4.
- FIG. 10A is a schematic diagram of the first frame of the image D1 according to Modification 3.
- the information processing device 100 will be described as being mounted on a ship (the ship obt on which the information processing device 100 shown in FIG. 4 is mounted).
- the information processing device 100 has a function of quickly determining whether or not a predetermined target on the sea is approaching the onboard vessel obt.
- the target is an object (tangible object).
- it is not limited whether or not the object operates by its own power.
- the object is not limited to an object floating on the sea, and may be an object floating or flying.
- Objects are, for example, ships, buoys, yachts, buoys, and the like. Some ships are difficult to turn abruptly and speed up quickly, so it is important to know in advance whether an object is approaching the ship at an early stage to avoid a collision between the ship and the object. It is useful from the viewpoint of
- the imaging device 20 is configured to output to the information processing device 100 data (image D ⁇ b>1 ) obtained by imaging the surroundings (front in the embodiment) of the onboard ship obt.
- the imaging device 20 is attached to the loading ship obt.
- the external device 21 is composed of, for example, a speaker, a display, or the like, and has a function of informing passengers of the on-board ship obt whether or not an object is approaching by means of voice, text, or the like.
- Information processing apparatus 100 can determine whether an object is approaching based on a plurality of frames of image D ⁇ b>1 and output the determination result to external device 21 . As a result, the information processing apparatus 100 can prompt the passengers of the on-board ship obt to take an avoidance action at an early stage.
- the information processing device 100 includes an image acquisition unit 1 , a probability acquisition unit 2 , a determination unit 3 and an output unit 4 .
- Each component of the information processing apparatus 100 may be realized by software or by hardware.
- various functions can be realized by the CPU executing a computer program.
- the program may be stored in a built-in storage unit, or may be stored in a computer-readable non-temporary recording medium. Alternatively, a program stored in an external storage unit may be read out and implemented by so-called cloud computing.
- it can be implemented by various circuits such as ASIC, FPGA, or DRP. In this embodiment, various information and concepts including this are handled, and these are represented by high and low signal values as binary bit aggregates composed of 0 or 1, and the above software or hardware Communication and computation can be performed according to the aspect of .
- the image acquisition unit 1 is configured to acquire an image D1 from an imaging device 20.
- the processing performed by the image acquisition unit 1 is an example of the image acquisition step.
- the image D1 has multiple frames.
- a plurality of frames is data for which time-series information is specified.
- the multiple frames include a first frame D11 and a second frame D12 whose timing is later than the first frame, as shown in FIGS. 5A and 5B.
- the timing (time) in the first frame is defined as first timing
- the timing (time) in the second frame is defined as second timing.
- the ship ob1 is approaching the onboard ship obt. Therefore, in the frame shown in FIG. 5B, the ship ob1 is displayed gradually larger and more clearly than in the frame shown in FIG. 5A.
- the frame of the image D1 shows the ship ob1, the sky ob2, and the sea ob3.
- the object refers to an object such as a buoy, a yacht, or the like, other than the ship ob1, which may become an obstacle to the progress of the ship.
- the ship ob1 is located in front of the onboard ship obt (imaging device 20). As shown in FIG. 4, defining the Cartesian coordinates centering on the on-board ship obt, the ship ob1 is positioned on the y-axis. In FIG. 4, the y-axis is parallel to the traveling direction of the on-board vessel obt, and the x-axis is orthogonal to the y-axis. Vessel ob1 is an object approaching onboard vessel obt. Specifically, the ship ob1 is moving in the -y-axis direction.
- the determination unit 3 which will be described later, performs situation determination such as whether or not an object is approaching based on the difference in the situation probabilities of the object in the two frames.
- a situation probability is a probability representing a situation of an object (an object in the embodiment), and a situation probability in the embodiment is an existence probability of the object (an object in the embodiment).
- Existence probability is the probability that a predetermined object exists in the frame of the image.
- the difference in timing between the first frame and the second frame is not particularly limited, but is preferably set according to the device on which the information processing apparatus 100 is installed. For example, when the information processing apparatus 100 is mounted on a ship as in the embodiment, the moving speed of an object that can be an obstacle is lower than that of a vehicle, an airplane, or the like.
- the difference between the first timing and the second timing is, for example, 10 seconds, 20 seconds, 30 seconds, 40 seconds, 50 seconds, 1 minute, 2 minutes, 3 minutes, 4 minutes, 5 minutes, 6 minutes, 7 minutes, 8 minutes, 9 minutes, 10 minutes. Also, the difference between the first timing and the second timing may be represented by a range of two values among the values listed here.
- the probability acquisition unit 2 acquires an output D2 including the existence probability d1 based on the image D1, as shown in FIG. 2A.
- the probability acquisition unit 2 then outputs the output D2 to the determination unit 3 .
- the output D2 has existence probability d1 (existence probability data), coordinates d2 (coordinate data), and type data d3.
- the existence probability d1 is data (situation probability data) in which the situation probability (existence probability in the embodiment) is specified.
- a configuration example of these data will be described.
- the existence probability d1 is data specifying the value of the existence probability of an object in the frame of the image D1.
- the existence probability of the existence probability d1 is the probability that a predetermined object exists in the frame of the image D1.
- the coordinates d2 are data specifying the coordinates of the object in the frame of the image D1.
- the coordinates of coordinate d2 are in the xz coordinate system.
- the coordinates d2 also include data specifying the width W and height H of the object in the frame of the image D1.
- the type data d3 is data for specifying the type of the object in the frame of the image D1.
- the type data d3 is data in which the probability is specified for each predetermined type for each object in the frame of the image D1.
- predetermined types in an embodiment include ships, buoys, yachts, and the like.
- the objects in the frame of image D1 include ships, buoys and yachts.
- the type data d3 specifies the probabilities of all candidate objects, such as the probability of being a ship, the probability of being a buoy, and the probability of being a yacht, with respect to the object corresponding to the ship ob1. It is
- the above output D2 (existence probability d1, coordinates d2, and type data d3) can be calculated based on a learning model that inputs image D1 and outputs output D2.
- a learning model is a model that is trained using a large amount of teacher data (a set of known input data and correct answer data) to make it possible to predict future outputs.
- the input data of the teacher data is image data (frames) obtained by transferring an object such as a ship on the sea
- the correct data is the existence probability of the object shown in this image data, the coordinates of the object, Includes the width and height of the object and the type of object.
- the learning model (machine learning model) can employ various object detection algorithms such as YOLO (You Only Look Once) and SSD (Single Shot Detector).
- the determination unit 3 has a function of determining the presence or absence of the coordinate d2 (hereinafter also referred to as a first function) and a function of determining whether the distance between coordinates between objects in different frames is equal to or less than a predetermined distance (hereinafter referred to as a first function). , also referred to as a second function), a function of determining whether the types of objects in different frames are the same (hereinafter also referred to as a third function), and a function of comparing the existence probabilities of objects in different frames. (hereinafter also referred to as a fourth function).
- the determination unit 3 is configured to be able to output a determination result indicating the state of the object.
- the determination unit 3 determines the presence or absence of the coordinate d2 of the second frame (chronologically later frame). If the existence probability of the second frame (chronologically subsequent frame) is 0%, the output D2 does not include the coordinate d2, and if the existence probability is higher than 0%, the output D2 does not include the coordinate d2. d2 is included. That is, in the embodiment, when the coordinate d2 exists in the output D2, it is equivalent to the case where the existence probability is higher than 0%.
- the first function may be replaced by threshold determination. That is, the determination unit 3 may be replaced with a function of determining whether or not the existence probability is higher than a predetermined threshold value (for example, 5%). In other words, under conditions where the existence probability is considerably low, the information processing apparatus 100 may process the object as not existing in the frame.
- the image acquisition unit 1 acquires the coordinates of the object in the first frame (the frame chronologically earlier) and the coordinates of the object in the second frame (the frame chronologically later).
- the coordinates are output to the determination unit 3 as the output D2.
- the determination unit 3 determines whether the inter-coordinate distance between the coordinates of the object in the first frame and the coordinates of the object in the second frame is equal to or less than a predetermined distance.
- the information processing apparatus 100 uses this inter-coordinate distance as a criterion for determining whether or not the object in each frame is the same. This is because, if the distance between coordinates is equal to or less than the predetermined distance, it increases the possibility that the object in the first frame and the object in the second frame are the same.
- the image acquisition unit 1 acquires the object type data d3 of the first frame (chronologically earlier frame) and the object type data d3 of the second frame (chronologically later frame). These data are output to the determination unit 3 as the output D2.
- the determination unit 3 identifies the type of object using the type data d3 of each frame. Specifically, the determination unit 3 identifies the candidate with the highest probability among the object type candidate probabilities (type identification probabilities) in the first frame as the object type. For example, in the case of the image D1 shown in FIG. 3, since the ship is clearly displayed, the probability of the ship is calculated high, and the probability of the buoy and the yacht is calculated low.
- the determination unit 3 compares the magnitude of these probabilities and identifies the ship, which is the highest candidate, as the type of object.
- the determination unit 3 similarly identifies the type of object for the second frame. That is, the determination unit 3 identifies the candidate with the highest probability among the object type candidate probabilities in the second frame as the object type. Then, the determination unit 3 determines whether or not the type of the object in the first frame is the same as the type of the object in the second frame. In this way, the information processing apparatus 100 uses the type of object as a criterion for determining whether the objects in each frame are the same. This is because if the types are the same, the possibility that the object in the first frame and the object in the second frame are the same increases.
- the image acquisition unit 1 acquires the object existence probability of the first frame (chronologically earlier frame) and the object existence probability of the second frame (chronologically later frame), These existence probabilities are output to the determination unit 3 as the output D2.
- the determination unit 3 has a function of comparing the existence probability of the object in the first frame and the existence probability of the object in the second frame. Specifically, the determination unit 3 determines that a value (difference) obtained by subtracting the existence probability of the first frame from the existence probability of the second frame is greater than a predetermined threshold Th1 (an example of the first threshold). is determined, it is determined that the object is approaching.
- the determination unit 3 determines that an object is approaching when the existence probability rises above the predetermined threshold value Th1 from the first frame to the second frame. In this way, the information processing apparatus 100 makes a situation judgment using the difference in the existence probability instead of judging the situation based on the existence probability alone. Note that the determination result of the determination unit 3 corresponds to the determination result D3.
- the value of the threshold Th1 is not particularly limited, but the value of the threshold Th1 can be set to any numerical value from 10 to 50, for example.
- the value of the threshold Th1 can be appropriately set according to the information processing apparatus 100 installed therein.
- the output unit 4 is configured to output the determination result D3 to the external device 21 .
- the external device 21 is not limited to equipment such as a display mounted on the onboard ship obt. It may be output to a control center that supervises the operation status of the on-board ship obt.
- Step S1 Judgment of presence/absence of coordinate d2 (first function)
- the determination unit 3 determines whether or not the coordinate d2 exists in the second frame (in the later frame in chronological order). If it exists, the process proceeds to step S2. If it does not exist, the process proceeds to step S5-2.
- Step S2 Threshold determination of distance between coordinates (second function)
- the determination unit 3 determines whether or not the distance between the coordinates of the object in the first and second frames is equal to or less than a predetermined distance. If the distance is less than or equal to the predetermined distance, there is a possibility that both the objects in the first and second frames are the same, so the process proceeds to step S3. If the distance is not equal to or less than the predetermined distance, the process proceeds to step S5-2.
- Step S3 Judgment of Same Type (Third Function)
- the determination unit 3 determines whether or not the types of objects in the first and second frames are the same. In making this determination, the determination unit 3 specifies in advance the types of objects in the first and second frames. That is, the determination unit 3 identifies the candidate with the highest probability (type identification probability) among the object type candidates as the object type for each of the first and second frames. If the types of objects are the same, the possibility that both objects are the same increases, so the process proceeds to step S4. If the types of objects are not the same, the process proceeds to step S5-2.
- Step S4 Comparison of existence probabilities (fourth function)
- the information processing apparatus 100 assumes that the objects in the first and second frames are the same when steps S1 to S3 have been performed. That is, steps S1 to S3 can be said to be steps for determining whether the objects in the first and second frames are the same.
- the determination unit 3 acquires the difference in the existence probability of the objects considered to be identical (a value obtained by subtracting the existence probability of the object in the first frame from the existence probability of the object in the second frame). Then, the determination unit 3 determines whether or not this difference is greater than a predetermined threshold value Th1. If it is greater than the threshold Th1, the process proceeds to step S5-1. If it is not greater than the threshold, the process proceeds to step S5-2.
- Step S4 is an example of a determination step.
- Step S5-1 Approaching The determination unit 3 generates a determination result D3 indicating that the object is approaching.
- Step S5-2 Not Approaching
- the determination unit 3 generates a determination result D3 indicating that the object is not approaching.
- the information processing apparatus 100 is configured such that the determination unit 3 determines the state of an object based on the existence probabilities of the first and second frames. That is, unlike the conventional information processing apparatus, the information processing apparatus 100 uses the difference between the existence probabilities to perform the avoidance action, instead of using the probability alone to determine whether the avoidance action should be taken. It is determined whether or not it should be. Therefore, the information processing apparatus 100 can determine whether or not there is a situation in which specific control should be performed more in advance than the information processing apparatus of the related art. control function) can be exhibited appropriately.
- Modification 5-1 Modification 1: Modified Probability Acquisition Unit 5
- the information processing apparatus 100 may further include a correction probability acquisition unit 5, as shown in FIG.
- the correction probability acquisition unit 5 is configured to be able to acquire the correction probability D21 based on the difference between the existence probability of the second frame and the existence probability of the first frame.
- the correction probability D21 is data in which the value of the correction probability is specified.
- the correction probability is the probability that an object exists in the frame between the timing of the first frame and the timing of the second frame.
- the existence probability difference itself is compared with the threshold value, but in the modified example, the difference is replaced with the probability (correction probability Pc) again.
- the correction probability Pc can be obtained based on, for example, a predetermined relational expression. That is, once the existence probabilities of the first and second frames are determined, the correction probability Pc is uniquely determined.
- This relational expression is set so that, for example, when the existence probability of the second frame is higher than the existence probability of the first frame, the correction probability is higher than the existence probability of the second frame. That is, when the existence probability increases, it can be said that there is a high possibility that an object is approaching, so the correction probability Pc is set higher than the existence probability of the second frame. In addition, when the existence probability is lowered, it can be said that the object is not approaching. Therefore, in this relational expression, when the existence probability of the first frame is higher than the existence probability of the second frame, the correction probability Pc is set to a value lower than the existence probability of the second frame. .
- Step S4 is a step in which the correction probability acquisition unit 5 acquires the correction probability D21 based on the difference between the existence probability of the second frame and the existence probability of the first frame.
- step S5 the determination unit 3 determines whether or not the correction probability Pc of the correction probability D21 is greater than the threshold Th2. If the correction probability Pc is greater than the threshold Th2, it is determined that the object is approaching (step S6-1).
- the existence probability of the object in the first frame is 1% and the existence probability of the object in the second frame is 6%
- a case where the existence probability of the object in the first frame is 20% and The difference itself is the same as when the existence probability of the object is 25%.
- the magnitude of the existence probability differs between the former case and the latter case. Therefore, even if the difference itself is the same between the former case and the latter case, it may be preferable to differentiate the determination contents. For example, in the former case, since the existence probability itself is very low, an error may have increased the existence probability. On the other hand, in the latter case, since the existence probability is relatively high, the increase in the existence probability is not an error but is likely to be significant.
- the modified probability acquisition unit 5 According to the predetermined relational expression of the modified probability acquisition unit 5 according to the first modification, even if the difference in the existence probability is the same, if the magnitude of the existence probability itself is different, there is a difference in the determination content. It is preferable that the
- the determination unit 3 determines the state of the object based on the correction probability Pc. It may be configured as In this case, the correction probability Pc is higher than a predetermined threshold (an example of the second threshold). Further, when at least one of the existence probabilities of the first and second frames is higher than the threshold (an example of the second threshold), the determination unit 3 determines the state of the object without using the correction probability Pc. may be configured to In this way, by determining whether or not to use the correction probability according to the magnitude of the existence probability of the first and second frames, it is possible to realize appropriate situation determination according to the scene.
- buoy ob4 exists near the onboard ship, the probability of existence of buoy ob4 is quite high.
- the existence probability is higher than a predetermined threshold, it can be assumed that the object is essentially near the onboard ship.
- Steps S1 to S5-2 in FIG. 6) are preferably not executed. In this case, for example, as a step preceding step S1 in the flowchart shown in FIG.
- step S1 if the existence probability is less than the threshold value, the process proceeds to step S1, and if the existence probability is equal to or greater than the threshold value, approach determination using a known method may be performed.
- approach determination can be performed based on changes in the size of an object or changes in the coordinates of an object.
- the ship ob1 and ob5 are contained in the frame, and the difference between these objects is determined by whether the distance between the coordinates in step S2 is equal to or less than the threshold. That is, when the flowchart shown in FIG. 6 is executed for the ship ob1, there are coordinates d2 of two objects (buoy ob4 and ship ob5) in addition to the coordinates d2 of the ship ob1.
- step S2 it is possible to avoid misjudgment that the ship ob1 in the second frame is the buoy ob4 or the ship ob5. The same is true for the buoy ob4 and the ship ob5.
- step S2 it is possible to avoid erroneous determination of the buoy ob4 as the ship ob1 or ob5, or erroneous determination of the ship ob5 as the ship ob1 or the buoy ob4.
- the information processing device 100 is mounted on a ship, and it is explained that the situation is determined whether or not the ship is approaching an obstacle. It is not limited. This situation determination may be a determination of a change in color of the object. For example, it may be applied to changes in the color of traffic lights or changes in the color of leaves.
- a traffic light will be used as an example. Further, in Modified Example 3, the information processing device 100 is described as being mounted in a vehicle. Furthermore, a situation in which the color of the traffic light changes from red (color indicating stop) to blue (color indicating possible start) while the vehicle is stopped will be described as an example.
- the existence probability and the type of object are separated as separate data. Since the situation determination in Modification 3 is a change in the color of the signal, the probability acquisition unit 2 acquires the existence probability d1 and the coordinates d2 as the output D2, as shown in FIG. 11A. Then, as shown in FIG. 11B, the structure of the existence probability d1 indicates the probability that each color exists.
- the object is a traffic light
- the objects are distinguished according to the lighting color.
- the first frame D11 of the image D1 shows a traffic light ob61 lit in red, a road ob7, and a sidewalk ob8.
- the second frame D12 of the image D1 shows a traffic light ob62 that is lit in blue, a road ob7, and a sidewalk ob8.
- Steps S1 and S2 are the same as in FIG. 6, so the description is omitted.
- Step S3 Determination of Color Type
- the determination unit 3 determines whether or not the color of the traffic light in the first and second frames has changed based on the existence probability d1. In making this determination, the determination unit 3 specifies in advance the color of the traffic light in the first and second frames based on the existence probability d1. That is, the determination unit 3 identifies the candidate with the highest probability among the candidates for the color of the traffic light as the type of color for each of the first and second frames.
- step S4-1 In other words, when the determination unit 3 determines that the color of the traffic light has changed from red to blue, the process proceeds to step S4-1.
- step S4-2 if the determination unit 3 determines that the color of the traffic signal has not changed, or if the color of the traffic signal has changed but does not change from red to blue, the process proceeds to step S4-2.
- Step S4-1 Change from Red to Blue
- the determination unit 3 generates a determination result D3 representing a situation in which the color of the traffic light changes from red to blue. Accordingly, the information processing device 100 can inform the passenger that the vehicle may start. Further, when the vehicle is automatically driven, the information processing device 100 can output information indicating that the vehicle is ready to proceed to the control device of the vehicle.
- Step S4-2 No Change from Red to Blue
- the determination unit 3 generates a determination result D3 representing a situation where the color of the traffic light has not changed from red to green.
- the information processing device 100 can inform the passenger of the situation that the vehicle needs to be maintained in a stopped state. Further, in the case of automatic driving, the information processing device 100 can output to the control device of the vehicle information indicating that the vehicle needs to be maintained in a stopped state.
- Modification 4 Emotion analysis using voice as input
- the situation determination in the embodiment and Modifications 1 to 3 was the movement of tangible objects and the change in color of tangible objects, but is limited to this. is not.
- the information processing apparatus 100 can also be applied to, for example, emotion analysis using voice as input.
- the situation determination in Modification 4 corresponds to the determination of the degree of emotion.
- the degree of emotion is, for example, a minimum of 0% and a maximum of 100%, with a higher number representing a better impression.
- emotions are an example of targets.
- the degree of emotion is an example of the situation probability.
- the above-mentioned situational probability is used instead of the existence probability.
- the concept of situation probability includes the concept of existence probability in the embodiment and modifications 1-3 thereof.
- the existence probability is one aspect of the situational probability.
- the situation probability is a probability representing the situation of any target, not limited to tangible objects.
- the information processing apparatus 100 includes a voice acquisition section 1t instead of the image acquisition section 1 according to the embodiment.
- the voice acquisition unit 1t is configured to be able to acquire a voice D1t (for example, words) from a voice input device 20t such as a microphone.
- the probability acquisition unit 2 is configured to output the situation probability (degree of emotion) as the output D2 when the voice D1t is input from the voice acquisition unit 1t.
- the situation probability is data specifying the value of the situation probability of the target at a certain timing. For example, assume that when voice data at the first timing is input to the probability obtaining unit 2, data specifying a situation probability of 10% (an example of the first situation probability) is output from the probability obtaining unit 2. Further, when the speech at the second timing, which is later than the first timing, is input to the probability acquisition unit 2, the probability acquisition unit 2 outputs data specifying a situation probability of 20% (an example of the second situation probability). shall have been
- the determination unit 3 has a function of calculating the difference in situation probabilities. Specifically, the determination unit 3 predetermines a value (difference) obtained by subtracting the situation probability (10%) of the situation probability at the first timing from the situation probability (20%) of the situation probability at the second timing. It is configured to determine that the emotion has improved when it is determined to be greater than a set threshold value (eg, 5%). In the example of Modification 4, the difference is 10%, and since the difference is larger than the threshold, the determination unit 3 determines that the emotions have improved.
- a set threshold value eg, 5%
- the correction probability acquisition unit 5 may be configured to be able to acquire the correction probability based on the difference between the situation probabilities at the first and second timings.
- the correction probability is data in which the value of the target correction probability is specified.
- the correction probability is a probability representing the situation of the target (emotion in the present modified example 4) between the first timing and the second timing. For example, when the situation probability at the second timing is higher than the situation probability at the first timing, the corrected situation probability acquired by the correction probability acquisition unit 5 is set to be higher than the situation probability at the second timing. be. Further, for example, when the situation probability at the first timing is higher than the situation probability at the second timing, the corrected situation probability acquired by the correction probability acquisition unit 5 is set to be lower than the situation probability at the second timing. set.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
本発明は、情報処理装置、情報処理方法及びコンピュータプログラムに関する。 The present invention relates to an information processing device, an information processing method, and a computer program.
情報処理装置には様々な機能のものが存在するが、例えば、対象(有体物だけでなく無体物も含む)の状況の把握をするのに活用される情報処理装置がある。このような情報処理装置としては、例えば、物体(対象の一例)間の距離を取得することで物体同士が衝突しそうな状況か否かを把握することができるものや、人間の感情(対象の一例)がどのような状況にあるかを推定可能なものもある。例えば、特許文献1には、生体情報をセンサから取得して、人間の感情を推定することが可能な情報処理装置が提案されている。 Information processing devices have various functions, but for example, there are information processing devices that are used to grasp the status of objects (including not only tangible objects but also intangible objects). Examples of such information processing devices include devices that can determine whether or not objects are likely to collide by acquiring distances between objects (an example of targets), human emotions (targets, etc.). For example), it is possible to estimate what kind of situation it is in. For example, Patent Literature 1 proposes an information processing apparatus capable of obtaining biological information from a sensor and estimating human emotions.
情報処理装置が活用されるシーンによらず、対象の状況を適切に把握することができることが望まれる。 It is desired that the situation of the target can be properly grasped regardless of the scene in which the information processing device is used.
本発明はこのような事情に鑑みてなされたものであり、対象の状況を適切に把握することができる、情報処理装置、情報処理方法及びコンピュータプログラムを提供することを目的としている。 The present invention has been made in view of such circumstances, and aims to provide an information processing device, an information processing method, and a computer program that can appropriately grasp the situation of a target.
本発明によれば、画像取得部と、確率取得部と、判定部とを備え、前記画像取得部は、画像を取得可能に構成され、前記画像は、タイミングが異なる複数のフレームを含み、前記複数のフレームは、第1フレームと、前記タイミングが第1フレームよりも後である第2フレームとを有し、前記確率取得部は、前記画像に基づいて、存在確率を取得可能に構成され、前記存在確率は、前記フレーム内において物体が存在する確率であり、前記判定部は、前記第1及び第2フレームの前記存在確率に基づいて、前記物体の状況を判定するように構成されている、情報処理装置が提供される。 According to the present invention, an image acquisition unit, a probability acquisition unit, and a determination unit are provided, the image acquisition unit is configured to be capable of acquiring an image, the image includes a plurality of frames with different timings, and the The plurality of frames includes a first frame and a second frame whose timing is later than the first frame, and the probability acquisition unit is configured to acquire the existence probability based on the image, The existence probability is a probability that an object exists in the frame, and the determination unit is configured to determine the state of the object based on the existence probabilities of the first and second frames. , an information processing apparatus is provided.
本発明では、判定部が、第1及び第2フレームの存在確率に基づいて、物体の状況を判定するように構成されており、対象(物体)の状況を適切に把握することができる。 In the present invention, the determination unit is configured to determine the state of the object based on the existence probabilities of the first and second frames, so that the state of the target (object) can be properly grasped.
以下、本発明の種々の実施形態を例示する。以下に示す実施形態は互いに組み合わせ可能である。
好ましくは、前記確率取得部は、前記画像に基づいて、前記物体の座標を更に取得可能に構成され、前記判定部は、前記第1及び第2フレームにおける前記物体の前記座標間の距離が予め定められた距離以下であると判定した場合に、前記存在確率を比較して前記物体の前記状況を判定する、情報処理装置が提供される。
好ましくは、前記確率取得部は、前記画像に基づいて、種類データを更に取得可能に構成され、前記種類データは、前記物体の種類を特定するデータであり、前記判定部は、前記第1及び第2フレームにおける前記物体の前記種類が同じであると判定した場合に、前記存在確率を比較して前記物体の前記状況を判定する、情報処理装置が提供される。
好ましくは、前記種類データには、前記物体の前記種類の候補ごとに種類特定確率が特定されており、前記判定部は、前記種類の前記候補のうち最も前記種類特定確率が高い前記候補を、前記物体の前記種類として特定するように構成されている、情報処理装置が提供される。
好ましくは、前記判定部は、前記第1及び第2フレームにおける前記物体が同じであると判定した場合に、前記存在確率を比較して前記物体の前記状況を判定する、情報処理装置が提供される。
好ましくは、前記状況は、前記物体の移動に対応しており、前記判定部は、前記第2フレームの前記存在確率と前記第1フレームの前記存在確率との差分が予め定められた第1閾値よりも大きいと判定した場合に、前記物体が接近してきていると判定するように構成される、情報処理装置が提供される。
好ましくは、前記物体には、船が含まれ、前記情報処理装置は、前記船の相対的な移動の前記状況を判定するように構成されている、情報処理装置が提供される。
好ましくは、修正確率取得部を更に備え、前記修正確率取得部は、前記第2フレームの前記存在確率と前記第1フレームの前記存在確率とに基づいて、修正確率を取得可能に構成され、前記修正確率は、前記第1フレームの前記タイミングと前記第2フレームの前記タイミングとの間において、前記フレーム内に前記物体が存在する確率であり、前記判定部は、前記修正確率に基づいて、前記物体の前記状況を判定するように構成されている、情報処理装置が提供される。
好ましくは、前記第2フレームの前記存在確率が前記第1フレームの前記存在確率よりも高い場合において、前記修正確率取得部が取得する前記修正確率は、前記第2フレームの前記存在確率よりも高い、情報処理装置が提供される。
好ましくは、前記第1フレームの前記存在確率が前記第2フレームの前記存在確率よりも高い場合において、前記修正確率取得部が取得する前記修正確率は、前記第2フレームの前記存在確率よりも低い、情報処理装置が提供される。
好ましくは、前記判定部は、前記第1及び第2フレームの前記存在確率の両方が予め定められた第2閾値より低い場合において、前記修正確率に基づいて、前記物体の前記状況を判定するように構成されている、情報処理装置が提供される。
好ましくは、前記修正確率は、前記第2閾値よりも高い、情報処理装置が提供される。
好ましくは、前記判定部は、前記第1及び第2フレームの前記存在確率のうちの少なくとも一方が前記第2閾値より高い場合において、前記修正確率を用いずに、前記物体の前記状況を判定するように構成されている、情報処理装置が提供される。
Various embodiments of the present invention are illustrated below. The embodiments shown below can be combined with each other.
Preferably, the probability acquisition unit is configured to be capable of further acquiring coordinates of the object based on the image, and the determination unit determines in advance the distance between the coordinates of the object in the first and second frames. An information processing apparatus is provided that compares the existence probabilities and determines the situation of the object when it is determined that the distance is equal to or less than a predetermined distance.
Preferably, the probability acquisition unit is configured to be capable of further acquiring type data based on the image, the type data is data specifying the type of the object, and the determination unit includes the first and An information processing apparatus is provided that compares the existence probabilities and determines the situation of the object when it is determined that the type of the object in a second frame is the same.
Preferably, in the type data, a type identification probability is specified for each candidate of the type of the object, and the determination unit selects the candidate with the highest type identification probability among the candidates of the type, An information processing device is provided configured to identify as said type of said object.
Preferably, the information processing device is provided, wherein the determining unit determines the situation of the object by comparing the existence probabilities when determining that the object in the first and second frames is the same. be.
Preferably, the situation corresponds to movement of the object, and the determination unit determines a difference between the existence probability of the second frame and the existence probability of the first frame as a predetermined first threshold. There is provided an information processing device configured to determine that the object is approaching when determining that the object is larger than.
Preferably, an information processing device is provided, wherein the object includes a ship, and the information processing device is configured to determine the situation of relative movement of the ship.
Preferably, a correction probability acquisition unit is further provided, and the correction probability acquisition unit is configured to acquire a correction probability based on the existence probability of the second frame and the existence probability of the first frame, and The correction probability is a probability that the object exists in the frame between the timing of the first frame and the timing of the second frame, and the determination unit determines the An information processing device is provided that is configured to determine said situation of an object.
Preferably, when the existence probability of the second frame is higher than the existence probability of the first frame, the correction probability acquired by the correction probability acquisition unit is higher than the existence probability of the second frame. , an information processing apparatus is provided.
Preferably, when the existence probability of the first frame is higher than the existence probability of the second frame, the corrected probability acquired by the corrected probability acquisition unit is lower than the existence probability of the second frame. , an information processing apparatus is provided.
Preferably, the determination unit determines the situation of the object based on the correction probability when both the existence probabilities of the first and second frames are lower than a predetermined second threshold. An information processing device is provided.
Preferably, the information processing device is provided, wherein the correction probability is higher than the second threshold.
Preferably, the determination unit determines the situation of the object without using the correction probability when at least one of the existence probabilities of the first and second frames is higher than the second threshold. An information processing device configured as follows is provided.
本発明の実施形態の別の観点によれば、画像取得ステップと、確率取得ステップと、判定ステップとを備えた情報処理方法であって、前記画像取得ステップでは、画像を取得し、前記画像は、タイミングが異なる複数のフレームを含み、前記複数のフレームは、第1フレームと、前記タイミングが第1フレームよりも後である第2フレームとを有し、前記確率取得ステップでは、前記画像に基づいて、存在確率を取得し、前記存在確率は、前記フレーム内において物体が存在する確率であり、前記判定ステップでは、前記第1及び第2フレームの前記存在確率に基づいて、前記物体の状況を判定する、方法が提供される。 According to another aspect of an embodiment of the present invention, there is provided an information processing method comprising an image acquisition step, a probability acquisition step, and a determination step, wherein the image acquisition step acquires an image, and the image is a plurality of frames having different timings, the plurality of frames having a first frame and a second frame having a later timing than the first frame; and in the probability obtaining step, based on the image and obtaining an existence probability, the existence probability being the probability that an object exists in the frame, and the determining step determining the state of the object based on the existence probability of the first and second frames. A method is provided for determining.
本発明の実施形態の別の観点によれば、画像取得ステップと、確率取得ステップと、判定ステップとを備えた情報処理方法を実行するコンピュータプログラムであって、前記画像取得ステップでは、画像を取得し、前記画像は、タイミングが異なる複数のフレームを含み、前記複数のフレームは、第1フレームと、前記タイミングが第1フレームよりも後である第2フレームとを有し、前記確率取得ステップでは、前記画像に基づいて、存在確率を取得し、前記存在確率は、前記フレーム内において物体が存在する確率であり、前記判定ステップでは、前記第1及び第2フレームの前記存在確率に基づいて、前記物体の状況を判定する、コンピュータプログラムが提供される。 According to another aspect of an embodiment of the present invention, there is provided a computer program for executing an information processing method comprising an image obtaining step, a probability obtaining step, and a determining step, wherein the image obtaining step obtains an image wherein the image includes a plurality of frames with different timings, the plurality of frames having a first frame and a second frame whose timing is later than the first frame, and in the probability obtaining step obtaining an existence probability based on the image, wherein the existence probability is the probability that an object exists in the frame; and in the determining step, based on the existence probability of the first and second frames, A computer program is provided for determining the condition of the object.
本発明の実施形態の別の観点によれば、確率取得部と、修正確率取得部とを備え、前記確率取得部は、第1状況確率と、タイミングが前記第1状況確率よりも後である第2状況確率とを取得可能に構成され、前記第1及び第2状況確率は、対象の状況を表す確率であり、前記修正確率取得部は、前記第1及び第2状況確率に基づいて、修正確率を取得可能に構成され、前記修正確率は、前記第1状況確率を取得する前記タイミングと前記第2状況確率を取得する前記タイミングとの間において、前記対象の前記状況を表す確率である、情報処理装置が提供される。
好ましくは、前記第2状況確率が前記第1状況確率よりも高い場合において、前記修正確率取得部が取得する前記修正確率は、前記第2状況確率よりも高い、情報処理装置が提供される。
好ましくは、前記第1状況確率が前記第2状況確率よりも高い場合において、前記修正確率取得部が取得する前記修正確率は、前記第2状況確率よりも低い、情報処理装置が提供される。
According to another aspect of an embodiment of the present invention, it comprises a probability obtaining unit and a modified probability obtaining unit, wherein the probability obtaining unit includes a first situation probability and a timing later than the first situation probability. and a second situation probability, wherein the first and second situation probabilities are probabilities representing the situation of a target, and the modified probability acquisition unit, based on the first and second situation probabilities, A modified probability is obtainable, and the modified probability is a probability representing the situation of the target between the timing of obtaining the first situation probability and the timing of obtaining the second situation probability. , an information processing apparatus is provided.
Preferably, when the second situation probability is higher than the first situation probability, the correction probability acquired by the correction probability acquisition unit is higher than the second situation probability.
Preferably, when the first situation probability is higher than the second situation probability, the correction probability acquired by the correction probability acquisition unit is lower than the second situation probability.
以下、図面を用いて本発明の実施形態について説明する。以下に示す実施形態中で示した各種特徴事項は、互いに組み合わせ可能である。また、各特徴事項について独立して発明が成立する。 Embodiments of the present invention will be described below with reference to the drawings. Various features shown in the embodiments shown below can be combined with each other. In addition, the invention is established independently for each characteristic item.
1 全体構成説明
実施形態に係る情報処理装置100の全体構成について説明する。実施形態において、情報処理装置100は、船(図4に示す情報処理装置100の搭載船obt)に搭載されているものとして説明する。情報処理装置100は、海上の予め定められた対象が搭載船obtに接近中であるか否かを早期に判定する機能を有する。ここで、実施形態において、対象は、物体(有体物)である。また、物体は、自身の動力で動作するものであるか否かは限定されない。また、物体は、海上に浮遊しているものに限定されるものではなく、浮遊・飛行しているものであってもよい。物体は、例えば、船、ブイ、ヨット、ブイ等である。船には、急激な方向転換や急激な速度アップが困難なものがあるため、物体が船に接近中であるかどうかを早期に予め把握しておくことは、船と物体との衝突を回避する観点で有益である。
1 Description of Overall Configuration The overall configuration of the information processing apparatus 100 according to the embodiment will be described. In the embodiment, the information processing device 100 will be described as being mounted on a ship (the ship obt on which the information processing device 100 shown in FIG. 4 is mounted). The information processing device 100 has a function of quickly determining whether or not a predetermined target on the sea is approaching the onboard vessel obt. Here, in the embodiment, the target is an object (tangible object). Also, it is not limited whether or not the object operates by its own power. Also, the object is not limited to an object floating on the sea, and may be an object floating or flying. Objects are, for example, ships, buoys, yachts, buoys, and the like. Some ships are difficult to turn abruptly and speed up quickly, so it is important to know in advance whether an object is approaching the ship at an early stage to avoid a collision between the ship and the object. It is useful from the viewpoint of
搭載船obtには、情報処理装置100の他に、図1に示す撮像装置20及び外部装置21が搭載されている。撮像装置20は、搭載船obtの周囲(実施形態では前方)を撮像して得られるデータ(画像D1)を情報処理装置100へ出力するように構成されている。撮像装置20は、搭載船obtに取り付けられている。外部装置21は、例えば、スピーカーやディスプレイ等で構成され、音声や文字等の手段によって搭載船obtの搭乗者へ、物体が接近中であるか否かを知らせる機能を有する。情報処理装置100は、画像D1の複数のフレームに基づいて物体が接近中であるか否かを判定し、判定結果を外部装置21へ出力することが可能である。これにより、情報処理装置100は、早期に搭載船obtの搭乗者に回避動作を促すことができる。 In addition to the information processing device 100, the imaging device 20 and the external device 21 shown in FIG. The imaging device 20 is configured to output to the information processing device 100 data (image D<b>1 ) obtained by imaging the surroundings (front in the embodiment) of the onboard ship obt. The imaging device 20 is attached to the loading ship obt. The external device 21 is composed of, for example, a speaker, a display, or the like, and has a function of informing passengers of the on-board ship obt whether or not an object is approaching by means of voice, text, or the like. Information processing apparatus 100 can determine whether an object is approaching based on a plurality of frames of image D<b>1 and output the determination result to external device 21 . As a result, the information processing apparatus 100 can prompt the passengers of the on-board ship obt to take an avoidance action at an early stage.
情報処理装置100は、画像取得部1と、確率取得部2と、判定部3と、出力部4とを備えている。情報処理装置100の各構成要素は、ソフトウェアによって実現してもよく、ハードウェアによって実現してもよい。ソフトウェアによって実現する場合、CPUがコンピュータプログラムを実行することによって各種機能を実現することができる。プログラムは、内蔵の記憶部に格納してもよく、コンピュータ読み取り可能な非一時的な記録媒体に格納してもよい。また、外部の記憶部に格納されたプログラムを読み出し、いわゆるクラウドコンピューティングにより実現してもよい。ハードウェアによって実現する場合、ASIC、FPGA、又はDRPなどの種々の回路によって実現することができる。本実施形態においては、様々な情報やこれを包含する概念を取り扱うが、これらは、0又は1で構成される2進数のビット集合体として信号値の高低によって表され、上記のソフトウェア又はハードウェアの態様によって通信や演算が実行され得るものである。 The information processing device 100 includes an image acquisition unit 1 , a probability acquisition unit 2 , a determination unit 3 and an output unit 4 . Each component of the information processing apparatus 100 may be realized by software or by hardware. When realized by software, various functions can be realized by the CPU executing a computer program. The program may be stored in a built-in storage unit, or may be stored in a computer-readable non-temporary recording medium. Alternatively, a program stored in an external storage unit may be read out and implemented by so-called cloud computing. When implemented by hardware, it can be implemented by various circuits such as ASIC, FPGA, or DRP. In this embodiment, various information and concepts including this are handled, and these are represented by high and low signal values as binary bit aggregates composed of 0 or 1, and the above software or hardware Communication and computation can be performed according to the aspect of .
2 情報処理装置100の詳細構成説明
2-1 画像取得部1
図1に示すように、画像取得部1は、撮像装置20から画像D1を取得可能に構成されている。なお、画像取得部1で行われる処理が、画像取得ステップの一例である。ここで、実施形態では、画像D1は、複数のフレームを有する。複数のフレームは、時系列情報が特定されているデータである。具体的には、複数のフレームは、図5A及び図5Bに示すように、第1フレームD11と、タイミングが第1フレームよりも後である第2フレームD12とを有する。なお、第1フレームにおけるタイミング(時刻)を第1タイミングと定義し、第2フレームにおけるタイミング(時刻)を第2タイミングと定義する。図5A及び図5Bに示すフレームの例では、船ob1が搭載船obtに接近してきている。このため、図5Bに示すフレームは、図5Aに示すフレームよりも、船ob1が徐々に大きく且つ明瞭に表示される。
2 Detailed Configuration Description of Information Processing Apparatus 100 2-1 Image Acquisition Unit 1
As shown in FIG. 1, the image acquisition unit 1 is configured to acquire an image D1 from an imaging device 20. As shown in FIG. Note that the processing performed by the image acquisition unit 1 is an example of the image acquisition step. Here, in the embodiment, the image D1 has multiple frames. A plurality of frames is data for which time-series information is specified. Specifically, the multiple frames include a first frame D11 and a second frame D12 whose timing is later than the first frame, as shown in FIGS. 5A and 5B. The timing (time) in the first frame is defined as first timing, and the timing (time) in the second frame is defined as second timing. In the example frames shown in FIGS. 5A and 5B, the ship ob1 is approaching the onboard ship obt. Therefore, in the frame shown in FIG. 5B, the ship ob1 is displayed gradually larger and more clearly than in the frame shown in FIG. 5A.
画像D1の構成例について図3を参照して説明する。実施形態において、画像D1のフレームには、船ob1と、空ob2と、海ob3とが映し出されている。実施形態において、物体とは、船ob1の他、ブイやヨットといったような、船が進行する上で障害となり得るような対象を指している。 A configuration example of the image D1 will be described with reference to FIG. In the embodiment, the frame of the image D1 shows the ship ob1, the sky ob2, and the sea ob3. In the embodiment, the object refers to an object such as a buoy, a yacht, or the like, other than the ship ob1, which may become an obstacle to the progress of the ship.
船ob1は、搭載船obt(撮像装置20)の正面に位置している。図4に示すように、搭載船obtを中心に直交座標を定義すると、y軸上に船ob1は位置している。なお、図4において、y軸は、搭載船obtの進行方向に平行であり、x軸は、y軸に直交している。船ob1は、搭載船obtに接近中の物体である。具体的には、船ob1は、-y軸方向に移動している。 The ship ob1 is located in front of the onboard ship obt (imaging device 20). As shown in FIG. 4, defining the Cartesian coordinates centering on the on-board ship obt, the ship ob1 is positioned on the y-axis. In FIG. 4, the y-axis is parallel to the traveling direction of the on-board vessel obt, and the x-axis is orthogonal to the y-axis. Vessel ob1 is an object approaching onboard vessel obt. Specifically, the ship ob1 is moving in the -y-axis direction.
実施形態では、後述する判定部3が、2つのフレーム中の物体の状況確率の差分に基づいて、物体が接近中であるか否かといった状況判定を行う。状況確率は、対象(実施形態では物体)の状況を表す確率であり、実施形態における状況確率は、対象(実施形態では物体)の存在確率である。存在確率は、画像のフレーム中に予め定められた物体が存在する確率である。
第1フレームと第2フレームとの間のタイミングの差は、特に限定されるものではないが、情報処理装置100が搭載されるものに応じて設定することが好ましい。例えば、実施形態のように、情報処理装置100が船に搭載される場合には、車両や飛行機等と比較して、障害物となり得る対象の移動速度は低い。このため、第1タイミングと第2タイミングとの差は、例えば、10秒,20秒,30秒,40秒,50秒,1分,2分,3分,4分,5分,6分,7分,8分,9分,10分である。また、第1タイミングと第2タイミングとの差は、ここで列挙した値のうちの2つの値の範囲で表されていてもよい。
In the embodiment, the determination unit 3, which will be described later, performs situation determination such as whether or not an object is approaching based on the difference in the situation probabilities of the object in the two frames. A situation probability is a probability representing a situation of an object (an object in the embodiment), and a situation probability in the embodiment is an existence probability of the object (an object in the embodiment). Existence probability is the probability that a predetermined object exists in the frame of the image.
The difference in timing between the first frame and the second frame is not particularly limited, but is preferably set according to the device on which the information processing apparatus 100 is installed. For example, when the information processing apparatus 100 is mounted on a ship as in the embodiment, the moving speed of an object that can be an obstacle is lower than that of a vehicle, an airplane, or the like. Therefore, the difference between the first timing and the second timing is, for example, 10 seconds, 20 seconds, 30 seconds, 40 seconds, 50 seconds, 1 minute, 2 minutes, 3 minutes, 4 minutes, 5 minutes, 6 minutes, 7 minutes, 8 minutes, 9 minutes, 10 minutes. Also, the difference between the first timing and the second timing may be represented by a range of two values among the values listed here.
2-2 確率取得部2
確率取得部2は、図2Aに示すように、画像D1に基づいて存在確率d1を含む出力D2を取得する。そして、確率取得部2は、出力D2を判定部3へ出力する。なお、確率取得部2で行われる処理が、確率取得ステップの一例である。出力D2は、存在確率d1(存在確率データ)と、座標d2(座標データ)と、種類データd3とを有する。なお、存在確率d1は、状況確率(実施形態では存在確率)が特定されたデータ(状況確率データ)である。ここで、これらのデータの構成例について説明する。
2-2 Probability acquisition unit 2
The probability acquisition unit 2 acquires an output D2 including the existence probability d1 based on the image D1, as shown in FIG. 2A. The probability acquisition unit 2 then outputs the output D2 to the determination unit 3 . Note that the processing performed by the probability acquisition unit 2 is an example of the probability acquisition step. The output D2 has existence probability d1 (existence probability data), coordinates d2 (coordinate data), and type data d3. Note that the existence probability d1 is data (situation probability data) in which the situation probability (existence probability in the embodiment) is specified. Here, a configuration example of these data will be described.
存在確率d1は、画像D1のフレーム中の物体の存在確率の値が特定されているデータである。存在確率d1の存在確率は、画像D1のフレーム中に予め定められた物体が存在する確率である。 The existence probability d1 is data specifying the value of the existence probability of an object in the frame of the image D1. The existence probability of the existence probability d1 is the probability that a predetermined object exists in the frame of the image D1.
座標d2は、画像D1のフレーム中の物体の座標が特定されているデータである。実施形態では、座標d2の座標は、x-z座標系である。また、座標d2には、画像D1のフレーム中の物体の横幅W及び縦幅Hが特定されているデータも含む。 The coordinates d2 are data specifying the coordinates of the object in the frame of the image D1. In an embodiment, the coordinates of coordinate d2 are in the xz coordinate system. The coordinates d2 also include data specifying the width W and height H of the object in the frame of the image D1.
種類データd3は、画像D1のフレーム中の物体の種類を特定するためのデータである。種類データd3は、画像D1のフレーム中の各物体について、予め定められた種類ごとに確率が特定されているデータである。例えば、図2Bに示すように、実施形態における予め定められた種類には、船、ブイ、ヨットといったものが含まれている。実施形態では、画像D1のフレーム中の物体としては、船、ブイ及びヨット等がある。例えば、種類データd3は、船ob1に対応する物体に対して、船である確率、ブイである確率、及びヨットである確率等といったように予め定められている全ての候補の物体の確率が特定されている。 The type data d3 is data for specifying the type of the object in the frame of the image D1. The type data d3 is data in which the probability is specified for each predetermined type for each object in the frame of the image D1. For example, as shown in FIG. 2B, predetermined types in an embodiment include ships, buoys, yachts, and the like. In embodiments, the objects in the frame of image D1 include ships, buoys and yachts. For example, the type data d3 specifies the probabilities of all candidate objects, such as the probability of being a ship, the probability of being a buoy, and the probability of being a yacht, with respect to the object corresponding to the ship ob1. It is
上述した出力D2(存在確率d1、座標d2、及び種類データd3)は、画像D1を入力して出力D2を出力する学習モデルに基づいて算出することができる。学習モデルとは、多数の教師データ(既知の入力データと正解データの組)を用いてモデルを訓練し、将来の出力を予測可能にするモデルである。本実施形態では、教師データの入力データは、海上の船等の物体を移した画像データ(フレーム)であり、正解データは、この画像データに映し出されている物体の存在確率、物体の座標、物体の横幅及び縦幅及び物体の種類を含む。本実施形態では、学習モデル(機械学習モデル)は、例えば、YOLO(You Only Look Once)やSSD(Single Shot Detector)といった各種の物体検出アルゴリズムを採用することができる。 The above output D2 (existence probability d1, coordinates d2, and type data d3) can be calculated based on a learning model that inputs image D1 and outputs output D2. A learning model is a model that is trained using a large amount of teacher data (a set of known input data and correct answer data) to make it possible to predict future outputs. In this embodiment, the input data of the teacher data is image data (frames) obtained by transferring an object such as a ship on the sea, and the correct data is the existence probability of the object shown in this image data, the coordinates of the object, Includes the width and height of the object and the type of object. In this embodiment, the learning model (machine learning model) can employ various object detection algorithms such as YOLO (You Only Look Once) and SSD (Single Shot Detector).
2-3 判定部3
判定部3は、座標d2の有無を判定する機能(以下、第1機能とも称する)と、異なるフレーム中の物体間の座標間距離が予め定められた距離以下であるかを判定する機能(以下、第2機能とも称する)と、異なるフレーム中の物体の種類が同一であるか否かを判定する機能(以下、第3機能とも称する)と、異なるフレーム中の物体の存在確率を比較する機能(以下、第4機能とも称する)とを有する。そして、判定部3は、物体の状況を示す判定結果を出力可能に構成されている。
2-3 Determination unit 3
The determination unit 3 has a function of determining the presence or absence of the coordinate d2 (hereinafter also referred to as a first function) and a function of determining whether the distance between coordinates between objects in different frames is equal to or less than a predetermined distance (hereinafter referred to as a first function). , also referred to as a second function), a function of determining whether the types of objects in different frames are the same (hereinafter also referred to as a third function), and a function of comparing the existence probabilities of objects in different frames. (hereinafter also referred to as a fourth function). The determination unit 3 is configured to be able to output a determination result indicating the state of the object.
まず、第1機能について説明する。判定部3は、第2フレーム(時系列的に後のフレーム)の座標d2の有無を判定する。第2フレーム(時系列的に後のフレーム)の存在確率が0%である場合には、出力D2には座標d2が含まれず、存在確率が0%より高い場合には、出力D2には座標d2が含まる。つまり、実施形態において、出力D2中に座標d2が存在している場合は、存在確率が0%よりも高い場合と等価である。なお、第1機能は、閾値判定で代替してもよい。つまり、判定部3は、存在確率が予め定められた閾値(例えば、5%)より高いか否かを判定する機能に置き換えてもよい。つまり、存在確率がかなり低い条件であれば、情報処理装置100が、物体がフレーム中に存在しないものとして処理してもよい。 First, the first function will be explained. The determination unit 3 determines the presence or absence of the coordinate d2 of the second frame (chronologically later frame). If the existence probability of the second frame (chronologically subsequent frame) is 0%, the output D2 does not include the coordinate d2, and if the existence probability is higher than 0%, the output D2 does not include the coordinate d2. d2 is included. That is, in the embodiment, when the coordinate d2 exists in the output D2, it is equivalent to the case where the existence probability is higher than 0%. Note that the first function may be replaced by threshold determination. That is, the determination unit 3 may be replaced with a function of determining whether or not the existence probability is higher than a predetermined threshold value (for example, 5%). In other words, under conditions where the existence probability is considerably low, the information processing apparatus 100 may process the object as not existing in the frame.
次に、第2機能について説明する。画像取得部1は、第1フレーム(時系列的に前のフレーム)の物体の座標と、第2フレーム(時系列的に後のフレーム)の物体の座標とをそれぞれ取得しており、これらの座標を出力D2として判定部3に出力している。判定部3は、第1フレームの物体の座標と、第2フレームの物体の座標との間の座標間距離が、予め定められた距離以下であるかを判定する。情報処理装置100は各フレームの物体が同一であるか否かを把握する基準として、この座標間距離を用いている。座標間距離が予め定められた距離以下であるのであれば、第1フレームの物体と第2フレームの物体とが同じものである可能性を高めることになるからである。 Next, the second function will be explained. The image acquisition unit 1 acquires the coordinates of the object in the first frame (the frame chronologically earlier) and the coordinates of the object in the second frame (the frame chronologically later). The coordinates are output to the determination unit 3 as the output D2. The determination unit 3 determines whether the inter-coordinate distance between the coordinates of the object in the first frame and the coordinates of the object in the second frame is equal to or less than a predetermined distance. The information processing apparatus 100 uses this inter-coordinate distance as a criterion for determining whether or not the object in each frame is the same. This is because, if the distance between coordinates is equal to or less than the predetermined distance, it increases the possibility that the object in the first frame and the object in the second frame are the same.
次に、第3機能について説明する。画像取得部1は、第1フレーム(時系列的に前のフレーム)の物体の種類データd3と、第2フレーム(時系列的に後のフレーム)の物体の種類データd3とをそれぞれ取得しており、これらのデータを出力D2として判定部3に出力している。
ここで、判定部3は、各フレームの種類データd3を用いて物体の種類を特定する。具体的には、判定部3は、第1フレームにおける物体の種類の候補の確率(種類特定確率)のうち、最も確率が高い候補を、物体の種類として特定する。例えば、図3に示す画像D1の場合には、船が鮮明に映し出されているため、船の確率が高く算出され、ブイやヨットの確率は低く算出されることになる。判定部3は、これらの確率の大小を比較し、最も高い候補である船を物体の種類として特定する。判定部3は、第2フレームについても同様に、物体の種類を特定する。つまり、判定部3は、第2フレームにおける物体の種類の候補の確率のうち、最も確率が高い候補を、物体の種類として特定する。
そして、判定部3は、第1フレームの物体の種類と、第2フレームの物体の種類とが同一であるか否かを判定する。このように、情報処理装置100は各フレームの物体が同一であるか否かを把握する基準として、物体の種類を用いている。種類が同一であれば、第1フレームの物体と第2フレームの物体とが同じものである可能性を高めることになるからである。
Next, the third function will be explained. The image acquisition unit 1 acquires the object type data d3 of the first frame (chronologically earlier frame) and the object type data d3 of the second frame (chronologically later frame). These data are output to the determination unit 3 as the output D2.
Here, the determination unit 3 identifies the type of object using the type data d3 of each frame. Specifically, the determination unit 3 identifies the candidate with the highest probability among the object type candidate probabilities (type identification probabilities) in the first frame as the object type. For example, in the case of the image D1 shown in FIG. 3, since the ship is clearly displayed, the probability of the ship is calculated high, and the probability of the buoy and the yacht is calculated low. The determination unit 3 compares the magnitude of these probabilities and identifies the ship, which is the highest candidate, as the type of object. The determination unit 3 similarly identifies the type of object for the second frame. That is, the determination unit 3 identifies the candidate with the highest probability among the object type candidate probabilities in the second frame as the object type.
Then, the determination unit 3 determines whether or not the type of the object in the first frame is the same as the type of the object in the second frame. In this way, the information processing apparatus 100 uses the type of object as a criterion for determining whether the objects in each frame are the same. This is because if the types are the same, the possibility that the object in the first frame and the object in the second frame are the same increases.
次に、第4機能について説明する。画像取得部1は、第1フレーム(時系列的に前のフレーム)の物体の存在確率と、第2フレーム(時系列的に後のフレーム)の物体の存在確率とをそれぞれ取得しており、これらの存在確率を出力D2として判定部3に出力している。判定部3は、第1フレームの物体の存在確率と、第2フレームの物体の存在確率とを比較する機能を有する。具体的には、判定部3は、第2フレームの存在確率から、第1フレームの存在確率を引いて得られる値(差分)が予め定められた閾値Th1(第1閾値の一例)よりも大きいと判定した場合に、物体が接近してきていると判定するように構成されている。換言すると、判定部3は、第1フレームから第2フレームにかけて、存在確率が予め定められた閾値Th1よりも上昇した場合に、物体が接近してきていると判定する。このように、情報処理装置100は、存在確率単体で状況判定をするのではなく、存在確率の差分を用いて状況判定をしている。なお、判定部3の判定結果が、判定結果D3に対応している。 Next, the fourth function will be explained. The image acquisition unit 1 acquires the object existence probability of the first frame (chronologically earlier frame) and the object existence probability of the second frame (chronologically later frame), These existence probabilities are output to the determination unit 3 as the output D2. The determination unit 3 has a function of comparing the existence probability of the object in the first frame and the existence probability of the object in the second frame. Specifically, the determination unit 3 determines that a value (difference) obtained by subtracting the existence probability of the first frame from the existence probability of the second frame is greater than a predetermined threshold Th1 (an example of the first threshold). is determined, it is determined that the object is approaching. In other words, the determination unit 3 determines that an object is approaching when the existence probability rises above the predetermined threshold value Th1 from the first frame to the second frame. In this way, the information processing apparatus 100 makes a situation judgment using the difference in the existence probability instead of judging the situation based on the existence probability alone. Note that the determination result of the determination unit 3 corresponds to the determination result D3.
ここで、閾値Th1の値は、特に限定されるものではないが、閾値Th1の値は、例えば、10~50のうちのいずれかの数値に設定することができる。閾値Th1の値は、情報処理装置100が搭載されるものに応じて適宜設定することができる。 Here, the value of the threshold Th1 is not particularly limited, but the value of the threshold Th1 can be set to any numerical value from 10 to 50, for example. The value of the threshold Th1 can be appropriately set according to the information processing apparatus 100 installed therein.
2-4 出力部4
出力部4は、判定結果D3を外部装置21へ出力するように構成されている。外部装置21は、搭載船obtに搭載されるディスプレイ等の機器に限定されるものではない。搭載船obtの運行状況を統括するコントロールセンターに出力されてもよい。
2-4 Output section 4
The output unit 4 is configured to output the determination result D3 to the external device 21 . The external device 21 is not limited to equipment such as a display mounted on the onboard ship obt. It may be output to a control center that supervises the operation status of the on-board ship obt.
3 動作説明
図6に基づいて情報処理装置100の動作について説明する。図6の制御フローチャートは、異なるタイミングのフレームを取得した段階で実施可能である。
3 Description of Operation The operation of the information processing apparatus 100 will be described with reference to FIG. The control flowchart in FIG. 6 can be implemented at the stage when frames with different timings are acquired.
ステップS1:座標d2の有無判定(第1機能)
判定部3は、第2フレーム中(時系列的に後のフレーム中)に座標d2が存在するか否かを判定する。存在する場合には、ステップS2へ移行する。存在しない場合には、ステップS5-2へ移行する。
Step S1: Judgment of presence/absence of coordinate d2 (first function)
The determination unit 3 determines whether or not the coordinate d2 exists in the second frame (in the later frame in chronological order). If it exists, the process proceeds to step S2. If it does not exist, the process proceeds to step S5-2.
ステップS2:座標間距離の閾値判定(第2機能)
判定部3は、第1及び第2フレームにおける物体の座標間の距離が予め定められた距離以下であるか否かを判定する。距離が予め定められた距離以下である場合には、第1及び第2フレームにおける両物体が同一である可能性があるため、ステップS3へ移行する。距離が予め定められた距離以下ではない場合には、ステップS5-2へ移行する。
Step S2: Threshold determination of distance between coordinates (second function)
The determination unit 3 determines whether or not the distance between the coordinates of the object in the first and second frames is equal to or less than a predetermined distance. If the distance is less than or equal to the predetermined distance, there is a possibility that both the objects in the first and second frames are the same, so the process proceeds to step S3. If the distance is not equal to or less than the predetermined distance, the process proceeds to step S5-2.
ステップS3:種類の同一判定(第3機能)
判定部3は、第1及び第2フレームにおける物体の種類が同じであるか否かを判定する。この判定をするにあたって、判定部3は、予め、第1及び第2フレームの物体の種類を特定している。つまり、判定部3は、第1及び第2フレームのそれぞれに関して、物体の種類の候補のうち最も確率(種類特定確率)が高い候補を、物体の種類として特定する。物体の種類が同じである場合には、両物体が同一である可能性が更に高まるため、ステップS4へ移行する。物体の種類が同じでない場合には、ステップS5-2へ移行する。
Step S3: Judgment of Same Type (Third Function)
The determination unit 3 determines whether or not the types of objects in the first and second frames are the same. In making this determination, the determination unit 3 specifies in advance the types of objects in the first and second frames. That is, the determination unit 3 identifies the candidate with the highest probability (type identification probability) among the object type candidates as the object type for each of the first and second frames. If the types of objects are the same, the possibility that both objects are the same increases, so the process proceeds to step S4. If the types of objects are not the same, the process proceeds to step S5-2.
ステップS4:存在確率の比較(第4機能)
実施形態に係る情報処理装置100は、ステップS1-ステップS3を経てきた場合には、第1及び第2フレーム中の物体が同一であるものとみなしている。つまり、ステップS1-ステップS3は、第1及び第2フレームにおける物体が同じであるか否かを判定するためのステップということができる。判定部3は、この同一であるとみなされた物体の存在確率の差分(第2フレームにおける物体の存在確率から、第1フレームにおける物体の存在確率を引いて得られる値)を取得する。そして、判定部3は、この差分が予め定められた閾値Th1より大きいか否かを判定する。閾値Th1より大きい場合には、ステップS5-1に移行する。閾値より大きくない場合には、ステップS5-2に移行する。ステップS4が、判定ステップの一例である。
Step S4: Comparison of existence probabilities (fourth function)
The information processing apparatus 100 according to the embodiment assumes that the objects in the first and second frames are the same when steps S1 to S3 have been performed. That is, steps S1 to S3 can be said to be steps for determining whether the objects in the first and second frames are the same. The determination unit 3 acquires the difference in the existence probability of the objects considered to be identical (a value obtained by subtracting the existence probability of the object in the first frame from the existence probability of the object in the second frame). Then, the determination unit 3 determines whether or not this difference is greater than a predetermined threshold value Th1. If it is greater than the threshold Th1, the process proceeds to step S5-1. If it is not greater than the threshold, the process proceeds to step S5-2. Step S4 is an example of a determination step.
ステップS5-1:接近中
判定部3は、物体が接近中である、という状況を表す判定結果D3を生成する。
Step S5-1: Approaching The determination unit 3 generates a determination result D3 indicating that the object is approaching.
ステップS5-2:接近中でない
判定部3は、物体が接近中ではない、という状況を表す判定結果D3を生成する。
Step S5-2: Not Approaching The determination unit 3 generates a determination result D3 indicating that the object is not approaching.
4 実施形態の効果
従来、特定の物体である確率が予め定められた閾値以上であると判定されたことを契機として、特定の制御(物体を回避する動作に係る制御)を実行するように構成されている情報処理装置が知られている。このような装置では、例えば、当該確率の値が急速に変動する場合には、特定の制御を適切に実行できなくなる可能性が高まる。また、大型船のように速度をすみやかに上昇させたり、進行方向をすみやかに変更したりすることが難しい乗り物の場合においても、特定の制御を適切に実行できなくなる可能性が高まる。つまり、従来の情報処理装置は、活用されるシーンによっては、対象(物体)の状況を適切に把握することができず、その結果、特定の制御を適切に実行することができなくなる可能性が高まる。
本実施形態に係る情報処理装置100は、判定部3が、第1及び第2フレームの存在確率に基づいて、物体の状況を判定するように構成されている。つまり、情報処理装置100は、従来の情報処理装置のように確率単体を用いて回避動作をするべき状況であるか否かを判定するのではなく、存在確率の差分を用いて回避動作をするべき状況であるか否かを判定する。このため、情報処理装置100は、従来の情報処理装置よりも、より事前に特定の制御をするべき状況にあるかどうかを判定することができ、各種装置の機能(例えば、物体を回避する動作に係る制御機能)を適切に発揮させることができる。
4 Effect of Embodiment Conventionally, when it is determined that the probability of being a specific object is equal to or greater than a predetermined threshold, a specific control (control related to avoidance of an object) is performed. is known. In such a device, for example, when the value of the probability fluctuates rapidly, there is an increased possibility that specific control cannot be executed appropriately. Also, in the case of a vehicle, such as a large ship, in which it is difficult to quickly increase speed or quickly change the direction of travel, there is an increased possibility that specific control cannot be executed appropriately. In other words, depending on the scene in which the conventional information processing device is used, it may not be possible to properly grasp the situation of the target (object), and as a result, it may not be possible to properly execute specific control. increase.
The information processing apparatus 100 according to this embodiment is configured such that the determination unit 3 determines the state of an object based on the existence probabilities of the first and second frames. That is, unlike the conventional information processing apparatus, the information processing apparatus 100 uses the difference between the existence probabilities to perform the avoidance action, instead of using the probability alone to determine whether the avoidance action should be taken. It is determined whether or not it should be. Therefore, the information processing apparatus 100 can determine whether or not there is a situation in which specific control should be performed more in advance than the information processing apparatus of the related art. control function) can be exhibited appropriately.
5 変形例
5-1 変形例1:修正確率取得部5
実施形態では、第1及び第2フレームの存在確率の差分に基づいて、状況判定をしていたが、これに限定されるものではない。情報処理装置100は、図7に示すように、修正確率取得部5を更に備えていてもよい。
5 Modification 5-1 Modification 1: Modified Probability Acquisition Unit 5
In the embodiment, the situation is determined based on the difference between the existence probabilities of the first and second frames, but the present invention is not limited to this. The information processing apparatus 100 may further include a correction probability acquisition unit 5, as shown in FIG.
修正確率取得部5は、第2フレームの存在確率と第1フレームの存在確率との差分に基づいて、修正確率D21を取得可能に構成されている。修正確率D21は、修正確率の値が特定されているデータである。修正確率は、第1フレームのタイミングと第2フレームのタイミングとの間において、フレーム内に物体が存在する確率である。実施形態では、存在確率の差分そのものと閾値とを比較する形態であったが、変形例では、差分を確率(修正確率Pc)に再び置き換える点で異なっている。 The correction probability acquisition unit 5 is configured to be able to acquire the correction probability D21 based on the difference between the existence probability of the second frame and the existence probability of the first frame. The correction probability D21 is data in which the value of the correction probability is specified. The correction probability is the probability that an object exists in the frame between the timing of the first frame and the timing of the second frame. In the embodiment, the existence probability difference itself is compared with the threshold value, but in the modified example, the difference is replaced with the probability (correction probability Pc) again.
修正確率Pcは、例えば、予め定められた関係式に基づいて取得可能である。つまり、第1及び第2フレームの存在確率が定まれば、修正確率Pcが一義的に定められる。
この関係式は、例えば、第2フレームの存在確率が第1フレームの存在確率よりも高い場合においては、修正確率が第2フレームの存在確率よりも高い値となるように設定される。つまり、存在確率が上昇している場合には、物体が接近している可能性が高いということができるので、修正確率Pcは、第2フレームの存在確率よりも高くする。
なお、存在確率が低下している場合には、物体が接近していない状況にあるということができる。このため、この関係式は、第1フレームの存在確率が第2フレームの存在確率よりも高い場合においては、修正確率Pcは、第2フレームの存在確率よりも低い値となるように設定される。
The correction probability Pc can be obtained based on, for example, a predetermined relational expression. That is, once the existence probabilities of the first and second frames are determined, the correction probability Pc is uniquely determined.
This relational expression is set so that, for example, when the existence probability of the second frame is higher than the existence probability of the first frame, the correction probability is higher than the existence probability of the second frame. That is, when the existence probability increases, it can be said that there is a high possibility that an object is approaching, so the correction probability Pc is set higher than the existence probability of the second frame.
In addition, when the existence probability is lowered, it can be said that the object is not approaching. Therefore, in this relational expression, when the existence probability of the first frame is higher than the existence probability of the second frame, the correction probability Pc is set to a value lower than the existence probability of the second frame. .
また、変形例におけるフローチャートは、図8に示すように、ステップS4及びステップS5を備える点で実施形態のフローチャートとは異なっている。ステップS4は、修正確率取得部5が、第2フレームの存在確率と第1フレームの存在確率との差分に基づいて、修正確率D21を取得するステップである。ステップS5は、判定部3が、修正確率D21の修正確率Pcが閾値Th2より大きいか否かを判定する。修正確率Pcが閾値Th2より大きい場合には、物体が接近中であると判定することになる(ステップS6-1)。 Also, the flowchart in the modified example differs from the flowchart in the embodiment in that it includes steps S4 and S5 as shown in FIG. Step S4 is a step in which the correction probability acquisition unit 5 acquires the correction probability D21 based on the difference between the existence probability of the second frame and the existence probability of the first frame. In step S5, the determination unit 3 determines whether or not the correction probability Pc of the correction probability D21 is greater than the threshold Th2. If the correction probability Pc is greater than the threshold Th2, it is determined that the object is approaching (step S6-1).
なお、例えば、第1フレームにおける物体の存在確率が1%であり第2フレームにおける物体の存在確率が6%である場合と、第1フレームにおける物体の存在確率が20%であり第2フレームにおける物体の存在確率が25%である場合とは、差分自体は同じである。しかし、前者の場合と後者の場合とでは、存在確率の大きさが異なる。このため、前者の場合と後者の場合とでは、差分自体が同じであっても、判定内容に差をつけることが好ましい場合がある。例えば、前者の場合には、存在確率自体が非常に低いため、誤差で存在確率が増大した可能性がある。それに対し、後者の場合は、比較的、存在確率が高いため、存在確率の上昇は誤差ではなく、有意なものである可能性が高い。そこで、本変形例1に係る修正確率取得部5の予め定められた関係式は、存在確率の差分が同じであっても、存在確率自体の大きさが異なる場合には、判定内容に差がつくように定められていることが好ましい。 For example, a case where the existence probability of the object in the first frame is 1% and the existence probability of the object in the second frame is 6%, and a case where the existence probability of the object in the first frame is 20% and The difference itself is the same as when the existence probability of the object is 25%. However, the magnitude of the existence probability differs between the former case and the latter case. Therefore, even if the difference itself is the same between the former case and the latter case, it may be preferable to differentiate the determination contents. For example, in the former case, since the existence probability itself is very low, an error may have increased the existence probability. On the other hand, in the latter case, since the existence probability is relatively high, the increase in the existence probability is not an error but is likely to be significant. Therefore, according to the predetermined relational expression of the modified probability acquisition unit 5 according to the first modification, even if the difference in the existence probability is the same, if the magnitude of the existence probability itself is different, there is a difference in the determination content. It is preferable that the
また、状況判定に差分を用いるか、それとも修正確率を用いるかを選択できるようにしておくことが好ましい。
例えば、判定部3は、第1及び第2フレームの存在確率の両方が予め定められた閾値(第2閾値の一例)より低い場合においては、修正確率Pcに基づいて、物体の状況を判定するように構成されていてもよい。この場合において、修正確率Pcは、予め定められた閾値(第2閾値の一例)よりも高い。
また、判定部3は、第1及び第2フレームの存在確率のうちの少なくとも一方が当該閾値(第2閾値の一例)より高い場合においては、修正確率Pcを用いずに、物体の状況を判定するように構成されていてもよい。このように、第1及び第2フレームの存在確率の大きさに応じて、修正確率を用いるか否かを決めることで、シーンに応じた適切な状況判定を実現することが期待できる。
Moreover, it is preferable to select whether to use the difference or the correction probability for the situation determination.
For example, when both the existence probabilities of the first and second frames are lower than a predetermined threshold value (an example of the second threshold value), the determination unit 3 determines the state of the object based on the correction probability Pc. It may be configured as In this case, the correction probability Pc is higher than a predetermined threshold (an example of the second threshold).
Further, when at least one of the existence probabilities of the first and second frames is higher than the threshold (an example of the second threshold), the determination unit 3 determines the state of the object without using the correction probability Pc. may be configured to In this way, by determining whether or not to use the correction probability according to the magnitude of the existence probability of the first and second frames, it is possible to realize appropriate situation determination according to the scene.
5-2 変形例2:複数物体が存在する場合について
実施形態では、判定対象となる1つの物体(船ob1)のみがフレーム中に収まっている形態について説明したがこれに限定されるものではない。複数の物体がフレーム中に収まっていてもよい。例えば、図9に示すように、画像のフレーム中にブイob4及び船ob5が映っていてもよい。
5-2 Modified Example 2: When Multiple Objects Exist In the embodiment, a form in which only one object to be determined (ship ob1) is contained in the frame has been described, but the present invention is not limited to this. . Multiple objects may fit in the frame. For example, as shown in FIG. 9, buoy ob4 and ship ob5 may appear in the frame of the image.
ここで、ブイob4は、搭載船の近くに存在している分、ブイob4の存在確率は、かなり高く値になる。存在確率が高い物体に対しては、実施形態で説明してきたような一連の制御(図6のステップS1~ステップS5-2)によらず、その他の公知の手法で接近判定をすることが好ましい。つまり、存在確率が予め定められた閾値よりも高い場合には、本来的に搭載船の近くにいる物体であるものとみなすことが可能であるため、実施形態で説明してきたような一連の制御(図6のステップS1~ステップS5-2)を実行しないことが好ましい。この場合、例えば、図6に示すフローチャートのステップS1の前段のステップとして、存在確率が閾値以上であるか否かを判定するステップを追加で設けるとよい。そして、存在確率が閾値未満であれば、ステップS1へ移行し、存在確率が閾値以上であれば、公知の手法の接近判定を実施するようにするとよい。公知の手法では、例えば、物体の大きさの変化や物体の座標の変化に基づいて接近判定を行うことができる。 Here, since buoy ob4 exists near the onboard ship, the probability of existence of buoy ob4 is quite high. For an object with a high probability of existence, it is preferable to use another known method to determine whether the object is approaching, instead of using the series of controls (steps S1 to S5-2 in FIG. 6) as described in the embodiment. . In other words, if the existence probability is higher than a predetermined threshold, it can be assumed that the object is essentially near the onboard ship. (Steps S1 to S5-2 in FIG. 6) are preferably not executed. In this case, for example, as a step preceding step S1 in the flowchart shown in FIG. Then, if the existence probability is less than the threshold value, the process proceeds to step S1, and if the existence probability is equal to or greater than the threshold value, approach determination using a known method may be performed. In a known method, for example, approach determination can be performed based on changes in the size of an object or changes in the coordinates of an object.
また、フレーム中には、船ob1及び船ob5が収まっているが、これらの物体の違いは、ステップS2の座標間距離が閾値以下か否かによって判断されることになる。つまり、船ob1について図6に示すフローチャートが実施されている場合において、船ob1の座標d2以外に、2つの物体(ブイob4及び船ob5)の座標d2も存在している。しかし、第1フレームの船ob1の座標d2と第2フレームの船ob1の座標d2との差分(座標間距離)は短いのに対し、第1フレームの船ob1の座標d2と第2フレームのブイob4の座標d2との差分や、第1フレームの船ob1の座標d2と第2フレームの船ob5の座標d2との差分は長い。このため、ステップS2が実施されることで、第2フレームの船ob1が、ブイob4や船ob5であると誤判定されることを回避することができる。ブイob4や船ob5についても、同様である。つまり、ステップS2が実施されることで、ブイob4が、船ob1や船ob5と誤判定されることや、船ob5が、船ob1やブイob4と誤判定されることを回避することができる。 Also, the ship ob1 and ob5 are contained in the frame, and the difference between these objects is determined by whether the distance between the coordinates in step S2 is equal to or less than the threshold. That is, when the flowchart shown in FIG. 6 is executed for the ship ob1, there are coordinates d2 of two objects (buoy ob4 and ship ob5) in addition to the coordinates d2 of the ship ob1. However, while the difference (inter-coordinate distance) between the coordinates d2 of the ship ob1 in the first frame and the coordinates d2 of the ship ob1 in the second frame is short, the coordinates d2 of the ship ob1 in the first frame and the buoys in the second frame The difference between the coordinates d2 of ob4 and the difference between the coordinates d2 of the ship ob1 in the first frame and the coordinates d2 of the ship ob5 in the second frame are long. Therefore, by performing step S2, it is possible to avoid misjudgment that the ship ob1 in the second frame is the buoy ob4 or the ship ob5. The same is true for the buoy ob4 and the ship ob5. In other words, by performing step S2, it is possible to avoid erroneous determination of the buoy ob4 as the ship ob1 or ob5, or erroneous determination of the ship ob5 as the ship ob1 or the buoy ob4.
5-3 変形例3:色の状態変化
実施形態では、情報処理装置100が船に搭載されており、船の障害物に接近しているかどうかという状況判定をするものとして説明したが、これに限定されるものではない。この状況判定は、物体の色の変化の判定であってもよい。例えば、信号機の色の変化や、木の葉の色の変化に適用してもよい。
5-3 Modified Example 3: Color State Change In the embodiment, the information processing device 100 is mounted on a ship, and it is explained that the situation is determined whether or not the ship is approaching an obstacle. It is not limited. This situation determination may be a determination of a change in color of the object. For example, it may be applied to changes in the color of traffic lights or changes in the color of leaves.
変形例3では、信号機を例に説明する。また、変形例3において、情報処理装置100は、車両に搭載されているものとして説明する。更に、車両が停車中において、信号機の色が赤(停止の色)から青(発進可能の色)へ変わる状況を一例として説明する。 In Modified Example 3, a traffic light will be used as an example. Further, in Modified Example 3, the information processing device 100 is described as being mounted in a vehicle. Furthermore, a situation in which the color of the traffic light changes from red (color indicating stop) to blue (color indicating possible start) while the vehicle is stopped will be described as an example.
実施形態では、出力D2について、存在確率と、物体の種類とを別々のデータとして切り分けていた。本変形例3における状況判定は、信号の色の変化であるため、図11Aに示すように、確率取得部2は、出力D2として、存在確率d1及び座標d2を取得する。そして、図11Bに示すように、存在確率d1の構成は、各色が存在する確率を示している。 In the embodiment, for the output D2, the existence probability and the type of object are separated as separate data. Since the situation determination in Modification 3 is a change in the color of the signal, the probability acquisition unit 2 acquires the existence probability d1 and the coordinates d2 as the output D2, as shown in FIG. 11A. Then, as shown in FIG. 11B, the structure of the existence probability d1 indicates the probability that each color exists.
画像D1の構成例について図10A及び図10Bを参照して説明する。変形例3において、物体とは、信号機であり、物体は、点灯する色ごとに区別される。図10Aに示すように、画像D1の第1フレームD11には、赤色に点灯している信号機ob61と、道路ob7と、歩道ob8とが映し出されている。図10Bに示すように、画像D1の第2フレームD12には、青色に点灯している信号機ob62と、道路ob7と、歩道ob8とが映し出されている。 A configuration example of the image D1 will be described with reference to FIGS. 10A and 10B. In Modified Example 3, the object is a traffic light, and the objects are distinguished according to the lighting color. As shown in FIG. 10A, the first frame D11 of the image D1 shows a traffic light ob61 lit in red, a road ob7, and a sidewalk ob8. As shown in FIG. 10B, the second frame D12 of the image D1 shows a traffic light ob62 that is lit in blue, a road ob7, and a sidewalk ob8.
次に、図12を参照して、変形例3に係る情報処理装置100の動作について説明する。ステップS1及びステップS2については、図6と同様であるため、説明を省略する。 Next, the operation of the information processing apparatus 100 according to Modification 3 will be described with reference to FIG. Steps S1 and S2 are the same as in FIG. 6, so the description is omitted.
ステップS3:色の種類の判定
ステップS3において、判定部3は、存在確率d1に基づいて、第1及び第2フレームにおける信号機の色が変化したか否かを判定する。この判定をするにあたって、判定部3は、存在確率d1に基づいて、予め第1及び第2フレームにおける信号機の色を特定している。つまり、判定部3は、第1及び第2フレームのそれぞれに関して、信号機の色の候補のうち最も確率が高い候補を、色の種類として特定する。
判定部3が信号機の色が変化したと判定した場合には、ステップS4-1に移行する。換言すると、判定部3が、信号機の色が赤から青へ変化したと判定した場合には、ステップS4-1に移行する。
判定部3が信号機の色が変化していないと判定した場合には、ステップS4-2に移行する。換言すると、判定部3が、信号機の色が変化していない場合、及び、信号機の色が変化しているが赤から青への変化ではない場合には、ステップS4-2に移行する。
Step S3: Determination of Color Type In step S3, the determination unit 3 determines whether or not the color of the traffic light in the first and second frames has changed based on the existence probability d1. In making this determination, the determination unit 3 specifies in advance the color of the traffic light in the first and second frames based on the existence probability d1. That is, the determination unit 3 identifies the candidate with the highest probability among the candidates for the color of the traffic light as the type of color for each of the first and second frames.
When the determination unit 3 determines that the color of the traffic light has changed, the process proceeds to step S4-1. In other words, when the determination unit 3 determines that the color of the traffic light has changed from red to blue, the process proceeds to step S4-1.
When the determination unit 3 determines that the color of the traffic light has not changed, the process proceeds to step S4-2. In other words, if the determination unit 3 determines that the color of the traffic signal has not changed, or if the color of the traffic signal has changed but does not change from red to blue, the process proceeds to step S4-2.
ステップS4-1:赤から青への変化
判定部3は、赤から青へ信号機の色が変化した状況を表す、判定結果D3を生成する。これにより、情報処理装置100は、車両が発進してもよい、という状況を搭乗者に伝えることができる。また、自動運転をする場合には、情報処理装置100は、車両が進行可能な状況であるという情報を車両の制御装置に出力することが可能となる。
Step S4-1: Change from Red to Blue The determination unit 3 generates a determination result D3 representing a situation in which the color of the traffic light changes from red to blue. Accordingly, the information processing device 100 can inform the passenger that the vehicle may start. Further, when the vehicle is automatically driven, the information processing device 100 can output information indicating that the vehicle is ready to proceed to the control device of the vehicle.
ステップS4-2:赤から青への変化ではない
判定部3は、赤から青へ信号機の色が変化していない状況を表す、判定結果D3を生成する。これにより、情報処理装置100は、車両が停止した状態を維持する必要がある、という状況を搭乗者に伝えることができる。また、自動運転をする場合には、情報処理装置100は、車両が停止した状態を維持する必要がある、という情報を車両の制御装置に出力することが可能となる。
Step S4-2: No Change from Red to Blue The determination unit 3 generates a determination result D3 representing a situation where the color of the traffic light has not changed from red to green. Thereby, the information processing device 100 can inform the passenger of the situation that the vehicle needs to be maintained in a stopped state. Further, in the case of automatic driving, the information processing device 100 can output to the control device of the vehicle information indicating that the vehicle needs to be maintained in a stopped state.
5-4 変形例4:音声を入力とする感情分析
実施形態や変形例1-変形例3の状況判定は、有体物の移動状況や有体物の色の変化状況であったがこれに限定されるものではない。情報処理装置100は、例えば、音声を入力とする感情分析にも適用することができる。具体的には、本変形例4における状況判定は、感情の度合いの判定に対応している。変形例4において、感情の度合いとは、例えば、最小で0%最大で100%であり、良い印象であるほど高い数値とする。
5-4 Modification 4: Emotion analysis using voice as input The situation determination in the embodiment and Modifications 1 to 3 was the movement of tangible objects and the change in color of tangible objects, but is limited to this. is not. The information processing apparatus 100 can also be applied to, for example, emotion analysis using voice as input. Specifically, the situation determination in Modification 4 corresponds to the determination of the degree of emotion. In Modified Example 4, the degree of emotion is, for example, a minimum of 0% and a maximum of 100%, with a higher number representing a better impression.
なお、感情は、対象の一例である。また、感情の度合いは、状況確率の一例である。 It should be noted that emotions are an example of targets. Also, the degree of emotion is an example of the situation probability.
変形例4では、存在確率ではなく、上述した状況確率が用いられる。状況確率の概念は、実施形態やその変形例1-3における存在確率の概念を包含している。換言すると、存在確率は、状況確率の一態様である。状況確率は、有体物に限らない任意の対象の状況を表す確率である。 In Modification 4, the above-mentioned situational probability is used instead of the existence probability. The concept of situation probability includes the concept of existence probability in the embodiment and modifications 1-3 thereof. In other words, the existence probability is one aspect of the situational probability. The situation probability is a probability representing the situation of any target, not limited to tangible objects.
図13に示すように、変形例4に係る情報処理装置100は、実施形態に係る画像取得部1の代わりに、音声取得部1tを備えている。この音声取得部1tは、例えばマイク等の音声入力装置20tから音声D1t(例えば、単語)を取得可能に構成されている。 As shown in FIG. 13, the information processing apparatus 100 according to Modification 4 includes a voice acquisition section 1t instead of the image acquisition section 1 according to the embodiment. The voice acquisition unit 1t is configured to be able to acquire a voice D1t (for example, words) from a voice input device 20t such as a microphone.
確率取得部2は、音声取得部1tから音声D1tが入力されると、出力D2として状況確率(感情の度合い)を出力するように構成されている。状況確率は、あるタイミングにおける対象の状況確率の値が特定されているデータである。例えば、第1タイミングにおける音声データを確率取得部2に入力したときに、確率取得部2から10%という状況確率が特定されたデータ(第1状況確率の一例)が出力されたものとする。また、第1タイミングより後である第2タイミングにおける音声を確率取得部2に入力したときに、確率取得部2から20%という状況確率が特定されたデータ(第2状況確率の一例)が出力されたものとする。 The probability acquisition unit 2 is configured to output the situation probability (degree of emotion) as the output D2 when the voice D1t is input from the voice acquisition unit 1t. The situation probability is data specifying the value of the situation probability of the target at a certain timing. For example, assume that when voice data at the first timing is input to the probability obtaining unit 2, data specifying a situation probability of 10% (an example of the first situation probability) is output from the probability obtaining unit 2. Further, when the speech at the second timing, which is later than the first timing, is input to the probability acquisition unit 2, the probability acquisition unit 2 outputs data specifying a situation probability of 20% (an example of the second situation probability). shall have been
判定部3は、状況確率の差分を算出する機能を有する。具体的には、判定部3は、第2タイミングにおける状況確率の状況確率(20%)から、第1タイミングにおける状況確率の状況確率(10%)を引いて得られる値(差分)が予め定められた閾値(例えば、5%)よりも大きいと判定した場合に、感情が好転したと判定するように構成されている。この変形例4の例では、差分が10%であり、差分が閾値よりも大きいため、判定部3は感情が好転したと判定する。 The determination unit 3 has a function of calculating the difference in situation probabilities. Specifically, the determination unit 3 predetermines a value (difference) obtained by subtracting the situation probability (10%) of the situation probability at the first timing from the situation probability (20%) of the situation probability at the second timing. It is configured to determine that the emotion has improved when it is determined to be greater than a set threshold value (eg, 5%). In the example of Modification 4, the difference is 10%, and since the difference is larger than the threshold, the determination unit 3 determines that the emotions have improved.
なお、本変形例4は、変形例1と組み合わせることもできる。つまり、修正確率取得部5は、第1及び第2タイミングの状況確率の差分に基づいて、修正確率を取得可能に構成されていてもよい。
なお、修正確率は、対象の修正確率の値が特定されたデータである。また、修正確率は、第1タイミングと第2タイミングとの間において、対象(本変形例4では感情)の状況を表す確率である。
例えば、第2タイミングの状況確率が第1タイミングにおける状況確率よりも高い場合において、修正確率取得部5が取得する修正状況確率は、第2タイミングの状況確率よりも高い値となるように設定される。
また、例えば、第1タイミングの状況確率が第2タイミングにおける状況確率よりも高い場合において、修正確率取得部5が取得する修正状況確率は、第2タイミングの状況確率よりも低い値となるように設定される。
Modification 4 can also be combined with Modification 1. FIG. That is, the correction probability acquisition unit 5 may be configured to be able to acquire the correction probability based on the difference between the situation probabilities at the first and second timings.
Note that the correction probability is data in which the value of the target correction probability is specified. Further, the correction probability is a probability representing the situation of the target (emotion in the present modified example 4) between the first timing and the second timing.
For example, when the situation probability at the second timing is higher than the situation probability at the first timing, the corrected situation probability acquired by the correction probability acquisition unit 5 is set to be higher than the situation probability at the second timing. be.
Further, for example, when the situation probability at the first timing is higher than the situation probability at the second timing, the corrected situation probability acquired by the correction probability acquisition unit 5 is set to be lower than the situation probability at the second timing. set.
1:画像取得部、1t:音声取得部、2:確率取得部、3:判定部、4:出力部、5:修正確率取得部、20:撮像装置、20t:音声入力装置、21:外部装置、100:情報処理装置、D1:画像、D11:第1フレーム、D12:第2フレーム、D1t:音声、D2:出力、D21:修正確率、D3:判定結果、H:縦幅、Pc:修正確率、Th1:閾値、Th2:閾値、W:横幅、d1:存在確率、d2:座標、d3:種類データ、ob1:船、ob2:空、ob3:海、ob4:ブイ、ob5:船、ob61:赤色に点灯している信号機、ob62:青色に点灯している信号機、ob7:道路、ob8:歩道、obt:搭載船 1: image acquisition unit, 1t: voice acquisition unit, 2: probability acquisition unit, 3: determination unit, 4: output unit, 5: correction probability acquisition unit, 20: imaging device, 20t: voice input device, 21: external device , 100: information processing device, D1: image, D11: first frame, D12: second frame, D1t: audio, D2: output, D21: correction probability, D3: determination result, H: vertical width, Pc: correction probability , Th1: threshold, Th2: threshold, W: width, d1: existence probability, d2: coordinates, d3: type data, ob1: ship, ob2: sky, ob3: sea, ob4: buoy, ob5: ship, ob61: red ob62: traffic light lit in blue, ob7: road, ob8: sidewalk, obt: loaded ship
Claims (18)
前記画像取得部は、画像を取得可能に構成され、
前記画像は、タイミングが異なる複数のフレームを含み、
前記複数のフレームは、第1フレームと、前記タイミングが第1フレームよりも後である第2フレームとを有し、
前記確率取得部は、前記画像に基づいて、存在確率を取得可能に構成され、
前記存在確率は、前記フレーム内において物体が存在する確率であり、
前記判定部は、前記第1及び第2フレームの前記存在確率に基づいて、前記物体の状況を判定するように構成されている、情報処理装置。 An image acquisition unit, a probability acquisition unit, and a determination unit,
The image acquisition unit is configured to acquire an image,
The image includes a plurality of frames with different timings,
The plurality of frames has a first frame and a second frame whose timing is later than the first frame,
The probability acquisition unit is configured to acquire an existence probability based on the image,
the existence probability is the probability that an object exists in the frame;
The information processing apparatus, wherein the determination unit is configured to determine the state of the object based on the existence probabilities of the first and second frames.
前記確率取得部は、前記画像に基づいて、前記物体の座標を更に取得可能に構成され、
前記判定部は、前記第1及び第2フレームにおける前記物体の前記座標間の距離が予め定められた距離以下であると判定した場合に、前記存在確率を比較して前記物体の前記状況を判定する、情報処理装置。 The information processing device according to claim 1,
The probability acquisition unit is configured to be able to further acquire the coordinates of the object based on the image,
The determination unit determines the situation of the object by comparing the existence probabilities when determining that the distance between the coordinates of the object in the first and second frames is equal to or less than a predetermined distance. information processing device.
前記確率取得部は、前記画像に基づいて、種類データを更に取得可能に構成され、
前記種類データは、前記物体の種類を特定するデータであり、
前記判定部は、前記第1及び第2フレームにおける前記物体の前記種類が同じであると判定した場合に、前記存在確率を比較して前記物体の前記状況を判定する、情報処理装置。 The information processing device according to claim 1 or claim 2,
The probability acquisition unit is configured to be able to further acquire type data based on the image,
The type data is data specifying the type of the object,
The information processing apparatus, wherein the determining unit determines the situation of the object by comparing the existence probabilities when determining that the types of the object are the same in the first and second frames.
前記種類データには、前記物体の前記種類の候補ごとに種類特定確率が特定されており、
前記判定部は、前記種類の前記候補のうち最も前記種類特定確率が高い前記候補を、前記物体の前記種類として特定するように構成されている、情報処理装置。 The information processing device according to claim 3,
a type identification probability is specified for each type candidate of the object in the type data;
The information processing device, wherein the determination unit is configured to identify the candidate with the highest type identification probability among the candidates of the type as the type of the object.
前記判定部は、前記第1及び第2フレームにおける前記物体が同じであると判定した場合に、前記存在確率を比較して前記物体の前記状況を判定する、情報処理装置。 The information processing device according to claim 1,
The information processing apparatus, wherein the determining unit determines the situation of the object by comparing the existence probabilities when determining that the object in the first and second frames is the same.
前記状況は、前記物体の移動に対応しており、
前記判定部は、前記第2フレームの前記存在確率と前記第1フレームの前記存在確率との差分が予め定められた第1閾値よりも大きいと判定した場合に、前記物体が接近してきていると判定するように構成される、情報処理装置。 The information processing device according to any one of claims 1 to 5,
the situation corresponds to movement of the object;
The determining unit determines that the object is approaching when determining that a difference between the existence probability of the second frame and the existence probability of the first frame is greater than a predetermined first threshold. An information processing device configured to determine.
前記物体には、船が含まれ、
前記情報処理装置は、前記船の相対的な移動の前記状況を判定するように構成されている、情報処理装置。 The information processing device according to claim 6,
the object includes a ship;
An information processing device, wherein the information processing device is configured to determine the situation of relative movement of the ship.
修正確率取得部を更に備え、
前記修正確率取得部は、前記第2フレームの前記存在確率と前記第1フレームの前記存在確率とに基づいて、修正確率を取得可能に構成され、
前記修正確率は、前記第1フレームの前記タイミングと前記第2フレームの前記タイミングとの間において、前記フレーム内に前記物体が存在する確率であり、
前記判定部は、前記修正確率に基づいて、前記物体の前記状況を判定するように構成されている、情報処理装置。 The information processing device according to claim 1,
further comprising a modified probability acquisition unit,
The correction probability acquisition unit is configured to be able to acquire a correction probability based on the existence probability of the second frame and the existence probability of the first frame,
the correction probability is a probability that the object exists in the frame between the timing of the first frame and the timing of the second frame;
The information processing device, wherein the determination unit is configured to determine the situation of the object based on the correction probability.
前記第2フレームの前記存在確率が前記第1フレームの前記存在確率よりも高い場合において、前記修正確率取得部が取得する前記修正確率は、前記第2フレームの前記存在確率よりも高い、情報処理装置。 The information processing device according to claim 8,
information processing, wherein when the existence probability of the second frame is higher than the existence probability of the first frame, the correction probability acquired by the correction probability acquisition unit is higher than the existence probability of the second frame; Device.
前記第1フレームの前記存在確率が前記第2フレームの前記存在確率よりも高い場合において、前記修正確率取得部が取得する前記修正確率は、前記第2フレームの前記存在確率よりも低い、情報処理装置。 The information processing device according to claim 8 or claim 9,
information processing, wherein when the existence probability of the first frame is higher than the existence probability of the second frame, the correction probability acquired by the correction probability acquisition unit is lower than the existence probability of the second frame; Device.
前記判定部は、前記第1及び第2フレームの前記存在確率の両方が予め定められた第2閾値より低い場合において、前記修正確率に基づいて、前記物体の前記状況を判定するように構成されている、情報処理装置。 The information processing device according to any one of claims 8 to 10,
The determination unit is configured to determine the situation of the object based on the correction probability when both the existence probabilities of the first and second frames are lower than a predetermined second threshold. information processing equipment.
前記修正確率は、前記第2閾値よりも高い、情報処理装置。 The information processing device according to claim 11,
The information processing device, wherein the correction probability is higher than the second threshold.
前記判定部は、前記第1及び第2フレームの前記存在確率のうちの少なくとも一方が前記第2閾値より高い場合において、前記修正確率を用いずに、前記物体の前記状況を判定するように構成されている、情報処理装置。 The information processing device according to claim 11 or claim 12,
The determination unit is configured to determine the situation of the object without using the correction probability when at least one of the existence probabilities of the first and second frames is higher than the second threshold. Information processing equipment.
前記画像取得ステップでは、画像を取得し、
前記画像は、タイミングが異なる複数のフレームを含み、
前記複数のフレームは、第1フレームと、前記タイミングが第1フレームよりも後である第2フレームとを有し、
前記確率取得ステップでは、前記画像に基づいて、存在確率を取得し、
前記存在確率は、前記フレーム内において物体が存在する確率であり、
前記判定ステップでは、前記第1及び第2フレームの前記存在確率に基づいて、前記物体の状況を判定する、方法。 An information processing method comprising an image acquisition step, a probability acquisition step, and a determination step,
The image acquisition step acquires an image,
The image includes a plurality of frames with different timings,
The plurality of frames has a first frame and a second frame whose timing is later than the first frame,
In the probability acquisition step, an existence probability is acquired based on the image;
the existence probability is the probability that an object exists in the frame;
In the determining step, the method determines the state of the object based on the existence probabilities of the first and second frames.
前記確率取得部は、第1状況確率と、タイミングが前記第1状況確率よりも後である第2状況確率とを取得可能に構成され、
前記第1及び第2状況確率は、対象の状況を表す確率であり、
前記修正確率取得部は、前記第1及び第2状況確率に基づいて、修正確率を取得可能に構成され、
前記修正確率は、前記第1状況確率を取得する前記タイミングと前記第2状況確率を取得する前記タイミングとの間において、前記対象の前記状況を表す確率である、情報処理装置。 A probability acquisition unit and a modified probability acquisition unit,
The probability acquisition unit is configured to be able to acquire a first situation probability and a second situation probability whose timing is later than the first situation probability,
The first and second situation probabilities are probabilities representing the situation of the target,
The modified probability acquisition unit is configured to be able to acquire modified probabilities based on the first and second situation probabilities,
The information processing apparatus, wherein the modified probability is a probability representing the situation of the target between the timing of acquiring the first situation probability and the timing of acquiring the second situation probability.
前記第2状況確率が前記第1状況確率よりも高い場合において、前記修正確率取得部が取得する前記修正確率は、前記第2状況確率よりも高い、情報処理装置。 The information processing device according to claim 16,
The information processing apparatus, wherein the correction probability acquired by the correction probability acquisition unit is higher than the second situation probability when the second situation probability is higher than the first situation probability.
前記第1状況確率が前記第2状況確率よりも高い場合において、前記修正確率取得部が取得する前記修正確率は、前記第2状況確率よりも低い、情報処理装置。 The information processing device according to claim 16 or 17,
The information processing device, wherein the correction probability acquired by the correction probability acquisition unit is lower than the second situation probability when the first situation probability is higher than the second situation probability.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2021/018624 WO2022244062A1 (en) | 2021-05-17 | 2021-05-17 | Information processing device, information processing method, and computer program |
| JP2023522011A JP7462113B2 (en) | 2021-05-17 | 2021-05-17 | Information processing device, information processing method, and computer program |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2021/018624 WO2022244062A1 (en) | 2021-05-17 | 2021-05-17 | Information processing device, information processing method, and computer program |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022244062A1 true WO2022244062A1 (en) | 2022-11-24 |
Family
ID=84141391
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/018624 Ceased WO2022244062A1 (en) | 2021-05-17 | 2021-05-17 | Information processing device, information processing method, and computer program |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP7462113B2 (en) |
| WO (1) | WO2022244062A1 (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010118851A (en) * | 2008-11-12 | 2010-05-27 | Toyota Motor Corp | Traveling object, traveling object system, and image management method |
| JP2011091705A (en) * | 2009-10-23 | 2011-05-06 | Canon Inc | Image processing apparatus, and image processing method |
| JP2019067159A (en) * | 2017-09-29 | 2019-04-25 | クラリオン株式会社 | Lane marking recognition device |
| CN112200101A (en) * | 2020-10-15 | 2021-01-08 | 河南省交通规划设计研究院股份有限公司 | Video monitoring and analyzing method for maritime business based on artificial intelligence |
-
2021
- 2021-05-17 WO PCT/JP2021/018624 patent/WO2022244062A1/en not_active Ceased
- 2021-05-17 JP JP2023522011A patent/JP7462113B2/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010118851A (en) * | 2008-11-12 | 2010-05-27 | Toyota Motor Corp | Traveling object, traveling object system, and image management method |
| JP2011091705A (en) * | 2009-10-23 | 2011-05-06 | Canon Inc | Image processing apparatus, and image processing method |
| JP2019067159A (en) * | 2017-09-29 | 2019-04-25 | クラリオン株式会社 | Lane marking recognition device |
| CN112200101A (en) * | 2020-10-15 | 2021-01-08 | 河南省交通规划设计研究院股份有限公司 | Video monitoring and analyzing method for maritime business based on artificial intelligence |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7462113B2 (en) | 2024-04-04 |
| JPWO2022244062A1 (en) | 2022-11-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12067764B2 (en) | Brake light detection | |
| US10817732B2 (en) | Automated assessment of collision risk based on computer vision | |
| US20200353932A1 (en) | Traffic light detection method and apparatus, intelligent driving method and apparatus, vehicle, and electronic device | |
| US12005922B2 (en) | Toward simulation of driver behavior in driving automation | |
| US20210097855A1 (en) | Multiple exposure event determination | |
| CA3068258C (en) | Rare instance classifiers | |
| US11514310B2 (en) | Training a classifier to detect open vehicle doors | |
| KR102318027B1 (en) | Vision-based sample-efficient reinforcement learning framework for autonomous driving | |
| US20130211682A1 (en) | System and method for traffic signal recognition | |
| US10776642B2 (en) | Sampling training data for in-cabin human detection from raw video | |
| KR20210104712A (en) | Fast CNN classification of multi-frame semantic signals | |
| KR20150085988A (en) | Method and system for recognition of speed limit sign using front camera | |
| CN116176625A (en) | Vehicle driving control method, system, device and medium based on machine vision | |
| WO2021017341A1 (en) | Method and apparatus for recognizing driving state of intelligent driving device, and device | |
| CN116524396A (en) | Vehicle appearance abnormality detection method and device, electronic equipment and storage medium | |
| WO2022244062A1 (en) | Information processing device, information processing method, and computer program | |
| US20160092752A1 (en) | Image recognition apparatus | |
| CN116494992B (en) | Vehicle control method and device, electronic equipment and storage medium | |
| KR20210089044A (en) | Method of selecting training data for object detection and object detection device for detecting object using object detection model trained using method | |
| CN116311181B (en) | Method and system for rapidly detecting abnormal driving | |
| US20220139071A1 (en) | Information processing device, information processing method, information processing program, and information processing system | |
| Lu et al. | A BEV Scene Classification Method based on Historical Location Points and Unsupervised Learning | |
| US12190698B2 (en) | Systems and methods for automatic spotting and safety indicator deployment | |
| CN114494743B (en) | Vehicle component matching method based on deep learning, storage medium and device | |
| CN115861967B (en) | Target detection method, device, vehicle and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21940682 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023522011 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21940682 Country of ref document: EP Kind code of ref document: A1 |