WO2018198212A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations lisible par ordinateur - Google Patents
Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations lisible par ordinateur Download PDFInfo
- Publication number
- WO2018198212A1 WO2018198212A1 PCT/JP2017/016451 JP2017016451W WO2018198212A1 WO 2018198212 A1 WO2018198212 A1 WO 2018198212A1 JP 2017016451 W JP2017016451 W JP 2017016451W WO 2018198212 A1 WO2018198212 A1 WO 2018198212A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display mode
- point
- candidate point
- candidate
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
- G01S13/9021—SAR image post-processing techniques
- G01S13/9027—Pattern recognition for feature extraction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
Definitions
- This disclosure relates to processing of data acquired by a radar.
- Synthetic Aperture Radar is a technology that observes the state of the earth's surface by irradiating electromagnetic waves from above and acquiring the intensity of the reflected electromagnetic waves (hereinafter also referred to as “reflected waves”). One. Based on the data acquired by the SAR, a two-dimensional map (hereinafter, “SAR image”) of the intensity of the reflected wave can be generated.
- the SAR image is a map in which the reflected wave is regarded as a reflected wave from a defined reference plane (for example, the ground surface) and the intensity of the reflected wave is represented on a plane representing the reference plane.
- the position where the intensity of the reflected wave is represented in the SAR image is based on the distance between the position where the reflected wave is generated and the position of the antenna that receives the reflected wave. Therefore, the intensity of the reflected wave from a position away from the reference plane is represented in the SAR image at a position shifted to the radar side according to the height from the reference plane with respect to the actual position.
- the image formed in the SAR image by the reflected wave from the object whose shape is not flat is an image in which the shape of the actual object is distorted.
- the phenomenon in which such a distorted image is generated is called foreshortening.
- Patent Documents 1 and 2 disclose apparatuses that perform correction processing called ortho correction in order to correct foreshortening.
- Patent Document 3 discloses a technique for correcting not only foreshortening but also a phenomenon called layover.
- the layover is a phenomenon in which a reflected wave signal from a position at a certain height and a reflected wave signal from a position different from the position overlap in the SAR image.
- Patent Document 4 which is a document related to the present disclosure has a description regarding an occlusion area in an image photographed by a camera.
- the ortho correction as disclosed in Patent Documents 1 and 2 is not assumed to be performed on a SAR image in which a layover has occurred.
- the ortho correction is a correction in which the position of a point where distortion occurs in the SAR image is shifted to a position estimated as a true position where a signal (reflected wave) represented at the point is generated.
- the ortho correction is a correction that is performed on the assumption that there is one position candidate that is estimated as the true position where the reflected wave is emitted at the point to be corrected.
- Patent Document 3 discloses a method for correcting layover, but this method requires a plurality of SAR images having different distortion methods. Thus, in the absence of any supplemental information, the reflected waves from two or more points that contribute to the signal at a point in the region where layover occurs in one SAR image are distinguished. That is impossible in principle.
- layover that is, if a candidate point that contributes to a signal at a certain point in the SAR image is not narrowed down, a person can select a candidate point that contributes to the signal while viewing the SAR image and the optical image. It is customary to estimate based on experience and various information.
- the images used in the present invention are obtained by other methods for estimating the state of an object by observing the reflection of electromagnetic waves such as images based on RAR (Real Aperture Radar) in addition to SAR images. It may be an image.
- RAR Real Aperture Radar
- An information processing apparatus includes a position in a three-dimensional space of a target point, which is a point specified in an intensity map of a signal from an observed object acquired by a radar, and a shape of the observed object
- candidate display means for extracting candidate points that are points that contribute to the signal at the target point, and a display mode for displaying the position of the candidate point in the spatial image in which the observed object is captured
- Display mode determining means for determining the position of the candidate point based on the position of the candidate point in the three-dimensional space and the imaging condition of the spatial image, and the position of the candidate point in the spatial image is displayed according to the determined display mode.
- Image generating means for generating a processed image.
- An information processing method includes a position in a three-dimensional space of a target point, which is a point specified in an intensity map of a signal from an observed object acquired by a radar, and a shape of the observed object Based on the above, a candidate point that is a point contributing to the signal at the target point is extracted, and a display mode of display indicating the position of the candidate point in the spatial image in which the observed object is captured
- the position is determined based on the position in the three-dimensional space and the shooting condition of the space image, and an image is generated in which the position of the candidate point in the space image is displayed according to the determined display mode.
- a program provides a computer with a position in a three-dimensional space of a target point, which is a point specified in an intensity map of a signal from an observed object acquired by a radar, and the observed object.
- a candidate point extracting means for extracting a candidate point that is a point contributing to the signal at the target point based on a shape, and a display mode for displaying a position of the candidate point in a spatial image in which the object is observed
- Display mode determining means for determining the position of the candidate point based on the position of the candidate point in the three-dimensional space and the imaging condition of the spatial image, and the display mode of determining the position of the candidate point in the spatial image
- an image generation process for generating the displayed image.
- the program is stored in, for example, a computer-readable non-volatile storage medium.
- the present invention in the intensity map of the signal from the observed object acquired by the radar, it is easy to understand the point on the observed object that contributes to the signal at the point in the region where the layover occurs. can do.
- FIG. 1 is a diagram for explaining layover.
- an observation device S 0 that performs observation by SAR and an object M that exists in the observed range are shown.
- Observation device S for example, mounting the radar, a satellite or aircraft or the like.
- Observation device S while moving the sky, transmits electromagnetic waves by the radar receives the reflected waves.
- the arrows indicate the traveling direction of the observation device S 0 , that is, the radar traveling direction (also referred to as azimuth direction).
- Observation electromagnetic waves emitted from the device S 0 is the surface, and reflected by the back-scattered by the structure M in the ground, part of the reflected wave is received back to the radar.
- the distance between the electromagnetic wave reflection point of the position and the structure M of the observation devices S 0 is specified.
- a point Q a is a point on the ground surface
- a point Q b is a point away from the ground surface on the surface of the structure M.
- the distance between the observation equipment S 0 and the point Q a is equal to the distance between the observation equipment S 0 and the point Q b.
- the straight line connecting the point Q b and the point Q a, and the traveling direction of the radar are in a vertical relationship.
- the reflected wave at the point Q a, the reflected wave at the point Q b can not be distinguished by taking the observation equipment S 0. That is, the intensity of the reflected waves from the intensity of the reflected wave and the point Q b from the point Q a, is observed intermingled.
- FIG. 2 An example of an image representing the intensity distribution of the reflected wave (hereinafter referred to as “SAR image”) generated in such a case is shown in FIG.
- the arrow indicates the traveling direction of the radar.
- the SAR image is generated based on the intensity of the reflected wave received by the radar and the distance between the point where the reflected wave is emitted and the radar.
- reflected waves from two or more points that are on the plane perpendicular to the traveling direction of the radar including the position of the radar and are equal in distance from the radar are not distinguished.
- Point P is a point which reflects the intensity of the reflected wave from a point Q a, the strength indicated in this respect P, is also reflected in the strength of the reflected wave from the point Q b.
- a white area including the point P is an area where a layover has occurred.
- an area painted black represents an area shaded by the structure M against the radar. This region is also called radar shadow.
- a reference three-dimensional space is defined in the processing performed by the information processing apparatus 11.
- a three-dimensional coordinate system is defined for the reference three-dimensional space.
- this three-dimensional coordinate system is referred to as a reference three-dimensional coordinate system or a reference coordinate system.
- the reference coordinate system may be, for example, a geodetic system or a coordinate system of model data 1113 that is three-dimensional data described later.
- the first coordinate system is related to the second coordinate system. Is written.
- FIG. 3 is a block diagram showing a configuration of the information processing apparatus 11 according to the first embodiment.
- the information processing apparatus 11 includes a storage unit 111, a feature point extraction unit 112, a geocoding unit 113, a candidate point extraction unit 114, a display mode determination unit 115, an image generation unit 116, and a display control unit 117.
- the storage unit 111, the feature point extraction unit 112, the geocoding unit 113, the candidate point extraction unit 114, the display mode determination unit 115, and the image generation unit 116 are connected so as to be able to transmit / receive data to / from each other.
- data transmission between each unit of the information processing apparatus 11 may be performed directly via a signal line, or may be performed by reading and writing to a shared storage area (for example, the storage unit 111).
- data movement is described by the words “send data” and “receive data”, but the method of transmitting data is not limited to the method of transmitting directly.
- the information processing apparatus 11 is connected to the display device 21 so as to be communicable.
- the storage unit 111 stores SAR data 1111, SAR data parameters 1112, model data 1113, an aerial image 1114, and shooting condition information 1115.
- SAR data 1111 is data obtained by observation using SAR.
- Targets observed by the SAR (hereinafter also referred to as “observed object”) are, for example, the ground surface and buildings.
- the SAR data 1111 is data that can generate at least a SAR image represented under a coordinate system related to a reference coordinate system.
- the SAR data 1111 includes an observation value and information associated with the observation value.
- the observed value is, for example, the intensity of the observed reflected wave.
- the information associated with the observation value is, for example, the position and traveling direction of the radar that observed the reflected wave, and the position between the reflected point and the radar derived from the observation of the reflected wave. Information such as distance.
- the SAR data 1111 may include information on the depression angle of the radar (the elevation angle of the radar viewed from the reflection point) with respect to the object to be observed.
- the information regarding the position is described by, for example, a combination of longitude, latitude, and altitude in the geodetic system.
- the SAR data 1111 may be a SAR image itself.
- observation data by SAR is assumed as data to be used.
- data of observation results by, for example, RAR (Real Aperture Radar) is used instead of SAR. May be used.
- the SAR data parameter 1112 is a parameter indicating the relationship between the data included in the SAR data 1111 and the reference coordinate system.
- the SAR data parameter 1112 is a parameter for assigning a position in the reference coordinate system to the observation value included in the SAR data 1111.
- the SAR data parameter Reference numeral 1112 denotes a parameter for converting the information into information described under a reference coordinate system.
- the coordinate system of the SAR image is related to the reference coordinate system by the SAR data parameter 1112. That is, an arbitrary point in the SAR image is associated with one point in the reference coordinate system.
- the model data 1113 is data representing the shape of an object in three dimensions, such as topography and building structure.
- the model data 1113 is, for example, DEM (Digital Elevation Model; digital elevation model).
- the model data 1113 may be DSM (Digital Surface Model) that is data of the earth surface including the structure, or DTM (Digital Terrain Model) that is data of the shape of the ground surface.
- the model data 1113 may have DTM and three-dimensional data of a structure separately.
- the coordinate system used for the model data 1113 is related to the reference coordinate system. That is, an arbitrary point in the model data 1113 can be described by coordinates in the reference coordinate system.
- the space image 1114 is an image in which a space including the SAR observed object is copied.
- the spatial image 1114 may be, for example, any of optical images such as satellite photographs and aerial photographs, maps, topographic maps, and CG (Computer Graphics) images representing the topography.
- the aerial image 1114 may be a projection view of the model data 1113.
- the spatial image 1114 the geographical shape and arrangement of the object in the space represented by the spatial image 1114 are viewed by the user of the information processing apparatus 11 (that is, the image output by the information processing apparatus 11). It is an image that is easy to understand intuitively.
- the spatial image 1114 may be captured from outside the information processing apparatus 11 or may be generated by projecting the model data 1113 by the image generation unit 116 described later.
- Shooting condition information 1115 is information related to shooting conditions (capturing conditions) of the spatial image 1114.
- the imaging condition of the spatial image 1114 is how the spatial image 1114 is acquired.
- the shooting condition information 1115 is information that can uniquely specify the shooting range of the spatial image 1114.
- the shooting condition information 1115 is represented by a plurality of parameter values related to the shooting range of the spatial image 1114, for example.
- the spatial image is a captured image captured from a specific position, and a subject (for example, an imaging device such as a camera) that has performed the imaging is referred to as an imaging object.
- a subject for example, an imaging device such as a camera
- the spatial image 1114 is an image obtained without actually performing the photographing process by the apparatus, such as when the spatial image 1114 is generated by projecting the model data 1113, the photographing body may be virtually assumed. .
- the photographing condition information 1115 is described by, for example, the position of the photographing object and information indicating the range of the photographing object.
- the imaging condition information 1115 includes four coordinates in the reference coordinate system corresponding to the coordinates in the reference coordinate system of the photographic object and the points that are captured at the four corners of the spatial image 1114. And may be described by coordinates.
- the shooting range is an area surrounded by four half lines extending from the position of the shooting body to the four coordinates.
- the position of the photographing object is strictly the position of the viewpoint of the photographing object with respect to the spatial image 1114, but in practice, the information on the position of the photographing object used by the display mode determination unit 115 does not have to be exact.
- the display mode determination unit 115 is acquired by a device having a GPS (Global Positioning System) function mounted on a device (aircraft, artificial satellite, etc.) on which the photographing body is mounted as information indicating the position of the photographing body. Position information may be used.
- GPS Global Positioning System
- the information indicating the position in the shooting condition information 1115 is given by, for example, a set of values of parameters (for example, longitude, latitude, and altitude) in the reference coordinate system.
- the position in the reference three-dimensional space of any point included in the range of the space included in the spatial image 1114 can be uniquely specified by the shooting condition information 1115.
- the spatial image of the point is based on the shooting condition information 1115.
- the position at 1114 may be uniquely identified.
- Each parameter of the imaging condition information 1115 may be a parameter of a coordinate system different from the reference coordinate system. In that case, the imaging condition information 1115 only needs to include a conversion parameter for converting the parameter value in the coordinate system to the parameter value in the reference coordinate system.
- the photographing condition information 1115 may be described by, for example, the position, posture, and angle of view of the photographing body.
- the posture of the photographing object can be described by a photographing direction, that is, an optical axis direction of the photographing object at the time of photographing, and a parameter indicating a relationship between a vertical direction of the spatial image 1114 and a reference coordinate system.
- the angle of view can be described by parameters indicating a vertical viewing angle and a horizontal viewing angle.
- the information indicating the position of the photographic body is described by the value of a parameter indicating the direction of the photographic body viewed from the subject. May be.
- the information indicating the position of the photographic object may be a set of azimuth and elevation angle.
- the shooting condition information 1115 may include shooting time information.
- the storage unit 111 does not always need to hold data in the information processing apparatus 11.
- the storage unit 111 may record data on a device or a recording medium outside the information processing apparatus 11 and acquire the data as necessary. That is, the storage unit 111 only needs to be configured to acquire data requested by each unit in the processing of each unit of the information processing apparatus 11 described below.
- a feature point is a point extracted by a predetermined method from points indicating signal intensity that is not at least 0 in the SAR data 1111. That is, the feature point extraction unit 112 extracts one or more points from the SAR data 1111 by a predetermined method for extracting points.
- the points extracted from the SAR data 1111 are a data group related to one point in the SAR image (for example, a set of an observation value and information associated with the observation value).
- the feature point extraction unit 112 extracts feature points by, for example, a method of extracting points that may give useful information in the analysis of the SAR data 1111.
- the feature point extraction unit 112 may extract points by a technique called PS-InSAR (Permanent Scatterers Interferometric SAR).
- PS-InSAR is a technique for extracting a point where a change in signal intensity is observed based on a phase shift from a plurality of SAR images.
- the feature point extraction unit 112 may extract a point that satisfies a predetermined condition (for example, the signal intensity exceeds a predetermined threshold) as the feature point.
- a predetermined condition for example, the signal intensity exceeds a predetermined threshold
- This predetermined condition may be set by a user or a designer of the information processing apparatus 11, for example.
- the feature point extraction unit 112 may extract points selected by human judgment as feature points.
- the feature point extraction unit 112 sends the extracted feature point information to the geocoding unit 113.
- the feature point information includes at least information capable of specifying coordinates in the reference coordinate system.
- the feature point information is represented by the position and traveling direction of the observation device that has acquired the SAR data in the range including the feature point, and the distance between the observation device and the signal reflection point at the feature point.
- the geocoding unit 113 converts the information based on the SAR data parameter 1112 into information represented by the position, traveling direction, and distance of the observation device in the reference coordinate system. Then, the geocoding unit 113 identifies points (coordinates) that satisfy all of the following conditions in the reference coordinate system. -The distance between the point and the position of the observation device is the distance indicated by the feature point information. ⁇ Included in a plane perpendicular to the direction of travel of the observation equipment.
- the coordinates of the identified point are the coordinates in the reference coordinate system of the feature point indicated by the feature point information.
- the geocoding unit 113 assigns the coordinates of the points specified in this way to the feature points indicated by the feature point information.
- candidate point a point related to the feature point (hereinafter, “candidate point”) with the feature point given the coordinate in the reference coordinate system.
- the candidate points related to the feature points will be described below.
- the signal intensity indicated at the feature point (referred to as point P) in the region where the layover occurs may be the sum of the intensity of the reflected waves from a plurality of points.
- a point in the three-dimensional space that may contribute to the signal intensity indicated at the point P is referred to as a candidate point related to the point P in this embodiment.
- FIG. 4 is a diagram for explaining an example of candidate points.
- FIG. 4 is a cross-sectional view of the reference three-dimensional space cut out by a plane passing through the point P and perpendicular to the radar traveling direction (azimuth direction).
- a line GL is a cross-sectional line of a reference plane in a reference three-dimensional space, that is, a plane on which a feature point is located.
- a line ML is a cross-sectional line of the three-dimensional structure represented by the model data 1113.
- Point S 1 is a point indicating the position of the radar.
- the position of the point P is a coordinate position given by the geocoding unit 113.
- the distance between the points P and S 1 is assumed to be "R".
- Reflected in the signal intensity indicated at the point P is a reflected wave from a point whose distance from the point S 1 is “R” in the cross-sectional view. That is, the point involved in the point P is the point at which arc centered on the point S 1 radius "R” intersects the line ML.
- points Q 1 , Q 2 , Q 3 , and Q 4 are points other than the point P where the circular arc with the radius “R” centering on the point S 1 intersects the line ML. Therefore, these points Q 1 , Q 2 , Q 3 , Q 4 are candidate points related to the point P.
- the candidate point extraction unit 114 extracts, as candidate points, points on the plane that includes the point P and that is perpendicular to the traveling direction of the radar and whose distance from the radar is equal to the distance between the radar and the point P. That's fine.
- the candidate points extracted by the candidate point extraction unit 114 may be points Q 1 , Q 2 , Q 4 excluding the point Q 3 .
- the candidate point extracting unit 114 based on the line segment connecting the point Q 3 and the point S 1 is crossing the line ML outside point Q 3, may exclude the point Q 3 from the candidate point.
- the three-dimensional space as a reference according to a plane perpendicular to the point P as the azimuth direction, the section line of the model data 1113, the position of the point S 1 and the point P, In addition, the distance “R” between the point S 1 and the point P.
- a candidate point Q 3 is that a straight line passing through point Q 3 and parallel to the incident line of the electromagnetic wave from the radar intersects line ML (ie, is in the radar shadow). May be removed from the point.
- the candidate point extraction unit 114 may extract candidate points under the approximation that the incident directions of the electromagnetic waves from the observation device to the object to be observed are all parallel to each other.
- the position of the candidate point can be calculated using the azimuth and depression angle ⁇ of the point S 1 instead of the coordinate of the point S 1 and the distance “R”.
- the candidate point extraction unit 114 sends candidate points related to the feature points to the display mode determination unit 115 and the image generation unit 116.
- the display mode is a state of display determined by, for example, the shape, size, color, brightness, transparency, movement, and change with time of a figure to be displayed.
- the “candidate point display mode” is a display mode for displaying the position of the candidate point.
- Display candidate points means to display a display indicating the positions of candidate points.
- the shooting condition is indicated by the shooting condition information 1115 as described above.
- the display mode determination unit 115 receives the coordinates of the candidate points from the candidate point extraction unit 114 when determining the display mode of the candidate points. Further, the display mode determination unit 115 reads out the model data 1113 and the imaging condition information 1115 of the spatial image used for the image generated by the image generation unit 116 from the storage unit 111.
- the display mode determination unit 115 displays different display modes depending on whether the candidate point is positioned in a region that is a blind spot in a spatial image in a three-dimensional space, for example. decide.
- An area that is a blind spot in a spatial image is an area that is included in the imaging range of the spatial image, but is blocked by an object that appears in the spatial image and is not visible from the position of the photographic object that captured the spatial image. .
- an area that becomes a blind spot in a spatial image is also referred to as an occlusion area.
- FIG. 6 is a diagram for explaining the occlusion area.
- the point S 2 represents the position of the imaging member.
- the solid M is a rectangular parallelepiped structure.
- the range of the three-dimensional area indicated by the dotted line is the shooting range.
- Straight Lc, Ld, Le, respectively, through a point Qc on sterically M, Qd, the Qe, is a straight line extending from the position of the point S 2.
- a three-dimensional area indicated by diagonal lines in FIG. 6 is an occlusion area.
- the display mode determination unit 115 first determines whether each candidate point is located in the occlusion area.
- FIG. 7 is a diagram for explaining a method for determining whether a candidate point is located in the occlusion area.
- Figure 7 is a S 2 points representing the position of the imaging of the spatial image, which is a diagram of a plane including the point Q 2.
- a line GL is a cross-sectional line of the reference plane of the reference three-dimensional space
- a line ML is a cross-sectional line of the three-dimensional structure represented by the model data 1113.
- Display mode decision unit 115 first calculates a line segment connecting the candidate points Q 2 and the point S 2. The display mode decision unit 115 determines that whether the segment has an intersection outside line ML and the point Q 2 (i.e., intersects the model data 1113). When the line segment has an intersection with the line ML, the display mode determination unit 115 determines that the candidate point is located in the occlusion area. When the line segment does not have an intersection with the line ML, the display mode determination unit 115 determines that the candidate point is not located in the occlusion area. In the example of FIG. 7, a line segment connecting the point Q 2 and the point S 2 has a line ML and the intersection Q f. Thus, the display mode determination unit 115 determines that the candidate point Q 2 is positioned in the occlusion region.
- the display mode determination unit 115 determines whether or not the candidate points included in the imaging range of the spatial image are located in the occlusion area, for example, as described above.
- the line segment connecting the candidate points and the point S 2 may be a half line extending in the direction of the imaging member from the candidate point.
- the direction of the imaging member to each of the candidate points are considered to be identical. That is, the display mode determination unit 115, by the point S 2 is regarded as sufficiently far, all of the candidate points in space image, the determination can be performed by using a half-line in the same direction. In this case, the calculation cost related to the determination can be reduced.
- the display mode determination unit 115 determines the display mode of each candidate point so that the display mode of candidate points located in the occlusion area is different from the display mode of candidate points not positioned in the occlusion area.
- the display mode determination unit 115 associates information indicating the determined display mode with candidate points. For example, the display mode determination unit 115 may set a property value related to the display mode of each candidate point. Or the display mode determination part 115 should just associate the identification information of the set of the property value relevant to the display mode already prepared with a candidate point. To associate the second information with the first information is to generate data indicating that the first information and the second information are associated with each other.
- the display mode determination unit 115 sets the transmittance value of the candidate points not located in the occlusion area to a value different from the transmittance of the candidate points located in the occlusion area.
- the transmittance is a parameter indicating the degree of contribution to the pixel value at the superimposed position when the displayed graphic is superimposed on the image. For example, when a graphic with a transmittance of 0% is superimposed, the pixel value at the superimposed position of the graphic depends only on the color of the graphic. When a graphic with a transmittance of 0% is superimposed, the pixel value at the superimposed position of the graphic is a value that also depends on the color of the pixel value before superimposition. In other words, a graphic whose transmittance is not 0% is displayed as a semitransparent graphic.
- FIG. 8 is a display mode showing examples of properties relating to the display mode and examples of values of each property in two types of display modes (“first display mode” and “second display mode”).
- the display mode determination unit 115 stores, for example, data representing the contents of the table as shown in FIG. 8 in advance, or stores the data in the storage unit 111, for each candidate point.
- One display mode "or" second display mode may be associated. Note that as a property related to the display mode, there may be a property for designating whether or not a graphic is displayed.
- the display mode determination unit 115 associates the first display mode as a display mode of candidate points that are not located in the occlusion area, and sets the first display mode of candidate points that are positioned in the occlusion area.
- the two display modes may be associated with each other. By doing so, the candidate points located in the occlusion area are displayed in the transmissive color on the display device 21.
- the transmittance may be set for each portion of the graphic to be displayed (such as an outline and an internal region).
- the display mode determination unit 115 displays different display modes according to the number of times the half line extending from the candidate point to the observation device intersects the line ML. You may determine with respect to each candidate point.
- the display mode determination unit 115 seems to have a transmittance of 50% for the candidate point where the number of times the half line intersects the line ML is one time, and the number of times that the half line intersects the line ML is two times or more.
- the transmittance of a candidate point may be set to 80%.
- the display mode determination unit 115 may determine a display mode that changes according to the distance between the candidate point and the photographic object, for example.
- the display mode determination unit 115 may determine the display mode of each candidate point so that, for example, as the candidate point is farther from the photographic object, the size of the graphic indicating the candidate point becomes smaller. Or the display mode determination part 115 may determine the display mode of each candidate point so that the brightness of the figure which shows a candidate point becomes low, for example, so that a candidate point is far from a to-be-photographed body. Such a configuration makes it easy to understand the positional relationship between candidate points.
- the display mode determination unit 115 sends the display mode of each candidate point to the image generation unit 116.
- the format of data generated by the image generation unit 116 is not limited to the image format. The image generated by the image generation unit 116 may be data having information necessary for the display device 21 to display.
- the image generation unit 116 reads a spatial image used for the generated image from the spatial image 1114 stored in the storage unit 111.
- the image generation unit 116 may determine the image to be read based on an instruction from the user, for example.
- the image generation unit 116 may receive information specifying one of the plurality of spatial images 1114 from the user.
- the image generation unit 116 may receive information specifying a range in the three-dimensional space and read a spatial image including the specified range.
- the image generation unit 116 may accept information designating feature points or candidate points that the user desires to display. Then, the image generation unit 116 may specify a range in the reference three-dimensional space including the designated feature point or candidate point, and read a spatial image including the specified range. Note that the information that specifies the feature points or candidate points that the user desires to display may be information that specifies the SAR data 1111.
- the image generation unit 116 may cut out a part of the spatial image 1114 stored in the storage unit 111 and read it as a spatial image to be used. For example, when the image generation unit 116 reads a spatial image based on candidate points that the user desires to display, the image generation unit 116 cuts out a range including all the candidate points from the spatial image 1114 and uses the cut-out image as a spatial image. May be read out.
- the image generation unit 116 receives the coordinates of the candidate points extracted by the candidate point extraction unit 114 from the candidate point extraction unit 114. Further, the image generation unit 116 acquires information indicating the display mode of each candidate point from the display mode determination unit 115.
- the image generation unit 116 superimposes a display indicating the position of the candidate point extracted by the candidate point extraction unit 114 on the read spatial image according to the display mode determined by the display mode determination unit 115. Thereby, a spatial image in which candidate points are shown is generated.
- the spatial image generated by the image generation unit 116 and indicating the candidate points is also referred to as a “point display image”.
- the image generation unit 116 may specify the position of the candidate point in the spatial image by calculation based on the shooting condition information 1115.
- the image generation unit 116 specifies the shooting range and shooting direction of the spatial image based on the shooting condition information 1115. Then, the image generation unit 116 obtains a cut surface of the shooting range by a plane that includes the candidate point and is perpendicular to the shooting direction. The positional relationship between the cut surface and the candidate point corresponds to the positional relationship between the spatial image and the candidate point.
- the image generation unit 116 may specify the coordinates of the candidate points when the coordinates of the cut surface are related to the coordinates of the spatial image.
- the identified coordinates are the coordinates of candidate points in the spatial image.
- the optical satellite image may be corrected by ortho correction or the like.
- the position where the candidate point is indicated is also corrected.
- the position of the candidate point may be corrected using the correction parameter used in the correction for the optical satellite image.
- the above-described method for specifying the position of the candidate point in the spatial image is an example.
- the image generation unit 116 may specify the position of the candidate point in the spatial image based on the position of the candidate point in the reference coordinate system and the relationship between the spatial image and the reference coordinate system.
- the image generation unit 116 sends the generated point display image to the display control unit 117.
- the display control unit 117 causes the display device 21 to display the point display image by, for example, outputting the point display image to the display device 21.
- the display device 21 is a display such as a liquid crystal monitor or a projector.
- the display device 21 may have a function as an input unit like a touch panel.
- the display device 21 is connected to the information processing device 11 as an external device of the information processing device 11, but even if the display device 21 is included in the information processing device 11 as a display unit. Good.
- the viewer who sees the display on the display device 21 knows the result of the processing by the information processing device 11. Specifically, the viewer can observe the point display image generated by the image generation unit 116.
- the feature point extraction unit 112 of the information processing apparatus 11 acquires the SAR data 1111 from the storage unit 111 (S111).
- the acquired SAR data 1111 includes at least SAR data in a range included in the spatial image used in step S121 described later.
- the feature point extraction unit 112 extracts feature points from the acquired SAR data 1111 (step S112).
- the geocoding unit 113 assigns coordinates indicating the position of the feature point in the reference coordinate system to the extracted feature point (step S113).
- the geocoding unit 113 sends the coordinates assigned to the extracted feature points to the candidate point extraction unit 114.
- the candidate point extraction unit 114 extracts candidate points related to the feature point based on the coordinates of the feature point and the model data 1113 (step S114). That is, the candidate point extraction unit 114 specifies the coordinates of candidate points related to the feature points. Then, the candidate point extraction unit 114 sends the coordinates of the candidate points to the display mode determination unit 115 and the image generation unit 116.
- the candidate point extraction unit 114 may store the coordinates of the candidate points in the storage unit 111 in a format in which the feature points and the candidate points are associated with each other.
- the display mode determination unit 115 reads shooting condition information of a spatial image used in the process of step S124 described later (step S121). In addition, the display mode determination unit 115 acquires the coordinates of the candidate points extracted by the candidate point extraction unit 114 from the candidate point extraction unit 114. When the coordinates of the candidate points are stored in the storage unit 111, the display mode determination unit 115 may read the coordinates of the candidate points from the storage unit 111. And the display mode determination part 115 specifies the candidate point contained in the range of a spatial image based on imaging condition information (step S122).
- the display mode determination unit 115 determines the display mode of each candidate point included in the range of the spatial image based on the position of the candidate point included in the range of the spatial image, the model data 1113, and the shooting condition information. (Step S123).
- the display mode determination unit 115 sends information on the display mode of the determined candidate points to the image generation unit 116.
- the image generation unit 116 generates a point display image that is a spatial image indicating the position of the candidate point (step S124). Specifically, the image generation unit 116 reads a spatial image from the storage unit 111 and superimposes a display indicating the position of the candidate point on the spatial image according to the display mode determined by the display mode determination unit 115.
- the timing at which the spatial image used by the image generation unit 116 is determined may be before or after the timing at which the process for acquiring the SAR data is performed. That is, in one example, after the spatial image to be used is determined, the information processing apparatus 11 specifies the SAR data 1111 that is data in the range included in the determined spatial image, and the specified SAR data Alternatively, the processing from step S111 to step S114 may be executed. In one example, the information processing apparatus 11 performs the processes from step S111 to step S114 on the SAR data 1111 in a range that can be included in the spatial image 1114 in advance before the spatial image to be used is determined. May be.
- the image generation unit 116 sends the generated image to the display control unit 117.
- the display control unit 117 causes the display device 21 to display the point display image generated by the image generation unit 116. Thereby, the display device 21 displays the point display image generated by the image generation unit 116 (step S125). By viewing this display, the viewer can easily understand the candidate points related to the feature points extracted in the SAR data 1111.
- a viewer can easily understand a point that contributes to a signal at a point in a region where a layover occurs in the SAR image.
- the reason is that the candidate point extraction unit 114 extracts candidate points that may have contributed to the signal at the feature point based on the model data 1113, and the image generation unit 116 displays the position of the candidate point. This is because a point display image that is a spatial image is generated.
- the display mode determination unit 115 displays the position of the candidate point displayed in the point display image in a more easily understandable display mode based on the position of the candidate point and the shooting conditions.
- FIG. 11 is a diagram illustrating an example of a dot display image displayed by the display device 21.
- the letters “Q 5 ” and “Q 6 ” and the curves indicating the points are shown for convenience of explanation and are not included in the actually displayed image.
- a circle representing a candidate point is superimposed on an optical satellite image showing a building.
- no candidate point is located in the occlusion area. Therefore, the display mode of each candidate point is uniform.
- FIG. 12 is a diagram illustrating an example of a point display image in which the building and the candidate point displayed in FIG. 11 are used as an optical image taken from another position as a spatial image.
- the candidate point Q 5 and Q 6 are to positioned in the occlusion area are displayed in a manner different from the other candidate points.
- FIG. 13 shows a point display with the same angle of view as the point display image shown in FIG. 12 when the display mode determination unit 115 determines the display mode so that the size of the circle indicating the candidate point differs according to the distance. It is a figure which shows an image.
- circles indicate the candidate point Q 5 and Q 6 to be displayed smaller than the circle that indicates the other candidate points located in front of the building, the candidate point Q 5 and Q 6 buildings It turns out that it is not a point located in front of.
- FIG. 14 is a diagram showing a point display image having the same angle of view as the point display image shown in FIG. 13 when the display mode determination unit 115 does not function.
- the display modes of the candidate points Q 5 and Q 6 are the same as the display modes of the other candidate points.
- the viewer will only see the point display image shown in FIG. 14, the candidate point Q 5 and Q 6 can not be identified whether the position to which the surface of the building.
- viewers can even obtain useful information from the information represented by the point Q 5, the information can not determine whether the information relating to which position of a building.
- PS-InSAR fluctuations in feature points can be observed using two SAR data 1111 having different data acquisition times.
- the candidate point Q 5 is even know viewers that feature points involved varies, specifically which location can not be determined whether varied. Alternatively, the viewer may mistakenly recognize that the front of the building has changed.
- a signal to the degree of contribution of each candidate point is a feature point, if you are trying to determine, based on the position of the candidate point, the position of the point Q 5 and Q 6 are not fixed, the point Q 5 and Q 6 can not determine the degree contribute to the signal of the feature points.
- the display mode determination unit 115 The inconvenience as described above can be solved by the display mode determination unit 115. That is, the information processing apparatus 11 provides the viewer with more easily understandable information regarding the feature points.
- ⁇ Modification 1 In the operation example of the information processing apparatus 11 described above, the order of the process of step S111 and the process of step S112 may be reversed. That is, the feature point extraction unit 112 may extract a feature point from the points given coordinates by the geocoding unit 113.
- the display mode determination unit 115 changes the display mode so that the display indicating the position of the candidate point further indicates the direction of the radar that received the signal from the candidate point or the incident direction of the electromagnetic wave from the radar. You may decide.
- FIG. 15 is a diagram illustrating an example of a point display image when the display mode determination unit 115 determines a display mode that indicates the incident direction of the electromagnetic wave from the radar with respect to each candidate point.
- the incident direction of the electromagnetic wave from the radar is indicated by an arrow that overlaps the circle indicating the candidate point.
- the viewer can know the direction of the radar.
- the figure showing the incident direction may be a figure showing not only a direction parallel to the image but also a three-dimensional direction.
- FIG. 16 is a diagram illustrating an example of a figure indicating a three-dimensional direction.
- the arrow shown in FIG. 16 points to the lower right direction closer to the back side of the page.
- the incident direction is a direction indicated by an arrow.
- the direction opposite to the direction indicated by the arrow is the direction of the radar viewed from the candidate point. According to such a display, the viewer can more specifically know the incident direction of the electromagnetic wave from the radar.
- the candidate point is the point where the electromagnetic wave from the radar is incident, it can be seen that the candidate point is not included in the occlusion area if at least the candidate point is viewed in the indicated direction. Therefore, the display showing the incident direction of the electromagnetic wave from the radar allows the viewer to know the shooting direction in which the candidate point is not hidden. Further, when there are a plurality of candidate positions of the displayed candidate points in the three-dimensional space, points that cannot be observed from the indicated direction can be excluded from the candidates. Therefore, the viewer may be able to deepen understanding of the position of the candidate point in the three-dimensional space.
- the display mode determination unit 115 may further be configured to determine the display mode so that the display mode of candidate points related to a specific feature point is different from the display mode of other candidate points.
- the display mode determination unit 115 may determine the display mode so that candidate points related to the feature points designated by the user are displayed in white and other candidate points are displayed in black.
- FIG. 17 is a block diagram illustrating a configuration of the information processing apparatus 12 including the designation receiving unit 118.
- the designation accepting unit 118 accepts designation of feature points from the user of the information processing apparatus 12, for example.
- the information processing apparatus 12 may cause the display device 21 to display a SAR image showing the feature points.
- designated reception part 118 may receive selection of 1 or more of the feature points shown in the SAR image by a user. The selection may be performed via an input / output device such as a mouse. The selected feature point is the designated feature point.
- the designation accepting unit 118 may accept designation of a plurality of feature points.
- the designation receiving unit 118 sends information on the designated feature points to the display mode determining unit 115.
- the designated feature point information is, for example, an identification number or coordinates associated with each feature point.
- the display mode determination unit 115 identifies candidate points related to the specified feature point. For example, the display mode determination unit 115 may cause the candidate point extraction unit 114 to extract candidate points related to the designated feature point and receive information on the extracted candidate points. Alternatively, when information that associates the feature points with the candidate points is stored in the storage unit 111, the display mode determination unit 115 may identify the candidate points based on the information.
- the designation accepting unit 118 may accept designation of candidate points instead of designation of feature points. For example, the user may select any candidate point among the candidate points included in the point display image displayed by the process of step S125. The designation accepting unit 118 may accept the selection and specify a feature point related to the selected candidate point. And the designation
- the display mode determination unit 115 determines a display mode different from the display mode of other candidate points as the display mode of the identified candidate points. Then, the image generation unit 116 generates a point display image in which candidate points are displayed according to the determined display mode. By displaying this point display image on the display device 21, the viewer can see information on candidate points related to the designated feature point.
- FIG. 18 is a diagram illustrating an example of a point display image generated by the information processing apparatus 12 according to the third modification.
- candidate points related to a specific feature point are displayed in white, and other candidate points are displayed in black.
- the transmittance of a figure indicating a candidate point located in the occlusion area among candidate points related to a specific feature point is 50%.
- the display mode determination unit 115 displays the candidate points related to the specific feature point so that the display of the candidate points related to the specific feature point includes a display indicating the incident direction of the radar as in the second modification. Aspects may be determined.
- Such a display can further suppress the possibility that the viewer misidentifies the position of the candidate point.
- the candidate points located on the wall surface of the building shown in FIG. 18 are located on a surface different from the front surface of the building.
- FIG. 19 is a block diagram illustrating a configuration of the information processing apparatus 10.
- the information processing apparatus 10 includes a candidate point extraction unit 104, a display mode determination unit 105, and an image generation unit 106.
- Candidate point extraction unit 104 determines the target point based on the position in the three-dimensional space of the target point, which is a point specified in the intensity map of the signal from the target object acquired by the radar, and the shape of the target object Candidate points that are points that contribute to the signal at the point are extracted.
- the candidate point extraction unit 114 of each of the above embodiments is an example of the candidate point extraction unit 104.
- the signal intensity map is, for example, a SAR image.
- a point specified in the intensity map is associated with a point in the three-dimensional space.
- An example of the target point is a feature point in the first embodiment.
- the shape of the object to be observed is given by, for example, three-dimensional model data.
- the display mode determination unit 105 displays the display mode indicating the position of the candidate point in the spatial image in which the observed object is captured based on the position of the candidate point in the three-dimensional space and the imaging condition of the spatial image. decide.
- the shape of the subject shown in the spatial image is given by, for example, three-dimensional model data.
- the display mode determination unit 115 of each of the above embodiments is an example of the display mode determination unit 105.
- the image generation unit 106 generates an image in which the position of the candidate point in the spatial image is displayed according to the determined display mode.
- the relationship between the points in the three-dimensional space and the points in the spatial image may be performed in advance or may be performed by the image generation unit 106.
- the image generation unit 106 generates an image indicating the position of the candidate point in the spatial image based on the position of the candidate point in the three-dimensional space and the relationship between the spatial image and the three-dimensional space.
- the image generation unit 116 in each of the above embodiments is an example of the image generation unit 106.
- FIG. 20 is a flowchart showing an operation flow of the information processing apparatus 10.
- Candidate point extraction unit 104 determines the target point based on the position of the target point, which is a point specified in the intensity map of the signal from the observed object acquired by the radar, in the three-dimensional space and the shape of the observed object. Candidate points that are points that contribute to the signal at the point are extracted (step S101).
- the display mode determining unit 105 displays the display mode indicating the position of the candidate point in the spatial image in which the observed object is captured, based on the position of the candidate point in the three-dimensional space and the shooting condition of the spatial image. Based on the determination (step S102).
- the image generation unit 106 generates an image in which the position of the candidate point in the spatial image is displayed according to the determined display mode (step S103).
- the candidate point extraction unit 104 extracts candidate points that contribute to the signal at the target point based on the model data, and the image generation unit 106 generates an image in which the position of the candidate point in the spatial image is displayed. Because. Furthermore, the display mode of the display indicating the position of the candidate point is determined based on the position in the three-dimensional space and the imaging condition of the spatial image.
- each component of each device represents a functional unit block.
- Computer-readable storage media includes, for example, portable media such as optical disks, magnetic disks, magneto-optical disks, and non-volatile semiconductor memories, and ROMs (Read Only Memory) and hard disks built into computer systems. It is a storage device.
- Computer-readable storage medium is a medium that dynamically holds a program for a short time, such as a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line,
- a program or a program that temporarily holds a program such as a volatile memory in a computer system corresponding to a server or a client is also included.
- the program may be a program for realizing a part of the functions described above, and may be a program capable of realizing the functions described above in combination with a program already stored in a computer system.
- the “computer system” is a system including a computer 900 as shown in FIG. 21 as an example.
- the computer 900 includes the following configuration.
- CPU Central Processing Unit
- ROM902 -RAM Random Access Memory
- a program 904A and storage information 904B loaded into the RAM 903
- a storage device 905 that stores the program 904A and storage information 904B
- a drive device 907 that reads / writes from / to the storage medium 906
- a communication interface 908 connected to the communication network 909
- each component of each device in each embodiment is realized when the CPU 901 loads the RAM 903 and executes a program 904A that realizes the function of the component.
- a program 904A for realizing the function of each component of each device is stored in advance in the storage device 905 or the ROM 902, for example. Then, the CPU 901 reads the program 904A as necessary.
- the storage device 905 is, for example, a hard disk.
- the program 904A may be supplied to the CPU 901 via the communication network 909, or may be stored in advance in the storage medium 906, read out to the drive device 907, and supplied to the CPU 901.
- the storage medium 906 is a portable medium such as an optical disk, a magnetic disk, a magneto-optical disk, and a nonvolatile semiconductor memory.
- each device may be realized by a possible combination of a separate computer 900 and a program for each component.
- a plurality of constituent elements included in each device may be realized by a possible combination of one computer 900 and a program.
- each device may be realized by other general-purpose or dedicated circuits, computers, or combinations thereof. These may be configured by a single chip or may be configured by a plurality of chips connected via a bus.
- each component of each device When a part or all of each component of each device is realized by a plurality of computers, circuits, etc., the plurality of computers, circuits, etc. may be centrally arranged or distributedly arranged.
- the computer, the circuit, and the like may be realized as a form in which each is connected via a communication network, such as a client and server system and a cloud computing system.
- Appendix 1 Based on the position in the three-dimensional space of the target point, which is the point specified in the intensity map of the signal from the observed object acquired by the radar, and the shape of the observed object, the signal at the target point
- Candidate point extracting means for extracting candidate points that are contributing points A display mode in which a display mode indicating the position of the candidate point in the spatial image in which the observed object is captured is determined based on the position of the candidate point in the three-dimensional space and the imaging condition of the spatial image.
- An information processing apparatus comprising: [Appendix 2] The display mode determination unit according to claim 1, wherein the display mode determination unit determines a different display mode depending on whether or not the candidate point is located in an area that is a blind spot when viewed from a photographic body that has captured the spatial image. Information processing device. [Appendix 3] The information processing apparatus according to appendix 2, wherein the display mode determination unit determines a different display mode according to the number of times a half line from the candidate point to the photographic object intersects the surface of the subject in the spatial image.
- the display mode determining means is further configured to display the position of the candidate point such that an indication of an incident direction of an electromagnetic wave from the radar that is a basis of the signal contributed by the candidate point. Determine the display mode of the display showing the position, The information processing apparatus according to any one of appendices 1 to 3.
- the display mode determining means determines the display mode of the display so that a part or all of the display indicating the position of the candidate point is a transparent color for at least one of the candidate points.
- the information processing apparatus according to any one of appendices 1 to 4.
- the display mode determination means further uses the position of the other candidate point in the spatial image as a display mode of display indicating the position of the candidate point contributing to the signal at the target point specified by a predetermined method. Determine a display mode different from the display mode of the display The information processing apparatus according to any one of appendices 1 to 5.
- Appendix 8 The information processing method according to appendix 7, wherein a different display mode is determined depending on whether or not the candidate point is located in a region that is a blind spot when viewed from a photographing body that has captured the spatial image.
- Appendix 9 9. The information processing method according to appendix 8, wherein a different display mode is determined according to the number of times a half line from the candidate point to the photographic object intersects the surface of the subject in the spatial image.
- Appendix 10 A display mode for indicating the position of the candidate point so that the display indicating the position of the candidate point further indicates the incident direction of the electromagnetic wave from the radar, which is the basis of the signal contributed by the candidate point. To decide, The information processing method according to any one of appendices 7 to 9.
- the display mode of the display is determined so that a part or all of the display indicating the position of the candidate point is a transparent color for at least one of the candidate points.
- the information processing method according to any one of appendices 7 to 10.
- the display mode indicating the position of the candidate point contributing to the signal at the target point specified by a predetermined method is different from the display mode indicating the position of the other candidate point in the spatial image. Determine the display mode, The information processing method according to any one of appendices 7 to 11.
- [Appendix 14] The display mode determination process according to appendix 13, wherein the display mode determination process determines a different display mode depending on whether or not the candidate point is located in a region that is a blind spot when viewed from the photographic body that has captured the spatial image.
- [Appendix 15] 15 The storage medium according to appendix 14, wherein the display mode determination process determines a different display mode according to the number of times a half line from the candidate point to the photographic object intersects the surface of the subject in the spatial image.
- the display mode determination process is performed so that the display indicating the position of the candidate point further indicates an incident direction of an electromagnetic wave from the radar that is a basis of the signal contributed by the candidate point.
- the display mode determination process determines the display mode of the display so that a part or all of the display indicating the position of the candidate point is a transparent color with respect to at least one of the candidate points.
- the position of another candidate point in the spatial image is further displayed as a display mode indicating the position of the candidate point contributing to the signal at the target point specified by a predetermined method. Determine a display mode different from the display mode of the display The storage medium according to any one of appendices 13 to 17.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
La présente invention facilite la compréhension d'un point sur un objet observé qui contribue au signal en un point d'une région dans laquelle il y a un déversement radar sur une carte d'intensité d'un signal provenant de l'objet observé acquise par radar. Un dispositif de traitement d'informations (11) selon un mode de réalisation comprend: une unité d'extraction de point candidat (114) servant à extraire, sur la base de la position dans un espace tridimensionnel d'un point cible spécifié sur une carte d'intensité (1111) d'un signal provenant d'un objet observé acquis par radar et de la forme de l'objet observé, un point candidat qui contribue au signal au niveau du point cible; une unité de détermination de mode d'affichage (115) servant à déterminer, sur la base de la position dans l'espace tridimensionnel du point cible et des conditions de photographie d'une image spatiale (1114) montrant l'objet observé, un mode d'affichage pour un affichage indiquant la position du point candidat sur l'image spatiale (1114); et une unité de génération d'image (116) servant à générer une image indiquant la position du point candidat sur l'image spatiale (1114) à l'aide du mode d'affichage déterminé.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2017/016451 WO2018198212A1 (fr) | 2017-04-26 | 2017-04-26 | Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations lisible par ordinateur |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2017/016451 WO2018198212A1 (fr) | 2017-04-26 | 2017-04-26 | Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations lisible par ordinateur |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018198212A1 true WO2018198212A1 (fr) | 2018-11-01 |
Family
ID=63919532
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/016451 Ceased WO2018198212A1 (fr) | 2017-04-26 | 2017-04-26 | Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations lisible par ordinateur |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2018198212A1 (fr) |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0933649A (ja) * | 1995-07-21 | 1997-02-07 | Toshiba Corp | Isar画像目標識別処理装置 |
| JP2002267749A (ja) * | 2001-03-14 | 2002-09-18 | Mitsubishi Electric Corp | 画像レーダ装置 |
| JP2004333445A (ja) * | 2003-05-12 | 2004-11-25 | Mitsubishi Electric Corp | グランドトゥルース支援装置およびグランドトゥルース支援プログラム |
| JP2008185375A (ja) * | 2007-01-29 | 2008-08-14 | Mitsubishi Electric Corp | Sar画像の3d形状算出装置及びsar画像の歪補正装置 |
| US20120274505A1 (en) * | 2011-04-27 | 2012-11-01 | Lockheed Martin Corporation | Automated registration of synthetic aperture radar imagery with high resolution digital elevation models |
| JP2016090361A (ja) * | 2014-11-04 | 2016-05-23 | 国立研究開発法人情報通信研究機構 | Sarインターフェログラムからの垂直構造の抽出方法 |
| WO2016125206A1 (fr) * | 2015-02-06 | 2016-08-11 | 三菱電機株式会社 | Dispositif de traitement de signal radar à synthèse d'ouverture |
| US20160259046A1 (en) * | 2014-04-14 | 2016-09-08 | Vricon Systems Ab | Method and system for rendering a synthetic aperture radar image |
| US20170059702A1 (en) * | 2015-05-07 | 2017-03-02 | Thales Holdings Uk Plc | Synthetic aperture radar |
-
2017
- 2017-04-26 WO PCT/JP2017/016451 patent/WO2018198212A1/fr not_active Ceased
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0933649A (ja) * | 1995-07-21 | 1997-02-07 | Toshiba Corp | Isar画像目標識別処理装置 |
| JP2002267749A (ja) * | 2001-03-14 | 2002-09-18 | Mitsubishi Electric Corp | 画像レーダ装置 |
| JP2004333445A (ja) * | 2003-05-12 | 2004-11-25 | Mitsubishi Electric Corp | グランドトゥルース支援装置およびグランドトゥルース支援プログラム |
| JP2008185375A (ja) * | 2007-01-29 | 2008-08-14 | Mitsubishi Electric Corp | Sar画像の3d形状算出装置及びsar画像の歪補正装置 |
| US20120274505A1 (en) * | 2011-04-27 | 2012-11-01 | Lockheed Martin Corporation | Automated registration of synthetic aperture radar imagery with high resolution digital elevation models |
| US20160259046A1 (en) * | 2014-04-14 | 2016-09-08 | Vricon Systems Ab | Method and system for rendering a synthetic aperture radar image |
| JP2016090361A (ja) * | 2014-11-04 | 2016-05-23 | 国立研究開発法人情報通信研究機構 | Sarインターフェログラムからの垂直構造の抽出方法 |
| WO2016125206A1 (fr) * | 2015-02-06 | 2016-08-11 | 三菱電機株式会社 | Dispositif de traitement de signal radar à synthèse d'ouverture |
| US20170059702A1 (en) * | 2015-05-07 | 2017-03-02 | Thales Holdings Uk Plc | Synthetic aperture radar |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN111436208B (zh) | 一种测绘采样点的规划方法、装置、控制终端及存储介质 | |
| KR101809067B1 (ko) | 마이크로파워 임펄스 레이더를 사용한 모바일 디스플레이 포지션 및 배향의 판정 기법 | |
| JP7596424B2 (ja) | 複数のモデルに基づくオブジェクトの表示 | |
| CN110703805B (zh) | 立体物体测绘航线规划方法、装置、设备、无人机及介质 | |
| CN102239503B (zh) | 立体匹配处理设备、立体匹配处理方法和记录介质 | |
| SG189284A1 (en) | Rapid 3d modeling | |
| CN112967344A (zh) | 相机外参标定的方法、设备、存储介质及程序产品 | |
| JP2020135764A (ja) | 立体物のモデリング方法、立体物のモデリング装置、サーバ、3次元モデル作成システム、およびプログラム | |
| WO2024213029A1 (fr) | Procédé et appareil de jalonnement, dispositif et support de stockage | |
| CN113870365A (zh) | 相机标定方法、图像生成方法、装置、设备以及存储介质 | |
| US9852542B1 (en) | Methods and apparatus related to georeferenced pose of 3D models | |
| CN115825067A (zh) | 一种基于无人机的地质信息采集方法、系统及电子设备 | |
| US20130120373A1 (en) | Object distribution range setting device and object distribution range setting method | |
| US10930079B1 (en) | Techniques for displaying augmentations that represent cadastral lines and other near-ground features | |
| KR20180097004A (ko) | 차량용 레이다 목표 리스트와 비전 영상의 목표물 정합 방법 | |
| US12159348B2 (en) | Image processing apparatus, image processing method, and storage medium | |
| JP6741154B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| WO2018198212A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations lisible par ordinateur | |
| US10460427B2 (en) | Converting imagery and charts to polar projection | |
| JP7020418B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
| KR102329031B1 (ko) | 3D 모델링 가시화 시스템의 최대 Zoom 레벨을 결정 방법 | |
| US12085641B2 (en) | Image processing device, image processing method, and image processing computer program | |
| EP4075789A1 (fr) | Dispositif d'imagerie, procédé d'imagerie et programme | |
| WO2007138866A1 (fr) | PROCÉDÉ de projection tridimensionnelle et dispositif d'affichage de motif tridimensionnel | |
| JP2017182287A (ja) | 台帳生成装置および台帳生成プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17907249 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17907249 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |