[go: up one dir, main page]

WO2014077788A2 - A target acquisition system and method - Google Patents

A target acquisition system and method Download PDF

Info

Publication number
WO2014077788A2
WO2014077788A2 PCT/TR2013/000333 TR2013000333W WO2014077788A2 WO 2014077788 A2 WO2014077788 A2 WO 2014077788A2 TR 2013000333 W TR2013000333 W TR 2013000333W WO 2014077788 A2 WO2014077788 A2 WO 2014077788A2
Authority
WO
WIPO (PCT)
Prior art keywords
target
unit
host
acquisition system
order
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/TR2013/000333
Other languages
French (fr)
Other versions
WO2014077788A3 (en
Inventor
Özcan REMZI
Original Assignee
Tusaş - Türk Havacilik Ve Uzay Sanayii A.Ş.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tusaş - Türk Havacilik Ve Uzay Sanayii A.Ş. filed Critical Tusaş - Türk Havacilik Ve Uzay Sanayii A.Ş.
Publication of WO2014077788A2 publication Critical patent/WO2014077788A2/en
Publication of WO2014077788A3 publication Critical patent/WO2014077788A3/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/02Aiming or laying means using an independent line of sight
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/06Aiming or laying means with rangefinder
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/22Aiming or laying means for vehicle-borne armament, e.g. on aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
    • G01S3/7864T.V. type tracking systems

Definitions

  • the present invention is related to a target acquisition system and method that enables the determination of the locations and distances of targets.
  • the distance information of a target is determined via laser ray transmitted to a target.
  • the location of the aircraft transmitting the laser ray can be determined easily by the target, the risk of getting rid of said aircraft comes into question.
  • JP2003325519 In the Japanese patent document numbered JP2003325519 according to the known state of the art, a three dimension video composing apparatus is described. As ultrasonic waves can be senses by the target, this is not suitable for several circumstances.
  • the aim of the invention is to develop a target acquisition system and method that enables the determination of the positions and distances of targets.
  • Another aim of the invention is to realize a target acquisition system and method that enables the passive determination of the positions and distances of targets.
  • Another aim of the invention is to realize an easily integrated target acquisition system and method.
  • Yet another aim of this invention is to provide a cost effective target acquisition system and method.
  • Figure 1 - Is the schematic view of the target acquisition system.
  • Figure 2 - Is the flow diagram of the target acquisition method.
  • Target acquisition system 2. Imaging unit
  • the target acquisition system (1) comprises;
  • At least one imaging unit (2) which enables to capture and thus to display the visible and/or infrared light waves reflected from objects within an environment
  • At least an informing unit (3) that enables the transmitting of images sensed from a imaging unit as visible and/or audio images and enables to enter commands requested by the user
  • At least a controlling unit (4) which enables the processing of images transmitted by the imaging unit (2) in accordance with the commands given by the user and enables to change the position of the Host (K) in order to obtain necessary data.
  • the imaging unit (2) is fixed to a host ( ).
  • a host (K) is preferably an aircraft.
  • the imaging unit (2) captures images from the environment and transfers said images to the informing unit (3).
  • the user can choose any target it requests from the image shown in the informing unit (3). This target can be an object in a targeted image or could be the whole region.
  • the patterns belonging to a target chosen by a user are determined by the controlling unit (4). Following this, the controlling unit (4) tracks the target by means of these patterns.
  • the controlling unit (4) controls the imaging unit (2) or the host (K) and ensures that the target stays within the line of vision of the imaging unit.
  • the controlling unit (4) ensures that information is published which defines which way the imaging unit (2) or the users host (K) needs to be oriented by the informing unit.
  • Such information published can be written, symbolized, audio information etc. however; it cannot be limited to these examples.
  • the user ensures that the target stays within the line of vision of the imaging unit (2) by directing the host (K) or the imaging unit (2) according to said information, and also enables to obtain the images necessary to determine the location/distance of the target from a suitable position.
  • the information belonging to manoeuvres carried out by the host (K) after the target digresses from the line of vision shall be used and the new location of the target to the line of vision is calculated.
  • the manoeuvres that need to be carried out in order to bring back the target into the line of vision of the imaging unit (2) are informed to the user by the informing unit (3) thus enabling the user to orient the host (K) or the imaging unit.
  • the controlling unit (4) orients the host (K) by means of an automatic pilot (O) in order to ensure that the location of the target is determined and that the target stays within the line of vision of the display unit (2).
  • the images that are necessary to determine the location of the target can be captured from suitable positions.
  • the distance precision of the target location determined by means of the images that have been captured by the imaging unit (2) increase when the numbers of the images captured by the imaging unit also increase. If the number of the images captured is sufficient to calculate the position and the distance of the target, the control unit (4) submits to the user by means of the informing unit (3) the location and distance information of the target and the possible error margin in said information. In the case that the user finds the precision of the information obtained insufficient images from the target are continued to be captured.
  • the procedures to bring the host (K) to a -suitable position in order to be able to capture an image from the target are realized as explained above.
  • the target acquisition method (100) according to the invention comprises the steps below:
  • step 104 If the target is out of the line of vision of the imaging unit (2), ensuring that the host or the imaging unit (2) is oriented such that the target stays within the line of vision of the imaging unit (2) and going back to step 104 to check if the target is within the line of vision of the imaging unit (2), (105),
  • the target is within the line of vision of the imaging unit (2), ensuring that the host (K) is directed to a position such that it can determine the position of the target (106),
  • step 106 If the precision is not high enough, going back to step 106 in order to re-take images from the target.
  • the user determines a target by means of the images that have been submitted by the informing unit (3).
  • the target acquisition process can either be determined by touching the target displayed on the informing unit (3) or it can also be realized with any other type of data entry method.
  • the controlling unit (4) determines the patters that belong to the target chosen by the user (102). Thus even if the host's (K) and/or the target's location changes, the target can be followed.
  • the controlling unit (4) tracks the target by using the patterns that belong to the target (103). Following this, the controlling unit (4) determines if the target remains within the line of vision of the imaging unit (2), (104).
  • the control unit (4) ensures that the host (K) or the imaging unit (2) is oriented such that the target goes into the line of vision of the imaging unit (2), (105). In the case that the target is in the line of the vision of the imaging unit (2), the controlling unit (4), ensures the orienting of the host (K) such that it can capture an image in order to determine the location of the target (106). When the host (K) comes into a suitable position, the target is tracked by the imaging unit (2). Following this, the controlling unit (4), calculates the location and distance data by using the images captured by the imaging unit (2), (108).
  • the precision data together with the location and distance data calculated by the controlling unit (4) are submitted to the user by means of the informing unit (3), (109).
  • the user decides if the precision of the related data are sufficiently high enough to meet the aim of usage (1 10). If the precision of the data are not sufficiently high, the procedures are repeated as of step 106, in order to be able to re-capture images from the target.
  • the delaying and rotating procedures of the host (K) in order to bring the Host (K) to the suitable position are either applied independently or in combination.
  • the delaying and rotating procedures of the host (K) in order to bring the Host (K) to the suitable position are either applied independently or in combination.
  • the information unit explaining how the host (K) or the imaging unit (2) has to be moved by the user in order to ensure that the host (K) is in the correct position shall be submitted to the user by the informing unit (3).
  • the information which explains how the user should move the host ( ) or the imaging unit (2) are submitted to the user by means of the informing unit (3).
  • the informing unit (3) in order for the user to be informed at the moment when the host (K) or the imaging unit (2) come to the suitable position to be able to capture an image, visual and/or audio warning signals are transmitted by the informing unit (3).
  • the manoeuvres necessary to bring the host (K) to the right position are submitted to the automatic pilot (O) and it is ensured that the automatic pilot is directed to the host (K).
  • the target acquisition system (1) subject to the present invention has at least two displaying units (2).
  • an image taken from each imaging unit (2) is used and the location and distance information of the target is calculated.
  • these imaging units (2) are placed on the host (K) such that they are preferably positioned at the farthest possible distance apart.
  • the distance between the imaging units, and the angular offsetting of the imaging units (2) according the axis of the host (K), are fed to the controlling unit (4) as parameters in order to use in calculating the location and distance information of the target.
  • the area viewed by the imaging unit (2) can be a whole region.
  • the procedures carried out in order to obtain the location and distance information of the target, are realized in this case in order to prepare the depth map of the region.
  • the depth map belonging to a certain region and the information of the location and distance belonging to this target located within the region are obtained together in an embodiment of the invention.
  • the target whose location and distance information is calculated and/or the target whose depth map has been mapped are shown on a 3 dimensional informing unit (3).
  • the target whose depth map has been mapped and/or the target whose location and distance information is calculated can be viewed on a 2 dimension informing unit (3).
  • the images shown on the informing unit (3) are shaded, coloured and/or cross hatched according to the depth of the target.
  • the target whose location and distance is tried to be calculated is shown on the informing unit (3) in a different colour, shade and/or cross hatch.
  • the regions from which a depth map cannot be obtained are shown with a different colour, shade and/or cross hatch.
  • the imaging unit (2) in order to increase the precision information of location/distance and depth, has an optical zooming characteristic.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Studio Devices (AREA)

Abstract

The present invention is related to a target acquisition system (1) and method (100) that enables the determination of the locations and distances of targets. The target acquisition system (1) according to the invention comprises; at least a imaging unit (2) which enables to capture and thus to display the visible and/or infrared light waves reflected from objects within an environment; at least an informing unit (3) that enables the transmitting of images sensed from a imaging unit as visible and/or audio images and enables to enter commands requested by the user; and at least a controlling unit (4) which enables the processing of images transmitted by the imaging unit (2) in accordance with the commands given by the user and enables to change the position of the Host (K) in order to obtain necessary data.

Description

DESCRIPTION
A TARGET ACQUISITION SYSTEM AND METHOD Technical Field
The present invention is related to a target acquisition system and method that enables the determination of the locations and distances of targets.
Prior Art
It is important for law enforcement forces to be able to determine conditions which could threaten the safety of the community and to take action against such conditions in order to be able to protect the security of the community. The conditions that may threaten security can be correctly determined by means of the information obtained. However the way such information is obtained, can be a condition which may endanger security under some circumstances. If we want to give a general example, we would say that the usage of a flash, in order to be able to capture images in a dark environment, will make others realize an image is being captured from such an environment. Similarly, illuminating the environment by visible region and/or infrared waves, in order to capture images from an environment, will also cause others to understand that images are being captured from such an environment. Moreover radars that capture images by using the scattering of sound signals after they strike obstacles can also be easily determined. In other words, systems that transmit any kind of visual/audio/magnetic/thermal etc signals are bound to be sensed. This also introduces the risk of determining the location of a system that is carrying out an imaging process.
Especially in aircrafts the distance information of a target is determined via laser ray transmitted to a target. However, as the location of the aircraft transmitting the laser ray, can be determined easily by the target, the risk of getting rid of said aircraft comes into question.
Due to said disadvantages in the case of such an acquisition procedure wants to be carried out secretly, the acquisition needs to be performed passively, in other words the party carrying out such an acquisition should do it without sending any kind of signal to an environment.
In the Japanese patent document numbered JP2003325519 according to the known state of the art, a three dimension video composing apparatus is described. As ultrasonic waves can be senses by the target, this is not suitable for several circumstances.
Brief description of the invention
The aim of the invention is to develop a target acquisition system and method that enables the determination of the positions and distances of targets.
Another aim of the invention is to realize a target acquisition system and method that enables the passive determination of the positions and distances of targets.
Another aim of the invention is to realize an easily integrated target acquisition system and method.
Yet another aim of this invention is to provide a cost effective target acquisition system and method.
Detailed description of the Invention
The target acquisition system and method provided in order to reach the aims of this invention has been shown in the attached figures wherein said figures illustrate the following:
Figure 1 - Is the schematic view of the target acquisition system.
Figure 2 - Is the flow diagram of the target acquisition method.
The parts shown in the figures have each been numbered, and their corresponding references have been given below.
1. Target acquisition system 2. Imaging unit
3. Informing unit
4. Controlling unit K. Host
O. Automatic pilot
The target acquisition system (1) according to the invention comprises;
— At least one imaging unit (2) which enables to capture and thus to display the visible and/or infrared light waves reflected from objects within an environment,
— At least an informing unit (3) that enables the transmitting of images sensed from a imaging unit as visible and/or audio images and enables to enter commands requested by the user,
— At least a controlling unit (4) which enables the processing of images transmitted by the imaging unit (2) in accordance with the commands given by the user and enables to change the position of the Host (K) in order to obtain necessary data. According to the target acquisition system (1) subject to the present invention, the imaging unit (2) is fixed to a host ( ).
According to a preferred embodiment of the present invention a host (K) is preferably an aircraft. In said embodiment, while the host (K) which is an aircraft is flying the imaging unit (2) captures images from the environment and transfers said images to the informing unit (3). The user can choose any target it requests from the image shown in the informing unit (3). This target can be an object in a targeted image or could be the whole region.
The patterns belonging to a target chosen by a user are determined by the controlling unit (4). Following this, the controlling unit (4) tracks the target by means of these patterns. The controlling unit (4) controls the imaging unit (2) or the host (K) and ensures that the target stays within the line of vision of the imaging unit. According to an embodiment of the invention, in order to ensure that the target stays within the line of vision of the imaging unit (2) and also to ensure the location of the target, the controlling unit (4) ensures that information is published which defines which way the imaging unit (2) or the users host (K) needs to be oriented by the informing unit.
Such information published can be written, symbolized, audio information etc. however; it cannot be limited to these examples. The user ensures that the target stays within the line of vision of the imaging unit (2) by directing the host (K) or the imaging unit (2) according to said information, and also enables to obtain the images necessary to determine the location/distance of the target from a suitable position. According to this application, in the case that the target digresses from the line of vision of the imaging unit (2), the information belonging to manoeuvres carried out by the host (K) after the target digresses from the line of vision shall be used and the new location of the target to the line of vision is calculated. Following this, the manoeuvres that need to be carried out in order to bring back the target into the line of vision of the imaging unit (2) are informed to the user by the informing unit (3) thus enabling the user to orient the host (K) or the imaging unit. According to another embodiment of the invention, the controlling unit (4) orients the host (K) by means of an automatic pilot (O) in order to ensure that the location of the target is determined and that the target stays within the line of vision of the display unit (2). Thus, the images that are necessary to determine the location of the target can be captured from suitable positions.
The distance precision of the target location determined by means of the images that have been captured by the imaging unit (2) increase when the numbers of the images captured by the imaging unit also increase. If the number of the images captured is sufficient to calculate the position and the distance of the target, the control unit (4) submits to the user by means of the informing unit (3) the location and distance information of the target and the possible error margin in said information. In the case that the user finds the precision of the information obtained insufficient images from the target are continued to be captured. The procedures to bring the host (K) to a -suitable position in order to be able to capture an image from the target are realized as explained above.
The target acquisition method (100) according to the invention comprises the steps below:
Determination of a target by means of the images that have been transferred from the informing unit (3) by the user (101),
Determination of the patterns by the controlling unit (4) belonging to the target chosen by the user (102),
- Tracking of the target by the control unit (4) by means of following the patterns (103),
Determining if the target has stayed within the line of vision of the imaging unit (2) or not (104),
- If the target is out of the line of vision of the imaging unit (2), ensuring that the host or the imaging unit (2) is oriented such that the target stays within the line of vision of the imaging unit (2) and going back to step 104 to check if the target is within the line of vision of the imaging unit (2), (105),
- If the target is within the line of vision of the imaging unit (2), ensuring that the host (K) is directed to a position such that it can determine the position of the target (106),
Displaying the target (107),
- Calculating the location and distance data of the target by using the images (108),
Submitting to the user by the informing unit (3) the calculation precision data together with the calculated location and distance data (109),
Determining if the location and distance data precisions are high enough or not (1 10),
- If the precision is not high enough, going back to step 106 in order to re-take images from the target.
According to the target acquisition method (100) subject to the invention, the user determines a target by means of the images that have been submitted by the informing unit (3). The target acquisition process can either be determined by touching the target displayed on the informing unit (3) or it can also be realized with any other type of data entry method. Following the determination of a target, the controlling unit (4) determines the patters that belong to the target chosen by the user (102). Thus even if the host's (K) and/or the target's location changes, the target can be followed. After the patterns belonging to the target are determined, the controlling unit (4) tracks the target by using the patterns that belong to the target (103). Following this, the controlling unit (4) determines if the target remains within the line of vision of the imaging unit (2), (104). In the case that the target is out of the line of vision of the imaging unit (2), the control unit (4) ensures that the host (K) or the imaging unit (2) is oriented such that the target goes into the line of vision of the imaging unit (2), (105). In the case that the target is in the line of the vision of the imaging unit (2), the controlling unit (4), ensures the orienting of the host (K) such that it can capture an image in order to determine the location of the target (106). When the host (K) comes into a suitable position, the target is tracked by the imaging unit (2). Following this, the controlling unit (4), calculates the location and distance data by using the images captured by the imaging unit (2), (108). The precision data together with the location and distance data calculated by the controlling unit (4) are submitted to the user by means of the informing unit (3), (109). The user decides if the precision of the related data are sufficiently high enough to meet the aim of usage (1 10). If the precision of the data are not sufficiently high, the procedures are repeated as of step 106, in order to be able to re-capture images from the target.
According to a preferred embodiment of the present invention, in the step of the host (K) to be oriented such that it can capture and image to determine the location of the target (106), the delaying and rotating procedures of the host (K) in order to bring the Host (K) to the suitable position are either applied independently or in combination. According to a preferred embodiment of the present invention,
According to a preferred embodiment of the present invention in the step of the host (K) to be oriented such that it can capture and image to determine the location of the target (106), the information unit explaining how the host (K) or the imaging unit (2) has to be moved by the user in order to ensure that the host (K) is in the correct position shall be submitted to the user by the informing unit (3). In this embodiment, in the case that the target digresses out of the line of vision of the imaging unit (2), in order to bring the target back to the line of vision of the displaying unit (2), the information which explains how the user should move the host ( ) or the imaging unit (2), are submitted to the user by means of the informing unit (3). In said embodiment, in order for the user to be informed at the moment when the host (K) or the imaging unit (2) come to the suitable position to be able to capture an image, visual and/or audio warning signals are transmitted by the informing unit (3). According to another embodiment of the present invention in the step of the host (K) to be oriented such that it can capture and image to determine the location of the target (106), the manoeuvres necessary to bring the host (K) to the right position are submitted to the automatic pilot (O) and it is ensured that the automatic pilot is directed to the host (K). In this embodiment, in the case that the target goes out of the line of vision of the imaging unit (2), in order to bring the target back to the line of vision of the imaging unit (2), the information which explain how the automatic pilot (O) should move the host (K) are submitted to the automatic pilot (O) by the controlling unit. In an embodiment of the invention, the target acquisition system (1) subject to the present invention has at least two displaying units (2). By this means an image taken from each imaging unit (2) is used and the location and distance information of the target is calculated. In the case that more than one imaging unit (2) is used on the host (K), these imaging units (2) are placed on the host (K) such that they are preferably positioned at the farthest possible distance apart. The distance between the imaging units, and the angular offsetting of the imaging units (2) according the axis of the host (K), are fed to the controlling unit (4) as parameters in order to use in calculating the location and distance information of the target. According to an embodiment of the invention, the area viewed by the imaging unit (2) can be a whole region. The procedures carried out in order to obtain the location and distance information of the target, are realized in this case in order to prepare the depth map of the region. The depth map belonging to a certain region and the information of the location and distance belonging to this target located within the region are obtained together in an embodiment of the invention.
According to a preferred embodiment of the invention, the target whose location and distance information is calculated and/or the target whose depth map has been mapped, are shown on a 3 dimensional informing unit (3).
Thus a user can obtain a visual perception with the depth of the target that has been displayed. In another embodiment of the present invention, the target whose depth map has been mapped and/or the target whose location and distance information is calculated can be viewed on a 2 dimension informing unit (3). According to this embodiment, in order to emphasize the depth information belonging to the target, the images shown on the informing unit (3) are shaded, coloured and/or cross hatched according to the depth of the target. According to the preferred embodiment the target whose location and distance is tried to be calculated, is shown on the informing unit (3) in a different colour, shade and/or cross hatch. Thus the user can distinguish the targeted object from other objects. According to a preferred embodiment of the present invention, the regions from which a depth map cannot be obtained are shown with a different colour, shade and/or cross hatch. By this means the user can understand if a depth information regarding to a regions could be obtained or not.
In a preferred embodiment of the present invention, in order to increase the precision information of location/distance and depth, the imaging unit (2) has an optical zooming characteristic. By this means both the details that belong to the target whose information is trying to be obtained can be obtained more precisely and said target is isolated from the objects that are not of interest.
Several various embodiments of the target acquisition system (1) and method (100) subject to the present invention can be developed; thus the present invention cannot be limited to the examples given herein, and the present invention is actually as described in the claims.

Claims

A target acquisition system (1) basically comprises;
At least a imaging unit (2) which enables to capture and thus to display the visible and/or infrared light waves reflected from objects within an environment,
At least an informing unit (3) that enables the transmitting of images sensed from a imaging unit as visible and/or audio images and enables to enter commands requested by the user, and
At least a controlling unit (4) which processes the images that have been transmitted by the imaging unit (2) in accordance with the commands given by the user,
Characterized in that said, at least one controlling unit (4) enables to change the location of the host (K) in order to obtain necessary data.
A target acquisition system (1) according to Claim 1 characterized in that; it comprises a controlling unit (4) which determines the patterns belonging to a target chosen by the user from an image shown on the informing unit (3).
A target acquisition system (1) according to Claim 2 characterized in that; it comprises a controlling unit (4) which enables to track down a target by using the patterns that belong to a target.
A target acquisition system (1) according to any of the preceding claims characterized in that; it comprises an imaging unit (2) in order to track down a target or a controlling unit (4) which enables for the target to stay in the line of vision of a displaying unit (2) by controlling the host (K).
A target acquisition system (1) according to any of the preceding claims characterized in that; it comprises a controlling unit (4) that enables to publish information that defines at which direction the informing unit (3) should orient the host (K) or the imaging unit (2) by means of the user in order to determine the location of the target and to ensure that the target stays within the line of vision of the displaying unit (2).
6. A target acquisition system (1) according to Claim 5 characterized in that; it comprises an informing unit (3) which publishes information in writing, symbols, audio etc.
7. A target acquisition system (1) according to any of the preceding claims characterized in that; it comprises a controlling unit (4) which calculates where the target is positioned, by using the information belonging to the manoeuvres that have been realized by the host (K) after the target digresses out of the line of vision, in such a case wherein the target goes out of the line of vision of the imaging unit (2).
8. A target acquisition system (1) according to Claim 7 characterized in that; it comprises a controlling unit (4) which enables the orientation of the displaying unit (K) or the Host (K) by the user by informing the user, via the informing unit of the manoeuvres that need to be carried out in order to ensure that the target is brought back into the line of vision of the displaying unit (2).
9. A target acquisition system (1) according to any of the preceding claims characterized in that; it comprises an informing unit (3) which submits an image and/or audio signal in order to inform the user when the host (K) or the imaging unit (2) is brought to a suitable image capturing position.
10. A target acquisition system (1) according to claims 1 -4 characterized in that; it comprises a controlling unit which calculates where the target is located in relation to the line of vision, by using the information belonging to the manoeuvres that has been realized by the host (K) after the target digresses out of the line of vision of the display unit (2).
11. A target acquisition system (1) according to claim 10 characterized in that; it comprises a controlling unit (4) that enables the orientation of a host (K) by means of an automatic pilot (O) in order to ensure that the target stays in the line of vision of the displaying unit (2) and moreover in order to determine the location of the target.
12. A target acquisition system (1) according to any of the preceding claims characterized in that; it comprises at least two displaying units (2) preferably positioned at the farthest possible distance to each other on the host (K).
13. A target acquisition system (1) according to claim 12 characterized in that;
it comprises a controlling unit which takes as parameter the information such as angular shift of the imaging units (2) according to the axis of the host (K) and the distance information between the imaging units, in order to be used in calculating the location and distance information of the target.
14. A target acquisition system (1) according to any of the preceding claims characterized in that; it comprises a 3 dimensional informing unit (3) in order for the user to be able to obtain a visual perception regarding the target whose depth map has been mapped out and/or location and distance information has been calculated. 15. A target acquisition system (1) according to any of the preceding claims characterized in that; it comprises an informing unit (3) wherein the images located on the target in order to emphasize the depth information of the target, are shaded, coloured and/or cross hatched.
16. A target acquisition system (1) according to any of the preceding claims characterized in that; it comprises an informing unit (3) wherein the target whose location and distance is trying to be calculated are shown with a different shade, colour and/or cross hatch.
17. A target acquisition system (1) according to any of the preceding claims characterized in that; it comprises an informing unit (3) wherein the areas which a depth map cannot be obtained are shown with a different colour, shade and/or cross hatch.
18. A target acquisition system (1) according to any of the preceding claims characterized in that; it comprises an imaging unit (2) having an optical zoom feature in order to increase the precision of the location/distance/depth information.
19. A target acquisition method (100) characterized in that it comprises the following steps:
-Determination of a target by means of the images that have been transferred from the informing unit (3) by the user (101),
-Determination of the patterns by the controlling unit (4) belonging to the target chosen by the user (102),
-Tracking of the target by the control unit (4) by means of following the patterns
(103),
-Determining if the target has stayed within the line of vision of the imaging unit (2) or not (104),
-If the target is out of the line of vision of the imaging unit (2), ensuring that the host or the imaging unit (2) is oriented such that the target stays within the line of vision of the imaging unit (2) and going back to step 104 to check if the target is within the line of vision of the imaging unit (2), (105),
-If the target is within the line of vision of the imaging unit (2), ensuring that the host (K) is directed to a position such that it can determine the position of the target (106),
-Displaying the target (107),
-Calculating the location and distance data of the target by using the images (108),
-Submitting to the user by the informing unit (3) the calculation precision data together with the calculated location and distance data (109), * -Determining if the location and distance data precisions are high enough or not (1 10),
-If the precision is not high enough, going back to step 106 in order to re-take images from the target.
20. A target acquisition method (100) according to claim 19 characterized in that it comprises according to the preferred embodiment of the invention, within the step of orienting the host (K) to a location at which an image can be captured in order to determine the position of the target (106); applying the shifting and rotating procedures independently or in combination for a host (K) in order to bring the host (K) to a suitable position.
21. A target acquisition method (100) according to claim 19 or 20 characterized in that it comprises within the step of orienting the host (K) to a location at which an image can be captured in order to determine the position of the target (106); transmitting the user by the informing unit (3) the information which explains how the host (K) or the displaying unit should be moved in order to bring the host (K) to a suitable position.
22. A target acquisition method (100) according to claim 19 characterized in that it comprises within the step of orienting the host (K) to a location at which an image can be captured in order to determine the position of the target (106); transmitting of the manoeuvres necessary to bring the host (K) to a suitable position to the automatic pilot (O) thus enabling the orienting of the host (K) by the automatic pilot.
23. A target acquisition method (100) according to claims 19-22 characterized in that it enables to obtain the location and distance information of a depth map belonging to a certain region or of a target located within said region.
PCT/TR2013/000333 2012-11-16 2013-11-01 A target acquisition system and method Ceased WO2014077788A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR201213262 2012-11-16
TR2012/13262 2012-11-16

Publications (2)

Publication Number Publication Date
WO2014077788A2 true WO2014077788A2 (en) 2014-05-22
WO2014077788A3 WO2014077788A3 (en) 2014-09-04

Family

ID=49753448

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2013/000333 Ceased WO2014077788A2 (en) 2012-11-16 2013-11-01 A target acquisition system and method

Country Status (1)

Country Link
WO (1) WO2014077788A2 (en)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011159288A2 (en) * 2010-06-15 2011-12-22 Flir Systems, Inc. Gimbal positioning with target velocity compensation

Also Published As

Publication number Publication date
WO2014077788A3 (en) 2014-09-04

Similar Documents

Publication Publication Date Title
EP1892149B1 (en) Method for imaging the surrounding of a vehicle and system therefor
JP7157054B2 (en) Vehicle navigation based on aligned images and LIDAR information
CN107000749B (en) Vehicle travel control device and travel control method
US9057609B2 (en) Ground-based camera surveying and guiding method for aircraft landing and unmanned aerial vehicle recovery
JP3345113B2 (en) Target object recognition method and target identification method
US8049658B1 (en) Determination of the three-dimensional location of a target viewed by a camera
US20080180351A1 (en) Vehicle display system and method with enhanced vision system and synthetic vision system image display
CN109017570A (en) Vehicle periphery scene rendering method and device, vehicle
KR102077597B1 (en) Anti-air tracking device and operation method of the same
WO2011104984A1 (en) Vehicle surroundings monitoring device
US10351241B2 (en) Device and method for an unmanned flying object
CN114930112B (en) Intelligent system for controlling combat vehicle turret functions
CN104166137A (en) Target comprehensive parameter tracking measurement method based on display of radar warning situation map
CN105676884A (en) Infrared thermal imaging searching/ tracking/ aiming device and method
US20150378000A1 (en) Delay compensation while controlling a remote sensor
US20100045449A1 (en) Method for detecting a traffic space
US11874379B2 (en) Time-resolved contrast imaging for lidar
CN114084129A (en) Fusion-based vehicle automatic driving control method and system
US9001186B2 (en) Method and device for combining at least two images to form a panoramic image
CN110210527A (en) Maritime Law Enforcement reconnaissance system based on machine vision joint perception
EP3865809B1 (en) Apparatus and method to improve a situational awareness of a pilot or driver
WO2014077788A2 (en) A target acquisition system and method
US8699781B1 (en) Embedded symbology for use with scene imaging system
DE102011100269A1 (en) Electro-optical fire control unit for a gun
CN111999882A (en) Large-view-field long-wave infrared cloud-penetrating early warning method attached to tracking telescope

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13802748

Country of ref document: EP

Kind code of ref document: A2

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
122 Ep: pct application non-entry in european phase

Ref document number: 13802748

Country of ref document: EP

Kind code of ref document: A2