CN117690040A - A target scene determination method, device and equipment - Google Patents
A target scene determination method, device and equipment Download PDFInfo
- Publication number
- CN117690040A CN117690040A CN202410130192.1A CN202410130192A CN117690040A CN 117690040 A CN117690040 A CN 117690040A CN 202410130192 A CN202410130192 A CN 202410130192A CN 117690040 A CN117690040 A CN 117690040A
- Authority
- CN
- China
- Prior art keywords
- observation
- remote sensor
- target
- scene
- time window
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Astronomy & Astrophysics (AREA)
- Remote Sensing (AREA)
- Image Processing (AREA)
Abstract
Description
技术领域Technical field
本申请涉及遥感技术领域,具体涉及一种目标场景确定方法、装置及设备。This application relates to the field of remote sensing technology, and specifically to a target scene determination method, device and equipment.
背景技术Background technique
针对条带状区域监测覆盖范围较大,遥感器的像幅宽度有限,每个观测活动只能覆盖区域目标的局部,遥感器需要执行多个观测活动才能完全覆盖区域目标的需求,研究面向条带状区域监测任务分解方法,综合考虑卫星轨道、遥感器观测范围、遥感器成像模式、可见时间窗以及区域目标地理位置等因素,合理有效对条带状区域目标进行分割,为卫星协同规划提供输入。In view of the requirement that the strip-shaped area monitoring coverage is large, the image width of the remote sensor is limited, each observation activity can only cover part of the regional target, and the remote sensor needs to perform multiple observation activities to fully cover the regional target, research on strip-oriented The decomposition method of strip area monitoring tasks comprehensively considers factors such as satellite orbit, remote sensor observation range, remote sensor imaging mode, visible time window, and regional target geographical location to reasonably and effectively segment strip area targets, providing information for satellite collaborative planning. enter.
但是,在目前的卫星协同规划中的场景确定过程中,常因卫星轨迹特征、遥感器观测范围、区域目标地理位置等因素,导致目标分割不合理而浪费可见时间窗的极端情况。However, in the current scene determination process in satellite collaborative planning, factors such as satellite trajectory characteristics, remote sensor observation range, and regional target geographical location often lead to extreme situations in which target segmentation is unreasonable and the visible time window is wasted.
发明内容Contents of the invention
基于此,有必要针对上述技术问题,提供一种能够提高可见时间窗的利用率的目标场景确定方法、装置及设备。Based on this, it is necessary to provide a target scene determination method, device and equipment that can improve the utilization of the visible time window in view of the above technical problems.
第一方面,本申请提供了一种目标场景确定方法,方法包括:In the first aspect, this application provides a method for determining target scenarios, which includes:
确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程;各遥感器具有各自的遥感器参数;Determine the sub-satellite point trajectory equation within each visible time window between each remote sensor and the corresponding regional target; each remote sensor has its own remote sensor parameters;
根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,得到候选观测场景;Segment the regional targets according to the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor to obtain candidate observation scenarios;
根据对候选观测场景进行观测所需的持续观测时间以及对应的可见时间窗,从候选观测场景中确定目标场景。Based on the continuous observation time required to observe the candidate observation scene and the corresponding visible time window, the target scene is determined from the candidate observation scene.
在其中一个实施例中,确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程,包括:In one embodiment, determining the sub-satellite point trajectory equation within each visible time window between each remote sensor and the corresponding regional target includes:
基于各区域目标选择遥感器,并确定各遥感器与对应区域目标的可见时间窗;Select remote sensors based on targets in each region, and determine the visible time window between each remote sensor and the target in the corresponding region;
利用特定软件,确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程。Use specific software to determine the sub-satellite point trajectory equation within each visible time window of each remote sensor and the target in the corresponding area.
在其中一个实施例中,根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,得到候选观测场景,包括:In one embodiment, the regional targets are segmented according to the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor to obtain candidate observation scenarios, including:
选择未处理的可见时间窗;Select the unprocessed visible time window;
根据未处理的可见时间窗对应的遥感器所对应的星下点轨迹方程和遥感器参数确定遥感器的观测范围;Determine the observation range of the remote sensor based on the sub-satellite point trajectory equation and remote sensor parameters corresponding to the remote sensor corresponding to the unprocessed visible time window;
根据预选的区域目标分割模式,在观测范围内对区域目标进行分割,得到候选观测场景。According to the preselected regional target segmentation mode, the regional targets are segmented within the observation range to obtain candidate observation scenes.
在其中一个实施例中,根据预选的区域目标分割模式,在观测范围内对区域目标进行分割,得到候选观测场景,包括:In one embodiment, the regional targets are segmented within the observation range according to the preselected regional target segmentation mode to obtain a candidate observation scene, including:
在预选的区域目标分割模式的为条带分割模式的情况下,根据条带分割模式在观测范围内对区域目标进行分割,得到候选观测场景。When the preselected regional target segmentation mode is the strip segmentation mode, segment the regional target within the observation range according to the strip segmentation mode to obtain a candidate observation scene.
在其中一个实施例中,在预选的区域目标分割模式的为条带分割模式的情况下,根据条带分割模式在观测范围内对区域目标进行分割,得到候选观测场景,包括:In one embodiment, when the preselected regional target segmentation mode is the strip segmentation mode, the regional target is segmented within the observation range according to the strip segmentation mode to obtain a candidate observation scene, including:
在预选的区域目标分割模式的为条带分割模式的情况下,将区域目标分割成相互平行的条带集合;When the preselected regional target division mode is the strip division mode, divide the regional target into a set of mutually parallel strips;
在遥感器采用单景模式成像的情况下,将观测范围内的条带集合分割成单景,得到候选观测场景。When the remote sensor adopts single scene mode for imaging, the strip set within the observation range is divided into single scenes to obtain candidate observation scenes.
在其中一个实施例中,根据对候选观测场景进行观测所需的持续观测时间以及对应的可见时间窗,从候选观测场景中确定目标场景,包括:In one embodiment, the target scene is determined from the candidate observation scenes based on the continuous observation time required to observe the candidate observation scene and the corresponding visible time window, including:
选择未访问的候选观测场景作为当前观测场景;Select an unvisited candidate observation scene as the current observation scene;
根据遥感器的飞行速度和观测范围,计算遥感器观测观测范围的持续观测时间;According to the flight speed and observation range of the remote sensor, calculate the continuous observation time of the remote sensor observation range;
根据持续观测时间和对应的可见时间窗,确定当前观测场景是否为目标场景。Based on the continuous observation time and the corresponding visible time window, determine whether the current observation scene is the target scene.
在其中一个实施例中,根据持续观测时间和对应的可见时间窗,确定当前观测场景是否为目标场景,包括:In one embodiment, determining whether the current observation scene is a target scene based on the continuous observation time and the corresponding visible time window includes:
判断持续观测时间和对应的可见时间窗是否匹配;Determine whether the continuous observation time matches the corresponding visible time window;
若是,则确定当前观测场景为目标场景。If so, the current observation scene is determined to be the target scene.
第二方面,本申请还提供了一种目标场景确定装置,装置包括:In a second aspect, this application also provides a target scene determination device, which includes:
方程确定模块,用于确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程;各遥感器具有各自的遥感器参数;The equation determination module is used to determine the sub-satellite point trajectory equation within each visible time window of each remote sensor and the corresponding regional target; each remote sensor has its own remote sensor parameters;
分割模块,用于根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,得到候选观测场景;The segmentation module is used to segment regional targets based on the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor to obtain candidate observation scenarios;
场景确定模块,用于根据对候选观测场景进行观测所需的持续观测时间以及对应的可见时间窗,从候选观测场景中确定目标场景。The scene determination module is used to determine the target scene from the candidate observation scenes based on the continuous observation time required to observe the candidate observation scene and the corresponding visible time window.
第三方面,本申请还提供了一种计算机设备,该计算机设备包括存储器和处理器,存储器存储有计算机程序,处理器执行计算机程序时实现上述目标场景确定方法的步骤。In a third aspect, this application also provides a computer device. The computer device includes a memory and a processor. The memory stores a computer program. When the processor executes the computer program, the steps of the above target scene determination method are implemented.
第四方面,本申请还提供了一种计算机可读存储介质,该计算机可读存储介质上存储有计算机程序,计算机程序被处理器执行时实现上述目标场景确定方法的步骤。In a fourth aspect, the present application also provides a computer-readable storage medium. The computer-readable storage medium stores a computer program. When the computer program is executed by a processor, the steps of the above target scene determination method are implemented.
上述目标场景确定方法,确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程,继而根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,得到候选观测场景,以根据对候选观测场景进行观测所需的持续观测时间以及对应的可见时间窗,从候选观测场景中确定目标场景。相比于现有技术的方法中常常导致目标分割不合理而浪费可见时间窗的极端情况,本申请的目标场景确定方法能够根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,且由于星下点轨迹方程和遥感器参数均为根据对应遥感器得到的,因此可信度较高,使得根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割更为合理,提高了可见时间窗的利用率。The above target scene determination method determines the sub-satellite point trajectory equation within each visible time window of each remote sensor and the corresponding regional target, and then segments the regional target according to the sub-satellite point trajectory equation corresponding to each remote sensor and the remote sensor parameters, and obtains Candidate observation scenes are used to determine the target scene from the candidate observation scenes based on the continuous observation time required to observe the candidate observation scene and the corresponding visible time window. Compared with the extreme situations in the existing technology methods that often lead to unreasonable target segmentation and waste of visible time windows, the target scene determination method of this application can determine regional targets based on the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor. Segmentation is performed, and since the sub-satellite point trajectory equation and remote sensor parameters are obtained based on the corresponding remote sensors, the credibility is high, allowing the regional targets to be segmented based on the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor. The segmentation is more reasonable and improves the utilization of the visible time window.
附图说明Description of the drawings
图1为一个实施例中目标场景确定方法的流程示意图;Figure 1 is a schematic flowchart of a target scene determination method in an embodiment;
图2为一个实施例中确定星下点轨迹方程的流程示意图;Figure 2 is a schematic flowchart of determining the sub-satellite point trajectory equation in one embodiment;
图3为一个实施例中确定候选观测场景的流程示意图;Figure 3 is a schematic flowchart of determining candidate observation scenarios in one embodiment;
图4为一个实施例中确定目标场景的流程示意图;Figure 4 is a schematic flowchart of determining a target scenario in one embodiment;
图5为一个实施例中目标场景确定装置的结构框图;Figure 5 is a structural block diagram of a target scene determination device in one embodiment;
图6为一个实施例中计算机设备的内部结构图。Figure 6 is an internal structure diagram of a computer device in one embodiment.
具体实施方式Detailed ways
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。In order to make the purpose, technical solutions and advantages of the present application more clear, the present application will be further described in detail below with reference to the drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application and are not used to limit the present application.
针对条带状区域监测覆盖范围较大,遥感器的像幅宽度有限,每个观测活动只能覆盖区域目标的局部,遥感器需要执行多个观测活动才能完全覆盖区域目标的需求,研究面向条带状区域监测任务分解方法,综合考虑卫星轨道、遥感器观测范围、遥感器成像模式、可见时间窗以及区域目标地理位置等因素,合理有效对条带状区域目标进行分割,为卫星协同规划提供输入。但是,在目前的卫星协同规划中的场景确定过程中,常因卫星轨迹特征、遥感器观测范围、区域目标地理位置等因素,导致目标分割不合理而浪费可见时间窗的极端情况。基于此,本申请实施例提供一种目标场景确定方法,以改善上述技术问题。In view of the requirement that the strip-shaped area monitoring coverage is large, the image width of the remote sensor is limited, each observation activity can only cover part of the regional target, and the remote sensor needs to perform multiple observation activities to fully cover the regional target, research on strip-oriented The decomposition method of strip area monitoring tasks comprehensively considers factors such as satellite orbit, remote sensor observation range, remote sensor imaging mode, visible time window, and regional target geographical location to reasonably and effectively segment strip area targets, providing information for satellite collaborative planning. enter. However, in the current scene determination process in satellite collaborative planning, factors such as satellite trajectory characteristics, remote sensor observation range, and regional target geographical location often lead to extreme situations in which target segmentation is unreasonable and the visible time window is wasted. Based on this, embodiments of the present application provide a target scene determination method to improve the above technical problems.
在一个实施例中,图1是根据本申请实施例提供的一种目标场景确定方法,且以该方法应用于服务器进行说明,该方法包括以下步骤:In one embodiment, Figure 1 is a target scene determination method provided according to an embodiment of the present application, and the method is applied to a server for illustration. The method includes the following steps:
S101,确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程。S101. Determine the sub-satellite point trajectory equation within each visible time window of each remote sensor and the corresponding regional target.
可选的,各遥感器具有各自的遥感器参数。可见时间窗可以是遥感器与整个区域目标的时间窗;星下点轨迹方程可以是时间窗下的卫星绕地球运动过程中所有星下点连成的曲线。Optionally, each remote sensor has its own remote sensor parameters. It can be seen that the time window can be the time window between the remote sensor and the target in the entire area; the sub-satellite point trajectory equation can be the curve formed by all sub-satellite points during the movement of the satellite around the earth in the time window.
具体的,利用专业软件内置的卫星轨道模型、或使国内已有的卫星轨道模型确定各可见时间窗内的星下点轨迹方程。Specifically, the satellite orbit model built into professional software or the existing domestic satellite orbit model is used to determine the sub-satellite point trajectory equation within each visible time window.
需要说明的是,根据用户需求中对图像类型、地面分辨率、遥感资源偏好等方面的要求,分配合适的卫星及遥感器,并计算遥感器与区域目标实施观测的可见时间窗集合。It should be noted that according to user requirements in terms of image type, ground resolution, remote sensing resource preferences, etc., appropriate satellites and remote sensors are allocated, and the set of visible time windows for remote sensors and regional targets to be observed is calculated.
S102,根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,得到候选观测场景。S102: Segment the regional targets according to the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor to obtain candidate observation scenarios.
可选的,在条带状区域监测覆盖范围较大的情况下,遥感器的像幅宽度有限,每个观测活动只能覆盖区域目标的局部,遥感器需要执行多个观测活动才能完全覆盖区域目标的需求,此时,需要对条带状区域目标进行分割,以完全覆盖区域目标的需求。Optionally, when the strip area monitoring coverage is large, the image width of the remote sensor is limited, and each observation activity can only cover part of the regional target. The remote sensor needs to perform multiple observation activities to fully cover the area. At this time, the strip-shaped area targets need to be segmented to completely cover the needs of the area targets.
S103,根据对候选观测场景进行观测所需的持续观测时间以及对应的可见时间窗,从候选观测场景中确定目标场景。S103: Determine the target scene from the candidate observation scenes according to the continuous observation time required to observe the candidate observation scene and the corresponding visible time window.
可选的,分别确定出对候选观测场景进行观测所需的持续观测时间,以及对应的可见时间窗,并将持续观测时间以及对应的可见时间窗的相似度进行比对,在相似度超过预设相似度阈值时,确定持续观测时间以及对应的可见时间窗匹配,则可将当前的候选观测场景确定为目标场景。否则,从候选观测场景中删除当前的候选观测场景。Optionally, determine the continuous observation time required to observe the candidate observation scene and the corresponding visible time window, and compare the similarity between the continuous observation time and the corresponding visible time window. When the similarity exceeds the predetermined time window, When setting the similarity threshold and determining the matching of the continuous observation time and the corresponding visible time window, the current candidate observation scene can be determined as the target scene. Otherwise, delete the current candidate observation scenario from the candidate observation scenarios.
上述目标场景确定方法,确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程,继而根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,得到候选观测场景,以根据对候选观测场景进行观测所需的持续观测时间以及对应的可见时间窗,从候选观测场景中确定目标场景。相比于现有技术的方法中常常导致目标分割不合理而浪费可见时间窗的极端情况,本申请的目标场景确定方法能够根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,且由于星下点轨迹方程和遥感器参数均为根据对应遥感器得到的,因此可信度较高,使得根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割更为合理,提高了可见时间窗的利用率。The above target scene determination method determines the sub-satellite point trajectory equation within each visible time window of each remote sensor and the corresponding regional target, and then segments the regional target according to the sub-satellite point trajectory equation corresponding to each remote sensor and the remote sensor parameters, and obtains Candidate observation scenes are used to determine the target scene from the candidate observation scenes based on the continuous observation time required to observe the candidate observation scene and the corresponding visible time window. Compared with the extreme situations in the existing technology methods that often lead to unreasonable target segmentation and waste of visible time windows, the target scene determination method of this application can determine regional targets based on the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor. Segmentation is carried out, and since the sub-satellite point trajectory equation and remote sensor parameters are obtained based on the corresponding remote sensors, the credibility is high, so that the regional targets can be classified according to the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor. The segmentation is more reasonable and improves the utilization of the visible time window.
在上述实施例的基础上,通过图2对确定星下点轨迹方程的步骤进行了分解细化。如图2所示,包括如下实现过程:On the basis of the above embodiment, the steps of determining the sub-satellite point trajectory equation are decomposed and refined through Figure 2. As shown in Figure 2, it includes the following implementation process:
S201,基于各区域目标选择遥感器,并确定各遥感器与对应区域目标的可见时间窗。S201. Select remote sensors based on targets in each region, and determine the visible time window between each remote sensor and the target in the corresponding region.
可选的,在各区域目标确定以后,基于各区域目标与遥感器的最短距离原则,选择各区域目标的遥感器,并确定各遥感器与其对应区域目标的可见时间窗。Optionally, after each regional target is determined, based on the principle of the shortest distance between each regional target and the remote sensor, select the remote sensor for each regional target, and determine the visible time window for each remote sensor and its corresponding regional target.
S202,利用特定软件,确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程。S202, use specific software to determine the sub-satellite point trajectory equation within each visible time window of each remote sensor and the corresponding regional target.
可选的,利用STK(Satellite Tool Kit,卫星工具包)等软件,确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程。Optionally, use software such as STK (Satellite Tool Kit) to determine the subsatellite point trajectory equation within each visible time window of each remote sensor and the corresponding regional target.
可以理解的是,本实施例中给出了确定星下点轨迹方程的一种可能的实现方式,为后续确定目标场景奠定了基础。It can be understood that this embodiment provides a possible implementation method for determining the sub-satellite point trajectory equation, which lays a foundation for subsequent determination of the target scenario.
在上述实施例的基础上,通过图3对确定候选观测场景的步骤进行了分解细化。如图3所示,包括如下实现过程:On the basis of the above embodiments, the steps for determining candidate observation scenes are decomposed and refined through Figure 3 . As shown in Figure 3, it includes the following implementation process:
S301,选择未处理的可见时间窗。S301: Select an unprocessed visible time window.
可选的,假设需要在给定的时间范围(一般在24小时)内安排成像侦察卫星遥感器/>,对多个区域目标/>进行观测,已知所有区域目标在各自平面坐标系中的范围定义/>、图像类型要求、遥感器参数、光照条件要求等信息,设任意遥感器/>对区域目标/>的时间窗集合为,其中/>为遥感器/>与区域目标/>的可见时间窗总数,对任意时间窗/>,遥感器所在卫星平台的局部轨迹方程为,/>,其中/>和/>分别为轨迹线的起始点横坐标和终止点横坐标。Optional, assuming required within a given time frame Arrange imaging reconnaissance satellite remote sensors (usually within 24 hours)/> , for multiple regional targets/> Make observations and know the range definitions of all regional targets in their respective plane coordinate systems/> , image type requirements, remote sensor parameters, lighting condition requirements and other information, set any remote sensor/> For regional targets/> The set of time windows is , of which/> For remote sensors/> with regional goals/> The total number of visible time windows, for any time window/> , the local trajectory equation of the satellite platform where the remote sensor is located is ,/> , of which/> and/> are respectively the abscissa coordinate of the starting point and the end point of the trajectory line.
可选的,选择一个未处理的时间窗,/>,/>,/>,。Optionally, select an unprocessed time window ,/> ,/> ,/> , .
S302,根据未处理的可见时间窗对应的遥感器所对应的星下点轨迹方程和遥感器参数确定遥感器的观测范围。S302: Determine the observation range of the remote sensor based on the sub-satellite point trajectory equation and remote sensor parameters corresponding to the remote sensor corresponding to the unprocessed visible time window.
基于其对应的星下点轨迹方程和遥感器的参数计算出遥感器在平面直角坐标系中的观测范围/>,其中,遥感器的观测范围指在满足空间分辨率要求的情况下通过调整侧摆角和前后仰角所能观测到的区域。Based on its corresponding sub-satellite point trajectory equation and remote sensor Calculate the observation range of the remote sensor in the plane Cartesian coordinate system/> , where the observation range of the remote sensor refers to the area that can be observed by adjusting the side angle and front and rear elevation angles while meeting the spatial resolution requirements.
S303,根据预选的区域目标分割模式,在观测范围内对区域目标进行分割,得到候选观测场景。S303: Segment the regional targets within the observation range according to the preselected regional target segmentation mode to obtain candidate observation scenes.
可选的,判断用户预选的区域目标分割模式,并在判断出用户预选的区域目标分割模式后,分情况确定候选观测场景。Optionally, determine the regional target segmentation mode preselected by the user, and determine candidate observation scenarios according to the situation after determining the regional target segmentation mode preselected by the user.
具体的,在预选的区域目标分割模式的为条带分割模式的情况下,将区域目标分割成相互平行的条带集合,并在在遥感器采用单景模式成像的情况下,将观测范围内的条带集合分割成单景,得到候选观测场景;在预选的区域目标分割模式的为手工干预模式的情况下,根据遥感器的观测范围,从用户预先定义的场景参考库中筛选候选观测场景。Specifically, when the preselected regional target segmentation mode is the strip segmentation mode, the regional target is segmented into a set of mutually parallel strips, and when the remote sensor adopts single scene mode imaging, the objects within the observation range are The strip set is divided into single scenes to obtain candidate observation scenes; when the preselected regional target segmentation mode is manual intervention mode, candidate observation scenes are screened from the user-predefined scene reference library according to the observation range of the remote sensor. .
可以理解的是,本实施例中给出了确定候选观测场景的一种可能的实现方式,为后续确定目标场景奠定了基础。It can be understood that this embodiment provides a possible implementation method for determining candidate observation scenarios, which lays a foundation for subsequent determination of target scenarios.
在上述实施例的基础上,通过图4对确定目标场景的步骤进行了分解细化。如图4所示,包括如下实现过程:On the basis of the above embodiments, the steps of determining the target scene are decomposed and refined through Figure 4 . As shown in Figure 4, it includes the following implementation process:
S401,选择未访问的候选观测场景作为当前观测场景。S401: Select an unvisited candidate observation scene as the current observation scene.
S402,根据遥感器的飞行速度和观测范围,计算遥感器观测观测范围的持续观测时间。S402: Calculate the continuous observation time of the remote sensor's observation range based on the flight speed and observation range of the remote sensor.
可选的,从候选观测场景中选择一个未访问的候选观测场景作为当前观测场景,根据卫星平台的飞行速度和观测场景覆盖范围,计算遥感资源完成观测所需要的持续观测时间。Optionally, select an unvisited candidate observation scene from the candidate observation scenes as the current observation scene, and calculate the continuous observation time required for remote sensing resources to complete the observation based on the flight speed of the satellite platform and the coverage of the observation scene.
S403,根据持续观测时间和对应的可见时间窗,确定当前观测场景是否为目标场景。S403: Determine whether the current observation scene is the target scene according to the continuous observation time and the corresponding visible time window.
可选的,计算当前观测场景与对应遥感器在指定轨道圈次内的可见时间窗,计算当前观测场景时间窗的方法是:通过坐标投影反算公式,将观测场景边界顶点的坐标由平面坐标系转换到大地坐标系,接着利用这些经纬度坐标在STK软件中生成相应的区域目标定义,并且利用STK软件计算遥感器与当前观测场景的可见时间窗。在这一阶段,计算持续观测时间要求所使用的遥感器必须与分割出该场景用到的遥感器相同。Optionally, calculate the visible time window of the current observation scene and the corresponding remote sensor within the specified orbit circle. The method of calculating the time window of the current observation scene is: use the coordinate projection inverse formula to convert the coordinates of the boundary vertices of the observation scene from the plane coordinates The system is converted to the geodetic coordinate system, and then these longitude and latitude coordinates are used to generate the corresponding regional target definition in the STK software, and the STK software is used to calculate the visible time window between the remote sensor and the current observation scene. At this stage, the remote sensor used to calculate the continuous observation time requirement must be the same as the remote sensor used to segment the scene.
在持续观测时间和对应的可见时间窗之后,为了得到目标场景,需要判断持续观测时间和对应的可见时间窗是否匹配;在持续观测时间和对应的可见时间窗匹配的情况下,确定当前观测场景为目标场景;在持续观测时间和对应的可见时间窗不匹配的情况下,则删除当前观测场景。After the continuous observation time and the corresponding visible time window, in order to obtain the target scene, it is necessary to determine whether the continuous observation time and the corresponding visible time window match; when the continuous observation time and the corresponding visible time window match, determine the current observation scene is the target scene; if the continuous observation time does not match the corresponding visible time window, the current observation scene will be deleted.
可以理解的是,本实施例中给出了确定目标场景的一种可能的实现方式,为后续确定目标场景奠定了基础。It can be understood that this embodiment provides a possible implementation method for determining the target scenario, which lays a foundation for subsequent determination of the target scenario.
应该理解的是,虽然如上所述的各实施例所涉及的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,如上所述的各实施例所涉及的流程图中的至少一部分步骤可以包括多个步骤或者多个阶段,这些步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤中的步骤或者阶段的至少一部分轮流或者交替地执行。It should be understood that although the steps in the flowcharts involved in the above-mentioned embodiments are shown in sequence as indicated by the arrows, these steps are not necessarily executed in the order indicated by the arrows. Unless explicitly stated in this article, there is no strict order restriction on the execution of these steps, and these steps can be executed in other orders. Moreover, at least some of the steps in the flowcharts involved in the above embodiments may include multiple steps or stages. These steps or stages are not necessarily executed at the same time, but may be completed at different times. The execution order of these steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least part of the steps or stages in other steps.
基于同样的发明构思,本申请实施例还提供了一种用于实现上述所涉及的目标场景确定方法的目标场景确定装置。该装置所提供的解决问题的实现方案与上述方法中所记载的实现方案相似,故下面所提供的一个或多个目标场景确定装置实施例中的具体限定可以参见上文中对于目标场景确定方法的限定,在此不再赘述。Based on the same inventive concept, embodiments of the present application also provide a target scene determination device for implementing the above-mentioned target scene determination method. The solution to the problem provided by this device is similar to the solution recorded in the above method. Therefore, for the specific limitations in the embodiments of one or more target scene determination devices provided below, please refer to the above description of the target scene determination method. Limitations will not be repeated here.
在其中一个实施例中,通过图5示出了一个实施例中目标场景确定装置的结构框图。如图5所示,提供了一种目标场景确定装置5,该装置5包括:方程确定模块50、分割模块51和场景确定模块52,其中:In one embodiment, FIG. 5 shows a structural block diagram of a target scene determination device in an embodiment. As shown in Figure 5, a target scene determination device 5 is provided. The device 5 includes: an equation determination module 50, a segmentation module 51 and a scene determination module 52, wherein:
方程确定模块50,用于确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程。The equation determination module 50 is used to determine the sub-satellite point trajectory equation within each visible time window of each remote sensor and the corresponding regional target.
其中,各遥感器具有各自的遥感器参数。Among them, each remote sensor has its own remote sensor parameters.
分割模块51,用于根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,得到候选观测场景。The segmentation module 51 is used to segment regional targets according to the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor to obtain candidate observation scenes.
场景确定模块52,用于根据对候选观测场景进行观测所需的持续观测时间以及对应的可见时间窗,从候选观测场景中确定目标场景。The scene determination module 52 is used to determine the target scene from the candidate observation scenes based on the continuous observation time required to observe the candidate observation scene and the corresponding visible time window.
上述目标场景确定装置,确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程,继而根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,得到候选观测场景,以根据对候选观测场景进行观测所需的持续观测时间以及对应的可见时间窗,从候选观测场景中确定目标场景。相比于现有技术的装置中常常导致目标分割不合理而浪费可见时间窗的极端情况,本申请的目标场景确定装置能够根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割,且由于星下点轨迹方程和遥感器参数均为根据对应遥感器得到的,因此可信度较高,使得根据各遥感器对应的星下点轨迹方程和遥感器参数对区域目标进行分割更为合理,提高了可见时间窗的利用率。The above-mentioned target scene determination device determines the sub-satellite point trajectory equation within each visible time window of each remote sensor and the corresponding regional target, and then segments the regional target according to the sub-satellite point trajectory equation corresponding to each remote sensor and the remote sensor parameters, and obtains Candidate observation scenes are used to determine the target scene from the candidate observation scenes based on the continuous observation time required to observe the candidate observation scene and the corresponding visible time window. Compared with the extreme situations in the prior art devices that often lead to unreasonable target segmentation and waste of visible time windows, the target scene determination device of the present application can determine regional targets based on the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor. Segmentation is performed, and since the sub-satellite point trajectory equation and remote sensor parameters are obtained based on the corresponding remote sensors, the credibility is high, allowing the regional targets to be segmented based on the sub-satellite point trajectory equation and remote sensor parameters corresponding to each remote sensor. The segmentation is more reasonable and improves the utilization of the visible time window.
在其中一个实施例中,目标场景确定系统还包括图像传感器,图像传感器与模数转换器、中央控制处理器互连,上述方程确定模块50具体用于:In one embodiment, the target scene determination system also includes an image sensor. The image sensor is interconnected with an analog-to-digital converter and a central control processor. The above equation determination module 50 is specifically used to:
基于各区域目标选择遥感器,并确定各遥感器与对应区域目标的可见时间窗;利用特定软件,确定各遥感器与对应区域目标的各可见时间窗内的星下点轨迹方程。Select a remote sensor based on each regional target, and determine the visible time window of each remote sensor and the corresponding regional target; use specific software to determine the sub-satellite point trajectory equation within each visible time window of each remote sensor and the corresponding regional target.
在其中一个实施例中,上述分割模块51包括:In one embodiment, the above-mentioned segmentation module 51 includes:
第一选择单元,用于选择未处理的可见时间窗;The first selection unit is used to select an unprocessed visible time window;
第一确定单元,用于根据未处理的可见时间窗对应的遥感器所对应的星下点轨迹方程和遥感器参数确定遥感器的观测范围;The first determination unit is used to determine the observation range of the remote sensor based on the sub-satellite point trajectory equation corresponding to the remote sensor corresponding to the unprocessed visible time window and the remote sensor parameters;
分割单元,用于根据预选的区域目标分割模式,在观测范围内对区域目标进行分割,得到候选观测场景。The segmentation unit is used to segment regional targets within the observation range according to the preselected regional target segmentation mode to obtain candidate observation scenes.
在其中一个实施例中,上述分割单元包括:In one embodiment, the above-mentioned dividing unit includes:
分割子单元,用于在预选的区域目标分割模式的为条带分割模式的情况下,根据条带分割模式在观测范围内对区域目标进行分割,得到候选观测场景。The segmentation subunit is used to segment regional targets within the observation range according to the strip segmentation mode to obtain candidate observation scenes when the preselected regional target segmentation mode is the strip segmentation mode.
在其中一个实施例中,上述分割子单元具体用于:In one embodiment, the above-mentioned segmentation subunit is specifically used for:
在预选的区域目标分割模式的为条带分割模式的情况下,将区域目标分割成相互平行的条带集合;在遥感器采用单景模式成像的情况下,将观测范围内的条带集合分割成单景,得到候选观测场景。When the preselected regional target segmentation mode is the strip segmentation mode, the regional target is segmented into a set of parallel strips; when the remote sensor adopts single scene mode imaging, the strip set within the observation range is segmented. into a single scene to obtain a candidate observation scene.
在其中一个实施例中,上述场景确定模块52包括:In one embodiment, the above-mentioned scene determination module 52 includes:
第二选择单元,用于选择未访问的候选观测场景作为当前观测场景;The second selection unit is used to select an unvisited candidate observation scene as the current observation scene;
计算单元,用于根据遥感器的飞行速度和观测范围,计算遥感器观测观测范围的持续观测时间;A calculation unit used to calculate the continuous observation time of the remote sensor's observation range based on the flight speed and observation range of the remote sensor;
第二确定单元,用于根据持续观测时间和对应的可见时间窗,确定当前观测场景是否为目标场景。The second determination unit is used to determine whether the current observation scene is the target scene according to the continuous observation time and the corresponding visible time window.
在其中一个实施例中,上述第二确定单元具体用于:In one embodiment, the above-mentioned second determining unit is specifically used to:
判断持续观测时间和对应的可见时间窗是否匹配;若是,则确定当前观测场景为目标场景。Determine whether the continuous observation time matches the corresponding visible time window; if so, determine the current observation scene as the target scene.
上述目标场景确定装置中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于计算机设备中的处理器中,也可以以软件形式存储于计算机设备中的存储器中,以便于处理器调用执行以上各个模块对应的操作。Each module in the above target scene determination device can be implemented in whole or in part by software, hardware, and combinations thereof. Each of the above modules may be embedded in or independent of the processor of the computer device in the form of hardware, or may be stored in the memory of the computer device in the form of software, so that the processor can call and execute the operations corresponding to the above modules.
在其中一个实施例中,提供了一种计算机设备,该计算机设备可以是服务器,其内部结构图可以如图6所示。该计算机设备包括通过系统总线连接的处理器、存储器、网络接口和收发器。其中,该计算机设备的处理器用于提供计算和控制能力。该计算机设备的存储器包括非易失性存储介质和内存储器。该计算机设备的收发器用于在处理器的控制下执行接收数据或发送数据的操作。该非易失性存储介质存储有操作系统、计算机程序和数据库。该内存储器为非易失性存储介质中的操作系统和计算机程序的运行提供环境。该计算机设备的数据库用于存储样本数据等数据。该计算机设备的网络接口用于与外部的终端通过网络连接通信。该计算机程序被处理器执行时以实现一种目标场景确定方法。In one embodiment, a computer device is provided. The computer device may be a server, and its internal structure diagram may be shown in Figure 6 . The computer device includes a processor, memory, network interface, and transceiver connected by a system bus. Wherein, the processor of the computer device is used to provide computing and control capabilities. The memory of the computer device includes non-volatile storage media and internal memory. The transceiver of the computer device is used to perform operations of receiving data or transmitting data under the control of the processor. The non-volatile storage medium stores operating systems, computer programs and databases. This internal memory provides an environment for the execution of operating systems and computer programs in non-volatile storage media. The computer device's database is used to store data such as sample data. The network interface of the computer device is used to communicate with external terminals through a network connection. The computer program implements a target scenario determination method when executed by a processor.
本领域技术人员可以理解,图6中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的,计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。Those skilled in the art can understand that the structure shown in Figure 6 is only a block diagram of a partial structure related to the solution of the present application, and does not constitute a limitation on the computer equipment to which the solution of the present application is applied. Specifically, the computer equipment More or fewer components may be included than shown in the figures, or certain components may be combined, or may have a different arrangement of components.
在其中一个实施例中,提供了一种计算机设备,包括存储器和处理器,存储器中存储有计算机程序,该处理器实现各实施例中的原理和具体过程可参见前述实施例中目标场景确定方法实施例中的说明,此处不再赘述。In one embodiment, a computer device is provided, including a memory and a processor. A computer program is stored in the memory. For the principles and specific processes implemented by the processor in each embodiment, please refer to the target scene determination method in the previous embodiments. The descriptions in the embodiments will not be repeated here.
在其中一个实施例中,提供了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现各实施例中的原理和具体过程可参见前述实施例中目标场景确定方法实施例中的说明,此处不再赘述。In one of the embodiments, a computer-readable storage medium is provided, on which a computer program is stored. When the computer program is executed by a processor, the principles and specific processes in each embodiment can be realized by referring to the target scenario determination in the previous embodiments. The descriptions in the method embodiments will not be repeated here.
需要说明的是,本申请所涉及的信息(包括但不限于本申请中观测场景相关信息以及遥感器参数等),均为经过各方充分授权的信息或数据。It should be noted that the information involved in this application (including but not limited to information related to the observation scene and remote sensor parameters in this application) is information or data fully authorized by all parties.
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一非易失性计算机可读取存储介质中,该计算机程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、数据库或其它介质的任何引用,均可包括非易失性和易失性存储器中的至少一种。非易失性存储器可包括只读存储器(Read-OnlyMemory,ROM)、磁带、软盘、闪存、光存储器、高密度嵌入式非易失性存储器、阻变存储器(ReRAM)、磁变存储器(Magnetoresistive Random Access Memory,MRAM)、铁电存储器(Ferroelectric Random Access Memory,FRAM)、相变存储器(Phase Change Memory,PCM)、石墨烯存储器等。易失性存储器可包括随机存取存储器(Random Access Memory,RAM)或外部高速缓冲存储器等。作为说明而非局限,RAM可以是多种形式,比如静态随机存取存储器(Static Random Access Memory,SRAM)或动态随机存取存储器(Dynamic RandomAccess Memory,DRAM)等。本申请所提供的各实施例中所涉及的数据库可包括关系型数据库和非关系型数据库中至少一种。非关系型数据库可包括基于区块链的分布式数据库等,不限于此。本申请所提供的各实施例中所涉及的处理器可为通用处理器、中央处理器、图形处理器、数字信号处理器、可编程逻辑器、基于量子计算的数据处理逻辑器等,不限于此。Those of ordinary skill in the art can understand that all or part of the processes in the methods of the above embodiments can be completed by instructing relevant hardware through a computer program. The computer program can be stored in a non-volatile computer-readable storage. In the media, when executed, the computer program may include the processes of the above method embodiments. Any reference to memory, database or other media used in the embodiments provided in this application may include at least one of non-volatile and volatile memory. Non-volatile memory can include read-only memory (ROM), magnetic tape, floppy disk, flash memory, optical memory, high-density embedded non-volatile memory, resistive memory (ReRAM), magnetic variable memory (Magnetoresistive Random) Access Memory (MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (PCM), graphene memory, etc. Volatile memory may include random access memory (Random Access Memory, RAM) or external cache memory, etc. As an illustration and not a limitation, RAM can be in various forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM). The databases involved in the various embodiments provided in this application may include at least one of a relational database and a non-relational database. Non-relational databases may include blockchain-based distributed databases, etc., but are not limited thereto. The processors involved in the various embodiments provided in this application may be general-purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing-based data processing logic devices, etc., and are not limited to this.
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。The technical features of the above embodiments can be combined in any way. To simplify the description, not all possible combinations of the technical features in the above embodiments are described. However, as long as there is no contradiction in the combination of these technical features, all possible combinations should be used. It is considered to be within the scope of this manual.
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请的保护范围应以所附权利要求为准。The above-described embodiments only express several implementation modes of the present application, and their descriptions are relatively specific and detailed, but should not be construed as limiting the patent scope of the present application. It should be noted that, for those of ordinary skill in the art, several modifications and improvements can be made without departing from the concept of the present application, and these all fall within the protection scope of the present application. Therefore, the scope of protection of this application should be determined by the appended claims.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410130192.1A CN117690040B (en) | 2024-01-31 | 2024-01-31 | Target scene determining method, device and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410130192.1A CN117690040B (en) | 2024-01-31 | 2024-01-31 | Target scene determining method, device and equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117690040A true CN117690040A (en) | 2024-03-12 |
CN117690040B CN117690040B (en) | 2024-06-25 |
Family
ID=90137391
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410130192.1A Active CN117690040B (en) | 2024-01-31 | 2024-01-31 | Target scene determining method, device and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117690040B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060026678A1 (en) * | 2004-07-29 | 2006-02-02 | Zakas Phillip H | System and method of characterizing and managing electronic traffic |
CN105787173A (en) * | 2016-02-25 | 2016-07-20 | 中国地质大学(武汉) | Multi-satellite earth-observation task scheduling and planning method and device |
WO2021063188A1 (en) * | 2019-09-30 | 2021-04-08 | 华为技术有限公司 | Neighbor relation configuration method and apparatus applicable to satellite network |
CN116740306A (en) * | 2023-08-09 | 2023-09-12 | 北京空间飞行器总体设计部 | Remote sensing satellite earth observation mission planning method and device based on road network guidance |
-
2024
- 2024-01-31 CN CN202410130192.1A patent/CN117690040B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060026678A1 (en) * | 2004-07-29 | 2006-02-02 | Zakas Phillip H | System and method of characterizing and managing electronic traffic |
CN105787173A (en) * | 2016-02-25 | 2016-07-20 | 中国地质大学(武汉) | Multi-satellite earth-observation task scheduling and planning method and device |
WO2021063188A1 (en) * | 2019-09-30 | 2021-04-08 | 华为技术有限公司 | Neighbor relation configuration method and apparatus applicable to satellite network |
CN116740306A (en) * | 2023-08-09 | 2023-09-12 | 北京空间飞行器总体设计部 | Remote sensing satellite earth observation mission planning method and device based on road network guidance |
Non-Patent Citations (2)
Title |
---|
LIN ZHAO: "The Attitude Control Algorithm of Agile Optical Satellite Oriented to Nonparallel-Ground-Track-Imaging", 《IEEE》, 19 October 2019 (2019-10-19), pages 1 - 12 * |
韩鹏: "基于相对成像时刻编码遗传算法的敏捷成像卫星任务规划", 《宇航学报》, 15 November 2021 (2021-11-15), pages 1428 - 1438 * |
Also Published As
Publication number | Publication date |
---|---|
CN117690040B (en) | 2024-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110687923B (en) | Unmanned aerial vehicle long-distance tracking flight method, device, equipment and storage medium | |
US10593110B2 (en) | Method and device for computing a path in a game scene | |
CN114930391B (en) | Map updating method, device, computer equipment and storage medium | |
US11534917B2 (en) | Methods, systems, articles of manufacture and apparatus to improve resource utilization for binary tree structures | |
JP2020507853A (en) | Method and apparatus for three-dimensional point cloud reconstruction | |
WO2024197815A1 (en) | Engineering machinery mapping method and device, and readable storage medium | |
CN112731961A (en) | Path planning method, device, equipment and storage medium | |
US20220244068A1 (en) | Dynamic map generation with focus on construction and localization field of technology | |
CN118319188A (en) | Repositioning method for cleaning robot, cleaning robot and computer equipment | |
CN115883172B (en) | Abnormality monitoring method, device, computer equipment and storage medium | |
CN116301043A (en) | Heterogeneous UAV cluster cooperative search optimization method and system based on multi-situation map fusion | |
WO2024139270A1 (en) | Grid map construction method, robot and computer-readable storage medium | |
CN117148872A (en) | Robot collaborative source searching method, device and equipment under multi-gas diffusion source scene | |
CN117690040B (en) | Target scene determining method, device and equipment | |
CN117705119A (en) | Navigation positioning method, equipment and storage medium for dynamic loading of large-scene 3D map | |
CN115311424B (en) | Three-dimensional reconstruction method and device of target scene, unmanned aerial vehicle and storage medium | |
CN114383621B (en) | Track deviation rectifying method based on grid map, electronic equipment and storage medium | |
CN113670253A (en) | Space target posture inversion method and device, computing equipment and storage medium | |
CN119180936B (en) | Video track positioning method, device, computer equipment and storage medium | |
CN116720387B (en) | Target compound tracking system architecture modeling method based on edge calculation | |
CN118425980B (en) | Edge path determination method, device, robot and storage medium | |
KR102837969B1 (en) | Method, computing device and computer program for precisely estimating robot position and orientation using low-resolution and high-resolution maps | |
CN120147331B (en) | Point cloud processing method, point cloud processing device, computer equipment and readable storage medium | |
US20250045625A1 (en) | Active search-based approach for sensor querying in geographically overlapping edge networks | |
CN117609412B (en) | Spatial object association method and device based on network structure information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20240628 Address after: 100086 No. 82, Haidian District, Beijing, Zhichun Road Patentee after: SPACE STAR TECHNOLOGY Co.,Ltd. Country or region after: China Patentee after: TIANJIN ZHONG WEI AEROSPACE DATA SYSTEM TECHNOLOGY Co.,Ltd. Address before: Research building of super large spacecraft assembly test center, 101 Shenzhou Avenue, Binhai science and Technology Park, Binhai New Area, Tianjin Patentee before: TIANJIN ZHONG WEI AEROSPACE DATA SYSTEM TECHNOLOGY Co.,Ltd. Country or region before: China |