[go: up one dir, main page]

WO2018152783A1 - Procédé et dispositif de traitement d'image, et aéronef - Google Patents

Procédé et dispositif de traitement d'image, et aéronef Download PDF

Info

Publication number
WO2018152783A1
WO2018152783A1 PCT/CN2017/074817 CN2017074817W WO2018152783A1 WO 2018152783 A1 WO2018152783 A1 WO 2018152783A1 CN 2017074817 W CN2017074817 W CN 2017074817W WO 2018152783 A1 WO2018152783 A1 WO 2018152783A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
processing
aircraft
system time
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2017/074817
Other languages
English (en)
Chinese (zh)
Inventor
李泽飞
熊川樘
吴智强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN201780005386.7A priority Critical patent/CN108496365A/zh
Priority to PCT/CN2017/074817 priority patent/WO2018152783A1/fr
Publication of WO2018152783A1 publication Critical patent/WO2018152783A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Definitions

  • the present invention relates to consumer electronics technology, and more particularly to an image processing method, a processing device, and an aircraft.
  • the video formed by the aircraft is generally post-processed plus a time stamp.
  • a time stamp may easily cause the time corresponding to the video to be inaccurate, and the reliability of the time display in the video is reduced.
  • Embodiments of the present invention provide an image processing method, a processing device, and an aircraft.
  • the invention provides an image processing method for an aircraft, wherein the aircraft is provided with an imaging device, and the processing method comprises the following steps:
  • the system time of the aircraft when imaging the imaging device is synthesized into the image.
  • the present invention provides an image processing apparatus for an aircraft, the aircraft being provided with an imaging device, the processing device comprising:
  • control module for controlling imaging of the imaging device to obtain an image
  • a first processing module for synthesizing a system time of the aircraft when imaging the imaging device into the image.
  • An aircraft of an embodiment of the present invention includes an imaging device and the processing device.
  • the image processing method, the processing device, and the aircraft of the embodiment of the present invention synthesize the system time of the aircraft when the imaging device is imaged into the image in real time to ensure the accuracy and credibility of the time display in the image.
  • FIG. 1 is a schematic flow chart of a method for processing an image according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of functional modules of an aircraft according to an embodiment of the present invention.
  • FIG. 3 is another schematic flowchart of a method for processing an image according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of functional modules of a first processing module according to an embodiment of the present invention.
  • FIG. 5 is still another schematic flowchart of a method for processing an image according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of another functional module of a first processing module according to an embodiment of the present invention.
  • FIG. 7 is still another schematic flowchart of a method for processing an image according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of another functional module of an aircraft according to an embodiment of the present invention.
  • FIG. 9 is a schematic diagram of still another functional module of the first processing module according to the embodiment of the present invention.
  • FIG. 10 is still another schematic flowchart of a method for processing an image according to an embodiment of the present invention.
  • FIG. 11 is a schematic diagram of functional blocks of a processing device according to an embodiment of the present invention.
  • FIG. 12 is still another schematic flowchart of a method for processing an image according to an embodiment of the present invention.
  • FIG. 13 is a schematic diagram of communication between an aircraft and a terminal according to an embodiment of the present invention.
  • FIG. 14 is a schematic diagram of another functional module of a processing device according to an embodiment of the present invention.
  • FIG. 15 is still another schematic flowchart of a method for processing an image according to an embodiment of the present invention.
  • 16 is a schematic diagram of image transmission according to an embodiment of the present invention.
  • 17 is a schematic diagram of still another functional module of the processing device according to an embodiment of the present invention.
  • FIG. 18 is still another schematic flowchart of a method for processing an image according to an embodiment of the present invention.
  • FIG 19 is another schematic diagram of image transmission in accordance with an embodiment of the present invention.
  • Aircraft 100 imaging device 10, processing device 20, control module 22, first processing module 24, first processing unit 242, second processing unit 244, third processing unit 246, processing subunit 2462, fourth processing unit 248, The second processing module 26, the third processing module 28, the fourth processing module 29, the positioning device 30, the remote controller 500, the relay terminal 600, the cloud server 700, the terminal 800, the display 80, and the monitoring terminal 900.
  • first and second are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated.
  • features defining “first” or “second” may include one or more of the described features either explicitly or implicitly.
  • the meaning of "a plurality" is two or more unless specifically and specifically defined otherwise.
  • connection In the description of the present invention, it should be noted that the terms “installation”, “connected”, and “connected” are to be understood broadly, and may be fixed or detachable, for example, unless otherwise explicitly defined and defined. Connected, or connected in one piece; Therefore, they may be mechanically connected, or may be electrically connected or may communicate with each other; they may be directly connected or indirectly connected through an intermediate medium, and may be internal communication of two elements or an interaction relationship of two elements.
  • the specific meanings of the above terms in the present invention can be understood on a case-by-case basis.
  • the image processing method of the embodiment of the present invention can be applied to the aircraft 100.
  • the imaging device 10 is disposed on the aircraft 100.
  • the image processing method includes the following steps:
  • S22 controlling the imaging device 10 to image to obtain an image
  • S24 The system time of the aircraft 100 when the imaging device 10 is imaged is synthesized into an image.
  • the image processing device 20 can be used with the aircraft 100.
  • the imaging device 10 is disposed on the aircraft 100.
  • Processing device 20 includes a control module 22 and a first processing module 24.
  • the control module 22 is for controlling the imaging device 10 to image to obtain an image.
  • the first processing module 24 is configured to synthesize the system time of the aircraft 100 when imaging the imaging device 10 into an image.
  • the processing method of the embodiment of the present invention may be implemented by the processing device 20 of the embodiment of the present invention, wherein the step S22 may be implemented by the control module 22, and the step S24 may be implemented by the first processing module 24.
  • the processing device 20 of the embodiments of the present invention may be applied to the aircraft 100 of the embodiment of the present invention, or the aircraft 100 of the embodiment of the present invention includes the processing device 20 of the embodiment of the present invention. Further, the aircraft 100 of the embodiment of the present invention further includes an imaging device 10 in which the imaging device 10 and the processing device 20 are electrically connected.
  • the image processing method, the processing device 20, and the aircraft 100 of the embodiment of the present invention synthesize the system time of the aircraft 100 when the imaging device 10 is imaged into the image in real time to ensure the accuracy and credibility of the time display in the image.
  • each frame of image has a corresponding imaging time, that is, the system time of the corresponding aircraft 100, and the imaging time corresponding to each frame of image is synthesized into the image, which can ensure that each frame can be accurately and quickly obtained.
  • Imaging time of the frame image in some cases where accurate image imaging time is required, such as live broadcast, real-time monitoring or exploration, etc., the processing method of the embodiment of the present invention, the processing device 20, and the image obtained by the aircraft 100 Time has greater accuracy and credibility.
  • aircraft 100 includes an unmanned aerial vehicle.
  • step S24 includes the following steps:
  • S244 Process the image to write a character string into the image.
  • the first processing module 24 includes a first processing unit 242 and a second processing unit 244.
  • the first processing unit 242 is configured to convert the system time into a string.
  • the second processing unit 244 is for processing the image to write a string into the image.
  • step S242 can be implemented by the first processing unit 242
  • step S244 can be implemented by the second processing unit 244.
  • the image and the corresponding system time can be more closely combined to avoid errors in the post-synthesis and prevent data from being tampered with.
  • the processing device 20 converts the system time of each frame of image into a character string, and then directly writes the character string into the image data corresponding to the image, such as an image pixel, thereby synthesizing the corresponding system time when the image is imaged into the image.
  • step S24 includes the following steps:
  • the first processing module 24 includes a third processing unit 246 and a fourth processing unit 248.
  • the third processing unit 246 is configured to update the system time to obtain the updated system time.
  • the fourth processing unit 248 is configured to synthesize the updated system time into the image.
  • step S246 can be implemented by the third processing unit 246, and step S248 can be implemented by the fourth processing unit 248.
  • the system time of the aircraft 100 may have a large error with the passage of time.
  • the system time needs to be updated, thereby reducing the error and improving the credibility of the system time.
  • aircraft 100 includes positioning device 30.
  • Step S246 includes the following steps:
  • S2462 Processing the time acquired by the positioning device 30 as the updated system time.
  • the aircraft 100 includes a positioning device 30.
  • the third processing unit 246 includes a processing sub-unit 2462.
  • the processing sub-unit 2462 is configured to process the time acquired by the positioning device 30 as the updated system time.
  • step S2462 can be implemented by the processing sub-unit 2462.
  • the system time can be updated by the time acquired by the positioning device 30, thereby improving the accuracy and credibility of the system time.
  • the timer inside the aircraft 100 has a certain error, the error is small in a short time, the system time is relatively accurate, and can be used normally, but the accumulation of time, the error variation causes the time to be inaccurate, so it needs to be positioned.
  • a device 30, such as a GPS positioning device, obtains a more accurate time as system time.
  • the GPS positioning device can transmit a request for acquisition time to the satellite communicating with the GPS positioning device, and receive the time transmitted by the satellite as the updated system time.
  • the processing method includes the following steps:
  • processing device 20 includes a second processing module 26 .
  • the second processing module 26 is for encoding the image synthesized with the system time.
  • step S26 can be implemented by the second processing module 26.
  • the image is generally not directly transmitted, but the image is first encoded to generate a smaller image file, thereby improving the image transmission efficiency. It is also possible to encrypt the image during the encoding process to ensure the security of the image data transmission.
  • aircraft 100 is in communication with terminal 800.
  • Terminal 800 includes display 80 and terminal time, and display 80 is used to display images that are synthesized with system time.
  • the processing method includes the following steps:
  • aircraft 100 is in communication with terminal 800
  • terminal 800 includes display 80 and terminal time
  • display 80 is used to display images synthesized with system time
  • processing device 20 includes Three processing modules 28.
  • the third processing module 28 is for comparing the terminal time and the system time to obtain a delay time.
  • step S28 can be implemented by the third processing module 28.
  • the delay time can be obtained as the video delay of the remote monitoring, thereby accurately grasping the monitoring time.
  • the terminal 800 includes a terminal processor for comparing terminal time and system time to obtain a delay time, that is, step S28 can be implemented by the terminal processor without any limitation.
  • the delay time can be displayed on the display 80.
  • the system time is also displayed in the image.
  • the format of the displayed system time is: year-month-day hour: minute: second.
  • aircraft 100 is in communication with remote control 500, and remote control 500 is in communication with monitoring terminal 900.
  • the processing method includes the following steps:
  • S29 The image is sequentially transmitted to the monitoring terminal 900 through the aircraft 100 and the remote controller 500.
  • aircraft 100 is in communication with remote control 500, which is in communication with monitoring terminal 900.
  • Processing device 20 includes a fourth processing processing module 29.
  • the fourth processing module 29 is configured to send the image to the monitoring terminal 900 through the aircraft 100 and the remote controller 500 in sequence.
  • step S29 can be implemented by the fourth processing module 29.
  • the aircraft 100 can directly transmit images to the monitoring terminal 900 through the remote controller 500.
  • the monitoring terminal 900 is in direct communication with the remote controller 500, and the aircraft 100 (such as an unmanned aerial vehicle) is generally in direct communication with the remote controller 500, so the aircraft 100 can transmit images to the monitoring terminal 900 via the remote controller 500.
  • the monitoring terminal 900 can obtain the system time corresponding to each frame image and each frame image by decoding the image.
  • the monitoring terminal 900 is a client, that is, a display terminal that monitors the environment in which the aircraft 100 is located.
  • the remote controller 500 communicates with the monitoring terminal 900 through the relay terminal 600 and the cloud server 700.
  • the processing method includes the following steps:
  • S31 The image is sent to the monitoring terminal 900 through the remote controller 500, the relay terminal 600, and the cloud server 700 in sequence.
  • the remote control 500 communicates with the monitoring terminal 900 via the relay terminal 600 and the cloud server 700.
  • the fourth processing module 29 is further configured to send the image to the monitoring terminal 900 through the remote controller 500, the relay terminal 600, and the cloud server 700 in sequence.
  • step S31 can also be implemented by the fourth processing module 29.
  • the aircraft 100 can transmit images to the monitoring terminal 900 through the remote controller 500, the relay terminal 600, and the cloud server 700.
  • the monitoring terminal 900 is remotely monitored and cannot communicate directly with the remote controller 500, and the aircraft 100 generally communicates directly with the remote controller 500, so the remote controller 500 can pass through the relay terminal 600 (such as a base station or an intelligent terminal).
  • the smart terminal includes a mobile phone, a tablet computer, etc., and the cloud server transmits the image from the aircraft 100 to the monitoring terminal 900, that is, the image is transmitted according to the process of the aircraft 100 ⁇ the remote controller 500 ⁇ the relay terminal 600 ⁇ the cloud server 700 ⁇ the monitoring terminal 900.
  • the monitoring terminal 900 can obtain the system time corresponding to each frame image and each frame image by decoding the image.
  • a "computer-readable medium” can be any apparatus that can contain, store, communicate, propagate, or transport a program for use in an instruction execution system, apparatus, or device, or in conjunction with the instruction execution system, apparatus, or device.
  • computer readable media include the following: electrical connections (electronic devices) having one or more wires, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer readable medium may even be a paper or other suitable medium on which the program can be printed, as it may be optically scanned, for example by paper or other medium, followed by editing, interpretation or, if appropriate, other suitable The method is processed to obtain the program electronically and then stored in computer memory.
  • portions of the invention may be implemented in hardware, software, firmware or a combination thereof.
  • multiple steps or methods may be performed by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if executed in hardware, as in another embodiment, it can be performed by any one of the following techniques or combinations thereof known in the art: having logic gates for performing logic functions on data signals Discrete logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate arrays (PGAs), field programmable gate arrays (FPGAs), etc.
  • each functional unit in each embodiment of the present invention may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
  • the above integrated modules can be executed in the form of hardware or in the form of software functional modules.
  • the integrated modules, if executed in the form of software functional modules and sold or used as separate products, may also be stored in a computer readable storage medium.
  • the above mentioned storage medium may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de traitement d'image, appliqué dans un aéronef (100). L'aéronef (100) comporte un dispositif d'imagerie (10). Le procédé de traitement comporte les étapes consistant à : (S22) commander le dispositif d'imagerie (10) à des fins de réalisation d'une imagerie pour obtenir une image ; et (S24) synthétiser le temps système de l'aéronef (100) dans l'image quand le dispositif d'imagerie (10) effectue une imagerie. L'invention concerne également un dispositif de traitement d'image (20) et l'aéronef (100).
PCT/CN2017/074817 2017-02-24 2017-02-24 Procédé et dispositif de traitement d'image, et aéronef Ceased WO2018152783A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780005386.7A CN108496365A (zh) 2017-02-24 2017-02-24 图像的处理方法、处理装置及飞行器
PCT/CN2017/074817 WO2018152783A1 (fr) 2017-02-24 2017-02-24 Procédé et dispositif de traitement d'image, et aéronef

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/074817 WO2018152783A1 (fr) 2017-02-24 2017-02-24 Procédé et dispositif de traitement d'image, et aéronef

Publications (1)

Publication Number Publication Date
WO2018152783A1 true WO2018152783A1 (fr) 2018-08-30

Family

ID=63254093

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/074817 Ceased WO2018152783A1 (fr) 2017-02-24 2017-02-24 Procédé et dispositif de traitement d'image, et aéronef

Country Status (2)

Country Link
CN (1) CN108496365A (fr)
WO (1) WO2018152783A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112399081A (zh) * 2020-11-03 2021-02-23 深圳市中博科创信息技术有限公司 一种图像数据和控制数据一体化传输的方法和系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101593412A (zh) * 2008-05-26 2009-12-02 奥城同立科技开发(北京)有限公司 高速公路车辆超速的全面监控方法
EP2226246A2 (fr) * 2009-03-04 2010-09-08 Honeywell International Inc. Système et procédés d'affichage vidéo avec sensibilisation spatiale améliorée
CN104902083A (zh) * 2015-05-08 2015-09-09 惠州Tcl移动通信有限公司 基于移动终端的电子游记生成方法及其系统
CN105373629A (zh) * 2015-12-17 2016-03-02 谭圆圆 基于无人飞行器的飞行状态数据处理装置及其方法
US20170053674A1 (en) * 2015-07-27 2017-02-23 Vantage Robotics, Llc System for recording and synchronizing audio and video associated with a uav flight

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009133776A (ja) * 2007-11-30 2009-06-18 Sony Corp 撮像装置と時刻修正方法
CN104734797A (zh) * 2013-12-18 2015-06-24 鸿富锦精密工业(深圳)有限公司 时间同步的方法及电子装置
CN105897392A (zh) * 2014-12-15 2016-08-24 中国空间技术研究院 星地时间同步系统和方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101593412A (zh) * 2008-05-26 2009-12-02 奥城同立科技开发(北京)有限公司 高速公路车辆超速的全面监控方法
EP2226246A2 (fr) * 2009-03-04 2010-09-08 Honeywell International Inc. Système et procédés d'affichage vidéo avec sensibilisation spatiale améliorée
CN104902083A (zh) * 2015-05-08 2015-09-09 惠州Tcl移动通信有限公司 基于移动终端的电子游记生成方法及其系统
US20170053674A1 (en) * 2015-07-27 2017-02-23 Vantage Robotics, Llc System for recording and synchronizing audio and video associated with a uav flight
CN105373629A (zh) * 2015-12-17 2016-03-02 谭圆圆 基于无人飞行器的飞行状态数据处理装置及其方法

Also Published As

Publication number Publication date
CN108496365A (zh) 2018-09-04

Similar Documents

Publication Publication Date Title
US11222409B2 (en) Image/video deblurring using convolutional neural networks with applications to SFM/SLAM with blurred images/videos
WO2022116322A1 (fr) Procédé et appareil de génération d'un modèle de détection d'anomalie et procédé et appareil de détection d'anomalie
CN111598006B (zh) 用于标注对象的方法和装置
US12375628B2 (en) Web service proxy protocol
CN114697512A (zh) 配置方法及装置
CN109871385B (zh) 用于处理数据的方法和装置
CN107493460B (zh) 一种图像采集方法及系统
CN110619666A (zh) 用于标定相机的方法及装置
CN110609911A (zh) 一种图数据的处理方法、装置及计算机可读存储介质
CN112013864A (zh) 远程启动车辆导航的方法、装置、设备及存储介质
CN116010289B (zh) 自动驾驶仿真场景测试方法、装置、电子设备和可读介质
US20170180293A1 (en) Contextual temporal synchronization markers
CN112911241A (zh) 一种车辆远程监控系统、方法、装置、设备及存储介质
US20210227102A1 (en) Systems and methods for synchronizing frame timing between physical layer frame and video frame
CN113222050A (zh) 图像分类方法、装置、可读介质及电子设备
WO2018152783A1 (fr) Procédé et dispositif de traitement d'image, et aéronef
CN114467260A (zh) 驾驶员辅助系统的有损压缩的高级驾驶辅助系统传感器数据处理
CN116155953A (zh) 智能汽车的数据跨域传输方法、设备及介质
CN109769101B (zh) 数据显示方法和系统
CN117114306A (zh) 信息生成方法、装置、电子设备和计算机可读介质
US20140189033A1 (en) Communication system, semiconductor device, and data communication method
CN117373024B (zh) 标注图像生成方法、装置、电子设备和计算机可读介质
CN116109495A (zh) 使用激光雷达点云用以超声波传感器增强的系统和方法
CN111066323B (zh) 计算机视觉系统中的图像压缩/解压缩
US20240086169A1 (en) Video security system configured for simplified cluster join and method therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17897900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17897900

Country of ref document: EP

Kind code of ref document: A1