[go: up one dir, main page]

CN109803089A - Electronic equipment and mobile platform - Google Patents

Electronic equipment and mobile platform Download PDF

Info

Publication number
CN109803089A
CN109803089A CN201910008293.0A CN201910008293A CN109803089A CN 109803089 A CN109803089 A CN 109803089A CN 201910008293 A CN201910008293 A CN 201910008293A CN 109803089 A CN109803089 A CN 109803089A
Authority
CN
China
Prior art keywords
flight time
application processor
initial depth
image
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910008293.0A
Other languages
Chinese (zh)
Other versions
CN109803089B (en
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910008293.0A priority Critical patent/CN109803089B/en
Publication of CN109803089A publication Critical patent/CN109803089A/en
Application granted granted Critical
Publication of CN109803089B publication Critical patent/CN109803089B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)
  • Studio Devices (AREA)

Abstract

本申请公开了一种电子设备和移动平台。电子设备包括本体和设置在本体上的多个不同方位的多个飞行时间组件。每个飞行时间组件均包括两个视场角为80度至120度中的任意值的光发射器和一个视场角为180度至200度中的任意值的光接收器。光发射器用于向本体外发射激光脉冲,光接收器用于接收被摄目标反射的对应的两个光发射器发射的激光脉冲。多个飞行时间组件的光发射器同时发射激光,多个飞行时间组件的光接收器同时曝光,以获取全景深度图像。本申请实施方式的电子设备和移动平台中,位于本体的多个不同方位的多个光发射器同时发射激光,多个光接收器同时曝光,以获取全景深度图像,能够一次性获取到较为全面的深度信息。

The present application discloses an electronic device and a mobile platform. The electronic device includes a body and a plurality of time-of-flight components in a plurality of different orientations disposed on the body. Each time-of-flight assembly includes two light emitters with a field angle of any value from 80 degrees to 120 degrees and a light receiver with a field angle of any value from 180 degrees to 200 degrees. The light transmitter is used for emitting laser pulses to the outside of the body, and the light receiver is used for receiving the laser pulses emitted by the corresponding two light transmitters reflected by the object. The light transmitters of multiple time-of-flight assemblies emit laser light simultaneously, and the light-receivers of multiple time-of-flight assemblies are exposed at the same time to obtain panoramic depth images. In the electronic device and mobile platform according to the embodiments of the present application, multiple light emitters located in multiple different orientations of the main body emit laser light simultaneously, and multiple light receivers are exposed at the same time to obtain a panoramic depth image, which can obtain a more comprehensive one-time image. depth information.

Description

Electronic equipment and mobile platform
Technical field
This application involves image acquisition technologies, more specifically, are related to a kind of electronic equipment and mobile platform.
Background technique
In order to enable the function of electronic equipment is more diversified, depth image can be set on electronic equipment and obtained dress It sets, to obtain the depth image of target subject.However, current integrated phase shift range finding is merely able to obtain a direction or one Depth image in a angular range, the depth information got are less.
Summary of the invention
The application embodiment provides a kind of electronic equipment and mobile platform.
The electronic equipment of the application embodiment includes the multiple flight time components of ontology and setting on the body, Multiple flight time components are located at multiple and different orientation of the ontology, and each flight time component includes Two optical transmitting sets one and optical receiver, the field angle of each optical transmitting set are the arbitrary value in 80 degree to 120 degree, often The field angle of a optical receiver is the arbitrary value in 180 degree to 200 degree, and the optical transmitting set is used for the ontology outgoing Laser pulse is penetrated, the optical receiver is used to receive the described of corresponding two optical transmitting set transmittings of target subject reflection Laser pulse;The optical transmitting set of multiple flight time components emits laser, multiple flight time components simultaneously The optical receiver expose simultaneously, to obtain panoramic range image.
The mobile platform of the application embodiment includes the multiple flight time components of ontology and setting on the body, Multiple flight time components are located at multiple and different orientation of the ontology, and each flight time component includes Two optical transmitting sets and an optical receiver, the field angle of each optical transmitting set are the arbitrary value in 80 degree to 120 degree, often The field angle of a optical receiver is the arbitrary value in 180 degree to 200 degree, and the optical transmitting set is used for the ontology outgoing Laser pulse is penetrated, the optical receiver is used to receive the described of corresponding two optical transmitting set transmittings of target subject reflection Laser pulse;The optical transmitting set of multiple flight time components emits laser, multiple flight time components simultaneously The optical receiver expose simultaneously, to obtain panoramic range image.
In the electronic equipment and mobile platform of the application embodiment, multiple light positioned at multiple and different orientation of ontology are sent out Emitter emits laser simultaneously, and multiple optical receivers expose simultaneously, to obtain panoramic range image, can disposably get more Comprehensive depth information.
The additional aspect and advantage of presently filed embodiment will be set forth in part in the description, partially will be from following Description in become obvious, or recognized by the practice of presently filed embodiment.
Detailed description of the invention
The above-mentioned and/or additional aspect and advantage of the application is from combining in description of the following accompanying drawings to embodiment by change It obtains obviously and is readily appreciated that, in which:
Fig. 1 is the structural schematic diagram of the electronic equipment of the application certain embodiments;
Fig. 2 is the module diagram of the electronic equipment of the application certain embodiments;
Fig. 3 is the structural schematic diagram of the optical transmitting set of the flight time component of the application certain embodiments;
Fig. 4 is the application scenarios schematic diagram of the electronic equipment of the application certain embodiments;
Fig. 5 is the coordinate system schematic diagram of the initial depth image mosaic of the application certain embodiments;
Fig. 6 to Figure 10 is the application scenarios schematic diagram of the electronic equipment of the application certain embodiments;
Figure 11 to Figure 14 is the structural schematic diagram of the mobile platform of the application certain embodiments.
Specific embodiment
Presently filed embodiment is described further below in conjunction with attached drawing.Same or similar label is from beginning in attached drawing To the same or similar element of expression or element with the same or similar functions eventually.The application's described with reference to the accompanying drawing Embodiment is exemplary, and is only used for explaining presently filed embodiment, and should not be understood as the limitation to the application.
Also referring to Fig. 1 and Fig. 2, the electronic equipment 100 of the application embodiment includes ontology 10, flight time component 20, CCD camera assembly 30, microprocessor 40 and application processor 50.
Ontology 10 includes multiple and different orientation.As shown in figure 1, ontology 10 can have there are four different direction example, along side clockwise To successively are as follows: first orientation, second orientation, third orientation and fourth orientation, first orientation is opposite with third orientation, second orientation It is opposite with fourth orientation.First orientation is the right side of orientation corresponding with the top of ontology 10, second orientation as with ontology 10 The corresponding orientation in side, third orientation are the left side of orientation corresponding with the lower section of ontology 10, fourth orientation as with ontology 10 Corresponding orientation.
Flight time component 20 is arranged on ontology 10.The quantity of flight time component 20 can be multiple, multiple flights Time component 20 is located at multiple and different orientation of ontology 10.Specifically, the quantity of flight time component 20 can be two, point It Wei not flight time component 20a and flight time component 20b.Flight time component 20a is arranged in first orientation, flight time group Part 20b is arranged in third orientation.Certainly, to may be four (or any other be greater than two to the quantity of flight time component 20 Quantity), two flight time components 20 in addition can be separately positioned on second orientation and fourth orientation.The application embodiment party Formula is illustrated so that the quantity of flight time component 20 is two as an example, it will be understood that two flight time components 20 can be real Now obtaining panoramic range image, (panoramic range image refers to that the field angle of the panoramic range image is greater than or equal to 180 degree, example Such as, the field angle of panoramic range image can be 180 degree, 240 degree, 360 degree, 480 degree, 720 degree etc.), be conducive to saving electronics The manufacturing cost of equipment 100 and the volume and the power consumption that reduce electronic equipment 100 etc..The electronic equipment 100 of present embodiment can To be the portable electronic devices such as mobile phone, tablet computer, the laptop for being provided with multiple flight time components 20, at this point, Ontology 10 can be handset, tablet computer fuselage, laptop fuselage etc..Electronic equipment higher for thickness requirement 100, for mobile phone, since mobile phone requires fuselage thinner thickness, thus the side of fuselage can not usually install flight time group Part 20, then can solve the above problem using two flight time components 20 to obtain the setting of panoramic range image, at this time Two flight time components 20 can be separately mounted to handset on the front and back.In addition, two flight time components 20 modes that can obtain panoramic range image are also beneficial to reduce the calculation amount of panoramic range image.
Each flight time component 20 includes two optical transmitting sets 22 and an optical receiver 24.Optical transmitting set 22 is used for Emit laser pulse to outside ontology 10, optical receiver 24 is used to receive corresponding two optical transmitting sets 22 hair of target subject reflection The laser pulse penetrated.Specifically, flight time component 20a includes optical transmitting set 222a, optical transmitting set 224a and optical receiver 24a, flight time component 20b include optical transmitting set 222b, optical transmitting set 224b and optical receiver 24b.Optical transmitting set 222a and Optical transmitting set 224a is used to emit laser pulse to the outer first orientation of ontology 10, and optical transmitting set 222b and optical transmitting set 224b are equal For emitting laser pulse to the outer third orientation of ontology 10, optical receiver 24a is used to receive the target subject reflection of first orientation Optical transmitting set 222a and optical transmitting set 224a transmitting laser pulse, optical receiver 24b is used to receive the shot of third orientation The laser pulse of optical transmitting set 222b and optical transmitting set the 224b transmitting of target reflection, it is each outside ontology 10 so as to cover Different zones are rotated by 360 ° for could obtaining more comprehensive depth information compared to existing needs, present embodiment Electronic equipment 100, which can not rotate, can disposably obtain more comprehensive depth information, and it is rapid to execute simple and response speed.
The optical transmitting set 22 of multiple flight time components 20 emits laser simultaneously, corresponding, multiple flight time groups The optical receiver 24 of part 20 exposes simultaneously, and to obtain panoramic range image, multiple flight time components 20 are, for example, two flights Time component 20.Specifically, optical transmitting set 222a, optical transmitting set 224a, optical transmitting set 222b and optical transmitting set 224b are sent out simultaneously Laser is penetrated, optical receiver 24a and optical receiver 24b expose simultaneously.Since multiple optical transmitting sets 22 emit laser, Duo Geguang simultaneously Receiver 24 exposes simultaneously, is acquiring corresponding multiple initial depth according to the received laser pulse of multiple optical receivers 24 When image, multiple initial depth image timeliness having the same are able to reflect the 10 each orientation of outer synchronization of ontology and show Picture, i.e. the panoramic range image of synchronization.
The field angle of each optical transmitting set 22 is the arbitrary value in 80 degree~120 degree, the field angle of each optical receiver 24 For the arbitrary value in 180 degree~200 degree.
In one embodiment, the field angle of each optical transmitting set 22 is the arbitrary value in 80 degree~90 degree, such as light hair Emitter 222a, optical transmitting set 224a, optical transmitting set 222b and optical transmitting set 224b field angle be 80 degree, optical receiver 24a Field angle with optical receiver 24b is 180 degree.When the field angle of optical transmitting set 22 is smaller, the manufacturing process of optical transmitting set 22 Fairly simple, manufacturing cost is lower, and can be improved the uniformity of the laser of transmitting.When the field angle of optical receiver 24 is smaller When, lens distortion is smaller, and the initial depth picture quality of acquisition is preferable, and the panoramic range image quality obtained from is also preferable, And accurate depth information can be got.
In one embodiment, optical transmitting set 222a, optical transmitting set 224a, optical transmitting set 222b and optical transmitting set 224b The sum of field angle is equal to 360 degree, and the sum of optical receiver 24a and the field angle of optical receiver 24b are equal to 360 degree.Specifically, light is sent out Emitter 222a, optical transmitting set 224a, optical transmitting set 222b and optical transmitting set 224b field angle can be 90 degree, optical receiver The field angle of 24a and optical receiver 24b can be 180 degree, and the mutual field angle of four optical transmitting sets 22 is not handed over mutually It is folded, two mutual field angles of optical receiver 24 are non-overlapping, obtain 360 degree or approximate 360 degree of panorama depth to realize Spend image.Alternatively, the field angle of optical transmitting set 222a and optical transmitting set 224a can be 80 degree, optical transmitting set 222b and light emitting The field angle of device 224b is 100 degree, the field angle of optical receiver 24a and optical receiver 24b are 180 degree etc., four light hairs Emitter 22, which is realized by angled complimentary, two optical receivers 24 by angled complimentary, obtains 360 degree or approximate 360 degree of panorama depth Spend image.
In one embodiment, optical transmitting set 222a, optical transmitting set 224a, optical transmitting set 222b and optical transmitting set 224b The sum of field angle is greater than 360 degree, and the sum of optical receiver 24a and the field angle of optical receiver 24b are greater than 360 degree, four light emittings The mutual field angle of at least two optical transmitting sets 22 in device 22 is overlapping, the mutual field angle of two optical receivers 24 It is overlapping.Specifically, the field angle of optical transmitting set 222a, optical transmitting set 224a, optical transmitting set 222b and optical transmitting set 224b can be with It is 100 degree, the field angle of four optical transmitting sets 22 between any two is mutually overlapping.The view of optical receiver 24a and optical receiver 24b Rink corner can be 200 degree, and the field angle between two optical receivers 24 is mutually overlapping.It, can when obtaining panoramic range image First to identify the edge overlapping part of two initial depth images, then the panorama for being 360 degree by two initial depth image mosaics Depth image.Visual field since the field angle of four optical transmitting sets 22 between any two is mutually overlapping, between two optical receivers 24 Angle is mutually overlapping, it can be ensured that outer 360 degree of the depth information of panoramic range image covering ontology 10 of acquisition.
Certainly, the specific value of the field angle of each optical transmitting set 22 and each optical receiver 24 is not limited to above-mentioned act Example, those skilled in the art, which can according to need, is set as appointing between 80 degree~120 degree for the field angle of optical transmitting set 22 Meaning numerical value, the field angle of optical receiver 24 are set as any number between 180 degree~200 degree, such as: the view of optical transmitting set 22 Rink corner is for 80 degree, 82 degree, 84 degree, 86 degree, 90 degree, 92 degree, 94 degree, 96 degree, 98 degree, 104 degree, 120 degree or any between the two Arbitrary value, the field angle of optical receiver 24 is 180 degree, 181 degree, 182 degree, 187 degree, 188 degree, 193.2 degree, 195 degree, 200 Degree or any arbitrary value between the two, this is not restricted.
Referring to Fig. 3, each optical transmitting set 22 includes light source 222 and diffuser 224.Light source 222 is for emitting laser (example Such as infrared laser, at this point, optical receiver 24 is infrared camera), diffuser 224 is used to spread the laser of the transmitting of light source 222.
Under normal circumstances, the laser pulse that optical transmitting set 22 adjacent between two adjacent flight time components 20 emits It is easy to cause interference, such as the field angle phase of the optical transmitting set 22 between two adjacent flight time components 20 between each other When mutually overlapping, the laser pulse that optical transmitting set 22 emits be easy to cause interference between each other.Therefore, in order to improve the depth of acquisition The accuracy of information, the wavelength of the laser pulse that the adjacent optical transmitting set 22 of two neighboring flight time component 20 emits can be with Difference, in order to distinguish and calculate initial depth image.
Specifically, it is assumed that the wavelength of the laser pulse of the optical transmitting set 222a transmitting of first orientation is λ 1, first orientation The wavelength of the laser pulse of optical transmitting set 224a transmitting is λ 2, the wave of the laser pulse of the optical transmitting set 222b transmitting in third orientation The wavelength of a length of λ 3, the laser pulse of the optical transmitting set 224b transmitting in third orientation are λ 4, then need to only meet 1 ≠ λ of λ 3,2 ≠ λ of λ 4 ?.Wherein, λ 1 can be equal with λ 2, be also possible to not equal (due to optical transmitting set 222a and optical transmitting set 224a In same orientation and belong to same flight time component 20a, therefore, to depth information when λ 1 is equal with λ 2 and mutually overlapping Acquisition influence less, therefore, λ 1 can be equal with λ 2, can not also wait), λ 3 can be equal with λ 4, be also possible to not Deng (acquisition of depth information is influenced less when ibid, λ 3 is equal with λ 4 and mutually overlapping, λ 3 can be equal with λ 4, Can not wait), λ 1 can be equal with λ 4, be also possible to not equal, λ 2 can be equal with λ 3, be also possible to 's.Preferably, the wavelength for the laser pulse that each optical transmitting set 22 emits is different, to further increase the depth information of acquisition Accuracy.In other words, in 3 ≠ λ of λ 1 ≠ λ, 2 ≠ λ 4, the laser pulse that multiple optical transmitting sets 22 emit does not interfere with each other, makes The calculating for obtaining initial depth image is easy the most.In addition, each optical receiver 24 is configured as receiving corresponding optical transmitting set 22 The laser pulse of the corresponding wavelength of transmitting.Such as optical receiver 24a is for receiving optical transmitting set 222a and optical transmitting set 224a hair The laser pulse for the corresponding wavelength penetrated is unable to receive the corresponding wavelength of optical transmitting set 222b and optical transmitting set 224b transmitting Laser pulse.Similarly, optical receiver 24b is only used for receiving the corresponding wavelength of optical transmitting set 222b and optical transmitting set 224b transmitting Laser pulse.
By taking the laser pulse that optical transmitting set 22 emits is infrared light as an example, the wavelength of infrared light be 770 nanometers to 1 millimeter it Between, then λ 1 can be the arbitrary value between 770 nanometers~1000 nanometers, and λ 2 can be times between 1000 nanometers~1200 nanometers Meaning value, λ 3 can be the arbitrary value between 1200 nanometers~1400 nanometers, and λ 4 can be between 1400 nanometers~1600 nanometers Arbitrary value.Optical receiver 24a is used to receive the laser pulse that the wavelength of optical transmitting set 222a transmitting is 770 nanometers~1000 nanometers The laser pulse that wavelength with optical transmitting set 224a transmitting is 1000 nanometers~1200 nanometers, optical receiver 24b is for receiving light Transmitter 222b transmitting wavelength be 1200 nanometers~1400 nanometers laser pulse and optical transmitting set 224b transmitting wavelength be 1400 nanometers~1600 nanometers of laser pulse.
It should be pointed out that other than the wavelength for the laser pulse for emitting optical transmitting set 22 is different, the skill of this field Art personnel can also avoid different flight time components 20 from interfering between each other when working at the same time using other modes, This is with no restriction;Alternatively, the lesser degree of interference can also be ignored, initial depth image is directly calculated;Alternatively, calculating just When beginning depth image, influence caused by the interference is filtered out by certain algorithm process.
Fig. 1 and Fig. 2 are please referred to, CCD camera assembly 30 is arranged on ontology 10.The quantity of CCD camera assembly 30 can be more It is a, the corresponding flight time component 20 of each CCD camera assembly 30.For example, when the quantity of flight time component 20 is two When, the quantity of CCD camera assembly 30 is also two, and two CCD camera assemblies 30 are separately positioned on first orientation and third orientation.
Multiple CCD camera assemblies 30 are connect with application processor 50.Each CCD camera assembly 30 is for acquiring target subject Scene image and export to application processor 50.In present embodiment, two CCD camera assemblies 30 are respectively used to acquisition first The scene image of the target subject in orientation, the scene image of the target subject in third orientation are simultaneously exported respectively to application processor 50.It is appreciated that each CCD camera assembly 30 it is identical as the field angle of optical receiver 24 of corresponding flight time component 20 or It is approximately uniform, so that each scene image can preferably be matched with corresponding initial depth image.
CCD camera assembly 30 can be visible image capturing head 32 or infrared pick-up head 34.When CCD camera assembly 30 When for visible image capturing head 32, scene image is visible images;When CCD camera assembly 30 is infrared pick-up head 34, scene Image is infrared light image.
Referring to Fig. 2, microprocessor 40 can be processing chip.The quantity of microprocessor 40 can be multiple, Mei Gewei Processor 40 corresponds to a flight time component 20.For example, the quantity of flight time component 20 is two in present embodiment, The quantity of microprocessor 40 is also two.Each microprocessor 40 in corresponding flight time component 20 optical transmitting set 22 and Optical receiver 24 is all connected with.Each microprocessor 40 can drive corresponding optical transmitting set 22 to emit laser by driving circuit, and Realize that four optical transmitting sets 22 emit laser simultaneously by the control of multi-microprocessor 40.Each microprocessor 40 be also used to Corresponding optical receiver 24 provides the clock information for receiving laser pulse so that optical receiver 24 exposes, and passes through two micro- places The control of reason device 40 exposes while realizing two optical receiver 24.When each microprocessor 40 is also used to according to corresponding flight Between component 20 optical transmitting set 22 emit laser pulse and the received laser pulse of optical receiver 24 to obtain initial depth figure Picture.For example, laser pulse and light-receiving that two microprocessors 40 emit according to the optical transmitting set of flight time component 20a respectively The received laser pulse of device 24a is to obtain initial depth image P1, swashing according to the transmitting of the optical transmitting set of flight time component 20b Light pulse and the received laser pulse of optical receiver 24b are to obtain initial depth image P2 (as shown in the upper part of Fig. 4).Each Microprocessor 40 can also carry out tiled, distortion correction, the processing of self calibration scheduling algorithm to initial depth image, initial to improve The quality of depth image.
It is appreciated that the quantity of microprocessor 40 may be one, at this point, microprocessor 40 needs successively according to correspondence Flight time component 20 optical transmitting set 22 emit laser pulse and the received laser pulse of optical receiver 24 with obtain just Beginning depth image.For a microprocessor 40, processing speed faster, is delayed smaller two microprocessors 40.
Two microprocessors 40 are connect with application processor 50, by initial depth image transmitting to application processor 50.In one embodiment, microprocessor 40 can pass through mobile industry processor interface (Mobile Industry Processor Interface, MIPI) it is connect with application processor 50, specifically, microprocessor 40 passes through mobile industry processing The credible performing environment (Trusted Execution Environment, TEE) of device interface and application processor 50 connects, with Data (initial depth image) in microprocessor 40 are transmitted directly in credible performing environment, to improve electronic equipment 100 The safety of interior information.Wherein, the code in credible performing environment and region of memory are controlled by access control unit, It cannot be accessed by the program in untrusted performing environment (Rich Execution Environment, REE), credible execution ring Border and untrusted performing environment can be formed in application processor 50.
The system that application processor 50 can be used as electronic equipment 100.Application processor 50 can reset microprocessor 40, Wake up (wake) microprocessor 40, error correction (debug) microprocessor 40 etc..Application processor 50 can also be with electronic equipment 100 Multiple electronic components connect and control multiple electronic component and run in predetermined patterns, such as application processor 50 It connect with visible image capturing head 32 and infrared pick-up head 34, is shot with controlling visible image capturing head 32 and infrared pick-up head 34 Visible images and infrared light image, and handle the visible images and infrared light image;When electronic equipment 100 includes display screen When, application processor 50 can control display screen and show scheduled picture;Application processor 50 can be with controlling electronic devices 100 Antenna send or receive scheduled data etc..
Referring to Fig. 4, in one embodiment, application processor 50 is used for the field angle according to optical receiver 24 for two Two initial depth images that microprocessor 40 obtains synthesize a frame panoramic range image.
Specifically, it incorporated by reference to Fig. 1, is built using transversal line as X-axis by Y-axis of longitudinal axis using the center of ontology 10 as center of circle O Vertical rectangular coordinate system XOY, in rectangular coordinate system XOY, the visual field of optical receiver 24a is (suitable between 190 degree~350 degree Hour hands rotation, rear same), for the visual field of optical transmitting set 222a between 190 degree~90 degree, the visual field of optical transmitting set 224a is located at 90 Between~350 degree of degree, for the visual field of optical receiver 24b between 10 degree~170 degree, the visual field of optical transmitting set 222b is located at 270 Between~170 degree of degree, the visual field of optical transmitting set 224b is between 10 degree~270 degree, then application processor 50 is according to two light Initial depth image P1, initial depth image P2 are spliced into the panoramic range image of 360 degree of a frame by the field angle of receiver 24 P12, so as to the use of depth information.
Each microprocessor 40 handle corresponding flight time component 20 optical transmitting set 22 emit laser pulse and In the initial depth image that the received laser pulse of optical receiver 24 obtains, the depth information of each pixel is the quilt in corresponding orientation Take the photograph the distance between the optical receiver 24 in target and the orientation.That is, in initial depth image P1 each pixel depth information For the distance between the target subject of first orientation and optical receiver 24a;The depth letter of each pixel in initial depth image P2 Breath is the distance between target subject and the optical receiver 24b in third orientation.By multiple initial depth images in multiple orientation During the panoramic range image for being spliced into 360 degree of a frame, first have to the depth of each pixel in each initial depth image Degree information is converted to unitized depth information, and unitized depth information indicates each target subject and some benchmark in each orientation The distance of position.After depth information is converted into unitized depth information, facilitate application processor 40 according to unitized depth information Do the splicing of initial depth image.
Specifically, a frame of reference is selected, the frame of reference can be with the optical receiver 24 in some orientation Image coordinate system is also possible to select other coordinate systems as the frame of reference as the frame of reference.By taking Fig. 5 as an example, with xo-yo-zoCoordinate system is benchmark coordinate system.Coordinate system x shown in fig. 5a-ya-zaFor the image coordinate system of optical receiver 24a, sit Mark system xb-yb-zbFor the image coordinate system of optical receiver 24b.Application processor 50 is according to coordinate system xa-ya-zaWith reference coordinate It is xo-yo-zoBetween spin matrix and translation matrix the depth information of each pixel in initial depth image P1 is converted into system One changes depth information, according to coordinate system xb-yb-zbWith frame of reference xo-yo-zoBetween spin matrix and translation matrix will be first The depth information of each pixel is converted to unitized depth information in beginning depth image P2.
After the completion of depth information conversion, multiple initial depth images are located under a unified frame of reference, and each Corresponding coordinate (the x of one pixel of initial depth imageo,yo,zo), then initial depth can be done by coordinate matching The splicing of image.For example, some pixel P in initial depth image P1aCoordinate be (xo1,yo1,zo1), initial deep Spend some pixel P in image P2bCoordinate be also (xo1,yo1,zo1), due to PaAnd PbUnder the current frame of reference Coordinate value having the same, then pixels illustrated point PaWith pixel PbIt is actually the same point, initial depth image P1 and initial When depth image P2 splices, pixel PaIt needs and pixel PbIt is overlapped.In this way, application processor 50 can pass through of coordinate The splicing of multiple initial depth images is carried out with relationship, and obtains 360 degree of panoramic range image.
It should be noted that the splicing that the matching relationship based on coordinate carries out initial depth image requires initial depth image Resolution ratio need be greater than a default resolution ratio.If being appreciated that the resolution ratio of initial depth image is lower, coordinate (xo,yo,zo) accuracy also can be relatively low, at this point, directly being matched according to coordinate, in fact it could happen that PaPoint and PbPoint is practical On be not overlapped, but differ an offset offset, and the value of offset be more than error bounds limit value the problem of.If image Resolution ratio it is higher, then coordinate (xo,yo,zo) accuracy also can be relatively high, at this point, directly being matched according to coordinate, i.e., Make PaPoint and PbPoint is practically without coincidence, differs an offset offset, but the value of offset can also be less than bouds on error Value will not influence too much the splicing of initial depth image that is, in the range of error permission.
It is appreciated that subsequent implementation mode can be used aforesaid way by two or more initial depth images into Row splicing or synthesis, no longer illustrate one by one.
Two initial depth images can also be synthesized three-dimensional with corresponding two visible images by application processor 50 Scene image is watched with being shown for user.For example, two visible images are respectively visible images V1 and visible light figure As V2.Then application processor 50 initial depth image P1 is synthesized with visible images V1 respectively, by initial depth image P2 with Visible images V2 synthesis, then two images after synthesis are spliced to obtain the three-dimensional scene images of 360 degree of a frame.Or Person, application processor 50 first splice initial depth image P1 and initial depth image P2 to obtain the panorama depth of 360 degree of a frame Image, and will be seen that light image V1 and visible images V2 splices to obtain the panorama visible images of 360 degree of a frame;Again by panorama Depth image and panorama visible images synthesize 360 degree of three-dimensional scene images.
Referring to Fig. 6, in one embodiment, be used to be obtained according to two microprocessors 40 two of application processor 50 Initial depth image and two scene images of two CCD camera assemblies 30 acquisition identify target subject.
Specifically, when scene image is infrared light image, two infrared light images can be infrared light image I1 respectively With infrared light image I2.Application processor 50 is respectively according to initial depth image P1 and infrared light image I1 identification first orientation Target subject, the target subject that third orientation is identified according to initial depth image P2 and infrared light image I2.When scene image is When visible images, two visible images are visible images V1 and visible images V2 respectively.Application processor 50 is distinguished According to the target subject of initial depth image P1 and visible images V1 identification first orientation, according to initial depth image P2 and can The target subject in light-exposed image V2 identification third orientation.
When identifying target subject is to carry out recognition of face, application processor 50 is using infrared light image as scene image It is higher to carry out recognition of face accuracy.Application processor 50 carries out recognition of face according to initial depth image and infrared light image Process can be as follows:
Firstly, carrying out Face datection according to infrared light image determines target human face region.Since infrared light image includes The detailed information of scene can carry out Face datection according to infrared light image, to detect after getting infrared light image It whether include out face in infrared light image.If in infrared light image including face, extract in infrared light image where face Target human face region.
Then, In vivo detection processing is carried out to target human face region according to initial depth image.Due to each initial depth Image and infrared light image are corresponding, include the depth information of corresponding infrared light image in initial depth image, therefore, Depth information corresponding with target human face region can be obtained according to initial depth image.Further, since living body faces are Three-dimensional, and the face of the display such as picture, screen is then plane, it therefore, can be according to the target human face region of acquisition Depth information judge that target human face region is three-dimensional or plane, to carry out In vivo detection to target human face region.
If In vivo detection success, obtains the corresponding target face property parameters of target human face region, and according to target person Face property parameters carry out face matching treatment to the target human face region in infrared light image, obtain face matching result.Target Face character parameter refers to the parameter that can characterize the attribute of target face, can be to target person according to target face property parameters Face carries out identification and matching treatment.Target face property parameters include but is not limited to be face deflection angle, face luminance parameter, Face parameter, skin quality parameter, geometrical characteristic parameter etc..Electronic equipment 100 can be stored in advance joins for matched face character Number.After getting target face property parameters, so that it may by target face property parameters and pre-stored face character Parameter is compared.If target face property parameters are matched with pre-stored face character parameter, recognition of face passes through.
It should be pointed out that application processor 50 carries out the tool of recognition of face according to initial depth image and infrared light image Body process is not limited to this, such as application processor 50 can also detect facial contour according to initial depth visual aids, to mention High recognition of face precision etc..Application processor 50 according to initial depth image and visible images carry out the process of recognition of face with Application processor 50 is similar with the infrared light image progress process of recognition of face according to initial depth image, no longer separately explains herein It states.
Fig. 6 and Fig. 7 are please referred to, application processor 50 is also used to according to two initial depth images and two scene images When identifying target subject failure, two initial depth figures being obtained two microprocessors 40 according to the field angle of optical receiver 24 Merge depth image as synthesizing a frame, two scene images that two CCD camera assemblies 30 are acquired synthesize a frame and merge field Scape image, and target subject is identified according to merging depth image and merging scene image.
Specifically, in Fig. 6 and embodiment shown in Fig. 7, due to the view of the optical receiver 24 of each flight time component 20 Rink corner is limited, it is understood that there may be the half of face be located at initial depth image P1, the other half be located at the situation of initial depth image P2, Initial depth image P1 and initial depth image P2 are synthesized a frame and merge depth image P12 by application processor 50, and corresponding Infrared light image I1 and infrared light image I2 (or visible images V1 and visible images V2) are synthesized into a frame and merge scene Image I12 (or V12), to identify target subject according to merging depth image P12 and merging scene image I12 (or V12) again.
Fig. 8 and Fig. 9 are please referred to, in one embodiment, application processor 50 according to multiple initial depth images for sentencing Disconnected the distance between target subject and electronic equipment 100 variation.
Specifically, each optical transmitting set 22 can repeatedly emit laser pulse, and accordingly, each optical receiver 24 can be more Secondary exposure.For example, at the first moment, optical transmitting set, the light of the optical transmitting set of flight time component 20a, flight time component 20b Receiver 24a and optical receiver 24b receive laser pulse, and two microprocessors 40 are corresponding to obtain initial depth image P11, initial Depth image P21;At the second moment, optical transmitting set, the light of the optical transmitting set of flight time component 20a, flight time component 20b Receiver 24a and optical receiver 24b receive laser pulse, and two microprocessors 40 are corresponding to obtain initial depth image P12, initial Depth image P22.Then, application processor 50 judges first according to initial depth image P11 and initial depth image P12 respectively The variation of the distance between the target subject in orientation and electronic equipment 100;According to initial depth image P21 and initial depth image P22 judges that the distance between target subject and the electronic equipment 100 in third orientation change.
It is appreciated that due to include in initial depth image target subject depth information, application processor 50 Can be changed according to the depth information at multiple continuous moment between the target subject and electronic equipment 100 that judge corresponding orientation away from From variation.
Referring to Fig. 10, application processor 50 is also used to judging that distance change fails according to multiple initial depth images When, a frame, which is synthesized, according to two initial depth images that the field angle of optical receiver 24 obtains two microprocessors 40 merges Depth image, application processor 50 continuously perform synthesis step to obtain multiframe and continuously merge depth image, and according to multiframe Merge depth image and judges distance change.
Specifically, in embodiment shown in Fig. 10, due to the field angle of the optical receiver 24 of each flight time component 20 It is limited, it is understood that there may be the half of face be located at initial depth image P11, the other half be located at the situation of initial depth image P21, answer The initial depth image P11 at the first moment and initial depth image P21 are synthesized into a frame with processor 50 and merge depth image P121, and correspond to and the initial depth image P12 and initial depth image P22 at the second moment are synthesized into frame merging depth image Then P122 merges depth image P121 and P122 according to this two frame after merging and rejudges distance change.
Referring to Fig. 9, when judging that distance change reduces for distance according to multiple initial depth images, or according to multiframe When merging depth image judges that distance change reduces for distance, application processor 50 can be improved to be passed from least one microprocessor 40 The frame per second to judge the initial depth image of distance change is acquired in defeated multiple initial depth images.
It is appreciated that electronic equipment 100 can not prejudge when the distance between target subject and electronic equipment 100 reduce The distance, which reduces, whether there is risk, and therefore, application processor 50 can be improved from the more of the transmission of at least one microprocessor 40 The frame per second to judge the initial depth image of distance change is acquired in a initial depth image, it should be away from closer concern From variation.Specifically, when judging that the corresponding distance in some orientation reduces, the orientation is can be improved from Wei Chu in application processor 50 The frame per second to judge the initial depth image of distance change is acquired in multiple initial depth images that reason device 40 transmits.
For example, two microprocessors 40 obtain initial depth image P11, initial depth image respectively at the first moment P21;At the second moment, two microprocessors 40 obtain initial depth image P12, initial depth image P22 respectively;In third It carves, two microprocessors 40 obtain initial depth image P13, initial depth image P23 respectively;At the 4th moment, two micro- places Reason device 40 obtains initial depth image P14, initial depth image P24 respectively.
Under normal circumstances, the selection of application processor 50 initial depth image P11 and initial depth image P14 judges first The variation of the distance between the target subject in orientation and electronic equipment 100;Choose initial depth image P21 and initial depth image P24 judges that the distance between target subject and the electronic equipment 100 in third orientation change.Application processor 50 is adopted in each orientation The frame per second of collection initial depth image is to acquire a frame at interval of two frames, i.e., every three frame chooses a frame.
When judging that the corresponding distance of first orientation reduces according to initial depth image P11 and initial depth image P14, Application processor 50 can then choose initial depth image P11 and initial depth image P13 judge the target subject of first orientation with The variation of the distance between electronic equipment 100.The frame per second that application processor 50 acquires the initial depth image of first orientation becomes every It is spaced a frame and acquires a frame, i.e., every two frame chooses a frame.And the frame per second in other orientation remains unchanged, i.e., application processor 50 still selects Initial depth image P21 and initial depth image P24 is taken to judge distance change.
When judging that the corresponding distance of first orientation reduces according to initial depth image P11 and initial depth image P14, together When according to initial depth image P21 and initial depth image P24 judge third orientation it is corresponding distance reduce when, using processing Device 50 can then choose initial depth image P11 and initial depth image P13 judges the target subject and electronic equipment of first orientation The target subject that initial depth image P21 and initial depth image P23 judges third orientation is chosen in the distance between 100 variations The variation of the distance between electronic equipment 100, application processor 50 acquire the initial depth image of first orientation and third orientation Frame per second become acquiring a frame at interval of a frame, i.e. every two frame chooses a frame.
Certainly, application processor 50 can also be improved when judging that the corresponding distance in any one orientation reduces from each The frame per second to judge the initial depth image of distance change is acquired in multiple initial depth images that microprocessor 40 transmits. That is: when the target subject and electronic equipment for judging first orientation according to initial depth image P11 and initial depth image P14 When the distance between 100 reduction, application processor 50 can then choose initial depth image P11 and initial depth image P13 judgement Initial depth image P21 and initial depth figure are chosen in the variation of the distance between the target subject of first orientation and electronic equipment 100 As P23 judges that the distance between target subject and the electronic equipment 100 in third orientation changes.
Application processor 50 can also judge the distance in conjunction with visible images or infrared light image when distance reduces Variation.Specifically, application processor 50 first identifies target subject according to visible images or infrared light image, then further according to more The initial depth image at a moment judges distance change, to set for different target subjects from different distance controlling electronics Standby 100 execute different operations.Alternatively, the control of microprocessor 40 improves the corresponding transmitting of optical transmitting set 22 and swashs when distance reduces The frequency etc. that light and optical receiver 24 expose.
It should be noted that the electronic equipment 100 of present embodiment is also used as an external terminal, be fixedly mounted or It is removably mounted on the portable electronic device such as mobile phone, tablet computer, laptop outside, can also be fixedly mounted It is used in the loose impediments such as vehicle body (as shown in Figure 7 and Figure 8), unmanned aerial vehicle body, robot body or ship ontology. When specifically used, when electronic equipment 100 synthesizes a frame panoramic range image according to multiple initial depth images as previously described, entirely Scape depth image can be used for three-dimensional modeling, immediately positioning and map structuring (simultaneous localization and Mapping, SLAM), augmented reality shows.When the identification target subject as previously described of electronic equipment 100, then can be applied to portable Recognition of face unlock, the payment of formula electronic device, or applied to the avoidance of robot, vehicle, unmanned plane, ship etc..Work as electronics When equipment 100 judges the variation of the distance between target subject and electronic equipment 100 as previously described, then it can be applied to robot, vehicle , automatic runnings, the object tracking such as unmanned plane, ship etc..
Fig. 2 and Figure 11 are please referred to, the application embodiment also provides a kind of mobile platform 300.Mobile platform 300 includes this Body 10 and the multiple flight time components 20 being arranged on ontology 10.Multiple flight time components 20 are located at the more of ontology 10 A different direction.Each flight time component 20 includes two optical transmitting sets 22 and an optical receiver 24.Each light emitting The field angle of device 22 is the arbitrary value in 80 degree to 120 degree, and the field angle of each optical receiver 24 is 180 degree in 200 degree Arbitrary value.Optical transmitting set 22 is used to receive target subject reflection for emitting laser pulse, optical receiver 24 to outside ontology 10 The laser pulse of corresponding two optical transmitting sets 22 transmitting.The optical transmitting set 22 of multiple flight time components 20 emits simultaneously to swash The optical receiver 24 of light, multiple flight time components 20 exposes simultaneously, to obtain panoramic range image.
Specifically, ontology 10 can be vehicle body, unmanned aerial vehicle body, robot body or ship ontology.
Figure 11 is please referred to, when ontology 10 is vehicle body, the quantity of multiple flight time components 20 is two, and two fly Row time component 20 is separately mounted to the two sides of vehicle body, for example, headstock and the tailstock, alternatively, being mounted on vehicle body left side and vehicle On the right side of body.Vehicle body can drive two flight time components 20 to move on road, and 360 degree constructed in travelling route are complete Scape depth image, using as Reference Map etc.;Or the initial depth image of two different directions is obtained, to identify mesh shot Mark, judge the distance between target subject and mobile platform 300 change, thus control vehicle body accelerate, deceleration, stop, around Row etc., realizes unmanned avoidance, for example, in vehicle when being moved on road, if recognizing target subject at a distance from vehicle Reduce and target subject is the pit on road, then vehicle is slowed down with the first acceleration, if recognizing target subject and vehicle Distance reduces and target subject is behaved, then vehicle is slowed down with the second acceleration, wherein the absolute value of the first acceleration is less than second The absolute value of acceleration.In this way, executing different operations according to different target subjects when distance reduces, vehicle can be made It is more intelligent.
Figure 12 is please referred to, when ontology 10 is unmanned aerial vehicle body, the quantity of multiple flight time components 20 is two, two Flight time component 20 is separately mounted to the opposite two sides of unmanned aerial vehicle body, such as front and rear sides or arranged on left and right sides, or It is mounted on the opposite two sides of the holder carried on unmanned aerial vehicle body.Unmanned aerial vehicle body can drive multiple flight time components 20 It flies in the sky, to be taken photo by plane, inspection etc., the panoramic range image that unmanned plane can will acquire is returned to ground control terminal, SLAM can directly be carried out.Multiple flight time components 20 can realize unmanned plane acceleration, deceleration, stopping, avoidance, object tracking.
Figure 13 is please referred to, when ontology 10 is robot body, such as sweeping robot, multiple flight time components 20 Quantity is two, and two flight time components 20 are separately mounted to the opposite sides of robot body.Robot body can band Move multiple flight time components 20 to move at home, obtain the initial depth image in multiple and different orientation, with identify target subject, Judge that the distance between target subject and mobile platform 300 change, to control robot body movement, realizes that robot removes Rubbish, avoidance etc..
Figure 14 is please referred to, when ontology 10 is ship ontology, the quantity of multiple flight time components 20 is two, and two fly Row time component 20 is separately mounted to the opposite two sides of ship ontology.Ship ontology can drive flight time component 20 to transport It is dynamic, the initial depth image in multiple and different orientation is obtained, to accurately identify quilt in adverse circumstances (such as under the environment that hazes) It takes the photograph target, judge target subject and the variation of the distance between mobile platform 300, improve sea going safety etc..
The mobile platform 300 of the application embodiment be can movable independently platform, multiple flight time components 20 pacify On the ontology 10 of mobile platform 300, to obtain panoramic range image.And the electronic equipment of the application embodiment 100 Body generally can not be moved independently, and electronic equipment 100 can further be equipped on the dress that can be moved similar to mobile platform 300 etc. It sets, so that the device be helped to obtain panoramic range image.
It should be pointed out that it is above-mentioned to the ontology 10 of electronic equipment 100, it is flight time component 20, CCD camera assembly 30, micro- The explanation of processor 40 and application processor 50 is equally applicable to the mobile platform 300 of the application embodiment, herein not Repeat explanation.
Although embodiments herein has been shown and described above, it is to be understood that above-described embodiment is example Property, it should not be understood as the limitation to the application, those skilled in the art within the scope of application can be to above-mentioned Embodiment is changed, modifies, replacement and variant, and scope of the present application is defined by the claims and their equivalents.

Claims (11)

1. a kind of electronic equipment, which is characterized in that the electronic equipment includes:
Ontology;With
Multiple flight time components on the body are set, and multiple flight time components are located at the ontology Multiple and different orientation, each flight time component include two optical transmitting sets and an optical receiver, each light The field angle of transmitter is the arbitrary value in 80 degree to 120 degree, and the field angle of each optical receiver is 180 degree to 200 degree In arbitrary value, for the optical transmitting set for emitting laser pulse to outside the ontology, the optical receiver is shot for receiving The laser pulse of corresponding two optical transmitting set transmittings of target reflection;
The optical transmitting sets of multiple flight time components emits laser simultaneously, multiple flight time components it is described Optical receiver exposes simultaneously, to obtain panoramic range image.
2. electronic equipment according to claim 1, which is characterized in that the flight time component includes two, two institutes The wavelength for stating the laser pulse of the adjacent optical transmitting set transmitting of flight time component is different.
3. electronic equipment according to claim 2, which is characterized in that the laser arteries and veins of each optical transmitting set transmitting The wavelength of punching is different.
4. electronic equipment according to claim 2, which is characterized in that the electronic equipment further includes application processor and two A microprocessor, the corresponding flight time component of each microprocessor, two microprocessors with it is described Application processor connection, each microprocessor are used to be sent out according to the optical transmitting set of the corresponding flight time component The received laser pulse of the laser pulse and the optical receiver penetrated is to obtain initial depth image and be transmitted to institute State application processor;The application processor is used to be obtained two microprocessors according to the field angle of the optical receiver Two initial depth images synthesize panoramic range image described in a frame.
5. electronic equipment according to claim 2, which is characterized in that the electronic equipment further includes application processor and two A microprocessor, the corresponding flight time component of each microprocessor, two microprocessors with it is described Application processor connection, each microprocessor are used to be sent out according to the optical transmitting set of the corresponding flight time component The received laser pulse of the laser pulse and the optical receiver penetrated is to obtain initial depth image and be transmitted to institute State application processor;
The electronic equipment further includes two CCD camera assemblies of setting on the body, and each CCD camera assembly is corresponding One flight time component, two CCD camera assemblies are connect with the application processor, each camera Component is used to acquire the scene image of the target subject and exports to the application processor;
The two initial depth images and two institutes that the application processor is used to be obtained according to two microprocessors Two scene images for stating CCD camera assembly acquisition identify the target subject.
6. electronic equipment according to claim 5, which is characterized in that the application processor is also used to according to two institutes When stating initial depth image and two scene images identification target subject failures, according to the visual field of the optical receiver Two initial depth images that two microprocessors obtain are synthesized a frame and merge depth image by angle, by two institutes Two scene images for stating CCD camera assembly acquisition synthesize a frame and merge scene image, and according to the merging depth map Picture and the merging scene image identify the target subject.
7. electronic equipment according to claim 2, which is characterized in that the electronic equipment further includes application processor and two A microprocessor, the corresponding flight time component of each microprocessor, two microprocessors with it is described Application processor connection, each microprocessor are used to be sent out according to the optical transmitting set of the corresponding flight time component The multiple received laser pulse of the laser pulse and the optical receiver penetrated is to obtain multiple initial depth images simultaneously It is transmitted to the application processor;The application processor is used to judge the mesh shot according to multiple initial depth images The variation of the distance between mark and the electronic equipment.
8. electronic equipment according to claim 7, which is characterized in that the application processor is also used to according to multiple institutes When stating initial depth image and judging distance change failure, according to the field angle of the optical receiver by two micro processs Two initial depth images that device obtains synthesize a frame and merge depth image, and the application processor continuously performs synthesis Step is to obtain the multiframe continuously merging depth image, and the merging depth image according to multiframe judges that the distance becomes Change.
9. electronic equipment according to claim 7 or 8, which is characterized in that the application processor is also used to judging When stating distance change as apart from reduction, improve from multiple initial depth images that microprocessor described at least one transmits Acquire the frame per second to judge the initial depth image of the distance change.
10. a kind of mobile platform, which is characterized in that the mobile platform includes:
Ontology;With
Multiple flight time components on the body are set, and multiple flight time components are located at the ontology Multiple and different orientation, each flight time component include two optical transmitting sets and an optical receiver, each light The field angle of transmitter is the arbitrary value in 80 degree to 120 degree, and the field angle of each optical receiver is 180 degree to 200 degree In arbitrary value, for the optical transmitting set for emitting laser pulse to outside the ontology, the optical receiver is shot for receiving The laser pulse of corresponding two optical transmitting set transmittings of target reflection;
The optical transmitting sets of multiple flight time components emits laser simultaneously, multiple flight time components it is described Optical receiver exposes simultaneously, to obtain panoramic range image.
11. mobile platform according to claim 10, which is characterized in that the ontology be vehicle body, unmanned aerial vehicle body, Robot body or ship ontology.
CN201910008293.0A 2019-01-04 2019-01-04 Electronic Devices and Mobile Platforms Expired - Fee Related CN109803089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910008293.0A CN109803089B (en) 2019-01-04 2019-01-04 Electronic Devices and Mobile Platforms

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910008293.0A CN109803089B (en) 2019-01-04 2019-01-04 Electronic Devices and Mobile Platforms

Publications (2)

Publication Number Publication Date
CN109803089A true CN109803089A (en) 2019-05-24
CN109803089B CN109803089B (en) 2021-05-18

Family

ID=66558483

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910008293.0A Expired - Fee Related CN109803089B (en) 2019-01-04 2019-01-04 Electronic Devices and Mobile Platforms

Country Status (1)

Country Link
CN (1) CN109803089B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113126111A (en) * 2019-12-30 2021-07-16 Oppo广东移动通信有限公司 Time-of-flight module and electronic equipment
CN114095713A (en) * 2021-11-23 2022-02-25 京东方科技集团股份有限公司 Imaging module, processing method, system, device and medium thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106371281A (en) * 2016-11-02 2017-02-01 辽宁中蓝电子科技有限公司 Multi-module 360-degree space scanning and positioning 3D camera based on structured light
US9653874B1 (en) * 2011-04-14 2017-05-16 William J. Asprey Trichel pulse energy devices
CN107263480A (en) * 2017-07-21 2017-10-20 深圳市萨斯智能科技有限公司 A kind of robot manipulation's method and robot
CN107742296A (en) * 2017-09-11 2018-02-27 广东欧珀移动通信有限公司 Dynamic image generation method and electronic device
US20180139431A1 (en) * 2012-02-24 2018-05-17 Matterport, Inc. Capturing and aligning panoramic image and depth data
CN108471487A (en) * 2017-02-23 2018-08-31 钰立微电子股份有限公司 Image device for generating panoramic depth image and related image device
CN108541304A (en) * 2015-04-29 2018-09-14 苹果公司 Flight time depth map with flexible scan pattern
CN108616703A (en) * 2018-04-23 2018-10-02 Oppo广东移动通信有限公司 Electronic device, control method thereof, computer apparatus, and readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9653874B1 (en) * 2011-04-14 2017-05-16 William J. Asprey Trichel pulse energy devices
US20180139431A1 (en) * 2012-02-24 2018-05-17 Matterport, Inc. Capturing and aligning panoramic image and depth data
CN108541304A (en) * 2015-04-29 2018-09-14 苹果公司 Flight time depth map with flexible scan pattern
CN106371281A (en) * 2016-11-02 2017-02-01 辽宁中蓝电子科技有限公司 Multi-module 360-degree space scanning and positioning 3D camera based on structured light
CN108471487A (en) * 2017-02-23 2018-08-31 钰立微电子股份有限公司 Image device for generating panoramic depth image and related image device
CN107263480A (en) * 2017-07-21 2017-10-20 深圳市萨斯智能科技有限公司 A kind of robot manipulation's method and robot
CN107742296A (en) * 2017-09-11 2018-02-27 广东欧珀移动通信有限公司 Dynamic image generation method and electronic device
CN108616703A (en) * 2018-04-23 2018-10-02 Oppo广东移动通信有限公司 Electronic device, control method thereof, computer apparatus, and readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113126111A (en) * 2019-12-30 2021-07-16 Oppo广东移动通信有限公司 Time-of-flight module and electronic equipment
CN113126111B (en) * 2019-12-30 2024-02-09 Oppo广东移动通信有限公司 Time of flight modules and electronics
CN114095713A (en) * 2021-11-23 2022-02-25 京东方科技集团股份有限公司 Imaging module, processing method, system, device and medium thereof

Also Published As

Publication number Publication date
CN109803089B (en) 2021-05-18

Similar Documents

Publication Publication Date Title
CN109862275A (en) Electronic equipment and mobile platform
US11277597B1 (en) Marker-based guided AR experience
US9432593B2 (en) Target object information acquisition method and electronic device
US20250252602A1 (en) Collaborative augmented reality eyewear with ego motion alignment
US12010286B2 (en) Separable distortion disparity determination
CN106371281A (en) Multi-module 360-degree space scanning and positioning 3D camera based on structured light
US20210232858A1 (en) Methods and systems for training an object detection algorithm using synthetic images
US12073536B2 (en) Dirty lens image correction
EP4172681A1 (en) Augmented reality eyewear with 3d costumes
CN108885487B (en) Gesture control method of wearable system and wearable system
CN109618108B (en) Electronic Devices and Mobile Platforms
US11580300B1 (en) Ring motion capture and message composition system
CN109587303A (en) Electronic equipment and mobile platform
JP2016157458A (en) Information processing device
CN109788172A (en) Electronic equipment and mobile platform
CN109803089A (en) Electronic equipment and mobile platform
CN109688400A (en) Electronic equipment and mobile platform
CN109587304A (en) Electronic equipment and mobile platform
CN109660731A (en) Electronic equipment and mobile platform
CN109618085A (en) Electronic Devices and Mobile Platforms
CN109788195A (en) Electronic equipment and mobile platform
CN109788196A (en) Electronic equipment and mobile platform
CN109729250A (en) Electronic Devices and Mobile Platforms
CN109660733A (en) Electronic equipment and mobile platform
CN109660732A (en) Electronic equipment and mobile platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210518