[go: up one dir, main page]

CN109803079A - A mobile terminal, a photographing method thereof, and a computer storage medium - Google Patents

A mobile terminal, a photographing method thereof, and a computer storage medium Download PDF

Info

Publication number
CN109803079A
CN109803079A CN201910120670.XA CN201910120670A CN109803079A CN 109803079 A CN109803079 A CN 109803079A CN 201910120670 A CN201910120670 A CN 201910120670A CN 109803079 A CN109803079 A CN 109803079A
Authority
CN
China
Prior art keywords
target
captured
mobile terminal
laser
camera module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910120670.XA
Other languages
Chinese (zh)
Other versions
CN109803079B (en
Inventor
马美雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910120670.XA priority Critical patent/CN109803079B/en
Publication of CN109803079A publication Critical patent/CN109803079A/en
Application granted granted Critical
Publication of CN109803079B publication Critical patent/CN109803079B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

本申请公开了一种移动终端及其拍照方法、计算机存储介质,该移动终端包括:摄像头模组;第一传感组件,用于获取待拍摄目标的位置变化情况;控制器,用于根据待拍摄目标的位置变化情况确定待拍摄目标的运动状态,并根据待拍摄目标的运动状态,在摄像头模组进行拍摄时,进行运动补偿。通过上述方式,能够保证运动目标的清晰度,提升了对运动物体拍摄的质量。

The present application discloses a mobile terminal and a photographing method thereof, and a computer storage medium. The mobile terminal includes: a camera module; a first sensor component for obtaining the position change of a target to be photographed; a controller for determining the motion state of the target to be photographed according to the position change of the target to be photographed, and performing motion compensation when the camera module photographs the target according to the motion state of the target to be photographed. By the above method, the clarity of the moving target can be guaranteed, and the quality of photographing the moving object can be improved.

Description

A kind of mobile terminal and its photographic method, computer storage medium
Technical field
This application involves technical field of mobile terminals, more particularly to a kind of mobile terminal and its photographic method, computer Storage medium.
Background technique
It is existing generally to be taken the photograph using the camera being located on mobile phone, including depth every empty gesture operation in human-computer interaction As head, binocular camera, monocular cam, the gesture state of user is detected by camera, is based on machine learning scheduling algorithm, Image recognition is carried out, is compared with preset gesture motion image, Lai Shixian is every empty gesture operation.Also the scheme having uses red Outer laser emitter judges gesture state by detecting the infrared external reflection situation received.
The gesture identification realized using camera and infrared launcher, is limited by the angle of camera and infrared remote receiver System can only act in generation in characteristic range, while need in conjunction with complicated image algorithm, very consumption system resource.In addition, It is larger using camera and infrared transmitter power consumption, it is unfavorable for using in Mobile portable formula equipment.
Summary of the invention
The technical solution that the application uses is: providing a kind of mobile terminal, which includes: camera mould Group;First sensory package, for obtaining the change in location situation of target to be captured;Controller, for according to target to be captured Change in location situation determines the motion state of target to be captured, and according to the motion state of target to be captured, in camera module When being shot, motion compensation is carried out.
Another technical solution that the application uses is: a kind of photographic method of mobile terminal is provided, this method comprises: obtaining Take the change in location situation of target to be captured;The movement shape of target to be captured is determined according to the change in location situation of target to be captured State;When camera module is shot, according to the motion state of target to be captured, motion compensation is carried out.
Another technical solution that the application uses is: providing a kind of computer storage medium, the computer storage medium For storing computer program, computer program realizes such as above-mentioned method when being executed by processor.
Mobile terminal provided by the present application includes: camera module;First sensory package, for obtaining target to be captured Change in location situation;Controller determines the motion state of target to be captured for the change in location situation according to target to be captured, And according to the motion state of target to be captured, when camera module is shot, motion compensation is carried out.By the above-mentioned means, Motion compensation can be carried out when shooting based on the movement speed of target to be captured, ensure that the clarity of moving target, promoted To the quality of moving object shooting.
Detailed description of the invention
In order to more clearly explain the technical solutions in the embodiments of the present application, make required in being described below to embodiment Attached drawing is briefly described, it should be apparent that, the drawings in the following description are only some examples of the present application, for For those of ordinary skill in the art, without creative efforts, it can also be obtained according to these attached drawings other Attached drawing.Wherein:
Fig. 1 is the structural schematic diagram of mobile terminal first embodiment provided by the present application;
Fig. 2 is the structural schematic diagram of first sensor component in mobile terminal first embodiment provided by the present application;
Fig. 3 is the mobile schematic diagram of target to be captured;
Fig. 4 is the moving distance schematic diagram of target to be captured;
Fig. 5 is the structural schematic diagram of mobile terminal second embodiment provided by the present application;
Fig. 6 is the structural schematic diagram of mobile terminal 3rd embodiment provided by the present application;
Fig. 7 is the structural schematic diagram of camera module in mobile terminal 3rd embodiment provided by the present application;
Fig. 8 is the structural schematic diagram of motor of camera in mobile terminal 3rd embodiment provided by the present application;
Fig. 9 is the flow diagram of the photographic method first embodiment of mobile terminal provided by the present application;
Figure 10 is the flow diagram of the photographic method second embodiment of mobile terminal provided by the present application;
Figure 11 is the structural schematic diagram of one embodiment of computer storage medium provided by the present application.
Specific embodiment
Refering to fig. 1, Fig. 1 is the structural schematic diagram of mobile terminal first embodiment provided by the present application, the mobile terminal 10 Including camera module 11, controller 12 and first sensor component 13.
Wherein, which specifically includes for being shot and takes pictures or record a video.The controller 12 can be CPU (Central Processing Unit, central processing unit) or GPU (Graphics Processing Unit, at figure Manage device), for handling data, or for issuing control instruction, to control the work of other components.
Wherein, which is used to obtain position and its situation of change of target to be captured, controller 12 for determining the motion state of target to be captured according to the change in location situation of target to be captured, and according to target to be captured Motion state carries out motion compensation when camera module is shot.
Wherein, which may include moving direction, speed and acceleration of target to be captured etc., here with speed Based on degree.It should be understood that detect the speed of target to be captured, by optical sensor detect variation and the time of its position come Calculating its speed indirectly is feasible scheme.And the change in location of target to be captured is detected, it obtains and first detects target to be captured The situation of change in the distance between capture apparatus and direction, for example, in one case, target to be captured and mobile terminal Between line direction it is constant, but distance between the two is changing, in another case, target to be captured and it is mobile eventually The distance between end is constant, but line direction between the two is changing.Object to be shot is introduced below by several ways The detection method of the distance between capture apparatus.
Optionally, which can be laser sensing device assembly, including laser emitter and laser connect Device is received, is specifically carried out using TOF (Time of flight) apart from detection, TOF, i.e. time-of-flight method 3D imaging are by swashing Optical transmitting set continuously transmits light pulse to object to be measured, and the light returned from object to be measured is then received with laser pickoff, is passed through Flight (round-trip) time of detecting optical pulses obtains object distance.
ToF distance measuring method belongs to Bidirectional distance measurement technology, it mainly utilizes signal in two asynchronous receiver-transmitters (Transceiver) (or by reflecting surface) the round-trip flight time carrys out the distance between measuring node between.Traditional ranging technology It is divided into Bidirectional distance measurement technology and unidirectional ranging technology.In the relatively good modulation of signal level or under non line of sight line-of-sight circumstances, it is based on The knot of RSSI (Received Signal Strength Indication, the instruction of received signal intensity) distance measuring method estimation Fruit is more satisfactory;Under sighting distance line-of-sight circumstances, it can make up for it based on ToF range estimation method based on RSSI range estimation method It is insufficient.
Optionally, which can be 3D structured light sensor component, including projector and camera, After the specific optical information of projector projects to object to be measured surface and background, acquired by camera.The light according to caused by object The variation of signal restores entire three-dimensional space information such as the position and the depth that calculate object, obtains mesh to be measured with this Target distance.
Optionally, in other embodiments, the inspection of distance can also be realized using ultrasonic sensor, laser radar It surveys, in addition, other than range sensor, it can also be with the use of inertial sensor, accelerometer, gyroscope etc., to inspection The distance of survey is further corrected.
The detection adjusted the distance below by a kind of specific embodiment is introduced.
As shown in Fig. 2, the structure that Fig. 2 is first sensor component in mobile terminal first embodiment provided by the present application is shown It is intended to.First sensory package 13 is multiple spot laser ranging module, including laser emitter 131 and corresponding laser pickoff 132。
Wherein, laser emitter 131 is used to emit laser to target to be captured;Laser pickoff 132 is for receiving reflection Laser;Controller (Fig. 2 does not show) is used to be surveyed according to the time difference between laser emitter 131 and corresponding laser pickoff 132 Obtain the distance between first sensor component 13 and object to be shot A.It should be understood that due in practical structures, laser hair The distance between emitter 131 and laser pickoff 132 very little, can be ignored, and therefore, emitting light and reflection light can be with Approximate coincidence is considered, then can specifically calculate first sensor component 13 and object to be shot A using following formula The distance between:
Wherein, L is the distance between first sensor component 13 and object to be shot A, and c is the light velocity, and t is laser emitter Time difference between at the time of 131 sending laser and at the time of the reception laser of laser pickoff 132.
Optionally, laser emitter 131 and corresponding laser pickoff 132 can be with the distributions of array, such as can use The array distribution of 4*4.
As shown in figure 3, Fig. 3 is the mobile schematic diagram of target to be captured, show respectively at tri- moment of t0, t1, t2, to The situation of movement of photographic subjects.It should be understood that since the position of mobile terminal will not generally change, it is possible to think The position of sensor is certain, then, if the distance that different sensors detect is changed, so that it may think wait clap It takes the photograph target and movement has occurred.Further, the situation of change of the distance obtained by each sensor is it may determine that be captured The situation of movement of target.
As shown in figure 4, Fig. 4 is the moving distance schematic diagram of target to be captured, two adjacent sensors are in time interval t Distance is inside measured respectively and significantlys change a and b, since sensor can accurately be set in design, production and calibration process Angle theta is counted out, it is possible to calculate the moving distance s of target to be captured:
Further, the speed v of target to be captured:
Wherein, v is the speed of target to be captured, and a is the mesh in the first moment first laser transmitter and target to be captured The distance between punctuate, b are the distance between the second moment second laser transmitter and target point, and θ is first laser transmitter With the angle between second laser transmitter, time span of the t between the first moment and the second moment.
It is the structural schematic diagram of mobile terminal second embodiment provided by the present application, the mobile terminal 50 refering to Fig. 5, Fig. 5 Including camera module 51, controller 52, first sensor component 53 and second sensor component 54.
Wherein, which specifically includes for being shot and takes pictures or record a video.The controller 52 can be CPU (Central Processing Unit, central processing unit) or GPU (Graphics Processing Unit, at figure Manage device), for handling data, or for issuing control instruction, to control the work of other components.
Wherein, which is used to obtain the change in location situation of target to be captured, and controller 52 is used for The motion state of target to be captured is determined according to the change in location situation of target to be captured.The second sensor component 54 is for obtaining Take the motion state of mobile terminal 50, controller 52 is used for according to the motion state of target to be captured and mobile terminal 50 Motion state carries out motion compensation when camera module 51 is shot.
Wherein, using the motion state that first sensor component 53 obtains be the moving direction of target to be captured, speed with And acceleration etc., using the motion state that second sensor component 54 obtains be the moving direction of mobile terminal 50, rate and Acceleration can also be chattering frequency, the jitter amplitude etc. of mobile terminal 50.
Specifically, which may include two gyroscopes (or accelerometer), two gyroscopes point The tilt angle with front and back Jian Ce not be controlled, then again by the two angle information transfers to controller 52, controller 52 is based on These angle informations obtain the motion state of mobile terminal 50.
It is the structural schematic diagram of mobile terminal 3rd embodiment provided by the present application, the mobile terminal 60 refering to Fig. 6, Fig. 6 Including camera module 61, controller 62 and first sensor component 63.
Wherein, which specifically includes for being shot and takes pictures or record a video.The controller 62 can be CPU (Central Processing Unit, central processing unit) or GPU (Graphics Processing Unit, at figure Manage device), for handling data, or for issuing control instruction, to control the work of other components.
Wherein, which is used to obtain the change in location situation of target to be captured, and controller 62 is used for The motion state of target to be captured is determined according to the change in location situation of target to be captured, and according to the movement shape of target to be captured State issues control instruction to camera module 61, to carry out motion compensation when camera module is shot.
As shown in fig. 7, Fig. 7 is the structural representation of camera module in mobile terminal 3rd embodiment provided by the present application Figure, camera module 61 specifically include motor of camera 611, camera lens 612 and driver 613.
Wherein, camera module 61 specifically can be set on the circuit board 60a in mobile terminal 60, circuit board 60a It can be specifically flexible circuit board (FPC), be connected to above the mainboard of electronic equipment by BTB (board to board).Separately Outside, camera module 61 can also include the other assemblies such as light sensation, flash lamp, be also possible to combine shape with other mould groups such as earpieces At a multi-functional mould group, here without repeating.
Wherein, which can be mechanical motor, electronics touches motor, ring-shaped ultrasonic motor etc., here not It is restricted.Optionally, in one embodiment, it can realize that the position of camera lens 612 is adjusted using voice coil motor.Voice coil motor (Voice Coil Actuator/Voice Coil Motor), is a kind of device for converting electrical energy into mechanical energy, and realize The movement of linear type and limited pivot angle.
It is the structural representation of motor of camera in mobile terminal 3rd embodiment provided by the present application refering to Fig. 8, Fig. 8 simultaneously Figure, the motor of camera 611 include Antimagnetic hood 611a, magnet 611b, upper elastic slice 611c, camera lens carrier 611d, motor coil 611e, lower elastic slice 611f and motor base 611g.Wherein, camera lens 612 is fixed on camera lens carrier 611d;Driver 613 has Body is for obtaining driving instruction, to control the current strength in motor coil 611e, to control camera lens carrier 611d in motor Movement in 611.
Optionally, which can be voice coil motor, and the working principle of voice coil motor is: in magnet 611b In the permanent-magnetic field formed, by changing the direct current size of motor inner motor coil 611e, to control spring leaf (on i.e. Elastic slice 611c and lower elastic slice 611f) stretch position, so that camera lens carrier 611d be driven to move back and forth, and then drive camera lens 612 Move back and forth.
In conjunction with the above embodiments, controller 62 in the present embodiment the motion state for getting target to be captured (such as Movement speed) after, motion compensation can be carried out by two ways:
The first, can also be called optical anti-vibration, and optical anti-vibration wraps up suspension camera lens by magnetic force, thus effectively overcome because The image that camera vibration generates is fuzzy, and the effect that this can play the digital camera of big zoom lens is more obvious.In general, Gyroscope in camera lens detects small movement, and signal can be reached to microprocessor and calculate the displacement for needing to compensate immediately Amount is compensated then by compensation lens set according to the jitter direction of camera lens and displacement, and compensation microscope group accordingly adjusts position And angle, make optical path keep stablizing, thus the image blur for effectively overcoming the vibration because of camera to generate.
By optical anti-vibration technical application to the movement to target to be captured in the present embodiment.When laser array detects Target to be captured it is mobile when, microprocessor determines its movement speed by the variation of its distance, then by compensation lens set, It is compensated according to the moving direction of target to be captured and movement speed, compensation microscope group accordingly adjusts position and angle, makes optical path It keeps stablizing, thus the image blur for effectively overcoming the vibration because of camera to generate.
Specifically, can change according to a certain percentage to adjust the movement speed of camera lens, such as in embodiment above-mentioned In got the distance between mobile terminal and target to be captured, it is assumed here that camera lens and sensor and target to be captured away from From identical, then the movement speed of camera lens can be adjusted according to the following formula:
Wherein, v1 is the movement speed of target to be captured, and v2 is the movement speed of camera lens, and s1 is target to be captured and CCD The distance between (imaging sensor), s2 are the distance between camera lens and CCD.
Second, electronic flutter-proof can also be called, electronic flutter-proof is referred mainly to be improved the photosensitive parameter of CCD using pressure while be added Fast shutter is simultaneously analyzed, the stabilization then compensated using edge image for the image obtained on CCD, and electronic flutter-proof is real It is a kind of technology that shake is compensated by reduction image quality on border, this technology attempts to obtain one between image quality and float Equalization point.
Electronic flutter-proof generates anti-shake effect using the processing that digital circuit carries out picture.When Anti-shaking circuit work, shooting Picture is only 90% or so of real screen, and then digital circuit carries out fuzzy Judgment to the jitter conditions of target to be captured, And then jitter compensation is carried out with remaining 10% or so picture.The characteristics of this mode is at low cost, but reduces the benefit of CCD With rate, certain loss can be brought to image sharpness.
It is in contrast to the prior art, the mobile terminal in the present embodiment, comprising: camera module;First sensing group Part, for obtaining the distance of target to be captured;Controller, the situation of change for the distance according to target to be captured are determined wait clap The motion state of target is taken the photograph, and according to the motion state of target to be captured, when camera module is shot, carries out movement benefit It repays.By the above-mentioned means, motion compensation can be carried out when shooting based on the movement speed of target to be captured, movement ensure that The clarity of target improves the quality to moving object shooting.
It is the flow diagram of the photographic method first embodiment of mobile terminal provided by the present application refering to Fig. 9, Fig. 9, it should Method includes:
Step 91: obtaining the change in location situation of target to be captured.
In one case, the line direction between target to be captured and mobile terminal is constant, but between the two away from From changing, in another case, the distance between target to be captured and mobile terminal are constant, but line between the two Direction is changing.
Optionally, it can use laser sensing device assembly to obtain the change in location situation of target to be captured, specifically, be somebody's turn to do Laser sensing device assembly includes laser emitter and laser pickoff, specifically carries out distance using TOF (Time of flight) Detection, TOF, i.e. time-of-flight method 3D imaging, are to continuously transmit light pulse to object to be measured by laser emitter, then with sharp Optical receiver receives the light that returns from object to be measured, obtained by flight (round-trip) time of detecting optical pulses object away from From.
ToF distance measuring method belongs to Bidirectional distance measurement technology, it mainly utilizes signal in two asynchronous receiver-transmitters (Transceiver) (or by reflecting surface) the round-trip flight time carrys out the distance between measuring node between.Traditional ranging technology It is divided into Bidirectional distance measurement technology and unidirectional ranging technology.In the relatively good modulation of signal level or under non line of sight line-of-sight circumstances, it is based on The knot of RSSI (Received Signal Strength Indication, the instruction of received signal intensity) distance measuring method estimation Fruit is more satisfactory;Under sighting distance line-of-sight circumstances, it can make up for it based on ToF range estimation method based on RSSI range estimation method It is insufficient.
Optionally, the distance of target to be captured, specifically, the structure can also be obtained using structured light sensor component Optical sensor components include projector and camera, with the specific optical information of projector projects to object to be measured surface and background Afterwards, it is acquired by camera.The information such as position and the depth for calculating object according to the variation of optical signal caused by object, Jin Erfu Former entire three-dimensional space, the distance of object to be measured is obtained with this.
Step 92: the motion state of target to be captured is determined according to the change in location situation of target to be captured.
Optionally, the detection of distance can be carried out using multiple spot laser matrix.Optionally, multiple spot laser matrix can be adopted With the array distribution of 4*4, i.e. 16 laser emitters and corresponding 16 laser pickoffs.It so can specifically use following Formula calculates the distance of object to be shot:
Wherein, L is the distance between mobile terminal and object to be shot, and c is the light velocity, and t is that laser emitter issues laser At the time of and laser pickoff receive laser at the time of between time difference.
It should be understood that since the position of mobile terminal will not generally change, it is possible to think the position of sensor It is certain for setting, then, if the distance that different sensors detect is changed, so that it may think target to be captured Movement.Further, the situation of change of the distance obtained by each sensor it may determine that target to be captured movement Situation.
In conjunction with Fig. 4, two adjacent sensors measure distance respectively in time interval t and significantly change a and b, by Angle theta can be accurately designed in design, production and calibration process in sensor, it is possible to calculate mesh to be captured Target moving distance s:
Further, the speed v of target to be captured:
Wherein, v is the speed of target to be captured, and a is the mesh in the first moment first laser transmitter and target to be captured The distance between punctuate, b are the distance between the second moment second laser transmitter and target point, and θ is first laser transmitter With the angle between second laser transmitter, time span of the t between the first moment and the second moment.
Step 93: when camera module is shot, according to the motion state of target to be captured, carrying out motion compensation.
Optionally, step 93 can be with specifically: when camera module is shot, according to the movement shape of target to be captured State carries out position adjustment to the camera lens in camera module, to carry out motion compensation when camera module is shot.
This mode can also be called optical anti-vibration, and optical anti-vibration wraps up suspension camera lens by magnetic force, to effectively overcome Because the image that camera vibration generates is fuzzy, the effect that this can play the digital camera of big zoom lens is more obvious.It is logical Often, the gyroscope in camera lens detects small movement, and signal can be reached to microprocessor and calculated immediately and need to compensate Displacement is compensated then by compensation lens set according to the jitter direction of camera lens and displacement, and compensation microscope group accordingly adjusts Position and angle make optical path keep stablizing, thus the image blur for effectively overcoming the vibration because of camera to generate.
By optical anti-vibration technical application to the movement to target to be captured in the present embodiment.When laser array detects Target to be captured it is mobile when, microprocessor determines its movement speed by the variation of its distance, then by compensation lens set, It is compensated according to the moving direction of target to be captured and movement speed, compensation microscope group accordingly adjusts position and angle, makes optical path It keeps stablizing, thus the image blur for effectively overcoming the vibration because of camera to generate.
Specifically, can change according to a certain percentage to adjust the movement speed of camera lens, such as in embodiment above-mentioned In got the distance between mobile terminal and target to be captured, it is assumed here that camera lens and sensor and target to be captured away from From identical, then the movement speed of camera lens can be adjusted according to the following formula:
Wherein, v1 is the movement speed of target to be captured, and v2 is the movement speed of camera lens, and s1 is target to be captured and CCD The distance between (imaging sensor), s2 are the distance between camera lens and CCD.
Optionally, step 93 can be with specifically: when camera module is shot, to the time for exposure of camera module It is adjusted, to carry out motion compensation when camera module is shot.
This mode can also be called electronic flutter-proof, and electronic flutter-proof refers mainly to improve the photosensitive parameter of CCD simultaneously using pressure Accelerate shutter and is analyzed for the image obtained on CCD, the stabilization then compensated using edge image, electronic flutter-proof Actually a kind of to compensate the technology of shake by reducing image quality, this technology attempts to obtain one between image quality and float A equalization point.
Electronic flutter-proof generates anti-shake effect using the processing that digital circuit carries out picture.When Anti-shaking circuit work, shooting Picture is only 90% or so of real screen, and then digital circuit carries out fuzzy Judgment to the jitter conditions of target to be captured, And then jitter compensation is carried out with remaining 10% or so picture.The characteristics of this mode is at low cost, but reduces the benefit of CCD With rate, certain loss can be brought to image sharpness.
0, Figure 10 is the flow diagram of the photographic method second embodiment of mobile terminal provided by the present application refering to fig. 1, This method comprises:
Step 101: obtaining the change in location situation of target to be captured.
Step 102: the motion state of target to be captured is determined according to the change in location situation of target to be captured.
Step 103: obtaining the motion state of mobile terminal.
In general, the gyroscope in camera lens detects small movement, to get the motion state of mobile terminal.
It should be understood that above-mentioned steps 101, step 102 are used to detect the motion state of target to be captured, step 103 is used In detection mobile terminal motion state, between the two the step of be can exchange, or both may be performed simultaneously, here It is not required.
Step 104: when camera module is shot, according to the motion state and mobile terminal of target to be captured Motion state, when camera module is shot, carry out motion compensation.
Optionally, it is compensated compared to single movement, obtains target to be captured and mobile terminal two simultaneously here The motion state of person, compensates simultaneously.
For example, if target to be captured is identical with the moving direction of mobile terminal, then relative to the independent of target to be captured For movement, offset can be smaller, if the moving direction of target to be captured and mobile terminal is on the contrary, so relative to be captured For being individually moved of target, offset can be larger.
Specifically, an offset individually can be obtained according to different types.For example, it is directed to the speed of target to be captured, The first offset is obtained, for the speed of mobile terminal, obtains the second offset.Then judge the moving direction of target to be captured It is whether identical with the moving direction of mobile terminal, if they are the same, the first offset is obtained into final offset plus the second offset, If not identical, using the difference of the first offset and the second offset as final offset.
1, Figure 11 is the structural schematic diagram of one embodiment of computer storage medium provided by the present application, the calculating refering to fig. 1 Computer program 111 is stored in machine storage medium 110, computer program 111 realizes following side when being executed by processor Method:
Obtain the change in location situation of target to be captured;Mesh to be captured is determined according to the change in location situation of target to be captured Target motion state;When camera module is shot, according to the motion state of target to be captured, motion compensation is carried out.
Optionally, computer program 111 is also used to realize following method when being executed by processor: being swashed using multiple spot Ligh-ranging mould group emits laser to target to be captured;The laser of reflection is received using multiple spot laser ranging module;Swashed based on transmitting The time difference between light and reception laser determines the change in location situation of target to be captured.
Optionally, computer program 111 is also used to realize following method when being executed by processor: using following public affairs The speed of formula calculating target to be captured:Wherein, v is the speed of target to be captured, when a is first Carve the distance between the target point in first laser transmitter and target to be captured, b be the second moment second laser transmitter with The distance between target point, angle of the θ between first laser transmitter and second laser transmitter, t are the first moment and the Time span between two moment.
Optionally, computer program 111 is also used to realize following method when being executed by processor: in camera mould When group is shot, according to the motion state of target to be captured and the motion state of mobile terminal, carried out in camera module When shooting, motion compensation is carried out.
Optionally, computer program 111 is also used to realize following method when being executed by processor: in camera mould When group is shot, according to the motion state of target to be captured, position adjustment is carried out to the camera lens in camera module, to take the photograph When being shot as head mould group, motion compensation is carried out.
Optionally, computer program 111 is also used to realize following method when being executed by processor: in camera mould When group is shot, the time for exposure of camera module is adjusted, to be moved when camera module is shot Compensation.
Embodiments herein is realized in the form of SFU software functional unit and when sold or used as an independent product, can To be stored in a computer readable storage medium.Based on this understanding, the technical solution of the application substantially or Say that all or part of the part that contributes to existing technology or the technical solution can embody in the form of software products Out, which is stored in a storage medium, including some instructions are used so that a computer equipment (can be personal computer, server or the network equipment etc.) or processor (processor) execute each implementation of the application The all or part of the steps of mode the method.And storage medium above-mentioned includes: USB flash disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic or disk Etc. the various media that can store program code.
The foregoing is merely presently filed embodiments, are not intended to limit the scope of the patents of the application, all to utilize this Equivalent structure or equivalent flow shift made by application specification and accompanying drawing content, it is relevant to be applied directly or indirectly in other Technical field similarly includes in the scope of patent protection of the application.

Claims (17)

1. a kind of mobile terminal characterized by comprising
Camera module;
First sensory package, for obtaining the change in location situation of target to be captured;
Controller, determines the motion state of target to be captured for the change in location situation according to target to be captured, and according to The motion state of photographic subjects carries out motion compensation when the camera module is shot.
2. mobile terminal according to claim 1, which is characterized in that
First sensory package is multiple spot laser ranging module, comprising:
Multiple laser emitters of array distribution, for emitting laser to target to be captured;
The multiple laser pickoffs being correspondingly arranged with the laser emitter, for receiving the laser of reflection;
The controller is used for the change in location situation that measures according to each laser emitter with corresponding laser pickoff, determination to The speed of photographic subjects.
3. mobile terminal according to claim 2, which is characterized in that
The control implement body is calculated using the following equation the speed of target to be captured:
Wherein, v is the speed of target to be captured, and a is the target point in the first moment first laser transmitter and target to be captured The distance between, b is the distance between the second moment second laser transmitter and described target point, and θ is first laser hair Angle between emitter and the second laser transmitter, time of the t between first moment and second moment are long Degree.
4. mobile terminal according to claim 2, which is characterized in that
The camera module is the rear camera of the mobile terminal, the multiple spot laser ranging module and the camera Mould group is set to the ipsilateral of the mobile terminal.
5. mobile terminal according to claim 1, which is characterized in that
The mobile terminal further includes the second sensory package, for obtaining the motion state of the mobile terminal;
The controller is also used to the motion state of motion state and the mobile terminal according to target to be captured, in institute When stating camera module and being shot, motion compensation is carried out.
6. mobile terminal according to claim 5, which is characterized in that
Second sensory package is acceleration transducer.
7. mobile terminal according to claim 1, which is characterized in that
The controller is specifically used for the motion state according to target to be captured, carries out position to the camera lens in the camera module Adjustment is set, to carry out motion compensation when the camera module is shot.
8. mobile terminal according to claim 7, which is characterized in that
The camera module specifically includes:
Motor of camera, including motor base and camera lens carrier;
Camera lens is installed on the camera lens carrier;
Driver, for obtaining the driving instruction of the controller, to control the movement of camera lens carrier, thus to the camera lens into Line position sets adjustment.
9. mobile terminal according to claim 8, which is characterized in that described
The motor of camera further includes motor coil, and the camera lens carrier is set in the motor coil;
The driver is specifically used for obtaining the driving instruction of the controller, to control the electric current of the motor coil, thus The movement of the camera lens carrier is controlled, to carry out position adjustment to the camera lens.
10. mobile terminal according to claim 1, which is characterized in that
The controller is specifically used for the motion state according to target to be captured, carries out to the time for exposure of the camera module Adjustment, to carry out motion compensation when the camera module is shot.
11. a kind of photographic method of mobile terminal characterized by comprising
Obtain the change in location situation of target to be captured;
The motion state of target to be captured is determined according to the change in location situation of target to be captured;
When the camera module is shot, according to the motion state of target to be captured, motion compensation is carried out.
12. photographic method according to claim 11, which is characterized in that
The step of change in location situation for obtaining target to be captured, comprising:
Emit laser to target to be captured using multiple spot laser ranging module;
The laser of reflection is received using multiple spot laser ranging module;
The change in location situation of target to be captured is determined based on the time difference between transmitting laser and reception laser.
13. photographic method according to claim 11, which is characterized in that
The described the step of motion state of target to be captured is determined according to the change in location situation of target to be captured, comprising:
It is calculated using the following equation the speed of target to be captured:
Wherein, v is the speed of target to be captured, and a is the target point in the first moment first laser transmitter and target to be captured The distance between, b is the distance between the second moment second laser transmitter and described target point, and θ is first laser hair Angle between emitter and the second laser transmitter, time of the t between first moment and second moment are long Degree.
14. according to the method for claim 11, which is characterized in that
The method also includes:
Obtain the motion state of the mobile terminal;
It is described when the camera module is shot, according to the motion state of target to be captured, carry out the step of motion compensation Suddenly, comprising:
When the camera module is shot, according to the motion state of target to be captured and the fortune of the mobile terminal Dynamic state carries out motion compensation when the camera module is shot.
15. according to the method for claim 11, which is characterized in that
It is described when the camera module is shot, according to the motion state of target to be captured, carry out the step of motion compensation Suddenly, comprising:
When the camera module is shot, according to the motion state of target to be captured, in the camera module Camera lens carries out position adjustment, to carry out motion compensation when the camera module is shot.
16. according to the method for claim 11, which is characterized in that
It is described when the camera module is shot, according to the motion state of target to be captured, carry out the step of motion compensation Suddenly, comprising:
When the camera module is shot, the time for exposure of the camera module is adjusted, to be taken the photograph described When being shot as head mould group, motion compensation is carried out.
17. a kind of computer storage medium, which is characterized in that the computer storage medium is for storing computer program, institute Computer program is stated when being executed by processor, realizes such as the described in any item methods of claim 11-16.
CN201910120670.XA 2019-02-18 2019-02-18 Mobile terminal, photographing method thereof and computer storage medium Expired - Fee Related CN109803079B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910120670.XA CN109803079B (en) 2019-02-18 2019-02-18 Mobile terminal, photographing method thereof and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910120670.XA CN109803079B (en) 2019-02-18 2019-02-18 Mobile terminal, photographing method thereof and computer storage medium

Publications (2)

Publication Number Publication Date
CN109803079A true CN109803079A (en) 2019-05-24
CN109803079B CN109803079B (en) 2021-04-27

Family

ID=66560963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910120670.XA Expired - Fee Related CN109803079B (en) 2019-02-18 2019-02-18 Mobile terminal, photographing method thereof and computer storage medium

Country Status (1)

Country Link
CN (1) CN109803079B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111479011A (en) * 2020-04-02 2020-07-31 Oppo广东移动通信有限公司 Power adjustment method, device, storage medium and electronic equipment
CN111736173A (en) * 2020-05-24 2020-10-02 深圳奥比中光科技有限公司 Depth measuring device and method based on TOF and electronic equipment
CN113395447A (en) * 2021-05-31 2021-09-14 江西晶浩光学有限公司 Anti-shake mechanism, image pickup device, and electronic apparatus
CN114254492A (en) * 2021-12-08 2022-03-29 新国脉文旅科技有限公司 A simulation method of passenger behavior trajectory based on passenger flow portrait
CN115883966A (en) * 2022-12-16 2023-03-31 西安闻泰信息技术有限公司 Shooting anti-shake method, device, medium and equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0782089B2 (en) * 1990-06-14 1995-09-06 浜松ホトニクス株式会社 Speed measuring instrument
CN101491084A (en) * 2006-07-20 2009-07-22 松下电器产业株式会社 image capture device
CN101753708A (en) * 2008-12-22 2010-06-23 康佳集团股份有限公司 Mobile phone capable of measuring velocity and method for measuring movement velocity of object by mobile phone
CN104853084A (en) * 2014-02-19 2015-08-19 佳能株式会社 Image processing apparatus and control method thereof
CN105100614A (en) * 2015-07-24 2015-11-25 小米科技有限责任公司 Optical anti-vibration realization method, apparatus and electronic equipment
CN105763785A (en) * 2014-12-15 2016-07-13 富泰华工业(深圳)有限公司 Shooting system and method with dynamic adjustment of shutter speed
JP2018194770A (en) * 2017-05-22 2018-12-06 キヤノン株式会社 Camera system, interchangeable and camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0782089B2 (en) * 1990-06-14 1995-09-06 浜松ホトニクス株式会社 Speed measuring instrument
CN101491084A (en) * 2006-07-20 2009-07-22 松下电器产业株式会社 image capture device
CN101753708A (en) * 2008-12-22 2010-06-23 康佳集团股份有限公司 Mobile phone capable of measuring velocity and method for measuring movement velocity of object by mobile phone
CN104853084A (en) * 2014-02-19 2015-08-19 佳能株式会社 Image processing apparatus and control method thereof
CN105763785A (en) * 2014-12-15 2016-07-13 富泰华工业(深圳)有限公司 Shooting system and method with dynamic adjustment of shutter speed
CN105100614A (en) * 2015-07-24 2015-11-25 小米科技有限责任公司 Optical anti-vibration realization method, apparatus and electronic equipment
JP2018194770A (en) * 2017-05-22 2018-12-06 キヤノン株式会社 Camera system, interchangeable and camera

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111479011A (en) * 2020-04-02 2020-07-31 Oppo广东移动通信有限公司 Power adjustment method, device, storage medium and electronic equipment
CN111736173A (en) * 2020-05-24 2020-10-02 深圳奥比中光科技有限公司 Depth measuring device and method based on TOF and electronic equipment
CN113395447A (en) * 2021-05-31 2021-09-14 江西晶浩光学有限公司 Anti-shake mechanism, image pickup device, and electronic apparatus
CN114254492A (en) * 2021-12-08 2022-03-29 新国脉文旅科技有限公司 A simulation method of passenger behavior trajectory based on passenger flow portrait
CN114254492B (en) * 2021-12-08 2024-12-06 新国脉文旅科技有限公司 A method for simulating passenger flow behavior trajectory based on passenger flow portrait
CN115883966A (en) * 2022-12-16 2023-03-31 西安闻泰信息技术有限公司 Shooting anti-shake method, device, medium and equipment

Also Published As

Publication number Publication date
CN109803079B (en) 2021-04-27

Similar Documents

Publication Publication Date Title
CN109803079A (en) A mobile terminal, a photographing method thereof, and a computer storage medium
EP3467585B1 (en) Head-mounted display tracking system
CN114762314B (en) Electronic device and method for controlling camera motion
WO2016000533A1 (en) Fast focusing shooting module in mobile phone
US10504243B2 (en) Calibration system for a head-mounted display tracking system
US9208565B2 (en) Method and apparatus for estimating three-dimensional position and orientation through sensor fusion
EP4006623A1 (en) Optical anti-shake apparatus and control method
CN107894588A (en) Mobile terminal, distance measurement method, dimension measurement method and device
US10216002B2 (en) Anti-shake correction system for curved optical sensor
CN105704380A (en) Camera focusing method and electric device
CN108377380A (en) Image scanning system and method thereof
CN208836252U (en) Camera and unmanned plane
CN110771143B (en) Control method of handheld PTZ, handheld PTZ, and handheld device
KR102794864B1 (en) Electronic device for using depth information and operating method thereof
WO2017117749A1 (en) Follow focus system and method based on multiple ranging approaches, and photographing system
CN108007426A (en) A kind of camera distance measuring method and system
CN107025666A (en) Single-camera-based depth detection method, device, and electronic device
CN111343381B (en) Method, device, electronic device and storage medium for controlling anti-shake function on
CN110336993A (en) Depth camera control method and device, electronic equipment and storage medium
CN110225247B (en) Image processing method and electronic equipment
CN114070994A (en) Camera module device, camera system, electronic equipment and automatic zooming imaging method
CN112335227A (en) Control device, imaging system, control method, and program
CN105824100A (en) Focusing method and apparatus
CN117998205A (en) A method for anti-shake of binocular camera
JP2012124554A (en) Imaging apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210427

CF01 Termination of patent right due to non-payment of annual fee