US20210303084A1 - Computer readable recording medium can perform image sensing system control method and image sensing system - Google Patents
Computer readable recording medium can perform image sensing system control method and image sensing system Download PDFInfo
- Publication number
- US20210303084A1 US20210303084A1 US16/830,212 US202016830212A US2021303084A1 US 20210303084 A1 US20210303084 A1 US 20210303084A1 US 202016830212 A US202016830212 A US 202016830212A US 2021303084 A1 US2021303084 A1 US 2021303084A1
- Authority
- US
- United States
- Prior art keywords
- time
- polling
- frame
- image sensor
- motion delta
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0317—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/50—Allocation of resources, e.g. of the central processing unit [CPU]
- G06F9/5005—Allocation of resources, e.g. of the central processing unit [CPU] to service a request
- G06F9/5027—Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
- G06F9/5038—Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the execution order of a plurality of tasks, e.g. taking priority or time dependency constraints into consideration
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0383—Signal control means within the pointing device
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Definitions
- the present invention relates to an image sensing system control method and an image sensing system, and particularly relates to an image sensing system control method and an image sensing system which can reduce the effect caused by non-synchronization between the frame rate and the MCU polling.
- the image sensor thereof captures frames at a predetermined frame rate and then computes the motion delta between different frames.
- Such predetermined frame rate may change corresponding to different modes, for example, an active mode or a standby mode.
- a MCU micro control unit
- polls the image sensor for motion delta i.e. request the image sensor to output motion delta.
- the MCU polling rate and the image sensor frame rate are usually different and non-synchronized with each other. As a result, motion delta output and MCU polling will never be consistent.
- FIG. 1 is a schematic diagram illustrating the non-synchronization between the frame rate and the MCU polling in prior art.
- frames f_ 1 , f_ 2 . . . f_ 8 are captured by an image sensor, and motion delta D_ 1 , D_ 2 . . . between different frames are respectively computed by the image sensor.
- a MCU coupled to the image sensor generates pollings P_ 1 -P_ 3 to request motion delta.
- the image sensor outputs motion deltas D_ 1 , D_ 2 to the MCU responding to the polling P_ 1 , outputs motion deltas D_ 3 , D_ 4 , D_ 5 to the MCU responding to the polling P_ 2 , and outputs motion deltas D_ 6 , D_ 7 to the MCU responding to the polling P_ 3 .
- the pollings P_ 1 , P_ 2 , P_ 3 respectively has different latencies L_ 1 , L_ 2 , L_ 3 from the frames f_ 3 , f_ 6 , and f_ 8 .
- the MCU may receive different numbers of motion deltas responding to different pollings.
- the MCU receives two motion deltas D_ 1 , D_ 2 for the polling P 1 , but receives three motion deltas D_ 3 , D_ 4 , D_ 5 for the polling P 2 . Since the motion deltas are always applied to compute a position of the optical pointing device, the issues illustrated in FIG. 1 may affect the accuracy of position computing.
- one objective of the present invention is to provide an image sensing system control method can reduce the affect caused by non-synchronization between the frame rate and the polling.
- Another objective of the present invention is to provide an image sensing system control method can reduce the affect caused by non-synchronization between the frame rate and the polling.
- an image sensing system control method comprising: (a) predicting a first velocity of the image sensor; (b) calculating a first time duration between a first frame time and a first polling time after the first frame time, wherein the image sensor captures a first frame at the first frame time and receives a first polling from the control circuit at the first polling time; and (c) calculating a first predicted motion delta of the first time duration according to the first velocity and the first time duration.
- an image sensing system comprising: a control circuit; an image sensor, configured to perform: (a) predicting a first velocity of the image sensor; (b) calculating a first time duration between a first frame time and a first polling time after the first frame time, wherein the image sensor captures a first frame at the first frame time and receives a first polling from the control circuit at the first polling time; and (c) calculating a first predicted motion delta of the first time duration according to the first velocity and the first time duration.
- the motion delta can be output corresponding to the time difference between a time of the frame and a time of the polling, thus can reduce the affect caused by non-synchronization between the frame rate and the polling.
- FIG. 1 is a schematic diagram illustrating the non-synchronization between the frame rate and the MCU polling in prior art.
- FIG. 2 is a block diagram illustrating an image sensing system according to one embodiment of the present invention.
- FIG. 3 - FIG. 7 are schematic diagrams illustrating image sensing system control methods according to different embodiments of the present invention.
- FIG. 8 is a flow chart illustrating an image sensing system control method according to one embodiment of the present invention.
- FIG. 9 is a block diagram illustrating an application that the image sensing system provided by the present invention is applied to a computer.
- each component in following descriptions can be implemented by hardware (e.g. a device or a circuit) or hardware with software (e.g. a program installed to a processor).
- the method in following descriptions can be executed by programs stored in a non-transitory computer readable recording medium such as a hard disk, an optical disc or a memory.
- the term “first”, “second”, “third” in following descriptions are only for the purpose of distinguishing different one elements, and do not mean the sequence of the elements. For example, a first device and a second device only mean these devices can have the same structure but are different devices.
- FIG. 2 is a block diagram illustrating an image sensing system 200 according to one embodiment of the present invention.
- the image sensing system 200 comprises a control circuit 201 and an image sensor 203 .
- the control circuit 201 can be above-mentioned MCU or any other device which can perform the same function, such as a processor.
- the image sensor 203 is configured to capture a plurality of frames and to compute motion deltas between different frames.
- the control circuit 201 generates pollings to the image sensor 203 , and the image sensor 203 outputs motions deltas responding to the pollings. It will be appreciated that the control circuit 201 and the image sensor 203 can be provided in the same electronic device (e.g.
- the image sensor 203 mentioned here is configured to capture a plurality of frames and to compute motion deltas between different frames, thus can also be named as a motion sensor.
- the image sensor 203 can only be configured to capture a plurality of frames, and other computations can be performed by other circuits independent from the image sensor 203 .
- At least one velocity of the image sensor (or at least one velocity of an electronic device comprising the image sensor) is predicted.
- at least one predicted motion delta is calculated according to the predicted velocity, and an output motion delta, which is output to the control circuit, is calculated based on the predicted motion delta.
- Many methods can be applied to predict the velocity, and will be detailedly illustrated in following descriptions.
- FIG. 3 - FIG. 6 are schematic diagrams illustrating image sensing system control methods according to different embodiments of the present invention.
- the velocity is predicted according to real time motion delta.
- the velocity is predicted according to the polling period and the accumulated motion delta corresponding to at least one polling.
- the image sensor 203 captures a first frame f_ 1 at the first frame time T_f 1 and receives a first polling P_ 1 from the control circuit 201 at the first polling time T_P 1 . Further, the image sensor 203 captures a second frame f_ 2 at the second frame time T_f 2 and receives a second polling P_ 2 from the control circuit 201 at the second polling time T_P 2 . The images sensor 203 calculates a first time duration TD_ 1 between the first frame time T_f 1 and the first polling time T_P 1 after the first frame time T_f 1 .
- the image sensor 203 calculates a second time duration TD_ 2 between the first frame time T_f 1 and the second frame time T_f 2 before the first frame time T_f 1 . After that, the image sensor 203 predicts a first velocity V_ 1 of the image sensor 203 or the electronic device comprising the image sensor 203 , according to a first motion delta D_ 1 between the first frame f_ 1 and the second frame f_ 2 , and according to the second time duration TD_ 2 . Next, the image sensor 203 calculates a first predicted motion delta PD_ 1 of the first time duration TD_ 1 according to the first velocity V_ 1 and the first time duration TD_ 1 . In one embodiment, the image sensor 203 predicts the first velocity according to an equation of
- T_f 1 indicates the first frame time
- T_f 2 indicates the second frame time
- the first frame time T_f 1 and the second frame time T_f 2 are before the first polling time T_P 1 and after the second polling time T_P 2 . That is, the image sensor 203 uses frames located between two continuous pollings to predict the first velocity V_ 1 . Further, the image sensor 203 does not capture any frame in the second time duration TD_ 2 . In other words, no frame exists in the time duration between the two frames which the image sensor 203 uses to predict the first velocity V_ 1 .
- At least one frame exists in the time duration between the two frames which the image sensor 203 uses to predict the first velocity.
- frames f_a, f_b are captured by the image sensor 203 in the second time duration TD_ 2 .
- the image sensor 203 can still predict the first velocity based on the above-mentioned equation of
- T_f 1 indicates the first frame time
- T_f 2 indicates the second frame time. Namely, the image sensor 203 predicts the first velocity V_ 1 based on the equation of
- D_ab and D_b 1 respectively mean the motion delta between the second frame f_ 2 /frame a, frame f_a/frame f_b and frame f_b/first frame f_ 1 .
- TD_ 2 a , TD_ab and TD_b 1 respectively mean the time durations between the second frame f_ 2 /frame a, frame f_a/frame f_b and frame f_b/first frame f_ 1 .
- the first velocity V_ 1 can be predicted by various methods.
- the first velocity V_ 1 is predicted according to the polling period and the accumulated motion delta corresponding to at least one polling.
- the image sensor 203 calculates a first polling period Pe_ 1 between the first polling time T_P 1 and the second polling time T_P 2 .
- the image sensor 203 calculates a first accumulated motion delta ACM_ 1 of the first polling T_P 1 .
- the image sensor 203 uses a conventional motion delta calculating method to calculates the first accumulated motion delta ACM_ 1 , which is illustrated in FIG. 1 .
- a first frame f_ 1 , a second frame f_ 2 and a third frame f_ 3 exists between the first polling time T_P 1 and the second polling time T_P 2 .
- the motion delta between the first frame f_ 1 and the second frame f_ 2 is motion delta D_ 1
- the motion delta between the second frame f_ 2 and the third frame f_ 3 is motion delta D_ 2 . Therefore, in the embodiment of FIG. 5 , the first accumulated motion delta ACM_ 1 is motion delta D_ 1 + motion delta D_ 2 .
- FIG. 3 and FIG. 4 can be combined to the embodiment of FIG. 5 .
- the first predicted motion delta PD_ 1 is also calculated, and the first accumulated motion delta ACM_ 1 is motion delta D_1+ motion delta D_2+ first predicted motion delta PD_ 1 .
- the first predicted motion delta PD_ 1 can be predicted twice by different methods (the method in FIGS. 3-4 , and the method in FIG. 5-6 ).
- the image sensor 203 predicts the first velocity V_ 1 according to the first accumulated motion delta ACM_ 1 and the first polling period Pe_ 1 . In one embodiment, the image sensor 203 predicts the first velocity V_ 1 according to an equation of
- ACM_ 1 indicates the first accumulated motion delta
- Pe_ 1 indicates the first polling period.
- the image sensor 203 calculates the first predicted motion delta PD_ 1 by V_ 1 ⁇ TD_ 1 .
- TD_ 1 is the above-mentioned first time duration.
- the embodiment illustrated in FIG. 5 is not limited to user only one accumulated motion delta and only one polling period.
- the image sensor 203 receives a third polling P_ 3 at a third polling time T_P 3 before the second polling time T_P 2 .
- the image sensor 203 captures frames f_a, f_b between the second polling time T_P 2 and the third polling time T_P 3 . Therefore, the second accumulated motion delta ACM_ 2 of the second polling P_ 2 is the motion delta D_ab between the frame f_a and the frame f_b.
- the image sensor 203 predicts a second velocity V_ 2 according to the second accumulated motion delta ACM_ 2 and the second polling period Pe_ 2
- the image sensor 203 After the second velocity V_ 2 is predicted, the image sensor 203 performs a weighting equation to the first velocity V_ 1 and the second velocity V_ 2 to generate a weighting result. Then, the image sensor 203 predicts the velocity of the image sensor 203 or the electronic device comprising the image sensor 203 according to the weighting result. In one embodiment, the image sensor 203 calculates an average of the first velocity V_ 1 and the second velocity V_ 2 as the weighting result. It will be appreciated that although the embodiment in FIG. 6 uses two polling periods and two accumulated motion deltas to predict the velocity. However, more than two polling periods and more than two accumulated motion deltas can be used to predict the velocity.
- FIG. 7 is a schematic diagram illustrating how the image sensor 203 reports an output motion delta responding to a polling after at least one velocity is predicted.
- the image sensor respectively receives a first polling P_ 1 , a second polling P_ 2 , and a third polling P_ 3 at the first polling time T_P 1 , the second polling time T_P 2 , and the third polling time T_P 3 .
- the image sensor 203 respectively captures a first frame f_ 1 , a second frame f_ 2 and a third frame f_ 3 at the first frame time T_f 1 , the second frame time T_f 2 , and the third frame time T_f 3 .
- the first velocity V_ 1 and the second velocity V_ 2 are predicted via above-mentioned methods, thus the first predicted motion delta PD_ 1 and the second predicted motion delta PD_ 2 can be acquired.
- the image sensor 203 reports an output motion delta responding to the first polling P_ 1 according to a first accumulated motion delta ACM_ 1 corresponding to the first polling, the first predicted motion delta PD_ 1 and the second predicted motion delta PD_ 2 .
- the first accumulated motion delta ACM_ 1 can be the motion delta D_ 1 between the first frame f_ 1 and the motion delta D_ 2 between the first frame f_ 2 .
- the output motion delta is ACM_ 1 +PD_ 1 ⁇ PD_ 2 .
- an image sensing system control method can be acquired, which can be performed by at least one program recorded in a non-transitory computer readable recording medium such as an optical disc, a hard disk or a memory card.
- FIG. 8 is a flow chart illustrating an image sensing system control method according to one embodiment of the present invention, which comprises following steps:
- a first velocity e.g. V_ 1 in FIG. 3 - FIG. 6
- the image sensor e.g. 203 in FIG. 2 .
- TD_ 1 in FIG. 1 a first time duration (e.g. TD_ 1 in FIG. 1 ) between a first frame time (e.g. T_f 1 in FIG. 1 ) and a first polling time (e.g. T_P 1 in FIG. 1 ) after the first frame time T_f 1 .
- FIG. 9 is a block diagram illustrating an application that the image sensing system provided by the present invention is applied to a computer.
- the control circuit 201 and the image sensor 203 are provided in an electronic device 900 .
- the electronic device 900 can be any kind of device which can communicate with the computer 901 .
- the electronic device 900 can be an optical navigation device such as an optical mouse, or an optical pointing device.
- the control circuit 201 can poll the image sensor 203 to output the above-mentioned output motion delta OD. After receiving the output motion delta OD, the control circuit 201 may transmit the output motion delta OD to the computer 901 .
- the computer 901 can be replaced by any other electronic device.
- a protocol analyzer 903 can be used to capture the communication between the control circuit 201 and the image sensor 203 . After that, extract the output motion delta OD (source) from the log file generated by the protocol analyzer 903 . Also, capture the output motion delta OD (destination) at the computer 901 , and compare the output motion delta OD (source) and the output motion delta OD (destination) by the data comparator 905 . If the control circuit 201 does not change the output motion delta OD from the image sensor 203 , the output motion delta OD (source) are the same.
- the protocol analyzer 903 can be implemented by a circuit specifically designed for capturing the output motion delta OD (source). Also, the protocol analyzer 903 can be a processor installed with at least one program to capture the output motion delta OD (source).
- the motion delta can be output corresponding to the time difference between a time of the frame and a time of the polling, thus can reduce the affect caused by non-synchronization between the frame rate and the polling.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
Abstract
Description
- The present invention relates to an image sensing system control method and an image sensing system, and particularly relates to an image sensing system control method and an image sensing system which can reduce the effect caused by non-synchronization between the frame rate and the MCU polling.
- In an optical navigation device such as an optical mouse, the image sensor thereof captures frames at a predetermined frame rate and then computes the motion delta between different frames. Such predetermined frame rate may change corresponding to different modes, for example, an active mode or a standby mode. Also, a MCU (micro control unit) polls the image sensor for motion delta (i.e. request the image sensor to output motion delta). However, the MCU polling rate and the image sensor frame rate are usually different and non-synchronized with each other. As a result, motion delta output and MCU polling will never be consistent.
-
FIG. 1 is a schematic diagram illustrating the non-synchronization between the frame rate and the MCU polling in prior art. As illustrated inFIG. 1 , frames f_1, f_2 . . . f_8 are captured by an image sensor, and motion delta D_1, D_2 . . . between different frames are respectively computed by the image sensor. Also, a MCU coupled to the image sensor generates pollings P_1-P_3 to request motion delta. - For more details, the image sensor outputs motion deltas D_1, D_2 to the MCU responding to the polling P_1, outputs motion deltas D_3, D_4, D_5 to the MCU responding to the polling P_2, and outputs motion deltas D_6, D_7 to the MCU responding to the polling P_3. However, due to the non-synchronization, the pollings P_1, P_2, P_3 respectively has different latencies L_1, L_2, L_3 from the frames f_3, f_6, and f_8. Also, due to the non-synchronization, the MCU may receive different numbers of motion deltas responding to different pollings. For example, the MCU receives two motion deltas D_1, D_2 for the polling P1, but receives three motion deltas D_3, D_4, D_5 for the polling P2. Since the motion deltas are always applied to compute a position of the optical pointing device, the issues illustrated in
FIG. 1 may affect the accuracy of position computing. - Therefore, one objective of the present invention is to provide an image sensing system control method can reduce the affect caused by non-synchronization between the frame rate and the polling.
- Another objective of the present invention is to provide an image sensing system control method can reduce the affect caused by non-synchronization between the frame rate and the polling.
- One embodiment of the present invention discloses: an image sensing system control method, comprising: (a) predicting a first velocity of the image sensor; (b) calculating a first time duration between a first frame time and a first polling time after the first frame time, wherein the image sensor captures a first frame at the first frame time and receives a first polling from the control circuit at the first polling time; and (c) calculating a first predicted motion delta of the first time duration according to the first velocity and the first time duration.
- Another embodiment of the present invention discloses: an image sensing system, comprising: a control circuit; an image sensor, configured to perform: (a) predicting a first velocity of the image sensor; (b) calculating a first time duration between a first frame time and a first polling time after the first frame time, wherein the image sensor captures a first frame at the first frame time and receives a first polling from the control circuit at the first polling time; and (c) calculating a first predicted motion delta of the first time duration according to the first velocity and the first time duration.
- In view of above-mentioned embodiments, the motion delta can be output corresponding to the time difference between a time of the frame and a time of the polling, thus can reduce the affect caused by non-synchronization between the frame rate and the polling.
- These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
-
FIG. 1 is a schematic diagram illustrating the non-synchronization between the frame rate and the MCU polling in prior art. -
FIG. 2 is a block diagram illustrating an image sensing system according to one embodiment of the present invention. -
FIG. 3 -FIG. 7 are schematic diagrams illustrating image sensing system control methods according to different embodiments of the present invention. -
FIG. 8 is a flow chart illustrating an image sensing system control method according to one embodiment of the present invention. -
FIG. 9 is a block diagram illustrating an application that the image sensing system provided by the present invention is applied to a computer. - Several embodiments are provided in following descriptions to explain the concept of the present invention. Each component in following descriptions can be implemented by hardware (e.g. a device or a circuit) or hardware with software (e.g. a program installed to a processor). Besides, the method in following descriptions can be executed by programs stored in a non-transitory computer readable recording medium such as a hard disk, an optical disc or a memory. Besides, the term “first”, “second”, “third” in following descriptions are only for the purpose of distinguishing different one elements, and do not mean the sequence of the elements. For example, a first device and a second device only mean these devices can have the same structure but are different devices.
-
FIG. 2 is a block diagram illustrating animage sensing system 200 according to one embodiment of the present invention. As illustrated inFIG. 2 , theimage sensing system 200 comprises acontrol circuit 201 and animage sensor 203. Thecontrol circuit 201 can be above-mentioned MCU or any other device which can perform the same function, such as a processor. Theimage sensor 203 is configured to capture a plurality of frames and to compute motion deltas between different frames. Thecontrol circuit 201 generates pollings to theimage sensor 203, and theimage sensor 203 outputs motions deltas responding to the pollings. It will be appreciated that thecontrol circuit 201 and theimage sensor 203 can be provided in the same electronic device (e.g. an optical pointing device), but also can be provided in different electronic devices as well. Additionally, theimage sensor 203 mentioned here is configured to capture a plurality of frames and to compute motion deltas between different frames, thus can also be named as a motion sensor. However, theimage sensor 203 can only be configured to capture a plurality of frames, and other computations can be performed by other circuits independent from theimage sensor 203. - In following embodiments, at least one velocity of the image sensor (or at least one velocity of an electronic device comprising the image sensor) is predicted. After that, at least one predicted motion delta is calculated according to the predicted velocity, and an output motion delta, which is output to the control circuit, is calculated based on the predicted motion delta. Many methods can be applied to predict the velocity, and will be detailedly illustrated in following descriptions.
-
FIG. 3 -FIG. 6 are schematic diagrams illustrating image sensing system control methods according to different embodiments of the present invention. In the embodiments ofFIG. 3 andFIG. 4 , the velocity is predicted according to real time motion delta. Also, in the embodiments ofFIG. 5 andFIG. 6 , the velocity is predicted according to the polling period and the accumulated motion delta corresponding to at least one polling. - Please refer to
FIG. 3 , theimage sensor 203 captures a first frame f_1 at the first frame time T_f1 and receives a first polling P_1 from thecontrol circuit 201 at the first polling time T_P1. Further, theimage sensor 203 captures a second frame f_2 at the second frame time T_f2 and receives a second polling P_2 from thecontrol circuit 201 at the second polling time T_P2. Theimages sensor 203 calculates a first time duration TD_1 between the first frame time T_f1 and the first polling time T_P1 after the first frame time T_f1. Further, theimage sensor 203 calculates a second time duration TD_2 between the first frame time T_f1 and the second frame time T_f2 before the first frame time T_f1. After that, theimage sensor 203 predicts a first velocity V_1 of theimage sensor 203 or the electronic device comprising theimage sensor 203, according to a first motion delta D_1 between the first frame f_1 and the second frame f_2, and according to the second time duration TD_2. Next, theimage sensor 203 calculates a first predicted motion delta PD_1 of the first time duration TD_1 according to the first velocity V_1 and the first time duration TD_1. In one embodiment, theimage sensor 203 predicts the first velocity according to an equation of -
- indicates the first motion delta, T_f1 indicates the first frame time and T_f2 indicates the second frame time. In such case, the first velocity equals to
-
- In the embodiment of
FIG. 3 , the first frame time T_f1 and the second frame time T_f2 are before the first polling time T_P1 and after the second polling time T_P2. That is, theimage sensor 203 uses frames located between two continuous pollings to predict the first velocity V_1. Further, theimage sensor 203 does not capture any frame in the second time duration TD_2. In other words, no frame exists in the time duration between the two frames which theimage sensor 203 uses to predict the first velocity V_1. - However, in one embodiment, at least one frame exists in the time duration between the two frames which the
image sensor 203 uses to predict the first velocity. Please refer toFIG. 4 , in such embodiment, frames f_a, f_b are captured by theimage sensor 203 in the second time duration TD_2. However, theimage sensor 203 can still predict the first velocity based on the above-mentioned equation of -
- indicates the first motion delta, T_f1 indicates the first frame time and T_f2 indicates the second frame time. Namely, the
image sensor 203 predicts the first velocity V_1 based on the equation of -
- D_ab and D_b1 respectively mean the motion delta between the second frame f_2/frame a, frame f_a/frame f_b and frame f_b/first frame f_1. Also, TD_2 a, TD_ab and TD_b1 respectively mean the time durations between the second frame f_2/frame a, frame f_a/frame f_b and frame f_b/first frame f_1.
- As above-mentioned, the first velocity V_1 can be predicted by various methods. In the embodiments of
FIG. 5 andFIG. 6 , the first velocity V_1 is predicted according to the polling period and the accumulated motion delta corresponding to at least one polling. In the embodiment ofFIG. 5 , theimage sensor 203 calculates a first polling period Pe_1 between the first polling time T_P1 and the second polling time T_P2. Also, theimage sensor 203 calculates a first accumulated motion delta ACM_1 of the first polling T_P1. In one embodiment, theimage sensor 203 uses a conventional motion delta calculating method to calculates the first accumulated motion delta ACM_1, which is illustrated inFIG. 1 . For example, in the embodiment ofFIG. 5 , a first frame f_1, a second frame f_2 and a third frame f_3 exists between the first polling time T_P1 and the second polling time T_P2. Also, the motion delta between the first frame f_1 and the second frame f_2 is motion delta D_1, and the motion delta between the second frame f_2 and the third frame f_3 is motion delta D_2. Therefore, in the embodiment ofFIG. 5 , the first accumulated motion delta ACM_1 is motion delta D_1+ motion delta D_2. Please note, the embodiments illustrated inFIG. 3 andFIG. 4 can be combined to the embodiment ofFIG. 5 . In such case, the first predicted motion delta PD_1 is also calculated, and the first accumulated motion delta ACM_1 is motion delta D_1+ motion delta D_2+ first predicted motion delta PD_1. In such case, the first predicted motion delta PD_1 can be predicted twice by different methods (the method inFIGS. 3-4 , and the method inFIG. 5-6 ). - After the first accumulated motion delta ACM_1 is acquired, the
image sensor 203 predicts the first velocity V_1 according to the first accumulated motion delta ACM_1 and the first polling period Pe_1. In one embodiment, theimage sensor 203 predicts the first velocity V_1 according to an equation of -
- ACM_1 indicates the first accumulated motion delta, and Pe_1 indicates the first polling period. After that, the
image sensor 203 calculates the first predicted motion delta PD_1 by V_1×TD_1. TD_1 is the above-mentioned first time duration. - The embodiment illustrated in
FIG. 5 is not limited to user only one accumulated motion delta and only one polling period. Please refer toFIG. 6 , theimage sensor 203 receives a third polling P_3 at a third polling time T_P3 before the second polling time T_P2. Also, theimage sensor 203 captures frames f_a, f_b between the second polling time T_P2 and the third polling time T_P3. Therefore, the second accumulated motion delta ACM_2 of the second polling P_2 is the motion delta D_ab between the frame f_a and the frame f_b. Theimage sensor 203 predicts a second velocity V_2 according to the second accumulated motion delta ACM_2 and the second polling period Pe_2 -
- After the second velocity V_2 is predicted, the
image sensor 203 performs a weighting equation to the first velocity V_1 and the second velocity V_2 to generate a weighting result. Then, theimage sensor 203 predicts the velocity of theimage sensor 203 or the electronic device comprising theimage sensor 203 according to the weighting result. In one embodiment, theimage sensor 203 calculates an average of the first velocity V_1 and the second velocity V_2 as the weighting result. It will be appreciated that although the embodiment inFIG. 6 uses two polling periods and two accumulated motion deltas to predict the velocity. However, more than two polling periods and more than two accumulated motion deltas can be used to predict the velocity. -
FIG. 7 is a schematic diagram illustrating how theimage sensor 203 reports an output motion delta responding to a polling after at least one velocity is predicted. In the embodiment ofFIG. 7 , the image sensor respectively receives a first polling P_1, a second polling P_2, and a third polling P_3 at the first polling time T_P1, the second polling time T_P2, and the third polling time T_P3. Further, theimage sensor 203 respectively captures a first frame f_1, a second frame f_2 and a third frame f_3 at the first frame time T_f1, the second frame time T_f2, and the third frame time T_f3. Also, the first velocity V_1 and the second velocity V_2 are predicted via above-mentioned methods, thus the first predicted motion delta PD_1 and the second predicted motion delta PD_2 can be acquired. - Afterwards, the
image sensor 203 reports an output motion delta responding to the first polling P_1 according to a first accumulated motion delta ACM_1 corresponding to the first polling, the first predicted motion delta PD_1 and the second predicted motion delta PD_2. In such case, the first accumulated motion delta ACM_1 can be the motion delta D_1 between the first frame f_1 and the motion delta D_2 between the first frame f_2. Also, in one embodiment, the output motion delta is ACM_1+PD_1−PD_2. - In view of above-mentioned embodiments, an image sensing system control method can be acquired, which can be performed by at least one program recorded in a non-transitory computer readable recording medium such as an optical disc, a hard disk or a memory card.
FIG. 8 is a flow chart illustrating an image sensing system control method according to one embodiment of the present invention, which comprises following steps: -
Step 801 - Predict a first velocity (e.g. V_1 in
FIG. 3 -FIG. 6 ) of the image sensor (e.g. 203 inFIG. 2 ). -
Step 803 - Calculate a first time duration (e.g. TD_1 in
FIG. 1 ) between a first frame time (e.g. T_f1 inFIG. 1 ) and a first polling time (e.g. T_P1 inFIG. 1 ) after the first frame time T_f1. -
Step 805 - Calculate a first predicted motion delta PD_1 of the first time duration according to the first velocity and the first time duration.
- Other detail are explained in above-mentioned embodiments, thus are omitted for brevity here. Please note, the above-mentioned embodiments are only examples for explaining. The combination or variations based on above-mentioned teachings should also fall in the scope of the present invention.
-
FIG. 9 is a block diagram illustrating an application that the image sensing system provided by the present invention is applied to a computer. In the embodiment ofFIG. 9 , thecontrol circuit 201 and theimage sensor 203 are provided in anelectronic device 900. Theelectronic device 900 can be any kind of device which can communicate with thecomputer 901. For example, theelectronic device 900 can be an optical navigation device such as an optical mouse, or an optical pointing device. As above-mentioned, thecontrol circuit 201 can poll theimage sensor 203 to output the above-mentioned output motion delta OD. After receiving the output motion delta OD, thecontrol circuit 201 may transmit the output motion delta OD to thecomputer 901. Please note, thecomputer 901 can be replaced by any other electronic device. - Therefore, if it needs to confirm whether the
computer 901 correctly receives the output motion delta OD or not, aprotocol analyzer 903 can be used to capture the communication between thecontrol circuit 201 and theimage sensor 203. After that, extract the output motion delta OD (source) from the log file generated by theprotocol analyzer 903. Also, capture the output motion delta OD (destination) at thecomputer 901, and compare the output motion delta OD (source) and the output motion delta OD (destination) by thedata comparator 905. If thecontrol circuit 201 does not change the output motion delta OD from theimage sensor 203, the output motion delta OD (source) are the same. Theprotocol analyzer 903 can be implemented by a circuit specifically designed for capturing the output motion delta OD (source). Also, theprotocol analyzer 903 can be a processor installed with at least one program to capture the output motion delta OD (source). - In view of above-mentioned embodiments, the motion delta can be output corresponding to the time difference between a time of the frame and a time of the polling, thus can reduce the affect caused by non-synchronization between the frame rate and the polling.
- Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Claims (22)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/830,212 US11119587B1 (en) | 2020-03-25 | 2020-03-25 | Computer readable recording medium can perform image sensing system control method and image sensing system |
| CN202011172818.3A CN113448723B (en) | 2020-03-25 | 2020-10-28 | Image sensing system control method and image sensing system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/830,212 US11119587B1 (en) | 2020-03-25 | 2020-03-25 | Computer readable recording medium can perform image sensing system control method and image sensing system |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US11119587B1 US11119587B1 (en) | 2021-09-14 |
| US20210303084A1 true US20210303084A1 (en) | 2021-09-30 |
Family
ID=77665844
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/830,212 Active US11119587B1 (en) | 2020-03-25 | 2020-03-25 | Computer readable recording medium can perform image sensing system control method and image sensing system |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US11119587B1 (en) |
| CN (1) | CN113448723B (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12379796B2 (en) * | 2023-09-22 | 2025-08-05 | Pixart Imaging Inc. | Image sensing system control method and image sensing system |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110150287A1 (en) * | 2009-12-17 | 2011-06-23 | Flosdorf Stephen P | Detection of Local Motion between Image Frames |
| US20140368688A1 (en) * | 2013-06-14 | 2014-12-18 | Qualcomm Incorporated | Computer vision application processing |
| US20150301618A1 (en) * | 2014-04-22 | 2015-10-22 | Pixart Imaging (Penang) Sdn. Bhd. | Sub-frame accumulation method and apparatus for keeping reporting errors of an optical navigation sensor consistent across all frame rates |
| US20150334315A1 (en) * | 2009-03-02 | 2015-11-19 | Flir Systems, Inc. | Infrared imaging enhancement with fusion |
| US9749524B1 (en) * | 2012-05-25 | 2017-08-29 | Apple Inc. | Methods and systems for determining a direction of a sweep motion |
| US10819896B1 (en) * | 2019-05-13 | 2020-10-27 | Pixart Imaging Inc. | Computer readable recording medium can perform image sensing system control method and image sensing system |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104053052A (en) * | 2013-03-15 | 2014-09-17 | 奇高电子股份有限公司 | Method and device for dynamically adjusting image update rate |
| US9927884B2 (en) * | 2015-11-06 | 2018-03-27 | Pixart Imaging (Penang) Sdn. Bhd. | Non transitory computer readable recording medium for executing image processing method, and image sensing device applying the image processing method |
| US10490068B2 (en) * | 2016-10-31 | 2019-11-26 | Veniam, Inc. | Systems and methods for achieving road action consensus, for example among autonomous vehicles, in a network of moving things |
| DE112017006689T5 (en) * | 2016-12-30 | 2019-09-12 | Intel Corporation | PROCESS AND DEVICES FOR RADIO COMMUNICATION |
| US10187607B1 (en) * | 2017-04-04 | 2019-01-22 | Gopro, Inc. | Systems and methods for using a variable capture frame rate for video capture |
| US10302450B1 (en) * | 2017-06-19 | 2019-05-28 | Rockwell Collins, Inc. | Methods and systems for high accuracy and integrity estimation of flight critical aircraft states |
-
2020
- 2020-03-25 US US16/830,212 patent/US11119587B1/en active Active
- 2020-10-28 CN CN202011172818.3A patent/CN113448723B/en active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150334315A1 (en) * | 2009-03-02 | 2015-11-19 | Flir Systems, Inc. | Infrared imaging enhancement with fusion |
| US20110150287A1 (en) * | 2009-12-17 | 2011-06-23 | Flosdorf Stephen P | Detection of Local Motion between Image Frames |
| US9749524B1 (en) * | 2012-05-25 | 2017-08-29 | Apple Inc. | Methods and systems for determining a direction of a sweep motion |
| US20140368688A1 (en) * | 2013-06-14 | 2014-12-18 | Qualcomm Incorporated | Computer vision application processing |
| US20150301618A1 (en) * | 2014-04-22 | 2015-10-22 | Pixart Imaging (Penang) Sdn. Bhd. | Sub-frame accumulation method and apparatus for keeping reporting errors of an optical navigation sensor consistent across all frame rates |
| US10819896B1 (en) * | 2019-05-13 | 2020-10-27 | Pixart Imaging Inc. | Computer readable recording medium can perform image sensing system control method and image sensing system |
| US20200366823A1 (en) * | 2019-05-13 | 2020-11-19 | Pixart Imaging Inc. | Computer readable recording medium can perform image sensing system control method and image sensing system |
Also Published As
| Publication number | Publication date |
|---|---|
| US11119587B1 (en) | 2021-09-14 |
| CN113448723A (en) | 2021-09-28 |
| CN113448723B (en) | 2023-10-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11392475B2 (en) | Job power predicting method and information processing apparatus | |
| CN112033351A (en) | Monocular camera-based distance measuring method and electronic equipment | |
| CN110069194B (en) | Page blockage determining method and device, electronic equipment and readable storage medium | |
| CN113903389B (en) | Slow disk detection method and device and computer readable and writable storage medium | |
| US9851789B2 (en) | Information processing technique for eye gaze movements | |
| CN112004133A (en) | Audio-visual coherence method, device, projection device and readable storage medium | |
| CN111063011A (en) | Face image processing method, device, equipment and medium | |
| US11119587B1 (en) | Computer readable recording medium can perform image sensing system control method and image sensing system | |
| US10819896B1 (en) | Computer readable recording medium can perform image sensing system control method and image sensing system | |
| JP2009200672A (en) | Image signal processing apparatus, image signal processing method, and program | |
| US10593059B1 (en) | Object location estimating method with timestamp alignment function and related object location estimating device | |
| CN109933537B (en) | Stuck detection method, related device, equipment and computer readable medium | |
| KR20190014628A (en) | Bigdata based temperature management assessment method and system | |
| US9483125B2 (en) | Position information obtaining device and method, and image display system | |
| CN104116505A (en) | Pulse estimation device and pulse estimation program | |
| CN115033165B (en) | Touch event processing method and device, storage medium and electronic equipment | |
| US11962898B2 (en) | Computer readable recording medium which can perform image sensing system control method and image sensing system | |
| KR100780057B1 (en) | Video Gravial Shot Converter and its method | |
| US8463037B2 (en) | Detection of low contrast for image processing | |
| KR101994287B1 (en) | Apparatus and method for backup image | |
| JP3747230B2 (en) | Video analysis system | |
| US12223664B2 (en) | Image processing apparatus, image processing method, and non-transitory computer-readable storage medium | |
| CN113572683A (en) | Data processing method and device, electronic equipment and storage medium | |
| CN112288774B (en) | Mobile detection method, mobile detection device, electronic equipment and storage medium | |
| US8514328B2 (en) | Motion vector based image segmentation |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: PIXART IMAGING INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KONG, SHANG CHAN;REEL/FRAME:052229/0437 Effective date: 20200220 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |