US20130314558A1 - Image capture device for starting specific action in advance when determining that specific action is about to be triggered and related image capture method thereof - Google Patents
Image capture device for starting specific action in advance when determining that specific action is about to be triggered and related image capture method thereof Download PDFInfo
- Publication number
- US20130314558A1 US20130314558A1 US13/868,092 US201313868092A US2013314558A1 US 20130314558 A1 US20130314558 A1 US 20130314558A1 US 201313868092 A US201313868092 A US 201313868092A US 2013314558 A1 US2013314558 A1 US 2013314558A1
- Authority
- US
- United States
- Prior art keywords
- image capture
- distance
- specific action
- sensing result
- capture device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N5/23222—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4007—Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4023—Scaling of whole images or parts thereof, e.g. expanding or contracting based on decimating pixels or lines of pixels; based on inserting pixels or lines of pixels
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4053—Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
Definitions
- the disclosed embodiments of the present invention relate to controlling an image capture module, and more particularly, to an image capture device for starting a specific action in advance when determining that the specific action associated with an image capture module is about to be triggered and related image capture method thereof.
- Camera modules have become popular elements used in a variety of applications.
- a smartphone is typically equipped with a camera module, thus allowing a user to easily and conveniently take pictures by using the smartphone.
- the smartphone is prone to generate blurred images.
- the camera aperture and/or sensor size of the smartphone is typically small, which leads to a small amount of light arriving at each pixel in camera sensor. As a result, the image quality may suffer from the small camera aperture and/or sensor size.
- the smartphone tends to be affected by hand shake. Specifically, when user's finger touches a physical shutter/capture bottom or a virtual shutter/capture button on the smartphone, the shake of the smartphone will last for a period of time. Hence, any picture taken during this period of time would be affected by the hand shake.
- An image deblurring algorithm may be performed upon the blurred images. However, the computational complexity of the image deblurring algorithm is very high, resulting in considerable power consumption. Besides, artifact will be introduced if the image deblurring algorithm is not perfect.
- a camera module with an optical image stabilizer is expensive.
- the conventional smartphone is generally equipped with a digital image stabilizer (i.e., an electronic image stabilizer (EIS)).
- EIS electronic image stabilizer
- the digital image stabilizer can counteract the motion of images, but fails to prevent image blurring.
- an image capture device for starting a specific action in advance when determining that the specific action associated with an image capture module is about to be triggered and related image capture method thereof are proposed to solve the above-mentioned problem.
- an exemplary image capture device includes an image capture module, a sensor arranged for sensing an object to generate a sensing result, and a controller arranged for checking the sensing result to determine if a specific action associated with the image capture module is about to be triggered and controlling the image capture module to start the specific action in advance when determining that the specific action is about to be triggered.
- an exemplary image capture method includes: sensing an object to generate a sensing result; checking the sensing result to determine if a specific action associated with an image capture module is about to be triggered; and when determining that the specific action is about to be triggered, controlling the image capture module to start the specific action in advance.
- FIG. 1 is a block diagram illustrating an image capture device according to an embodiment of the present invention.
- FIG. 2 is a flowchart illustrating an image capture method according to an embodiment of the present invention.
- FIG. 3 is a diagram illustrating a first embodiment of step 206 shown in FIG. 2 .
- FIG. 4 is a diagram illustrating a second embodiment of step 206 shown in FIG. 2 .
- FIG. 5 is a diagram illustrating a third embodiment of step 206 shown in FIG. 2 .
- FIG. 6 is a diagram illustrating a fourth embodiment of step 206 shown in FIG. 2 .
- the main concept of the present invention is to capture one or more still image(s) or start/end a video recording operation before an object (e.g., a finger of a user or a pen with magnetism that is used by the user) actually touches an image capture device. In this way, the image blurring caused by unwanted hand shake applied to the image capture device is avoided. Further details are described as below.
- an object e.g., a finger of a user or a pen with magnetism that is used by the user
- FIG. 1 is a block diagram illustrating an image capture device according to an embodiment of the present invention.
- the image capture device 100 may be at least a portion (i.e., part or all) of an electronic device.
- the image capture device 100 may be implemented in a portable device such as a smartphone or a digital camera.
- the image capture device 100 includes, but is not limited to, an image capture module 102 , a sensor 104 and a controller 106 .
- the image capture module 102 has the image capture capability, and may be used to generate still image(s) under an image capture mode (i.e., a photo mode) and generate a video sequence under a video recording mode.
- an image capture mode i.e., a photo mode
- FIG. 1 is a block diagram illustrating an image capture device according to an embodiment of the present invention.
- the image capture device 100 may be at least a portion (i.e., part or all) of an electronic device.
- the image capture device 100 may be implemented in a portable device such
- the sensor 104 is coupled to the controller 106 , and arranged for sensing an object OBJ to generate a sensing result SR.
- the object OBJ may trigger a specific action to be performed by the image capture module 102 .
- the sensing result SR carries information indicative of the triggering status of the specific action.
- the specific action may be an image capture action or an action of starting/ending video recording; and the object OBJ may be a finger of a user or a pen with magnetism that is used by the user.
- the controller 106 is coupled to the sensor 104 and the image capture module 102 , and arranged for receiving the sensing result SR and controlling the image capture module 102 based on the received sensing result SR. Specifically, the controller 106 checks the sensing result SR to determine if the specific action associated with the image capture module 102 is about to be triggered, and controlling the image capture module 102 to start the specific action in advance when determining that the specific action is about to be triggered (i.e., the object OBJ is close to the image capture device 100 but does not touch the image capture device 100 yet).
- the image capture module 102 is controlled by the controller 106 to start the image capture action (i.e., enter an image capture mode) before the image capture device 100 is actually touched by the object OBJ, thus making captured still images free from image blurring caused by unwanted hand shake.
- the image capture module 102 is controlled by the controller 106 to start the action of starting video recording (i.e., enter a video recording mode) before the image capture device 100 is actually touched by the object OBJ, thus making captured video frames in the beginning of the video recording free from image blurring caused by hand shake.
- the image capture module 102 is controlled by the controller 106 to start the action of ending video recording (i.e., leave the video recording mode) before the image capture device 100 is actually touched by the object OBJ, thus making captured video frames in the end of the video recording free from image blurring caused by hand shake.
- FIG. 2 is a flowchart illustrating an image capture method according to an embodiment of the present invention. Provided that the result is substantially the same, the steps are not required to be executed in the exact order shown in FIG. 2 .
- the exemplary image capture method may include following steps.
- Step 200 Start.
- Step 202 The image capture module 102 enters a camera preview mode.
- Step 204 Utilize the sensor 104 to sense the object OBJ, and accordingly generate the sensing result SR.
- Step 206 Utilize the controller 106 to check the sensing result SR to determine if the specific action associated with the image capture module 102 is about to be triggered. If yes, go to step 208 ; otherwise, go to step 202 .
- Step 208 Utilize the controller 106 to control the image capture module 102 to leave camera preview mode and enter a different camera mode (e.g., an image capture mode or a video recording mode) to start the specific action.
- a different camera mode e.g., an image capture mode or a video recording mode
- Step 210 The specific action is actually triggered by the object OBJ touching the image capture device 100 .
- Step 212 End.
- the image capture module 102 may enter a camera preview mode to generate a preview image or a preview video sequence on a display screen (not shown) of the image capture device 100 (step 202 ). Thus, the image capture module 102 stays in the camera preview mode until it is determined that the specific action associated with the image capture module 102 is about to be triggered (step 206 ). As can be seen from the flowchart in FIG. 2 , the specific action is started in advance at the time the controller 106 judges that the specific action is about to be triggered (steps 206 and 208 ). That is, when a predetermined criterion is met, the controller 106 would activate the specific action of the image capture module 102 even though the object OBJ does not actually trigger the specific action (steps 208 and 210 ).
- the specific action is started in advance at the time the controller 106 judges that the specific action is about to be triggered (steps 206 and 208 ). That is, when a predetermined criterion is met, the controller 106 would activate the specific action
- step 206 is performed to determine whether the specific action should be activated in advance.
- the controller 106 may refer to the sensing result SR to determine a distance D between the object OBJ and the image capture device 100 (e.g., a distance between the object OBJ and the sensor 104 ), and refers to the distance D to determine if the specific action is about to be triggered.
- FIG. 3 is a diagram illustrating a first embodiment of step 206 shown in FIG. 2 .
- the step 206 may be realized using following steps.
- Step 302 Estimate the distance D between the object OBJ and the image capture device 100 according to information given by the sensing result SR.
- Step 304 Compare the distance D with a predetermined threshold TH D .
- Step 306 Check if the distance D is shorter than the predetermined threshold TH D . If yes, go to step 308 ; otherwise, go to step 316 .
- Step 308 Count a time period T in which the distance D is continuously found shorter than the predetermined threshold TH D .
- Step 310 Compare the time period T with a predetermined time duration TH T .
- Step 312 Check if the time period T reaches the predetermined time duration TH T . If yes, go to step 314 ; otherwise, go to step 302 .
- Step 314 Determine that the specific action is about to be triggered.
- Step 316 Determine that the specific action is not about to be triggered.
- the controller 106 determines that the specific action is about to be triggered when the distance D is continuously found shorter than the predetermined threshold TH D over the predetermined time duration TH T . Specifically, when the distance D becomes shorter than the predetermined threshold TH D , this means that the object OBJ is close to the image capture device 100 (steps 302 - 306 ). It is possible that the user is going to trigger the specific action associated with the image capture module 102 . To avoid misjudgment, the predetermined time duration TH T is employed in this embodiment.
- the controller 106 would not decide that the specific action is about to be triggered (steps 308 - 312 ). That is, when there is one determination result showing that the distance D is not shorter than the predetermined threshold TH D before the predetermined time duration TH T is expired, the controller 106 skips the current counting operation of the time period T in which the distance D remains shorter than the predetermined threshold TH D , and decides that the specific action is not about to be triggered.
- the flow shown in FIG. 3 is merely one feasible implementation of the step 206 shown in FIG. 2 .
- the steps 308 - 312 may be omitted.
- the controller 106 may determine that the specific action is about to be triggered each time the distance D is found shorter than the predetermined threshold TH D . This also falls within the scope of the present invention.
- steps 308 - 312 are used to avoid misjudgment by checking if the distance D is continuously found shorter than the predetermined threshold TH D over the predetermined time duration TH T .
- a different misjudgment prevention scheme may be employed.
- FIG. 4 is a diagram illustrating a second embodiment of step 206 shown in FIG. 2 .
- the step 206 may be realized using following steps.
- Step 502 Estimate the distance (e.g., a first distance D 1 ) between the object OBJ and the image capture device 100 according to information given by the sensing result SR.
- Step 504 Compare the first distance D 1 with a predetermined threshold TH D .
- Step 506 Check if the first distance D 1 is shorter than the predetermined threshold TH D . If yes, go to step 508 ; otherwise, go to step 516 .
- Step 508 Estimate the distance (e.g., a second distance D 2 ) between the object OBJ and the image capture device 100 according to information given by the sensing result SR.
- Step 510 Compare the second distance D 2 with the first distance D 1 .
- Step 512 Check if the second distance D 2 is shorter than the first distance D 1 . If yes, go to step 514 ; otherwise, go to step 516 .
- Step 514 Determine that the specific action is about to be triggered.
- Step 516 Determine that the specific action is not about to be triggered.
- the controller 106 determines that the specific action is about to be triggered when the estimated distance (i.e., first distance D 1 ) is shorter than the predetermined threshold TH D at one time point and then the estimated distance (i.e., second distance D 2 ) becomes shorter at the next time point.
- the estimated distance i.e., first distance D 1
- the estimated distance i.e., second distance D 2
- the controller 106 determines that the specific action is about to be triggered when the estimated distance (i.e., first distance D 1 ) is shorter than the predetermined threshold TH D at one time point and then the estimated distance (i.e., second distance D 2 ) becomes shorter at the next time point.
- the first distance D 1 becomes shorter than the predetermined threshold TH D
- the user is going to trigger the specific action associated with the image capture module 102 .
- the controller 106 would not decide that the specific action is about to be triggered (steps 508 - 512 and 516 ). That is, the controller 106 does not decide that the specific action is about to be triggered unless the sequentially estimated distances D 1 and D 2 are both shorter than the predetermined threshold TH D and the later is shorter than the former (steps 508 - 514 ).
- step 302 / 502 / 508 the distance D/D 1 /D 2 between the object OBJ and the image capture device 100 is estimated by the controller 106 based on information given by the sensing result SR generated from the sensor 104 .
- the distance D/D 1 /D 2 between the object OBJ and the image capture device 100 is given as below.
- the senor 104 acts as a shutter/capture button
- the controller 106 is configured to determine the distance D/D 1 /D 2 by using skin color information of the object OBJ that is derived from the sensing result SR.
- the image capture module 102 is a front camera of a smartphone
- the sensor 104 is a back camera of the smartphone.
- the sensor 104 generates captured images of the object OBJ to serve as the sensing result SR.
- the controller 106 analyzes each captured image of the object OBJ to obtain the skin color information of the object OBJ.
- the user may use his/her finger to touch the sensor 104 to trigger the aforementioned specific action associated with the front camera (i.e., the image capture module 102 ).
- the skin color information of the object OBJ would indicate a finger area within each captured image of the object OBJ.
- the size of the finger area is inversely proportional to the distance D/D 1 /D 2 between the object OBJ and the image capture device 100 . That is, if the finger area is larger, the object OBJ is closer to the image capture device 100 . Hence, the size of the finger area can be used to estimate the distance D/D 1 /D 2 between the object OBJ and the image capture device 100 .
- the controller 106 determines that the distance D/D 1 /D 2 is shorter when an area of skin color (i.e., the size of the finger area) is found larger.
- an area of skin color i.e., the size of the finger area
- the controller 106 determines that the distance D/D 1 /D 2 is shorter when an area of skin color (i.e., the size of the finger area) is found larger.
- the image capture module 102 is a front camera of a smartphone
- the sensor 104 is a color sensor implemented in the smartphone.
- the sensor 104 detects the skin color of the object OBJ, and accordingly generates the sensing result SR.
- the skin color information of the object OBJ is directly provided by the sensor 104 .
- the sensor 104 acts as a shutter/capture button
- the user may use his/her finger to touch the sensor 104 to trigger the aforementioned specific action.
- the controller 106 is capable of determining if user's finger is approaching the shutter/capture button by monitoring the size variation of the finger area.
- the senor 104 acts as a shutter/capture button, and the controller 106 is configured to determine the distance D/D 1 /D 2 by using light information of the object OBJ that is derived from the sensing result SR.
- the image capture module 102 is a front camera of a smartphone, and the sensor 104 is a back camera of the smartphone.
- the sensor 104 generates captured images of the object OBJ to serve as the sensing result SR.
- the controller 106 analyzes each captured image of the object OBJ to obtain the light information (i.e., brightness information).
- the user may use his/her finger to touch the sensor 104 to trigger the aforementioned specific action.
- the light information would indicate whether user's finger is close to the image capture device 100 due to the fact that the intensity of the brightness is inversely proportional to the distance D/D 1 /D 2 between the object OBJ and the image capture device 100 . That is, if the captured image generated from the sensor 104 becomes darker, the object OBJ is closer to the image capture device 100 .
- the intensity of brightness can be used to estimate the distance D/D 1 /D 2 between the object OBJ and the image capture device 100 .
- the controller 106 determines that the distance D/D 1 /D 2 is shorter when the intensity of brightness is found lower.
- the intensity of brightness decreases to be close to a dark level (i.e., the non-zero distance D/D 1 /D 2 is shorter than the predetermined threshold TH D )
- the image capture module 102 is a front camera of a smartphone
- the sensor 104 is a light sensor implemented in the smartphone.
- the sensor 104 detects the ambient light, and accordingly generates the sensing result SR.
- the light information is directly provided by the sensor 104 .
- the sensor 104 acts as a shutter/capture button
- the user may use his/her finger to touch the sensor 104 to trigger the aforementioned specific action.
- the controller 106 is capable of determining if user's finger is approaching the shutter/capture button by monitoring the brightness variation of the ambient light detection result.
- the sensor 104 acts as a shutter/capture button
- the controller 106 is configured to determine the distance D/D 1 /D 2 by using proximity information of the object OBJ that is derived from the sensing result SR.
- the image capture module 102 is a front camera of a smartphone
- the sensor 104 is a back camera of the smartphone.
- the sensor 104 generates captured images of the object OBJ to serve as the sensing result SR.
- the controller 106 analyzes each captured image of the object OBJ to obtain the proximity information of the object OBJ.
- the user may use his/her finger to touch the sensor 104 to trigger the aforementioned specific action.
- the proximity information of the object OBJ would indicate whether the object OBJ is in the proximity of the image capture device 100 .
- the proximity information can be used to estimate the distance D/D 1 /D 2 between the object OBJ and the image capture device 100 .
- the controller 106 determines that the distance D/D 1 /D 2 is shorter when the proximity information of the object OBJ indicates that the object OBJ is closer to the image capture device 100 .
- the proximity information of the object OBJ indicates that the object OBJ is close to the image capture device 100 , it is possible that user's finger is going to touch the shutter/capture button.
- the image capture module 102 is a front camera of a smartphone
- the sensor 104 is a proximity sensor implemented in the smartphone.
- the sensor 104 detects if the object OBJ is in the proximity of the image capture device 100 , and accordingly generates the sensing result SR.
- the proximity information of the object OBJ is directly provided by the sensor 104 .
- the sensor 104 acts as a shutter/capture button
- the user may use his/her finger to touch the sensor 104 to trigger the aforementioned specific action.
- the controller 106 is capable of determining if user's finger is approaching the shutter/capture button by monitoring the variation of the proximity detection result.
- the senor 104 acts as a shutter/capture button
- the controller 106 is configured to determine the distance D/D 1 /D 2 by using range information of the object OBJ that is derived from the sensing result SR.
- the image capture module 102 is a front camera of a smartphone
- the sensor 104 is a back camera of the smartphone.
- the sensor 104 generates captured images of the object OBJ to serve as the sensing result SR.
- the controller 106 analyzes each captured image of the object OBJ to obtain the range information of the object OBJ.
- the user may use his/her finger to touch the sensor 104 to trigger the aforementioned specific action.
- the range information of the object OBJ directly gives an estimated value of the distance D/D 1 /D 2 between the object OBJ and the image capture device 100 .
- the controller 106 determines that the distance D/D 1 /D 2 is shorter when the range information of the object OBJ indicates that the object OBJ is closer to the image capture device 100 .
- the range information of the object OBJ indicates that the object OBJ is close to the image capture device 100 , it is possible that user's finger is going to touch the shutter/capture button.
- the image capture module 102 is a front camera of a smartphone
- the sensor 104 is a range sensor implemented in the smartphone.
- the sensor 104 measures the distance between the object OBJ and the image capture device 100 , and accordingly generates the sensing result SR.
- the range information of the object OBJ is directly provided by the sensor 104 .
- the sensor 104 acts as a shutter/capture button
- the user may use his/her finger to touch the sensor 104 to trigger the aforementioned specific action.
- the controller 106 is capable of determining if user's finger is approaching the shutter/capture button by monitoring the variation of the range detection result.
- the controller 106 is configured to determine the distance D/D 1 /D 2 by using depth information of the object OBJ that is derived from the sensing result SR.
- the image capture module 102 is a front camera of a smartphone
- the sensor 104 is a dual-lens camera of the smartphone.
- the sensor 104 is capable of generating a plurality of image pairs, each including a left-view captured image and a right-view captured image of the object OBJ, to serve as the sensing result SR.
- the controller 106 may perform disparity analysis based on the left-view captured image and the right-view captured image of each image pair, and then refer to the disparity analysis result to obtain the depth information of the object OBJ.
- the estimated depth of the object OBJ is proportional to the distance D/D 1 /D 2 between the object OBJ and the image capture device 100 .
- the distance D/D 1 /D 2 between the object OBJ and the image capture device 100 can be estimated based on the depth information of the object OBJ.
- the controller 106 determines that the distance D/D 1 /D 2 is shorter when the depth information of the object OBJ indicates that the object OBJ is closer to the image capture device 100 . Therefore, before the object OBJ, such as user's finger, actually touches a shutter/capture button to trigger the aforementioned specific action, the depth information of the object OBJ would indicate that the object OBJ is approaching the image capture device 100 . When the depth information of the object OBJ indicates that the object OBJ is close to the image capture device 100 , it is possible that user's finger is going to touch the shutter/capture button.
- the image capture module 102 is a front camera of a smartphone
- the sensor 104 is a depth sensor implemented in the smartphone.
- the sensor 104 measures the depth of the object OBJ, and accordingly generates the sensing result SR.
- the depth information of the object OBJ is directly provided by the sensor 104 .
- the user may use his/her finger to touch a shutter/capture button 104 to trigger the aforementioned specific action.
- the controller 106 is capable of determining if user's finger is approaching the shutter/capture button by monitoring the variation of the depth detection result.
- the senor 104 is implemented using a depth sensing liquid crystal display (LCD) panel. More specifically, the sensor 104 is an LCD panel with depth sensing elements integrated therein. Hence, the sensor 104 may be used to display a virtual shutter/capture button.
- the controller 106 is configured to determine the distance D/D 1 /D 2 by using depth information of the object OBJ that is derived from the sensing result SR, where the depth information of the object OBJ is directly provided by the sensor 104 .
- the controller 106 is capable of determining if user's finger is approaching the virtual shutter/capture button by monitoring the variation of the depth detection result.
- the object OBJ is close to the virtual shutter/capture button on the screen, it is possible that user's finger is going to touch the virtual shutter/capture button on the screen.
- FIG. 5 is a diagram illustrating a third embodiment of step 206 shown in FIG. 2 .
- the step 206 may be realized using following steps.
- Step 402 Compare one of an electrical property (e.g., current magnitude) and a magnetic property (e.g., magnetism magnitude) of the sensing result SR with a predetermined threshold TH P .
- an electrical property e.g., current magnitude
- a magnetic property e.g., magnetism magnitude
- Step 404 Check if the checked property is greater than the predetermined threshold TH P . If yes, go to step 406 ; otherwise, go to step 414 .
- Step 406 Count a time period T in which the checked property is continuously found greater than the predetermined threshold TH P .
- Step 408 Compare the time period T with a predetermined time duration TH T .
- Step 410 Check if the time period T reaches the predetermined time duration TH T . If yes, go to step 412 ; otherwise, go to step 402 .
- Step 412 Determine that the specific action is about to be triggered.
- Step 414 Determine that the specific action is not about to be triggered.
- the controller 106 determines that the specific action is about to be triggered when the checked property (e.g., one of the electrical property (e.g., current magnitude) and the magnetic property (e.g., magnetism magnitude) of the sensing result SR) is continuously found greater than the predetermined threshold TH P over the predetermined time duration TH T .
- the checked property e.g., one of the electrical property (e.g., current magnitude) and the magnetic property (e.g., magnetism magnitude) of the sensing result SR
- the checked property becomes greater than the predetermined threshold TH P
- the predetermined time duration TH T is employed in this embodiment.
- the controller 106 would not decide that the specific action is about to be triggered (steps 406 - 410 ). That is, when there is one determination result showing that the checked property is not greater than the predetermined threshold TH P before the predetermined time duration TH T is expired, the controller 106 skips the current counting operation of the time period T in which the checked property is greater than the predetermined threshold TH P , and decides that the specific action is not about to be triggered.
- the flow shown in FIG. 5 is merely one feasible implementation of the step 206 shown in FIG. 2 .
- the steps 406 - 410 may be omitted.
- the controller 106 may determine that the specific action is about to be triggered each time the checked property is found greater than the predetermined threshold TH P . This also falls within the scope of the present invention.
- steps 406 - 410 are used to avoid misjudgment by checking if the checked property (e.g., the electrical/magnetic property of the sensing result SR) is continuously found greater than the predetermined threshold TH P over the predetermined time duration TH T .
- the checked property e.g., the electrical/magnetic property of the sensing result SR
- TH P the predetermined threshold
- TH T the predetermined time duration
- FIG. 6 is a diagram illustrating a fourth embodiment of step 206 shown in FIG. 2 .
- the step 206 may be realized using following steps.
- Step 602 Compare a first checked property P 1 with a predetermined threshold TH P , where the first checked property P 1 is one of an electrical property (e.g., current magnitude) and a magnetic property (e.g., magnetism magnitude) of the sensing result SR.
- an electrical property e.g., current magnitude
- a magnetic property e.g., magnetism magnitude
- Step 604 Check if the first checked property P 1 is greater than the predetermined threshold TH P . If yes, go to step 606 ; otherwise, go to step 612 .
- Step 606 Compare a second checked property P 2 with the first checked property P 1 , where the second checked property P 2 is also one of the electrical property (e.g., current magnitude) and the magnetic property (e.g., magnetism magnitude) of the sensing result SR.
- the first checked property P 1 and the second checked property P 2 may be electrical properties or magnetic properties.
- Step 608 Check if the second checked property P 2 is greater than the first checked property P 1 . If yes, go to step 610 ; otherwise, go to step 612 .
- Step 610 Determine that the specific action is about to be triggered.
- Step 612 Determine that the specific action is not about to be triggered.
- the controller 106 determines that the specific action is about to be triggered when the checked property (i.e., first checked property P 1 ) is greater than the predetermined threshold TH P at one time point and then the checked property (i.e., second checked property P 2 ) becomes greater at the next time point.
- the checked property i.e., first checked property P 1
- the predetermined threshold TH P the predetermined threshold
- the first checked property P 1 becomes greater than the predetermined threshold TH P
- the electrical/magnetic property of the sensing result SR is checked again.
- the controller 106 would not decide that the specific action is about to be triggered (steps 606 , 608 , and 612 ). That is, the controller 106 does not decide that the specific action is about to be triggered unless the sequentially checked properties P 1 and P 2 are both greater than the predetermined threshold TH P and the later is greater than the former (steps 608 and 610 ).
- the controller 106 refers one of the electrical property and the magnetic property of the sensing result SR to determine if the specific action is about to be triggered.
- the electrical property e.g., current magnitude
- the sensor 104 may be implemented using a floating touch panel composed of self capacitive sensors.
- the sensing result SR of the sensor 104 would have its current magnitude inversely proportional to the distance between the object OBJ and the image capture device 100 . Due to the use of self capacitive sensors, the sensor 104 is able to detect the object OBJ before the object OBJ has a physical contact with the sensor 104 .
- a virtual shutter/capture button may be displayed on a screen beneath the floating touch panel.
- the controller 106 is capable of determining if user's finger is approaching the virtual shutter/capture button by monitoring the variation of the current magnitude of the sensing result SR.
- the object OBJ is found close to the virtual shutter/capture button on the screen, it is possible that user's finger is going to touch the virtual shutter/capture button.
- the object OBJ may be a pen with magnetism
- the sensor 104 may be implemented using is a sensor board installed on the image capture device 100 .
- the sensor 104 Based on the magnetic coupling between the object OBJ and the sensor 104 , the sensor 104 generates the sensing result SR with a corresponding magnetism magnitude.
- the sensing result SR of the sensor 104 would have its magnetism magnitude inversely proportional to the distance between the object OBJ and the image capture device 100 .
- the sensor 104 is able to detect the object OBJ before the object OBJ has a physical contact with a virtual shutter/capture button on a screen to trigger the aforementioned specific action.
- the controller 106 is capable of determining if pen with magnetism is approaching the virtual shutter/capture button by monitoring the variation of the magnetism magnitude of the sensing result SR. When the object OBJ is found close to the shutter/capture button, it is possible that the pen with magnetism is going to touch the virtual shutter/capture button.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Television Signal Processing For Recording (AREA)
- User Interface Of Digital Computer (AREA)
- Editing Of Facsimile Originals (AREA)
Abstract
An image capture device has an image capture module, a sensor and a controller. The sensor senses an object to generate a sensing result. The controller checks the sensing result to determine if a specific action associated with the image capture module is about to be triggered, and controls the image capture module to start the specific action in advance when determining that the specific action is about to be triggered.
Description
- This application claims the benefit of U.S. provisional application No. 61/651,499, filed on May 24, 2012 and incorporated herein by reference.
- The disclosed embodiments of the present invention relate to controlling an image capture module, and more particularly, to an image capture device for starting a specific action in advance when determining that the specific action associated with an image capture module is about to be triggered and related image capture method thereof.
- Camera modules have become popular elements used in a variety of applications. For example, a smartphone is typically equipped with a camera module, thus allowing a user to easily and conveniently take pictures by using the smartphone. However, due to inherent characteristics of the smartphone, the smartphone is prone to generate blurred images. For example, the camera aperture and/or sensor size of the smartphone is typically small, which leads to a small amount of light arriving at each pixel in camera sensor. As a result, the image quality may suffer from the small camera aperture and/or sensor size.
- Besides, due to lightweight and portability of the smartphone, the smartphone tends to be affected by hand shake. Specifically, when user's finger touches a physical shutter/capture bottom or a virtual shutter/capture button on the smartphone, the shake of the smartphone will last for a period of time. Hence, any picture taken during this period of time would be affected by the hand shake. An image deblurring algorithm may be performed upon the blurred images. However, the computational complexity of the image deblurring algorithm is very high, resulting in considerable power consumption. Besides, artifact will be introduced if the image deblurring algorithm is not perfect.
- Moreover, a camera module with an optical image stabilizer (OIS) is expensive. Hence, the conventional smartphone is generally equipped with a digital image stabilizer (i.e., an electronic image stabilizer (EIS)). The digital image stabilizer can counteract the motion of images, but fails to prevent image blurring.
- Thus, there is a need for an innovative image capture device which is capable of generating non-blurred pictures.
- In accordance with exemplary embodiments of the present invention, an image capture device for starting a specific action in advance when determining that the specific action associated with an image capture module is about to be triggered and related image capture method thereof are proposed to solve the above-mentioned problem.
- According to a first aspect of the present invention, an exemplary image capture device is disclosed. The exemplary image capture device includes an image capture module, a sensor arranged for sensing an object to generate a sensing result, and a controller arranged for checking the sensing result to determine if a specific action associated with the image capture module is about to be triggered and controlling the image capture module to start the specific action in advance when determining that the specific action is about to be triggered.
- According to a second aspect of the present invention, an exemplary image capture method is disclosed. The exemplary image capture method includes: sensing an object to generate a sensing result; checking the sensing result to determine if a specific action associated with an image capture module is about to be triggered; and when determining that the specific action is about to be triggered, controlling the image capture module to start the specific action in advance.
- These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
-
FIG. 1 is a block diagram illustrating an image capture device according to an embodiment of the present invention. -
FIG. 2 is a flowchart illustrating an image capture method according to an embodiment of the present invention. -
FIG. 3 is a diagram illustrating a first embodiment ofstep 206 shown inFIG. 2 . -
FIG. 4 is a diagram illustrating a second embodiment ofstep 206 shown inFIG. 2 . -
FIG. 5 is a diagram illustrating a third embodiment ofstep 206 shown inFIG. 2 . -
FIG. 6 is a diagram illustrating a fourth embodiment ofstep 206 shown inFIG. 2 . - Certain terms are used throughout the description and following claims to refer to particular components. As one skilled in the art will appreciate, manufacturers may refer to a component by different names. This document does not intend to distinguish between components that differ in name but not function. In the following description and in the claims, the terms “include” and “comprise” are used in an open-ended fashion, and thus should be interpreted to mean “include, but not limited to . . . ”. Also, the term “couple” is intended to mean either an indirect or direct electrical connection. Accordingly, if one device is coupled to another device, that connection may be through a direct electrical connection, or through an indirect electrical connection via other devices and connections.
- The main concept of the present invention is to capture one or more still image(s) or start/end a video recording operation before an object (e.g., a finger of a user or a pen with magnetism that is used by the user) actually touches an image capture device. In this way, the image blurring caused by unwanted hand shake applied to the image capture device is avoided. Further details are described as below.
- Please refer to
FIG. 1 , which is a block diagram illustrating an image capture device according to an embodiment of the present invention. Theimage capture device 100 may be at least a portion (i.e., part or all) of an electronic device. For example, theimage capture device 100 may be implemented in a portable device such as a smartphone or a digital camera. In this embodiment, theimage capture device 100 includes, but is not limited to, animage capture module 102, asensor 104 and acontroller 106. Theimage capture module 102 has the image capture capability, and may be used to generate still image(s) under an image capture mode (i.e., a photo mode) and generate a video sequence under a video recording mode. As the present invention focuses on the control scheme applied to theimage capture module 102 rather than an internal structure of theimage capture module 102, further description of the internal structure of theimage capture module 102 is omitted here for brevity. - The
sensor 104 is coupled to thecontroller 106, and arranged for sensing an object OBJ to generate a sensing result SR. The object OBJ may trigger a specific action to be performed by theimage capture module 102. Thus, the sensing result SR carries information indicative of the triggering status of the specific action. By way of example, but not limitation, the specific action may be an image capture action or an action of starting/ending video recording; and the object OBJ may be a finger of a user or a pen with magnetism that is used by the user. - The
controller 106 is coupled to thesensor 104 and theimage capture module 102, and arranged for receiving the sensing result SR and controlling theimage capture module 102 based on the received sensing result SR. Specifically, thecontroller 106 checks the sensing result SR to determine if the specific action associated with theimage capture module 102 is about to be triggered, and controlling theimage capture module 102 to start the specific action in advance when determining that the specific action is about to be triggered (i.e., the object OBJ is close to theimage capture device 100 but does not touch theimage capture device 100 yet). In a case where the specific action is an image capture action, theimage capture module 102 is controlled by thecontroller 106 to start the image capture action (i.e., enter an image capture mode) before theimage capture device 100 is actually touched by the object OBJ, thus making captured still images free from image blurring caused by unwanted hand shake. In another case where the specific action is an action of starting video recording, theimage capture module 102 is controlled by thecontroller 106 to start the action of starting video recording (i.e., enter a video recording mode) before theimage capture device 100 is actually touched by the object OBJ, thus making captured video frames in the beginning of the video recording free from image blurring caused by hand shake. In yet another case where the specific action is an action of ending video recording, theimage capture module 102 is controlled by thecontroller 106 to start the action of ending video recording (i.e., leave the video recording mode) before theimage capture device 100 is actually touched by the object OBJ, thus making captured video frames in the end of the video recording free from image blurring caused by hand shake. - Please refer to
FIG. 1 in conjunction withFIG. 2 .FIG. 2 is a flowchart illustrating an image capture method according to an embodiment of the present invention. Provided that the result is substantially the same, the steps are not required to be executed in the exact order shown inFIG. 2 . The exemplary image capture method may include following steps. - Step 200: Start.
- Step 202: The
image capture module 102 enters a camera preview mode. - Step 204: Utilize the
sensor 104 to sense the object OBJ, and accordingly generate the sensing result SR. - Step 206: Utilize the
controller 106 to check the sensing result SR to determine if the specific action associated with theimage capture module 102 is about to be triggered. If yes, go to step 208; otherwise, go to step 202. - Step 208: Utilize the
controller 106 to control theimage capture module 102 to leave camera preview mode and enter a different camera mode (e.g., an image capture mode or a video recording mode) to start the specific action. - Step 210: The specific action is actually triggered by the object OBJ touching the
image capture device 100. - Step 212: End.
- Before the user actually triggers the specific action (e.g., an image capture action, an action of starting video recording, or an action of ending video recording), the
image capture module 102 may enter a camera preview mode to generate a preview image or a preview video sequence on a display screen (not shown) of the image capture device 100 (step 202). Thus, theimage capture module 102 stays in the camera preview mode until it is determined that the specific action associated with theimage capture module 102 is about to be triggered (step 206). As can be seen from the flowchart inFIG. 2 , the specific action is started in advance at the time thecontroller 106 judges that the specific action is about to be triggered (steps 206 and 208). That is, when a predetermined criterion is met, thecontroller 106 would activate the specific action of theimage capture module 102 even though the object OBJ does not actually trigger the specific action (steps 208 and 210). - As mentioned above,
step 206 is performed to determine whether the specific action should be activated in advance. In one exemplary design, thecontroller 106 may refer to the sensing result SR to determine a distance D between the object OBJ and the image capture device 100 (e.g., a distance between the object OBJ and the sensor 104), and refers to the distance D to determine if the specific action is about to be triggered. Please refer toFIG. 3 , which is a diagram illustrating a first embodiment ofstep 206 shown inFIG. 2 . In this embodiment, thestep 206 may be realized using following steps. - Step 302: Estimate the distance D between the object OBJ and the
image capture device 100 according to information given by the sensing result SR. - Step 304: Compare the distance D with a predetermined threshold THD.
- Step 306: Check if the distance D is shorter than the predetermined threshold THD. If yes, go to step 308; otherwise, go to step 316.
- Step 308: Count a time period T in which the distance D is continuously found shorter than the predetermined threshold THD.
- Step 310: Compare the time period T with a predetermined time duration THT.
- Step 312: Check if the time period T reaches the predetermined time duration THT. If yes, go to step 314; otherwise, go to step 302.
- Step 314: Determine that the specific action is about to be triggered.
- Step 316: Determine that the specific action is not about to be triggered.
- In this embodiment, the
controller 106 determines that the specific action is about to be triggered when the distance D is continuously found shorter than the predetermined threshold THD over the predetermined time duration THT. Specifically, when the distance D becomes shorter than the predetermined threshold THD, this means that the object OBJ is close to the image capture device 100 (steps 302-306). It is possible that the user is going to trigger the specific action associated with theimage capture module 102. To avoid misjudgment, the predetermined time duration THT is employed in this embodiment. Therefore, if the time period in which the distance D remains shorter than the predetermined threshold THD does not last up to the predetermined time duration THT, thecontroller 106 would not decide that the specific action is about to be triggered (steps 308-312). That is, when there is one determination result showing that the distance D is not shorter than the predetermined threshold THD before the predetermined time duration THT is expired, thecontroller 106 skips the current counting operation of the time period T in which the distance D remains shorter than the predetermined threshold THD, and decides that the specific action is not about to be triggered. - The flow shown in
FIG. 3 is merely one feasible implementation of thestep 206 shown inFIG. 2 . In an alternative design, the steps 308-312 may be omitted. Hence, thecontroller 106 may determine that the specific action is about to be triggered each time the distance D is found shorter than the predetermined threshold THD. This also falls within the scope of the present invention. - In the exemplary shown in
FIG. 3 , steps 308-312 are used to avoid misjudgment by checking if the distance D is continuously found shorter than the predetermined threshold THD over the predetermined time duration THT. Alternatively, a different misjudgment prevention scheme may be employed. Please refer toFIG. 4 , which is a diagram illustrating a second embodiment ofstep 206 shown inFIG. 2 . In this embodiment, thestep 206 may be realized using following steps. - Step 502: Estimate the distance (e.g., a first distance D1) between the object OBJ and the
image capture device 100 according to information given by the sensing result SR. - Step 504: Compare the first distance D1 with a predetermined threshold THD.
- Step 506: Check if the first distance D1 is shorter than the predetermined threshold THD. If yes, go to step 508; otherwise, go to step 516.
- Step 508: Estimate the distance (e.g., a second distance D2) between the object OBJ and the
image capture device 100 according to information given by the sensing result SR. - Step 510: Compare the second distance D2 with the first distance D1.
- Step 512: Check if the second distance D2 is shorter than the first distance D1. If yes, go to step 514; otherwise, go to step 516.
- Step 514: Determine that the specific action is about to be triggered.
- Step 516: Determine that the specific action is not about to be triggered.
- In this embodiment, the
controller 106 determines that the specific action is about to be triggered when the estimated distance (i.e., first distance D1) is shorter than the predetermined threshold THD at one time point and then the estimated distance (i.e., second distance D2) becomes shorter at the next time point. Specifically, when the first distance D1 becomes shorter than the predetermined threshold THD, this means that the object OBJ is close to the image capture device 100 (steps 502-506). It is possible that the user is going to trigger the specific action associated with theimage capture module 102. To avoid misjudgment, the distance between the object OBJ and theimage capture device 100 is estimated again. Therefore, if the second distance D2 is not shorter than the first distance D1, thecontroller 106 would not decide that the specific action is about to be triggered (steps 508-512 and 516). That is, thecontroller 106 does not decide that the specific action is about to be triggered unless the sequentially estimated distances D1 and D2 are both shorter than the predetermined threshold THD and the later is shorter than the former (steps 508-514). - In
step 302/502/508, the distance D/D1 /D2 between the object OBJ and theimage capture device 100 is estimated by thecontroller 106 based on information given by the sensing result SR generated from thesensor 104. Several examples for achieving estimation of the distance D/D1/D2 between the object OBJ and theimage capture device 100 are given as below. - In a first exemplary implementation, the
sensor 104 acts as a shutter/capture button, and thecontroller 106 is configured to determine the distance D/D1/D2 by using skin color information of the object OBJ that is derived from the sensing result SR. For example, theimage capture module 102 is a front camera of a smartphone, and thesensor 104 is a back camera of the smartphone. Thus, thesensor 104 generates captured images of the object OBJ to serve as the sensing result SR. After receiving the sensing result SR (i.e., captured images of the object OBJ), thecontroller 106 analyzes each captured image of the object OBJ to obtain the skin color information of the object OBJ. As thesensor 104 is a back camera which acts as a shutter/capture button, the user may use his/her finger to touch thesensor 104 to trigger the aforementioned specific action associated with the front camera (i.e., the image capture module 102). The skin color information of the object OBJ would indicate a finger area within each captured image of the object OBJ. The size of the finger area is inversely proportional to the distance D/D1/D2 between the object OBJ and theimage capture device 100. That is, if the finger area is larger, the object OBJ is closer to theimage capture device 100. Hence, the size of the finger area can be used to estimate the distance D/D1/D2 between the object OBJ and theimage capture device 100. In this embodiment, thecontroller 106 determines that the distance D/D1/D2 is shorter when an area of skin color (i.e., the size of the finger area) is found larger. When the finger area increases to occupy most of the captured image of the object OBJ (i.e., the non-zero distance D/D1/D2 is shorter than the predetermined threshold THD), it is possible that user's finger is going to touch the shutter/capture button. - Alternatively, the
image capture module 102 is a front camera of a smartphone, and thesensor 104 is a color sensor implemented in the smartphone. Thus, thesensor 104 detects the skin color of the object OBJ, and accordingly generates the sensing result SR. In other words, the skin color information of the object OBJ is directly provided by thesensor 104. As thesensor 104 acts as a shutter/capture button, the user may use his/her finger to touch thesensor 104 to trigger the aforementioned specific action. After receiving the sensing result SR (i.e., skin color detection result), thecontroller 106 is capable of determining if user's finger is approaching the shutter/capture button by monitoring the size variation of the finger area. - In a second exemplary implementation, the
sensor 104 acts as a shutter/capture button, and thecontroller 106 is configured to determine the distance D/D1/D2 by using light information of the object OBJ that is derived from the sensing result SR. For example, theimage capture module 102 is a front camera of a smartphone, and thesensor 104 is a back camera of the smartphone. Thus, thesensor 104 generates captured images of the object OBJ to serve as the sensing result SR. After receiving the sensing result SR (i.e., captured images of the object OBJ), thecontroller 106 analyzes each captured image of the object OBJ to obtain the light information (i.e., brightness information). As thesensor 104 acts as a shutter/capture button, the user may use his/her finger to touch thesensor 104 to trigger the aforementioned specific action. The light information would indicate whether user's finger is close to theimage capture device 100 due to the fact that the intensity of the brightness is inversely proportional to the distance D/D1/D2 between the object OBJ and theimage capture device 100. That is, if the captured image generated from thesensor 104 becomes darker, the object OBJ is closer to theimage capture device 100. Hence, the intensity of brightness can be used to estimate the distance D/D1/D2 between the object OBJ and theimage capture device 100. In this embodiment, thecontroller 106 determines that the distance D/D1/D2 is shorter when the intensity of brightness is found lower. When the intensity of brightness decreases to be close to a dark level (i.e., the non-zero distance D/D1/D2 is shorter than the predetermined threshold THD), it is possible that user's finger is going to touch the shutter/capture button. - Alternatively, the
image capture module 102 is a front camera of a smartphone, and thesensor 104 is a light sensor implemented in the smartphone. Thus, thesensor 104 detects the ambient light, and accordingly generates the sensing result SR. In other words, the light information is directly provided by thesensor 104. As thesensor 104 acts as a shutter/capture button, the user may use his/her finger to touch thesensor 104 to trigger the aforementioned specific action. After receiving the sensing result SR (i.e., ambient light detection result), thecontroller 106 is capable of determining if user's finger is approaching the shutter/capture button by monitoring the brightness variation of the ambient light detection result. - In a third exemplary implementation, the
sensor 104 acts as a shutter/capture button, and thecontroller 106 is configured to determine the distance D/D1/D2 by using proximity information of the object OBJ that is derived from the sensing result SR. For example, theimage capture module 102 is a front camera of a smartphone, and thesensor 104 is a back camera of the smartphone. Thus, thesensor 104 generates captured images of the object OBJ to serve as the sensing result SR. After receiving the sensing result SR (i.e., captured images of the object OBJ), thecontroller 106 analyzes each captured image of the object OBJ to obtain the proximity information of the object OBJ. As thesensor 104 acts as a shutter/capture button, the user may use his/her finger to touch thesensor 104 to trigger the aforementioned specific action. The proximity information of the object OBJ would indicate whether the object OBJ is in the proximity of theimage capture device 100. Hence, the proximity information can be used to estimate the distance D/D1/D2 between the object OBJ and theimage capture device 100. In this embodiment, thecontroller 106 determines that the distance D/D1/D2 is shorter when the proximity information of the object OBJ indicates that the object OBJ is closer to theimage capture device 100. When the proximity information of the object OBJ indicates that the object OBJ is close to theimage capture device 100, it is possible that user's finger is going to touch the shutter/capture button. - Alternatively, the
image capture module 102 is a front camera of a smartphone, and thesensor 104 is a proximity sensor implemented in the smartphone. Thus, thesensor 104 detects if the object OBJ is in the proximity of theimage capture device 100, and accordingly generates the sensing result SR. In other words, the proximity information of the object OBJ is directly provided by thesensor 104. As thesensor 104 acts as a shutter/capture button, the user may use his/her finger to touch thesensor 104 to trigger the aforementioned specific action. After receiving the sensing result SR (i.e., proximity detection result), thecontroller 106 is capable of determining if user's finger is approaching the shutter/capture button by monitoring the variation of the proximity detection result. - In a fourth exemplary implementation, the
sensor 104 acts as a shutter/capture button, and thecontroller 106 is configured to determine the distance D/D1/D2 by using range information of the object OBJ that is derived from the sensing result SR. For example, theimage capture module 102 is a front camera of a smartphone, and thesensor 104 is a back camera of the smartphone. Thus, thesensor 104 generates captured images of the object OBJ to serve as the sensing result SR. After receiving the sensing result SR (i.e., captured images of the object OBJ), thecontroller 106 analyzes each captured image of the object OBJ to obtain the range information of the object OBJ. As thesensor 104 acts as a shutter/capture button, the user may use his/her finger to touch thesensor 104 to trigger the aforementioned specific action. The range information of the object OBJ directly gives an estimated value of the distance D/D1/D2 between the object OBJ and theimage capture device 100. Hence, thecontroller 106 determines that the distance D/D1/D2 is shorter when the range information of the object OBJ indicates that the object OBJ is closer to theimage capture device 100. When the range information of the object OBJ indicates that the object OBJ is close to theimage capture device 100, it is possible that user's finger is going to touch the shutter/capture button. - Alternatively, the
image capture module 102 is a front camera of a smartphone, and thesensor 104 is a range sensor implemented in the smartphone. Thus, thesensor 104 measures the distance between the object OBJ and theimage capture device 100, and accordingly generates the sensing result SR. In other words, the range information of the object OBJ is directly provided by thesensor 104. As thesensor 104 acts as a shutter/capture button, the user may use his/her finger to touch thesensor 104 to trigger the aforementioned specific action. After receiving the sensing result SR (i.e., range detection result), thecontroller 106 is capable of determining if user's finger is approaching the shutter/capture button by monitoring the variation of the range detection result. - In a fifth exemplary implementation, the
controller 106 is configured to determine the distance D/D1/D2 by using depth information of the object OBJ that is derived from the sensing result SR. For example, theimage capture module 102 is a front camera of a smartphone, and thesensor 104 is a dual-lens camera of the smartphone. Thus, thesensor 104 is capable of generating a plurality of image pairs, each including a left-view captured image and a right-view captured image of the object OBJ, to serve as the sensing result SR. After receiving the sensing result SR (i.e., image pairs), thecontroller 106 may perform disparity analysis based on the left-view captured image and the right-view captured image of each image pair, and then refer to the disparity analysis result to obtain the depth information of the object OBJ. The estimated depth of the object OBJ is proportional to the distance D/D1/D2 between the object OBJ and theimage capture device 100. Hence, the distance D/D1/D2 between the object OBJ and theimage capture device 100 can be estimated based on the depth information of the object OBJ. In this embodiment, thecontroller 106 determines that the distance D/D1/D2 is shorter when the depth information of the object OBJ indicates that the object OBJ is closer to theimage capture device 100. Therefore, before the object OBJ, such as user's finger, actually touches a shutter/capture button to trigger the aforementioned specific action, the depth information of the object OBJ would indicate that the object OBJ is approaching theimage capture device 100. When the depth information of the object OBJ indicates that the object OBJ is close to theimage capture device 100, it is possible that user's finger is going to touch the shutter/capture button. - Alternatively, the
image capture module 102 is a front camera of a smartphone, and thesensor 104 is a depth sensor implemented in the smartphone. Thus, thesensor 104 measures the depth of the object OBJ, and accordingly generates the sensing result SR. In other words, the depth information of the object OBJ is directly provided by thesensor 104. As mentioned above, the user may use his/her finger to touch a shutter/capture button 104 to trigger the aforementioned specific action. After receiving the sensing result SR (i.e., depth detection result), thecontroller 106 is capable of determining if user's finger is approaching the shutter/capture button by monitoring the variation of the depth detection result. - In a sixth exemplary implementation, the
sensor 104 is implemented using a depth sensing liquid crystal display (LCD) panel. More specifically, thesensor 104 is an LCD panel with depth sensing elements integrated therein. Hence, thesensor 104 may be used to display a virtual shutter/capture button. Thecontroller 106 is configured to determine the distance D/D1/D2 by using depth information of the object OBJ that is derived from the sensing result SR, where the depth information of the object OBJ is directly provided by thesensor 104. As the user may use his/her finger to touch the virtual shutter/capture button displayed on the depth sensing LCD panel to trigger the aforementioned specific action, thecontroller 106 is capable of determining if user's finger is approaching the virtual shutter/capture button by monitoring the variation of the depth detection result. When the object OBJ is close to the virtual shutter/capture button on the screen, it is possible that user's finger is going to touch the virtual shutter/capture button on the screen. - Regarding the exemplary flows shown in
FIG. 3 andFIG. 4 , the distance D/D1/D2 between the object OBJ and theimage capture device 100 is needed to be estimated/calculated based on information given by the sensing result SR. However, this is for illustrative purposes only, and is not meant to be a limitation of the present invention. Please refer toFIG. 5 , which is a diagram illustrating a third embodiment ofstep 206 shown inFIG. 2 . In this embodiment, thestep 206 may be realized using following steps. - Step 402: Compare one of an electrical property (e.g., current magnitude) and a magnetic property (e.g., magnetism magnitude) of the sensing result SR with a predetermined threshold THP.
- Step 404: Check if the checked property is greater than the predetermined threshold THP. If yes, go to step 406; otherwise, go to step 414.
- Step 406: Count a time period T in which the checked property is continuously found greater than the predetermined threshold THP.
- Step 408: Compare the time period T with a predetermined time duration THT.
- Step 410: Check if the time period T reaches the predetermined time duration THT. If yes, go to step 412; otherwise, go to step 402.
- Step 412: Determine that the specific action is about to be triggered.
- Step 414: Determine that the specific action is not about to be triggered.
- In this embodiment, the
controller 106 determines that the specific action is about to be triggered when the checked property (e.g., one of the electrical property (e.g., current magnitude) and the magnetic property (e.g., magnetism magnitude) of the sensing result SR) is continuously found greater than the predetermined threshold THP over the predetermined time duration THT. Specifically, when the checked property becomes greater than the predetermined threshold THP, this means that the object OBJ is close to, but does not have contact with, the image capture device 100 (steps 402 and 404). It is possible that the user is going to trigger the specific action associated with theimage capture module 102. To avoid misjudgment, the predetermined time duration THT is employed in this embodiment. Therefore, if the time period in which the checked property is greater than the predetermined threshold THP does not last up to the predetermined time duration THT, thecontroller 106 would not decide that the specific action is about to be triggered (steps 406-410). That is, when there is one determination result showing that the checked property is not greater than the predetermined threshold THP before the predetermined time duration THT is expired, thecontroller 106 skips the current counting operation of the time period T in which the checked property is greater than the predetermined threshold THP, and decides that the specific action is not about to be triggered. - The flow shown in
FIG. 5 is merely one feasible implementation of thestep 206 shown inFIG. 2 . In an alternative design, the steps 406-410 may be omitted. Hence, thecontroller 106 may determine that the specific action is about to be triggered each time the checked property is found greater than the predetermined threshold THP. This also falls within the scope of the present invention. - In the exemplary design shown in
FIG. 5 , steps 406-410 are used to avoid misjudgment by checking if the checked property (e.g., the electrical/magnetic property of the sensing result SR) is continuously found greater than the predetermined threshold THP over the predetermined time duration THT. Alternatively, a different misjudgment prevention scheme may be employed. Please refer toFIG. 6 , which is a diagram illustrating a fourth embodiment ofstep 206 shown inFIG. 2 . In this embodiment, thestep 206 may be realized using following steps. - Step 602: Compare a first checked property P1 with a predetermined threshold THP, where the first checked property P1 is one of an electrical property (e.g., current magnitude) and a magnetic property (e.g., magnetism magnitude) of the sensing result SR.
- Step 604: Check if the first checked property P1 is greater than the predetermined threshold THP. If yes, go to step 606; otherwise, go to step 612.
- Step 606: Compare a second checked property P2 with the first checked property P1, where the second checked property P2 is also one of the electrical property (e.g., current magnitude) and the magnetic property (e.g., magnetism magnitude) of the sensing result SR. Specifically, both of the first checked property P1 and the second checked property P2 may be electrical properties or magnetic properties.
- Step 608: Check if the second checked property P2 is greater than the first checked property P1. If yes, go to step 610; otherwise, go to step 612.
- Step 610: Determine that the specific action is about to be triggered.
- Step 612: Determine that the specific action is not about to be triggered.
- In this embodiment, the
controller 106 determines that the specific action is about to be triggered when the checked property (i.e., first checked property P1) is greater than the predetermined threshold THP at one time point and then the checked property (i.e., second checked property P2) becomes greater at the next time point. Specifically, when the first checked property P1 becomes greater than the predetermined threshold THP, this means that the object OBJ is close to, but does not have contact with, the image capture device 100 (steps 602 and 604). It is possible that the user is going to trigger the specific action associated with theimage capture module 102. To avoid misjudgment, the electrical/magnetic property of the sensing result SR is checked again. Therefore, if the second checked property P2 is not greater than the first checked property P1, thecontroller 106 would not decide that the specific action is about to be triggered ( 606, 608, and 612). That is, thesteps controller 106 does not decide that the specific action is about to be triggered unless the sequentially checked properties P1 and P2 are both greater than the predetermined threshold THP and the later is greater than the former (steps 608 and 610). - As mentioned above, the
controller 106 refers one of the electrical property and the magnetic property of the sensing result SR to determine if the specific action is about to be triggered. In a case where the electrical property (e.g., current magnitude) of the sensing result SR is checked instep 402/602/606, thesensor 104 may be implemented using a floating touch panel composed of self capacitive sensors. Hence, the sensing result SR of thesensor 104 would have its current magnitude inversely proportional to the distance between the object OBJ and theimage capture device 100. Due to the use of self capacitive sensors, thesensor 104 is able to detect the object OBJ before the object OBJ has a physical contact with thesensor 104. In addition, a virtual shutter/capture button may be displayed on a screen beneath the floating touch panel. As the user may use his/her finger to touch the virtual shutter/capture button for triggering the aforementioned specific action by having a physical contact with thesensor 104 disposed on the screen, thecontroller 106 is capable of determining if user's finger is approaching the virtual shutter/capture button by monitoring the variation of the current magnitude of the sensing result SR. When the object OBJ is found close to the virtual shutter/capture button on the screen, it is possible that user's finger is going to touch the virtual shutter/capture button. - In another case where the magnetic property (e.g., magnetism magnitude) of the sensing result SR is checked in
step 402/602/606, the object OBJ may be a pen with magnetism, and thesensor 104 may be implemented using is a sensor board installed on theimage capture device 100. Specifically, based on the magnetic coupling between the object OBJ and thesensor 104, thesensor 104 generates the sensing result SR with a corresponding magnetism magnitude. Hence, the sensing result SR of thesensor 104 would have its magnetism magnitude inversely proportional to the distance between the object OBJ and theimage capture device 100. Due to the use of the pen with magnetism, thesensor 104 is able to detect the object OBJ before the object OBJ has a physical contact with a virtual shutter/capture button on a screen to trigger the aforementioned specific action. Thecontroller 106 is capable of determining if pen with magnetism is approaching the virtual shutter/capture button by monitoring the variation of the magnetism magnitude of the sensing result SR. When the object OBJ is found close to the shutter/capture button, it is possible that the pen with magnetism is going to touch the virtual shutter/capture button. - Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Claims (25)
1. An image capture device comprising:
an image capture module;
a sensor, arranged for sensing an object to generate a sensing result; and
a controller, arranged for checking the sensing result to determine if a specific action associated with the image capture module is about to be triggered and controlling the image capture module to start the specific action in advance when determining that the specific action is about to be triggered.
2. The image capture device of claim 1 , wherein the specific action is an image capture action, an action of starting video recording or an action of ending video recording.
3. The image capture device of claim 1 , wherein the controller refers to the sensing result to determine a distance between the object and the image capture device, and refers to the distance to determine if the specific action is about to be triggered.
4. The image capture device of claim 3 , wherein the controller determines that the specific action is about to be triggered when the distance is continuously found shorter than a predetermined threshold over a predetermined time duration.
5. The image capture device of claim 3 , wherein the controller determines that the specific action is about to be triggered when the distance is shorter than the predetermined threshold and a next distance between the object and the image capture device is shorter than the distance.
6. The image capture device of claim 3 , wherein the controller determines the distance by using skin color information of the object that is derived from the sensing result.
7. The image capture device of claim 6 , wherein the controller determines that the distance is shorter when an area of skin color is found larger.
8. The image capture device of claim 3 , wherein the controller determines the distance by using light information that is derived from the sensing result.
9. The image capture device of claim 3 , wherein the controller determines the distance by using proximity information of the object that is derived from the sensing result.
10. The image capture device of claim 3 , wherein the controller determines the distance by using range information of the object that is derived from the sensing result or depth information of the object that is derived from the sensing result.
11. The image capture device of claim 10 , wherein when the controller determines the distance by using the depth information of the object, the sensor is a depth sensing liquid crystal display (LCD) panel.
12. The image capture device of claim 1 , wherein the controller refers one of an electrical property and a magnetic property of the sensing result to determine if the specific action is about to be triggered.
13. The image capture device of claim 12 , wherein the sensor is a floating touch panel.
14. The image capture device of claim 12 , wherein the object sensed by the sensor is a pen with magnetism.
15. An image capture method comprising:
sensing an object to generate a sensing result;
checking the sensing result to determine if a specific action associated with an image capture module is about to be triggered; and
when determining that the specific action is about to be triggered, controlling the image capture module to start the specific action in advance.
16. The image capture method of claim 15 , wherein the specific action is an image capture action, an action of starting video recording or an action of ending video recording.
17. The image capture method of claim 15 , wherein the step of checking the sensing result to determine if the specific action associated with the image capture module is about to be triggered comprises:
referring to the sensing result to determine a distance between the object and the image capture device; and
referring to the distance to determine if the specific action is about to be triggered.
18. The image capture method of claim 17 , wherein it is determined that the specific action is about to be triggered when the distance is continuously found shorter than a predetermined threshold over a predetermined time duration.
19. The image capture method of claim 17 , wherein it is determined that the specific action is about to be triggered when the distance is shorter than the predetermined threshold and a next distance between the object and the image capture device is shorter than the distance.
20. The image capture method of claim 17 , wherein the step of referring to the sensing result to determine the distance between the object and the image capture device comprises:
determining the distance by using skin color information of the object that is derived from the sensing result.
21. The image capture method of claim 20 , wherein it is determined that the distance is shorter when an area of skin color is found larger.
22. The image capture method of claim 17 , wherein the step of referring to the sensing result to determine the distance between the object and the image capture device comprises:
determining the distance by using light information that is derived from the sensing result.
23. The image capture method of claim 17 , wherein the step of referring to the sensing result to determine the distance between the object and the image capture device comprises:
determining the distance by using proximity information of the object that is derived from the sensing result.
24. The image capture method of claim 17 , wherein the step of referring to the sensing result to determine the distance between the object and the image capture device comprises:
determining the distance by using range information of the object that is derived from the sensing result or depth information of the object that is derived from the sensing result.
25. The image capture method of claim 15 , wherein the step of checking the sensing result to determine if the specific action associated with the image capture module is about to be triggered comprises:
referring to one of an electrical property and a magnetic property of the sensing output to determine if the specific action is about to be triggered.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/868,092 US20130314558A1 (en) | 2012-05-24 | 2013-04-22 | Image capture device for starting specific action in advance when determining that specific action is about to be triggered and related image capture method thereof |
| CN2013101858481A CN103428425A (en) | 2012-05-24 | 2013-05-20 | Image capturing device and image capturing method |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201261651499P | 2012-05-24 | 2012-05-24 | |
| US13/868,092 US20130314558A1 (en) | 2012-05-24 | 2013-04-22 | Image capture device for starting specific action in advance when determining that specific action is about to be triggered and related image capture method thereof |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20130314558A1 true US20130314558A1 (en) | 2013-11-28 |
Family
ID=49621289
Family Applications (6)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/868,092 Abandoned US20130314558A1 (en) | 2012-05-24 | 2013-04-22 | Image capture device for starting specific action in advance when determining that specific action is about to be triggered and related image capture method thereof |
| US13/868,072 Active US9503645B2 (en) | 2012-05-24 | 2013-04-22 | Preview system for concurrently displaying multiple preview images generated based on input image generated by image capture apparatus and related preview method thereof |
| US13/890,254 Abandoned US20130314511A1 (en) | 2012-05-24 | 2013-05-09 | Image capture device controlled according to image capture quality and related image capture method thereof |
| US13/891,201 Active 2033-07-20 US9066013B2 (en) | 2012-05-24 | 2013-05-10 | Content-adaptive image resizing method and related apparatus thereof |
| US13/891,196 Active US9560276B2 (en) | 2012-05-24 | 2013-05-10 | Video recording method of recording output video sequence for image capture module and related video recording apparatus thereof |
| US15/296,002 Active US9681055B2 (en) | 2012-05-24 | 2016-10-17 | Preview system for concurrently displaying multiple preview images generated based on input image generated by image capture apparatus and related preview method thereof |
Family Applications After (5)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/868,072 Active US9503645B2 (en) | 2012-05-24 | 2013-04-22 | Preview system for concurrently displaying multiple preview images generated based on input image generated by image capture apparatus and related preview method thereof |
| US13/890,254 Abandoned US20130314511A1 (en) | 2012-05-24 | 2013-05-09 | Image capture device controlled according to image capture quality and related image capture method thereof |
| US13/891,201 Active 2033-07-20 US9066013B2 (en) | 2012-05-24 | 2013-05-10 | Content-adaptive image resizing method and related apparatus thereof |
| US13/891,196 Active US9560276B2 (en) | 2012-05-24 | 2013-05-10 | Video recording method of recording output video sequence for image capture module and related video recording apparatus thereof |
| US15/296,002 Active US9681055B2 (en) | 2012-05-24 | 2016-10-17 | Preview system for concurrently displaying multiple preview images generated based on input image generated by image capture apparatus and related preview method thereof |
Country Status (2)
| Country | Link |
|---|---|
| US (6) | US20130314558A1 (en) |
| CN (5) | CN103428423A (en) |
Cited By (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140028894A1 (en) * | 2012-07-25 | 2014-01-30 | Samsung Electronics Co., Ltd. | Digital photographing apparatus and method of controlling same |
| US20140099994A1 (en) * | 2012-10-04 | 2014-04-10 | Nvidia Corporation | Electronic camera embodying a proximity sensor |
| US20140132817A1 (en) * | 2012-11-12 | 2014-05-15 | Samsung Electronics Co., Ltd. | Method and apparatus for capturing and displaying an image |
| US20150022432A1 (en) * | 2013-07-17 | 2015-01-22 | Lenovo (Singapore) Pte. Ltd. | Special gestures for camera control and image processing operations |
| US20160119522A1 (en) * | 2014-10-24 | 2016-04-28 | Samsung Electronics Co., Ltd. | Image sensor simultaneously generating image proximity signal |
| US10277888B2 (en) * | 2015-01-16 | 2019-04-30 | Qualcomm Incorporated | Depth triggered event feature |
| US10939035B2 (en) | 2016-12-07 | 2021-03-02 | Zte Corporation | Photograph-capture method, apparatus, terminal, and storage medium |
| US11546524B2 (en) | 2019-10-11 | 2023-01-03 | Google Llc | Reducing a flicker effect of multiple light sources in an image |
| US11847770B2 (en) | 2019-09-30 | 2023-12-19 | Google Llc | Automatic generation of all-in-focus images with a mobile camera |
| US11856295B2 (en) | 2020-07-29 | 2023-12-26 | Google Llc | Multi-camera video stabilization |
| US11949990B2 (en) | 2018-10-05 | 2024-04-02 | Google Llc | Scale-down capture preview for a panorama capture user interface |
| US12046072B2 (en) | 2019-10-10 | 2024-07-23 | Google Llc | Camera synchronization and image tagging for face authentication |
| US12266113B2 (en) | 2019-07-15 | 2025-04-01 | Google Llc | Automatically segmenting and adjusting images |
Families Citing this family (91)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9936143B2 (en) | 2007-10-31 | 2018-04-03 | Google Technology Holdings LLC | Imager module with electronic shutter |
| KR101720774B1 (en) * | 2010-11-24 | 2017-03-28 | 삼성전자주식회사 | Digital photographing apparatus and method for providing a picture thereof |
| US20150036880A1 (en) * | 2012-03-29 | 2015-02-05 | Nec Corporation | Analysis system |
| JP5880263B2 (en) * | 2012-05-02 | 2016-03-08 | ソニー株式会社 | Display control device, display control method, program, and recording medium |
| US9392322B2 (en) | 2012-05-10 | 2016-07-12 | Google Technology Holdings LLC | Method of visually synchronizing differing camera feeds with common subject |
| US20130314558A1 (en) * | 2012-05-24 | 2013-11-28 | Mediatek Inc. | Image capture device for starting specific action in advance when determining that specific action is about to be triggered and related image capture method thereof |
| EP2912602A4 (en) * | 2012-10-23 | 2016-03-16 | Ishay Sivan | Real time assessment of picture quality |
| US9282244B2 (en) | 2013-03-14 | 2016-03-08 | Microsoft Technology Licensing, Llc | Camera non-touch switch |
| EP2975995B1 (en) * | 2013-03-20 | 2023-05-31 | Covidien LP | System for enhancing picture-in-picture display for imaging devices used for surgical procedures |
| CN105264419B (en) * | 2013-06-06 | 2017-09-22 | 富士胶片株式会社 | Autofocus and its method of controlling operation |
| KR102082661B1 (en) * | 2013-07-12 | 2020-02-28 | 삼성전자주식회사 | Photograph image generating method of electronic device, and apparatus thereof |
| US9582716B2 (en) * | 2013-09-09 | 2017-02-28 | Delta ID Inc. | Apparatuses and methods for iris based biometric recognition |
| KR20150051085A (en) * | 2013-11-01 | 2015-05-11 | 삼성전자주식회사 | Method for obtaining high dynamic range image,Computer readable storage medium of recording the method and a digital photographing apparatus. |
| US9210327B2 (en) * | 2013-12-02 | 2015-12-08 | Yahoo! Inc. | Blur aware photo feedback |
| CN104333689A (en) * | 2014-03-05 | 2015-02-04 | 广州三星通信技术研究有限公司 | Method and device for displaying preview image during shooting |
| US9357127B2 (en) | 2014-03-18 | 2016-05-31 | Google Technology Holdings LLC | System for auto-HDR capture decision making |
| CN103916602B (en) * | 2014-04-17 | 2019-01-15 | 努比亚技术有限公司 | Method, first movement terminal and the system of long-range shooting control |
| KR102105961B1 (en) | 2014-05-13 | 2020-05-28 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
| US9729784B2 (en) | 2014-05-21 | 2017-08-08 | Google Technology Holdings LLC | Enhanced image capture |
| US9628702B2 (en) | 2014-05-21 | 2017-04-18 | Google Technology Holdings LLC | Enhanced image capture |
| US9813611B2 (en) * | 2014-05-21 | 2017-11-07 | Google Technology Holdings LLC | Enhanced image capture |
| US9774779B2 (en) | 2014-05-21 | 2017-09-26 | Google Technology Holdings LLC | Enhanced image capture |
| US9451178B2 (en) | 2014-05-22 | 2016-09-20 | Microsoft Technology Licensing, Llc | Automatic insertion of video into a photo story |
| US9503644B2 (en) | 2014-05-22 | 2016-11-22 | Microsoft Technology Licensing, Llc | Using image properties for processing and editing of multiple resolution images |
| US11184580B2 (en) * | 2014-05-22 | 2021-11-23 | Microsoft Technology Licensing, Llc | Automatically curating video to fit display time |
| GB201412818D0 (en) * | 2014-07-18 | 2014-09-03 | Omg Plc | Minimisation of blur in still image capture |
| US9413947B2 (en) | 2014-07-31 | 2016-08-09 | Google Technology Holdings LLC | Capturing images of active subjects according to activity profiles |
| CN104200189B (en) * | 2014-08-27 | 2017-05-03 | 苏州佳世达电通有限公司 | Barcode scanning device and processing method thereof |
| KR102189647B1 (en) * | 2014-09-02 | 2020-12-11 | 삼성전자주식회사 | Display apparatus, system and controlling method thereof |
| KR20160029536A (en) * | 2014-09-05 | 2016-03-15 | 엘지전자 주식회사 | Mobile terminal and control method for the mobile terminal |
| KR102252448B1 (en) * | 2014-09-12 | 2021-05-14 | 삼성전자주식회사 | Method for controlling and an electronic device thereof |
| US9654700B2 (en) | 2014-09-16 | 2017-05-16 | Google Technology Holdings LLC | Computational camera using fusion of image sensors |
| CN105516579B (en) * | 2014-09-25 | 2019-02-05 | 联想(北京)有限公司 | An image processing method, device and electronic device |
| EP3010225B1 (en) * | 2014-10-14 | 2019-07-24 | Nokia Technologies OY | A method, apparatus and computer program for automatically capturing an image |
| CN104581379A (en) * | 2014-12-31 | 2015-04-29 | 乐视网信息技术(北京)股份有限公司 | Video preview image selecting method and device |
| TWI565317B (en) * | 2015-01-06 | 2017-01-01 | 緯創資通股份有限公司 | Image processing method and mobile electronic device |
| FR3043233B1 (en) * | 2015-10-30 | 2018-04-06 | Merry Pixel | METHOD OF AUTOMATICALLY SELECTING IMAGES FROM A MOBILE DEVICE |
| CN105872352A (en) * | 2015-12-08 | 2016-08-17 | 乐视移动智能信息技术(北京)有限公司 | Method and device for shooting picture |
| US9881191B2 (en) * | 2015-12-14 | 2018-01-30 | Leadot Innovation, Inc. | Method of controlling operation of cataloged smart devices |
| US10015400B2 (en) * | 2015-12-17 | 2018-07-03 | Lg Electronics Inc. | Mobile terminal for capturing an image and associated image capturing method |
| US9743077B2 (en) | 2016-01-12 | 2017-08-22 | Sling Media LLC | Detection and marking of low quality video content |
| KR20180023197A (en) * | 2016-08-25 | 2018-03-07 | 엘지전자 주식회사 | Terminal and method for controlling the same |
| CN107800950B (en) * | 2016-09-06 | 2020-07-31 | 东友科技股份有限公司 | Image acquisition method |
| ES2858370T5 (en) * | 2016-10-11 | 2024-04-29 | Signify Holding Bv | Surveillance system and procedure for controlling a surveillance system |
| CN108713318A (en) * | 2016-10-31 | 2018-10-26 | 华为技术有限公司 | A kind of processing method and equipment of video frame |
| CN106453962B (en) * | 2016-11-30 | 2020-01-21 | 珠海市魅族科技有限公司 | Camera shooting control method of double-screen intelligent terminal |
| US20180227502A1 (en) * | 2017-02-06 | 2018-08-09 | Qualcomm Incorporated | Systems and methods for reduced power consumption in imaging pipelines |
| CN106937045B (en) | 2017-02-23 | 2020-08-14 | 华为机器有限公司 | Display method of preview image, terminal equipment and computer storage medium |
| CN108781254A (en) * | 2017-03-14 | 2018-11-09 | 华为技术有限公司 | Photo preview method, graphical user interface and terminal |
| WO2019031086A1 (en) * | 2017-08-09 | 2019-02-14 | 富士フイルム株式会社 | Image processing system, server device, image processing method, and image processing program |
| JP7023663B2 (en) * | 2017-10-12 | 2022-02-22 | キヤノン株式会社 | Image pickup device and its control method |
| CN107731020B (en) * | 2017-11-07 | 2020-05-12 | Oppo广东移动通信有限公司 | Multimedia playing method, device, storage medium and electronic equipment |
| CN107809590B (en) * | 2017-11-08 | 2020-04-28 | 青岛海信移动通信技术股份有限公司 | A method and device for taking pictures |
| CN110086905B (en) * | 2018-03-26 | 2020-08-21 | 华为技术有限公司 | Video recording method and electronic equipment |
| BR112020019378A2 (en) * | 2018-03-26 | 2021-01-05 | Huawei Technologies Co., Ltd. | METHOD AND ELECTRONIC DEVICE FOR VIDEO RECORDING |
| US10861148B2 (en) * | 2018-04-30 | 2020-12-08 | General Electric Company | Systems and methods for improved component inspection |
| CN109005337B (en) * | 2018-07-05 | 2021-08-24 | 维沃移动通信有限公司 | A kind of photographing method and terminal |
| CN108600647A (en) * | 2018-07-24 | 2018-09-28 | 努比亚技术有限公司 | Shooting preview method, mobile terminal and storage medium |
| CN109257538A (en) * | 2018-09-10 | 2019-01-22 | Oppo(重庆)智能科技有限公司 | Camera control method and relevant apparatus |
| KR102637732B1 (en) * | 2018-09-21 | 2024-02-19 | 삼성전자주식회사 | Image signal processor, method of operating the image signal processor, and application processor including the image signal processor |
| US10872240B2 (en) * | 2018-09-28 | 2020-12-22 | Opentv, Inc. | Systems and methods for generating media content |
| CN109194839B (en) * | 2018-10-30 | 2020-10-23 | 维沃移动通信(杭州)有限公司 | Display control method, terminal and computer readable storage medium |
| CN109830077A (en) * | 2019-01-15 | 2019-05-31 | 苏州佳世达光电有限公司 | Monitoring device and monitoring method |
| CN109922271A (en) * | 2019-04-18 | 2019-06-21 | 珠海格力电器股份有限公司 | Mobile terminal based on folding screen and photographing method thereof |
| US10812771B1 (en) * | 2019-06-12 | 2020-10-20 | At&T Intellectual Property I, L.P. | Methods, systems, and devices for adjusting image content for streaming panoramic video content |
| JP7210388B2 (en) * | 2019-06-25 | 2023-01-23 | キヤノン株式会社 | IMAGE PROCESSING DEVICE, IMAGING DEVICE, CONTROL METHOD AND PROGRAM FOR IMAGE PROCESSING DEVICE |
| KR102665968B1 (en) * | 2019-06-27 | 2024-05-16 | 삼성전자주식회사 | Method and apparatus for blur estimation |
| CN110896451B (en) * | 2019-11-20 | 2022-01-28 | 维沃移动通信有限公司 | Preview picture display method, electronic device and computer readable storage medium |
| CN114205522B (en) | 2020-01-23 | 2023-07-18 | 华为技术有限公司 | A telephoto shooting method and electronic device |
| US10835106B1 (en) | 2020-02-21 | 2020-11-17 | Ambu A/S | Portable monitor |
| US10980397B1 (en) | 2020-02-21 | 2021-04-20 | Ambu A/S | Video processing device |
| US11109741B1 (en) | 2020-02-21 | 2021-09-07 | Ambu A/S | Video processing apparatus |
| US11166622B2 (en) | 2020-02-21 | 2021-11-09 | Ambu A/S | Video processing apparatus |
| CN116114257A (en) * | 2020-07-31 | 2023-05-12 | 富士胶片株式会社 | Image processing device, image processing method, image processing program, and image pickup device |
| CN114125344B (en) * | 2020-08-31 | 2023-06-23 | 京东方科技集团股份有限公司 | Video processing device and method, monitor device, computer device, medium |
| WO2022055273A1 (en) * | 2020-09-09 | 2022-03-17 | Samsung Electronics Co., Ltd. | Method and electronic device for applying adaptive zoom on an image |
| CN114205515B (en) * | 2020-09-18 | 2023-04-07 | 荣耀终端有限公司 | Anti-shake processing method for video and electronic equipment |
| CN112333382B (en) | 2020-10-14 | 2022-06-10 | 维沃移动通信(杭州)有限公司 | Shooting method and device and electronic equipment |
| WO2022099118A1 (en) * | 2020-11-09 | 2022-05-12 | Canon U.S.A., Inc. | Detection of image sharpness in frequency domain |
| FR3118380B1 (en) * | 2020-12-22 | 2024-08-30 | Fond B Com | Method for encoding images of a video sequence to be encoded, decoding method, corresponding devices and system. |
| CN112954193B (en) * | 2021-01-27 | 2023-02-10 | 维沃移动通信有限公司 | Shooting method, shooting device, electronic equipment and medium |
| US11716531B2 (en) | 2021-03-22 | 2023-08-01 | International Business Machines Corporation | Quality of multimedia |
| US11483472B2 (en) * | 2021-03-22 | 2022-10-25 | International Business Machines Corporation | Enhancing quality of multimedia |
| US11533427B2 (en) | 2021-03-22 | 2022-12-20 | International Business Machines Corporation | Multimedia quality evaluation |
| EP4086845B1 (en) * | 2021-05-07 | 2025-05-14 | Nokia Technologies Oy | Image processing |
| EP4115789B1 (en) | 2021-07-08 | 2023-12-20 | Ambu A/S | Endoscope image processing device |
| CN116095504B (en) * | 2021-10-29 | 2025-09-26 | 瑞昱半导体股份有限公司 | Image processing system and related image processing method for image enhancement based on region control and multi-branch processing architecture |
| CN114422713B (en) * | 2022-03-29 | 2022-06-24 | 湖南航天捷诚电子装备有限责任公司 | Image acquisition and intelligent interpretation processing device and method |
| KR102823705B1 (en) * | 2022-05-26 | 2025-06-24 | 한국전자통신연구원 | Method and apparatus learning facial feature extractor for low-resolution face recognition |
| EP4627591A2 (en) * | 2022-12-01 | 2025-10-08 | Align Technology, Inc. | Augmented video generation with dental modifications |
| US20240185518A1 (en) * | 2022-12-01 | 2024-06-06 | Align Technology, Inc. | Augmented video generation with dental modifications |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080129854A1 (en) * | 2006-11-14 | 2008-06-05 | Casio Computer Co., Ltd. | Imaging appartus, imaging method and program thereof |
| US20110158623A1 (en) * | 2009-12-30 | 2011-06-30 | Chi Mei Communication Systems, Inc. | Camera device and method for taking photos |
Family Cites Families (54)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0754966B2 (en) | 1985-12-09 | 1995-06-07 | 株式会社日立製作所 | Contour correction circuit |
| US6075926A (en) | 1997-04-21 | 2000-06-13 | Hewlett-Packard Company | Computerized method for improving data resolution |
| JP3564031B2 (en) | 1999-03-16 | 2004-09-08 | オリンパス株式会社 | Electronic still camera |
| JP2000278710A (en) * | 1999-03-26 | 2000-10-06 | Ricoh Co Ltd | Binocular stereoscopic image evaluation device |
| JP2001211351A (en) * | 2000-01-27 | 2001-08-03 | Fuji Photo Film Co Ltd | Image pickup device and its operation control method |
| US20020056083A1 (en) * | 2000-03-29 | 2002-05-09 | Istvan Anthony F. | System and method for picture-in-browser scaling |
| GB0125774D0 (en) * | 2001-10-26 | 2001-12-19 | Cableform Ltd | Method and apparatus for image matching |
| JP4198449B2 (en) * | 2002-02-22 | 2008-12-17 | 富士フイルム株式会社 | Digital camera |
| JP2004080252A (en) | 2002-08-14 | 2004-03-11 | Toshiba Corp | Image display apparatus and method |
| US7269300B2 (en) | 2003-10-24 | 2007-09-11 | Eastman Kodak Company | Sharpening a digital image in accordance with magnification values |
| EP1746819B1 (en) * | 2004-05-13 | 2012-05-23 | Sony Corporation | Imaging device, image display method, and user interface |
| US7545391B2 (en) | 2004-07-30 | 2009-06-09 | Algolith Inc. | Content adaptive resizer |
| US7711211B2 (en) | 2005-06-08 | 2010-05-04 | Xerox Corporation | Method for assembling a collection of digital images |
| US8045047B2 (en) * | 2005-06-23 | 2011-10-25 | Nokia Corporation | Method and apparatus for digital image processing of an image having different scaling rates |
| US7448753B1 (en) | 2005-07-19 | 2008-11-11 | Chinnock Randal B | Portable Digital Medical Camera for Capturing Images of the Retina or the External Auditory Canal, and Methods of Use |
| CN101909156B (en) * | 2005-11-02 | 2013-01-16 | 奥林巴斯株式会社 | Electronic camera and image processing method thereof |
| JP4956988B2 (en) * | 2005-12-19 | 2012-06-20 | カシオ計算機株式会社 | Imaging device |
| US20070283269A1 (en) * | 2006-05-31 | 2007-12-06 | Pere Obrador | Method and system for onboard camera video editing |
| JP4904108B2 (en) * | 2006-07-25 | 2012-03-28 | 富士フイルム株式会社 | Imaging apparatus and image display control method |
| JP4218720B2 (en) | 2006-09-22 | 2009-02-04 | ソニー株式会社 | IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND COMPUTER PROGRAM |
| JP2008096868A (en) * | 2006-10-16 | 2008-04-24 | Sony Corp | Imaging display device and imaging display method |
| US8615112B2 (en) * | 2007-03-30 | 2013-12-24 | Casio Computer Co., Ltd. | Image pickup apparatus equipped with face-recognition function |
| JP4139430B1 (en) | 2007-04-27 | 2008-08-27 | シャープ株式会社 | Image processing apparatus and method, image display apparatus and method |
| JP2008306236A (en) | 2007-06-05 | 2008-12-18 | Sony Corp | Image display device, image display method, program for image display method, and recording medium recording program for image display method |
| US20080304568A1 (en) * | 2007-06-11 | 2008-12-11 | Himax Technologies Limited | Method for motion-compensated frame rate up-conversion |
| JP5053731B2 (en) * | 2007-07-03 | 2012-10-17 | キヤノン株式会社 | Image display control device, image display control method, program, and recording medium |
| JP4999649B2 (en) * | 2007-11-09 | 2012-08-15 | キヤノン株式会社 | Display device |
| JP5003529B2 (en) | 2008-02-25 | 2012-08-15 | 株式会社ニコン | Imaging apparatus and object detection method |
| CN101266650A (en) | 2008-03-31 | 2008-09-17 | 北京中星微电子有限公司 | An image storage method based on face detection |
| TW200947355A (en) | 2008-05-15 | 2009-11-16 | Ind Tech Res Inst | Intelligent multi-direction display system and method |
| JP4543105B2 (en) | 2008-08-08 | 2010-09-15 | 株式会社東芝 | Information reproduction apparatus and reproduction control method |
| EP2207342B1 (en) * | 2009-01-07 | 2017-12-06 | LG Electronics Inc. | Mobile terminal and camera image control method thereof |
| EP2396768B1 (en) | 2009-02-12 | 2013-04-17 | Dolby Laboratories Licensing Corporation | Quality evaluation of sequences of images |
| JP5294922B2 (en) | 2009-02-26 | 2013-09-18 | キヤノン株式会社 | Playback apparatus and playback method |
| JP2011045039A (en) * | 2009-07-21 | 2011-03-03 | Fujifilm Corp | Compound-eye imaging apparatus |
| US8373802B1 (en) | 2009-09-01 | 2013-02-12 | Disney Enterprises, Inc. | Art-directable retargeting for streaming video |
| US20110084962A1 (en) | 2009-10-12 | 2011-04-14 | Jong Hwan Kim | Mobile terminal and image processing method therein |
| JP5116754B2 (en) | 2009-12-10 | 2013-01-09 | シャープ株式会社 | Optical detection device and electronic apparatus |
| US8294748B2 (en) | 2009-12-11 | 2012-10-23 | DigitalOptics Corporation Europe Limited | Panorama imaging using a blending map |
| US20110149029A1 (en) | 2009-12-17 | 2011-06-23 | Marcus Kellerman | Method and system for pulldown processing for 3d video |
| JP5218388B2 (en) * | 2009-12-25 | 2013-06-26 | カシオ計算機株式会社 | IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM |
| US20110301980A1 (en) | 2010-06-03 | 2011-12-08 | Siemens Medical Solutions Usa, Inc. | Automated Medical Image Storage System |
| JP5569206B2 (en) | 2010-07-15 | 2014-08-13 | ソニー株式会社 | Image processing apparatus and method |
| US20120019677A1 (en) | 2010-07-26 | 2012-01-26 | Nethra Imaging Inc. | Image stabilization in a digital camera |
| CN102457673A (en) * | 2010-10-26 | 2012-05-16 | 宏达国际电子股份有限公司 | Image acquisition method and system |
| JP5779959B2 (en) * | 2011-04-21 | 2015-09-16 | 株式会社リコー | Imaging device |
| WO2013001165A1 (en) | 2011-06-28 | 2013-01-03 | Nokia Corporation | A method, a system, a viewing device and a computer program for picture rendering |
| US9530192B2 (en) | 2011-06-30 | 2016-12-27 | Kodak Alaris Inc. | Method for determining stereo quality score and automatically improving the quality of stereo images |
| FR2978894A1 (en) * | 2011-08-02 | 2013-02-08 | St Microelectronics Grenoble 2 | METHOD FOR PREVIEWING IMAGE IN A DIGITAL VIEWING APPARATUS |
| US9001255B2 (en) * | 2011-09-30 | 2015-04-07 | Olympus Imaging Corp. | Imaging apparatus, imaging method, and computer-readable storage medium for trimming and enlarging a portion of a subject image based on touch panel inputs |
| CN103842903B (en) * | 2011-09-30 | 2015-06-03 | 富士胶片株式会社 | Imaging device, imaging method, and program |
| US9269323B2 (en) | 2011-10-28 | 2016-02-23 | Microsoft Technology Licensing, Llc | Image layout for a display |
| US8848068B2 (en) | 2012-05-08 | 2014-09-30 | Oulun Yliopisto | Automated recognition algorithm for detecting facial expressions |
| US20130314558A1 (en) * | 2012-05-24 | 2013-11-28 | Mediatek Inc. | Image capture device for starting specific action in advance when determining that specific action is about to be triggered and related image capture method thereof |
-
2013
- 2013-04-22 US US13/868,092 patent/US20130314558A1/en not_active Abandoned
- 2013-04-22 US US13/868,072 patent/US9503645B2/en active Active
- 2013-05-09 US US13/890,254 patent/US20130314511A1/en not_active Abandoned
- 2013-05-10 US US13/891,201 patent/US9066013B2/en active Active
- 2013-05-10 US US13/891,196 patent/US9560276B2/en active Active
- 2013-05-17 CN CN2013101845833A patent/CN103428423A/en active Pending
- 2013-05-20 CN CN2013101858481A patent/CN103428425A/en active Pending
- 2013-05-24 CN CN201310196545.XA patent/CN103428460B/en active Active
- 2013-05-24 CN CN201310196539.4A patent/CN103428428B/en not_active Expired - Fee Related
- 2013-05-24 CN CN201310196195.7A patent/CN103428427B/en active Active
-
2016
- 2016-10-17 US US15/296,002 patent/US9681055B2/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080129854A1 (en) * | 2006-11-14 | 2008-06-05 | Casio Computer Co., Ltd. | Imaging appartus, imaging method and program thereof |
| US20110158623A1 (en) * | 2009-12-30 | 2011-06-30 | Chi Mei Communication Systems, Inc. | Camera device and method for taking photos |
Cited By (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140028894A1 (en) * | 2012-07-25 | 2014-01-30 | Samsung Electronics Co., Ltd. | Digital photographing apparatus and method of controlling same |
| US20140099994A1 (en) * | 2012-10-04 | 2014-04-10 | Nvidia Corporation | Electronic camera embodying a proximity sensor |
| US9948861B2 (en) * | 2012-11-12 | 2018-04-17 | Samsung Electronics Co., Ltd. | Method and apparatus for capturing and displaying an image |
| US20140132817A1 (en) * | 2012-11-12 | 2014-05-15 | Samsung Electronics Co., Ltd. | Method and apparatus for capturing and displaying an image |
| US20150022432A1 (en) * | 2013-07-17 | 2015-01-22 | Lenovo (Singapore) Pte. Ltd. | Special gestures for camera control and image processing operations |
| US9430045B2 (en) * | 2013-07-17 | 2016-08-30 | Lenovo (Singapore) Pte. Ltd. | Special gestures for camera control and image processing operations |
| US10587789B2 (en) * | 2014-10-24 | 2020-03-10 | Samsung Electronics Co., Ltd. | Image sensor simultaneously generating image and proximity signals |
| US20160119522A1 (en) * | 2014-10-24 | 2016-04-28 | Samsung Electronics Co., Ltd. | Image sensor simultaneously generating image proximity signal |
| US10277888B2 (en) * | 2015-01-16 | 2019-04-30 | Qualcomm Incorporated | Depth triggered event feature |
| US10939035B2 (en) | 2016-12-07 | 2021-03-02 | Zte Corporation | Photograph-capture method, apparatus, terminal, and storage medium |
| US11949990B2 (en) | 2018-10-05 | 2024-04-02 | Google Llc | Scale-down capture preview for a panorama capture user interface |
| US12256150B2 (en) | 2018-10-05 | 2025-03-18 | Google Llc | Scale-down capture preview for a panorama capture user interface |
| US12266113B2 (en) | 2019-07-15 | 2025-04-01 | Google Llc | Automatically segmenting and adjusting images |
| US11847770B2 (en) | 2019-09-30 | 2023-12-19 | Google Llc | Automatic generation of all-in-focus images with a mobile camera |
| US12333685B2 (en) | 2019-09-30 | 2025-06-17 | Google Llc | Automatic generation of all-in-focus images with a mobile camera |
| US12046072B2 (en) | 2019-10-10 | 2024-07-23 | Google Llc | Camera synchronization and image tagging for face authentication |
| US11546524B2 (en) | 2019-10-11 | 2023-01-03 | Google Llc | Reducing a flicker effect of multiple light sources in an image |
| US12120435B2 (en) | 2019-10-11 | 2024-10-15 | Google Llc | Reducing a flicker effect of multiple light sources in an image |
| US11856295B2 (en) | 2020-07-29 | 2023-12-26 | Google Llc | Multi-camera video stabilization |
Also Published As
| Publication number | Publication date |
|---|---|
| CN103428428A (en) | 2013-12-04 |
| CN103428460A (en) | 2013-12-04 |
| CN103428460B (en) | 2018-03-13 |
| US20130314511A1 (en) | 2013-11-28 |
| CN103428427A (en) | 2013-12-04 |
| US9066013B2 (en) | 2015-06-23 |
| CN103428423A (en) | 2013-12-04 |
| US20170034448A1 (en) | 2017-02-02 |
| US20130315556A1 (en) | 2013-11-28 |
| US9681055B2 (en) | 2017-06-13 |
| CN103428428B (en) | 2017-06-16 |
| US20130314580A1 (en) | 2013-11-28 |
| CN103428425A (en) | 2013-12-04 |
| CN103428427B (en) | 2016-06-08 |
| US9560276B2 (en) | 2017-01-31 |
| US9503645B2 (en) | 2016-11-22 |
| US20130315499A1 (en) | 2013-11-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20130314558A1 (en) | Image capture device for starting specific action in advance when determining that specific action is about to be triggered and related image capture method thereof | |
| US10419668B2 (en) | Portable device with adaptive panoramic image processor | |
| JP4018695B2 (en) | Method and apparatus for continuous focusing and exposure adjustment in a digital imaging device | |
| KR101278335B1 (en) | Handheld electronic device, dual image capturing method applying for thereof, and computer program product for load into thereof | |
| US10878539B2 (en) | Image-processing method, apparatus and device | |
| US9973743B2 (en) | Electronic device having dynamically controlled flashlight for image capturing and related control method | |
| CN106603904B (en) | Capturing stable images using ambient light sensor based triggers | |
| US9060129B2 (en) | Imaging device, control method of imaging device, and computer program | |
| US20140292794A1 (en) | Mobile information apparatus and display control method | |
| CN103916592A (en) | Apparatus and method for photographing portrait in portable terminal having camera | |
| US9071760B2 (en) | Image pickup apparatus | |
| CN105744116A (en) | Detection method, control method, detection device, control device and electronic device | |
| JP2013179536A (en) | Electronic apparatus and control method therefor | |
| KR20160018354A (en) | Detecting apparatus, detecting method and computer program stored in recording medium | |
| CN105681626A (en) | Detection method, control method, detection device, control device and electronic device | |
| KR101434027B1 (en) | Imaging device, imaging method and storage medium | |
| JP2017045326A (en) | Display device, control method therefor, program, and storage medium | |
| JP2015011702A (en) | Information processing apparatus and program | |
| US9389489B2 (en) | Photographing apparatus for recognizing type of external device, method of controlling the photographing apparatus, and the external device | |
| CN105872378A (en) | Control method, control device and electronic device | |
| CN106415528B (en) | translation device | |
| KR20110043991A (en) | Motion blur processing apparatus and method | |
| CN105847686B (en) | Control method, control device, and electronic device | |
| JP5660306B2 (en) | Imaging apparatus, program, and imaging method | |
| JP2018054762A5 (en) |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: MEDIATEK INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JU, CHI-CHENG;CHEN, DING-YUN;HO, CHENG-TSAI;REEL/FRAME:030273/0622 Effective date: 20130419 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |