US20080002963A1 - Systems and methods for capturing images of objects - Google Patents
Systems and methods for capturing images of objects Download PDFInfo
- Publication number
- US20080002963A1 US20080002963A1 US11/736,655 US73665507A US2008002963A1 US 20080002963 A1 US20080002963 A1 US 20080002963A1 US 73665507 A US73665507 A US 73665507A US 2008002963 A1 US2008002963 A1 US 2008002963A1
- Authority
- US
- United States
- Prior art keywords
- displayed
- image object
- orientation type
- image
- shutter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00281—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal
- H04N1/00307—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal with a mobile telephone apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00352—Input means
- H04N1/00384—Key input means, e.g. buttons or keypads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00411—Display of information to the user, e.g. menus the display also being used for user input, e.g. touch screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N1/32101—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N1/32128—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
- H04N2201/3254—Orientation, e.g. landscape or portrait; Location or order of the image data, e.g. in memory
Definitions
- the invention relates to digital image generation, and more particularly, to systems and methods for determining image orientations when capturing focused objects.
- Mobile electronic devices such as mobile phones, personal digital assistants (PDAs) or similar, are typically equipped with embedded camera modules containing lenses, image sensor modules, image signal processors (ISPs) and others, to capture images of objects such as still images or video frames corresponding to focused objects (e.g. people, animals, flowers, mountain, stones or similar).
- the mobile electronic device may be held vertically or horizontally for focusing on objects to be captured.
- Mobile electronic devices are typically equipped with gyro sensors to detect orientation, in which the mobile electronic device is held, such as vertical or horizontal, thus, hardware cost is increased.
- An embodiment of a method for generating an image object, performed by a mobile electronic device comprises the following steps.
- the mobile electronic device comprises a first shutter object and a second shutter object.
- a signal is detected. It is determined whether the signal is generated by the first shutter object or the second shutter object.
- a first orientation type is determined when the signal is generated by the first shutter object.
- a second orientation type is determined when the signal is generated by the second shutter object.
- the image object with the determined orientation type is stored.
- An embodiment of a system comprises a first shutter object, a second shutter object and a processor coupled thereto.
- the processor coupling to the first and second shutter objects, detects a signal, determining whether the signal is generated by the first shutter object or the second shutter object, determines a first orientation type when the signal is generated by the first shutter object, determines a second orientation type when the signal is generated by the second shutter object, and stores the image object with the determined orientation type.
- the image object is to be displayed in response to the stored orientation type.
- FIG. 1 is a diagram of a hardware environment applicable to a mobile electronic device
- FIG. 2 shows the opposite side of an embodiment of a mobile electronic device
- FIGS. 3 a , 3 b , 4 a , 4 b , 5 a , 5 b , 6 a and 6 b are schematic diagrams illustrating embodiments of shutter object placement
- FIGS. 7 a and 7 b are diagrams of the opposite side of an embodiment of a mobile electronic device
- FIG. 8 is a flowchart illustrating an embodiment of a method for capturing images of objects
- FIGS. 9 a and 9 b are diagrams respectively containing tables
- FIGS. 10 a to 10 h are diagrams illustrating mappings between the stored image objects and representations on a display device
- FIG. 11 is a flowchart illustrating an embodiment of a method for displaying image objects
- FIGS. 12 to 15 are diagrams respectively illustrating capture of an image by a mobile phone, and display of the captured image of the person by an external display;
- FIGS. 16 a to 16 d are diagrams illustrating adjustment of a direction indicated by an iconic indicator in various aspects before capturing a skyscraper;
- FIG. 17 is a flowchart illustrating an embodiment of a method for capturing images of objects
- FIG. 18 is a diagram containing a table
- FIG. 19 is a diagram of an embodiment of a pipeline for video encoding
- FIG. 20 is a flowchart illustrating an embodiment of buffer write procedure for writing one sensed image to a frame buffer
- FIG. 21 is a diagram of an image DMA controller writing pixel values of one sensed image from an image sensor to a frame buffer by employing a buffer write procedure of FIG. 20 ;
- FIG. 22 is a flowchart illustrating an embodiment of buffer write procedure for writing one sensed image to a frame buffer
- FIG. 23 is a diagram of an image DMA controller writing pixel values of one sensed image from an image sensor to a frame buffer by employing a buffer write procedure of FIG. 22 ;
- FIG. 24 is a diagram of an embodiment of a pipeline for video encoding
- FIGS. 25 a and 25 b are flowcharts illustrating an embodiment of buffer read procedure for reading one sensed image from a frame buffer
- FIGS. 26 a and 26 b are flowcharts illustrating an embodiment of buffer read procedure for reading one sensed image from a frame buffer
- FIG. 27 is a diagram of a video encoder reading pixel values of one sensed image from a frame buffer and generating encoded video stream;
- FIGS. 28 and 29 are diagrams of embodiments of pipelines for video encoding.
- FIG. 1 is a diagram of a hardware environment applicable to a mobile electronic device 100 mainly comprising a communication system 1301 , a microphone 1302 , a speaker 1303 , an antenna 1304 , a processor 1305 , memory 1306 , an image sensor module 1307 , lens 1308 , an image sensor 1309 , a sensor controller and image processor 1310 , an image encoder 1312 , a touch panel controller 1320 , and a key pad controller 1330 .
- the communication system 1301 communicates with other remote mobile electronic devices via the antenna 1304 when connecting to a cellular network, such as global system for mobile communications (GSM), general packet radio service (GPRS), enhanced data rates for global evolution (EDGE), code division multiple access (CDMA), wideband code division multiple access (WCDMA) or circuit switched data (CSD) system or similar.
- GSM global system for mobile communications
- GPRS general packet radio service
- EDGE enhanced data rates for global evolution
- CDMA code division multiple access
- WCDMA wideband code division multiple access
- CSD circuit switched data
- the image sensor module 1307 containing lenses 1308 and the image sensor 1309 , as well as the sensor controller and image processor 1310 and image encoder 1312 provide image object generating capability.
- the image sensor module 1307 may contain charge coupled device (CCD) image sensors, complementary metal oxide semiconductor (CMOS) image sensors or similar to record the intensity of light as variable charges.
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- the sensor controller and image processor 1310 quantifies the variable charge into a discrete color value.
- a bitmap image contains a plurality of pixel data quantified by the sensor controller and image processor 1310 at a given resolution such as 640 ⁇ 480, 1024 ⁇ 768 and so on.
- the quantified bitmap images may be further converted into a well-known format such as joint photographic experts group (JPEG), graphics interchange format (GIF) or similar, by the image encoder 1312 to generate multiple compressed still images such as a JPEG or GIF images or similar.
- JPEG joint photographic experts group
- GIF graphics interchange format
- the image encoder 1312 may be a video encoder to compress and organize a series of the quantified bitmap images into a series of video frames such as MPEG-1, MPEG-2, MPEG-4, H.263 or H.264 I-, P- and B-frames.
- the still images and/or video frames generated by the image encoder 1312 may be stored in memory 1306 such as dynamic random access memory (DRAM), synchronous DRAM (SDRAM), flash memory or similar, or the storage media 1313 such as a compact flash (CF), memory stick (MS), smart media (SM), or SD memory card or similar.
- the generated still images and/or video frames may be displayed on the display device 1314 such as a color super-twisted nematic (CSTN) display, a thin film transistor-liquid crystal display (TFT-LCD), an organic light-emitting diode (OLED) display or similar.
- CSTN color super-twisted nematic
- TFT-LCD thin film transistor-liquid crystal display
- OLED organic light-emitting diode
- Users may direct the mobile electronic device 100 to capture images of objects corresponding to focused objects by pressing soft keys 1321 and 1322 on a touch panel ( FIG. 4 a ), hard keys on a key pad 1331 or side shutter button 1332 ( FIG. 3 a ).
- the processor 1305 may direct various camera mechanisms such as an autofocus motor, a shutter motor and/or a diaphragm motor (not shown), the sensor controller and image processor 1310 and image encoder 1312 to capture images of objects.
- FIG. 2 shows the opposite side of an embodiment of a mobile electronic device 100 containing the antenna 1304 and camera lens 1308 .
- Objects brought into focus by the camera lens 1308 are captured and transformed into image objects upon detecting the described shutter or recording signal.
- Hard keys on the keypad 1331 , the shutter button 1332 and soft keys 1321 and 1322 on the touch panel 1323 capable of generating the shutter or recording signals are referred to as shutter objects.
- At least two shutter objects for capturing images of objects such as still images and video frames are provided, and when detecting a shutter or recording signal, an orientation type corresponding to the shutter object generating the shutter or recording signal is determined and the generated image objects with the determined orientation type are stored, enabling the generated image object to be displayed in response to the determined orientation type.
- FIGS. 3 a and 3 b are schematic diagrams illustrating embodiments of shutter object placement.
- a hard shutter key 310 on a keypad e.g. 1331 of FIG. 1
- a shutter button 1332 is disposed on one lateral side (e.g. the right lateral side) of the mobile electronic device.
- a display device e.g. 1314 of FIG. 1
- a touch panel e.g. 1323 of FIG. 1
- a user may hold the mobile electronic device vertically to focus on certain objects to be captured, and, when pressing the hard shutter key 310 with a thumb, an image object corresponding to the focused objects is generated and stored in memory (e.g. 1306 of FIG. 1 ) or storage media (e.g. 1313 of FIG. 1 ).
- a user may hold the mobile electronic device horizontally to focus on certain objects, and, when pressing the shutter button 1332 with a forefinger, an image object corresponding to the focused objects is generated and stored.
- FIGS. 4 a and 4 b are schematic diagrams illustrating embodiments of shutter object placement.
- Two soft keys 1321 and 1322 are displayed on a touch panel (e.g. 1323 of FIG. 1 ) of a mobile electronic device.
- the soft key 1321 presents an icon indicating that it is preferably to click the soft key 1321 to capture an image of the focused objects when the mobile electronic device is vertically oriented.
- the soft key 1322 presents an icon indicating that it is preferably to click the soft key 1322 to capture an image of the focused objects when the mobile electronic device is horizontally oriented.
- the touch panel may continuously display images generated by an image sensor module (e.g. 1307 of FIG. 1 ) in preview area W 400 , facilitating focus on certain objects. Referring to FIG.
- a user may hold the mobile electronic device vertically to focus on certain objects to be captured, and when clicking the soft key 1321 with a thumb, an image object corresponding to the focused objects is generated and stored in memory (e.g. 1306 of FIG. 1 ) or storage media (e.g. 1313 of FIG. 1 ).
- memory e.g. 1306 of FIG. 1
- storage media e.g. 1313 of FIG. 1
- a user may hold the mobile electronic device horizontally to focus on certain objects, and when clicking the soft key 1322 with a forefinger, an image object corresponding to the focused objects is generated and stored.
- FIGS. 5 a and 5 b are schematic diagrams illustrating an embodiment of shutter object placement.
- the described hard shutter key 310 on a keypad (e.g. 1331 of FIG. 1 ) is disposed on the front panel of a mobile electronic device, and the described soft key 1322 is displayed on a touch panel (e.g. 1323 of FIG. 1 ) of a mobile electronic device.
- the touch panel may continuously display images generated by an image sensor module (e.g. 1307 of FIG. 1 ) in preview area W 300 , facilitating focus on certain objects. Referring to FIG.
- a user may hold the mobile electronic device vertically to focus on certain objects to be captured, and when pressing the hard shutter key 310 with a thumb, an image object corresponding to the focused objects is generated and stored.
- a user may hold the mobile electronic device horizontally to focus on certain objects, and, when clicking the soft key 1322 with a forefinger, an image object corresponding to the focused objects is generated and stored.
- FIGS. 6 a and 6 b are schematic diagrams illustrating an embodiment of shutter object placement.
- the described soft key 1321 is displayed on a touch panel (e.g. 1323 of FIG. 1 ) of a mobile electronic device, and the described shutter button 1322 is disposed on one lateral side (e.g. the right lateral side) of the mobile electronic device.
- the touch panel may continuously display images generated by an image sensor module (e.g. 1307 of FIG. 1 ) in preview area W 600 , facilitating focus on certain objects.
- an image sensor module e.g. 1307 of FIG. 1
- FIG. 6 a a user may hold the mobile electronic device vertically to focus on certain objects to be captured, and when clicking the soft key 1321 with a thumb, an image object corresponding to the focused objects is generated and stored.
- FIG. 6 b a user may hold the mobile electronic device horizontally to focus on certain objects, and when pressing the shutter button 1332 with a forefinger, an image object corresponding to the focused objects is generated
- FIGS. 7 a and 7 b are diagrams of the opposite side of an embodiment of a mobile electronic device, illustrating placement of the image sensor in two aspects.
- the image sensor is typically an array of CMOS, CCD cells or similar.
- FIG. 7 a at least one long edge of the image sensor 1309 is placed parallel to at least one short edge of a mobile electronic device.
- FIG. 7 b at least one short edge of the image sensor 1309 is placed parallel to at least one short edge of a mobile electronic device.
- a cell P(0,0) is located in column 0 (i.e. the first column) of row 0 (i.e. the first row) of the image sensor.
- image objects are generated by sequentially scanning the image sensor 1309 to retrieve and convert variable charges into discrete color values.
- the scanning process performed by a sensor controller and image processor (e.g. 1310 of FIG. 1 ), comprises scanning from the first column to the last column in a row. Upon reaching the last column in a row, the next row is scanned from the first column to the last column. The scanning process continues until the entire image sensor has been scanned and all color values have been acquired.
- FIG. 8 is a flowchart illustrating an embodiment of a method for capturing images of objects, performed by a processor of a mobile electronic device (e.g. 1305 of FIG. 1 ).
- information indicating mapping relationships between shutter objects and orientation types is provided. Such information may be stored in memory (e.g. 1306 of FIG. 1 ) or storage media (e.g. 131 of FIG. 1 ).
- FIGS. 9 a and 9 b are diagrams respectively containing tables 91 and 93 .
- the table 91 describes information indicating mapping relationships when an image sensor is placed as shown in FIG. 7 a
- the table 93 describes information indicating mapping relationships when an image sensor is placed as shown in FIG. 7 b .
- V-Obj may identify a shutter object easily pressed or clicked with a thumb (e.g. 310 of FIG. 3 a or FIG. 5 a , or 1321 of FIG. 4 a or FIG. 6 a ) when the mobile electronic is vertically oriented.
- H-Obj may identify a shutter object easily pressed or clicked with a forefinger (e.g. 1332 of FIG. 3 b or FIG. 6 b , or 1322 of FIG. 4 b or FIG. 5 b ) when the mobile electronic is horizontally held.
- Each orientation type indicates mappings between the stored image objects and representations on a display device (e.g. 1314 of FIG. 1 ), a touch panel (e.g. 1323 of FIG. 1 ) or an external display such as a CRT monitor, a TFT-LCD display (not shown), a plasma display (not shown), an OLED display (not shown) or similar.
- FIGS. 10 a to 10 h are diagrams illustrating mappings between the stored image objects and representations on a display device.
- An orientation type equal to one indicates row 0 of a stored image/frame P 100 a is displayed at the top of a displayed image/frame P 200 a , and column 0 of the stored image/frame P 100 a is displayed at the left-hand side of the displayed image/frame P 200 a , the display result is as shown in FIG. 10 a .
- An orientation type equal to two indicates row 0 of a stored image/frame P 100 b is displayed at the top of a displayed image/frame P 200 b , and column 0 of the stored image/frame P 100 b is displayed at the right-hand side of the displayed image/frame P 200 b , the display result is as shown in FIG. 10 b .
- An orientation type equal to three indicates row 0 of a stored image/frame P 100 c is displayed at the bottom of a displayed image/frame P 200 c , and column 0 of the stored image/frame P 100 c is displayed at the right-hand side of the displayed image/frame P 200 c , the display result is as shown in FIG. 10 c .
- An orientation type equal to four indicates row 0 of a stored image/frame P 100 d is displayed at the bottom of a displayed image/frame P 200 d , and column 0 of the stored image/frame P 100 d is displayed at the left-hand side of the displayed image/frame P 200 d , the display result is as shown in FIG. 10 d .
- An orientation type equal to five indicates row 0 of a stored image/frame P 100 e is displayed at the left-hand side of a displayed image/frame P 200 e , and column 0 of the stored image/frame P 100 e is displayed at the top of the displayed image/frame P 200 e , the display result is as shown in FIG. 10 e .
- An orientation type equal to six indicates row 0 of a stored image/frame P 100 f is displayed at the right-hand side of a displayed image/frame P 200 f , and column 0 of the stored image/frame P 100 f is displayed at the top of the displayed image/frame P 200 f , the display result is as shown in FIG. 10 f .
- An orientation type equal to seven indicates row 0 of a stored image/frame P 100 g is displayed at the right-hand side of a displayed image/frame P 200 g , and column 0 of the stored image/frame P 100 g is displayed at the bottom of the displayed image/frame P 200 g , the display result is as shown in FIG. 10 g .
- An orientation type equal to eight indicates row 0 of a stored image/frame P 100 h is displayed at the left-hand side of a displayed image/frame P 200 h , and column 0 of the stored image/frame P 100 h is displayed at the bottom of the displayed image/frame P 200 h , the display result is as shown in FIG. 10 h .
- the details of utility of the provided information are to be described by referring to the following steps.
- a shutter or recording signal is detected.
- the shutter or recording signal may be generated by one of multiple shutter objects such as soft keys on a touch panel (e.g. 1321 of FIG. 4 a , 4 b , 6 a or 6 b , or 1322 of FIG. 4 a , 4 b , 5 a or 5 b ), hard keys on a keypad (e.g. 310 of FIG. 3 a , 3 b , 5 a or 5 b ), and a shutter button disposed on a lateral side (e.g. 1332 of FIG. 3 a , 3 b , 6 a or 6 b ).
- the shutter signal will direct relevant electronic devices of the mobile electronic device to generate a still image.
- the recording signal will direct relevant electronic devices of the mobile electronic device to generate a series of video frames.
- step S 831 it is determined the detected shutter or recording signal is generated by which shutter object.
- step S 841 an image object is acquired via an image sensor module (e.g. 1307 of FIG. 1 ), a sensor controller and image processor (e.g. 1310 of FIG. 1 ) and/or an image encoder (e.g. 1312 of FIG. 1 ).
- an orientation type for the acquired image object is determined according to the provided information and the shutter object generating the shutter or recording signal. For example, as an image sensor is placed as shown in FIG. 7 a , according to the table 91 of FIG.
- the orientation type is determined to be one when the shutter or recording signal is generated by a shutter object identified by “V-Obj” (e.g. 310 of FIG. 3 a , 1321 of FIG. 4 a , 310 of FIG. 5 a , or 1321 of FIG. 6 a ), otherwise, the orientation type is determined to be eight when the shutter or recording signal is generated by a shutter object identified by “H-Obj” (e.g. 1332 of FIG. 3 b , 1322 of FIG. 4 b , 1322 of FIG. 4 a , or 1332 of FIG. 6 b ).
- the acquired image object with the determined orientation type is stored in memory (e.g. 1306 of FIG.
- the determined orientation type may be stored in an orientation tag (0x112) of a still image header file compatible with the exchangeable image file format (EXIF) set forth by exchangeable image file format for digital still cameras: Exif Version 2.2 established in April, 2002.
- the determined orientation type following a proprietary identifier (e.g. “MTKORIT”) may be stored in a user data (udat) box of an MPEG file set forth by ISO 14496-12 first edition on Feb. 1, 2004. It is to be understood that the pixel data arrangement for the stored still images or video frame is not changed when storing the orientation type in the orientation tag of a still image header file or the udat box of an MPEG file.
- FIG. 11 is a flowchart illustrating an embodiment of a method for displaying image objects, performed by a processor of a mobile electronic device (e.g. 1305 of FIG. 1 ), a processor of a computer (not shown), or similar.
- a processor of a mobile electronic device e.g. 1305 of FIG. 1
- a processor of a computer not shown
- an image object is acquired from memory (e.g. 1306 of FIG. 1 ) or storage media ( 1313 of FIG. 1 ).
- an orientation type for the acquired image object is acquired.
- the orientation type may be acquired from the described orientation tag or the described udat box.
- the acquired image object is displayed in response to the acquired orientation type.
- the display details may refer to the above description for FIGS. 10 a to 10 h.
- FIG. 12 is a diagram illustrating capture of an image by a mobile phone, and display of the captured image of the person by an external display.
- an image sensor 1309 of the mobile phone is placed as shown in FIG. 7 a , the described table 91 ( FIG. 9 a ) is provided (referring to step S 811 of FIG. 8 ).
- a shutter or recording signal is detected (referring to step S 821 of FIG.
- a processor determines that a hard shutter key 310 identified with “V-Obj” generates the shutter or recording signal (referring to step S 831 ), acquires an image object IMG 120 (referring to step S 841 ), determines an orientation type equal to one INFO 120 for the acquired image object IMG 120 captured by the hard shutter key 310 by retrieving the provided table 91 (referring to step S 851 ) and stores the acquired image object IMG 120 with the determined orientation type INFO 120 . And then, a computer acquires the stored image object IMG 120 (referring to step S 1110 of FIG. 11 ), acquires the stored orientation type equal to one for the acquired image object (referring to step S 1120 of FIG.
- the mobile phone may also display the acquired image object on a screen thereof in response to the stored orientation type by a photo browsing application.
- the display result can be deduced by the analogy of FIG. 12 .
- FIG. 13 is a diagram illustrating capture of an image by a mobile phone, and display of the captured image of the person by an external display.
- an image sensor 1309 of the mobile phone is placed as shown in FIG. 7 a , the described table 91 ( FIG. 9 a ) is provided (referring to step S 811 of FIG. 8 ).
- a shutter or recording signal is detected (referring to step S 821 of FIG.
- a processor determines that a side shutter button 1332 identified with “H-Obj” generates the shutter or recording signal (referring to step S 831 ), acquires an image object IMG 130 (referring to step S 841 ), determines an orientation type equal to eight INFO 130 for the acquired image object IMG 130 captured by the side shutter button 1332 by retrieving the provided table 91 (referring to step S 851 ) and stores the acquired image object IMG 130 with the determined orientation type INFO 130 . And then, a computer acquires the stored image object IMG 130 (referring to step S 1110 of FIG. 11 ), acquires the stored orientation type equal to eight for the acquired image object (referring to step S 1120 of FIG.
- the mobile phone may also display the acquired image object on a screen thereof in response to the stored orientation type by a photo browsing application.
- the display result can be deduced by the analogy of FIG. 13 .
- FIG. 14 is a diagram illustrating capture of an image by a mobile phone, and display of the captured image of the person by an external display.
- an image sensor 1309 of the mobile phone is placed as shown in FIG. 7 b , the described table 93 ( FIG. 9 b ) is provided (referring to step S 811 of FIG. 8 ).
- a shutter or recording signal is detected (referring to step S 821 of FIG.
- a processor determines that a hard shutter key 310 identified with “V-Obj” generates the shutter or recording signal (referring to step S 831 ), acquires an image object IMG 140 (referring to step S 841 ), determines an orientation type equal to six INFO 140 for the acquired image object IMG 140 captured by the hard shutter key 310 by retrieving the provided table 93 (referring to step S 851 ) and stores the acquired image object IMG 140 with the determined orientation type INFO 140 . And then, a computer acquires the stored image object IMG 140 (referring to step S 1110 of FIG. 11 ), acquires the stored orientation type equal to six for the acquired image object (referring to step S 1120 of FIG.
- the mobile phone may also display the acquired image object on a screen thereof in response to the stored orientation type by a photo browsing application.
- the display result can be deduced by the analogy of FIG. 14 .
- FIG. 15 is a diagram illustrating of an image by a mobile phone, and display of the captured image of the person by an external display.
- an image sensor 1309 of the mobile phone is placed as shown in FIG. 7 b , the described table 93 ( FIG. 9 a ) is provided (referring to step S 811 of FIG. 8 ).
- a shutter or recording signal is detected (referring to step S 821 of FIG.
- a processor determines that a side shutter button 1332 identified with “H-Obj” generates the shutter or recording signal (referring to step S 831 ), acquires an image object IMG 150 (referring to step S 841 ), determines an orientation type equal to one INFO 150 for the acquired image object IMG 150 captured by the side shutter button 1332 by retrieving the provided table 93 (referring to step S 851 ) and stores the acquired image object IMG 1530 with the determined orientation type INFO 150 . And then, a computer acquires the stored image object IMG 150 (referring to step S 1110 of FIG. 11 ), acquires the stored orientation type equal to one for the acquired image object (referring to step S 1120 of FIG.
- a mobile phone may also display the acquired image object on a screen thereof in response to the stored orientation type by a photo browsing application.
- the display result can be deduced by the analogy of FIG. 15 .
- the orientation type may be determined by a direction of an iconic indicator displayed on a touch panel (e.g. 1323 of FIG. 1 ) or a display device 1314 (e.g. 1314 of FIG. 1 ).
- a direction of an iconic indicator displayed on a touch panel (e.g. 1323 of FIG. 1 ) or a display device 1314 (e.g. 1314 of FIG. 1 ).
- the direction of the iconic indicator may be adjusted by clicking the specific region. For example, supposing that at least one short edge of an image sensor of a mobile electronic device is placed parallel to at least one short edge of the mobile electronic device as shown in FIG. 7 b .
- FIGS. 16 a to 16 d are diagrams illustrating adjustment of a direction indicated by an iconic indicator in various aspects before capturing a skyscraper.
- a head of an iconic person 11600 a is initially displayed toward a direction D up because the image sensor is placed as shown in FIG. 7 b .
- a user recognizes that the skyscraper cannot be fully viewed in the touch panel as shown in FIG. 16 a .
- the user then vertically holds the mobile electronic device to focus on the skyscraper as shown in FIG. 16 b and discovers that the head of the iconic person 11600 a towards a wrong direction.
- the user can click a specific region on the touch panel, displaying the iconic person, to counterclockwise rotate the iconic person by ninety degrees, i.e. from the direction D up to a direction D left .
- a processor e.g. 1305 of FIG. 1
- the touch panel controller clockwise rotates the iconic person on the touch panel by ninety degrees when detecting a signal indicating that the specific region on the touch panel is clicked.
- the iconic person may be rotated by pressing a hard key on a keypad ( 1331 of FIG. 1 ) as shown in FIG. 16 d .
- a keypad controller e.g. 1330 of FIG. 1
- the processor counterclockwise rotates the iconic person on the touch panel by ninety degrees, as shown in I 1600 b of FIG. 16 d .
- a direction flag stored in a memory e.g. 1306 of FIG. 1
- FIG. 17 is a flowchart illustrating an embodiment of a method for capturing images of objects, performed by a processor of a mobile electronic device (e.g. 1305 of FIG. 1 ).
- step S 1711 information indicating mapping relationships between directions indicated by a displayed iconic indicator and orientation types is provided. Such information may be stored in memory (e.g. 1306 of FIG. 1 ) or storage media (e.g. 131 of FIG. 1 ).
- FIG. 18 is a diagram containing a table 180 .
- the table 180 describes information indicating mapping relationships when an image sensor is placed as shown in FIG. 7 b . Wherein, “D up ”, “D down ”, “D left ” and “D right ” may identify the directions as shown in FIGS.
- orientation types ranging from 1 to 8 can be assigned to one of “D up ”, “D down ”, “D left ” and “D right ”. Details of the orientation types may follow descriptions for FIGS. 10 a to 10 h .
- the pixel data i.e. discrete color values
- the tables 180 can be implemented in various data structures such as two-dimensional arrays or similar.
- a shutter or recording signal is detected.
- the shutter or recording signal may be generated by one of soft keys on a touch panel (e.g. 1321 of FIG. 4 a , 4 b , 6 a or 6 b , or 1322 of FIG. 4 a , 4 b , 5 a or 5 b ), hard keys on a keypad (e.g. 310 of FIG. 3 a , 3 b , 5 a or 5 b ), and a shutter button disposed on a lateral side (e.g. 1332 of FIG. 3 a , 3 b , 6 a or 6 b ).
- the displayed iconic indicator indicates which direction.
- the direction indicated by the displayed iconic indicator is preferably determined by inspecting the stored direction flag.
- the shutter signal will direct relevant electronic devices of the mobile electronic device to generate a still image.
- the recording signal will direct relevant electronic devices of the mobile electronic device to generate a series of video frames.
- it is determined the detected shutter signal is generated by which shutter object.
- an image object is acquired via an image sensor module (e.g. 1307 of FIG. 1 ), a sensor controller and image processor (e.g. 1310 of FIG. 1 ) and/or an image encoder (e.g. 1312 of FIG. 1 ).
- an orientation type for the acquired image object is determined according to the provided information and the direction indicated by the displayed iconic indicator.
- the orientation type is determined to be six when the direction indicated by the iconic indicator is “D left ” as shown in FIG. 16 c or 16 d .
- the acquired image object with the determined orientation type is stored in memory (e.g. 1306 of FIG. 1 ) or storage media (e.g. 1313 of FIG. 1 ).
- the determined orientation type may be stored in the described orientation tag (0x112) of a still image header file.
- the determined orientation type following a proprietary identifier e.g. “MTKORIT” may be stored in the described user data (udat) box of an MPEG file.
- FIG. 19 is a diagram of an embodiment of a pipeline for video encoding.
- the pipeline for video encoding comprising the processor 1305 , memory 1306 , image sensor 1309 , video encoder 1312 , an image DMA (direct memory access) controller 1910 preferably resident on the sensor controller and image processor 1310 , and a frame buffer 1930 preferably resident on the memory 1306 .
- the image DMA controller 1910 contains several buffer write procedures in hardware circuits. Before video encoding, the processor 1305 instructs the image DMA controller 1910 to employ one buffer write procedure according to a determined orientation type. Thereafter, the image DMA controller 1910 receives color values (e.g.
- RGB, YCbCr or similar values of pixels along the described scanning process from the image sensor 1309 and writes the received color values of each pixel to the frame buffer 1930 with reference to the instructed buffer write procedure during video encoding.
- the sensed image may be rotated and stored in the frame buffer 1930 .
- the video encoder 1312 subsequently acquires an image by serially reading color values from the frame buffer 1312 , and encodes the acquired image into a video bitstream by performing MPEG, H.26x encoding methods, or similar.
- FIG. 20 is a flowchart illustrating an embodiment of buffer write procedure for writing one sensed image to a frame buffer when at least one short edge of an image sensor of a mobile electronic device is placed parallel to at least one short edge of a mobile electronic device as shown in FIG. 7 b , and the rotation type is one.
- a variable i is set to zero.
- color values for one pixel are received.
- the received color values are written to Buffer[OFFSET+i], where the constant “OFFSET” indicates the beginning address of the frame buffer 1930 .
- the variable i is increased by one. Note that each cell in the frame buffer 1930 has sufficient space for storing color values of one pixel.
- step S 2031 it is determined whether i is equal to a total number of pixels denoted as N(image). If so, the process ends, otherwise, the process proceeds to step S 2021 in order to process the next pixel.
- N(image) a total number of pixels denoted as N(image).
- FIG. 21 is a diagram of an image DMA controller writing pixel values of one sensed image from an image sensor to a frame buffer by employing the buffer write procedure as shown in FIG. 20 . Note that, when employing the buffer write procedure as shown in FIG. 20 , the sensed image is not rotated.
- FIG. 22 is a flowchart illustrating an embodiment of buffer write procedure for writing one sensed image to a frame buffer when at least one short edge of an image sensor of a mobile electronic device is placed parallel to at least one short edge of a mobile electronic device as shown in FIG. 7 b , and the rotation type is six.
- a variable i is set to one.
- a variable j is set to one.
- color values for one pixel are received.
- step S 2225 the received color values are written to Buffer[OFFSET+SENSOR_HEIGHT ⁇ j ⁇ i], where the constant “OFFSET” indicates the beginning address of the frame buffer 1930 , and the constant “SENSOR_HEIGHT” indicates the height of the image sensor 1309 .
- step S 2231 it is determined whether j is equal to a constant “SENOR_WIDTH” indicating the width of the image sensor 1309 . If so, the process proceeds to step S 2241 , otherwise, to step S 2233 .
- step S 2233 the variable j is increased by one.
- step S 2241 it is determined whether i is equal to the constant “SENSOR_HEIGHT”.
- FIG. 23 is a diagram of an image DMA controller writing pixel values of one sensed image from an image sensor to a frame buffer by employing the buffer write procedure as shown in FIG. 23 . Note that, when employing the buffer write procedure as shown in FIG. 23 , the sensed image is rotated.
- FIG. 24 is a diagram of an embodiment of a pipeline for video encoding.
- the pipeline for video encoding comprising the processor 1305 , memory 1306 , image sensor 1309 , video encoder 1312 , an image DMA (direct memory access) controller 1910 preferably resident on the sensor controller and image processor 1310 , and a frame buffer 1930 preferably resident on the memory 1306 .
- the video encoder 1312 contains several buffer read procedures in hardware circuits. Before video encoding, the processor 1305 instructs the video encoder 1312 to employ one buffer read procedure according to a determined orientation type. Thereafter, the image DMA controller 1910 receives color values (e.g.
- the video encoder 1312 subsequently acquires an image by reading color values from the frame buffer 1312 with reference to the instructed buffer read procedure, and encodes the acquired image into a video bitstream by performing MPEG, H.26x encoding methods, or similar.
- the sensed image may be rotated and encoded in a video bitstream.
- FIG. 25 is a flowchart illustrating an embodiment of buffer read procedure for reading one sensed image from a frame buffer when at least one short edge of an image sensor of a mobile electronic device is placed parallel to at least one short edge of a mobile electronic device as shown in FIG. 7 b , and the rotation type is one.
- the buffer read procedure organizes the image in the frame buffer into blocks, and performs a series of encoding methods for each block, such as color space transform, down-sampling, discrete cosine transform (DCT), quantization, variable length encoding (VLE), entropy encoding, motion estimation, and/or others.
- DCT discrete cosine transform
- VLE variable length encoding
- step S 2511 variables i, j, next_i, next_j, block_count, block_height_count, and block_width_count are initially set to zeros.
- step S 2521 color values for one pixel are read from Buffer[OFFSET+SENSOR_HEIGHT ⁇ i+j] and treated as one pixel of a block, denoted as block[block_count block_height_count,block_width_count], where the constant “OFFSET” indicates the beginning address of the frame buffer 1930 , and the constant “SENSOR_HEIGHT” indicates the height of the image sensor 1309 .
- step S 2523 it is determined whether the variable block_width_count is equal to a constant “BLOCK_WIDTH” minus one, where the constant “BLOCK_WIDTH” indicates a block width. If so, the process completes one row of a block and proceeds to step S 2531 , otherwise, to step S 2525 .
- step S 2525 the variables j, and block_width_count are increased by one.
- step S 2531 it is determined whether the variable block_height-count is equal to a constant “BLOCK_HEIGHT” minus one, where the constant “BLOCK_HEIGHT” indicates a block height.
- step S 2533 the variable i is increased by one, the variable j is set to the variable next_j, the variable block_height_count is increased by one, and the variable block_width_count is set to zero.
- step S 2541 it is determined whether value of SENSOR_HEIGHT ⁇ i+j+1 is a multiple of the constant “SENSOR_WIDTH”. Supposing that the width of an image sensor is 320 , the multiples of the constant “SENSOR_WIDTH” are 320, 640, 960, 1280, and so on.
- step S 2543 the variable i is set to the variable next_i, the variable next_j is set to the variable j plus one, the variable j is increased by one, the variables block_height_count and block_width_count are set to zero, and the variable block_count is increased by one.
- step S 2551 it is determined whether value of SENSOR_HEIGHT ⁇ i+j+1 is equal to a value of SENSOR_HEIGHT ⁇ SENSOR_WIDTH. If so, the process ends to complete a sensed image, otherwise, to step S 2553 .
- step S 2553 the variable i is increased by one, the variable next_i is set to the variable i plus one, the variables j, next_j, block_width_count and block_height_count are set to zeros, and the variable block_count is increased by one.
- step S 2545 the newly acquired block denoted as block[block_count] is encoded.
- the newly acquired block may be encoded by color space transform, down-sampling, discrete cosine transform (DCT), quantization, variable length encoding (VLE), entropy encoding, motion estimation, and/or others. Note that the sequence of steps S 2511 to S 2553 is only provided for improved understanding. Those skilled in the art may arrange functions of S 2511 to S 2553 in parallel hardware circuits without departing from the scope and spirit of the buffer read procedure in order to improve decoding efficiency.
- DCT discrete cosine transform
- VLE variable length encoding
- step S 2623 it is determined whether the variable block_width_count is equal to a constant “BLOCK_WIDTH” minus one, where the constant “BLOCK_WIDTH” indicates a block width. If so, the process completes one row of a block and proceeds to step S 2631 , otherwise, to step S 2625 .
- step S 2625 the variable j is decreased by one, and the variable block_width_count is increased by one.
- step S 2631 it is determined whether the variable block_height_count is equal to a constant “BLOCK_HEIGHT” minus one, where the constant “BLOCK_HEIGHT” indicates a block height.
- step S 2641 the variable i is increased by one, the variable j is set to the variable next_j, the variable block_height_count is increased by one, and the variable block_width_count is set to zero.
- step S 2541 it is determined whether value of SENSOR_HEIGHT ⁇ j+i+1 is a value between one and the constant “SENSOR_WIDTH”. If so, the process completes all rows of a slice and proceeds to step S 2651 , otherwise, to step S 2643 .
- step S 2643 the variable i is set to the variable next_i, the variable next_j is set to the variable j minus one, the variable j is decreased by one, the variables block_height_count and block_width_count are set to zero, and the variable block_count is increased by one.
- step S 2651 it is determined whether value of SENSOR_HEIGHT ⁇ j+i+1 is equal to the constant “SENSOR_WIDTH”. If so, the process ends to complete a sensed image, otherwise, to step S 2653 .
- step S 2653 the variable i is increased by one, the variable next_i is set to the variable i plus one, the variables j and next_j are set to 239, the variables block_width_count and block_height_count are set to zeros, and the variable block_count is increased by one.
- step S 2645 the newly acquired block denoted as block[block_count] is encoded. Note that the sequence of steps S 2611 to S 2653 is only provided for improved understanding. Those skilled in the art may arrange functions of S 2611 to S 2653 in parallel hardware circuits without departing from the scope and spirit of the buffer read procedure in order to improve decoding efficiency.
- FIG. 28 is a diagram of an embodiment of a pipeline for video encoding.
- the pipeline for video encoding comprising the processor 1305 , memory 1306 , image sensor 1309 , video encoder 1312 , and a frame buffer 1930 preferably resident on the memory 1306 .
- the processor 1305 receives color values of pixels along the described scanning process from the image sensor 1309 and writes the received color values of each pixel to the frame buffer 1930 with reference to one buffer write procedure implemented in program codes according to a determined orientation type. Details of buffer write procedure may follow the descriptions of FIGS. 20 and 22 .
- the sensed image may be rotated and stored in the frame buffer 1930 .
- the video encoder 1312 subsequently acquires an image by serially reading color values from the frame buffer 1312 , and encodes the acquired image into a video bitstream by performing MPEG, H.26x encoding methods, or similar.
- FIG. 29 is a diagram of an embodiment of a pipeline for video encoding.
- the pipeline for video encoding comprising the processor 1305 , memory 1306 , image sensor 1309 , video encoder 1312 , the image DMA controller 1910 preferably resident on the sensor controller and image processor 1310 , and the frame buffer 1930 preferably resident on the memory 1306 .
- the image DMA controller 1910 receives color values of pixels along the described scanning process from the image sensor 1309 and writes the received color values of each pixel to the frame buffer 1930 from the beginning to the end.
- the processor 1305 acquires an image by reading color values from the frame buffer 1312 with reference to one buffer read procedure implemented in program codes according to a determined orientation type. Details of buffer read procedure may follow the descriptions of FIGS.
- steps S 2545 and S 2645 are updated with outputting the newly acquired block denoted as block[block_count] to the video encoder 1312 .
- the sensed image may be rotated and outputted to the video encoder 1312 . Thereafter, The video encoder 1312 subsequently, and encodes the acquired image into a video bitstream by performing MPEG, H.26x encoding methods, or similar.
- Methods for capturing and displaying images of objects may take the form of program codes (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMS, hard drives, or any other machine-readable storage medium, wherein, when the program codes are loaded into and executed by a machine, such as a mobile phone, a computer, a DVD recorder or similar, the machine becomes an apparatus for practicing the invention.
- program codes i.e., instructions
- tangible media such as floppy diskettes, CD-ROMS, hard drives, or any other machine-readable storage medium
- the disclosed methods may also be embodied in the form of program codes transmitted over some transmission medium, such as electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program codes are received and loaded into and executed by a machine, such as a mobile phone, a computer, the machine becomes an apparatus for practicing the invention.
- a machine such as a mobile phone, a computer
- the program codes When implemented on a general-purpose processor, the program codes combine with the processor to provide a unique apparatus that operate analogously to specific logic circuits.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Studio Devices (AREA)
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/736,655 US20080002963A1 (en) | 2006-06-28 | 2007-04-18 | Systems and methods for capturing images of objects |
| DE102007029630A DE102007029630A1 (de) | 2006-06-28 | 2007-06-26 | Systeme und Verfahren zum Erfassen von Objektbildern |
| CN2013101494019A CN103209276A (zh) | 2006-06-28 | 2007-06-28 | 产生图像物件的方法及系统 |
| TW096123443A TWI333783B (en) | 2006-06-28 | 2007-06-28 | Methods and systems for generating an image object |
| US13/043,753 US8648931B2 (en) | 2006-06-28 | 2011-03-09 | Systems and methods for capturing images of objects |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US80601106P | 2006-06-28 | 2006-06-28 | |
| US11/736,655 US20080002963A1 (en) | 2006-06-28 | 2007-04-18 | Systems and methods for capturing images of objects |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/043,753 Division US8648931B2 (en) | 2006-06-28 | 2011-03-09 | Systems and methods for capturing images of objects |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20080002963A1 true US20080002963A1 (en) | 2008-01-03 |
Family
ID=38777193
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US11/736,655 Abandoned US20080002963A1 (en) | 2006-06-28 | 2007-04-18 | Systems and methods for capturing images of objects |
| US13/043,753 Active 2027-09-25 US8648931B2 (en) | 2006-06-28 | 2011-03-09 | Systems and methods for capturing images of objects |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/043,753 Active 2027-09-25 US8648931B2 (en) | 2006-06-28 | 2011-03-09 | Systems and methods for capturing images of objects |
Country Status (4)
| Country | Link |
|---|---|
| US (2) | US20080002963A1 (zh) |
| CN (1) | CN103209276A (zh) |
| DE (1) | DE102007029630A1 (zh) |
| TW (1) | TWI333783B (zh) |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090015703A1 (en) * | 2007-07-11 | 2009-01-15 | Lg Electronics Inc. | Portable terminal having touch sensing based image capture function and image capture method therefor |
| US20110157421A1 (en) * | 2006-06-28 | 2011-06-30 | Mediatek Inc. | Systems and Methods for Capturing Images of Objects |
| US20110298940A1 (en) * | 2010-06-07 | 2011-12-08 | Samsung Electronics Co., Ltd. | Method and apparatus for operating camera function in portable terminal |
| US20110300903A1 (en) * | 2009-02-24 | 2011-12-08 | Kyocera Corporation | Portable electronic device and control method therefor |
| CN102918560A (zh) * | 2010-06-28 | 2013-02-06 | 英特尔公司 | 图像信号处理器复用 |
| US20140118567A1 (en) * | 2012-11-01 | 2014-05-01 | Samsung Techwin Co., Ltd. | Method of and system for detecting motion in real time |
| WO2014086357A1 (en) * | 2012-12-05 | 2014-06-12 | Aspekt R&D A/S | Photo survey |
| EP3104592A1 (en) * | 2015-06-08 | 2016-12-14 | Jin Wook Rim | Method for providing user interface in user terminal including camera |
| EP3459797A1 (de) | 2015-09-24 | 2019-03-27 | Autoliv Development AB | Fahrzeugsensitiver sensor mit mehrteiliger sensormasse |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2012121802A1 (en) | 2011-03-10 | 2012-09-13 | Vidyo, Inc. | Render-orientation information in video bitstream |
| TWI496090B (zh) | 2012-09-05 | 2015-08-11 | Ind Tech Res Inst | 使用深度影像的物件定位方法與裝置 |
| JP6071543B2 (ja) * | 2012-12-27 | 2017-02-01 | キヤノン株式会社 | 電子機器及び電子機器の制御方法 |
| US9445031B2 (en) * | 2014-01-02 | 2016-09-13 | Matt Sandy | Article of clothing |
| JP6659148B2 (ja) * | 2016-02-03 | 2020-03-04 | キヤノン株式会社 | 表示制御装置及びその制御方法、プログラム、並びに記憶媒体 |
| US11438509B2 (en) * | 2019-03-29 | 2022-09-06 | Canon Kabushiki Kaisha | Imaging apparatus configured to record orientation of the imaging apparatus when an image is captured |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6262769B1 (en) * | 1997-07-31 | 2001-07-17 | Flashpoint Technology, Inc. | Method and system for auto rotating a graphical user interface for managing portrait and landscape images in an image capture unit |
| US6473123B1 (en) * | 1997-08-21 | 2002-10-29 | Flash Point Technology, Inc. | Method and system for organizing DMA transfers to support image rotation |
| US6597817B1 (en) * | 1997-07-15 | 2003-07-22 | Silverbrook Research Pty Ltd | Orientation detection for digital cameras |
| US20040185878A1 (en) * | 2003-01-30 | 2004-09-23 | Jung-Oh Woo | Device and method for displaying pictures in wireless mobile terminal |
| US20050168583A1 (en) * | 2002-04-16 | 2005-08-04 | Thomason Graham G. | Image rotation correction for video or photographic equipment |
| US7054552B2 (en) * | 2004-06-25 | 2006-05-30 | Nokia Corporation | Vertical and horizontal pictures taken without camera rotation |
| US20060221351A1 (en) * | 2005-03-29 | 2006-10-05 | Dahai Yu | Handheld metrology imaging system and method |
| US7532235B2 (en) * | 2003-10-27 | 2009-05-12 | Fujifilm Corporation | Photographic apparatus |
| US7554578B2 (en) * | 2000-07-11 | 2009-06-30 | Phase One A/S | Digital camera with integrated accelerometers |
Family Cites Families (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6192257B1 (en) * | 1998-03-31 | 2001-02-20 | Lucent Technologies Inc. | Wireless communication terminal having video image capability |
| US6148149A (en) * | 1998-05-26 | 2000-11-14 | Microsoft Corporation | Automatic image rotation in digital cameras |
| JP2001054084A (ja) * | 1999-08-09 | 2001-02-23 | Matsushita Electric Ind Co Ltd | テレビ電話装置 |
| JP3671883B2 (ja) | 2001-08-15 | 2005-07-13 | ソニー株式会社 | 画像記録再生装置 |
| US6842652B2 (en) * | 2002-02-22 | 2005-01-11 | Concord Camera Corp. | Image capture device |
| JP2004145291A (ja) * | 2002-10-03 | 2004-05-20 | Casio Comput Co Ltd | 画像表示装置、画像表示方法及びプログラム |
| JP4053444B2 (ja) * | 2003-03-07 | 2008-02-27 | シャープ株式会社 | 携帯可能な多機能電子機器 |
| JP5093968B2 (ja) * | 2003-10-15 | 2012-12-12 | オリンパス株式会社 | カメラ |
| US20050104976A1 (en) * | 2003-11-17 | 2005-05-19 | Kevin Currans | System and method for applying inference information to digital camera metadata to identify digital picture content |
| US20060033819A1 (en) * | 2004-08-12 | 2006-02-16 | Sony Corporation | Method and apparatus for automatic orientation correction of digital photographs |
| US8031775B2 (en) * | 2006-02-03 | 2011-10-04 | Eastman Kodak Company | Analyzing camera captured video for key frames |
| US20080002963A1 (en) * | 2006-06-28 | 2008-01-03 | Media Tek Inc. | Systems and methods for capturing images of objects |
-
2007
- 2007-04-18 US US11/736,655 patent/US20080002963A1/en not_active Abandoned
- 2007-06-26 DE DE102007029630A patent/DE102007029630A1/de not_active Ceased
- 2007-06-28 CN CN2013101494019A patent/CN103209276A/zh active Pending
- 2007-06-28 TW TW096123443A patent/TWI333783B/zh not_active IP Right Cessation
-
2011
- 2011-03-09 US US13/043,753 patent/US8648931B2/en active Active
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6597817B1 (en) * | 1997-07-15 | 2003-07-22 | Silverbrook Research Pty Ltd | Orientation detection for digital cameras |
| US6262769B1 (en) * | 1997-07-31 | 2001-07-17 | Flashpoint Technology, Inc. | Method and system for auto rotating a graphical user interface for managing portrait and landscape images in an image capture unit |
| US6473123B1 (en) * | 1997-08-21 | 2002-10-29 | Flash Point Technology, Inc. | Method and system for organizing DMA transfers to support image rotation |
| US7554578B2 (en) * | 2000-07-11 | 2009-06-30 | Phase One A/S | Digital camera with integrated accelerometers |
| US20050168583A1 (en) * | 2002-04-16 | 2005-08-04 | Thomason Graham G. | Image rotation correction for video or photographic equipment |
| US20040185878A1 (en) * | 2003-01-30 | 2004-09-23 | Jung-Oh Woo | Device and method for displaying pictures in wireless mobile terminal |
| US7532235B2 (en) * | 2003-10-27 | 2009-05-12 | Fujifilm Corporation | Photographic apparatus |
| US7054552B2 (en) * | 2004-06-25 | 2006-05-30 | Nokia Corporation | Vertical and horizontal pictures taken without camera rotation |
| US20060221351A1 (en) * | 2005-03-29 | 2006-10-05 | Dahai Yu | Handheld metrology imaging system and method |
Cited By (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110157421A1 (en) * | 2006-06-28 | 2011-06-30 | Mediatek Inc. | Systems and Methods for Capturing Images of Objects |
| US8648931B2 (en) * | 2006-06-28 | 2014-02-11 | Mediatek Inc. | Systems and methods for capturing images of objects |
| US8203640B2 (en) * | 2007-07-11 | 2012-06-19 | Lg Electronics Inc. | Portable terminal having touch sensing based image capture function and image capture method therefor |
| US20090015703A1 (en) * | 2007-07-11 | 2009-01-15 | Lg Electronics Inc. | Portable terminal having touch sensing based image capture function and image capture method therefor |
| US8676250B2 (en) * | 2009-02-24 | 2014-03-18 | Kyocera Corporation | Portable electronic device and control method therefor |
| US20110300903A1 (en) * | 2009-02-24 | 2011-12-08 | Kyocera Corporation | Portable electronic device and control method therefor |
| US8625020B2 (en) * | 2010-06-07 | 2014-01-07 | Samsung Electronics Co., Ltd. | Method and apparatus for operating camera function in portable terminal |
| US20110298940A1 (en) * | 2010-06-07 | 2011-12-08 | Samsung Electronics Co., Ltd. | Method and apparatus for operating camera function in portable terminal |
| US9485421B2 (en) | 2010-06-07 | 2016-11-01 | Samsung Electronics Co., Ltd. | Method and apparatus for operating camera function in portable terminal |
| US9712745B2 (en) | 2010-06-07 | 2017-07-18 | Samsung Electronics Co., Ltd. | Method and apparatus for operating camera function in portable terminal |
| CN102918560A (zh) * | 2010-06-28 | 2013-02-06 | 英特尔公司 | 图像信号处理器复用 |
| US20140118567A1 (en) * | 2012-11-01 | 2014-05-01 | Samsung Techwin Co., Ltd. | Method of and system for detecting motion in real time |
| US9338353B2 (en) * | 2012-11-01 | 2016-05-10 | Hanwha Techwin Co., Ltd. | Method of and system for detecting motion in real time |
| WO2014086357A1 (en) * | 2012-12-05 | 2014-06-12 | Aspekt R&D A/S | Photo survey |
| US9565334B2 (en) | 2012-12-05 | 2017-02-07 | Aspekt R&D A/S | Photo survey using smart device with camera |
| EP3104592A1 (en) * | 2015-06-08 | 2016-12-14 | Jin Wook Rim | Method for providing user interface in user terminal including camera |
| EP3459797A1 (de) | 2015-09-24 | 2019-03-27 | Autoliv Development AB | Fahrzeugsensitiver sensor mit mehrteiliger sensormasse |
Also Published As
| Publication number | Publication date |
|---|---|
| US20110157421A1 (en) | 2011-06-30 |
| DE102007029630A1 (de) | 2008-01-03 |
| TWI333783B (en) | 2010-11-21 |
| CN103209276A (zh) | 2013-07-17 |
| TW200803477A (en) | 2008-01-01 |
| US8648931B2 (en) | 2014-02-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8648931B2 (en) | Systems and methods for capturing images of objects | |
| US20090066693A1 (en) | Encoding A Depth Map Into An Image Using Analysis Of Two Consecutive Captured Frames | |
| US8570335B2 (en) | Mobile device and method for displaying thumbnails on the mobile device | |
| CN115315954A (zh) | 基于图块的机器用视频编码 | |
| US20070188624A1 (en) | Image capturing method and image-capturing device thereof | |
| US20090041363A1 (en) | Image Processing Apparatus For Reducing JPEG Image Capturing Time And JPEG Image Capturing Method Performed By Using Same | |
| US8194145B2 (en) | Method for resizing image in wireless terminal and wireless terminal adapted for resizing | |
| US20130093913A1 (en) | Information processing apparatus, display method, and information processing system | |
| JP7100493B2 (ja) | 表示制御装置及びその制御方法及びプログラム | |
| US8081228B2 (en) | Apparatus and method for processing image data | |
| KR100639109B1 (ko) | 섬네일 제이피이지 이미지의 생성 장치, 그의 생성 방법 및그의 저장 매체 | |
| US20100007678A1 (en) | Handheld electrical communication device and image processing method thereof | |
| KR20100077940A (ko) | 디지털 이미지 처리 장치 및 방법 | |
| US7545416B2 (en) | Image processing device and camera including CPU which determines whether processing performed using external memory | |
| US8797424B2 (en) | Image processing apparatus for reading compressed data from memory via data bus and image processing method performed in the image processing apparatus | |
| CN101719985B (zh) | 一种图像采集和处理的方法和装置 | |
| CN115472140B (zh) | 显示方法、显示装置、电子设备和可读存储介质 | |
| CN102780842A (zh) | 手持式电子装置以及适用于其的双重影像撷取方法 | |
| JP4870563B2 (ja) | 携帯装置における画像処理方法及び装置 | |
| CN1863297A (zh) | 在便携式终端显示图像数据的方法 | |
| US7944510B2 (en) | Broadcast receiving apparatus for capturing broadcast signal and method thereof | |
| CN101098401A (zh) | 产生图像物件的方法及系统 | |
| US8154643B2 (en) | Image pick-up apparatus, an image processing apparatus and an image processing method, for displaying image data on an external display with appropriate color space conversion based on resolution of image data and external display | |
| US20070046792A1 (en) | Image compositing | |
| KR20040068635A (ko) | Tv용 전자 액자의 이미지 회전을 위한 시스템 및 그방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: MEDIATEK INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHUANG, CHENG TE;CHANG, YU-CHUNG;CHEN, CHENG-CHE;REEL/FRAME:019176/0403;SIGNING DATES FROM 20060818 TO 20060821 Owner name: MEDIATEK INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHUANG, CHENG TE;CHANG, YU-CHUNG;CHEN, CHENG-CHE;SIGNING DATES FROM 20060818 TO 20060821;REEL/FRAME:019176/0403 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
| AS | Assignment |
Owner name: XUESHAN TECHNOLOGIES INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MEDIATEK INC.;REEL/FRAME:056593/0167 Effective date: 20201223 Owner name: XUESHAN TECHNOLOGIES INC., CANADA Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNOR:MEDIATEK INC.;REEL/FRAME:056593/0167 Effective date: 20201223 |
|
| AS | Assignment |
Owner name: TAIWAN SEMICONDUCTOR MANUFACTURING COMPANY, LTD., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:XUESHAN TECHNOLOGIES INC.;REEL/FRAME:056714/0487 Effective date: 20210331 |