US20130100026A1 - Proximity Screen Display and User Interface - Google Patents
Proximity Screen Display and User Interface Download PDFInfo
- Publication number
- US20130100026A1 US20130100026A1 US13/655,910 US201213655910A US2013100026A1 US 20130100026 A1 US20130100026 A1 US 20130100026A1 US 201213655910 A US201213655910 A US 201213655910A US 2013100026 A1 US2013100026 A1 US 2013100026A1
- Authority
- US
- United States
- Prior art keywords
- screen display
- proximity screen
- proximity
- elements
- imaging elements
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
Definitions
- the present disclosure relates generally to a proximity screen display for use in a communication device, and more specifically to integration of integrated imaging elements within the proximity screen display for use in adjusting various parameters of video data and/or image data that is being displayed by a display area of the proximity screen display.
- a conventional communication device includes a conventional touch screen display, such as resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition touch screen display to provide some examples, which operates as an interface between the communication device and a user of the communication device.
- the conventional touch screen display operates as an output device for displaying image and/or video data for the user.
- the conventional touch screen display operates as an input device for receiving command data, control data, and/or other data from the user of the communication device.
- the conventional touch screen display includes an integrated virtual keyboard, also referred to as an on-screen integrated virtual keyboard, for receiving the command data, control data, and/or other data from the user of the communication device.
- the continued evolution of silicon semiconductor fabrication technologies has reduced a size of the conventional communication device as well as a size of the conventional touch screen display and its integrated virtual keyboard.
- the alphanumeric keys of the integrated virtual keyboard have also decreased thereby making use of the integrated virtual keyboard more difficult.
- users with larger hands can have difficulty in selecting from among the alphanumeric keys leading to erroneous keys being selected.
- the conventional communication device is not properly oriented when the integrated virtual keyboard is in use leading to erroneous keys being selected.
- a user of the conventional communication device often times holds the conventional communication device at an angle which leads to erroneous keys being selected.
- FIG. 1 illustrates a block diagram of an exemplary communication device according to an exemplary embodiment of the present disclosure
- FIG. 2A illustrates a first block diagram of an exemplary configuration and arrangement of perimeter imaging elements surrounding a screen to form a proximity screen display of the communication device according to an exemplary embodiment of the present disclosure
- FIG. 2B illustrates a second block diagram of an exemplary configuration and arrangement of integrated imaging elements within the communication device according to an exemplary embodiment of the present disclosure
- FIG. 3A illustrates a first exemplary integration of the imaging elements within a screen of the communication device according to an exemplary embodiment of the present disclosure
- FIG. 3B illustrates a second exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure
- FIG. 3C illustrates a third exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure
- FIG. 4 illustrates an integrated imaging element within the communication device according to an exemplary embodiment of the present disclosure
- FIG. 5A illustrates a single pixel element of a flexible organic light-emitting diode screen display according to an exemplary embodiment of the present disclosure
- FIG. 5B illustrates a single pixel element of a flexible organic light-emitting diode proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure
- FIG. 6A illustrates a single pixel element of an electronic paper, e-paper, or electronic ink screen display according to an exemplary embodiment of the present disclosure
- FIG. 6B illustrates a single pixel element of an electronic paper, e-paper, or electronic ink proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure
- FIG. 7A illustrates a single pixel element of a liquid crystal screen display according to an exemplary embodiment of the present disclosure
- FIG. 7B illustrates a single pixel element of a liquid crystal proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure
- FIG. 8 illustrates an exemplary proximity screen display and proximity screen display interface that can be implemented within the communication device according to an exemplary embodiment of the present disclosure
- FIG. 9 is a first flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure.
- FIG. 10 is a second flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure.
- Embodiments of the disclosure can be implemented in hardware, firmware, software, or any combination thereof. Embodiments of the disclosure can also be implemented as instructions stored on a machine-readable medium, which can be read and executed by one or more processors.
- a machine-readable medium can include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device).
- a machine-readable medium can include non-transitory machine-readable mediums such as read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; and others.
- the machine-readable medium can include transitory machine-readable medium such as electrical, optical, acoustical, or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.).
- transitory machine-readable medium such as electrical, optical, acoustical, or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.).
- firmware, software, routines, instructions can be described herein as performing certain actions. However, it should be appreciated that such descriptions are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc.
- module shall be understood to include at least one of software, firmware, and hardware (such as one or more circuits, microchips, or devices, or any combination thereof), and any combination thereof.
- each module can include one, or more than one, component within an actual device, and each component that forms a part of the described module can function either cooperatively or independently of any other component forming a part of the module.
- multiple modules described herein can represent a single component within an actual device. Further, components within a module can be in a single device or distributed among multiple devices in a wired or wireless manner.
- a communication device that includes a proximity screen display that includes one or more imaging elements that are configured and arranged around a periphery of a display area of the proximity screen display and/or integrated within the display area.
- the one or more integrated imaging elements are configured and arranged to sense light in their field of view.
- the one or more integrated imaging elements provide one or more various sensing signals whose magnitudes depend upon an amount of light sensed in their field of view.
- the communication device can adjust various parameters of video data and/or image data that is being displayed by the display area in response to the various sensing signals.
- FIG. 1 illustrates a block diagram of an exemplary communication device according to an exemplary embodiment of the present disclosure.
- a communication device 100 communicates information, such as audio data, video data, image data, command data, control data and/or other data to provide some examples, between a near-end user and a far-end user over various wired and/or wireless communication networks.
- the communication device 100 can represent a mobile communication device, such as a cellular phone or a smartphone, a mobile computing device, such as a tablet computer or a laptop computer, or any other electronic device that is capable of communicating information over communication networks that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure.
- the communication device 100 can communicate information that is received from the far-end user, as well as information that is generated by the communication device 100 , to the near-end user using a proximity screen display. Additionally, the near-end user can communicate information to the far-end user, as well as information to the communication device 100 , using the proximity screen display.
- the communication device 100 includes a communication module 102 , a host processor 104 , a proximity screen display 106 , a proximity screen display interface 108 , and a communication interface 110 .
- the communication module 102 can include a Bluetooth module, a Global Position System (GPS) module, a cellular module, a wireless local area network (WLAN) module, a near field communication (NFC) module, a radio frequency identification (RFID) module and/or a wireless power transfer (WPT) module.
- the Bluetooth module, the cellular module, the WLAN module, the NFC module, and the RFID module provide wireless communication between the communication device 100 and other Bluetooth, other cellular, other WLAN, other NFC, and other RFID capable communication devices, respectively, in accordance with various communication standards or protocols.
- These various communication standards or protocols can include various cellular communication standards such as a third Generation Partnership Project (3GPP) Long Term Evolution (LTE) communications standard, a fourth generation (4G) mobile communications standard, or a third generation (3G) mobile communications standard, various networking protocols such a Worldwide Interoperability for Microwave Access (WiMAX) communications standard or a Wi-Fi communications standard, various NFC/RFID communications protocols such as ISO 1422, ISO/IEC 14443, ISO/IEC 15693, ISO/IEC 18000, or FeliCa to provide some examples.
- the GPS module receives various signals from various satellites to determine location information for the communication device 100 .
- the WPT module supports wireless transmission of power between the communication device 100 and another WPT capable communication device.
- the host processor 104 controls overall operation and/or configuration of the communication device 100 .
- the host processor 104 can receive and/or process information from a user interface such as an alphanumeric keypad, a microphone, a mouse, a speaker, and/or from other electrical devices or host devices that are coupled to the communication device 100 .
- the host processor 104 can provide this information to the communication module 102 and/or the proximity screen display interface 108 .
- the host processor 104 can receive and/or process information from the communication module 102 and/or the proximity screen display interface 108 .
- the host processor 104 can provide this information to the user interface, to other electrical devices or host devices, and/or to the communication module 102 and/or the proximity screen display interface 108 .
- the host processor 104 can execute one or more applications such as Short Message Service (SMS) for text messaging, electronic mailing, and/or audio and/or video recording to provide some examples, and/or software applications such as a calendar and/or a phone book to provide some examples.
- SMS Short Message Service
- the proximity screen display 106 represents an electronic visual display that can detect a presence and/or a location of a touch that is proximate to its display area.
- the proximity screen display 106 includes a display area to provide information from the proximity screen display interface 108 to the near-end user. Additionally, the proximity screen display 106 includes one or more integrated imaging elements that are integrated within and/or approximate to the display area to provide information from the near-end user to the proximity screen display interface 108 .
- the one or more integrated imaging elements are configured and arranged to detect a presence and/or a location of a touch from the near-end user.
- the touch can represent a physical touching of the display area by the near-end user and/or by other passive objects available to the near-end user, such as a stylus to provide an example or proximity of the near-end user and/or the other passive objects to the display area.
- the proximity screen display interface 108 can be configured via user setup and/or via a software programming interface (API) to respond to certain objects and classes of objects, e.g., a finger, fingers, hands, stylus, pencil, etc.
- API software programming interface
- any object within a user's grasp can be held in front of the proximity screen display 106 for imaging and thereafter can be used as the primary means of interacting through the proximity screen display interface 108 .
- This setup can be for all applications and operating system interactions on the communication device 100 , or it can apply to a single application or to only the operating system.
- a user wearing gloves can easily train the proximity screen display interface 108 to recognize certain gloved hand and finger movements as the input via a similar setup process. Thereafter, any time the proximity screen display interface 108 recognizes a trained or default input element, a pop-up window can appear on the proximity screen display 106 to prompt the user to accept an automatic input device setup can be made without having to retraining
- full contact of a finger, hand, gloved hand, stylus, or other input object with the surface of the proximity screen display interface 108 may be required, it need not be.
- a particular input element can be defined to have a working range. For example, full contact can be required for a pointer finger arrangement for one application. In another application and for perhaps a stylus, passing within two (2) centimeters of the screen surface within a defined speed range will be characterized as contact. Similarly, if within ten (10) centimeters of the screen surface and carrying out a double click tapping motion, even without actual contact, will be recognized and applied as a full contact double click.
- a full set of default behaviors for a default set of input types and classes may be preloaded. Adjustments to the underlying settings, whether preloaded or not, can be made over time based on actual interactions. For example, a first attempt at a particular input element motion that is slightly outside a working range or slightly of what has been defined may not be recognized; however, a re-attempt though may be recognized. Once the first attempt is then affectively recognized via the subsequent reattempt, modifications and adjustments can be made to so that subsequent interactions similar to the first attempt will be recognized.
- touch typing motions can be recognized such as (i) movement of the fingers away from and toward the screen (change in size indicating distance away), (ii) changes in finger shape (which indicate a pressure associated with a keystroke), (iii) lighting characteristics and shading, (iv) movement velocities/accelerations, (v) relocation, and (vi) rates of change of all the above.
- This touch typing mode can be activated when a user decides to begin typing by merely bringing their finger set in a typing configuration toward the screen. By the time the fingers reach the screen, a keypad will be configured to fit appropriately thereunder without forcing the user to find finger to key placement. Thus, a user can begin typing without looking at the keys to make sure the fingers are maintaining their alignment. Instead, the keys will automatically be aligned, sized and positioned to fit the user. With this configuration, small or large hands with a more natural finger positioning can be easily accommodated.
- the touch typing mode may also be activated by a software application, user interaction with a field that requires typing input, or by any other gesture input (whether full contact or not) and via voice recognized commands.
- thumb typing may be represented by one or more thumb typing modes. Some of these modes might be tailored to interact using a more traditional full contact mode with rather fixed key offerings. Other thumb typing modes may take advantage of the non-contact and user-specific-tailoring aspects of the present invention. For example, thumb typing with or without gloves, small or big thumbs, short or long thumbs, short thumb range in x, y, z directions or regions can all be taken into account in adjusting and tailoring an effective interface for a particular user.
- an elevated and non-contact typing mode i.e., a “hovering non-contact typing mode” which may recognize fingers with non-contact typing motions.
- a hovering contact typing mode may also be selected wherein a finger needs full contact to be recognized as a key depression. No matter what typing mode is selected though, the key recognition range and associated finger behaviors can be accounted for dynamically to support a given user and user's situation.
- Spelling and grammar tools running within underlying software (operating system or application) on the host processor 104 (or any dedicated other processing circuitry perhaps within the proximity screen display interface 108 ) assist in such dynamic tailoring no matter what the typing mode happens to be. For example, while in a hovering contact typing mode, spelling and grammatical mistakes can be recognized for stored finger motions. Some users often strike an “a” instead of a “q” after typing a “w.” Likewise, some users may hold their fingers in a hovering and striking position in a configuration somewhat off of a normal horizontal alignment leading to other typing errors. Other finger motions might not be identified as the finger range of motion may be impaired on yet another user. By analysis of mistakes over time as identified with spelling and grammar tools and unsolicited user corrections along with evaluation of both the strokes and hand positions associated with the mistakes and spatial characteristics of a keyboard layout, comfortable, effective and accurate typing input interface can be established for each user.
- gestures such as placing all fingertips together then opening the hand to reveal a palm might be recognized to perform a function such as returning to a desktop.
- Mapping of such and any other gestures, whether involving some aspect of contact or not, to a particular function on the communication device 100 can be managed via default offerings and in a training based setup fashion.
- Such gestures can be with any object or body part and include other user input marriage.
- An example of such a marriage might be a gesture plus a detected voice command (simultaneously carried out or in a sequence) might be used to trigger performance of a function.
- the proximity screen display interface 108 and associated screen may be fitted with full-contact only, touch screen technology.
- both the visual recognition aspects and full contact detection approaches can be used together in a reinforcing way, or be selected as operational alternatives wherein only one may be powered down or up per user or application software command.
- fingers and hands might be replaced with a data structure representing each joint in each hand along with related motion and position data.
- fingertip contact might be represented by an estimated contact time, pressure and duration.
- such data can be characterized for a particular user as matching a particular input target.
- a particular user At perhaps ten (10) frames per second capture in grey-scale and with a conversion to a data structure of a pivot point, length between pivot points (e.g., joints), and so on, user input whether or not full contact may offer sufficient for some embodiments.
- the image capture quality of the proximity screen display 106 is high enough, the full screen or portions thereof may be used for capturing images and video that is sufficient for consumption by the human eye.
- the imaging capabilities of the screen server dual purposes (imaging for the eye and imaging for user input interfacing).
- the one or more integrated imaging elements are configured and arranged to sense light in their field of view.
- the one or more integrated imaging elements provide one or more various sensing signals whose magnitudes depend upon an amount of light sensed in their field of view to the proximity screen display interface 108 .
- the sensing signals have a first magnitude when the one or more integrated imaging elements are exposed to a bright light, a second magnitude when the one or more integrated imaging elements are exposed to a dark light, and a third magnitude that varies between the first magnitude and the second magnitude as the amount of light sensed by the one or more integrated imaging elements varies between the bright light and the dark light.
- the proximity screen display interface 108 communicates information between the communication module 102 , the host processor 104 , and the proximity screen display 106 .
- the proximity screen display interface 108 provides various control signals to the proximity screen display 106 for configuration of its display area to display the information.
- the proximity screen display interface 108 can provide various control signals to the proximity screen display 106 for configuration of its display area to display image data or video data received from the communication module 102 and/or the host processor 104 .
- the proximity screen display interface 108 can interpret the various sensing signals provided by the proximity screen display 106 to determine the presence and/or the location of the near-end user and/or the other passive objects.
- the proximity screen display interface 108 can interpolate an image of an environment surrounding the display area from magnitudes of the various sensing signals to generate an image of an environment surrounding the display area to determine the presence and/or the location of the near-end user and/or the other passive objects.
- the proximity screen display interface 108 can compare various images of the environment surrounding the display area at different instances in time to determine movement of the near-end user and/or the other passive objects.
- the proximity screen display interface 108 can recognize specific portions of the object, such as one or more fingers of a hand of the near-end user to provide an example, from one or more images of the environment surrounding the display area.
- the proximity screen display interface can 108 assign various control and/command data to different specific portions of the object and provide respective control and/command data to the communication module 102 and/or the host processor 104 when a respective specific portion of the object has been recognized. Yet further, the proximity screen display interface 108 can adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the information provided by the communication module 102 and/or the host processor 104 in response to the various sensing signals provided by the proximity screen display 106 .
- the proximity screen display interface 108 can adjust the zoom, resolution, pitch, roll, and/or yaw of image data, such as an image of a integrated virtual keyboard to provide an example, or video data to align the image data or video data with the movement of the near-end user and/or the other passive objects.
- image data such as an image of a integrated virtual keyboard to provide an example, or video data to align the image data or video data with the movement of the near-end user and/or the other passive objects.
- the communication interface 110 routes various communications between the communications module 102 , the host processor 104 , and the proximity screen display interface 108 .
- These communications can include various digital signals, such as one or more commands and/or data to provide some examples, various analog signals, such as direct current (DC) currents and/or voltages to provide some examples, or any combination thereof.
- the communication interface 110 can be implemented as a series of wired and/or wireless interconnections between the communications module 102 , the host processor 104 , and the proximity screen display interface 108 .
- the interconnections of the communication interface 110 can be arranged to form a parallel interface to route communications between the communications module 102 , the host processor 104 , and the proximity screen display interface 108 in parallel, a serial interface to route communications between the communications module 102 , the host processor 104 , and the proximity screen display interface 108 , or any combination thereof.
- the one or more integrated imaging elements sense changes in light resulting from the movement of the near-end user and/or the other passive objects in their field of view.
- the exemplary configurations and arrangements of the one or more integrated imaging elements to be discussed below are for illustrative purposes only. Those skilled in the relevant art(s) will recognize that other configurations and arrangements of the one or more integrated imaging elements are possible without departing from the spirit and scope of the present disclosure.
- the various proximity screen displays and associated assemblies described herein can include any suitable number of the integrated imaging elements ranging from a single integrated imaging element up to many thousands of integrated imaging elements. However, even larger numbers of integrated imaging elements are possible as will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure.
- one or more imaging elements can be positioned outside the perimeter of the screen surface.
- Other embodiments may use imaging elements integrated into the screen itself, while yet others position the imaging elements on top of or behind the screen. No matter what the configuration of a particular embodiment, the resulting proximity screen display and associated processing hardware and software act in concert to provide user output and user input interfacing in a common, spatially mapped interface arrangement.
- FIG. 2A illustrates a first block diagram of an exemplary configuration and arrangement of perimeter imaging elements surrounding a screen to form a proximity screen display of the communication device according to an exemplary embodiment of the present disclosure.
- a proximity screen display 200 includes integrated imaging elements 202 . 1 through 202 . i that are configured and arranged around a periphery of a display area 204 .
- the integrated imaging elements 202 . 1 through 202 . i are illustrated as being configured and arranged around the periphery of the display area 204 in a uniform manner, this is for illustrative purposes only.
- the integrated imaging elements 202 . 1 through 202 . i can be configured and arranged around the periphery of the display area 204 in any suitable manner without departing from the spirit and scope of the present disclosure.
- the proximity screen display 200 can represent an exemplary embodiment of the proximity screen display 106 .
- the integrated imaging elements 202 . 1 through 202 . i can be configured and arranged around the periphery of the display area 204 in a uniform manner to form rows 206 and/or columns 208 of the integrated imaging elements 202 . 1 through 202 . i .
- the integrated imaging elements 202 . 1 through 202 . i and the display area 204 can be formed onto a common chip or die or separate chips or dies.
- the integrated imaging elements 202 . 1 through 202 . i and the display area 204 can be integrated within a mechanical housing of a communication device, such as the communication device 100 .
- the mechanical housing can include various openings or cutouts to accommodate the integrated imaging elements 202 . 1 through 202 . i.
- FIG. 2B illustrates a second block diagram of an exemplary configuration and arrangement of integrated imaging elements within the communication device according to an exemplary embodiment of the present disclosure.
- a proximity screen display 210 includes integrated imaging elements 212 . 1 through 212 . i that are configured and arranged in a uniform manner to form a matrix that is integrated within a display area 214 .
- the integrated imaging elements 212 . 1 through 212 . i can be configured and arranged to in rows 216 and/or columns 218 to form the matrix that is integrated within the display area 214 .
- the matrix shown in FIG. 2B is merely illustrative, those skilled the relevant art(s) will recognize that the matrix can be increased or reduced depending on the design goals of a particular embodiment.
- the array size may bear no relationship or a full relationship with the underlying pixel array size and layout.
- the proximity screen display 210 can represent an exemplary embodiment of the proximity screen display 106 .
- some configurations and arrangements of the proximity screen display 106 can include a first set of integrated imaging elements, similar to the imaging elements 202 . 1 through 202 . i , around the periphery of its display area and a second set of the integrated imaging elements, similar to the integrated imaging elements 212 . 1 through 212 . i , integrated within the display area.
- some configurations and arrangements of the proximity screen display 106 can include integrated imaging elements, similar to the integrated imaging elements 202 . 1 through 202 . i and/or the one or more integrated imaging elements 212 . 1 through 212 .
- the rows and/or the columns of these integrated imaging elements can include different numbers of integrated imaging elements.
- some configurations and arrangements of the proximity screen display 106 can include integrated imaging elements, similar to the integrated imaging elements 202 . 1 through 202 . i and/or the one or more integrated imaging elements 212 . 1 through 212 . i , that are configured and arranged to form any geometric shape around the periphery of its display area and/or integrated within the display area.
- One or more integrated imaging elements can be integrated within a proximity screen display, such as the proximity screen display 106 , the proximity screen display 200 , or the proximity screen display 210 to provide some examples, of a communication device, such as the communication device 100 to provide an example.
- the one or more integrated imaging elements can be formed onto and/or within the proximity screen display around the periphery of a display area in a similar manner as the display area 204 and/or integrated within the display area in a similar manner as the display area 214 .
- FIG. 3A illustrates a first exemplary integration of the imaging elements within a screen of the communication device according to an exemplary embodiment of the present disclosure.
- One or more integrated imaging elements 302 . 1 through 302 . k can be formed onto a substrate 304 of a proximity screen display 300 .
- the proximity screen display 300 can represent an exemplary embodiment of the proximity screen display 106 , the proximity screen display 200 , or the proximity screen display 210 to provide some examples.
- the integrated imaging elements 302 . 1 through 302 . k can represent an exemplary embodiment of the imaging elements 202 . 1 through 202 . i or the imaging elements 212 . 1 through 212 . i.
- the proximity screen display 300 can be formed onto a single substrate or multiple substrates that are communicatively coupled to each other.
- the substrate 304 can represent a portion of the single substrate, one of the multiple substrates, or a portion of one of the multiple substrates.
- the substrate 304 can be integrated within a mechanical housing 306 of a communication device, such as the communication device 100 to provide an example.
- the mechanical housing 306 can include various openings 308 . 1 through 308 . k to accommodate the integrated imaging elements 302 . 1 through 302 . k.
- the openings 308 . 1 through 308 . k can be physical hole type openings or merely comprise transparent material areas through which imaging can be conducted.
- the integrated imaging elements 302 . 1 through 302 . k can take on the structure, shape and architecture of a mostly flat imaging structure.
- FIG. 3B illustrates a second exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure.
- a proximity screen display 310 Similar to the proximity screen display 300 , a proximity screen display 310 includes the one or more integrated imaging elements 302 . 1 through 302 . k that are formed onto the substrate 304 . However, as an alternate to the mechanical housing 306 , a protective coating of transparent material 312 can be placed onto the substrate 304 of the communication device to protect the integrated imaging elements 302 . 1 through 302 . k.
- FIG. 3C illustrates a third exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure.
- One or more integrated imaging elements 316 . 1 through 316 . k can be integrated within one or more integrated circuit layers 318 . 1 through 318 . m of a substrate 320 of a proximity screen display 314 .
- the proximity screen display 314 can represent an exemplary embodiment of the proximity screen display 106 , the proximity screen display 200 , or the proximity screen display 210 to provide some examples.
- the integrated imaging elements 316 . 1 through 316 . k can represent an exemplary embodiment of the imaging elements 202 . 1 through 202 . i or the imaging elements 212 . 1 through 212 . i.
- the proximity screen display 314 can be formed onto a single substrate or multiple substrates that are communicatively coupled to each other.
- the substrate 320 can represent a portion of the single substrate, one of the multiple substrates, or a portion of one of the multiple substrates.
- the one or more integrated imaging elements 316 . 1 through 316 . k can be integrated within the one or more integrated circuit layers 318 . 1 through 318 . m of the substrate 320 .
- the one or more integrated imaging elements 316 . 1 through 316 . k are formed onto an integrated circuit layer 318 . 1 that represents a substrate of semiconductor material, which is often flexible.
- k as well as a display area of the proximity screen display 314 are formed using various integrated circuit layers between the integrated circuit layers 318 . 1 and 318 . m.
- a flexible transparent cover is often formed onto the one or more integrated imaging elements 316 . 1 through 316 . k as the integrated circuit layer 318 . m.
- the one or more integrated imaging elements sense changes in light resulting from the movement of the near-end user and/or the other passive objects in their field of view.
- the one or more integrated imaging elements are implemented using various photosensor and/or photodetector devices to provide an example, which convert energy of the light into electrical energy, such as current or voltage, by a photovoltaic effect.
- the photosensor and/or photodetector devices can include active pixel element sensors, light emitting diodes, optical detectors, photoresistors, photovoltaic cells, photodiodes, phototransistors, and or other devices that are capable of converting the energy of the light into the electrical energy that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure.
- These various photosensor and/or photodetector devices can be integrated around the periphery in a similar manner as the imaging elements 202 . 1 through 202 .
- a display area of a proximity screen display such as the proximity screen display 106 , the proximity screen display 200 , or the proximity screen display 210 to provide some examples, and/or integrated within the display area in a similar manner as the integrated imaging elements 212 . 1 through 212 . i.
- FIG. 4 illustrates an integrated imaging element within the communication device according to an exemplary embodiment of the present disclosure.
- the integrated imaging element is implemented using a photovoltaic device 400 .
- photons of sufficient energy strike the photovoltaic device 400 , they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes.
- These negatively charged electrons move toward a cathode of the photovoltaic device 400 and/or the positively charged electron holes move toward an anode of the photovoltaic device 400 producing a current and/or voltage.
- This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with the proximity screen display 106 .
- the photovoltaic device 400 includes a transparent conducting cathode layer 402 , an optional buffer layer 404 , a donor-acceptor layer 406 , and a transparent conducting anode layer 408 that are configured and arranged as a planar hetero-junction, although a bulk heterojunction (BHJ) or an ordered heterojunctions (OHJ) can be used, onto a transparent substrate 410 .
- the transparent conducting cathode layer 402 represents a cathode of the photovoltaic device 400 that attracts negatively charged electrons from the optional buffer layer 404 and/or the donor-acceptor layer 406 when the photons of sufficient energy strike the photovoltaic device 400 .
- the attracting of the negatively charged electrons to the transparent conducting cathode layer 402 can produce the current and/or the voltage which is indicative of an intensity of the photons striking the photovoltaic device 400 .
- the optional buffer layer 404 is an intrinsic semiconductor layer, also called an undoped semiconductor layer or i-type semiconductor layer, which represents a semiconductor layer without any significant impurity atoms.
- the optional buffer layer 404 can be implemented with donor-acceptor layer 406 to form a p-type semiconductor, an i-type semiconductor, an n-type semiconductor (PIN) structure.
- PIN n-type semiconductor
- the negatively charged electrons and/or the positively charged electron holes from the donor-acceptor layer 406 accumulate within the optional buffer layer 404 when the photons of sufficient energy strike the photovoltaic device 400 .
- a current can flow between the transparent conducting cathode layer 402 and the transparent conducting anode layer 408 when a sufficient number of the negatively charged electrons and/or the positively charged electron holes have accumulated in the optional buffer layer 404 .
- the donor-acceptor layer 406 can be doped with impurity atoms of an acceptor type, such as boron or aluminum to provide some examples, that are capable of accepting an electron and/or doped with impurity atoms of a donor type, such as phosphorus, arsenic, or antimony to provide some examples that are capable of donating an electron.
- a first portion of the donor-acceptor layer 406 is doped with the impurity atoms of the acceptor type and a second portion of the donor-acceptor layer 406 is doped with the impurity atoms of the donor type to form a p-n junction.
- the donor-acceptor layer 406 provides negatively charged electrons to the transparent conducting cathode layer 402 and/or positively charged electron holes to the transparent conducting anode layer 408 when the photons of sufficient energy strike the photovoltaic device 400 causing a current to flow between the transparent conducting cathode layer 402 and the transparent conducting anode layer 408 .
- the transparent conducting anode layer 408 represents an anode of the photovoltaic device 400 that attracts positively charged electron holes from the optional buffer layer 404 and/or the donor-acceptor layer 406 when the photons of sufficient energy strike the photovoltaic device 400 .
- the attracting of the positively charged electron holes to the transparent conducting anode layer 408 can produce the current and/or the voltage which is indicative of an intensity of the photons striking the photovoltaic device 400 .
- an integrated imaging element such as one of the integrated imaging elements 212 . 1 through 212 . i , one of the integrated imaging elements 316 . 1 through 316 . k , or the photovoltaic device 400 to provide some examples, can be integrated within a display area, such as the display area 214 to provide an example, of a proximity screen display, such as the proximity screen display 106 , the proximity screen display 214 , or the proximity screen display 314 to provide some examples.
- These various proximity screen displays can be placed in a side-by-side configuration with various imaging elements, such as one or more of the photovoltaic device 400 to provide an example, to form the proximity screen display 210 .
- a transparent flexible cover can be placed onto the proximity screen display 210 to cover the various proximity screen displays and the various imaging elements.
- 140 10K-imaging elements can be placed in a middle of a full high-definition array of pixel elements of these proximity screen displays to form the proximity screen display 210 .
- arrays of imaging elements can be interdigitated with arrays of pixel elements of these proximity screen displays to form a checkerboard like configuration.
- Other ratios of imaging elements to pixel elements and imager element arrays to pixel elements can be selected depending on the goals of the specific design embodiment.
- various imaging elements such as one or more of the photovoltaic device 400 to provide an example, can be placed in an on-top configuration with the various proximity screen displays. In this configuration, the various imaging elements are placed on top of the various proximity screen displays. In some situations, various layers of various imaging elements and the various proximity screen displays can be shared, such as a flexible transparent cover of the various imaging elements and a transparent substrate of the various imaging elements.
- proximity screen displays such as an organic light-emitting diode proximity screen display, an electronic paper, e-paper, or electronic ink proximity screen display, or a liquid crystal display to provide some examples.
- the discussion to follow then describes integration of the imaging element within these various proximity screen displays to form exemplary integrations of the integrated imaging elements.
- FIG. 5A illustrates a single pixel element of a flexible organic light-emitting diode proximity screen display according to an exemplary embodiment of the present disclosure.
- An organic light-emitting diode proximity screen display 500 includes one or more pixel elements that are configured and arranged to form a display area.
- Each of the one or more pixel elements includes one or more layers of one or more organic compounds which emit light in response to an electric current.
- the one or more layers of the one or more organic compounds are positioned between two electrodes on a substrate.
- the organic light-emitting diode proximity screen display 500 can represent a bottom emission device that uses a transparent or semi-transparent bottom electrode to emit the light through a transparent substrate or a top emission device that uses a transparent or semi-transparent top electrode to directly emit the light.
- a single pixel element of the organic light-emitting diode proximity screen display 500 includes an optional flexible transparent cover 502 , a transparent conducting cathode 504 , an optional electron transport layer 506 , one or more organic emission layers 508 , an optional hole transport layer 510 , and a transparent conducting anode 512 that are formed on a flexible substrate 514 such as a substrate of polyethylene terephthalate to provide an example.
- the optional flexible transparent cover 502 represents a protective coating of transparent material 312 that can be placed onto the flexible substrate 514 to protect the transparent conducting cathode 504 , the optional electron transport layer 506 , the one or more organic emission layers 508 , the optional hole transport layer 510 , and the transparent conducting anode 512 .
- the transparent conducting cathode 504 provides a current of electrons when a voltage at the transparent conducting anode 512 is positive with respect to the transparent conducting cathode 504 .
- This current of electrons injects electrons into lowest occupied molecular orbitals (LUMO) of the one or more organic emission layers 508 at the optional electron transport layer 506 and withdraws electrons from highest occupied molecular orbitals (HUMO) from the one or more organic emission layers 508 at the optional hole transport layer 510 forming electron holes.
- LUMO lowest occupied molecular orbitals
- the optional electron transport layer 506 can be doped impurity atoms of a donor type, such as phosphorus, arsenic, or antimony to provide some examples, which are capable of donating an electron.
- the electron transport layer provides excess carrier electrons to the one or more organic emission layers 508 as the current flows through the optional electron transport layer 506 from the transparent conducting cathode 504 to the transparent conducting anode 512 .
- the one or more organic emission layers 508 provide various electrostatic forces to bring the electrons and the holes towards each other whereupon the electrons and the holes recombine to form a bound state of the electron and hole, often referred to as an exciton.
- the decay of the exciton results in a relaxation of the energy levels of the electron, accompanied by emission of radiation whose frequency is in the visible region.
- the one or more organic emission layers 508 can include organometallic chelates, fluorescent and phosphorescent dyes, conjugated dendrimer to provide some examples.
- the optional hole transport layer 510 can be doped with impurity atoms of an acceptor type, such as boron or aluminum to provide some examples, that are capable of accepting an electron.
- the electron transport layer provides excess carrier holes to the one or more organic emission layers 508 as the current flows through the optional electron transport layer 506 from the transparent conducting cathode 504 to the transparent conducting anode 512 .
- the transparent conducting anode 512 receives the current of electrons when the voltage at the transparent conducting anode 512 is positive with respect to the transparent conducting cathode 504 .
- the transparent conducting anode 512 can be shared with other anode electrodes of other pixel elements of the organic light-emitting diode screen display 500 .
- FIG. 5B illustrates a single pixel element of a flexible organic light-emitting diode proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure.
- An organic light-emitting diode proximity screen display 520 includes one or more pixel elements that are configured and arranged to form a display area. Each of the one or more pixel elements includes one or more layers of one or more organic compounds which emit light in response to an electric current. The one or more layers of the one or more organic compounds are positioned between two electrodes on a substrate.
- the organic light-emitting diode proximity screen display 520 also includes an integrated imaging element that is integrated within the display area.
- the organic light-emitting diode proximity screen display 520 can represent an exemplary embodiment of the proximity screen display 106 , the proximity screen display 210 , or the proximity screen display 314 to provide some examples.
- a single pixel element of the organic light-emitting diode proximity screen display 520 includes an imaging element 524 and an organic light-emitting diode touch screen pixel element 526 .
- the imaging element 524 sense changes in light 522 resulting from the movement of the near-end user and/or the other passive objects in their field of view. When photons of sufficient energy of the light 522 strike the imaging element 524 , they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. The negatively charged electrons move from a photovoltaic absorption layer 530 toward a transparent conducting cathode 528 which represents a cathode of the imaging element 524 .
- the positively charged electron holes move from the photovoltaic absorption layer 530 toward a shared transparent conducting electrode 532 which represents an anode of the imaging element 524 .
- the photovoltaic absorption layer 530 can be implemented in a substantially similar manner using the optional buffer layer 404 and/or the donor-acceptor layer 406 .
- the movement of the negatively charged electrons toward the transparent conducting cathode 528 and the movement of the positively charged electron holes toward the shared transparent conducting electrode 532 produce a current and/or voltage.
- This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with the proximity screen display 106 .
- the organic light-emitting diode touch screen pixel element 526 includes one or more layers of one or more organic compounds which emit light in response to an electric current which emit light in response to an electric current.
- the organic light-emitting diode touch screen pixel element 526 can be implemented in a similar manner as a pixel element of the organic light-emitting diode proximity screen display 500 .
- the organic light-emitting diode touch screen pixel element 526 includes the optional electron transport layer 506 , the one or more organic emission layers 508 , the optional hole transport layer 510 , and the transparent conducting anode 512 with the transparent conducting cathode 504 of the organic light-emitting diode proximity screen display 500 being replaced by the shared transparent conducting electrode 532 .
- the shared transparent conducting electrode 532 provides a current of electrons when a voltage at the transparent conducting anode 512 is positive with respect to the shared transparent conducting electrode 532 .
- This current of electrons injects electrons into lowest occupied molecular orbitals (LUMO) of the one or more organic emission layers 508 at the optional electron transport layer 506 and withdraws electrons from highest occupied molecular orbitals (HUMO) from the one or more organic emission layers 508 at the optional hole transport layer 510 forming electron holes.
- LUMO lowest occupied molecular orbitals
- HUMO highest occupied molecular orbitals
- the decay of the exciton results in a relaxation of the energy levels of the electron, accompanied by emission of radiation whose frequency is in the visible region.
- FIG. 6A illustrates a single pixel element of a electronic paper, e-paper, or electronic ink screen display.
- a electronic paper, e-paper, or electronic ink screen display 600 includes one or more pixel elements that are configured and arranged to form a display area.
- Each of the one or more pixel elements includes various charged pigment particles in a layer of liquid polymer that can be re-configured and re-arranged by applying various electric fields to two electrodes on a substrate. Applying a negative charge to a top electrode repels white pigment particles to a bottom of the layer of liquid polymer forcing black pigment particles to a top of the layer of liquid polymer which results in a black appearance.
- a single pixel element of the electronic paper, e-paper, or electronic ink screen display 600 includes the optional flexible transparent cover 502 , a top transparent conducting electrode 602 , one or more liquid polymer layers 604 , and a bottom transparent conducting electrode 606 that are formed on the flexible substrate 514 .
- the top transparent conducting electrode 602 attracts positively charged pigment particles to a top of the one or more liquid polymer layers 604 and repels negatively charged pigment particles black pigment particles to a bottom of the one or more liquid polymer layers 604 when a negative charge is applied between the top transparent conducting electrode 602 and the bottom transparent conducting electrode 606 .
- the top transparent conducting electrode 602 repels the positively charged pigment particles to the bottom of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles to the top of the one or more liquid polymer layers 604 when a positive charge is applied between the top transparent conducting electrode 602 and the bottom transparent conducting electrode 606 .
- the one or more liquid polymer layers 604 includes one or more layers of various liquid polymers that suspend the positively charged pigment particles and negatively charged pigment particles until a charge is applied between the top transparent conducting electrode 602 and the bottom transparent conducting electrode 606 .
- the positively charged pigment particles represent as black pigment particles and the negatively charge pigment particles represent white pigment particles.
- the pixel element of the paper, e-paper, or electronic ink screen display 600 will have a black appearance when the black pigment particles are attracted to the top transparent conducting electrode 602 and the white pigment particles are repelled to the bottom transparent conducting electrode.
- the pixel element of the paper, e-paper, or electronic ink screen display 600 will have a white appearance when the white pigment particles are attracted to the top transparent conducting electrode 602 and the black pigment particles are repelled to the bottom transparent conducting electrode.
- the bottom transparent conducting electrode 606 repels the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the negative charge is applied between the top transparent conducting electrode 602 and the bottom transparent conducting electrode 606 .
- the bottom transparent conducting electrode 606 attracts the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and repels the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the positive charge is applied between the top transparent conducting electrode 602 and the bottom transparent conducting electrode 606 .
- FIG. 6B illustrates a single pixel element of an electronic paper, e-paper, or electronic ink proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure.
- An electronic paper, e-paper, or electronic ink proximity screen display 620 includes one or more pixel elements that are configured and arranged to form a display area.
- the electronic paper, e-paper, or electronic ink proximity screen display 620 also includes an integrated imaging element that is integrated within the display area.
- the electronic paper, e-paper, or electronic ink proximity screen display 620 can represent an exemplary embodiment of the proximity screen display 106 , the proximity screen display 210 , or the proximity screen display 314 to provide some examples.
- a single pixel element of the electronic paper, e-paper, or electronic ink proximity screen display 620 includes an imaging element 622 and an electronic paper, e-paper, or electronic ink touch screen pixel element 624 .
- the imaging element 622 sense changes in the light 522 resulting from the movement of the near-end user and/or the other passive objects in their field of view. When photons of sufficient energy of the light 522 strike the imaging element 622 , they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. The negatively charged electrons move from the photovoltaic absorption layer 530 toward a transparent conducting shared electrode 630 which represents a cathode of the imaging element 622 .
- the positively charged electron holes move from the photovoltaic absorption layer 530 toward the transparent conducting anode 512 which represents an anode of the imaging element 622 .
- the movement of the negatively charged electrons toward the transparent conducting shared electrode 630 and the movement of the positively charged electron holes toward transparent conducting anode 512 produce a current and/or voltage.
- This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with the proximity screen display 106 .
- the single pixel element of the electronic paper, e-paper, or electronic ink proximity screen display 620 includes a first conducting element 626 and a second conducting element 628 .
- the first conducting element 626 attracts positively charged pigment particles to a first side of the one or more liquid polymer layers 604 and repels negatively charged pigment particles black pigment particles to a second side of the one or more liquid polymer layers 604 when a negative charge is applied between the first conducting element 626 and the second conducting element 628 .
- the second conducting element 628 repels the positively charged pigment particles to the first side of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles black pigment particles to the second side of the one or more liquid polymer layers 604 when the negative charge is applied between the first conducting element 626 and the second conducting element 628 .
- the attracting of the positively charged pigment particles to the first side and the negatively charged pigment particles to the first side allows the light 522 to pass through the one or more liquid polymer layers 604 to strike the imaging element 622 .
- the electronic paper, e-paper, or electronic ink proximity screen display 624 includes various charged pigment particles in a layer of liquid polymer that can be re-configured and re-arranged by applying various electric fields to two electrodes on the substrate.
- the electronic paper, e-paper, or electronic ink proximity screen display 624 can be implemented in a similar manner as a pixel element of the electronic paper, e-paper, or electronic ink proximity screen display 600 .
- the electronic paper, e-paper, or electronic ink proximity screen display 624 includes the top transparent conducting electrode 602 and the one or more liquid polymer layers 604 with the bottom transparent conducting electrode 606 of the electronic paper, e-paper, or electronic ink proximity screen display 600 being replaced by the transparent conducting shared electrode 630 .
- the transparent conducting shared electrode 630 repels the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the negative charge is applied between the top transparent conducting electrode 602 and the transparent conducting shared electrode 630 .
- the transparent conducting shared electrode 630 attracts the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and repels the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the positive charge is applied between the top transparent conducting electrode 602 and the transparent conducting shared electrode 630 .
- FIG. 7A illustrates a single pixel element of a liquid crystal ⁇ screen display.
- a liquid crystal ⁇ screen display 700 includes one or more pixel elements that are configured and arranged to form a display area.
- Each of the one or more pixel elements includes one or more layers of liquid crystal material aligned between two electrodes. Before an electric field is applied between the two electrodes, the liquid crystal material is aligned perpendicular to each other at the two electrodes, and so molecules of the liquid crystal material arrange themselves in a helical structure, or twist. When a voltage applied between the two electrodes is large enough, the liquid crystal molecules in a center of the layers of liquid crystal material are almost completely untwisted and the polarization of the incident light is not rotated as it passes through the liquid crystal material.
- This light will then be mainly polarized perpendicular to a horizontal polarizing filter, and thus be blocked and the pixel element will appear black.
- a light source such as a backlight or a reflector to provide some examples, is often included to reflect the light passed through the pixel element.
- a single pixel element of the liquid crystal ⁇ screen display 700 includes the optional flexible transparent cover 502 , a vertical axis polarizing filter 702 , a top transparent conducting electrode 704 , one or more liquid crystal layers 706 , a horizontal axis polarizing filter 708 , and a bottom transparent conducting electrode 710 that are formed on the flexible substrate 514 .
- the vertical axis polarizing filter 702 is configured to pass vertical components of the light passing through it while absorbing and/or reflecting horizontal components.
- the horizontal axis polarizing filter 708 is configured to pass horizontal components of the light passing through it while absorbing and/or reflecting vertical components.
- the one or more liquid crystal layers 706 contain liquid crystals that twist and untwist at varying degrees to allow light to pass through when a voltage is applied between the top transparent conducting electrode 704 and the bottom transparent conducting electrode 710 .
- the liquid crystals untwist changing their polarization and in proportion to a voltage applied between the top transparent conducting electrode 704 and the bottom transparent conducting electrode 710 .
- FIG. 7B illustrates a single pixel element of a liquid crystal proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure.
- a liquid crystal proximity screen display 720 includes one or more pixel elements that are configured and arranged to form a display area.
- the liquid crystal proximity screen display 720 also includes an integrated imaging element that is integrated within the display area.
- the liquid crystal proximity screen display 720 can represent an exemplary embodiment of the proximity screen display 106 , the proximity screen display 210 , or the proximity screen display 314 to provide some examples.
- a single pixel element of the liquid crystal proximity screen display 720 includes a the vertical axis polarizing filter 702 , the one or more liquid crystal layers 706 , the horizontal axis polarizing filter 708 , the bottom transparent conducting electrode 710 and a transparent conducting shared electrode 722 to are configured and arranged on the flexible substrate 514 to form a pixel element of the liquid crystal proximity screen display 720 .
- This pixel element of the liquid crystal proximity screen display 720 can be implemented in a similar manner as a pixel element of the liquid crystal proximity screen display 700 with the top transparent conducting electrode 704 of the liquid crystal proximity screen display 700 being replaced by the transparent conducting shared electrode 722 .
- the liquid crystals of the one or more liquid crystal layers 706 twist and untwist at varying degrees to allow light to pass through when a voltage is applied between the transparent conducting shared electrode 722 and the bottom transparent conducting electrode 710 .
- the single pixel element of the liquid crystal proximity screen display 720 also includes the transparent conducting cathode 528 , the photovoltaic absorption layer 530 , and the transparent conducting shared electrode 722 that are configured and arranged to form an imaging element.
- the imaging element sense changes in the light 522 resulting from the movement of the near-end user and/or the other passive objects in their field of view. When photons of sufficient energy of the light 522 strike the imaging element, they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. The negatively charged electrons move from the photovoltaic absorption layer 530 toward a transparent conducting shared electrode 722 which represents a cathode of the imaging element.
- the positively charged electron holes move from the photovoltaic absorption layer 530 toward the transparent conducting anode 512 which represents an anode of the imaging element.
- the movement of the negatively charged electrons toward the transparent conducting shared electrode 722 and the movement of the positively charged electron holes toward transparent conducting anode 512 produce a current and/or voltage.
- This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with the proximity screen display 106 .
- FIG. 8 illustrates an exemplary proximity screen display and proximity screen display interface that can be implemented within the communication device according to an exemplary embodiment of the present disclosure.
- a proximity screen display interface 800 provides various control signals to a proximity screen display 802 for configuration of its display area to display information from a host processor, such as the host processor 104 to provide an example, and/or a communication module, such as the communication module 102 to provide an example. Additionally, the proximity screen display interface 800 can interpret various sensing signals provided by the proximity screen display 802 to determine the presence and/or the location of the near-end user and/or the other passive objects.
- the proximity screen display interface 800 can adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the information provided by the communication module and/or the host processor in response to the various sensing signals provided by the proximity screen display 802 .
- the proximity screen display interface 800 can represent an exemplary embodiment of the proximity screen display interface 108 and the proximity screen display 802 can represent an exemplary embodiment of the proximity screen display 106 , the proximity screen display 200 , the proximity screen display 210 , the proximity screen display 300 , the proximity screen display 310 , the proximity screen display 314 , the organic light-emitting diode proximity screen display 520 , the electronic paper, e-paper, or electronic ink proximity screen display 620 , the liquid crystal proximity screen display 720 , or any combination thereof.
- the proximity screen display interface 800 includes a touch screen controller 804 , a display area driver module 806 , and an integrated imaging element driver module 808 .
- the touch screen controller 804 controls overall operation and/or configuration of the proximity screen display 802 .
- the touch screen controller 804 receives information 850 from the host processor and/or the communication module.
- the information 850 can include video and/or image data to be displayed by the proximity screen display 802 .
- the information 850 can include command and/or control data to control the operation and/or configuration of the proximity screen display 802 .
- This command and/or control data can include backlight parameters, contrast parameters, brightness parameters, sharpness parameters, color parameters, tint parameters, refresh rate parameters, aspect ratio parameters, and/or resolution parameters to control the displaying of the video and/or image data within a display area of the proximity screen display 802 .
- the touch screen controller 804 provides video and/or image data 852 to the display area driver module 806 that is to be displayed in accordance with the command and/or control data.
- the command and/or control data can include sensing rate parameters or sensing scheme parameters to control the operation and/or configuration of the integrated imaging elements that are configured and arranged around a periphery of the display area and/or within the proximity screen display.
- the touch screen controller 804 provides imaging command and/or control 854 to the integrated imaging element driver module 808 to control the operation and/or configuration of the integrated imaging elements. Further, the proximity screen display interface 800 can adjust various parameters, such the zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the video and/or image data in response to the sensing signals 856 provided by the proximity screen display 802 .
- the display area driver module 806 processes the video and/or image data 852 to provide display area control signals 860 for displaying of the video and/or image data 852 on the display area of the proximity screen display 802 .
- the display area of the proximity screen display 802 is configured and arranged as multiple rows and/or columns of pixel elements that are configured and arranged as a matrix.
- Each of the pixel elements can display a pixel of red, green, blue, black, white, or any combination thereof color by applying a current and/or a voltage to its respective electrodes.
- a first electrode of each of the pixel elements in each row is coupled to each other.
- a second electrode of each of the pixel elements in each column is coupled to each other.
- the display area driver module 806 provides various currents and/or voltages to the rows and/or the columns of the matrix to configure the multiple rows and/or columns of pixel elements to display the video and/or image data 852 .
- the integrated imaging element driver module 808 provides integrated imaging element control signals 862 to configure to integrated imaging elements to sense light in their field of view.
- the integrated imaging elements are controlled in a substantially opposite manner as the pixel elements. For example, when a pixel element displays its respective pixel of red, green, blue, black, white, or any combination thereof color, namely active or turned “ON”, its respective integrated imaging element is configured to be inactive or turned “OFF”. Likewise, when an integrated imaging element is sensing the light in its field of view, namely active or turned “ON”, its respective pixel element is configured to be inactive or turned “OFF”.
- the integrated imaging element and its respective pixel element can be duty cycled to switch between their inactive or active configurations at sufficient rate without affecting an appearance of the video and/or image data 852 to the near-end user.
- the touch screen controller 804 determines which of the pixel elements are to be inactive based upon the video and/or image data 852 .
- the touch screen controller 804 provides the imaging command and/or control 854 to cause integrated imaging elements that correspond to these inactive pixel elements to be active to sense the light in their field of view.
- the integrated imaging elements of the proximity screen display 802 are configured and arranged as multiple rows and/or columns that are configured and arranged as a matrix.
- a first electrode of each of the integrated imaging elements in each row is coupled to each other and a second electrode of each of the integrated imaging elements in each column is coupled to each other.
- the integrated imaging element driver module 808 provides various currents and/or voltages to the rows and/or the columns of the matrix to configure the multiple rows and/or columns of pixel elements to integrated imaging elements to sense the light in their field of view to provide the sensing signals 856 .
- the magnitudes of the sensing signals 856 depend upon an amount of light sensed in their field of view.
- the proximity screen display 802 can include one or more pixel elements that are configured and arranged in multiple rows and/or columns to form a display area for displaying of the video and/or image data 852 .
- the proximity screen display 802 also includes integrated imaging elements that are configured and arranged around the periphery of the display area and/or within the proximity screen display 802 to sense the light in their field of view.
- the proximity screen display 802 can include more, less, or equal integrated imaging elements as pixel elements.
- the integrated imaging elements provide various voltages and/or currents as the sensing signals 856 that are indicative of the light in their field of view.
- the sensing signals 856 have a first magnitude when the one or more integrated imaging elements are exposed to a bright light, a second magnitude when the one or more integrated imaging elements are exposed to a dark light, and a third magnitude that varies between the first magnitude and the second magnitude as the amount of light sensed by the integrated imaging elements varies between the bright light and the dark light.
- a proximity screen display interface such as the proximity screen display interface 108 or the proximity screen display interface 800 to provide some examples, can adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of video data or image data, provided by a communication module, such as the communication module 102 to provide an example, and/or a host processor, such as the host processor 104 to provide an example.
- image parameters such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of video data or image data, provided by a communication module, such as the communication module 102 to provide an example, and/or a host processor, such as the host processor 104 to provide an example.
- the proximity screen display interface can adjust the various parameters of the video data or the image data in response to the presence and/or the location of the touch from the near-end user in relation to a proximity screen display, such as the proximity screen display 106 , the proximity screen display 200 , the proximity screen display 210 , the proximity screen display 300 , the proximity screen display 310 , the proximity screen display 314 , the organic light-emitting diode proximity screen display 520 , the electronic paper, e-paper, or electronic ink proximity screen display 620 , the liquid crystal proximity screen display 720 , or the proximity screen display 802 to provide some examples.
- a proximity screen display such as the proximity screen display 106 , the proximity screen display 200 , the proximity screen display 210 , the proximity screen display 300 , the proximity screen display 310 , the proximity screen display 314 , the organic light-emitting diode proximity screen display 520 , the electronic paper, e-paper, or electronic ink proximity screen display 620 , the liquid crystal proximity screen display 720
- FIG. 9 is a flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure.
- the disclosure is not limited to this operational description. Rather, it will be apparent to persons skilled in the relevant art(s) that other operational control flows are within the scope and spirit of the present disclosure. The following discussion describes the steps in FIG. 9 .
- the operational control flow receives video data, image data, command data, and/or control data for displaying the image and/or the video data on to a display area 904 of the proximity screen display 900 .
- the proximity screen display 900 can represent an exemplary embodiment of the proximity screen display 106 , the proximity screen display 200 , the proximity screen display 210 , the proximity screen display 300 , the proximity screen display 310 , the proximity screen display 314 , the organic light-emitting diode proximity screen display 520 , the electronic paper, e-paper, or electronic ink proximity screen display 620 , the liquid crystal proximity screen display 720 , or the proximity screen display 802 to provide some examples.
- the proximity screen display 900 can also be implemented using any conventional proximity screen display, such as any conventional resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition proximity screen display to provide some examples, that is capable of detecting a presence or a location of an object without departing from the spirit and scope of the present disclosure.
- any conventional proximity screen display such as any conventional resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition proximity screen display to provide some examples, that is capable of detecting a presence or a location of an object without departing from the spirit and scope of the present disclosure.
- a proximity screen display interface such as the proximity screen display interface 108 or the proximity screen display interface 800 to provide some examples, receives the video data, image data, command data, and/or control data for displaying the image and/or the video data on to a display area 904 of a proximity screen display 900 .
- the operational control flow displays the image and/or the video data on to the display area 904 in according to the command data, and/or control data.
- the proximity screen display interface provides various display area control signals, such as the various display area control signals 860 to provide an example, in response to the video data, image data, command data, and/or control data to configure and arrange the display area 904 to display the video and/or image data on the display area.
- the operational control flow detects a presence or a location of an object, such as a finger 902 of a user, which is proximate to the display area 904 .
- the proximity screen display 904 can include one or more integrated imaging elements that can be integrated around a periphery of the display area 904 in a similar manner as the imaging elements 202 . 1 through 202 . i and/or integrated within the display area in a similar manner as the integrated imaging elements 212 . 1 through 212 . i .
- the one or more integrated imaging elements are configured and arranged to sense light in their field of view.
- the proximity screen display interface provides various integrated imaging element control signals, such as the integrated imaging element control signals 862 to provide an example, to configure the one or more integrated imaging elements to sense light in their field of view.
- the one or more integrated imaging elements provide various sensing signals, such as the sensing signals 856 to provide an example, whose magnitudes depend upon an amount of light sensed in their field of view at different locations with the proximity screen display.
- the proximity screen display interface can interpolate an image of an environment surrounding the display area from magnitudes of the various sensing signals to detect the presence or the location of the object. Additionally, the proximity screen display interface can compare various images of the environment surrounding the display area 904 at different instances in time to determine movement of the object within the environment.
- the proximity screen display interface can recognize specific portions of the object, such as one or more fingers of a hand of the user, from one or more images of the environment surrounding the display area 904 .
- the proximity screen display interface can assign various control and/command data to different specific portions of the object and provide respective control and/command data when a respective specific portion of the object has been recognized.
- the proximity screen display interface can provide control and/command data to scroll down and no hypertext jumping or zooming upon recognition of the right thumb, or control and/command data to select hypertext links and invokes zooming upon recognition of the right index finger.
- the operational control flow adjusts the image and/or the video data as being displayed on the display area 904 in response to detecting the presence or the location of the object.
- the proximity screen display interface can adjust the various display area control signals to adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the image and/or the video data as being displayed on the display area 904 .
- the proximity screen display interface can adjust the various parameters to enlarge or to zoom into a coincidental portion 902 of the image and/or the video data, such as one or more alphanumeric keys of an integrated virtual keyboard to provide an example, as being displayed on the display area 904 that coincides with the location of the object.
- the coincidental portion 902 of the image and/or the video data can appear to be larger to the user.
- step 952 the operational control flow reverts to step 952 to display the image and/or the video data.
- various full or non-contact modes can be automatically identified by the proximity screen display 900 and/or its corresponding display interface such as the proximity screen display interface 108 or the proximity screen display interface 800 to provide some examples.
- Such modes can also be selected by a user via a setup process.
- One such mode involves a non-contact “click” selection using a pointer finger.
- Such mode can be established as a factory default or user defined, trained and configured to cause a particular software function to trigger such as launching a software API (application program interface) On_Click( ) type function.
- a software API application program interface
- underlying pixel based visual graphics can be set to a particular user selected magnification which may be set to any degree of magnification or to 100% (or otherwise turned off). As the pointer finger moves closer, the magnification can be set to scale up or down or merely stay the same. Likewise, the circle size can be made to change or stay the same.
- an active input element button, down arrow, text field or widget
- the circle can be made to stabilize or to reflect free movement in the x-y-z directions. With stabilization, even a user with shaky hands can find a target input element and stay thereon long enough to recognize same and carry out perhaps a double click motion without ever touching the screen.
- the stabilization may involve centering and appropriate zooming along with a temporary dwell time that may be user configured or calculated for each user through training or through captured behaviors.
- the stabilization can be over-ridden when the pointer finger motion appears to be intentional and not within a range of a particular human's jittering.
- FIG. 10 is a second flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure.
- the disclosure is not limited to this operational description. Rather, it will be apparent to persons skilled in the relevant art(s) that other operational control flows are within the scope and spirit of the present disclosure. The following discussion describes the steps in FIG. 10 .
- the operational control flow receives the video data, image data, command data, and/or control data for displaying the image and/or the video data on to a display area 1004 of the proximity screen display 1000 .
- the proximity screen display 1000 can represent an exemplary embodiment of the proximity screen display 106 , the proximity screen display 200 , the proximity screen display 210 , the proximity screen display 300 , the proximity screen display 310 , the proximity screen display 314 , the organic light-emitting diode proximity screen display 520 , the electronic paper, e-paper, or electronic ink proximity screen display 620 , the liquid crystal proximity screen display 720 , or the proximity screen display 802 to provide some examples.
- the proximity screen display 1000 can also be implemented using any conventional proximity screen display, such as any conventional resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition proximity screen display to provide some examples, that is capable of detecting a presence or a location of an object without departing from the spirit and scope of the present disclosure.
- any conventional proximity screen display such as any conventional resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition proximity screen display to provide some examples, that is capable of detecting a presence or a location of an object without departing from the spirit and scope of the present disclosure.
- the operational control flow displays the image and/or the video data on to the display area 1004 in according to the command data, and/or control data.
- the operational control flow detects a presence or a location of an object, such as one or more hands 1002 of a user, which is proximate to the display area 1004 in a substantially similar manner as described in step 954 .
- the operational control flow adjusts the image and/or the video data as being displayed on the display area 1004 in response to detecting the presence or the location of the object.
- the proximity screen display interface can adjust the various display area control signals to adjust the various parameters of the image and/or the video data as being displayed on the display area 1004 .
- the proximity screen display interface can adjust the various parameters to enlarge or to zoom into coincidental portions 1006 of the image and/or the video data as being displayed on the display area 1004 that coincide with various portions of the object, such as one or more fingers of the one or more hands 1002 .
- the coincidental portions 1006 of the image and/or the video data can appear to be larger to the user.
- the proximity screen display interface can adjust the various parameters to adjust an orientation 1008 of the image and/or the video data as being displayed on the display area 1004 to coincide with the various portions of the object.
- This other example is particularly useful when the image and/or the video data correspond to an integrated virtual keyboard.
- This other example allows the orientation of the integrated virtual keyboard to be adjusted to coincide with the one or more fingers of the one or more hands 1002 .
- the proximity screen display 1000 and supporting hardware and software within the illustrated tablet device can support various typing modes, including touch typing (full finger contact with finger lifts and presses), finger hover with keystroking finger screen contact, hover without contact but with keystroke-like finger motions, and so on. Thumb typing and hunt and peck styles can also be selected. And even with hunt and peck, other objects can be used for the pecking (e.g., a stylus, pencil, or broom handle). Typing input can even support one handed typing, missing digits, and obscure typing preferences and layouts. No matter what the typing configuration though, user specific tailoring is supported through training, ongoing monitoring of typing success (spelling/grammar/corrections) and associated hand and finger positions, motions, and characterizations thereof.
- the proximity screen display 1000 and supporting hardware and software respond to detecting the approach and construct a keyboard layout that fits the locations of the fingers in their natural typing readiness configuration (again as shown).
- This keyboard layout may change during the approach as fingers move or only change (or are created) upon contact.
- a finger striking range can be identified which may also account for keystroke sequencing. From such information key sizing may be adjusted on a key by key basis.
- the active area for the letter “q” may need to be much larger than the active area for the letter “a” and both may be ellipsoidal or other more natural finger movement related active key area shapes.
- a visual keyboard element can be placed but need not be.
- the visual keyboard shape and locations can be constructed to parallel how a user typically types or to help tighten up problem areas where a user often makes mistakes.
- a real time predictive spelling can be turned on.
- Such feature involves making a prediction as to the next letter that will be typed (based on spelling and grammar rules) and which letters will not be typed.
- a one of such keys can be made to favor another. For example when ten (10) percent of the time, a user hits an upper region associated with their typical letter “a” strokes and a lower region associated with their typical letter “q,” then “a” could get favored treatment due to the perhaps a most likely spelling prediction. Subsequent keys might change the prediction resulting in a swapping event from “a” to “q” of course. Thus, to avoid confusion, several modes of operation might be selected.
- both letters can be presented before one is finalized based on subsequent typing entry and spelling and grammar considerations for an entire word or word sequence which either verifies or conflicts with a prediction.
- a second mode the most likely is presented and visually swapped if it proves incorrect.
- both letter options are withheld until enough further letters are received to converge on one or the other. This applies to more than two possibilities as well based on their nearness in physical keyboard layout.
- Each finger contact area for a given key is represented by statistical variation data for each particular user.
- Such statistics and associated predictions, resizing and relative locations can be extended to three dimensions and can be associated with any element of each hands digits (e.g., joints).
- a particular user's hand motion associated with a pinky-stretch for a “q” might be much more important than the landing spot.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automatic Focus Adjustment (AREA)
- Studio Devices (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Television Signal Processing For Recording (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Image Processing (AREA)
Abstract
Description
- The present application claims the benefit of U.S. Provisional Patent Appl. No. 61/549,495, filed Oct. 20, 2011, which is incorporated herein by reference in its entirety.
- 1. Field of Disclosure
- The present disclosure relates generally to a proximity screen display for use in a communication device, and more specifically to integration of integrated imaging elements within the proximity screen display for use in adjusting various parameters of video data and/or image data that is being displayed by a display area of the proximity screen display.
- 2. Related Art
- A conventional communication device includes a conventional touch screen display, such as resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition touch screen display to provide some examples, which operates as an interface between the communication device and a user of the communication device. The conventional touch screen display operates as an output device for displaying image and/or video data for the user. Additionally, the conventional touch screen display operates as an input device for receiving command data, control data, and/or other data from the user of the communication device. Most often, the conventional touch screen display includes an integrated virtual keyboard, also referred to as an on-screen integrated virtual keyboard, for receiving the command data, control data, and/or other data from the user of the communication device.
- The continued evolution of silicon semiconductor fabrication technologies has reduced a size of the conventional communication device as well as a size of the conventional touch screen display and its integrated virtual keyboard. As a result, the alphanumeric keys of the integrated virtual keyboard have also decreased thereby making use of the integrated virtual keyboard more difficult. For example, users with larger hands can have difficulty in selecting from among the alphanumeric keys leading to erroneous keys being selected. As another example, the conventional communication device is not properly oriented when the integrated virtual keyboard is in use leading to erroneous keys being selected. In this other example, a user of the conventional communication device often times holds the conventional communication device at an angle which leads to erroneous keys being selected.
- Manufacturers have developed various error correction techniques within the conventional communication device to correct for errors that result from erroneous keys being selected. These error correction techniques conventionally include sophisticated error techniques to interpolate a word or phrase from a word or a phrase having errors that was entered on the integrated virtual keyboard.
- Embodiments of the disclosure are described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left most digit(s) of a reference number identifies the drawing in which the reference number first appears.
-
FIG. 1 illustrates a block diagram of an exemplary communication device according to an exemplary embodiment of the present disclosure; -
FIG. 2A illustrates a first block diagram of an exemplary configuration and arrangement of perimeter imaging elements surrounding a screen to form a proximity screen display of the communication device according to an exemplary embodiment of the present disclosure; -
FIG. 2B illustrates a second block diagram of an exemplary configuration and arrangement of integrated imaging elements within the communication device according to an exemplary embodiment of the present disclosure; -
FIG. 3A illustrates a first exemplary integration of the imaging elements within a screen of the communication device according to an exemplary embodiment of the present disclosure; -
FIG. 3B illustrates a second exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure; -
FIG. 3C illustrates a third exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure; -
FIG. 4 illustrates an integrated imaging element within the communication device according to an exemplary embodiment of the present disclosure; -
FIG. 5A illustrates a single pixel element of a flexible organic light-emitting diode screen display according to an exemplary embodiment of the present disclosure; -
FIG. 5B illustrates a single pixel element of a flexible organic light-emitting diode proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure; -
FIG. 6A illustrates a single pixel element of an electronic paper, e-paper, or electronic ink screen display according to an exemplary embodiment of the present disclosure; -
FIG. 6B illustrates a single pixel element of an electronic paper, e-paper, or electronic ink proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure; -
FIG. 7A illustrates a single pixel element of a liquid crystal screen display according to an exemplary embodiment of the present disclosure; -
FIG. 7B illustrates a single pixel element of a liquid crystal proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure; -
FIG. 8 illustrates an exemplary proximity screen display and proximity screen display interface that can be implemented within the communication device according to an exemplary embodiment of the present disclosure; -
FIG. 9 is a first flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure; and -
FIG. 10 is a second flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure. - The disclosure will now be described with reference to the accompanying drawings. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements. The drawing in which an element first appears is indicated by the leftmost digit(s) in the reference number.
- The following Detailed Description refers to accompanying drawings to illustrate exemplary embodiments consistent with the disclosure. References in the Detailed Description to “one exemplary embodiment,” “an exemplary embodiment,” “an example exemplary embodiment,” etc., indicate that the exemplary embodiment described can include a particular feature, structure, or characteristic, but every exemplary embodiment can not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same exemplary embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an exemplary embodiment, it is within the knowledge of those skilled in the relevant art(s) to affect such feature, structure, or characteristic in connection with other exemplary embodiments whether or not explicitly described.
- The exemplary embodiments described herein are provided for illustrative purposes, and are not limiting. Other exemplary embodiments are possible, and modifications can be made to the exemplary embodiments within the spirit and scope of the disclosure. Therefore, the Detailed Description is not meant to limit the disclosure. Rather, the scope of the disclosure is defined only in accordance with the following claims and their equivalents.
- Embodiments of the disclosure can be implemented in hardware, firmware, software, or any combination thereof. Embodiments of the disclosure can also be implemented as instructions stored on a machine-readable medium, which can be read and executed by one or more processors. A machine-readable medium can include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device). For example, a machine-readable medium can include non-transitory machine-readable mediums such as read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; and others. As another example, the machine-readable medium can include transitory machine-readable medium such as electrical, optical, acoustical, or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Further, firmware, software, routines, instructions can be described herein as performing certain actions. However, it should be appreciated that such descriptions are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc.
- The following Detailed Description of the exemplary embodiments will so fully reveal the general nature of the disclosure that others can, by applying knowledge of those skilled in relevant art(s), readily modify and/or adapt for various applications such exemplary embodiments, without undue experimentation, without departing from the spirit and scope of the disclosure. Therefore, such adaptations and modifications are intended to be within the meaning and plurality of equivalents of the exemplary embodiments based upon the teaching and guidance presented herein. It is to be understood that the phraseology or terminology herein is for the purpose of description and not of limitation, such that the terminology or phraseology of the present specification is to be interpreted by those skilled in relevant art(s) in light of the teachings herein.
- For purposes of this discussion, the term “module” shall be understood to include at least one of software, firmware, and hardware (such as one or more circuits, microchips, or devices, or any combination thereof), and any combination thereof. In addition, it will be understood that each module can include one, or more than one, component within an actual device, and each component that forms a part of the described module can function either cooperatively or independently of any other component forming a part of the module. Conversely, multiple modules described herein can represent a single component within an actual device. Further, components within a module can be in a single device or distributed among multiple devices in a wired or wireless manner.
- The following Detailed Description describes a communication device that includes a proximity screen display that includes one or more imaging elements that are configured and arranged around a periphery of a display area of the proximity screen display and/or integrated within the display area. The one or more integrated imaging elements are configured and arranged to sense light in their field of view. The one or more integrated imaging elements provide one or more various sensing signals whose magnitudes depend upon an amount of light sensed in their field of view. The communication device can adjust various parameters of video data and/or image data that is being displayed by the display area in response to the various sensing signals.
-
FIG. 1 illustrates a block diagram of an exemplary communication device according to an exemplary embodiment of the present disclosure. Acommunication device 100 communicates information, such as audio data, video data, image data, command data, control data and/or other data to provide some examples, between a near-end user and a far-end user over various wired and/or wireless communication networks. Thecommunication device 100 can represent a mobile communication device, such as a cellular phone or a smartphone, a mobile computing device, such as a tablet computer or a laptop computer, or any other electronic device that is capable of communicating information over communication networks that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. Thecommunication device 100 can communicate information that is received from the far-end user, as well as information that is generated by thecommunication device 100, to the near-end user using a proximity screen display. Additionally, the near-end user can communicate information to the far-end user, as well as information to thecommunication device 100, using the proximity screen display. Thecommunication device 100 includes acommunication module 102, ahost processor 104, aproximity screen display 106, a proximityscreen display interface 108, and acommunication interface 110. - The
communication module 102 can include a Bluetooth module, a Global Position System (GPS) module, a cellular module, a wireless local area network (WLAN) module, a near field communication (NFC) module, a radio frequency identification (RFID) module and/or a wireless power transfer (WPT) module. The Bluetooth module, the cellular module, the WLAN module, the NFC module, and the RFID module provide wireless communication between thecommunication device 100 and other Bluetooth, other cellular, other WLAN, other NFC, and other RFID capable communication devices, respectively, in accordance with various communication standards or protocols. These various communication standards or protocols can include various cellular communication standards such as a third Generation Partnership Project (3GPP) Long Term Evolution (LTE) communications standard, a fourth generation (4G) mobile communications standard, or a third generation (3G) mobile communications standard, various networking protocols such a Worldwide Interoperability for Microwave Access (WiMAX) communications standard or a Wi-Fi communications standard, various NFC/RFID communications protocols such as ISO 1422, ISO/IEC 14443, ISO/IEC 15693, ISO/IEC 18000, or FeliCa to provide some examples. The GPS module receives various signals from various satellites to determine location information for thecommunication device 100. The WPT module supports wireless transmission of power between thecommunication device 100 and another WPT capable communication device. - The
host processor 104 controls overall operation and/or configuration of thecommunication device 100. Thehost processor 104 can receive and/or process information from a user interface such as an alphanumeric keypad, a microphone, a mouse, a speaker, and/or from other electrical devices or host devices that are coupled to thecommunication device 100. Thehost processor 104 can provide this information to thecommunication module 102 and/or the proximityscreen display interface 108. Additionally, thehost processor 104 can receive and/or process information from thecommunication module 102 and/or the proximityscreen display interface 108. Thehost processor 104 can provide this information to the user interface, to other electrical devices or host devices, and/or to thecommunication module 102 and/or the proximityscreen display interface 108. Further, thehost processor 104 can execute one or more applications such as Short Message Service (SMS) for text messaging, electronic mailing, and/or audio and/or video recording to provide some examples, and/or software applications such as a calendar and/or a phone book to provide some examples. - The
proximity screen display 106 represents an electronic visual display that can detect a presence and/or a location of a touch that is proximate to its display area. Theproximity screen display 106 includes a display area to provide information from the proximityscreen display interface 108 to the near-end user. Additionally, theproximity screen display 106 includes one or more integrated imaging elements that are integrated within and/or approximate to the display area to provide information from the near-end user to the proximityscreen display interface 108. The one or more integrated imaging elements are configured and arranged to detect a presence and/or a location of a touch from the near-end user. The touch can represent a physical touching of the display area by the near-end user and/or by other passive objects available to the near-end user, such as a stylus to provide an example or proximity of the near-end user and/or the other passive objects to the display area. - The proximity
screen display interface 108 can be configured via user setup and/or via a software programming interface (API) to respond to certain objects and classes of objects, e.g., a finger, fingers, hands, stylus, pencil, etc. Via user setup, any object within a user's grasp can be held in front of theproximity screen display 106 for imaging and thereafter can be used as the primary means of interacting through the proximityscreen display interface 108. This setup can be for all applications and operating system interactions on thecommunication device 100, or it can apply to a single application or to only the operating system. Similarly, a user wearing gloves can easily train the proximityscreen display interface 108 to recognize certain gloved hand and finger movements as the input via a similar setup process. Thereafter, any time the proximityscreen display interface 108 recognizes a trained or default input element, a pop-up window can appear on theproximity screen display 106 to prompt the user to accept an automatic input device setup can be made without having to retraining - Moreover, although full contact of a finger, hand, gloved hand, stylus, or other input object with the surface of the proximity
screen display interface 108 may be required, it need not be. Through setup and the software programming interface, a particular input element can be defined to have a working range. For example, full contact can be required for a pointer finger arrangement for one application. In another application and for perhaps a stylus, passing within two (2) centimeters of the screen surface within a defined speed range will be characterized as contact. Similarly, if within ten (10) centimeters of the screen surface and carrying out a double click tapping motion, even without actual contact, will be recognized and applied as a full contact double click. - A full set of default behaviors for a default set of input types and classes may be preloaded. Adjustments to the underlying settings, whether preloaded or not, can be made over time based on actual interactions. For example, a first attempt at a particular input element motion that is slightly outside a working range or slightly of what has been defined may not be recognized; however, a re-attempt though may be recognized. Once the first attempt is then affectively recognized via the subsequent reattempt, modifications and adjustments can be made to so that subsequent interactions similar to the first attempt will be recognized.
- For example, conventional touch screens typically found in tablet and smartphone devices do not support touch typing. The proximity
screen display interface 108 supports such interactions by “looking” at the fingers and finger motions of one or both sets of fingers being placed on the screen surface. Such looking involves repeated capture of images to form a sequence of images which together comprise video. By analyzing the sequence in real time, touch typing motions can be recognized such as (i) movement of the fingers away from and toward the screen (change in size indicating distance away), (ii) changes in finger shape (which indicate a pressure associated with a keystroke), (iii) lighting characteristics and shading, (iv) movement velocities/accelerations, (v) relocation, and (vi) rates of change of all the above. - This touch typing mode can be activated when a user decides to begin typing by merely bringing their finger set in a typing configuration toward the screen. By the time the fingers reach the screen, a keypad will be configured to fit appropriately thereunder without forcing the user to find finger to key placement. Thus, a user can begin typing without looking at the keys to make sure the fingers are maintaining their alignment. Instead, the keys will automatically be aligned, sized and positioned to fit the user. With this configuration, small or large hands with a more natural finger positioning can be easily accommodated. The touch typing mode may also be activated by a software application, user interaction with a field that requires typing input, or by any other gesture input (whether full contact or not) and via voice recognized commands.
- Of course, instead of touch typing, a user may select a thumb typing configuration or pointer finger input mode as an overall default or on an application by application basis. Thumb typing may be represented by one or more thumb typing modes. Some of these modes might be tailored to interact using a more traditional full contact mode with rather fixed key offerings. Other thumb typing modes may take advantage of the non-contact and user-specific-tailoring aspects of the present invention. For example, thumb typing with or without gloves, small or big thumbs, short or long thumbs, short thumb range in x, y, z directions or regions can all be taken into account in adjusting and tailoring an effective interface for a particular user. Similarly, an elevated and non-contact typing mode (i.e., a “hovering non-contact typing mode” which may recognize fingers with non-contact typing motions. A hovering contact typing mode may also be selected wherein a finger needs full contact to be recognized as a key depression. No matter what typing mode is selected though, the key recognition range and associated finger behaviors can be accounted for dynamically to support a given user and user's situation.
- Spelling and grammar tools running within underlying software (operating system or application) on the host processor 104 (or any dedicated other processing circuitry perhaps within the proximity screen display interface 108) assist in such dynamic tailoring no matter what the typing mode happens to be. For example, while in a hovering contact typing mode, spelling and grammatical mistakes can be recognized for stored finger motions. Some users often strike an “a” instead of a “q” after typing a “w.” Likewise, some users may hold their fingers in a hovering and striking position in a configuration somewhat off of a normal horizontal alignment leading to other typing errors. Other finger motions might not be identified as the finger range of motion may be impaired on yet another user. By analysis of mistakes over time as identified with spelling and grammar tools and unsolicited user corrections along with evaluation of both the strokes and hand positions associated with the mistakes and spatial characteristics of a keyboard layout, comfortable, effective and accurate typing input interface can be established for each user.
- Other types of full, partial and no-contact user interfacing can also be established through the proximity
screen display interface 108. For example, non-contact gestures such as placing all fingertips together then opening the hand to reveal a palm might be recognized to perform a function such as returning to a desktop. Mapping of such and any other gestures, whether involving some aspect of contact or not, to a particular function on thecommunication device 100 can be managed via default offerings and in a training based setup fashion. Such gestures can be with any object or body part and include other user input marriage. An example of such a marriage might be a gesture plus a detected voice command (simultaneously carried out or in a sequence) might be used to trigger performance of a function. - Although illustrated as solely an image capture based interface, the proximity
screen display interface 108 and associated screen may be fitted with full-contact only, touch screen technology. With such configuration, both the visual recognition aspects and full contact detection approaches can be used together in a reinforcing way, or be selected as operational alternatives wherein only one may be powered down or up per user or application software command. - There are various ways for capturing images and video sequences in proximity of the
screen display 106 such as those discussed in more detail below. No matter approach is used, because the image and video output is not intended for consumption by a human eye, capture and storage approaches may follow an entirely different set of goals and requirements. For example, to recognize a finger and finger position accurately may not require full HD (High Definition) color, contrast and resolution. Similarly, the image capture rate need not be fixed nor occur at typical film frame rates. Instead, for each particular design embodiment, the goals are much simpler and can be services with perhaps lower cost image processing elements, although more complexity and greater demand requirements could still be met (especially when high speed gestures and motion detection is needed). Likewise, storage may take the form of “stick-man” representations of body parts or other input elements. For example, in a compressed form, fingers and hands might be replaced with a data structure representing each joint in each hand along with related motion and position data. Within such or another data structure, fingertip contact might be represented by an estimated contact time, pressure and duration. Thus, by best-fit correlation of hand motion data to a target set of known motions or gestures, a user's input behaviors can be identified. - During training and when attempting to correct for identified mistakes, such data can be characterized for a particular user as matching a particular input target. At perhaps ten (10) frames per second capture in grey-scale and with a conversion to a data structure of a pivot point, length between pivot points (e.g., joints), and so on, user input whether or not full contact may offer sufficient for some embodiments. If, however, the image capture quality of the
proximity screen display 106 is high enough, the full screen or portions thereof may be used for capturing images and video that is sufficient for consumption by the human eye. In such embodiments, the imaging capabilities of the screen server dual purposes (imaging for the eye and imaging for user input interfacing). - The one or more integrated imaging elements are configured and arranged to sense light in their field of view. The one or more integrated imaging elements provide one or more various sensing signals whose magnitudes depend upon an amount of light sensed in their field of view to the proximity
screen display interface 108. For example, the sensing signals have a first magnitude when the one or more integrated imaging elements are exposed to a bright light, a second magnitude when the one or more integrated imaging elements are exposed to a dark light, and a third magnitude that varies between the first magnitude and the second magnitude as the amount of light sensed by the one or more integrated imaging elements varies between the bright light and the dark light. - The proximity
screen display interface 108 communicates information between thecommunication module 102, thehost processor 104, and theproximity screen display 106. The proximityscreen display interface 108 provides various control signals to theproximity screen display 106 for configuration of its display area to display the information. For example, the proximityscreen display interface 108 can provide various control signals to theproximity screen display 106 for configuration of its display area to display image data or video data received from thecommunication module 102 and/or thehost processor 104. Additionally, the proximityscreen display interface 108 can interpret the various sensing signals provided by theproximity screen display 106 to determine the presence and/or the location of the near-end user and/or the other passive objects. For example, the proximityscreen display interface 108 can interpolate an image of an environment surrounding the display area from magnitudes of the various sensing signals to generate an image of an environment surrounding the display area to determine the presence and/or the location of the near-end user and/or the other passive objects. As another example, the proximityscreen display interface 108 can compare various images of the environment surrounding the display area at different instances in time to determine movement of the near-end user and/or the other passive objects. Further, the proximityscreen display interface 108 can recognize specific portions of the object, such as one or more fingers of a hand of the near-end user to provide an example, from one or more images of the environment surrounding the display area. The proximity screen display interface can 108 assign various control and/command data to different specific portions of the object and provide respective control and/command data to thecommunication module 102 and/or thehost processor 104 when a respective specific portion of the object has been recognized. Yet further, the proximityscreen display interface 108 can adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the information provided by thecommunication module 102 and/or thehost processor 104 in response to the various sensing signals provided by theproximity screen display 106. For example, the proximityscreen display interface 108 can adjust the zoom, resolution, pitch, roll, and/or yaw of image data, such as an image of a integrated virtual keyboard to provide an example, or video data to align the image data or video data with the movement of the near-end user and/or the other passive objects. - The
communication interface 110 routes various communications between thecommunications module 102, thehost processor 104, and the proximityscreen display interface 108. These communications can include various digital signals, such as one or more commands and/or data to provide some examples, various analog signals, such as direct current (DC) currents and/or voltages to provide some examples, or any combination thereof. Thecommunication interface 110 can be implemented as a series of wired and/or wireless interconnections between thecommunications module 102, thehost processor 104, and the proximityscreen display interface 108. The interconnections of thecommunication interface 110 can be arranged to form a parallel interface to route communications between thecommunications module 102, thehost processor 104, and the proximityscreen display interface 108 in parallel, a serial interface to route communications between thecommunications module 102, thehost processor 104, and the proximityscreen display interface 108, or any combination thereof. - As discussed above, the one or more integrated imaging elements sense changes in light resulting from the movement of the near-end user and/or the other passive objects in their field of view. The exemplary configurations and arrangements of the one or more integrated imaging elements to be discussed below are for illustrative purposes only. Those skilled in the relevant art(s) will recognize that other configurations and arrangements of the one or more integrated imaging elements are possible without departing from the spirit and scope of the present disclosure.
- The various proximity screen displays and associated assemblies described herein can include any suitable number of the integrated imaging elements ranging from a single integrated imaging element up to many thousands of integrated imaging elements. However, even larger numbers of integrated imaging elements are possible as will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. For example, one or more imaging elements can be positioned outside the perimeter of the screen surface. Other embodiments may use imaging elements integrated into the screen itself, while yet others position the imaging elements on top of or behind the screen. No matter what the configuration of a particular embodiment, the resulting proximity screen display and associated processing hardware and software act in concert to provide user output and user input interfacing in a common, spatially mapped interface arrangement.
-
FIG. 2A illustrates a first block diagram of an exemplary configuration and arrangement of perimeter imaging elements surrounding a screen to form a proximity screen display of the communication device according to an exemplary embodiment of the present disclosure. Aproximity screen display 200 includes integrated imaging elements 202.1 through 202.i that are configured and arranged around a periphery of adisplay area 204. Although, the integrated imaging elements 202.1 through 202.i are illustrated as being configured and arranged around the periphery of thedisplay area 204 in a uniform manner, this is for illustrative purposes only. Those skilled in the relevant art(s) will recognize that the integrated imaging elements 202.1 through 202.i can be configured and arranged around the periphery of thedisplay area 204 in any suitable manner without departing from the spirit and scope of the present disclosure. Theproximity screen display 200 can represent an exemplary embodiment of theproximity screen display 106. - As shown in
FIG. 2A , the integrated imaging elements 202.1 through 202.i can be configured and arranged around the periphery of thedisplay area 204 in a uniform manner to formrows 206 and/orcolumns 208 of the integrated imaging elements 202.1 through 202.i. The integrated imaging elements 202.1 through 202.i and thedisplay area 204 can be formed onto a common chip or die or separate chips or dies. Additionally, the integrated imaging elements 202.1 through 202.i and thedisplay area 204 can be integrated within a mechanical housing of a communication device, such as thecommunication device 100. The mechanical housing can include various openings or cutouts to accommodate the integrated imaging elements 202.1 through 202.i. -
FIG. 2B illustrates a second block diagram of an exemplary configuration and arrangement of integrated imaging elements within the communication device according to an exemplary embodiment of the present disclosure. Aproximity screen display 210 includes integrated imaging elements 212.1 through 212.i that are configured and arranged in a uniform manner to form a matrix that is integrated within adisplay area 214. As shown inFIG. 2B , the integrated imaging elements 212.1 through 212.i can be configured and arranged to inrows 216 and/orcolumns 218 to form the matrix that is integrated within thedisplay area 214. The matrix shown inFIG. 2B is merely illustrative, those skilled the relevant art(s) will recognize that the matrix can be increased or reduced depending on the design goals of a particular embodiment. Moreover, the array size may bear no relationship or a full relationship with the underlying pixel array size and layout. Theproximity screen display 210 can represent an exemplary embodiment of theproximity screen display 106. - That is, other configurations and arrangements of the
proximity screen display 106 are possible without departing from the spirit and scope of the present disclosure. For example, some configurations and arrangements of theproximity screen display 106 can include a first set of integrated imaging elements, similar to the imaging elements 202.1 through 202.i, around the periphery of its display area and a second set of the integrated imaging elements, similar to the integrated imaging elements 212.1 through 212.i, integrated within the display area. As another example, some configurations and arrangements of theproximity screen display 106 can include integrated imaging elements, similar to the integrated imaging elements 202.1 through 202.i and/or the one or more integrated imaging elements 212.1 through 212.i, which are configured and arranged around the periphery or integrated within the matrix, respectively, in a non-uniform manner. In this other example, the rows and/or the columns of these integrated imaging elements can include different numbers of integrated imaging elements. As a further example, some configurations and arrangements of theproximity screen display 106 can include integrated imaging elements, similar to the integrated imaging elements 202.1 through 202.i and/or the one or more integrated imaging elements 212.1 through 212.i, that are configured and arranged to form any geometric shape around the periphery of its display area and/or integrated within the display area. - One or more integrated imaging elements, such as the integrated imaging elements 202.1 through 202.i or the one or more integrated imaging elements 212.1 through 212.i to provide an example, can be integrated within a proximity screen display, such as the
proximity screen display 106, theproximity screen display 200, or theproximity screen display 210 to provide some examples, of a communication device, such as thecommunication device 100 to provide an example. The one or more integrated imaging elements can be formed onto and/or within the proximity screen display around the periphery of a display area in a similar manner as thedisplay area 204 and/or integrated within the display area in a similar manner as thedisplay area 214. -
FIG. 3A illustrates a first exemplary integration of the imaging elements within a screen of the communication device according to an exemplary embodiment of the present disclosure. One or more integrated imaging elements 302.1 through 302.k can be formed onto asubstrate 304 of aproximity screen display 300. Theproximity screen display 300 can represent an exemplary embodiment of theproximity screen display 106, theproximity screen display 200, or theproximity screen display 210 to provide some examples. As such, the integrated imaging elements 302.1 through 302.k can represent an exemplary embodiment of the imaging elements 202.1 through 202.i or the imaging elements 212.1 through 212.i. - As shown in
FIG. 3A , theproximity screen display 300 can be formed onto a single substrate or multiple substrates that are communicatively coupled to each other. Thesubstrate 304 can represent a portion of the single substrate, one of the multiple substrates, or a portion of one of the multiple substrates. Thesubstrate 304 can be integrated within amechanical housing 306 of a communication device, such as thecommunication device 100 to provide an example. Themechanical housing 306 can include various openings 308.1 through 308.k to accommodate the integrated imaging elements 302.1 through 302.k. The openings 308.1 through 308.k can be physical hole type openings or merely comprise transparent material areas through which imaging can be conducted. The integrated imaging elements 302.1 through 302.k may comprise single photodetector imagers or more complex photodetector arrays with associated lensing, depending on the particular embodiment. Although shown as having a substantially three dimensional shape, the integrated imaging elements 302.1 through 302.k can take on the structure, shape and architecture of a mostly flat imaging structure. -
FIG. 3B illustrates a second exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure. Similar to theproximity screen display 300, aproximity screen display 310 includes the one or more integrated imaging elements 302.1 through 302.k that are formed onto thesubstrate 304. However, as an alternate to themechanical housing 306, a protective coating oftransparent material 312 can be placed onto thesubstrate 304 of the communication device to protect the integrated imaging elements 302.1 through 302.k. -
FIG. 3C illustrates a third exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure. One or more integrated imaging elements 316.1 through 316.k can be integrated within one or more integrated circuit layers 318.1 through 318.m of asubstrate 320 of aproximity screen display 314. Theproximity screen display 314 can represent an exemplary embodiment of theproximity screen display 106, theproximity screen display 200, or theproximity screen display 210 to provide some examples. As such, the integrated imaging elements 316.1 through 316.k can represent an exemplary embodiment of the imaging elements 202.1 through 202.i or the imaging elements 212.1 through 212.i. - As shown in
FIG. 3C , theproximity screen display 314 can be formed onto a single substrate or multiple substrates that are communicatively coupled to each other. Thesubstrate 320 can represent a portion of the single substrate, one of the multiple substrates, or a portion of one of the multiple substrates. The one or more integrated imaging elements 316.1 through 316.k can be integrated within the one or more integrated circuit layers 318.1 through 318.m of thesubstrate 320. Typically, the one or more integrated imaging elements 316.1 through 316.k are formed onto an integrated circuit layer 318.1 that represents a substrate of semiconductor material, which is often flexible. The one or more integrated imaging elements 316.1 through 316.k as well as a display area of theproximity screen display 314 are formed using various integrated circuit layers between the integrated circuit layers 318.1 and 318.m. Optionally, a flexible transparent cover is often formed onto the one or more integrated imaging elements 316.1 through 316.k as the integrated circuit layer 318.m. - As discussed above, the one or more integrated imaging elements sense changes in light resulting from the movement of the near-end user and/or the other passive objects in their field of view. Typically, the one or more integrated imaging elements are implemented using various photosensor and/or photodetector devices to provide an example, which convert energy of the light into electrical energy, such as current or voltage, by a photovoltaic effect. The photosensor and/or photodetector devices can include active pixel element sensors, light emitting diodes, optical detectors, photoresistors, photovoltaic cells, photodiodes, phototransistors, and or other devices that are capable of converting the energy of the light into the electrical energy that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. These various photosensor and/or photodetector devices can be integrated around the periphery in a similar manner as the imaging elements 202.1 through 202.i, of a display area of a proximity screen display, such as the
proximity screen display 106, theproximity screen display 200, or theproximity screen display 210 to provide some examples, and/or integrated within the display area in a similar manner as the integrated imaging elements 212.1 through 212.i. -
FIG. 4 illustrates an integrated imaging element within the communication device according to an exemplary embodiment of the present disclosure. As shown inFIG. 4 , the integrated imaging element is implemented using aphotovoltaic device 400. When photons of sufficient energy strike thephotovoltaic device 400, they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. These negatively charged electrons move toward a cathode of thephotovoltaic device 400 and/or the positively charged electron holes move toward an anode of thephotovoltaic device 400 producing a current and/or voltage. This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with theproximity screen display 106. - As shown in
FIG. 4 , thephotovoltaic device 400 includes a transparentconducting cathode layer 402, anoptional buffer layer 404, a donor-acceptor layer 406, and a transparentconducting anode layer 408 that are configured and arranged as a planar hetero-junction, although a bulk heterojunction (BHJ) or an ordered heterojunctions (OHJ) can be used, onto atransparent substrate 410. The transparentconducting cathode layer 402 represents a cathode of thephotovoltaic device 400 that attracts negatively charged electrons from theoptional buffer layer 404 and/or the donor-acceptor layer 406 when the photons of sufficient energy strike thephotovoltaic device 400. The attracting of the negatively charged electrons to the transparentconducting cathode layer 402 can produce the current and/or the voltage which is indicative of an intensity of the photons striking thephotovoltaic device 400. - The
optional buffer layer 404 is an intrinsic semiconductor layer, also called an undoped semiconductor layer or i-type semiconductor layer, which represents a semiconductor layer without any significant impurity atoms. In some situations, theoptional buffer layer 404 can be implemented with donor-acceptor layer 406 to form a p-type semiconductor, an i-type semiconductor, an n-type semiconductor (PIN) structure. In these situations, the negatively charged electrons and/or the positively charged electron holes from the donor-acceptor layer 406 accumulate within theoptional buffer layer 404 when the photons of sufficient energy strike thephotovoltaic device 400. A current can flow between the transparentconducting cathode layer 402 and the transparentconducting anode layer 408 when a sufficient number of the negatively charged electrons and/or the positively charged electron holes have accumulated in theoptional buffer layer 404. - The donor-
acceptor layer 406 can be doped with impurity atoms of an acceptor type, such as boron or aluminum to provide some examples, that are capable of accepting an electron and/or doped with impurity atoms of a donor type, such as phosphorus, arsenic, or antimony to provide some examples that are capable of donating an electron. In some situations, a first portion of the donor-acceptor layer 406 is doped with the impurity atoms of the acceptor type and a second portion of the donor-acceptor layer 406 is doped with the impurity atoms of the donor type to form a p-n junction. The donor-acceptor layer 406 provides negatively charged electrons to the transparentconducting cathode layer 402 and/or positively charged electron holes to the transparentconducting anode layer 408 when the photons of sufficient energy strike thephotovoltaic device 400 causing a current to flow between the transparentconducting cathode layer 402 and the transparentconducting anode layer 408. - The transparent
conducting anode layer 408 represents an anode of thephotovoltaic device 400 that attracts positively charged electron holes from theoptional buffer layer 404 and/or the donor-acceptor layer 406 when the photons of sufficient energy strike thephotovoltaic device 400. The attracting of the positively charged electron holes to the transparentconducting anode layer 408 can produce the current and/or the voltage which is indicative of an intensity of the photons striking thephotovoltaic device 400. - As discussed above, an integrated imaging element, such as one of the integrated imaging elements 212.1 through 212.i, one of the integrated imaging elements 316.1 through 316.k, or the
photovoltaic device 400 to provide some examples, can be integrated within a display area, such as thedisplay area 214 to provide an example, of a proximity screen display, such as theproximity screen display 106, theproximity screen display 214, or theproximity screen display 314 to provide some examples. These various proximity screen displays can be placed in a side-by-side configuration with various imaging elements, such as one or more of thephotovoltaic device 400 to provide an example, to form theproximity screen display 210. Optionally, a transparent flexible cover can be placed onto theproximity screen display 210 to cover the various proximity screen displays and the various imaging elements. For example, 140 10K-imaging elements can be placed in a middle of a full high-definition array of pixel elements of these proximity screen displays to form theproximity screen display 210. As another example, arrays of imaging elements can be interdigitated with arrays of pixel elements of these proximity screen displays to form a checkerboard like configuration. Other ratios of imaging elements to pixel elements and imager element arrays to pixel elements can be selected depending on the goals of the specific design embodiment. - Alternatively, or in addition to, various imaging elements, such as one or more of the
photovoltaic device 400 to provide an example, can be placed in an on-top configuration with the various proximity screen displays. In this configuration, the various imaging elements are placed on top of the various proximity screen displays. In some situations, various layers of various imaging elements and the various proximity screen displays can be shared, such as a flexible transparent cover of the various imaging elements and a transparent substrate of the various imaging elements. - Alternatively, or in addition to, discussion to follow describes various proximity screen displays, such as an organic light-emitting diode proximity screen display, an electronic paper, e-paper, or electronic ink proximity screen display, or a liquid crystal display to provide some examples. The discussion to follow then describes integration of the imaging element within these various proximity screen displays to form exemplary integrations of the integrated imaging elements.
-
FIG. 5A illustrates a single pixel element of a flexible organic light-emitting diode proximity screen display according to an exemplary embodiment of the present disclosure. An organic light-emitting diodeproximity screen display 500 includes one or more pixel elements that are configured and arranged to form a display area. Each of the one or more pixel elements includes one or more layers of one or more organic compounds which emit light in response to an electric current. The one or more layers of the one or more organic compounds are positioned between two electrodes on a substrate. The organic light-emitting diodeproximity screen display 500 can represent a bottom emission device that uses a transparent or semi-transparent bottom electrode to emit the light through a transparent substrate or a top emission device that uses a transparent or semi-transparent top electrode to directly emit the light. A single pixel element of the organic light-emitting diodeproximity screen display 500 includes an optional flexibletransparent cover 502, atransparent conducting cathode 504, an optionalelectron transport layer 506, one or more organic emission layers 508, an optionalhole transport layer 510, and atransparent conducting anode 512 that are formed on aflexible substrate 514 such as a substrate of polyethylene terephthalate to provide an example. - The optional flexible
transparent cover 502 represents a protective coating oftransparent material 312 that can be placed onto theflexible substrate 514 to protect thetransparent conducting cathode 504, the optionalelectron transport layer 506, the one or more organic emission layers 508, the optionalhole transport layer 510, and thetransparent conducting anode 512. - The
transparent conducting cathode 504 provides a current of electrons when a voltage at thetransparent conducting anode 512 is positive with respect to thetransparent conducting cathode 504. This current of electrons injects electrons into lowest occupied molecular orbitals (LUMO) of the one or moreorganic emission layers 508 at the optionalelectron transport layer 506 and withdraws electrons from highest occupied molecular orbitals (HUMO) from the one or moreorganic emission layers 508 at the optionalhole transport layer 510 forming electron holes. - The optional
electron transport layer 506 can be doped impurity atoms of a donor type, such as phosphorus, arsenic, or antimony to provide some examples, which are capable of donating an electron. The electron transport layer provides excess carrier electrons to the one or moreorganic emission layers 508 as the current flows through the optionalelectron transport layer 506 from thetransparent conducting cathode 504 to thetransparent conducting anode 512. - The one or more
organic emission layers 508 provide various electrostatic forces to bring the electrons and the holes towards each other whereupon the electrons and the holes recombine to form a bound state of the electron and hole, often referred to as an exciton. The decay of the exciton results in a relaxation of the energy levels of the electron, accompanied by emission of radiation whose frequency is in the visible region. The one or moreorganic emission layers 508 can include organometallic chelates, fluorescent and phosphorescent dyes, conjugated dendrimer to provide some examples. - The optional
hole transport layer 510 can be doped with impurity atoms of an acceptor type, such as boron or aluminum to provide some examples, that are capable of accepting an electron. The electron transport layer provides excess carrier holes to the one or moreorganic emission layers 508 as the current flows through the optionalelectron transport layer 506 from thetransparent conducting cathode 504 to thetransparent conducting anode 512. - The
transparent conducting anode 512 receives the current of electrons when the voltage at thetransparent conducting anode 512 is positive with respect to thetransparent conducting cathode 504. In some situations, thetransparent conducting anode 512 can be shared with other anode electrodes of other pixel elements of the organic light-emittingdiode screen display 500. -
FIG. 5B illustrates a single pixel element of a flexible organic light-emitting diode proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure. An organic light-emitting diodeproximity screen display 520 includes one or more pixel elements that are configured and arranged to form a display area. Each of the one or more pixel elements includes one or more layers of one or more organic compounds which emit light in response to an electric current. The one or more layers of the one or more organic compounds are positioned between two electrodes on a substrate. The organic light-emitting diodeproximity screen display 520 also includes an integrated imaging element that is integrated within the display area. The organic light-emitting diodeproximity screen display 520 can represent an exemplary embodiment of theproximity screen display 106, theproximity screen display 210, or theproximity screen display 314 to provide some examples. - A single pixel element of the organic light-emitting diode
proximity screen display 520 includes animaging element 524 and an organic light-emitting diode touchscreen pixel element 526. Theimaging element 524 sense changes inlight 522 resulting from the movement of the near-end user and/or the other passive objects in their field of view. When photons of sufficient energy of the light 522 strike theimaging element 524, they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. The negatively charged electrons move from aphotovoltaic absorption layer 530 toward atransparent conducting cathode 528 which represents a cathode of theimaging element 524. Similarly, the positively charged electron holes move from thephotovoltaic absorption layer 530 toward a sharedtransparent conducting electrode 532 which represents an anode of theimaging element 524. In an exemplary embodiment, thephotovoltaic absorption layer 530 can be implemented in a substantially similar manner using theoptional buffer layer 404 and/or the donor-acceptor layer 406. The movement of the negatively charged electrons toward thetransparent conducting cathode 528 and the movement of the positively charged electron holes toward the sharedtransparent conducting electrode 532 produce a current and/or voltage. This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with theproximity screen display 106. - The organic light-emitting diode touch
screen pixel element 526 includes one or more layers of one or more organic compounds which emit light in response to an electric current which emit light in response to an electric current. The organic light-emitting diode touchscreen pixel element 526 can be implemented in a similar manner as a pixel element of the organic light-emitting diodeproximity screen display 500. As such the organic light-emitting diode touchscreen pixel element 526 includes the optionalelectron transport layer 506, the one or more organic emission layers 508, the optionalhole transport layer 510, and thetransparent conducting anode 512 with thetransparent conducting cathode 504 of the organic light-emitting diodeproximity screen display 500 being replaced by the sharedtransparent conducting electrode 532. - The shared
transparent conducting electrode 532 provides a current of electrons when a voltage at thetransparent conducting anode 512 is positive with respect to the sharedtransparent conducting electrode 532. This current of electrons injects electrons into lowest occupied molecular orbitals (LUMO) of the one or moreorganic emission layers 508 at the optionalelectron transport layer 506 and withdraws electrons from highest occupied molecular orbitals (HUMO) from the one or moreorganic emission layers 508 at the optionalhole transport layer 510 forming electron holes. Various electrostatic forces to bring the electrons and the holes towards each other in the one or moreorganic emission layers 508 whereupon the electrons and the holes recombine to form the exciton. The decay of the exciton results in a relaxation of the energy levels of the electron, accompanied by emission of radiation whose frequency is in the visible region. -
FIG. 6A illustrates a single pixel element of a electronic paper, e-paper, or electronic ink screen display. A electronic paper, e-paper, or electronicink screen display 600 includes one or more pixel elements that are configured and arranged to form a display area. Each of the one or more pixel elements includes various charged pigment particles in a layer of liquid polymer that can be re-configured and re-arranged by applying various electric fields to two electrodes on a substrate. Applying a negative charge to a top electrode repels white pigment particles to a bottom of the layer of liquid polymer forcing black pigment particles to a top of the layer of liquid polymer which results in a black appearance. Similarly, applying a positive charge to the top electrode repels black pigment particles to the bottom of the layer of liquid polymer forcing white pigment particles to the top of the layer of liquid polymer which results in a white appearance. A single pixel element of the electronic paper, e-paper, or electronicink screen display 600 includes the optional flexibletransparent cover 502, a top transparent conducting electrode 602, one or more liquid polymer layers 604, and a bottomtransparent conducting electrode 606 that are formed on theflexible substrate 514. - The top transparent conducting electrode 602 attracts positively charged pigment particles to a top of the one or more liquid polymer layers 604 and repels negatively charged pigment particles black pigment particles to a bottom of the one or more liquid polymer layers 604 when a negative charge is applied between the top transparent conducting electrode 602 and the bottom
transparent conducting electrode 606. Likewise, the top transparent conducting electrode 602 repels the positively charged pigment particles to the bottom of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles to the top of the one or more liquid polymer layers 604 when a positive charge is applied between the top transparent conducting electrode 602 and the bottomtransparent conducting electrode 606. - The one or more liquid polymer layers 604 includes one or more layers of various liquid polymers that suspend the positively charged pigment particles and negatively charged pigment particles until a charge is applied between the top transparent conducting electrode 602 and the bottom
transparent conducting electrode 606. The positively charged pigment particles represent as black pigment particles and the negatively charge pigment particles represent white pigment particles. The pixel element of the paper, e-paper, or electronicink screen display 600 will have a black appearance when the black pigment particles are attracted to the top transparent conducting electrode 602 and the white pigment particles are repelled to the bottom transparent conducting electrode. Likewise, the pixel element of the paper, e-paper, or electronicink screen display 600 will have a white appearance when the white pigment particles are attracted to the top transparent conducting electrode 602 and the black pigment particles are repelled to the bottom transparent conducting electrode. - The bottom
transparent conducting electrode 606 repels the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the negative charge is applied between the top transparent conducting electrode 602 and the bottomtransparent conducting electrode 606. Likewise, the bottomtransparent conducting electrode 606 attracts the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and repels the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the positive charge is applied between the top transparent conducting electrode 602 and the bottomtransparent conducting electrode 606. -
FIG. 6B illustrates a single pixel element of an electronic paper, e-paper, or electronic ink proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure. An electronic paper, e-paper, or electronic inkproximity screen display 620 includes one or more pixel elements that are configured and arranged to form a display area. The electronic paper, e-paper, or electronic inkproximity screen display 620 also includes an integrated imaging element that is integrated within the display area. The electronic paper, e-paper, or electronic inkproximity screen display 620 can represent an exemplary embodiment of theproximity screen display 106, theproximity screen display 210, or theproximity screen display 314 to provide some examples. - A single pixel element of the electronic paper, e-paper, or electronic ink
proximity screen display 620 includes animaging element 622 and an electronic paper, e-paper, or electronic ink touchscreen pixel element 624. Theimaging element 622 sense changes in the light 522 resulting from the movement of the near-end user and/or the other passive objects in their field of view. When photons of sufficient energy of the light 522 strike theimaging element 622, they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. The negatively charged electrons move from thephotovoltaic absorption layer 530 toward a transparent conducting sharedelectrode 630 which represents a cathode of theimaging element 622. Similarly, the positively charged electron holes move from thephotovoltaic absorption layer 530 toward thetransparent conducting anode 512 which represents an anode of theimaging element 622. The movement of the negatively charged electrons toward the transparent conducting sharedelectrode 630 and the movement of the positively charged electron holes towardtransparent conducting anode 512 produce a current and/or voltage. This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with theproximity screen display 106. - Additionally, the single pixel element of the electronic paper, e-paper, or electronic ink
proximity screen display 620 includes afirst conducting element 626 and asecond conducting element 628. Thefirst conducting element 626 attracts positively charged pigment particles to a first side of the one or more liquid polymer layers 604 and repels negatively charged pigment particles black pigment particles to a second side of the one or more liquid polymer layers 604 when a negative charge is applied between thefirst conducting element 626 and thesecond conducting element 628. Similarly, thesecond conducting element 628 repels the positively charged pigment particles to the first side of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles black pigment particles to the second side of the one or more liquid polymer layers 604 when the negative charge is applied between thefirst conducting element 626 and thesecond conducting element 628. The attracting of the positively charged pigment particles to the first side and the negatively charged pigment particles to the first side allows the light 522 to pass through the one or more liquid polymer layers 604 to strike theimaging element 622. - The electronic paper, e-paper, or electronic ink
proximity screen display 624 includes various charged pigment particles in a layer of liquid polymer that can be re-configured and re-arranged by applying various electric fields to two electrodes on the substrate. The electronic paper, e-paper, or electronic inkproximity screen display 624 can be implemented in a similar manner as a pixel element of the electronic paper, e-paper, or electronic inkproximity screen display 600. As such the electronic paper, e-paper, or electronic inkproximity screen display 624 includes the top transparent conducting electrode 602 and the one or more liquid polymer layers 604 with the bottomtransparent conducting electrode 606 of the electronic paper, e-paper, or electronic inkproximity screen display 600 being replaced by the transparent conducting sharedelectrode 630. - The transparent conducting shared
electrode 630 repels the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the negative charge is applied between the top transparent conducting electrode 602 and the transparent conducting sharedelectrode 630. Likewise, the transparent conducting sharedelectrode 630 attracts the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and repels the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the positive charge is applied between the top transparent conducting electrode 602 and the transparent conducting sharedelectrode 630. -
FIG. 7A illustrates a single pixel element of a liquid crystal\screen display. A liquid crystal\screen display 700 includes one or more pixel elements that are configured and arranged to form a display area. Each of the one or more pixel elements includes one or more layers of liquid crystal material aligned between two electrodes. Before an electric field is applied between the two electrodes, the liquid crystal material is aligned perpendicular to each other at the two electrodes, and so molecules of the liquid crystal material arrange themselves in a helical structure, or twist. When a voltage applied between the two electrodes is large enough, the liquid crystal molecules in a center of the layers of liquid crystal material are almost completely untwisted and the polarization of the incident light is not rotated as it passes through the liquid crystal material. This light will then be mainly polarized perpendicular to a horizontal polarizing filter, and thus be blocked and the pixel element will appear black. By controlling the voltage applied between the two electrodes, light can be allowed to pass through in varying amounts thus constituting different levels of gray. A light source, such as a backlight or a reflector to provide some examples, is often included to reflect the light passed through the pixel element. - A single pixel element of the liquid crystal
\screen display 700 includes the optional flexibletransparent cover 502, a vertical axispolarizing filter 702, a top transparent conductingelectrode 704, one or more liquid crystal layers 706, a horizontal axispolarizing filter 708, and a bottomtransparent conducting electrode 710 that are formed on theflexible substrate 514. The vertical axispolarizing filter 702 is configured to pass vertical components of the light passing through it while absorbing and/or reflecting horizontal components. The horizontal axispolarizing filter 708 is configured to pass horizontal components of the light passing through it while absorbing and/or reflecting vertical components. - The one or more
liquid crystal layers 706 contain liquid crystals that twist and untwist at varying degrees to allow light to pass through when a voltage is applied between the top transparent conductingelectrode 704 and the bottomtransparent conducting electrode 710. The liquid crystals untwist changing their polarization and in proportion to a voltage applied between the top transparent conductingelectrode 704 and the bottomtransparent conducting electrode 710. By properly adjusting the level of the voltage applied between the top transparent conductingelectrode 704 and the bottomtransparent conducting electrode 710, almost any gray level or transmission can be achieved. -
FIG. 7B illustrates a single pixel element of a liquid crystal proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure. A liquid crystalproximity screen display 720 includes one or more pixel elements that are configured and arranged to form a display area. The liquid crystalproximity screen display 720 also includes an integrated imaging element that is integrated within the display area. The liquid crystalproximity screen display 720 can represent an exemplary embodiment of theproximity screen display 106, theproximity screen display 210, or theproximity screen display 314 to provide some examples. - A single pixel element of the liquid crystal
proximity screen display 720 includes a the vertical axispolarizing filter 702, the one or more liquid crystal layers 706, the horizontal axispolarizing filter 708, the bottomtransparent conducting electrode 710 and a transparent conducting shared electrode 722 to are configured and arranged on theflexible substrate 514 to form a pixel element of the liquid crystalproximity screen display 720. This pixel element of the liquid crystalproximity screen display 720 can be implemented in a similar manner as a pixel element of the liquid crystalproximity screen display 700 with the top transparent conductingelectrode 704 of the liquid crystalproximity screen display 700 being replaced by the transparent conducting shared electrode 722. The liquid crystals of the one or moreliquid crystal layers 706 twist and untwist at varying degrees to allow light to pass through when a voltage is applied between the transparent conducting shared electrode 722 and the bottomtransparent conducting electrode 710. - The single pixel element of the liquid crystal
proximity screen display 720 also includes thetransparent conducting cathode 528, thephotovoltaic absorption layer 530, and the transparent conducting shared electrode 722 that are configured and arranged to form an imaging element. The imaging element sense changes in the light 522 resulting from the movement of the near-end user and/or the other passive objects in their field of view. When photons of sufficient energy of the light 522 strike the imaging element, they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. The negatively charged electrons move from thephotovoltaic absorption layer 530 toward a transparent conducting shared electrode 722 which represents a cathode of the imaging element. Similarly, the positively charged electron holes move from thephotovoltaic absorption layer 530 toward thetransparent conducting anode 512 which represents an anode of the imaging element. The movement of the negatively charged electrons toward the transparent conducting shared electrode 722 and the movement of the positively charged electron holes towardtransparent conducting anode 512 produce a current and/or voltage. This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with theproximity screen display 106. -
FIG. 8 illustrates an exemplary proximity screen display and proximity screen display interface that can be implemented within the communication device according to an exemplary embodiment of the present disclosure. A proximityscreen display interface 800 provides various control signals to aproximity screen display 802 for configuration of its display area to display information from a host processor, such as thehost processor 104 to provide an example, and/or a communication module, such as thecommunication module 102 to provide an example. Additionally, the proximityscreen display interface 800 can interpret various sensing signals provided by theproximity screen display 802 to determine the presence and/or the location of the near-end user and/or the other passive objects. Further, the proximityscreen display interface 800 can adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the information provided by the communication module and/or the host processor in response to the various sensing signals provided by theproximity screen display 802. The proximityscreen display interface 800 can represent an exemplary embodiment of the proximityscreen display interface 108 and theproximity screen display 802 can represent an exemplary embodiment of theproximity screen display 106, theproximity screen display 200, theproximity screen display 210, theproximity screen display 300, theproximity screen display 310, theproximity screen display 314, the organic light-emitting diodeproximity screen display 520, the electronic paper, e-paper, or electronic inkproximity screen display 620, the liquid crystalproximity screen display 720, or any combination thereof. - The proximity
screen display interface 800 includes atouch screen controller 804, a displayarea driver module 806, and an integrated imagingelement driver module 808. Thetouch screen controller 804 controls overall operation and/or configuration of theproximity screen display 802. As shown inFIG. 8 , thetouch screen controller 804 receivesinformation 850 from the host processor and/or the communication module. For example, theinformation 850 can include video and/or image data to be displayed by theproximity screen display 802. As another example, theinformation 850 can include command and/or control data to control the operation and/or configuration of theproximity screen display 802. This command and/or control data can include backlight parameters, contrast parameters, brightness parameters, sharpness parameters, color parameters, tint parameters, refresh rate parameters, aspect ratio parameters, and/or resolution parameters to control the displaying of the video and/or image data within a display area of theproximity screen display 802. Thetouch screen controller 804 provides video and/orimage data 852 to the displayarea driver module 806 that is to be displayed in accordance with the command and/or control data. Additionally, the command and/or control data can include sensing rate parameters or sensing scheme parameters to control the operation and/or configuration of the integrated imaging elements that are configured and arranged around a periphery of the display area and/or within the proximity screen display. Thetouch screen controller 804 provides imaging command and/orcontrol 854 to the integrated imagingelement driver module 808 to control the operation and/or configuration of the integrated imaging elements. Further, the proximityscreen display interface 800 can adjust various parameters, such the zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the video and/or image data in response to the sensing signals 856 provided by theproximity screen display 802. - The display
area driver module 806 processes the video and/orimage data 852 to provide display area control signals 860 for displaying of the video and/orimage data 852 on the display area of theproximity screen display 802. Typically, the display area of theproximity screen display 802 is configured and arranged as multiple rows and/or columns of pixel elements that are configured and arranged as a matrix. Each of the pixel elements can display a pixel of red, green, blue, black, white, or any combination thereof color by applying a current and/or a voltage to its respective electrodes. A first electrode of each of the pixel elements in each row is coupled to each other. Similarly, a second electrode of each of the pixel elements in each column is coupled to each other. The displayarea driver module 806 provides various currents and/or voltages to the rows and/or the columns of the matrix to configure the multiple rows and/or columns of pixel elements to display the video and/orimage data 852. - The integrated imaging
element driver module 808 provides integrated imaging element control signals 862 to configure to integrated imaging elements to sense light in their field of view. Typically, the integrated imaging elements are controlled in a substantially opposite manner as the pixel elements. For example, when a pixel element displays its respective pixel of red, green, blue, black, white, or any combination thereof color, namely active or turned “ON”, its respective integrated imaging element is configured to be inactive or turned “OFF”. Likewise, when an integrated imaging element is sensing the light in its field of view, namely active or turned “ON”, its respective pixel element is configured to be inactive or turned “OFF”. In some situations, the integrated imaging element and its respective pixel element can be duty cycled to switch between their inactive or active configurations at sufficient rate without affecting an appearance of the video and/orimage data 852 to the near-end user. In other situations, thetouch screen controller 804 determines which of the pixel elements are to be inactive based upon the video and/orimage data 852. In these other situations, thetouch screen controller 804 provides the imaging command and/orcontrol 854 to cause integrated imaging elements that correspond to these inactive pixel elements to be active to sense the light in their field of view. In an exemplary embodiment, the integrated imaging elements of theproximity screen display 802 are configured and arranged as multiple rows and/or columns that are configured and arranged as a matrix. - In this exemplary embodiment, a first electrode of each of the integrated imaging elements in each row is coupled to each other and a second electrode of each of the integrated imaging elements in each column is coupled to each other. In this exemplary embodiment, the integrated imaging
element driver module 808 provides various currents and/or voltages to the rows and/or the columns of the matrix to configure the multiple rows and/or columns of pixel elements to integrated imaging elements to sense the light in their field of view to provide the sensing signals 856. Generally, the magnitudes of the sensing signals 856 depend upon an amount of light sensed in their field of view. - The
proximity screen display 802 can include one or more pixel elements that are configured and arranged in multiple rows and/or columns to form a display area for displaying of the video and/orimage data 852. Theproximity screen display 802 also includes integrated imaging elements that are configured and arranged around the periphery of the display area and/or within theproximity screen display 802 to sense the light in their field of view. Theproximity screen display 802 can include more, less, or equal integrated imaging elements as pixel elements. The integrated imaging elements provide various voltages and/or currents as the sensing signals 856 that are indicative of the light in their field of view. For example, the sensing signals 856 have a first magnitude when the one or more integrated imaging elements are exposed to a bright light, a second magnitude when the one or more integrated imaging elements are exposed to a dark light, and a third magnitude that varies between the first magnitude and the second magnitude as the amount of light sensed by the integrated imaging elements varies between the bright light and the dark light. - As discussed above, a proximity screen display interface, such as the proximity
screen display interface 108 or the proximityscreen display interface 800 to provide some examples, can adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of video data or image data, provided by a communication module, such as thecommunication module 102 to provide an example, and/or a host processor, such as thehost processor 104 to provide an example. The proximity screen display interface can adjust the various parameters of the video data or the image data in response to the presence and/or the location of the touch from the near-end user in relation to a proximity screen display, such as theproximity screen display 106, theproximity screen display 200, theproximity screen display 210, theproximity screen display 300, theproximity screen display 310, theproximity screen display 314, the organic light-emitting diodeproximity screen display 520, the electronic paper, e-paper, or electronic inkproximity screen display 620, the liquid crystalproximity screen display 720, or theproximity screen display 802 to provide some examples. -
FIG. 9 is a flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure. The disclosure is not limited to this operational description. Rather, it will be apparent to persons skilled in the relevant art(s) that other operational control flows are within the scope and spirit of the present disclosure. The following discussion describes the steps inFIG. 9 . - At
step 950, the operational control flow receives video data, image data, command data, and/or control data for displaying the image and/or the video data on to adisplay area 904 of theproximity screen display 900. Theproximity screen display 900 can represent an exemplary embodiment of theproximity screen display 106, theproximity screen display 200, theproximity screen display 210, theproximity screen display 300, theproximity screen display 310, theproximity screen display 314, the organic light-emitting diodeproximity screen display 520, the electronic paper, e-paper, or electronic inkproximity screen display 620, the liquid crystalproximity screen display 720, or theproximity screen display 802 to provide some examples. However, those skilled in the relevant art(s) will recognize that theproximity screen display 900 can also be implemented using any conventional proximity screen display, such as any conventional resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition proximity screen display to provide some examples, that is capable of detecting a presence or a location of an object without departing from the spirit and scope of the present disclosure. - A proximity screen display interface, such as the proximity
screen display interface 108 or the proximityscreen display interface 800 to provide some examples, receives the video data, image data, command data, and/or control data for displaying the image and/or the video data on to adisplay area 904 of aproximity screen display 900. - At
step 952, the operational control flow displays the image and/or the video data on to thedisplay area 904 in according to the command data, and/or control data. Specifically, the proximity screen display interface provides various display area control signals, such as the various display area control signals 860 to provide an example, in response to the video data, image data, command data, and/or control data to configure and arrange thedisplay area 904 to display the video and/or image data on the display area. - At
step 954, the operational control flow detects a presence or a location of an object, such as afinger 902 of a user, which is proximate to thedisplay area 904. Specifically, theproximity screen display 904 can include one or more integrated imaging elements that can be integrated around a periphery of thedisplay area 904 in a similar manner as the imaging elements 202.1 through 202.i and/or integrated within the display area in a similar manner as the integrated imaging elements 212.1 through 212.i. The one or more integrated imaging elements are configured and arranged to sense light in their field of view. - The proximity screen display interface provides various integrated imaging element control signals, such as the integrated imaging element control signals 862 to provide an example, to configure the one or more integrated imaging elements to sense light in their field of view. The one or more integrated imaging elements provide various sensing signals, such as the sensing signals 856 to provide an example, whose magnitudes depend upon an amount of light sensed in their field of view at different locations with the proximity screen display. The proximity screen display interface can interpolate an image of an environment surrounding the display area from magnitudes of the various sensing signals to detect the presence or the location of the object. Additionally, the proximity screen display interface can compare various images of the environment surrounding the
display area 904 at different instances in time to determine movement of the object within the environment. - Further, the proximity screen display interface can recognize specific portions of the object, such as one or more fingers of a hand of the user, from one or more images of the environment surrounding the
display area 904. The proximity screen display interface can assign various control and/command data to different specific portions of the object and provide respective control and/command data when a respective specific portion of the object has been recognized. For example, the proximity screen display interface can provide control and/command data to scroll down and no hypertext jumping or zooming upon recognition of the right thumb, or control and/command data to select hypertext links and invokes zooming upon recognition of the right index finger. - At
step 956, the operational control flow adjusts the image and/or the video data as being displayed on thedisplay area 904 in response to detecting the presence or the location of the object. Specifically, the proximity screen display interface can adjust the various display area control signals to adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the image and/or the video data as being displayed on thedisplay area 904. For example, the proximity screen display interface can adjust the various parameters to enlarge or to zoom into acoincidental portion 902 of the image and/or the video data, such as one or more alphanumeric keys of an integrated virtual keyboard to provide an example, as being displayed on thedisplay area 904 that coincides with the location of the object. In this example, thecoincidental portion 902 of the image and/or the video data can appear to be larger to the user. - Thereafter, the operational control flow reverts to step 952 to display the image and/or the video data.
- As mentioned previously, various full or non-contact modes can be automatically identified by the
proximity screen display 900 and/or its corresponding display interface such as the proximityscreen display interface 108 or the proximityscreen display interface 800 to provide some examples. Such modes can also be selected by a user via a setup process. One such mode, involves a non-contact “click” selection using a pointer finger. Such mode can be established as a factory default or user defined, trained and configured to cause a particular software function to trigger such as launching a software API (application program interface) On_Click( ) type function. In particular, when a pointer finger comes within approximately fifteen (15) centimeters of the screen surface, a bordered circle of a larger size appears on the screen at a corresponding x-y location. Within this circle, underlying pixel based visual graphics can be set to a particular user selected magnification which may be set to any degree of magnification or to 100% (or otherwise turned off). As the pointer finger moves closer, the magnification can be set to scale up or down or merely stay the same. Likewise, the circle size can be made to change or stay the same. When passing over an active input element (button, down arrow, text field or widget), the circle can be made to stabilize or to reflect free movement in the x-y-z directions. With stabilization, even a user with shaky hands can find a target input element and stay thereon long enough to recognize same and carry out perhaps a double click motion without ever touching the screen. The stabilization may involve centering and appropriate zooming along with a temporary dwell time that may be user configured or calculated for each user through training or through captured behaviors. The stabilization can be over-ridden when the pointer finger motion appears to be intentional and not within a range of a particular human's jittering. -
FIG. 10 is a second flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure. The disclosure is not limited to this operational description. Rather, it will be apparent to persons skilled in the relevant art(s) that other operational control flows are within the scope and spirit of the present disclosure. The following discussion describes the steps inFIG. 10 . - At
step 950, the operational control flow receives the video data, image data, command data, and/or control data for displaying the image and/or the video data on to adisplay area 1004 of theproximity screen display 1000. Theproximity screen display 1000 can represent an exemplary embodiment of theproximity screen display 106, theproximity screen display 200, theproximity screen display 210, theproximity screen display 300, theproximity screen display 310, theproximity screen display 314, the organic light-emitting diodeproximity screen display 520, the electronic paper, e-paper, or electronic inkproximity screen display 620, the liquid crystalproximity screen display 720, or theproximity screen display 802 to provide some examples. However, those skilled in the relevant art(s) will recognize that theproximity screen display 1000 can also be implemented using any conventional proximity screen display, such as any conventional resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition proximity screen display to provide some examples, that is capable of detecting a presence or a location of an object without departing from the spirit and scope of the present disclosure. - At
step 952, the operational control flow displays the image and/or the video data on to thedisplay area 1004 in according to the command data, and/or control data. - At
step 1050, the operational control flow detects a presence or a location of an object, such as one ormore hands 1002 of a user, which is proximate to thedisplay area 1004 in a substantially similar manner as described instep 954. - At
step 1052, the operational control flow adjusts the image and/or the video data as being displayed on thedisplay area 1004 in response to detecting the presence or the location of the object. Specifically, the proximity screen display interface can adjust the various display area control signals to adjust the various parameters of the image and/or the video data as being displayed on thedisplay area 1004. For example, the proximity screen display interface can adjust the various parameters to enlarge or to zoom intocoincidental portions 1006 of the image and/or the video data as being displayed on thedisplay area 1004 that coincide with various portions of the object, such as one or more fingers of the one ormore hands 1002. In this example, thecoincidental portions 1006 of the image and/or the video data can appear to be larger to the user. As another example, the proximity screen display interface can adjust the various parameters to adjust anorientation 1008 of the image and/or the video data as being displayed on thedisplay area 1004 to coincide with the various portions of the object. This other example is particularly useful when the image and/or the video data correspond to an integrated virtual keyboard. This other example allows the orientation of the integrated virtual keyboard to be adjusted to coincide with the one or more fingers of the one ormore hands 1002. - As discussed previously, the
proximity screen display 1000 and supporting hardware and software within the illustrated tablet device can support various typing modes, including touch typing (full finger contact with finger lifts and presses), finger hover with keystroking finger screen contact, hover without contact but with keystroke-like finger motions, and so on. Thumb typing and hunt and peck styles can also be selected. And even with hunt and peck, other objects can be used for the pecking (e.g., a stylus, pencil, or broom handle). Typing input can even support one handed typing, missing digits, and obscure typing preferences and layouts. No matter what the typing configuration though, user specific tailoring is supported through training, ongoing monitoring of typing success (spelling/grammar/corrections) and associated hand and finger positions, motions, and characterizations thereof. - To illustrate such tailoring, consider a situation wherein a user brings their hands into full contact with the tablet as illustrated. Instead of forcing a keyboard fit on the user, the
proximity screen display 1000 and supporting hardware and software respond to detecting the approach and construct a keyboard layout that fits the locations of the fingers in their natural typing readiness configuration (again as shown). This keyboard layout may change during the approach as fingers move or only change (or are created) upon contact. Thereafter, as the user types, corrects, makes mistakes and so on, a finger striking range can be identified which may also account for keystroke sequencing. From such information key sizing may be adjusted on a key by key basis. For example, the active area for the letter “q” may need to be much larger than the active area for the letter “a” and both may be ellipsoidal or other more natural finger movement related active key area shapes. From key area shapes, a visual keyboard element can be placed but need not be. The visual keyboard shape and locations can be constructed to parallel how a user typically types or to help tighten up problem areas where a user often makes mistakes. - To further correct such mistakes, a real time predictive spelling can be turned on. Such feature involves making a prediction as to the next letter that will be typed (based on spelling and grammar rules) and which letters will not be typed. When such letters occur in proximity on a keyboard layout, a one of such keys can be made to favor another. For example when ten (10) percent of the time, a user hits an upper region associated with their typical letter “a” strokes and a lower region associated with their typical letter “q,” then “a” could get favored treatment due to the perhaps a most likely spelling prediction. Subsequent keys might change the prediction resulting in a swapping event from “a” to “q” of course. Thus, to avoid confusion, several modes of operation might be selected. In a first mode, both letters can be presented before one is finalized based on subsequent typing entry and spelling and grammar considerations for an entire word or word sequence which either verifies or conflicts with a prediction. In a second mode, the most likely is presented and visually swapped if it proves incorrect. In a third mode, both letter options are withheld until enough further letters are received to converge on one or the other. This applies to more than two possibilities as well based on their nearness in physical keyboard layout.
- Each finger contact area for a given key is represented by statistical variation data for each particular user. Such statistics and associated predictions, resizing and relative locations can be extended to three dimensions and can be associated with any element of each hands digits (e.g., joints). Thus, for example, a particular user's hand motion associated with a pinky-stretch for a “q” might be much more important than the landing spot.
- Such characterizations and predictions apply equally for other types of keyboard input modes as well as non-keyboard, input element interactions involving full contact, no contact, and partial contact motions of hands and fingers, other body parts and held objects.
- It is to be appreciated that the Detailed Description section, and not the Abstract section, is intended to be used to interpret the claims. The Abstract section can set forth one or more, but not all exemplary embodiments, of the disclosure, and thus, are not intended to limit the disclosure and the appended claims in any way.
- The disclosure has been described above with the aid of functional building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.
- It will be apparent to those skilled in the relevant art(s) that various changes in form and detail can be made therein without departing from the spirit and scope of the disclosure. Thus the disclosure should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/655,910 US20130100026A1 (en) | 2011-10-20 | 2012-10-19 | Proximity Screen Display and User Interface |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201161549495P | 2011-10-20 | 2011-10-20 | |
| US13/655,910 US20130100026A1 (en) | 2011-10-20 | 2012-10-19 | Proximity Screen Display and User Interface |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20130100026A1 true US20130100026A1 (en) | 2013-04-25 |
Family
ID=48135540
Family Applications (5)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/361,579 Abandoned US20130100334A1 (en) | 2011-10-20 | 2012-01-30 | Method and System for an Adaptive Auto-Focus Algorithm |
| US13/397,240 Active 2032-08-01 US8749607B2 (en) | 2011-10-20 | 2012-02-15 | Face equalization in video conferencing |
| US13/435,909 Abandoned US20130101275A1 (en) | 2011-10-20 | 2012-03-30 | Video Memory Having Internal Programmable Scanning Element |
| US13/628,750 Abandoned US20130101162A1 (en) | 2011-10-20 | 2012-09-27 | Multimedia System with Processing of Multimedia Data Streams |
| US13/655,910 Abandoned US20130100026A1 (en) | 2011-10-20 | 2012-10-19 | Proximity Screen Display and User Interface |
Family Applications Before (4)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/361,579 Abandoned US20130100334A1 (en) | 2011-10-20 | 2012-01-30 | Method and System for an Adaptive Auto-Focus Algorithm |
| US13/397,240 Active 2032-08-01 US8749607B2 (en) | 2011-10-20 | 2012-02-15 | Face equalization in video conferencing |
| US13/435,909 Abandoned US20130101275A1 (en) | 2011-10-20 | 2012-03-30 | Video Memory Having Internal Programmable Scanning Element |
| US13/628,750 Abandoned US20130101162A1 (en) | 2011-10-20 | 2012-09-27 | Multimedia System with Processing of Multimedia Data Streams |
Country Status (3)
| Country | Link |
|---|---|
| US (5) | US20130100334A1 (en) |
| CN (1) | CN103226279A (en) |
| TW (1) | TW201329554A (en) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140118259A1 (en) * | 2012-11-01 | 2014-05-01 | Pantech Co., Ltd. | Portable device and method for providing user interface thereof |
| US20140181755A1 (en) * | 2012-12-20 | 2014-06-26 | Samsung Electronics Co., Ltd | Volumetric image display device and method of providing user interface using visual indicator |
| US10049625B1 (en) * | 2016-12-21 | 2018-08-14 | Amazon Technologies, Inc. | Context-based rendering |
| US20180301078A1 (en) * | 2017-06-23 | 2018-10-18 | Hisense Mobile Communications Technology Co., Ltd. | Method and dual screen devices for displaying text |
| US10139954B2 (en) * | 2016-03-30 | 2018-11-27 | Qisda Optronics (Suzhou) Co., Ltd. | Display device and operating method thereof |
| US20190302893A1 (en) * | 2013-12-27 | 2019-10-03 | Rovi Guides, Inc. | Methods and systems for selecting media guidance functions based on tactile attributes of a user input |
| US12259751B2 (en) * | 2020-08-25 | 2025-03-25 | Fujifilm Business Innovation Corp. | Display control device, display device, and non-transitory computer readable medium |
Families Citing this family (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9479677B2 (en) * | 2012-09-05 | 2016-10-25 | Intel Corproation | Protocol for communications between platforms and image devices |
| US9246898B2 (en) * | 2012-11-20 | 2016-01-26 | Utility Associates, Inc. | System and method for securely distributing legal evidence |
| US10136061B2 (en) | 2015-01-30 | 2018-11-20 | Microsoft Technology Licensing, Llc | Automatic processing of automatic image capture parameter adjustment |
| CN105138962A (en) * | 2015-07-28 | 2015-12-09 | 小米科技有限责任公司 | Image display method and image display device |
| WO2018004001A1 (en) * | 2016-06-30 | 2018-01-04 | 株式会社ニコン | Camera |
| US9881194B1 (en) * | 2016-09-19 | 2018-01-30 | Hand Held Products, Inc. | Dot peen mark image acquisition |
| KR102646750B1 (en) * | 2018-03-21 | 2024-03-13 | 삼성전자주식회사 | Method for adjusting focus based on spread-level of display object and electronic device implementing the same |
| CN109521547B (en) * | 2018-12-21 | 2021-03-26 | 广州医软智能科技有限公司 | Variable-step-length automatic focusing method and system |
| TWI724788B (en) * | 2020-02-14 | 2021-04-11 | 國立清華大學 | Method for integrating processing-in-sensor and in-memory computing and system thereof |
| CN113269211B (en) * | 2020-02-14 | 2024-09-06 | 神盾股份有限公司 | Integration method and system of processing unit in sensor and computing unit in memory |
| CN113938599B (en) * | 2020-07-14 | 2024-03-08 | 浙江宇视科技有限公司 | Electric lens focusing method and device, electronic equipment and storage medium |
| EP4391522A1 (en) * | 2022-12-21 | 2024-06-26 | GN Audio A/S | Video-conference device and method |
| USD1070830S1 (en) * | 2023-01-27 | 2025-04-15 | Gn Audio A/S | Video bar |
| USD1070798S1 (en) * | 2023-01-27 | 2025-04-15 | Gn Audio A/S | Video bar |
| USD1070800S1 (en) * | 2023-01-27 | 2025-04-15 | Gn Audio A/S | Video bar |
| CN116320595B (en) * | 2023-02-08 | 2025-11-21 | 联想(北京)有限公司 | Recording method and device for multimedia data |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070132741A1 (en) * | 2005-12-14 | 2007-06-14 | Yen-Chang Chiu | Movement detection method for multiple objects on a capacitive touchpad |
| US20090103161A1 (en) * | 2007-10-19 | 2009-04-23 | Qualcomm Mems Technologies, Inc. | Display with integrated photovoltaic device |
| US20090139778A1 (en) * | 2007-11-30 | 2009-06-04 | Microsoft Corporation | User Input Using Proximity Sensing |
| US20100156807A1 (en) * | 2008-12-19 | 2010-06-24 | Verizon Data Services Llc | Zooming keyboard/keypad |
| US20100245731A1 (en) * | 2009-03-31 | 2010-09-30 | Benjie Limketkai | Integrated photovoltaic cell for display device |
| US20110128264A1 (en) * | 2009-11-27 | 2011-06-02 | National Taiwan University | Transflective display device |
| US20110286076A1 (en) * | 2010-05-19 | 2011-11-24 | Au Optronics Corporation | Electrophoretic Display Device |
| US20120306767A1 (en) * | 2011-06-02 | 2012-12-06 | Alan Stirling Campbell | Method for editing an electronic image on a touch screen display |
Family Cites Families (24)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5260736A (en) * | 1991-09-04 | 1993-11-09 | Fuji Photo Film Co., Ltd. | Auto focus control device |
| US6236431B1 (en) * | 1993-05-27 | 2001-05-22 | Canon Kabushiki Kaisha | Video camera apparatus with distance measurement area adjusted based on electronic magnification |
| US6496277B1 (en) * | 1999-07-23 | 2002-12-17 | Xerox Corporation | Data flow control and storage facility for an image reproduction system |
| US7079289B2 (en) * | 2001-10-01 | 2006-07-18 | Xerox Corporation | Rank-order error diffusion image processing |
| US7538815B1 (en) * | 2002-01-23 | 2009-05-26 | Marena Systems Corporation | Autofocus system and method using focus measure gradient |
| US7187413B2 (en) * | 2002-07-25 | 2007-03-06 | Lockheed Martin Corporation | Method and system for using an image based autofocus algorithm |
| EP1528419B1 (en) * | 2002-08-07 | 2018-01-24 | Panasonic Intellectual Property Management Co., Ltd. | Focusing device |
| JP2005055618A (en) * | 2003-08-01 | 2005-03-03 | Seiko Precision Inc | Projector, focusing method and focusing program for the same |
| JP2005156971A (en) * | 2003-11-26 | 2005-06-16 | Tamron Co Ltd | Autofocusing device, camera provided with the same, and camera body |
| US20070177860A1 (en) * | 2004-03-15 | 2007-08-02 | Anthony Hooley | Camera autofocus |
| US7515201B2 (en) * | 2004-06-16 | 2009-04-07 | Hoya Corporation | Focus detection method and focus detection apparatus |
| US8432582B2 (en) * | 2004-08-20 | 2013-04-30 | Xerox Corporation | Uniformity compensation in halftoned images |
| US8154769B2 (en) * | 2005-02-15 | 2012-04-10 | Ricoh Co. Ltd | Systems and methods for generating and processing evolutionary documents |
| DE602006006582D1 (en) * | 2005-08-08 | 2009-06-10 | Mep Imaging Technologies Ltd | ADAPTIVE EXPOSURE CONTROL |
| US7956929B2 (en) * | 2005-10-31 | 2011-06-07 | Broadcom Corporation | Video background subtractor system |
| EP1976268A1 (en) * | 2005-12-28 | 2008-10-01 | Olympus Corporation | Imaging system and image processing program |
| EP2033066A4 (en) * | 2006-05-31 | 2012-08-15 | Ibm | Method and system for transformation of logical data objects for storage |
| KR100780957B1 (en) * | 2006-08-21 | 2007-12-03 | 삼성전자주식회사 | Image Selection Device and Method |
| US8144186B2 (en) * | 2007-03-09 | 2012-03-27 | Polycom, Inc. | Appearance matching for videoconferencing |
| KR100897768B1 (en) * | 2007-05-01 | 2009-05-15 | 삼성전자주식회사 | Auto focusing method and devices that can use the method |
| US20110109764A1 (en) * | 2008-09-24 | 2011-05-12 | Li Hong | Autofocus technique utilizing gradient histogram distribution characteristics |
| JP4620150B2 (en) * | 2008-10-23 | 2011-01-26 | 株式会社東芝 | Electronic device and video processing method |
| US8655146B2 (en) * | 2009-03-31 | 2014-02-18 | Broadcom Corporation | Collection and concurrent integration of supplemental information related to currently playing media |
| JP5569329B2 (en) * | 2010-10-15 | 2014-08-13 | 大日本印刷株式会社 | Conference system, monitoring system, image processing apparatus, image processing method, image processing program, etc. |
-
2012
- 2012-01-30 US US13/361,579 patent/US20130100334A1/en not_active Abandoned
- 2012-02-15 US US13/397,240 patent/US8749607B2/en active Active
- 2012-03-30 US US13/435,909 patent/US20130101275A1/en not_active Abandoned
- 2012-09-10 TW TW101133027A patent/TW201329554A/en unknown
- 2012-09-27 US US13/628,750 patent/US20130101162A1/en not_active Abandoned
- 2012-10-19 US US13/655,910 patent/US20130100026A1/en not_active Abandoned
- 2012-12-26 CN CN2012105761813A patent/CN103226279A/en active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070132741A1 (en) * | 2005-12-14 | 2007-06-14 | Yen-Chang Chiu | Movement detection method for multiple objects on a capacitive touchpad |
| US20090103161A1 (en) * | 2007-10-19 | 2009-04-23 | Qualcomm Mems Technologies, Inc. | Display with integrated photovoltaic device |
| US20090139778A1 (en) * | 2007-11-30 | 2009-06-04 | Microsoft Corporation | User Input Using Proximity Sensing |
| US20100156807A1 (en) * | 2008-12-19 | 2010-06-24 | Verizon Data Services Llc | Zooming keyboard/keypad |
| US20100245731A1 (en) * | 2009-03-31 | 2010-09-30 | Benjie Limketkai | Integrated photovoltaic cell for display device |
| US20110128264A1 (en) * | 2009-11-27 | 2011-06-02 | National Taiwan University | Transflective display device |
| US20110286076A1 (en) * | 2010-05-19 | 2011-11-24 | Au Optronics Corporation | Electrophoretic Display Device |
| US20120306767A1 (en) * | 2011-06-02 | 2012-12-06 | Alan Stirling Campbell | Method for editing an electronic image on a touch screen display |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140118259A1 (en) * | 2012-11-01 | 2014-05-01 | Pantech Co., Ltd. | Portable device and method for providing user interface thereof |
| US20140181755A1 (en) * | 2012-12-20 | 2014-06-26 | Samsung Electronics Co., Ltd | Volumetric image display device and method of providing user interface using visual indicator |
| US10120526B2 (en) * | 2012-12-20 | 2018-11-06 | Samsung Electronics Co., Ltd. | Volumetric image display device and method of providing user interface using visual indicator |
| US20190302893A1 (en) * | 2013-12-27 | 2019-10-03 | Rovi Guides, Inc. | Methods and systems for selecting media guidance functions based on tactile attributes of a user input |
| US10901511B2 (en) * | 2013-12-27 | 2021-01-26 | Rovi Guides, Inc. | Methods and systems for selecting media guidance functions based on tactile attributes of a user input |
| US10139954B2 (en) * | 2016-03-30 | 2018-11-27 | Qisda Optronics (Suzhou) Co., Ltd. | Display device and operating method thereof |
| US10049625B1 (en) * | 2016-12-21 | 2018-08-14 | Amazon Technologies, Inc. | Context-based rendering |
| US20180301078A1 (en) * | 2017-06-23 | 2018-10-18 | Hisense Mobile Communications Technology Co., Ltd. | Method and dual screen devices for displaying text |
| US12259751B2 (en) * | 2020-08-25 | 2025-03-25 | Fujifilm Business Innovation Corp. | Display control device, display device, and non-transitory computer readable medium |
Also Published As
| Publication number | Publication date |
|---|---|
| CN103226279A (en) | 2013-07-31 |
| US20130101275A1 (en) | 2013-04-25 |
| US20130101162A1 (en) | 2013-04-25 |
| US20130100334A1 (en) | 2013-04-25 |
| US20130100235A1 (en) | 2013-04-25 |
| TW201329554A (en) | 2013-07-16 |
| US8749607B2 (en) | 2014-06-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20130100026A1 (en) | Proximity Screen Display and User Interface | |
| US10909916B2 (en) | OLED array substrate, OLED display panel, pixel circuit, driving method and method for fingerprint recognition using OLED display panel | |
| US9830009B2 (en) | Apparatus and method for detecting hovering commands | |
| EP3428967B1 (en) | Electronic device having display | |
| KR101761543B1 (en) | Touch sensor and method for driving the same and display device | |
| US8665223B2 (en) | Display device and method providing display contact information based on an amount of received light | |
| TWI696957B (en) | Method and device for synchronously collecting fingerprint information | |
| CN106484176A (en) | Pressure detector and the touch input device containing which of pressure-sensitivity can be adjusted | |
| CN113903300A (en) | Display panel, calibration method, calibration device and electronic equipment | |
| KR20220147962A (en) | Foldable electronic device | |
| US10302484B2 (en) | Optical sensor module | |
| CN116580636A (en) | display device | |
| KR102014779B1 (en) | Electronic apparatus and method of driving a display | |
| US11194428B2 (en) | Touch screen, pressure-sensitive touch method, and display apparatus | |
| US20240423038A1 (en) | Display device | |
| US20230105095A1 (en) | Electronic device and method of operating the same | |
| CN116185218A (en) | touch screen monitor | |
| KR102321652B1 (en) | Display Apparatus | |
| US20210223903A1 (en) | Control method for display screen, and electronic device | |
| CN223503355U (en) | Illuminated display device | |
| US12260840B2 (en) | Display device and driving method of the same | |
| US11101506B2 (en) | Mobile device for determining magnitude of light volume, method for controlling mobile device thereof and non-transitory storage medium thereof | |
| KR20240068880A (en) | Display device | |
| JP2025065216A (en) | Display device | |
| CN113571007A (en) | Sub-pixel driving circuit and pixel driving circuit |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VITSNUDEL, ILIA;BENNETT, JAMES;REEL/FRAME:029159/0645 Effective date: 20121018 |
|
| AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
| AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 |
|
| AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001 Effective date: 20170119 |