[go: up one dir, main page]

HK1150698B - Camera data management and user interface apparatuses, systems, and methods - Google Patents

Camera data management and user interface apparatuses, systems, and methods Download PDF

Info

Publication number
HK1150698B
HK1150698B HK11104514.6A HK11104514A HK1150698B HK 1150698 B HK1150698 B HK 1150698B HK 11104514 A HK11104514 A HK 11104514A HK 1150698 B HK1150698 B HK 1150698B
Authority
HK
Hong Kong
Prior art keywords
camera
image
session
user interface
visual indicator
Prior art date
Application number
HK11104514.6A
Other languages
Chinese (zh)
Other versions
HK1150698A1 (en
Inventor
Heath Stallings
Sok Y. Hwang
Shadman Zafar
Original Assignee
Verizon Patent And Licensing Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/165,411 external-priority patent/US8477228B2/en
Application filed by Verizon Patent And Licensing Inc. filed Critical Verizon Patent And Licensing Inc.
Publication of HK1150698A1 publication Critical patent/HK1150698A1/en
Publication of HK1150698B publication Critical patent/HK1150698B/en

Links

Description

Camera data management and user interface device, system, and method
Cross Reference to Related Applications
This application claims priority to U.S. patent application No.12/165, 411, filed on 30/6/2008, which is hereby incorporated by reference in its entirety.
Technical Field
Background
As digital photo and data storage technology advances, and as the costs associated with the technology decrease, digital cameras have become commonplace in society. For example, digital cameras are included in many mobile phones. However, conventional data management and user interface applications provided by digital cameras, and particularly user interface applications provided by camera phones, tend to be unsightly and difficult to use.
Disclosure of Invention
Drawings
The accompanying drawings illustrate various embodiments and are a part of the specification. The illustrated embodiments are merely examples and do not limit the scope of the disclosure. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements.
FIG. 1 illustrates an exemplary camera data management and user interface system.
Fig. 2 illustrates an exemplary mobile device in which the system of fig. 1 is implemented.
Fig. 3 illustrates an exemplary session-based organization of camera images.
4A-4I illustrate exemplary graphical user interface views that may be displayed.
FIG. 5 illustrates an exemplary publication system.
Fig. 6 illustrates an exemplary content distribution subsystem.
FIG. 7 illustrates an exemplary camera data management and user interface method.
Detailed Description
Exemplary camera data management and user interface devices, systems, and methods are described herein.
In certain exemplary embodiments, a graphical user interface is provided for display. A plurality of graphical user interface views may be displayed in the graphical user interface and may be configured to allow a user to interact with camera-related features and functions and camera image data. In some examples, a graphical user interface including a live (1ive) camera sensor view is displayed, and an image manager pane (pane) and the live camera sensor view are displayed in the graphical user interface in response to capture of a camera image (capture). The image manager pane includes a visual indicator representing the captured camera image.
In certain exemplary embodiments, camera images are captured and automatically assigned to sessions based on a predetermined session grouping heuristic (session grouping heuristics). Such sessions may be defined and used to organize camera images and groups for processing. Examples of sessions and assigning camera images to sessions are described further below.
In certain exemplary embodiments, data representing the captured camera image is provided to a content distribution subsystem over a network, and the content distribution subsystem is configured to distribute the data representing the camera image to a plurality of predetermined destinations. In some examples, the destination is defined by a user, and the content distribution subsystem is configured to send the camera image to the defined destination.
Exemplary embodiments of camera data management and user interface devices, systems, and methods will now be described in more detail with reference to the accompanying drawings.
Fig. 1 illustrates an exemplary camera data management and user interface 100 (or simply "system 100"). As shown in fig. 1, the system 100 may include a communication device 110, a processing device 120, a storage device 130, an input/output ("I/O") device 140, a camera device 150, a session management device 160, a user interface device 170, and a publishing device 180 communicatively connected to each other. The devices 110 and 180 may be communicatively coupled using any suitable technique, and the devices 110 and 180 may communicate using any communication platform and/or technique suitable for communicating communications and data between the devices 110 and 180, including well-known communication platforms and techniques.
In some examples, system 100 may include any computing hardware and/or instructions (e.g., software programs) configured to perform the processes described herein, or a combination of computing instructions and hardware. In particular, it should be understood that system 100 or components of system 100 may be implemented on one physical computing device or may be implemented on multiple physical computing devices. Thus, the system 100 may comprise any of a number of well-known computing devices and may use any of a number of well-known computer operating systems.
One or more of the processes described herein may be implemented, at least in part, as computer-executable instructions tangibly embodied in a computer-readable medium, i.e., instructions executable by one or more computing devices. Generally, a processor (e.g., a microprocessor) receives instructions, e.g., from a memory, a computer-readable medium, etc., and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions may be stored and transmitted using a variety of known computer-readable media.
A computer-readable medium (also referred to as a processor-readable medium) includes any medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such media may take the form of, including but not limited to: non-volatile media, and transmission media. Non-volatile media may include, for example, optical or magnetic disks and other persistent memory. Volatile media may include, for example, dynamic random access memory ("DRAM"), which typically constitutes a main memory. Transmission media may include, for example, coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to the processor of the computer. Transmission media can include or convey acoustic waves, light waves and electromagnetic emissions, such as those generated during radio frequency ("RF") and infrared ("IR") communications. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or module, or any other medium from which a computing device can read.
Accordingly, each of the devices 110-180 can be implemented as hardware, computing instructions (e.g., software) tangibly embodied on a computer-readable medium, or a combination of hardware and computing instructions tangibly embodied configured to perform one or more of the processes described herein. In particular embodiments, for example, session management device 160, user interface device 170, and/or one or more other devices may be implemented as one or more software applications embodied on a computer-readable medium, such as storage device 130 or other memory, and configured to direct processing device 120 to perform one or more of the processes described herein.
The components of system 100 shown in fig. 1 are merely illustrative. In other embodiments, one or more components may be added, omitted, or reconfigured. In particular embodiments, for example, communication device 110, and/or publishing device 180 may be omitted. In certain embodiments, session management device 160, or user interface device 170 may be omitted. Each of the devices 110-180 will now be described in greater detail.
Communication device 110 may be configured to send and/or receive communications to/from one or more external apparatuses (e.g., servers), and communication device 110 may include and/or use any apparatus, logic, communication media, communication protocols, and/or other technologies suitable for sending and receiving communications and data, including data representing camera images (e.g., photographs) and/or issuing commands and data. Examples of such communication technologies, devices, media, and protocols include, but are not limited to, data transmission media, communication devices, transmission control protocol ("TCP"), internet protocol ("IP"), file transfer protocol ("FTP"), Telnet protocol (Telnet), hypertext transfer protocol ("HTTP"), secure hypertext transfer protocol ("HTTPs"), session initiation protocol ("SIP"), simple object access protocol ("SOAP"), extensible markup language ("XML") and variations thereof, simple mail transfer protocol ("SMTP"), real-time transport protocol ("RTP"), user datagram protocol ("UDP"), global system for mobile communications ("GSM") technology, code division multiple access ("CDMA") technology, time division multiple access ("TDMA") technology, short message service ("SMS"), multimedia message service ("MMS"), "and the like, Evolution-data optimized protocol ("EVDO"), radio frequency ("RF") signaling techniques, signaling system 7 ("SS 7") techniques, ethernet, in-band and out-of-band signaling techniques, and other suitable communication networks and techniques.
Processing device 120 may include one or more processors and may be configured to perform and/or direct the performance of one or more processes or operations described herein. Processing device 120 may direct the performance of operations according to computer-executable instructions, which may be stored, for example, in storage device 130 or another computer-readable medium. For example, processing device 120 may be configured to process data, including demodulating, decoding, and parsing the acquired data, and encode and modulate the data for transmission by communication device 110.
Storage 130 may include one or more data storage media, apparatuses, or configurations and may use any type, form, and combination of storage media. For example, storage device 130 may include, but is not limited to: a hard disk drive, a network drive, a flash drive, a magnetic disk, an optical disk, random access memory ("RAM"), dynamic RAM ("DRAM"), other nonvolatile and/or volatile memory units, or combinations or sub-combinations thereof. Data including data representing camera images and/or image metadata may be temporarily and/or permanently stored in storage device 130.
Input/output device 140 may be configured to receive user input and provide user output, and may include any hardware, firmware, software, or combination thereof that supports input and output capabilities. For example, input/output device 140 may include one or more means for capturing user input, including but not limited to: a microphone, voice recognition technology, a keyboard or keypad, a touch screen component, a receiver (e.g., an RF or infrared receiver), and one or more input buttons.
Input/output device 140 may include one or more means for providing output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen), one or more display drivers, one or more audio speakers, and one or more audio drivers. The output may include audio, video, text, and/or tactile output. In particular embodiments, for example, input/output device 140 is configured to display a graphical user interface ("GUI") for viewing by a user. An exemplary GUI that may be displayed via the input/output device 140 is further described below.
The camera device 150 may include any combination of hardware, software, and/or firmware configured to capture camera images. For example, the camera device 150 may include components of a still photo camera, a video camera, and/or a camera such as a camera lens, a camera sensor, and/or the like. Any suitable camera technology and apparatus may be used. Accordingly, the camera device 150 may capture one or more camera images, including generating data (e.g., digital image data) representative of the camera images. The data representing the captured images may be provided to one or more of the other devices 110, 140, and 160, 180 for processing and/or storage. For example, the camera image data may be temporarily or permanently stored in the storage device 130. The camera image may include one or more images and/or data representing one or more images captured by the camera device 150, including but not limited to a photograph, video, or other collection of image frames.
The session management device 160 may be configured to organize or direct the processing device 120 to organize the camera images and/or related data by grouping sessions of heuristics based on predetermined sessions. Session management device 160 may also provide one or more tools for defining session grouping criteria. Exemplary definitions of session grouping criteria and session-based organization of camera images are further described below.
The user interface device 170 may be configured to generate or direct the processing device 120 to generate one or more user interfaces. For example, one or more GUIs may be generated and provided to the input/output device 140 for display. As mentioned above, exemplary GUI views are described further below.
Publishing device 180 may be configured to perform or direct performance of one or more operations for publishing camera images. Publishing may include, but is not limited to: provide one or more camera images to the input/output device 140 for display; means for providing one or more camera images to the communication device 110 for transmission to an external device or for storage and/or distribution (e.g., automated scheduled distribution); and/or provide one or more camera images to an external service or platform (e.g., a social networking website) for display. Examples of publishing camera images are described further below.
The system 100 may be implemented in a number of ways, and the system 100 may be adapted for specific applications. Fig. 2 illustrates an exemplary mobile device 200 on which the system 100 is implemented. The mobile device 200 may include one or more of the apparatuses 110 and 180 shown in fig. 1 and may be configured to perform one or more of the processes and/or operations described herein. In a particular embodiment, the mobile device 200 includes a mobile telephone device having a built-in digital camera. In other embodiments, system 100 may be implemented in other devices or other types of devices.
As shown in FIG. 2, the mobile device 200 may include a plurality of input buttons 210-1 through 210-8 (collectively, "input buttons 210"), which the plurality of input buttons 210-1 through 210-8 may be actuated by a user to provide input to the mobile device 200. Exemplary input buttons may include "soft" and/or "hard" coded input buttons. The "soft" coded buttons may be dynamically associated with different user input commands and/or operations based on the operating environment of the mobile device 200, and the "hard" coded buttons may be statically associated with corresponding user input commands and/or operations of the mobile device 200. FIG. 2 illustrates a mobile device 200 that includes "soft" input buttons 210-1 and 210-2. The user may be indicated the operations associated with the "soft" input buttons 210-1 and 210-2. FIG. 2 shows visually indicated "menu" and "options" operations associated with "soft" input buttons 210-1 and 210-2, respectively. Mobile device 200 further includes a "clear" ("CLR") input button 210-3, a "Send" input button 210-4, an "end" input button 210-5, a camera mode input button 210-6, a select button 210-7, and one or more directional (e.g., "Up," "Down," "left," and "Right") input buttons 210-8.
A user of mobile device 200 may use one or more of input buttons 210 to provide user input configured to initiate mobile device operations. For example, camera mode input button 210-6 may be actuated to activate (activate) or terminate operation of the camera mode on mobile device 200. When the camera mode is active, the camera device 150 may operate in a state configured to capture camera images. As another example, directional input buttons 210-8 may be used to navigate a visual selector within the GUI and highlight or indicate a designated selectable item in the GUI. Selector button 210-7 may be used to select one or more highlighted items, thereby initiating one or more operations associated with the selected items.
As shown in fig. 2, mobile device 200 may include a display 220, display 220 being configured to display a graphical user interface 230 ("GUI 230") for viewing by a user of mobile device 200. The display 220 may be included in the input/output device 140 and may include a display screen on which the GUI230 may be displayed. Examples of the GUI230 and the various views that may be displayed in the GUI230 are described in more detail below.
To help facilitate understanding of the session management device 160 and session management operations, fig. 3 illustrates an exemplary organization of camera images by session. The camera device 150 may acquire camera images 310-1 through 310-N (collectively "camera images 310"), and the session management device 160 may organize the acquired camera images 310, including selectively organizing the camera images 310 into one or more sessions 320-1 through 320-2 (collectively "sessions 320"). In particular embodiments, when a camera image 310 is obtained (e.g., when a photograph is captured), the camera image 310 may be automatically assigned to one or more sessions 320.
The session 320 may include a defined set of one or more camera images 310. A session may be defined as a particular implementation or application that may serve the system 100. In particular embodiments, the session 320 may be defined by specifying one or more criteria to be satisfied that qualify the camera image 310 for inclusion in the session 320. When it is determined that the camera image 310 satisfies the criteria, the camera image 310 may be assigned to the session 320.
As an example, the session 320 may be defined as a set of one or more camera images acquired over a continuous period of time during which the camera mode is active (i.e., the camera device 150 is in a state configured to acquire camera images). The activation of the camera mode may be a criterion defined for establishing a new session 320 and the deactivation of the camera mode may be a criterion defined for ending the session. Thus, when the user activates the camera mode, the session management device 160 may establish a new session 320-1. Any camera images 310-1 and 310-2 acquired between the time the camera mode is activated (and session 320-1 is established) and the time the camera mode is deactivated (and session 320-1 is closed) may be assigned to session 320-1. This process may be repeated for other camera images obtained during a time period between another activation and deactivation of the camera mode. The camera device 150 may be configured to activate or deactivate the camera mode in response to a predetermined event, such as the user turning the camera mode on or off (e.g., using the "camera mode" button 210-6), or the passage of a predetermined length of time from the acquisition of the most recent camera image.
As another example, the session 320 may be defined based on geographic criteria. For example, session 320 may be defined to include grouping criteria for specifying a common geographic location (e.g., a particular geographic location or area). Accordingly, the session 320 may be assigned a camera image 310 associated with a geographic location. For example, mobile device 200 may be configured to detect a geographic location of mobile device 200, such as by using GPS technology to determine GPS coordinates of the detected location of mobile device 200. When camera image 310 is acquired, location data representing the geographic location of mobile device 200 may be associated with camera image 310. For example, the position data may be included in the camera image metadata. The session management device 160 may be configured to use the location data to selectively assign camera images 310 to sessions 320 that have been defined by geographic locations as described above. Accordingly, camera images 310 associated with a particular geographic location (e.g., camera images 310 acquired within a geographic location) may be grouped into sessions 320.
As another example, the session 320 may be defined based on temporal data. For example, the session 320 may be defined to include camera images 310 associated with a particular range of time (e.g., days, weeks, months, etc.) (e.g., camera images 310 captured within a particular range of time (e.g., days, weeks, months, etc.)). Accordingly, the camera images 310 may be selectively assigned to the sessions 320 based on time data associated with the camera images 310, such as time data indicating when the camera images 310 were acquired.
Combinations of various conversation grouping criteria, including any combination or sub-combination of the above examples of conversation grouping criteria, may be used to define the conversation 320.
In particular embodiments, the session management apparatus 160 may be configured to automatically and selectively assign the acquired camera images 310 to one or more sessions 320 based on a predetermined session grouping heuristic 330. The conversation group heuristics 330 may be defined to represent one or more defined conversation group criteria associated with one or more conversations 320. Thus, the session grouping heuristic 330 may include a set of rules configured to be used by the session management device 160 for automatically and selectively assigning the acquired camera images 310 to the sessions 320.
The session management apparatus 160 may provide one or more tools configured to enable a user (e.g., a user of the mobile device 200) to manage the session 320, including defining, establishing, opening, modifying, closing, deleting, naming, searching, accessing, and otherwise processing the session 320 or the camera image 310 assigned to the session 320. Using the tools, a user of mobile device 200 may customize one or more conversation grouping criteria to be met that qualify camera image 310 for being assigned to conversation 320. Session management device 160 may define and/or update session grouping heuristics 330 to represent user-defined criteria. Thus, the user may establish customized session grouping criteria and sessions 320.
In particular embodiments, one or more of the tools provided by session management device 160 may be configured to enable a user to assign an identifier (e.g., a name) to session 320. For example, sessions 320 defined to include camera images 310 associated with a particular geographic location may be descriptively named. Examples of such session names may include "home," "work," "lake," "road trip," and so forth. The session management device 160 may further provide one or more tools for searching and selecting sessions 320.
The camera images 310 included in the session 320 may be displayed, identified, selected, and/or processed together. As described further below, the user interface may display camera images 310 organized by sessions 320, or the user interface may provide an indication of sessions 320 assigned one or more camera images 310. A session 320 may be selected, and the camera images 310 included in the session 320 may be collectively selected and processed. For example, a session 320 may be selected and published, which means that camera images 310 within the session 320 are selected and published. Examples of releases are further described below. Examples of session-based user interface views, session indicators in user interface views, and session-based operations are described further below.
To facilitate understanding of the user interface device 170 and the exemplary user interface provided by the user interface device 170, fig. 4A-4I illustrate exemplary GUI views that may be generated and provided by the user interface device 170 for display.
Fig. 4A illustrates the GUI230 in which a live camera sensor view 410 is displayed. When the camera device 150 is operating in the active camera mode as described above, a live camera sensor view 410 may be displayed in the GUI 230. The live camera sensor view 410 may be a real-time or near real-time representation of the view detected by the camera sensor.
The camera device 150 may capture a camera image (e.g., camera image 310-1) representing a live camera sensor view 410. For example, a user of the mobile apparatus 200 may actuate the selection button 210-7 or other input mechanism, and the camera device 150 may responsively capture a camera image 310-1 representing the live camera sensor view 410.
The view shown in FIG. 4A may be updated as the camera image 310-1 is captured. For example, fig. 4B illustrates the GUI230 in which the image manager pane 420 and the live camera sensor view 410 are displayed together. In the view shown in fig. 4B, the live camera sensor view 410 may be as described above, and may continuously display a real-time or near real-time representation of the view detected by the camera sensor. In the illustrated example, an image manager pane 420 is displayed overlaid over the live camera sensor view 410 of fig. 4A. This is merely illustrative. In other embodiments, the live camera sensor view 410 of fig. 4A may be resized, for example, from a full screen view to a partial screen view, to fit within the image manager panel 420 in the GUI 230.
Although fig. 4B illustrates a vertically oriented image manager panel 420 positioned along the left side of the live camera sensor view 410 and aligned with the left side of the GUI230, this is merely illustrative. Other locations, shapes, orientations, and sizes of the image manager panel 420 may be used in other embodiments. For example, the image manager panel 420 may be oriented horizontally and positioned along the top or bottom edge of the GUI 230. As another example, the image manager pane 420 may be configured to form a border or partial border around the live camera sensor view 410 and/or the GUI 230.
The image manager panel 420 may be displayed in the GUI230 in response to the capture of the camera image 310-1 or in response to another predetermined event (e.g., a predetermined user input command). In particular embodiments, the user interface device 170 may be configured to continue to display the image manager pane 420 in the GUI230 until the occurrence of a predetermined event, such as the elapse of a predetermined time after capturing the camera image 310-1 or deactivating the camera mode. For example, the image manager panel 420 may be temporarily displayed when the camera image 310-1 is captured, and the image manager panel 420 is hidden from view after a predetermined length of time has elapsed. When the image manager panel 420 is hidden, the view in the GUI230 may return to the full screen live camera sensor view 410 as illustrated in fig. 4A. In other embodiments, the image manager panel 420 continues to be displayed in the GUI230 after the camera image 310-1 is captured, and the associated session remains active.
The image manager pane 420 may include one or more visual indicators representing one or more camera images 310. In FIG. 4B, for example, the image manager pane 420 includes a visual indicator 440-1, the visual indicator 440-1 representing a camera image 310-1 captured from the live camera sensor view 410. Any suitable visual indicator may be used. In a particular embodiment, the visual indicator 440-1 includes a thumbnail of the captured camera image 310-1.
In certain embodiments, visual indicator 440-1 may comprise a selectable object. Fig. 4B illustrates the selector 450 positioned at (e.g., highlighted) the visual indicator 440-1. The user of the mobile device 200 may provide input to navigate the selector 450 and highlight and/or select the visual indicator 440-1 in the image manager panel 420.
One or more operations may be performed on the camera image 310-1 associated with the selected visual indicator 440-1 displayed at the image manager panel 420. For example, using the visual indicator 440-1 identified by the selector 450, an option button (e.g., the "soft" button 210-2) may be actuated. In response, the user interface device 170 may provide a set of selectable operations that may be applied to the camera image 310-1. Examples of such operations include, but are not limited to: delete, permanently store, name, attach comments, and publish the camera image 310. Examples of such operations applied to the camera image 310-1 or a group of camera images 310 (e.g., session 320) are described further below.
Using the live camera sensor view 410 and the image manager pane displayed together in the GUI230, the user can view, manage, and manipulate the camera image 310-1 from the image manager pane 420 while also being able to view the live camera sensor view 410. That is, the viewing of the live camera sensor view 410 does not have to be turned off in order for the user to view, manage, and manipulate the camera image 310-1.
In a particular example, another camera image 310-2 may be captured from the user interface view illustrated in fig. 4B. For example, while the view shown in fig. 4B is displayed in the session 320, the user of the mobile apparatus 200 may actuate the selector button 210-7 or other input mechanism, and the camera device 150 may responsively capture another camera image 310-2 representing the live camera sensor view 410.
The view shown in FIG. 4B may be updated as another camera image 310-2 is captured. For example, FIG. 4C illustrates GUI230 with visual indicators 440-1 and 440-2 (collectively "visual indicators 440") included in the image manager panel 420. A visual indicator 440-2 may be added to the image manager panel 420 and may represent the newly captured camera image 310-2.
In FIG. 4C, the selector 450 is located at the visual indicator 440-2. In certain embodiments, the user interface device 170 may be configured to automatically position the selector 450 at the most recently added visual indicator 440-2 in the image manager panel 420. In the current example, the selector 450 is repositioned from the visual indicator 440-1 in FIG. 4B to the visual indicator in FIG. 4C, for example, when the camera image 310-2 is captured and the visual indicator 440-2 is added to the image manager panel 420.
The selector 450 may be navigated in the image manager panel 420 based on user input. For example, actuation of the "up" direction of directional button 210-8 may cause selector 450 to move from visual indicator 440-2 to visual indicator 440-1 in FIG. 4C. Thus, the user may navigate the selector 450 in the image manager panel 420 and select and initiate an operation on any camera image 310 associated with the visual indicator 440 included in the image manager panel 420.
In particular embodiments, when the live camera sensor view 410 and the image manager pane 420 are displayed together in the GUI230, a particular input may be defined as being associated with either the live camera sensor view 410 or the image manager pane 420. For example, when the view shown in fig. 4C is displayed in the GUI230, actuation of a particular one of the input buttons 210 of the mobile device 200 may be configured to initiate capture of the camera image 310, while actuation of one or more of the other input buttons 210 of the mobile device 200 may be configured to initiate navigation between the visual indicators 440 included in the image manager panel 420 and select the visual indicators 440 included in the image manager panel 420. Thus, the live camera sensor view 410 and the image manager pane 420 may be active simultaneously for receiving a particular user input and performing a corresponding operation.
In other embodiments, the user interface device 170 may be configured to switch active input modes between the live camera sensor view 410 and the image manager pane 420 in response to user input. When the live camera sensor view 410 or image manager panel 420 is in an active input mode, user input may be received for a particular view 410 or panel 420. As an example of switching input modes, when an input mode is active in the image manager pane 420, receipt of a predetermined user input may cause the input mode to become inactive for the image manager pane 420 and active for the live camera sensor view 410. In a particular example, the "right" directional input button 210-8 may be associated with switching an active input mode from the image manager pane 420 to the live camera sensor view 410. A visual indication of a switch active input mode command may be displayed within GUI 230. Each of fig. 4B and 4C illustrates a visual "right" directional arrow indicator 455-1 located near the selector 450 and pointing in the direction of the live camera sensor view 410. This may indicate that actuation of the "right" directional input button 210-8 may be configured to cause the active input mode to switch from the image manager pane 420 to the live camera sensor view 410. Similarly, another predetermined input, such as the "left" directional input button 210-8, may be configured to cause the active input mode to switch from the live camera sensor view 410 to the image manager pane 420.
In particular embodiments, the image manager panel 420 may be session specific. That is, the visual indicator 440 in the image manager panel 420 may be associated with the camera image 310 corresponding to a particular session 320. For example, when a camera image 310-2 is captured and assigned to a session 320-1, the user interface device 170 may be configured to include content associated with the same session 320-1 in the image manager panel 420. For example, because the corresponding camera images 310-1 and 310-2, respectively, are associated with the same session 320-1, at least visual indicators 440-1 and 440-2 may be included in the image manager panel 420. The session 320-1 assigned the most recently acquired camera image may be referred to as an "active session," and the image manager panel 420 is configured to include a visual indicator 440 of the camera image associated with the active session.
The image manager panel 420 with session-specific content may help reduce the chance of confusion and/or error as compared to conventional camera interfaces that are not session-specific. For example, when camera images 310-1 and 310-2 are captured and assigned to session 320-1 and visual indicators 440-1 and 440-2 associated with the captured camera images 310-1 and 310-2 are displayed in the image manager panel 420, the user can manage the camera images 320-1 and 320-2 in session 320-1 individually or as a group without having to consider other camera images not associated with session 320-1. For example, without having to view and/or sort other camera images not included in session 320-1, the user may scroll through visual indicators 440-1 and 440-2 in image manager panel 420 and initiate performance of at least one operation of the corresponding camera images 310-1 and 310-2, such as deleting unwanted camera images (e.g., camera image 310-2) from session 320-1 before permanently storing the remaining camera images (e.g., camera image 310-1) in storage 130.
The user interface device 170 may be configured to provide other session-specific views and/or session indicators associated with the camera image 310. For example, fig. 4D illustrates an exemplary camera image library view 460 in the GUI 230. As shown, the library view 460 may include a plurality of visual indicators 440-1 through 440-J, each of which may correspond to the camera image 310. The GUI230 may further include session indicators 470-1 through 470-2 (collectively "session indicators 470"), the session indicators 470-1 through 470-2 being configured to indicate a particular visual indicator 440-1 through 440-2, and/or the corresponding camera images (e.g., camera images 310-1 and 310-2) being associated with the common session 320-1. Thus, the user may discern from the GUI view which camera images 310-1 and 310-2 are associated with a particular session 320-1. Although fig. 4D illustrates an exemplary session indicator 470, other session indicators or other types of session indicators may be used in other embodiments. For example, a similar color, background, border, brightness, or transparency may be used to indicate an association with the common session 320-1.
The user interface device 170 may be configured to provide the user with the ability to select a plurality of visual indicators 440 from the visual indicators 440 included in the image manager panel 420 or library view 460. For example, the selector 450 may be used to navigate and select one or more of the visual indicators 440-1 and 440-4 to establish a set of selected camera images (the camera images 310-1 and 310-4 associated with the visual indicators 440-1 and 440-4).
In particular embodiments, the user interface device 170 may be configured to enable a user to navigate and select a particular visual indicator 440-1, or corresponding camera image 310-1, associated with the session 320-1 and select all other camera images associated with the same session 320-1 (e.g., the session 320-2 corresponding to the visual indicator 440-2) from the selected visual indicator 440-1 or camera image 310-1. For example, where the visual indicator 440-1 is highlighted by the selector 450 as shown in FIG. 4D, the user may select an "options" input button (e.g., the "soft" input button 210-2) to launch a window in the GUI230 that includes selectable operations that may be performed in relation to the camera image 310-1 associated with the visual indicator 440-1.
Fig. 4E illustrates an exemplary options window 475-1 displayed in GUI 230. The options window 475-1 may include one or more selectable options associated with a predetermined one or more operations that may be applied to the selected visual indicator 440-1 and/or the corresponding camera image 310-1. Examples of such options and/or operations may include deleting, editing, zooming in, and publishing the selected camera image 310-1. As shown, one of the selectable options in options window 475-1 may correspond to an operation for identifying and selecting all camera images associated with the same session 320-1 as selected camera image 310-1. When this option is selected, all camera images 310 associated with session 320-1 are selected for inclusion in the selected group.
The selected group may be indicated in the GUI 230. FIG. 4F illustrates a library view 460 with a plurality of visual indicators 440-1 and 440-2 visually labeled to indicate that the corresponding camera images 310-1 and 310-2 are included in the user-selected group. In the illustrated example, the grouping indicator includes a check mark graphic associated with the visual indicators 440-1 and 440-2. This is merely illustrative. Other group indicators may be used in other embodiments.
The camera images 310-1 and 310-2 selected for inclusion in the group may be operated as a group. In the case of a selected set of camera images 310-1 and 310-2 as shown in FIG. 4F, for example, the user may select an "options" input (e.g., the "soft" input button 210-2) to initiate another options window 475-2 in the GUI230, the other options window 475-2 including selectable operations that may be applied to the selected set of camera images 310-1 and 310-2. Fig. 4G illustrates another options window 475-2 that includes selectable operations for editing, deleting, and publishing camera images 310-1 and 310-2 as a group. If one of the selectable options in options window 475-2 is selected, one or more operations associated with the selected option may be applied to the selected set of camera images 310. For example, when the user selects the "publish" option from the option window 475-2 shown in FIG. 4G, the camera images 310-1 and 310-2 included in the selected group may be published simultaneously.
While any of a number of operations may be applied to the selected camera image 310 or the selected group of camera images 310, an example of publishing one or more selected camera images 310 will now be described. Posting of the camera image 310 or set of camera images 310 may include, but is not limited to, sending one or more camera images 310 to another apparatus (e.g., another mobile phone), to a contact in a contacts database, to an external service or site (e.g., a social networking site), to a distribution service, to the storage device 130, to an external data storage device, to the input/output device 140 for display, and to any other device to the system 100 or to an interface to the system 100.
For example, the user may wish to send a selected set of camera images 310-1 and 310-2 to individuals included in the contacts database. In the view shown in FIG. 4G, the user may select the "publish image" option from options window 475-2. In response to this selection, another options window 475-3 may be displayed in the GUI230, for example, as shown in FIG. 4H. The options window 475-3 may include selectable options for publishing the selected camera images 310-1 and 310-2. In the example shown in FIG. 4H, the selectable operations include an option to publish the selected camera images 310-1 and 310-2 to a social networking site, a world Wide Web location (e.g., a particular website), to one or more contacts, to a locally defined distribution list (e.g., a predetermined group of contacts), and to a distribution service labeled "Express". Examples of distributing services to exemplary services are described further below.
If the user selects the "contacts" option from the list of options in option window 475-3, user interface device 170 may display another option window 475-4 in GUI230, such as shown in FIG. 4I. As shown in fig. 4I, options window 475-4 may include one or more selectable options corresponding to a predefined contact, which may be accessed from a contacts database stored in storage device 130. The user may select one of the contacts listed in options window 475-4. In response, the publishing device 180 may initiate a transmission to send data representing the selected camera images 310-1 and 310-2 to one or more communication devices associated with the selected contact in the contact database. For example, the data may be sent to a mobile phone, an email address, and/or other destination specified in a contacts database.
Sending the camera image 310 to the selected contact is only one example of publishing the camera image 310. As described above, publishing may include: the data representing the camera image 310 is provided to other destinations such as a website and/or social networking site (e.g., a page of the user on the social networking site).
Another example of publishing the camera image 310 includes: the camera image or the selected set of camera images is sent to a distribution service for distribution by the service of the camera image or the selected set of camera images to one or more predetermined destinations. Fig. 5 illustrates an exemplary publication system 500 (or simply "system 500") in which a mobile device 200 may provide (e.g., upload) one or more camera images 310 to a content distribution subsystem 510 over a network 525. The content distribution subsystem 510 may be configured to distribute the camera images 310 to one or more predetermined destinations 530. Fig. 5 illustrates a single camera image 310-1 uploaded from mobile device 200 to content distribution subsystem 510 and distributed from content distribution subsystem 510 to a plurality of predetermined destinations 530.
The mobile device 200 and the content distribution subsystem 510 may communicate over the network 525 using any communication platform and technology suitable for communicating data and/or communication signals, including known communication technologies, devices, media, and protocols that support remote data communication, examples of which include, but are not limited to: data transmission media, communications devices, transmission control protocol ("TCP"), Internet protocol ("IP"), file transfer protocol ("FTP"), Telnet protocol (Telnet), Hypertext transfer protocol ("HTTP"), Hypertext transfer protocol secure ("HTTPS"), Session initiation protocol ("SIP"), simple object Access protocol ("SOAP"), extensible markup language ("XML") and variations thereof, simple mail transfer protocol ("SMTP"), real-time transport protocol ("RTP"), user Datagram protocol ("UDP"), Global System for Mobile communications ("GSM") technology, code division multiple Access ("CDMA") technology, evolution data optimized protocol ("EVDO"), time division multiple Access ("TDMA") technology, short message service ("SMS"), multimedia message service ("MMS"), radio frequency ("RF") signaling technology, data transfer protocol ("HTTP"), and variations thereof, Wireless communication technologies (e.g., bluetooth, Wi-Fi, etc.), in-band and out-of-band signaling technologies, and other suitable communication networks and technologies.
Network 525 may include one or more networks including, but not limited to: a wireless network, a mobile phone network (e.g., a cellular phone network), a closed media network, an open media network, a closed communication network, an open communication network, a satellite network, a navigation network, a broadband network, a narrowband network, a voice communication network (e.g., a voice over internet protocol (VoIP) network), the internet, a wide area network, a local area network, a public network, a private network, and any other network capable of carrying data and/or communication signals between the mobile device 200 and the content distribution subsystem 510. In certain exemplary embodiments, the network 525 comprises a mobile phone network, and the content distribution subsystem 510 and the mobile device 200 are configured to communicate with each other using mobile phone network communication techniques.
In some examples, system 500 may include any computing hardware and/or instructions (e.g., software programs), or combination of computing instructions and hardware, configured to perform the processes described herein. In particular, it should be appreciated that the components of system 500 may be implemented on one physical computing device, or may be implemented on more than one physical computing device. Thus, the system 500 may include any of a number of computing devices and/or computer operating systems (e.g., a mobile device operating system).
Thus, the processes described herein may be implemented, at least in part, as computer-executable instructions tangibly embodied in a computer-readable medium, i.e., instructions executable by one or more computing devices. Generally, a processor (e.g., a microprocessor) receives instructions, e.g., from a memory, a computer-readable medium, etc., and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions may be stored and transmitted using a variety of known computer-readable media, including any computer-readable media described above.
The predetermined destination 530 may include any device, service, or other destination configured to receive the camera image 310 distributed by the content distribution subsystem 510. Examples of predetermined destinations include, but are not limited to, another device associated with a user of mobile device 200 (e.g., a personal computer or television service set top box), another mobile device 200 associated with the user or another user (e.g., another mobile phone), a server device associated with a service (e.g., a social networking site server), a data storage device and/or service, and any other destination configured to receive distribution data representing camera image 310.
Any suitable communication technology may be used to distribute the camera image 310 from the content distribution subsystem 510 to one or more predetermined destinations 530, including any of the communication devices, protocols, formats, networks, and technologies described above. The content distribution subsystem 510 may be configured to distribute the camera image 310 over the same network 525 ("in the network") used to communicate with the mobile device 200 and/or over a communication channel controlled by a common authorization ("authorization"). Alternatively or additionally, the content distribution subsystem 510 may be configured to distribute the camera image 310 over a communication channel other than the communication channel used to communicate with the mobile device 200 ("out-of-network") and/or over a communication channel other than the communication channel controlled by the common licensing authority ("out-of-licensing").
The distribution of the camera image 310 by the content distribution subsystem 510 may allow a user of the mobile device 200 to minimize or avoid fees that are typically charged for sending data representing the camera image 310. For example, certain conventional mobile phone services may be configured to charge a fee for each such communication sent from the mobile device 200. Thus, if the user of the mobile device 200 directly sends the camera images 310 from the mobile device 200 to a plurality of predetermined destinations 530, a fee would be incurred for each individual communication. Alternatively, the user of the mobile device 200 may send the camera image 310 to the content distribution subsystem 510 and incur only a fee for this transmission. The content distribution subsystem 510 may distribute the camera images 310 to a plurality of predetermined destinations 530 and the user of the mobile device 200 does not incur additional fees for distribution.
The content distribution subsystem 510 may include one or more devices (e.g., one or more servers) configured to receive and distribute data representing the camera image 310 using one or more communication techniques. Fig. 6 illustrates an exemplary content distribution subsystem 510. The components of content distribution subsystem 510 may include or be implemented as hardware, computing instructions (e.g., software) embodied on one or more computer-readable media, or a combination thereof. In particular embodiments, for example, one or more components of content distribution subsystem 510 may include or be implemented on at least one server configured to communicate over network 525. Although an exemplary content distribution subsystem 510 is shown in fig. 6, the exemplary components shown in fig. 6 are not intended to be limiting. Indeed, additional or alternative components and/or embodiments may be used.
As shown in fig. 6, content distribution subsystem 510 may include a communication module 610, and communication module 610 may be configured to communicate with mobile device 200, including receiving data representing camera image 310 from mobile device 200, and providing data representing camera image 310 to one or more predetermined destinations 530. The communication module 610 may be configured to support multiple communication platforms, protocols, and formats such that the content distribution subsystem 510 may receive content from and distribute content to multiple computing platforms (e.g., mobile phone service platforms, web-based platforms, user television platforms, etc.) using multiple communication technologies. Thus, the content distribution subsystem 510 may support a multi-platform system, where content may be received from and provided to different platforms.
The content distribution subsystem 510 may include a processing module 620, the processing module 620 configured to control the operation of the components of the content distribution subsystem 510. The processing module 620 may perform operations or direct the performance of operations according to computer-executable instructions stored to a computer-readable medium, such as the data storage area 630. For example, the processing module 620 may be configured to process (e.g., encode, decode, modulate, and/or demodulate) data and communications received from the mobile device 200 and/or the predetermined destination 530, or to be transmitted to the mobile device 200 and/or the predetermined destination 530. As another example, the processing module 620 can be configured to perform data management operations on data stored in the data storage 630. For example, the processing module 620 may manipulate data, including storing data to the data store 630, and indexing, searching, accessing, retrieving, modifying, annotating, copying, and/or deleting data stored in the data store 630.
Data storage 630 may include one or more data storage media, devices, or configurations, and any type, form, and combination of storage media may be used. For example, the data storage 630 may include, but is not limited to, a hard disk drive, a network drive, a flash drive, a magnetic disk, an optical disk, random access memory ("RAM"), dynamic RAM ("DRAM"), other nonvolatile and/or volatile memory units, or combinations or sub-combinations thereof. The data store 630 may store any suitable type or form of electronic data, including camera image data 640 and profile data 650.
Camera image data 640 may include data representing one or more camera images 310, the one or more camera images 310 including camera image 310 received from mobile device 200 over network 525. The camera image data 640 may further include data related to the camera image 310 including, for example, camera image metadata.
Profile data 650 may include information associated with one or more users, which may include subscribers to one or more services provided over network 525, such as users of mobile device 200. The profile data 650 may include any information describing the user, user preferences, user-specific settings, and/or services provided to the user. In particular embodiments, profile 650 may include predetermined distribution settings associated with a user. The predetermined distribution settings may be used to identify one or more destinations to which the camera images 310 are to be distributed, as described further below.
The distribution settings for a user may be custom defined by the user. The content distribution subsystem 510 and/or the mobile device 200 may be configured to provide one or more tools for customized definition of distribution settings. The tools can be provided in any suitable manner, and can include any mechanism or process by which a user can customize one or more predetermined distribution destinations 530. For example, a graphical user interface may be provided, and the graphical user interface may include one or more tools configured to enable a user to provide distribution information and settings. Thus, the user profile may include personalized distribution settings for specifying one or more predetermined distribution destinations and related information, such as addresses, access information (e.g., usernames and passwords), interface information (API access information), and any other information that may be helpful in identifying the distribution destinations and distributing the camera image 310 thereto. Thus, the user profile and the predetermined distribution settings included therein may be used to automatically distribute data representing the camera image 310 from the content distribution subsystem 510 to one or more predetermined destinations 530.
As shown in fig. 6, content distribution subsystem 510 may further include distribution module 660, distribution module 660 may include or be implemented as hardware configured to perform one or more of the content distribution processes described herein, computing instructions (e.g., software) tangibly embodied on a computer-readable medium, or a combination of hardware and embodied computing instructions. In particular embodiments, distribution module 660 may be implemented as a software application embodied on a computer-readable medium, such as data storage 630, and configured to direct processing module 620 to perform one or more processes described herein.
The content distribution subsystem 510 may be configured to identify when one or more camera images 310 received from the mobile device 200 are to be distributed based on predetermined distribution settings specified in the profile 650. The content distribution subsystem 510 may identify one or more predetermined distribution destinations from the predetermined distribution settings, and may distribute the camera image 310 to the predetermined destination 530 or initiate distribution of the camera image 310 to the predetermined destination 530.
For example, when the user of the mobile device 200 selects the "courier distribution service" option from the list of selectable options in the options window 475-3 shown in fig. 4H, the mobile device 200 may provide data representing the selected camera image 310-1 (or in other examples, the selected set of camera images 310) to the content distribution subsystem 510 over the network 525, as shown in fig. 5. Along with the camera image 310-1, the mobile device 200 may provide an indication that the camera image 310-1 is to be provided for distribution according to a courier distribution service. The communication module 610 may receive the data and the distribution module 660 may identify from the data a request to distribute the camera image 310-1 according to the courier distribution service. The distribution module 660 may access an appropriate profile in the profile data 650, such as a profile associated with the user of the mobile device 200 from which the camera image 310-1 was received. The distribution module 660 can use the predetermined distribution settings specified in the identified profile to determine one or more predetermined destinations 530 to which the camera image 310-1 is to be sent. Using the information included in the distribution settings, the distribution module 660 can initiate distribution of the camera image 310-1 to the identified predetermined destination 530. The camera image 310-1 may be automatically distributed to the predetermined destination 530 according to the predetermined distribution settings without human intervention. In this manner or the like, a user may upload one or more camera images 310 from the mobile device 200 to the content distribution subsystem 510 for automatic distribution from the content distribution subsystem 510 to one or more predetermined destinations 530. Thus, the captured camera image 310 may be managed, including by uploading and automatically sending the camera image 310 to a predetermined destination 530, such as a blog, a data backup storage device, and/or a social networking site.
The user interface device 170 may be configured to provide visual animation effects in the GUI230 in association with GUI views and/or one or more operations as described above. For example, an animation effect may be displayed, and the animation effect may represent the transmission, capture, and/or release of the camera image 310. In particular embodiments, the animation effect may be displayed concurrently with performance of the data transfer, capture, and/or publication operations. The animation effect may help to improve the user experience during the memory access latency period.
As an example of an animation effect, when the GUI view shown in FIG. 4B is displayed and the user captures the camera image 310-2, the user interface device 170 may provide an animation effect designed to animate the capture of the camera image 310-2 and/or the transfer of data from the live camera sensor view 410 to the camera image 310-2 of the image manager pane 420. In particular embodiments, the animation effect may be designed to illustrate a funnel-like flow (channel flow) of camera image pixels, such as a flow from the live camera sensor view 410 to the visual indicator 440-2 in the image manager pane 420. In other embodiments, the animation effect may be designed to illustrate a spiral flow and/or compression of camera image pixels, such as from the live camera sensor view 410 to the visual indicator 440-2 in the image manager pane 420. Such an animation effect may provide a visual appearance of pixels being drawn into the visual indicator 440-2 from the live camera sensor view 410. These examples are merely illustrative and may be used in other GUI views, including visual pointer capture and storage of camera images 310-2 to library view 460. Other animation effects may be used in other embodiments.
As another example, an animation effect may be provided and configured to animate the release of the camera image 310. For example, the animation effect may illustrate the expansion of pixels, and/or the flow of pixels from the visual indicator 440-2 to the posted destination 530, thereby displaying the camera image 310-2 of the entire screen view, and/or transmitting the pixels to an external destination such as a social networking site.
Fig. 7 illustrates an exemplary camera data management and user interface method. Although fig. 7 illustrates exemplary steps according to one embodiment, other embodiments may omit, add to, reorder, and/or modify any of the steps shown in fig. 7.
In step 710, a camera image is acquired. Step 710 may be performed in any manner as described above, including the camera device 150 capturing a camera image.
In step 720, the acquired camera images are assigned to at least one session based on a predetermined session grouping heuristic. Step 720 may be performed in any manner as described above, including using one or more criteria in the predetermined session grouping heuristic 330 to determine that the camera image is eligible to be assigned to a session. Step 720 may be performed automatically.
In step 730, a graphical user interface is provided for display. Step 730 may be performed in any manner as described above, including the user interface device 170 generating and providing the GUI230 to the input/output device 140, and the input/output device 140 may display the GUI230 for viewing by the user.
As described above, various graphical user interface views may be displayed in the GUI230, including any of the exemplary graphical user interface views described above and/or illustrated in the figures. User input may be received and user output may be provided through a graphical user interface view, as described above. For example, in step 740, an animation effect representing the acquisition of the camera image may be provided in the graphical user interface in any manner as described above.
In step 750, a set of one or more camera images may be identified based on the user input. Step 750 may be performed in any manner as described above, including the user navigating through GUI230 and selecting one or more visual indicators representing one or more camera images. From the user selection, the user interface device 170 may identify the set of one or more selected camera images.
In step 760, the set of one or more selected camera images is published. Step 760 may be performed in any manner as described above, including identifying a publish command provided by the user (e.g., by selecting a "publish" option in GUI 230), and providing data representing the one or more selected camera images. In particular embodiments, publishing may include sending data representing the one or more selected camera images to content distribution subsystem 510 over network 525.
At step 770, data representing the one or more selected camera images is distributed to one or more predetermined destinations. Step 770 may be performed in any manner as described above, including content distribution subsystem 510 receiving data representing the one or more camera images and automatically distributing the data representing the one or more camera images to one or more predetermined destinations. In certain embodiments, the camera images are distributed to predetermined destinations according to distribution settings included in a profile (e.g., a user profile).
In the foregoing specification, various exemplary embodiments have been described with reference to the accompanying drawings. It will, however, be evident that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the scope of the invention as set forth in the appended claims. For example, particular features of one embodiment described herein may be combined with, or substituted for, features of another embodiment described herein. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims (23)

1. A method for camera data management, comprising:
displaying, by a computing device, a live camera sensor view in a graphical user interface;
capturing, by the computing device, a camera image; and
displaying, by the computing device, a visual indicator representing the captured camera image along with the live camera sensor view in the graphical user interface;
in response to the capturing, displaying, by the computing device, an image manager pane in the graphical user interface with the live camera sensor view, the image manager pane including the visual indicator; and
switching, by the computing device, an active input mode between the live camera sensor view and the image manager pane in response to user input, wherein a selector is displayed in the image manager pane including the visual indicator representing the captured camera image and concurrently with the live camera sensor view in the graphical user interface, the selector including a directional arrow indicator indicating a direction for switching the input mode directly to an active input button for the live camera sensor view.
2. The method of claim 1, further comprising: automatically assigning, by the computing device, the captured camera images to sessions based on a predetermined session grouping heuristic.
3. The method of claim 2, wherein the visual indicator comprises a session indicator indicating the assignment of the camera image to the session.
4. The method of claim 1, wherein the visual indicator comprises a user-selectable object.
5. The method of claim 1, wherein the direction indicates a direction on the graphical user interface from the visual indicator to the live camera sensor view.
6. The method of claim 3, wherein the visual indicator comprises a thumbnail image of the captured camera image, and
the session indicator is another image occupying a subspace of the thumbnail images.
7. The method of claim 5, wherein the directional arrow indicator is displayed along an edge of the selector.
8. The method of claim 1, further comprising: displaying, by the computing device, an animation effect in the graphical user interface in response to the capturing of the camera image, the animation effect visually indicating movement of image pixels from the live camera sensor view to the visual indicator representing the camera image.
9. The method of claim 1, further comprising:
providing, by the computing device, data representing the camera image to a content distribution subsystem over a network; and
distributing, by the computing device, data representing the camera image from the distribution subsystem to a plurality of predetermined destinations.
10. The method of claim 1, tangibly embodied on at least one non-transitory computer-readable medium as computer-executable instructions.
11. A method for camera data management, comprising:
displaying, by a computing device, a graphical user interface comprising a live camera sensor view;
capturing, by the computing device, a camera image;
automatically assigning, by the computing device, the camera image to a session based on a predetermined session grouping heuristic;
in response to the capturing, displaying, by the computing device, an image manager pane in the graphical user interface with the live camera sensor view, the image manager pane including a visual indicator representing the captured camera image; and
switching, by the computing device, an active input mode between the live camera sensor view and the image manager pane in response to user input, wherein a selector is displayed in the image manager pane including the visual indicator representing the captured camera image and concurrently with the live camera sensor view in the graphical user interface, the selector including a directional arrow indicator indicating a direction for switching the input mode directly to an active input button for the live camera sensor view.
12. The method of claim 11, wherein the direction indicates a direction on the graphical user interface from the visual indicator to the live camera sensor view.
13. The method of claim 11, further comprising:
capturing, by the computing device, another camera image;
automatically assigning, by the computing device, the another camera image to the session based on the predetermined session group heuristic; and
displaying, by the computing device, another visual indicator representing the another captured camera image in the image manager pane.
14. The method of claim 12, wherein the directional arrow indicator is displayed along an edge of the selector.
15. The method of claim 11, wherein the session is defined to include one or more camera images captured over a continuous period of time during which a camera mode is active.
16. The method of claim 11, wherein the session is defined to include one or more camera images captured at a common geographic location.
17. The method of claim 11, wherein the visual indicator comprises a session indicator indicating the assignment of the camera image to the session,
the visual indicator includes a thumbnail image of the captured camera image, and
the session indicator is another image occupying a subspace of the thumbnail images.
18. A system for camera data management, comprising:
a user interface device configured to provide a live camera sensor view for display in a graphical user interface;
a camera device comprising a camera configured to capture camera images; and
a processor configured to execute the user interface device,
wherein the user interface device is further configured to:
providing a visual indicator representing the captured camera image for display in the graphical user interface with the live camera sensor view,
providing an image manager pane for display in the graphical user interface with the live camera sensor view, the image manager pane including the visual indicator;
switching an active input mode between the live camera sensor view and the image manager pane in response to user input, an
Displaying a selector in the image manager pane that includes the visual indicator representing the captured camera image within the graphical user interface and concurrently with the live camera sensor view, the selector including a directional arrow indicator that indicates a direction for switching the input mode directly to an input button active for the live camera sensor view.
19. The system of claim 18, wherein the direction indicates a direction on the graphical user interface from the visual indicator to the live camera sensor view.
20. The system of claim 18, further comprising: a session management device configured to automatically assign the captured camera images to sessions based on a predetermined session grouping heuristic.
21. The system of claim 20, wherein the user interface device is further configured to provide a session indicator for display in the graphical user interface, the session indicator configured to indicate an association of the visual indicator with the session.
22. The system of claim 18, implemented on a mobile phone device.
23. The system of claim 22, further comprising a content distribution subsystem configured to communicate with the mobile telephony device over a network, including receiving data representing the camera image from the mobile telephony device over the network, wherein the content distribution subsystem is configured to distribute the data representing the camera image to a plurality of predetermined destinations.
HK11104514.6A 2008-06-30 2009-06-29 Camera data management and user interface apparatuses, systems, and methods HK1150698B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US12/165,411 2008-06-30
US12/165,411 US8477228B2 (en) 2008-06-30 2008-06-30 Camera data management and user interface apparatuses, systems, and methods
PCT/US2009/049018 WO2010002768A1 (en) 2008-06-30 2009-06-29 Camera data management and user interface apparatuses, systems, and methods

Publications (2)

Publication Number Publication Date
HK1150698A1 HK1150698A1 (en) 2012-01-06
HK1150698B true HK1150698B (en) 2014-06-27

Family

ID=

Similar Documents

Publication Publication Date Title
CN102027740B (en) Camera data management and user interface apparatuses, systems, and methods
US11714523B2 (en) Digital image tagging apparatuses, systems, and methods
US20220300132A1 (en) Facilitating the editing of multimedia as part of sending the multimedia in a message
US8990868B2 (en) Display device and method for displaying contents on the same
US10095385B2 (en) Communication user interface systems and methods
JP6389014B2 (en) Voice control method, device, program, recording medium, control device and smart device for smart device
US9183229B2 (en) System and method for selecting a geographic location to associate with an object
WO2015097568A1 (en) Alternative camera function control
KR20160086848A (en) Communication user interface systems and methods
JP2019508784A (en) Multimedia file management method, electronic device and graphical user interface
US20130235233A1 (en) Methods and devices for capturing images
CN106506342A (en) Information copy and paste method and terminal
HK1150698B (en) Camera data management and user interface apparatuses, systems, and methods
CA2804594A1 (en) Methods and devices for capturing images
JP2014119983A (en) Server device, control method for server device and program
CN105898160A (en) Recorded content processing method and device, and terminal equipment