[go: up one dir, main page]

HK1192961A - Mobile device power state - Google Patents

Mobile device power state Download PDF

Info

Publication number
HK1192961A
HK1192961A HK14106470.0A HK14106470A HK1192961A HK 1192961 A HK1192961 A HK 1192961A HK 14106470 A HK14106470 A HK 14106470A HK 1192961 A HK1192961 A HK 1192961A
Authority
HK
Hong Kong
Prior art keywords
computing device
orientation
input
input device
power state
Prior art date
Application number
HK14106470.0A
Other languages
Chinese (zh)
Inventor
Tom Belesiu Jim
Drasnin Sharon
A. Schwager Michael
Harry Stoumbos Christopher
J. Seilstad Mark
Original Assignee
Microsoft Technology Licensing, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing, Llc filed Critical Microsoft Technology Licensing, Llc
Publication of HK1192961A publication Critical patent/HK1192961A/en

Links

Description

Mobile device power state
RELATED APPLICATIONS
This application claims priority to the following U.S. provisional patent applications in accordance with 35 U.S. C. § 119(e), the entire contents of each of these applications being incorporated herein by reference in their entirety:
attorney docket number 336082.01, entitled "Screen Edge," filed on 3/2/2012, U.S. provisional patent application No. 61/606,321;
U.S. provisional patent application No. 61/606,301, attorney docket No. 336083.01, entitled "Input Device Functionality", filed 3, 2, 2012;
U.S. provisional patent application attorney docket No. 336084.01, entitled "Functional Hinge," and application No. 61/606,311, filed 3/2/2012;
U.S. provisional patent application attorney docket No. 336086.01 entitled "use and Authentication", application No. 61/606,333, filed 3, 2, 2012;
U.S. provisional patent application attorney docket No. 336086.02 entitled "use and Authentication", application No. 61/613,745, filed 3, 21/2012;
U.S. provisional patent application attorney docket No. 336087.01, entitled "Kickstand and Camera," filed 3/2/2012, and application No. 61/606,336; and
attorney docket number 336143.01, entitled "spanway Provisional," filed on 6.3.2012, U.S. Provisional patent application No. 61/607,451.
Background
Mobile computing devices have been developed to add functionality that can be provided to users in mobile scenarios. For example, a user may interact with a mobile phone, tablet computer, or other mobile computing device to check email, surf the web, compose text, interact with an application, and so forth.
Because mobile computing devices are configured to be mobile, these devices typically contain some type of battery that acts as a mobile power source for the device. One limitation associated with using battery power is that batteries have a limited useful charge life (charge life). When the battery charge of the mobile computing device is depleted, the battery is recharged or replaced to maintain device operability. Therefore, managing power consumption of a mobile computing device is a very important consideration in order to extend battery life.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Techniques for mobile device power states are described. In one or more implementations, a mobile device includes a computing device flexibly coupled with an input device via a flexible hinge. Accordingly, the mobile device may operate in a variety of different power states based on the positional orientation of the computing device relative to the associated input device. For example, the computing device and the input device may be at different respective tilt angles. The techniques may determine a tilt angle between the computing device and the input device, and may determine a particular power state of the computing device and/or the input device based on the tilt angle. For example, different tilt angle ranges may correspond to different power states.
In one or more implementations, an application resident on a computing device may operate in different application states based on the positional orientation of the computing device relative to an associated input device. For example, particular functions of an application may be enabled or disabled based on the tilt angle between the computing device and the input device. Thus, different tilt angle ranges may correspond to different application states.
In one or more implementations, a technique may cause a computing device to transition between power states in response to detected vibrations. For example, a vibration detection mechanism (e.g., an accelerometer) associated with a computing device in a low power mode may detect vibrations of the computing device and/or an input device coupled with the computing device. For example, the vibrations may be caused by user input to a touch function of the computing device, such as a touch screen of the computing device, a trackpad (track pad) of a coupled input device, and so forth. Alternatively, the vibration may also be due to some other contact with the computing device, such as a result of the user inadvertently striking the computing device, a vibration of a table or other surface on which the computing device is located, and so forth. Thus, the techniques discussed herein may distinguish vibrations caused by touch inputs to a touch function from other types of vibrations. Based on this distinction, the techniques may determine whether to transition between device power states.
Drawings
The detailed description is described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items. The entities depicted in the figures may represent one or more entities, whereby reference may be made interchangeably to the singular or plural forms of the entities in the discussion.
FIG. 1 is an illustration of an environment in an example implementation that is operable to employ techniques described herein.
FIG. 2 depicts an example implementation of the input device of FIG. 1, where the figure shows the flexible hinge in more detail.
FIG. 3 depicts an example orientation of an input device relative to a computing device in accordance with one or more embodiments.
FIG. 4 depicts an example orientation of an input device relative to a computing device in accordance with one or more embodiments.
FIG. 5 depicts an example orientation of an input device relative to a computing device in accordance with one or more embodiments.
FIG. 6 depicts an example orientation of an input device relative to a computing device in accordance with one or more embodiments.
FIG. 7 depicts an example orientation of an input device relative to a computing device in accordance with one or more embodiments.
FIG. 8 depicts an example orientation of an input device relative to a computing device in accordance with one or more embodiments.
FIG. 9 depicts some example rotational orientations of a computing device relative to an input device in accordance with one or more embodiments.
FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
FIG. 11 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
FIG. 13 shows an example system containing various components of an example device that can be implemented as any of the types of computing devices described with reference to FIGS. 1-12 to implement embodiments of the techniques described herein.
Detailed Description
Overview
Techniques for mobile device power states are described. In one or more implementations, a mobile device includes a computing device flexibly coupled with an input device via a flexible hinge. Examples of the input device include a keyboard, a touch panel, a combination of a keyboard and a touch panel, and the like. The computing device includes a display device (e.g., a display surface) and has independent operability separate from the input device, such as outputting content, receiving touch input, and so forth. Thus, the input device provides a mechanism for providing input to the computing device, but the computing device may also operate to provide functionality independent of the input device.
In one or more implementations, a computing device may operate in a plurality of different power states based on a positional orientation of the computing device relative to an associated input device. For example, the computing device and the input device may be at respective different tilt angles. The techniques may determine a tilt angle between the computing device and the input device, and may determine a particular power state of the computing device and/or the input device based on the tilt angle. For example, different tilt angle ranges may correspond to different device power states.
In one or more implementations, an application resident on a computing device may operate in different application states based on the positional orientation of the computing device relative to an associated input device. For example, particular functions of an application may be enabled or disabled based on the tilt angle between the computing device and the input device. Thus, different tilt angle ranges may correspond to different application states.
In one or more implementations, a technique may cause a computing device to transition between power states in response to detected vibrations. For example, a vibration detection mechanism (e.g., an accelerometer) associated with a computing device in a low power mode may detect vibrations of the computing device and/or an input device coupled with the computing device. For example, the vibrations may be caused by user input to a touch function associated with the computing device, such as a touch screen of the computing device and a trackpad of a coupled input device, among others. Alternatively, the vibration may be due to some other contact with the computing device, such as a result of the user inadvertently striking the computing device, a vibration of a table or other surface on which the computing device is located, and so forth.
In response to the detected vibration, the technique can query a computing device function to determine whether the vibration was caused by a touch input by the user. For example, capacitive touch input mechanisms (e.g., trackpads, touch screens, etc.) may be powered on and queried to determine whether the mechanisms are receiving touch input from a user. For example, the touch input may indicate an intention from a user to cause the computing device to transition from a low power (e.g., sleep) mode to a functional mode.
If the mechanism indicates that it is receiving touch input, the computing device may wake from a low power mode. If there is no indication of touch input, the computing device may remain in a low power state. Thus, the techniques may detect vibrations of a computing device with a less power consuming sensing mechanism and may determine whether the vibrations originate from a touch input with a more power consuming mechanism (e.g., a capacitive touch sensor). Doing so can enable a higher power consuming sensing mechanism to remain in a low power state (e.g., an off state) until queried to confirm the presence of a touch input, thereby reducing power consumption of the computing device.
In the discussion that follows, an exemplary environment is first described in which the techniques described herein may be employed. Next, a section entitled "example device orientations" describes some example mobile device orientations in accordance with one or more embodiments. Exemplary processes that can be performed in the exemplary environment and in other environments are described hereinafter. Thus, execution of the illustrative processes is not limited to the illustrative environment, nor is execution of the illustrative processes limited. Next, a section entitled "touch initiated power state transitions" describes example embodiments that transition between power states based on touch input. Finally, exemplary systems and devices in accordance with and which can implement one or more embodiments are described. Still further, while an input device is described, other devices that do not include input functionality, such as a cover, are also contemplated.
Example Environment
FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ techniques described herein. The illustrated environment 100 includes one example of a computing device 102 that is physically communicatively coupled to an input device 104 via a flexible hinge 106. The computing device 102 may be configured in a variety of ways. For example, the computing device 102 may be configured for mobile use, such as a mobile phone, a tablet computer as shown, and so on. Thus, the computing device 102 may range from full resource devices with rich memory and processor resources to low resource devices with limited memory and/or processing resources. The computing device 102 may also involve software that causes the computing device 102 to perform one or more operations.
For example, computing device 102 is illustrated as including input/output module 108. The input/output module 108 represents functionality related to input processing and output presentation of the computing device 102. The input/output module 108 may process a variety of different inputs, which may relate to, for example, functions corresponding to keys of the input device 104, functions corresponding to virtual keyboard keys displayed by the display device 110, and so on, thereby recognizing gestures and causing operations corresponding to those recognizable through touchscreen functionality of the input device 104 and/or the display device 110 to be performed, and so on. Thus, the input/output module 108 may support a variety of different input techniques by recognizing and taking advantage of the distinction between input types, including keys, gestures, and so forth.
In the illustrated example, the input device 104 is configured with an input portion comprising a keyboard having a QWERTY key arrangement and a trackpad, however, other key arrangements are also contemplated. Still further, other unconventional configurations are also contemplated, such as configurations of game controllers, simulated musical instruments, and so forth. Thus, the input device 104 and the keys included with the input device 104 may take on a variety of different configurations to support a variety of different functions.
As previously described, the input device 104 is physically communicatively coupled to the computing device 102 using the flexible hinge 106 in this example. The flexible hinge 106 is flexible in that the rotational movement supported by the hinge is achieved by flexing (e.g. bending) of the material forming the hinge in relation to the mechanical rotation supported by the positioning pins (pins), but other embodiments are equally conceivable. Still further, such flexible rotation may be configured to support movement in one or more directions (e.g., vertical in the figure), but to limit movement in other directions, such as lateral movement of the input device 104 relative to the computing device 102. This may be used to support consistent alignment of the input device 104 relative to the computing device 102, for example, to align sensors for changing power states, application states, and so forth.
For example, the flexible hinge 106 may be formed using one or more layers of fabric and may contain wires formed as flexible traces to communicatively couple the input device 104 to the computing device 102, and vice versa. Such communication may be used, for example, to communicate key press results to the computing device 102, receive power from the computing device, perform authentication, provide supplemental power to the computing device 102, and so forth. The flexible hinge 106 may be configured in a variety of ways, further discussion of which may be found in connection with the figures discussed below.
Computing device 102 also includes an orientation module 112 that represents functionality for determining the positional orientation of computing device 102 relative to input device 104. For example, the orientation module 112 may receive orientation information from the computing device accelerometer 114 and from the input device accelerometer 116. The orientation module 112 may use the orientation information from the respective accelerometers to determine the relative orientation of the device. For example, the relative orientation may indicate an angle of inclination of the computing device 102 (e.g., the display device 110) relative to the input device 104. This orientation information may be used to perform different tasks, such as determining an appropriate power state of the computing device 102 and/or the input device 104, determining an application state for different applications, and so forth.
A power state module 118 is included that represents functionality to cause the computing device 102 and/or the input device 104 to operate in different power states. For example, the power state module 118 may power up, power down, and hibernate the computing device 102 and/or the input device 104 based on the different device orientations determined by the orientation module 112. In addition, other various power states are also contemplated. For example, different tilt angle ranges may be associated with different power states of the computing device 102 and/or the input device 104.
The power state module 118 may also be used to cause the computing device 102 and/or the input device 104 to transition power states based on detected vibrations, for example, detected by the computing device accelerometer 114 and/or the input device accelerometer 116. Such vibrations may be due to user contact with the computing device 102 and/or the input device 104. For example, a user may touch the display device 110 and/or the trackpad 120 to initiate waking up the computing device 102 and/or the input device 104 from a sleep mode. The vibration may also be caused by other forms of contact, such as a user hitting the device and/or the surface on which the device is located. As discussed in detail below, the techniques herein may be implemented to distinguish between wake events (e.g., a user touching a key and/or trackpad 120) and non-wake events such as accidental contact with a device.
As described above, the computing device 102 may be rotated to assume different orientations with respect to the input device 104. For example, the computing device 102 may be rotated to a closed position in which the input device 104 covers the display device 110. An example technique for detecting when a computing device is in a closed position uses a first sensing portion 122 and a second sensing portion 124. The first sensing portion 122 is located in a region of the computing device 102, such as below an outer surface near an edge of the computing device 102. Likewise, the second sensing portion 124 may also be located below the outer surface near the edge of the input device 104. The first sensing portion 122 and the second sensing portion 124 together form a sensing mechanism that can detect when the computing device 102 is in the closed position.
For example, the sensing mechanism may utilize the Hall (Hall) effect to detect the proximity of the computing device 102 and the input device 104 using magnetic force. For example, the first sensing portion 122 may include a hall effect sensor, and the second sensing portion 124 may include a magnet. When the computing device 102 is rotated to the closed position, the first sensing portion 122 can be aligned with the second sensing portion 124, whereby the hall effect sensors in the first sensing portion 122 will detect the magnets in the second sensing portion 124. The first sensing portion 122 can indicate to a different function that the computing device 102 is in the closed position, such as to the orientation module 112, the power status module 118, and so forth. The first sensing portion 122 does not detect the second sensing portion 124 when the computing device 102 is placed away from the input device 104. Thus, the first sensing portion 122 can indicate to different functions that the computing device 102 is in an open position.
FIG. 2 depicts an example implementation 200 of the input device 104 of FIG. 1 in which the flexible hinge 106 is shown in greater detail. In this example, a connection portion 202 of the input device is shown, where the connection portion is configured to provide a communicative physical connection between the input device 104 and the computing device 102. The illustrated connection portion 202 has a height and cross-section configured to be received in a housing channel of the computing device 102, although this arrangement may be reversed without departing from the spirit and scope thereof.
The connection portion 202 is flexibly connected to a portion of the input device 104 containing the keys by using the flexible hinge 106. Thus, when the connection portion 202 is physically connected to the computing device, the combination of the connection portion 202 and the flexible hinge 106 will support movement of the input device 104 relative to the computing device 102, similar to a hinge of a book.
The connection portion 202 in this example is illustrated as containing magnetic coupling devices 204, 206, mechanical coupling protrusions 208, 210, and communication contacts 212. The magnetic coupling devices 204, 206 are configured to magnetically couple to complementary magnetic coupling devices of the computing device 102 through the use of one or more magnets. In this way, the input device 104 may be physically secured to the computing device 102 through the use of magnetic forces.
The connection portion 202 also includes mechanical coupling protrusions 208, 210 to form a mechanical physical connection between the input device 104 and the computing device 102. The communication contacts 212 are configured to contact corresponding communication contacts of the computing device 102 to form a communicative coupling between the displayed devices.
Having discussed an example environment in which embodiments herein may operate, some example device orientations in accordance with one or more embodiments will now be considered.
Exemplary device orientations
The following discussion presents some example device orientations. As described in detail, different device orientations may be associated with different device power states, different application states, and so on.
FIG. 3 illustrates that the input device 104 may be rotated to position the input device 104 against the display device 110 of the computing device 102 such that it assumes an orientation 300. In the orientation 300, the input device 104 may act as a cover, whereby the input device 104 may protect the display device 110 from damage. In an implementation, the orientation 300 may correspond to a closed position of the computing device 102.
As mentioned above, in the closed position, the first sensing portion 122 may detect the proximity of the second sensing portion 124. Thus, the first sensing portion 122 can indicate to different functions that the computing device 102 is in the closed position. For example, the power state module 118 may determine that the computing device 102 is in a closed position and may cause the computing device 102 to transition to an off power state. In the off power state, various functions may be powered down and/or hibernated, such as the input device 104, the display device 110, and so forth.
FIG. 4 illustrates rotation of the input device 104 away from the computing device 102 to cause the computing device to assume an orientation 400. The orientation 400 includes a gap 402 introduced between the computing device 102 and the input device 104. In an implementation, the orientation 400 may be unintentionally caused by a user, such as inadvertently touching the computing device 102 and/or the input device 104, which may cause the computing device 102 to sag slightly away from the input device 104, thereby introducing the gap 402.
In at least some embodiments, at the orientation 400, the first sensing portion 122 cannot detect the proximity of the second sensing portion 124. For example, the distance between the first sensing portion 122 and the second sensing portion 124 introduced by the gap 402 may result in the first sensing portion 122 not detecting the second sensing portion 124.
When orienting the computing device 102 at an angle, such as the orientation 400, relative to the input device 104, the technique may determine the angle. For example, the computing device accelerometer 114 may determine an angle at which the computing device 102 is oriented relative to the earth's surface. Still further, the input device accelerometer 116 may determine the angle at which the input device 104 is oriented relative to the earth's surface. As discussed in detail below, the two angles may be compared in order to determine an azimuth angle of the computing device 102 relative to the input device 104.
In the example shown in fig. 4, the computing device 102 is oriented at an angle 404 relative to the input device 104. For example, the angle 404 may be determined to be about 4 degrees. Although in the orientation 400, the computing device 102 is rotated slightly toward the angle 404, although, for purposes of determining an appropriate power state, the computing device 102 may be considered to be in a closed position. For example, the angle 404 may be considered to be within an angular range corresponding to a closed position of the computing device 102. For example, an angular range of 0-30 may correspond to the closed position. As mentioned above, the closed position may correspond to an off power state in which different functions may be powered down and/or hibernated.
FIG. 5 illustrates an example orientation 500 of the computing device 102. In the orientation 500, the input device 104 is laid flat on a surface and the computing device 102 is deployed at an angle that allows viewing of the display device 110, for example, using a stand 502 deployed on the back of the computing device 102. The orientation 500 may correspond to a typing arrangement whereby input is received via the input device 104, such as using keyboard keys, trackpad 120, and so forth.
With further reference to the example shown in fig. 5, the computing device 102 is oriented at an angle 504 relative to the input device 104. For example, the angle 504 may be determined to be about 115 °. The angle 504 may be considered to be in a range of angles corresponding to the typing position of the computing device 102. For example, an angular range of 31-180 may correspond to the typing position. Within this range of angles, the computing device 102 and/or the input device 104 may be placed in a keyed-power state. In the typing power state, the input device 104 and the computing device 102 may be powered on to provide input to the computing device 102 via the input device 104.
FIG. 6 illustrates another example orientation 600 of the computing device 102, generally at 600. In the orientation 600, the computing device 102 is oriented to cause the display device 110 to face away from the input device 104. In this example, the stand 502 may support the computing device 102, for example, via contact with a back side of the input device 104. Although not explicitly shown here, a cover may be used to cover and protect the front surface of the input device 104.
With further reference to the example shown in fig. 6, the display device 110 of the computing device 102 is determined to be oriented at an angle 602 relative to the input device 104. For example, the angle 602 may be determined to be about 295 °. The angle 602 may be considered to be in a range of angles corresponding to a viewing position of the computing device 102. For example, an angular range of 200-360 may correspond to a viewing position. The orientation 600 may allow for easy use and/or viewing of the display device 110, such as viewing content, providing touch input to the computing device 102, and so forth.
In this range of angles, the computing device 102 and/or the input device 104 may be placed in a viewing power state. In the view power state, the computing device 102 may be powered on and the input device 104 may be powered off or hibernated. Battery power to power the input device 104 may thereby be conserved while still allowing interaction with and/or viewing of the display device 110 of the computing device 102.
FIG. 7 illustrates an example orientation 700 in which the input device 104 can also be rotated to be deployed against the back of the computing device 102, such as against a rear housing of the computing device 102 that is disposed opposite the display device 110 of the computing device 102. In this example, the flexible hinge 106 may be caused to "wrap around" the connection portion 202 by the orientation of the connection portion 202 relative to the computing device 102 in order to position the input device 104 at the rear of the computing device 102.
Such wrap-around may result in a portion of the rear of the computing device 102 remaining exposed. This can be used for a variety of functions, such as in the example orientation 700, allowing the use of a camera 702 located at the rear of the computing device 102 even if a significant portion of the rear of the computing device 102 has been covered by the input device 104.
With further reference to the example shown in fig. 7, the display device 110 of the computing device 102 is determined to be oriented at an angle 704 relative to the input device 104. As an example, the angle 704 may be determined to be about 360 °. For example, the angle 704 may be considered to be within an angular range corresponding to a viewing position (mentioned above), whereby the computing device 102 is in a viewing power state. As mentioned above, viewing the power state may allow viewing and/or interaction with the display device 110 while powering down or hibernating the input device 104. The camera 702 may be powered in the viewing power state so that a photograph may be captured while the computing device is in the viewing power state.
FIG. 8 illustrates another example orientation of the computing device 102 generally at 800. In orientation 800, computing device 102 will be rotated laterally, e.g., relative to a vertical orientation of surface 802 on which computing device 102 is deployed. The display device 110 is visible and the input device 104 is rotated away from the display device 110. In at least some implementations, the width of the input device 104 can be narrower than the width of the computing device 102. Additionally or alternatively, the width of the input device 104 may be tapered such that the edge closest to the hinge 106 is wider than the outermost edge. Doing so may allow the face of the display device to lean back in the orientation 800 to provide a proper viewing angle.
With further reference to the example shown in fig. 8, the techniques discussed herein may determine that the computing device 102 is deployed in the position 800. For example, the computing device accelerometer 114 and/or the input device accelerometer 116 may determine that the computing device 102 and/or the input device 104 are rotated to the orientation 800. In orientation 800, the screen orientation of display device 110 may be rotated 90 degrees, for example, to a portrait viewing mode. Still further, the computing device 102 may be placed in a viewing power state. As mentioned above, viewing the power state may allow viewing and/or interaction with the display device 110 while also powering down or hibernating the input device 104.
FIG. 9 illustrates that the computing device 102 may be rotated in a plurality of different angular ranges relative to the input device 104. As described in detail herein, different angular ranges may be associated with different power states, different application states, and so forth.
An angular range 900 corresponding to the closed position of the computing device 102 is illustrated. Thus, if the computing device 102 is positioned at an angle within the angular range 900 relative to the input device 104, it may be determined that the computing device 102 is in the closed position. As mentioned above, the closed position may include an associated off power state in which different functions may be powered down and/or hibernated, such as the input device 104, the display device 110, and so forth.
Also shown is a range of angles 902 corresponding to the typing orientation of computing device 102. Thus, if the computing device 102 is positioned at an angle within the range of angles 902 relative to the input device 104, the computing device 102 may be determined to be in a typing orientation. In this orientation, the computing device 102 and/or the input device 104 may be placed in a keyed-power state in which the input device 104 and the computing device 102 may be powered on, whereby input may be provided to the computing device 102 via the input device 104, touch input may be provided to the display device 100, and so on.
Fig. 9 also illustrates a range of angles 904 corresponding to a viewing position of the computing device 102. Thus, if the computing device 102 is positioned at an angle within the range of angles 904 relative to the input device 104, it may be determined that the computing device 102 is in a viewing orientation. In this orientation, computing device 102 and/or input device 10 may be placed in a viewing power state, whereby computing device 102 may be powered and input device 104 may be powered off or hibernated.
The orientations, angular ranges, power states, etc. described above are given for illustrative purposes only. It is contemplated that a variety of different orientations, power states, and angular ranges may be implemented within the spirit and scope of the claimed embodiments.
Having discussed some example device orientations, some example procedures in accordance with one or more embodiments will now be considered.
Exemplary procedure
FIG. 10 is a flow diagram that describes steps in a method in accordance with one or more embodiments. In at least some embodiments, the method can be used to determine the orientation of a computing device relative to an input device.
Step 1000 determines a gravitational orientation of a computing device. For example, the orientation (e.g., gravity vector) of the computing device accelerometer 114 relative to the earth's gravity may be determined. In an embodiment, this may include determining an angle at which the axis of the computing device accelerometer 114 is oriented relative to earth gravity.
Step 1002 determines a gravitational orientation of an input device. For example, the orientation of the input device accelerometer 116 relative to the earth's gravity may be determined. In an embodiment, this may include determining the angle at which the axis of the input device accelerometer 116 is oriented relative to earth gravity.
Step 1004 determines an orientation of the computing device relative to the input device by comparing a gravitational orientation of the computing device to a gravitational orientation of the input device. For example, the angle at which the computing device is oriented relative to gravity may be compared to the angle at which the input device is oriented relative to gravity to determine the angle at which the computing device is oriented relative to the input device.
One example way to determine orientation is to consider it as the angle Θ (theta) between the computing device and the input device. The theta can be expressed by an equationOr a dot product divided by the magnitude product, where a is the gravity vector of the computing device and B is the gravity vector of the input device. This equation is presented for illustrative purposes only, and a variety of techniques may be used to determine the orientation of the computing device relative to the input device within the spirit and scope of the claimed embodiments.
Although techniques are discussed herein with respect to determining relative orientations using accelerometers, a variety of different techniques may be used to determine orientation within the spirit and scope of the claimed embodiments.
FIG. 11 is a flow diagram that describes steps in a method in accordance with one or more embodiments. Step 1100 determines the orientation of the computing device relative to the input device. As described above, the orientation may include the angle at which the computing device is oriented relative to the input device, or vice versa.
Step 1102 determines a power state based on the orientation. For example, the power state may be determined for a computing device, an input device, and/or other devices operatively associated with the computing device. Examples of different power states have been discussed above.
Step 1104 determines an application state based on the orientation. For example, based on a particular orientation, a particular function of an application may be enabled or disabled. In an embodiment, steps 1102 and 1104 may occur simultaneously, sequentially, alternately, or otherwise.
The application state may be determined from a set of application states that may be applied to the application while the application is running on the computing device. Thus, the application may comprise different operating states, and at least some of the operating states depend on the device orientation. For example, consider a scenario that contains an application that allows a user to play a digital piano with the aid of a computing device. An input device operatively attached to the computing device may include keys that may be depressed to play different notes of the piano. Thus, when the input device is deployed in an orientation in which input may be provided via the input device (such as the orientation 500 discussed above with reference to FIG. 5), the application may enable functionality to receive input from the input device to play a note.
However, the application may disable functionality for receiving input from the input device when the input device is deployed in a different orientation. For example, in the orientation 700 described above, the input device 104 will be powered down or hibernated. Thus, in this orientation, the instantiated application may disable the functionality of receiving input via the input device. Still further, the application may enable other functionality for receiving input, such as presenting visual piano keys that may be displayed via the display device 110 and may receive touch input from a user to play the digital piano.
As another example, the input device may be configured as a game controller. Thus, the gaming application may enable or disable particular functionality related to the game based on the orientation of the computing device and/or the input device.
Touch initiated power state transitions
In at least some implementations, techniques allow for transitioning power states in response to a detected touch interaction. For example, vibrations generated by touch interactions may be detected, thereby triggering certain events.
FIG. 12 is a flow diagram that describes steps in a method in accordance with one or more embodiments. Step 1200 monitors for vibration in the device in a low power state. For example, the monitoring may be performed in a low power state (e.g., sleep mode) of the computing device 102 in which different functions are to be powered off and/or hibernated, such as the keyboard and trackpad 120 of the input device 104, the processor and/or display device 110 of the computing device 102, and so forth. In a low power state, the computing device accelerometer 114 and/or the input device accelerometer 116 may be powered in order to detect the vibration.
Step 1202 detects vibration on the device. For example, the computing device accelerometer 114 and/or the input device accelerometer 116 may detect vibrations. As mentioned above, a variety of different events may cause the device to vibrate. For example, a user may provide touch input to the touch functionality to cause the device to wake up from a low power mode. Alternatively, the vibration may be caused by other types of contact with the device, such as the user hitting the device, the user hitting a surface on which the device is located, and so forth.
Step 1204 determines whether the vibration exceeds a vibration threshold. For example, the vibration threshold may be specified in accordance with a suitable metric, such as in terms of meters per square second ("g"), hertz ("Hz"), or the like. For example, for some amount of time T, if there are N zero crossings in the readings from the accelerometer and N +1 values greater than a threshold, then vibration may be detected. For example, within 5ms, if the reading from the accelerometer is +1g, then-1 g, then +1g, then it can be considered a simple impact or vibration event. These values are merely examples, and any particular values may be used in accordance with the particular embodiments.
If the vibration does not exceed the vibration threshold ("NO"), then the method returns to step 1200. If the vibration exceeds the vibration threshold ("YES"), then step 1206 will power on the touch function. For example, the touch functionality includes functionality configured to receive touch input. Examples of touch functionality include a trackpad (e.g., trackpad 120), a touch screen (e.g., display device 110), a capacitive touch device, a keyboard of input device 104, and so forth. In at least some implementations, the accelerometer that detects the vibration can issue a notification to the device processor, which can then cause power to be provided to the touch functionality. For example, the touch functionality may be in a power-down mode, e.g., in a sleep mode, before the vibration is detected. Thus, the touch function may be energized in response to detecting the vibration.
Step 1208 determines whether a touch input has been received via the touch function. For example, touch functionality may be queried to determine whether a touch input is received. If no touch input is received ("NO"), step 1210 will power down the touch functionality. For example, if the touch functionality indicates that no touch input was received, the touch functionality may be powered down. As mentioned above, in addition to touch input to the input function, vibrations may also originate from other forms of contact with the device, such as a user inadvertently bumping into the device. In at least some embodiments, the method can return to step 1200.
If a touch input is received ("YES"), step 1212 causes the device to transition to a different power state. For example, the device may transition from a low power state to a powered state. Examples of power-on states include the key-in and view power states described above. Thus, different power states may cause different functions to be powered on, such as the processor of the computing device 102, the display device 110, the input device 104, and so forth.
Thus, the approach described in FIG. 12 can allow touch functions that consume more power (e.g., capacitive sensing functions) to remain in a low power mode while functions that consume relatively less power can remain powered on in order to detect vibrations associated with possible touch interactions. If vibration is detected, the touch function may be powered on to determine if the vibration is caused by a touch input to the touch function, such as by a user wishing to wake up the device from a low power mode. Thus, low power functions (e.g., accelerometers) may be used as a monitoring mechanism, while functions that consume more power (e.g., touch functions) may be used as a confirmation mechanism to determine whether the detected vibrations are the result of a touch input or some other event.
Exemplary System and device
FIG. 13 illustrates an example system generally at 1300 that includes an example computing device 1302, the computing device 1302 representing one or more computing systems and/or devices that can implement the various techniques described herein. The computing device 1302 may be configured to assume a mobile configuration, for example, by using a housing shaped and sized to be held or carried by one or more hands of a user, examples of computing devices shown including mobile phones, mobile gaming and music devices, and tablet computers, although other examples are contemplated.
The illustrated example computing device 1302 includes a processing system 1304, one or more computer-readable media 1306, and one or more I/O interfaces 1308, which are communicatively coupled to each other. Although not shown, computing device 1302 may also include a system bus or other data and command transfer system that couples the various components to one another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. In addition, various other examples are also contemplated, such as control and data lines.
The processing system 1304 represents functionality that uses hardware to perform one or more operations. Accordingly, the processing system 1304 is illustrated as including hardware elements 1310 that may be configured as processors, functional blocks, and the like. This may include hardware implementations as application specific integrated circuits or other logic devices formed using one or more semiconductors. The hardware element 1310 is not limited by the materials from which it is constructed or the processing mechanisms employed within it. For example, a processor may include one or more semiconductors and/or transistors (e.g., electronic Integrated Circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.
Computer-readable storage medium 1306 is illustrated as including memory/storage 1312. The memory/storage 1312 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 1312 may include volatile media (such as Random Access Memory (RAM)) and/or nonvolatile media (such as Read Only Memory (ROM), flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 1312 may include fixed media (e.g., RAM, ROM, a fixed hard drive, etc.) as well as removable media (e.g., flash memory, a removable hard drive, an optical disk, and so forth). The computer-readable medium 1306 may be configured in a variety of other ways as further described below.
One or more input/output interfaces 1308 represent functionality that enables a user to enter commands and information into computing device 1302, and also enables information to be presented to the user and/or other components or devices through the use of different input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors configured to detect physical touch), a camera (e.g., movement that can be recognized as a gesture with invisible wavelengths such as visible or infrared frequencies, where the gesture does not include touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, a haptic response device, and so forth. Thus, computing device 1302 may be configured in a variety of ways to support user interaction.
Computing device 1302 is further illustrated as being communicatively physically coupled to an input device 1314, where the input device 1314 is physically and communicatively removable from the computing device 1302. In this manner, a variety of different input devices may be coupled with computing device 1302 in a variety of configurations to support a variety of functions. In this example, input device 1314 includes one or more keys 1316, which may be configured as pressure sensitive keys, mechanical toggle keys, and the like.
The input device 1314 is further illustrated as including one or more modules 1318 that can be configured to support a variety of functions. For example, the one or more modules 1318 may be configured to process analog and/or digital signals received from the keys 1316 to determine whether a keystroke was intended, to determine whether an input indicates a resting pressure, to support authentication of the input device 1314 to be operational with the computing device 1302, and so forth.
The various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth, which perform particular tasks or implement particular abstract data types. The terms "module," "functionality," and "component" as used herein generally represent software, firmware, hardware, or a combination thereof. Furthermore, the features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
The techniques may also be implemented in a network environment, such as one that uses various cloud-based resources. For example, the above-described methods, processes, etc., may utilize network resources to enable different functionality.
Implementations of the modules and techniques described may be stored on or transmitted across some form of computer readable media. Computer-readable media can include a variety of media that can be accessed by computing device 1302. By way of example, and not limitation, such computer-readable media may comprise "computer-readable storage media" and "computer-readable signal media".
"computer-readable storage medium" may refer to media and/or devices that allow information to be stored, permanently and/or non-temporarily, as opposed to mere signal transmission, carrier waves, or the signal itself. Accordingly, computer-readable storage media refer to non-signal bearing media. The computer-readable storage media include hardware, such as volatile and nonvolatile, removable and non-removable media and/or storage devices, implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, logic elements/circuits or other data. Examples of computer readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage devices, tangible media, or articles of manufacture suitable for storing the desired information and capable of being accessed by a computer.
"computer-readable signal medium" may refer to a signal-bearing medium configured to transmit instructions to the hardware of computing device 1302, which may be transmitted via a network, for example. "signal media" may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier wave, data signal, or other transport mechanism. Signal media also includes any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes both wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
As previously described, hardware element 1310 and computer-readable medium 1306 represent modules, programmable device logic, and/or fixed device logic implemented in hardware, which may be used in some embodiments to implement at least some aspects of the techniques described herein, such as to execute one or more instructions. The hardware may include components of integrated circuits or systems on a chip, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), Complex Programmable Logic Devices (CPLDs), and other implementations using silicon or other hardware. In this context, hardware may operate as a processing device to perform program tasks defined by instructions and/or logic contained in the hardware, as well as hardware to store instructions for execution (e.g., the computer-readable storage media described above).
Combinations of the above may also be used to implement the different techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage medium and/or may be implemented by one or more hardware elements 1310. Computing device 1302 may be configured to implement particular instructions and/or functions corresponding to software and/or hardware modules. Accordingly, implementations of modules that may be executed as software by the computing device 1302 may be implemented at least in part in hardware, for example, using computer-readable storage media and/or hardware elements 1310 of the processing system 1304. These instructions and/or functions may be executed/operated by one or more articles of manufacture (e.g., one or more computing devices 1302 and/or processing systems 1304) to implement the techniques, modules, and examples described herein.
Various methods are discussed herein that may be implemented to perform the techniques discussed herein. Aspects of the method may be implemented in hardware, firmware, software, or a combination thereof. The methodologies are shown as a set of blocks, and the blocks specify operations performed by one or more devices, but the blocks are not necessarily limited to the order in which the blocks shown perform the operations. Still further, operations shown with respect to a particular method may be combined and/or interchanged with operations of a different method according to one or more embodiments. Aspects of these methods may be implemented via interactions between the different entities discussed above with reference to environment 100.
Conclusion
Although the illustrative embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the embodiments defined in the appended claims are not necessarily limited to the specific features or acts described above. Rather, the specific features and acts are disclosed as example forms of implementing the claims features.

Claims (11)

1. A computer-implemented method, comprising:
determining (1100) an orientation of a computing device relative to an input device communicatively coupled to the computing device; and
determining (1102) a power state of one or more of the computing device or input device based on the orientation.
2. The method of claim 1, wherein the orientation comprises an angle at which the computing device is positioned relative to the input device.
3. The method of claim 1, wherein the orientation comprises an angle by which the computing device is rotated relative to the input device, and wherein the determining comprises associating the angle with one of a plurality of angular ranges, each of the angular ranges corresponding to a respective power state of the computing device.
4. The method of claim 1, wherein the orientation comprises a typing orientation, and wherein the power state comprises a typing power state in which the computing device and the input device are powered so that input may be provided to the computing device via the input device.
5. The method of claim 1, wherein the orientation comprises a viewing orientation, and wherein the power state comprises a viewing power state in which the computing device is powered on and the input device is one of powered off or hibernated.
6. A computer-implemented method, comprising:
determining (1100) an orientation of a computing device relative to an input device communicatively coupled to the computing device; and
an application state is determined (1104) based on the orientation.
7. The method of claim 6, wherein the determining comprises enabling or disabling functionality of an application based on the orientation.
8. The method of claim 6, wherein the application state is determined from a set of application states that may be applied to the application while the application is running on a computing device.
9. A computer-implemented method, comprising:
monitoring (1200) vibrations in the device in a low power state;
in response to detecting the vibration, determining (1204) whether the vibration exceeds a vibration threshold;
if the vibration exceeds a vibration threshold:
energizing (1206) a touch function associated with the device;
determining (1208) whether a touch input is received via a touch function; and
if a touch input is received via the touch functionality, the device is caused (1212) to transition to a different power state.
10. The method of claim 9, wherein the low power state comprises a state in which one or more functions of the device are powered off, wherein the different power state comprises a state in which one or more functions are powered on.
11. A computer-readable storage medium having computer-readable instructions stored thereon, which when executed by a computer, cause the computer to perform the method of any one of claims 1-10.
HK14106470.0A 2012-03-02 2014-06-27 Mobile device power state HK1192961A (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US61/606,301 2012-03-02
US61/606,321 2012-03-02
US61/606,336 2012-03-02
US61/606,333 2012-03-02
US61/606,313 2012-03-02
US61/607,451 2012-03-06
US61/613,745 2012-03-21
US13/471,001 2012-05-14

Publications (1)

Publication Number Publication Date
HK1192961A true HK1192961A (en) 2014-09-05

Family

ID=

Similar Documents

Publication Publication Date Title
US9047207B2 (en) Mobile device power state
US20150234478A1 (en) Mobile Device Application State
KR102085267B1 (en) Use of Accelerometer Inputs to Change the Operating State of a Convertible Computing Device
US9892490B2 (en) Electronic apparatus
KR101880913B1 (en) Method and apparatus for controlling flexible display panel
CN105144016B (en) Additional input device, equipment and method
US9417665B2 (en) Providing an alternative human interface
US20110285631A1 (en) Information processing apparatus and method of displaying a virtual keyboard
US9674335B2 (en) Multi-configuration input device
US20120120000A1 (en) Method of interacting with a portable electronic device
US20130019192A1 (en) Pickup hand detection and its application for mobile devices
WO2012135935A2 (en) Portable electronic device having gesture recognition and a method for controlling the same
JP2015518579A (en) Flexible display device and operation method thereof
KR20170075798A (en) Fabric laminated touch input device
JP2009199537A (en) Electronic apparatus and method of controlling same
WO2012061917A1 (en) Motion gestures interface for portable electronic device
JP6304232B2 (en) Portable electronic device, its control method and program
HK1192961A (en) Mobile device power state
US20110296223A1 (en) Electronic Apparatus and Control Method of the Electronic Apparatus
JP2014071833A (en) Electronic apparatus, display change method, display change program
JP2015064738A (en) Electronic apparatus, processing method and program
HK1191707B (en) Sensor fusion algorithm