US20190377451A1 - Electronic device applicable to interaction control - Google Patents
Electronic device applicable to interaction control Download PDFInfo
- Publication number
- US20190377451A1 US20190377451A1 US16/158,324 US201816158324A US2019377451A1 US 20190377451 A1 US20190377451 A1 US 20190377451A1 US 201816158324 A US201816158324 A US 201816158324A US 2019377451 A1 US2019377451 A1 US 2019377451A1
- Authority
- US
- United States
- Prior art keywords
- electronic device
- force
- predetermined
- sensitive component
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present invention relates to control of user interfaces, and more particularly, to an electronic device applicable to interaction control.
- Interaction mechanisms have been applied in electronic devices such as multifunctional mobile phones. Interaction mechanisms provide typical interactions between human and machine, rather than interactions between machines. Some related arts have attempted to achieve interactions between machines, but problems such as high manufacturing cost and excess space taken up by the overall structure occur as a result. Hence, there is a need for a novel method and related structure which can achieve interaction mechanisms between devices without introducing a side effect, or in a way that less likely to introduce a side effect.
- An objective of the present invention is to provide an electronic device applicable to interaction control, in order to solve the above-addressed problem.
- Another objective of the present invention is to provide an electronic device applicable to interaction control, in order to achieve an interaction mechanism between devices without introducing a side effect, or in a way that less likely to introduce a side effect.
- At least one embodiment of the present invention provides a first electronic device, applicable to interact with a second electronic device.
- the first electronic device comprises a first force-sensitive component, a camera module and a processing circuit.
- the first force-sensitive component is arranged to detect a contact event between the first electronic device and the second electronic device, and the camera module is arranged to capture at least one partial image of at least one predetermined image, wherein a display module of the second electronic device displays said at least one predetermined image.
- the processing circuit is coupled to the first force-sensitive component as well as the camera module.
- the processing circuit determines at least one relative location of the first electronic device with respect to the second electronic device according to said at least one partial image for the second electronic device to use, and the display module of the second electronic device displays a display content corresponding to said at least one relative location.
- At least one embodiment of the present invention provides a second electronic device applicable to interact with a first electronic device.
- the second electronic device comprises a second force-sensitive component, a display module and a processing circuit.
- the second force-sensitive component is arranged to detect a contact event between the first electronic device and the second electronic device.
- the display module is arranged to display at least one predetermined image, wherein a camera module of the first electronic device captures at least one partial image of said at least one predetermined image, to allow the first electronic device to determine at least one relative location of the first electronic device with respect to the second electronic device for the second electronic device to use.
- the processing circuit is coupled to the second force-sensitive component as well as the display module. The processing circuit controls the display module to display said at least one predetermined image, and controls the display module to display a display content corresponding to said at least one relative location.
- the present invention is capable of performing proper control of operations of the electronic devices, and more particularly, is capable of achieving an interaction mechanism between devices in limited space to avoid various problems presented in the related arts.
- the embodiments illustrated by the present invention may solve the related art problems without significantly introducing additional costs.
- FIG. 1 is a diagram illustrating an interaction system according to an embodiment of the present invention.
- FIG. 2 is a diagram illustrating a working flow of a method for performing interaction control according to an embodiment of the present invention.
- FIG. 3 is a diagram illustrating implementation details of the interaction system according to an embodiment of the present invention.
- FIG. 4 is a diagram illustrating implementation details of the interaction system according to another embodiment of the present invention.
- FIG. 5 is a diagram illustrating implementation details of the interaction system according to yet another embodiment of the present invention.
- FIG. 6 is a diagram illustrating implementation details of the interaction system according to still another embodiment of the present invention.
- FIG. 7 illustrates some examples of predetermined images in the method.
- FIG. 8 illustrates a working flow of the method according to an embodiment of the present invention.
- FIG. 1 is a diagram illustrating an interaction system 100 according to an embodiment of the present invention.
- the interaction system 100 may comprise electronic devices 110 and 120 .
- the electronic device 110 may include, but are not limited to: a tablet, multifunctional mobile phone, wearable device, simple input gadget (e.g. gadget allowing for simple input commands), toy, etc.
- Examples of the electronic device 120 may include, but are not limited to: an interaction table, all-in-one personal computer (PC), laptop computer, tablet, multifunctional mobile phone, large sized toy, etc., wherein the size of the electronic device 120 is usually larger than that of the electronic device 110 , for example, may be 8 inches or above, but the present invention is not limited thereto.
- the electronic device 110 may comprise a processing circuit 112 (e.g. a processor, microprocessor, microcontroller, etc.), at least one force-sensitive component 114 (e.g. one or more force-sensitive components, such as force sensors, pressure sensors, etc.), a communications module 116 (particularly, a wireless communications module conforming to a predetermined communications standard such as Bluetooth or Wi-Fi) and a camera module 118 .
- the electronic device 120 may comprise a processing circuit 122 (e.g. a processor, microprocessor, microcontroller, etc.), at least one force-sensitive component 124 (e.g.
- the present invention is not limited thereto.
- the structure of the interaction system 100 of the present invention is modifiable.
- the quantity, type, arrangement, etc. of at least one of the electronic devices 110 and 120 can be modified.
- some components in any of the electronic devices 110 and 120 may be integrated into a same module.
- the processing circuits 112 and 122 may be arranged to control operations of the electronic devices 110 and 120 , respectively, and the force-sensitive components 114 and 124 may be arranged to perform force-related sensing for the electronic devices 110 and 120 (e.g. sensing pressure, such as the pressure that one of the electronic devices 110 and 120 applies to the other), respectively.
- the communications modules 116 and 126 may be arranged to perform wireless communications for the electronic devices 110 and 120 , respectively, in order to allow exchanging of information between the electronic devices 110 and 120 .
- the camera module 118 may capture one or more images for the electronic device 110 , such as at least one portion (e.g.
- the display module 128 may display the predetermined image for the electronic device 120 for further use of the electronic device 110 , and may also display images regarding interaction control (e.g. modified images such as images modified in response to interaction control) for the electronic device 120 , in order to improve the user experience.
- interaction control e.g. modified images such as images modified in response to interaction control
- FIG. 2 is a diagram illustrating a working flow 200 of a method for performing interaction control according to an embodiment of the present invention, wherein the method may be applied to the interaction system 100 , the electronic devices 110 and 120 , the processing circuits 112 and 122 , and other components shown in FIG. 1 .
- the processing circuits 112 and 122 may control the respective operations of the electronic devices 110 and 120 according to the method.
- the first electronic device and second electronic device in the working flow 200 may be described using the electronic devices 110 and 120 respectively, and their own components (e.g. the first and the second force-sensitive components) may respectively be described using the respective components of the electronic devices 110 and 120 (e.g. the force-sensitive components 114 and 124 ).
- the interaction system. 100 may utilize at least one component within the force-sensitive component 114 of the electronic device 110 and the force-sensitive component 124 of the electronic device 120 (e.g.
- the force-sensitive components 114 and/or 124 to detect a contact event between the electronic devices 110 and 120 , and more particularly, may utilize the aforementioned at least one of the force-sensitive components 114 and 124 to sense the pressure between the electronic devices 110 and 120 , and, when at least one condition is satisfied, determine that the contact event occurs, wherein the aforementioned at least one condition may comprise: the pressure falls within a predetermined pressure range. The condition may further comprise: the period when the pressure falls within the predetermined pressure range reaches a predetermined time threshold.
- the interaction system 100 may utilize the display module 128 of the electronic device 120 to display at least one predetermined image (e.g. one or more predetermined images).
- the processing circuit 122 may control the display module 120 to display the aforementioned at least one predetermined image, for example, may store the aforementioned at least one predetermined image into a storage (not shown) in the electronic device 120 in advance, and may display the aforementioned at least one predetermined image (which is read from this storage) in Step 220 , but the present invention is not limited thereto.
- the interaction system 100 may utilize the camera module 118 of the electronic device 110 to capture at least one partial image of the aforementioned at least one predetermined image (e.g. one or more partial images).
- the interaction system 100 may utilize the electronic device 110 (particularly, the processing circuit 112 therein) to determine at least one relative location (e.g. one or more relative locations) of the electronic device 110 with respect to the electronic device 120 according to the aforementioned at least one partial image, for the electronic device 120 to use.
- the electronic device 110 may store the aforementioned at least one predetermined image into a storage (not shown) in the electronic device 110 in advance, and may determine the aforementioned at least one relative location according to the location of the aforementioned at least one partial image (which is read from this storage) on the aforementioned at least one predetermined image in Step 240 .
- the aforementioned at least one partial image carries location information for indicating the aforementioned at least one relative location
- the electronic device 110 may obtain the location information from the aforementioned at least one partial image based on the predetermined rule, in order to determine the aforementioned at least one relative location according to the location information.
- the interaction system 100 may utilize the display module 128 of the electronic device 120 to display the display content that corresponds to the aforementioned at least one relative location, such as the above-mentioned modified images.
- the processing circuit 122 may control the display module 120 to display the display content that corresponds to the aforementioned at least one relative location, such as corresponding auxiliary information, icons, symbols, etc.
- the processing circuit 122 may perform the operations in Step 250 in response to one or more actions of a user placing the electronic device 110 on the aforementioned at least one relative location.
- one or more steps may be added into, modified, or removed from the working flow 200 .
- some operation(s) of one of the electronic devices 110 and 120 may be performed by the other of the electronic devices 110 and 120 .
- FIG. 3 is a diagram illustrating implementation details of the interaction system 100 according to an embodiment of the present invention.
- the electronic devices 110 M and 120 T may be taken as examples of the electronic devices 110 and 120 , respectively.
- the electronic devices 110 M and 120 T may be implemented as a multifunctional mobile phone and an all-in-one personal computer respectively, and may have their own touch screens with built-in force-sensitive components 114 F and 124 respectively.
- the force-sensitive component 114 F, the camera 118 F (particularly, the front camera) and the touch screen of the electronic device 120 T may be taken as examples of the force-sensitive component 114 , the camera module 118 and the display module 128 respectively, but the present invention is not limited thereto.
- the force-sensitive components 114 F and 124 may be located above or under its corresponding touch screens, respectively.
- the force-sensitive components 114 F and 124 may be transparent.
- FIG. 4 is a diagram illustrating implementation details of the interaction system 100 according to another embodiment of the present invention.
- the force-sensitive component 114 B and the camera 118 B may be taken as examples of the force-sensitive component 114 and the camera module 118 respectively, but the present invention is not limited thereto. For brevity, similar descriptions for this embodiment are not repeated in detail here.
- the force-sensitive component 114 may comprise the force-sensitive components 114 F and 114 B
- the camera module 118 may comprise the cameras 118 F and 118 B.
- FIG. 5 is a diagram illustrating implementation details of the interaction system 100 according to yet another embodiment of the present invention.
- the electronic device 110 C may be taken as an example of the electronic device 110 .
- the electronic device 110 C maybe implemented as a toy (e.g. an electronic chess set), wherein the force-sensitive component 114 C and the camera 118 C may be taken as examples of the force-sensitive component 114 and the camera module 118 respectively, and the force-sensitive component 114 C may be transparent, but the present invention is not limited thereto.
- similar descriptions for this embodiment are not repeated in detail here.
- FIG. 6 is a diagram illustrating implementation details of the interaction system 100 according to still another embodiment of the present invention.
- the electronic device 110 A may be taken as an example of the electronic device 110 .
- the electronic device 110 A may be implemented as a simple input gadget, wherein the force-sensitive component 114 A and the camera 118 A may be taken as examples of the force-sensitive component 114 and the camera module 118 respectively, and the force-sensitive component 114 A may be located around the camera 118 A (e.g. surrounding the camera 118 A), but the present invention is not limited thereto.
- similar descriptions for this embodiment are not repeated in detail here.
- FIG. 7 illustrates some examples of the predetermined image mentioned in the method mentioned above.
- the electronic device 110 A such as a simple input gadget may be illustrated in FIG. 7 , to indicate that different relative locations thereof may respectively correspond to different location information carried by the predetermined image.
- Example (a) comprises a series of predetermined images which may be arranged to sequentially display a downwardly-moving horizontal black stripe (depicted by a shaded pattern).
- the processing circuit 122 may control the display module 128 to start displaying the series of predetermined images, making the horizontal black stripe move downwards at a steady speed (e.g. a first speed). Since the display period of each predetermined image in the series of predetermined images and the total display period of the series of predetermined images are known, when detecting the horizontal black stripe through the camera 118 A, the electronic device 110 A (e.g.
- the processing circuit 112 may determine a time period (such as the period between the time point when the occurrence of the contact event is confirmed and the current time point), and determine the Y coordinate value of the relative location of the electronic device 110 A according to the ratio of this time period to this total display period.
- Example (b) comprises another series of predetermined images which may be arranged to sequentially display a rightward-moving vertical black stripe (depicted by a shaded pattern).
- the processing circuit 122 may control the display module 128 to start displaying the other series of predetermined images, making the vertical black stripe move to the right-hand side at a steady speed (e.g. a second speed). Since the display period of each predetermined image of the other series of predetermined images and the total display period of the other series of predetermined images are known already, when detecting the vertical black stripe via the camera 118 A, the electronic device 110 A (e.g.
- the processing circuit 112 may determine a time period (such as the period between the time point when the occurrence of displaying is confirmed and the current time point), and determine the X coordinate value of the relative location of the electronic device 110 A according to the ratio of this time period to this total display period.
- Example (c) comprises a multi-area image which may be divided into multiple areas, wherein the distribution of the respective average brightness of the areas and the brightness distribution within each area of the areas are based on the predetermined rule.
- the respective average brightness of the areas may differ from one another, and may vary with the change of the area locations, and more particularly, may increase upwardly and rightward (e.g. from bottom to top and from left to right; and the brightness in each area of the areas may vary with the change of the locations, and more particularly, may decrease upwards and rightward (e.g. from bottom to top and from left to right); but the present invention is not limited thereto.
- the predetermined rule may be modified, and the multi-area image will change correspondingly.
- Example (d) comprises a multi-area image which may be divided into multiple sets of areas, and two adjacent sets of areas may be separated by a bold horizontal bar, wherein the respective colors of the multiple sets of areas (depicted by various types of patterns in FIG. 7 ), the distribution of the respective average brightness of multiple areas in each set of areas within the multiple sets of areas, and the brightness distribution within each area of the multiple areas are based on the predetermined rule.
- the respective colors of the multiple sets of areas may be different from one another; and the respective average brightness of the areas in each set of areas within the multiple sets of areas may be different from one another, and may vary with the change of area locations, and more particularly, may increase from left to right; but the present invention is not limited thereto.
- the predetermined rule may be modified, and the multi-area image will change correspondingly.
- the number of the multiple areas in Example (c) and the way of dividing or differentiating them may be modified.
- the set count of the multiple sets of areas in Example (d) and the way of dividing or differentiating them, and/or the number of the multiple areas in Example (d) and the way of dividing or differentiating them may be modified.
- examples of the aforementioned at least one predetermined image may include, but are not limited to: one or a combination of various image characteristics (such as colors, shapes, sizes, distributions, etc.), and actual images (such as photos of landscapes).
- FIG. 8 illustrates a working flow 400 of the method according to an embodiment of the present invention.
- the user may place the electronic device 110 on the electronic device 120 , resulting in pressure between the two electronic devices, wherein the electronic devices 110 and 120 may be regarded as an upper device and a lower device, respectively.
- the electronic devices 110 and 120 (such as the upper device and the lower device) may respectively sense the pressure via the force-sensitive components 114 and 124 mentioned in Steps S 11 -S 14 , in order to identify each other, and may utilize the aforementioned at least one predetermined image to perform positioning in Steps S 15 -S 18 , in order to determine the aforementioned at least one relative location for further usage in Step S 19 .
- Step S 11 the electronic devices 110 and 120 may turn on the force-sensitive components 114 and 124 , respectively.
- Step S 12 the electronic devices 110 and 120 may utilize the force-sensitive components 114 and 124 to sense pressure, respectively.
- At least one of the electronic devices 110 and 120 may determine whether there is an up-down relationship between the electronic devices 110 and 120 according to the level and the duration of pressure, in order to obtain the device attributes.
- Each of the processing circuits 112 and 122 may be arranged to determine that the aforementioned at least one condition is satisfied. More specifically, the aforementioned at least one condition may comprise: the pressure falls within the predetermined pressure range (e.g. 50 gram/square centimeter, or any of other predetermined values); and the period when the pressure falls within the predetermined pressure range reaches the predetermined time threshold (e.g. 3 seconds, or any of other predetermined values).
- the processing circuit 112 may inform the processing circuit 122 that the aforementioned at least one condition is satisfied, and may transmit the device attributes of the electronic device 110 .
- the processing circuit 122 may receive the device attributes in order to confirm the occurrence of the contact event.
- the electronic device 120 may define which one of the electronic devices 110 and 120 is an upper device, and define the other as a lower device.
- the electronic device 110 is the upper device
- the electronic device 120 is the lower device.
- the lower device i.e. the electronic device 120 is this case
- the upper device i.e. the electronic device 110 in this case
- the upper device may calculate the space information of the upper device according to the partial image(s), such as the aforementioned at least one relative location.
- the upper device may output the space information of the upper device to the lower device.
- the lower device may display a corresponding user interface (e.g. the display content corresponding to the aforementioned at least one relative location, such as corresponding auxiliary information, icons, symbols, etc.) on the surrounding of the upper device.
- a corresponding user interface e.g. the display content corresponding to the aforementioned at least one relative location, such as corresponding auxiliary information, icons, symbols, etc.
- one or more steps may be added to, modified, or removed from the working flow 400 .
- the inter-device relationship between the electronic devices 110 and 120 may comprise the predetermined pressure range and the predetermined time threshold.
- the electronic devices 110 and 120 may connect to at least one network (e.g. the Internet) in advance, and may obtain the inter-device relationship according to the history record, such as the inter-device relationship created for executing any of the working flows 200 and 400 , wherein the inter-device relationship may be uploaded to a database beforehand, for the electronic devices 110 and 120 to download data thereof when there is a need.
- the electronic devices 110 and 120 may confirm the occurrence of the contact event and/or define which one amongst them is the upper device, and which one amongst them is the lower device.
- the inter-device relationship may further comprise the respective device attributes of the electronic devices 110 and 120 , such as size, weight, product model number, etc.
- the inter-device relationship may be determined with the aid of other information, such as the user account, relationships between friends (e.g. the friends on social network), and wireless connection state (e.g. whether wireless signals from peer device are detectable).
- the electronic device 110 may perform image analysis according to the aforementioned at least one partial image, in order to determine the relative height (i.e. the Z coordinate value), tilt angle, rotation angle, etc. of the electronic device 110 with respect to the electronic device 120 , to enable the electronic device 120 to correspondingly modify the user interface displayed by the display module 128 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
Description
- The present invention relates to control of user interfaces, and more particularly, to an electronic device applicable to interaction control.
- Interaction mechanisms have been applied in electronic devices such as multifunctional mobile phones. Interaction mechanisms provide typical interactions between human and machine, rather than interactions between machines. Some related arts have attempted to achieve interactions between machines, but problems such as high manufacturing cost and excess space taken up by the overall structure occur as a result. Hence, there is a need for a novel method and related structure which can achieve interaction mechanisms between devices without introducing a side effect, or in a way that less likely to introduce a side effect.
- An objective of the present invention is to provide an electronic device applicable to interaction control, in order to solve the above-addressed problem.
- Another objective of the present invention is to provide an electronic device applicable to interaction control, in order to achieve an interaction mechanism between devices without introducing a side effect, or in a way that less likely to introduce a side effect.
- At least one embodiment of the present invention provides a first electronic device, applicable to interact with a second electronic device. The first electronic device comprises a first force-sensitive component, a camera module and a processing circuit. The first force-sensitive component is arranged to detect a contact event between the first electronic device and the second electronic device, and the camera module is arranged to capture at least one partial image of at least one predetermined image, wherein a display module of the second electronic device displays said at least one predetermined image. The processing circuit is coupled to the first force-sensitive component as well as the camera module. The processing circuit determines at least one relative location of the first electronic device with respect to the second electronic device according to said at least one partial image for the second electronic device to use, and the display module of the second electronic device displays a display content corresponding to said at least one relative location.
- At least one embodiment of the present invention provides a second electronic device applicable to interact with a first electronic device. The second electronic device comprises a second force-sensitive component, a display module and a processing circuit. The second force-sensitive component is arranged to detect a contact event between the first electronic device and the second electronic device. The display module is arranged to display at least one predetermined image, wherein a camera module of the first electronic device captures at least one partial image of said at least one predetermined image, to allow the first electronic device to determine at least one relative location of the first electronic device with respect to the second electronic device for the second electronic device to use. The processing circuit is coupled to the second force-sensitive component as well as the display module. The processing circuit controls the display module to display said at least one predetermined image, and controls the display module to display a display content corresponding to said at least one relative location.
- The present invention is capable of performing proper control of operations of the electronic devices, and more particularly, is capable of achieving an interaction mechanism between devices in limited space to avoid various problems presented in the related arts. In addition, the embodiments illustrated by the present invention may solve the related art problems without significantly introducing additional costs.
- These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
-
FIG. 1 is a diagram illustrating an interaction system according to an embodiment of the present invention. -
FIG. 2 is a diagram illustrating a working flow of a method for performing interaction control according to an embodiment of the present invention. -
FIG. 3 is a diagram illustrating implementation details of the interaction system according to an embodiment of the present invention. -
FIG. 4 is a diagram illustrating implementation details of the interaction system according to another embodiment of the present invention. -
FIG. 5 is a diagram illustrating implementation details of the interaction system according to yet another embodiment of the present invention. -
FIG. 6 is a diagram illustrating implementation details of the interaction system according to still another embodiment of the present invention. -
FIG. 7 illustrates some examples of predetermined images in the method. -
FIG. 8 illustrates a working flow of the method according to an embodiment of the present invention. -
FIG. 1 is a diagram illustrating aninteraction system 100 according to an embodiment of the present invention. Theinteraction system 100 may compriseelectronic devices electronic device 110 may include, but are not limited to: a tablet, multifunctional mobile phone, wearable device, simple input gadget (e.g. gadget allowing for simple input commands), toy, etc. Examples of theelectronic device 120 may include, but are not limited to: an interaction table, all-in-one personal computer (PC), laptop computer, tablet, multifunctional mobile phone, large sized toy, etc., wherein the size of theelectronic device 120 is usually larger than that of theelectronic device 110, for example, may be 8 inches or above, but the present invention is not limited thereto. - As shown in
FIG. 1 , theelectronic device 110 may comprise a processing circuit 112 (e.g. a processor, microprocessor, microcontroller, etc.), at least one force-sensitive component 114 (e.g. one or more force-sensitive components, such as force sensors, pressure sensors, etc.), a communications module 116 (particularly, a wireless communications module conforming to a predetermined communications standard such as Bluetooth or Wi-Fi) and acamera module 118. Theelectronic device 120 may comprise a processing circuit 122 (e.g. a processor, microprocessor, microcontroller, etc.), at least one force-sensitive component 124 (e.g. one or more force-sensitive components, such as force sensors, pressure sensors, etc.), a communications module 126 (particularly, a wireless communications module conforming to the predetermined communications standard such as Bluetooth or Wi-Fi) and adisplay module 128, but the present invention is not limited thereto. As long as the result is substantially the same, the structure of theinteraction system 100 of the present invention is modifiable. For example, the quantity, type, arrangement, etc. of at least one of theelectronic devices 110 and 120 (e.g. theelectronic device 110 and/or the electronic device 120) can be modified. In another example, some components in any of theelectronic devices - According to this embodiment, the
processing circuits electronic devices sensitive components electronic devices 110 and 120 (e.g. sensing pressure, such as the pressure that one of theelectronic devices communications modules electronic devices electronic devices processing circuit 112, thecamera module 118 may capture one or more images for theelectronic device 110, such as at least one portion (e.g. a portion or all) of a predetermined image displayed by thedisplay module 128, for use of interaction control. Under the control of theprocessing circuit 122, thedisplay module 128 may display the predetermined image for theelectronic device 120 for further use of theelectronic device 110, and may also display images regarding interaction control (e.g. modified images such as images modified in response to interaction control) for theelectronic device 120, in order to improve the user experience. -
FIG. 2 is a diagram illustrating a workingflow 200 of a method for performing interaction control according to an embodiment of the present invention, wherein the method may be applied to theinteraction system 100, theelectronic devices processing circuits FIG. 1 . Theprocessing circuits electronic devices flow 200 may be described using theelectronic devices electronic devices 110 and 120 (e.g. the force-sensitive components 114 and 124). - In
Step 210, the interaction system. 100 (for example, at least one of theelectronic devices electronic devices 110 and/or 120; for another example, at least one of theprocessing circuits processing circuits 112 and/or 122) may utilize at least one component within the force-sensitive component 114 of theelectronic device 110 and the force-sensitive component 124 of the electronic device 120 (e.g. the force-sensitive components 114 and/or 124) to detect a contact event between theelectronic devices sensitive components electronic devices - In
Step 220, theinteraction system 100 may utilize thedisplay module 128 of theelectronic device 120 to display at least one predetermined image (e.g. one or more predetermined images). Theprocessing circuit 122 may control thedisplay module 120 to display the aforementioned at least one predetermined image, for example, may store the aforementioned at least one predetermined image into a storage (not shown) in theelectronic device 120 in advance, and may display the aforementioned at least one predetermined image (which is read from this storage) inStep 220, but the present invention is not limited thereto. - In
Step 230, theinteraction system 100 may utilize thecamera module 118 of theelectronic device 110 to capture at least one partial image of the aforementioned at least one predetermined image (e.g. one or more partial images). - In Step 240, the
interaction system 100 may utilize the electronic device 110 (particularly, theprocessing circuit 112 therein) to determine at least one relative location (e.g. one or more relative locations) of theelectronic device 110 with respect to theelectronic device 120 according to the aforementioned at least one partial image, for theelectronic device 120 to use. For example, theelectronic device 110 may store the aforementioned at least one predetermined image into a storage (not shown) in theelectronic device 110 in advance, and may determine the aforementioned at least one relative location according to the location of the aforementioned at least one partial image (which is read from this storage) on the aforementioned at least one predetermined image in Step 240. In another example, based on a predetermined rule, the aforementioned at least one partial image carries location information for indicating the aforementioned at least one relative location, and theelectronic device 110 may obtain the location information from the aforementioned at least one partial image based on the predetermined rule, in order to determine the aforementioned at least one relative location according to the location information. - In
Step 250, theinteraction system 100 may utilize thedisplay module 128 of theelectronic device 120 to display the display content that corresponds to the aforementioned at least one relative location, such as the above-mentioned modified images. Theprocessing circuit 122 may control thedisplay module 120 to display the display content that corresponds to the aforementioned at least one relative location, such as corresponding auxiliary information, icons, symbols, etc. For example, theprocessing circuit 122 may perform the operations inStep 250 in response to one or more actions of a user placing theelectronic device 110 on the aforementioned at least one relative location. - According to some embodiments, one or more steps may be added into, modified, or removed from the working
flow 200. According to some embodiments, as long as the implementation of the present invention is not hindered, some operation(s) of one of theelectronic devices electronic devices -
FIG. 3 is a diagram illustrating implementation details of theinteraction system 100 according to an embodiment of the present invention. Theelectronic devices electronic devices electronic devices sensitive components sensitive component 114F, thecamera 118F (particularly, the front camera) and the touch screen of theelectronic device 120T may be taken as examples of the force-sensitive component 114, thecamera module 118 and thedisplay module 128 respectively, but the present invention is not limited thereto. In some embodiments, the force-sensitive components sensitive components -
FIG. 4 is a diagram illustrating implementation details of theinteraction system 100 according to another embodiment of the present invention. The force-sensitive component 114B and thecamera 118B (particularly, the back camera) may be taken as examples of the force-sensitive component 114 and thecamera module 118 respectively, but the present invention is not limited thereto. For brevity, similar descriptions for this embodiment are not repeated in detail here. - According to some embodiments, the force-
sensitive component 114 may comprise the force-sensitive components camera module 118 may comprise thecameras -
FIG. 5 is a diagram illustrating implementation details of theinteraction system 100 according to yet another embodiment of the present invention. Theelectronic device 110C may be taken as an example of theelectronic device 110. For better understanding, theelectronic device 110C maybe implemented as a toy (e.g. an electronic chess set), wherein the force-sensitive component 114C and thecamera 118C may be taken as examples of the force-sensitive component 114 and thecamera module 118 respectively, and the force-sensitive component 114C may be transparent, but the present invention is not limited thereto. For brevity, similar descriptions for this embodiment are not repeated in detail here. -
FIG. 6 is a diagram illustrating implementation details of theinteraction system 100 according to still another embodiment of the present invention. Theelectronic device 110A may be taken as an example of theelectronic device 110. For better understanding, theelectronic device 110A may be implemented as a simple input gadget, wherein the force-sensitive component 114A and thecamera 118A may be taken as examples of the force-sensitive component 114 and thecamera module 118 respectively, and the force-sensitive component 114A may be located around thecamera 118A (e.g. surrounding thecamera 118A), but the present invention is not limited thereto. For brevity, similar descriptions for this embodiment are not repeated in detail here. -
FIG. 7 illustrates some examples of the predetermined image mentioned in the method mentioned above. For better understanding, theelectronic device 110A such as a simple input gadget may be illustrated inFIG. 7 , to indicate that different relative locations thereof may respectively correspond to different location information carried by the predetermined image. - Example (a) comprises a series of predetermined images which may be arranged to sequentially display a downwardly-moving horizontal black stripe (depicted by a shaded pattern). When the occurrence of the contact event is confirmed or determined, the
processing circuit 122 may control thedisplay module 128 to start displaying the series of predetermined images, making the horizontal black stripe move downwards at a steady speed (e.g. a first speed). Since the display period of each predetermined image in the series of predetermined images and the total display period of the series of predetermined images are known, when detecting the horizontal black stripe through thecamera 118A, theelectronic device 110A (e.g. the processing circuit 112) may determine a time period (such as the period between the time point when the occurrence of the contact event is confirmed and the current time point), and determine the Y coordinate value of the relative location of theelectronic device 110A according to the ratio of this time period to this total display period. - Example (b) comprises another series of predetermined images which may be arranged to sequentially display a rightward-moving vertical black stripe (depicted by a shaded pattern). After completing displaying the series of predetermined images, the
processing circuit 122 may control thedisplay module 128 to start displaying the other series of predetermined images, making the vertical black stripe move to the right-hand side at a steady speed (e.g. a second speed). Since the display period of each predetermined image of the other series of predetermined images and the total display period of the other series of predetermined images are known already, when detecting the vertical black stripe via thecamera 118A, theelectronic device 110A (e.g. the processing circuit 112) may determine a time period (such as the period between the time point when the occurrence of displaying is confirmed and the current time point), and determine the X coordinate value of the relative location of theelectronic device 110A according to the ratio of this time period to this total display period. - Example (c) comprises a multi-area image which may be divided into multiple areas, wherein the distribution of the respective average brightness of the areas and the brightness distribution within each area of the areas are based on the predetermined rule. For example, the respective average brightness of the areas may differ from one another, and may vary with the change of the area locations, and more particularly, may increase upwardly and rightward (e.g. from bottom to top and from left to right; and the brightness in each area of the areas may vary with the change of the locations, and more particularly, may decrease upwards and rightward (e.g. from bottom to top and from left to right); but the present invention is not limited thereto. In some examples, the predetermined rule may be modified, and the multi-area image will change correspondingly.
- Example (d) comprises a multi-area image which may be divided into multiple sets of areas, and two adjacent sets of areas may be separated by a bold horizontal bar, wherein the respective colors of the multiple sets of areas (depicted by various types of patterns in
FIG. 7 ), the distribution of the respective average brightness of multiple areas in each set of areas within the multiple sets of areas, and the brightness distribution within each area of the multiple areas are based on the predetermined rule. For example, the respective colors of the multiple sets of areas may be different from one another; and the respective average brightness of the areas in each set of areas within the multiple sets of areas may be different from one another, and may vary with the change of area locations, and more particularly, may increase from left to right; but the present invention is not limited thereto. In some examples, the predetermined rule may be modified, and the multi-area image will change correspondingly. - According to some embodiments, the number of the multiple areas in Example (c) and the way of dividing or differentiating them may be modified. In addition, according to some embodiments, the set count of the multiple sets of areas in Example (d) and the way of dividing or differentiating them, and/or the number of the multiple areas in Example (d) and the way of dividing or differentiating them may be modified.
- According to some embodiments, examples of the aforementioned at least one predetermined image may include, but are not limited to: one or a combination of various image characteristics (such as colors, shapes, sizes, distributions, etc.), and actual images (such as photos of landscapes).
-
FIG. 8 illustrates a workingflow 400 of the method according to an embodiment of the present invention. For better understanding, the user may place theelectronic device 110 on theelectronic device 120, resulting in pressure between the two electronic devices, wherein theelectronic devices electronic devices 110 and 120 (such as the upper device and the lower device) may respectively sense the pressure via the force-sensitive components - In Step S11, the
electronic devices sensitive components - In Step S12, the
electronic devices sensitive components - In Step S13, at least one of the
electronic devices 110 and 120 (e.g. theprocessing circuits 112 and/or 122 thereof) may determine whether there is an up-down relationship between theelectronic devices processing circuits communications modules processing circuit 112 may inform theprocessing circuit 122 that the aforementioned at least one condition is satisfied, and may transmit the device attributes of theelectronic device 110. Theprocessing circuit 122 may receive the device attributes in order to confirm the occurrence of the contact event. - In Step S14, the electronic device 120 (e.g. the processing circuit 122) may define which one of the
electronic devices electronic device 110 is the upper device, and theelectronic device 120 is the lower device. - In Step S15, the lower device (i.e. the
electronic device 120 is this case) may display predetermined image(s) (e.g. one or more predetermined images) for positioning, such as that shown in any of Examples (a)-(d). - In Step S16, the upper device (i.e. the
electronic device 110 in this case) may obtain partial image(s) (e.g. one or more partial images) from the predetermined image(s). - In Step S17, the upper device may calculate the space information of the upper device according to the partial image(s), such as the aforementioned at least one relative location.
- In Step S18, the upper device may output the space information of the upper device to the lower device.
- In Step S19, the lower device may display a corresponding user interface (e.g. the display content corresponding to the aforementioned at least one relative location, such as corresponding auxiliary information, icons, symbols, etc.) on the surrounding of the upper device.
- According to some embodiments, one or more steps may be added to, modified, or removed from the working
flow 400. - According to some embodiments, the inter-device relationship between the
electronic devices electronic devices electronic devices electronic devices electronic devices - According to some embodiments, the electronic device 110 (e.g. the processing circuit 112) may perform image analysis according to the aforementioned at least one partial image, in order to determine the relative height (i.e. the Z coordinate value), tilt angle, rotation angle, etc. of the
electronic device 110 with respect to theelectronic device 120, to enable theelectronic device 120 to correspondingly modify the user interface displayed by thedisplay module 128. - Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Claims (10)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW107119683A | 2018-06-07 | ||
TW107119683 | 2018-06-07 | ||
TW107119683A TWI669636B (en) | 2018-06-07 | 2018-06-07 | Electronic device applicable to interaction control |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190377451A1 true US20190377451A1 (en) | 2019-12-12 |
US10664086B2 US10664086B2 (en) | 2020-05-26 |
Family
ID=68316745
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/158,324 Active US10664086B2 (en) | 2018-06-07 | 2018-10-12 | Electronic device applicable to interaction control |
Country Status (2)
Country | Link |
---|---|
US (1) | US10664086B2 (en) |
TW (1) | TWI669636B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080195735A1 (en) * | 2007-01-25 | 2008-08-14 | Microsoft Corporation | Motion Triggered Data Transfer |
US20140043209A1 (en) * | 2012-08-10 | 2014-02-13 | Research In Motion Limited | Stacked device position identification |
US20190107899A1 (en) * | 2017-10-05 | 2019-04-11 | Htc Corporation | Method for operating electronic device, electronic device and computer-readable recording medium thereof |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9075464B2 (en) * | 2013-01-30 | 2015-07-07 | Blackberry Limited | Stylus based object modification on a touch-sensitive display |
WO2015047360A1 (en) * | 2013-09-29 | 2015-04-02 | Rinand Solutions Llc | Force sensing compliant enclosure |
WO2015077018A1 (en) * | 2013-11-21 | 2015-05-28 | 3M Innovative Properties Company | Touch systems and methods employing force direction determination |
US9860451B2 (en) * | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
TWI545484B (en) * | 2015-06-17 | 2016-08-11 | 和碩聯合科技股份有限公司 | Force sensing electronic apparatus |
-
2018
- 2018-06-07 TW TW107119683A patent/TWI669636B/en active
- 2018-10-12 US US16/158,324 patent/US10664086B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080195735A1 (en) * | 2007-01-25 | 2008-08-14 | Microsoft Corporation | Motion Triggered Data Transfer |
US20140043209A1 (en) * | 2012-08-10 | 2014-02-13 | Research In Motion Limited | Stacked device position identification |
US20190107899A1 (en) * | 2017-10-05 | 2019-04-11 | Htc Corporation | Method for operating electronic device, electronic device and computer-readable recording medium thereof |
Also Published As
Publication number | Publication date |
---|---|
US10664086B2 (en) | 2020-05-26 |
TWI669636B (en) | 2019-08-21 |
TW202001497A (en) | 2020-01-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102491443B1 (en) | Display adaptation method and apparatus for application, device, and storage medium | |
CN106055327B (en) | Display method and electronic equipment | |
CN102736854B (en) | Communication terminal and the screen adjustment method based on this communication terminal | |
CN103649895B (en) | Adjustment method and terminal of icon display | |
EP4024186B1 (en) | Screenshot method and terminal device | |
CN108153503B (en) | Display control method and related product | |
CN103809741B (en) | Electronic device and method for determining depth of 3D object image in 3D environment image | |
CN104536661A (en) | Terminal screen shot method | |
TWI691889B (en) | Electronic device and method for displaying icons | |
CN107998657B (en) | A sorting processing method, device and computer-readable storage medium | |
US20120313976A1 (en) | Computer-readable storage medium having display control program stored therein, display control method, display control system, and display control apparatus | |
CN108121493A (en) | Display control method and related product | |
CN108427586A (en) | Using push terminal, method and the computer readable storage medium of theme | |
US10664086B2 (en) | Electronic device applicable to interaction control | |
CN103064672B (en) | A kind of 3D view method of adjustment and device | |
US10031589B2 (en) | Apparatuses, methods and computer programs for remote control | |
US10048834B2 (en) | Information processing method and electronic device | |
CN110618775B (en) | Electronic device for interactive control | |
CN104536564A (en) | Terminal | |
JP6092818B2 (en) | Image processing apparatus, image processing method, image processing program, and print order receiving apparatus | |
US9817555B2 (en) | Information processing device and information processing method | |
CN103377639A (en) | A terminal and a display control method | |
EP3756083B1 (en) | Electronic device and method of executing function thereof | |
KR20130115953A (en) | Method for displaying image and mobile terminal therfor | |
CN108021313A (en) | A kind of picture browsing method and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ACER INCORPORATED, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KO, CHUEH-PIN;REEL/FRAME:047144/0141 Effective date: 20180627 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |