WO2018005167A1 - Deleting items based on user interaction - Google Patents
Deleting items based on user interaction Download PDFInfo
- Publication number
- WO2018005167A1 WO2018005167A1 PCT/US2017/038353 US2017038353W WO2018005167A1 WO 2018005167 A1 WO2018005167 A1 WO 2018005167A1 US 2017038353 W US2017038353 W US 2017038353W WO 2018005167 A1 WO2018005167 A1 WO 2018005167A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- items
- representation
- presentation
- modified
- determining
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F7/00—Methods or arrangements for processing data by operating upon the order or content of the data handled
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/10—File systems; File servers
- G06F16/16—File or folder operations, e.g. details of user interfaces specifically adapted to file systems
- G06F16/162—Delete operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present application relates to a field of device technology.
- the present application relates to techniques for clearing files at an electronic device.
- Files may accumulate on a device (e.g., a smartphone) due to various reasons. For example, users may download files onto a device from a web browser and/or receive at the device files that are sent from other devices. As such, over time, a smartphone typically stores a large quantity of cache files and installation package files, in addition to other types of downloaded content. If these files (e.g., cache files and installation package files) are not promptly cleared, they will accumulate and consume a large portion of the overall storage space at the device.
- a device e.g., a smartphone
- a smartphone typically stores a large quantity of cache files and installation package files, in addition to other types of downloaded content. If these files (e.g., cache files and installation package files) are not promptly cleared, they will accumulate and consume a large portion of the overall storage space at the device.
- a user clears (e.g., permanently deletes) files by marking certain applications to be deleted using a clearing software, for example.
- a clearing software for example.
- the user may fail to get a sense of the progress and/or outcome of the file clearing process.
- FIG. 1 is a diagram showing an embodiment of a device at which items are deleted based on user interaction.
- FIG. 2 is a flow diagram showing an embodiment of a process for deleting a set of items at a device.
- FIGS. 3A through 3D show example display screens of a device and using user input operations comprising a user's swipes through at least some of a representation of a set of items to delete at least a portion of the set of items.
- FIG. 4 is a flow diagram showing an example of a process for detecting a user input operation and determining at least a portion of a set of items to delete.
- FIG. 5 is a flow diagram showing an example of a process for detecting a user input operation and determining at least a portion of a set of items to delete.
- FIG. 6 shows an example display screen that presents a size of items that were deleted.
- FIG. 7 is a flow diagram showing an example of a process for detecting a user input operation and determining at least a portion of a set of items to delete.
- FIG. 8 is an example of a modified presentation of a representation after a user sound operation has been detected.
- FIG. 9 is a flow diagram showing an example of a process for detecting a user input operation and determining at least a portion of a set of items to delete.
- FIG. 10 is a structural diagram of an embodiment of a display device at which items are deleted based on user interaction.
- FIG. 11 is a structural diagram of an embodiment of an electronic device at which items are deleted based on user interaction.
- the invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor.
- these implementations, or any other form that the invention may take, may be referred to as techniques.
- the order of the steps of disclosed processes may be altered within the scope of the invention.
- a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task.
- the term 'processor' refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
- first piece of information could be called a second piece of information.
- second piece of information could be called a first piece of information.
- first piece of information could be interpreted as "when” or "upon being confirmed,” depending on the context.
- an "item" comprises a file such as, for example, a cache file, an installation package file, or a residual file of unloaded software.
- cache files and installation package files include electronic files that are downloaded using the device's browser or downloaded by online or instant chat software.
- Examples of unloaded residual files comprise data files left over by unloaded software (e.g., software that is not currently executed).
- to "delete" a set of items comprises to mark the items for deletion (e.g., the items will no longer be accessible to the user and their locations in storage marked to be reclaimed by new data) or to permanently removed from the device (e.g., their locations in storage are overwritten by new data such that the items are no longer stored at the device).
- a representation associated with the set of items is generated.
- the representation comprises data that can be presented as a visualization of the set of items.
- the representation comprises a multimedia file such as a thumbnail image associated with one or more of the set of items to be deleted.
- the representation associated with the set of items is presented at the device (e.g., at the touchscreen of the device).
- a user input operation associated with modifying the presentation of the representation associated with the set of items is detected at the device.
- the user input operation may comprise various different types of user input with respect to the device and different types of user input operations may be detected by different sensors that are part of the device.
- the greater repetitions of and/or the greater duration of a user input operation that is detected the greater the extent to which the appearance of the representation will be modified in its presentation at the device. For example, the greater repetitions of and/or the greater duration of a user input operation, the more that the presentation of the representation will be reduced in size. How much of the set of items will be deleted will be determined by the modified presentation of the representation associated with the set of items.
- FIG. 1 is a diagram showing an embodiment of a device at which items are deleted based on user interaction.
- Device 100 includes touch sensor 102, sound sensor 104, light sensor 106, storage 108, and deletion engine 110.
- touch sensor 102, sound sensor 104, light sensor 106, and deletion engine 110 may be implemented using one or more of hardware and software.
- Device 100 may comprise a mobile device, smart phone, tablet device, laptop computer, desktop computer, and/or any other computing device.
- Deletion engine 110 is configured to determine a set of items to be deleted.
- the set of items comprises one or more files that have been selected by the user to be deleted and/or one or more files that have been stored at storage 108 for at least a predetermined period of time.
- Deletion engine 110 is configured to analyze the attributes and/or metadata associated with the set of items to determine a corresponding representation of the set of items.
- the representation may comprise a thumbnail image based on at least one image that is included in the set of items and/or a generic image that is not specific to any content included in the set of items.
- Deletion engine 110 is further configured to present the representation at the display (e.g., touchscreen) of device 100.
- the representation is presented in a manner that overlays any other content that was previously presented at the device (e.g., a previously executing application or the desktop of the device).
- the representation may also be presented with text or another form of instructions that invite a user to interact with device 100 in at least one particular way to modify the appearance of the representation that is presented. Examples of the presentation of the representation are described below in connection with FIG. 3 A.
- the deletion engine can be implemented as a stand-alone application, or a part of the operating system.
- a user may perform an operation with respect to device 100 per the instructions, that are presented with the representation, that describe the particular type of user interaction that is to be detected. Different types of user input operations may be detected by a different sensor that is included in device 100.
- a first type of user interaction is a user touch operation on the screen of device 100 and can be detected by touch sensor 102.
- touch sensor 102 comprises a touchscreen that is layered over an electronic visual display component of device 100 and is configured to capture user motions involving contact with the screen.
- a user input operation that is detectable by touch sensor 102 comprises a user's finger motion or user's use of a stylus to wipe/clear away at least a portion of the presentation of the representation at the screen of device 100.
- a portion of the presentation of the representation that is wiped/cleared away by a user's touch motion becomes hidden from view at the screen and thereby reduces the amount/size of the presentation of the representation that can be viewed at the screen.
- a second type of user interaction is a user inputting a sound operation (e.g., by blowing air) towards device 100 and can be detected by sound sensor 104.
- sound sensor 104 comprises a microphone and is configured to capture sounds.
- sound sensor 104 comprises an air flow sensor and is configured to detect air flow, derive airflow velocity, and calculate airflow volumes.
- a user input operation that is detectable by sound sensor 104 comprises blowing air into sound sensor 104, which may have an interface at the exterior of device 100 for ease of access. For example, the longer that the user blows air into sound sensor 104, the more of the presentation of the representation that is wiped/cleared away and becomes hidden from view at the screen. Therefore, the longer the duration of a user blowing air (or making another noise) into sound sensor 104, the less that the representation can be viewed at the screen.
- a third type of user interaction is a user making a gesture near device 100 and can be detected by light sensor 106.
- a user's gesture may also be captured by a motion sensor 106 (not shown) in device 100.
- light sensor 106 is configured to capture changes in light intensity near device 100.
- a user input operation that is detectable by light sensor 106 comprises a user hand performing a gesture (e.g., a hand wave) near light sensor 106 of device 100.
- a gesture e.g., a hand wave
- the more hand wave motions detected by light sensor 106 e.g., as a function of the patterns of changes in light
- the presentation of the representation is wiped/cleared away and becomes hidden from view at the screen. Therefore, the greater the number of hand wave motions that are detectable as changes in the light intensity near light sensor 106, the less that the representation can be viewed at the screen.
- a fourth type of user interaction is a cursor operation on the screen of device 100 and can be detected by a pointing device (not shown) that is connected to and/or a part of device 100.
- the pointing device comprises a computer mouse.
- a cursor operation that is detectable by the pointing device comprises a user using the cursor that appears at the display screen of device 100 to wipe/clear away at least a portion of the presentation of the representation at the screen of device 100.
- a portion of the presentation of the representation that is wiped/cleared away by a user's cursor operation becomes hidden from view at the screen and thereby reduces the amount/size of the presentation of the representation that can be viewed at the screen.
- Deletion engine 110 is configured to determine how much of the set of items to actually delete from storage 108 based on the amount, if any, of the presentation of the representation that is still viewable at the screen after the completion of the user input operation(s), as will be described in further detail below.
- FIG. 2 is a flow diagram showing an embodiment of a process for deleting a set of items at a device. In some embodiments, process 200 is implemented on a device such as device 100 of FIG. 1.
- a set of items is determined to be deleted from a device.
- a set of items to be deleted may be determined as a set of items that have been identified by a user to be deleted at the device. For example, the user may have selected one or more items to delete from a file clearing application that is executing at the device (e.g., a storage management application on the iPhone). In another example, the user may have selected one or more items to delete by inputting a preset multi-key combination for triggering file clearing at the device. In some embodiments, a set of items may be programmatically determined to be deleted by an application or operating system executing at the device.
- those items may be automatically determined to be deleted.
- a predetermined threshold percentage e.g. 50%
- those items may be automatically determined to be deleted.
- a visual representation of the set of items is generated.
- the representation comprises a multimedia file such as a picture or video with or without audio effects.
- the representation is generated based on metadata and/or content associated with the items in the set to be deleted.
- the representation may be a thumbnail version of an image that is included in the set of items to be deleted.
- the attributes of the representation to be generated may include the shape of the representation displayed on the display screen, the size of the representation that is displayed at the display screen, the image(s) of the representation that are displayed on the display screen, and so on.
- the representation may be presented as a rectangular file with image content.
- the size of the representation is one-tenth of the total area of the display screen of the device.
- the image of the representation may comprise a color distribution of specific colors.
- the display area of the representation on the display screen of the device may be proportional to the set of items to be deleted. For example, when the items to be deleted are 10MB in size, the display area of the representation on the display screen is a x b, and when the items to be deleted are 20MB in size, the display area of the representation on the display screen is a x 2b, wherein a is the width of the shape of the representation, and b is the height of the shape of the representation.
- a is the width of the shape of the representation
- b is the height of the shape of the representation.
- the representation (e.g., a multimedia file) is presented at the display screen of the device.
- the representation is presented as an overlay over an executing application at the device.
- the representation is presented as an overlay over a desktop display of icons associated with one or more applications that have been installed on the device.
- the representation may be presented at the display screen of the device in any way that is designed to capture the attention of the user.
- the representation is also presented with instructions (e.g., text and/or an image) that instruct a user to interact with the presented representation to cause at least some of the set of items to become deleted.
- the presented instructions may instruct the user to bring one or more fingers (or a stylus pen) to the display screen (which is also a touchscreen) and make swiping motions against the portion of the touchscreen that includes the presentation of the representation.
- the presented instructions may instruct the user to make a specified noise (e.g., a blowing noise) into a microphone or air flow sensor of the device.
- the presented instructions may instruct the user to make a gesture (e.g., a wave of the user's palm) near a light sensor of the device.
- a user input operation associated with modifying the presentation of the representation associated with the set of items is detected at the device.
- the greater extent to which the user input operation e.g., the more finger swiping actions that were detected against the touchscreen, the longer the duration of the user's producing a noise into the microphone or air flow sensor, or the greater the number of repetitions of a hand motion near the device
- the greater extent to which the presentation of the representation will be modified.
- the greater extent to which the user input operation is performed and detected by the device the more of the area of the representation that becomes erased/cleared away or hidden from view at the display screen of the device.
- the greater extent of the user input operation that is detected the less of the area of the representation that remains visible at the display screen of the device, so the user can experience/see a reduction/removal in the representation of the set of items to be deleted that is proportional to the extent of their user input operation.
- At 210 at least a portion of the set of items is deleted based at least in part on the modified presentation of the representation associated with the set of items.
- How much area of the representation that remains to be presented at the display screen at the device correlates with the amount of the set of items that is to be deleted. In some embodiments, if the presented representation is entirely erased/cleared away, then the set of items is to be deleted entirely. In some embodiments, if a threshold amount (e.g., area) of the representation has been erased/cleared away, then the set of items is to be deleted entirely. In some embodiments, if less than a threshold amount (e.g., area) of the representation has been erased/cleared away, then the portion of the set of items that is to be deleted may be proportional to the erased/cleared away area of the representation.
- a threshold amount e.g., area
- the amount of storage space that is freed as a result of the amount of items that are deleted is subsequently presented at the display screen for the device (e.g., so that the user can receive confirmation of the deletion as well as a sense of how much storage space was freed up as a consequence).
- deleting files from the device may also incentivize users to more regularly delete unused items off of the device.
- FIGS. 3A through 3D show example display screens of a device and using user input operations comprising a user's swipes through at least some of a representation of a set of items to delete at least a portion of the set of items.
- FIG. 3A is an example of a display screen of device 300 at which representation 302 of a set of items that is to be deleted is presented.
- Representation 302 in this example, comprises an image and is overlaid on some icons that are displayed at device 300.
- the size and appearance of representation 302 is determined based on metadata and/or content of the items to be deleted.
- FIG. 3B is a first example of a modified presentation of representation 302 after a user input operation has occurred with respect to representation 302.
- the user input operation that is detected by (e.g., a touchscreen of) device 300 is a user's finger swipe through representation 302.
- the presentation of representation 302 is updated to reflect the user's tracked finger swipe.
- FIG. 3C is a second example of a modified presentation of representation 302 after a subsequent user input operation has occurred with respect to representation 302.
- the user input operation that is detected by (e.g., a touchscreen of) device 300 is a user's subsequent finger swipe through representation 302.
- the result of the user's second finger swipe through a part of representation 302 below portion 306 of representation 302 is the removal from view of portion 312 of representation 302 that was affected by the user's second finger swipe, as if portion 312 was erased by the user's finger.
- the center curve of a user's touch operation with respect to the representation is determined. Referring to FIGS. 3B and 3C, the center curves 320 and 322 corresponding to the first and second user's touch operations are shown. The center curves 320 and 322 correspond to the user's touch operations on the touchscreen.
- a predetermined distance e.g., measured in number of pixels, centimeters, etc.
- Multiplying the predetermined distance by the distance of the swipe will yield the approximate size/area of the affected area, in some embodiments.
- affected portions of representation 302 are no longer displayed at the display screen of device 300.
- Additional user input operations comprising user's swiping motions across representation 302 may be received (not shown) and cause corresponding portions of representation 302 to be hidden from view at the display screen of device 300.
- the presentation of representation 302 may be presented as increasingly less sharp and more blurry until an image associated with representation 302 disappears altogether.
- the blurring effects can be achieved by applying a filter to the original image, substituting pixels in the original image, and/or other appropriate techniques, in some embodiments.
- representation 302 included sounds then as more user input operations are detected with respect to representation 302, the presentation of representation 302 may be presented as emitting quieter and quieter sounds until the sounds disappear altogether.
- the presentation of representation 302 may gradually disappear from top to bottom.
- representation 302 Depending on how much of representation 302 is cleared away in response to one or more user input operations, at least a portion of the set of items that is represented by representation 302 is deleted from device 300.
- to delete items refers to either marking the storage space associated with such items for reclamation or to immediately overwrite the storage space associated with such items with new data.
- the portion, if any, of representation 302 that remains in view the display screen is analyzed to determine how much of the set of items to delete. In some embodiments, if all of representation 302 has been removed from view by the user input operations, then all of the set of items are to be deleted. In some embodiments, if less than all of representation 302 has been removed from view by the user input operations, then the remaining area of representation 302 is compared to the original area of representation 302 to determine how much of the set of items is to be deleted. For example, the total size of a set of items to be deleted is 600MB in total.
- FIG. 3D is an example of a display screen at which a message that indicates the size of the items that have been deleted as a result of modification of the presentation of representation 302 by one or more user input operations is displayed at 314.
- the remainder, if any, of representation 302 was checked to determine how much of the original total size of the set of items to delete. In the example of FIG. 3D, it was determined that 500MB of the original total size of the set of items (which was at least 500MB in size) was deleted.
- the original total size of the set of items may be presented at the display screen (e.g., "500MB of 600MB worth of files have been deleted").
- FIG. 4 is a flow diagram showing an example of a process for detecting a user input operation and determining at least a portion of a set of items to delete.
- process 400 is implemented at a device such as device 100 of FIG. 1.
- steps 208 and 210 of process 200 of FIG. 2 may be implemented, at least in part, using process 400.
- Process 400 describes an example of deleting a set of items based on user input operations comprising a user's touch operation (e.g., finger swiping motions) on a representation of a set of items that is presented at a display screen of a device.
- Process 400 describes deleting a portion of the set of items to be deleted based on the ratio of the representation that was cleared away/removed/modified by one or more user's touch operations.
- a user input operation comprising a touch operation with respect to modifying a presentation of a representation of a set of items is detected using a touch sensor.
- One or more user input operations comprising user's touch operations on the representation that is presented at the display screen of the device are detected.
- Each touch operation on the representation that is presented at the display screen clears/removes a corresponding portion of the representation from being displayed.
- a modified presentation of the representation is presented in response to the detected touch operation.
- the area of the portion of the representation that has been cleared away/removed/modified by the user's touch operations is presented as being hidden from view at the display screen of the device. For example, the clearing
- away/removing/modifying of each portion of the representation by the touch operation may be presented as an animation in which the size of the presentation diminishes over several frames.
- the modification to the presentation of the representation in response to the user's input operation the user may experience an immediate feedback to their input operations.
- a portion of the presentation of the representation that has been modified by the touch operation is determined.
- the area of the portion of the representation that has been cleared away/removed/modified by the user's touch operations is determined using the technique described above in connection with FIG. 3C. In some embodiments, the area of the portion of the representation that has been cleared
- away/removed/modified by the user's touch operations may be determined after it is determined that the user has finished performing user input operations.
- one technique by which to determine that the user has finished performing user input operations is the lack of detected user's touch operations at the touchscreen of the device for at least a predetermined length of time.
- a comparison of a first area associated with the portion of the presentation of the representation that has been modified by the touch operation to a second area associated with the presentation of the representation is determined.
- the area of the portion of the representation that has been cleared away/removed/modified by the user's touch operations is compared to the original, total area of the representation (prior to being modified by user input operations) to determine a ratio or a percentage of the representation that has been cleared away/removed/modified.
- the ratio or percentage of the total size of the set of items that is deleted as a result of the detected user's touch operations is the same as the ratio or a percentage of the representation that has been cleared away/removed/modified by the user's touch operations.
- the deletion operation can be performed programmatically. For example, an operating system can be invoked to perform the deletion operation by either marking the storage space associated with the deleted files as "dirty" and therefore ready to be reclaimed/written over by new data or directly overwriting the storage space associated with the deleted files with new or default data.
- the device detects that the area of a representation that has been cleared from the display screen is c.
- the total, original area of the representation is d. Therefore, the total quantity of the set of items to be deleted from the device can be determined according to the ratio of c to d. For example, assume that the ratio of c to d is one-third and the total quantity of the set of items to be deleted on the device is 600MB. Thus, 1/3 of 600MB (200MB) of the set of items to be deleted will be deleted.
- FIG. 5 is a flow diagram showing an example of a process for detecting a user input operation and determining at least a portion of a set of items to delete.
- process 500 is implemented at a device such as device 100 of FIG. 1.
- steps 208 and 210 of process 200 of FIG. 2 may be implemented, at least in part, using process 500.
- Process 500 describes an example of deleting a set of items based on user input operations comprising a user's touch operation (e.g., finger motions) on a representation of a set of items that is presented at a display screen of a device.
- a user's touch operation e.g., finger motions
- a user input operation comprising a touch operation with respect to a presentation of a representation of a set of items is detected using a touch sensor.
- Step 502 is similar to step 402 of process 400 of FIG. 4.
- a modified presentation of the representation is presented in response to the detected touch operation.
- Step 504 is similar to step 404 of process 400 of FIG. 4.
- Step 506 a portion of the presentation of the representation that has been modified by the touch operation is determined. Step 506 is similar to step 406 of process 400 of FIG. 4.
- an area difference is determined between a first area associated with the presentation of the representation and a second area associated with the portion of the presentation of the representation that has been modified by the touch operation is determined.
- the area of the remaining portion of the representation that is presented is determined as a difference between the overall area of the presentation of the representation and the portion of the presentation of the representation that has been modified by the touch operation. Put another way, the remaining area of the representation that was not cleared away /removed/modified by one or more user's touch operations is determined.
- control is transferred to 512. Otherwise, in the event that the first area is not less than the threshold area, control is transferred to 514.
- the set of items is completely deleted. Because the remainder of the representation is less than a predetermined threshold area, it is assumed that the user had intended to delete the entire set of items even if he or she did not clear away /remove/modify the entire presentation of the representation on the display screen.
- the predetermined threshold area may be determined to be a relatively small percentage (e.g., 10%) of the total, original area of the representation. Therefore, it is possible to ensure that a set of items is to be entirely deleted from the device, which simplifies/reduces the interactions that needs to be performed by the user.
- a comparison of the second area associated with the portion of the presentation of the representation to the first area associated with the presentation of the representation is determined.
- the area of the portion of the representation that has been cleared away /removed/modified by the user's touch operations is compared to the original, total area of the representation (prior to being modified by user input operations) to determine a ratio or a percentage of the representation that has been cleared
- At 516 at least a portion of the set of items to be deleted is determined based at least in part on the comparison.
- the ratio or percentage of the total size of the set of items that is deleted as a result of the detected user's touch operations is the same as the ratio or a percentage of the representation that has been cleared away /removed/modified by the user's touch operations.
- a size associated with the at least a portion of the set of items that is deleted is presented. If the entire set of items was deleted at step 516, then the total size of the set of items is presented as the amount of items that was deleted. For example, if the total size of the set of items to be deleted is 600MB and the entire set of items was determined to be deleted at step 510, then a message that indicates that 600MB worth of files were deleted is presented, as shown in FIG. 6. In FIG. 6, message 612 that indicates that 600MB worth of files have been deleted is presented at the display screen of device 600. If less than the entire set of items was deleted at step 516, then the portion of the total size of the set of items that was deleted based on the determined ratio is presented.
- process 500 may be similarly implemented for a user input operation that comprises a cursor operation that is detected by a pointing device.
- FIG. 7 is a flow diagram showing an example of a process for detecting a user input operation and determining at least a portion of a set of items to delete.
- process 700 is implemented at a device such as device 100 of FIG. 1.
- steps 208 and 210 of process 200 of FIG. 2 may be implemented, at least in part, using process 700.
- Process 700 describes an example of deleting a set of items based on user input operations comprising a user's sound input operation (e.g., producing a sound or blowing air into a microphone or air flow sensor) on a representation of a set of items that is presented at a display screen of a device.
- a user's sound input operation e.g., producing a sound or blowing air into a microphone or air flow sensor
- a user input operation comprising a sound input operation with respect to modifying a presentation of a representation of a set of items is detected using a sound sensor.
- the sound sensor comprises a microphone or an air flow sensor.
- the sound sensor has an interface near the exterior of the device.
- the sound sensor receives input via an earphone jack of the device.
- the sound input operation produced by the user may comprise a user's verbal speech, a user's blowing air into the interface of the sound sensor, and/or any other type of noise or sound that is produced by a user.
- a duration associated with the sound input operation is determined.
- a time duration associated with the detected sound input operation is measured.
- the sound input operation refers to a continuous sound input.
- the sound input operation refers to a series of sound inputs that are separated by brief pauses of silence.
- the longer the measured duration of the sound input operation the more that the presentation of a representation of a set of items to be deleted is modified.
- the longer the measured duration of the sound input operation the more that the presentation of a representation of a set of items to be deleted is cleared away/removed/modified.
- an airflow sensor detects the airflow strength of a user's air blowing operation. When the detected airflow strength reaches a preset strength value, the duration of the air blowing operation is measured.
- a modified presentation of the representation is presented in response to the duration of the detected sound input operation.
- the threshold length of time is dynamically generated based on the total size of the set of items to be deleted. For example, a unit of time is associated with a unit of storage size, and accordingly, a set of items associated with a larger size will have a correspondingly longer threshold length of time and a set of items associated with a smaller size will have a correspondingly shorter threshold length of time. Any measured duration of the sound input operation that is less than the threshold length of time will result in a proportional portion of the presentation of the representation being cleared away /removed/modified at the display screen.
- the duration associated with the sound input operation is equal to or greater than a threshold length of time. In the event that the measured duration of the sound input operation is equal to or greater than the threshold length of time, control is transferred to 710. Otherwise, in the event that the measured duration of the sound input operation is less than the threshold length of time, control is transferred to 712. [0075] At 710, the set of items is completely deleted. Because the measured duration of the sound input operation is at least as long as the threshold length of time, it is assumed that the user had intended to delete the entire set of items.
- a comparison of the duration associated with the sound input operation to the threshold length of time is determined. In the event that the measured duration of time is less than the threshold length of time, a ratio between the measured duration of time and the threshold length of time is determined.
- At 714 at least a portion of the set of items is deleted based at least in part on the comparison. Both the portion of the presentation of the representation that is still being shown at the display screen and the portion of the set of items that is determined to be deleted are determined based on the ratio between the measured duration of time and the threshold length of time.
- a size associated with the at least a portion of the set of items that is deleted is presented.
- FIG. 8 is an example of a modified presentation of a representation after a user sound operation has been detected. In the example of FIG.
- the portion of the presentation of the top portion of representation 802 has been cleared away in response to a detected user sound input operation.
- the threshold length of time of a user sound input operation needs to be at least 5 milliseconds long for representation 802 to be completely cleared away/hidden from view at the display screen of device 800. Then, for example, using a process such as process 700 of FIG. 7, if a user input operation of only 1 millisecond were detected, then only one-fifth of representation 802 would be cleared away/hidden from view, as shown in the example of FIG. 8.
- FIG. 9 is a flow diagram showing an example of a process for detecting a user input operation and determining at least a portion of a set of items to delete.
- process 900 is implemented at a device such as device 100 of FIG. 1.
- steps 208 and 210 of process 200 of FIG. 2 may be implemented, at least in part, using process 900.
- Process 900 describes an example of deleting a set of items based on user input operations comprising a user's gesture operation (e.g., a wave of the user's hand that does not contact the display screen) on a representation of a set of items that is presented at a display screen of a device.
- a user's gesture operation e.g., a wave of the user's hand that does not contact the display screen
- a user input operation comprising a gesture operation is detected using a light intensity sensor.
- One or more user input operations comprising user's gesture operations are detected.
- Each gesture operation (e.g., within a predetermined distance from the display screen) clears away/removes a portion of the representation from being displayed.
- the user's gesture operation which can be the user's hand wave, is detected by the light intensity sensor of the device based on the change in light intensity patterns.
- the more user hand wave motions that are detected by the user the more of the representation that is cleared away/removed from being displayed.
- each detected hand wave motion corresponds to a predetermination amount (e.g., 20%) of the representation being cleared away/removed from being displayed.
- a modified presentation of the representation is presented in response to the detected gesture operation.
- the area of the portion of the representation that has been cleared away/removed/modified by the user's gesture operations is presented as being hidden from view at the display screen of the device. For example, the clearing
- away/removing/modifying of each portion of the representation by the gesture operation may be presented as an animation.
- the user may experience an immediate feedback to their input operations.
- a portion of a presentation of a representation of a set of items that has been modified by the gesture operation is determined.
- the area of the portion of the representation that has been cleared away/removed/modified by the user's gesture operations is determined.
- the area of the portion of the representation that has been cleared away /removed/modified by the user's gesture operations may be determined after it is determined that the user has finished performing user input operations. For example, one technique by which to determine that the user has finished performing user input operations is the lack of detected user's gesture operations for at least a predetermined length of time.
- an area difference between a first area associated with the presentation of the representation and a second area associated with the portion of the presentation of the representation that has been modified by the gesture operation is determined.
- the area of the remaining portion of the representation that is presented is determined as a difference between the overall area of the presentation of the representation and the portion of the presentation of the representation that has been modified by the gesture operation. Put another way, the remaining area of the representation that was not cleared away/removed/modified by one or more user's gesture operations is determined.
- control is transferred to 912. Otherwise, in the event that the first area is not less than the threshold area, control is transferred to 914.
- the set of items is completely deleted. Because the remainder of the representation is less than a predetermined threshold area, it is assumed that the user had intended to delete the entire set of items even if he or she did not clear away /remove/modify the entire presentation of the representation on the display screen.
- the predetermined threshold area may be determined to be a relatively small percentage (e.g., 10%) of the total, original area of the representation. Therefore, it is possible to ensure that a set of items is to be entirely deleted from the device, which simplifies/reduces the interactions that need to be performed by the user.
- a comparison of the second area associated with the portion of the presentation of the representation to the first area associated with the presentation of the representation is determined.
- the area of the portion of the representation that has been cleared away/removed/modified by the user's gesture operations is compared to the original, total area of the representation (prior to being modified by user input operations) to determine a ratio or a percentage of the representation that has been cleared
- At 916 at least a portion of the set of items is deleted based at least in part on the comparison.
- the ratio or percentage of the total size of the set of items that is deleted as a result of the detected user's gesture operations is the same as the ratio or a percentage of the representation that has been cleared away/removed/modified by the user's gesture operations.
- a size associated with the at least a portion of the set of items that is deleted is presented. If the entire set of items was deleted at step 916, then the total size of the set of items is presented as the amount of items that was deleted. For example, if the total size of the set of items to be deleted is 600MB and the entire set of items was determined to be deleted at step 916, then a message that indicates that 600MB worth of files were deleted is presented. If less than the entire set of items was deleted at step 916, then the portion of the total size of the set of items that was deleted based on the determined ratio is presented.
- some embodiments described herein tightly link a user's gesture operations with the device to the process of deleting items from the device so that the user can better perceive the removal of such files.
- FIG. 10 is a structural diagram of an embodiment of a display device at which items are deleted based on user interaction.
- the device may comprise a processor, an internal bus, a network interface, memory, and non- volatile memory.
- the device may also comprise other hardware required for business services, which are not shown.
- the processor takes the corresponding computer program from the non- volatile memory to the internal memory and then runs it.
- the display component described above is implemented in the logical layer which includes the operating system, in some embodiments.
- the present application does not exclude other implementations in addition to a software implementation, e.g., a logic device (e.g., a programmable electronic component) or a combined software/hardware form.
- a logic device e.g., a programmable electronic component
- a combined software/hardware form e.g., the entity that executes the process flow need not be limited to the various logical units. It may also be implemented by hardware or a logic device.
- FIG. 11 is a structural diagram of an embodiment of an electronic device at which items are deleted based on user interaction.
- the electronic device may comprise a processor, an internal bus, a network interface, memory, and non- volatile memory.
- the electronic device may also comprise other hardware required for business services, which are not shown.
- the processor takes the corresponding computer program from the non- volatile memory to the internal memory and then runs it.
- the deletion engine described above is implemented in the logical layer.
- the present application does not exclude other implementations in addition to a software implementation, e.g., a logic device or a combined software/hardware form.
- the entity that executes the process flow need not be limited to the various logical units. It may also be implemented by hardware or a logic device.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17820935.9A EP3479216A4 (en) | 2016-06-29 | 2017-06-20 | DELETING ITEMS BASED ON USER INTERACTION |
JP2018562513A JP6718524B2 (en) | 2016-06-29 | 2017-06-20 | Delete items based on user interaction |
KR1020187034568A KR102198988B1 (en) | 2016-06-29 | 2017-06-20 | Deleting items based on user interaction |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610499292.7A CN107545010B (en) | 2016-06-29 | 2016-06-29 | Display method, file cleaning method and device, display equipment and electronic equipment |
CN201610499292.7 | 2016-06-29 | ||
US15/626,958 | 2017-06-19 | ||
US15/626,958 US10503694B2 (en) | 2016-06-29 | 2017-06-19 | Deleting items based on user interation |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018005167A1 true WO2018005167A1 (en) | 2018-01-04 |
Family
ID=60786621
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2017/038353 WO2018005167A1 (en) | 2016-06-29 | 2017-06-20 | Deleting items based on user interaction |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018005167A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6389433B1 (en) * | 1999-07-16 | 2002-05-14 | Microsoft Corporation | Method and system for automatically merging files into a single instance store |
US8818971B1 (en) * | 2012-01-30 | 2014-08-26 | Google Inc. | Processing bulk deletions in distributed databases |
US8914330B2 (en) * | 2004-09-17 | 2014-12-16 | International Business Machines Corporation | Bulk deletion through segmented files |
US20150134913A1 (en) * | 2013-11-14 | 2015-05-14 | Cheetah Mobile Inc. | Method and apparatus for cleaning files in a mobile terminal and associated mobile terminal |
-
2017
- 2017-06-20 WO PCT/US2017/038353 patent/WO2018005167A1/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6389433B1 (en) * | 1999-07-16 | 2002-05-14 | Microsoft Corporation | Method and system for automatically merging files into a single instance store |
US8914330B2 (en) * | 2004-09-17 | 2014-12-16 | International Business Machines Corporation | Bulk deletion through segmented files |
US8818971B1 (en) * | 2012-01-30 | 2014-08-26 | Google Inc. | Processing bulk deletions in distributed databases |
US20150134913A1 (en) * | 2013-11-14 | 2015-05-14 | Cheetah Mobile Inc. | Method and apparatus for cleaning files in a mobile terminal and associated mobile terminal |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108829327B (en) | Writing method and device for interactive smart device | |
US9135022B2 (en) | Cross window animation | |
CN101923425B (en) | Method and device for realizing window switching based on sliding terminal screen | |
US8413075B2 (en) | Gesture movies | |
JP6115965B2 (en) | Low-latency touch input device | |
CN103649875B (en) | Manage content through actions on context-based menus | |
CA2800108C (en) | Jump, checkmark, and strikethrough gestures | |
US9836313B2 (en) | Low-latency visual response to input via pre-generation of alternative graphical representations of application elements and input handling on a graphical processing unit | |
KR20120066122A (en) | Method and device for controlling touch screen using timeline bar, recording medium for program for the same, and user terminal having the same | |
WO2014178842A1 (en) | Generate preview of content | |
US20110185301A1 (en) | Providing sensory information based on detected events | |
US20130009991A1 (en) | Methods and systems for displaying interfaces | |
JP2016502200A (en) | Content manipulation using swipe gesture recognition technology | |
CN110413187B (en) | Method and device for processing annotations of interactive intelligent equipment | |
US10503694B2 (en) | Deleting items based on user interation | |
JP2025505836A (en) | Video display method, device and storage medium | |
CN107025100A (en) | Play method, interface rendering intent and device, the equipment of multi-medium data | |
CN107609433A (en) | Method for secret protection and electronic equipment | |
WO2018005167A1 (en) | Deleting items based on user interaction | |
CN113574500A (en) | Interface presentation on a display | |
CN114003320B (en) | Desktop implementation method, terminal equipment, device and storage medium | |
CN106231190A (en) | A kind of based on the double formation method opened of front camera and rear camera and terminal | |
KR20160019774A (en) | Method and apparatus of controlling display, and computer program for executing the method | |
CN114090092A (en) | Interaction method and device of electronic equipment | |
HK1249779B (en) | Display method, file cleaning method and apparatus, display device and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17820935 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20187034568 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2018562513 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017820935 Country of ref document: EP Effective date: 20190129 |