[go: up one dir, main page]

GB2498041A - Displaying available targets for an object during a drag and drop operation - Google Patents

Displaying available targets for an object during a drag and drop operation Download PDF

Info

Publication number
GB2498041A
GB2498041A GB1220229.7A GB201220229A GB2498041A GB 2498041 A GB2498041 A GB 2498041A GB 201220229 A GB201220229 A GB 201220229A GB 2498041 A GB2498041 A GB 2498041A
Authority
GB
United Kingdom
Prior art keywords
item
items
user
drop
list
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1220229.7A
Other versions
GB201220229D0 (en
Inventor
Christopher John Evans
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
RARA MEDIA GROUP Ltd
Original Assignee
RARA MEDIA GROUP Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by RARA MEDIA GROUP Ltd filed Critical RARA MEDIA GROUP Ltd
Publication of GB201220229D0 publication Critical patent/GB201220229D0/en
Publication of GB2498041A publication Critical patent/GB2498041A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method involves presenting a user with available actions and options for an item/object that is dragged across a graphical user interface (GUI) during a drag and drop operation. The method comprises displaying the possible actions for the item as one or more drop targets when the drag operation begins, the drop targets being visually distinguishable from other areas of the screen or display; and allowing the user to select an option by dropping the dragged item onto one of the available drop targets. This method may be used for adding music tracks or albums to playlists, or for sharing and recommending media content to friends via e-mail or social networking sites. The available actions may be defined for each item, and a set of target objects may be defined for each action. The drop targets may be shown for a predetermined period, such as five seconds.

Description

I
A METHOD FOR IMPROVING THE UTILITY OF A USER INTERFACE
BACKGROUND OF THE INVENTION
I. Field of the Invention
This invention relates to a method of improving the utility of a user interface shown on a screen of a computing device. One implementation represents an improvement to the drag and drop user interface idiom.
2. Descriptixm of the Prior Art
As computtng systems and applications have grown in cotnplexi, a number of different user interfaces have been utilised, each with different strengths and flaws. One problem which is common to all such legacy interfaces, however, is that they are missing a robust way to signai to the end user winch actions are available to that user.
In particular, where the action to he undertaken consists of a noun-verb-noun operation, such as "This item... to be shared on a social nenvork" or "This music track... to he added to my playlists", user interfaces are able to provide three basic approaches for carrying out such operations: * Verb-based interfaces require the user to tirst choose the verb (e.g. "Add to) command and then select which track(s) are to he added and which playlist those items should be added to.
* Object-based interfaces require the user to first choose the object (e.g. the playlist and then select an Add tracks" command before choosing which itetns to add to that piayhst.
* Subject-based interfaces require the user to first choose the subject (e.g. items like music tracks) and then either (i) choose the "Add to command and then choose which playlist those items should he added to; or (ii) choose the playhst and then choose the "All to..." command.
\Vhile there are situations where Verb-or Object-based user interfaces are appropriate, it is subject-based interfaces which are most iitilised in the real world: Users are used to selecting item(s) and then doing something with those items.
I
As computing devices transition more and more away from legacy keyboard-and-mouse interfaces and towards touchscreen technologies, subject-based user interfaces are becoming increasingly central to user interactions with those devices, though that UI idiom is of course also still used within mouse-based user interfaces.
S However, the primary problem with pre-existing subject-based user interfaces is that such interfaces are historically missing a mechanism to make it clear and obvious to the user which actions can be undertaken with the item(s) which the user has selected. This problem is particularly apparent iii situations where the user has several possible actions which can be undertaken with his selected item(s).
Historically, attempts to signal the user's available actions have relied on subtle cues in the user interface, by: * Automatically enabling or disabling user interface elements depending on which item(s) the user has selected. This approach may he seen with the "Edit" toolbar in MicrosofFM Word IM for example: If text is currently selected then the cut/copy icons are enabled, otherwise those toolbar icons are disabled.
* Allowing the user to drag-and-drop item(s) and having the graphic for the item being dragged to dynamically change to indicate when it passes over an area where it may be dropped to perform some action (a "drop target").
The problems with those legacy approaches to this problem are readily apparent: * The user can only see visual cues if they are visible. Where a particular icon (or any other form of or region constituting a drop target) becomes enabled then, for the user to notice that that action is available requires both that: o the drop target (e.g. icon) itself is visible at the time (i.e. it is not off screen or covered up by, for example, another wlndo\v or dialog box) o the change from disabled/enabled status of that drop target (e.g. icon) is readily noticeable and is noticed by the user * Where one or more "drop targets" is available, it is not necessarily clear to the end user what those drop targets are without requiring the user to drag their selected item(s) across every possible target visible.
In the coninionplace circumstance of one or more drop targets not actually being visible then it becomes much more difficult for the user to identify that the associated action is actually available to him.
In cases where the desired action has rio "drop target", such as the actions to S view, edit, print or delete items, legacy user interfaces need to either provide additional "special" drop targets for those actions, such as the "recycle bin" in WindowsTM, the home page editor on Androidi'M systems or a printer icon, or not permit such actions to he performed using drag-and-drop at all, which is commonly the case with view/open/play or edit commands. In the latter ease, the user is required to use a different mode of interaction with the interface (select item then click button) and the use of drag-and-drop is simply not supported by legacy interfaces in that context.
Where a given target action is unavailable, the user is, in traditional interfaces, not provided with any particular cue that that is the case.
The present invention solves these historical problems by providing a mechanism to clearly display the user's range of possible actions as soon as the user starts to drag one or more items within the user interface.
BRIEF SUMMARY OF THE INVENTION
The present invention pro\ndes a mechanism by which the drag and drop user interface idiom can he used to carry out actions within the user interface of a computer program without requiring the end user to notice, or hunt for, subtle cues as to winch actions are available.
When the user starts to drag an item then the user interface is automatically modified in such a manner that those drop targets are made visually distinct on the screen, in comparison to those regions on the screen (e.g. icons, folders etc.) that are not valid drop targets.
Thus, the present invention allows the user to simply and rapidly drop that item onto one of those drop targets in order to execute the desired command, without having to first expend time and effort to locate and identify potential drop targets within the overall pre-existing user interface. Instead, the available drop targets for that item are specifically drawn to the user's attention -e.g. made visually prominent in some manner.
Definitions For convenience, and to avoid needless repetition, the terms "music" and "media content" in this document are to he taken to encompass all "media content" which is in digital form or which it is possible to convert to digital form -including but not limited to hooks, maga?ines, newspapers and other periodicals, video in the form of digital video, motion pictures, television shows (as series, as seasons and as individual episodes), computer games and other interactive media, images (photographic or otherwise) and music.
Similarly, the term "tack" indicates a specific item of media content, whether that be a song, a television show, an eBook or portion thereof, a computer game or any other discreet item of media content.
The terms "playlist", "timehne" and "album" are used interchangeably to indicate collections of "tracks" and/or interstiflals which have been conjoined together such that they may he treated as a single entity for the purposes of analysis or recommendation.
The term "play queue" is used to refer to the set of items in a digital media player which, unless further modified, will automatically he played next by that media player in the order defined by that list in combination with the media player's settings.
The terms "digital media catalogue", "digital music catalogue", "media catalogue" and "catalogue" are used interchangeably to indicate a collection of tracks and/or albums to which a user may be allowed access for listening purposes. The digital media catalogue may aggregate both digital media files and their associated metadata or, in another example embodiment, the digital media and metadata may he delivered from multiple such catalogues. There is no implication that only one such catalogue exists, and the term encompasses access to multiple separate catalogues simultaneously, whether consecutively, concurrently or by aggregation. The actual catalogue utilised by any given operation may he fixed or may vary over time and/or according to the location or access rights of a particular device or end-user.
The abbreviation "DRJ\i" is used to refer to a "Digital Rights Management" system or mechanism used to grant access rights to a digital media file.
The verbs "to listen", "to view" and "to play" are to be taken as encompassing any interaction between a human and media content, whether that be listening to audio content, watching video or image content, reading books or other textual content, playing a computer game, interacting with interactive media content or some combination of such activities.
The terms "user", "consumer", "end user" and "individual" are used interchangeably to refer to the person, or group of people making use of the facilities provided by the interface. In all cases, the masculine includes the feminine and vice versa.
The terms "device" and "media player" are used interchangeably to refer to any computational device which is capable of playing digital media content, including hut not limited to MP3 players, television sets, home entertainment system, home computer systems, mobile computing devices, games consoles, handheld games consoles, IVEs or other vehicular-based media players or any other applicable device or software media player on such a device. Something essentially capable of playback of media.
The term "DSP" ("Digital Signal Processing") refers to any computational processing of digital media content in order to extract additional metadata from that cointent. Such calculated metadata may take a variety of forms, including deriving the tempo of a musical track or identifying one or more spots within the digital media file which are gauged to be representative of that content as a whole.
The terms "drag and drop" and "drag-and-drop" are used interchangeably to refer to the user interface idiom whereby one or more selected items may be dragged within the user interface to another position in that interface (the "drop target") in order to execute an action.
The terms "dragged item" and "item which is [being dragged" are used interchangeably to refer to the item or items which are the subject of a drag and drop operation within a user interface, however defined.
The term "drop target" refers to any location within a user interface where releasing, or "dropping", an item which is being dragged will trigger the execution of an action.
The terms "target item", "target subject" and "subject" are used interchangeably to refer to the item which is the subject of the verb-subject action of a drop target which uses such a construction. For example, if the drop target is "Share on FacebooktM" then the action's verb is "share" and the target item or "subject" is "Facebook".
The term "subject-based interface" refers to a user interface in which the user first chooses one or more "subject" items and then either (i chooses the action to take and then, where applicable, the target of that action (such as choosing the command "Send to" and then the name of the friend to send the subject item(s) to); or @i) chooses the target of the action and then, where applicable, the action to undertake (such as, in a drag-and-drop idiom, dragging the suhject item(s) onto a particular friend's icon and the choosing the "Send to" command, implicitly or explicitly).
BRIEF DESCRIPTION OF THE FIGURES
The figures ate wireftames that depict the layout or arrangement of a web-based application that implements the present invention; the wireframes show the main interface elements and navigational systems, and how they work together.
Figure 1: The user starting to drag an item (in this example, a music track).
Figure 2: An example of the presentation of available actions for the item being dragged.
Figure 3: The user dropping the item onto a drop target to perform an action (in this case, "Add item to Playlist 1").
Figure 4: An example of a consolidated list of actions displayed in a fixed location at the base of the display.
Figure 5: An example of two top-level sets of actions displayed in a tixed location at the right side of the display.
Figure 6: An illustration of scrolling as implemented in one example embodiment.
S
DETAILED DESCRIPTION OF THE INVENTION
The present invention provides a mechanism by which the drag and drop user interface idiom can to be used to earn out actions within the user interface of a computer program without requiring the end user to notice, or hunt for, subtle cues as to which actuns arc available. Instead, drop targets declare themselves clearly and distinctly so that the user easily and quickly knows where he can drag and drop and item to execute an action.
In its preferred embodiment, a user interface which implements the present invention appears to the end user as the following steps: 1. The user selects one or more items using whatever selection mechanisms are provided by the pre-existing user interface.
2. The user starts to "drag" his selected item(s) using whatever signal for starting a "drag operation is recognised by the pre-existing user interface, such as a long mouse press or a long linger tap in a touchscreen interface. This is ffiusuated in FIG LIRE 1. Figure 1 shows a screen display for a web-based music service, in which a music category (e.g. classical, rock, jazz etc) with an appropriate icon is shown at!, and various specific music tracks in that category are shown in a 3 x 3 grid of rectangular lozenges 2. The cover artwork for the various music tracks scheduled to he played is shown at 3. Conventional music control icons (back, play, fonvard, a timeline are shown at 4. The user has selected a specific track S and is starring to drag it.
3. The user interface is reconfigured to display the possible actions available to the user with respect to the selected item(s). This is illustrated in FIGURL 2, which shows two large panels over-lying the top and bottom sections of the screen; these panels present to the user in a very clear and distinct manner the drop targets availahle for the music track item that is being dragged. The top panel 20, headed Add to', includes a 4 x 2 grid of lozenges, with various options, such as Queue', Song radio', Playhst A' and Playlist C' on the top row. The user can drag and drop the music track to the Queue' drop target lozenge to initiate adding that music track to the queue of tracks to he played. A scroll arrow 21 indicates further options (not shown) to die right. The lower panel 22, headed Share to', includes a 4 x 2 grid of lozenges, with various options, such as Social Networking Service I', Social Networking Service n', Lenail' etc. The user can drag and drop the music track to the email' Io?enge to initiate the action of opening an email client, with the music track automatically added as an attachment.
The user "drops" the selected item(s) onto one of the drop targets provided, thereby initiating his desired action. This is illustrated in FIG URE 3, which shows the user dropping the music track onto an action Playlist I' that the user has previously scrolled to in the upper Add to' panel That initiates the action of adding that music track to the user's Playlist I. As can be seen from the example illustrated, the present invention both avoids the limitations of all legacy subject-based user interfaces hy allowing the end user to clearly see at a glance some or all of his available actions for the selected item(s) and to carry out those actions rapidly, without needing to explore the user interface in search of subtie visual cues as to which actions are available.
In the example presented in FIGURES 1, 2 and 3, the available actions presented are grouped such that they represent traditional drop targets gathered together -optionally together with some or all other possible actions, such as those which might traditionally have been presented in a right-click "context menu" interface -for display and use by the end user in a more convenient manner.
In another example embodiment, the drop targets presented also include actions which (10 not require an "object", such as actions to delete, print, edit, play, open or view the selected item(s), or any other available actions. In that example embodiment, the di-ag-and-drop idiom provides a single unified interface for interaction with the system which permits the user, should he so wish, to carry out all available actions without changing his manner of interaction with the system.
Implementing the Present Invention In order to implement the present invention within a user interface, there are two major aspects xvhich need to be considered: I. How to determine which actions to display to the end user when that user starts to drag one or more items within the user interface 2. 1 low to present those actions to the end user Other technical considerations -chiefly, how to detect when an end user has started to drag an item within the user interface and how to identify whtch item(s) are being so dragged -are very much system specific hut any programmer reasonably skilled in the art would he able to carry out that portion of the implementation task without difficulty.
Actually carrying out the action requested by the end user is an application-specific function which is ancillary to the present embodiment, which is concerned solely with presenting those available actions to the end user for selection on detection of the drag event occurring.
Determining which actions to display The set of which actions may he carried out on a given selected item (or set of items) will in some cases he large.
The decision as to which of those actions should be presented to the end user using the present invention is necessarily a design decision, and one which therefore must he determined by the user interface designer of the particular software system or application in which the present invention is hnpleinented.
In the preferred embodiment, all possible actions are displayed flr the dragged item. In other example embodiments only a selected subset of actions is displayed, defined according to whichever criteria are deemed appropriate.
Defining "available actions" The key aspect required is to hnk each itetn -or, in the preferred embodiment, each kind of item -which can be drag-and-dropped in the user interface to an action which can he pertormed on that item. In one example embodiment, shown in Figures 1 -5, the present invention is used in a digital media system in which some of the actions which may be performed are to pray a music track or playlist, to add a track to a playlist, to remove a track from a pIanist, to share an item with a friend or a group on a social nenvorking service and so forth.
Defining the "target list" for actions Additionally, each combination of item type-and-action may optionally have a "target list", wi-ueh is an associated list winch defines those things which may he the object of that action.
In an example embodiment of the present invention implemented withtn a digital media S system, a typical such list might define that tracks may he added to items of type playlist. In that example, the items of pe playlist would he a "variable list" of playlists, since the number of eligible items in that list varies according to the rights of the user performing that action.
In the same digital media system example embodiment, another typical object of the add to action for tracks might he the current user's play queue: the list of items which are awaiting playback for that user. In that case, there is a just a single ohject item -the user's play queue -and thus the "target list" is called a "fixed list" in that instance.
Defining the display order for actions In the preferred embodiment of the present invention, actions may also optionally he assigned an ordering, with a sub-ordering according to the type of target list associated with that action and a further sub-ordering defined by the designer hut typically permitting ordering to he detennined by one or more of the following factors: * I-low recently the user has interacted with an item, such as by accessing or viewing that item * \Vhether the user has marked an item as a "favourite" * An ordering determined by a recommendations engine * Alphabetical ordering For example, in one example embodiment (see Figures) actions for tracks are first dehned with the ordering: * \dd to * Share to where that top-level ordering is defined such that those actions are to he displayed as separate groupings in the user interface.
In the preferred embodiment, the set of sub-lists for a given action may consist of various fixed and variable lists and even actions with no actual target, permitting lists of all sub-types to be grouped for a given action, as illustrated in HG TRE 2 and FIGURE D. Within each such grouping, in that example embodiment, items are further sorted such that items appear sorted in the following order: I. Actions which have no target item or items, such as "delete'' or "print", are shown first 2. Fixed-list items, such as "Add to... play dlueue" , are shown next 3. Variable list items are shown next, with the most recently interacted with items appearing first in that sub-section, any user-denoted favourite items next and then any other items shown alphabetically In other example embodiments, sort ordering may he defined differently or may he handled in a hard-coded manner -such that, for instance, items in sub-lists are merely shown in alphabetical order -or no sort ordering might be defined for sub-lists and/or items in sub-lists.
Displaying available actions to the end user In the present invention, the key aspect of the display of available actions to the end user is that when the user drags an item in the user interface then those available actions for that item, the definition of which is disclosed above, are distinctly displayed to the end user in the form of drop targets.
In the preferred embodiment, the set of drop targets is displayed to the end user as an overlay onto the user interface as soon as the drag action is detected. In another example embodiment, the set of drop targets replaces the pre-existing user interface. In still another example embodiment, the set of drop targets is displayed within a brief period of up to five seconds in duration following the detection of the drag action. In still another example embodiment, the pre-existing interface is graphically manipulated -by changing its size or colouring or by any other means -\vhen the set of drop targets is displayed.
In the preferred embodiment, the prior user interface is restored to normal, howsoever defined, following the completion of the drag-and-drop operation, whether that completion is due to the triggering of an action or to the cancellation of the drag-and-drop operation.
In the preferred embodiment, the order in which drop targets are displayed in the user interface is further determined by the sort order dehned for target lists and sub-lists, as disclosed above.
Display location for drop targets Various different example embodiments may opt to present that list of available actions in different locations on the display. Wherexer those drop targets are displayed, the only requirement for the present invention is that each of the drop targets defined as available he accessible to the end user.
Possible display locations for the set of drop targets include, hut are not limited to: * Above, below or to the left or right of the item being dragged * At fixed locations on the display, such as at the top, bottom, left or right side of the interface, such as is shown in FIGURES 4 and 5.
* Grouped such that certain deflned top-level actions are shown as their own independent lists of actions with "target" items shown in a list in each group, as ifiustrated in FIGURES 2 and 3.
* Arranged near to or even around the item being dmgged, such as one example embodiment where actions are shown in a circular arrangement around the item being dragged.
* Shown as "action" icons only, which-where applicable -then cause the display of that action's list of individual drop target items when the user hovers over that icon.
* Shown as "target object" icons which, when the item being dragged is dropped onto those icons, may in one example embodiment trigger a request to the end user to specify which action to perform if multiple actions are possible. For example, if the user drops a track onto the icon for a linked friend on a social networking service then if there are two possible actions -such as "send track to this friend" and "recommend track to this friend" -available in that system then in that example embodiment of the present invention the user would at that point he asked whether he wishes to recommend the track to that friend or simply send the track to that friend.
Any other reasonable method of display of the actions available for the item being dragged.
S Scroffing \Vhere a list of available actions is too long to be displayed in its entirety, or merely for reasons of aesthetics or for any other reason, the list or lists of drop targets may be presented to the end user, in the preferred embodiment, as a scrollable area which, in one example implementation of that embodiment, may he scrolled by hovering the item being dragged at one or other extreme of the displayed section of that list while tlragging the item, or by any other method of triggering scrolling which is supported by the operating system or application within which the present invention is implemented.
An example of how such a scrolling area might appear to the end user is illustrated in FIGURE 6.
Core Concepts A method for presenting an end user with the available actions for an item being dragged in a user interface shown on a screen of a computing device, comprising the steps of (a) displaying the possible actions for an itetn as one or more drop targets when that item begins to he dragged, the drop targets being visually distinguishable on the screen from other regions or icons on the screen; (b) alli)wing the user to select an action by dropping the dragged itetn onto an associated drop target displayed in step (a) above.
* the drop targets are made visually distinguishable on the screen from other regions or icons on the screen by virtue of a change to what is shown on screen, and the change is triggered by the detection of the item being dragged.
* The method including the steps of defining which are the available actions for items which may he dragged within a user interface; and defining, for available actions, the "target list," which is the set of possible target objects for that action and which may comprise no target items in some cases.
the displayed set of drop targets for the dragged item is removed from the user interface when the drag-and-drop operation is completed, whether that completion is by the execution of an action or the cancelling of the drag-and-drop operation or by an other means * the user interface is restored to normal when the set of drop targets is removed from the user interface.
* defining the available actions includes defining one or more sets of lists of targets for each action.
* the target list for an action may be one or more of no target items; a set of one or more fixed targets; a list of target items defined algorithmically; or any other defined list of targets.
* a llst of target items is defined algorithmically and includes one or more of the user's most recently accessed items within a given set of items; the user's marked favourites within a given set of items; all items within a given set of items which do not fall into one or more other list or lists of items, such as a list of those items of a given type which are neither recendy accessed nor the user's marked favourites; any other defined llst of target items.
* the list of target items is sorted alphabetically, sorted according to the order in which the target items in that list were most recently accessed, sorted according to any other defined criterion or combination of one or more criteria or is arranged randomly or is not sorted.
* the target hst consists of a consolidation of two or more other target lists, whether aggregated consecutively in a defined sort order or intermixed and re-sorted or aggregated without regard to ordering or consolidated in any manner.
* the set of available actions for items includes the actions of adding an item to a playlist, adding an item to a play queue, sharing an item with one or more users via a social networking service, recommending an item to another person or asns, removing an item from a playlist, deleting an item, printing an item or any other actions applicable to the item being dragged.
displaying the possible actions as drop targets comprises displaying one or more drop targets for @) one or more available actions for the dragged item; or @i) one or more possible target items for an available action for the dragged item; or (iii) one or more combination of available action and possible target item for that action for the dragged item; or (iv any combination or subset or combination of subsets of the listed drop targets, howsoever constituted.
drop targets are displayed consolidated into one or more lists of drop targets, grouped according to action or item type or by any other criteria.
the list of drop targets is displayed as a hori2ontal or vertical line of drop targets or in a circular or polygonal formation or in any other appropriate formation in the user interface.
* the list of drop targets is displayed in the vicinity of the original location of die dragged item or at a fixed location in the user interface or in any other reasonable location within the user interface.
the list of drop targets is displayed wholly or partially overlaying the pre-existing user interface or wholly or partially in place of the pre-existing user interface.
* the list of drop targets is displayed as a scrollable list such that all available items in the targets list are accessible even if all target items are not, for whatever reason, di splayed simultaneously.
* the drop targets are graphically manipulated versions of the pre-existing user interface.
* the drop targets are shown for a period, such as S seconds, following detection of the drag action.
* the drop targets are sho\vn for a period, such as 5 seconds, following detection of a voice command to show the drop targets.
* the available action for an item is determined by consideration of one or more of the kind of item which that item is defined as being or any other metadata related to that itetn or an metadata concerning the user currently dragging that item or by any other tneans.
* the metadata concerning the user dragging an item comprises the set of social network services to which that user belongs, die settings of preferences of that user and/or of the service which that user is utilising or any other rnetadata.
* the list of drop targets is displayed as soon as the drag action is detected or within five seconds of the drag action being detected.
* an item in fact consists of two or more items.
* method is implemented within a system or software application which is used to interact with digita' media content.
Other aspects include: A computing device, such as a tablet, smartphone or computer, programmed to perform any of the preceding methods.
Computer software that, when running on a computing device, enables that device to perforn any of the preceding methods.

Claims (1)

  1. <claim-text>CLAIMSI. A method for presenting an end user with the availabk actions for an item being dragged in a user interface shown on a screen of a computing device, comprising the steps of (a) displaying the possible actions for an item as one or more drop targets when that item begins to be dragged, the drop targets being visually distinguishable on the screen front other regions or icons on the screen; (b) allowing the user to scilect an action hy dropping the dragged item onto an associated drop target displayed in step (a) above.</claim-text> <claim-text>2. The method of claim I in which the drop targets are made visually distinguishable on the screen from other regions or icons on the screen by virtue of a change to what is shown on screen, and the change is triggered by the detection of the item being dragged.</claim-text> <claim-text>3. The method of claim 1 or 2, including the steps of defining which are the availahle actions for items which may he dragged witinn a user interface; and defining, for available actions, the "target list," which is the set of possible target objects for that action and which may comprise no target items in some cases.</claim-text> <claim-text>4. The method of any preceding claim, where the displayed set of drop targets for the dragged itcm is removed from the user interface when the drag-and-drop operation is completed, whether that completion is by the execution of an action or the cancelling of the drag-and-drop operation or by any other means.</claim-text> <claim-text>5. The method of any preceding claim, where the user interface is restored to normal when the set of drop targets is remoed from the user interface.</claim-text> <claim-text>6. The method of any preceding claim where defining the available actions includes defining one or more sets of lists of targets fbr each action.</claim-text> <claim-text>7. The method of any preceding claim 3 or 6 where the target list for an action may he one or more of no target items; a set of one or more fixed targets; a list of target items defined algorithmically; or any other defined list of targets.</claim-text> <claim-text>3. The method of any preceding claim 3, 6 or 7 where a list of target items is defined algorithmically and includes one or more of the user's most recently accessed items within a given set of items; the user's marked favourites within a given set of items; all items within a given set of items which do not fail into one or more other list or lists of items, such as a list of those items of a given type which are neither recently accessed nor the user's marked favourites; any other defined list of target items.</claim-text> <claim-text>9. The method of any preceding claim 3, 6, 7 or S where the list of target items is sorted alphabetically, sorted according to the order in which the target items in that list \vere most recently accessed, sorted according to any other defined criterion or combination of one or more criteria or is arranged randomly or is not sorted.</claim-text> <claim-text>10. The method of any preceding claim 3,6,7,3, or 9 where the target list consists of a consolidation of two or more other target lists, whether aggregated consecutively in a detined sort order or intermixed and re-sorted or aregated without regard to ordering or consolidated in any manner.ii. The method of ally preceding claim where the set of available actions for items includes the actions of adding an item to a playlist, adding an item to a play queue, sharing an itetn with one or more users via a social networking service, recornniendtng an item to another pcrson or persons, removing an item from a playlist, deleting an item, printing an item or any other actions applicable to the item being dragged.12. The method of any preceding claim where displaying the possible actions as drop targets coniprises displaying one or more drop targets for (i) one or more available actions for the dragged item; or (ii) one or more possible target items for an available action for the drated item; or (iii) one or more combination of available action and possible target item for that action for the dragged item; or (iv) any combinathn or subset or combination of subsets of the listed drop targets, howsoever constituted.13. The method of any preceding claim where drop targets are displayed consolidated into one or more lists of drop targets, grouped according to action or item type or by any other criteria.14. The method of preceding claim 13 where the list of drop targets is displayed as a horizontal or vertical line of drop targets or in a circular or polygonal formation or in any other appropriate firmati n in the user interface.15. The method of preceding claim 13 where the list of drop targets is displayed in the vicinity of the original location of the dragged item or at a fixed location in the user interface or in any other reasonable location within the user interface.16. The method of preceding claim 13 where the list of drop targets is displayed wholly or partially overlaying the pre-existing user interface or whoily or partially in place of the pre-existing user interface.17. The method of preceding claim 13 where the list of drop targets is displayed as a scrollahie list such that all available items in the targets list are accessible even if all target items are not, for whatever reason, displayed simultaneously.18. The method of any preceding claim where the drop targets are graphically manipulated versions of the pre-eKisting user interface.19. The method of any preceding claim where the drop targets are shown for a period, such as 5 seconds, following detection of the drag action.20. The method of arty preceding claim where the drop targets are shown for a period, such as 5 seconds, following detectlin of a voice command to show the drop targets.21. The method ot any preceding claim where the available action for an item is determined by consideration of one or more of the kind of item which that item is defined as being or any other rnetadata related to that item or any metadat:a concerning the user currently dragging that item or by any other means.22. The method of claim 16 where the metadata concerning the user dragging an item comprises the set of social network services to which that user belongs, the settings of preferences of that user and/or of the service which that user is utilising or any other metadata.23. The method of any preceding claim where the list of drop targets is displayed as soon as the drag action is detected or within five seconds of the drag action being detected.24. The method of any preceding claim where an item in fact consists of two or more items.25. The method of any preceding claim where that method is implemented within a system or software application which is used to interact with digital media content.26. A computing device, such as a tablet, smartphone or computer, programmed to perform any of the preceding methods.27. Computer software that, when running on a computing device, enables that device to perform any of the preceding methods.</claim-text>
GB1220229.7A 2011-11-09 2012-11-09 Displaying available targets for an object during a drag and drop operation Withdrawn GB2498041A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1119383.6A GB201119383D0 (en) 2011-11-09 2011-11-09 Rara

Publications (2)

Publication Number Publication Date
GB201220229D0 GB201220229D0 (en) 2012-12-26
GB2498041A true GB2498041A (en) 2013-07-03

Family

ID=45421544

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1119383.6A Ceased GB201119383D0 (en) 2011-11-09 2011-11-09 Rara
GB1220229.7A Withdrawn GB2498041A (en) 2011-11-09 2012-11-09 Displaying available targets for an object during a drag and drop operation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1119383.6A Ceased GB201119383D0 (en) 2011-11-09 2011-11-09 Rara

Country Status (2)

Country Link
GB (2) GB201119383D0 (en)
WO (1) WO2013068761A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115407909A (en) * 2021-05-27 2022-11-29 Oppo广东移动通信有限公司 Content sharing method, device, terminal and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040001094A1 (en) * 2002-06-28 2004-01-01 Johannes Unnewehr Automatic identification of drop zones
US20090259959A1 (en) * 2008-04-09 2009-10-15 International Business Machines Corporation Seamless drag and drop operation with multiple event handlers
US20100070899A1 (en) * 2008-09-12 2010-03-18 Meebo, Inc. Techniques for sharing content on a web page
US20100083154A1 (en) * 2008-09-30 2010-04-01 Fujifilm Corporation Apparatus, method and program for controlling drag and drop operation and computer terminal
US20100146425A1 (en) * 2008-12-08 2010-06-10 Lance John M Drag and drop target indication in a graphical user interface
US8051382B1 (en) * 2008-10-30 2011-11-01 Hewlett-Packard Development Company, L.P. Displaying rating indications for drop targets in response to user dragging of mobile icon

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7665028B2 (en) * 2005-07-13 2010-02-16 Microsoft Corporation Rich drag drop user interface
US8793605B2 (en) * 2006-03-29 2014-07-29 Yahoo! Inc. Smart drag-and-drop
US7546545B2 (en) * 2006-09-27 2009-06-09 International Business Machines Corporation Emphasizing drop destinations for a selected entity based upon prior drop destinations
US20080295012A1 (en) * 2007-05-23 2008-11-27 Microsoft Corporation Drag-and-drop abstraction
US20090276701A1 (en) * 2008-04-30 2009-11-05 Nokia Corporation Apparatus, method and computer program product for facilitating drag-and-drop of an object

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040001094A1 (en) * 2002-06-28 2004-01-01 Johannes Unnewehr Automatic identification of drop zones
US20090259959A1 (en) * 2008-04-09 2009-10-15 International Business Machines Corporation Seamless drag and drop operation with multiple event handlers
US20100070899A1 (en) * 2008-09-12 2010-03-18 Meebo, Inc. Techniques for sharing content on a web page
US20100083154A1 (en) * 2008-09-30 2010-04-01 Fujifilm Corporation Apparatus, method and program for controlling drag and drop operation and computer terminal
US8051382B1 (en) * 2008-10-30 2011-11-01 Hewlett-Packard Development Company, L.P. Displaying rating indications for drop targets in response to user dragging of mobile icon
US20100146425A1 (en) * 2008-12-08 2010-06-10 Lance John M Drag and drop target indication in a graphical user interface

Also Published As

Publication number Publication date
GB201119383D0 (en) 2011-12-21
WO2013068761A1 (en) 2013-05-16
GB201220229D0 (en) 2012-12-26

Similar Documents

Publication Publication Date Title
KR102628385B1 (en) Devices, methods, and graphical user interfaces for interacting with user interface objects corresponding to applications
US9804772B2 (en) Method and apparatus for generating a new menu item by dividing or merging a menu item displayed on a portable terminal
US9817548B2 (en) Providing enhanced user interfaces
CN115268730B (en) Apparatus, method and graphical user interface for interacting with user interface objects corresponding to applications
US9626071B2 (en) Method and apparatus for moving items using touchscreen
US9720564B1 (en) Systems and methods for determining user preferences using a graphical user interface
US8402390B2 (en) Rendering icons along a multidimensional path having a terminus position
US9542665B2 (en) Methods for creating, arranging, and leveraging an ad-hoc collection of heterogeneous organization components
AU2011350307A1 (en) Method for moving object between pages and interface apparatus
WO1993022738A1 (en) Method and apparatus for organizing information in a computer system
US11209972B2 (en) Combined tablet screen drag-and-drop interface
WO2011142856A1 (en) Management of digital information via an interface
US20240119968A1 (en) Display method and apparatus, and readable storage medium
US20100289753A1 (en) Adjusting organization of media content on display
US20140215375A1 (en) Presenting shortcuts to provide computer software commands
GB2498041A (en) Displaying available targets for an object during a drag and drop operation
Gruman MacBook Pro Portable Genius

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)