[go: up one dir, main page]

WO2018146493A1 - Dispositif et procédé d'interface utilisateur graphique - Google Patents

Dispositif et procédé d'interface utilisateur graphique Download PDF

Info

Publication number
WO2018146493A1
WO2018146493A1 PCT/GB2018/050382 GB2018050382W WO2018146493A1 WO 2018146493 A1 WO2018146493 A1 WO 2018146493A1 GB 2018050382 W GB2018050382 W GB 2018050382W WO 2018146493 A1 WO2018146493 A1 WO 2018146493A1
Authority
WO
WIPO (PCT)
Prior art keywords
visual object
vertices
state
user interface
virtual movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/GB2018/050382
Other languages
English (en)
Inventor
Nicolas COMER-CALDER
Aron Alexander SCHLEIDER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Voucher Market Ltd
Original Assignee
Voucher Market Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Voucher Market Ltd filed Critical Voucher Market Ltd
Publication of WO2018146493A1 publication Critical patent/WO2018146493A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • a Graphical User Interface Device and Method Field of Invention is in the field of graphical user interfaces. More particularly, but not exclusively, the present invention relates to controlling access within an electronic device using a graphical user interface.
  • Access is often mediated via a graphical user interface where a graphical element is displayed and where user input is received to enable interaction with the graphical element. In the simplest form this may involve, for example, receiving a pointer-click to an icon.
  • virtually wrapped objects are displayed to the user and the user may click on the wrapped object.
  • a predefined animation is displayed where the virtually wrapped object is unwrapped through a series of animation or video frames.
  • the unwrapped object may represent content or functionality, such a greeting card, a virtual gift, an image corresponding to an actual gift or a gift card redeemable for a physical or virtual product.
  • a computer- implemented method of controlling an electronic device with a display and a user input including:
  • Figure 1 shows a block diagram illustrating an electronic device in accordance with an embodiment of the invention
  • Figure 2 shows a block diagram illustrating a software architecture for the electronic device in accordance with an embodiment of the invention
  • Figure 3 shows a flow diagram illustrating a method in accordance with an embodiment of the invention
  • FIG. 1 show a series of screenshots illustrating a graphical user interface in accordance with an embodiment of the invention
  • Figure 5a shows a diagram illustrating the forces acting upon vertices within a grid in accordance with an embodiment of the invention
  • Figure 6c shows a flow diagram illustrating a method in accordance with an embodiment of the invention. Detailed Description of Preferred Embodiments
  • the present invention provides a graphical user interface device and method.
  • FIG. 1 an electronic device 100 in accordance with an embodiment of the invention is shown.
  • the device 100 includes a processor 101 , a memory 102, a user input apparatus 103, and a display apparatus 104.
  • the device 100 may also include a communications apparatus 105.
  • the input apparatus 103 may include one or more of a touch/near-touch input, an audio input, a keyboard, a pointer device (such as a mouse), or any other type of input.
  • the display apparatus 104 may include one or more of a digital screen (such as an LED or OLED screen), an e-ink screen, or any other type of display.
  • the input and display apparatuses 103 and 104 may form an integrated user interface 106 such as a touch or near-touch screen.
  • the device 100 may constitute a personal computing device such as a desktop or laptop computer, or a mobile device 100, such as a smart-phone or a tablet.
  • the device 100 may include a common operating system 107 such as Apple iOS, Google Android, or Microsoft Windows Phone for mobile devices or Microsoft Windows or Apple OSX for personal computing devices.
  • the processor 101 may be configured to display a visual object within a graphical user interface in a first and second state on the display apparatus 104.
  • the processor 101 may be configured to convert the state of the visual object from the first state to the second in response to a user input via the input apparatus 103 at the graphical user interface.
  • the user input may comprise a virtual movement, for example, a touch movement detected moving from one location to another or a pointer-based movement detected moving from one location to another.
  • the states may be reflected as visual differences on the display.
  • the processor 101 may be further configured to provide content or functionality to the user. Access to the content or functionality may be provided by the processor 101 when the visual object is converted into the second state.
  • the memory 102 may be configured to store software applications 108, libraries 109, the operating system 107, device drivers 1 10, and data 1 1 1 .
  • the processor 101 is configured to execute the software applications 108, libraries 109, operating system 107, and device drivers 1 10, and to retrieve data 1 1 1 .
  • the communications apparatus 105 may be configured to communicate with one or more other devices or servers via a communications interface such as wifi, Bluetooth, and/or cellular (e.g. 2G, 3G, or 4G) and/or across a network (such as a cellular network, a private LAN/WLAN and/or the Internet).
  • a communications interface such as wifi, Bluetooth, and/or cellular (e.g. 2G, 3G, or 4G) and/or across a network (such as a cellular network, a private LAN/WLAN and/or the Internet).
  • Software applications 201 are provided at a top layer. Below this layer are user interface APIs (Application Programming Interfaces) 202 which provide access for the application software 201 to user interface libraries. Below this layer are operating system APIs 203 which provide access for the application software 201 and user interface libraries to the core operating system 204. Below the core operating system 204 are the device drivers 205 which provide access to the input 103, output 104, and communication 105 apparatuses.
  • user interface APIs Application Programming Interfaces
  • operating system APIs 203 which provide access for the application software 201 and user interface libraries to the core operating system 204.
  • device drivers 205 which provide access to the input 103, output 104, and communication 105 apparatuses.
  • the user interface APIs 202 include exposed calls to instruct the processor 101 to display a visual object within a graphical user interface in a first state on the display apparatus 104 and to instruct the processor 101 to convert the state of the visual object from the first displayed state to a second displayed state in response to a user input via the input apparatus 103 at the graphical user interface.
  • the calls may exposed to the application layer 201 .
  • a visual object is displayed in a first state to a user on an electronic device (e.g. on display apparatus 104) within a graphical user interface.
  • the visual object may be a visual representation of a sheet of material such as paper.
  • the visual object may be comprised of a visible or non-visible grid.
  • the grid may be comprised of interconnected vertices.
  • a user input event may be detected at the graphical user interface.
  • the user input event may comprise virtual movement.
  • Virtual movement may be a continuous series of touch events detected by a touchscreen across an area, or a pointer-based movement from one location to another. It will be appreciated that virtual movement may be captured in a variety of ways.
  • the virtual movement may comprise a path.
  • the path may not be predefined. That is, the path may be only defined by the user when the user interacts with the user input apparatus to provide the user input event.
  • the user input event comprises multiple virtual movement defined by touch events detected by a touch-screen.
  • each of the multiple virtual movements may correspond to a path such that multiple paths are defined.
  • step 303 in response to the user input event and during the virtual movement, the visual object is converted in correspondence to the virtual movement from the first state to a second state. Modification of the visual object is reflected within the graphical user interface.
  • the modification may be tears, folds, or crumpling of the sheet, such that the sheet of material is simulated within the graphical user interface.
  • the visual object is comprised of a grid of vertices, virtual movement proximate to the vertices may result in "disconnections" between those vertices. In this way, where the visual object represents a sheet of material, a tearing in the sheet may be simulated.
  • the effect of the virtual movement on a localised aspect of the grid may iterate through-out the grid of vertices to provide a crumpling effect.
  • a 2D image may be mapped to the grid of vertices such that modifications to the grid of vertices results in a visual change to the 2D image which may in a first state be displayed in a flat form, and, after, modification to the grid of vertices may be displayed in a three-dimensional form illustrating crumpling.
  • a force is calculated on each vertex of the grid of vertices, and virtual movement changes the forces on one or more vertices within the grid of vertices, such that, beyond a specific force threshold, a vertex may split into two representing a "disconnection" within the grid.
  • associated sound effects may be played such as tearing, folding or crumpling sound effects.
  • vertices may experience force increases in two opposing directions resulting in more rapid "disconnections".
  • vertices are not split into two to represent disconnections, but the visual object may comprise two or more grids of vertices, each grid of vertices mapped to one part of the image or one image of a plurality of images. Modification may then affect one or more of the grids of vertices depending on the type of user input event, resulting in one grid or both moving in relation to the other simulating, for example, a tearing of the overall visual object.
  • step 304 access is provided to content or functionality once the visual object is converted into the second state. Access may be provided to the user within the graphical user interface.
  • the content or functionality may include display of a greeting card or message card, display of a gift card, and/or providing the ability to purchase products or services. It will be appreciated that access to various types of content or functionality may be provided.
  • Figures 4a to 4d show diagrams illustrating a sequence of user interactions with screenshots generated by a software application in accordance with a method of the invention.
  • a visual object representing wrapping paper is displayed to the user within a graphical user interface.
  • An animated icon may be displayed to user to indicate the type of action that is possible in relation to this visual object.
  • a dragging user input is indicated.
  • a dragging user input is detected within the graphical user interface in relation to the visual object.
  • the visual representation of the wrapping paper tears and folds organically corresponding to where and how the user drags.
  • the content or functionality is a further wrapping paper to be interacted with and a new visual object representing this further wrapping paper is displayed within the graphical user interface as shown in Figure 4c.
  • the display of the wrapping paper shows tearing and folding of the paper to display a message card behind in Figure 4d.
  • An image (such as a 2D bitmap image) is mapped into a grid of vertices. Each connected pair of vertices within the grid is connected by a spring.
  • the grid In an initial state, before user events are detected, the grid is in a rest state and the forces upon the vertices sum to null as shown in Figure 5a.
  • a user event is first detected (such as one or more touch events)
  • an anchor point is created at each user event initiation point corresponding to the grid (e.g. a touch event on a touch-screen).
  • One or more anchor springs may be created between each of the one or more anchor points and one or more proximate vertices.
  • the anchor points are moved to correspond to the virtual movement. Tension then increase on the one or more anchor springs and additional force is applied to the associated vertices.
  • each point i.e. vertex or anchor point
  • Velocity Springs may be defined with parameters:
  • Forces between two points (p1 and p2) may be updated as follows:
  • Point integration using velocity and acceleration over a given deltaTime may be defined as:
  • multiple images forming a larger visual object may be each mapped on to multiple grids.
  • These image-grid pairs exist within the user interface as wrapping-paper objects.
  • a user input event comprising one or more touch events may connect the anchor points for each touch event to vertices at one of the wrapping-paper objects via the anchor springs.
  • a user input event comprising a single touch event will affect vertices of the grid for one wrapping-paper object
  • a user input event comprising multiple touch events may affect multiple wrapping-paper objects when the vertices proximate to the anchor points exist on different wrapping-paper objects.
  • a user provides a single input point, for example using a mouse or one finger on a touch-screen.
  • An example of a user interacting with two wrapping-paper objects in accordance with the invention will be described with reference to Figure 5c.
  • a user provides a two input points, for example using two fingers on a touch-screen.
  • the larger visual object is a layer
  • there may exist multiple layers within the user interface such that the layers are stacked in front of one another. Each layer may partially or completely obscure the layer behind it. These layers exist within the interface as wrappingobjects. An embodiment of the invention will now be described with reference to Figure 6a to 6b.
  • This embodiment may include five system modules.
  • the system modules may be implemented in software or within firmware.
  • the modules may execute at the user interface layer 202 and expose user interface APIs to application software executing at the application layer 201.
  • the system modules may include a Wrapping Manager module, a Wrapping module, a Wrapping Geometry module, a Wrapping Paper module, and Wrapping Spring module.
  • the Wrapping Manager module may be configured for managing the entire "unwrapping" user interface process. It may be configured for:
  • the Wrapping module may be configured for managing an "unwrapping" layer. It may be configured for:
  • the Wrapping Geometry module may be configured for creating the grid plane geometry. It may be configured for:
  • the Wrapping Paper module may be configured for managing each separate "paper piece” or Wrapping Paper object. It may be configured for:
  • the Wrapping Spring module may be configured for calculating the forces applied between two vertices. It may be configured for:
  • the Wrapping Spring module may require rest length and stiffness parameters.
  • the Wrapping Manager module receives user input and executes the main update function calling the active Wrapping object.
  • the Wrapping object manages the Wrapping Paper objects that represent each separate piece of paper.
  • Each Wrapping Paper object updates its vertices using its Wrapping Spring objects for the physics calculations.
  • a potential advantage of some embodiments of the present invention is that immediate visual feedback is provided to the user, the visual feedback correlates more directly to the user action, and the skeuomorphic attributes enhance user ease-of-use particularly for users with less exposure to technology or technical ability. At least some of these advantages may thus provide an improved graphical user interface which in turn provides an improved device. Furthermore, a potential advantage of some embodiments of the present invention is that a user interface API is provided to application software developers to facilitate ease of creation of this user interface element. While the present invention has been illustrated by the description of the embodiments thereof, and while the embodiments have been described in considerable detail, it is not the intention of the applicant to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details, representative apparatus and method, and illustrative examples shown and described. Accordingly, departures may be made from such details without departure from the spirit or scope of applicant's general inventive concept.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé mis en oeuvre par ordinateur pour commander un dispositif électronique avec un affichage et une entrée d'utilisateur. Le procédé comprend les étapes consistant à : afficher un objet visuel dans un premier état à l'intérieur d'une interface utilisateur graphique sur l'affichage; détecter un événement d'entrée d'utilisateur comprenant un mouvement virtuel à l'intérieur de l'interface utilisateur graphique; et en réponse à l'événement d'entrée d'utilisateur: pendant le mouvement virtuel, modifier l'objet visuel en correspondance avec le mouvement virtuel de telle sorte que l'objet visuel soit converti d'un premier état à un second état; et fournir un accès au contenu ou à la fonctionnalité lorsque l'objet visuel est dans le second état. Le mouvement virtuel ne suit pas un trajet prédéfini.
PCT/GB2018/050382 2017-02-10 2018-02-12 Dispositif et procédé d'interface utilisateur graphique Ceased WO2018146493A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1702286.4A GB201702286D0 (en) 2017-02-10 2017-02-10 A graphical user interface device and method
GB1702286.4 2017-02-10

Publications (1)

Publication Number Publication Date
WO2018146493A1 true WO2018146493A1 (fr) 2018-08-16

Family

ID=58462068

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2018/050382 Ceased WO2018146493A1 (fr) 2017-02-10 2018-02-12 Dispositif et procédé d'interface utilisateur graphique

Country Status (2)

Country Link
GB (1) GB201702286D0 (fr)
WO (1) WO2018146493A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060284852A1 (en) * 2005-06-15 2006-12-21 Microsoft Corporation Peel back user interface to show hidden functions
US20120098639A1 (en) * 2010-10-26 2012-04-26 Nokia Corporation Method and apparatus for providing a device unlock mechanism
EP2587361A2 (fr) * 2011-10-25 2013-05-01 Samsung Electronics Co., Ltd Procédé et appareil d'affichage de livre électronique dans un terminal ayant une fonction de lecteur de livre électronique
US20130159914A1 (en) * 2011-12-19 2013-06-20 Samsung Electronics Co., Ltd. Method for displaying page shape and display apparatus thereof
US20130205255A1 (en) * 2012-02-06 2013-08-08 Hothead Games, Inc. Virtual Opening of Boxes and Packs of Cards

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060284852A1 (en) * 2005-06-15 2006-12-21 Microsoft Corporation Peel back user interface to show hidden functions
US20120098639A1 (en) * 2010-10-26 2012-04-26 Nokia Corporation Method and apparatus for providing a device unlock mechanism
EP2587361A2 (fr) * 2011-10-25 2013-05-01 Samsung Electronics Co., Ltd Procédé et appareil d'affichage de livre électronique dans un terminal ayant une fonction de lecteur de livre électronique
US20130159914A1 (en) * 2011-12-19 2013-06-20 Samsung Electronics Co., Ltd. Method for displaying page shape and display apparatus thereof
US20130205255A1 (en) * 2012-02-06 2013-08-08 Hothead Games, Inc. Virtual Opening of Boxes and Packs of Cards

Also Published As

Publication number Publication date
GB201702286D0 (en) 2017-03-29

Similar Documents

Publication Publication Date Title
US12307080B2 (en) Displaying a three dimensional user interface
US7954066B2 (en) Interface engine providing a continuous user interface
US20210141523A1 (en) Platform-independent user interface system
CN102023706B (zh) 用于与虚拟环境中的对象进行交互的系统
US8786517B2 (en) System and method for displaying a user interface across multiple electronic devices
US20120092340A1 (en) Systems, methods, and computer-readable media for manipulating graphical objects
EP2584450A2 (fr) Procédé de modification d'attributs rendus d'éléments de liste dans une interface utilisateur
US20130127870A1 (en) Focus-change invariance in a graphical display
US20100235769A1 (en) Smooth layout animation of continuous and non-continuous properties
WO2020180421A1 (fr) Accrochage d'un objet virtuel sur une surface cible
CN110473273B (zh) 矢量图形的绘制方法、装置、存储介质及终端
JP6532981B2 (ja) 持続的ノードフレームワーク
EP2992411A1 (fr) Manipulation automatique de données visualisées en fonction de l'interactivité
CN108255546A (zh) 一种数据加载动画的实现方法及装置
CA2806906C (fr) Systeme et procede pour afficher une interface utilisateur sur plusieurs dispositifs electroniques
CN116610881A (zh) 一种基于低代码软件的WebGL浏览交互方法
CN111708533B (zh) 在应用瘦客户端中设置鼠标显示状态的方法及装置
WO2018146493A1 (fr) Dispositif et procédé d'interface utilisateur graphique
CN109416638B (zh) 可定制的紧凑叠加窗口
US11907646B1 (en) HTML element based rendering supporting interactive objects
JP2017072977A (ja) コンピュータ・プログラム
CN118708093A (zh) 网页控件的滑动控制方法及装置、设备、存储介质
US20070184906A1 (en) Method of controlling interactions between objects
CN117971220A (zh) 控件显示方法、介质、装置和计算设备
CN115857778A (zh) 页面生成方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18714008

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18714008

Country of ref document: EP

Kind code of ref document: A1