[go: up one dir, main page]

WO2018001456A1 - Device and method for haptic exploration of a rendered object - Google Patents

Device and method for haptic exploration of a rendered object Download PDF

Info

Publication number
WO2018001456A1
WO2018001456A1 PCT/EP2016/064947 EP2016064947W WO2018001456A1 WO 2018001456 A1 WO2018001456 A1 WO 2018001456A1 EP 2016064947 W EP2016064947 W EP 2016064947W WO 2018001456 A1 WO2018001456 A1 WO 2018001456A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
haptic
rendering
finger
touchscreen
Prior art date
Application number
PCT/EP2016/064947
Other languages
French (fr)
Inventor
José ARAÚJO
Guoqiang Zhang
Lars Andersson
Original Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget Lm Ericsson (Publ) filed Critical Telefonaktiebolaget Lm Ericsson (Publ)
Priority to PCT/EP2016/064947 priority Critical patent/WO2018001456A1/en
Publication of WO2018001456A1 publication Critical patent/WO2018001456A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Definitions

  • the invention relates to a device for haptic exploration of an object rendered on a haptic touchscreen comprised in the device, a method of a device for haptic exploration of an object rendered on a haptic touchscreen comprised in the device, a corresponding computer program, and a corresponding computer program product.
  • the haptic internet is considered to be the next step in mobile networking. It is envisioned that users of mobile devices, such as mobile phones, smartphones, tablets, and the like, will be able to communicate by means of touch, in addition to voice and video. This is achieved by means of haptic interfaces which provide haptic feedback using actuation technologies utilizing, e.g., ultrasound, vibrotactile, electrostatic, or piezoelectric, transducers.
  • actuation technologies utilizing, e.g., ultrasound, vibrotactile, electrostatic, or piezoelectric, transducers.
  • haptic feedback is typically used for improving the user's interaction with a graphical user interface, such as buttons or other user-interface objects displayed on a touchscreen, or to enable haptic exploration of a physical object or material which is rendered on the touchscreen.
  • Haptic exploration is a mechanism by which humans learn about the surface properties of (unknown) objects. Through the sense of touch, we are able to learn about attributes such as object shape, surface texture, stiffness, and temperature.
  • attributes such as object shape, surface texture, stiffness, and temperature.
  • a user swiping with his/her finger over an image which is rendered on a haptic touchscreen and which represents a physical object or materials can feel the texture of the rendered object or material at the location of touch, since the device is capable of tracking the finger's position.
  • a possible use case for such technology is e-commerce, where users may be given the opportunity to explore the haptic characteristics of an object, such as a piece of furniture or a clothing item, in addition to its visual appearance, by rendering the object on a haptic display.
  • an object such as a piece of furniture or a clothing item
  • additional information needs to be communicated to the haptic device for rendering the haptic characteristics of an object, e.g., friction, stiffness, or texture
  • the efficient selection and transmission of data which is required for rendering haptic properties of an object needs to be addressed for efficient bandwidth utilization.
  • a device for rendering an object on a haptic touchscreen is provided.
  • the haptic touchscreen is comprised in the device, which may, e.g., be any one of a display, a mobile phone, a smartphone, a mobile terminal, a User Equipment (UE), a tablet, a laptop, or the like.
  • the device is operative to render the object on the haptic touchscreen using first data which comprises data for graphically rendering the object, determine haptic exploration characteristics of a finger interacting with the haptic touchscreen for haptically exploring the object, acquire second data for rendering one or more haptic properties of the object, and render the one or more haptic properties of the object using the second data.
  • the second data is selected based on the determined haptic exploration characteristics.
  • a method of rendering an object on a haptic touchscreen comprises rendering the object on the haptic touchscreen using first data which comprises data for graphically rendering the object, determining haptic exploration characteristics of a finger interacting with the haptic touchscreen for haptically exploring the object, acquiring second data for rendering one or more haptic properties of the object, and rendering the one or more haptic properties of the object using the second data.
  • the second data is selected based on the determined haptic exploration characteristics.
  • a computer program comprises computer-executable instructions
  • a computer program product comprises a computer- readable storage medium which has the computer program according to the third aspect of the invention embodied therein.
  • the invention makes use of an understanding that the amount of haptic data which is transmitted to a device comprising a haptic touchscreen, herein referred to as a haptic device, for rendering an object for haptic exploration by a user, using a finger or any other body part, may be reduced by selectively providing haptic data to the haptic device based on determined haptic exploration characteristics of the finger, or other body part, interacting with the haptic touchscreen for haptically exploring the object, i.e., sensing the object using a finger or other body part.
  • an object may, e.g., be a representation of an item offered for sale in an online shop, such as a piece of furniture, a clothing item, and the like.
  • the object can be graphically rendered, using media data, e.g., image data, and/or haptically rendered.
  • Haptic rendering of an object is to be understood as rendering haptic properties of the object on a haptic interface, such as a haptic display, for haptic exploration.
  • the haptic interface may be a haptic touchscreen which is capable of both sensing haptic exploration characteristics such as the position, velocity, and force, of the finger or other body part, but also provide haptic feedback via, e.g., piezoelectric actuators, ultrasonic actuators, and electrostatic actuators.
  • haptic touchscreens are known, e.g., from the Tpad phone.
  • Embodiments of the invention first render an object graphically, using first data, before acquiring second data for rendering one or more haptic properties of the object.
  • the rendered object may, e.g., be selected by the user of the haptic device from a list of displayed objects, such as items for sale in an online shop.
  • a problem which is addressed herein is the selection of the second data for rendering the one or more haptic properties of the object.
  • Known solutions simply acquire all haptic data for an object, resulting in a considerable increase in the amount of data which needs to be transmitted to the haptic device.
  • the first data and the second data may be acquired by retrieving the data from a network node providing the data, such as a server of an online shop, or the like.
  • the invention is based on an understanding that the second data may be selected based on haptic exploration characteristics which provide information about the haptic exploration of the finger, or other body part.
  • the haptic exploration characteristics may be determined by tracking the finger interacting with the haptic touchscreen during haptic exploration of the rendered object, and comprise any one, or a combination of, a position, a velocity, and a force, applied by the finger to the touchscreen when exploring the object.
  • the first data may further comprise data for haptically rendering the one or more haptic properties of the object with a first level of detail
  • the second data comprises data for haptically rendering the one or more haptic properties of the object with a second level of detail which is higher than the first level of detail.
  • the first data may comprise data for rendering only a subset of the haptic properties of the object, e.g., only texture
  • the second data comprises data for rendering additional haptic properties of the object, e.g., stiffness.
  • the first data may comprise haptic data for rendering an outline of the object.
  • the user can start exploring the haptic properties of the object, and the rendered haptic properties of the object are improved when the second data is received by the haptic device and used for re-rendering the object.
  • the second data is selected based on a number of times the finger has explored the object within each of the at least two regions.
  • the second data is only acquired by the device for one or more regions which in fact are explored by the finger or other body part, or about to be explored. This may be achieved by tracking a position, and optionally a velocity, of the finger or other body part relative to the regions which are defined for the rendered object, and selecting the second data accordingly.
  • the second data may be haptic data for a region which is currently explored by the finger, i.e., the current position of the finger is within that region.
  • the second data may be haptic data for a region which the finger is likely to explore, which region may be identified based on the current position of the finger and a velocity of the finger.
  • the second data comprises at least one of friction data, stiffness data, and texture data.
  • the second data is selected based on whether the finger is exploring a friction, a texture, or stiffness, respectively, of the object or a region thereof. This can be determined by characterizing the interaction of the finger with the haptic touchscreen, e.g., based on whether the finger slides over the haptic touchscreen or is pressed onto the haptic touchscreen.
  • the second data is available in at least two different resolutions, and is selected based on a velocity of the finger exploring the object, or a region thereof. That is, if the finger or other body part is moving quickly over the rendered object or a region thereof, the haptic properties of the object are rendered at a low resolution. If the finger or body part is moving slowly over the rendered object or regions thereof, the haptic properties of the object are rendered at a high resolution.
  • This embodiment of the invention is based on an understanding that the user may feel more details when moving the finger slowly, as compared to moving the finger rapidly over the haptic touchscreen.
  • the second data is acquired by selecting the second data based on the determined haptic exploration characteristics and further based on information identifying second data which is available for the object, requesting the second data from the network node providing the second data, and receiving the second data from the network node providing the second data.
  • the haptic device which selects the second data and requests the selected second data from the network node providing the second data.
  • the information identifying second data which is available for the object may be comprised in the first data, e.g., as metadata.
  • the second data is acquired by transmitting the determined haptic exploration characteristics, or information derived therefrom, to the network node providing the second data, and receiving the second data from the network node providing the second data. That is, the haptic device transmits the determined haptic exploration characteristics to the network node providing the second data, e.g., a server of an online shop, to enable the network node to select the second data for transmission to the haptic device. As an alternative, the haptic device may transmit ranking information which is derived from the determined haptic exploration characteristics, based on which ranking information the network node selects the second data for transmission to the haptic device.
  • Fig. 1 shows a device for rendering an object on a haptic touchscreen comprised in the device, in accordance with embodiments of the invention.
  • Fig. 2 shows a sequence diagram illustrating rendering an object on a haptic touchscreen, in accordance with an embodiment of the invention.
  • Fig. 3 shows a sequence diagram illustrating rendering an object on a haptic touchscreen, in accordance with another embodiment of the invention.
  • Fig. 4 shows a processing means comprised in the device for rendering an object on a haptic touchscreen, in accordance with an embodiment of the invention.
  • Fig. 5 shows a processing means comprised in the device for rendering an object on a haptic touchscreen, in accordance with another embodiment of the invention.
  • Fig. 6 shows a method of rendering an object on a haptic
  • a device 100 for rendering an object 1 10 on a haptic touchscreen 101 is illustrated.
  • Device 100 comprising haptic touchscreen 101 may, e.g., be any one of a display (for offering or advertising an item for sale in a shop), a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, or the like.
  • Haptic touchscreen 101 is arranged for providing haptic feedback and may utilize any known actuation technology.
  • haptic touchscreen 101 may be based on ultrasonic transducers, vibrotactile transducers, electrostatic transducers, or piezoelectric transducers.
  • haptic device 100 further comprises processing means 102, which is described in more detail further below, and communications module 103, which is operative to effect communications between haptic device 100 and an external network node.
  • processing means 102 which is described in more detail further below
  • communications module 103 which is operative to effect communications between haptic device 100 and an external network node.
  • communications module 103 may be operative to effect
  • a cellular mobile network e.g., a Global System for Mobile
  • GSM Global System for Mobile communications
  • UMTS Universal Mobile Telecommunications System
  • LTE Long Term Evolution
  • WLAN Wireless Local Area Network
  • Haptic device 100 is operative to render object 1 10, in Fig. 1 exemplified by a chair, for haptic exploration by a finger 121 or any other body part of a user, e.g., hand 120 or one of the other fingers of hand 120.
  • haptic exploration is to be understood as a mechanism by which a user of haptic device 100 learns about the haptic properties of an object rendered on haptic touchscreen 101 , such as object 1 10.
  • the user is able to learn about attributes such as friction, texture, and stiffness, by swiping finger 121 over, or pressing finger 121 onto, object 1 10 which is rendered on haptic touchscreen 101 .
  • haptic device 100 is operative to render object 1 10 on haptic touchscreen 101 using first data which comprises data for graphically rendering object 1 10, i.e., rendering a visual representation of object 1 10.
  • first data may comprise image data for rendering an image representing object 1 10.
  • Haptic device 100 is further operative to determine haptic exploration characteristics of finger 121 , or any other body part of the user, interacting with haptic touchscreen 101 for haptically exploring object 1 10.
  • haptic exploration characteristics comprise at least one of a position, a velocity, and a force, applied by finger 121 , or any other body part of the user, to haptic touchscreen 101 when exploring object 1 10.
  • Haptic device 100 is further operative to acquire second data for rendering one or more haptic properties of object 1 10, wherein the second data is selected based on the determined haptic exploration characteristics of finger 121 , as is described further below, and to render the one or more haptic properties of object 1 10 using the second data.
  • the second data comprises data (haptic data) for haptically rendering object 1 10, such that the haptic properties of object 1 10, including but not limited to friction, texture, and stiffness, can be sensed by the user when touching haptic touchscreen 101 with finger 121 at a position 131 where object 1 10 is rendered, or by swiping finger 121 across at least a part of object 1 10 rendered on haptic touchscreen 101 , e.g., along path 132.
  • the first data may further comprise data for haptically rendering the one or more haptic properties of object 1 10 with a first level of detail.
  • the second data comprises data for rendering the one or more haptic properties of object 1 10 with a second level of detail which is higher than the first level of detail.
  • the first data may comprise data for rendering only a subset of the haptic properties of object 1 10, e.g., only its texture, whereas the second data comprises data for rendering additional haptic properties of object 1 10, e.g., its stiffness.
  • the first data may comprise haptic data for rendering an outline of object 1 10.
  • haptic device 100 can start rendering object 1 10 at a lower level of detail, thereby allowing the user to start exploring object 1 10 before the second data has been received at haptic device 100 and object 1 10 has been re-rendered. This results in reduced latency and improved user experience.
  • the second data may be selected according to a number of alternatives, which are described in the following.
  • At least two regions may be defined for object 1 10.
  • a first region 1 1 1 may be defined for a seat of chair 1 10
  • a second region 1 12 may be defined for a back of chair 1 10.
  • the seat which has the style of a cushion, may have haptic properties which are considerably different from those of the back.
  • a user who browses through a collection of chairs which are offered for sale by an online shop may be interested in comparing how the seat feels for different chairs.
  • haptic device 100 acquires and render only second data which describes the haptic properties of first region 1 1 1 , rather than acquiring second data which describes the haptic properties of the entire chair, or both regions 1 1 1 and 1 12.
  • the regions for which the second data is acquired may be selected in different ways. As an example, the second data may be selected based on a number of times finger 121 has explored object 1 10, or similar objects such as a collection of chairs, within each of the at least two regions 1 1 1 and 1 12.
  • the second data for the one or more most-explored region(s) may be selected.
  • the second data for the most-explored region may be acquired and rendered first, and the second data for one or more less-explored regions are acquired and rendered subsequently, e.g., based on an availability of a bandwidth for
  • communications effected via communications module 103.
  • the second data may comprise at least one of friction data, stiffness data, and texture data.
  • the second data may be selected based on whether finger 121 explores the friction, the stiffness, or the texture, respectively, of object 1 10, or one of its regions 1 1 1 and 1 12.
  • the type of haptic property which the user is exploring using finger 121 can, e.g., be determined based on whether finger 121 is pressing onto haptic touchscreen 101 or sliding over haptic touchscreen 101 . For instance, if finger 121 is pressing onto haptic touchscreen 101 , e.g., at position 131 within rendered object 1 10, it may be concluded that the user is exploring the stiffness of object 1 10.
  • finger 121 is sliding, or swiping, over haptic touchscreen 101 , e.g., along path 132 over rendered object 1 10, it may be concluded that the user is exploring friction if finger 121 slides across haptic touchscreen 101 with a high pressure at a low velocity, and that the user is exploring texture if finger 121 slides across haptic
  • Haptic device 100 may be operative to determine the type of haptic property which is explored by finger 121 , or any other body part, based on threshold values for velocity and/or pressure, which may be set by a manufacturer of haptic device 100 and which may optionally be configurable by the user. That is, haptic device 100 may be operative to distinguish low pressure and high pressure by means of comparison with a pressure threshold value and,
  • the second data may be available in at least two different resolutions, a first (low) resolution and a second (high) resolution which is higher than the first resolution.
  • the second data may be selected based on a velocity of finger 121 exploring object 1 10, or a region 1 1 1 or 1 12 of object 1 10.
  • finger 121 is moving quickly over object 1 10 or a region thereof, it is sufficient to render the haptic properties of object 1 10 at a low resolution, as the user may not be able to sense all details. Therefore, it suffices to acquire the second data at a first (low) resolution and render object 1 10 accordingly.
  • haptic device 100 may be operative to distinguish low velocity and high velocity based on a velocity threshold value which may be set by a manufacturer of haptic device 100 and which may optionally be configurable by the user.
  • the second data may initially be acquired at the first (low) resolution and subsequently in the second (high) resolution, either after object 1 10 has been rendered using the second data in the first resolution, or while the second data in the first resolution is acquired and/or object 1 10 is rendered in the first resolution.
  • haptic device 100 can start rendering object 1 10 using the second data in the first (low) resolution, thereby allowing the user to start exploring object 1 10, before the second data in the second resolution has been received at haptic device 100 and object 1 10 can be re-rendered using the second data in the second (high) resolution. This results in reduced latency and improved user experience.
  • Figs. 2 and 3 show sequence diagrams illustrating rendering an object on haptic touchscreen 101 comprised in haptic device 100.
  • the second data 224 may be acquired from a network node 200 providing the second data, such as a server of an online shop, which is accessible via communications module 103 accessing a communications network, e.g., a cellular mobile network or a WLAN/WiFi network.
  • a network node 200 providing the second data
  • a communications network e.g., a cellular mobile network or a WLAN/WiFi network.
  • First data 212 may either be pushed to haptic device 100, or transmitted to haptic device 100 in response to a request 21 1 for first data received by server 200.
  • Request 21 1 may, e.g., be a HyperText Transfer protocol (HTTP) GET request identifying the object for which the first data is requested, such as object 1 10.
  • the object may, e.g., be identified by a unique identifier, such as a text string, a number string, or a character string.
  • HTTP request 21 1 the first data may be identified by a Uniform Resource Locator (URL) of the form
  • request 21 1 may be of the form
  • server 200 may transmit first data 212 in an HTTP response message, e.g., as an HTTP 200 OK message, as is known in the art.
  • First data 212 comprises (media) data for graphically rendering object 1 10 on haptic touchscreen 101 , and may optionally comprise (haptic) data for rendering one or more haptic properties of object 1 10 with a first (low) level of detail.
  • object 1 10 is rendered 213 at device 100, on haptic touchscreen 101 . More specifically, object 1 10 is graphically rendered 213 using the media data comprised in first data 212, and may optionally be haptically rendered 213 using any haptic data with a first (low) level of detail which is comprised in first data 212.
  • haptic device 100 determines 221 the haptic exploration characteristics of finger 121 interacting with haptic touchscreen 101 for haptically exploring object 1 10, and selects 222 the second data for object 1 10.
  • the second data is selected 222 based on the determined haptic exploration characteristics, as is described hereinbefore, and further based on information identifying second data which is available for the object, e.g., in the form of metadata.
  • the information identifying second data which is available for the object may optionally be comprised in first data 212.
  • first data 212 may, in addition to media data for graphically rendering object 1 1 1 , comprise metadata defining any one or a combination of the following:
  • haptic properties for which haptic data is available for object 1 10, such as friction, stiffness, or texture, - One or more levels of detail for which haptic data is available for object 1 10, and
  • Second HTTP GET request 223 may, e.g., be of the form:
  • HTTP GET request 223 may be of the form
  • server 200 may transmit the second data in a response message 224, e.g., as an HTTP 200 OK message.
  • Second data 224 comprises (haptic) data for rendering the one or more haptic properties of object 1 10. If haptic data with a first (low) level of detail has been transmitted to haptic device 100 with first data 212, second data 224 may comprise haptic data with a second level of detail which is higher than the first level of detail. It will be appreciated that second data 224 may optionally comprise additional data, such as media data.
  • haptic device 100 re- renders 225 object 1 10, i.e., it renders the haptic properties of object 1 10 using second data 224.
  • Fig. 3 an alternative solution for rendering an object on haptic touchscreen 101 comprised in haptic device 100 is illustrated.
  • the second data 324 may be acquired from network node 200 providing the second data.
  • First data 312 may either be pushed to haptic device 100, or transmitted to haptic device 100 in response to a request 31 1 for first data received by server 200.
  • Request 31 1 may, e.g., be an HTTP GET request identifying the object for which the first data is requested, such as object 1 10.
  • server 200 may transmit first data 312 in an HTTP response message, e.g., as an HTTP 200 OK message, as is known in the art.
  • First data 312 comprises (media) data for graphically rendering object 1 10 on haptic touchscreen 101 , and may optionally comprise (haptic) data for rendering one or more haptic properties of object 1 10 with a first (low) level of detail.
  • object 1 10 is rendered 313 at device 100, on haptic touchscreen 101 . More specifically, object 1 10 is graphically rendered 313 using the media data comprised in first data 312, and may optionally be haptically rendered 313 using any haptic data with a first (low) level of detail which is comprised in first data 312.
  • haptic device 100 determines 321 the haptic exploration characteristics of finger 121 interacting with haptic touchscreen 101 for haptically exploring object 1 10, and transmits the haptic exploration characteristics 322 to server 200.
  • server 200 selects 323 the second data for object 1 10, similar to what has been described hereinbefore with reference to Fig. 2.
  • the embodiments described with reference to Figs. 2 and 3 differ in the way the second data is selected. More specifically, whereas the second data is selected 222 by haptic device 100 for the embodiment described with reference to Fig. 2, the second data is selected 323 by server 200 for the embodiment illustrated in Fig. 3, based on haptic exploration
  • haptic device 100 may transmit any information 322 derived from the determined haptic exploration
  • a corresponding rank may be transmitted to server 200, such as R reg (l, 0.9) and R reg (2, 0.1), for first region 1 1 1 and second region 1 12, respectively.
  • R reg l, 0.9
  • R reg 2, 0.1
  • R Oe K, X, N, M
  • Such value may be further normalized per the total number of regions and explorations. This allows constructing a model which maps the determined exploration characteristics, in particular velocity and force, into the
  • Second data 324 comprises (haptic) data for rendering the one or more haptic properties of object 1 10. If haptic data with a first (low) level of detail has been transmitted to haptic device 100 with first data 312, second data 324 may comprise haptic data with a second level of detail which is higher than the first level of detail. It will be appreciated that second data 324 may optionally comprise additional data, such as media data. Subsequent to receiving second data 324, haptic device 100 re- renders 325 object 1 10, i.e., it renders the haptic properties of object 1 10 using second data 324.
  • Embodiment 400 of processing means 102 comprises a processing unit 401 , such as a general purpose processor, and a computer-readable storage medium 402, such as a Random Access Memory (RAM), a Flash memory, or the like.
  • processing means 400 comprises one or more interfaces 404 ("I/O" in Fig. 4) for controlling and/or receiving information from haptic touchscreen 101 and communications module 103.
  • Communications module 103 may, e.g., be a cellular communications module for effecting wireless communications via GSM, UMTS, LTE, or the like, or a WLAN/WiFi module for effecting communications via a WLAN.
  • Memory 402 contains computer-executable instructions 403, i.e., a computer program, for causing a haptic device 100, such as a display, a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, or the like, comprising a haptic touchscreen 101 , to perform in accordance with an embodiment of the invention as described herein, when computer-executable instructions 403 are executed on processing unit 401 .
  • a haptic device 100 such as a display, a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, or the like, comprising a haptic touchscreen 101 , to perform in accordance with an embodiment of the invention as described herein, when computer-executable instructions 403 are executed on processing unit 401 .
  • haptic device 100 becomes operative to render an object 1 10 on haptic touchscreen 101 using first data comprising data for graphically rendering object 1 10, determine haptic exploration characteristics of a finger 121 interacting with haptic touchscreen 101 for haptically exploring object 1 10, acquire second data for rendering one or more haptic properties of object 1 10, wherein the second data is selected based on the determined haptic exploration characteristics, and render the one or more haptic properties of object 1 10 using the second data.
  • the haptic exploration characteristics may, e.g., comprise at least one of a position, a velocity, and a force, applied by finger 121 to touchscreen 101 when exploring object 1 10.
  • the first data may further comprise data for haptically rendering the one or more haptic properties of object 1 10 with a first level of detail
  • the second data may comprise data for haptically rendering the one or more haptic properties of object 1 10 with a second level of detail which is higher than the first level of detail
  • At least two regions 1 1 1 and 1 12 are defined for
  • the second data is selected based on a number of times finger 121 has explored object 1 10 within each of the at least two regions.
  • the second data may comprise at least one of friction data, stiffness data, and texture data, and the second data may be selected based on finger 121 exploring a friction, a stiffness, or a texture, respectively, of object 1 10, or a region 1 1 1/1 12 thereof.
  • the second data may be available in at least two different resolutions, and the second data may be selected based on a velocity of finger 121 exploring object 1 10, or a region 1 1 1/1 12 thereof.
  • the second data may be acquired from a network node 200 providing the second data.
  • Network node 200 is accessible over a
  • haptic device 100 may become operative to acquire the second data by selecting the second data based on the determined haptic exploration characteristics and further based on information identifying second data which is available for object 1 10, requesting the second data from network node 200 providing the second data, and receiving the second data from network node 200 providing the second data.
  • the information identifying second data which is available for object 1 10 is comprised in the first data.
  • haptic device 100 may become operative to acquire the second data by transmitting the determined haptic exploration characteristics, or information derived therefrom, to network node 200 providing the second data, and receiving the second data from network node 200 providing the second data.
  • haptic device 100 may become operative to perform additional and/or alternative steps, in accordance with embodiments of the invention described throughout this disclosure.
  • An alternative embodiment 500 of processing means 102 comprises a rendering module 501 , a tracking module 502, and a data module 503.
  • processing means 500 comprises one or more interfaces 504 ("I/O" in Fig. 5) for controlling and/or receiving information from haptic touchscreen 101 and communications module 103.
  • Communications module 103 may, e.g., be a cellular communications module for effecting wireless communications via GSM, UMTS, LTE, or the like, or a WLAN/WiFi module for effecting communications via a WLAN.
  • Rendering module 501 , tracking module 502, and data module 503, are adapted to cause a haptic device 100, such as a display, a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, or the like, comprising a haptic device 100, such as a display, a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, or the like, comprising a haptic device 100, such as a display, a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, or the like, comprising a haptic
  • touchscreen 101 to perform in accordance with an embodiment of the invention as described herein.
  • rendering module 501 is adapted to render an object 1 10 on haptic touchscreen 101 using first data comprising data for graphically rendering object 1 10, and tracking module 502 is adapted to determine haptic exploration characteristics of a finger 121 interacting with haptic touchscreen 101 for haptically exploring object 1 10.
  • Data module 503 is adapted to acquire second data for rendering one or more haptic properties of object 1 10, wherein the second data is selected based on the determined haptic exploration characteristics.
  • Rendering module 501 is further adapted to render the one or more haptic properties of object 1 10 using the second data.
  • the haptic exploration characteristics may, e.g., comprise at least one of a position, a velocity, and a force, applied by finger 121 to
  • the first data may further comprise data for haptically rendering the one or more haptic properties of object 1 10 with a first level of detail
  • the second data may comprise data for haptically rendering the one or more haptic properties of object 1 10 with a second level of detail which is higher than the first level of detail
  • At least two regions 1 1 1 and 1 12 are defined for
  • the second data is selected based on a number of times finger 121 has explored object 1 10 within each of the at least two regions.
  • the second data may comprise at least one of friction data, stiffness data, and texture data, and the second data may be selected based on finger 121 exploring a friction, a stiffness, or a texture, respectively, of object 1 10, or a region 1 1 1/1 12 thereof.
  • the second data may be available in at least two different resolutions, and the second data may be selected based on a velocity of finger 121 exploring object 1 10, or a region 1 1 1/1 12 thereof.
  • the second data may be acquired from a network node 200 providing the second data.
  • Network node 200 is accessible over a
  • data module 503 may be adapted to acquire the second data by selecting the second data based on the determined haptic
  • the information identifying second data which is available for object 1 10 is comprised in the first data.
  • data module 503 may be adapted to acquire the second data by transmitting the determined haptic exploration characteristics, or information derived therefrom, to network node 200 providing the second data, and receiving the second data from network node 200 providing the second data.
  • rendering module 501 may be adapted to perform additional and/or alternative steps, in accordance with embodiments of the invention described throughout this disclosure.
  • processing means 500 may comprise additional modules which are adapted to perform additional and/or alternative steps, in accordance with embodiments of the invention described throughout this disclosure.
  • Modules 401-404 and 501-504, as well as any additional modules comprised in processing means 500 may be implemented by any kind of electronic circuitry, e.g., any one, or a combination of, analogue electronic circuitry, digital electronic circuitry, and a processing unit executing a suitable computer program.
  • Method 600 comprises rendering 602 an object 1 10 on a haptic touchscreen 101 using first data which comprises data for graphically rendering object 1 10, determining 603 haptic exploration characteristics of a finger 121 interacting with haptic touchscreen 101 for haptically exploring object 1 10, acquiring 604 second data for rendering one or more haptic properties of object 1 10, wherein the second data is selected based on the determined haptic exploration characteristics, and rendering 605 the one or more haptic properties of object 1 10 using the second data.
  • the first data may, e.g., be acquired 601 from a network node providing first data, such as a server of an online shop, or the like.
  • the haptic exploration characteristics may, e.g., comprise at least one of a position, a velocity, and a force, applied by finger 121 to touchscreen 101 when exploring object 1 10.
  • the first data may further comprise data for haptically rendering 602 the one or more haptic properties of object 1 10 with a first level of detail
  • the second data may comprise data for haptically rendering 606 the one or more haptic properties of the object with a second level of detail which is higher than the first level of detail.
  • At least two regions 1 1 1 1 and 1 12 may be defined for object 1 10, and the second data is selected based on a number of times finger 121 has explored object 1 10 within each of the at least two regions.
  • the second data may comprise at least one of friction data, stiffness data, and texture data, and the second data may be selected based on finger 121 exploring a friction, a stiffness, or a texture, respectively, of object 1 10, or a region 1 1 1/1 12 thereof.
  • the second data may be available in at least two different resolutions, and the second data may be selected based on a velocity of finger 121 exploring object 1 10, or a region 1 1 1/1 12 thereof.
  • the second data is acquired 604 from a network node 200 providing the second data.
  • Network node 200 is accessible over a
  • the acquiring 604 the second data may comprise selecting the second data based on the determined haptic exploration characteristics and further based on information identifying second data which is available for object 1 10, requesting the second data from network node 200 providing the second data, and receiving the second data from network node 200 providing the second data.
  • the information identifying second data which is available for object 1 10 is comprised in the first data.
  • acquiring 604 the second data may comprise
  • method 600 may comprise additional, or modified, steps in accordance with what is described throughout this disclosure.
  • An embodiment of method 600 may be implemented in software, i.e., as computer-executable instructions, and may be performed by any one a display, a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A haptic device (100) for rendering an object (110) on a haptic touchscreen (101) comprised in the device is provided. The device is operative to render the object on the haptic touchscreen using first data which comprises data for graphically rendering the object, determine one or more haptic exploration characteristics of a finger (121) interacting with the haptic touchscreen for haptically exploring the object, acquire second data for rendering one or more haptic properties of the object, and render the one or more haptic properties of the object using the second data. The second data is selected based on the determined haptic exploration characteristics, such as a position, a velocity, and a force, applied by the finger to the touchscreen when exploring the object.

Description

DEVICE AND METHOD FOR HAPTIC EXPLORATION OF A RENDERED
OBJECT
Technical field
The invention relates to a device for haptic exploration of an object rendered on a haptic touchscreen comprised in the device, a method of a device for haptic exploration of an object rendered on a haptic touchscreen comprised in the device, a corresponding computer program, and a corresponding computer program product.
Background
The haptic internet is considered to be the next step in mobile networking. It is envisioned that users of mobile devices, such as mobile phones, smartphones, tablets, and the like, will be able to communicate by means of touch, in addition to voice and video. This is achieved by means of haptic interfaces which provide haptic feedback using actuation technologies utilizing, e.g., ultrasound, vibrotactile, electrostatic, or piezoelectric, transducers.
Manufacturers of laptops and handheld devices have started implementing haptic feedback in trackpads and touchscreens. For instance, the TPad Phone is provided with a variable-friction tactile display which changes friction, or resistance force, as a finger slides across the screen. Haptic feedback is typically used for improving the user's interaction with a graphical user interface, such as buttons or other user-interface objects displayed on a touchscreen, or to enable haptic exploration of a physical object or material which is rendered on the touchscreen.
Haptic exploration is a mechanism by which humans learn about the surface properties of (unknown) objects. Through the sense of touch, we are able to learn about attributes such as object shape, surface texture, stiffness, and temperature. A user swiping with his/her finger over an image which is rendered on a haptic touchscreen and which represents a physical object or materials can feel the texture of the rendered object or material at the location of touch, since the device is capable of tracking the finger's position.
A possible use case for such technology is e-commerce, where users may be given the opportunity to explore the haptic characteristics of an object, such as a piece of furniture or a clothing item, in addition to its visual appearance, by rendering the object on a haptic display. However, since additional information needs to be communicated to the haptic device for rendering the haptic characteristics of an object, e.g., friction, stiffness, or texture, the efficient selection and transmission of data which is required for rendering haptic properties of an object needs to be addressed for efficient bandwidth utilization.
Summary
It is an object of the invention to provide an improved alternative to the above techniques and prior art.
More specifically, it is an object of the invention to provide an improved solution for rendering objects on a haptic touchscreen for haptic exploration by a finger of a user. In particular, it is an object of the invention to provide a solution for rendering objects on a haptic touchscreen which requires less bandwidth for transmitting haptic information to the device than known solutions.
These and other objects of the invention are achieved by means of different aspects of the invention, as defined by the independent claims. Embodiments of the invention are characterized by the dependent claims.
According to a first aspect of the invention, a device for rendering an object on a haptic touchscreen is provided. The haptic touchscreen is comprised in the device, which may, e.g., be any one of a display, a mobile phone, a smartphone, a mobile terminal, a User Equipment (UE), a tablet, a laptop, or the like. The device is operative to render the object on the haptic touchscreen using first data which comprises data for graphically rendering the object, determine haptic exploration characteristics of a finger interacting with the haptic touchscreen for haptically exploring the object, acquire second data for rendering one or more haptic properties of the object, and render the one or more haptic properties of the object using the second data. The second data is selected based on the determined haptic exploration characteristics.
According to a second aspect of the invention, a method of rendering an object on a haptic touchscreen is provided. The device may, e.g., be any one of a display, a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, or the like, comprising a haptic touchscreen. The method comprises rendering the object on the haptic touchscreen using first data which comprises data for graphically rendering the object, determining haptic exploration characteristics of a finger interacting with the haptic touchscreen for haptically exploring the object, acquiring second data for rendering one or more haptic properties of the object, and rendering the one or more haptic properties of the object using the second data. The second data is selected based on the determined haptic exploration characteristics.
According to a third aspect of the invention, a computer program is provided. The computer program comprises computer-executable
instructions for causing a device to perform the method according to an embodiment of the second aspect of the invention, when the computer- executable instructions are executed on a processing unit comprised in the device.
According to a fourth aspect of the invention, a computer program product is provided. The computer program product comprises a computer- readable storage medium which has the computer program according to the third aspect of the invention embodied therein.
The invention makes use of an understanding that the amount of haptic data which is transmitted to a device comprising a haptic touchscreen, herein referred to as a haptic device, for rendering an object for haptic exploration by a user, using a finger or any other body part, may be reduced by selectively providing haptic data to the haptic device based on determined haptic exploration characteristics of the finger, or other body part, interacting with the haptic touchscreen for haptically exploring the object, i.e., sensing the object using a finger or other body part.
In the present context, an object may, e.g., be a representation of an item offered for sale in an online shop, such as a piece of furniture, a clothing item, and the like. The object can be graphically rendered, using media data, e.g., image data, and/or haptically rendered. Haptic rendering of an object is to be understood as rendering haptic properties of the object on a haptic interface, such as a haptic display, for haptic exploration. In particular, the haptic interface may be a haptic touchscreen which is capable of both sensing haptic exploration characteristics such as the position, velocity, and force, of the finger or other body part, but also provide haptic feedback via, e.g., piezoelectric actuators, ultrasonic actuators, and electrostatic actuators. Such haptic touchscreens are known, e.g., from the Tpad phone.
Embodiments of the invention first render an object graphically, using first data, before acquiring second data for rendering one or more haptic properties of the object. The rendered object may, e.g., be selected by the user of the haptic device from a list of displayed objects, such as items for sale in an online shop. A problem which is addressed herein is the selection of the second data for rendering the one or more haptic properties of the object. Known solutions simply acquire all haptic data for an object, resulting in a considerable increase in the amount of data which needs to be transmitted to the haptic device. The first data and the second data may be acquired by retrieving the data from a network node providing the data, such as a server of an online shop, or the like.
The invention is based on an understanding that the second data may be selected based on haptic exploration characteristics which provide information about the haptic exploration of the finger, or other body part. The haptic exploration characteristics may be determined by tracking the finger interacting with the haptic touchscreen during haptic exploration of the rendered object, and comprise any one, or a combination of, a position, a velocity, and a force, applied by the finger to the touchscreen when exploring the object. Thereby, only data which is required for rendering a haptic property which is actually explored by the user, and/or haptic data for a region which is actually explored by the user, is transmitted to the haptic device. Accordingly, the amount of data required for rendering haptic properties of the object is reduced and bandwidth is saved. In addition, embodiments of the invention may result in a reduced latency, due the reduced amount of data which is received by the haptic device and rendered on the haptic touchscreen, resulting in an improved user experience.
According to an embodiment of the invention, the first data may further comprise data for haptically rendering the one or more haptic properties of the object with a first level of detail, and the second data comprises data for haptically rendering the one or more haptic properties of the object with a second level of detail which is higher than the first level of detail. This is advantageous in that the haptic properties of the object can at least partly be rendered for haptic exploration by the user. For instance, the first data may comprise data for rendering only a subset of the haptic properties of the object, e.g., only texture, whereas the second data comprises data for rendering additional haptic properties of the object, e.g., stiffness. As a further example, the first data may comprise haptic data for rendering an outline of the object. In that way, the user can start exploring the haptic properties of the object, and the rendered haptic properties of the object are improved when the second data is received by the haptic device and used for re-rendering the object.
According to an embodiment of the invention, at least two regions are defined for the object, and the second data is selected based on a number of times the finger has explored the object within each of the at least two regions. Advantageously, the second data is only acquired by the device for one or more regions which in fact are explored by the finger or other body part, or about to be explored. This may be achieved by tracking a position, and optionally a velocity, of the finger or other body part relative to the regions which are defined for the rendered object, and selecting the second data accordingly. For instance, the second data may be haptic data for a region which is currently explored by the finger, i.e., the current position of the finger is within that region. Alternatively, the second data may be haptic data for a region which the finger is likely to explore, which region may be identified based on the current position of the finger and a velocity of the finger.
According to an embodiment of the invention, the second data comprises at least one of friction data, stiffness data, and texture data. The second data is selected based on whether the finger is exploring a friction, a texture, or stiffness, respectively, of the object or a region thereof. This can be determined by characterizing the interaction of the finger with the haptic touchscreen, e.g., based on whether the finger slides over the haptic touchscreen or is pressed onto the haptic touchscreen.
According to an embodiment of the invention, the second data is available in at least two different resolutions, and is selected based on a velocity of the finger exploring the object, or a region thereof. That is, if the finger or other body part is moving quickly over the rendered object or a region thereof, the haptic properties of the object are rendered at a low resolution. If the finger or body part is moving slowly over the rendered object or regions thereof, the haptic properties of the object are rendered at a high resolution. This embodiment of the invention is based on an understanding that the user may feel more details when moving the finger slowly, as compared to moving the finger rapidly over the haptic touchscreen.
According to an embodiment of the invention, the second data is acquired by selecting the second data based on the determined haptic exploration characteristics and further based on information identifying second data which is available for the object, requesting the second data from the network node providing the second data, and receiving the second data from the network node providing the second data. Thus, it is the haptic device which selects the second data and requests the selected second data from the network node providing the second data. The information identifying second data which is available for the object may be comprised in the first data, e.g., as metadata.
According to another embodiment of the invention, the second data is acquired by transmitting the determined haptic exploration characteristics, or information derived therefrom, to the network node providing the second data, and receiving the second data from the network node providing the second data. That is, the haptic device transmits the determined haptic exploration characteristics to the network node providing the second data, e.g., a server of an online shop, to enable the network node to select the second data for transmission to the haptic device. As an alternative, the haptic device may transmit ranking information which is derived from the determined haptic exploration characteristics, based on which ranking information the network node selects the second data for transmission to the haptic device.
Even though advantages of the invention have in some cases been described with reference to embodiments of the first aspect of the invention, corresponding reasoning applies to embodiments of other aspects of the invention. Further objectives of, features of, and advantages with, the invention will become apparent when studying the following detailed disclosure, the drawings and the appended claims. Those skilled in the art realize that different features of the invention can be combined to create embodiments other than those described in the following.
Brief description of the drawings
The above, as well as additional objects, features and advantages of the invention, will be better understood through the following illustrative and non-limiting detailed description of embodiments of the invention, with reference to the appended drawings, in which:
Fig. 1 shows a device for rendering an object on a haptic touchscreen comprised in the device, in accordance with embodiments of the invention.
Fig. 2 shows a sequence diagram illustrating rendering an object on a haptic touchscreen, in accordance with an embodiment of the invention.
Fig. 3 shows a sequence diagram illustrating rendering an object on a haptic touchscreen, in accordance with another embodiment of the invention.
Fig. 4 shows a processing means comprised in the device for rendering an object on a haptic touchscreen, in accordance with an embodiment of the invention.
Fig. 5 shows a processing means comprised in the device for rendering an object on a haptic touchscreen, in accordance with another embodiment of the invention.
Fig. 6 shows a method of rendering an object on a haptic
touchscreen, in accordance with embodiments of the invention.
All the figures are schematic, not necessarily to scale, and generally only show parts which are necessary in order to elucidate the invention, wherein other parts may be omitted or merely suggested. Detailed description
The invention will now be described more fully herein after with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In Fig. 1 , an embodiment of a device 100 for rendering an object 1 10 on a haptic touchscreen 101 is illustrated. Device 100 comprising haptic touchscreen 101 , throughout this disclosure also referred to as haptic device, may, e.g., be any one of a display (for offering or advertising an item for sale in a shop), a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, or the like. Haptic touchscreen 101 is arranged for providing haptic feedback and may utilize any known actuation technology. For instance, haptic touchscreen 101 may be based on ultrasonic transducers, vibrotactile transducers, electrostatic transducers, or piezoelectric transducers.
In addition to haptic touchscreen 101 , haptic device 100 further comprises processing means 102, which is described in more detail further below, and communications module 103, which is operative to effect communications between haptic device 100 and an external network node. In particular, communications module 103 may be operative to effect
communications via one or more wired or wireless communications networks such as a cellular mobile network, e.g., a Global System for Mobile
Communications (GSM) network, a Universal Mobile Telecommunications System (UMTS) network, or a Long Term Evolution (LTE) network, or a Wireless Local Area Network (WLAN)/WiFi network. Communications module 103 may, e.g., be a cellular communications module or a WLAN/WiFi module. Haptic device 100 is operative to render object 1 10, in Fig. 1 exemplified by a chair, for haptic exploration by a finger 121 or any other body part of a user, e.g., hand 120 or one of the other fingers of hand 120. In the present context, haptic exploration is to be understood as a mechanism by which a user of haptic device 100 learns about the haptic properties of an object rendered on haptic touchscreen 101 , such as object 1 10. Through the sense of touch, the user is able to learn about attributes such as friction, texture, and stiffness, by swiping finger 121 over, or pressing finger 121 onto, object 1 10 which is rendered on haptic touchscreen 101 .
More specifically, haptic device 100 is operative to render object 1 10 on haptic touchscreen 101 using first data which comprises data for graphically rendering object 1 10, i.e., rendering a visual representation of object 1 10. For instance, the first data may comprise image data for rendering an image representing object 1 10. Haptic device 100 is further operative to determine haptic exploration characteristics of finger 121 , or any other body part of the user, interacting with haptic touchscreen 101 for haptically exploring object 1 10. This is achieved by tracking finger 121 during its interaction with haptic touchscreen 101 , e.g., when swiping over haptic touchscreen 101 , along a path 132 of finger 121 , or exerting pressure (by touching or pressing) at a position 131 on haptic touchscreen 101 , as is known from conventional, i.e., non-haptic, touchscreens. More specifically, the haptic exploration characteristics comprise at least one of a position, a velocity, and a force, applied by finger 121 , or any other body part of the user, to haptic touchscreen 101 when exploring object 1 10.
Haptic device 100 is further operative to acquire second data for rendering one or more haptic properties of object 1 10, wherein the second data is selected based on the determined haptic exploration characteristics of finger 121 , as is described further below, and to render the one or more haptic properties of object 1 10 using the second data. That is, whereas the first data only comprises data for graphically rendering object 1 10, i.e., for rendering a visual representation such as an image of object 1 10, the second data comprises data (haptic data) for haptically rendering object 1 10, such that the haptic properties of object 1 10, including but not limited to friction, texture, and stiffness, can be sensed by the user when touching haptic touchscreen 101 with finger 121 at a position 131 where object 1 10 is rendered, or by swiping finger 121 across at least a part of object 1 10 rendered on haptic touchscreen 101 , e.g., along path 132.
Optionally, the first data may further comprise data for haptically rendering the one or more haptic properties of object 1 10 with a first level of detail. In this case, the second data comprises data for rendering the one or more haptic properties of object 1 10 with a second level of detail which is higher than the first level of detail. For instance, the first data may comprise data for rendering only a subset of the haptic properties of object 1 10, e.g., only its texture, whereas the second data comprises data for rendering additional haptic properties of object 1 10, e.g., its stiffness. As a further example, the first data may comprise haptic data for rendering an outline of object 1 10. In that way, the user can start exploring the haptic properties of the object, e.g., feeling the edges of chair 1 10, and the rendered haptic properties of object 1 10 are improved when the second data is received by haptic device 1 10 and used for re-rendering object 1 10, i.e., rendering its haptic properties with an increased level of detail. Advantageously, haptic device 100 can start rendering object 1 10 at a lower level of detail, thereby allowing the user to start exploring object 1 10 before the second data has been received at haptic device 100 and object 1 10 has been re-rendered. This results in reduced latency and improved user experience.
The second data may be selected according to a number of alternatives, which are described in the following.
For instance, at least two regions may be defined for object 1 10. With reference to chair 1 10 sown in Fig. 1 , a first region 1 1 1 may be defined for a seat of chair 1 10, and a second region 1 12 may be defined for a back of chair 1 10. For chair 1 10 which is illustrated in Fig. 1 , the seat, which has the style of a cushion, may have haptic properties which are considerably different from those of the back. Oftentimes, a user who browses through a collection of chairs which are offered for sale by an online shop may be interested in comparing how the seat feels for different chairs. Accordingly, the user is likely to explore object 1 10 in first region 1 1 1 , i.e., the seat of chair 1 10, by touching and/or swiping with finger 121 . It is therefore advantageous for haptic device 100 to acquire and render only second data which describes the haptic properties of first region 1 1 1 , rather than acquiring second data which describes the haptic properties of the entire chair, or both regions 1 1 1 and 1 12. The regions for which the second data is acquired may be selected in different ways. As an example, the second data may be selected based on a number of times finger 121 has explored object 1 10, or similar objects such as a collection of chairs, within each of the at least two regions 1 1 1 and 1 12. For instance, only the second data for the one or more most-explored region(s) may be selected. Alternatively, the second data for the most-explored region may be acquired and rendered first, and the second data for one or more less-explored regions are acquired and rendered subsequently, e.g., based on an availability of a bandwidth for
communications effected via communications module 103.
As an alternative, the second data may comprise at least one of friction data, stiffness data, and texture data. The second data may be selected based on whether finger 121 explores the friction, the stiffness, or the texture, respectively, of object 1 10, or one of its regions 1 1 1 and 1 12. The type of haptic property which the user is exploring using finger 121 can, e.g., be determined based on whether finger 121 is pressing onto haptic touchscreen 101 or sliding over haptic touchscreen 101 . For instance, if finger 121 is pressing onto haptic touchscreen 101 , e.g., at position 131 within rendered object 1 10, it may be concluded that the user is exploring the stiffness of object 1 10. On the other hand, if finger 121 is sliding, or swiping, over haptic touchscreen 101 , e.g., along path 132 over rendered object 1 10, it may be concluded that the user is exploring friction if finger 121 slides across haptic touchscreen 101 with a high pressure at a low velocity, and that the user is exploring texture if finger 121 slides across haptic
touchscreen 101 with a low pressure at a high velocity. Haptic device 100 may be operative to determine the type of haptic property which is explored by finger 121 , or any other body part, based on threshold values for velocity and/or pressure, which may be set by a manufacturer of haptic device 100 and which may optionally be configurable by the user. That is, haptic device 100 may be operative to distinguish low pressure and high pressure by means of comparison with a pressure threshold value and,
correspondingly, to distinguish low velocity and high velocity by means of comparison with a velocity threshold value.
As a further alternative, the second data may be available in at least two different resolutions, a first (low) resolution and a second (high) resolution which is higher than the first resolution. In this case, the second data may be selected based on a velocity of finger 121 exploring object 1 10, or a region 1 1 1 or 1 12 of object 1 10. Advantageously, if finger 121 is moving quickly over object 1 10 or a region thereof, it is sufficient to render the haptic properties of object 1 10 at a low resolution, as the user may not be able to sense all details. Therefore, it suffices to acquire the second data at a first (low) resolution and render object 1 10 accordingly. If, on the other hand, it is detected that finger 121 swipes across object 1 10, or a region thereof, at a higher velocity, the second data is acquired in a second (high) resolution and rendered accordingly, allowing the user to sense more, or finer, details. In accordance with what is described hereinbefore, haptic device 100 may be operative to distinguish low velocity and high velocity based on a velocity threshold value which may be set by a manufacturer of haptic device 100 and which may optionally be configurable by the user. Optionally, the second data may initially be acquired at the first (low) resolution and subsequently in the second (high) resolution, either after object 1 10 has been rendered using the second data in the first resolution, or while the second data in the first resolution is acquired and/or object 1 10 is rendered in the first resolution. Advantageously, haptic device 100 can start rendering object 1 10 using the second data in the first (low) resolution, thereby allowing the user to start exploring object 1 10, before the second data in the second resolution has been received at haptic device 100 and object 1 10 can be re-rendered using the second data in the second (high) resolution. This results in reduced latency and improved user experience.
In the following, embodiments of the invention are further elucidated with reference to Figs. 2 and 3, which show sequence diagrams illustrating rendering an object on haptic touchscreen 101 comprised in haptic device 100.
With reference to Fig. 2, the second data 224, and optionally also the first data 212, may be acquired from a network node 200 providing the second data, such as a server of an online shop, which is accessible via communications module 103 accessing a communications network, e.g., a cellular mobile network or a WLAN/WiFi network.
First data 212 may either be pushed to haptic device 100, or transmitted to haptic device 100 in response to a request 21 1 for first data received by server 200. Request 21 1 may, e.g., be a HyperText Transfer protocol (HTTP) GET request identifying the object for which the first data is requested, such as object 1 10. The object may, e.g., be identified by a unique identifier, such as a text string, a number string, or a character string. As an example, in HTTP request 21 1 the first data may be identified by a Uniform Resource Locator (URL) of the form
http : //www . server . com/iteml23/first_data . jpg, where it is assumed here that the requested first data for object "item123" only comprises media data, i.e., data for graphically rendering object 1 10, in this case an image of type "jpg". As a further example, HTTP GET
request 21 1 may be of the form
http : //www . server . com/shop?item=123&data=first, utilizing a query string for conveying the request for first data ("first") for an object listed as item "123".
In response to receiving request 21 1 , server 200 may transmit first data 212 in an HTTP response message, e.g., as an HTTP 200 OK message, as is known in the art. First data 212 comprises (media) data for graphically rendering object 1 10 on haptic touchscreen 101 , and may optionally comprise (haptic) data for rendering one or more haptic properties of object 1 10 with a first (low) level of detail.
In response to receiving first data 212, object 1 10 is rendered 213 at device 100, on haptic touchscreen 101 . More specifically, object 1 10 is graphically rendered 213 using the media data comprised in first data 212, and may optionally be haptically rendered 213 using any haptic data with a first (low) level of detail which is comprised in first data 212.
Then, haptic device 100 determines 221 the haptic exploration characteristics of finger 121 interacting with haptic touchscreen 101 for haptically exploring object 1 10, and selects 222 the second data for object 1 10. The second data is selected 222 based on the determined haptic exploration characteristics, as is described hereinbefore, and further based on information identifying second data which is available for the object, e.g., in the form of metadata. The information identifying second data which is available for the object may optionally be comprised in first data 212. For instance, first data 212 may, in addition to media data for graphically rendering object 1 1 1 , comprise metadata defining any one or a combination of the following:
- One or more regions 1 1 1 and 1 12 of object 1 10,
- One or more types of haptic properties for which haptic data is available for object 1 10, such as friction, stiffness, or texture, - One or more levels of detail for which haptic data is available for object 1 10, and
- One or more resolutions in which haptic data is available for
object 1 10.
Subsequent to selecting 222 the second data, the selected second data is requested 223 from server 200, e.g., using another HTTP GET request 223 identifying the selected second data. Second HTTP GET request 223 may, e.g., be of the form:
http : //www . server . com/iteml23/second_data_regl . hdf, where it is assumed that the second data for a region "regl " is requested, and that the requested second data is of type "hdf (Haptic Data File), (see, e.g., D. Wang, Y. Zhang, and J. Wu, "A novel haptic file format for sharing haptic sensation by record-play method". It will be appreciated that embodiments of the invention are not limited to any specific file type which is used for conveying haptic data to haptic device 100. As a further example, HTTP GET request 223 may be of the form
http : //www . server . com/shop?item=123&data=sec&reg=l, where the request for the second data ("sec") for region "1 " of an object listed as item "123" is conveyed as a query string.
In response to receiving request 223, server 200 may transmit the second data in a response message 224, e.g., as an HTTP 200 OK message. Second data 224 comprises (haptic) data for rendering the one or more haptic properties of object 1 10. If haptic data with a first (low) level of detail has been transmitted to haptic device 100 with first data 212, second data 224 may comprise haptic data with a second level of detail which is higher than the first level of detail. It will be appreciated that second data 224 may optionally comprise additional data, such as media data.
Subsequent to receiving second data 224, haptic device 100 re- renders 225 object 1 10, i.e., it renders the haptic properties of object 1 10 using second data 224. In Fig. 3, an alternative solution for rendering an object on haptic touchscreen 101 comprised in haptic device 100 is illustrated.
Similar to what has been described with reference to Fig. 2, the second data 324, and optionally also the first data 312, may be acquired from network node 200 providing the second data. First data 312 may either be pushed to haptic device 100, or transmitted to haptic device 100 in response to a request 31 1 for first data received by server 200. Request 31 1 may, e.g., be an HTTP GET request identifying the object for which the first data is requested, such as object 1 10.
In response to receiving request 31 1 , server 200 may transmit first data 312 in an HTTP response message, e.g., as an HTTP 200 OK message, as is known in the art. First data 312 comprises (media) data for graphically rendering object 1 10 on haptic touchscreen 101 , and may optionally comprise (haptic) data for rendering one or more haptic properties of object 1 10 with a first (low) level of detail.
In response to receiving first data 312, object 1 10 is rendered 313 at device 100, on haptic touchscreen 101 . More specifically, object 1 10 is graphically rendered 313 using the media data comprised in first data 312, and may optionally be haptically rendered 313 using any haptic data with a first (low) level of detail which is comprised in first data 312.
Then, haptic device 100 determines 321 the haptic exploration characteristics of finger 121 interacting with haptic touchscreen 101 for haptically exploring object 1 10, and transmits the haptic exploration characteristics 322 to server 200. In response to receiving haptic exploration characteristics 322, server 200 selects 323 the second data for object 1 10, similar to what has been described hereinbefore with reference to Fig. 2.
The embodiments described with reference to Figs. 2 and 3 differ in the way the second data is selected. More specifically, whereas the second data is selected 222 by haptic device 100 for the embodiment described with reference to Fig. 2, the second data is selected 323 by server 200 for the embodiment illustrated in Fig. 3, based on haptic exploration
characteristics 322 received from haptic device 100.
As an alternative, rather than transmitting the determined haptic exploration characteristics to server 200, haptic device 100 may transmit any information 322 derived from the determined haptic exploration
characteristics, e.g., a derived ranking of the regions which are defined for object 1 10, or the like. For instance, if it is determined that the user explores first region 1 1 1 90% of the times and second region 1 12 10% of the times, a corresponding rank may be transmitted to server 200, such as Rreg(l, 0.9) and Rreg(2, 0.1), for first region 1 1 1 and second region 1 12, respectively. As a further example, one can define a rank ROe (K, X, N, M) as the number of times the user K explores the force/velocity model given by region M in a Delaunay triangulation set, in the object region N for object type X. Such value may be further normalized per the total number of regions and explorations. This allows constructing a model which maps the determined exploration characteristics, in particular velocity and force, into the
acceleration which is to be felt by the user (see, e.g., H. Culbertson,
J. J. Lopez Delgado, and K. J. Kuchenbecker, "The Penn Haptic Texture Toolkit for Modeling, Rendering, and Evaluating Haptic Virtual Textures", Departmental Papers (MEAM), University of Pennsylvania, paper 299, 2014).
Similar to what has been described with reference to Fig. 2, subsequent to selecting 323 the second data server 200 transmits the second data in a response message 324, e.g., as an HTTP 200 OK message. Second data 324 comprises (haptic) data for rendering the one or more haptic properties of object 1 10. If haptic data with a first (low) level of detail has been transmitted to haptic device 100 with first data 312, second data 324 may comprise haptic data with a second level of detail which is higher than the first level of detail. It will be appreciated that second data 324 may optionally comprise additional data, such as media data. Subsequent to receiving second data 324, haptic device 100 re- renders 325 object 1 10, i.e., it renders the haptic properties of object 1 10 using second data 324.
In the following, embodiments 400 and 500 of processing means 102, comprised in haptic device 100, are described with reference to Figs. 4 and 5.
Embodiment 400 of processing means 102, shown in Fig. 4, comprises a processing unit 401 , such as a general purpose processor, and a computer-readable storage medium 402, such as a Random Access Memory (RAM), a Flash memory, or the like. In addition, processing means 400 comprises one or more interfaces 404 ("I/O" in Fig. 4) for controlling and/or receiving information from haptic touchscreen 101 and communications module 103. Communications module 103 may, e.g., be a cellular communications module for effecting wireless communications via GSM, UMTS, LTE, or the like, or a WLAN/WiFi module for effecting communications via a WLAN. Memory 402 contains computer-executable instructions 403, i.e., a computer program, for causing a haptic device 100, such as a display, a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, or the like, comprising a haptic touchscreen 101 , to perform in accordance with an embodiment of the invention as described herein, when computer-executable instructions 403 are executed on processing unit 401 .
In particular, haptic device 100 becomes operative to render an object 1 10 on haptic touchscreen 101 using first data comprising data for graphically rendering object 1 10, determine haptic exploration characteristics of a finger 121 interacting with haptic touchscreen 101 for haptically exploring object 1 10, acquire second data for rendering one or more haptic properties of object 1 10, wherein the second data is selected based on the determined haptic exploration characteristics, and render the one or more haptic properties of object 1 10 using the second data. The haptic exploration characteristics may, e.g., comprise at least one of a position, a velocity, and a force, applied by finger 121 to touchscreen 101 when exploring object 1 10.
Optionally, the first data may further comprise data for haptically rendering the one or more haptic properties of object 1 10 with a first level of detail, and the second data may comprise data for haptically rendering the one or more haptic properties of object 1 10 with a second level of detail which is higher than the first level of detail.
Optionally, at least two regions 1 1 1 and 1 12 are defined for
object 1 10, and the second data is selected based on a number of times finger 121 has explored object 1 10 within each of the at least two regions.
Optionally, the second data may comprise at least one of friction data, stiffness data, and texture data, and the second data may be selected based on finger 121 exploring a friction, a stiffness, or a texture, respectively, of object 1 10, or a region 1 1 1/1 12 thereof.
Optionally, the second data may be available in at least two different resolutions, and the second data may be selected based on a velocity of finger 121 exploring object 1 10, or a region 1 1 1/1 12 thereof.
Optionally, the second data may be acquired from a network node 200 providing the second data. Network node 200 is accessible over a
communications network.
For instance, haptic device 100 may become operative to acquire the second data by selecting the second data based on the determined haptic exploration characteristics and further based on information identifying second data which is available for object 1 10, requesting the second data from network node 200 providing the second data, and receiving the second data from network node 200 providing the second data. Optionally, the information identifying second data which is available for object 1 10 is comprised in the first data.
Alternatively, haptic device 100 may become operative to acquire the second data by transmitting the determined haptic exploration characteristics, or information derived therefrom, to network node 200 providing the second data, and receiving the second data from network node 200 providing the second data.
It will be appreciated that haptic device 100 may become operative to perform additional and/or alternative steps, in accordance with embodiments of the invention described throughout this disclosure.
An alternative embodiment 500 of processing means 102, shown in Fig. 5, comprises a rendering module 501 , a tracking module 502, and a data module 503. In addition, processing means 500 comprises one or more interfaces 504 ("I/O" in Fig. 5) for controlling and/or receiving information from haptic touchscreen 101 and communications module 103. Communications module 103 may, e.g., be a cellular communications module for effecting wireless communications via GSM, UMTS, LTE, or the like, or a WLAN/WiFi module for effecting communications via a WLAN. Rendering module 501 , tracking module 502, and data module 503, are adapted to cause a haptic device 100, such as a display, a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, or the like, comprising a haptic
touchscreen 101 , to perform in accordance with an embodiment of the invention as described herein.
In particular, rendering module 501 is adapted to render an object 1 10 on haptic touchscreen 101 using first data comprising data for graphically rendering object 1 10, and tracking module 502 is adapted to determine haptic exploration characteristics of a finger 121 interacting with haptic touchscreen 101 for haptically exploring object 1 10. Data module 503 is adapted to acquire second data for rendering one or more haptic properties of object 1 10, wherein the second data is selected based on the determined haptic exploration characteristics. Rendering module 501 is further adapted to render the one or more haptic properties of object 1 10 using the second data. The haptic exploration characteristics may, e.g., comprise at least one of a position, a velocity, and a force, applied by finger 121 to
touchscreen 101 when exploring object 1 10.
Optionally, the first data may further comprise data for haptically rendering the one or more haptic properties of object 1 10 with a first level of detail, and the second data may comprise data for haptically rendering the one or more haptic properties of object 1 10 with a second level of detail which is higher than the first level of detail.
Optionally, at least two regions 1 1 1 and 1 12 are defined for
object 1 10, and the second data is selected based on a number of times finger 121 has explored object 1 10 within each of the at least two regions.
Optionally, the second data may comprise at least one of friction data, stiffness data, and texture data, and the second data may be selected based on finger 121 exploring a friction, a stiffness, or a texture, respectively, of object 1 10, or a region 1 1 1/1 12 thereof.
Optionally, the second data may be available in at least two different resolutions, and the second data may be selected based on a velocity of finger 121 exploring object 1 10, or a region 1 1 1/1 12 thereof.
Optionally, the second data may be acquired from a network node 200 providing the second data. Network node 200 is accessible over a
communications network.
For instance, data module 503 may be adapted to acquire the second data by selecting the second data based on the determined haptic
exploration characteristics and further based on information identifying second data which is available for object 1 10, requesting the second data from network node 200 providing the second data, and receiving the second data from network node 200 providing the second data. Optionally, the information identifying second data which is available for object 1 10 is comprised in the first data.
Alternatively, data module 503 may be adapted to acquire the second data by transmitting the determined haptic exploration characteristics, or information derived therefrom, to network node 200 providing the second data, and receiving the second data from network node 200 providing the second data.
It will be appreciated that rendering module 501 , tracking module 502, and data module 503, may be adapted to perform additional and/or alternative steps, in accordance with embodiments of the invention described throughout this disclosure. It will also be appreciated that processing means 500 may comprise additional modules which are adapted to perform additional and/or alternative steps, in accordance with embodiments of the invention described throughout this disclosure.
Modules 401-404 and 501-504, as well as any additional modules comprised in processing means 500, may be implemented by any kind of electronic circuitry, e.g., any one, or a combination of, analogue electronic circuitry, digital electronic circuitry, and a processing unit executing a suitable computer program.
In the following, embodiments 600 of the method of rendering an object on a haptic touchscreen are described with reference to Fig. 6.
Method 600 comprises rendering 602 an object 1 10 on a haptic touchscreen 101 using first data which comprises data for graphically rendering object 1 10, determining 603 haptic exploration characteristics of a finger 121 interacting with haptic touchscreen 101 for haptically exploring object 1 10, acquiring 604 second data for rendering one or more haptic properties of object 1 10, wherein the second data is selected based on the determined haptic exploration characteristics, and rendering 605 the one or more haptic properties of object 1 10 using the second data. The first data may, e.g., be acquired 601 from a network node providing first data, such as a server of an online shop, or the like. The haptic exploration characteristics may, e.g., comprise at least one of a position, a velocity, and a force, applied by finger 121 to touchscreen 101 when exploring object 1 10. Optionally, the first data may further comprise data for haptically rendering 602 the one or more haptic properties of object 1 10 with a first level of detail, and the second data may comprise data for haptically rendering 606 the one or more haptic properties of the object with a second level of detail which is higher than the first level of detail.
Optionally, at least two regions 1 1 1 and 1 12 may be defined for object 1 10, and the second data is selected based on a number of times finger 121 has explored object 1 10 within each of the at least two regions.
Optionally, the second data may comprise at least one of friction data, stiffness data, and texture data, and the second data may be selected based on finger 121 exploring a friction, a stiffness, or a texture, respectively, of object 1 10, or a region 1 1 1/1 12 thereof.
Optionally, the second data may be available in at least two different resolutions, and the second data may be selected based on a velocity of finger 121 exploring object 1 10, or a region 1 1 1/1 12 thereof.
Optionally, the second data is acquired 604 from a network node 200 providing the second data. Network node 200 is accessible over a
communications network.
For instance, the acquiring 604 the second data may comprise selecting the second data based on the determined haptic exploration characteristics and further based on information identifying second data which is available for object 1 10, requesting the second data from network node 200 providing the second data, and receiving the second data from network node 200 providing the second data. Optionally, the information identifying second data which is available for object 1 10 is comprised in the first data.
Alternatively, acquiring 604 the second data may comprise
transmitting the determined haptic exploration characteristics, or information derived therefrom, to network node 200 providing the second data, and receiving the second data from network node 200 providing the second data. It will be appreciated that method 600 may comprise additional, or modified, steps in accordance with what is described throughout this disclosure. An embodiment of method 600 may be implemented in software, i.e., as computer-executable instructions, and may be performed by any one a display, a mobile phone, a smartphone, a mobile terminal, a UE, a tablet, a laptop, and the like.
The person skilled in the art realizes that the invention by no means is limited to the embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims.

Claims

1 . A device (100) for rendering an object (1 10) on a haptic
touchscreen (101 ) comprised in the device, the device being operative to: render (213; 313) the object on the haptic touchscreen using first data (21 1 ; 31 1 ) comprising data for graphically rendering the object,
determine (221 ; 321 ) haptic exploration characteristics of a
finger (121 ) interacting with the haptic touchscreen for haptically exploring the object,
acquire (222-224; 322-324) second data for rendering one or more haptic properties of the object, wherein the second data is selected (222; 323) based on the determined haptic exploration characteristics, and
render (225; 325) the one or more haptic properties of the object using the second data.
2. The device according to claim 1 , wherein the first data (212; 312) further comprises data for haptically rendering (213; 313) the one or more haptic properties of the object (1 10) with a first level of detail, and the second data (224; 324) comprises data for haptically rendering (225; 325) the one or more haptic properties of the object with a second level of detail which is higher than the first level of detail.
3. The device according to claims 1 or 2, wherein at least two regions (1 1 1 ; 1 12) are defined for the object (1 10), and the second data is selected (222; 323) based on a number of times the finger (121 ) has explored the object within each of the at least two regions.
4. The device according to claims 1 or 2, wherein the second data (224; 324) comprises at least one of: friction data, stiffness data, and texture data, and the second data is selected (222; 323) based on the finger (121 ) exploring a friction, a stiffness, or a texture, respectively, of the object (1 10), or a region (1 1 1 , 1 12) thereof.
5. The device according to claims 1 or 2, wherein the second data (224; 324) is available in at least two different resolutions, and the second data is selected (222; 323) based on a velocity of the finger (121 ) exploring the object (1 10), or a region (1 1 1 , 1 12) thereof.
6. The device according to any one of claims 1 to 5, wherein the second data (224; 324) is acquired from a network node (200) providing the second data, which network node is accessible over a communications network.
7. The device according to claim 6, the device being operative to acquire the second data (224; 324) by:
selecting (222) the second data based on the determined (221 ) haptic exploration characteristics and further based on information identifying second data which is available for the object (1 10),
requesting (223) the second data from the network node (200) providing the second data, and
receiving (224) the second data from the network node providing the second data.
8. The device according to claim 7, wherein the information identifying second data (224) which is available for the object is comprised in the first data (212).
9. The device according to claim 6, the device being operative to acquire the second data (324) by: transmitting (322) the determined (321 ) haptic exploration
characteristics, or information derived therefrom, to the network node (200) providing the second data, and
receiving (324) the second data from the network node providing the second data.
10. The device according to any one of claims 1 to 9, wherein the haptic exploration characteristics comprise at least one of: a position, a velocity, and a force, applied by the finger (121 ) to the touchscreen (101 ) when exploring the object (1 10).
1 1 . The device according to any one of claims 1 to 10, the device being any one of: a display, a mobile phone, a smartphone, a mobile terminal, a User Equipment, UE, a tablet, and a laptop.
12. A method (600) of rendering an object (1 10) on a haptic touchscreen, the method comprising:
rendering (213; 313; 602) the object on the haptic touchscreen using first data (21 1 ; 31 1 ) comprising data for graphically rendering the object, determining (221 ; 321 ; 603) haptic exploration characteristics of a finger (121 ) interacting with the haptic touchscreen for haptically exploring the object,
acquiring (222-224; 322-324; 604) second data for rendering one or more haptic properties of the object, wherein the second data is
selected (222; 323) based on the determined haptic exploration
characteristics, and
rendering (225; 325; 605) the one or more haptic properties of the object using the second data.
13. The method according to claim 12, wherein the first data (212; 312) further comprises data for haptically rendering (213; 313; 602) the one or more haptic properties of the object (1 10) with a first level of detail, and the second data (224; 324) comprises data for haptically rendering (225; 325; 605) the one or more haptic properties of the object with a second level of detail which is higher than the first level of detail.
14. The method according to claims 12 or 13, wherein at least two regions (1 1 1 ; 1 12) are defined for the object (1 10), and the second data is selected (222; 323) based on a number of times the finger (121 ) has explored the object within each of the at least two regions.
15. The method according to claims 12 or 13, wherein the second data (224; 324) comprises at least one of: friction data, stiffness data, and texture data, and the second data is selected (222; 323) based on the finger exploring a friction, a stiffness, or a texture, respectively, of the object (1 10), or a region (1 1 1 , 1 12) thereof.
16. The method according to claims 12 or 13, wherein the second data (224; 324) is available in at least two different resolutions, and the second data is selected (222; 323) based on a velocity of the finger (121 ) exploring the object (1 10), or a region (1 1 1 , 1 12) thereof.
17. The method according to any one of claims 12 to 16, wherein the second data (224; 324) is acquired (604) from a network node (200) providing the second data, which network node is accessible over a communications network.
18. The method according to claim 17, wherein the acquiring (604) the second data comprises: selecting (222) the second data based on the determined haptic exploration characteristics and further based on information identifying second data which is available for the object,
requesting (223) the second data from the network node (200) providing the second data, and
receiving (224) the second data from the network node providing the second data.
19. The method according to claim 18, wherein the information identifying second data (224) which is available for the object is comprised in the first data (212).
20. The method according to claim 17, wherein the acquiring (604) the second data comprises:
transmitting (322) the determined (321 ) haptic exploration
characteristics, or information derived therefrom, to the network node (200) providing the second data, and
receiving (324) the second data from the network node providing the second data.
21 . The method according to any one of claims 12 to 20, wherein the haptic exploration characteristics comprise at least one of: a position, a velocity, and a force, applied by the finger (321 ) to the touchscreen when exploring the object (1 10).
22. A computer program (404) comprising computer-executable instructions for causing a device to perform the method according to any one of claims 12 to 21 , when the computer-executable instructions are executed on a processing unit (402) comprised in the device.
23. A computer program product comprising a computer-readable storage medium (403), the computer-readable storage medium having the computer program (404) according to claim 22 embodied therein.
PCT/EP2016/064947 2016-06-28 2016-06-28 Device and method for haptic exploration of a rendered object WO2018001456A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2016/064947 WO2018001456A1 (en) 2016-06-28 2016-06-28 Device and method for haptic exploration of a rendered object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2016/064947 WO2018001456A1 (en) 2016-06-28 2016-06-28 Device and method for haptic exploration of a rendered object

Publications (1)

Publication Number Publication Date
WO2018001456A1 true WO2018001456A1 (en) 2018-01-04

Family

ID=56289499

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/064947 WO2018001456A1 (en) 2016-06-28 2016-06-28 Device and method for haptic exploration of a rendered object

Country Status (1)

Country Link
WO (1) WO2018001456A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210356936A1 (en) * 2016-04-27 2021-11-18 Sang Hun Park Interior design product fabricating system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020109668A1 (en) * 1995-12-13 2002-08-15 Rosenberg Louis B. Controlling haptic feedback for enhancing navigation in a graphical environment
US20050195154A1 (en) * 2004-03-02 2005-09-08 Robbins Daniel C. Advanced navigation techniques for portable devices
US20100231367A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Providing Features in a Friction Display
EP2461228A2 (en) * 2010-12-02 2012-06-06 Immersion Corporation Haptic feedback assisted text manipulation
WO2015121969A1 (en) * 2014-02-14 2015-08-20 富士通株式会社 Tactile device and system
US20150323995A1 (en) * 2014-05-09 2015-11-12 Samsung Electronics Co., Ltd. Tactile feedback apparatuses and methods for providing sensations of writing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020109668A1 (en) * 1995-12-13 2002-08-15 Rosenberg Louis B. Controlling haptic feedback for enhancing navigation in a graphical environment
US20050195154A1 (en) * 2004-03-02 2005-09-08 Robbins Daniel C. Advanced navigation techniques for portable devices
US20100231367A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Providing Features in a Friction Display
EP2461228A2 (en) * 2010-12-02 2012-06-06 Immersion Corporation Haptic feedback assisted text manipulation
WO2015121969A1 (en) * 2014-02-14 2015-08-20 富士通株式会社 Tactile device and system
US20150323995A1 (en) * 2014-05-09 2015-11-12 Samsung Electronics Co., Ltd. Tactile feedback apparatuses and methods for providing sensations of writing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
H. CULBERTSON; J. J. LOPEZ DELGADO; K. J. KUCHENBECKER, THE PENN HAPTIC TEXTURE TOOLKIT FOR MODELING, RENDERING, AND EVALUATING HAPTIC VIRTUAL TEXTURES, 2014, pages 299

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210356936A1 (en) * 2016-04-27 2021-11-18 Sang Hun Park Interior design product fabricating system
US12422819B2 (en) * 2016-04-27 2025-09-23 Sang Hun Park Interior design product fabricating system

Similar Documents

Publication Publication Date Title
CN105283869B (en) Frequent website based on browse mode
KR101716350B1 (en) Animation sequence associated with image
RU2576247C1 (en) Method of capturing content and mobile terminal therefor
US10776854B2 (en) Merchandise recommendation device, merchandise recommendation method, and program
CN103999028B (en) Invisible control
JP6434483B2 (en) Interactive elements for launching from the user interface
KR102093652B1 (en) Animation sequence associated with feedback user-interface element
CN103324408B (en) The method and its mobile terminal of shared content
US9015584B2 (en) Mobile device and method for controlling the same
CN107710131A (en) Content-browsing user interface
CN118092720A (en) Setup program for electronic device
WO2013123757A1 (en) File data transmission method and device
WO2013135270A1 (en) An apparatus and method for navigating on a touch sensitive screen thereof
CN104956301A (en) Display device and method for controlling display device
WO2020007011A1 (en) Personal information sharing method and apparatus, terminal device, and storage medium
US9588635B2 (en) Multi-modal content consumption model
CN104423796A (en) User interface based on device context
JP2015191551A (en) Electronics
TW201234259A (en) Systems and methods for screen data management, and computer program products thereof
EP3559780B1 (en) A method and arrangement for handling haptic feedback
US10152496B2 (en) User interface device, search method, and program
JP2014164695A (en) Data processing device and program
WO2014041930A1 (en) User inteface device, search method, and program
CN110554880B (en) Setup program for electronic device
US20150234926A1 (en) User interface device, search method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16733068

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16733068

Country of ref document: EP

Kind code of ref document: A1