[go: up one dir, main page]

WO2025040381A1 - Touchless gesture control-enabling data processing apparatus, system and method - Google Patents

Touchless gesture control-enabling data processing apparatus, system and method Download PDF

Info

Publication number
WO2025040381A1
WO2025040381A1 PCT/EP2024/071496 EP2024071496W WO2025040381A1 WO 2025040381 A1 WO2025040381 A1 WO 2025040381A1 EP 2024071496 W EP2024071496 W EP 2024071496W WO 2025040381 A1 WO2025040381 A1 WO 2025040381A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
data processing
processing apparatus
control
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/071496
Other languages
French (fr)
Inventor
Martin Seiler
Christian ENSSLEN
Maria Rothvoss Buchheimer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ameria AG
Original Assignee
Ameria AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ameria AG filed Critical Ameria AG
Publication of WO2025040381A1 publication Critical patent/WO2025040381A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present disclosure generally relates to the field of touchless gesture control, and more specifically to techniques for controlling a display system using corresponding control commands. Certain embodiments may provide for simplified deployment and incorporation of three-dimensional touchless gesture control into existing end devices, such as display systems.
  • Display systems have evolved significantly in recent years, offering an array of innovative features and functionalities to enhance user experience and interaction.
  • One notable advancement in display technology has been the integration of touchless gesture control, allowing users to interact with the display without the need for physical contact. This has been made possible through the incorporation of sensor systems, such as depth cameras, that capture and interpret user gestures and movements.
  • Touchless gesture control technology holds great promise for a wide range of applications, from interactive displays in public spaces to personal devices such as smartphones and tablets. By enabling users to navigate, select, and manipulate content through intuitive gestures, touchless gesture control systems offer enhanced convenience and accessibility.
  • touchless gesture control is not without challenges. While sensor systems like depth cameras provide the raw data necessary for interpreting user gestures, specialized software is typically required to process this data and enable effective touchless interaction. This software typically translates captured movements into recognizable commands, ensuring seamless communication between users and the display system.
  • Another pertinent concern pertains to the protection of sensitive algorithms typically embedded within the touchless gesture control software.
  • the source code of such software may incorporate proprietary algorithms that contribute to the software’s efficiency and performance.
  • Unfortunately, the deployment of software onto various display systems may expose it to potential reverse engineering attempts. Unauthorized access to the software’s source code could enable third parties to decipher and exploit these proprietary algorithms, leading to security risks, for instance.
  • the data processing apparatus may comprise an input interface configured to receive sensor data captured by a sensor system.
  • the sensor data may represent one or more three-dimensional touchless control gestures performed by a user of a display system.
  • the data processing apparatus may comprise a processing unit configured to generate, based at least in part on the received sensor data, one or more control commands for the display system. At least one control command of the one or more control commands may be in accordance with a display interface standard.
  • the data processing apparatus may comprise an output interface configured to output the one or more control commands to the display system.
  • a data processing apparatus should be understood as any electronic device or system designed to perform various operations on input data to produce desired output or results.
  • a data processing apparatus may execute software instructions that involve tasks like arithmetic calculations, data storage and retrieval, data transformation, algorithm execution, and/or communication with external devices or networks.
  • the term “data processing apparatus” encompasses a broad spectrum of devices, reflecting the diverse applications and functions that modern computing technology enables. Specific implementation examples will be provided further below.
  • an input interface and an output interface may be components or mechanisms that facilitate the interaction between a user or external device and the data processing apparatus. They enable the transfer of information, data, and/or signals into the data processing apparatus (input) or out of the data processing apparatus (output). These interfaces may enable communication, control, and/or data exchange between the user/external device and the data processing apparatus.
  • An input interface should be understood as a means by an external device, such as the sensor system, can provide information, commands, and/or data to the data processing apparatus.
  • An output interface should be understood as a means through which the data processing apparatus can convey information, results, data, and/or control commands to external devices, such as the display system.
  • the input interface and the output interface may be distinct interfaces or may be integrated into a single input/output interface.
  • a sensor system should be understood as a collection of one or more interconnected sensors, devices, and/or components designed to detect and measure specific physical phenomena or environmental conditions. These systems gather data from the surrounding environment and convert it into electrical or digital signals that can be processed, analyzed, and/or utilized for various applications.
  • a three-dimensional touchless control gesture may refer to a non-contact and/or in-air hand movement or motion performed in a three-dimensional interaction space to interact with the display system without the need for physical touch. These gestures may involve using the position, orientation, and/or movement of one or more hands and/or one or more fingers to control and/or manipulate digital content, interfaces, or functions.
  • three-dimensional touchless control gestures may incorporate the depth dimension, allowing users to control devices by moving their hands closer to or farther away from the sensor, and by performing more complex gestures as would be possible in only two dimensions.
  • Three- dimensional touchless control gestures often involve various hand and/or finger movements, such as pointing, waving, grabbing, or swiping in the air. The precise motion and trajectory of the hands and/or fingers are interpreted by the sensor system to trigger specific actions. Advanced algorithms and/or software may be used to recognize and interpret the user’s hand and/or movements.
  • Three-dimensional touchless control gestures can operate within a defined interaction space in front of the sensor system. This space is where users perform their gestures, and its boundaries are typically determined by the sensor system’s capabilities. Using three-dimensional touchless control gestures, natural human movements and gestures may be mimicked and/or enhanced, making the humancomputer interaction intuitive and user-friendly.
  • a display system should be understood as any arrangement of hardware, software, and/or components designed to present visual information, content, and/or data to users.
  • Display systems encompass a wide range of devices that generate visual output, enabling users to view and interact with various forms of information, including text, images, videos, graphics, and user interfaces. These systems may convey information, facilitate communication, and enhance user experience across different applications and industries.
  • the term “display system” is meant to indicate that the system has display capabilities, but the system itself may come in various shapes and configurations of an electronic user device.
  • smartphones such as the Apple iPhone and Samsung Galaxy series, serve as pocket-sized computing hubs, enabling voice calls, messaging, internet browsing, app usage, and multimedia consumption.
  • Tablets like the Apple iPad and Microsoft Surface offer larger touchscreens for browsing, reading, gaming, and content creation.
  • Laptops and ultrabooks from companies like Dell, HP, and Lenovo provide portable computing power for work, study, and multimedia tasks.
  • E-readers like the Amazon Kindle cater to digital book enthusiasts, while smartwatches such as the Apple Watch and Samsung Galaxy Watch offer fitness tracking, notifications, and app integration from users’ wrists.
  • displays and monitors, including those from brands like LG, ASUS, and Dell serve as external screens for computers, enhancing visual experience, productivity, and gaming immersion.
  • the HID standard defines how devices like keyboards, mice, game controllers, touchscreens, and other input peripherals should communicate with computers, ensuring interoperability and easy plug-and-play functionality.
  • the display interface standard may be a standard which the display system supports (i.e. , the display system is able to operate according to control commands being in accordance with the display interface standard).
  • the at least one control command may be a control command for two-dimensional display control.
  • common user interactions e.g., manipulating a display pointer or performing a mouse click
  • usability as well as user experience of the system is improved because users find themselves in a familiar interaction environment.
  • control command for two- dimensional display control may comprise two-dimensional position information and/or two- dimensional event information associated with the two-dimensional position information. This way, unintended or confusing system responsiveness is avoided, because a specific two- dimensional event, as intended by the user, is executed at a specific position, as intended by the user.
  • the two-dimensional position information may consist of an x-coordinate and a y-coordinate.
  • the two-dimensional event information associated with the two-dimensional position information may comprise one or more of: a click event, in particular a single-click, a double-click or a triple-click, a press-and- release event, a drag-and-drop event, a hold-and-repeat event, a movement event, such as a position change, an acceleration and/or a velocity associated with a position change, a scrolling event, a tap event, a swipe event, a pinch-to-zoom event, a rotate event and a multi- finger-gesture event.
  • a click event in particular a single-click, a double-click or a triple-click
  • a press-and- release event such as a position change, an acceleration and/or a velocity associated with a position change
  • a scrolling event such as a tap event, a swipe event, a pinch-to-zoom event,
  • two-dimensional position information may relate to information measured from a reference point (e.g., an origin of a coordinate system) of the display system.
  • a reference point e.g., an origin of a coordinate system
  • the x-coordinate may relate to a horizontal distance from the reference point
  • the y-coordinate may relate to a vertical distance from the reference point.
  • the reference point may be positioned in a corner of the display system or a center point.
  • a click event may refer to an action representative of a pressing and releasing of a button (e.g., on a pointing device such as a mouse) for tasks such as selecting items or interacting with software.
  • a time between pressing and clicking the button is shorter compared to a press-and-release event.
  • a single, double or triple click may relate to the number of clicks within a predefined duration.
  • a press-and-release event may refer to an action of pressing a button or key and then releasing it, generating a single input signal or event.
  • a drag-and-drop event may relate to clicking, holding, moving an object, and then releasing it to perform an action like moving or copying the object.
  • a hold-and-repeat event may relate to holding down a button or key, resulting in repeated input signals after a brief delay, often used for actions requiring continuous repetition.
  • a scrolling event may relate to moving content on a screen by using a finger or a pointing device, allowing users to navigate through longer pages or lists.
  • a tap event may relate to a brief touch or click action on a touchscreen or button, often used to select items or activate functions.
  • a swipe event may relate to a deliberate movement of a finger or cursor across a touchscreen or touchpad, commonly used to navigate between screens or pages.
  • a pinch-to-zoom event may relate to placing two fingers close together on a touchscreen and then spreading them apart to zoom in or bringing them closer to zoom out.
  • a rotate event may relate to using two fingers to rotate an object on a touchscreen, allowing users to manipulate the orientation of an image or element.
  • a multi-finger-gesture event may relate to complex touch interactions involving multiple fingers on a touchscreen, enabling advanced actions such as drawing, gaming, or navigation.
  • the display interface standard may support only control commands for two-dimensional display control. This way, it is ensured that the command is interpretable by most computing systems, because two-dimensional display controls (e.g., a mouse click) is a widespread technology. This way, the compatibility and scalability of the system is improved (e.g., with respect to the operating system (OS) running on the display system).
  • OS operating system
  • the display interface standard may be a Human Interface Device, HID, standard.
  • HID Human Interface Device
  • the one or more control commands may comprise at least one control command for three-dimensional touchless gesture control. This way, more advanced interaction (e.g., 3D cursors or gesture recognition and gesture commands) is provided.
  • control command for three-dimensional touchless gesture control should be understood as referring to a specific gesture, movement, and/or action performed in the three-dimensional interaction space in front of the sensor system to trigger a particular response or function. These commands are used to interact with the display system using touchless gestures, providing users with a hands-free and intuitive means of controlling various functionalities. Key characteristics of control commands for three-dimensional touchless gesture control may include one or more of the following:
  • Each control command may correspond to a distinct gesture or movement pattern. This could involve waving a hand, making a grabbing motion, pointing, or performing other defined gestures.
  • the sensor system and/or the data processing apparatus may be equipped with algorithms that analyze the captured data from sensors like depth cameras. These algorithms may recognize and/or interpret the performed gesture based on predefined criteria. Mapping to Functions: Once a specific gesture is recognized, it may be mapped to a corresponding function and/or action to be caused within the display system. For instance, a swipe gesture might be mapped to scrolling through content, or a grabbing motion could trigger a selection.
  • Control commands for touchless gestures may be executed in real-time, enabling users to interact with the display system without delay.
  • the recognized control commands may be translated into input signals that are interpreted by the user interface of the display system, allowing the intended action to take place.
  • control command for three- dimensional touchless gesture control may comprise three-dimensional position information and/or three-dimensional event information. This way, unintended or confusing system responsiveness is avoided, because a specific three-dimensional event, as intended by the user, is executed at a specific position, as intended by the user.
  • three-dimensional position information may relate to an extension of two-dimensional position information by an additional dimension (e.g., a z- coordinate).
  • the processing unit may further be configured to determine a three-dimensional touchless control gesture based on the received sensor data.
  • the generating of the one or more commands for the display system may comprise generating, if the three-dimensional touchless control gesture is convertible into a two-dimensional event, a control command in accordance with the display interface standard or generating, if the three-dimensional touchless control gesture is not convertible into a two- dimensional event, a three-dimensional control command.
  • a three-dimensional touchless control gesture (performed by the user) is determined.
  • a two- dimensional event e.g., a tap or a click.
  • These types of three-dimensional touchless control gestures can be converted into a corresponding two-dimensional event. Accordingly, a respective control command which is in accordance with the display interface standard can be generated. This way, interoperability and compatibility of the system is ensured allowing for scalability.
  • there might be three-dimensional touchless control gestures which cannot be (fully) represented as a two-dimensional event and thus need respective additional computation and/or processing. These three-dimensional touchless control gestures typically provide more advanced interactions. These three-dimensional touchless control gestures may then be converted into corresponding three-dimensional control commands. A receiver of these commands may require additional processing capabilities (e.g., provided by special software).
  • the output interface may comprise a wired communication unit, in particular a Power over Ethernet, PoE, unit, configured to transmit the one or more control commands to the display system.
  • a wired communication unit in particular a Power over Ethernet, PoE, unit, configured to transmit the one or more control commands to the display system.
  • PoE Power over Ethernet
  • wired communication unit as described throughout the present disclosure may be capable of supporting any suitable type of wired communication, such as PoE, Ethernet, USB, or the like.
  • the output interface may comprise a wireless communication unit, in particular a Wireless Local Area Network, WLAN, unit or a Bluetooth unit, configured to transmit the one or more control commands to the display system.
  • a wireless communication unit in particular a Wireless Local Area Network, WLAN, unit or a Bluetooth unit, configured to transmit the one or more control commands to the display system.
  • a wireless communication unit as described throughout the present disclosure may be capable of supporting any suitable type of wireless communication. It may also be possible to combine both units (i.e. , the wired and the wireless unit) into one unit to combine the respective advantages.
  • the data processing system may comprise a data processing apparatus according to any one of the aspects described herein.
  • the data processing system may comprise a sensor system configured for capturing sensor data representing one or more three-dimensional touchless control gestures performed by a user of a display system.
  • the data processing system may comprise the display system.
  • the sensor system may comprise one or more depth sensors, in particular one or more depth cameras, more particularly two or more depth sensors or depth cameras. This way, the accuracy of capturing the sensor data may be increased.
  • depth cameras may provide an improved accuracy compared to other depth sensors such as infrared cameras.
  • the methos may be performed by a data processing apparatus.
  • the method may be computer- implemented.
  • the method may comprise a step of receiving, via an input interface of the data processing apparatus, sensor data captured by a sensor system.
  • the sensor data may represent one or more three-dimensional touchless control gestures performed by a user of a display system.
  • the method may comprise a step of generating, by a processing unit of the data processing apparatus, based at least in part on the received sensor data, one or more control commands for the display system. At least one control command of the one or more control commands may be in accordance with a display interface standard.
  • the method may comprise a step of outputting, via an output interface of the data processing apparatus, the one or more control commands to the display system.
  • the data processing apparatus may be a data processing apparatus according to the aspects as described herein.
  • Another aspect of the present disclosure relates to a computer program or a computer- readable medium having stored thereon a computer program, the computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method according to any one of the aspects of the present disclosure.
  • FIG. 1 A block diagram with a general overview of a data processing system in accordance with embodiments of the present disclosure.
  • Figs. 2a-b Wireless implementations of the data processing system in accordance with embodiments of the present disclosure.
  • Figs.3a-b Wired implementations of the data processing system in accordance with embodiments of the present disclosure.
  • the data processing apparatus 102 may comprise an input interface configured to receive sensor data captured by the sensor system 104. As described above, the sensor data may represent one or more three-dimensional touchless control gestures performed by a user of the display system 106.
  • the data processing apparatus 120 may comprise a processing unit configured to generate, based at least in part on the received sensor data, one or more control commands for the display system 106. At least one control command of the one or more control commands may be in accordance with a display interface standard.
  • the data processing apparatus 120 may comprise an output interface configured to output the one or more control commands to the display system 106.
  • the at least one control command may be a control command for two-dimensional display control.
  • the two-dimensional control command may comprise two-dimensional position information and/or two-dimensional event information associated with the two-dimensional position information.
  • the two-dimensional position information may consist of an x- coordinate and a y-coordinate.
  • the two-dimensional event information associated with the two-dimensional position information may comprise one or more of: a click event, in particular a single-click, a double-click or a triple-click, a press-and-release event, a hold- and-repeat event, a movement event, such as a position change, an acceleration and/or a velocity associated with a position change, a scrolling event, a tap event, a swipe event, a pinch-to-zoom event, a rotate event and a multi-finger-gesture event.
  • the display interface standard may support only control commands for two-dimensional display control.
  • the display interface standard may be a Human Interface Device (HID) standard.
  • the one or more control commands may comprise at least one control command for three- dimensional touchless gesture control
  • the control command for three-dimensional touchless gesture control may comprise three-dimensional position information and/or three- dimensional event information.
  • the processing unit may further be configured to generate a three-dimensional touchless control gesture(s) based on the received sensor data.
  • the generating of the one or more commands for the display system may comprise generating, if the three-dimensional touchless control gesture is convertible into a two-dimensional event, a command in accordance with the display interface standard or generating, if the three- dimensional touchless control gesture is not convertible into a two-dimensional event, a three-dimensional control command (i.e., a control command for three-dimensional touchless gesture control).
  • the processing unit is further configured to determine, based at least in part on the three-dimensional touchless control gesture, whether the three-dimensional touchless control gesture is convertible into a two-dimensional event or not. It may be possible that the processing unit is further configured to convert the three- dimensional touchless control gesture into a two-dimensional event or a three-dimensional event respectively.
  • the output interface may comprise a wired communication unit, in particular a Power over Ethernet, PoE, unit, configured to transmit the one or more control commands to the display system.
  • the output interface may comprise a wireless communication unit, in particular a Wireless Local Area Network, WLAN, unit or a Bluetooth unit, configured to transmit the one or more control commands to the display system.
  • the position and/or event information associated with the hidden cursor may be monitored to recognize user input (e.g., touchless control gestures) and to generate corresponding control commands for the display system 106.
  • Fig. 2a illustrates a wireless implementation 200a of the data processing system 100 in accordance with an exemplary embodiment.
  • the sensor system 104 is not shown in the illustrated implementation 200a. However, it is to be understood that the sensor system 104 as well as other units as described throughout this disclosure can be incorporated into the implementation 200a.
  • the wireless implementation 200a involves a microcontroller 202.
  • the microcontroller 202 as explained with respect to the embodiments of the present invention may be part of the data processing apparatus or may be part of another end device (e.g., the display system 106).
  • the data processing apparatus 102 and the microcontroller 202 may represent a logical unit. This means that the data processing apparatus 102 and the microcontroller 202 may be seen as one unit (e.g., with respect to the entity producing the two, with respect to the positioning or mounting etc.).
  • the display system 106 may represent a separate unit as indicated by the dashed line.
  • the display system 106 may be from a different manufacturer as the logical unit comprising the data processing apparatus 102 and the microcontroller 202.
  • components of the logical unit i.e., the data processing apparatus 102 and the microcontroller 202 may be positioned/mounted at the same location while the display system 106 is positioned/mounted at a different (e.g., remote) location.
  • both communication paths i.e., the path between the data processing apparatus 102 and the microcontroller 202 as well as the path between the logical unit and the display system 106) may be a two-way communication as indicated by the arrowheads showing in both directions.
  • a two-way communication may relate to both entities participating in the communication (e.g., the logical unit and the display system 106) being able to receive and transmit data (e.g., control commands according to aspects of the present disclosure).
  • Fig. 2b illustrates another wireless implementation 200b of the data processing system 100 in accordance with an exemplary embodiment.
  • the sensor system 104 is not shown in the illustrated implementation 200b.
  • the sensor system 104 as well as other units as described throughout this disclosure can be incorporated into the implementation 200b.
  • the wireless implementation 200b involves a microcontroller 202.
  • the display system 106 and the microcontroller 202 may represent a logical unit. This means that the display system 106 and the microcontroller 202 may be seen as one unit (e.g., with respect to the entity producing the two, with respect to the positioning or mounting etc.).
  • the data processing apparatus 102 may represent a separate unit as indicated by the dashed line.
  • the data processing apparatus 102 may be from a different manufacturer as the logical unit comprising the display system 106 and the microcontroller 202.
  • components of the logical unit i.e. , the display system 106 and the microcontroller 202 may be positioned/mounted at the same location while the data processing apparatus 102 is positioned/mounted at a different (e.g., remote) location.
  • the communication between the two may be wired (e.g., using USB).
  • communication between the logical unit i.e., the display system 106 and the microcontroller 202 and the data processing apparatus 102, as indicated by the dashed arrow, may be wireless (e.g., using Bluetooth).
  • both communication paths i.e., the path between the data processing apparatus 120 and the logical unit as well as the path between the microcontroller 202 and the display system 106) may be a one-way communication as indicated by the arrowhead.
  • a one-way communication may relate to only one entity participating in the communication (e.g., the microcontroller 202) being able to transmit data (e.g., control commands according to aspects of the present disclosure) while the other entity (e.g., the display system 106) being able to only receive the data.
  • the entity e.g., the microcontroller 202
  • data e.g., control commands according to aspects of the present disclosure
  • the other entity e.g., the display system 106
  • Fig. 3a illustrates a wired implementation 300a of the data processing system 100 in accordance with an exemplary embodiment.
  • the sensor system 104 is not shown in the illustrated implementation 300a. However, it is to be understood that the sensor system 104 as well as other units as described throughout this disclosure can be incorporated into the implementation 300a.
  • the wired implementation 300a involves a microcontroller 202.
  • the display system 106 and the microcontroller 202 may represent a logical unit. This means that the display system 106 and the microcontroller 202 may be seen as one unit (e.g., with respect to the entity producing the two, with respect to the positioning or mounting etc.).
  • the data processing apparatus 102 may represent a separate unit as indicated by the dashed line.
  • the data processing apparatus 102 may be from a different manufacturer as the logical unit comprising the display system 106 and the microcontroller 202.
  • components of the logical unit i.e., the display system and the microcontroller 202
  • the display system and the microcontroller 202 may be positioned/mounted at the same location while the data processing apparatus 102 is positioned/mounted at a different (e.g., remote) location.
  • the communication between the two may be wired (e.g., using USB).
  • the communication between the logical unit (i.e., the display system 106 and the microcontroller 202) and the data processing apparatus 102, as indicated by the arrow may also be wired (e.g., using Ethernet).
  • both communication paths i.e., the path between the data processing apparatus 102 and the logical unit as well as the path between the microcontroller 202 and the display system 106) may be a one-way communication as indicated by the arrowhead.
  • a one-way communication may relate to only one entity participating in the communication (e.g., the microcontroller 202) being able to transmit data (e.g., control commands according to aspects of the present disclosure) while the other entity (e.g., the display system 106) being able to only receive the data.
  • the entity e.g., the microcontroller 202
  • data e.g., control commands according to aspects of the present disclosure
  • the other entity e.g., the display system 106
  • Fig. 3b illustrates another wired implementation 300b of the data processing system 100 in accordance with an exemplary embodiment.
  • the sensor system 104 is not shown in the illustrated implementation 300b.
  • the sensor system 104 as well as other units as described throughout this disclosure can be incorporated into the implementation 300b.
  • the wired implementation 300b involves a microcontroller 202.
  • the data processing apparatus 102 and the microcontroller 202 may represent a logical unit. This means that the data processing apparatus 102 and the microcontroller 202 may be seen as one unit (e.g., with respect to the entity producing the two, with respect to the positioning or mounting etc.).
  • the display system 106 may represent a separate unit as indicated by the dashed line. For example, the display system 106 may be from a different manufacturer as the logical unit comprising the data processing apparatus 102 and the microcontroller 202.
  • components of the logical unit may be positioned/mounted at the same location while the display system 106 is positioned/mounted at a different (e.g., remote) location.
  • the communication between the two may be wired (e.g., using USB).
  • the communication between the logical unit (i.e., the data processing apparatus 102 and the microcontroller 202) and the display system 106, as indicated by the arrow, may also be wired (e.g., using USB).
  • the communication path between the data processing apparatus 102 and the microcontroller 202 may be a two-way communication as indicated by the arrowheads showing in both directions.
  • the path between the logical unit and the display system 106 may be a one-way communication as indicated by the arrowhead.
  • a two-way communication may relate to both entities participating in the communication (e.g., the data processing apparatus 102 and the microcontroller 202) being able to receive and transmit data (e.g., control commands according to aspects of the present disclosure).
  • a one-way communication may relate to only one entity participating in the communication (e.g., the logical unit) being able to transmit data (e.g., control commands according to aspects of the present disclosure) while the other entity (e.g., the display system 106) being able to only receive the data.
  • aspects have been described in the context of an apparatus, it is clear that these aspects also represent a description of the corresponding method, where a block or device corresponds to a method step or a feature of a method step. Analogously, aspects described in the context of a method step also represent a description of a corresponding block or item or feature of a corresponding apparatus.
  • Embodiments of the present disclosure may be implemented on a computer system.
  • the computer system may be a local computer device (e.g., personal computer, laptop, tablet computer or mobile phone) with one or more processors and one or more storage devices or may be a distributed computer system (e.g., a cloud computing system with one or more processors and one or more storage devices distributed at various locations, for example, at a local client and/or one or more remote server farms and/or data centers).
  • the computer system may comprise any circuit or combination of circuits.
  • the computer system may include one or more processors which can be of any type.
  • processor may mean any type of computational circuit, such as but not limited to a microprocessor, a microcontroller, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a graphics processor, a digital signal processor (DSP), multiple core processor, a field programmable gate array (FPGA), or any other type of processor or processing circuit.
  • CISC complex instruction set computing
  • RISC reduced instruction set computing
  • VLIW very long instruction word
  • DSP digital signal processor
  • FPGA field programmable gate array
  • circuits that may be included in the computer system may be a custom circuit, an application-specific integrated circuit (ASIC), or the like, such as, for example, one or more circuits (such as a communication circuit) for use in wireless devices like mobile telephones, tablet computers, laptop computers, two-way radios, and similar electronic systems.
  • the computer system may include one or more storage devices, which may include one or more memory elements suitable to the particular application, such as a main memory in the form of random-access memory (RAM), one or more hard drives, and/or one or more drives that handle removable media such as compact disks (CD), flash memory cards, digital video disk (DVD), and the like.
  • RAM random-access memory
  • CD compact disks
  • DVD digital video disk
  • the computer system may also include a display device, one or more speakers, and a keyboard and/or controller, which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system.
  • a display device one or more speakers
  • a keyboard and/or controller which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system.
  • Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a processor, a microprocessor, a programmable computer or an electronic circuit. In some embodiments, some one or more of the most important method steps may be executed by such an apparatus.
  • embodiments of the present disclosure can be implemented in hardware or in software.
  • the implementation can be performed using a non-transitory storage medium such as a digital storage medium, for example a floppy disc, a DVD, a Blu-Ray, a CD, a ROM, a PROM, and EPROM, an EEPROM or a FLASH memory, having electronically readable control signals stored thereon, which cooperate (or are capable of cooperating) with a programmable computer system such that the respective method is performed. Therefore, the digital storage medium may be computer readable.
  • Some embodiments according to the present disclosure comprise a data carrier having electronically readable control signals, which are capable of cooperating with a programmable computer system, such that one of the methods described herein is performed.
  • embodiments of the present disclosure can be implemented as a computer program product with a program code, the program code being operative for performing one of the methods when the computer program product runs on a computer.
  • the program code may, for example, be stored on a machine-readable carrier.
  • inventions comprise the computer program for performing one of the methods described herein, stored on a machine-readable carrier.
  • an embodiment of the present disclosure is, therefore, a computer program having a program code for performing one of the methods described herein, when the computer program runs on a computer.
  • a further embodiment of the present disclosure is, therefore, a storage medium (or a data carrier, or a computer-readable medium) comprising, stored thereon, the computer program for performing one of the methods described herein when it is performed by a processor.
  • the data carrier, the digital storage medium or the recorded medium are typically tangible and/or non-transitory.
  • a further embodiment of the present disclosure is an apparatus as described herein comprising a processor and the storage medium.
  • a further embodiment of the present disclosure is, therefore, a data stream or a sequence of signals representing the computer program for performing one of the methods described herein.
  • the data stream or the sequence of signals may, for example, be configured to be transferred via a data communication connection, for example, via the internet.
  • a further embodiment comprises a processing means, for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein.
  • a processing means for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein.
  • a further embodiment comprises a computer having installed thereon the computer program for performing one of the methods described herein.
  • a further embodiment according to the present disclosure comprises an apparatus or a system configured to transfer (for example, electronically or optically) a computer program for performing one of the methods described herein to a receiver.
  • the receiver may, for example, be a computer, a mobile device, a memory device or the like.
  • the apparatus or system may, for example, comprise a file server for transferring the computer program to the receiver.
  • a programmable logic device for example, a field programmable gate array
  • a field programmable gate array may cooperate with a microprocessor in order to perform one of the methods described herein.
  • the methods are preferably performed by any hardware apparatus.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Disclosed is a data processing system. The data processing system may comprise a data processing apparatus, a sensor system configured for capturing sensor data representing one or more three-dimensional touchless control gestures performed by a user of a display system, and the display system. The data processing apparatus may be configured to receive the sensor data, to generate, based at least in part on the received sensor data, one or more control commands for the display system (106), wherein at least one control command of the one or more control commands is in accordance with a display interface standard, and to output the one or more control commands to the display system (106).

Description

TOUCHLESS GESTURE CONTROL-ENABLING DATA PROCESSING APPARATUS, SYSTEM AND METHOD
TECHNICAL FIELD
The present disclosure generally relates to the field of touchless gesture control, and more specifically to techniques for controlling a display system using corresponding control commands. Certain embodiments may provide for simplified deployment and incorporation of three-dimensional touchless gesture control into existing end devices, such as display systems.
BACKGROUND
Display systems have evolved significantly in recent years, offering an array of innovative features and functionalities to enhance user experience and interaction. One notable advancement in display technology has been the integration of touchless gesture control, allowing users to interact with the display without the need for physical contact. This has been made possible through the incorporation of sensor systems, such as depth cameras, that capture and interpret user gestures and movements.
Touchless gesture control technology holds great promise for a wide range of applications, from interactive displays in public spaces to personal devices such as smartphones and tablets. By enabling users to navigate, select, and manipulate content through intuitive gestures, touchless gesture control systems offer enhanced convenience and accessibility.
However, the implementation of touchless gesture control is not without challenges. While sensor systems like depth cameras provide the raw data necessary for interpreting user gestures, specialized software is typically required to process this data and enable effective touchless interaction. This software typically translates captured movements into recognizable commands, ensuring seamless communication between users and the display system.
The complexity of integrating touchless gesture control arises from the diversity of operating systems employed by various display system manufacturers. Prominent examples include Tizen by Samsung, WebOS by LG, iOS by Apple, and ChromeOS by Google. Each operating system presents its own unique software environment and requirements for successful integration.
Developing touchless gesture control software tailored to each individual operating system is a multifaceted challenge. The software must be meticulously crafted to align with the intricacies of the specific operating system, ensuring optimal performance and compatibility. Additionally, maintaining and updating the software for each operating system demands continuous efforts, resulting in substantial overhead in terms of development, testing, and support.
Another pertinent concern pertains to the protection of sensitive algorithms typically embedded within the touchless gesture control software. The source code of such software may incorporate proprietary algorithms that contribute to the software’s efficiency and performance. Unfortunately, the deployment of software onto various display systems may expose it to potential reverse engineering attempts. Unauthorized access to the software’s source code could enable third parties to decipher and exploit these proprietary algorithms, leading to security risks, for instance.
Therefore, there exists a clear need for improved touchless gesture control techniques that address these challenges. An innovative solution is required to streamline the integration process across diverse operating systems, alleviate the burden of maintaining multiple software versions, and safeguard sensitive algorithms from potential reverse engineering threats. Such a solution would not only enhance the user experience of touchless gesture control but also provide a more efficient and secure pathway for its implementation across a variety of display systems.
It is therefore an objective of the present disclosure to provide a scalable yet secure solution for equipping display systems with touchless gesture control capabilities, thereby overcoming the above-mentioned disadvantages of the prior art at least in part.
SUMMARY OF THE DISCLOSURE
The objective is solved by the subject-matter defined in the independent claims. Advantageous modifications of embodiments of the present disclosure are defined in the dependent claims as well as in the description and the figures. As a general overview, certain aspects of the present disclosure provide for a way to incorporate three-dimensional touchless gesture control into display systems by converting at least some of the three-dimensional control gestures into control commands which conform to a well-established display interface standard. This way, the sensor system may effectively present itself as a standardized human interface device without requiring special driver installations on the display system.
One aspect of the present disclosure relates to a data processing apparatus. The data processing apparatus may comprise an input interface configured to receive sensor data captured by a sensor system. The sensor data may represent one or more three-dimensional touchless control gestures performed by a user of a display system. The data processing apparatus may comprise a processing unit configured to generate, based at least in part on the received sensor data, one or more control commands for the display system. At least one control command of the one or more control commands may be in accordance with a display interface standard. The data processing apparatus may comprise an output interface configured to output the one or more control commands to the display system.
This way, deployment may be greatly simplified, because complex three-dimensional touchless control gestures are (at least to some degree) converted into control commands which are in accordance with a display interface standard. As a result, no additional use-case specific programming is required which allows the apparatus to be provided as a plug-and- play solution.
Throughout the present disclosure, a data processing apparatus should be understood as any electronic device or system designed to perform various operations on input data to produce desired output or results. A data processing apparatus may execute software instructions that involve tasks like arithmetic calculations, data storage and retrieval, data transformation, algorithm execution, and/or communication with external devices or networks. The term “data processing apparatus” encompasses a broad spectrum of devices, reflecting the diverse applications and functions that modern computing technology enables. Specific implementation examples will be provided further below.
Throughout the present disclosure, an input interface and an output interface may be components or mechanisms that facilitate the interaction between a user or external device and the data processing apparatus. They enable the transfer of information, data, and/or signals into the data processing apparatus (input) or out of the data processing apparatus (output). These interfaces may enable communication, control, and/or data exchange between the user/external device and the data processing apparatus. An input interface should be understood as a means by an external device, such as the sensor system, can provide information, commands, and/or data to the data processing apparatus. An output interface should be understood as a means through which the data processing apparatus can convey information, results, data, and/or control commands to external devices, such as the display system. The input interface and the output interface may be distinct interfaces or may be integrated into a single input/output interface.
Throughout the present disclosure, a sensor system should be understood as a collection of one or more interconnected sensors, devices, and/or components designed to detect and measure specific physical phenomena or environmental conditions. These systems gather data from the surrounding environment and convert it into electrical or digital signals that can be processed, analyzed, and/or utilized for various applications.
Throughout the present disclosure, a three-dimensional touchless control gesture may refer to a non-contact and/or in-air hand movement or motion performed in a three-dimensional interaction space to interact with the display system without the need for physical touch. These gestures may involve using the position, orientation, and/or movement of one or more hands and/or one or more fingers to control and/or manipulate digital content, interfaces, or functions.
One characteristic of three-dimensional touchless control gestures is dimensionality: Unlike traditional touch gestures that operate in two dimensions (such as tapping or swiping on a touchscreen), three-dimensional gestures may incorporate the depth dimension, allowing users to control devices by moving their hands closer to or farther away from the sensor, and by performing more complex gestures as would be possible in only two dimensions. Three- dimensional touchless control gestures often involve various hand and/or finger movements, such as pointing, waving, grabbing, or swiping in the air. The precise motion and trajectory of the hands and/or fingers are interpreted by the sensor system to trigger specific actions. Advanced algorithms and/or software may be used to recognize and interpret the user’s hand and/or movements. These algorithms may analyze the captured data to determine the intended gesture and its corresponding action. Three-dimensional touchless control gestures can operate within a defined interaction space in front of the sensor system. This space is where users perform their gestures, and its boundaries are typically determined by the sensor system’s capabilities. Using three-dimensional touchless control gestures, natural human movements and gestures may be mimicked and/or enhanced, making the humancomputer interaction intuitive and user-friendly.
Throughout the present disclosure, a display system should be understood as any arrangement of hardware, software, and/or components designed to present visual information, content, and/or data to users. Display systems encompass a wide range of devices that generate visual output, enabling users to view and interact with various forms of information, including text, images, videos, graphics, and user interfaces. These systems may convey information, facilitate communication, and enhance user experience across different applications and industries. The term “display system” is meant to indicate that the system has display capabilities, but the system itself may come in various shapes and configurations of an electronic user device. For example, smartphones, such as the Apple iPhone and Samsung Galaxy series, serve as pocket-sized computing hubs, enabling voice calls, messaging, internet browsing, app usage, and multimedia consumption. Tablets like the Apple iPad and Microsoft Surface offer larger touchscreens for browsing, reading, gaming, and content creation. Laptops and ultrabooks from companies like Dell, HP, and Lenovo provide portable computing power for work, study, and multimedia tasks. E-readers like the Amazon Kindle cater to digital book enthusiasts, while smartwatches such as the Apple Watch and Samsung Galaxy Watch offer fitness tracking, notifications, and app integration from users’ wrists. Additionally, displays and monitors, including those from brands like LG, ASUS, and Dell, serve as external screens for computers, enhancing visual experience, productivity, and gaming immersion. These devices collectively exemplify the technological diversity that enhances daily life and connectivity for users worldwide, and all of them can greatly benefit from the three-dimensional touchless gesture control capabilities enabled through the aspects disclosed herein.
Throughout the present disclosure, the term “control command for a display system” should be understood as referring to instructions or signals that cause manipulation of content displayed by the display system. One example of such content manipulation is the control of a display pointer, such as a mouse cursor.
Throughout the present disclosure, a display interface standard should be understood as a set of agreed-upon technical specifications, protocols, and/or guidelines that govern the connection and/or communication between a display device and a source device, such the data processing apparatus. Display interface standards ensure compatibility and interoperability between different devices, allowing them to communicate effectively and display content accurately. One example of a display interface standard is the Human Interface Device (HID) standard, which refers to a set of specifications and protocols developed by the USB Implementers Forum (USB-IF) to enable consistent communication between various input devices and computer systems using the Universal Serial Bus (USB) interface. The HID standard defines how devices like keyboards, mice, game controllers, touchscreens, and other input peripherals should communicate with computers, ensuring interoperability and easy plug-and-play functionality. Throughout the present disclosure, the display interface standard may be a standard which the display system supports (i.e. , the display system is able to operate according to control commands being in accordance with the display interface standard).
According to another aspect of the present disclosure the at least one control command may be a control command for two-dimensional display control. This way, common user interactions (e.g., manipulating a display pointer or performing a mouse click) are enabled. Accordingly, usability as well as user experience of the system is improved because users find themselves in a familiar interaction environment.
According to another aspect of the present disclosure, the control command for two- dimensional display control may comprise two-dimensional position information and/or two- dimensional event information associated with the two-dimensional position information. This way, unintended or confusing system responsiveness is avoided, because a specific two- dimensional event, as intended by the user, is executed at a specific position, as intended by the user.
According to another aspect of the present disclosure, the two-dimensional position information may consist of an x-coordinate and a y-coordinate. The two-dimensional event information associated with the two-dimensional position information may comprise one or more of: a click event, in particular a single-click, a double-click or a triple-click, a press-and- release event, a drag-and-drop event, a hold-and-repeat event, a movement event, such as a position change, an acceleration and/or a velocity associated with a position change, a scrolling event, a tap event, a swipe event, a pinch-to-zoom event, a rotate event and a multi- finger-gesture event. This way, interaction of a user with the display system is improved. This is because more precise information about the position of the user input is provided. In addition, the specific two-dimensional events allow an intuitive interaction.
Throughout the present disclosure, two-dimensional position information may relate to information measured from a reference point (e.g., an origin of a coordinate system) of the display system. For example, the x-coordinate may relate to a horizontal distance from the reference point and the y-coordinate may relate to a vertical distance from the reference point. The reference point may be positioned in a corner of the display system or a center point.
Throughout the present disclosure, a click event may refer to an action representative of a pressing and releasing of a button (e.g., on a pointing device such as a mouse) for tasks such as selecting items or interacting with software. Typically, a time between pressing and clicking the button is shorter compared to a press-and-release event. A single, double or triple click may relate to the number of clicks within a predefined duration.
Throughout the present disclosure, a press-and-release event may refer to an action of pressing a button or key and then releasing it, generating a single input signal or event.
Throughout the present disclosure, a drag-and-drop event may relate to clicking, holding, moving an object, and then releasing it to perform an action like moving or copying the object.
Throughout the present disclosure, a hold-and-repeat event may relate to holding down a button or key, resulting in repeated input signals after a brief delay, often used for actions requiring continuous repetition.
Throughout the present disclosure, a scrolling event may relate to moving content on a screen by using a finger or a pointing device, allowing users to navigate through longer pages or lists.
Throughout the present disclosure, a tap event may relate to a brief touch or click action on a touchscreen or button, often used to select items or activate functions.
Throughout the present disclosure, a swipe event may relate to a deliberate movement of a finger or cursor across a touchscreen or touchpad, commonly used to navigate between screens or pages.
Throughout the present disclosure, a pinch-to-zoom event may relate to placing two fingers close together on a touchscreen and then spreading them apart to zoom in or bringing them closer to zoom out.
Throughout the present disclosure, a rotate event may relate to using two fingers to rotate an object on a touchscreen, allowing users to manipulate the orientation of an image or element. Throughout the present disclosure, a multi-finger-gesture event may relate to complex touch interactions involving multiple fingers on a touchscreen, enabling advanced actions such as drawing, gaming, or navigation.
According to another aspect of the present disclosure, the display interface standard may support only control commands for two-dimensional display control. This way, it is ensured that the command is interpretable by most computing systems, because two-dimensional display controls (e.g., a mouse click) is a widespread technology. This way, the compatibility and scalability of the system is improved (e.g., with respect to the operating system (OS) running on the display system).
According to another aspect of the present disclosure, the display interface standard may be a Human Interface Device, HID, standard. This way, compatibility and scalability of the system is further improved, because using the HID standard provides a plug-and-play solution which does not require any further specific software (e.g., specific drivers).
According to another aspect of the present disclosure, the one or more control commands may comprise at least one control command for three-dimensional touchless gesture control. This way, more advanced interaction (e.g., 3D cursors or gesture recognition and gesture commands) is provided.
Throughout the present disclosure, a control command for three-dimensional touchless gesture control should be understood as referring to a specific gesture, movement, and/or action performed in the three-dimensional interaction space in front of the sensor system to trigger a particular response or function. These commands are used to interact with the display system using touchless gestures, providing users with a hands-free and intuitive means of controlling various functionalities. Key characteristics of control commands for three-dimensional touchless gesture control may include one or more of the following:
Gesture Specification: Each control command may correspond to a distinct gesture or movement pattern. This could involve waving a hand, making a grabbing motion, pointing, or performing other defined gestures.
Gesture Recognition: The sensor system and/or the data processing apparatus may be equipped with algorithms that analyze the captured data from sensors like depth cameras. These algorithms may recognize and/or interpret the performed gesture based on predefined criteria. Mapping to Functions: Once a specific gesture is recognized, it may be mapped to a corresponding function and/or action to be caused within the display system. For instance, a swipe gesture might be mapped to scrolling through content, or a grabbing motion could trigger a selection.
Real-time Interaction: Control commands for touchless gestures may be executed in real-time, enabling users to interact with the display system without delay.
User Interface Integration: The recognized control commands may be translated into input signals that are interpreted by the user interface of the display system, allowing the intended action to take place.
According to another aspect of the present disclosure, the control command for three- dimensional touchless gesture control may comprise three-dimensional position information and/or three-dimensional event information. This way, unintended or confusing system responsiveness is avoided, because a specific three-dimensional event, as intended by the user, is executed at a specific position, as intended by the user.
Throughout the present disclosure, three-dimensional position information may relate to an extension of two-dimensional position information by an additional dimension (e.g., a z- coordinate).
According to another aspect of the present disclosure, the processing unit may further be configured to determine a three-dimensional touchless control gesture based on the received sensor data. The generating of the one or more commands for the display system may comprise generating, if the three-dimensional touchless control gesture is convertible into a two-dimensional event, a control command in accordance with the display interface standard or generating, if the three-dimensional touchless control gesture is not convertible into a two- dimensional event, a three-dimensional control command.
This way, system efficiency is improved. Sensor data is received and a three-dimensional touchless control gesture (performed by the user) is determined. There might be certain three-dimensional touchless control gestures which can be represented by a two- dimensional event (e.g., a tap or a click). These types of three-dimensional touchless control gestures can be converted into a corresponding two-dimensional event. Accordingly, a respective control command which is in accordance with the display interface standard can be generated. This way, interoperability and compatibility of the system is ensured allowing for scalability. On the other hand, there might be three-dimensional touchless control gestures which cannot be (fully) represented as a two-dimensional event and thus need respective additional computation and/or processing. These three-dimensional touchless control gestures typically provide more advanced interactions. These three-dimensional touchless control gestures may then be converted into corresponding three-dimensional control commands. A receiver of these commands may require additional processing capabilities (e.g., provided by special software).
According to another aspect of the present disclosure, the output interface may comprise a wired communication unit, in particular a Power over Ethernet, PoE, unit, configured to transmit the one or more control commands to the display system. This way, transmission of data as well as power supply is ensured. In addition, installation and operation is simplified, especially in environments where access to other power sources is limited.
It is to be understood that a wired communication unit as described throughout the present disclosure may be capable of supporting any suitable type of wired communication, such as PoE, Ethernet, USB, or the like.
According to another aspect of the present disclosure, the output interface may comprise a wireless communication unit, in particular a Wireless Local Area Network, WLAN, unit or a Bluetooth unit, configured to transmit the one or more control commands to the display system. This way, convenience and flexibility is improved. By integrating wireless technologies (e.g., WLAN, Bluetooth, Cellular etc.) the system can also be installed in locations which are not accessible for wired connections. In addition, a seamless integration into existing wireless networks is enabled.
It is to be understood that a wireless communication unit as described throughout the present disclosure may be capable of supporting any suitable type of wireless communication. It may also be possible to combine both units (i.e. , the wired and the wireless unit) into one unit to combine the respective advantages.
Another aspect of the present disclosure relates to a data processing system. The data processing system may comprise a data processing apparatus according to any one of the aspects described herein. The data processing system may comprise a sensor system configured for capturing sensor data representing one or more three-dimensional touchless control gestures performed by a user of a display system. The data processing system may comprise the display system. According to another aspect of the present disclosure, the sensor system may comprise one or more depth sensors, in particular one or more depth cameras, more particularly two or more depth sensors or depth cameras. This way, the accuracy of capturing the sensor data may be increased. In particular, depth cameras may provide an improved accuracy compared to other depth sensors such as infrared cameras.
Another aspect of the present disclosure relates to a computer-implementation method. The methos may be performed by a data processing apparatus. The method may be computer- implemented. The method may comprise a step of receiving, via an input interface of the data processing apparatus, sensor data captured by a sensor system. The sensor data may represent one or more three-dimensional touchless control gestures performed by a user of a display system. The method may comprise a step of generating, by a processing unit of the data processing apparatus, based at least in part on the received sensor data, one or more control commands for the display system. At least one control command of the one or more control commands may be in accordance with a display interface standard. The method may comprise a step of outputting, via an output interface of the data processing apparatus, the one or more control commands to the display system. The data processing apparatus may be a data processing apparatus according to the aspects as described herein.
Another aspect of the present disclosure relates to a computer program or a computer- readable medium having stored thereon a computer program, the computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method according to any one of the aspects of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosure may be better understood by reference to the following drawings:
Fig. 1 : A block diagram with a general overview of a data processing system in accordance with embodiments of the present disclosure.
Figs. 2a-b: Wireless implementations of the data processing system in accordance with embodiments of the present disclosure.
Figs.3a-b: Wired implementations of the data processing system in accordance with embodiments of the present disclosure. DETAILED DESCRIPTION
In the following, representative embodiments illustrated in the accompanying drawings will be explained. It should be understood that the illustrated embodiments and the following descriptions refer to examples which are not intended to limit the embodiments to one preferred embodiment.
Fig. 1 illustrates a schematic overview of a data processing system 100 in accordance with an exemplary embodiment. The data processing system 100 comprises a data processing apparatus 102 according to aspects of the present disclosure, a display system 106, and a sensor system 104 configured for capturing sensor data representing one or more three- dimensional touchless control gestures performed by a user of the display system 106.
The data processing apparatus 102 may comprise an input interface configured to receive sensor data captured by the sensor system 104. As described above, the sensor data may represent one or more three-dimensional touchless control gestures performed by a user of the display system 106. The data processing apparatus 120 may comprise a processing unit configured to generate, based at least in part on the received sensor data, one or more control commands for the display system 106. At least one control command of the one or more control commands may be in accordance with a display interface standard. The data processing apparatus 120 may comprise an output interface configured to output the one or more control commands to the display system 106.
In one particular configuration, the data processing apparatus 102 may be implemented as standard personal computer combined with one or more additional microcontrollers. For example, the microcontroller may act as a HID compliant device using aspects of the present disclosure.
The at least one control command may be a control command for two-dimensional display control. The two-dimensional control command may comprise two-dimensional position information and/or two-dimensional event information associated with the two-dimensional position information. The two-dimensional position information may consist of an x- coordinate and a y-coordinate. The two-dimensional event information associated with the two-dimensional position information may comprise one or more of: a click event, in particular a single-click, a double-click or a triple-click, a press-and-release event, a hold- and-repeat event, a movement event, such as a position change, an acceleration and/or a velocity associated with a position change, a scrolling event, a tap event, a swipe event, a pinch-to-zoom event, a rotate event and a multi-finger-gesture event. The display interface standard may support only control commands for two-dimensional display control. The display interface standard may be a Human Interface Device (HID) standard.
The one or more control commands may comprise at least one control command for three- dimensional touchless gesture control The control command for three-dimensional touchless gesture control may comprise three-dimensional position information and/or three- dimensional event information. The processing unit may further be configured to generate a three-dimensional touchless control gesture(s) based on the received sensor data. The generating of the one or more commands for the display system may comprise generating, if the three-dimensional touchless control gesture is convertible into a two-dimensional event, a command in accordance with the display interface standard or generating, if the three- dimensional touchless control gesture is not convertible into a two-dimensional event, a three-dimensional control command (i.e., a control command for three-dimensional touchless gesture control). It may be possible that the processing unit is further configured to determine, based at least in part on the three-dimensional touchless control gesture, whether the three-dimensional touchless control gesture is convertible into a two-dimensional event or not. It may be possible that the processing unit is further configured to convert the three- dimensional touchless control gesture into a two-dimensional event or a three-dimensional event respectively.
The output interface may comprise a wired communication unit, in particular a Power over Ethernet, PoE, unit, configured to transmit the one or more control commands to the display system. The output interface may comprise a wireless communication unit, in particular a Wireless Local Area Network, WLAN, unit or a Bluetooth unit, configured to transmit the one or more control commands to the display system.
When a user is interacting with the data processing system 100 by means of touchless interaction with the display system 106 it may be preferable to hide the cursor to avoid confusion at the user’s side. Nevertheless, the position and/or event information associated with the hidden cursor may be monitored to recognize user input (e.g., touchless control gestures) and to generate corresponding control commands for the display system 106.
Fig. 2a illustrates a wireless implementation 200a of the data processing system 100 in accordance with an exemplary embodiment. For the sake of clarity, the sensor system 104 is not shown in the illustrated implementation 200a. However, it is to be understood that the sensor system 104 as well as other units as described throughout this disclosure can be incorporated into the implementation 200a.
As can be seen, the wireless implementation 200a involves a microcontroller 202. The microcontroller 202 as explained with respect to the embodiments of the present invention may be part of the data processing apparatus or may be part of another end device (e.g., the display system 106). In the implementation 200a, the data processing apparatus 102 and the microcontroller 202 may represent a logical unit. This means that the data processing apparatus 102 and the microcontroller 202 may be seen as one unit (e.g., with respect to the entity producing the two, with respect to the positioning or mounting etc.). In contrast, the display system 106 may represent a separate unit as indicated by the dashed line. For example, the display system 106 may be from a different manufacturer as the logical unit comprising the data processing apparatus 102 and the microcontroller 202. Alternatively or additionally, components of the logical unit (i.e., the data processing apparatus 102 and the microcontroller 202) may be positioned/mounted at the same location while the display system 106 is positioned/mounted at a different (e.g., remote) location.
In this example implementation 200a in which the data processing apparatus 102 and the microcontroller 202 form a logical unit, the communication between the two, as indicated by the solid arrow, may be wired (e.g., using USB). On the other hand, communication between the logical unit (i.e., the data processing apparatus 102 and the microcontroller 202) and the display system 106, as indicated by the dashed arrow, may be wireless (e.g., using Bluetooth). In this example, both communication paths (i.e., the path between the data processing apparatus 102 and the microcontroller 202 as well as the path between the logical unit and the display system 106) may be a two-way communication as indicated by the arrowheads showing in both directions. A two-way communication may relate to both entities participating in the communication (e.g., the logical unit and the display system 106) being able to receive and transmit data (e.g., control commands according to aspects of the present disclosure).
Fig. 2b illustrates another wireless implementation 200b of the data processing system 100 in accordance with an exemplary embodiment. For the sake of clarity, the sensor system 104 is not shown in the illustrated implementation 200b. However, it is to be understood that the sensor system 104 as well as other units as described throughout this disclosure can be incorporated into the implementation 200b. As can be seen, the wireless implementation 200b involves a microcontroller 202. In the implementation 200b the display system 106 and the microcontroller 202 may represent a logical unit. This means that the display system 106 and the microcontroller 202 may be seen as one unit (e.g., with respect to the entity producing the two, with respect to the positioning or mounting etc.). In contrast, the data processing apparatus 102 may represent a separate unit as indicated by the dashed line. For example, the data processing apparatus 102 may be from a different manufacturer as the logical unit comprising the display system 106 and the microcontroller 202. Alternatively or additionally, components of the logical unit (i.e. , the display system 106 and the microcontroller 202) may be positioned/mounted at the same location while the data processing apparatus 102 is positioned/mounted at a different (e.g., remote) location.
In this example implementation 200b in which the display system 106 and the microcontroller 202 form a logical unit, the communication between the two, as indicated by the arrow, may be wired (e.g., using USB). On the other hand, communication between the logical unit (i.e., the display system 106 and the microcontroller 202) and the data processing apparatus 102, as indicated by the dashed arrow, may be wireless (e.g., using Bluetooth). In this example, both communication paths (i.e., the path between the data processing apparatus 120 and the logical unit as well as the path between the microcontroller 202 and the display system 106) may be a one-way communication as indicated by the arrowhead. A one-way communication may relate to only one entity participating in the communication (e.g., the microcontroller 202) being able to transmit data (e.g., control commands according to aspects of the present disclosure) while the other entity (e.g., the display system 106) being able to only receive the data.
Fig. 3a illustrates a wired implementation 300a of the data processing system 100 in accordance with an exemplary embodiment. For the sake of clarity, the sensor system 104 is not shown in the illustrated implementation 300a. However, it is to be understood that the sensor system 104 as well as other units as described throughout this disclosure can be incorporated into the implementation 300a.
As can be seen, the wired implementation 300a involves a microcontroller 202. In the implementation 300a the display system 106 and the microcontroller 202 may represent a logical unit. This means that the display system 106 and the microcontroller 202 may be seen as one unit (e.g., with respect to the entity producing the two, with respect to the positioning or mounting etc.). In contrast, the data processing apparatus 102 may represent a separate unit as indicated by the dashed line. For example, the data processing apparatus 102 may be from a different manufacturer as the logical unit comprising the display system 106 and the microcontroller 202. Alternatively or additionally, components of the logical unit (i.e., the display system and the microcontroller 202) may be positioned/mounted at the same location while the data processing apparatus 102 is positioned/mounted at a different (e.g., remote) location.
In this example implementation 300a in which the display system 106 and the microcontroller 202 form a logical unit, the communication between the two, as indicated by the arrow, may be wired (e.g., using USB). The communication between the logical unit (i.e., the display system 106 and the microcontroller 202) and the data processing apparatus 102, as indicated by the arrow, may also be wired (e.g., using Ethernet). In this example, both communication paths (i.e., the path between the data processing apparatus 102 and the logical unit as well as the path between the microcontroller 202 and the display system 106) may be a one-way communication as indicated by the arrowhead. A one-way communication may relate to only one entity participating in the communication (e.g., the microcontroller 202) being able to transmit data (e.g., control commands according to aspects of the present disclosure) while the other entity (e.g., the display system 106) being able to only receive the data.
Fig. 3b illustrates another wired implementation 300b of the data processing system 100 in accordance with an exemplary embodiment. For the sake of clarity, the sensor system 104 is not shown in the illustrated implementation 300b. However, it is to be understood that the sensor system 104 as well as other units as described throughout this disclosure can be incorporated into the implementation 300b.
As can be seen, the wired implementation 300b involves a microcontroller 202. In the implementation 300b the data processing apparatus 102 and the microcontroller 202 may represent a logical unit. This means that the data processing apparatus 102 and the microcontroller 202 may be seen as one unit (e.g., with respect to the entity producing the two, with respect to the positioning or mounting etc.). In contrast, the display system 106 may represent a separate unit as indicated by the dashed line. For example, the display system 106 may be from a different manufacturer as the logical unit comprising the data processing apparatus 102 and the microcontroller 202. Alternatively or additionally, components of the logical unit (i.e., the data processing apparatus 102 and the microcontroller 202) may be positioned/mounted at the same location while the display system 106 is positioned/mounted at a different (e.g., remote) location.
In this example implementation 300b in which the data processing apparatus 102 and the microcontroller 202 form a logical unit, the communication between the two, as indicated by the arrow, may be wired (e.g., using USB). The communication between the logical unit (i.e., the data processing apparatus 102 and the microcontroller 202) and the display system 106, as indicated by the arrow, may also be wired (e.g., using USB). In this example, the communication path between the data processing apparatus 102 and the microcontroller 202 may be a two-way communication as indicated by the arrowheads showing in both directions. The path between the logical unit and the display system 106 may be a one-way communication as indicated by the arrowhead. A two-way communication may relate to both entities participating in the communication (e.g., the data processing apparatus 102 and the microcontroller 202) being able to receive and transmit data (e.g., control commands according to aspects of the present disclosure). A one-way communication may relate to only one entity participating in the communication (e.g., the logical unit) being able to transmit data (e.g., control commands according to aspects of the present disclosure) while the other entity (e.g., the display system 106) being able to only receive the data.
As used herein the term “and/or” includes any and all combinations of one or more of the associated listed items and may be abbreviated as 7”.
Although some aspects have been described in the context of an apparatus, it is clear that these aspects also represent a description of the corresponding method, where a block or device corresponds to a method step or a feature of a method step. Analogously, aspects described in the context of a method step also represent a description of a corresponding block or item or feature of a corresponding apparatus.
Embodiments of the present disclosure may be implemented on a computer system. The computer system may be a local computer device (e.g., personal computer, laptop, tablet computer or mobile phone) with one or more processors and one or more storage devices or may be a distributed computer system (e.g., a cloud computing system with one or more processors and one or more storage devices distributed at various locations, for example, at a local client and/or one or more remote server farms and/or data centers). The computer system may comprise any circuit or combination of circuits. In one embodiment, the computer system may include one or more processors which can be of any type. As used herein, processor may mean any type of computational circuit, such as but not limited to a microprocessor, a microcontroller, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a graphics processor, a digital signal processor (DSP), multiple core processor, a field programmable gate array (FPGA), or any other type of processor or processing circuit. Other types of circuits that may be included in the computer system may be a custom circuit, an application-specific integrated circuit (ASIC), or the like, such as, for example, one or more circuits (such as a communication circuit) for use in wireless devices like mobile telephones, tablet computers, laptop computers, two-way radios, and similar electronic systems. The computer system may include one or more storage devices, which may include one or more memory elements suitable to the particular application, such as a main memory in the form of random-access memory (RAM), one or more hard drives, and/or one or more drives that handle removable media such as compact disks (CD), flash memory cards, digital video disk (DVD), and the like. The computer system may also include a display device, one or more speakers, and a keyboard and/or controller, which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system.
Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a processor, a microprocessor, a programmable computer or an electronic circuit. In some embodiments, some one or more of the most important method steps may be executed by such an apparatus.
Depending on certain implementation requirements, embodiments of the present disclosure can be implemented in hardware or in software. The implementation can be performed using a non-transitory storage medium such as a digital storage medium, for example a floppy disc, a DVD, a Blu-Ray, a CD, a ROM, a PROM, and EPROM, an EEPROM or a FLASH memory, having electronically readable control signals stored thereon, which cooperate (or are capable of cooperating) with a programmable computer system such that the respective method is performed. Therefore, the digital storage medium may be computer readable.
Some embodiments according to the present disclosure comprise a data carrier having electronically readable control signals, which are capable of cooperating with a programmable computer system, such that one of the methods described herein is performed. Generally, embodiments of the present disclosure can be implemented as a computer program product with a program code, the program code being operative for performing one of the methods when the computer program product runs on a computer. The program code may, for example, be stored on a machine-readable carrier.
Other embodiments comprise the computer program for performing one of the methods described herein, stored on a machine-readable carrier.
In other words, an embodiment of the present disclosure is, therefore, a computer program having a program code for performing one of the methods described herein, when the computer program runs on a computer.
A further embodiment of the present disclosure is, therefore, a storage medium (or a data carrier, or a computer-readable medium) comprising, stored thereon, the computer program for performing one of the methods described herein when it is performed by a processor. The data carrier, the digital storage medium or the recorded medium are typically tangible and/or non-transitory. A further embodiment of the present disclosure is an apparatus as described herein comprising a processor and the storage medium.
A further embodiment of the present disclosure is, therefore, a data stream or a sequence of signals representing the computer program for performing one of the methods described herein. The data stream or the sequence of signals may, for example, be configured to be transferred via a data communication connection, for example, via the internet.
A further embodiment comprises a processing means, for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein.
A further embodiment comprises a computer having installed thereon the computer program for performing one of the methods described herein.
A further embodiment according to the present disclosure comprises an apparatus or a system configured to transfer (for example, electronically or optically) a computer program for performing one of the methods described herein to a receiver. The receiver may, for example, be a computer, a mobile device, a memory device or the like. The apparatus or system may, for example, comprise a file server for transferring the computer program to the receiver. In some embodiments, a programmable logic device (for example, a field programmable gate array) may be used to perform some or all of the functionalities of the methods described herein. In some embodiments, a field programmable gate array may cooperate with a microprocessor in order to perform one of the methods described herein. Generally, the methods are preferably performed by any hardware apparatus.

Claims

1. A data processing apparatus (102), comprising: an input interface configured to receive sensor data captured by a sensor system (104), the sensor data representing one or more three-dimensional touchless control gestures performed by a user of a display system (106); a processing unit configured to generate, based at least in part on the received sensor data, one or more control commands for the display system (106); wherein at least one control command of the one or more control commands is in accordance with a display interface standard; and an output interface configured to output the one or more control commands to the display system (106).
2. The data processing apparatus (102) of claim 1, wherein the at least one control command is a control command for two-dimensional display control.
3. The data processing apparatus (102) of claim 2, wherein the control command for two-dimensional display control comprises two-dimensional position information and/or two- dimensional event information associated with the two-dimensional position information.
4. The data processing apparatus (102) of claim 3, wherein the two-dimensional position information consists of an x-coordinate and a y-coordinate; and wherein the two-dimensional event information associated with the two-dimensional position information comprises one or more of: a click event, in particular a single-click, a double-click or a triple-click, a press-and-release event, a drag-and-drop event, a hold-and-repeat event, a movement event, such as a position change, an acceleration and/or a velocity associated with a position change, a scrolling event, a tap event, a swipe event, a pinch-to-zoom event, a rotate event and a multi-finger-gesture event.
5. The data processing apparatus (102) of any one of the preceding claims, wherein the display interface standard supports only control commands for two-dimensional display control.
6. The data processing apparatus (102) of any one of the preceding claims, wherein the display interface standard is a Human Interface Device, HID, standard.
7. The data processing apparatus (102) of any one of the preceding claims, wherein the one or more control commands comprise at least one control command for three- dimensional touchless gesture control.
8. The data processing apparatus (102) of claim 7, wherein the control command for three-dimensional touchless gesture control comprises three-dimensional position information and/or three-dimensional event information.
9. The data processing apparatus (102) of any one of the preceding claims, wherein the processing unit is further configured to: determine a three-dimensional touchless control gesture based on the received sensor data; and wherein the generating the one or more commands for the display system (106) comprises: generating, if the three-dimensional touchless control gesture is convertible into a two-dimensional event, a control command in accordance with the display interface standard; or generating, if the three-dimensional touchless gesture is not convertible into a two- dimensional event, a three-dimensional control command.
10. The data processing apparatus (102) of any one of the preceding claims, wherein the output interface comprises a wired communication unit, in particular a Power over Ethernet, PoE, unit, configured to transmit the one or more control commands to the display system (106).
11 . The data processing apparatus (102) of any one of the preceding claims, wherein the output interface comprises a wireless communication unit, in particular a Wireless Local Area Network, WLAN, unit or a Bluetooth unit, configured to transmit the one or more control commands to the display system (106).
12. A data processing system (100) comprising: a data processing apparatus (102) according to any one of the preceding claims; a sensor system (104) configured for capturing sensor data representing one or more three-dimensional touchless control gestures performed by a user of a display system (106); and the display system (106).
13. The data processing system (100) of claim 12, wherein the sensor system (104) comprises one or more depth sensors, in particular one or more depth cameras.
14. A computer-implemented method performed by a data processing apparatus (102), the method comprising the steps of: receiving, via an input interface of the data processing apparatus (102), sensor data captured by a sensor system (104), the sensor data representing one or more three- dimensional touchless control gestures performed by a user of a display system (106); generating, by a processing unit of the data processing apparatus (102), based at least in part on the received sensor data, one or more control commands for the display system (106); wherein at least one control command of the one or more control commands is in accordance with a display interface standard; and outputting, via an output interface of the data processing apparatus (102), the one or more control commands to the display system (106).
15. A computer program or a computer-readable medium having stored thereon a computer program, the computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of claim 14.
PCT/EP2024/071496 2023-08-21 2024-07-29 Touchless gesture control-enabling data processing apparatus, system and method Pending WO2025040381A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102023122357 2023-08-21
DE102023122357.7 2023-08-21

Publications (1)

Publication Number Publication Date
WO2025040381A1 true WO2025040381A1 (en) 2025-02-27

Family

ID=92208682

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/071496 Pending WO2025040381A1 (en) 2023-08-21 2024-07-29 Touchless gesture control-enabling data processing apparatus, system and method

Country Status (1)

Country Link
WO (1) WO2025040381A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220197431A1 (en) * 2020-07-30 2022-06-23 Ncr Corporation Methods, system, and apparatus for touchless terminal interface interaction
US20230037571A1 (en) * 2019-12-31 2023-02-09 Neonode Inc. Contactless touch input system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230037571A1 (en) * 2019-12-31 2023-02-09 Neonode Inc. Contactless touch input system
US20220197431A1 (en) * 2020-07-30 2022-06-23 Ncr Corporation Methods, system, and apparatus for touchless terminal interface interaction

Similar Documents

Publication Publication Date Title
CN108885521B (en) Cross-environment sharing
US8762869B2 (en) Reduced complexity user interface
EP2981104B1 (en) Apparatus and method for providing information
KR102090269B1 (en) Method for searching information, device, and computer readable recording medium thereof
EP3087456B1 (en) Remote multi-touch control
US10754546B2 (en) Electronic device and method for executing function using input interface displayed via at least portion of content
US10359905B2 (en) Collaboration with 3D data visualizations
US11360566B2 (en) Mechanism to provide visual feedback regarding computing system command gestures
US20120192078A1 (en) Method and system of mobile virtual desktop and virtual trackball therefor
US20130132878A1 (en) Touch enabled device drop zone
US10846939B2 (en) Augmented reality workspace system
TW201020919A (en) Multi-touch manipulation of application objects
CN104303145A (en) Translation of touch input into local input based on a translation profile for an application
US9182908B2 (en) Method and electronic device for processing handwritten object
CN113396378A (en) System and method for a multipurpose input device for two-dimensional and three-dimensional environments
CN103440033A (en) Method and device for achieving man-machine interaction based on bare hand and monocular camera
KR20120061169A (en) Object control system using the mobile with touch screen
US8869073B2 (en) Hand pose interaction
US10496237B2 (en) Computer-implemented method for designing a three-dimensional modeled object
WO2025040381A1 (en) Touchless gesture control-enabling data processing apparatus, system and method
US8989498B2 (en) System, information providing method and electronic device
CN109558051B (en) Method, device and computer-readable storage medium for switching multi-function pages
CN110955787A (en) User avatar setting method, computer device and computer-readable storage medium
Cardoso et al. Interaction tasks and controls for public display applications
Adsul et al. Real-Time Vision-Based Hand Gesture Recognition System for Multi-Functional Contactless Control of Digital Applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24751250

Country of ref document: EP

Kind code of ref document: A1