[go: up one dir, main page]

TR2023011601A2 - A SYSTEM THAT ENABLES THE USE OF A VIRTUAL KEYBOARD AND MOUSE - Google Patents

A SYSTEM THAT ENABLES THE USE OF A VIRTUAL KEYBOARD AND MOUSE Download PDF

Info

Publication number
TR2023011601A2
TR2023011601A2 TR2023/011601A TR2023011601A TR2023011601A2 TR 2023011601 A2 TR2023011601 A2 TR 2023011601A2 TR 2023/011601 A TR2023/011601 A TR 2023/011601A TR 2023011601 A TR2023011601 A TR 2023011601A TR 2023011601 A2 TR2023011601 A2 TR 2023011601A2
Authority
TR
Turkey
Prior art keywords
user
mouse
server
virtual
enables
Prior art date
Application number
TR2023/011601A
Other languages
Turkish (tr)
Inventor
Emre Taşar Davut
Sunturlu Burcu
Kaldi Ersi̇n
Original Assignee
Tuerkiye Garanti Bankasi Anonim Sirketi
Türki̇ye Garanti̇ Bankasi Anoni̇m Şi̇rketi̇
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tuerkiye Garanti Bankasi Anonim Sirketi, Türki̇ye Garanti̇ Bankasi Anoni̇m Şi̇rketi̇ filed Critical Tuerkiye Garanti Bankasi Anonim Sirketi
Priority to TR2023/011601A priority Critical patent/TR2023011601A2/en
Publication of TR2023011601A2 publication Critical patent/TR2023011601A2/en
Priority to PCT/TR2024/050710 priority patent/WO2025063923A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

Bu buluş, yapay zekâ algoritmaları aracılığıyla bir kullanıcının el hareketlerinin takip edilerek kullanıcının sanal gerçeklik ve artırılmış gerçeklik ortamları ile etkileşime girmesini sağlayan bir sistem (1) ile ilgilidir.This invention relates to a system (1) that enables a user to interact with virtual reality and augmented reality environments by tracking a user's hand movements through artificial intelligence algorithms.

Description

TARIFNAME BIR SANAL KLAVYE VE FARE KULLANIMINI SAGLAYAN BIR Teknik Alan Bu bulus, yapay zekâ algoritmalari araciligiyla bir kullanicinin el hareketlerinin takip edilerek kullanicinin sanal gerçeklik ve artirilmis gerçeklik ortamlari ile etkilesime girmesini saglayan bir sistem ile ilgilidir. Önceki Teknik Günümüzde, elektronik cihazlar, sanal gerçeklik ve aitirilmis gerçeklik ortamlari ile kullanici etkilesimini saglamak için ayrilmis alan gerektiren ve mobil veya kompakt ortamlarda pratik olmayan fiziksel klavyeler ve fareler, dokunsal geri bildirim ve giris seçenekleri açisindan sinirlayici olabilen dokunmatik ekranlar, pahali ve tüm uygulamalarla uyumlu olmayan özel sanal gerçeklik/artirilmis gerçeklik denetleyicileri, genellikle özel donanim gerektiren veya sinirli islevsellige sahip hareket tanima cihazlari seklinde çesitli cihazlar mevcuttur. Ancak bu yöntemlerin tasinabilirlik eksikligi, geleneksel giris cihazlarinin sanal gerçeklik/artirilmis gerçeklik ortamlarina sinirli entegrasyonu, çesitli kullanici tercihleri veya fiziksel yetenekler için uyarlanabilirlik ve özellestirme seçeneklerinin olmamasi seklinde dezavantajlari bulunmaktadir. Bu sebeple mevcut teknikte bulunan çalismalar ve eksiklikler göz önünde bulunduruldugunda, kullanicilarin dogal el hareketlerini kullanarak klavye ve fare gibi fiziksel çevre birimlerine ihtiyaç duymadan cihazlariyla etkilesime girmesine olanak taninmasini ve engelli veya sinirli hareket kabiliyetine sahip kullanicilar için iyilestirilmis erisilebilirligin sunulmasini saglayan bir sisteme ihtiyaç duyuldugu anlasilmaktadir. Teknigin bilinen durumunda yer alan USZOl90265781Al sayili Birlesik Devletler patent dokümaninda, sanal veya artirilmis gerçeklik ortaminda bir sanal giris cihaziyla bir kullanici etkilesimi saglayan bir sistemden bahsedilmektedir. Söz konusu bulustaki sistemde bir sanal giris cihazi bir görüntüleme cihazi tarafindan islenmekte, bir kullanicinin elinin hareketi izlenmekte, bir darbe tespit cihazindan, kullanicinin elinin en az bir parmaginin fiziksel bir yüzey üzerinde tespit edilen bir darbesine karsilik gelen darbe verileri alinmakta, algilanan etkinin islenmis sanal giris cihazina göre konumu belirlenmektedir. Teknigin bilinen durumunda yer alan bir diger doküman olan USZOl90295322Al sayili Birlesik Devletler patent dokümaninda, karma gerçeklik ve/veya aitirilmis gerçeklik etkilesimli deneyimleri saglayan çevresel cihazlarla ilgilidir. Söz konusu bulustaki sistemde çevresel aygitlar, insan ve bilgisayar arasinda arayüz olusturmak için kullanilmaktadir. Bazi yaygin çevresel aygitlar arasinda klavyeler, bilgisayar fareleri, görüntü tarayicilar, hoparlörler, mikrofonlar, web kameralari, basa takilan ekran, prob uçlari, kalemler, isaretçi aygitlari ve daha fazlasi bulunmaktadir. Bu çevresel aygitlardan bazilari, bir kullanicidan gelen bir hareketi bir girdi olarak algilamakta ve hareketi bilgisayara bir girdi olarak çevirmektedir. Bir bilgisayar faresi gibi elde tutulan bir çevresel aygitin yüzeye göre hareketi, bir grafik kullanici arabiriminde görüntülenen bir imlece karsilik gelen harekete çevirebilmektedir. Ayrica klavye, bir tusun hareketini ve/veya parmagin tusa dokunma hareketini algilamakta ve tus tarafindan temsil edilen belirli bilgileri görüntülemek için bilgisayara bir sinyal göndermektedir. Bulusun Kisa Açiklamasi Bu bulusun amaci, yapay zekâ algoritmalari araciligiyla bir kullanicinin el hareketlerinin takip edilerek kullanicinin sanal gerçeklik ve artirilmis gerçeklik ortamlari ile etkilesime girmesini saglayan bir sistem gerçeklestirmektir. Bu bulusun bir diger amaci, kullanicinin el hareketlerini klavye ve fare girisine dönüstürerek kullanicinin fiziksel çevre birimlerine ihtiyaç duymadan görevleri gerçeklestirmesine olanak taninmasini saglayan bir sistem gerçeklestirmektir. Bulusun Ayrintili Açiklamasi Bu bulusun amacina ulasmak için gerçeklestirilen "Bir Sanal Klavye ve Fare Kullanimini Saglayan Bir Sistem" ekli sekilde gösterilmis olup, bu sekil; Sekil-l; Bulus konusu sistemin sematik görünüsüdür. Sekilde yer alan parçalar tek tek numaralandirilmis olup, bu numaralarin karsiliklari asagida verilmistir. 1. Sistem 2. Elektronik Cihaz 3. Kamera 4. Sunucu Kullanicinin el hareketlerini klavye ve fare girisine dönüstürerek kullanicinin fiziksel çevre birimlerine ihtiyaç duymadan görevleri gerçeklestirmesine olanak tanimasini saglamak amaciyla gelistirilen bulus konusu sistem (1); - herhangi bir uzak iletisim protokolünü kullanarak veri alisverisi gerçeklestirmek ve üzerinde en azindan bir uygulama yürütmek üzere yapilandirilan en az bir elektronik cihaz (2), - görüntü yakalamak ve kullanici el hareketlerini gerçek zamanli olarak izlemek üzere yapilandirilan en az bir kamera (3), - herhangi bir uzak iletisim protokolünü kullanarak elektronik cihaz (2) ile iletisim kurmak, kamera (3) ile veri alisverisi gerçeklestirmek, makine ögrenimi ve bilgisayarla görme algoritmalari araciligiyla kamera (3) üzerinden erisilen kullanicinin el hareketi verilerini analiz etmek ve belirli hareketleri klavye ve fare eylemleri seklinde girdi olarak tanimlamak ve elektronik cihaz (2) üzerine sanal gerçeklik ve artirilmis gerçeklik ortamlarini entegre etmek üzere yapilandirilan en az bir sunucu (4) içermektedir. Bulus konusu sistemde (1) yer alan elektronik cihaz (2), herhangi bir uzak iletisim protokolünü kullanarak veri alisverisi gerçeklestirmek ve üzerinde en azindan bir uygulama yürütmek üzere yapilandirilmaktadir. Elektronik cihaz (2), cep telefonu, bilgisayar, diz üstü bilgisayar ve tablet seklinde bir cihazdir. Elektronik cihaz (2), teknigin bilinen durumunda yer alan herhangi bir uzak iletisim protokolünü kullanarak sunucu (4) ile baglanti kurmak ve kurulan bu baglanti üzerinden kamera (3) ve sunucu (4) arasinda veri alisverisi gerçeklesmesini saglamak üzere yapilandirilmaktadir. Bulusun tercih edilen uygulamasinda elektronik cihaz (2), sunucu (4) ile Internet seklinde bir veri yolu kullanarak veri alisverisi gerçeklestirmek üzere yapilandirilmaktadir. Bulus konusu sistemde (1) yer alan kamera (3), elektronik cihaz (2) üzerinde yürütülmekte olup, kullanicinin ellerinin görüntülerini gerçek zamanli olarak izlemek ve yakalamak üzere yapilandirilmaktadir. Bulus konusu sistemde (1) yer alan sunucu (4), teknigin bilinen durumunda yer alan herhangi bir iletisim protokolünü kullanarak elektronik cihaz (2) ile iletisim kurmak ve kurulan bu iletisim üzerinden elektronik cihaz (2) üzerinde yürütülen kamera (3) ile veri alisverisi gerçeklestirmek üzere yapilandirilmaktadir. Sunucu (4), elektronik cihaz (2) ile Internet seklindeki bir veri sebekesi üzerinden iletisim kurmak üzere yapilandirilmaktadir. Sunucu (4), kamera (3) ile veri alisverisi gerçeklestirmek üzere yapilandirilmaktadir. Sunucu (4), kamera (3) üzerinden eristigi kullanicinin el görüntülerini tanimlamak ve islemek üzere yapilandirilmaktadir. Sunucu (4), makine ögrenimi ve bilgisayarla görme algoritmalari araciligiyla kamera (3) üzerinden eristigi kullanicinin el hareketi verilerini analiz etmek ve hareketleri klavye ve fare eylemleri seklinde girdi olarak tanimlamak üzere yapilandirilmaktadir. Sunucu (4), elektronik cihaz (2) üzerine sanal gerçeklik ve artirilmis gerçeklik ortamlarini entegre etmek üzere yapilandirilmaktadir. Sunucu (4), makine ögrenimi ve bilgisayarla görme algoritmalari araciligiyla kamera (3) üzerinden eristigi kullanicinin hareketlerini klavye ve fare eylemleri seklinde girdi olarak tanimlayarak kullanicilarin elektronik cihazlari (2) üzerine entegre edilen sanal gerçeklik ve artirilmis gerçeklik ortamlari ile etkilesimde bulunmalarini saglamak üzere yapilandirilmaktadir. Sunucu (4), elektronik cihaz (2) üzerine entegre edilen sanal gerçeklik ve artirilmis gerçeklik ortamlari üzerinde kullanicilarin sanal giris düzeni ve hareket tanima ayarlari seklindeki tercihlerini kendi bireysel ihtiyaçlarina ve yeteneklerine uyacak sekilde ayarlamalarini saglamak üzere yapilandirilmaktadir. Bulusun sanayiye uygulanmasi Bulus konusu sistemde (1) sunucu (4), kamera (3) üzerinden eristigi kullanicinin el görüntülerini tanimlamak ve islemekte, makine ögrenimi ve bilgisayarla görme algoritmalari araciligiyla kamera (3) üzerinden eristigi kullanicinin el hareketi verilerini analiz etmekte ve belirli hareketleri klavye ve fare eylemleri seklinde girdi olarak tanimlamaktadir. Sunucu (4), elektronik cihaz (2) üzerine sanal gerçeklik ve artirilmis gerçeklik ortamlarini entegre etmekte, makine ögrenimi ve bilgisayarla görme algoritmalari araciligiyla kamera (3) üzerinden eristigi kullanicinin belirli hareketlerini klavye ve fare eylemleri seklinde girdi olarak tanimlayarak kullanicilarin elektronik cihazlari (2) üzerine entegre edilen sanal gerçeklik ve artirilmis gerçeklik ortamlari ile etkilesimde bulunmalarini saglamaktadir. Sunucu (4), elektronik cihaz (2) üzerine entegre edilen sanal gerçeklik ve artirilmis gerçeklik ortamlari üzerinde kullanicilarin sanal giris düzeni ve hareket tanima ayarlari seklindeki tercihlerini kendi bireysel ihtiyaçlarina ve yeteneklerine uyacak sekilde ayarlamalarini saglamaktadir. Böylece görüntü yakalama ve isleme, hareket tanima, sanal gerçeklik/artirilmis gerçeklik entegrasyonu, yapay zekâ tabanli hareket tanima ve özellestirilebilir bir sanal giris teknolojileriyle çok yönlü ve kullanici dostu bir çözüm sunularak, kullanicilarin dogal el hareketlerini kullanarak klavye ve fare gibi fiziksel çevre birimlerine ihtiyaç duymadan cihazlariyla etkilesime girmesine olanak taninmasi ve engelli veya sinirli hareket kabiliyetine sahip kullanicilar için iyilestirilmis erisilebilirligin sunulmasi saglanmaktadir. Bu temel kavramlar etrafinda, bulus konusu "Bir Sanal Klavye ve Fare Kullanimini Saglayan Bir Sistem (1)" ile ilgili çok çesitli uygulamalarin gelistirilmesi mümkün olup, bulus burada açiklanan örneklerle sinirlandirilamaZ, esas olarak istemlerde belirtildigi gibidir. TR TR TR TR TR TR TR TR TRDESCRIPTION A TECHNICAL FIELD THAT ENABLES THE USE OF A VIRTUAL KEYBOARD AND MOUSE This invention relates to a system that enables a user to interact with virtual reality and augmented reality environments by tracking their hand movements through artificial intelligence algorithms. Previous Technical Field Today, various devices exist for enabling user interaction with virtual reality and augmented reality environments, including physical keyboards and mice that require dedicated space and are impractical in mobile or compact environments, touch screens that can be limiting in terms of haptic feedback and input options, expensive and not compatible with all applications, specialized virtual reality/augmented reality controllers, and motion recognition devices that often require specialized hardware or have limited functionality. However, these methods have disadvantages such as lack of portability, limited integration of traditional input devices into virtual reality/augmented reality environments, and the absence of adaptability and customization options for various user preferences or physical abilities. Therefore, considering the studies and shortcomings in the current technology, it is understood that there is a need for a system that allows users to interact with their devices using natural hand movements without the need for physical peripherals such as keyboards and mice, and that provides improved accessibility for users with disabilities or limited mobility. The United States patent document, USZOl90265781Al, which falls under the state of the art, describes a system that enables user interaction with a virtual input device in a virtual or augmented reality environment. In the system described in this invention, a virtual input device is processed by a display device, the movement of a user's hand is tracked, and impact data corresponding to a detected impact of at least one finger of the user's hand on a physical surface is received from an impact detection device, and the location of the detected impact relative to the processed virtual input device is determined. Another document in the state of the art, United States Patent No. USZOl90295322Al, relates to peripheral devices that provide mixed reality and/or augmented reality interactive experiences. In the system described in this invention, peripheral devices are used to create an interface between humans and computers. Some common peripheral devices include keyboards, computer mice, image scanners, speakers, microphones, webcams, head-mounted displays, probes, pens, pointing devices, and more. Some of these peripheral devices can detect a movement from a user as input and translate that movement into input for the computer. The movement of a handheld peripheral device, such as a computer mouse, relative to a surface can be translated into a movement corresponding to a cursor displayed on a graphical user interface. Similarly, a keyboard can detect the movement of a key and/or the touch of a finger on a key and send a signal to the computer to display specific information represented by the key. Brief Description of the Invention: The aim of this invention is to realize a system that, through artificial intelligence algorithms, tracks a user's hand movements and enables the user to interact with virtual reality and augmented reality environments. Another aim of this invention is to realize a system that converts the user's hand movements into keyboard and mouse input, allowing the user to perform tasks without needing physical peripherals. Detailed Description of the Invention The system developed to achieve the purpose of this invention, "A System Enabling the Use of a Virtual Keyboard and Mouse", is shown in the attached figure; Figure-l; is the schematic view of the system in question. The parts in the figure are numbered individually, and the corresponding numbers are given below. 1. System 2. Electronic Device 3. Camera 4. Server The system in question, developed to enable the user to perform tasks without needing physical peripherals by converting the user's hand movements into keyboard and mouse input (1); The invention includes at least one electronic device (2) configured to exchange data using any remote communication protocol and to run at least one application on it; at least one camera (3) configured to capture images and track user hand movements in real time; and at least one server (4) configured to communicate with the electronic device (2) using any remote communication protocol, exchange data with the camera (3), analyze the user hand movement data accessed through the camera (3) using machine learning and computer vision algorithms, define specific movements as input in the form of keyboard and mouse actions, and integrate virtual reality and augmented reality environments onto the electronic device (2). The electronic device (2) in the system (1) is configured to exchange data using any remote communication protocol and to run at least one application on it. The electronic device (2) is a device in the form of a mobile phone, computer, laptop, and tablet. The electronic device (2) is configured to connect with the server (4) using any remote communication protocol available under the known state of the art and to enable data exchange between the camera (3) and the server (4) via this connection. In the preferred application of the invention, the electronic device (2) is configured to exchange data with the server (4) using a data bus in the form of the Internet. The camera (3) in the system (1) is run on the electronic device (2) and is configured to monitor and capture images of the user's hands in real time. In the system in question (1), the server (4) is configured to communicate with the electronic device (2) using any communication protocol available under the known state of the art and to exchange data with the camera (3) running on the electronic device (2) through this communication. The server (4) is configured to communicate with the electronic device (2) over a data network in the form of the Internet. The server (4) is configured to exchange data with the camera (3). The server (4) is configured to identify and process the hand images of the user accessed through the camera (3). The server (4) is configured to analyze the hand movement data of the user accessed through the camera (3) through machine learning and computer vision algorithms and to identify the movements as input in the form of keyboard and mouse actions. Server (4) is configured to integrate virtual reality and augmented reality environments onto electronic devices (2). Server (4) is configured to enable users to interact with virtual reality and augmented reality environments integrated onto electronic devices (2) by identifying the user's movements accessed via camera (3) as input in the form of keyboard and mouse actions through machine learning and computer vision algorithms. Server (4) is configured to allow users to adjust their preferences in the form of virtual input patterns and motion recognition settings on the virtual reality and augmented reality environments integrated onto electronic devices (2) to suit their individual needs and abilities. Industrial application of the invention In the system (1), the server (4) identifies and processes the hand images of the user accessed through the camera (3), analyzes the hand movement data of the user accessed through the camera (3) via machine learning and computer vision algorithms, and defines certain movements as input in the form of keyboard and mouse actions. The server (4) integrates virtual reality and augmented reality environments onto the electronic device (2), and enables users to interact with the virtual reality and augmented reality environments integrated onto their electronic devices (2) by defining certain movements of the user accessed through the camera (3) via machine learning and computer vision algorithms as input in the form of keyboard and mouse actions. The server (4) enables users to adjust their virtual input layout and motion recognition settings on the virtual reality and augmented reality environments integrated onto the electronic device (2) to suit their individual needs and abilities. Thus, a versatile and user-friendly solution is offered with image capture and processing, motion recognition, virtual reality/augmented reality integration, AI-based motion recognition and customizable virtual input technologies, allowing users to interact with their devices using natural hand gestures without needing physical peripherals such as keyboards and mice, and providing improved accessibility for users with disabilities or limited mobility. Around these basic concepts, it is possible to develop a wide variety of applications related to the invention subject "A System Enabling the Use of a Virtual Keyboard and Mouse (1)", and the invention is not limited to the examples described herein, but mainly as stated in the requirements.

Claims (1)

1.1.
TR2023/011601A 2023-09-18 2023-09-18 A SYSTEM THAT ENABLES THE USE OF A VIRTUAL KEYBOARD AND MOUSE TR2023011601A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
TR2023/011601A TR2023011601A2 (en) 2023-09-18 2023-09-18 A SYSTEM THAT ENABLES THE USE OF A VIRTUAL KEYBOARD AND MOUSE
PCT/TR2024/050710 WO2025063923A1 (en) 2023-09-18 2024-06-25 A system for using a virtual keyboard and a mouse

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TR2023/011601A TR2023011601A2 (en) 2023-09-18 2023-09-18 A SYSTEM THAT ENABLES THE USE OF A VIRTUAL KEYBOARD AND MOUSE

Publications (1)

Publication Number Publication Date
TR2023011601A2 true TR2023011601A2 (en) 2023-12-21

Family

ID=95071815

Family Applications (1)

Application Number Title Priority Date Filing Date
TR2023/011601A TR2023011601A2 (en) 2023-09-18 2023-09-18 A SYSTEM THAT ENABLES THE USE OF A VIRTUAL KEYBOARD AND MOUSE

Country Status (2)

Country Link
TR (1) TR2023011601A2 (en)
WO (1) WO2025063923A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2591808A1 (en) * 2007-07-11 2009-01-11 Hsien-Hsiang Chiu Intelligent object tracking and gestures sensing input device
GB2483168B (en) * 2009-10-13 2013-06-12 Pointgrab Ltd Computer vision gesture based control of a device
US10732725B2 (en) * 2018-09-25 2020-08-04 XRSpace CO., LTD. Method and apparatus of interactive display based on gesture recognition
US20210248826A1 (en) * 2020-02-07 2021-08-12 Krikey, Inc. Surface distinction for mobile rendered augmented reality
KR102542830B1 (en) * 2021-04-16 2023-06-14 서울과학기술대학교 산학협력단 Mobile game system for playing Baduk-ball based on augmented reality and method therefor
CN115268743B (en) * 2022-07-29 2025-02-18 深圳市商汤科技有限公司 Image processing method, device, electronic device, information input system and medium
CN116048374B (en) * 2023-03-05 2023-08-29 广州网才信息技术有限公司 Online examination method and system for virtual invisible keyboard
CN116560497A (en) * 2023-04-06 2023-08-08 周源 Gesture signal input method and system based on image recognition

Also Published As

Publication number Publication date
WO2025063923A1 (en) 2025-03-27

Similar Documents

Publication Publication Date Title
Lee et al. Interaction methods for smart glasses: A survey
Liebers et al. Identifying users by their hand tracking data in augmented and virtual reality
CN102541256A (en) Position aware gestures with visual feedback as input method
CN114115689A (en) Cross-environment sharing
JP2006340370A (en) Input device by fingertip-mounting sensor
Cicek et al. Designing and evaluating head-based pointing on smartphones for people with motor impairments
Lu et al. Classification, application, challenge, and future of midair gestures in augmented reality
Zhang et al. A hybrid 2D–3D tangible interface combining a smartphone and controller for virtual reality
Corradini et al. A map-based system using speech and 3D gestures for pervasive computing
Jo et al. Enhancing virtual and augmented reality interactions with a MediaPipe-based hand gesture recognition user interface
Rahim et al. Gestural flick input-based non-touch interface for character input
TR2023011601A2 (en) A SYSTEM THAT ENABLES THE USE OF A VIRTUAL KEYBOARD AND MOUSE
Caporusso et al. Enabling touch-based communication in wearable devices for people with sensory and multisensory impairments
CN104834410A (en) Input apparatus and input method
Kaveri et al. Object tracking glove
Sawicki et al. Head movement based interaction in mobility
Sainadh et al. A Real-Time Human Computer Interaction Using Hand Gestures in OpenCV
Aggarwal et al. Gesture-based computer control
Ikematsu et al. Investigation of smartphone grasping posture detection method using corneal reflection images through a crowdsourced experiment
Sankar et al. Virtual Mouse Using Hand Gesture
WO2018034386A1 (en) Smartboard system linked with biometric information and method thereof
Kang et al. An alternative method for smartphone input using AR markers
Khan A survey of interaction techniques and devices for large high resolution displays
Samudrala et al. Hand Gesture Canvas-a New Digital Drawing Experience
Sri Charan et al. Effective Gesture-Based Framework for Capturing User Input