[go: up one dir, main page]

US20160239087A1 - Content-aware haptic system and associated control method - Google Patents

Content-aware haptic system and associated control method Download PDF

Info

Publication number
US20160239087A1
US20160239087A1 US15/016,605 US201615016605A US2016239087A1 US 20160239087 A1 US20160239087 A1 US 20160239087A1 US 201615016605 A US201615016605 A US 201615016605A US 2016239087 A1 US2016239087 A1 US 2016239087A1
Authority
US
United States
Prior art keywords
haptic
signal
input signal
control signal
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/016,605
Inventor
Shuo-Li Shih
Kui-Chang Tseng
Yu-Hao Huang
Tsu-Ming Liu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MediaTek Inc
Original Assignee
MediaTek Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MediaTek Inc filed Critical MediaTek Inc
Priority to US15/016,605 priority Critical patent/US20160239087A1/en
Assigned to MEDIATEK INC. reassignment MEDIATEK INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HUANG, YU-HAO, LIU, TSU-MING, SHIH, SHUO-LI, TSENG, KUI-CHANG
Priority to CN201610085507.0A priority patent/CN105892645A/en
Publication of US20160239087A1 publication Critical patent/US20160239087A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/06Investigating concentration of particle suspensions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N15/14Optical investigation techniques, e.g. flow cytometry
    • G01N15/1434Optical arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/49Scattering, i.e. diffuse reflection within a body or fluid
    • G01N21/53Scattering, i.e. diffuse reflection within a body or fluid within a flowing fluid, e.g. smoke
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4408Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video stream encryption, e.g. re-encrypting a decrypted video stream for redistribution in a home network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • H04N7/144Constructional details of the terminal equipment, e.g. arrangements of the camera and the display camera and display on the same optical axis, e.g. optically multiplexing the camera and display for eye to eye contact
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/06Investigating concentration of particle suspensions
    • G01N15/075Investigating concentration of particle suspensions by optical means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/062LED's
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/063Illuminating optical parts
    • G01N2201/0638Refractive parts
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • H04N2007/145Handheld terminals

Definitions

  • the invention relates to haptic systems, and, in particular, to a content-aware haptic system and an associated control method.
  • haptic technologies have emerged in academic reports and industrial products.
  • current haptic technologies mostly consider how to address the haptic experience of the end user, but haptic experiences are not easy to record or to duplicate in different haptic systems.
  • the patterns of haptic feedback are usually edited or predetermined in a specific manner, such as using a predefined mapping table to map specific symbols into corresponding haptic patterns.
  • Conventional haptic technologies do not control the haptic feedback according to the content of an input signal such as a video signal, image signal, audio signal, etc.
  • a haptic system in an exemplary embodiment, includes: a processor, and control circuitry.
  • the processor is configured to receive an input signal and perform haptic conversion on the input signal according to content of the input signal to generate a first control signal.
  • the control circuitry is configured to generate a driving signal according to the first control signal so as to provide haptic feedback through a haptic device.
  • a control method for a haptic system comprises the steps of: receiving an input signal; performing haptic conversion on the input signal according to the content of the input signal to generate a first control signal; and generating a driving signal according to the first control signal so as to provide haptic feedback through a haptic device.
  • FIG. 1 is a block diagram of a haptic system in accordance with an embodiment of the invention
  • FIG. 2 is a flow chart of a control method for a haptic system in accordance with an embodiment of the invention
  • FIGS. 3A and 3B are diagrams illustrating enhancing haptic feedback on the touch position in accordance with an embodiment of the invention
  • FIG. 4 is a flow chart of a control method for a haptic system in accordance with another embodiment of the invention.
  • FIG. 5 is a flow chart of a control method for a haptic system in accordance with yet another embodiment of the invention.
  • Haptic systems may be used for actuation such as vibration, shape deformation (e.g., contouring a flat surface, or deformation), friction change, volumetric haptic shape, transcutaneous nerve stimulation, temperature stimulation, or other suitable actuations or combination of actuations which could provide tactile feedback to a user.
  • Haptic systems may also be used for sensing stimuli such as, for example, contact on a display screen, patterns of contact on a screen, shape changes, physical changes of a system or component, or other suitable stimuli or combinations of stimuli which may be received.
  • Haptic systems may sense particular stimuli, change one or more characteristics of a shape change element, or both.
  • Haptic systems may perform sensing functions and actuating functions by a integrated haptic device.
  • haptic systems may be coupled to a display screen, audio processing circuitry, image processing circuitry, device hardware, or other system to provide for any combination of tactile, visual, and audio interactions. Actuation may occur, in some embodiments, substantially normal to a substantially planar surface, which may allow for three dimensional contouring of the planar surface.
  • FIG. 1 is a block diagram of a haptic system in accordance with an embodiment of the invention.
  • the haptic system 100 includes a processing unit 110 , control circuitry 120 , and a haptic device 130 .
  • the processing unit 110 is configured to receive an input signal 10 , and performs a haptic conversion to convert the content of the input signal 10 into a control signal 20 that is transmitted to the control circuitry 120 .
  • the processing unit 110 may comprises one or more processors, or digital signal processors (DSP), but the invention is not limited thereto.
  • DSP digital signal processors
  • the haptic conversion may include operations selected from at least one of decoding, processing, analyzing, converting the content of the input signal, retrieving, transforming, calculating relevant information of the input signal, and any combination operations mentioned in above, so as to generate the control signal 20 accordingly.
  • the control signal 20 may be one single signal or a set of signals, and may include information for indicating the strength of the haptic feedback to be provided as well as the location information of rendering the haptic feedback on the haptic device 130 .
  • the input signal 10 may be an image, video, acoustic signal, message, or sign, but the invention is not limited thereto. The details for converting different types of input signals into an associated control signal will be described later.
  • the control circuitry 120 is configured to receive the control signal 20 and generate a driving signal 30 so as to control the haptic device 130 to provide haptic feedback.
  • the control circuitry 120 may comprise a digital-to-analog converter (DAC) 121 that is configured to convert a digital control signal 20 received from the processing unit 110 into the driving signal 30 , e.g. an audio waveforms.
  • DAC digital-to-analog converter
  • the haptic feedback provided by the haptic device 130 may include vibration, shape deformation, friction change, volumetric haptic shape, transcutaneous nerve stimulation, temperature stimulation etc., but the invention is not limited thereto.
  • the processing unit 110 may analyze the video signal to obtain the motion vectors of macroblocks of an image frame in the video signal.
  • the magnitude and/or frequency of the driving signal 30 depend on the motion vectors of the macroblocks of the image frame.
  • the processing unit 110 may provide a control signal 20 indicating a higher amplitude and/or frequency of vibration to the control circuitry 120 , so that the control circuitry 120 may provide a driving signal 30 to control the haptic device 130 to provide vibration haptic feedback having a corresponding amplitude and/or frequency.
  • the control circuitry 120 may provide a driving signal 30 to control the haptic device 130 so as to provide a global vibration haptic feedback for the whole haptic device 130 .
  • the control circuitry 120 may provide a plurality of driving signals 30 to locally control a plurality of vibrators located in specific regions of the haptic device 130 so as to provide regional vibration haptic feedbacks respectively.
  • the processing unit 110 may analyze the video signal to obtain the foreground object, its position, and motion vectors of the foreground object.
  • the processing unit 110 may generate the control signal 20 to control at least a local vibrator of the haptic device 130 on the position at which the foreground object is presented.
  • the processing unit 110 may also detect the content change of the video signal, and generate the control signal 20 corresponding to the content change.
  • the processing unit 110 may also analyze the acoustic content of the video signal, and may generate the control signal 20 based on the acoustic content: For example, when there is a huge explosion scene in the video signal, and the associated acoustic content may have a sudden explosion sound with a high magnitude. Accordingly, the control signal 20 generated by the processing unit 110 may indicate a sudden strong vibration on the haptic device 130 while the explosion occurs.
  • the haptic device 130 includes electrostatic generator which is capable of providing electrostatic haptic feedback.
  • the haptic device 130 may be an electrostatic passive haptic display or an electrostatic passive haptic touchpad and the input signal 10 is a video signal, but the invention is not limited thereto.
  • the haptic device 130 is an electrostatic passive haptic display
  • the user may sense friction change while sliding his/her finger or fingers on the surface of the haptic device 130 , and the friction force sensed by the user corresponds to the content of the image. Additionally, the friction force (i.e. sensing slippery or rough) sensed by the user can be controlled by the driving signal 30 applied to the haptic device 130 .
  • the haptic device 130 is an electrostatic passive haptic display
  • the user may place his/her finger or fingers directly on the surface of the haptic device 130 , and the voltage difference occurs between the finger (as an electrode) and the haptic device 130 .
  • the surface of the haptic device 130 may include several segmented electrodes placed in a plurality of different regions.
  • the haptic device 130 is an electrostatic passive haptic touchpad
  • the user may experience tactile feedback by putting his/her finger or fingers on the touchpad and explores the surface of the haptic device 130 .
  • there may be a cursor or pointer on a display of the haptic system 100 and the user may sense the friction haptic feedback where the cursor or pointer is located when putting his finger on the surface of the haptic device 130 .
  • the processing unit 110 may perform a two-dimensional frequency-domain transformation (e.g. digital wavelet transform (DWT) or digital cosine transform (DCT)) on the image frame of the video signal, and thus a map in the frequency domain can be obtained.
  • DWT digital wavelet transform
  • DCT digital cosine transform
  • the control signal 20 generated by the processing unit 110 may control the control circuitry 120 to provide the driving signal 30 with higher voltage or frequency, so that the user may sense that the friction force on the specific image block is larger (i.e. sensing a rough surface) while sliding on the corresponding specific image block of the haptic device 130 via his finger.
  • the haptic device 130 (e.g. an electrostatic passive haptic display) could be able to provide friction haptic feedback.
  • the applied driving signal 30 to drive the haptic device 130 for generating a friction haptic feedback depends on the magnitude of motion vectors of the image frame.
  • the control circuitry 120 may provide a plurality of driving signals 30 to control different segmented electrodes in the haptic device 130 to provide a friction haptic feedback at which the foreground object or an obvious image feature (e.g. a human face) is appeared in the image frame.
  • the friction force provided by the haptic device 130 may also depend on the magnitude or frequency of the acoustic content of the incoming video signal, and the details can be found in the aforementioned embodiment.
  • the processing unit 110 may also generate the control signal 20 associated with the content of the video signal.
  • the degree of deformation haptic feedback also corresponds to the magnitude of the motion vectors and/or the content change in the video signal.
  • the haptic device 130 is a haptic deformation display having a plurality of deformation components, e.g. micro-fluidics display panel, and the processing unit 110 is capable of converting an incoming stereoscopic video signal or 3D image into a control signal 20 indicating deformation magnitude and shape of the deformation components of different regions on the haptic device 130 .
  • the surface of the haptic device 130 can be divided into a plurality of regions, and the control circuitry 120 may generate a plurality of driving signals 30 to control the deformation component of respective region according to the control signal 20 from the processing unit 110 .
  • the haptic deformation display is originally in a first-shape configuration, and the deformation component may at least partially define the shape of the haptic deformation display, thereby causing the shape of the haptic deformation display to deform into a second-shape configuration that is different from the first-shape configuration.
  • the second-shape configuration may substantially be maintained.
  • the processing unit 110 may calculate depth information (e.g. a depth map) associated with image frame in the stereoscopic video signal or 3D image, and the deformation height of the deformation component of a specific region corresponds to the associated depth information.
  • the processing unit 110 may perform foreground segmentation on the incoming video signal or image to obtain a foreground object.
  • the control signal 20 controls the individual deformation component of each region of the haptic device 130 to generate the deformation shape corresponding to the segmentation result of the foreground object.
  • control signal 20 is generated according to the temporal motion prediction results of the incoming stereoscopic video signal or 3D image when the haptic system 100 is used in a video codec system.
  • the haptic feedback e.g. slippery or rough
  • the processing unit 110 may perform foreground segmentation on the stereoscopic video signal or 3D image to obtain a foreground object so as to generate the control signal 20 accordingly by the processing unit 110 .
  • the control circuitry 120 may generate the driving signals 30 to control the corresponding deformation components where the foreground object is presented on the haptic deformation display.
  • the deformation behavior may also correspond to the acoustic content in the stereoscopic video signal, and the aforementioned embodiments can be referred to for the details.
  • the haptic device 130 equipped with transcutaneous electrical nerve stimulation device having a plurality of electrodes.
  • the processing unit 110 is capable of converting an incoming video signal or audio signal into a control signal 20 .
  • the control circuitry 120 may provide a plurality of driving signals 30 to control the plurality of electrodes in the haptic device 130 to provide a transcutaneous nerve stimulation haptic feedback.
  • the driving signals 30 indicate amplitude, frequency and/or duration of electrical impulses which would be applied by the plurality of electrodes on the haptic device 130 .
  • the haptic device 130 equipped with heaters.
  • the processing unit 110 is capable of converting an incoming video signal or audio signal into a control signal 20 .
  • the control circuitry 120 may provide a plurality of driving signals 30 to control the heaters in the haptic device 130 to provide thermal stimulation haptic feedback.
  • the driving signals 30 control thermal energy, and/or heating duration of the heaters of the haptic device 130 .
  • FIG. 2 is a flow chart of a control method for a haptic system in accordance with an embodiment of the invention.
  • step S 210 the input signal 10 is received.
  • step S 220 the haptic conversion is performed based on the content of the input signal 10 to generate a control signal 20 indicating associated haptic feedback.
  • step S 230 a haptic feedback driving signal 30 is provided to the haptic device 130 according to the control signal 20 .
  • the input content can be processed so as to generate driving signal to provide various haptic feedback with the help of sensor data.
  • the haptic system 100 may further includes at least a sensor 140 such as GPS sensor, ambient light sensor, proximity sensor, thermal sensor, accelerometer, pressure sensor, heart-rate sensor, or UV sensor, etc.
  • the sensor 140 provides sensor data 40 to the processing unit 110 .
  • the processing unit 110 receives the input video signal 10 and GPS data from the GPS sensor and generates a control signal 20 according to the input video signal 10 and GPS data, where the control signal 20 may be converted to the driving signal 30 to control the haptic device 130 to provide the haptic feedback, e.g. vibration haptic feedback.
  • the vibration magnitude may depend on the motion vectors of the image frame of the video signal, and also depend on the GPS data e.g. geographical location of the user such as at home or office.
  • the vibration haptic feedback may be weaker when the user is at home, and the vibration haptic feedback may be stronger when the user is in the office.
  • the processing unit 110 may also generate the control signal 20 according to the video signal and a plurality of sensor data 40 so as to control the haptic feedback of the haptic device 130 .
  • FIGS. 3A and 3B are diagrams illustrating enhancing haptic feedback on the touch position in accordance with an embodiment of the invention.
  • the haptic device 130 includes a touch module 131 is configured to detect the touch position 310 where the user's finger is touching on the haptic device 130 or is touching the mid-air space above the haptic device 130 , as shown in FIG. 3A .
  • the touch module 131 is further configured to send the touch position information 50 to the processing unit 110 , and thus the processing unit 110 may adjust the control signal 20 to enhance the haptic feedback corresponding to the touch position 310 .
  • the processing unit 110 may increase the strength (e.g. magnitude or frequency) of the haptic feedback on or near the touch position 310 , and thus the user may sense a stronger haptic feedback on the touch position 310 , as shown in FIG. 3B .
  • the haptic device 130 further comprises an ultrasound haptic component.
  • the ultrasound haptic component is capable of producing a volumetric haptic shape to the user by vibrating the air particles near the ultrasound haptic component, so that the user may sense the friction haptic feedback by “touching” the mid-air where the ultrasound haptic component renders the volumetric haptic shape.
  • the ultrasound haptic component is capable of detecting the touch position where the user is touching in mid-air, and sends the touch position information 50 to the processing unit 110 .
  • the processing unit 110 may enhance the ultrasound haptic feedback on the touch position.
  • FIG. 4 is a flow chart of a control method for a haptic system in accordance with another embodiment of the invention.
  • step S 410 the input signal 10 and sensor data 40 are received.
  • step S 420 the haptic conversion is performed based on the content of the input signal 10 and the sensor data 40 from the sensors 140 of the haptic system 100 to generate a control signal 20 indicating associated haptic feedback.
  • step S 430 the haptic feedback driving signal 30 is provided to drive the haptic device 130 according to the control signal 20 .
  • the sensors 140 of the haptic system 100 may further comprise a microphone for detecting the acoustic signal performed by the user.
  • the sensors 140 of the haptic system 100 may further comprise a camera for detecting the motions performed by the user.
  • the processing unit 110 can be further controlled by the user through a gesture, a voice command, an ultrasound signal, or a pose captured by the microphone or the camera. Specifically, when the processing unit 110 detects the gesture, voice command, ultrasound signal, or pose performed by the user, the processing unit 110 may apply a specific haptic signal instead of the original control signal, so that the control circuitry 120 would generate the driving signal 30 according to the specific haptic signal.
  • the haptic signal includes vibration, friction, or deformation, but the invention is not limited thereto.
  • the haptic system 100 may further comprise a network interface unit 150 configured to communicate with other devices using wired or wireless protocols such as LAN, Bluetooth, Wifi, LTE etc.
  • the processing unit 110 of the haptic system 100 may receive a status signal 60 indicating a remote user's action or a remote object's status from a remote electronic device (not shown in FIG. 1 ), and the processing unit 110 may generate the control signal 20 corresponding to the status signal 60 , and convert the control signal 20 to a driving signal 30 through control circuitry 120 .
  • the remote electronic device may also comprise sensors such as accelerometer and gyroscope (not limited), that detect the remote user's action or a remote object's status, and the status signal 60 is generated based on the sensor data 40 from the sensors of the remote electronic device.
  • sensors such as accelerometer and gyroscope (not limited), that detect the remote user's action or a remote object's status, and the status signal 60 is generated based on the sensor data 40 from the sensors of the remote electronic device.
  • the processing unit 110 may analyze a input image signal and identify objects having different materials from the image signal by using object recognition techniques or material data base from an external source. And the processing unit 110 is configured to be capable of generating haptic signals corresponding to different materials such as cloth, glass, paper, metal, air, but the invention is not limited thereto.
  • the haptic device 130 provides friction haptic feedback
  • the user may sense that the cloth may be rough and the metal may be slippery when touching on the associated portion on the haptic device 130 e.g. a electrostatic passive haptic display. That is, different material has a corresponding haptic signal that causes an individual haptic feedback, e.g. different friction haptic feedback driven by driving signals 30 having different magnitude and/or frequency.
  • the haptic signals of different materials can be recorded using analog signals such as audio signals.
  • each material has its own acoustic material haptic file.
  • the processing unit 110 may retrieve the acoustic material haptic files of the recognized materials from a database (not shown in FIG. 1 ), and generate the control signals 20 associated with the recognized materials according to the retrieved acoustic material haptic files.
  • the database could be stored in the haptic system 100 , or the data base could be received from outside of the haptic system 100 through network interface unit 150 .
  • the haptic properties of different materials can be easily duplicated and distributed, and different haptic systems may share the same acoustic material haptic files, so that the user may sense the same haptic feedback on different haptic systems.
  • the input signal 10 may include an analog content, e.g. an analog audio signal, which can be regarded as the control signal 20 .
  • the analog content could be treated as the driving signal 30 and be sent to the haptic device 130 for proving haptic feedback directly.
  • the processing unit 110 may determine whether the input signal 10 is an analog signal. If the input signal 10 is an analog signal, the processing unit 110 may directly send the analog signal to the haptic device 130 .
  • the direct output haptic feedback from the input audio signal can also reflect the vibration from the air wave such as collision between materials, or the explosion of fireworks.
  • the processing unit 110 may perform haptic conversion to convert the digital signal into an analog control signal that is transmitted to the control circuitry 120 .
  • both the digital content and the analog content may exist in a multimedia file.
  • the processing unit 110 may perform haptic conversion on the video signal to generate a first control signal, and directly send the analog acoustic signal to the control circuitry 120 as the second control signal. Accordingly, the haptic device 130 performs haptic feedback according to the first control signal and the second control signal.
  • FIG. 5 is a flow chart of a control method for a haptic system in accordance with yet another embodiment of the invention.
  • step S 510 an input signal comprising a digital signal and an analog signal is received.
  • step S 520 the haptic conversion is performed on the digital signal to generate a first control signal.
  • step S 530 analog signal is directly determined as a second control signal.
  • step S 540 haptic feedback driving signal 30 is provided according to the first control signal and the second control signal.
  • the first control signal is converted by a DAC 121 then to be mixed with the second control signal to generate the driving signal 30 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pathology (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Dispersion Chemistry (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Studio Devices (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Push-Button Switches (AREA)
  • Telephone Function (AREA)

Abstract

A haptic system and an associated control method are provided. The haptic system includes: a processor, and control circuitry. The processor is configured to receive an input signal and perform haptic conversion on the input signal according to content of the input signal to generate a first control signal. The control circuitry is configured to generate a driving signal according to the first control signal so as to provide haptic feedback through a haptic device.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application No. 62/116,710 filed on Feb. 16, 2015, the entirety of which is incorporated by reference herein.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The invention relates to haptic systems, and, in particular, to a content-aware haptic system and an associated control method.
  • 2. Description of the Related Art
  • In recent years, haptic technologies have emerged in academic reports and industrial products. However, current haptic technologies mostly consider how to address the haptic experience of the end user, but haptic experiences are not easy to record or to duplicate in different haptic systems. In addition, the patterns of haptic feedback are usually edited or predetermined in a specific manner, such as using a predefined mapping table to map specific symbols into corresponding haptic patterns. Conventional haptic technologies do not control the haptic feedback according to the content of an input signal such as a video signal, image signal, audio signal, etc.
  • Accordingly, there is demand for a haptic system and an associated control method to solve the aforementioned problems.
  • BRIEF SUMMARY OF THE INVENTION
  • A detailed description is given in the following embodiments with reference to the accompanying drawings.
  • In an exemplary embodiment, a haptic system is provided. The haptic system includes: a processor, and control circuitry. The processor is configured to receive an input signal and perform haptic conversion on the input signal according to content of the input signal to generate a first control signal. The control circuitry is configured to generate a driving signal according to the first control signal so as to provide haptic feedback through a haptic device.
  • In another exemplary embodiment, a control method for a haptic system is provided. The method comprises the steps of: receiving an input signal; performing haptic conversion on the input signal according to the content of the input signal to generate a first control signal; and generating a driving signal according to the first control signal so as to provide haptic feedback through a haptic device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention can be more fully understood by reading the subsequent detailed description and examples with references made to the accompanying drawings, wherein:
  • FIG. 1 is a block diagram of a haptic system in accordance with an embodiment of the invention;
  • FIG. 2 is a flow chart of a control method for a haptic system in accordance with an embodiment of the invention;
  • FIGS. 3A and 3B are diagrams illustrating enhancing haptic feedback on the touch position in accordance with an embodiment of the invention;
  • FIG. 4 is a flow chart of a control method for a haptic system in accordance with another embodiment of the invention; and
  • FIG. 5 is a flow chart of a control method for a haptic system in accordance with yet another embodiment of the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The following description is of the best-contemplated mode of carrying out the invention. This description is made for the purpose of illustrating the general principles of the invention and should not be taken in a limiting sense. The scope of the invention is best determined by reference to the appended claims.
  • The present disclosure is directed to systems and methods for providing content-aware haptic controls. Haptic systems may be used for actuation such as vibration, shape deformation (e.g., contouring a flat surface, or deformation), friction change, volumetric haptic shape, transcutaneous nerve stimulation, temperature stimulation, or other suitable actuations or combination of actuations which could provide tactile feedback to a user. Haptic systems may also be used for sensing stimuli such as, for example, contact on a display screen, patterns of contact on a screen, shape changes, physical changes of a system or component, or other suitable stimuli or combinations of stimuli which may be received. Haptic systems may sense particular stimuli, change one or more characteristics of a shape change element, or both. Haptic systems may perform sensing functions and actuating functions by a integrated haptic device. In some embodiments, haptic systems may be coupled to a display screen, audio processing circuitry, image processing circuitry, device hardware, or other system to provide for any combination of tactile, visual, and audio interactions. Actuation may occur, in some embodiments, substantially normal to a substantially planar surface, which may allow for three dimensional contouring of the planar surface.
  • FIG. 1 is a block diagram of a haptic system in accordance with an embodiment of the invention. As shown in FIG. 1, the haptic system 100 includes a processing unit 110, control circuitry 120, and a haptic device 130. The processing unit 110 is configured to receive an input signal 10, and performs a haptic conversion to convert the content of the input signal 10 into a control signal 20 that is transmitted to the control circuitry 120. For example, the processing unit 110 may comprises one or more processors, or digital signal processors (DSP), but the invention is not limited thereto. The haptic conversion may include operations selected from at least one of decoding, processing, analyzing, converting the content of the input signal, retrieving, transforming, calculating relevant information of the input signal, and any combination operations mentioned in above, so as to generate the control signal 20 accordingly. The control signal 20 may be one single signal or a set of signals, and may include information for indicating the strength of the haptic feedback to be provided as well as the location information of rendering the haptic feedback on the haptic device 130. The input signal 10 may be an image, video, acoustic signal, message, or sign, but the invention is not limited thereto. The details for converting different types of input signals into an associated control signal will be described later.
  • The control circuitry 120 is configured to receive the control signal 20 and generate a driving signal 30 so as to control the haptic device 130 to provide haptic feedback. For example, the control circuitry 120 may comprise a digital-to-analog converter (DAC) 121 that is configured to convert a digital control signal 20 received from the processing unit 110 into the driving signal 30, e.g. an audio waveforms. The haptic feedback provided by the haptic device 130 may include vibration, shape deformation, friction change, volumetric haptic shape, transcutaneous nerve stimulation, temperature stimulation etc., but the invention is not limited thereto.
  • In an embodiment, given that the input signal 10 is a video signal and the haptic device 130 including vibrator to provide vibration haptic feedback, the processing unit 110 may analyze the video signal to obtain the motion vectors of macroblocks of an image frame in the video signal. The magnitude and/or frequency of the driving signal 30 depend on the motion vectors of the macroblocks of the image frame. For example, when the scene of the video signal moves fast, it indicates that the magnitude of the motion vectors of the image frame in the video signal would be larger, and thus the processing unit 110 may provide a control signal 20 indicating a higher amplitude and/or frequency of vibration to the control circuitry 120, so that the control circuitry 120 may provide a driving signal 30 to control the haptic device 130 to provide vibration haptic feedback having a corresponding amplitude and/or frequency. It should be noted that the control circuitry 120 may provide a driving signal 30 to control the haptic device 130 so as to provide a global vibration haptic feedback for the whole haptic device 130. In yet another embodiment, the control circuitry 120 may provide a plurality of driving signals 30 to locally control a plurality of vibrators located in specific regions of the haptic device 130 so as to provide regional vibration haptic feedbacks respectively.
  • In addition, the processing unit 110 may analyze the video signal to obtain the foreground object, its position, and motion vectors of the foreground object. The processing unit 110 may generate the control signal 20 to control at least a local vibrator of the haptic device 130 on the position at which the foreground object is presented. In some embodiments, the processing unit 110 may also detect the content change of the video signal, and generate the control signal 20 corresponding to the content change.
  • Moreover, the processing unit 110 may also analyze the acoustic content of the video signal, and may generate the control signal 20 based on the acoustic content: For example, when there is a huge explosion scene in the video signal, and the associated acoustic content may have a sudden explosion sound with a high magnitude. Accordingly, the control signal 20 generated by the processing unit 110 may indicate a sudden strong vibration on the haptic device 130 while the explosion occurs.
  • In another embodiment, the haptic device 130 includes electrostatic generator which is capable of providing electrostatic haptic feedback. For example, the haptic device 130 may be an electrostatic passive haptic display or an electrostatic passive haptic touchpad and the input signal 10 is a video signal, but the invention is not limited thereto. Given that the haptic device 130 is an electrostatic passive haptic display, the user may sense friction change while sliding his/her finger or fingers on the surface of the haptic device 130, and the friction force sensed by the user corresponds to the content of the image. Additionally, the friction force (i.e. sensing slippery or rough) sensed by the user can be controlled by the driving signal 30 applied to the haptic device 130. In some embodiments, given that the haptic device 130 is an electrostatic passive haptic display, the user may place his/her finger or fingers directly on the surface of the haptic device 130, and the voltage difference occurs between the finger (as an electrode) and the haptic device 130. This allows a direct touch without the use of any additional components. In addition, the surface of the haptic device 130 may include several segmented electrodes placed in a plurality of different regions. In some other embodiments, given that the haptic device 130 is an electrostatic passive haptic touchpad, the user may experience tactile feedback by putting his/her finger or fingers on the touchpad and explores the surface of the haptic device 130. For example, there may be a cursor or pointer on a display of the haptic system 100, and the user may sense the friction haptic feedback where the cursor or pointer is located when putting his finger on the surface of the haptic device 130.
  • In the aforementioned embodiments, the processing unit 110 may perform a two-dimensional frequency-domain transformation (e.g. digital wavelet transform (DWT) or digital cosine transform (DCT)) on the image frame of the video signal, and thus a map in the frequency domain can be obtained. When pixel values in a specific image block vary a lot, it indicates that the frequency in the specific image block is higher, and thus the control signal 20 generated by the processing unit 110 may control the control circuitry 120 to provide the driving signal 30 with higher voltage or frequency, so that the user may sense that the friction force on the specific image block is larger (i.e. sensing a rough surface) while sliding on the corresponding specific image block of the haptic device 130 via his finger.
  • In some embodiments, the haptic device 130 (e.g. an electrostatic passive haptic display) could be able to provide friction haptic feedback. The applied driving signal 30 to drive the haptic device 130 for generating a friction haptic feedback depends on the magnitude of motion vectors of the image frame. Additionally, the control circuitry 120 may provide a plurality of driving signals 30 to control different segmented electrodes in the haptic device 130 to provide a friction haptic feedback at which the foreground object or an obvious image feature (e.g. a human face) is appeared in the image frame. Alternatively, the friction force provided by the haptic device 130 may also depend on the magnitude or frequency of the acoustic content of the incoming video signal, and the details can be found in the aforementioned embodiment.
  • Similarly, given that the haptic device 130 is further capable of providing deformation haptic feedback (e.g. through a display screen having a deformation function on its surface), the processing unit 110 may also generate the control signal 20 associated with the content of the video signal. For example, as described above, the degree of deformation haptic feedback also corresponds to the magnitude of the motion vectors and/or the content change in the video signal.
  • In an embodiment, the haptic device 130 is a haptic deformation display having a plurality of deformation components, e.g. micro-fluidics display panel, and the processing unit 110 is capable of converting an incoming stereoscopic video signal or 3D image into a control signal 20 indicating deformation magnitude and shape of the deformation components of different regions on the haptic device 130. It should be noted that the surface of the haptic device 130 can be divided into a plurality of regions, and the control circuitry 120 may generate a plurality of driving signals 30 to control the deformation component of respective region according to the control signal 20 from the processing unit 110. For example, the haptic deformation display is originally in a first-shape configuration, and the deformation component may at least partially define the shape of the haptic deformation display, thereby causing the shape of the haptic deformation display to deform into a second-shape configuration that is different from the first-shape configuration. The second-shape configuration may substantially be maintained.
  • The processing unit 110 may calculate depth information (e.g. a depth map) associated with image frame in the stereoscopic video signal or 3D image, and the deformation height of the deformation component of a specific region corresponds to the associated depth information. In addition, the processing unit 110 may perform foreground segmentation on the incoming video signal or image to obtain a foreground object. The control signal 20 controls the individual deformation component of each region of the haptic device 130 to generate the deformation shape corresponding to the segmentation result of the foreground object.
  • In some embodiments, control signal 20 is generated according to the temporal motion prediction results of the incoming stereoscopic video signal or 3D image when the haptic system 100 is used in a video codec system. Similarly, the haptic feedback (e.g. slippery or rough) may also correspond to the magnitude, or correspond to motion vectors in the stereoscopic video signal or 3D image. In addition, the processing unit 110 may perform foreground segmentation on the stereoscopic video signal or 3D image to obtain a foreground object so as to generate the control signal 20 accordingly by the processing unit 110. The control circuitry 120 may generate the driving signals 30 to control the corresponding deformation components where the foreground object is presented on the haptic deformation display. Furthermore, the deformation behavior may also correspond to the acoustic content in the stereoscopic video signal, and the aforementioned embodiments can be referred to for the details.
  • In an embodiment, the haptic device 130 equipped with transcutaneous electrical nerve stimulation device having a plurality of electrodes. The processing unit 110 is capable of converting an incoming video signal or audio signal into a control signal 20. The control circuitry 120 may provide a plurality of driving signals 30 to control the plurality of electrodes in the haptic device 130 to provide a transcutaneous nerve stimulation haptic feedback. The driving signals 30 indicate amplitude, frequency and/or duration of electrical impulses which would be applied by the plurality of electrodes on the haptic device 130.
  • In yet an embodiment, the haptic device 130 equipped with heaters. The processing unit 110 is capable of converting an incoming video signal or audio signal into a control signal 20. The control circuitry 120 may provide a plurality of driving signals 30 to control the heaters in the haptic device 130 to provide thermal stimulation haptic feedback. The driving signals 30 control thermal energy, and/or heating duration of the heaters of the haptic device 130.
  • In view of the above, the haptic system 100 is able to convert the input content into various types of haptic feedback, e.g. vibration, shape deformation, friction change, volumetric haptic shape, transcutaneous nerve stimulation, and/or temperature stimulation mentioned in the embodiments in above. FIG. 2 is a flow chart of a control method for a haptic system in accordance with an embodiment of the invention. In FIG. 2, in step S210, the input signal 10 is received. In step S220, the haptic conversion is performed based on the content of the input signal 10 to generate a control signal 20 indicating associated haptic feedback. In step S230, a haptic feedback driving signal 30 is provided to the haptic device 130 according to the control signal 20.
  • In yet another embodiment, the input content can be processed so as to generate driving signal to provide various haptic feedback with the help of sensor data. For example, the haptic system 100 may further includes at least a sensor 140 such as GPS sensor, ambient light sensor, proximity sensor, thermal sensor, accelerometer, pressure sensor, heart-rate sensor, or UV sensor, etc. The sensor 140 provides sensor data 40 to the processing unit 110. In an embodiment, the processing unit 110 receives the input video signal 10 and GPS data from the GPS sensor and generates a control signal 20 according to the input video signal 10 and GPS data, where the control signal 20 may be converted to the driving signal 30 to control the haptic device 130 to provide the haptic feedback, e.g. vibration haptic feedback. For example, the vibration magnitude may depend on the motion vectors of the image frame of the video signal, and also depend on the GPS data e.g. geographical location of the user such as at home or office. The vibration haptic feedback may be weaker when the user is at home, and the vibration haptic feedback may be stronger when the user is in the office. Additionally, the processing unit 110 may also generate the control signal 20 according to the video signal and a plurality of sensor data 40 so as to control the haptic feedback of the haptic device 130.
  • FIGS. 3A and 3B are diagrams illustrating enhancing haptic feedback on the touch position in accordance with an embodiment of the invention. In an embodiment, the haptic device 130 includes a touch module 131 is configured to detect the touch position 310 where the user's finger is touching on the haptic device 130 or is touching the mid-air space above the haptic device 130, as shown in FIG. 3A. And the touch module 131 is further configured to send the touch position information 50 to the processing unit 110, and thus the processing unit 110 may adjust the control signal 20 to enhance the haptic feedback corresponding to the touch position 310. For example, when the user's finger touches on the haptic device 130, the user may sense that there is a first haptic feedback on the touch position 310. After receiving the touch position information 50, the processing unit 110 may increase the strength (e.g. magnitude or frequency) of the haptic feedback on or near the touch position 310, and thus the user may sense a stronger haptic feedback on the touch position 310, as shown in FIG. 3B.
  • In an embodiment, the haptic device 130 further comprises an ultrasound haptic component. The ultrasound haptic component is capable of producing a volumetric haptic shape to the user by vibrating the air particles near the ultrasound haptic component, so that the user may sense the friction haptic feedback by “touching” the mid-air where the ultrasound haptic component renders the volumetric haptic shape. In addition, the ultrasound haptic component is capable of detecting the touch position where the user is touching in mid-air, and sends the touch position information 50 to the processing unit 110. Thus, the processing unit 110 may enhance the ultrasound haptic feedback on the touch position.
  • In view of the above, the haptic system 100 is able to convert the input content and the sensor data 40 into various types of vibration, shape deformation, friction change, and/or volumetric haptic shape mentioned in the embodiments in above. FIG. 4 is a flow chart of a control method for a haptic system in accordance with another embodiment of the invention. Referring to FIG. 4, in step S410, the input signal 10 and sensor data 40 are received. In step S420, the haptic conversion is performed based on the content of the input signal 10 and the sensor data 40 from the sensors 140 of the haptic system 100 to generate a control signal 20 indicating associated haptic feedback. In step S430, the haptic feedback driving signal 30 is provided to drive the haptic device 130 according to the control signal 20.
  • In another embodiment, the sensors 140 of the haptic system 100 may further comprise a microphone for detecting the acoustic signal performed by the user. And the sensors 140 of the haptic system 100 may further comprise a camera for detecting the motions performed by the user. For example, the processing unit 110 can be further controlled by the user through a gesture, a voice command, an ultrasound signal, or a pose captured by the microphone or the camera. Specifically, when the processing unit 110 detects the gesture, voice command, ultrasound signal, or pose performed by the user, the processing unit 110 may apply a specific haptic signal instead of the original control signal, so that the control circuitry 120 would generate the driving signal 30 according to the specific haptic signal. It should be noted that the haptic signal includes vibration, friction, or deformation, but the invention is not limited thereto.
  • In an embodiment, the haptic system 100 may further comprise a network interface unit 150 configured to communicate with other devices using wired or wireless protocols such as LAN, Bluetooth, Wifi, LTE etc. For example, the processing unit 110 of the haptic system 100 may receive a status signal 60 indicating a remote user's action or a remote object's status from a remote electronic device (not shown in FIG. 1), and the processing unit 110 may generate the control signal 20 corresponding to the status signal 60, and convert the control signal 20 to a driving signal 30 through control circuitry 120. It should be noted that the remote electronic device may also comprise sensors such as accelerometer and gyroscope (not limited), that detect the remote user's action or a remote object's status, and the status signal 60 is generated based on the sensor data 40 from the sensors of the remote electronic device.
  • In yet another embodiment, the processing unit 110 may analyze a input image signal and identify objects having different materials from the image signal by using object recognition techniques or material data base from an external source. And the processing unit 110 is configured to be capable of generating haptic signals corresponding to different materials such as cloth, glass, paper, metal, air, but the invention is not limited thereto. For example, given that the haptic device 130 provides friction haptic feedback, the user may sense that the cloth may be rough and the metal may be slippery when touching on the associated portion on the haptic device 130 e.g. a electrostatic passive haptic display. That is, different material has a corresponding haptic signal that causes an individual haptic feedback, e.g. different friction haptic feedback driven by driving signals 30 having different magnitude and/or frequency.
  • Furthermore, the haptic signals of different materials can be recorded using analog signals such as audio signals. Specifically, each material has its own acoustic material haptic file. When the processing unit 110 recognizes the materials in the video signal, the processing unit 110 may retrieve the acoustic material haptic files of the recognized materials from a database (not shown in FIG. 1), and generate the control signals 20 associated with the recognized materials according to the retrieved acoustic material haptic files. The database could be stored in the haptic system 100, or the data base could be received from outside of the haptic system 100 through network interface unit 150. With the help of acoustic material haptic files, the haptic properties of different materials can be easily duplicated and distributed, and different haptic systems may share the same acoustic material haptic files, so that the user may sense the same haptic feedback on different haptic systems.
  • In an embodiment, the input signal 10 may include an analog content, e.g. an analog audio signal, which can be regarded as the control signal 20. Thus, the analog content could be treated as the driving signal 30 and be sent to the haptic device 130 for proving haptic feedback directly. For example, the processing unit 110 may determine whether the input signal 10 is an analog signal. If the input signal 10 is an analog signal, the processing unit 110 may directly send the analog signal to the haptic device 130. For example, the direct output haptic feedback from the input audio signal can also reflect the vibration from the air wave such as collision between materials, or the explosion of fireworks.
  • If the input signal 10 is a digital signal, the processing unit 110 may perform haptic conversion to convert the digital signal into an analog control signal that is transmitted to the control circuitry 120. However, both the digital content and the analog content may exist in a multimedia file. For example, when the input signal is a video signal including video images and an analog acoustic signal, the processing unit 110 may perform haptic conversion on the video signal to generate a first control signal, and directly send the analog acoustic signal to the control circuitry 120 as the second control signal. Accordingly, the haptic device 130 performs haptic feedback according to the first control signal and the second control signal.
  • FIG. 5 is a flow chart of a control method for a haptic system in accordance with yet another embodiment of the invention. In step S510, an input signal comprising a digital signal and an analog signal is received. In step S520, the haptic conversion is performed on the digital signal to generate a first control signal. In step S530, analog signal is directly determined as a second control signal. In step S540, haptic feedback driving signal 30 is provided according to the first control signal and the second control signal. In an embodiment, the first control signal is converted by a DAC 121 then to be mixed with the second control signal to generate the driving signal 30.
  • While the invention has been described by way of example and in terms of the preferred embodiments, it is to be understood that the invention is not limited to the disclosed embodiments. On the contrary, it is intended to cover various modifications and similar arrangements (as would be apparent to those skilled in the art). Therefore, the scope of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.

Claims (18)

What is claimed is:
1. A haptic system, comprising:
a processor, configured to receive an input signal and perform haptic conversion on the input signal according to content of the input signal to generate a first control signal; and
control circuitry, configured to generate a driving signal according to the first control signal so as to provide haptic feedback through a haptic device.
2. The haptic system as claimed in claim 1, wherein the input signal comprises a video signal, an image signal, an audio signal, a message, a sign, or a combination thereof.
3. The haptic system as claimed in claim 1, wherein the haptic feedback comprises vibration, shape deformation, friction change, or volumetric haptic shape.
4. The haptic system as claimed in claim 1, wherein the haptic conversion performed by the processor comprises operations selected from at least one of decoding, processing, analyzing, converting the content of the input signal, retrieving, transforming, calculating relevant information of the input signal, and any operations in combination thereof.
5. The haptic system as claimed in claim 1, wherein the input signal is a video signal, and the processor analyzes the video signal to obtain motion vectors of an image frame of the video signal or obtain foreground object information, and haptic feedback controlled by the driving signal corresponds to the motion vectors or is associated with a position of the foreground object.
6. The haptic system as claimed in claim 1, wherein the processor is coupled to a sensor and receives a sensor data from the sensor; wherein the processor generates the first control signal according to the content of the input signal and the sensor data.
7. The haptic system as claimed in claim 1, wherein the processor is coupled to a touch module and receives a touch position information from the touch module; wherein the touch position information indicates location where a user is touching on the haptic device; wherein the processor further generates a second control signal so as to provide an enhanced haptic feedback around the touch position.
8. The haptic system as claimed in claim 1, further comprising: a microphone or a camera, and the processor further modifies the first control signal according to a voice command or an ultrasound signal captured by the microphone or a gesture or pose from the user captured by the camera.
9. The haptic system as claimed in claim 1, wherein the input signal is an image signal, and the processor analyzes the image signal to identify a material information of the image signal, and generates the first control signal associated with the material shown in the image signal, and the control circuitry generate the driving signal to provide haptic feedback according to the first control signal.
10. The haptic system as claimed in claim 9, wherein the material has an individual material haptic file, and the processor retrieves the individual material haptic file from a database when the specific material is being identified.
11. A control method for a haptic system, the method comprising:
receiving an input signal;
performing haptic conversion on the input signal according to content of the input signal to generate a first control signal; and
generating a driving signal according to the first control signal so as to provide haptic feedback through a haptic device.
12. The method as claimed in claim 11, wherein the haptic feedback comprises vibration, shape deformation, friction change, or volumetric haptic shape.
13. The method as claimed in claim 11, wherein the haptic conversion comprises operations selected from at least one of decoding, processing, analyzing, converting the content of the input signal, retrieving, transforming, calculating relevant information of the input signal, and any operations in combination thereof.
14. The method as claimed in claim 11, further comprising:
receiving a sensor data from a sensor;
wherein the first control signal is generated according to the content of the input signal and the sensor data.
15. The method as claimed in claim 11, further comprising:
detecting a touch position information where a user is touching on the haptic device; and
generating a second control signal so as to provide an enhanced haptic feedback around the touch position.
16. The method as claimed in claim 11, wherein the input signal is an image signal, and the method further comprises:
analyzing the image signal to identify a material information of the image signal;
generating the first control signal associated with the material shown in the image signal; and
generating the driving signal to provide haptic feedback according to the first control signal.
17. The method as claimed in claim 16, wherein the material has an individual material haptic file, and the method further comprises:
retrieving the individual material haptic file from a database when the specific material is being identified.
18. The method as claimed in claim 11, wherein the input signal is a video signal, and the step of performing haptic conversion further comprises:
analyzing the video signal to obtain motion vectors of an image frame of the video signal or obtain foreground object information;
wherein the haptic feedback controlled by the driving signal corresponds to the motion vectors or is associated with a position of the foreground object.
US15/016,605 2015-02-16 2016-02-05 Content-aware haptic system and associated control method Abandoned US20160239087A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/016,605 US20160239087A1 (en) 2015-02-16 2016-02-05 Content-aware haptic system and associated control method
CN201610085507.0A CN105892645A (en) 2015-02-16 2016-02-15 Haptic system and control method for haptic system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562116710P 2015-02-16 2015-02-16
US15/016,605 US20160239087A1 (en) 2015-02-16 2016-02-05 Content-aware haptic system and associated control method

Publications (1)

Publication Number Publication Date
US20160239087A1 true US20160239087A1 (en) 2016-08-18

Family

ID=56621537

Family Applications (5)

Application Number Title Priority Date Filing Date
US15/000,220 Active 2037-01-20 US10225442B2 (en) 2015-02-16 2016-01-19 Electronic device and method for sensing air quality
US15/002,448 Active US9692950B2 (en) 2015-02-16 2016-01-21 Display method for video conferencing
US15/003,987 Active 2036-02-07 US10148857B2 (en) 2015-02-16 2016-01-22 Electronic device and method for sensing touch force
US15/003,998 Active 2036-04-09 US10122898B2 (en) 2015-02-16 2016-01-22 Electronic device
US15/016,605 Abandoned US20160239087A1 (en) 2015-02-16 2016-02-05 Content-aware haptic system and associated control method

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US15/000,220 Active 2037-01-20 US10225442B2 (en) 2015-02-16 2016-01-19 Electronic device and method for sensing air quality
US15/002,448 Active US9692950B2 (en) 2015-02-16 2016-01-21 Display method for video conferencing
US15/003,987 Active 2036-02-07 US10148857B2 (en) 2015-02-16 2016-01-22 Electronic device and method for sensing touch force
US15/003,998 Active 2036-04-09 US10122898B2 (en) 2015-02-16 2016-01-22 Electronic device

Country Status (2)

Country Link
US (5) US10225442B2 (en)
CN (5) CN105898186B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180005497A1 (en) * 2016-06-29 2018-01-04 Immersion Corporation Real-time haptics generation
US20180046250A1 (en) * 2016-08-09 2018-02-15 Wipro Limited System and method for providing and modulating haptic feedback
US10845878B1 (en) * 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback
WO2020251566A1 (en) * 2019-06-12 2020-12-17 Hewlett-Packard Development Company, L.P. Extended reality grasp controller
US11478022B2 (en) 2017-11-07 2022-10-25 Dotbliss Llc Electronic garment with haptic feedback

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204650423U (en) * 2015-05-12 2015-09-16 瑞声声学科技(深圳)有限公司 Compression sensor device
KR102413074B1 (en) * 2015-09-21 2022-06-24 삼성전자주식회사 User terminal device, Electronic device, And Method for controlling the user terminal device and the electronic device thereof
JP6786245B2 (en) * 2016-04-07 2020-11-18 ソニーモバイルコミュニケーションズ株式会社 Information processing device
CN109643181B (en) * 2016-08-31 2021-03-30 华为技术有限公司 Communication enhancement method based on pressure touch
KR102564305B1 (en) * 2016-09-09 2023-08-08 삼성디스플레이 주식회사 Display device
CN106790787B (en) * 2016-11-30 2019-11-29 上海斐讯数据通信技术有限公司 A kind of method of digital product and Detection of Air Quality
CN106680971A (en) * 2016-12-06 2017-05-17 上海斐讯数据通信技术有限公司 Method for increasing huge view angle of cameramodule and camera module device
CN106708262A (en) * 2016-12-08 2017-05-24 苏州欧菲光科技有限公司 Electronic equipment and tactile feedback device thereof
CN106843475A (en) * 2017-01-03 2017-06-13 京东方科技集团股份有限公司 A kind of method and system for realizing virtual reality interaction
US11120672B2 (en) * 2017-01-23 2021-09-14 Sanko Tekstil Isletmeleri San. Vetic A.S. Garment, system and method for transmitting audio and/or messaging information to a user
CN108956395B (en) * 2017-05-18 2021-01-08 中兴通讯股份有限公司 A method and terminal for detecting the concentration of air particles
CN107340871A (en) * 2017-07-25 2017-11-10 深识全球创新科技(北京)有限公司 The devices and methods therefor and purposes of integrated gesture identification and ultrasonic wave touch feedback
CN107809503B (en) * 2017-10-31 2025-07-04 北京小米移动软件有限公司 Electronic devices
US10767878B2 (en) 2017-11-21 2020-09-08 Emerson Climate Technologies, Inc. Humidifier control systems and methods
CN107941668A (en) * 2017-11-28 2018-04-20 苏州佑克骨传导科技有限公司 A kind of wear-type PM2.5 detects broadcast device
US12018852B2 (en) 2018-04-20 2024-06-25 Copeland Comfort Control Lp HVAC filter usage analysis system
US11486593B2 (en) 2018-04-20 2022-11-01 Emerson Climate Technologies, Inc. Systems and methods with variable mitigation thresholds
WO2019204779A1 (en) 2018-04-20 2019-10-24 Emerson Climate Technologies, Inc. Indoor air quality and occupant monitoring systems and methods
WO2019204786A1 (en) 2018-04-20 2019-10-24 Emerson Climate Technologies, Inc. Computerized hvac filter evaluation system
WO2019204789A1 (en) 2018-04-20 2019-10-24 Emerson Climate Technologies, Inc. Indoor air quality sensor calibration systems and methods
US12078373B2 (en) 2018-04-20 2024-09-03 Copeland Lp Systems and methods for adjusting mitigation thresholds
US11371726B2 (en) 2018-04-20 2022-06-28 Emerson Climate Technologies, Inc. Particulate-matter-size-based fan control system
US12311308B2 (en) 2018-04-20 2025-05-27 Copeland Lp Particulate-matter-size-based fan control system
WO2019204792A1 (en) 2018-04-20 2019-10-24 Emerson Climate Technologies, Inc. Coordinated control of standalone and building indoor air quality devices and systems
WO2019204790A1 (en) 2018-04-20 2019-10-24 Emerson Climate Technologies, Inc. Systems and methods with variable mitigation thresholds
CN108982309A (en) * 2018-05-02 2018-12-11 Oppo广东移动通信有限公司 Detection method, detection mould group, electronic device, computer equipment and storage medium
CN108562463A (en) * 2018-06-28 2018-09-21 清远市元信智能科技有限公司 A kind of intellegent sampling system
CN109034252B (en) * 2018-08-01 2020-10-30 中国科学院大气物理研究所 An automatic identification method for abnormal monitoring data of air quality stations
CN111225125B (en) * 2018-11-23 2021-07-06 北京小米移动软件有限公司 Camera Modules and Electronic Equipment
CN109447038A (en) * 2018-11-27 2019-03-08 南京恩诺网络科技有限公司 Tactile data evaluation method and device, system
JP7320939B2 (en) * 2018-11-30 2023-08-04 ニデックプレシジョン株式会社 Blade operating device and blade operating method
JP7219066B2 (en) * 2018-11-30 2023-02-07 日本電産コパル株式会社 Imaging device and electronic equipment
US11386562B2 (en) 2018-12-28 2022-07-12 Cyberlink Corp. Systems and methods for foreground and background processing of content in a live video
CN209823800U (en) * 2019-03-26 2019-12-20 Oppo广东移动通信有限公司 Mobile terminal and shell assembly thereof
CN111835995A (en) * 2019-04-16 2020-10-27 泰州市朗嘉尚网络科技有限公司 Adaptive sharpness adjustment method
US11847311B2 (en) * 2019-05-22 2023-12-19 Apple Inc. Characterization of a venting state or other system parameter that affects the characterization of a force applied to a device
JP7321023B2 (en) * 2019-07-30 2023-08-04 ニデックプレシジョン株式会社 Electronics
CN110636236B (en) * 2019-09-17 2021-08-13 惠州市华星通讯实业有限公司 Police law enforcement instrument with camera protection function
TWI696913B (en) * 2019-09-26 2020-06-21 群光電子股份有限公司 Electronic device and moving module thereof
CN110830639B (en) * 2019-10-28 2021-10-01 维沃移动通信有限公司 an electronic device
CN111008000B (en) * 2019-12-12 2022-01-18 联想(北京)有限公司 Information processing method and electronic equipment
US11360294B2 (en) 2020-05-15 2022-06-14 Samsung Electronics Co., Ltd. Optical sensor
CN112672093A (en) * 2020-12-23 2021-04-16 北京市商汤科技开发有限公司 Video display method and device, electronic equipment and computer storage medium

Family Cites Families (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5428964A (en) * 1994-01-10 1995-07-04 Tec-Way Air Quality Products Inc. Control for air quality machine
US5531801A (en) * 1994-02-08 1996-07-02 Sewell; Frederic D. Liquid spray air purification and controlled humidification apparatus with air quality monitor and controller
US6674024B2 (en) * 1996-12-19 2004-01-06 Automotive Systems Laboratory, Inc Seat weight sensor
US6033130A (en) * 1997-10-27 2000-03-07 Olympus Optical Co., Ltd. Lens cover and a lens cover support assembly mounted external to a lens barrel
US6650338B1 (en) * 1998-11-24 2003-11-18 Interval Research Corporation Haptic interaction with video and image data
US6633608B1 (en) * 1999-05-27 2003-10-14 Sarnoff Corporation Method and apparatus for adapting memory resource utilization in an information stream decoder
KR20030097437A (en) 2002-06-21 2003-12-31 주식회사 팬택앤큐리텔 Handset having air pollution detecting sensor
JP4302548B2 (en) * 2004-02-18 2009-07-29 オリンパス株式会社 Cameras and electronic devices
JP2006030360A (en) * 2004-07-13 2006-02-02 Konica Minolta Photo Imaging Inc Lens unit and camera incorporating this unit
WO2006089417A1 (en) * 2005-02-23 2006-08-31 Craig Summers Automatic scene modeling for the 3d camera and 3d video
CN2872406Y (en) * 2005-11-15 2007-02-21 华晶科技股份有限公司 lens protector
KR100901382B1 (en) * 2006-07-19 2009-06-05 엘지전자 주식회사 Mobile communication terminal with camera lens cover and camera lens cover
CN200982953Y (en) 2006-11-27 2007-11-28 康佳集团股份有限公司 A dust monitoring device based on mobile phone
US20070193901A1 (en) * 2007-03-09 2007-08-23 Daniel Cohen Camera cover
JP2010525367A (en) * 2007-04-27 2010-07-22 セラマテック・インク Particulate matter sensor
CN101686273B (en) * 2008-09-26 2014-11-26 深圳富泰宏精密工业有限公司 Portable communication device
US20100194574A1 (en) * 2009-01-30 2010-08-05 David James Monk Particle detection system and method of detecting particles
US20150189140A1 (en) * 2009-02-23 2015-07-02 Gary Edwin Sutton Curved sensor array camera
CN101615072B (en) * 2009-06-18 2010-12-29 东南大学 Method for reproducing texture force touch based on shape-from-shading technology
JP5385029B2 (en) * 2009-07-06 2014-01-08 パナソニック株式会社 Cam frame, lens barrel, shake correction device, and image sensor unit
CN101995732A (en) * 2009-08-21 2011-03-30 深圳富泰宏精密工业有限公司 Lens cap positioning component and lens cap mechanism with same
US8085484B2 (en) * 2009-12-29 2011-12-27 Sensor Technology Systems, Inc. Flip optic adaptor
CN101819462B (en) * 2010-03-12 2011-07-20 东南大学 Image texture haptic representation system based on force/haptic interaction equipment
AU2011235309B2 (en) * 2010-04-02 2013-07-04 3M Innovative Properties Company Alignment registration feature for analyte sensor optical reader
US8908008B2 (en) * 2010-07-16 2014-12-09 Hewlett-Packard Development Company, L.P. Methods and systems for establishing eye contact and accurate gaze in remote collaboration
US9716858B2 (en) * 2011-03-07 2017-07-25 Ricoh Company, Ltd. Automated selection and switching of displayed information
EP2713294A4 (en) * 2011-06-01 2014-07-02 Huawei Device Co Ltd Terminal authentication method and device thereof
CN103037050A (en) 2011-10-10 2013-04-10 张跃 Mobile phone for detecting inhalable particles
US8711118B2 (en) * 2012-02-15 2014-04-29 Immersion Corporation Interactivity model for shared feedback on mobile devices
WO2013127418A1 (en) * 2012-02-27 2013-09-06 Eth Zurich Method and system for image processing in video conferencing for gaze correction
KR101886287B1 (en) * 2012-03-23 2018-09-11 엘지이노텍 주식회사 Touch panel
US11474645B2 (en) * 2012-03-27 2022-10-18 Nokia Technologies Oy Method and apparatus for force sensing
CN102636621B (en) 2012-03-31 2016-09-21 北京安赛博技术有限公司 The monitoring device of a kind of portable PM2.5 and monitoring method
US9250754B2 (en) * 2012-09-27 2016-02-02 Google Inc. Pressure-sensitive trackpad
JP6128821B2 (en) * 2012-12-04 2017-05-17 オリンパス株式会社 Blur correction device
US10817096B2 (en) * 2014-02-06 2020-10-27 Apple Inc. Force sensor incorporated into display
US9064385B2 (en) * 2013-03-15 2015-06-23 Immersion Corporation Method and apparatus to generate haptic feedback from video content analysis
US9109981B2 (en) 2013-03-15 2015-08-18 Aircuity, Inc. Methods and apparatus for indoor air contaminant monitoring
CN104104750A (en) * 2013-04-02 2014-10-15 昆山新力精密五金有限公司 Cell phone camera head fixing and protection device
US20140347436A1 (en) * 2013-05-22 2014-11-27 Sony Corporation Portable transparent display with life-size image for teleconference
CN103353434A (en) 2013-06-18 2013-10-16 成都佳锂科技有限公司 Portable bluetooth fine particulate matter detector with purifying function
CN103454194B (en) 2013-09-05 2015-09-23 江苏物联网研究发展中心 PM2.5 pick-up unit and manufacture method thereof
US9883138B2 (en) * 2014-02-26 2018-01-30 Microsoft Technology Licensing, Llc Telepresence experience
US9584814B2 (en) * 2014-05-15 2017-02-28 Intel Corporation Content adaptive background foreground segmentation for video coding
US20160062500A1 (en) * 2014-08-28 2016-03-03 Apple Inc. Force Sensor with Capacitive Gap Sensing
CN104198349B (en) 2014-09-17 2015-08-12 深圳市信诚佳业电子有限公司 Dust investigating and dust detection method

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180005497A1 (en) * 2016-06-29 2018-01-04 Immersion Corporation Real-time haptics generation
US10210724B2 (en) * 2016-06-29 2019-02-19 Immersion Corporation Real-time patterned haptic effect generation using vibrations
US20190213852A1 (en) * 2016-06-29 2019-07-11 Immersion Corporation Real-time haptics generation
US10692337B2 (en) * 2016-06-29 2020-06-23 Immersion Corporation Real-time haptics generation
US10845878B1 (en) * 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback
US20180046250A1 (en) * 2016-08-09 2018-02-15 Wipro Limited System and method for providing and modulating haptic feedback
US11478022B2 (en) 2017-11-07 2022-10-25 Dotbliss Llc Electronic garment with haptic feedback
US11700891B2 (en) 2017-11-07 2023-07-18 Dotbliss Llc Electronic garment with haptic feedback
US11986027B2 (en) 2017-11-07 2024-05-21 Dotbliss Llc Electronic garment with haptic feedback
US12324463B2 (en) 2017-11-07 2025-06-10 Dotbliss Llc Electronic garment with haptic feedback
WO2020251566A1 (en) * 2019-06-12 2020-12-17 Hewlett-Packard Development Company, L.P. Extended reality grasp controller
US11474603B2 (en) 2019-06-12 2022-10-18 Hewlett-Packard Development Company, L.P. Extended reality grasp controller

Also Published As

Publication number Publication date
CN105898186B (en) 2019-02-05
US20160241748A1 (en) 2016-08-18
US20160241811A1 (en) 2016-08-18
US10225442B2 (en) 2019-03-05
US9692950B2 (en) 2017-06-27
CN105897963B (en) 2019-04-12
US10148857B2 (en) 2018-12-04
US20160239134A1 (en) 2016-08-18
CN105892645A (en) 2016-08-24
CN105897963A (en) 2016-08-24
CN105892742A (en) 2016-08-24
US10122898B2 (en) 2018-11-06
CN105891071A (en) 2016-08-24
US20160238527A1 (en) 2016-08-18
CN105898186A (en) 2016-08-24

Similar Documents

Publication Publication Date Title
US20160239087A1 (en) Content-aware haptic system and associated control method
US9454881B2 (en) Haptic warping system
JP6739571B2 (en) Haptic transformation system using frequency shift
US10564727B2 (en) Systems and methods for a low profile haptic actuator
US10429933B2 (en) Audio enhanced simulation of high bandwidth haptic effects
CN105027034B (en) For providing the apparatus and method of touch feedback to input block
KR102024940B1 (en) High definition haptic effects generation using primitives
JP6399865B2 (en) System and method for generating tactile effects by visual processing of spectrograms
CN104423593B (en) Systems and methods for generating haptic effects associated with transitions in audio signals
KR20150028725A (en) Systems and methods for generating haptic effects associated with an envelope in audio signals
JP2015176371A (en) Electronic apparatus, tactile control method, and program
JP2015118605A (en) Tactile control device, tactile control method, and program
CN104995588A (en) Input device and control method therefor, and program
JP2015130006A (en) Touch control device, touch control method, and program
JP2019102018A (en) Information processing apparatus and program
EP3462285A1 (en) Haptic pitch control
US10664056B2 (en) Control device, input system and control method
JP6868788B2 (en) Output controller, output control method, and program
JP2016009315A (en) Touch control device, touch control method, and program
CN104635923B (en) A kind of information processing method and equipment
CN104635923A (en) Information processing method and equipment

Legal Events

Date Code Title Description
AS Assignment

Owner name: MEDIATEK INC., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHIH, SHUO-LI;TSENG, KUI-CHANG;HUANG, YU-HAO;AND OTHERS;REEL/FRAME:037673/0556

Effective date: 20160119

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION