US20190329417A1 - Method for performing emotional gestures by a device to interact with a user - Google Patents
Method for performing emotional gestures by a device to interact with a user Download PDFInfo
- Publication number
- US20190329417A1 US20190329417A1 US16/507,621 US201916507621A US2019329417A1 US 20190329417 A1 US20190329417 A1 US 20190329417A1 US 201916507621 A US201916507621 A US 201916507621A US 2019329417 A1 US2019329417 A1 US 2019329417A1
- Authority
- US
- United States
- Prior art keywords
- user
- gesture
- emotional
- determined
- body portion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0005—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
- B25J11/001—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means with emotions simulating means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/10—Multimedia information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/003—Controls for manipulators by means of an audio-responsive input
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/081—Touching devices, e.g. pressure-sensitive
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/086—Proximity sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/087—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices for sensing other physical parameters, e.g. electrical or chemical properties
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
Definitions
- the present disclosure relates generally to electronic devices, and more specifically to electronic devices designed to perform electro-mechanical gestures that portray particular emotions.
- Electronic devices including personal electronic devices such as smartphones, tablet computers, consumer robots, and the like, have been recently designed with ever increasing capabilities. Such capabilities fall within a wide range, including, for example, automatically cleaning or vacuuming a floor, playing high definition video clips, identifying a user by a fingerprint detector, running applications with multiple uses, accessing the internet from various locations, and the like.
- Certain embodiments disclosed herein include a device for performing emotional gestures to interact with a user.
- the device includes a base; a controller; a first body portion pivotally connected to the base, the first body portion having a first aperture; an electro-mechanical member disposed within the first body portion and connected to the controller; and a second body portion connected to the electro-mechanical member, the second body portion having a second aperture.
- the electro-mechanical member is configured to extend from the first body portion through the first aperture to the second body portion through the second aperture and the controller is configured to control movements of the electro-mechanical member and the first body portion, where the movements include emotional gestures.
- FIG. 1A is a schematic diagram of a device for performing emotional gestures according to an embodiment.
- FIG. 1B is a schematic diagram of a device for performing emotional gestures with a user device attached thereto, according to an embodiment.
- FIG. 2 is an exploded view of the first body portion of the device illustrated in FIGS. 1A and 1B , according to an embodiment.
- FIG. 3 is a perspective view of the first body portion and the second body portion of the device illustrated in FIGS. 1A and 1B , according to an embodiment.
- FIG. 4 is an exploded perspective view of the second body portion of the device illustrated in FIGS. 1A and 1B according to an embodiment.
- FIG. 5 is a block diagram of a controller for controlling a device for performing emotional gestures.
- FIG. 6 is a flowchart of a method for performing an emotional gesture based on a received electronic message according to an embodiment.
- FIG. 7 is a flowchart of a method for performing an emotional gesture based on a recommendation according to an embodiment.
- FIG. 8 is a flowchart of a method for performing an emotional gesture based on a determined user emotional state, according to an embodiment.
- the various disclosed embodiments include a device configured to perform and display gestures that may be interpreted as emotional gestures by a user.
- the device includes a base connected to a first body portion, where the first body portion is rotatable relative to the base.
- a second body portion is placed above the first body portion and is attached thereto via an electro-mechanical arm.
- FIG. 1A is an example schematic diagram of a device 100 for performing emotional gestures according to an embodiment.
- the device 100 comprises a base 110 , which may include therein a variety of electronic components, hardware components, and the like.
- the base 110 may further include a volume control 180 , a speaker 190 , and a microphone 195 .
- a first body portion 120 may be mounted to the base 110 within a ring 170 designed to accept the first body portion 120 therein.
- the first body portion 120 may include a hollow hemisphere mounted above a hollow cylinder, although other appropriate bodies and shapes may be used while having a base configured to fit into the ring 170 .
- a first aperture 125 crossing through the apex of the hemisphere of the first body portion 120 provides access into and out of the hollow interior volume of the first body portion 120 .
- the first body portion 120 is mounted to the base 110 within the confinement of the ring 170 such that it may rotate about its vertical axis symmetry, i.e., an axis extending perpendicular from the base.
- the first body portion 120 rotates clockwise or counterclockwise relative to the base 110 .
- the rotation of the first body portion 120 about the base 110 may be achieved by, for example, a motor (not shown) mounted to the base 110 or a motor (not shown) mounted within the hollow of the first body portion 120 .
- the device 100 further includes a second body portion 140 .
- the second body portion 140 may additionally include a hollow hemisphere mounted onto a hollow cylindrical portion, although other appropriate bodies may be used.
- a second aperture 145 is located at the apex of the hemisphere of the second body portion 140 . When assembled, the second aperture 145 is positioned to align with the first aperture 125 .
- the second body portion 140 is mounted to the first body portion 120 by an electro-mechanical member (not shown in FIG. 1 ) placed within the hollow of the first body portion 120 and protruding into the hollow of the second body portion 140 through the first aperture 125 and the second aperture 145 .
- the electro-mechanical member enables motion of the second body portion 140 with respect of the first body portion 120 in a motion that imitates at least an emotional gesture understandable to a human user.
- the combined motion of the second body portion 140 with respect of the first body portion 120 and the first body portion 120 with respect to the base 110 is configured to correspond to one or more of a plurality of predetermined emotional gestures capable of being presented by such movement.
- a head camera assembly (not shown) may be embedded within the second body portion 140 .
- the head camera assembly comprises at least one image capturing sensor that allows capturing images and videos.
- the base 110 may be further equipped with a stand 160 that is designed to provide support to a user device, such as a portable computing device.
- the stand 160 may include two vertical support pillars that may include therein electronic elements.
- Example for such elements include wires, sensors, charging cables, wireless charging components, and the like and may be configured to communicatively connect the stand to the user device.
- a camera assembly 165 is embedded within a top side of the stand 160 .
- the camera assembly 165 includes at least one image capturing sensor.
- a user device 150 is shown supported by the stand 160 .
- the user device 150 may include a portable electronic device including a smartphone, a mobile phone, a tablet computer, a wearable device, and the like.
- the device 100 is configured to communicate with the user device 150 via a controller (not shown).
- the user device 150 may further include at least a display unit used to display content, e.g., multimedia.
- the user device 150 may also include sensors, e.g., a camera, a microphone, a light sensor, and the like. The input identified by the sensors of the user device 150 may be relayed to the controller of the device 100 to determine whether one or more electro-mechanical gestures are to be performed.
- the device 100 may further include an audio system, including, e.g., a speaker 190 .
- the speaker 190 in one embodiment is embedded in the base 110 .
- the audio system may be utilized to, for example, play music, make alert sounds, play voice messages, and other audio or audiovisual signals generated by the device 100 .
- the microphone 195 being also part of the audio system, may be adapted to receive voice instructions from a user.
- the device 100 may further include an illumination system (not shown). Such a system may be implemented using, for example, one or more light emitting diodes (LEDs).
- the illumination system may be configured to enable the device 100 to support emotional gestures and relay information to a user, e.g., by blinking or displaying a particular color. For example, an incoming message may be indicated on the device by a LED pulsing green light.
- the LEDs of the illumination system may be placed on the base 110 , on the ring 170 , or within on the first or second body portions 120 , 140 of the device 100 .
- Emotional gestures understood by humans are, for example and without limitation, gestures such as: slowly tilting a head downward towards a chest in an expression interpreted as being sorry or ashamed; tilting the head to the left of right towards the shoulder as an expression of posing a question; nodding the head upwards and downwards vigorously as indicating enthusiastic agreement; shaking a head from side to side as indicating disagreement, and so on.
- a profile of a plurality of emotional gestures may be compiled and used by the device 100 .
- the device 100 is configured to relay similar emotional gestures by movements of the first body portion 120 and the second body portion 140 relative to each other and to the base 110 .
- the emotional gestures may be predefined movements that mimic or are similar to certain gestures of humans.
- the device may be configured to direct the gesture toward a particular individual within a room. For example, for an emotional gesture of expressing agreement towards a particular user who is moving from one side of a room to another, the first body portion 120 may perform movements that track the user, such as a rotation about a vertical axis relative to the base 110 , while the second body portion 140 may move upwards and downwards relative to the first body portion 120 to mimic a nodding motion.
- FIG. 2 shows an example exploded view of the first body portion 120 of the illustrated in FIGS. 1A and 1B , according to an embodiment.
- the first body portion 120 includes an electro-mechanical member 130 used by the device 100 for performing emotional gestures.
- the electro-mechanical member 130 is mounted within the hollow of the first body portion 120 and at least partially protrudes through the first aperture 125 .
- the electro-mechanical member 130 may further be connected to the base 110 , such that the first body portion 120 may be moved relative thereto.
- the electro-mechanical member 130 is configured to control movements of the first body portion 120 and the second body portion 140 and includes an assembly of a plurality of mechanical elements that in combination enable such motion.
- the plurality of mechanical elements may include a variety of combinations, such as, but not limited to, shafts, axles, pivots, wheels, cogwheels, poles, and belts.
- the electro-mechanical member 130 may be connected to one or more electric motors (not shown) configured to rotate the first body portion 120 about a vertical axis.
- the electric motor of the electro-mechanical member 130 may be physically connected to the first body portion 120 by an axis that enables a complete 360-degree spin.
- an arm 121 of the electro-mechanical member 130 may be connected to the electric motor and extend through the first aperture 125 to be physically connected to the second body portion 140 by an axis that enables movement of the second body portion 140 via the arm 121 , e.g., moving the second body portion upwards, downwards, forwards, and backwards.
- the arm 121 may include a narrow portion configured to fit within the first aperture 125 and the second aperture 145 , such that the first body portion 120 and second body portion 140 may be connected through the arm 121 .
- Additional components within the first body portion 120 may include a connector 123 adapted to connect the electro-mechanical member 130 to a controller (not shown) within the device 100 .
- the electric motor is connected to a spring system 122 that is configured to allow for smooth movements of the arm, and, in turn, the second body portion 140 , without the use of cogwheels or gears.
- the combined movements of the first body portion 120 and the second body portion 140 may be configured to perform diverse emotional gestures.
- the first body portion 120 may rotate right while the second body portion 140 performs a tilting movement, which may be interpreted as posing a question to a user detected to be positioned to the right of the device 100 .
- FIG. 3 is an example perspective view of the first body portion 120 and the second body portion 140 of the device illustrated in FIGS. 1A and 1B , according to an embodiment.
- the arm 121 When assembled, the arm 121 protrudes from the first aperture 125 and extends through the second aperture 145 .
- a bearing assembly 142 may be secured to the top of the arm 121 and configured to hold the arm 121 in place within the hollow of the second body portion 140 .
- FIG. 4 is an example exploded perspective view of the second body portion 140 of the device for performing emotional gestures according to an embodiment.
- the neck 121 When fully assembled, the neck 121 extends into the interior volume of the hollow second body portion 140 and is attached thereto.
- a head motor assembly 142 is disposed within the second body portion and connected to the neck 121 .
- the head motor assembly 143 may be configured to allow for additional movement of the second body portion 140 with respect to the first body portion 120 .
- a connector 144 allows for a connection between the head motor assembly 143 and a controller (not shown).
- a motherboard 146 is in communication with the head motor assembly 143 and the connector 144 , and is configured to be controlled via the controller.
- FIG. 5 is an example block diagram of a controller 500 of the device 100 implemented according to an embodiment.
- the controller 500 is disposed within the base 110 of the device 100 .
- the controller 500 is placed within the hollow of the first body portion 120 or the second body portion 140 of the device 100 .
- the controller 500 includes a processing circuitry 510 that is configured to control at least the motion of the various electro-mechanical segments of the device 100 .
- the processing circuitry 510 may be realized as one or more hardware logic components and circuits.
- illustrative types of hardware logic components include field programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), application-specific standard products (ASSPs), system-on-a-chip systems (SOCs), general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), and the like, or any other hardware logic components that can perform calculations or other manipulations of information.
- FPGAs field programmable gate arrays
- ASICs application-specific integrated circuits
- ASSPs application-specific standard products
- SOCs system-on-a-chip systems
- DSPs digital signal processors
- the controller 500 further includes a memory 520 .
- the memory 520 may contain therein instructions that, when executed by the processing circuitry 510 , cause the controller 510 to execute actions, such as, performing a motion of one or more portions of the device 100 , receive an input from one or more sensors, display a light pattern, and the like.
- the memory 520 may store therein user information, e.g., data associated with a user's behavior pattern.
- the memory 520 is further configured to store software.
- Software shall be construed broadly to mean any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise.
- Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code).
- the instructions cause the processing circuitry 510 to perform the various processes described herein. Specifically, the instructions, when executed, cause the processing circuitry 510 to cause the first body portion 120 , the second body portion 140 , the electro-mechanical member 130 , and the arm 121 of the device 100 to perform emotional gestures as described herein.
- the memory 520 may further include a memory portion (not shown) including the instructions.
- the controller 500 further includes a communication interface 530 which is configured to perform wired 532 communications, wireless 534 communications, or both, with external components, such as a wired or wireless network, wired or wireless computing devices, and so on.
- the communication interface 530 may be configured to communicate with the user device to receive data and instructions therefrom.
- the controller 500 may further include an input/output (I/O) interface 540 that may be utilized to control the various electronics of the device 100 , such as sensors 550 , including sensors on the device 100 , sensors on the user device 150 , the electro-mechanical member 130 , and more.
- the sensors 550 may include, but are not limited to, environmental sensors, a camera, a microphone, a motion detector, a proximity sensor, a light sensor, a temperature sensor and a touch detector, one of more of which may be configured to sense and identify real-time data associated with a user.
- a motion detector may sense movement
- a proximity sensor may detect that the movement is within a predetermined distance to the device 100 .
- instructions may be send to light up the illumination system of the device 100 and raise the second body portion 140 , mimicking a gesture indicating attention or interest.
- the real-time data may be saved and stored within the device 100 , e.g., within the memory 520 , and may be used as historical data to assist with identifying behavior patterns, changes occur in behavior patterns, and the like.
- the controller 500 may determine, based on sensory input from a sensor 550 , that a certain emotional gesture is appropriate based on identification of a specific user behavior. As a result, the controller 500 may cause the first body portion 120 , the electro-mechanical member 130 and the second body portion 140 to perform one or more movements that may be interpreted by the user as one or more emotional gestures.
- Methods implemented by the device 100 may be utilized for several purposes.
- An example for such a purpose may be performing an electro-mechanical gesture based on a receipt of an electronic message by identification of information based on the electronic message and collection of data with respect to a user's state.
- the device 100 may be further utilized to perform an electro-mechanical gesture respective of a receipt of an electronic recommendation.
- the method may be configured to receive an electronic recommendation, collect data related to the user, analyze the data and the information associated with the recommendation to determine a proactive electro-mechanical gesture associated with at least one emotion. Then, the at least one electro-mechanical gesture is performed.
- the device 100 may be configured to respond to detected loneliness of a user.
- the device 100 may be configured to detect loneliness of a user using predetermined loneliness profiles based on various parameters including, e.g., identifying the amount of time a user has been stationary watching television, sleeping, sitting in a chair without significant movement, and the like.
- the controller of the device may be configured to select at least one electro-mechanical gesture from a predetermined set of electro-mechanical gestures based on the detected loneliness profile.
- the gestures may be adapted for various users.
- different users may experience different electro-mechanical gestures from the device based on identical parameters.
- the electro-mechanical gesture may include rotating the first body segment 120 towards the user, moving the second body portion 140 forward and towards the user, and causing the illumination system to illuminate and the audio system to play music.
- FIG. 6 is an example flowchart of a method 600 for performing an emotional gesture based on a received electronic message according to an embodiment.
- an electronic message is received by the device.
- the electronic message may include, for example, a short message service (SMS) message, a multimedia messaging service (MMS) message, an email, a voice mail, a message sent over a social network, and the like.
- SMS short message service
- MMS multimedia messaging service
- the electronic message may be received over a network, such as a local area network (LAN), a wireless network, the Internet, and the like.
- LAN local area network
- the received message may include both content and metadata, where metadata is data describing the content.
- the content may include text, such as alphanumeric characters, words, sentences, queries, an image, a picture, a video, an audio recording a combination thereof, and the like.
- the metadata of the electronic message can include information about a sender of the message, a device or phone number associated with a device from which the electronic message was received, a time the message was initially sent, a time the message was received, the size of the message, and the like.
- the received electronic message is analyzed.
- the analysis includes identifying the content and the metadata of the electronic message and determining the intent of a sender of the message based on the content and the metadata.
- the intent may include asking a question to an intended recipient, determining the current state of an intended recipient, e.g., the current emotional state of the intended recipient, sending a reminder to an intended recipient and the like.
- the intended recipient is a user of the device 100 , discussed above.
- the analysis further includes identifying the sender of the electronic message.
- the identity of the sender may include a name of the sender, the relationship between the sender and the intended recipient (also referred to as the user), and the like.
- a sender may be identified as a family relative, a friend, a co-worker, a social worker, and so on, of the user.
- the identity of the sender may be useful in determining the sender's intent. For example, based on historical data associated with the user, including previously received messages, if the sender is determined to be a child of the user and the message has been sent at a time that many previous messages have been sent, it may be determined that the child is checking in on the wellbeing of the user.
- the historical data may be retrieved from a database, as discussed below.
- the sender of an electronic message may be identified as the daughter of the intended recipient, where the electronic message is determined to have been sent through a messaging service, e.g., Facebook® Messenger, where the sender is associated with a user account previously known to be associated with the daughter of the intended recipient.
- a messaging service e.g., Facebook® Messenger
- the content of the electronic message may be analyzed to determine that the intent of the daughter is to confirm that the user and sender are meeting for dinner on the upcoming Sunday at 6 pm at a particular restaurant.
- the data may include real-time data or historical data.
- Real-time data may be information associated with a user's current state, and may be accessed and determined via one or more sensors, e.g., the sensors 550 discussed above in FIG. 5 .
- the sensors may be connected to a device and may include an environmental sensor, a camera, a microphone, a motion detector, a proximity sensor, a light sensor, a temperature sensor, a touch detector, a combination thereof, and the like.
- real-time data may include frequency of motion detected from the user within a room within a predetermined period of time, e.g., the previous three hours, based on data received via a proximity sensor and a motion sensor.
- Historical data may include previously recorded real-time data of the user, attributes associated with the user, and the like, and be indicative of the user's behavior patterns and preferences based on previously determined data. For example, historical data may indicate that the user's sense of hearing is in a poor condition.
- the historical data may be stored in, for example, a memory, e.g., the memory 520 of the device of FIG. 5 , a database, a cloud database, and the like.
- an interaction objective is determined.
- the real-time and historical data may be utilized to determine at least one interaction objective.
- the interaction objectives are desired goals related to a determined user state to be achieved by a proactive interaction based on the received electronic message and user data. For example, if, based on the analyzed data, it is determined that a user's emotional state is identified as lonely, at least one interaction objective may be to improve this user's state and minimize perceived loneliness by performing at least one electro-mechanical gesture related to the received electronic message.
- the determination of the interaction objectives may be achieved by collecting and analyzing the data associated with the user from, for example, a plurality of sensors, social networks used by the user, previous user data stored on a database, and the like, in order to identify the current user state.
- the electro-mechanical gestures can be adjusted for each individual user. For example, if it is determined that a user has been without human interaction for 24 hours, the electro-mechanical gesture that will be provided in response to a receipt of an electronic message, e.g., an electronic message used to check in with a user from a relative, will be different from the electro-mechanical gesture that will be provided in case the user has been entertaining company.
- an electronic message e.g., an electronic message used to check in with a user from a relative
- At S 650 at least one electro-mechanical gesture based on the analysis of the electronic message and the analysis of the data of the user is determined to be performed.
- the electro-mechanical gesture may include at least one of the electro-mechanical gestures determined to achieve the interaction objective.
- the electro-mechanical gesture may include emotional gestures that are predefined movements that mimic or are similar to certain gestures of humans, as described herein above.
- the determined at least one electro-mechanical gesture is performed, e.g., by the device described in FIGS. 1A-5 .
- a response to the electronic message is sent, e.g., to the sender.
- the response may indicate that the electronic message has been received by user.
- the response may include a reaction or reply from the user intended to be sent to the sender.
- the response may be sent, e.g., over a network, and may include text, such as alphanumeric characters, words, sentences, queries, an image, a picture, a video, an audio recording, a combination thereof, and the like.
- FIG. 7 is an example flowchart of a method 700 for performing an emotional gesture based on a recommendation according to an embodiment.
- a recommendation is received.
- the recommendation may include an instruction to perform an electro-mechanical gesture, to display a multimedia content item, to provide a link to a multimedia content item, and the like.
- the recommendation is received from a person, e.g., from a user device over a network.
- the recommendation is received from a recommendation generator, which is configured to generate a recommendation based on data associated with a user.
- the recommendation generator may be the controller discussed in FIG. 5 above.
- the data may include real-time data or historical data.
- Real-time data may be information associated with a user's current state, and may be accessed and determined via one or more sensors, e.g., the sensors 550 discussed above in FIG. 5 .
- the sensors may be connected to a device and may include an environmental sensor, a camera, a microphone, a motion detector, a proximity sensor, a light sensor, a temperature sensor, a touch detector, a combination thereof, and the like.
- real-time data may include frequency of motion detected from the user within a room within a predetermined period of time, e.g., the previous three hours, based on data received via a proximity and a motion sensor.
- Historical data may include previously recorded real-time data of the user, attributes associated with the user, and the like, and be indicative of the user's behavior patterns and preferences based on previously determined data. For example, historical data may indicate that the user's sense of hearing is in a poor condition.
- the historical data may be stored in, for example, a memory, e.g., the memory 520 of the device of FIG. 5 , a database, a cloud database, and the like.
- the data and the electronic recommendation is analyzed to determine an action to be performed based thereon.
- the analysis includes determining a user's emotional state, and determining an appropriate action based on the determined emotional state and the received recommendation.
- the action includes an electro-mechanical gesture, e.g., an emotion gesture, to be performed.
- the electro-mechanical gesture may include causing a device to display a show of interest with the user, and displaying a link to an uplifting video within a display of the device.
- the action does not include an electro-mechanical gesture.
- the action is performed.
- the action is performed by a device configured to perform emotional gestures, e.g., the device discussed above in FIGS. 1A-5 .
- the electro-mechanical gesture may include moving a first body portion and a second body portion to mimic human emotional gestures, as discussed above.
- the action may include displaying a multimedia content, e.g., based on the received recommendation, in tandem with one or more electro-mechanical gestures.
- an electro-mechanical gesture is configured to create an emotional reaction by the user.
- an electro-mechanical gesture may include rotating a first body segment slowly towards the user's direction and moving the second body portion slowly forward, imitating a human gesture that expresses an interest in a companion's feelings.
- a user response to the action is determined.
- the response may be determined based on data collected from a plurality of sensors.
- a user response includes at least one of: a response to the electro-mechanical gesture, and a response to the electronic recommendation. For example, if the user's response to a recommended video was expressed by smiling and laughing, the user's state may be determined to be positive, and the video, the type of video, the time of the suggestion, and the like, may be stored in a storage, e.g., in a database, for managing future recommendations.
- FIG. 8 is an example flowchart of a method for performing an emotional gesture based on a determined user emotional state, according to an embodiment.
- real-time indicators that indicate the current state of a user are received.
- Real-time indicators may be received from sensors, i.e., the sensors discussed above, including an environmental sensor, a camera, a microphone, a motion detector, a proximity sensor, a light sensor, a temperature sensor, a touch detector, and combination thereof, and the like.
- Examples of real-time indicators include determining if user motions have been detected over a period of time, e.g., by a motion sensor, if conversations have taken place over a period of time, e.g., by a microphone, and the like.
- the real-time indicators related to a current state of a user are analyzed.
- the analysis includes comparing the received real-time indicators with previously determined user profiles. For example, based on detected motions, it may be determined whether the user is currently awake or asleep, if there is current user movement, if there has been user movement over a previously determined period of time, if the user has watched television without moving for a predetermined period of time, and the like.
- the analysis may further include comparing the determined current emotional state of a user to a plurality of predetermined profiles.
- a plurality of loneliness profiles may be accessed via a database, where each profile includes a set of parameters that may indicate loneliness.
- the parameters may include data associated with real-time indicators, including the amount of time a user has been idle, the amount of time between movements, the amount of time between conversations involving the user, and the like.
- a user's current emotional state is determined based on the analysis, e.g., if the user is determined to be in a lonely state.
- At S 840 at least one electro-mechanical gesture is selected to be performed based on the determined user's emotional state.
- the gesture may be selected from a plurality of electro-mechanical gestures based on the user's emotional state, where the electro-mechanical gestures may be associated with predetermined profiles associated with at least one emotion.
- the plurality of electro-mechanical gestures may be dynamically updated based on, for example, a user's reactions to the selected electro-mechanical gesture as identified by sensors. Thus, different users may experience different electro-mechanical gestures based on identical circumstances.
- At least one electro-mechanical gesture may be selected to mimic an emotional gesture, such as, for example, rotating a first body segment of a device, e.g., the device of FIGS. 1A-5 , towards the user, move a second body portion forward and towards the user, and the like, to display a gesture of interest.
- certain multimedia items may be display or played, e.g., a light turned on, a video played, music played, and the like.
- each of the selected electro-mechanical gestures is performed.
- the various embodiments disclosed herein can be implemented as hardware, firmware, software, or any combination thereof.
- the software is preferably implemented as an application program tangibly embodied on a program storage unit or computer readable medium consisting of parts, or of certain devices and/or a combination of devices.
- the application program may be uploaded to, and executed by, a machine comprising any suitable architecture.
- the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces.
- CPUs central processing units
- the computer platform may also include an operating system and microinstruction code.
- a non-transitory computer readable medium is any computer readable medium except for a transitory propagating signal.
- the phrase “at least one of” followed by a listing of items means that any of the listed items can be utilized individually, or any combination of two or more of the listed items can be utilized. For example, if a system is described as including “at least one of A, B, and C,” the system can include A alone; B alone; C alone; A and B in combination; B and C in combination; A and C in combination; or A, B, and C in combination.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Manipulator (AREA)
- Toys (AREA)
Abstract
Description
- This application is a continuation of International Application No. PCT/US2018/012923 filed Jan. 9, 2018 which claims the benefit of U.S. Provisional Patent Application No. 62/444,384 and U.S. Provisional Application No. 62/444,386, both filed on Jan. 10, 2017, the contents of which are hereby incorporated by reference.
- The present disclosure relates generally to electronic devices, and more specifically to electronic devices designed to perform electro-mechanical gestures that portray particular emotions.
- Electronic devices, including personal electronic devices such as smartphones, tablet computers, consumer robots, and the like, have been recently designed with ever increasing capabilities. Such capabilities fall within a wide range, including, for example, automatically cleaning or vacuuming a floor, playing high definition video clips, identifying a user by a fingerprint detector, running applications with multiple uses, accessing the internet from various locations, and the like.
- In recent years, microelectronics advancement, computer development, control theory development and the availability of electro-mechanical and hydro mechanical servomechanisms, among others, have been key factors in robotics evolution, giving rise to a new generation of automatons known as social robots. Social robots can conduct what appears to be emotional and cognitive activities, interacting and communicating with people in a simple and pleasant manner following a series of behaviors, patterns and social norms. Advancements in the field of robotics have included the development of biped robots with human appearances that facilitate interaction between the robots and humans by introducing anthropomorphic human traits in the robots. The robots often include a precise mechanical structure allowing for specific physical locomotion and handling skill.
- Although social robots have sensory systems to perceive the surrounding environment and are capable of interacting with human beings, the self-expressions they are currently programmed to display remain limited. Current social robots' performances include simple direct responses to a user's actions. For example, these responses may include performing a movement or series of movements based on predetermined paths. Vacuum robots employ such predetermined paths in order to efficiently maximize coverage of a floor plan and may run based on a user determined schedule. Responses may further include predetermined movements when encountering a known obstacle, which may be employed by biped robots to maneuver a course. However, these responses are difficult to employ when the desired application of the robot is to directly respond to a user's queries or to determine and respond to a user's mood. Without the ability to provide gestures and movements that appear as emotional in nature, robots become less appealing to many users, especially those who are less familiar with robotic technology.
- It would therefore be advantageous to provide a solution that would overcome the challenges noted above.
- A summary of several example embodiments of the disclosure follows. This summary is provided for the convenience of the reader to provide a basic understanding of such embodiments and does not wholly define the breadth of the disclosure. This summary is not an extensive overview of all contemplated embodiments, and is intended to neither identify key or critical elements of all embodiments nor to delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more embodiments in a simplified form as a prelude to the more detailed description that is presented later. For convenience, the term “certain embodiments” may be used herein to refer to a single embodiment or multiple embodiments of the disclosure.
- Certain embodiments disclosed herein include a device for performing emotional gestures to interact with a user. The device includes a base; a controller; a first body portion pivotally connected to the base, the first body portion having a first aperture; an electro-mechanical member disposed within the first body portion and connected to the controller; and a second body portion connected to the electro-mechanical member, the second body portion having a second aperture. The electro-mechanical member is configured to extend from the first body portion through the first aperture to the second body portion through the second aperture and the controller is configured to control movements of the electro-mechanical member and the first body portion, where the movements include emotional gestures.
- The subject matter disclosed herein is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other objects, features, and advantages of the disclosed embodiments will be apparent from the following detailed description taken in conjunction with the accompanying drawings.
-
FIG. 1A is a schematic diagram of a device for performing emotional gestures according to an embodiment. -
FIG. 1B is a schematic diagram of a device for performing emotional gestures with a user device attached thereto, according to an embodiment. -
FIG. 2 is an exploded view of the first body portion of the device illustrated inFIGS. 1A and 1B , according to an embodiment. -
FIG. 3 is a perspective view of the first body portion and the second body portion of the device illustrated inFIGS. 1A and 1B , according to an embodiment. -
FIG. 4 is an exploded perspective view of the second body portion of the device illustrated inFIGS. 1A and 1B according to an embodiment. -
FIG. 5 is a block diagram of a controller for controlling a device for performing emotional gestures. -
FIG. 6 is a flowchart of a method for performing an emotional gesture based on a received electronic message according to an embodiment. -
FIG. 7 is a flowchart of a method for performing an emotional gesture based on a recommendation according to an embodiment. -
FIG. 8 is a flowchart of a method for performing an emotional gesture based on a determined user emotional state, according to an embodiment. - It is important to note that the embodiments disclosed herein are only examples of the many advantageous uses of the innovative teachings herein. In general, statements made in the specification of the present application do not necessarily limit any of the various claimed embodiments. Moreover, some statements may apply to some inventive features but not to others. In general, unless otherwise indicated, singular elements may be in plural and vice versa with no loss of generality. In the drawings, like numerals refer to like parts through several views.
- The various disclosed embodiments include a device configured to perform and display gestures that may be interpreted as emotional gestures by a user. The device includes a base connected to a first body portion, where the first body portion is rotatable relative to the base. A second body portion is placed above the first body portion and is attached thereto via an electro-mechanical arm.
-
FIG. 1A is an example schematic diagram of adevice 100 for performing emotional gestures according to an embodiment. Thedevice 100 comprises abase 110, which may include therein a variety of electronic components, hardware components, and the like. Thebase 110 may further include avolume control 180, aspeaker 190, and amicrophone 195. - A
first body portion 120 may be mounted to thebase 110 within aring 170 designed to accept thefirst body portion 120 therein. Thefirst body portion 120 may include a hollow hemisphere mounted above a hollow cylinder, although other appropriate bodies and shapes may be used while having a base configured to fit into thering 170. Afirst aperture 125 crossing through the apex of the hemisphere of thefirst body portion 120 provides access into and out of the hollow interior volume of thefirst body portion 120. Thefirst body portion 120 is mounted to thebase 110 within the confinement of thering 170 such that it may rotate about its vertical axis symmetry, i.e., an axis extending perpendicular from the base. For example, thefirst body portion 120 rotates clockwise or counterclockwise relative to thebase 110. The rotation of thefirst body portion 120 about thebase 110 may be achieved by, for example, a motor (not shown) mounted to thebase 110 or a motor (not shown) mounted within the hollow of thefirst body portion 120. - The
device 100 further includes asecond body portion 140. Thesecond body portion 140 may additionally include a hollow hemisphere mounted onto a hollow cylindrical portion, although other appropriate bodies may be used. Asecond aperture 145 is located at the apex of the hemisphere of thesecond body portion 140. When assembled, thesecond aperture 145 is positioned to align with thefirst aperture 125. - The
second body portion 140 is mounted to thefirst body portion 120 by an electro-mechanical member (not shown inFIG. 1 ) placed within the hollow of thefirst body portion 120 and protruding into the hollow of thesecond body portion 140 through thefirst aperture 125 and thesecond aperture 145. - In an embodiment, the electro-mechanical member enables motion of the
second body portion 140 with respect of thefirst body portion 120 in a motion that imitates at least an emotional gesture understandable to a human user. The combined motion of thesecond body portion 140 with respect of thefirst body portion 120 and thefirst body portion 120 with respect to thebase 110 is configured to correspond to one or more of a plurality of predetermined emotional gestures capable of being presented by such movement. A head camera assembly (not shown) may be embedded within thesecond body portion 140. The head camera assembly comprises at least one image capturing sensor that allows capturing images and videos. - The base 110 may be further equipped with a
stand 160 that is designed to provide support to a user device, such as a portable computing device. Thestand 160 may include two vertical support pillars that may include therein electronic elements. Example for such elements include wires, sensors, charging cables, wireless charging components, and the like and may be configured to communicatively connect the stand to the user device. - In an embodiment, a
camera assembly 165 is embedded within a top side of thestand 160. Thecamera assembly 165 includes at least one image capturing sensor. - According to some embodiments, shown in
FIG. 1B , auser device 150 is shown supported by thestand 160. Theuser device 150 may include a portable electronic device including a smartphone, a mobile phone, a tablet computer, a wearable device, and the like. Thedevice 100 is configured to communicate with theuser device 150 via a controller (not shown). Theuser device 150 may further include at least a display unit used to display content, e.g., multimedia. According to an embodiment, theuser device 150 may also include sensors, e.g., a camera, a microphone, a light sensor, and the like. The input identified by the sensors of theuser device 150 may be relayed to the controller of thedevice 100 to determine whether one or more electro-mechanical gestures are to be performed. - Returning to
FIG. 1A , thedevice 100 may further include an audio system, including, e.g., aspeaker 190. Thespeaker 190 in one embodiment is embedded in thebase 110. The audio system may be utilized to, for example, play music, make alert sounds, play voice messages, and other audio or audiovisual signals generated by thedevice 100. Themicrophone 195, being also part of the audio system, may be adapted to receive voice instructions from a user. - The
device 100 may further include an illumination system (not shown). Such a system may be implemented using, for example, one or more light emitting diodes (LEDs). The illumination system may be configured to enable thedevice 100 to support emotional gestures and relay information to a user, e.g., by blinking or displaying a particular color. For example, an incoming message may be indicated on the device by a LED pulsing green light. The LEDs of the illumination system may be placed on thebase 110, on thering 170, or within on the first or 120, 140 of thesecond body portions device 100. - Emotional gestures understood by humans are, for example and without limitation, gestures such as: slowly tilting a head downward towards a chest in an expression interpreted as being sorry or ashamed; tilting the head to the left of right towards the shoulder as an expression of posing a question; nodding the head upwards and downwards vigorously as indicating enthusiastic agreement; shaking a head from side to side as indicating disagreement, and so on. A profile of a plurality of emotional gestures may be compiled and used by the
device 100. - In an embodiment, the
device 100 is configured to relay similar emotional gestures by movements of thefirst body portion 120 and thesecond body portion 140 relative to each other and to thebase 110. The emotional gestures may be predefined movements that mimic or are similar to certain gestures of humans. Further, the device may be configured to direct the gesture toward a particular individual within a room. For example, for an emotional gesture of expressing agreement towards a particular user who is moving from one side of a room to another, thefirst body portion 120 may perform movements that track the user, such as a rotation about a vertical axis relative to thebase 110, while thesecond body portion 140 may move upwards and downwards relative to thefirst body portion 120 to mimic a nodding motion. -
FIG. 2 shows an example exploded view of thefirst body portion 120 of the illustrated inFIGS. 1A and 1B , according to an embodiment. Thefirst body portion 120 includes an electro-mechanical member 130 used by thedevice 100 for performing emotional gestures. The electro-mechanical member 130 is mounted within the hollow of thefirst body portion 120 and at least partially protrudes through thefirst aperture 125. The electro-mechanical member 130 may further be connected to thebase 110, such that thefirst body portion 120 may be moved relative thereto. - The electro-
mechanical member 130 is configured to control movements of thefirst body portion 120 and thesecond body portion 140 and includes an assembly of a plurality of mechanical elements that in combination enable such motion. The plurality of mechanical elements may include a variety of combinations, such as, but not limited to, shafts, axles, pivots, wheels, cogwheels, poles, and belts. The electro-mechanical member 130 may be connected to one or more electric motors (not shown) configured to rotate thefirst body portion 120 about a vertical axis. - In an example embodiment, the electric motor of the electro-
mechanical member 130 may be physically connected to thefirst body portion 120 by an axis that enables a complete 360-degree spin. In another example embodiment, anarm 121 of the electro-mechanical member 130 may be connected to the electric motor and extend through thefirst aperture 125 to be physically connected to thesecond body portion 140 by an axis that enables movement of thesecond body portion 140 via thearm 121, e.g., moving the second body portion upwards, downwards, forwards, and backwards. Thearm 121 may include a narrow portion configured to fit within thefirst aperture 125 and thesecond aperture 145, such that thefirst body portion 120 andsecond body portion 140 may be connected through thearm 121. Additional components within thefirst body portion 120 may include aconnector 123 adapted to connect the electro-mechanical member 130 to a controller (not shown) within thedevice 100. In an embodiment, the electric motor is connected to aspring system 122 that is configured to allow for smooth movements of the arm, and, in turn, thesecond body portion 140, without the use of cogwheels or gears. - The combined movements of the
first body portion 120 and thesecond body portion 140 may be configured to perform diverse emotional gestures. For example, thefirst body portion 120 may rotate right while thesecond body portion 140 performs a tilting movement, which may be interpreted as posing a question to a user detected to be positioned to the right of thedevice 100. -
FIG. 3 is an example perspective view of thefirst body portion 120 and thesecond body portion 140 of the device illustrated inFIGS. 1A and 1B , according to an embodiment. When assembled, thearm 121 protrudes from thefirst aperture 125 and extends through thesecond aperture 145. A bearingassembly 142 may be secured to the top of thearm 121 and configured to hold thearm 121 in place within the hollow of thesecond body portion 140. -
FIG. 4 is an example exploded perspective view of thesecond body portion 140 of the device for performing emotional gestures according to an embodiment. When fully assembled, theneck 121 extends into the interior volume of the hollowsecond body portion 140 and is attached thereto. In an embodiment, ahead motor assembly 142 is disposed within the second body portion and connected to theneck 121. Thehead motor assembly 143 may be configured to allow for additional movement of thesecond body portion 140 with respect to thefirst body portion 120. Aconnector 144 allows for a connection between thehead motor assembly 143 and a controller (not shown). In an embodiment, amotherboard 146 is in communication with thehead motor assembly 143 and theconnector 144, and is configured to be controlled via the controller. -
FIG. 5 is an example block diagram of acontroller 500 of thedevice 100 implemented according to an embodiment. In an embodiment, thecontroller 500 is disposed within thebase 110 of thedevice 100. In another embodiment, thecontroller 500 is placed within the hollow of thefirst body portion 120 or thesecond body portion 140 of thedevice 100. Thecontroller 500 includes aprocessing circuitry 510 that is configured to control at least the motion of the various electro-mechanical segments of thedevice 100. Theprocessing circuitry 510 may be realized as one or more hardware logic components and circuits. For example, and without limitation, illustrative types of hardware logic components that can be used include field programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), application-specific standard products (ASSPs), system-on-a-chip systems (SOCs), general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), and the like, or any other hardware logic components that can perform calculations or other manipulations of information. - The
controller 500 further includes amemory 520. Thememory 520 may contain therein instructions that, when executed by theprocessing circuitry 510, cause thecontroller 510 to execute actions, such as, performing a motion of one or more portions of thedevice 100, receive an input from one or more sensors, display a light pattern, and the like. According to an embodiment, thememory 520 may store therein user information, e.g., data associated with a user's behavior pattern. Thememory 520 is further configured to store software. Software shall be construed broadly to mean any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions cause theprocessing circuitry 510 to perform the various processes described herein. Specifically, the instructions, when executed, cause theprocessing circuitry 510 to cause thefirst body portion 120, thesecond body portion 140, the electro-mechanical member 130, and thearm 121 of thedevice 100 to perform emotional gestures as described herein. In a further embodiment, thememory 520 may further include a memory portion (not shown) including the instructions. - The
controller 500 further includes acommunication interface 530 which is configured to perform wired 532 communications,wireless 534 communications, or both, with external components, such as a wired or wireless network, wired or wireless computing devices, and so on. Thecommunication interface 530 may be configured to communicate with the user device to receive data and instructions therefrom. - The
controller 500 may further include an input/output (I/O)interface 540 that may be utilized to control the various electronics of thedevice 100, such assensors 550, including sensors on thedevice 100, sensors on theuser device 150, the electro-mechanical member 130, and more. Thesensors 550 may include, but are not limited to, environmental sensors, a camera, a microphone, a motion detector, a proximity sensor, a light sensor, a temperature sensor and a touch detector, one of more of which may be configured to sense and identify real-time data associated with a user. For example, a motion detector may sense movement, and a proximity sensor may detect that the movement is within a predetermined distance to thedevice 100. As a result, instructions may be send to light up the illumination system of thedevice 100 and raise thesecond body portion 140, mimicking a gesture indicating attention or interest. - According to an embodiment, the real-time data may be saved and stored within the
device 100, e.g., within thememory 520, and may be used as historical data to assist with identifying behavior patterns, changes occur in behavior patterns, and the like. - As a non-limiting example, the
controller 500 may determine, based on sensory input from asensor 550, that a certain emotional gesture is appropriate based on identification of a specific user behavior. As a result, thecontroller 500 may cause thefirst body portion 120, the electro-mechanical member 130 and thesecond body portion 140 to perform one or more movements that may be interpreted by the user as one or more emotional gestures. - Methods implemented by the
device 100 may be utilized for several purposes. An example for such a purpose may be performing an electro-mechanical gesture based on a receipt of an electronic message by identification of information based on the electronic message and collection of data with respect to a user's state. - According to another example method, the
device 100 may be further utilized to perform an electro-mechanical gesture respective of a receipt of an electronic recommendation. The method may be configured to receive an electronic recommendation, collect data related to the user, analyze the data and the information associated with the recommendation to determine a proactive electro-mechanical gesture associated with at least one emotion. Then, the at least one electro-mechanical gesture is performed. According to another exemplary method executed using thedevice 100, thedevice 100 may be configured to respond to detected loneliness of a user. - For example, the
device 100 may be configured to detect loneliness of a user using predetermined loneliness profiles based on various parameters including, e.g., identifying the amount of time a user has been stationary watching television, sleeping, sitting in a chair without significant movement, and the like. The controller of the device may be configured to select at least one electro-mechanical gesture from a predetermined set of electro-mechanical gestures based on the detected loneliness profile. The gestures may be adapted for various users. Thus, different users may experience different electro-mechanical gestures from the device based on identical parameters. As an example gesture, where a user is identified as lonely, the electro-mechanical gesture may include rotating thefirst body segment 120 towards the user, moving thesecond body portion 140 forward and towards the user, and causing the illumination system to illuminate and the audio system to play music. -
FIG. 6 is an example flowchart of amethod 600 for performing an emotional gesture based on a received electronic message according to an embodiment. At S610, an electronic message is received by the device. The electronic message may include, for example, a short message service (SMS) message, a multimedia messaging service (MMS) message, an email, a voice mail, a message sent over a social network, and the like. The electronic message may be received over a network, such as a local area network (LAN), a wireless network, the Internet, and the like. - The received message may include both content and metadata, where metadata is data describing the content. The content may include text, such as alphanumeric characters, words, sentences, queries, an image, a picture, a video, an audio recording a combination thereof, and the like. The metadata of the electronic message can include information about a sender of the message, a device or phone number associated with a device from which the electronic message was received, a time the message was initially sent, a time the message was received, the size of the message, and the like.
- At S620, the received electronic message is analyzed. In one embodiment, the analysis includes identifying the content and the metadata of the electronic message and determining the intent of a sender of the message based on the content and the metadata. The intent may include asking a question to an intended recipient, determining the current state of an intended recipient, e.g., the current emotional state of the intended recipient, sending a reminder to an intended recipient and the like. In an embodiment, the intended recipient is a user of the
device 100, discussed above. - In another embodiment, the analysis further includes identifying the sender of the electronic message. The identity of the sender may include a name of the sender, the relationship between the sender and the intended recipient (also referred to as the user), and the like. For example, a sender may be identified as a family relative, a friend, a co-worker, a social worker, and so on, of the user. The identity of the sender may be useful in determining the sender's intent. For example, based on historical data associated with the user, including previously received messages, if the sender is determined to be a child of the user and the message has been sent at a time that many previous messages have been sent, it may be determined that the child is checking in on the wellbeing of the user. The historical data may be retrieved from a database, as discussed below.
- As an example, the sender of an electronic message may be identified as the daughter of the intended recipient, where the electronic message is determined to have been sent through a messaging service, e.g., Facebook® Messenger, where the sender is associated with a user account previously known to be associated with the daughter of the intended recipient. According to the same example, the content of the electronic message may be analyzed to determine that the intent of the daughter is to confirm that the user and sender are meeting for dinner on the upcoming Sunday at 6 pm at a particular restaurant.
- At S630, data of the intended recipient, or the user, is received. The data may include real-time data or historical data. Real-time data may be information associated with a user's current state, and may be accessed and determined via one or more sensors, e.g., the
sensors 550 discussed above inFIG. 5 . As mentioned above, the sensors may be connected to a device and may include an environmental sensor, a camera, a microphone, a motion detector, a proximity sensor, a light sensor, a temperature sensor, a touch detector, a combination thereof, and the like. As an example, real-time data may include frequency of motion detected from the user within a room within a predetermined period of time, e.g., the previous three hours, based on data received via a proximity sensor and a motion sensor. - Historical data may include previously recorded real-time data of the user, attributes associated with the user, and the like, and be indicative of the user's behavior patterns and preferences based on previously determined data. For example, historical data may indicate that the user's sense of hearing is in a poor condition. The historical data may be stored in, for example, a memory, e.g., the
memory 520 of the device ofFIG. 5 , a database, a cloud database, and the like. - At optional S640, an interaction objective is determined. The real-time and historical data may be utilized to determine at least one interaction objective. The interaction objectives are desired goals related to a determined user state to be achieved by a proactive interaction based on the received electronic message and user data. For example, if, based on the analyzed data, it is determined that a user's emotional state is identified as lonely, at least one interaction objective may be to improve this user's state and minimize perceived loneliness by performing at least one electro-mechanical gesture related to the received electronic message. The determination of the interaction objectives may be achieved by collecting and analyzing the data associated with the user from, for example, a plurality of sensors, social networks used by the user, previous user data stored on a database, and the like, in order to identify the current user state. The electro-mechanical gestures can be adjusted for each individual user. For example, if it is determined that a user has been without human interaction for 24 hours, the electro-mechanical gesture that will be provided in response to a receipt of an electronic message, e.g., an electronic message used to check in with a user from a relative, will be different from the electro-mechanical gesture that will be provided in case the user has been entertaining company.
- At S650, at least one electro-mechanical gesture based on the analysis of the electronic message and the analysis of the data of the user is determined to be performed. The electro-mechanical gesture may include at least one of the electro-mechanical gestures determined to achieve the interaction objective. The electro-mechanical gesture may include emotional gestures that are predefined movements that mimic or are similar to certain gestures of humans, as described herein above.
- At S660, the determined at least one electro-mechanical gesture is performed, e.g., by the device described in
FIGS. 1A-5 . - At optional S670, a response to the electronic message is sent, e.g., to the sender. In an embodiment, the response may indicate that the electronic message has been received by user. In an additional embodiment, the response may include a reaction or reply from the user intended to be sent to the sender. The response may be sent, e.g., over a network, and may include text, such as alphanumeric characters, words, sentences, queries, an image, a picture, a video, an audio recording, a combination thereof, and the like.
- At S680, it is determined if additional electronic messages have been received. If so, execution continues at S610; otherwise execution terminates.
-
FIG. 7 is an example flowchart of amethod 700 for performing an emotional gesture based on a recommendation according to an embodiment. At S710, a recommendation is received. The recommendation may include an instruction to perform an electro-mechanical gesture, to display a multimedia content item, to provide a link to a multimedia content item, and the like. In an embodiment, the recommendation is received from a person, e.g., from a user device over a network. In a further embodiment, the recommendation is received from a recommendation generator, which is configured to generate a recommendation based on data associated with a user. The recommendation generator may be the controller discussed inFIG. 5 above. - At S720, data of the intended recipient, or the user, is received. The data may include real-time data or historical data. Real-time data may be information associated with a user's current state, and may be accessed and determined via one or more sensors, e.g., the
sensors 550 discussed above inFIG. 5 . As mentioned above, the sensors may be connected to a device and may include an environmental sensor, a camera, a microphone, a motion detector, a proximity sensor, a light sensor, a temperature sensor, a touch detector, a combination thereof, and the like. As an example, real-time data may include frequency of motion detected from the user within a room within a predetermined period of time, e.g., the previous three hours, based on data received via a proximity and a motion sensor. - Historical data may include previously recorded real-time data of the user, attributes associated with the user, and the like, and be indicative of the user's behavior patterns and preferences based on previously determined data. For example, historical data may indicate that the user's sense of hearing is in a poor condition. The historical data may be stored in, for example, a memory, e.g., the
memory 520 of the device ofFIG. 5 , a database, a cloud database, and the like. - At S730, the data and the electronic recommendation is analyzed to determine an action to be performed based thereon. In an embodiment, the analysis includes determining a user's emotional state, and determining an appropriate action based on the determined emotional state and the received recommendation. In an embodiment, the action includes an electro-mechanical gesture, e.g., an emotion gesture, to be performed. For example, if the recommendation includes a display of a video, and it is determined that a user is currently in a lonely state, the electro-mechanical gesture may include causing a device to display a show of interest with the user, and displaying a link to an uplifting video within a display of the device. In an embodiment, the action does not include an electro-mechanical gesture.
- At S740, the action is performed. In an embodiment, the action is performed by a device configured to perform emotional gestures, e.g., the device discussed above in
FIGS. 1A-5 . The electro-mechanical gesture may include moving a first body portion and a second body portion to mimic human emotional gestures, as discussed above. Further, the action may include displaying a multimedia content, e.g., based on the received recommendation, in tandem with one or more electro-mechanical gestures. In an embodiment, an electro-mechanical gesture is configured to create an emotional reaction by the user. For example, an electro-mechanical gesture may include rotating a first body segment slowly towards the user's direction and moving the second body portion slowly forward, imitating a human gesture that expresses an interest in a companion's feelings. - At S750, a user response to the action is determined. The response may be determined based on data collected from a plurality of sensors. A user response includes at least one of: a response to the electro-mechanical gesture, and a response to the electronic recommendation. For example, if the user's response to a recommended video was expressed by smiling and laughing, the user's state may be determined to be positive, and the video, the type of video, the time of the suggestion, and the like, may be stored in a storage, e.g., in a database, for managing future recommendations.
-
FIG. 8 is an example flowchart of a method for performing an emotional gesture based on a determined user emotional state, according to an embodiment. At S810, real-time indicators that indicate the current state of a user are received. Real-time indicators may be received from sensors, i.e., the sensors discussed above, including an environmental sensor, a camera, a microphone, a motion detector, a proximity sensor, a light sensor, a temperature sensor, a touch detector, and combination thereof, and the like. Examples of real-time indicators include determining if user motions have been detected over a period of time, e.g., by a motion sensor, if conversations have taken place over a period of time, e.g., by a microphone, and the like. - At S820, the real-time indicators related to a current state of a user are analyzed. In an embodiment, the analysis includes comparing the received real-time indicators with previously determined user profiles. For example, based on detected motions, it may be determined whether the user is currently awake or asleep, if there is current user movement, if there has been user movement over a previously determined period of time, if the user has watched television without moving for a predetermined period of time, and the like. The analysis may further include comparing the determined current emotional state of a user to a plurality of predetermined profiles. As a non-limiting example, a plurality of loneliness profiles may be accessed via a database, where each profile includes a set of parameters that may indicate loneliness. The parameters may include data associated with real-time indicators, including the amount of time a user has been idle, the amount of time between movements, the amount of time between conversations involving the user, and the like.
- At S830, a user's current emotional state is determined based on the analysis, e.g., if the user is determined to be in a lonely state.
- At S840, at least one electro-mechanical gesture is selected to be performed based on the determined user's emotional state. The gesture may be selected from a plurality of electro-mechanical gestures based on the user's emotional state, where the electro-mechanical gestures may be associated with predetermined profiles associated with at least one emotion. In an embodiment, the plurality of electro-mechanical gestures may be dynamically updated based on, for example, a user's reactions to the selected electro-mechanical gesture as identified by sensors. Thus, different users may experience different electro-mechanical gestures based on identical circumstances.
- As an example, if a user is identified as currently having a lonely state, at least one electro-mechanical gesture may be selected to mimic an emotional gesture, such as, for example, rotating a first body segment of a device, e.g., the device of
FIGS. 1A-5 , towards the user, move a second body portion forward and towards the user, and the like, to display a gesture of interest. According to another embodiment, certain multimedia items may be display or played, e.g., a light turned on, a video played, music played, and the like. - At S850, each of the selected electro-mechanical gestures is performed. At S860, it checked whether more real-time indicators have been received, and if so, execution continues with S810; otherwise, execution terminates.
- The various embodiments disclosed herein can be implemented as hardware, firmware, software, or any combination thereof. Moreover, the software is preferably implemented as an application program tangibly embodied on a program storage unit or computer readable medium consisting of parts, or of certain devices and/or a combination of devices. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such a computer or processor is explicitly shown. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit. Furthermore, a non-transitory computer readable medium is any computer readable medium except for a transitory propagating signal.
- As used herein, the phrase “at least one of” followed by a listing of items means that any of the listed items can be utilized individually, or any combination of two or more of the listed items can be utilized. For example, if a system is described as including “at least one of A, B, and C,” the system can include A alone; B alone; C alone; A and B in combination; B and C in combination; A and C in combination; or A, B, and C in combination.
- All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the principles of the disclosed embodiment and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosed embodiments, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
Claims (25)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/507,621 US20190329417A1 (en) | 2017-01-10 | 2019-07-10 | Method for performing emotional gestures by a device to interact with a user |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201762444386P | 2017-01-10 | 2017-01-10 | |
| US201762444384P | 2017-01-10 | 2017-01-10 | |
| PCT/US2018/012923 WO2018132364A1 (en) | 2017-01-10 | 2018-01-09 | A method for performing emotional gestures by a device to interact with a user |
| US16/507,621 US20190329417A1 (en) | 2017-01-10 | 2019-07-10 | Method for performing emotional gestures by a device to interact with a user |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2018/012923 Continuation WO2018132364A1 (en) | 2017-01-10 | 2018-01-09 | A method for performing emotional gestures by a device to interact with a user |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20190329417A1 true US20190329417A1 (en) | 2019-10-31 |
Family
ID=62839549
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/507,599 Active 2040-07-14 US12186885B2 (en) | 2017-01-10 | 2019-07-10 | Device for performing emotional gestures to interact with a user |
| US16/507,621 Abandoned US20190329417A1 (en) | 2017-01-10 | 2019-07-10 | Method for performing emotional gestures by a device to interact with a user |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/507,599 Active 2040-07-14 US12186885B2 (en) | 2017-01-10 | 2019-07-10 | Device for performing emotional gestures to interact with a user |
Country Status (5)
| Country | Link |
|---|---|
| US (2) | US12186885B2 (en) |
| EP (1) | EP3568269A4 (en) |
| JP (1) | JP2020504027A (en) |
| CN (1) | CN110382174A (en) |
| WO (2) | WO2018132363A1 (en) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20220134544A1 (en) * | 2020-10-30 | 2022-05-05 | Honda Research Institute Europe Gmbh | System and method for continuously sharing behavioral states of a creature |
| EP4353428A4 (en) * | 2021-11-30 | 2025-02-19 | Samsung Electronics Co., Ltd. | ROBOT DEVICE AND CONTROL METHOD THEREFOR |
| KR20230081174A (en) * | 2021-11-30 | 2023-06-07 | 삼성전자주식회사 | Robot apparatus and controlling method thereof |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5832189A (en) * | 1996-09-26 | 1998-11-03 | Interval Research Corporation | Affect-based robot communication methods and systems |
| US20020055320A1 (en) * | 2000-11-04 | 2002-05-09 | Kyung-Cheol An | Motion expressible toy |
| US20070217586A1 (en) * | 2005-11-19 | 2007-09-20 | Massachusetts Institute Of Technology | Animatronic creatures that act as intermediaries between human users and a telephone system |
| US8406926B1 (en) * | 2011-05-06 | 2013-03-26 | Google Inc. | Methods and systems for robotic analysis of environmental conditions and response thereto |
| US20140074945A1 (en) * | 2012-09-12 | 2014-03-13 | International Business Machines Corporation | Electronic Communication Warning and Modification |
| US20140085181A1 (en) * | 2012-09-27 | 2014-03-27 | Microsoft Corporation | Mood-actuated device |
| US9796095B1 (en) * | 2012-08-15 | 2017-10-24 | Hanson Robokind And Intelligent Bots, Llc | System and method for controlling intelligent animated characters |
Family Cites Families (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS6372969A (en) * | 1986-09-12 | 1988-04-02 | Fujitsu Ltd | Gear type speed reducer |
| US5700178A (en) * | 1996-08-14 | 1997-12-23 | Fisher-Price, Inc. | Emotional expression character |
| JP2003071761A (en) * | 2001-09-03 | 2003-03-12 | Sony Corp | Robot and display device for robot |
| JP3916933B2 (en) * | 2001-11-20 | 2007-05-23 | オムロン株式会社 | Robot joint device |
| JP3990209B2 (en) * | 2002-06-25 | 2007-10-10 | 株式会社タカラトミー | Swing toys |
| JP2004214895A (en) * | 2002-12-27 | 2004-07-29 | Toshiba Corp | Communication auxiliary device |
| CN100436082C (en) * | 2003-08-12 | 2008-11-26 | 株式会社国际电气通信基础技术研究所 | Control system for communication robot |
| US7813836B2 (en) * | 2003-12-09 | 2010-10-12 | Intouch Technologies, Inc. | Protocol for a remotely controlled videoconferencing robot |
| JP2007069302A (en) * | 2005-09-07 | 2007-03-22 | Hitachi Ltd | Action expression device |
| US20110014848A1 (en) * | 2009-07-15 | 2011-01-20 | Ricky Law | Motion character figure |
| US8721122B2 (en) * | 2010-02-18 | 2014-05-13 | Polygroup Macau Limited (Bvi) | Lighted wobble head system |
| JP5787325B2 (en) * | 2010-09-24 | 2015-09-30 | 国立大学法人岐阜大学 | Humanoid electric hand |
| JP2012166325A (en) * | 2011-02-16 | 2012-09-06 | Nsk Ltd | Joint for manipulator |
| US20120245925A1 (en) * | 2011-03-25 | 2012-09-27 | Aloke Guha | Methods and devices for analyzing text |
| US20130268119A1 (en) * | 2011-10-28 | 2013-10-10 | Tovbot | Smartphone and internet service enabled robot systems and methods |
| JP5185473B1 (en) * | 2012-02-13 | 2013-04-17 | パナソニック株式会社 | Legged robot |
| US9046884B2 (en) * | 2012-12-31 | 2015-06-02 | Microsoft Technology Licensing, Llc | Mood-actuated device |
| EP2933068B1 (en) * | 2014-04-17 | 2021-08-18 | Aldebaran Robotics | Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller |
| FR3021890B1 (en) * | 2014-06-05 | 2019-04-19 | Aldebaran Robotics | MOTORIZED SECURED JOINT FOR EQUIPING A HUMANOID ROBOT |
| CN104493827A (en) * | 2014-11-17 | 2015-04-08 | 福建省泉州市第七中学 | Intelligent cognitive robot and cognitive system thereof |
| US10514766B2 (en) * | 2015-06-09 | 2019-12-24 | Dell Products L.P. | Systems and methods for determining emotions based on user gestures |
| CN105856201B (en) * | 2016-05-25 | 2018-10-09 | 华南理工大学 | A kind of Robot Visual Servoing platform of Three Degree Of Freedom |
-
2018
- 2018-01-09 CN CN201880009138.4A patent/CN110382174A/en active Pending
- 2018-01-09 EP EP18738520.8A patent/EP3568269A4/en not_active Withdrawn
- 2018-01-09 WO PCT/US2018/012922 patent/WO2018132363A1/en not_active Ceased
- 2018-01-09 JP JP2019557533A patent/JP2020504027A/en active Pending
- 2018-01-09 WO PCT/US2018/012923 patent/WO2018132364A1/en not_active Ceased
-
2019
- 2019-07-10 US US16/507,599 patent/US12186885B2/en active Active
- 2019-07-10 US US16/507,621 patent/US20190329417A1/en not_active Abandoned
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5832189A (en) * | 1996-09-26 | 1998-11-03 | Interval Research Corporation | Affect-based robot communication methods and systems |
| US20020055320A1 (en) * | 2000-11-04 | 2002-05-09 | Kyung-Cheol An | Motion expressible toy |
| US20070217586A1 (en) * | 2005-11-19 | 2007-09-20 | Massachusetts Institute Of Technology | Animatronic creatures that act as intermediaries between human users and a telephone system |
| US8406926B1 (en) * | 2011-05-06 | 2013-03-26 | Google Inc. | Methods and systems for robotic analysis of environmental conditions and response thereto |
| US9796095B1 (en) * | 2012-08-15 | 2017-10-24 | Hanson Robokind And Intelligent Bots, Llc | System and method for controlling intelligent animated characters |
| US20140074945A1 (en) * | 2012-09-12 | 2014-03-13 | International Business Machines Corporation | Electronic Communication Warning and Modification |
| US20140085181A1 (en) * | 2012-09-27 | 2014-03-27 | Microsoft Corporation | Mood-actuated device |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190329416A1 (en) | 2019-10-31 |
| EP3568269A4 (en) | 2020-08-19 |
| JP2020504027A (en) | 2020-02-06 |
| WO2018132364A1 (en) | 2018-07-19 |
| CN110382174A (en) | 2019-10-25 |
| EP3568269A1 (en) | 2019-11-20 |
| US12186885B2 (en) | 2025-01-07 |
| WO2018132363A1 (en) | 2018-07-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR102306624B1 (en) | Persistent companion device configuration and deployment platform | |
| US10391636B2 (en) | Apparatus and methods for providing a persistent companion device | |
| US10357881B2 (en) | Multi-segment social robot | |
| US20170206064A1 (en) | Persistent companion device configuration and deployment platform | |
| WO2016011159A1 (en) | Apparatus and methods for providing a persistent companion device | |
| US20210166592A1 (en) | Robot | |
| US11074491B2 (en) | Emotionally intelligent companion device | |
| JP6938980B2 (en) | Information processing equipment, information processing methods and programs | |
| US20190329417A1 (en) | Method for performing emotional gestures by a device to interact with a user | |
| WO2019133615A1 (en) | A method for personalized social robot interaction | |
| Krüger et al. | The SMOOTH-robot: a modular, interactive service robot | |
| US11855932B2 (en) | Method for adjusting a device behavior based on privacy classes | |
| US11461404B2 (en) | System and method for adjustment of a device personality profile | |
| US20190392327A1 (en) | System and method for customizing a user model of a device using optimized questioning | |
| WO2018183812A1 (en) | Persistent companion device configuration and deployment platform | |
| CA2904359C (en) | Apparatus and methods for providing a persistent companion device | |
| JP2026014269A (en) | system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: INTUITION ROBOTICS, LTD., ISRAEL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AMIR, ROY;MENDELSOHN, ITAI;SKULER, DOR;AND OTHERS;SIGNING DATES FROM 20190709 TO 20190710;REEL/FRAME:049714/0700 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| AS | Assignment |
Owner name: WTI FUND X, INC., CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:INTUITION ROBOTICS LTD.;REEL/FRAME:059848/0768 Effective date: 20220429 Owner name: VENTURE LENDING & LEASING IX, INC., CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:INTUITION ROBOTICS LTD.;REEL/FRAME:059848/0768 Effective date: 20220429 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| AS | Assignment |
Owner name: WTI FUND X, INC., CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ERRONEOUS PROPERTY TYPE LABEL FROM APPLICATION NO. 10646998 TO APPLICATION NO. 10646998 PREVIOUSLY RECORDED ON REEL 059848 FRAME 0768. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY AGREEMENT;ASSIGNOR:INTUITION ROBOTICS LTD.;REEL/FRAME:064219/0085 Effective date: 20220429 Owner name: VENTURE LENDING & LEASING IX, INC., CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ERRONEOUS PROPERTY TYPE LABEL FROM APPLICATION NO. 10646998 TO APPLICATION NO. 10646998 PREVIOUSLY RECORDED ON REEL 059848 FRAME 0768. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY AGREEMENT;ASSIGNOR:INTUITION ROBOTICS LTD.;REEL/FRAME:064219/0085 Effective date: 20220429 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
| AS | Assignment |
Owner name: INTUITION ROBOTICS LTD., ISRAEL Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:VENTURE LENDING & LEASING IX, INC.;REEL/FRAME:072280/0608 Effective date: 20250728 Owner name: INTUITION ROBOTICS LTD., ISRAEL Free format text: RELEASE OF SECURITY INTEREST;ASSIGNOR:VENTURE LENDING & LEASING IX, INC.;REEL/FRAME:072280/0608 Effective date: 20250728 |
|
| AS | Assignment |
Owner name: INTUITION ROBOTICS LTD., ISRAEL Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNOR ADDED PREVIOUSLY RECORDED AT REEL: 72280 FRAME: 608. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST;ASSIGNORS:WTI FUND X, INC.;VENTURE LENDING & LEASING IX, INC.;REEL/FRAME:072989/0071 Effective date: 20250728 |