[go: up one dir, main page]

CN108406803B - Interaction system and interaction method of multi-user and service robot - Google Patents

Interaction system and interaction method of multi-user and service robot Download PDF

Info

Publication number
CN108406803B
CN108406803B CN201810380483.0A CN201810380483A CN108406803B CN 108406803 B CN108406803 B CN 108406803B CN 201810380483 A CN201810380483 A CN 201810380483A CN 108406803 B CN108406803 B CN 108406803B
Authority
CN
China
Prior art keywords
wearing
unit
control system
positioning
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810380483.0A
Other languages
Chinese (zh)
Other versions
CN108406803A (en
Inventor
黄玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Zegan Biotechnology Co ltd
Original Assignee
Shanghai Zegan Biotechnology Co ltd
Suzhou Santi Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Zegan Biotechnology Co ltd, Suzhou Santi Intelligent Technology Co ltd filed Critical Shanghai Zegan Biotechnology Co ltd
Priority to CN201810380483.0A priority Critical patent/CN108406803B/en
Priority to CN201911422628.XA priority patent/CN111037583A/en
Publication of CN108406803A publication Critical patent/CN108406803A/en
Application granted granted Critical
Publication of CN108406803B publication Critical patent/CN108406803B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)
  • Toys (AREA)

Abstract

The invention discloses an interactive system and an interactive method of a multi-user and service robot, wherein the interactive system comprises a master control part arranged on the service robot and a plurality of wearing parts worn by the user; the master control part comprises a control system, a communication unit and a positioning unit; the wearing part comprises a sub-communication unit which can communicate with the communication unit, a receiver which is used for receiving the infrared light emitted by the infrared light generating element, a positioning micro base station, a positioning label and a micro processing unit; the sub-communication units of different wearing parts can communicate with each other. According to the multi-user and service robot interaction system and the interaction method, the robot can master the position and the speaking condition of each user through signal transmission between the main control part on the service robot and the wearing part worn by the user, the complex conditions that the user falls behind, the user speaks and the like can be effectively solved in the process of guiding and presenting, and good experience is achieved.

Description

Interaction system and interaction method of multiple users and service robot
Technical Field
The invention relates to the technical field of robots, in particular to the technical field of robot-to-robot interaction.
Background
At present, in the field of robots, human-computer interaction is an important research field, and there are many existing human-computer interaction methods, mainly in forms of voice interaction, touch interaction, input interaction, and the like, and mixed forms of these interaction forms, but the existing human-computer interaction mode of a robot is still mainly a one-to-one situation of the robot, and for a one-to-many situation of the robot, such as a navigation explaining robot, it needs to bring users to explain while walking, and because people have subjective initiative, the robot cannot control situations of user's falling behind, sending out in disorder, and the like, so there is no better interaction system and method all the time.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides an interactive system and an interactive method of a multi-user and service robot, which are suitable for man-machine interaction in a one-to-many situation.
The technical scheme is as follows: in order to achieve the purpose, the interactive system of the multi-user and the service robot comprises a main control part arranged on the service robot and a plurality of wearing parts worn by the user;
the master control part comprises a control system, a communication unit and a positioning unit, wherein the communication unit and the positioning unit are electrically connected with the control system;
the wearing part comprises a sub-communication unit which can communicate with the communication unit, a receiver which is used for being matched with the positioning unit to use, a positioning micro base station, a positioning tag and a micro processing unit; the sub-communication units of different wearing parts can communicate with each other;
the positioning unit comprises a first light emitter capable of generating infrared light and a container with a vertical slit, the first light emitter is arranged in the container, and the container can rotate along a vertical shaft under the action of the rotation driving unit; the positioning unit also comprises a second luminous body which can generate infrared light; the receiver may sense infrared light.
Furthermore, the wearing part also comprises an earphone unit and a microphone unit, and the earphone unit and the microphone unit are both connected with the micro-processing unit.
Further, the wearing part also comprises a vibration unit which is connected with the micro-processing unit.
Further, the wearing part also comprises a button unit which is connected with the micro-processing unit.
The interaction method based on the interaction system of the multi-user and the service robot comprises the following steps:
the method comprises the following steps: the control system controls the first light emitter to brighten and controls the rotation driving unit to rotate, so that the container with the first light emitter inside is driven to rotate;
step two: each wearing part is provided with a unique number, when a receiver of the wearing part receives infrared light generated by the first light emitter, the wearing part receiving the infrared light generates feedback information and feeds the feedback information back to the control system, the feedback information comprises the number of the wearing part and the time of the wearing part receiving the infrared light, and the control system calculates the position of each wearing part receiving the infrared light according to the feedback information and the relation between the rotation angle of the rotation driving unit and the time;
step three: the control system controls the second luminous body to brighten, the wearing part which receives the infrared light generated by the first luminous body before receives the infrared light generated by the second luminous body sends the number of the wearing part and the time of receiving the infrared light to the control system again as second feedback information, the control system can know the distance between the wearing part and the positioning unit according to the brightening time of the second luminous body and the second feedback information, and the position of the wearing part can be known by combining the position of the wearing part calculated in the step two; the control system designates three wearing parts as position reference points in a plurality of wearing parts with known positions;
step four: the control system sends instructions to all wearing parts except the three wearing parts as position reference points, so that the positioning labels of the wearing parts respectively send positioning information to the positioning micro base stations of the three wearing parts as the position reference points, and the positioning information comprises the serial numbers of the wearing parts and the time for sending the positioning information by the wearing parts;
step five: the three wearing parts used as position reference points receive the positioning information, and the time for receiving the positioning information is added to each positioning information and then the positioning information is forwarded to the control system;
step six: the control system calculates the positions of the rest wearing parts according to the positioning information and the position information of the wearing parts serving as position reference points to obtain the position distribution conditions of all the wearing parts.
Further, the control system obtains the distribution situation of the users wearing the wearing parts according to the position distribution situation of all the wearing parts, controls the moving speed of the service robot accordingly, and controls the service robot to stop waiting for all the wearing parts to reach the set range and then continue the tour guide when necessary.
Furthermore, the wearing part also comprises an earphone unit and a microphone unit, and the earphone unit and the microphone unit are both connected with the micro-processing unit; the control system grasps the right of all wearing parts to listen to information through the earphone unit and the right to speak through the microphone unit.
Further, the wearing part also comprises a vibration unit which is connected with the micro-processing unit; when the control system obtains the distribution situation of the users wearing the wearing parts, it is analyzed that some users fall behind seriously, and the users can be reminded to keep up with the team through the vibration units of the wearing parts worn by the users.
Furthermore, the wearing part also comprises a button unit, when a user needs to ask a question, a request signal can be sent to the control system through the button unit, and after the control system accepts the request of the user, the microphone unit of the wearing part sending the request signal starts to collect the voice signal.
Has the advantages that: according to the multi-user and service robot interaction system and the interaction method, the robot can master the position and the speaking condition of each user through signal transmission between the main control part on the service robot and the wearing part worn by the user, the complex conditions that the user falls behind, the user speaks and the like can be effectively solved in the process of guiding and presenting, and good experience is achieved.
Drawings
FIG. 1 is a system component diagram of an interactive system of multiple users and a service robot;
FIG. 2 is a view showing an installation position of the positioning unit;
FIG. 3 is a block diagram of a positioning unit;
fig. 4 is a distribution diagram of the total control part and a plurality of wearing parts.
Detailed Description
The present invention will be further described with reference to the accompanying drawings.
The interactive system of the multi-user and the service robot as shown in fig. 1 comprises a main control part 100 arranged on the service robot and a plurality of wearing parts 200 worn by the user;
the general control part 100 comprises a control system 110, a communication unit 120 and a positioning unit 130, wherein the communication unit 120 and the positioning unit 130 are electrically connected with the control system 110; preferably, the positioning unit 130 is disposed on top of the robot, as shown in fig. 2.
The wearing part 200 comprises a sub-communication unit 210 capable of communicating with the communication unit 120, a receiver 220 for cooperating with the positioning unit 130, a positioning micro base station 230, a positioning tag 240 and a micro processing unit 250; the sub-communication units 210 of the different wearing portions 200 can communicate with each other. The sub-communication unit 210, the receiver 220, the positioning micro base station 230, and the positioning tag 240 are all connected to the micro processing unit 250.
As shown in fig. 3, the positioning unit 130 includes a first light emitter 131 (shown by a dotted line) for generating infrared light and a container 132 having a vertical slit, the first light emitter 131 is disposed in the container 132, the container 132 can rotate along a vertical axis under the action of the rotation driving unit 133, and the receiver 220 can sense the infrared light. Therefore, the light emitted by the first light emitter 131 can only exit from the vertical slit to form a vertical light ray, the scattering range of the vertical light ray in the horizontal plane is small, the scattering range of the vertical light ray in the vertical plane is large, the vertical light ray can be rotated and swept around along the vertical axis, the receiver 220 on each wearing part 200 is received in the morning and evening, and the orientation of the wearing part 200 containing the receiver 220 can be inferred through the time when the receiver 220 receives infrared light and the orientation of the vertical slit on the container 132. The positioning unit 130 further comprises a second light emitter 134 capable of generating infrared light, the light emission of the second light emitter 134 is not covered, so that the infrared light emitted by the second light emitter 134 can be broadcast in all directions at 360 degrees, the control system 110 can calculate the distance between the wearing part 200 and the robot by controlling the second light emitter 134 to be turned off to be turned on, and combining the light emission time of the second light emitter 134 and the infrared light receiving time of the receiver 220 on the wearing part 200, so that the position of the wearing part 200 is known, and the distance of the wearing part 200 is also known, and therefore the position of the wearing part 200 is known.
The wearing part 200 further comprises an earphone unit 260 and a microphone unit 270, and both the earphone unit 260 and the microphone unit 270 are connected to the microprocessor unit 250.
The wearing part 200 further comprises a vibration unit 280, and the vibration unit 280 is connected with the micro-processing unit 250.
The wearing part 200 further comprises a button unit 290, the button unit 290 being connected to the microprocessor unit 250.
Based on the interaction method of the interaction system of the multi-user and the service robot, the method for the robot to obtain the position of each user comprises the following steps:
the method comprises the following steps: the control system 110 controls the first light emitter 131 to be lightened and controls the rotation driving unit 133 to rotate, so as to drive the container 132 with the first light emitter 131 inside to rotate;
step two: each wearing part 200 has a unique number, when the receiver 220 of the wearing part 200 receives the infrared light generated by the first light emitter 131, the wearing part 200 receiving the infrared light generates feedback information to be fed back to the control system 110 through the communication between the sub-communication unit 210 and the communication unit 120, the feedback information comprises the number of the wearing part 200 and the time when the wearing part 200 receives the infrared light, and the control system 110 calculates the orientation of each wearing part 200 receiving the infrared light (i.e. in which direction of the main control part 100 the wearing part 200 is located, which is represented by a parameter, namely θ in fig. 4) according to the feedback information and the relation between the rotation angle of the rotation driving unit 133 and the time;
step three: the control system 110 controls the second light emitter 134 to turn on, the wearing part 200 which previously received the infrared light generated by the first light emitter 131 receives the infrared light generated by the second light emitter 134 and then sends the number of the wearing part and the time of receiving the infrared light to the control system 110 again as second feedback information, the control system 110 can know the distance (D in fig. 4) between the wearing parts 200 and the positioning unit 130 according to the time when the second light emitter 134 turns on and the second feedback information, and the position of the wearing parts 200 can be known by combining the position of the wearing parts 200 estimated in the second step; the control system 110 specifies three wearing parts 200 among the several wearing parts 200 of which positions are known as position reference points (the wearing part 200 as the position reference point is as a black color block in fig. 4, and the other wearing parts 200 are as white color blocks in fig. 4);
step four: the control system 110 sends out instructions to all the wearing parts 200 except the three wearing parts 200 as position reference points, so that the positioning tags 240 of the wearing parts 200 respectively send out positioning information to the positioning micro base stations 230 of the three wearing parts 200 as position reference points, wherein the positioning information comprises the number of the wearing parts 200 and the time when the wearing parts 200 send out the positioning information;
step five: the three wearing parts 200 as the position reference points receive the positioning information, and the time for receiving the positioning information is added to each positioning information and then the positioning information is forwarded to the control system 110;
step six: the control system 110 estimates the positions of the remaining wearing parts 200 based on the positioning information and the position information of the wearing part 200 as the position reference point (the distances of the wearing part 200 of the non-position reference point wearing part 200 with respect to each wearing part 200 as the position reference point, i.e., D1, D2, D3 in fig. 4, are obtained first, and then the positions of the remaining wearing parts 200 are estimated based on the position information of the wearing part 200 as the position reference point), to obtain the position distribution conditions of all the wearing parts 200.
The control system 110 obtains the distribution of the users wearing the wearing parts 200 according to the position distribution of all the wearing parts 200, controls the moving speed of the service robot accordingly, and controls the service robot to stop waiting for all the wearing parts 200 to reach the set range (i.e. reach the circle with a certain radius centered on the service robot as shown by the dashed circle in fig. 4) and then continue the guiding lecture if necessary.
The wearing part 200 further comprises an earphone unit 260 and a microphone unit 270, wherein the earphone unit 260 and the microphone unit 270 are both connected with the microprocessor unit 250; the control system 110 grasps the authority of all the wearing portions 200 to listen to information through the ear unit 260 and the authority to speak through the microphone unit 270. Thus, the speech of the user can be effectively controlled, and the problem of speech extraction in a noisy environment can be solved, and the control system 110 of the robot can transmit the speech of a single user to other users through the earphone units 260 of the wearing parts 200.
The wearing part 200 further comprises a vibration unit 280, wherein the vibration unit 280 is connected with the micro-processing unit 250; when the control system 110 obtains the distribution of the users wearing the wearing parts 200 and analyzes that some users fall behind seriously, the users can be reminded to keep up with the team by the vibration units 280 of the wearing parts 200 worn by the users.
The wearing part 200 further includes a button unit 290, when a user needs to ask a question, the button unit 290 can send a request signal to the control system 110, and after the control system 110 receives the request of the user, the microphone unit 270 of the wearing part 200 that sends the request signal starts to collect a voice signal.
According to the multi-user and service robot interaction system and the interaction method, the robot can master the position and the speaking condition of each user through signal transmission between the main control part on the service robot and the wearing part worn by the user, the complex conditions that the user falls behind, the user speaks and the like can be effectively solved in the process of guiding and presenting, and good experience is achieved.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.

Claims (4)

1. The interaction method of the interaction system of the multi-user and the service robot is characterized in that:
the interactive system of the multi-user and the service robot comprises a general control part (100) arranged on the service robot and a plurality of wearing parts (200) worn by the user;
the general control part (100) comprises a control system (110), a communication unit (120) and a positioning unit (130), wherein the communication unit (120) and the positioning unit (130) are electrically connected with the control system (110);
the wearing part (200) comprises a sub-communication unit (210) which can communicate with the communication unit (120), a receiver (220) which is used for matching with the positioning unit (130), a positioning micro base station (230), a positioning tag (240) and a micro processing unit (250); the sub-communication units (210) of different wearing parts (200) can communicate with each other;
the positioning unit (130) comprises a first light emitter (131) capable of generating infrared light and a container (132) with a vertical slit, the first light emitter (131) is arranged in the container (132), and the container (132) can rotate along a vertical axis under the action of the rotary driving unit (133); the positioning unit (130) further comprises a second light emitter (134) capable of generating infrared light; the receiver (220) is sensible for infrared light;
the method comprises the following steps:
the method comprises the following steps: the control system (110) controls the first light emitter (131) to brighten and controls the rotation driving unit (133) to rotate, so that the container (132) with the first light emitter (131) arranged inside is driven to rotate;
step two: each wearing part (200) has a unique number, when a receiver (220) of the wearing part (200) receives infrared light generated by the first light emitter (131), the wearing part (200) receiving the infrared light generates feedback information to be fed back to the control system (110), the feedback information comprises the number of the wearing part (200) and the time when the wearing part (200) receives the infrared light, and the control system (110) calculates the position of each wearing part (200) receiving the infrared light according to the feedback information and the relation between the rotating angle and the time of the rotating driving unit (133);
step three: the control system (110) controls the second light-emitting body (134) to be lightened, the wearing part (200) which receives the infrared light generated by the first light-emitting body (131) before receives the infrared light generated by the second light-emitting body (134) sends the number of the wearing part and the time of receiving the infrared light to the control system (110) again as second feedback information, the control system (110) can know the distance between the wearing part (200) and the positioning unit (130) according to the lightening time of the second light-emitting body (134) and the second feedback information, and the position of the wearing part (200) can be known by combining the position of the wearing part (200) estimated in the second step; the control system (110) specifies three wearing parts (200) among a plurality of wearing parts (200) with known positions as position reference points;
step four: the control system (110) sends out an instruction to all wearing parts (200) except the three wearing parts (200) as position reference points, so that the positioning labels (240) of all wearing parts (200) except the three wearing parts (200) as the position reference points respectively send positioning information to the positioning micro base stations (230) of the three wearing parts (200) as the position reference points, and the positioning information comprises the number of the wearing parts (200) and the time when the wearing parts (200) send out the positioning information;
step five: three wearing parts (200) serving as position reference points receive the positioning information, increase the time for receiving the positioning information on each piece of positioning information and then forward the positioning information to a control system (110);
step six: the control system (110) calculates the positions of the rest wearing parts (200) according to the positioning information and the position information of the wearing parts (200) as position reference points, and obtains the position distribution conditions of all the wearing parts (200).
2. The method of claim 1, wherein the method comprises: the wearing part (200) further comprises an earphone unit (260) and a microphone unit (270), wherein the earphone unit (260) and the microphone unit (270) are both connected with the micro-processing unit (250); the control system (110) grasps the right of all wearing parts (200) to listen to information through the headphone unit (260) and the right to speak through the microphone unit (270).
3. The method of claim 1, wherein the method comprises: the wearing part (200) further comprises a vibration unit (280), and the vibration unit (280) is connected with the micro-processing unit (250); when the control system (110) obtains the distribution of the users wearing the wearing parts (200), and analyzes that some users fall behind seriously, the users can be reminded to keep up with the team through the vibration units (280) of the wearing parts (200) worn by the users.
4. The method of claim 2, wherein the method comprises: the wearing part (200) further comprises a button unit (290), when a user needs to ask a question, a request signal can be sent to the control system (110) through the button unit (290), and after the control system (110) receives the request of the user, the microphone unit (270) of the wearing part (200) sending the request signal starts to collect the voice signal.
CN201810380483.0A 2018-04-25 2018-04-25 Interaction system and interaction method of multi-user and service robot Expired - Fee Related CN108406803B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810380483.0A CN108406803B (en) 2018-04-25 2018-04-25 Interaction system and interaction method of multi-user and service robot
CN201911422628.XA CN111037583A (en) 2018-04-25 2018-04-25 Multi-user and service robot interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810380483.0A CN108406803B (en) 2018-04-25 2018-04-25 Interaction system and interaction method of multi-user and service robot

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201911422628.XA Division CN111037583A (en) 2018-04-25 2018-04-25 Multi-user and service robot interaction system

Publications (2)

Publication Number Publication Date
CN108406803A CN108406803A (en) 2018-08-17
CN108406803B true CN108406803B (en) 2020-05-19

Family

ID=63136608

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810380483.0A Expired - Fee Related CN108406803B (en) 2018-04-25 2018-04-25 Interaction system and interaction method of multi-user and service robot
CN201911422628.XA Pending CN111037583A (en) 2018-04-25 2018-04-25 Multi-user and service robot interaction system

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201911422628.XA Pending CN111037583A (en) 2018-04-25 2018-04-25 Multi-user and service robot interaction system

Country Status (1)

Country Link
CN (2) CN108406803B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203118401U (en) * 2013-01-24 2013-08-07 天津恒达文博科技有限公司 Intelligent audio guide
WO2016056226A1 (en) * 2014-10-10 2016-04-14 パナソニックIpマネジメント株式会社 Autonomous travel-type cleaner
CN205720649U (en) * 2016-06-28 2016-11-23 北醒(北京)光子科技有限公司 One directly drives small rotary scanning range unit
CN206115270U (en) * 2016-08-31 2017-04-19 厦门轻游信息科技有限公司 Mutual induction type has guide robot of navigation and explanation function
CN107243909A (en) * 2017-07-26 2017-10-13 武汉盛德物联科技有限公司 Intelligent robot system for nursing
CN107608352A (en) * 2017-09-19 2018-01-19 宁波美健机器人有限公司 A kind of guide robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203118401U (en) * 2013-01-24 2013-08-07 天津恒达文博科技有限公司 Intelligent audio guide
WO2016056226A1 (en) * 2014-10-10 2016-04-14 パナソニックIpマネジメント株式会社 Autonomous travel-type cleaner
CN205720649U (en) * 2016-06-28 2016-11-23 北醒(北京)光子科技有限公司 One directly drives small rotary scanning range unit
CN206115270U (en) * 2016-08-31 2017-04-19 厦门轻游信息科技有限公司 Mutual induction type has guide robot of navigation and explanation function
CN107243909A (en) * 2017-07-26 2017-10-13 武汉盛德物联科技有限公司 Intelligent robot system for nursing
CN107608352A (en) * 2017-09-19 2018-01-19 宁波美健机器人有限公司 A kind of guide robot

Also Published As

Publication number Publication date
CN111037583A (en) 2020-04-21
CN108406803A (en) 2018-08-17

Similar Documents

Publication Publication Date Title
US11259108B2 (en) Information processing device and information processing method
CN111436040B (en) Method for triangularly positioning and retrieving Bluetooth device, Bluetooth device and positioning system
KR20200087720A (en) A method and apparatus for obtaining positioning information
US20250008023A1 (en) Systems and methods for providing headset voice control to employees in quick-service restaurants
AU2018332721A1 (en) Push to talk for the internet of things
EP1882393A1 (en) Method and system for lighting control
WO2015039581A1 (en) Positioning method based on visible light source, mobile terminal and controller
AU2017253763B2 (en) Method for operating a production plant and production plant
CN108406803B (en) Interaction system and interaction method of multi-user and service robot
CN102184007A (en) Interactive intelligent conference system based on pattern recognition and using method thereof
CN105204517A (en) Personal service method and system for small and mini-type unmanned aerial vehicles
WO2013139090A1 (en) Method, system, and related device for operating display device
WO2020026413A1 (en) Wireless terminal device and wireless power supply equipment
US10319220B2 (en) Control arrangement and control method
JP2023081259A (en) Control device for unmanned aircraft and control method thereof
JP2021023517A (en) Performance control system, method and program
KR101720144B1 (en) Voice communication system using helmet with short-range wireless communication and method thereof
CN111436020B (en) Bluetooth positioning method, Bluetooth device searching method, Bluetooth device and positioning system
CN113066218A (en) Number calling system with consulting room path guidance
CN201475018U (en) Fan display control device
KR102051233B1 (en) System for visualizing acoustics of unmanned aerial vehicle
CN210574126U (en) Visitor prompt system and prompting lamp
US20240025698A1 (en) Elevator system having elevator operating devices for passengers with limited mobility
JP2007235935A (en) Instrument controller
US20230022806A1 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200423

Address after: 201800 room jt21632, building 4, block B, 925 Yecheng Road, Jiading Industrial Zone, Jiading District, Shanghai

Applicant after: Shanghai zegan Biotechnology Co.,Ltd.

Address before: 17, No. 2588, Swan Road, 215100, Suzhou, Jiangsu, Wuzhong District

Applicant before: SUZHOU SANTI INTELLIGENT TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200519

CF01 Termination of patent right due to non-payment of annual fee