WO2019240330A1 - Système de prédiction de force basé sur des images et procédé correspondant - Google Patents
Système de prédiction de force basé sur des images et procédé correspondant Download PDFInfo
- Publication number
- WO2019240330A1 WO2019240330A1 PCT/KR2018/011808 KR2018011808W WO2019240330A1 WO 2019240330 A1 WO2019240330 A1 WO 2019240330A1 KR 2018011808 W KR2018011808 W KR 2018011808W WO 2019240330 A1 WO2019240330 A1 WO 2019240330A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot arm
- motion information
- force
- time series
- interaction force
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J18/00—Arms
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01L—MEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
- G01L11/00—Measuring steady or quasi-steady pressure of a fluid or a fluent solid material by means not provided for in group G01L7/00 or G01L9/00
- G01L11/02—Measuring steady or quasi-steady pressure of a fluid or a fluent solid material by means not provided for in group G01L7/00 or G01L9/00 by optical means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Definitions
- the interaction time and the object of the robot arm predicted by inputting the time series motion information of the robot and the actual images to the deep learning algorithm of the neural network structure It can be learned by comparing the physical properties of the object and the interaction force of the robot arm measured using the sensor and the physical properties of the object.
- Deep learning algorithm of the robot arm force prediction method the process of extracting the region associated with the operation of the robot from the actual images using the CNN (Convolutional Neural Network) that the actual images are input ; Calculating class scores of the time series motion information of the robot using a first FC (fully-connected) layer to which the time series motion information of the robot is input; Inputting the extracted region and class scores of the time series motion information into a recurrent neural network (RNN) to learn a relationship between the area that changes over time and the time series motion information; And calculating class scores of the learning result as a second FC layer into which the learning result of the RNN is input, and predicting interaction force of the robot arm corresponding to the time series motion information and the actual image and physical property values of the object.
- the correlation between the actual images and time series motion information of the robot and the interaction force of the robot arm may be learned.
- Time series operation information of the force prediction method of the robot arm changes over time, the robot arm It may include at least one of a change in the position and a change in the operation of the robot arm.
- the force predicting unit of the force learning device of the robotic arm generates a virtual image representing a change in motion information over time based on the time series motion information in a graph form, thereby generating the first cone ball of the CNN.
- a virtual image representing a change in motion information over time based on the time series motion information in a graph form, thereby generating the first cone ball of the CNN.
- the real images output from the convolutional layer are matched using the unified layer, the matched information is classified through the FC layer, and the interaction force of the robot arm corresponding to the time series motion information and the real image and the physical property values of the object. It can be predicted.
- Figure 7 is an example of the interaction force prediction process of the robot arm according to an embodiment of the present invention.
- the force predictor 230 may input the actual images 511 accumulated over time to the second convolutional layer 510 which is different from the first convolutional layer.
- the force predictor 230 classifies the matched information through the FC layer 540 to predict the interaction force of the robot arm and the physical property of the object corresponding to the real time image. have.
- the force predictor 630 may predict the interaction force of the robot arm based on the actual images and the time series motion information.
- the force predictor 630 may store the actual images, time series motion information, and the object in the database 120 in which the learning result of the correlation between the actual images, time series motion information, physical properties of the object, and the interaction force of the robot arm is stored. By applying the properties, we can predict the interaction force of the robot arm.
- the present invention can predict the interaction force of the robot arm using the actual image, so if an error occurs in the sensor, the interaction between the interaction force of the robot arm measured by the sensor and the robot arm predicted using the actual image By comparing the working force, it is possible to determine whether the sensor is abnormal.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Image Analysis (AREA)
Abstract
La présente invention concerne un système de prédiction de force basé sur des images et un procédé correspondant. Le procédé de prédiction de la force d'un bras robotisé du système de prédiction de force basé sur des images peut comprendre les étapes consistant à : acquérir des images réelles générées par une prise de vues photographiques continue d'un mouvement d'un bras robotisé interagissant avec un objet ; acquérir des informations de série chronologique de mouvement relatives à un changement du mouvement du bras robotisé dans le temps ; et prédire une force d'interaction du bras robotisé sur la base des images réelles et des informations de série chronologique de mouvement.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020180066861A KR102094360B1 (ko) | 2018-06-11 | 2018-06-11 | 영상 기반 힘 예측 시스템 및 그 방법 |
| KR10-2018-0066861 | 2018-06-11 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019240330A1 true WO2019240330A1 (fr) | 2019-12-19 |
Family
ID=68843407
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2018/011808 Ceased WO2019240330A1 (fr) | 2018-06-11 | 2018-10-08 | Système de prédiction de force basé sur des images et procédé correspondant |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR102094360B1 (fr) |
| WO (1) | WO2019240330A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112530553A (zh) * | 2020-12-03 | 2021-03-19 | 中国科学院深圳先进技术研究院 | 软组织与工具之间的交互力估计方法及装置 |
| US12306611B1 (en) * | 2020-12-11 | 2025-05-20 | Amazon Technologies, Inc. | Validation of a robotic manipulation event based on a classifier |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102386009B1 (ko) * | 2020-07-30 | 2022-04-13 | 네이버랩스 주식회사 | 로봇 작업의 학습 방법 및 로봇 시스템 |
| KR102401800B1 (ko) * | 2021-10-28 | 2022-05-26 | 주식회사 오비고 | 오브젝트 실감 기술을 구현하기 위한 학습 방법과 체험 방법 및 이를 이용한 학습 장치와 체험 장치 |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008008746A (ja) * | 2006-06-29 | 2008-01-17 | Univ Of Tokyo | 反射像を用いた触覚センサ |
| JP2010066028A (ja) * | 2008-09-08 | 2010-03-25 | Hiroshima Univ | 印加力推定装置及び方法 |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101199940B1 (ko) * | 2010-12-15 | 2012-11-09 | 전자부품연구원 | 이동체 탑재형 영상 추적 장치 |
| US9327406B1 (en) * | 2014-08-19 | 2016-05-03 | Google Inc. | Object segmentation based on detected object-specific visual cues |
| JP6522488B2 (ja) * | 2015-07-31 | 2019-05-29 | ファナック株式会社 | ワークの取り出し動作を学習する機械学習装置、ロボットシステムおよび機械学習方法 |
-
2018
- 2018-06-11 KR KR1020180066861A patent/KR102094360B1/ko active Active
- 2018-10-08 WO PCT/KR2018/011808 patent/WO2019240330A1/fr not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008008746A (ja) * | 2006-06-29 | 2008-01-17 | Univ Of Tokyo | 反射像を用いた触覚センサ |
| JP2010066028A (ja) * | 2008-09-08 | 2010-03-25 | Hiroshima Univ | 印加力推定装置及び方法 |
Non-Patent Citations (3)
| Title |
|---|
| AVILES ET AL.: "A Recurrent Neural Network Approach for 3d Vision-based Force Estimation", 2014 4TH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA), 14 October 2014 (2014-10-14), pages 1 - 6, XP032716190, DOI: 10.1109/IPTA.2014.7001941 * |
| AVILES ET AL.: "Sensorless Force Estimation using a Neuro-Vision-Based Approa ch for Robotic-Assisted Surgery", 2015 7TH INTERNATIONAL IEEE /EMBS CONFERENCE ON NEURAL ENGINEERING (NER, 22 April 2015 (2015-04-22), pages 86 - 89, XP033166170, DOI: 10.1109/NER.2015.7146566 * |
| AVILES ET AL.: "V-ANFIS for Dealing with Visual Uncertainty for Force Estimat ion in Robotic Surgery", 16TH WORLD CONGRESS OF THE INTERNATIONAL FUZZY SYSTEMS ASSOCIATION (IFSA), 1 January 2015 (2015-01-01), pages 1465 - 1472, XP055664910, DOI: 10.2991/ifsa-eusflat-15.2015.208 * |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112530553A (zh) * | 2020-12-03 | 2021-03-19 | 中国科学院深圳先进技术研究院 | 软组织与工具之间的交互力估计方法及装置 |
| US12306611B1 (en) * | 2020-12-11 | 2025-05-20 | Amazon Technologies, Inc. | Validation of a robotic manipulation event based on a classifier |
Also Published As
| Publication number | Publication date |
|---|---|
| KR102094360B1 (ko) | 2020-03-30 |
| KR20190140546A (ko) | 2019-12-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019240330A1 (fr) | Système de prédiction de force basé sur des images et procédé correspondant | |
| WO2019132168A1 (fr) | Système d'apprentissage de données d'images chirurgicales | |
| WO2017164478A1 (fr) | Procédé et appareil de reconnaissance de micro-expressions au moyen d'une analyse d'apprentissage profond d'une dynamique micro-faciale | |
| WO2022059969A1 (fr) | Procédé de pré-apprentissage de réseau neuronal profond permettant une classification de données d'électrocardiogramme | |
| WO2017022882A1 (fr) | Appareil de classification de diagnostic pathologique d'image médicale, et système de diagnostic pathologique l'utilisant | |
| WO2022265197A1 (fr) | Procédé et dispositif pour analyser une image endoscopique sur la base de l'intelligence artificielle | |
| WO2020196985A1 (fr) | Appareil et procédé de reconnaissance d'action vidéo et de détection de section d'action | |
| WO2021045367A1 (fr) | Procédé et programme informatique visant à déterminer un état psychologique par un processus de dessin du bénéficiaire de conseils | |
| WO2022149894A1 (fr) | Procédé pour entraîner un réseau neuronal artificiel fournissant un résultat de détermination d'un échantillon pathologique, et système informatique pour sa mise en œuvre | |
| WO2019117563A1 (fr) | Appareil d'analyse prédictive intégrée pour télésanté interactive et procédé de fonctionnement associé | |
| WO2022231392A1 (fr) | Procédé et dispositif pour mettre en œuvre une plateforme à évolution automatique par apprentissage automatique de machine | |
| WO2023033270A1 (fr) | Procédé et dispositif à base d'apprentissage profond destinés à prédire des résultats d'analyse | |
| WO2023068440A1 (fr) | Système de main de robot et procédé de commande de main de robot | |
| WO2021246700A1 (fr) | Procédé et dispositif pour prédire un état d'utilisateur | |
| WO2017171266A1 (fr) | Procédé de génération de modèle de diagnostic et appareil de génération de modèle de diagnostic s'y rapportant | |
| WO2023182796A1 (fr) | Dispositif d'intelligence artificielle permettant de détecter des produits défectueux sur la base d'images de produit et procédé associé | |
| WO2023018299A1 (fr) | Appareil basé sur une transformée en ondelettes de pression artérielle et procédé de prédiction d'hypotension, et procédé d'apprentissage de modèle de prédiction d'hypotension de celui-ci | |
| WO2020050456A1 (fr) | Procédé d'évaluation du degré d'anomalie de données d'équipement | |
| WO2019112385A1 (fr) | Procédé de codage d'informations temporelles de caractéristiques spécifiques à une trame de segment d'image en vue d'une reconnaissance de vidéo | |
| WO2019164273A1 (fr) | Méthode et dispositif de prédiction de temps de chirurgie sur la base d'une image chirurgicale | |
| WO2020045903A1 (fr) | Procédé et dispositif de détection d'objet indépendamment de la taille au moyen d'un réseau neuronal convolutif | |
| WO2023200038A1 (fr) | Système et procédé de suivi oculaire pour surveiller une frappe de golf, et support d'enregistrement lisible par ordinateur non transitoire | |
| WO2019164278A1 (fr) | Procédé et dispositif permettant d'obtenir des informations chirurgicales à l'aide d'une image chirurgicale | |
| WO2022181919A1 (fr) | Dispositif et procédé pour fournir un environnement d'opération basé sur la réalité virtuelle | |
| WO2019124602A1 (fr) | Procédé et dispositifs de suivi d'un objet |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18922362 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18922362 Country of ref document: EP Kind code of ref document: A1 |