[go: up one dir, main page]

CN120814908A - An oral bone grafting navigation system based on multimodal mixed reality interaction - Google Patents

An oral bone grafting navigation system based on multimodal mixed reality interaction

Info

Publication number
CN120814908A
CN120814908A CN202511338994.2A CN202511338994A CN120814908A CN 120814908 A CN120814908 A CN 120814908A CN 202511338994 A CN202511338994 A CN 202511338994A CN 120814908 A CN120814908 A CN 120814908A
Authority
CN
China
Prior art keywords
bone
mixed reality
navigation
screw
oral
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202511338994.2A
Other languages
Chinese (zh)
Inventor
满毅
夏溦瑶
屈依丽
周炜凯
田陶然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan University
Original Assignee
Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan University filed Critical Sichuan University
Priority to CN202511338994.2A priority Critical patent/CN120814908A/en
Publication of CN120814908A publication Critical patent/CN120814908A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/252User interfaces for surgical systems indicating steps of a surgical procedure

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本发明涉及口腔医学领域,公开一种基于多模态的混合现实交互的口腔植骨导航系统,包括数据处理模块、混合现实显示模块、导航模块;数据处理模块:用于获取患者口腔CBCT数据和口扫数据,并依据获取的数据生成融合骨组织与牙列形态的复合三维模型,得到植骨骨片的三维形态及目标放置位置、螺丝固定位置;混合现实显示模块:包括能够对虚拟指示单元和虚拟术区呈现单元进行显示的头戴式显示设备;导航模块:包括导航标记物和光学定位相机,用于实时追踪手术器械及骨片、螺丝的空间位置。本发明结合多模态影像数据,通过混合现实显示模块与导航模块的结合,实现骨片位置、螺丝植入路径的实时可视化引导,提高植骨手术效率和安全性。

The present invention relates to the field of stomatology and discloses an oral bone grafting navigation system based on multimodal mixed reality interaction, comprising a data processing module, a mixed reality display module, and a navigation module; the data processing module is used to obtain a patient's oral CBCT data and oral scan data, and generate a composite three-dimensional model of fused bone tissue and dentition morphology based on the acquired data, thereby obtaining the three-dimensional morphology of the bone graft and the target placement position and screw fixation position; the mixed reality display module includes a head-mounted display device capable of displaying a virtual indication unit and a virtual surgical area presentation unit; and the navigation module includes a navigation marker and an optical positioning camera for real-time tracking of the spatial position of surgical instruments, bone grafts, and screws. The present invention combines multimodal imaging data with the mixed reality display module and the navigation module to achieve real-time visual guidance of the bone graft position and screw implantation path, thereby improving the efficiency and safety of bone grafting surgery.

Description

Oral cavity bone grafting navigation system based on multi-mode mixed reality interaction
Technical Field
The invention relates to the field of stomatology, in particular to an oral bone grafting navigation system based on multi-mode mixed reality interaction.
Background
In the oral bone grafting operation, accurate fixation of bone fragments is a key link for ensuring the postoperative bone fusion effect. In the traditional bone grafting operation, the bone slice is fixed by using screws according to the experience of doctors, and the problems are that 1, the screw positioning accuracy is insufficient, the doctors only judge the implantation position, depth and angle of the screws according to two-dimensional images, the screws are easy to deviate from a planned path and possibly damage nerve blood vessels (such as the following alveolar nerves) or penetrate bone cortex, 2, the fixing efficiency is low, the positions of the bone slice and the screws are required to be repeatedly adjusted, X-ray slice shooting is carried out for a plurality of times in operation to confirm, the operation time is prolonged, and 3, the visual guiding is lacking, namely the three-dimensional guiding of real-time dynamics is not available, the spatial relation between the screws and the bone slice and the host bone cannot be intuitively judged, and the bone slice displacement or the screw loosening can be caused.
Although the existing navigation system can track the bone fragment position, the screw fixing process is not guided accurately, and the virtual model displayed by a screen is depended on to be compared with a real operation area, so that the vision is required to be switched frequently during operation, and the continuity is influenced. Therefore, a mixed reality system capable of simultaneously navigating bone fragments and fixing screws in real time is needed, and the problems of inaccurate fixing and positioning and insufficient visualization in the prior art are solved.
Disclosure of Invention
Aiming at the problems of low positioning precision, insufficient visualization and the like in the bone screw fixation of the oral bone grafting operation in the prior art, the invention provides an oral bone grafting navigation system based on multi-mode mixed reality interaction.
In order to achieve the technical purpose, the invention adopts the following technical scheme:
an oral bone grafting navigation system based on multi-mode mixed reality interaction comprises a data processing module, a mixed reality display module and a navigation module;
The data processing module comprises an image processing unit, a bone grafting planning unit and a data output unit, wherein the image processing unit is used for acquiring CBCT data and oral scanning data of a patient oral cavity, registering the CBCT data and the oral scanning data through characteristic points, generating an oral cavity three-dimensional model containing a bone defect area through a three-dimensional reconstruction algorithm, marking a bone structure and a bone defect boundary, the bone grafting planning unit is used for planning a bone piece three-dimensional form and a target placement position in the three-dimensional model, marking a joint surface of the bone piece and a host bone, designing a screw fixing position on the three-dimensional model, marking a starting point, depth and angle of a screw implantation path, and the data output unit is used for generating navigation data containing the bone piece three-dimensional model and screw planning coordinates and transmitting the navigation data to the mixed reality display module and the navigation module;
The mixed reality display module comprises a head-mounted display device capable of displaying a virtual indication unit and a virtual operation area presentation unit, wherein the virtual indication unit is used for displaying the target positions and the real-time positions of bone fragments and screws in real time, and the virtual operation area presentation unit is used for displaying a three-dimensional model of an oral cavity of a patient and a real-time virtual model of an instrument in a surgical process;
The navigation module comprises a navigation marker and an optical positioning camera, and is used for tracking the spatial positions of the surgical instrument, the bone fragments and the screws in real time and transmitting the real-time spatial position data and the three-dimensional spatial deviation to the data processing module and the mixed reality display module.
Further, the virtual indication unit comprises a bone slice virtual indicator and a screw virtual indicator, wherein the bone slice virtual indicator is used for indicating the position and the angle of the bone slice and indicating the depth, and the screw virtual indicator is used for indicating the horizontal position and the depth of the screw and indicating the angle precision.
Further, the bone slice virtual indicator adopts two positioning frames to indicate the position and the angle of the bone slice, the target positioning frame is fixed beside the target position of the bone defect area of the patient, the real-time positioning frame is attached to the edge of the bone slice, when the two positioning frames are completely overlapped, the position and the angle of the bone slice meet the planning requirement, the depth indication of the bone slice is indicated by a depth progress bar, the depth precision bar is arranged beside the target positioning frame, and the vertical distance between the current position and the target position of the bone slice is displayed in real time.
Further, the screw virtual indicator adopts two positioning circles to conduct horizontal position and depth indication of the screw, the target positioning circle is displayed at a planned implantation point of the screw, the real-time positioning circle is attached to a drill bit of the planting mobile phone, when the real-time positioning circle and the target positioning circle are completely overlapped, the horizontal position of the screw meets planning requirements, an annular depth indication strip is arranged on the outer ring of the target positioning circle to display implantation depth, when the ring is completely filled, the screw is prompted to reach preset depth, the angle precision indication of the screw is indicated by an annular dotted indication strip, the annular dotted indication strip surrounds the periphery of the planting mobile phone, and the dotted length of the corresponding direction reflects the angle deviation of the direction.
Further, the virtual operation area presenting unit displays the reconstructed three-dimensional bone model of the patient, the three-dimensional bone model is rendered in semitransparent white, the three-dimensional bone model is marked with a bone defect area, bone slices planned before operation and screw positions, and the virtual model of the implantation mobile phone and the virtual model of the screw moving along with the position of the drill bit in real time, which are synchronous with the posture of the real instrument, are dynamically displayed.
The mixed reality display module further comprises a space matching unit, the space matching unit comprises a positioning plate, marking points are arranged on the positioning plate and can be identified by the navigation module and the mixed reality display module, the navigation module and the mixed reality display module respectively identify the space coordinates of the positioning plate, and a relative conversion relation of the two module coordinate systems is established to realize space registration.
Further, the mixed reality display module further comprises a gesture recognition unit, the system interface is divided into a state layer and a guide layer by adopting a layered focus design, the state layer continuously displays basic operation information, and the guide layer dynamically presents navigation guidance.
The navigation marker comprises a fixed marker and a movable marker, wherein the fixed marker is arranged on teeth of a patient oral cavity and used for establishing a coordinate system reference of the patient, the movable marker is arranged at the tail end of a bone forceps and at the tail part of a planting mobile phone and used for establishing an instrument coordinate system and providing real-time position and posture of the instrument, and the optical positioning camera is used for collecting characteristic points of the navigation marker and establishing a conversion relation between the instrument coordinate system and the patient coordinate system.
Further, the fixed marker is a medical grade ARuco two-dimensional code marking plate, the movable marker at the tail end of the bone forceps is ARuco two-dimensional code, and the movable marker at the tail of the implantation mobile phone is an annular ARuco marking belt.
The invention has the beneficial effects that:
The oral bone grafting navigation system combines multi-mode image data, including CBCT data and high-precision oral scanning data, realizes real-time visual guidance of bone slice positions and screw implantation paths through the combination of a mixed reality display technology and a navigation system, accurately controls the bone slice placement angle and the screw implantation positions, depth and angles, ensures the stable fixation of the bone slice and host bone, improves bone grafting operation efficiency, reduces nerve vascular injury risk and improves postoperative bone fusion effect.
The oral bone grafting navigation system realizes sub-millimeter real-time registration of multi-mode image data, and reduces registration error of the traditional navigation system from 1-2 mm to below 0.3mm through an innovative mark point-surface dual registration strategy. The system adopts an improved ArUco marker as a basic registration reference, at least three non-coplanar marker points are arranged in the oral cavity of a patient, a stable patient coordinate system is established, the identification precision and anti-shielding capacity of the marker points are remarkably improved, on the basis of the marker point registration, the system further executes surface registration based on the feature points, the high-precision oral scanner is used for acquiring the surface topology data of an operation area, at least 5000 feature points are extracted to be matched with a three-dimensional model reconstructed by CBCT before operation for ICP (Iterative Closest Point), in order to solve the problem that the traditional ICP algorithm is easy to fall into local optimum, arUco markers are firstly used as the basic registration reference, then the surface registration of the feature points is used for greatly improving the effect, and double registration results are fused through a Kalman filter to finally output a stable space transformation matrix, so that the accurate alignment of a virtual model and a real anatomical structure is ensured.
It should be apparent that, in light of the foregoing, various modifications, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.
The above-described aspects of the present invention will be described in further detail below with reference to specific embodiments in the form of examples. It should not be construed that the scope of the above subject matter of the present invention is limited to the following examples. All techniques implemented based on the above description of the invention are within the scope of the invention.
Drawings
FIG. 1 is a schematic diagram of a three-dimensional model.
Fig. 2 is a view of a bone fragment virtual indicator status display area.
Fig. 3 is a screw virtual indicator display.
Fig. 4 is a virtual operating field presentation.
Fig. 5 is a navigation status indication display diagram.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Example 1:
An oral bone grafting navigation system based on multi-mode mixed reality interaction comprises a data processing module, a mixed reality display module and a navigation module.
Data processing module
The module is used for data input and three-dimensional reconstruction, and specifically comprises an image processing unit, a bone grafting planning unit and a data output unit.
The image processing unit is used for acquiring CBCT data of the oral cavity of a patient and mouth scan data containing dentition and soft tissue surface morphology, registering the CBCT data and the mouth scan data through characteristic points (registration error is less than or equal to 50 mu m), generating an oral cavity three-dimensional model containing a bone defect area through a three-dimensional reconstruction algorithm, and marking a bone structure and a bone defect boundary, as shown in figure 1.
The bone grafting planning unit is used for planning bone fragments sources, three-dimensional forms (size/radian/thickness) and target placement positions (coordinates/angles) in a three-dimensional model by a doctor, marking joint surfaces of bone fragments and host bones as shown in a yellow area in fig. 1, designing screw fixing positions (each bone fragment is planned to be 1-3 fixing points) on the three-dimensional model, marking starting points, depths and angles (included angles with normal lines of the bone fragments are smaller than or equal to 15 degrees) of screw implantation paths, and avoiding damaging important structures.
The data output unit is used for generating navigation data comprising a bone slice three-dimensional model and screw planning coordinates (X/Y/Z axis position, implantation depth and angle parameters) and transmitting the navigation data to the mixed reality display module and the navigation module.
(II) mixed reality display module
The module is based on mixed reality head-mounted display equipment, realizes visual guidance of bone fragments and screw implantation through visual indicators, and specifically comprises the following steps:
1. Virtual indication unit
And displaying the corresponding bone piece virtual indicator and screw virtual indicator in the mixed reality display module in real time according to the three-dimensional form and the target placement position of the bone piece, the target position, the angle and the depth of the screw designed by the data processing module.
The bone fragment virtual indicator comprises position and angle indication and depth indication functions. As shown in fig. 2, the position and angle of the bone slice are indicated by two positioning frames, wherein the target positioning frame (purple) is fixed beside the target position of the bone defect area of the patient, the relative position is kept unchanged along with the movement of the head of the patient, the real-time positioning frame (white) is attached to the edge of the bone slice, synchronously displaces along with the movement of the bone slice and keeps the relative position unchanged with the bone slice, and the real-time positioning frame is completely embedded into the target positioning frame and keeps coincident by adjusting the position and angle of the bone slice, so that the position and angle of the bone slice meet the planning requirement. The depth indication of the bone fragments displays the vertical distance between the current position of the bone fragments and the target position in real time by displaying a depth progress bar beside the target positioning frame. The progress bar value is dynamically updated to visually reflect the depth state of the bone fragments entering the target position. Preferably, the positioning frame is rectangular in shape.
The screw virtual indicator comprises a horizontal position and depth indication function and an angle precision indication function. As shown in FIG. 3, the horizontal position and depth indication of the screw are indicated by two positioning circles, wherein a target positioning circle (blue) is displayed at a planned implantation point of the screw and keeps relative position along with the movement of a patient, a real-time positioning circle (yellow) is attached near a drill bit of the planting mobile phone and synchronously displaces along with the movement of the planting mobile phone, when the real-time positioning circle and the target positioning circle are completely overlapped, the horizontal position of the screw is indicated to meet the planning requirement, an annular depth indication strip is arranged on the outer ring of the target positioning circle to display the implantation depth, preferably, the annular depth indication strip is filled gradually to 360 degrees from 0 degrees, and when the annular depth indication strip is completely filled, the screw is prompted to reach the preset depth. The angle precision indication of the screw is used for indicating the angle alignment condition of the screw, an annular dotted line indication strip is used for indicating, the annular dotted line indication strip is a circle of angle error dotted line surrounding the planting mobile phone, the length of the dotted line corresponds to the angle deviation in different directions, namely, the larger the deviation is, the longer the dotted line in the corresponding direction is. The doctor adjusts the gesture of the mobile phone by extending the opposite direction to the dotted line until the angle error is narrowed to be within the threshold range.
2. Virtual operation area presentation unit
The virtual operation area presenting unit is used for displaying the three-dimensional bone model of the patient reconstructed by semitransparent white rendering, clearly marking the bone defect area and the positions of bone fragments and screws planned before operation as shown in fig. 4, and dynamically displaying the virtual model of the implantation mobile phone synchronous with the posture of the real instrument and the virtual model of the screws moving along with the position of the drill bit in real time, so that the operation track is visually presented.
3. Space matching unit
In order to realize accurate alignment of the virtual indicator and the real scene, a multifunctional positioning plate is arranged in an operation area to serve as a coordinate system registration medium. The multifunctional positioning plate is provided with marking points, the marking points can be identified by the navigation module and the mixed reality display module, and the relative conversion relation of the two module coordinate systems is calculated and established through the two-way identification of the space coordinates of the multifunctional positioning plate, so that the virtual indicator is ensured to be accurately mapped to the actual positions of the patient and the planting mobile phone.
4. Gesture recognition unit
The system interface adopts a layered focus design, as shown in fig. 5, the key information is divided into a state layer and a guiding layer, which respectively correspond to different visual saliences. The status layer continuously displays basic operation information such as the current instrument position, bone fragment status and the like, and the guiding layer dynamically presents navigation guidance such as a target path, the current position and the like.
(III) navigation module
The navigation module comprises a high-precision optical positioning camera and a navigation marker, and the spatial positions of the surgical instrument and the bone fragments are tracked in real time through an optical positioning technology.
The navigation marker comprises a fixed marker and a movable marker, wherein the fixed marker adopts a medical grade ARuco two-dimensional code marking plate, and is respectively stuck to lingual surfaces of the maxillary central incisors and the mandibular central incisors of a patient to establish an oral cavity coordinate system reference. The movable marker comprises a bone fragment clamping instrument marker and a planting mobile phone marker, and is used for establishing an instrument coordinate system, wherein the bone fragment clamping instrument marker is a ARuco two-dimensional code marked at the tail end of a bone forceps, so that 6 degrees of freedom (position and gesture) tracking of bone fragments is realized. The planting mobile phone marker is an annular ARuco marker band marked at the tail part of the planting mobile phone, so that the planting mobile phone marker can be stably identified by the optical positioning camera during the rotation operation of the mobile phone.
The optical positioning camera at least comprises 2 stations, acquires images of navigation markers, decodes the images in real time, calculates three-dimensional coordinates and posture parameters of the navigation markers, establishes a conversion relation between an instrument coordinate system and a patient coordinate system, performs real-time positioning calculation, performs deviation analysis on bone slice positioning and screw implantation, calculates three-dimensional coordinate deviation and angle deviation of an actual bone slice position and a planned preoperative position, and real-time analyzes distance deviation and angle deviation of a drill tip of the planting mobile phone and a planned screw path. And the deviation data is sent to the mixed reality display module through a low-delay wireless transmission protocol, and the bone fragment/screw virtual indicator is driven to be dynamically updated.
The specific implementation mode of the oral bone grafting navigation system of the invention is as follows:
First preoperative preparation
1.1, Carrying out multi-mode data acquisition and three-dimensional planning through a data processing module
CBCT data (0.3 mm thick) of the patient's oral cavity was acquired and input to a data processing module to generate a three-dimensional model containing the bone defect region. The doctor plans the three-dimensional form (radian matching degree error is less than or equal to 0.5 mm) and the target position (coordinate precision is +/-0.1 mm) of the bone slice in the model, marks 1-3 screw fixing points on the surface of the bone slice, designs an implantation path, wherein a starting point is positioned at the center of the planning point, the depth is 8-15mm, and the included angle between the implantation angle and the normal line of the surface of the bone slice is less than or equal to 15 degrees.
Obtaining dentition and soft tissue surface morphology (precision 20 μm, single frame scanning time less than or equal to 0.5 s) using high precision oral scanner
And registering the CBCT data and the mouth scan data through characteristic points (registration error is less than or equal to 50 mu m) to generate a composite three-dimensional model fusing bone tissue and dentition morphology.
1.2 Navigation marker mounting and calibration
The medical grade ARuco marking plate is stuck on the lingual side of the incisors in the upper and lower jaws of a patient, a customized marking module is arranged at the tail end of the bone forceps, and an annular marking belt is fixed at the tail part of the implantation mobile phone. And acquiring characteristic points of the navigation marker by an optical positioning camera (with the precision of +/-0.1 mm), and establishing a conversion relation (the error is less than or equal to 0.3 mm) between the instrument coordinate system and the patient coordinate system.
(II) surgical procedure
2.1 System initialization and spatial registration
The doctor wears the mixed reality display head-mounted device, and the system establishes a patient coordinate system through the fixed markers. The multifunctional locating plate is placed in an operation area, registration (registration error is less than or equal to 0.5 mm) of the virtual model and a real scene is completed through two-way coordinate calculation, and a display module loads a semitransparent bone model, a bone fragment target triangle frame and a screw planning path.
2.2 Bone fragment placement navigation operations
The bone fragments are taken by the bone forceps clamp with the markers, and the real-time positioning frame (white) at the edge of the bone fragments and the target positioning frame (purple) beside the bone defect area are displayed in real time by the mixed reality equipment, as shown in fig. 2. And the doctor adjusts the positions of the bone fragments, when the white real-time positioning frame is completely embedded into the purple target positioning frame and the depth progress bar displays the vertical distance of <0.3mm and the angle deviation of <3 degrees, the indicator is changed into green and normally bright, and the gap between the joint surfaces of the bone fragments is less than or equal to 0.2mm.
2.3, Accurate navigation of screw implantation
The marker-carrying planting mobile phone is replaced, the display module displays the target positioning circle (blue) at the planned fixed point, and the real-time positioning circle (yellow) is synchronously displayed near the drill bit, as shown in fig. 3. When the two circles are completely overlapped (the horizontal deviation is less than or equal to 0.5 mm), the implantation depth (0-360 degrees) of the outer circle of the target positioning circle is displayed through the color filling progress, and meanwhile the angle error dotted line around the mobile phone is dynamically shortened. When the depth reaches a preset value (such as 12 mm) and the angle deviation is less than or equal to 3 degrees, the system gives out an audible prompt, and a doctor drives in the screw.
2.4, Multiple screw cooperative fixing process
After the first screw is fixed, the display module automatically activates the next screw guide wire, and the virtual operation area renders the spatial relationship between the implanted screw and the bone fragments in real time, so that screw interference is avoided. After all screws are fixed, the system generates a deviation report (the position deviation is less than or equal to 0.5mm, and the angle deviation is less than or equal to 3 degrees).
(III) post-operative treatment
3.1, Data recording and device Disinfection
The markers are removed and the device is sterilized, and the operation data including the actual coordinates of the bone fragments (deviation + -0.15 mm), the actual measured values of the depth/angle of the screw implantation and the real-time deviation curve (sampling frequency 10 Hz) are stored.
3.2 Post-operative evaluation report Generation
The system generates an evaluation report based on the intraoperative data, comprises a three-dimensional reconstruction contrast map and a neurovascular safety distance detection result (more than or equal to 2 mm), and can be fused with a postoperative CBCT image to analyze the fixing effect.
Of course, the present invention is capable of other various embodiments and its several details are capable of modification and variation in light of the present invention by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (9)

1. The oral bone grafting navigation system based on the multi-mode mixed reality interaction is characterized by comprising a data processing module, a mixed reality display module and a navigation module;
The data processing module comprises an image processing unit, a bone grafting planning unit and a data output unit, wherein the image processing unit is used for acquiring CBCT data and oral scanning data of a patient oral cavity, registering the CBCT data and the oral scanning data through characteristic points, generating an oral cavity three-dimensional model containing a bone defect area through a three-dimensional reconstruction algorithm, marking a bone structure and a bone defect boundary, the bone grafting planning unit is used for planning a bone piece three-dimensional form and a target placement position in the three-dimensional model, marking a joint surface of the bone piece and a host bone, designing a screw fixing position on the three-dimensional model, marking a starting point, depth and angle of a screw implantation path, and the data output unit is used for generating navigation data containing the bone piece three-dimensional model and screw planning coordinates and transmitting the navigation data to the mixed reality display module and the navigation module;
The mixed reality display module comprises a head-mounted display device capable of displaying a virtual indication unit and a virtual operation area presentation unit, wherein the virtual indication unit is used for displaying the target positions and the real-time positions of bone fragments and screws in real time, and the virtual operation area presentation unit is used for displaying a three-dimensional model of an oral cavity of a patient and a real-time virtual model of an instrument in a surgical process;
The navigation module comprises a navigation marker and an optical positioning camera, and is used for tracking the spatial positions of the surgical instrument, the bone fragments and the screws in real time and transmitting the real-time spatial position data and the three-dimensional spatial deviation to the data processing module and the mixed reality display module.
2. The oral bone grafting navigation system based on multi-modal mixed reality interaction of claim 1, wherein the virtual indication unit comprises a bone slice virtual indicator and a screw virtual indicator, wherein the bone slice virtual indicator is used for indicating the position and the angle of a bone slice and indicating the depth, and the screw virtual indicator is used for indicating the horizontal position and the depth of a screw and indicating the angle precision.
3. The oral bone grafting navigation system based on multi-mode mixed reality interaction according to claim 2, wherein the bone piece virtual indicator is characterized in that two positioning frames are used for indicating the positions and angles of bone pieces, the target positioning frames are fixed beside the target positions of bone defect areas of patients, the real-time positioning frames are attached to the edges of the bone pieces, when the two positioning frames are completely overlapped, the positions and angles of the bone pieces meet planning requirements, the depth indication of the bone pieces is indicated by a depth progress bar, the depth precision bar is arranged beside the target positioning frames, and the vertical distance between the current positions and the target positions of the bone pieces is displayed in real time.
4. The oral bone grafting navigation system based on multi-mode mixed reality interaction according to claim 2, wherein the screw virtual indicator is used for indicating the horizontal position and depth of the screw by adopting two positioning circles, the target positioning circle is displayed at a planned implantation point of the screw, the real-time positioning circle is attached to a drill bit of the implantation mobile phone, when the real-time positioning circle and the target positioning circle are completely overlapped, the horizontal position of the screw meets the planning requirement, an annular depth indication strip is arranged on the outer ring of the target positioning circle to display the implantation depth, when the annular depth indication strip is completely filled, the screw is prompted to reach the preset depth, the angle precision of the screw is indicated by adopting an annular dotted indication strip, the annular dotted indication strip surrounds the periphery of the implantation mobile phone, and the dotted line length of the corresponding direction reflects the angle deviation of the direction.
5. The oral bone grafting navigation system based on the multi-mode mixed reality interaction of claim 1, wherein the virtual operation area presenting unit displays a reconstructed three-dimensional bone model of a patient, the three-dimensional bone model is rendered in semitransparent white, a bone defect area and a bone slice and screw position planned before operation are marked on the three-dimensional bone model, and an implant mobile phone virtual model synchronous with the posture of a real instrument and a screw virtual model moving along with the position of a drill bit in real time are dynamically displayed.
6. The oral bone grafting navigation system based on multi-mode mixed reality interaction of claim 1, wherein the mixed reality display module further comprises a space matching unit, the space matching unit comprises a positioning plate, marking points are arranged on the positioning plate and can be identified by the navigation module and the mixed reality display module, the navigation module and the mixed reality display module respectively identify space coordinates of the positioning plate, and a relative conversion relation of two module coordinate systems is established to achieve space registration.
7. The oral bone grafting navigation system based on multi-mode mixed reality interaction of claim 1, wherein the mixed reality display module further comprises a gesture recognition unit, a system interface is divided into a state layer and a guiding layer by adopting a layered focus design, the state layer continuously displays basic operation information, and the guiding layer dynamically presents navigation guidance.
8. The oral bone grafting navigation system based on multi-mode mixed reality interaction according to claim 1, wherein the navigation markers comprise fixed markers and movable markers, wherein the fixed markers are arranged on teeth of an oral cavity of a patient and used for establishing a patient coordinate system reference, the movable markers are arranged at tail ends of bone forceps and tail parts of a implanting mobile phone and used for establishing an instrument coordinate system and providing real-time positions and postures of the instrument, and the optical positioning camera is used for collecting characteristic points of the navigation markers and establishing conversion relation between the instrument coordinate system and the patient coordinate system.
9. The oral bone grafting navigation system based on multi-mode mixed reality interaction according to claim 8, wherein the fixed marker is a medical grade ARuco two-dimensional code marking plate, the movable marker at the tail end of the bone forceps is ARuco two-dimensional code, and the movable marker at the tail of the implanting mobile phone is annular ARuco marking belt.
CN202511338994.2A 2025-09-18 2025-09-18 An oral bone grafting navigation system based on multimodal mixed reality interaction Pending CN120814908A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202511338994.2A CN120814908A (en) 2025-09-18 2025-09-18 An oral bone grafting navigation system based on multimodal mixed reality interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202511338994.2A CN120814908A (en) 2025-09-18 2025-09-18 An oral bone grafting navigation system based on multimodal mixed reality interaction

Publications (1)

Publication Number Publication Date
CN120814908A true CN120814908A (en) 2025-10-21

Family

ID=97366278

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202511338994.2A Pending CN120814908A (en) 2025-09-18 2025-09-18 An oral bone grafting navigation system based on multimodal mixed reality interaction

Country Status (1)

Country Link
CN (1) CN120814908A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120814875A (en) * 2025-09-18 2025-10-21 四川大学 Oral osteotomy navigation system, method and storage medium based on mixed reality interaction

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108742898A (en) * 2018-06-12 2018-11-06 中国人民解放军总医院 Tooth-planting navigation system based on mixed reality
CN112972027A (en) * 2021-03-15 2021-06-18 四川大学 Orthodontic micro-implant implantation positioning method using mixed reality technology
CN114587657A (en) * 2022-02-06 2022-06-07 上海诠视传感技术有限公司 Oral implantation auxiliary navigation method and system based on mixed reality technology
CN114668534A (en) * 2022-03-25 2022-06-28 杭州键嘉机器人有限公司 Intraoperative implantation precision detection system and method for oral dental implant surgery
CN115778589A (en) * 2022-12-28 2023-03-14 同济大学 Device and method for assisting implantation of dental implant

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108742898A (en) * 2018-06-12 2018-11-06 中国人民解放军总医院 Tooth-planting navigation system based on mixed reality
CN112972027A (en) * 2021-03-15 2021-06-18 四川大学 Orthodontic micro-implant implantation positioning method using mixed reality technology
CN114587657A (en) * 2022-02-06 2022-06-07 上海诠视传感技术有限公司 Oral implantation auxiliary navigation method and system based on mixed reality technology
CN114668534A (en) * 2022-03-25 2022-06-28 杭州键嘉机器人有限公司 Intraoperative implantation precision detection system and method for oral dental implant surgery
CN115778589A (en) * 2022-12-28 2023-03-14 同济大学 Device and method for assisting implantation of dental implant

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120814875A (en) * 2025-09-18 2025-10-21 四川大学 Oral osteotomy navigation system, method and storage medium based on mixed reality interaction

Similar Documents

Publication Publication Date Title
CN106264659B (en) For guiding the 3D system and method for object
US9161821B2 (en) Advanced bone marker and custom implants
JP4488678B2 (en) Establishing a three-dimensional display of bone X-ray images
JP7126516B2 (en) Guided method of performing oral and maxillofacial procedures, and related systems
CN111658065A (en) A digital guidance system for mandibular resection surgery
WO2022126828A1 (en) Navigation system and method for joint replacement surgery
KR102352789B1 (en) Surgical robot system for integrated surgical planning and implant preparation, and associated method
US12285307B2 (en) System and method for guiding medical instruments
US7367801B2 (en) Method for precisely-positioned production of a cavity, especially a bone cavity and instrument therefor
CN110236674A (en) A liver surgery navigation method and system based on structured light scanning
CN108201470B (en) Autonomous dental implant robot system and equipment and method thereof
Hong et al. Medical navigation system for otologic surgery based on hybrid registration and virtual intraoperative computed tomography
CN108742898A (en) Tooth-planting navigation system based on mixed reality
CN106413621A (en) Surgical assemblies for housing force transmitting members
CN112885436B (en) Dental surgery real-time auxiliary system based on augmented reality three-dimensional imaging
JP2017164007A (en) Medical image processing apparatus, medical image processing method, program
CN102784003A (en) Pediculus arcus vertebrae internal fixation operation navigation system based on structured light scanning
CN110537985A (en) Spine space coordinate system positioning device and method for augmented reality surgery system
JP2023502927A (en) Visualization system for use in a surgical environment
CN120814908A (en) An oral bone grafting navigation system based on multimodal mixed reality interaction
JP2014131552A (en) Medical support device
CN111728695B (en) A beam-assisted positioning system for craniotomy
CN116737031A (en) A tooth root information visualization system and method based on mixed reality
CN119278005A (en) Landmarks integrated with the oral bracket
CN110720985A (en) Multi-mode guided surgical navigation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination