WO2020261249A1 - Orthopaedic pre-operative planning system - Google Patents
Orthopaedic pre-operative planning system Download PDFInfo
- Publication number
- WO2020261249A1 WO2020261249A1 PCT/IB2020/056143 IB2020056143W WO2020261249A1 WO 2020261249 A1 WO2020261249 A1 WO 2020261249A1 IB 2020056143 W IB2020056143 W IB 2020056143W WO 2020261249 A1 WO2020261249 A1 WO 2020261249A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- implant
- patient
- model
- anatomical
- operative
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/505—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of bone
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0022—Monitoring a patient using a global network, e.g. telephone networks, internet
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4851—Prosthesis assessment or monitoring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7275—Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/465—Displaying means of special interest adapted to display user selection data, e.g. graphical user interface, icons or menus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/467—Arrangements for interfacing with the operator or the patient characterised by special input means
- A61B6/468—Arrangements for interfacing with the operator or the patient characterised by special input means allowing annotation or message recording
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
- A61B2034/104—Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/108—Computer aided selection or customisation of medical implants or cutting guides
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/05—Surgical care
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/48—Diagnostic techniques
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/28—Bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/30—Joints
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B42/00—Obtaining records using waves other than optical waves; Visualisation of such records by using optical means
- G03B42/02—Obtaining records using waves other than optical waves; Visualisation of such records by using optical means using X-rays
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- This disclosure relates to the general field of planning an operative procedure, particularly an orthopedic procedure. Methods, systems, and devices for pre-operatively selecting and positioning implant components to use in operative procedures are also disclosed.
- the disclosure is also related to methods, systems, and devices for pre-operatively planning joint-replacement surgery.
- the disclosure is also related to methods, systems, and devices for measuring, predicting, and comparing patient joint function before and after joint-replacement surgery.
- prosthesis or implants for example hip implants, knee or shoulder implants amongst others.
- the primary aim of joint replacement surgery is to restore patient joint function.
- it is difficult to predict the functional impact of the selection and positioning of implants given variations in patient bone shape, muscle geometry and function, and movement styles for specific function tasks.
- a joint e.g. shoulder
- a measure of the ability to perform the task is the magnitude of that range of angles.
- a healthy subject may be able to rotate their shoulder in flexion 180 degrees while a patient requiring shoulder arthroplasty may only manage 90 degrees. It is the aim of the shoulder arthroplasty to restore that range of angles to 180 degrees or as much as possible through implant selection and placement.
- Implants can include without limitation permanent implants, e.g. artificial joint
- the disclosure broadly provides a method for determining one or more of selection, positioning or placement of a surgical implant, the method including the steps of:
- Predicting post-operative function of the structure for one or more implants Selecting one or more of the implant, the implant position or the implant location to improve the predicted post-operative function.
- the selection includes minimising one or more differences between the predicted post-operative function and the predicted unimpaired function.
- pre-operative data for the impaired structure is obtained.
- Data for the subject or patient may also be obtained, as may data for a population of subjects.
- post-operative data may be obtained to improve the predictive functions.
- the method includes producing a patient anatomical model.
- the model comprises a 3D model.
- the model is generated from one or more patient medical images.
- one or more patient medical images is processed and a statistical shape model is used to produce the patient anatomical model.
- the method may include a machine learning method, such as an artifical neural network or a deep neural network for performing one or more method steps, for example classifying and/or filtering patient anatomical or medical images.
- a machine learning method such as an artifical neural network or a deep neural network for performing one or more method steps, for example classifying and/or filtering patient anatomical or medical images.
- the statistical shape model is used to identify or produce one or more of: an anatomical landmark, feature or region; one or more geometric models; one or more morphometric measurements.
- the anatomical landmark(s), feature(s) or region(s) may be a surgically relevant landmark, feature or region.
- the surgically relevant landmark of feature may comprise a fixation point, or region or location for an implant.
- the anatomical landmark(s), feature(s) or region(s) may be a relevant landmark, feature or region for determining a pre- or post-operative patient function, for example a a pre- or post-operative range of movement.
- the disclosure broadly provides a method for determining one or more of selection, positioning or placement of a surgical implant, the method including the steps of:
- predicting post-operative function of the anatomical structure for one or more implants selecting one or more of the implant, the implant position or the implant location to improve the predicted post-operative function of the structure.
- the methods above may be applied to determine one or more of: the type of implant; the shape of the implant; the fixing points for the implant.
- the method includes producing a patient anatomical model.
- the model comprises a 3D model.
- the model is generated from one or more patient medical images.
- one or more patient medical images is processed and a statistical shape model is used to produce the patient anatomical model.
- the method may include a machine learning method, such as an artifical neural network or a deep neural network for performing one or more method steps, for example classifying and/or filtering patient anatomical or medical images.
- a machine learning method such as an artifical neural network or a deep neural network for performing one or more method steps, for example classifying and/or filtering patient anatomical or medical images.
- the statistical shape model is used to identify or produce one or more of: an anatomical landmark, feature or region; one or more geometric models; one or more morphometric measurements.
- the anatomical landmark(s), feature(s) or region(s) may be a surgically relevant landmark, feature or region.
- the surgically relevant landmark of feature may comprise a fixation point, or region or location for an implant.
- the anatomical iandmark(s), feature(s) or region(s) may be a relevant landmark, feature or region for determining a pre- or post-operative patient function, for example a a pre- or post-operative range of movement.
- the disclosure provides a method or system for producing medical images for predicting unimpaired or post-operative function of anatomical structures.
- the method includes producing a patient anatomical model.
- the model comprises a 3D model.
- the model is generated from one or more patient medical images.
- one or more patient medical images is processed and a statistical shape model is used to produce the patient anatomical model.
- the method may include a machine learning method, such as an artifical neural network or a deep neural network for performing one or more method steps, for example classifying and/or filtering patient anatomical or medical images.
- a machine learning method such as an artifical neural network or a deep neural network for performing one or more method steps, for example classifying and/or filtering patient anatomical or medical images.
- the statistical shape model is used to identify or produce one or more of: an anatomical landmark, feature or region; one or more geometric models; one or more morphometric measurements.
- the anatomical landmark(s), feature(s) or region(s) may be a surgically relevant landmark, feature or region.
- the surgically relevant landmark of feature may comprise a fixation point, or region or location for an implant.
- the anatomical landmark(s), feature(s) or region(s) may be a relevant landmark, feature or region for determining a pre- or post-operative patient function, for example a a pre- or post-operative range of movement.
- the disclosure provides a graphical user interface for facilitating one or more of the foregoing methods.
- the interface includes a 3D representation of patient anatomy and a proposed implant superimposed on the patient anatomy.
- the 3D representation is manipu!able to provide a plurality of view perspectives.
- the interface shows implant or patient joint orientation in a plurality of different planes.
- the planes ar eorthogona! to each other.
- the disclosure provides apparatus for implementing the foregoing methods.
- the apparatus comprises a client-server system.
- the disclosure provides a system for implementing the foregoing methods.
- Figures 1A and 1 B are diagrams showing apparatus including a processing environment for implementing the method and system disclosed herein;
- FIG. 2 is a diagrammatic system overview
- Figure 3 is a diagram showing generation of a machine-learning/biomechanical hybrid model that predicts patient functional metrics
- Figure 4 is a diagram of an implant and placement sub-system
- Figure 5 is a diagram of an image processing sub-system
- Figure 6 is a fiow chart illustrating an example or embodiment of 3D model generation and simulation of implant fit
- Figure 7 is a diagrammatic illustration of a client-server system providing an example of implementation of the invention
- Figures 8A-D are sketches showing examples of landmarks or geometric features that may be identified to form anatomical models
- Figures 9A and B are sketches showing examples of landmarks in the form of target surgical features for implant integration on the bone (or similar structures) in the model
- Figures 10-17 show examples of a Graphical User Inteface to facilitate use of the system.
- the following description focuses on embodiments of the present invention applicable for planning an orthopedic procedure.
- the method includes positioning a virtual implant component relative to a digital model of the patient’s anatomy.
- Embodiments of the invention will be described in the following with regard to planning a hip replacement procedure using a hip implant comprising an acetabular cup component and a femoral stem component.
- joint implant procedures e.g. a knee implant procedure, an ankle implant procedure etc. wherein one or several implant components may be included in the procedure.
- positioning a virtual implant component may comprise defining positional information for at least one of an affected femoral head, affected femoral shaft, unaffected femoral head, unaffected femoral shaft, a cup of a virtual implant, and a stem of a virtual implant.
- modelling of anatomical structures as disclosed herein is not limited to modelling bone (although this is used as a primary example), but includes other structures including without limitation connective tissue, ligaments, tendons, cartilage, muscles and vascular structures.
- the tools used to preform pre-operative planning as disclosed herein are computer implemented. Accordingly, aspects of the present disclosure are implemented in a data processing environment.
- the data processing environment 10 may include a plurality of individual networks, such as wireless networks and wired networks.
- a plurality of wired and/or wireless devices 11 may communicate over a network 12 with third party information sources 13, data processing services 14 and information management system 15 which may include a data store 16 and a data processing system or computing device 20.
- the computing device 20 is shown in more detail in Figure 1B.
- the device 20 may be implemented as a microprocessor which can process data accessed from additional network based resources, such as web sites or other supplemental content delivery.
- Figure 1 B depicts one embodiment of an architecture of illustrative computing device 20 for implementing various aspects of the data processing system or environment 10 in accordance with aspects of the present application.
- the data processing system 10 can be a part of the instantiation of a set of virtual machine instances.
- the computing device 20 may a stand-alone device that functions as the data processing system 10.
- the general architecture of the device 20 depicted in Figure 1 B includes an arrangement of computer hardware and software components that may be used to implement aspects of the present disclosure.
- the device 20 includes a processing unit 24, a network interface 26, a computer readable medium drive 28, an input/output device interface 29, ail of which may communicate with one another by way of a communication bus.
- the components of the computing device 20 may be physical hardware components or implemented in a virtualized environment.
- the network interface 26 may provide connectivity to one or more networks or computing systems.
- the processing unit 24 may thus receive information and instructions from other computing systems or services via a network.
- the processing unit 24 may also communicate to and from memory 30 and further provide output information.
- the memory 30 may include computer program instructions that the processing unit 24 executes in order to implement one or more embodiments.
- the memory generally includes RAM, ROM, or other persistent or non-transitory memory.
- the memory may store an operating system 34 that provides computer program instructions for use by the processing unit 24 in the general administration and operation of the device.
- the memory may further include computer program instructions and other information for implementing aspects of the present disclosure.
- the memory includes interface software 32 for receiving and processing requests from the client devices 11.
- Memory 30 includes an information match processing component 36 for processing the user interactions to create graphical interfaces as described herein.
- FIG. 2 an overall schematic of one embodiment of a system 100 for pre- operative planning is illustrated. Although this embodiment is described with reference to implants, those skilled in the art will appreciate that the system is applicable with other operative procedures.
- Model 104 receives patient anatomical models 110 and an initial surgical plan 1 13, along with pre-operative patient motion data.
- This motion data comprises patient motion data 111 (derived from a pre-operative assessment 101), and patient motion data 112 (derived from an assessment of post-operative patient function
- patient anatomicai models are derived from pre-operative images 117 which are processed at 102 to provide models 110
- the outputs from the model 104 include functional metrics 104a for implementing a surgical plan, and pre- and post-operative functional metrics 104b,c,d.
- Pre-operative metrics 104b,c can be used to develop a pre-operative range of motion analysis 105 which can be used to determine implant selection and placement as shown in 103.
- Post operative metrics 104d may be used to develop a post-operative range of motion analysis 107, which may be compared with the pre-operative analysis data 105 for optional review by a system user such as a surgeon at 114 before determining a surgical outcome 1 15 that may be provided to the model 104 as data for to improve future modelling and processing, for example through use of machine learning.
- the system can process data automatically with minimum input from a surgeon. This may depend on the nature or complexity of the operative procedure.
- the surgical procedure can be planned with no specific decisions needing to be taken by a surgeon.
- the surgical plan may be provided in a machine readable form to enable a machince such as a robot to perform the surgical procedure
- surgeon may be able to make manual selections based on data such as pre- or post-operative outcomes determined by the model 104.
- Implant models 118 are provided to allow the system to perform the required modelling for placement of the implant as part of the procedure, and the post-operative outcomes.
- implant models 118 allow model 104 to produce data for placement of the implant relative to the patient anatomicai strucures and to allow visualisation of implant as required.
- outcomes can be optimised automatically.
- the implant selection and placement can be optimised automatically. This may occur by an an iterative process for example, so initial implant selection and placement data can be input into the surgical plan 113 and processed again in accordance with model 104 and this process may continue until a selection and placement is determined that falls within one or more threshold parameters.
- the threshold parameters may for example include some of the pre-operative and/or post-operative range of motion analysis data 105, 107.
- a surgeon may use data from the pre-operative range of motion analysis 105 for example to try using a completely different form of implant at 103.
- the surgeon may have the option to approve or alter, at 116, the pre-operative range of motion analysis 105 and thus provide a manual adjustment, or iteration, or override for implant selection and placement 103.
- a finalised surgical plan 1 19 can be produced as an output, and as can be seen from Figure 2, data from the plan can be fed back as an input to the model 104 to allow the system to iteratively process all inputs until such time as optimisation has been performed to within one or more required thresholds or parameters, or until an output is manually selected.
- the surgical plan, and other data produced by the system can be visualised to provide a human user such as a surgeon with images that can assist the surgical process and/or allow the user to visualise implant placement and the effects the implant may have on post-operative ranges of movement or other effects that may be experienced by the patient.
- model 104 may use machine learning to assist with predictive functions of the system.
- the post-operative function assessment might be performed by model 104 based on post-operative data obtained from previous patients.
- a predicted post-operative assessment may be used as another input in
- system 100 broadly provides a digitally implemented surgical planning system having:
- IMU motion/measurement unit
- An image processing sub-system 102 that uses deep neural networks to produce digital models of the patient’s anatomy.
- An implant selection and placement sub-system 103 that
- a machine-learning/biomechanical hybrid model (the hybrid model) 104 that a. predicts the patient’s functional metric given an initial or working surgical plan 113 b. estimates the patient’s current functional metric given motion data from 101 c. predicts the patient’s normal functional metric given motion data from 101 d. learns from pre and post-operative functional data and patient anatomy data to improve its prediction of 4(a), (b), and (c).
- a range of motion analysis sub-system 105 that compares the output of 104 and provides feedback to 103 for visualisation and optimisation of implant selection and placement.
- IMU inertia motion unit
- Model 104 includes a kinematic model generator 201 that produces a digital functional model 204 of the required or subject patient anatomy from patient anatomy models 110 generated by the image processing sub-system (which is described further below).
- This model consists of anatomical structures including without limitation bones, joints, muscles and other features having geometric constraints adapted to the patient’s anatomy that model the function of the patient’s musculoskeletal system.
- the model generator 201 can optionally take as input a surgical plan 113 that includes one or more implants. In this example, the generator adjusts the joint geometry to replace the patient’s own joint with the artificial joint.
- a functional metric estimator 202 produces the functional metric from IMU data and the patient-specific kinematic model from 204 to provide estimated pre-operative functional metrics 205.
- a functional metric predictor 203 that uses the kinematic model, patient medical images, estimated functional metrics from 205, raw patient motion data, and population models of anatomy and function 206 to predict the functional metric 207 of the patient if they had a normal joint (in the example of a joint replacement procedure).
- This predictor uses a combination of biomechanical models and machine-learning techniques to combine the various input data types for a prediction.
- the implant selection and placement subsystem 103 may include an implant fit simulator 401 that fits each implant in an implant library 118 to the patient's anatomical models 110 taking into account surgical constraints.
- the fit simulator 401 scores and then ranks each implant at step 406 based on how well it a. fits to the patient anatomy
- the outputs of 401 are sent to a graphical user interface 402 which displays to a user (e.g. a surgeon). In some embodiments this allows the surgeon to perform one of more of the following:
- this allows the surgeon to view the implant and/or implant and anatomical structures from one or more selected directions and/or distances, in cross-section, and/or with overlayed graphical and text data
- the selected implant(s) and their positioning are input to the hybrid biomechanical model 403 to predict the post-operative function (see above)
- a range of motion analysis 405 is performed on the predicted normal function 2067 and the predicted post-operative function to calculate the difference at 408.
- the system 100 is configured to minimise difference 408, so this difference is sent to the fit simulator 401 to adjust implant positioning and scoring, and to the user interface 402 to give the user feedback on the performance of the selected implant(s).
- the subsystem 102 may include a deep neural network (DNN) and a set of image filters 301 that generates 3D models of anatomical structures such as bones, muscles, and other relevant anatomical structures from one or more medical images 117 of the patient.
- the image or images may comprise 2-D X-ray, 3D X-ray CT, 3D MRI, or other modalities.
- the DNN is trained to associate input image texture with output 3D voxel volumes of the various anatomical structures.
- a series of image filters including thresholding, region-growing, gaussian smoothing, and marching- cubes then convert the 3D voxel volumes into 3D triangulated meshes.
- these raw geometric models or meshes 302 consist of arbitrary ordering of triangles and no information about anatomical regions or landmarks.
- Sub-system 102 also includes statistical shape models (SSM) 303 which may be fitted to the raw meshes.
- SSM statistical shape models
- the SSM morphs a canonical triangulation of each anatomical object to the raw mesh so that meshes of the patient’s anatomy are obtained which are with consistent triangulation as shown at 304.
- This allows the system to map anatomical regions and landmarks onto the geometry as shown at 305, and automatically take morphometric measurements such as lengths, angles, areas, and volumes as shown at 306.
- the process begins at 640 and the first step is acquisition or uploading of medical anatomical images at 641.
- a 3D model of the patient anatomy relevant to the procedure e.g. a patient anatomicai structure such as a bone or bones
- a digital model of an implant from a library of implants can then be automatically or manually selected.
- the implant model has its own target surgical features which are already identified, or may be adjusted or identified dependent on the anatomical digital model.
- the fit between the anatomical and implant models is simulated in step 644.
- the model is used to simulate patient function, for example range of joint movement, in step 645. If function is adequate, the planning can be completed at 646. If inadequate, then an alternative implant can be selected and simulated as indicated by path 647.
- Information management system 15 in this embodiment represented as a server 752, communicates via a network 12 with client- side application 754 which may be executed by a machine 11.
- the client-side application 754 may be used by a user, for example a surgeon planning an orthopedic procedure, to open a new surgical case and upload patient anatomical images, as shown in block 755.
- the anatomical images may be sourced from a variety of different medical imaging modalities, for example, X-ray, CT or MRI. Some modalities may be provided as 2D images, for example X-ray sourced images. Others may be 3D (or consist of a stack of 2D images that can be represented as a 3D image) for example sourced by CT or MRI.
- the client-side application provides the images to the server 742 as 2D images 756 or 3D images 757.
- An image processing application running on the server then performs a 3D reconstruction of the patient anatomy from the images 756, 757 as shown in block 758, to automatically generate 3D model of the patient anatomy.
- the anatomy which is modelled will include the anatomical region which is the subject of the procedure, for example a hip or shoulder or knee.
- the 3D model generated in block 758 is provided as a digital model in a format (such as STL, PLY, OBJ, or other formats) which can readily be provided back to the client side application 754 as shown in block 760 to enable the user to readily visualise the patient anatomy and manipulate the representation appearing on the client side device so that the user can obtain an adequate visualisation of all parts of the patient anatomy relevant to the intended procedure
- the application represented by block 758 may make use of an additional tool such as an artificial neural network 759 which in some embodiments may comprise one or more deep neural networks.
- the server 752 may also include a database 761 comprising a collection of statistical shape models (SSMs) of patient anatomy (e.g. bones, or other tissues and structures) which may be used to generate or reconstruct the 3D model.
- SSMs statistical shape models
- the 3D anatomical model is produced or reconstructed from 2D input medical anatomic images 756 by firstly using deep neural network 759 to identify selected landmarks which may comprises certain geometric features such as the volumes, regions, contours, or discrete points in the images belonging to the anatomical object.
- Figure 8A-D are sketches using the hip joint as an example, in particular the Femoral head 801 as located next to or within the Acetabulum 802.
- Figure 8A shows a geometric feature comprising the volume 803 (which is shaded) of the femoral head.
- Figure 8B shows a region 804 (which is shaded) of the femoral head occupied by a plane in cross section.
- Figure 8C shows a contour 805 (which is shown in broken outline) of the femoral head in a plane in cross section.
- Figure 8D shows identified points 806 on the femoral head.
- the next step is to fit an SSM of the related anatomical structure to the landmarks or contours to thus reconstruct a 3D model of the bone.
- the 3D anatomical model is produced or reconstructed from 2D input medical anatomic images 756 by using deep neural network 759 to directly predict the parameters of an SSM of a bone from one or more medical images. The predicted parameters can then be used to generate a 3D model of the bone from the SSM.
- the 3D anatomical model is produced or reconstructed from a 3D image volume (for example composed of a set of 2D CT or MRI images), such as input medical anatomic images 757, by using deep neural network 759 to identify and label the relevant regions of bones of interest from the 3D image volume.
- a 3D image volume for example composed of a set of 2D CT or MRI images
- deep neural network 759 to identify and label the relevant regions of bones of interest from the 3D image volume.
- the identified volume, region, contour, or points may encompass or be on a single connected portion of one object (e.g. part of one bone), multiple unconnected regions of one object (e.g. different pieces of a fractured bone), or multiple objects (e.g. all the bones that make up a joint (e.g. the femur, tibia, and patella in the knee) or larger structure (e.g. multiple vertebrae that make up the spine).
- one object e.g. part of one bone
- multiple unconnected regions of one object e.g. different pieces of a fractured bone
- multiple objects e.g. all the bones that make up a joint (e.g. the femur, tibia, and patella in the knee) or larger structure (e.g. multiple vertebrae that make up the spine).
- the next step is to identify landmarks in the form of target surgical features for implant integration on the bone (or similar structures) in the model 3. These target surgical features or regions are mapped onto the patient 3D models using an SSM. This can be achieved by:
- a canonical representation of a 3D geometry e.g. a triangulated mesh of a bone
- a 3D geometry e.g. a triangulated mesh of a bone
- a description of the modes of variation of that mean shape observed in a population e.g. the variation in shape of a bone across a human population
- An SSM's canonical representation can be customised to the shape of a particular individual by morphing the mean shape according to the modes of variation, each weighted by a different score; or
- An SSM can be fitted to an individual's shape by
- the SSM of each bone contains additional information about anatomical points, regions, axes, and other geometric features on the canonical geometry (e.g. triangulated mesh), for example spheres, cylinders cones best fitted to the platform. Examples are shown in Figure 9A and B, in which sketches of a femur 900 are shown marked up with reference to the landmarks, regions and features described below.
- An anatomical landmark 901 can be described by the index of the mesh vertex that is closest to the landmark
- An anatomical region 902 is described by the set of indices of the mesh vertices and faces that fall within the region
- An additional feature is an anatomical axes 903 defined by a line between two landmarks, a line fitted through 3 or more landmarks, a line or axis 904 fitted through a region, or a line fitted through a combination of landmarks and regions.
- An additional feature is a circle with a centre and radius fitted to three or more anatomical landmarks and/or a region.
- An additional feature is a sphere 907 with a centre 905 and radius 906 fitted to four or more anatomical landmarks and/or a region.
- An additional feature is a plane 908 with an in-plane point and a normal vector calculated from two landmarks or fitted through a region
- An additional feature is a local cartesian coordinate system with an origin point and three orthogonal vectors calculated by at least 3 landmarks, or a combination of landmarks, axes, and planes.
- the morphed mesh now becomes an accurate representation of the patient's bone shape that is annotated with the locations of their anatomical landmarks, regions, and features. These landmarks, regions, and features provide targets and constraints for the fitting of implants.
- the next step is to select an implant from the library of implant shapes and sizes and simulate implant fit from the selected implant, or simply perform simulation across the library of implant shapes and sizes.
- Simulation involves the optimisation of fit between predefined regions on the implant geometry and regions on the patient anatomy.
- the fit is between an implant 3D model and the morphed structural (e.g. bone) model from the step above.
- the implant model is also annotated with landmark points and regions
- the quality of the fit is quantified by calculating a score based on one or more geometric and/or functional measurements between the implant and the bones.
- a score based on one or more geometric and/or functional measurements between the implant and the bones.
- Landmarks and other features on the fitted implant is used to define geometries (e.g. planes, spheres, cuboids) that are then used to simulate the resection (cutting) of the bone required to deliver the implant operatively.
- the system 100 allows simulation of patient function with their native and implanted anatomy
- the anatomical landmarks and regions on the morphed bone models and implant models are used to construct joint coordinate systems between adjacent bones (e.g. femur and pelvis).
- Native function is concerned with the relative motion of un-implanted bone models
- Implanted function is concerned with the relative motion of models that are the combination of fitted implants and resected bones.
- the function of the patient anatomy is determined by the ability of their bone and implant models to move freely relative to each other as governed by their joint coordinate systems.
- 3D models having the ability to move relative to another as governed by biomechanical models of passive and active forces provided by muscles, tendons, ligaments, and other anatomical structures, as well mass, inertia, and other physical properties .
- function can be defined by the joint to move freely to achieve the motion required for a function task, e.g. standing up, walking, reaching, grabbing, arm swinging.
- the system 100 includes a user interface which is shown in Figures 10-17.
- a graphical user interface 1000 has a case identifier field 1001 beneath which case state tracker 1002 is provided.
- the case state tracker allows the user to immediately recognise the case status, including without limitation whether the pre-operative plan is complete, whether surgery has been performed and whether post-operative assessment has been completed.
- Controls 1005 include control elements configured to allow the user to make manual adjustments at various stages of the planning process, or to allow the system to perform steps automatically. Summary information on each step is provided in fields 1006-1009 and these may include graphical control elements to allow the user to navigate to processes involved in some or multiple steps and/or use controls 1005 to implement changes in implant selection or positioning for example.
- An approval or sign-off button 1010 allows user or supervisor approval of the plan produced by the system, or alternatively approval for selected steps in the process.
- a field or window 1003 is provided in which a display of a 3D model of the patient anatomy and of simulated implant (in this example) fit on the 3D model is portrayed.
- the display or visualisation in window 1003 is able to be manipulated by the user, and this is shown by way of example in figures 11-17.
- the display shows patient function pre and post implantation, allows the user to select implants based on simulation results and adjust implant position and orientation. Furthermore, the user is informed of in real-time of quantitative changes to implant fit as they make adjustments. Post-operative measurements, native measurements, implant specific visualisations and image overlay controls are provided in field or window 1004.
- this window provides a multiaxis visualisation of joint centre offset - in multiple planes as shown in 1004A and 1004B in Figure 11, with one plane showing joint offset in the posterior- anterior and superior-inferior axes, and the other plane showing joint offset in the medial- lateral and posterior-anterior axes. Reference numerals are omitted from Figures 12-17 for clarity.
- computer, display, and/or input device may each be separate computer systems, applications, or processes or may run as part of the same computer systems, applications, or processes - or one of more may be combined to run as part of one application or process - and / or each or one or more may be part of or run on a computer system.
- a computer system may include a bus or other communication mechanism for communicating information, and a processor coupled with the bus for processing information.
- the computer systems may have a main memory, such as a random access memory or other dynamic storage device, coupled to the bus. The main memory may be used to store instructions and temporary variables.
- the computer systems may also include a read-only memory or other static storage device coupled to the bus for storing static information and
- the computer systems may also be coupled to a display, such as a CRT or LCD monitor.
- Input devices may also be coupled to the computer system. These input devices may include a mouse, a trackball, or cursor direction keys.
- Each computer system may be implemented using one or more physical computers or computer systems or portions thereof.
- the instructions executed by the computer system may also be read in from a computer-readable medium.
- the computer-readable medium may be a CD, DVD, optical or magnetic disk, laserdisc, carrier wave, or any other medium that is readable by the computer system.
- hardwired circuitry may be used in place of or in combination with software instructions executed by the processor.
- Communication among modules, systems, devices, and elements may be over a direct or switched connection, and wired or wireless networks or connections, via directly connected wires, or any other appropriate communication mechanism.
- the communication among modules, systems, devices, and elements may include
- Communication may also messages related to HTTP, HTTPS, FTP, TCP, IP, ebMS OASIS/ebXML, secure sockets, VPN, encrypted or unencrypted pipes, MIME, SMTP, MIME Multipart/Related Content-type, SQL, etc.
- 3D graphics processing may be used for displaying or rendering including processing based on WebGL, OpenGL, Direct3D, Java 3D, etc.
- Whole, partial, or modified 3D graphics packages may also be used, such packages including 3DS Max, SolidWorks, Maya, Form Z, Cybermotion 3D, Blender, or any others.
- various parts of the needed rendering may occur on traditional or specialized graphics hardware.
- the rendering may also occur on the general CPU, on programmable hardware, on a separate processor, be distributed over multiple
- processors over multiple dedicated graphics cards, or using any other appropriate combination of hardware or technique.
- conditional language is not 5 generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment.
- All of the methods and processes described above may be embodied in, and fully automated via, software code modules executed by one or more general purpose computers or processors, such as those computer systems described above.
- the code modules may be stored in o any type of computer-readable medium or other computer storage device. Some or all of the methods may alternatively be embodied in specialized computer hardware.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Theoretical Computer Science (AREA)
- Artificial Intelligence (AREA)
- Radiology & Medical Imaging (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Physiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- High Energy & Nuclear Physics (AREA)
- Evolutionary Computation (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Computing Systems (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Psychiatry (AREA)
- Signal Processing (AREA)
- Computational Linguistics (AREA)
- Robotics (AREA)
- Data Mining & Analysis (AREA)
- Transplantation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Fuzzy Systems (AREA)
Abstract
Description
Claims
Priority Applications (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CA3145179A CA3145179A1 (en) | 2019-06-28 | 2020-06-29 | Orthopaedic pre-operative planning system |
| JP2021576871A JP2022545151A (en) | 2019-06-28 | 2020-06-29 | Pre-Orthopedic Surgery Planning System |
| US17/620,291 US12226163B2 (en) | 2019-06-28 | 2020-06-29 | Orthopaedic pre-operative planning system |
| AU2020307681A AU2020307681A1 (en) | 2019-06-28 | 2020-06-29 | Orthopaedic pre-operative planning system |
| EP20831426.0A EP3989856A4 (en) | 2019-06-28 | 2020-06-29 | ORTHOPEDIC PREOPERATIVE PLANNING SYSTEM |
| US19/026,033 US20250169885A1 (en) | 2019-06-28 | 2025-01-16 | Orthopaedic pre-operative planning system |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| NZ75500519 | 2019-06-28 | ||
| NZ755005 | 2019-06-28 | ||
| NZ76367920 | 2020-04-20 | ||
| NZ763679 | 2020-04-20 |
Related Child Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/620,291 A-371-Of-International US12226163B2 (en) | 2019-06-28 | 2020-06-29 | Orthopaedic pre-operative planning system |
| US19/026,033 Continuation US20250169885A1 (en) | 2019-06-28 | 2025-01-16 | Orthopaedic pre-operative planning system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020261249A1 true WO2020261249A1 (en) | 2020-12-30 |
Family
ID=74061829
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2020/056143 Ceased WO2020261249A1 (en) | 2019-06-28 | 2020-06-29 | Orthopaedic pre-operative planning system |
Country Status (5)
| Country | Link |
|---|---|
| EP (1) | EP3989856A4 (en) |
| JP (1) | JP2022545151A (en) |
| AU (1) | AU2020307681A1 (en) |
| CA (1) | CA3145179A1 (en) |
| WO (1) | WO2020261249A1 (en) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021215936A1 (en) * | 2020-04-20 | 2021-10-28 | Formus Labs Limited | Surgical system |
| WO2023096706A1 (en) * | 2021-11-24 | 2023-06-01 | Smith & Nephew, Inc. | System and method for determining femoral contact points |
| EP4333755A4 (en) * | 2021-05-06 | 2025-03-12 | Zimmer Biomet Pty Ltd | DYNAMIC HIP ARTHROPLASTY OPTIMIZATION SYSTEM |
| JP2025523511A (en) * | 2022-06-24 | 2025-07-23 | パラゴン28・インコーポレイテッド | Implant Identification |
| EP4426222A4 (en) * | 2021-11-03 | 2025-08-13 | Carlsmed Inc | PATIENT-SPECIFIC ARTHROPOLASTY DEVICES AND ASSOCIATED SYSTEMS AND METHODS |
| WO2025226549A1 (en) * | 2024-04-23 | 2025-10-30 | Howmedica Osteonics Corp. | Shoulder arthroplasty visualization and planning |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2012113030A1 (en) * | 2011-02-25 | 2012-08-30 | Optimized Ortho Pty Ltd | A computer-implemented method, a computing device and a computer readable storage medium for providing alignment information data for the alignment of an orthopaedic implant for a joint of a patient |
| WO2018067966A1 (en) | 2016-10-07 | 2018-04-12 | New York Society For The Relief Of The Ruptured And Crippled, Maintaining The Hostpital For Special Surgery | Patient specific 3-d interactive total joint model and surgical planning system |
| CN109157286A (en) * | 2018-10-25 | 2019-01-08 | 北京爱康宜诚医疗器材有限公司 | Data predication method and device |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8078440B2 (en) * | 2008-09-19 | 2011-12-13 | Smith & Nephew, Inc. | Operatively tuning implants for increased performance |
| CN103796609A (en) * | 2011-07-20 | 2014-05-14 | 史密夫和内修有限公司 | Systems and methods for optimizing implant fit to anatomy |
| US10258256B2 (en) * | 2014-12-09 | 2019-04-16 | TechMah Medical | Bone reconstruction and orthopedic implants |
| AU2016371923B2 (en) * | 2015-12-17 | 2019-05-09 | Materialise N.V. | Pre-operative determination of implant configuration for soft-tissue balancing in orthopedic surgery |
| EP3878391A1 (en) * | 2016-03-14 | 2021-09-15 | Mohamed R. Mahfouz | A surgical navigation system |
| WO2018009794A1 (en) * | 2016-07-08 | 2018-01-11 | Biomet Manufacturing, Llc | Reverse shoulder pre-operative planning |
-
2020
- 2020-06-29 WO PCT/IB2020/056143 patent/WO2020261249A1/en not_active Ceased
- 2020-06-29 CA CA3145179A patent/CA3145179A1/en active Pending
- 2020-06-29 JP JP2021576871A patent/JP2022545151A/en active Pending
- 2020-06-29 AU AU2020307681A patent/AU2020307681A1/en active Pending
- 2020-06-29 EP EP20831426.0A patent/EP3989856A4/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2012113030A1 (en) * | 2011-02-25 | 2012-08-30 | Optimized Ortho Pty Ltd | A computer-implemented method, a computing device and a computer readable storage medium for providing alignment information data for the alignment of an orthopaedic implant for a joint of a patient |
| WO2018067966A1 (en) | 2016-10-07 | 2018-04-12 | New York Society For The Relief Of The Ruptured And Crippled, Maintaining The Hostpital For Special Surgery | Patient specific 3-d interactive total joint model and surgical planning system |
| CN109157286A (en) * | 2018-10-25 | 2019-01-08 | 北京爱康宜诚医疗器材有限公司 | Data predication method and device |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3989856A4 |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021215936A1 (en) * | 2020-04-20 | 2021-10-28 | Formus Labs Limited | Surgical system |
| EP4333755A4 (en) * | 2021-05-06 | 2025-03-12 | Zimmer Biomet Pty Ltd | DYNAMIC HIP ARTHROPLASTY OPTIMIZATION SYSTEM |
| EP4426222A4 (en) * | 2021-11-03 | 2025-08-13 | Carlsmed Inc | PATIENT-SPECIFIC ARTHROPOLASTY DEVICES AND ASSOCIATED SYSTEMS AND METHODS |
| WO2023096706A1 (en) * | 2021-11-24 | 2023-06-01 | Smith & Nephew, Inc. | System and method for determining femoral contact points |
| JP2025523511A (en) * | 2022-06-24 | 2025-07-23 | パラゴン28・インコーポレイテッド | Implant Identification |
| WO2025226549A1 (en) * | 2024-04-23 | 2025-10-30 | Howmedica Osteonics Corp. | Shoulder arthroplasty visualization and planning |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3989856A1 (en) | 2022-05-04 |
| JP2022545151A (en) | 2022-10-26 |
| EP3989856A4 (en) | 2022-08-24 |
| AU2020307681A1 (en) | 2022-01-20 |
| CA3145179A1 (en) | 2020-12-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20250169885A1 (en) | Orthopaedic pre-operative planning system | |
| JP7662595B2 (en) | Bone reconstruction and orthopedic implants | |
| EP3989856A1 (en) | Orthopaedic pre-operative planning system | |
| JP6833912B2 (en) | Bone reconstruction and orthopedic implants | |
| Jun et al. | Design of patient-specific hip implants based on the 3D geometry of the human femur | |
| CN107847275A (en) | For preoperative planning positioning the system and method for soft tissue | |
| US20250195143A1 (en) | Patient-specific orthopedic implant evaluation | |
| JP7704965B2 (en) | Surgical planning system and method for preoperative assessment of center of rotation data - Patents.com | |
| Kotecki et al. | Automation of the Determining Parameters Process Used to Assess the State of Hip Joint Degeneration Based on CT Imaging | |
| CN121189101A (en) | Personalized orthopedic surgery planning system and method based on multidimensional biomechanics simulation and closed-loop optimization | |
| WO2026035151A1 (en) | Method and system for optimising orthopaedic treatment | |
| Vartziotis et al. | Integrated digital engineering methodology for virtual orthopedics surgery planning |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20831426 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2021576871 Country of ref document: JP Kind code of ref document: A Ref document number: 3145179 Country of ref document: CA |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2020307681 Country of ref document: AU Date of ref document: 20200629 Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2020831426 Country of ref document: EP |