[go: up one dir, main page]

WO2001027702A1 - Systeme de micropositionnement - Google Patents

Systeme de micropositionnement Download PDF

Info

Publication number
WO2001027702A1
WO2001027702A1 PCT/GB2000/003817 GB0003817W WO0127702A1 WO 2001027702 A1 WO2001027702 A1 WO 2001027702A1 GB 0003817 W GB0003817 W GB 0003817W WO 0127702 A1 WO0127702 A1 WO 0127702A1
Authority
WO
WIPO (PCT)
Prior art keywords
tool
image
projected image
micropositioning
micropositioning system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/GB2000/003817
Other languages
English (en)
Inventor
Anthony James Douglas
Paul Edward Jarvis
Kevin William Beggs
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems PLC
Original Assignee
BAE Systems PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAE Systems PLC filed Critical BAE Systems PLC
Priority to AT00964511T priority Critical patent/ATE247840T1/de
Priority to CA002385541A priority patent/CA2385541C/fr
Priority to EP00964511A priority patent/EP1221076B1/fr
Priority to JP2001530653A priority patent/JP3504936B2/ja
Priority to DE60004692T priority patent/DE60004692T2/de
Priority to AU75441/00A priority patent/AU7544100A/en
Priority to US09/700,880 priority patent/US6472676B1/en
Publication of WO2001027702A1 publication Critical patent/WO2001027702A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/401Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for measuring, e.g. calibration and initialisation, measuring workpiece for machining purposes

Definitions

  • This invention relates to the general field of manufacturing and more particularly to the positioning of tools for carrying out manufacturing or inspection operations.
  • the accuracy with which the tools are positioned is dependent upon the level of accuracy required in the finished product. For example, in aircraft manufacture, many components need to be produced to very high standards of accuracy, and are often fitted and finished by hand to meet the required tolerances.
  • the present invention seeks to alleviate the problem of expense associated with the known tool positioning methods described above, by providing an apparatus and a method for accurately positioning tools for use in manufacturing or inspection operations whilst reducing the need for costly tooling such as jigs.
  • a micropositioning system comprising: a radiation source for projecting an image onto a surface of an article, the image being part of a manufacturing template and the image representing a predetermined position on the surface of the article where a manufacturing or inspection operation is to be undertaken; a radiation detector for detecting the projected image; tool conveyancing means for carrying a tool adapted to perform manufacturing or inspection operations; processor means for calculating at least two dimensional co-ordinates of the projected image detected by the radiation detector relative to the tool; and control means for controlling the tool conveyancing means so as to position the tool in a predefined spacial relationship with the projected image in response to a signal from the processor means.
  • the information contained within the manufacturing template is obtained directly from a CAD model of the article.
  • the radiation source may be a laser.
  • the radiation source provides radiation visible to the human eye so that an operator may view the image.
  • the radiation source may be for example a Virtek Laseredge 3D laser projection system. Two radiation sources may be used for complex surfaces.
  • the radiation source may project an image in the form of an ellipse.
  • the radiation source may alternatively project an image in the form of a cross, or a circle.
  • the image is preferably of a size in the range 0.5 to 3.0 cm.
  • the image is projected onto a surface at a location where a manufacturing or inspection operation is to be carried out. Several images may be simultaneously projected to provide, for example, a drill template on a surface such as an aircraft panel.
  • the radiation detector preferably comprises a camera and an image processing system.
  • the camera may comprise an array of solid state charge coupled devices (CCDs).
  • the array may be linear or rectangular.
  • the CCDs produce a charge proportional to the amount of light falling on them and the charge from each device in the array is preferably used by the image processing system to build up an image.
  • the image processing system preferably comprises a frame grabber for digitising the image and a computer adapted for processing the image.
  • the image is advantageously processed by the computer to identify features such as areas of the same intensity or changes in intensity, for example.
  • the image processor advantageously is thereby able to identify an image such as a cross projected by the radiation source, and locate the centre of the image.
  • the tool conveyancing means may comprise a tool holding device, for example, a chuck.
  • the tool conveyancing means preferably further comprises a moveable stage.
  • the tool holding device is advantageously mounted on the moveable stage.
  • the moveable stage is preferably able to move in at least x and y directions, where the x and y directions are normal to each other and are in one plane (the x-y plane).
  • the moveable stage may be servo motor actuated.
  • the moveable stage may additionally be able to move in a z direction, where the z direction is normal to the x-y plane.
  • the tool holding device may be adapted to move in the z direction.
  • the tool holding device is advantageously mounted to the moveable stage in a manner such that the tool holding device may move relative to the moveable stage in the z direction.
  • the moveable stage is preferably mounted on a platform, such that it is able to move relative to the platform.
  • the platform preferably comprises attachment means for allowing the platform to be releasably attached to the surface.
  • the attachment means may comprise a vacuum sucker.
  • the vacuum sucker may comprise a rubber seal and venturi ejector vacuum pump.
  • the attachment means may comprise a magnetic portion, if the surface is ferrous.
  • the attachment means may comprise a mechanical fastener, such as a bolt or clamp, for example.
  • the platform may comprise one or more adjustable feet for allowing the micropositioning system to operate on curved or uneven surfaces.
  • the adjustable feet are preferably individually adjustable, and are for adjusting the distance between the surface and the platform.
  • the adjustable feet may be manually or automatically adjustable, and may utilise hydraulic or electrical jacks, or telescopic or screw thread mechanical arrangements.
  • the micropositioning system preferably comprises normalisation means for checking that the tool is substantially normal to the surface prior to a manufacturing operation being carried out.
  • the normalisation means may automatically control the adjustable feet to ensure that the platform is stable with respect to the surface and to alter the orientation of the platform and with it the inclination of the tool.
  • the normalisation means may comprise a sensor such as, for example, a linear potentiometer.
  • the normalisation means may comprise at least two sensors located on the platform in a manner such that, in use, the sensors are adjacent the surface.
  • the normalisation means may comprise a sensor such as, for example, a radiation source and reflected radiation detector system, where at least two such sensors are located on the platform such that, in use, the sensors are perpendicular to the surface.
  • the sensors are preferably used to determine if the moveable stage of the platform is substantially parallel to the surface in cases where the surface is substantially flat, or in the case of a curved surface, whether the moveable stage mounted on the platform is substantially tangential to the surface.
  • the normalisation means may further comprise a tool normalisation aid for checking that the tool is normal to the moveable stage.
  • the processor means advantageously uses data obtained from the image processing system to determine the location of the image with respect to the position of the tool.
  • the control means may comprise a servo motor and a motion controller.
  • the control means preferably comprises at least two servo motors, at least one for actuating movement of the moveable stage in the x direction and at least one for actuating movement of the moveable stage in the y direction.
  • the motion controller advantageously controls the movement of the moveable stage in at least the x and y directions.
  • the control means may further comprise a servo motor for actuating movement of the tool holder in the z direction.
  • the motion controller may control the movement of the tool holder in the z direction.
  • the processor means are adapted to communicate with the control means.
  • the tool conveyancing means may comprise an extendable arm for holding a tool.
  • the tool may be a drill.
  • the tool may be a milling tool or a grinding tool or a welding tool or a rivet insertion tool.
  • the tool may be an inspection tool or a non destructive testing tool.
  • the tool may be a spray gun or blast gun.
  • a camera may be provided on the tool holder, for sending a 'tool's eye view' to a monitor visible to the micropositioning device operator. The operator is then able to visually verify that the operation is being carried out on the surface correctly and at the place where the image is being projected.
  • the platform, moveable plate and tool holding means are preferably mainly manufactured from a material having light weight and good strength, for example, aluminium alloy or carbon fibre composite.
  • a handle is preferably provided on the platform for enabling an operator to position the platform on the surface to drilled.
  • a method for accurately positioning tools comprising at least the steps of: projecting an image onto a surface of an article, the image being part of a manufacturing template and the image representing a predetermined position on the surface of the article where a manufacturing or inspection operation is to be undertaken; detecting the projected image; processing the projected image; calculating at least two dimensional co-ordinates of the projected image relative to a tool adapted to perform manufacturing or inspection operations; and moving the tool so that it is positioned in a predefined spacial relationship with respect to the projected image.
  • a feature such as an area having a greater intensity than its surroundings is identified by an image processing system. The centre of the area may then be determined by the image processing system.
  • a feature such as a change in intensity between adjacent areas may be identified by the image processing system, corresponding to a boundary of a projected image.
  • the image processing system locates the centre of the projected image.
  • the two dimensional co-ordinates of the centre of the projected image relative to a tool are then advantageously calculated by a processor.
  • the lighting is preferably controlled to give a high contrast between the projected image on the surface and the rest of the surface.
  • the lighting is chosen to minimise unwanted reflections, shadows, and other uneven illumination.
  • the tool is manoeuvrable in the x, y and z directions, where the x and y directions preferably represent a two dimensional plane substantially parallel or tangential to the surface and the z direction is normal to the x, y plane.
  • the tool is preferably held in an x, y plane substantially parallel or tangential to the surface, and displaced in the z direction toward or away from the surface.
  • the tool is normalised so that in use its line of action is normal to the surface.
  • the processor sends a signal to cause the tool to be moved in the x, y plane so that it is located at the same x, y co-ordinates as the centre of the projected image.
  • the motion of the tool in the x, y plane is preferably achieved by a servo motor.
  • one servo motor controls movement in the x direction and one motor controls movement in the y direction.
  • the servo motors are preferably controlled by a motion controller which receives move command instructions from the processor.
  • the processor works out how the tool needs to move in the x and y directions to be at the same x and y coordinates as the centre of the image and then instructs the motion controller to actuate the servo motors to achieve this movement.
  • the movement of the tool in the z direction may be achieved for example by a pneumatic cylinder or by a servo motor.
  • the rate of movement in the z direction of the tool is preferably controlled by an adjustable spring damper unit.
  • the platform is releasably attached to the surface by the operator prior to undertaking a manufacturing operation.
  • the operator may position the platform adjacent the projected image.
  • the operator then preferably checks that the platform is positioned correctly. This check may be undertaken using normalisation sensors.
  • the tool is prevented from operating when the normalisation sensors indicate that the platform is not positioned correctly.
  • the normalisation sensors may control the movement of adjustable feet to ensure that the platform is stable with respect to the surface, and to alter the orientation of the platform. Alternatively the operator may manually control the movement of the adjustable feet.
  • the micropositioning system is preferably calibrated to allow the x, y co-ordinates within the field of view of the radiation detector to be linked to the x, y position of the tool.
  • the operator is able to visually inspect the result of the operation on a monitor, the monitor receiving an image of the surface from a camera located adjacent the tool.
  • Figure 1 shows a schematic diagram of the system according to the present invention.
  • Figure 2 shows an isometric view of the micropositioning unit part of the system.
  • Figure 3 shows a plan view of a part of the system.
  • Figure 4 shows a side view of the part of the system shown in Figure 2.
  • Figure 5 shows an image projected by the system.
  • FIG. 1 shows a schematic diagram of a micropositioning system 1 in accordance with the present invention.
  • a micropositioning unit 3 is placed on a surface 5.
  • the surface 5 is an aircraft panel curved to match the fuselage profile, the radius of curvature being 2m.
  • the micropositioning unit comprises a moveable stage 7 mounted on a platform 9.
  • the platform 9 has vacuum suckers 11 attached to its underside for releasably attaching the platform 9 to the surface 5.
  • a tool holder 13 is mounted on the moveable stage 7.
  • a tool 15 is held in the tool holder 13.
  • a camera 17 and a light source 19 are mounted adjacent the tool 15, the light source 19 projecting a beam 21 onto the surface 5.
  • the moveable stage 7 is operable by an x direction servo motor 23 and a y direction servo motor 25, where the x and y directions are substantially in the same plane as the moveable stage.
  • the camera 17 is connected to a monitor 27 and to a camera power source 29.
  • the tool 15 is connected to a tool power source 31 and the micropositioning unit 3 is connected to a unit power source 33.
  • the camera 17 is also connected to an image processor 35 that forms part of a processing unit 37.
  • the processing unit 37 further comprises a processor 39, a control panel 63, and a motion controller 41.
  • the motion controller controls the x and y direction servo motors, 23 and 25 respectively.
  • the control panel 63 comprises operator controls, such as button 65.
  • a laser projector 43 is positioned to project a beam of radiation 45 onto surface 5, adjacent the micropositioning unit 3.
  • Figure 2 shows an isometric view of the micropositioning unit 3 positioned on the surface of an aircraft panel 47.
  • the laser projector (not shown) is projecting a row of three crosses 49, 51, 53 onto the panel 47.
  • the micropositioning unit is releasably attached to the panel 47 by vacuum suckers 11.
  • a drill 55 is held by the tool holder 13, and a camera 17 is mounted adjacent drill 55.
  • a handle 57 is provided on the micropositioning unit 3 for allowing an operator to lift the unit 3 more easily.
  • Figure 3 shows a plan view of the micropositioning unit 3.
  • the unit 3 has four vacuum suckers 11 on its base for releasably attaching the platform 9 to a surface.
  • the unit 3 also has normalisation sensors 59 adjacent the vacuum suckers 11 for ensuring that the platform 9 and associated moveable stage 7 are parallel or tangential with respect to a surface.
  • the unit 3 also comprises a tool control means 61 for controlling the z direction movement of a tool held in the tool holder 13, and a moveable stage 7 for moving the tool in the x and y directions.
  • the z direction is substantially normal to the plane of the moveable stage.
  • Figure 4 shows a side view of part of the micropositioning unit 3 of Figure 5.
  • a drill 55 is held in a tool holder 13.
  • the tool holder 13 is mounted on a moveable stage 7, the moveable stage 7 being mounted on a platform 9.
  • Tool control means 61 controls the z direction movement of the drill 55.
  • the micropositioning system 1 is positioned near to a surface on which a machining or inspecting operation is to be carried out.
  • the surface is preferably substantially flat and may have a curvature radius of 2 m or over.
  • the surface is an aircraft panel 47.
  • the curvature radius may be significantly smaller than 2m.
  • the laser projector 43 projects a beam of radiation 45 in the form of a cross 49 onto the aircraft panel 47.
  • the aircraft panel 47 is held in a jig (not shown) which has been datumed so that the precise position of the panel is known with respect to the laser projector and so the cross 49 is projected onto the exact part of the panel 47 which needs to be drilled.
  • the laser projector may project several crosses, as a drill template, onto the panel 47 if several drilling operations are required.
  • the laser projector 43 takes its drill template from the CAD model of the aircraft panel 47.
  • the cross 49 should be within the field of view of the camera 17.
  • a light source 19, such as a small torch, is mounted adjacent the camera 17 and is set to project light 21 onto the panel 47 to indicate the field of view of the camera 17. The operator can then easily see whether the cross 49 is within the field of view of the camera 17.
  • the micropositioning unit 3 has a green light and a red light associated with the normalisation sensors 59. If the normalisation sensors 59 indicate that the unit is not correctly positioned, the red light comes on, and the operator is prevented from moving the drill 55 in the z direction. The operator must then adjust the unit until the green light comes on, indicating that the unit 3 is positioned correctly.
  • the unit 3 is then releasably clamped to the panel 47 using the vacuum suckers 11 on the platform 9 of the unit 3.
  • the vacuum is activated by the operator pressing a button (not shown) on the unit 3.
  • the operator is able to see the "camera's eye view" of the panel 47 on the monitor 27, which receives images from the camera 17.
  • the operator is happy with the position of the micropositioning unit 3, he activates the drilling process by pressing a button 65 located on the processing unit control panel 63.
  • the camera 17 then captures an image of the cross 49 on the panel 47 and the image processor 35 identifies the cross 49 and calculates its centre. This technique is described further with reference to Figure 5.
  • the processor 39 then calculates the relative position of the centre of the cross 49 with respect to the drill 55 and calculates the distance the drill 55 needs to move in the x and y directions to be positioned at the same x and y co-ordinates as the centre of the cross 49.
  • the processor 39 imparts this information to the motion controller 41 , which controls the operation of the x and y direction servo motors 23, 25 that move the moveable stage 7.
  • the tool 15 is positioned in the tool holder 13 such that the tool 15 will be normal to the surface of the panel 47 when the platform 9 and associated moveable stage 7 are substantially parallel or tangential to the surface of the panel 47.
  • the normalisation sensors 59 that indicate whether the platform 9 is substantially parallel or tangential to a surface will thereby also indicate whether the tool 15 is normal to the surface.
  • the tool holder 13 is mounted onto the moveable stage 7, and may have an extensible arm. In this example the tool holder 13 has no extensible arm.
  • the servo motors 23, 25 move the moveable stage 7 until the drill bit is positioned at the same x, y co-ordinates as the centre of the cross 49.
  • the drill 55 is then started automatically and moved in the z direction to drill a hole into the panel 47 through the centre of the cross 49.
  • the drill 55 is then retracted so that the camera 17 can display a live image of the cross and the hole on the monitor 27.
  • the operator can then inspect the hole using the monitor 27 to confirm that the hole was drilled in the correct place.
  • Figure 5 shows the field of view 71 of the camera 17.
  • a cross 49 is projected by the laser projector 43 onto a surface.
  • the cross has a centre 73.
  • the camera 17 uses an array of charge coupled devices (CCDs) which produce a charge proportional to the light falling on them.
  • the array may be linear or rectangular.
  • the charge from each device in the array is sued to build up an image 75 comprising numerous pixels 79, 81 where each CCD corresponds to a pixel.
  • the intensity of each pixel corresponds to the charge generated by the corresponding CCD.
  • a monitor 27 is used to display the image to allow the operator to check that the cross 49 is in the correct position.
  • the image 75 corresponds to the field of view 71 of the camera 17.
  • the CCDs directed at that portion of the surface receive move illumination then those CCDs directed at portions of the surface that do not have a cross projected onto them.
  • the CCDs receiving illumination from the projected cross generate a higher charge output than those not receiving illumination, and so the corresponding image 75 comprises pixels of greater intensity 79 and pixels of lower intensity 81 , the pixels of greater intensity forming an image of the cross 77.
  • the pixels at the extremities 79, 83 of the image of the cross 77 are of lower intensity than those closer to the centre of the image of the cross 85, 87 as the CCDs directed at the extremities of the projected cross do not receive as much illumination as those directed at the centre of the projected cross.
  • the image 75 is processed by the image processor 35.
  • the image 75 can be processed to identify various features such as areas of the same intensity, for example a 'blob' of light or changes in intensity, as for example at an edge of a projected feature.
  • the image processing algorithm used to identify and locate the centre of a projected cross is as follows: a) The image 75 is 'thresholded' to leave only those pixels above a certain intensity. In the example of Figure 5, the intensity threshold would be set to eliminate the lower intensity pixels 81 and those pixels at the extremities of the image of the cross. b) Adjacent pixels above the threshold are joined to form clusters or blobs. c) A bounding box is defined around each blob. d) A series of statistics are calculated for each blob including: the centre the centre of gravity the size e) All blobs below a pre-set size are discarded. f) The largest blob is chosen as the detected feature. g) The bounding box is shrunk by 20%.
  • the laser projector may project features other than crosses, for example circular or elliptical blobs.
  • the image processing algorithm follows a similar process to that described for the cross up to and including step f, but then the x, y co-ordinates of the centre of gravity of the largest blob are determined and these x, y co-ordinates are sent to the processor.
  • the image processing can be made much easier and more accurate by careful control of the environment to improve the image and simplify the analysis. This can be assisted by controlling the lighting, for example, to ensure that the lighting is constant over the surface with no reflections or shadows and that there is a reasonable difference in intensity between the projected feature and the background lighting.
  • Providing a datum for the article being manufactured or inspected and establishing the position of the article with respect to the radiation source before utilising the present invention is also important, as the feature must be projected onto the correct part of a surface. Also the distance between the camera and the article needs to be known to ascertain the scale, and the position of the drill with respect to the field of view of the camera must also be known.
  • a link into the CAD model of the article can be used to relate the contents of the image to the article. The article may be clamped to a work table, and its position relative to the radiation source can then be determined using known positioning techniques prior to projecting the image, thereby reducing the need for jigs.
  • the base of the micropositioning unit may contain wheels or casters to assist with positioning, adjustable feet to allow improved operation on curved surfaces, and means other than vacuum suckers for releasably securing the unit to a surface, such as claps or electro magnetic devices, for example, may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Machine Tool Sensing Apparatuses (AREA)
  • Inorganic Fibers (AREA)
  • Crystals, And After-Treatments Of Crystals (AREA)
  • Low-Molecular Organic Synthesis Reactions Using Catalysts (AREA)
  • Numerical Control (AREA)
  • Control Of Position Or Direction (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Magnetic Bearings And Hydrostatic Bearings (AREA)
  • Linear Motors (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un appareil et un procédé permettant de positionner précisément des outils destinés à des opérations d'inspection ou de fabrication ce qui rend l'outillage coûteux tel que des gabarits moins nécessaire. Ce procédé permettant de positionner précisément des outils consiste au minimum à projeter une image sur une surface, à détecter l'image projetée, à traiter l'image projetée, à calculer au moins deux coordonnées dimensionnelles de cette image par rapport à un outil, à déplacer cet outil de façon qu'il soit positionné dans l'espace par rapport à l'image projetée d'une manière prédéfinie. Ce système de micropositionnement comprend une source de rayonnement permettant de projeter une image sur une surface, un détecteur de rayonnement permettant de détecter l'image projetée, un organe porteur d'outil permettant de porter un outil, un processeur permettant de calculer au moins deux coordonnées dimensionnelles de l'image projetée et détectée par le détecteur de rayonnement par rapport à l'outil, et un organe de commande permettant de commander l'organe porteur d'outil de façon à positionner de manière prédéfinie cet outil dans l'espace par rapport à l'image projetée en réponse à un signal du processeur.
PCT/GB2000/003817 1999-10-09 2000-10-06 Systeme de micropositionnement Ceased WO2001027702A1 (fr)

Priority Applications (7)

Application Number Priority Date Filing Date Title
AT00964511T ATE247840T1 (de) 1999-10-09 2000-10-06 Mikropositionierungssystem
CA002385541A CA2385541C (fr) 1999-10-09 2000-10-06 Systeme de micropositionnement
EP00964511A EP1221076B1 (fr) 1999-10-09 2000-10-06 Systeme de micropositionnement
JP2001530653A JP3504936B2 (ja) 1999-10-09 2000-10-06 マイクロポジショニングシステム
DE60004692T DE60004692T2 (de) 1999-10-09 2000-10-06 Mikropositionierungssystem
AU75441/00A AU7544100A (en) 1999-10-09 2000-10-06 Micropositioning system
US09/700,880 US6472676B1 (en) 1999-10-09 2000-10-06 Micropositioning system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9923795.0 1999-10-09
GBGB9923795.0A GB9923795D0 (en) 1999-10-09 1999-10-09 Micropositioning system

Publications (1)

Publication Number Publication Date
WO2001027702A1 true WO2001027702A1 (fr) 2001-04-19

Family

ID=10862338

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2000/003817 Ceased WO2001027702A1 (fr) 1999-10-09 2000-10-06 Systeme de micropositionnement

Country Status (12)

Country Link
US (1) US6472676B1 (fr)
EP (1) EP1221076B1 (fr)
JP (1) JP3504936B2 (fr)
AT (1) ATE247840T1 (fr)
AU (1) AU7544100A (fr)
CA (1) CA2385541C (fr)
DE (1) DE60004692T2 (fr)
ES (1) ES2200941T3 (fr)
GB (1) GB9923795D0 (fr)
MY (1) MY125388A (fr)
TW (1) TW514577B (fr)
WO (1) WO2001027702A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003001359A (ja) * 2001-06-20 2003-01-07 Daiwa House Ind Co Ltd リベット締結方法およびそれに用いるリベット締結具
DE102007026100A1 (de) * 2007-06-05 2008-12-11 Airbus Deutschland Gmbh Bearbeitungsvorrichtung und Verfahren zum Bearbeiten eines Schichtverbundwerkstoffs
US8393834B2 (en) 2007-06-05 2013-03-12 Airbus Operations Gmbh Machining apparatus and method for machining a laminate
DE102016224435A1 (de) * 2016-12-08 2018-06-14 Robert Bosch Gmbh Werkzeug und Verfahren zum Führen eines Benutzers eines Werkzeugs bei einer Behandlung mindestens eines Werkstücks

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7880116B2 (en) * 2003-03-18 2011-02-01 Loma Linda University Medical Center Laser head for irradiation and removal of material from a surface of a structure
US6934607B2 (en) 2003-03-21 2005-08-23 Fmc Technologies, Inc. Method and apparatus for visually indexing objects upon a moving surface
US7755761B2 (en) * 2004-11-12 2010-07-13 The Boeing Company Self-normalizing contour drilling machine
DE102005026012B4 (de) * 2005-06-07 2008-04-17 Airbus Deutschland Gmbh Handgerät zum Herstellen von Bohrungen, Ausnehmungen oder Planflächen
EP2036648A1 (fr) * 2007-09-14 2009-03-18 Plakoni Engineering Dispositif pour faciliter la soudure manuelle avec plusieurs unités projectant une lumière pour marquage
JP2010256341A (ja) * 2009-03-31 2010-11-11 Toshiba Mach Co Ltd 刃先位置検出方法および刃先位置検出装置
US8413307B2 (en) * 2009-07-06 2013-04-09 The Boeing Company Guide assembly and method
US8899886B1 (en) * 2009-11-25 2014-12-02 The Boeing Company Laser signature vision system
WO2015062646A1 (fr) * 2013-10-30 2015-05-07 Proceq Ag Ensemble et procédé d'inspection d'un objet, en particulier d'un ouvrage
JP2017522196A (ja) * 2014-07-09 2017-08-10 マグスウィッチ テクノロジー インコーポレイテッドMagswitch Technology Inc. 磁気工具スタンド
TWI549792B (zh) * 2014-10-16 2016-09-21 Gison Machinery Co Ltd Pneumatic machinery
US10646930B2 (en) * 2017-02-03 2020-05-12 The Boeing Company System and method for precisely drilling matched hole patterns using surface mapped features
KR102197729B1 (ko) * 2018-12-13 2021-01-04 주식회사 강한이노시스 산업기계설비 진단 및 위치제어시스템
CN113680891A (zh) * 2020-05-19 2021-11-23 四川精创通信网络维护有限公司 一种精确打孔的液压冲孔机

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4453085A (en) * 1981-05-11 1984-06-05 Diffracto Ltd. Electro-optical systems for control of robots, manipulator arms and co-ordinate measuring machines
US4720870A (en) * 1985-06-18 1988-01-19 Billiotte Jean Marie Method of automatically and electronically analyzing patterns in order to distinguish symmetrical perceptible areas in a scene together with their centers of symmetry
EP0331108A2 (fr) * 1988-03-04 1989-09-06 Max Mayer Maschinenbau GmbH Burlafingen Procédé et dispositif pour le positionnement des pièces à usiner
EP0433803A1 (fr) * 1989-12-21 1991-06-26 Hughes Aircraft Company Appareil pour localiser des joints brasés
FR2707017A1 (en) * 1993-06-25 1994-12-30 Couval Sa Device for projecting an image on a machine-tool table
WO1997010925A2 (fr) * 1995-09-21 1997-03-27 Douglas Industries, Inc. Appareil pour nettoyer les vitrages
US5633707A (en) * 1993-05-18 1997-05-27 Seemann; Henry R. Method for non-destructive inspection of an aircraft
US5663885A (en) * 1994-04-23 1997-09-02 Stahl; Anton Procedure and device for processing cutting material
EP0902345A2 (fr) * 1997-09-12 1999-03-17 Mitutoyo Corporation Appareil d'entrainement du système de coordonnées d'un capteur

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3795449A (en) * 1973-01-11 1974-03-05 Lockheed Missiles Space Cutter monitor
US4373804A (en) 1979-04-30 1983-02-15 Diffracto Ltd. Method and apparatus for electro-optically determining the dimension, location and attitude of objects
US4468695A (en) 1980-11-20 1984-08-28 Tokico Ltd. Robot
US6317953B1 (en) * 1981-05-11 2001-11-20 Lmi-Diffracto Vision target based assembly
US4412121A (en) 1981-08-28 1983-10-25 S R I International Implement positioning apparatus and process
US4654949A (en) 1982-02-16 1987-04-07 Diffracto Ltd. Method for automatically handling, assembling and working on objects
US4523100A (en) * 1982-08-11 1985-06-11 R & D Associates Optical vernier positioning for robot arm
US4611292A (en) 1982-10-06 1986-09-09 Hitachi, Ltd. Robot vision system
US4647208A (en) * 1985-07-22 1987-03-03 Perceptron, Inc. Method for spatial measurement of holes
US4744664A (en) * 1986-06-03 1988-05-17 Mechanical Technology Incorporated Method and apparatus for determining the position of a feature of an object
JP2646776B2 (ja) 1989-12-28 1997-08-27 日立工機株式会社 視覚補正位置決め装置
JP2509357B2 (ja) 1990-01-19 1996-06-19 トキコ株式会社 ワ―ク位置検知装置
US5172326A (en) * 1990-03-19 1992-12-15 Forcam, Incorporated Patterned web cutting method and system for operation manipulation of displayed nested templates relative to a displayed image of a patterned web
US5146965A (en) * 1990-10-29 1992-09-15 Nigel Gibson Router attachment
JPH04178506A (ja) 1990-11-13 1992-06-25 Matsushita Electric Ind Co Ltd ワークの3次元位置計測方法
JP3073341B2 (ja) 1992-12-04 2000-08-07 三菱重工業株式会社 ロボットの位置決め方法及びその装置
US5446635A (en) 1993-06-24 1995-08-29 Quarton, Inc. Laser assembly for marking a line on a workpiece for guiding a cutting tool
GB9405299D0 (en) 1994-03-17 1994-04-27 Roke Manor Research Improvements in or relating to video-based systems for computer assisted surgery and localisation
IT1279210B1 (it) 1995-05-16 1997-12-04 Dea Spa Dispositivo e metodo di visione per la misura tridimensionale senza contatto.
US5666202A (en) * 1995-08-22 1997-09-09 Kyrazis; Demos High bandwidth, dynamically rigid metrology system for the measurement and control of intelligent manufacturing processes
DE59507532D1 (de) * 1995-10-13 2000-02-03 Schablonentechnik Kufstein Ag Verfahren zur Herstellung einer Schablone, insbesondere für den Papier- oder Textildruck
US5768792A (en) * 1996-02-09 1998-06-23 Faro Technologies Inc. Method and apparatus for measuring and tube fitting
US6090158A (en) * 1998-09-08 2000-07-18 Levi Strauss & Co. Localized finishing of garment workpieces
JP3328878B2 (ja) * 1998-10-26 2002-09-30 澁谷工業株式会社 ボンディング装置
WO2000025185A1 (fr) * 1998-10-27 2000-05-04 Irobotics, Inc. Planification de processus robotises utilisant des gabarits

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4453085A (en) * 1981-05-11 1984-06-05 Diffracto Ltd. Electro-optical systems for control of robots, manipulator arms and co-ordinate measuring machines
US4720870A (en) * 1985-06-18 1988-01-19 Billiotte Jean Marie Method of automatically and electronically analyzing patterns in order to distinguish symmetrical perceptible areas in a scene together with their centers of symmetry
EP0331108A2 (fr) * 1988-03-04 1989-09-06 Max Mayer Maschinenbau GmbH Burlafingen Procédé et dispositif pour le positionnement des pièces à usiner
EP0433803A1 (fr) * 1989-12-21 1991-06-26 Hughes Aircraft Company Appareil pour localiser des joints brasés
US5633707A (en) * 1993-05-18 1997-05-27 Seemann; Henry R. Method for non-destructive inspection of an aircraft
FR2707017A1 (en) * 1993-06-25 1994-12-30 Couval Sa Device for projecting an image on a machine-tool table
US5663885A (en) * 1994-04-23 1997-09-02 Stahl; Anton Procedure and device for processing cutting material
WO1997010925A2 (fr) * 1995-09-21 1997-03-27 Douglas Industries, Inc. Appareil pour nettoyer les vitrages
EP0902345A2 (fr) * 1997-09-12 1999-03-17 Mitutoyo Corporation Appareil d'entrainement du système de coordonnées d'un capteur

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003001359A (ja) * 2001-06-20 2003-01-07 Daiwa House Ind Co Ltd リベット締結方法およびそれに用いるリベット締結具
DE102007026100A1 (de) * 2007-06-05 2008-12-11 Airbus Deutschland Gmbh Bearbeitungsvorrichtung und Verfahren zum Bearbeiten eines Schichtverbundwerkstoffs
US8393834B2 (en) 2007-06-05 2013-03-12 Airbus Operations Gmbh Machining apparatus and method for machining a laminate
DE102007026100B4 (de) * 2007-06-05 2013-03-21 Airbus Operations Gmbh Bearbeitungsvorrichtung und Verfahren zum Bearbeiten eines Schichtverbundwerkstoffs
US8568069B2 (en) 2007-06-05 2013-10-29 Airbus Operations Gmbh Machining apparatus and method for machining a laminate
DE102016224435A1 (de) * 2016-12-08 2018-06-14 Robert Bosch Gmbh Werkzeug und Verfahren zum Führen eines Benutzers eines Werkzeugs bei einer Behandlung mindestens eines Werkstücks

Also Published As

Publication number Publication date
MY125388A (en) 2006-07-31
US6472676B1 (en) 2002-10-29
AU7544100A (en) 2001-04-23
EP1221076B1 (fr) 2003-08-20
CA2385541C (fr) 2008-08-05
ATE247840T1 (de) 2003-09-15
DE60004692T2 (de) 2004-04-01
EP1221076A1 (fr) 2002-07-10
DE60004692D1 (de) 2003-09-25
ES2200941T3 (es) 2004-03-16
JP3504936B2 (ja) 2004-03-08
CA2385541A1 (fr) 2001-04-19
TW514577B (en) 2002-12-21
GB9923795D0 (en) 1999-12-08
JP2003511778A (ja) 2003-03-25

Similar Documents

Publication Publication Date Title
CA2385541C (fr) Systeme de micropositionnement
US11667030B2 (en) Machining station, workpiece holding system, and method of machining a workpiece
KR101013749B1 (ko) 비젼시스템을 구비한 씨엔씨 공작기계
CN107253084B (zh) 飞机数字化装配中的高效高精机器人自动铣削系统
EP1301310B1 (fr) Systeme de positionnement d'outil
US5390128A (en) Robotic processing and inspection system
US11992962B2 (en) Robot and robot system
CN114515924A (zh) 一种基于焊缝识别的塔脚工件自动焊接系统及方法
AU2001270761A1 (en) Tool positioning system
TWI704028B (zh) 因應治具偏移的刀具路徑定位補償系統
CN117047237A (zh) 一种异形件智能柔性焊接系统与方法
CN108489361A (zh) 一种带孔工件的通止检测系统
WO2022091767A1 (fr) Procédé de traitement d'images, dispositif de traitement d'images, dispositif de transfert de type monté sur un robot et système
CN120038642B (zh) 用于曲面构件损伤修复的打磨装置及打磨方法
WO2022191148A1 (fr) Outil d'apprentissage et dispositif d'apprentissage permettant d'utiliser la main d'un opérateur pour régler un point d'apprentissage
CN118076461A (zh) 可搭载式扫描铣削集成系统及其使用方法
Ngom et al. Basic design of a computer vision based controller for desktop NC engraving machine
Chalus et al. 3D robotic welding with a laser profile scanner
Wang et al. The use of a machine vision system in a flexible manufacturing cell incorporating an automated coordinate measuring machine
JPH03166086A (ja) タッチアップ方法
CN117943870A (zh) 一种数控加工模锻件的微调姿方法、应用及柔性调姿装置
CN117139810A (zh) 电阻点焊电极位姿及电极表面在线自动检测方法及装置

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 09700880

Country of ref document: US

ENP Entry into the national phase

Ref country code: JP

Ref document number: 2001 530653

Kind code of ref document: A

Format of ref document f/p: F

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2000964511

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2385541

Country of ref document: CA

WWP Wipo information: published in national office

Ref document number: 2000964511

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 2000964511

Country of ref document: EP