US20020133263A1 - Data acquisition image analysis image manipulation interface - Google Patents
Data acquisition image analysis image manipulation interface Download PDFInfo
- Publication number
- US20020133263A1 US20020133263A1 US09/984,952 US98495201A US2002133263A1 US 20020133263 A1 US20020133263 A1 US 20020133263A1 US 98495201 A US98495201 A US 98495201A US 2002133263 A1 US2002133263 A1 US 2002133263A1
- Authority
- US
- United States
- Prior art keywords
- image
- interface
- workpiece
- control means
- annular control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
Definitions
- the invention relates to interfaces for use on a visual display of a computer system to allow an operator to interact with the computer system. More specifically, it relates to an improved user friendly interface for use in an interactive environment which involves the positioning of an object for imaging and also for analysis, comparison and manipulation of the image of the object.
- the object of the present invention is achieved by providing an interface apparatus for manipulating a workpiece positionable by movement means.
- the interface apparatus has a first annular control means responsive to operator input for generating a rotational control signal for rotating said workpiece within a common reference frame and a second annular control means responsive to operator input for generating a translational motion control signal for movement of said workpiece within the common reference frame.
- the apparatus also has a command and control mechanism to generate the specified control signals upon operator input, transmit the control signals to said movement means and initiate activity specified by the operator input.
- the apparatus in this aspect of the invention allows the operator to position the workpiece for at least one of the following imaging, analysis and comparison.
- the interface apparatus has a third annular control means for generating a translational motion control signal, for movement of the workpiece within the reference frame, wherein the control signal generated by the second annular control means moves the workpiece in small increments and the signal generated by the third annular control means moves the workpiece in large increments.
- the two annular control means of the interface apparatus are concentric so that they share a common center.
- the interface apparatus has at the common center of the first and second concentric annular control means at least two activable buttons, the first button when activated rotates the work piece in a clockwise direction and the second button when activated rotates the work piece in a counterclockwise direction.
- the second annular control means of the interface apparatus is segmented into four equal arcs, with each arc forming a separate button for directed motion, which on individual activation of each button generates a control signal for movement of the workpiece on a vector in a direction which is radially away from a common center of the arcs and normal to a center of a peripheral outside edge of the arc so activated, the arcs being so positioned that the direction of a vector of any one arc is perpendicular to the direction of vectors of the two adjacent arcs when each is activated by operator input, and 180 degrees from the direction of a vector of an arc on the opposite side of the common center of the arcs when activated by operator input.
- the interface apparatus includes a third annular ring, also segmented into four equal arcs, each arc so formed being paired with an arc of the second annular ring so that the paired arcs form two buttons for movement of the workpiece along a vector in the same direction, one of the arcs of each pair providing for small incremental movements of the workpiece and the other arc of each pair providing for large incremental movements of the workpiece.
- control signal generated by operator input at a selected spot on the second annular control means moves the workpiece within the common reference frame in a direction of a vector radially away from a common center of the second annular control means and normal to a tangent line formed at the closest point on an outside periphery of the annular control means to the selected spot on the second annular control means activated by operator input.
- the command and control mechanism is a programmable computer with a visual display.
- the computer system of the invention being in functional communication with an imaging device and an object positioning device, the imaging device being placed in relation to the object positioning device such that upon operator input, applied through the interface, the computer system can generate the necessary control signal to position the object held by the object positioning device in a focal plane of the object imaging device so that the imaging device can focus on the object and transmit to the computer for display on the visual display the image of that object so obtained.
- the first annular control means can be switched between a first operational state wherein it rotates an object and a second operational state wherein contours of a surface of the object can be mapped to an outside periphery of the first annular control means so that a representation of the contours of the surface of the object appears on the outside periphery of the first annular control means.
- the second operation state maps those portions of the contours of the object which have been successfully imaged and stored in a memory by the computer system.
- the object imaged in the preferred embodiment generally being a spent bullet and the first annular control means of the interface apparatus in the second operational state displays on its outside periphery contours of land engraved areas of the bullet successfully imaged and stored by the computer in a memory.
- the workpiece is an image of a reference object and an image of a test object and the computer system can then simultaneously, display on the visual display, an image of a test object and an image of a reference object and operator input applied through the interface can switch the computer system between three different image analysis states, a first state for manipulating the image of the test object, a second state for manipulating the image of the reference object and a third state for manipulating the combined images of the test object and the reference object.
- the interface When the interface is in the third state, the image of the test object and the reference object are joined together on the visual display in an overlapping configuration with the visible portions of the image of the reference object and the image of the test object separated by a line of separation, the line of separation being manipulated by operator input means.
- Operator input to manipulate the line of separation while the interface is in the third state, causes the line of separation to rotate about a central point, which as it rotates, it successively reveals different portions of the overlapped images of the test object and the image of the reference object so that the operator can compare and analyze them.
- the first annular control means has an operator activable marker, which marker, as the operator moves it around the first annular ring generates the signal which causes the workpiece to rotate in the same direction through the same angular distance as the marker is moved, by the operator, on the first annular control means.
- the operator activates the interface with a pointing device.
- the pointing device can be: a mouse, track ball touch pad, light pen and PC styles.
- the operator can activate the various parts of the interface with a touch sensitive screen.
- annular control means when activated, generates a control signal of rotational motion which rotates a workpiece within a reference frame
- the second annular control means when activated, generates a control signal of translational motion which moves the workpiece within the reference frame
- the method of the invention preferably includes one or more of the additional steps of generating the first and second annular control means such that they are concentric and thus share a common center. Generating a third annular control means which shares the common center with the first and second annular control means, wherein activation of the third annular control means generates a signal of translational motion which moves the workpiece in large increments and the signal of translational motion generated by activation of the second annular control means moves the work piece in small increments.
- Another alternative aspect of the method of this invention involves generating a control signal by operator input at a selected spot on the second annular control means which moves the workpiece in the direction of a vector pointing radially away from a center of the second annular control means and normal to a line tangent to a point on an outside circumference of the second annular control means which point is the closest point on the outside periphery to the selected spot.
- the step of generating the interface includes segmenting the second annular control means into four equal arcs, with each arc forming a separate button for directed motion, which on individual activation of each button generates a control signal for movement of the workpiece on a vector in a direction which is radially away from a common center of the arcs, said vector being normal to the center of the peripheral outside edge of the arc so activated, the arcs being so positioned that the direction of the vector of any one arc being perpendicular to the direction of the vectors of the two adjacent arcs and 180 degrees from the direction of the vector of the arc on the opposite of the common center.
- This alternative aspect can include the additional step of generating a third concentric annular control means which shares the common center with the second annular control means, the third annular control means being segmented into four equal arcs, each arc so formed by the third annular control-means being paired with an arc of the second annular control means so that the paired arcs form two buttons for movement of the workpiece along a vector in the same direction, one of the arcs of each pair providing for small incremental movements of the workpiece and the other arc of each pair providing for large incremental movements of the workpiece.
- the method of this invention can also include generating two buttons at the common center of the annular control means, one of said buttons upon operator activation rotates the workpiece in a clockwise direction and the other button on operator activation rotates the workpiece in a counterclockwise direction.
- the method of this invention can include switching between two operating modes, a first mode for image acquisition and second for image analysis, comparison and manipulation.
- the step of operating in the first operating mode comprises manipulating with the interface a workpiece which is both an object and an image of that object, the image of the object so manipulated appearing on the visual display.
- the workpiece is an image of a reference object and an image of a test object and the step of operating in the second operating mode includes simultaneously displaying on the visual display the image of the test object and the image of the reference object, and a further step, switching the interface in the second operating mode between three different states, a first state for manipulating the image of the test object, a second state for manipulating the image of the reference object and a third state for manipulating a combined image of the test object and the reference object.
- the step of operating in the first operating mode can also include the step of selecting one of two different states to operate in: a first state for acquisition of an image of a cartridge case and a second state of the first operating mode for acquisition of images of the land engraved areas of a spent bullet.
- the step of operating in the second state includes the step of mapping to the first annular control means a representation of each land engraved area successfully imaged.
- the invention provides a method of displaying contours of a surface of an object, the method comprising the steps of: providing an interface with a peripheral surface; obtaining positional information on contours of said surface of said object; and altering said interfaces peripheral surface to display said information on the contours of portions of said object.
- the step of obtaining the information on the contours of said surface of said object comprises scanning the surface of said object and generating a mathematical function approximating said surface of said object.
- the step of altering said interfaces peripheral surface comprises: altering the peripheral surface of said interface apparatus with the information from said mathematical function.
- FIG. 1 is a schematic drawing of the essential features of the present invention
- FIG. 1A depicts an object being manipulated by the present invention
- FIG. 2 is a view of a screen display which implements the interface of the present invention with other visual interfaces;
- FIG. 3 provides a schematic of a system with which the screen display in FIG. 2 would be used;
- FIG. 4A provides an alternate arrangement for the interface of the present invention
- FIG. 4B provides a second alternate version of the interface of the present invention.
- FIG. 4C provides a third alternate version of the interface of the present invention.
- FIG. 4D provides a view of a portion of FIG. 4B
- FIG. 5 is part of a flow chart of one system which incorporates the interface of the present invention.
- FIG. 6 is the rest of flow chart of the system of FIG. 5 which incorporates the interface of the present invention.
- FIG. 7 is another screen display which implements the interface of the present invention with other visual interfaces
- FIG. 8 provides a schematic of a system with which the screen display in FIG. 7 would be used;
- FIG. 9 is a perspective view of a spent bullet
- FIG. 10A is a view of the interface of the present invention in one of its implementations
- FIG. 10B is another view of the interface of the present invention after completion of the implementation in 10 A;
- FIG. 11 is a view of a screen display which incorporates the interface of the present invention in a cartridge case analysis mode
- FIG. 11A depicts the window activity indicator when the window with the test image is active
- FIG. 11B depicts the window activity indicator when the window with the reference image is active
- FIG. 11C depicts the window activity indicator when both the reference image and the test image are combined in one window are overlapped with a line of reference indicating the boundaries between the images;
- FIG. 12 is a view of a screen display of another implementation of the interface of the present invention in a cartridge case analysis mode
- FIG. 13 is a view of a screen display which incorporates the present invention in a spent bullet analysis mode in which the image of the test and reference bullets occupy their own windows;
- FIG. 13A present invention in a spent bullet analysis mode in which the image of the test and reference bullets are combined in an overlapping mode in the same window with the images separated by a line of separation;
- FIG. 14A is a view of an end of a spent bullet fragment
- FIG. 14B is an interface of the present invention with the scanned contour of the spent bullet fragment mapped to its first annular ring;
- FIG. 14C is the interface of FIG. 14B with an indication of those areas successfully imaged
- FIG. 15 is an operative schematic view of the present invention presented in the drawings and description herein;
- FIG. 15A is a portion of the operative schematic view of FIG. 15 of the present invention with the various parts of the interface of the preferred embodiment of the present invention represented.
- Visual display 41 generally being a computer screen such as a CRT, liquid crystal display or any similar device which provides the user of a computer with a visual display.
- the appropriately configured software which will be discussed below, would generate directional dial interface 21 on visual display 41 and allow the operator of the computer 48 to interact with the computer 48 in the manner which will shortly be described.
- the computer system depicted in FIG. 1 includes a computer 48 , connected by a video bus 53 to visual display 41 .
- the computer in turn has a keyboard 49 connected to it as well as pointing device 43 , in the example, a mouse, by cable 52 .
- the computer system would be running the appropriate software which generates the image of the interface 21 among other things.
- a knowledgeable software writer could compose the necessary software in a variety of different ways to achieve the effect and purpose of the present invention.
- the directional dial interface 21 in the preferred embodiment has three annular rings 24 , 26 and 28 with a common center 29 .
- the first annular ring 24 which is the outermost of the three, is an unbroken ring.
- the second annular ring 26 is the innermost one and in the preferred embodiment is sectioned into four separate buttons 30 N, 30 E, 30 S, and 30 W.
- the third annular ring 28 lies between the first ring 24 and the second ring 26 and like ring 26 in the preferred embodiment is sectioned into four separate buttons 32 N, 32 E, 32 S, and 32 W.
- the common center 29 has two triangular buttons 37 and 39 .
- buttons of directional dial interface 21 have a specific general function which allow the computer operator to move and manipulate the spatial position of an image of an object in a specific frame of reference.
- This frame of reference is in fact the visual display or screen 41 on which the image of the object appears.
- the image of an object 45 provides one example of an image which can be manipulation with the interface of the present invention.
- the screen itself is two dimensional, as is well known in the art, a visual display can depict an image of an object in three dimensions and the interface could be configured to handle such object manipulation.
- the operator may be manipulating an image of an object on the visual display, in actuality, the operator could be manipulating an image of an object previously taken and stored for later use or the image of an object being viewed in real time.
- the system can be configured such that each manipulation of the image being viewed could result in a corresponding movement of the object.
- the object being in another corresponding reference frame, where it is being viewed by an appropriate imaging device 63 FIG. 3.
- the other or secondary reference frame being the focal plane of the imaging device 63 .
- the primary reference frame being the visual display.
- the imaging device 63 which is transmitting the image of the object so viewed to the computer 62 for imaging on the screen 61 could be any number of cameras or a charged coupled device (CCD) which are well known in the art.
- CCD charged coupled device
- buttons of directional dial interface 21 FIG. 1 in the preferred embodiment would be activated by the computer operator using a pointing device 43 .
- Item 43 in FIG. 1 is a standard two button mouse. As is well known in the art, the mouse has at least one button. However, in most instances the mouse has two and sometimes three buttons. One of the buttons, generally the left button 43 A, is the primary activating button.
- Mouse 43 has a corresponding cursor 47 generated on visual display 41 by the appropriate and commonly available software which is well known in the art. The operator moves the pointing device 43 , about on a flat surface, in this case pad 51 .
- Each move of the pointing device 43 results in a corresponding move of the pointing device cursor 47 on screen 41 .
- This allows the operator to position the pointing device cursor 47 at any position on screen 41 in a few moves of the pointing device 43 .
- the cursor 47 could be easily moved to position 47 A, 47 B, 47 C or any other position on the visual display.
- pointing device cursor 47 in visual display 41 , is place on one of the buttons of directional dial interface 21 and the appropriate button on the pointing device 43 is pushed, movement of the workpiece in this case image 45 is initiated.
- the actual direction of -movement depends on which button of directional dial interface 21 is activated by the pointing device 43 .
- the actual specific directions in which directional dial interface 21 moves image 45 will be discussed in detail below.
- buttons of the second annular ring 26 and the third annular ring 28 when appropriately activated, provide for translational moves of the workpiece.
- the eight buttons of annular rings 26 and 28 for purposes of description, can be thought to move the workpiece in the directions equivalent to the four points of the compass.
- the four sides of the visual display 41 as follows: the top as north (N), the bottom as south (S), the left side as west (W) and the right side as east (E).
- buttons 30 N and 32 N would move image 45 north on the screen
- buttons 30 S and 32 S would move image 45 south on the screen
- buttons 30 W and 32 W move image 45 west on the screen
- buttons 30 E and 32 E move image 45 east on the screen.
- the buttons of the second annular ring 26 namely 30 N, 30 S, 30 W and 30 E would move image 45 in small increments in their respective directions of motion.
- the buttons of the third annular ring 28 namely 32 N, 32 S, 32 W and 32 E move image 45 in large increments in their respective directions of motion.
- the system of the present invention allows the operator to set the speed and distance each button of the second and third annular rings: 30 N, 30 S, 30 E, 30 W, 32 N, 32 S, 32 E and 32 W move the image 45 on the screen 41 .
- the operator could set inner annular ring buttons 30 N, 30 S, 30 E and 30 W to move the workpiece or image 45 in small increments of about a millimeter for each click of the pointing device button 43 A when the pointing device cursor 47 is placed on it.
- the operator could also configure the system such that when the pointing device cursor is placed on one of these buttons, 30 N, 30 S, 30 E and 30 W, and the pointing device button is held down for more then two seconds the image 45 moves at a rate of one millimeter every fifth of a second.
- the operator could set the buttons of the middle ring 32 N, 32 S, 32 E and 32 W, on activation, to move the image 45 in centimeter increments on the screen with each click of the pointing device button 43 A while the pointing device cursor 47 is on the button activated.
- the operator could provide for continuous movement of image 45 on depression of one of these buttons, 32 N, 32 S, 32 E and 32 W, for more than two seconds etc.
- buttons of translational motion moves the image of the object 45 in a direction in which the arrows point on each of the respective buttons point.
- Each of the pairs of east, west, north and south buttons are in orthogonal relationship to the adjacent buttons with respect to the direction in which they move the image 45 .
- Each pair of buttons moves image 45 in a direction 180 degrees to the pair of buttons on the opposite side of the common center 29 .
- buttons 30 N and 32 N, pair 30 S and 32 S, pair 30 E and 32 E, and pair 30 W and 32 W could be considered to have an associated vector of motion equivalent to the arrows which appear on each in the directional dial interface 21 .
- the direction of each vector being the direction the arrows point on each button.
- Buttons 30 E and 32 E move the image 45 in the direction of a vector pointing in the east direction on the screen which is at right angles i.e. orthogonal to the directional vector of the adjacent pairs of buttons, pair 30 N and 32 N and pair 30 S and 32 S, moves object 45 .
- buttons 30 W and 32 W on the other side of the common center 29 from button pair 30 E and 32 E, move the image in the direction of a vector pointing to the west direction.
- the direction of the vector of pair 30 W and 32 W being 180 degrees from the direction of the vector of direction of pair 30 E and 32 E.
- buttons 30 N and 32 N move object 45 in the direction of a vector pointing to the north on the screen and button pair 32 S and 30 S moves image 45 in the direction of a vector pointing to the south.
- First annular ring 24 in the preferred embodiment surrounds the entire interface 21 forming its outer boundary.
- First annular ring 24 provides one means to rotate image 45 .
- One activates annular ring 24 by moving mouse cursor 47 to the position 47 C and placing the mouse cursor 47 on ring marker 34 of the first annular ring 24 .
- Once mouse cursor 47 is placed on ring marker 34 the operator then clicks on the appropriate button on the mouse 43 and holds that button down and drags ring marker 34 around annular ring 24 which results in a corresponding rotational movement of image 45 .
- moving ring marker 34 by the above method from position 56 A to 56 B results in a corresponding movement of image 45 .
- point 56 AA on image 45 moves to position 56 BB.
- the axis of rotation about which image 45 rotates is selected by default as the center of the image as initially acquired. However, as depicted in FIG. 1A the operator can change the axis about which the image 45 rotates by moving the cursor 47 to the appropriate position such as point 57 , for the purposes of this example, and clicking on the appropriate mouse button. Thus, if ring marker 34 is moved from point 56 A to 56 B with the axis of rotation at point 57 in FIG. 1A image 45 rotates to new position 45 B.
- buttons 37 and 39 provide another means to initiate rotational motion of object 45 .
- One of the buttons, such as 37 when activated by moving pointing device, cursor 47 , to button 37 and depressing the appropriate pointing device button 43 A rotates image 45 in a clockwise direction and the other button 39 , when activated rotates it in a counter clockwise direction.
- each click of button 37 or 39 rotates the image in one degree increments.
- the image 45 rotates at a steady and moderate pace for as long as the cursor 47 remains on button 37 or 39 and the appropriate pointing device button, i.e. 43 A or 43 B, is depressed.
- the two center buttons 37 and 39 thus serve as the fine adjustments of angular positioning in the system of this invention and first annular ring provides for substantial and quick adjustments.
- the interface of the present invention can take on different configurations and not depart from the fundamental concept of intuitive functionality it provides.
- the directional dial interface 21 could take on the configuration shown in FIG. 4A where the four pairs of buttons point towards the four points of a compass.
- the buttons are activated in the manner, as noted above, through use of a pointing device wherein the screen cursor 47 is placed over the arrow buttons 30 (N, S, E and W) or 32 (N, S, E and W) and clicked.
- Each of the pairs of buttons having the same function as described above with respect to movement.
- FIG. 4C Another alternative configuration is to segment the third and second annular rings into more segments such as depicted in FIG. 4C. As shown, FIG. 4C the second and third annular rings are segmented into eight arcs. This results in eight sets of two buttons for a total of eight directions or compass points the image of the object, or the object itself, can be moved in within the frame of reference with only one click of the pointing device. For example, buttons 32 NW and 30 NW would move the image on screen 41 in a northwest direction between the direction of button pairs 30 W and 32 W and 30 N and 32 N.
- the second annular ring 72 and third annular ring 73 could be presented as solid rings.
- the second annular ring 72 and third annular rings 73 would still be used for transitional movement of the object.
- clicking the mouse cursor 47 on a section of the second annular ring 72 or third annular ring 73 would cause the image to move in a direction normal to a line tangent to the point on the outside curvature 80 which is closest to the spot on the ring clicked.
- FIG. 4D depicts a portion of annular ring 73 from FIG. 4B, specifically that portion around point 78 .
- vector 79 is normal to tangent line 77 .
- Tangent line 77 is tangent at point 76 on the outside periphery of annular ring 73 .
- Tangent point 76 is the closest point on the outside curvature or periphery 80 of annular ring 73 to spot 78 , the spot clicked by the operator to initiate movement of the image.
- the invention includes the feature of allowing the operator to control the positioning of an object in real time through use of directional dial interface 21 .
- the purpose of positioning the object could be for obtaining an image for storage and later analysis, to work on the object positioned or for the handling of toxic or dangerous materials in a secure area removed from the operator.
- FIG. 3 shows a system set up to position an object 65 for imaging.
- Computer 62 using the appropriate software controls, the visual display 61 , as well as optical imaging device 63 and positioning stage 64 for this system. Technologies including hardware and software for implementing and controlling such devices are well known in the art.
- the operator would exercise control through keyboard 49 and pointing device 43 .
- FIG. 2 shows the directional dial interface 21 of the present invention integrated on visual display 41 A with various other interface devices to form an extended system. The interface system of display 41 A would then appear on screen 61 .
- the additional interface apparatuses are not essential for practicing the present invention.
- directional dial interface 21 The operator through use of directional dial interface 21 , in the manner described above, would then position the object 65 in the appropriate position for imaging. Use of directional dial interface 21 , in the manner described above, would result in movement of object 65 to the appropriate position through instructions sent by computer 62 to positioning stage 64 .
- the operator could also control the brightness of the object by slide 71 and focus of the optical device 63 by slide 70 .
- Both interface device 71 and 70 are well known in the art as well as techniques for control and use of optical devices such as 63 , which not only has a standard optical imaging device, it also has the appropriate means to transmit the image viewed, in a form, which can be displayed on screen 61 .
- the system has a common reference frame.
- the image acquisition process is made up of two parts.
- the visual display provides the primary reference frame and the focal plane of the imaging device provides the secondary reference frame.
- the operator on screen 61 would then view the image so captured as depicted in window 74 on screen 41 A.
- the signal transmitted by optical device 63 could either be an analog or a digital signal. Suitable apparatus and techniques well known in the art could be used.
- the imaging device 63 would include a charged coupled device (CCD) well known in the art. This transmits a digitalized signal of that image.
- CCD charged coupled device
- the operator could instruct the system, by activating button 75 , to save the image to a storage device, not specifically shown, but which would be part of the computer system 62 and certainly well known in the art.
- the image so obtained could be saved as a file in the usual manner and held for later retrieval and use.
- the invention has been described in fairly general terms up to this point. The following description will discuss implementation of the invention in a system which takes images of objects, stores those images and subsequently uses those images for comparison and analysis with other similarly obtained images.
- the Integrated Ballistics Identification System or IBIS of Forensic Technology provides a still developing system for automated and systematized forensic ballistics analysis. The system relies in part on computers and thus control and operation would be significantly enhanced with user friendly interfaces among other things.
- a number of patents have issued relating to different aspects of this automated forensic ballistics analysis system such as the following: U.S. patents: “Method And Apparatus For Monitoring And Adjusting The Position Of An Article Under Optical Observation” U.S. Pat. No.
- FIG. 5 and 6 provide flow charts with the major functional elements of the current preferred embodiment of the system which uses the directional dial interface of the present invention. Only so much of this system will be described, as is necessary, to understand the full capacity and functionality of the directional dial interface.
- the program is activated, 79 FIG. 5, and then the operating mode 80 is selected.
- the system has two operating modes, image acquisition mode 90 and image comparison, analysis and manipulation mode 81 FIG. 6. If the image acquisition mode is selected, then one of two sub-modes must be selected, either the sub-mode for acquisition of the image of a cartridge case 91 or the sub-mode for acquisition of the image of a spent bullet 93 . If the sub-mode for acquisition of the image of the cartridge case 91 is selected then directional dial interface 21 is generated on the visual display together with the rest of the working interface. Starting the sub-mode for acquisition of images of the cartridge case 91 , in the preferred embodiment, only activates the buttons of transitional motion 92 .
- this sub-state assists in assuring that the operator has successfully obtained images of the land engraved areas on a spent bullet being imaged. As the operator rotates the spent bullet imaging the land engraved areas on the spent bullet those portions successfully imaged are mapped as depressions to the first annular ring. This allows the operator to keep track of what has been imaged and know when all of the land engraved areas on the spent bullet have been imaged.
- the operator After activation of the program 79 the operator also has the option of starting the image comparison, analysis and manipulation mode 81 . Then depending on whether or not the operator wants to compare previously acquired images of spent bullets or shells he selects either the spent bullet comparison, analysis and manipulation sub mode 82 or the cartridge case analysis, comparison and manipulation sub mode 84 . If the spent bullet comparison and analysis sub mode 82 is selected the directional dial interface 21 is generated on the screen. Interface 21 appears on the screen with the other related interfaces, but only its buttons of translational movement are activated.
- the cartridge case analysis mode 84 FIG. 6 has been selected after generation of the directional dial interface 21 and the other related interfaces of the system have been generated 85 then the buttons of translational and rotational motion are activated.
- the operator must still select a state to operate in from a choice of three possible operational states available in the cartridge case analysis mode.
- the cartridge case analysis mode the operator usually has two images on the screen to work with, one is an image of a reference object which will be compared to another image, the image of a test object. Both images are of spent cartridge cases and the purpose is for comparison, to determine if a match exists, such as, were both fired from the same firearm.
- the operator can switch into the image of the reference object manipulation sub-state 87 to move the reference image around.
- the operator can then switch to the image of the test object manipulation sub-state 88 to move the test image around. Finally, the operator can switch to an image comparison sub-state 89 which joins both images as one image separated by a line of separation. As will be discussed in detail below half of each image, such as half of the image of the test object and half of the image of the reference object, appears together separated by a line of separation. Rotation of the line of separation, as described below about a central axis progressively reveals different portions of each image so the both can be compared simultaneously.
- FIG. 8 depicts schematically the basic components of the spent bullet image acquisition system.
- the system includes a visual display 103 connected to an appropriately programmed computer 102 .
- the operator controls the computer with keyboard 104 and pointing device 105 in the usual manner.
- the computer in turn controls optical imaging device 106 and spent bullet holding and positioning stage 108 .
- the operator thus can position the spent bullet 107 to take appropriate images as will be described shortly.
- Various components of this system are described in detail in U.S. Pat. Nos. 5,379,106; 5,390,108; and 5,559,489 which were discussed above and incorporated herein by reference.
- the spent bullet 108 being imaged appears in FIG. 9.
- the spent bullet generally made of lead or copper, after being forced down the barrel of gun by the explosion of the gun powder has etched thereon land engraved areas 109 .
- the rifling in a gun barrel consists of spiral alternating grooves and raised areas, called land areas. It is well known that gun rifling, a feature used for at least the last 100 to 200 years, imparts a spin to the bullet as it travels down the barrel and in so doing adds enormous stability to the spent bullet on leaving the barrel. This stability in turn substantially increase the range and accuracy of the bullet fired from the gun.
- FIG. 8 the operator will successively obtain images of each LEA on spent bullet 107 as it is rotated in spent bullet holder 108 .
- the directional dial interface 21 FIG. 7 provides the operator with the means of keeping track of the images of the LEA's as he or she rotates the spent bullet taking the images.
- the operator actually views a magnified image of the LEA in window 99 in interface 98 .
- Interface 98 appears on visual display 103 FIG. 8.
- the operator picks out a shoulder 110 FIG. 9 at the beginning of a LEA 109 and marks it with mouse cursor 115 FIG. 7, the operator then activates reference mark 35 on the directional dial 21 making it correspond to the first shoulder on the spent bullet.
- buttons of translational motion 30 (N, E, S and W) and 32 (N, E, S and W) perform the same function and allow the operator to move the image about to optimally position it for imaging.
- rotational buttons 37 and 39 allow the operator to adjust the angular relationship of the image to the viewing window 99 to also help optimize the image obtained.
- FIG. 11 If the operator selects the cartridge case image comparison, analysis and manipulation mode 84 and 85 FIG. 6 interface 126 FIG. 11 would appear on the visual display.
- the initial display besides having the interface features depicted including the directional dial interface 21 of the present invention has two separate windows, window 127 which has the image of the test object 122 , in this case the cartridge case under examination and window 128 which has the reference object 121 , another cartridge case image, to which the test object 122 is to be compared.
- the operator has the option of making either window 128 or 127 active by placing the cursor 43 of the system pointing device on button 130 and depressing the left pointing device button. By making either window 127 or 128 active, the operator can then manipulate the image in the active window with directional dial 21 .
- buttons 30 and 32 would allow the operator to move the image of the object around as described above.
- the operator could rotate the object with central buttons 37 and 39 or with the first annular ring 24 in the manner described above to place the object in the active window in the proper angular orientation.
- Indicator 124 tells the operator which window is active. In the preferred embodiment when window 127 is active indicator 124 is clear or lightly shaded 124 A FIG. 11A. If window 128 is active then indicator 124 is dark in color 124 B as depicted in FIG. 11B. The operator can also put the system into a third state as depicted in FIG. 12. If the operator puts the system into this third state indicator 124 is half dark and half light 124 C FIG. 11C.
- the third state depicted in FIG. 12 combines half of each image 121 A and 122 A at any one time.
- the images are separated by a line of separation 123 .
- the operator can rotate line of separation 123 about a center point 132 and by so doing progressively reveal different portions of object 121 A and 122 A at the same time. This provides the operator with another means of comparing the image of the reference object 121 and the image of the test object 122 .
- the operator has three options with which to initiate rotation of the line of separation 123 .
- the operator can place the pointing device screen cursor 47 on the line of separation 123 and drag it around in a circular motion.
- the operator can use directional dial central buttons 37 and 39 in the fashion described above to cause the line of separation to rotate in a clockwise direction or a counter clockwise direction.
- the operator would place the pointing device cursor 47 on the selected button and depress the appropriate pointing device button with his or her finger to initiate rotation.
- the operator can use the first annular ring 24 in this state to rotate the line of separation.
- the operator would, as in the fashion described above, place the pointing device screen cursor 47 on the annular ring marker, 34 depress the appropriate pointing device button with his or her hand and drag the annular ring cursor around the ring until the desired position is reached.
- the indicator 124 is half light and half dark 124 C FIG. 11C.
- the directional dial 21 is also utilized in the analysis submode 82 for movement of the spent bullet image in vertical and the horizontal direction.
- FIG. 13 depicts how the overall interface appears in this mode with the directional dial 21 implemented for use to supplement the system.
- the buttons of translationalmovement 30 and 32 (N, E, S and W) on dial 21 move the image of the reference image 161 in window 168 or the image of the test object 162 in window 169 depending on which of the two windows 168 or 169 is active.
- activity indicator 124 is clear 124 A FIG. 11A.
- activity indicator 124 is dark 124 B FIG. 11B.
- window 168 is active the system is in the reference image substate 83 B FIG. 6 of the spent bullet analysis mode.
- window 169 is active the system is in the test image substate 83 C FIG. 6 of the spent bullet analysis mode.
- the reference image 161 and the test image 162 can also be combined in one window as depicted in FIG. 13A.
- There the images are overlapped with the images separated by a line of separation 160 .
- the line of separation can be moved horizontally back and forth with buttons 30 E, 32 E, 30 W and 32 W.
- the line of separation can also be moved back and forth by placing cursor 47 on it and dragging the line 160 back and forth. By moving the line of separation back and forth the operator can successively reveal different portions of each spent bullet. Moving the line of separation 160 to the left would reveal more of test image 162 and cover-up part of 161 . On the other hand moving line of separation 160 to the right would reveal more of reference image 161 and cover up portions of test image 162 .
- activity indicator 124 is half dark and half light as depicted by 124 C FIG. 11C.
- activity indicator 124 is half dark and half light as depicted by 124 C FIG. 11C.
- FIGS. 13 and 13A appears virtual thumb wheel 163 which an operator of the system uses to stretch or compress the images of the spent bullets displayed on the screen during the image analysis mode.
- the thumb wheel 163 is only used with the spent bullet image analysis mode 82 FIG. 6.
- the image in the active window stretches out in a uniform and proportional manner along the vertical axis of the image so that its individual features can be more easily studied.
- the cursor down on the thumb wheel it compresses the image in the active window.
- Compression occurs in a uniform and proportional manner for the image along the vertical axis of the image.
- compression and stretching of the image occurs in a uniform manner only in one dimension, along the vertical axis of the image which is generally perpendicular to the direction of the land engraved areas on the image of the spent bullet.
- the apparatus could be adapted to stretch the image in more than one direction.
- This stretching or sizing apparatus controlled by thumb wheel 163 aids in analysis of spent bullets which have been deformed to some extent on impact after firing but the striations left on the land engraved areas can still be observed and analyzed.
- To return the image to its original dimensions the operator merely clicks twice on the center of the thumb wheel 163 . The operator can switch between the two windows 168 and 169 by clicking twice on the window to be activated or by clicking twice on the on indicator 124 which successively cycles the system through each of the three sub states 83 B, 83 C and 83 D.
- the practice of the '717 invention uses this information to calculate a mathematical function of the surface contours of that portion of the spent bullet or object scanned.
- the system of that invention uses the function obtained from the scanning path to obtain an optimal imaging path. What the initial scanning path amounts to then is a function of the contour of the outside surface scanned.
- the practice of the '717 patent can easily be incorporated into the practice of the current invention.
- the function obtained in the initial scan according to the practice of the '106 patent can be mapped to the first annular ring 24 so it provides an outline of the contours of the surface of the object scanned. Then as those portions of the scanned area are imaged, the imaged areas can be designated on the first annular ring 24 .
- the spent bullets obtained at a crime scene are deformed or are fragments. This results from the fact that the bullets or bullets are most often made of lead or brass which shatter or become deformed to some extent on impact after being shot.
- FIG. 14A depicts a portion of a deformed bullet being scanned to create the mathematical function of the scanning path.
- Optical imaging device 133 scans bullet fragment 131 as it rotates about on axis 142 .
- Axis 142 is perpendicular to the plane of the paper and forms the rotational axis of a spent bullet holding and rotating device.
- FIG. 8 schematically depicts the various major parts of the system.
- U.S. Pat. Nos. 5,379,106 and 5,390,108 already incorporated by reference herein go into specific detail on various related aspects the systems and devices used.
- FIG. 14A the spent bullet 131 only has two relatively intact LEAs over the surface being scanned LEA 137 and 139 .
- spent bullet 131 has two LEA's partially intact 138 and 140 .
- FIG. 14B depicts the interface 143 with the function of the scan obtained from spent bullet 131 mapped to the periphery of annular ring 145 .
- intact LEA's 137 A and 139 A appear thereon.
- Partial LEA's 138 A and 140 A also appear thereon. That portion of the spent bullet missing is indicated by dashed line 144 .
- x's 146 indicate a significant departure of the circumference of the spent bullet from its original shape as a result of its deformation.
- FIG. 14B depicts the interface 143 with the function of the scan obtained from spent bullet 131 mapped to the periphery of annular ring 145 .
- intact LEA's 137 A and 139 A appear thereon.
- 14C depicts the interface 143 after successful image acquisition of the LEA's. Images as indicated by the hatched lines at 137 B, 138 B, 139 B and 140 B indicate the successful image acquisition. Any number of options exist for indicating on the display successful imaging of LEA's including color coding.
- the present invention could be implemented on an electromechanical system, in the preferred embodiment, it is implemented on a programmable computer. Specifically, a programmable digital computer system is used in the preferred embodiment. In the last 10 to 20 years progress in the development of programmable digital computers has been incremental. Computer hardware, has in fact, become a commodity and now software in a sense has become a commodity. Those skilled in the art, on reading the proceeding disclosure, will know that by using standard software writing techniques as well as available software modules appropriate software programs can be prepared to implement the invention as described herein without the need for any experimentation. In fact the present invention could be implemented in a variety of software languages i.e. C, Unix etc.
- FIG. 15 provides an operative view of the system and its functional states.
- the interface 173 and mode selection device 176 would appear on the screen in an operator input window 203 in the preferred embodiment.
- Initial operator input to the mode selection device 176 selects one of the two available operating modes either an image acquisition or an image analysis mode. For example if an operator selects the image acquisition mode using pointing device 194 by inputting to mode selection device 176 this in turn transmits a mode selection signal 186 to the central signal and control unit 171 .
- the operator then with pointing device 194 applies input to the interface device 173 which results in generation of a workpiece operative signal 183 which prompts the central signal and control unit 171 to transmit an object positioning signal 184 to the object holding and movement device 174 .
- the workpiece being both the object being imaged in real time as well as the image of the object which would appear on the display 205 which would make up part of the image acquisition means.
- the object imaging device 191 transmits the focused image to image acquisition and movement device 175 , an appropriate combination of hardware and software.
- Image acquisition and movement device 175 transmits the viewed image in real time to display window 205 as well as saving the selected image to image storage 197 .
- Many of these aspects have been discussed in detail above. If the operator selects the image analysis mode through input with a pointing device 194 to the mode selection device 176 it generates mode selection signal 186 to central signal and control unit 171 . Then operator input to the interface 173 with pointing device 194 results in generation of operative signal 183 to central signal and control unit 171 which then transmits image movement signal 185 to image movement device 175 for movement of the image on display window 205 .
- the interface device 173 in the preferred embodiment consists of four different parts: the first annular control 173 A, second annular control 173 B, third annular control 173 C and incremental rotational control 173 D. Operator input to each results in generation by each of their respective operative signals 183 A, 183 B, 183 C and 183 D for movement of the workpiece.
- the work piece being the image of an object or the image and the object imaged.
- the central signal and control unit 171 would generate the object positioning signal 184 or the image positioning signal 185 .
- each of the parts of interface 173 would generate an operative signal which would initiate a specific type of movement in the workpiece as follows: the first annular control 173 A would cause rotational motion of the workpiece, the second annular control 173 B would cause small translation moves in the workpiece, the third annular control 173 C would cause large translational moves in the workpiece and the incremental rotational control 173 D, the two central buttons 37 and 39 in FIG. 1, would rotate in small movements.
- the image acquisition mode as noted in detail above has a submode in which the surface profile of an object is mapped to the periphery of the first annular control means.
- mapping meaning the outside periphery of the first annular shape is changed to resemble the surface contours of the object imaged.
- the image analysis mode in a certain aspect of its operation has three sub-operating states.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
An interface apparatus for use with a computer system. The apparatus in the preferred embodiment having three concentric annular rings. The outer most of which is a solid ring, the two inner rings each being segmented into for equal arcs, which arcs form four pairs of two buttons each. The interface also has at its center two buttons. The interface is used to position an object for imaging and also to manipulate and position the image of an object for comparison and analysis. The two center buttons and the outside annular ring are used to generate signals to rotate an object or the image of the object in a common reference frame. The four pairs of arcs of the two inner concentric rings are used to generate signals for translational motion of the object or the image of the object. The four pairs of arcs being positioned to move the object or the image of the object in one of four perpendicular directions within the common reference frame. In a sub-operating state the outside periphery of the outer most solid ring can be made to represent the contours of the surface of an object and also to indicate which portions of the surface of that object have been imaged.
Description
- The invention relates to interfaces for use on a visual display of a computer system to allow an operator to interact with the computer system. More specifically, it relates to an improved user friendly interface for use in an interactive environment which involves the positioning of an object for imaging and also for analysis, comparison and manipulation of the image of the object.
- The first personal computers had very crude and hard to use interfaces which in some instances amounted to knowing special codes which one then typed into the computer. Since then, one of the driving forces in the development and improvement of computers has been the improvement of the user interface. The trend has been to develop user friendly, intuitive and generally graphical visually based interfaces. In fact, the success or failure of some products, in particular, software products, has been their user interfaces and how easy they made the product to master and use. However, such a plethora of interfaces now exist that the user now has to often master a new interface for each new product or application encountered and used.
- Thus, there is a continuing need to improve interfaces and make them easier and more intuitive in their operation. Additionally, a significant need exists to make interfaces with enough flexibility and capacity to allow them to function with a fairly wide variety of applications but still retain their flexibility, intuitive feel and usefulness. One significant area in which this need exists in is image acquisition and image analysis, comparison and manipulation.
- It is an object of the present invention to provide a user friendly generally intuitive interface for use with a computer system which has sufficient flexibility and adaptability to be used in a variety of different applications.
- The object of the present invention is achieved by providing an interface apparatus for manipulating a workpiece positionable by movement means. The interface apparatus has a first annular control means responsive to operator input for generating a rotational control signal for rotating said workpiece within a common reference frame and a second annular control means responsive to operator input for generating a translational motion control signal for movement of said workpiece within the common reference frame. The apparatus also has a command and control mechanism to generate the specified control signals upon operator input, transmit the control signals to said movement means and initiate activity specified by the operator input. The apparatus in this aspect of the invention allows the operator to position the workpiece for at least one of the following imaging, analysis and comparison.
- In another aspect of this invention, the interface apparatus has a third annular control means for generating a translational motion control signal, for movement of the workpiece within the reference frame, wherein the control signal generated by the second annular control means moves the workpiece in small increments and the signal generated by the third annular control means moves the workpiece in large increments.
- In another aspect of this invention the two annular control means of the interface apparatus are concentric so that they share a common center. The interface apparatus has at the common center of the first and second concentric annular control means at least two activable buttons, the first button when activated rotates the work piece in a clockwise direction and the second button when activated rotates the work piece in a counterclockwise direction.
- In another aspect of this invention the second annular control means of the interface apparatus is segmented into four equal arcs, with each arc forming a separate button for directed motion, which on individual activation of each button generates a control signal for movement of the workpiece on a vector in a direction which is radially away from a common center of the arcs and normal to a center of a peripheral outside edge of the arc so activated, the arcs being so positioned that the direction of a vector of any one arc is perpendicular to the direction of vectors of the two adjacent arcs when each is activated by operator input, and 180 degrees from the direction of a vector of an arc on the opposite side of the common center of the arcs when activated by operator input. The interface apparatus includes a third annular ring, also segmented into four equal arcs, each arc so formed being paired with an arc of the second annular ring so that the paired arcs form two buttons for movement of the workpiece along a vector in the same direction, one of the arcs of each pair providing for small incremental movements of the workpiece and the other arc of each pair providing for large incremental movements of the workpiece.
- In yet another aspect of this invention the control signal generated by operator input at a selected spot on the second annular control means moves the workpiece within the common reference frame in a direction of a vector radially away from a common center of the second annular control means and normal to a tangent line formed at the closest point on an outside periphery of the annular control means to the selected spot on the second annular control means activated by operator input.
- In yet another aspect of the invention the command and control mechanism is a programmable computer with a visual display. The computer system of the invention being in functional communication with an imaging device and an object positioning device, the imaging device being placed in relation to the object positioning device such that upon operator input, applied through the interface, the computer system can generate the necessary control signal to position the object held by the object positioning device in a focal plane of the object imaging device so that the imaging device can focus on the object and transmit to the computer for display on the visual display the image of that object so obtained.
- In yet another aspect of this invention the first annular control means can be switched between a first operational state wherein it rotates an object and a second operational state wherein contours of a surface of the object can be mapped to an outside periphery of the first annular control means so that a representation of the contours of the surface of the object appears on the outside periphery of the first annular control means. The second operation state maps those portions of the contours of the object which have been successfully imaged and stored in a memory by the computer system. The object imaged in the preferred embodiment generally being a spent bullet and the first annular control means of the interface apparatus in the second operational state displays on its outside periphery contours of land engraved areas of the bullet successfully imaged and stored by the computer in a memory.
- In yet another aspect of this invention, the workpiece is an image of a reference object and an image of a test object and the computer system can then simultaneously, display on the visual display, an image of a test object and an image of a reference object and operator input applied through the interface can switch the computer system between three different image analysis states, a first state for manipulating the image of the test object, a second state for manipulating the image of the reference object and a third state for manipulating the combined images of the test object and the reference object. When the interface is in the third state, the image of the test object and the reference object are joined together on the visual display in an overlapping configuration with the visible portions of the image of the reference object and the image of the test object separated by a line of separation, the line of separation being manipulated by operator input means. Operator input to manipulate the line of separation, while the interface is in the third state, causes the line of separation to rotate about a central point, which as it rotates, it successively reveals different portions of the overlapped images of the test object and the image of the reference object so that the operator can compare and analyze them.
- In yet another aspect of this invention, the first annular control means has an operator activable marker, which marker, as the operator moves it around the first annular ring generates the signal which causes the workpiece to rotate in the same direction through the same angular distance as the marker is moved, by the operator, on the first annular control means.
- In yet another aspect of this invention the operator activates the interface with a pointing device. The pointing device can be: a mouse, track ball touch pad, light pen and PC styles. Alternatively, the operator can activate the various parts of the interface with a touch sensitive screen.
- According to the invention, there is also provided a method for a computer system to manipulate objects and images which includes the steps of:
- generating an activable interface for a visual display with at least two annular control means: the first annular control means, when activated, generates a control signal of rotational motion which rotates a workpiece within a reference frame, the second annular control means, when activated, generates a control signal of translational motion which moves the workpiece within the reference frame;
- moving a workpiece within the reference frame to a desired location in that reference frame, with the control signal of translational motion generated by activating the second annular control means;
- rotating the workpiece within the reference frame to a desired angular orientation with the control signal of rotational motion generated by activating the first annular control means; and
- conducting at least one of the following imaging, analysis and comparison.
- The method of the invention preferably includes one or more of the additional steps of generating the first and second annular control means such that they are concentric and thus share a common center. Generating a third annular control means which shares the common center with the first and second annular control means, wherein activation of the third annular control means generates a signal of translational motion which moves the workpiece in large increments and the signal of translational motion generated by activation of the second annular control means moves the work piece in small increments.
- Another alternative aspect of the method of this invention involves generating a control signal by operator input at a selected spot on the second annular control means which moves the workpiece in the direction of a vector pointing radially away from a center of the second annular control means and normal to a line tangent to a point on an outside circumference of the second annular control means which point is the closest point on the outside periphery to the selected spot.
- In yet another alternative aspect of the method of this invention the step of generating the interface includes segmenting the second annular control means into four equal arcs, with each arc forming a separate button for directed motion, which on individual activation of each button generates a control signal for movement of the workpiece on a vector in a direction which is radially away from a common center of the arcs, said vector being normal to the center of the peripheral outside edge of the arc so activated, the arcs being so positioned that the direction of the vector of any one arc being perpendicular to the direction of the vectors of the two adjacent arcs and 180 degrees from the direction of the vector of the arc on the opposite of the common center. This alternative aspect can include the additional step of generating a third concentric annular control means which shares the common center with the second annular control means, the third annular control means being segmented into four equal arcs, each arc so formed by the third annular control-means being paired with an arc of the second annular control means so that the paired arcs form two buttons for movement of the workpiece along a vector in the same direction, one of the arcs of each pair providing for small incremental movements of the workpiece and the other arc of each pair providing for large incremental movements of the workpiece. The method of this invention can also include generating two buttons at the common center of the annular control means, one of said buttons upon operator activation rotates the workpiece in a clockwise direction and the other button on operator activation rotates the workpiece in a counterclockwise direction.
- In yet another aspect of the method of this invention it can include switching between two operating modes, a first mode for image acquisition and second for image analysis, comparison and manipulation. The step of operating in the first operating mode comprises manipulating with the interface a workpiece which is both an object and an image of that object, the image of the object so manipulated appearing on the visual display. In the step of operating in the second operating mode the workpiece is an image of a reference object and an image of a test object and the step of operating in the second operating mode includes simultaneously displaying on the visual display the image of the test object and the image of the reference object, and a further step, switching the interface in the second operating mode between three different states, a first state for manipulating the image of the test object, a second state for manipulating the image of the reference object and a third state for manipulating a combined image of the test object and the reference object. The step of operating in the first operating mode can also include the step of selecting one of two different states to operate in: a first state for acquisition of an image of a cartridge case and a second state of the first operating mode for acquisition of images of the land engraved areas of a spent bullet. The step of operating in the second state includes the step of mapping to the first annular control means a representation of each land engraved area successfully imaged.
- In another aspect of the invention it provides a method of displaying contours of a surface of an object, the method comprising the steps of: providing an interface with a peripheral surface; obtaining positional information on contours of said surface of said object; and altering said interfaces peripheral surface to display said information on the contours of portions of said object.
- In yet another aspect of this method the step of obtaining the information on the contours of said surface of said object comprises scanning the surface of said object and generating a mathematical function approximating said surface of said object.
- In yet another aspect of this method the step of altering said interfaces peripheral surface comprises: altering the peripheral surface of said interface apparatus with the information from said mathematical function.
- In yet another aspect of this method it includes the additional step of imaging selected portions of said surface of said object and indicating on the peripheral surface of said interface which portions of said surface of said object have been imaged.
- The invention will be better understood by an examination of the following description, together with the accompanying drawings, in which:
- FIG. 1 is a schematic drawing of the essential features of the present invention;
- FIG. 1A depicts an object being manipulated by the present invention;
- FIG. 2 is a view of a screen display which implements the interface of the present invention with other visual interfaces;
- FIG. 3 provides a schematic of a system with which the screen display in FIG. 2 would be used;
- FIG. 4A provides an alternate arrangement for the interface of the present invention;
- FIG. 4B provides a second alternate version of the interface of the present invention;
- FIG. 4C provides a third alternate version of the interface of the present invention;
- FIG. 4D provides a view of a portion of FIG. 4B;
- FIG. 5 is part of a flow chart of one system which incorporates the interface of the present invention;
- FIG. 6 is the rest of flow chart of the system of FIG. 5 which incorporates the interface of the present invention;
- FIG. 7 is another screen display which implements the interface of the present invention with other visual interfaces;
- FIG. 8 provides a schematic of a system with which the screen display in FIG. 7 would be used;
- FIG. 9 is a perspective view of a spent bullet;
- FIG. 10A is a view of the interface of the present invention in one of its implementations;
- FIG. 10B is another view of the interface of the present invention after completion of the implementation in10A;
- FIG. 11 is a view of a screen display which incorporates the interface of the present invention in a cartridge case analysis mode;
- FIG. 11A depicts the window activity indicator when the window with the test image is active;
- FIG. 11B depicts the window activity indicator when the window with the reference image is active;
- FIG. 11C depicts the window activity indicator when both the reference image and the test image are combined in one window are overlapped with a line of reference indicating the boundaries between the images;
- FIG. 12 is a view of a screen display of another implementation of the interface of the present invention in a cartridge case analysis mode;
- FIG. 13 is a view of a screen display which incorporates the present invention in a spent bullet analysis mode in which the image of the test and reference bullets occupy their own windows;
- FIG. 13A present invention in a spent bullet analysis mode in which the image of the test and reference bullets are combined in an overlapping mode in the same window with the images separated by a line of separation;
- FIG. 14A is a view of an end of a spent bullet fragment;
- FIG. 14B is an interface of the present invention with the scanned contour of the spent bullet fragment mapped to its first annular ring;
- FIG. 14C is the interface of FIG. 14B with an indication of those areas successfully imaged;
- FIG. 15 is an operative schematic view of the present invention presented in the drawings and description herein; and
- FIG. 15A is a portion of the operative schematic view of FIG. 15 of the present invention with the various parts of the interface of the preferred embodiment of the present invention represented.
- Overview of the Invention
- The
interface 21 of the present invention appears onvisual display 41 in FIG. 1.Visual display 41 generally being a computer screen such as a CRT, liquid crystal display or any similar device which provides the user of a computer with a visual display. The appropriately configured software, which will be discussed below, would generatedirectional dial interface 21 onvisual display 41 and allow the operator of thecomputer 48 to interact with thecomputer 48 in the manner which will shortly be described. - The computer system depicted in FIG. 1 includes a
computer 48, connected by avideo bus 53 tovisual display 41. The computer in turn has akeyboard 49 connected to it as well as pointingdevice 43, in the example, a mouse, bycable 52. The computer system would be running the appropriate software which generates the image of theinterface 21 among other things. In fact, as will be noted in more detail below, given the current state of development of software once the configuration and mode of the present invention is described a knowledgeable software writer could compose the necessary software in a variety of different ways to achieve the effect and purpose of the present invention. - The
directional dial interface 21 in the preferred embodiment has threeannular rings common center 29. The firstannular ring 24, which is the outermost of the three, is an unbroken ring. The secondannular ring 26 is the innermost one and in the preferred embodiment is sectioned into fourseparate buttons annular ring 28 lies between thefirst ring 24 and thesecond ring 26 and likering 26 in the preferred embodiment is sectioned into fourseparate buttons common center 29 has twotriangular buttons - All of the buttons of
directional dial interface 21 have a specific general function which allow the computer operator to move and manipulate the spatial position of an image of an object in a specific frame of reference. This frame of reference is in fact the visual display orscreen 41 on which the image of the object appears. The image of anobject 45 provides one example of an image which can be manipulation with the interface of the present invention. Although the screen itself is two dimensional, as is well known in the art, a visual display can depict an image of an object in three dimensions and the interface could be configured to handle such object manipulation. Additionally, although the operator may be manipulating an image of an object on the visual display, in actuality, the operator could be manipulating an image of an object previously taken and stored for later use or the image of an object being viewed in real time. If the operator is viewing an image of an object in real time (i.e. the operator views the image of an object as it is actually being taken) the system can be configured such that each manipulation of the image being viewed could result in a corresponding movement of the object. The object, being in another corresponding reference frame, where it is being viewed by anappropriate imaging device 63 FIG. 3. The other or secondary reference frame being the focal plane of theimaging device 63. The primary reference frame being the visual display. Theimaging device 63 which is transmitting the image of the object so viewed to thecomputer 62 for imaging on thescreen 61 could be any number of cameras or a charged coupled device (CCD) which are well known in the art. One such setup is depicted in FIG. 3 and will be discussed again below. Hereon in the term “workpiece” can refer to both the image of an object or the object itself when being imaged. - The buttons of
directional dial interface 21 FIG. 1 in the preferred embodiment would be activated by the computer operator using apointing device 43.Item 43 in FIG. 1 is a standard two button mouse. As is well known in the art, the mouse has at least one button. However, in most instances the mouse has two and sometimes three buttons. One of the buttons, generally theleft button 43A, is the primary activating button.Mouse 43 has a correspondingcursor 47 generated onvisual display 41 by the appropriate and commonly available software which is well known in the art. The operator moves thepointing device 43, about on a flat surface, in thiscase pad 51. Each move of thepointing device 43, as is well known by those who use them, results in a corresponding move of thepointing device cursor 47 onscreen 41. This allows the operator to position thepointing device cursor 47 at any position onscreen 41 in a few moves of thepointing device 43. For example, by making the appropriate moves of pointingdevice 43 onpad 51 thecursor 47 could be easily moved toposition device cursor 47, invisual display 41, is place on one of the buttons ofdirectional dial interface 21 and the appropriate button on thepointing device 43 is pushed, movement of the workpiece in thiscase image 45 is initiated. The actual direction of -movement depends on which button ofdirectional dial interface 21 is activated by thepointing device 43. The actual specific directions in whichdirectional dial interface 21moves image 45 will be discussed in detail below. - Although, the example herein uses a mouse as the pointing device it is well known in the art that other pointing devices would work as well as a mouse among them being: track balls, touch sensitive pads, light pens etc. Since these devices are now commonly used by anyone who uses a personal computer, and they are used in the standard fashion with the invention, there is no need to describe how they would work. Also, it is well known in the art how one would implement the use of such devices with computer hardware and software in a standard fashion without the need for any experimentation. It will also be readily perceived by those skilled in the art that a touch sensitive visual display or screen could be used for activation of the buttons of the
directional dial interface 21. Such a touch sensitive screen could be activated by a human hand or stylus designed for such activity. - Returning to the
directional dial interface 21, depicted in FIG. 1, the buttons of the secondannular ring 26 and the thirdannular ring 28, when appropriately activated, provide for translational moves of the workpiece. In the preferred embodiment the eight buttons ofannular rings visual display 41 as follows: the top as north (N), the bottom as south (S), the left side as west (W) and the right side as east (E). Thus,buttons image 45 north on the screen,buttons 30S and 32S would moveimage 45 south on the screen,buttons 32 W move image 45 west on the screen andbuttons 32 E move image 45 east on the screen. The buttons of the secondannular ring 26 namely 30N, 30S, 30W and 30E would moveimage 45 in small increments in their respective directions of motion. On the other hand the buttons of the thirdannular ring 28 namely 32N, 32S, 32W and32 E move image 45 in large increments in their respective directions of motion. Naturally, if the operator is viewing the image of an object in real time the initiation of movements of the image on the screen with appropriate hardware and software would result in corresponding movements of the object being imaged in its reference frame, generally the focal plane of an imaging device. - The system of the present invention, through the use of well known software and hardware devices and techniques, allows the operator to set the speed and distance each button of the second and third annular rings:30N, 30S, 30E, 30W, 32N, 32S, 32E and 32W move the
image 45 on thescreen 41. For example, the operator could set innerannular ring buttons image 45 in small increments of about a millimeter for each click of thepointing device button 43A when thepointing device cursor 47 is placed on it. The operator could also configure the system such that when the pointing device cursor is placed on one of these buttons, 30N, 30S, 30E and 30W, and the pointing device button is held down for more then two seconds theimage 45 moves at a rate of one millimeter every fifth of a second. Likewise, the operator could set the buttons of themiddle ring image 45 in centimeter increments on the screen with each click of thepointing device button 43A while thepointing device cursor 47 is on the button activated. Additionally, the operator could provide for continuous movement ofimage 45 on depression of one of these buttons, 32N, 32S, 32E and 32W, for more than two seconds etc. - Thus it can be seen that the buttons of translational motion moves the image of the
object 45 in a direction in which the arrows point on each of the respective buttons point. Each of the pairs of east, west, north and south buttons are in orthogonal relationship to the adjacent buttons with respect to the direction in which they move theimage 45. Each pair of buttons movesimage 45 in a direction 180 degrees to the pair of buttons on the opposite side of thecommon center 29. - Each of the buttons: pair30N and 32N,
pair 30S and 32S,pair directional dial interface 21. The direction of each vector being the direction the arrows point on each button.Buttons image 45 in the direction of a vector pointing in the east direction on the screen which is at right angles i.e. orthogonal to the directional vector of the adjacent pairs of buttons,pair pair 30S and 32S, movesobject 45.Button pair common center 29 frombutton pair pair pair buttons object 45 in the direction of a vector pointing to the north on the screen andbutton pair 32S and 30S movesimage 45 in the direction of a vector pointing to the south. - By moving
image 45 in any one of the four primary directions the appropriate number of times and in the proper sequence the operator can repositionimage 45 anywhere in the reference frame on the screen. For example, assume the operator wanted to moveimage 45 to position 45A the operator could accomplish this transition with two moves, one in the east direction by activatingbuttons button 30S or 32S. In fact, one potentially could moveimage 45 to any position in thereference frame 41, the visual display with no more than two moves. - First
annular ring 24 in the preferred embodiment surrounds theentire interface 21 forming its outer boundary. Firstannular ring 24 provides one means to rotateimage 45. One activatesannular ring 24 by movingmouse cursor 47 to theposition 47C and placing themouse cursor 47 onring marker 34 of the firstannular ring 24. Oncemouse cursor 47 is placed onring marker 34 the operator then clicks on the appropriate button on themouse 43 and holds that button down and dragsring marker 34 aroundannular ring 24 which results in a corresponding rotational movement ofimage 45. In FIG. 1, assuming the axis of rotation ofimage 45 is at its center, movingring marker 34 by the above method fromposition 56A to 56B results in a corresponding movement ofimage 45. Thus point 56AA onimage 45 moves to position 56BB. The axis of rotation about whichimage 45 rotates is selected by default as the center of the image as initially acquired. However, as depicted in FIG. 1A the operator can change the axis about which theimage 45 rotates by moving thecursor 47 to the appropriate position such aspoint 57, for the purposes of this example, and clicking on the appropriate mouse button. Thus, ifring marker 34 is moved frompoint 56A to 56B with the axis of rotation atpoint 57 in FIG.1A image 45 rotates tonew position 45B. - Referring back to FIG. 1
center buttons object 45. One of the buttons, such as 37, when activated by moving pointing device,cursor 47, tobutton 37 and depressing the appropriatepointing device button 43A rotatesimage 45 in a clockwise direction and theother button 39, when activated rotates it in a counter clockwise direction. In the preferred embodiment, each click ofbutton button image 45 rotates at a steady and moderate pace for as long as thecursor 47 remains onbutton center buttons - Alternative Interface Configurations
- The interface of the present invention can take on different configurations and not depart from the fundamental concept of intuitive functionality it provides. The
directional dial interface 21 could take on the configuration shown in FIG. 4A where the four pairs of buttons point towards the four points of a compass. The buttons are activated in the manner, as noted above, through use of a pointing device wherein thescreen cursor 47 is placed over the arrow buttons 30 (N, S, E and W) or 32 (N, S, E and W) and clicked. Each of the pairs of buttons having the same function as described above with respect to movement. - Another alternative configuration is to segment the third and second annular rings into more segments such as depicted in FIG. 4C. As shown, FIG. 4C the second and third annular rings are segmented into eight arcs. This results in eight sets of two buttons for a total of eight directions or compass points the image of the object, or the object itself, can be moved in within the frame of reference with only one click of the pointing device. For example, buttons32NW and 30NW would move the image on
screen 41 in a northwest direction between the direction of button pairs 30W and 32W and 30N and 32N. - In another alternative, as depicted in FIG. 4B, the second
annular ring 72 and thirdannular ring 73 could be presented as solid rings. Here again the secondannular ring 72 and thirdannular rings 73 would still be used for transitional movement of the object. However, clicking themouse cursor 47 on a section of the secondannular ring 72 or thirdannular ring 73 would cause the image to move in a direction normal to a line tangent to the point on theoutside curvature 80 which is closest to the spot on the ring clicked. For example, the computer operator would place thepointing device cursor 47 atspot 78, click and movement of the image in the frame of reference would occur in the direction ofvector 79, a direction normal to the outside curvature orperiphery 80 of the ring so activated, in this instance thethird ring 73. The second ring could be for small incremental movements and the third ring would still be for large movements. FIG. 4D depicts a portion ofannular ring 73 from FIG. 4B, specifically that portion aroundpoint 78. As can be seen thereonvector 79 is normal totangent line 77.Tangent line 77 is tangent atpoint 76 on the outside periphery ofannular ring 73.Tangent point 76 is the closest point on the outside curvature orperiphery 80 ofannular ring 73 to spot 78, the spot clicked by the operator to initiate movement of the image. - Other Uses
- As mentioned above, the invention includes the feature of allowing the operator to control the positioning of an object in real time through use of
directional dial interface 21. The purpose of positioning the object could be for obtaining an image for storage and later analysis, to work on the object positioned or for the handling of toxic or dangerous materials in a secure area removed from the operator. FIG. 3 shows a system set up to position anobject 65 for imaging.Computer 62 using the appropriate software controls, thevisual display 61, as well asoptical imaging device 63 andpositioning stage 64 for this system. Technologies including hardware and software for implementing and controlling such devices are well known in the art. The operator would exercise control throughkeyboard 49 andpointing device 43. FIG. 2 shows thedirectional dial interface 21 of the present invention integrated onvisual display 41A with various other interface devices to form an extended system. The interface system ofdisplay 41A would then appear onscreen 61. However, the additional interface apparatuses are not essential for practicing the present invention. - The operator through use of
directional dial interface 21, in the manner described above, would then position theobject 65 in the appropriate position for imaging. Use ofdirectional dial interface 21, in the manner described above, would result in movement ofobject 65 to the appropriate position through instructions sent bycomputer 62 topositioning stage 64. The operator could also control the brightness of the object byslide 71 and focus of theoptical device 63 byslide 70. Bothinterface device screen 61. As noted above the system has a common reference frame. The image acquisition process is made up of two parts. The visual display provides the primary reference frame and the focal plane of the imaging device provides the secondary reference frame. - The operator on
screen 61 would then view the image so captured as depicted inwindow 74 onscreen 41A. The signal transmitted byoptical device 63 could either be an analog or a digital signal. Suitable apparatus and techniques well known in the art could be used. In the preferred embodiment theimaging device 63 would include a charged coupled device (CCD) well known in the art. This transmits a digitalized signal of that image. After making the appropriate adjustments to obtain an optimal image, as described above, the operator could instruct the system, by activatingbutton 75, to save the image to a storage device, not specifically shown, but which would be part of thecomputer system 62 and certainly well known in the art. The image so obtained could be saved as a file in the usual manner and held for later retrieval and use. - Various Modes and States of Operation
- The invention has been described in fairly general terms up to this point. The following description will discuss implementation of the invention in a system which takes images of objects, stores those images and subsequently uses those images for comparison and analysis with other similarly obtained images. The Integrated Ballistics Identification System or IBIS of Forensic Technology provides a still developing system for automated and systematized forensic ballistics analysis. The system relies in part on computers and thus control and operation would be significantly enhanced with user friendly interfaces among other things. A number of patents have issued relating to different aspects of this automated forensic ballistics analysis system such as the following: U.S. patents: “Method And Apparatus For Monitoring And Adjusting The Position Of An Article Under Optical Observation” U.S. Pat. No. 5,379,106; “Computer Automated Bullet Analysis Apparatus” U.S. Pat. No. 5,390,108; “Method For Monitoring And Adjusting The Position Of An Object Under Optical Observation For Imaging” U.S. Pat. No. 5,633,717; “Fired Cartridge Examination Method And Imaging Apparatus” U.S. Pat. No. 5,654,801; And “Method And Apparatus For Obtaining A Signature From A Fired Bullet” U.S. Pat. No. 5,659,489. All of these patents are incorporated herein by reference.
- An overview of the implementation of the directional dial interface of the present invention in the IBIS ballistics analysis system will be discussed with the aid of flow charts. Then specific implementations of the directional dial interface in the IBIS ballistics analysis system will be reviewed. FIG. 5 and6 provide flow charts with the major functional elements of the current preferred embodiment of the system which uses the directional dial interface of the present invention. Only so much of this system will be described, as is necessary, to understand the full capacity and functionality of the directional dial interface.
- First the program is activated,79 FIG. 5, and then the operating
mode 80 is selected. The system has two operating modes,image acquisition mode 90 and image comparison, analysis andmanipulation mode 81 FIG. 6. If the image acquisition mode is selected, then one of two sub-modes must be selected, either the sub-mode for acquisition of the image of acartridge case 91 or the sub-mode for acquisition of the image of a spentbullet 93. If the sub-mode for acquisition of the image of thecartridge case 91 is selected thendirectional dial interface 21 is generated on the visual display together with the rest of the working interface. Starting the sub-mode for acquisition of images of thecartridge case 91, in the preferred embodiment, only activates the buttons oftransitional motion 92. On the other hand, starting the sub-mode for acquisition of images of the spentbullet 93 activates not only the buttons oftranslational motion 94, it also activates a unique sub-state which uses the first annular ring which will be described in detail below. Briefly, this sub-state assists in assuring that the operator has successfully obtained images of the land engraved areas on a spent bullet being imaged. As the operator rotates the spent bullet imaging the land engraved areas on the spent bullet those portions successfully imaged are mapped as depressions to the first annular ring. This allows the operator to keep track of what has been imaged and know when all of the land engraved areas on the spent bullet have been imaged. - After activation of the
program 79 the operator also has the option of starting the image comparison, analysis andmanipulation mode 81. Then depending on whether or not the operator wants to compare previously acquired images of spent bullets or shells he selects either the spent bullet comparison, analysis andmanipulation sub mode 82 or the cartridge case analysis, comparison andmanipulation sub mode 84. If the spent bullet comparison andanalysis sub mode 82 is selected thedirectional dial interface 21 is generated on the screen.Interface 21 appears on the screen with the other related interfaces, but only its buttons of translational movement are activated. - If the cartridge
case analysis mode 84 FIG. 6 has been selected after generation of thedirectional dial interface 21 and the other related interfaces of the system have been generated 85 then the buttons of translational and rotational motion are activated. However, the operator must still select a state to operate in from a choice of three possible operational states available in the cartridge case analysis mode. In the cartridge case analysis mode, the operator usually has two images on the screen to work with, one is an image of a reference object which will be compared to another image, the image of a test object. Both images are of spent cartridge cases and the purpose is for comparison, to determine if a match exists, such as, were both fired from the same firearm. Thus, the operator can switch into the image of the referenceobject manipulation sub-state 87 to move the reference image around. The operator can then switch to the image of the testobject manipulation sub-state 88 to move the test image around. Finally, the operator can switch to animage comparison sub-state 89 which joins both images as one image separated by a line of separation. As will be discussed in detail below half of each image, such as half of the image of the test object and half of the image of the reference object, appears together separated by a line of separation. Rotation of the line of separation, as described below about a central axis progressively reveals different portions of each image so the both can be compared simultaneously. - Detailed Implementation
- As noted above, after selection of the spent bullet image
acquisition sub mode 93 FIG. 5 the program generates thedirectional dial interface 21 on the screen together with the associated interfaces and activates theinterface 94. FIG. 8 depicts schematically the basic components of the spent bullet image acquisition system. The system includes avisual display 103 connected to an appropriately programmedcomputer 102. The operator controls the computer withkeyboard 104 andpointing device 105 in the usual manner. The computer in turn controlsoptical imaging device 106 and spent bullet holding andpositioning stage 108. The operator thus can position the spentbullet 107 to take appropriate images as will be described shortly. Various components of this system are described in detail in U.S. Pat. Nos. 5,379,106; 5,390,108; and 5,559,489 which were discussed above and incorporated herein by reference. - The spent
bullet 108 being imaged appears in FIG. 9. The spent bullet, generally made of lead or copper, after being forced down the barrel of gun by the explosion of the gun powder has etched thereon land engravedareas 109. The rifling in a gun barrel consists of spiral alternating grooves and raised areas, called land areas. It is well known that gun rifling, a feature used for at least the last 100 to 200 years, imparts a spin to the bullet as it travels down the barrel and in so doing adds incredible stability to the spent bullet on leaving the barrel. This stability in turn substantially increase the range and accuracy of the bullet fired from the gun. It has also been known since at least the first part of the twentieth century that the land areas of each gun when they etch the land engraved areas (LEA) on a spent bullet, also leave unique markings or striations which can identify the gun from which the spent bullet was fired. - Referring to FIG. 8 the operator will successively obtain images of each LEA on spent
bullet 107 as it is rotated in spentbullet holder 108. Thedirectional dial interface 21 FIG. 7 provides the operator with the means of keeping track of the images of the LEA's as he or she rotates the spent bullet taking the images. The operator actually views a magnified image of the LEA inwindow 99 ininterface 98.Interface 98 appears onvisual display 103 FIG. 8. The operator picks out ashoulder 110 FIG. 9 at the beginning of aLEA 109 and marks it withmouse cursor 115 FIG. 7, the operator then activatesreference mark 35 on thedirectional dial 21 making it correspond to the first shoulder on the spent bullet. The operator then rotates the spentbullet 107 inholder 108 moving down the LEA taking appropriate images of it and stops at theshoulder 110 on the opposite side of theLEA 109. Once the operator has successfully acquired an image or images of that LEA anindentation 111 FIG. 10A appears on the outside of the firstannular ring 24. Thus, the operator slowly rotates the spent bullet inholder 108 and successively obtains images of each LEA.Indentations 112 FIG. 10A and 10B andindentations - Referring back to FIG. 7, the other functions of the
directional dial interface 21 remain the same. The buttons of translational motion 30(N, E, S and W) and 32 (N, E, S and W) perform the same function and allow the operator to move the image about to optimally position it for imaging. Likewise,rotational buttons viewing window 99 to also help optimize the image obtained. - If the operator selects the cartridge
case imaging sub-mode overall interface 41A of FIG. 2 have already been described in detail above as they relate to thedirectional dial interface 21. U.S. Pat. No. 5,654,801 identified and incorporated herein by references discloses a cartridge case examination and imaging method and apparatus which would work with the system. - If the operator selects the cartridge case image comparison, analysis and
manipulation mode interface 126 FIG. 11 would appear on the visual display. The initial display besides having the interface features depicted including thedirectional dial interface 21 of the present invention has two separate windows,window 127 which has the image of thetest object 122, in this case the cartridge case under examination andwindow 128 which has thereference object 121, another cartridge case image, to which thetest object 122 is to be compared. The operator has the option of making eitherwindow cursor 43 of the system pointing device onbutton 130 and depressing the left pointing device button. By making eitherwindow directional dial 21. Thus as discussed in detail above translational buttons 30 (N, E, S and W) and 32 (N, E, S and W) would allow the operator to move the image of the object around as described above. Likewise, the operator could rotate the object withcentral buttons annular ring 24 in the manner described above to place the object in the active window in the proper angular orientation.Indicator 124 tells the operator which window is active. In the preferred embodiment whenwindow 127 isactive indicator 124 is clear or lightly shaded 124A FIG. 11A. Ifwindow 128 is active thenindicator 124 is dark incolor 124B as depicted in FIG. 11B. The operator can also put the system into a third state as depicted in FIG. 12. If the operator puts the system into thisthird state indicator 124 is half dark and half light 124C FIG. 11C. - The third state depicted in FIG. 12 combines half of each
image separation 123. The operator can rotate line ofseparation 123 about acenter point 132 and by so doing progressively reveal different portions ofobject reference object 121 and the image of thetest object 122. The operator has three options with which to initiate rotation of the line ofseparation 123. The operator can place the pointingdevice screen cursor 47 on the line ofseparation 123 and drag it around in a circular motion. The operator can use directional dialcentral buttons pointing device cursor 47 on the selected button and depress the appropriate pointing device button with his or her finger to initiate rotation. Finally, the operator can use the firstannular ring 24 in this state to rotate the line of separation. The operator would, as in the fashion described above, place the pointingdevice screen cursor 47 on the annular ring marker, 34 depress the appropriate pointing device button with his or her hand and drag the annular ring cursor around the ring until the desired position is reached. When the system is in the third state theindicator 124 is half light and half dark 124C FIG. 11C. - In the preferred embodiment the
directional dial 21 is also utilized in theanalysis submode 82 for movement of the spent bullet image in vertical and the horizontal direction. FIG. 13 depicts how the overall interface appears in this mode with thedirectional dial 21 implemented for use to supplement the system. The buttons of translationalmovement 30 and 32 (N, E, S and W) ondial 21 move the image of thereference image 161 inwindow 168 or the image of thetest object 162 inwindow 169 depending on which of the twowindows window 169 is active and thetest image 162 can be manipulatedactivity indicator 124 is clear 124A FIG. 11A. Whenwindow 168 is active and thereference image 161 can be manipulated thenactivity indicator 124 is dark 124B FIG. 11B. Whenwindow 168 is active the system is in thereference image substate 83B FIG. 6 of the spent bullet analysis mode. Whenwindow 169 is active the system is in thetest image substate 83C FIG. 6 of the spent bullet analysis mode. - The
reference image 161 and thetest image 162 can also be combined in one window as depicted in FIG. 13A. There the images are overlapped with the images separated by a line ofseparation 160. The line of separation can be moved horizontally back and forth withbuttons cursor 47 on it and dragging theline 160 back and forth. By moving the line of separation back and forth the operator can successively reveal different portions of each spent bullet. Moving the line ofseparation 160 to the left would reveal more oftest image 162 and cover-up part of 161. On the other hand moving line ofseparation 160 to the right would reveal more ofreference image 161 and cover up portions oftest image 162. When the images are combined in one window as depicted in FIG.13A activity indicator 124 is half dark and half light as depicted by 124C FIG. 11C. When the images of the test and reference spentbullets image comparison substate 83D of the spent bullet analysis mode. - On FIGS. 13 and 13A appears
virtual thumb wheel 163 which an operator of the system uses to stretch or compress the images of the spent bullets displayed on the screen during the image analysis mode. In the preferred embodiment thethumb wheel 163 is only used with the spent bulletimage analysis mode 82 FIG. 6. One positions themouse cursor 47 on thewheel 163 generally at its center and moves thecursor 47 up or down onvirtual thumb wheel 163. In the preferred embodiment when the operator moves thecursor 47 up on thethumb wheel 163 the image in the active window stretches out in a uniform and proportional manner along the vertical axis of the image so that its individual features can be more easily studied. When one moves the cursor down on the thumb wheel it compresses the image in the active window. Compression occurs in a uniform and proportional manner for the image along the vertical axis of the image. In the preferred embodiment compression and stretching of the image occurs in a uniform manner only in one dimension, along the vertical axis of the image which is generally perpendicular to the direction of the land engraved areas on the image of the spent bullet. However, if necessary the apparatus could be adapted to stretch the image in more than one direction. This stretching or sizing apparatus controlled bythumb wheel 163 aids in analysis of spent bullets which have been deformed to some extent on impact after firing but the striations left on the land engraved areas can still be observed and analyzed. To return the image to its original dimensions the operator merely clicks twice on the center of thethumb wheel 163. The operator can switch between the twowindows indicator 124 which successively cycles the system through each of the threesub states - Reference was made above to the IBAS system and patents which relate to that system. A discussion then ensued in general terms which described how the present invention related to the IBAS system and the identified and incorporated various US patents with specific reference to the use of those patents. Another application involving one of those patents will now be discussed. U.S. Pat. No. 5,633,717 mentioned above and incorporated by reference herein describes an apparatus and method for scanning and then imaging an object. The '717 provides for an initial scan of the object, in the preferred embodiment, a spent bullet to obtain a mathematical function approximating the surface scanned. It creates this function by measuring the distance between the surface of the spent bullet or object being imaged and the camera or imaging device. The practice of the '717 invention then uses this information to calculate a mathematical function of the surface contours of that portion of the spent bullet or object scanned. The system of that invention then uses the function obtained from the scanning path to obtain an optimal imaging path. What the initial scanning path amounts to then is a function of the contour of the outside surface scanned.
- The practice of the '717 patent can easily be incorporated into the practice of the current invention. The function obtained in the initial scan according to the practice of the '106 patent can be mapped to the first
annular ring 24 so it provides an outline of the contours of the surface of the object scanned. Then as those portions of the scanned area are imaged, the imaged areas can be designated on the firstannular ring 24. Often, the spent bullets obtained at a crime scene are deformed or are fragments. This results from the fact that the bullets or bullets are most often made of lead or brass which shatter or become deformed to some extent on impact after being shot. FIG. 14A depicts a portion of a deformed bullet being scanned to create the mathematical function of the scanning path.Optical imaging device 133scans bullet fragment 131 as it rotates about onaxis 142.Axis 142 is perpendicular to the plane of the paper and forms the rotational axis of a spent bullet holding and rotating device. FIG. 8 schematically depicts the various major parts of the system. U.S. Pat. Nos. 5,379,106 and 5,390,108 already incorporated by reference herein go into specific detail on various related aspects the systems and devices used. - As can be seen in FIG. 14A the spent
bullet 131 only has two relatively intact LEAs over the surface being scannedLEA spent bullet 131 has two LEA's partially intact 138 and 140. FIG. 14B depicts theinterface 143 with the function of the scan obtained from spentbullet 131 mapped to the periphery ofannular ring 145. As can be seen, intact LEA's 137A and 139A appear thereon. Partial LEA's 138A and 140A also appear thereon. That portion of the spent bullet missing is indicated by dashedline 144. Also, x's 146 indicate a significant departure of the circumference of the spent bullet from its original shape as a result of its deformation. FIG. 14C depicts theinterface 143 after successful image acquisition of the LEA's. Images as indicated by the hatched lines at 137B, 138B, 139B and 140B indicate the successful image acquisition. Any number of options exist for indicating on the display successful imaging of LEA's including color coding. - Although, in some aspects the present invention could be implemented on an electromechanical system, in the preferred embodiment, it is implemented on a programmable computer. Specifically, a programmable digital computer system is used in the preferred embodiment. In the last 10 to 20 years progress in the development of programmable digital computers has been astounding. Computer hardware, has in fact, become a commodity and now software in a sense has become a commodity. Those skilled in the art, on reading the proceeding disclosure, will know that by using standard software writing techniques as well as available software modules appropriate software programs can be prepared to implement the invention as described herein without the need for any experimentation. In fact the present invention could be implemented in a variety of software languages i.e. C, Unix etc. Additionally, it could be written to operate on a variety of operating systems including Window®, Windows NT®, Unix based operating systems etc. Since there is nothing unique about the software necessary to implement the present invention no detailed source code is included. For example, the software necessary to generate a signal when an operator touches
button 32E FIG. 1 to cause workpiece, in this case, an image, 45 to move it can be written any number of different ways to accomplish the result. Thus, in a sense the means for movement of the workpiece is in fact generic and pro forma. The same could be said of all of the other software necessary to run and control the operation of the concept of the present invention. - Regarding the mechanical and other techniques for the focusing of the imaging device, positioning of the work piece etc., these are generally generic and well known to those skilled in the art. The exceptions being those concepts claimed in the various patents which have been cited herein and incorporated by reference herein.
- FIG. 15 provides an operative view of the system and its functional states. The
interface 173 andmode selection device 176 would appear on the screen in anoperator input window 203 in the preferred embodiment. Initial operator input to themode selection device 176 selects one of the two available operating modes either an image acquisition or an image analysis mode. For example if an operator selects the image acquisition mode usingpointing device 194 by inputting tomode selection device 176 this in turn transmits amode selection signal 186 to the central signal andcontrol unit 171. The operator then withpointing device 194 applies input to theinterface device 173 which results in generation of a workpieceoperative signal 183 which prompts the central signal andcontrol unit 171 to transmit anobject positioning signal 184 to the object holding andmovement device 174. The workpiece being both the object being imaged in real time as well as the image of the object which would appear on thedisplay 205 which would make up part of the image acquisition means. Theobject imaging device 191 transmits the focused image to image acquisition andmovement device 175, an appropriate combination of hardware and software. Image acquisition andmovement device 175 transmits the viewed image in real time to displaywindow 205 as well as saving the selected image to imagestorage 197. Many of these aspects have been discussed in detail above. If the operator selects the image analysis mode through input with apointing device 194 to themode selection device 176 it generatesmode selection signal 186 to central signal andcontrol unit 171. Then operator input to theinterface 173 withpointing device 194 results in generation ofoperative signal 183 to central signal andcontrol unit 171 which then transmitsimage movement signal 185 toimage movement device 175 for movement of the image ondisplay window 205. - As can be seen with FIG. 15A the
interface device 173 in the preferred embodiment consists of four different parts: the firstannular control 173A, secondannular control 173B, thirdannular control 173C and incremental rotational control 173D. Operator input to each results in generation by each of their respective operative signals 183A, 183B, 183C and 183D for movement of the workpiece. The work piece being the image of an object or the image and the object imaged. Depending on the mode selected throughmode selection device 176 by selection of 176A or 176B the central signal andcontrol unit 171 would generate theobject positioning signal 184 or theimage positioning signal 185. As noted above in detail each of the parts ofinterface 173 would generate an operative signal which would initiate a specific type of movement in the workpiece as follows: the firstannular control 173 A would cause rotational motion of the workpiece, the secondannular control 173B would cause small translation moves in the workpiece, the thirdannular control 173C would cause large translational moves in the workpiece and the incremental rotational control 173D, the twocentral buttons - While the invention has been particularly shown and described with reference to a preferred embodiment thereof, it will be understood by those skilled in the art that various changes in form and detail may be made to it without departing from the spirit and scope of the invention.
Claims (74)
1. An interface apparatus for manipulating a workpiece positionable by movement means, said apparatus comprising:
a first annular control means responsive to operator input for generating a rotational control signal for rotating said workpiece within a common reference frame;
a second annular control means responsive to operator input for generating a translational motion control signal for movement of said workpiece within the common reference frame; and
such movement allowing the operator to position the workpiece for at least one of the following imaging, analysis and comparison.
2. The interface apparatus of claim 1 which further comprises a third annular control means for generating a translational motion control signal, for movement of the workpiece within the reference frame, wherein the control signal generated by the second annular control means moves the workpiece in small increments and the signal generated by the third annular control means moves the workpiece in large increments.
3. The interface apparatus of claim 1 wherein the two annular control means are concentric so that they share a common center.
4. The interface apparatus of claim 3 which further comprises, at the common center of the first and second concentric annular control means, at least two activable buttons, the first button when activated rotates the work piece in a clockwise direction and the second button when activated rotates the work piece in a counterclockwise direction.
5. The interface apparatus of claim 1 wherein the second annular control means is segmented into four equal arcs, with each arc forming a separate button for directed motion, which on individual activation of each button generates a control signal for movement of the workpiece on a vector in a direction which is radially away from a common center of the arcs and normal to a tangent at a center of a peripheral outside edge of the arc so activated, the arcs being so positioned that the direction of a vector of any one arc is perpendicular to the direction of vectors of the two adjacent arcs when each is activated by operator input, and 180 degrees from the direction of a vector of an arc on the opposite side of the common center of the arcs when activated by operator input.
6. The interface apparatus of claim 5 wherein the interface includes a third annular ring, also segmented into four equal arcs, each arc so formed being paired with an arc of the second annular ring so that the paired arcs form two buttons for movement of the workpiece along a vector in the same direction, one of the arcs of each pair providing for small incremental movements of the workpiece and the other arc of each pair providing for large incremental movements of the workpiece.
7. The interface apparatus of claim 1 wherein the control signal generated by operator input at a selected spot on the second annular control means moves the workpiece within the common reference frame in a direction of a vector radially away from a common center of the second annular control means and normal to a tangent line formed at the closest point on an outside periphery of the annular control means to the selected spot on the second annular control means activated by operator input.
8. The interface apparatus of claim 6 wherein the command and control mechanism is a programmable computer with a visual display.
9. The interface apparatus of claim 8 wherein the workpiece is both an object and an image of that object and the computer system is in functional communication with an imaging device and an object positioning device, the imaging device being placed in relation to the object positioning device such that upon operator input, applied through the interface, the computer system can generate the necessary control signal to position the object held by the object positioning device in a focal plane of the object imaging device so that the imaging device can focus on the object and transmit to the computer for display on the visual display the image of that object so obtained.
10. The interface apparatus of claim 9 wherein the first annular control means can be switched between a first operational state wherein it rotates the object and a second operational state wherein contours of a surface of the object can be mapped to an outside periphery of the first annular control means so that a representation of the contours of the surface of the object appears on the outside periphery of the first annular control means.
11. The interface apparatus of claim 10 wherein the second operation state maps those portions of the contours of the object which have been successfully imaged and stored in a memory by the computer system.
12. The interface apparatus of claim 11 wherein the object imaged is a spent bullet and the first annular control means of the interface apparatus in the second operational state displays on its outside periphery contours of land engraved areas of the bullet successfully imaged and stored by the computer in a memory.
13. The interface apparatus of claim 8 wherein the workpiece is an image of a reference object and an image of a test object and the computer system can simultaneously display on the visual display an image of a test object and an image of a reference object and operator input applied through the interface can switch the computer system between three different image analysis states, a first state for manipulating the image of the test object, a second state for manipulating the image of the reference object and a third state for manipulating the combined images of the test object and the reference object.
14. The interface apparatus of claim 13 wherein when the interface is in the third state, the image of the test object and the reference object are joined together on the visual display in an overlapping configuration with the visible portions of the image of the reference object and the image of the test object separated by a line of separation, the line of separation being manipulated by operator input means.
15. The interface apparatus of claim 14 wherein operator input to manipulate the line of separation, while the interface is in the third state, causes the line of separation to rotate about a central point, which as it rotates it successively reveals different portions of the overlapped images of the test object and the image of the reference object so that the operator can compare and analyze them.
16. The interface apparatus of claim 2 wherein the first annular control means has an operator activable marker, which marker as the operator moves it around the first annular ring generates the signal which causes the workpiece to rotate in the same direction through the same angular distance as the marker is moved by the operator on the first annular control means.
17. The interface apparatus of claim 1 wherein the operator activates the interface with a pointing device.
18. The interface apparatus of claim 17 wherein the pointing device is taken from one of the group consisting of: a mouse, track ball, touch pad, light pen and PC styles.
19. The interface apparatus of claim 1 wherein the operator activities the various parts of the interface with a touch sensitive screen.
20. In a computer system, a visual display interface for manipulating a workpiece comprising: an operator activated interface for a visual display with four operator activable buttons positioned in an orthogonal relationship to each other around a common center and each button, when individually activated, by operator input, generates a signal which causes a workpiece in a common frame of reference to move in a direction of a vector of motion pointing radially away from the common center, which vector is at right angles to the direction of vectors of motion of the two adjacent buttons, and at 180 degrees from the direction of a vector of motion of the button on an opposite side of the common center.
21. The interface apparatus of claim 20 which further comprises an annular ring which generates a control signal which rotates the workpiece upon operator input.
22. The interface apparatus of claim 21 wherein the interface further comprises four additional buttons one of each of which is paired with the four original buttons, thus forming four pairs of two buttons each, one of each pair when activated moves the workpiece in small increments in the direction of the vector of motion of the original button of the pair moves the workpiece, and the second button of each pair moves the workpiece in larger increments in the same direction of the vector of motion of the original button of the pair moves the workpiece.
23. The interface apparatus of claim 22 wherein the pairs of buttons in orthogonal relation to each other share the common center with the annular ring.
24. The interface apparatus of claim 23 wherein the annular ring circumscribes the four pairs of buttons.
25. The interface apparatus of claim 24 which further comprises two additional buttons at the common center, the first button upon operator input rotates the workpiece in a clockwise direction and the second button upon operator input rotates the workpiece in a counter clockwise direction.
26. The interface apparatus of claim 25 wherein the interface can be switched between two different operating modes, a first mode for image acquisition and a second mode for image analysis.
27. The interface apparatus of claim 26 wherein the work piece is an object and the image of that object and in which the computer system is in functional communication with an imaging device and an object positioning device, the imaging device being placed in relation to the object positioning device such that upon operator input applied through the interface the computer system can generate the necessary control signal to position the object held by the object positioning device in a focal plane of the object imaging device so that the imaging device can focus on the object and transmit to the computer the image of the object so obtained.
28. The interface apparatus of claim 27 wherein the common reference frame further comprises a primary reference frame and a secondary reference frame, the primary reference frame being the visual display shared by the image of the object and the interface apparatus and the secondary reference frame being the focal plane of the imaging device; the primary reference frame and the secondary reference frame are functionally linked so that during real time imaging that movement of the image resulting from operator input to the interface apparatus results in corresponding movement of the object so that operator can position the object to acquire images of any surface of the object.
29. The interface apparatus of claim 28 wherein the annular ring has two functional operating states between which it can be changed while in the first operating mode of image acquisition, a first operational state in which it rotates the object and the image of that object being imaged, and a second state in which those portions of the object which are successfully imaged by the system and the images are saved by the system are representatively mapped to an outside periphery of the annular ring so that a representation of those portions of the object appears on the said periphery.
30. The interface apparatus of claim 29 wherein the object imaged while the system is in the first state of the first operating mode is a cartridge case.
31. The interface apparatus of claim 29 wherein the object imaged while the system is in the second state of the first operating mode is a spent bullet from a firearm and the portions of the spent bullet which are representatively mapped to the annular ring upon successful imaging are land engraved areas of the spent bullet.
32. The interface apparatus of claim 26 wherein the workpiece is both an image of a test object and an image of a reference object and when the computer system is in the second mode of image analysis it can simultaneously display, on the visual display, the image of the test object and the image of the reference object and operator input through the interface can switch the computer system between three different imaging states, a first state for manipulating the image of the test object, a second state for manipulating a the image of the reference object and a third state for manipulating the combined images of the test object and the reference object.
33. The interface apparatus of claim 32 wherein when the interface is in the third state, the image of the test object and the reference object are joined together on the visual display in an overlapping configuration with the visible portions of the image of the reference object and the image of the test object separated by a line of separation, the line of separation being manipulated by the operator generated control signal.
34. The interface apparatus of claim 33 wherein activation of the first annular control means to generate the control signal, while the interface is in the third state, causes the line of separation to rotate about a central point, which as it rotates it successively reveals different portions of the overlapped images of the test object and the image of the reference object so that the operator can compare and analyze them.
35. The interface of claim 34 wherein the image of the object under analysis is a cartridge case.
36. In a computer system, a method for manipulating objects and images comprising the steps of:
generating an activable interface for a visual display with at least two annular control means: the first annular control means, when activated, generates a control signal of rotational motion which rotates a workpiece within a reference frame, the second annular control means, when activated, generates a control signal of translational motion which moves the workpiece within the reference frame;
moving a workpiece within the reference frame to a desired location in that reference frame, with the control signal of translational motion generated by activating the second annular control means;
rotating the workpiece within the reference frame to a desired angular orientation with the control signal of rotational motion generated by activating the first annular control means; and
conducting at least one of the following imaging, analysis and comparison.
37. The method of claim 36 wherein the step of generating the interface further comprises the step of generating the first and second annular control means such that they are concentric and thus share a common center.
38. The method of claim 37 wherein the step of generating the interface further comprises: generating a third annular control means which shares the common center with the first and second annular control means and wherein activation of the third annular control means generates a signal of translational motion which moves the workpiece in large increments and the signal of translational motion generated by activation of the second annular control means moves the work piece in small increments.
39. The method of claim 37 wherein generating the interface further comprises the step of the interface and the workpiece sharing a common reference frame.
40. The method of claim 36 wherein generation of a control signal by operator input at a selected spot on the second annular control means moves the workpiece in the direction of a vector pointing radially away from a center of the second annular control means and normal to a line tangent to a point on an outside circumference of the second annular control means which point is the closest point on the outside periphery to the selected spot.
41. The method of claim 36 wherein in the step of generating the interface comprises: segmenting the second annular control means into four equal arcs which share a common center, with each arc forming a separate button for directed motion, which on individual activation of each button generates a control signal for movement of the workpiece on a vector in a direction which is radially away from the common center of the arcs, said vector being normal to the center of the peripheral outside edge of the arc so activated, the arcs being so positioned that the direction of the vector of any one arc being perpendicular to the direction of the vectors of the two adjacent arcs and 180 degrees from the direction of the vector of the arc on the opposite of the common center.
42. The method of claim 41 wherein the step of generating the interface further comprises generating a third concentric annular control means which shares the common center with the second annular control means, the third annular control means being segmented into four equal arcs, each arc so formed by the third annular control means being paired with an arc of the second annular control means so that the paired arcs form two buttons for movement of the workpiece along a vector in the same direction, one of the arcs of each pair providing for small incremental movements of the workpiece and the other arc of each pair providing for large incremental movements of the workpiece.
43. The method of claim 42 wherein generating the interface further comprises generating two buttons at the common center of the annular control means, one of said buttons upon operator activation rotates the workpiece in a clockwise direction and the other button on operator activation rotates the workpiece in a counterclockwise direction.
44. The method of claim 36 comprising the further step of switching between two operating modes, a first mode for image acquisition and second for image analysis, comparison and manipulation.
45. The method of claim 44 wherein the step of operating in the first operating mode comprises manipulating with the interface a workpiece which is both an object and an image of that object, the image of the object so manipulated appearing on the visual display.
46. The method of claim 44 wherein the step of operating in the second operating mode comprises the workpiece being an image of a reference object and an image of a test object and simultaneously displaying on the visual display the image of the test object and the image of the reference object, and the further step of switching the interface in the second operating mode between three different states, a first state for manipulating the image of the test object, a second state for manipulating a the image of the reference object and a third state for manipulating a combined image of the test object and the reference object.
47. The method of claim 44 wherein the step of operating in the first operating mode comprises the step of selecting one of two different states to operate in: a first state for acquisition of an image of a cartridge case and a second state for acquisition of images of the land engraved areas of a spent bullet.
48. The method of claim 47 wherein the step of operating in the second state includes the step of mapping to the first annular control means a representation of each land engraved area successfully imaged.
49. The method of claim 48 wherein, the step of the second state imaging the land engraved areas, further comprises the step of a third state mapping contours of the object imaged to the first annular control means.
50. The method of claim 48 comprising the step of entering the third state before entering the second state.
51. A method of displaying contours of a surface of an object, the method comprising the steps of:
providing an interface with a peripheral surface;
obtaining positional information on contours of said surface of said object; and
altering said interfaces peripheral surface to display said information on the contours of portions of said object.
52. The method of claim 51 wherein the step of obtaining the information on the contours of said surface of said object comprises scanning the surface of said object and generating a mathematical function approximating said surface of said object.
53. The method of claim 52 wherein the step of altering said interfaces peripheral surface comprises: altering the peripheral surface of said interface apparatus with the information from said mathematical function.
54. The method of claim 53 comprising the further step of imaging selected portions of said surface of said object and indicating on the peripheral surface of said interface which portions of said surface of said object have been imaged.
55. The method of claim 52 wherein the scanned object is cylindrical in shape.
56. The method of claim 52 wherein said object is a spent bullet.
57. The method of claim 56 comprising the further step of imaging selected portions of said spent bullet surface and indicating on the peripheral surface of said interface which portions of said spent bullet have been imaged.
58. The method of claim 57 wherein the portions of said spent bullet imaged are land engraved areas.
59. An interface apparatus of a computer system adaptable for representing contours of a surface of an object, said apparatus comprising:
an annular control means with a peripheral surface;
an imaging system capable of obtaining information on the surface of the object; and
means to alter the peripheral surface of the annular control means with the information obtained by the imaging system so that the peripheral surface represents the contours of the surface of the object.
60. The interface apparatus of claim 59 wherein the imaging system obtains the information on the contours of the surface of the object by scanning the object and generating a mathematical function representative of the contours of the surface which mathematical function is the information used to alter the outside peripheral surface by the altering means.
61. The interface apparatus of claim 60 wherein when the imaging system obtains images of portions of the surface of the object, the peripheral surface of the annular control means can indicate which portions of the surface of the object were imaged.
62. The interface apparatus of claim 61 wherein the imaging system comprises an imaging device and an object holding device, the holding device being configured to hold and manipulate the object and the imaging device being positioned to focus on and image the object held by the holding device so that the object may be scanned and imaged.
63. The interface apparatus of claim 62 wherein the computer system is in control of the operation of the imaging system, the means to alter, and the annular control means and the annular control means appears on a visual display of the computer system.
64. A system for manipulating a work piece, said workpiece being an object or an image of that object, said system comprising:
interface means for receiving an operator input and generating an operative signal,
mode selection means for selecting between an acquisition and an analysis mode,
acquisition movement means for generating a workpiece positioning signal upon the operators selection of the acquisition mode and generation of an operative signal,
analysis movement means for generating an image positioning signal upon the operators selection of the analysis mode and generation of the operative signal,
means for receiving said mode selection signal and for providing said operative signal selectively to one of: the acquisition movement means in response to selection of the acquisition mode, or the analysis movement means in response to selection of the analysis mode.
65. The system of claim 64 wherein said acquisition movement means for generating a workpiece positioning signal and analysis movement means for generating an image positioning signal further comprises means to generate a first workpiece positioning signal which provides for rotational motion of the work piece, and a second workpiece positioning signal which provides for translational motion of the work piece.
66. The system of claim 64 wherein said acquisition mode has a first sub-mode for acquisition of a profile of contours of a surface of the object and means for displaying an image of the profile of the contours of the surface of the object.
67. The system of claim 65 wherein said analysis movement means has means to display the image of a test object and the image of a reference object adjacent to each other and the mode means can change the analysis movement means among three operating states: a first operating state for rotation and translational motion of the image of the reference object, a second operating state for rotation and translation of the image of the test object and a third operating state for displaying the combined image of the test and the reference object joined by a line of separation.
68. The system of claim 67 wherein when the system is in the third operating state the signal of rotational motion, upon input of an operative signal rotates the line of separation about a point on the line of separation, whereby rotation of the line of separation successively different portions of the image of the test and reference object.
69. The system of claim 64 wherein the interface receives operator input from a pointing device and the operator selects the operating mode with the pointing device.
70. The system of claim 65 wherein the interface means has a first and second annular control means, wherein the first annular control means receives operator input for generation of the first workpiece positioning signal which provides for rotational motion and the second annular control means receives operator input for generation of the second workpiece positioning signal for which provides for translational motion of the workpiece.
71. The system of claim 70 wherein the interface means further comprises a third annular control means wherein the third annular ring receives operator input for generation of a signal for large translational motion movements of the workpiece and the second annular control piece receives operator input for generation of a signal for small translational motion movements of the workpiece.
72. The system of claim 71 wherein the first annular control means, the second annular control means and the third annular control means all share a common center and at the common center the interface has means for incremental rotational movement whereby the operator input generates a signal which provides for incremental rotational movement of the work piece.
73. A method for dimensionally adjusting an image for optimal image analysis comprising:
displaying an image on a screen;
sending a signal to expand the image so that the image adjusts along at least one dimensional axis;
wherein the adjusting of the image can be stretching it out or compressing it; and
whereby the adjustment of the image facilitates analysis of the image and comparison of the image with at least one other image.
74. An interface apparatus for dimensionally adjusting an image for optimal image analysis comprising: a thumb wheel responsive to operator input, said thumb wheel being operatively connected to an image adjustment mechanism so that when the operator moves the thumb wheel the image adjusts it size along at least one axis, wherein adjustment can either be a stretching out of the image or compressing the image to facilitate analysis of the image and comparison of the image with at least one other image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/984,952 US20020133263A1 (en) | 1998-07-08 | 2001-10-31 | Data acquisition image analysis image manipulation interface |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/111,743 US6336052B1 (en) | 1998-07-08 | 1998-07-08 | Data acquistion image analysis image manipulation interface |
US09/984,952 US20020133263A1 (en) | 1998-07-08 | 2001-10-31 | Data acquisition image analysis image manipulation interface |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/111,743 Division US6336052B1 (en) | 1998-07-08 | 1998-07-08 | Data acquistion image analysis image manipulation interface |
Publications (1)
Publication Number | Publication Date |
---|---|
US20020133263A1 true US20020133263A1 (en) | 2002-09-19 |
Family
ID=22340205
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/111,743 Expired - Lifetime US6336052B1 (en) | 1998-07-08 | 1998-07-08 | Data acquistion image analysis image manipulation interface |
US09/984,952 Abandoned US20020133263A1 (en) | 1998-07-08 | 2001-10-31 | Data acquisition image analysis image manipulation interface |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/111,743 Expired - Lifetime US6336052B1 (en) | 1998-07-08 | 1998-07-08 | Data acquistion image analysis image manipulation interface |
Country Status (1)
Country | Link |
---|---|
US (2) | US6336052B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140022173A1 (en) * | 2012-07-17 | 2014-01-23 | Giga-Byte Technology Co., Ltd. | Computer input device with switchable operation modes and mode switching method thereof |
US8817016B2 (en) | 2010-04-08 | 2014-08-26 | Forensic Technology Wai, Inc. | Generation of a modified 3D image of an object comprising tool marks |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7051287B1 (en) * | 1998-12-14 | 2006-05-23 | Canon Kabushiki Kaisha | Display device with frame reduction, display control method thereof, and storage medium |
US6448983B1 (en) * | 1999-01-14 | 2002-09-10 | General Electric Company | Method for selection of a design of experiment |
US8407595B1 (en) | 2000-02-11 | 2013-03-26 | Sony Corporation | Imaging service for automating the display of images |
US7810037B1 (en) | 2000-02-11 | 2010-10-05 | Sony Corporation | Online story collaboration |
US7262778B1 (en) | 2000-02-11 | 2007-08-28 | Sony Corporation | Automatic color adjustment of a template design |
US7058903B1 (en) * | 2000-02-11 | 2006-06-06 | Sony Corporation | Image database jog/shuttle search |
US7136528B2 (en) * | 2000-02-11 | 2006-11-14 | Sony Corporation | System and method for editing digital images |
US6631303B1 (en) * | 2000-03-24 | 2003-10-07 | Microsoft Corporation | Imaging compensation method for optical pointing devices |
US20020073143A1 (en) * | 2000-08-31 | 2002-06-13 | Edwards Eric D. | File archive and media transfer system with user notification |
US6788288B2 (en) * | 2000-09-11 | 2004-09-07 | Matsushita Electric Industrial Co., Ltd. | Coordinate input device and portable information apparatus equipped with coordinate input device |
US20030081216A1 (en) * | 2001-11-01 | 2003-05-01 | Martin Ebert | Graphical user interface for sample positioning |
US20030122783A1 (en) * | 2001-12-28 | 2003-07-03 | Green Carl I. | Horizontal wheel user input device |
WO2004061594A2 (en) * | 2002-12-16 | 2004-07-22 | Microsoft Corporation | Systems and metzhods for interfacing with computer devices |
JP2006092321A (en) * | 2004-09-24 | 2006-04-06 | Toshiba Corp | Electronic equipment and touchpad device |
JP2007160642A (en) * | 2005-12-13 | 2007-06-28 | Sumitomo Heavy Ind Ltd | Molding machine control system, molding machine, control apparatus, and molding machine control method |
US9237294B2 (en) | 2010-03-05 | 2016-01-12 | Sony Corporation | Apparatus and method for replacing a broadcasted advertisement based on both heuristic information and attempts in altering the playback of the advertisement |
US9298598B2 (en) * | 2010-03-22 | 2016-03-29 | Red Hat, Inc. | Automated visual testing |
KR101341025B1 (en) * | 2010-08-11 | 2013-12-13 | 엘지디스플레이 주식회사 | Simulation method for image quality improvement of image display device and circuit the same |
US20120096380A1 (en) * | 2010-10-13 | 2012-04-19 | Wagner David L | Color Selection Graphical User Interface |
US9832528B2 (en) | 2010-10-21 | 2017-11-28 | Sony Corporation | System and method for merging network-based content with broadcasted programming content |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4052603A (en) * | 1974-12-23 | 1977-10-04 | International Business Machines Corporation | Object positioning process and apparatus |
US4987412A (en) * | 1988-08-25 | 1991-01-22 | The United States Of America As Represented By The United States Department Of Energy | Method and apparatus for the simultaneous display and correlation of independently generated images |
US5546525A (en) | 1989-11-13 | 1996-08-13 | Lotus Development Corporation | Computer user interface with multimode selection of displayed controls |
US5390108A (en) | 1991-05-24 | 1995-02-14 | Forensic Technology Wai Inc. | Computer automated bullet analysis apparatus |
US5428367A (en) | 1991-07-08 | 1995-06-27 | Mikan; Peter J. | Computer mouse simulator having see-through touchscreen device and external electronic interface therefor |
US5594471A (en) | 1992-01-09 | 1997-01-14 | Casco Development, Inc. | Industrial touchscreen workstation with programmable interface and method |
US5379106A (en) * | 1992-04-24 | 1995-01-03 | Forensic Technology Wai, Inc. | Method and apparatus for monitoring and adjusting the position of an article under optical observation |
CA2101864A1 (en) | 1992-08-27 | 1994-02-28 | Claudia Carpenter | Customizable program control interface for a computer system |
US5659693A (en) | 1992-08-27 | 1997-08-19 | Starfish Software, Inc. | User interface with individually configurable panel interface for use in a computer system |
US5392388A (en) * | 1992-12-04 | 1995-02-21 | International Business Machines Corporation | Method and system for viewing graphic images in a data processing system |
US5581670A (en) | 1993-07-21 | 1996-12-03 | Xerox Corporation | User interface having movable sheet with click-through tools |
CA2124624C (en) | 1993-07-21 | 1999-07-13 | Eric A. Bier | User interface having click-through tools that can be composed with other tools |
US5592195A (en) * | 1994-11-21 | 1997-01-07 | International Business Machines Corporation | Information displaying device |
US5654801A (en) | 1995-01-03 | 1997-08-05 | Forensic Technology Wai Inc. | Fired cartridge examination method and imaging apparatus |
US5894294A (en) * | 1996-02-22 | 1999-04-13 | Brother Kogyo Kabushiki Kaisha | Sewing pattern display device |
US5808613A (en) * | 1996-05-28 | 1998-09-15 | Silicon Graphics, Inc. | Network navigator with enhanced navigational abilities |
US5633717A (en) | 1996-06-26 | 1997-05-27 | Forensic Technology Wai Inc. | Method for monitoring and adjusting the position of an object under optical observation for imaging |
-
1998
- 1998-07-08 US US09/111,743 patent/US6336052B1/en not_active Expired - Lifetime
-
2001
- 2001-10-31 US US09/984,952 patent/US20020133263A1/en not_active Abandoned
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8817016B2 (en) | 2010-04-08 | 2014-08-26 | Forensic Technology Wai, Inc. | Generation of a modified 3D image of an object comprising tool marks |
US20140022173A1 (en) * | 2012-07-17 | 2014-01-23 | Giga-Byte Technology Co., Ltd. | Computer input device with switchable operation modes and mode switching method thereof |
US9024879B2 (en) * | 2012-07-17 | 2015-05-05 | Giga-Byte Technology Co., Ltd. | Computer input device with switchable operation modes and mode switching method thereof |
Also Published As
Publication number | Publication date |
---|---|
US6336052B1 (en) | 2002-01-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6336052B1 (en) | Data acquistion image analysis image manipulation interface | |
US12210680B2 (en) | Head mounted display apparatus | |
CN101308442B (en) | 3d pointing method and 3d pointing device | |
EP1292877B1 (en) | Apparatus and method for indicating a target by image processing without three-dimensional modeling | |
EP0528422B1 (en) | Three-dimensional object image drawing apparatus and method | |
US6204852B1 (en) | Video hand image three-dimensional computer interface | |
US6764185B1 (en) | Projector as an input and output device | |
US7324085B2 (en) | Techniques for pointing to locations within a volumetric display | |
US20010030668A1 (en) | Method and system for interacting with a display | |
CN111766937A (en) | Interactive method, device, terminal device and storage medium for virtual content | |
KR20110022057A (en) | Gesture Based Control System for Vehicle Interface | |
WO2003099526A1 (en) | A method and a system for programming an industrial robot | |
WO2009059716A1 (en) | Pointing device and method for operating the pointing device | |
NO339941B1 (en) | System and method for a gesture-based management system | |
CN111813214B (en) | Virtual content processing method, device, terminal device and storage medium | |
JP4172307B2 (en) | 3D instruction input device | |
US6760030B2 (en) | Method of displaying objects in a virtual 3-dimensional space | |
CN108459702A (en) | Man-machine interaction method based on gesture identification and visual feedback and system | |
CN111766936A (en) | Control method, device, terminal device and storage medium for virtual content | |
JPH10283115A (en) | Display input device | |
CN111580677B (en) | Human-computer interaction method and human-computer interaction system | |
GB2345538A (en) | Optical tracker | |
US5886700A (en) | Three-dimensional volume selection tool | |
CN111857364A (en) | Interactive device, virtual content processing method, device and terminal device | |
Iannizzotto et al. | A multimodal perceptual user interface for video-surveillance environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FORENSIC TECHNOLOGIES WAI INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OUELLET, JEAN-FRANCOIS;RANNOU, PATRICK;REEL/FRAME:012296/0347 Effective date: 19980706 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |