[go: up one dir, main page]

GB2043941A - Improvements in or Relating to Visual Display Apparatus - Google Patents

Improvements in or Relating to Visual Display Apparatus Download PDF

Info

Publication number
GB2043941A
GB2043941A GB8001166A GB8001166A GB2043941A GB 2043941 A GB2043941 A GB 2043941A GB 8001166 A GB8001166 A GB 8001166A GB 8001166 A GB8001166 A GB 8001166A GB 2043941 A GB2043941 A GB 2043941A
Authority
GB
United Kingdom
Prior art keywords
zone
definition
line
area
head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB8001166A
Other versions
GB2043941B (en
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thales Training and Simulation Ltd
Original Assignee
Thales Training and Simulation Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thales Training and Simulation Ltd filed Critical Thales Training and Simulation Ltd
Priority to GB8001166A priority Critical patent/GB2043941B/en
Priority to CA000344313A priority patent/CA1147073A/en
Publication of GB2043941A publication Critical patent/GB2043941A/en
Application granted granted Critical
Publication of GB2043941B publication Critical patent/GB2043941B/en
Expired legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0154Head-up displays characterised by mechanical features with movable elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B2027/0192Supplementary details
    • G02B2027/0198System for aligning or maintaining alignment of an image in a predetermined direction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

The invention provides head- coupled area-of-interest visual display apparatus particularly for ground- based craft-flight simulators. The apparatus provides binocular vision of a display projected onto a part- spherical retro-reflective screen covering an area of interest determined by craft position and heading and viewer's instantaneous line of view. Line scan apparatus is cockpit mounted, line image transmission is by fibre optic light guide ribbon and frame scan apparatus is helmet-mounted Sensing means detect head/helmet movements to permit voluntary individual scanning of a wide angle of simulated view from the craft. The projected image comprises two zones, a larger area zone of lower definition and a smaller area zone of higher definition. Separate line-scanning apparatus and fibre optic light guides may be used for each zone. <IMAGE>

Description

SPECIFICATION Improvements In or Relating to Visual Display Apparatus This invention relates to visual display apparatus, particularly for ground-based flight simulators and particularly for providing a display covering a wide-angle field of view. The invention provides such apparatus capable of providing a display for a sole pilot or simultaneously for two pilots.
The apparatus is of the head-coupled area-ofinterest type, wherein an image is projected upon a screen and is appropriately changed both according to the simulated craft position and angular orientation and according to the viewer's instantaneous line of view and is simultaneously moved on the screen to occupy the viewer's field of view.
Apparatus of this type was described in prior patent Specification Number 1,489,758. Such apparatus provided an area-of-interest display for a sole observer which was pseudo-collimated, that is, the same image was projected for left and right eyes, so as to appear at infinity.
The present invention provides improved apparatus of this type which provides a display having two zones, a larger area zone of lower definition and an inset smaller area zone of higher definition.
Accordingly, the invention provides headcoupled area-of-interest, visual display apparatus providing a displayed scene having two zones the first zone being of greater area and having a lower definition image relatively to the second zone, the second zone being of smaller area, being inset within the first zone and having a higher definition image relatively to the first zone, comprising a part-spherical retro-reflective concave screen of area greater than a viewer's instantaneous field of view, a helmet, sensing means for sensing the orientation of the viewer's head and helmet, visual image generating means for generating a simulated scene in the direction of the viewer's instantaneous line of view according to the viewer's simulated position and orientation and under control of the said sensing means, the said image generator being adapted for providing two visual images corresponding respectively to the two said zones of the displayed scene, a laser light source, separate laser beam modulators for each zone of the displayed scene, separate line scanners for each zone of the said scene for scanning the modulated laser beam over the input ends of respective fibre optic light guides, the said fibre optic light guides having their output ends at spaced-apart positions on the viewer's helmet, and frame scanning means mounted on the said helmet for receiving light from the light guide outputs and projecting the light as simultaneous scan lines of the two said zones to form a composite two-zone displayed scene on the screen.
Short Description of Drawings In order that the invention may readily be carried into practice, one embodiment will now be described in detail, by way of example, with reference to the accompanying drawings, in which: Fig. 1 is a diagrammatic, simplified perspective view showing a pilot seated in relation to a partspherical screen for viewing a display comprising a lower-definition zone and an Inset zone of higher definition; Fig. 2 is a diagrammatic, simplified representation of one laser source, laser beam modulator, line scanner, fibre optic light guide ribbon and helmet-mounted frame scanner combination used in the apparatus of Fig. 1; Fig. 3 is a diagrammatic side view of the frame scanner of Fig. 2;; Fig. 4 is a diagrammatic representation of the projection screen of Fig. 1 with a two-zone display corresponding to a particular line of view shown thereon; Fig. 5 is a diagrammatic representation of a preferred arrangement of laser source, two-zone laser beam modulators, fibre optic guides and helmet mounted frame scanner used in the invention; Fig. 6 shows diagrammatically the method of storing an image in a buffer store so that it may be displayed as an image with low and high resolution parts; Fig. 7 is a diagrammatic perspective view showing apparatus for a preferred form of the invention providing a psuedo-collimated display of large area with lower definition and inset smaller area of high definition; and Fig. 8 is a detail view showing an alternative line scanner of those of Fig. 2 and Fig. 5.
Description of the Example In the accompanying drawings, the same elements are indicated by the same reference numerals throughout.
Fig. 1 shows in simplified form the apparatus according to the invention for generating and displaying a two-zone area-of-interest view. A pilot 10 wearing a helmet 12 is seated within a part-spherical shell having a retro-reflective interior surface partially represented in Fig. 1 by the concave retro-reflective screen 14. The pilot's line of vision 67 intersects the screen at point 1 7.
The field of view for each eye is centred on the point 1 7. The view displayed comprises two zones, each zone covering at least half of the field of view. For simplicity, the combined zones will be referred to as the displayed scene.
The displayed scene depends, in this example, upon the simulated position of an aircraft during an exercise flight, the attitude of the aircraft, the pilot's seating position in the aircraft and the pilot's instantaneous line of view as determined by the instantaneous orientation of the pilot's head and helmet. The position of point 1 7 on the screen 14, and hence the position of the displayed view on the screen, depends only on the pilot's head and helmet orientation.
The two zone images are generated by an image generator 20 of the computer-generated image type which includes a frame buffer store 20'. The pilot's head orientation is sensed by a head orientation sensor 22, which is fixedly mounted within the simulated aircraft cockpit in a mounting 24. The displayed view is projected onto the screen 14, centred in the appropriate locations as two raster-scanned images, the line scan apparatus being cockpit-mounted and the frame scan apparatus being mounted on the helmet 12. Line scan may be either across the screen 14 or up or down. In the present example, line scan is such that the projected scan line on the screen and the line between the pilot's eyes lie in the same plane. The frame scan is orthogonal thereto. Thus, when the pilot's head is upright, line scan is horizontal and frame scan vertical.
Referring still to Fig. 1, a laser source 30 provides an output laser beam 31 which is directed through beam-splitter and reflector elements 32, 33 to provide two beams 34 and 36 of equal intensity. Laser beam 34 passes through a full-colour modulator 38 controlled from the image generator 20 according to the first zone image. Laser beam 36 passes through a fullcolour modulator 40 controlled from the image generator 20 according to the second zone image. Both modulated beams 34' and 36' pass to a double line scanner 42 fixedly mounted in the simulated aircraft cockpit. The two scanners, described in detail later herein, provide two respective scanned beams 44 and 46 which are respectively scanned over the input ends 48 and 50 of two fibre optic light guide ribbons 52 and 54.
In Fig. 1, the output ends of the two light guides 52 and 54 are shown spaced apart on the helmet 12, for clarity, and the emergent light beams are separately focussed by spherical lenses 62 and 64 respectively onto the mirror 60 of a common frame scanner. A practical form of the apparatus uses a common spherical lens 62, as is shown in Fig. 5, which is described later herein.
The two fibre optic light guides provide a flexible linkage between the fixed line scanner 42 and the movable helmet 12. In Fig. 1, however, the emergent scanned light beams from the respective ends 56 and 58 of the light guides 52 and 54 are focussed by spherical lenses 62 and 64 onto the screen 14 and directed onto a plane mirror 60. The first zone beams are reflected by the mirror 60 along divergent paths to form a scan line of the first zone image. Similarly, the second zone beams are reflected by the mirror 60 along divergent paths to form a scan line of the second zone image. The centre line of the displayed scene is thereby formed on the screen 14 at point 17.
The mirror 60 is long in relation to its width and is carried in bearings at its end which are mounted on the helmet 1 2. These bearings are provided by motors 74 and 76 at the two ends which move the mirror 60 to provide the required frame scan.
The mirror 60 may be a single plane mirror which is either oscillated or rotated by the motors 74, 76 on its axis parallel to the plane in which the line scan is projected or the mirror 60 may be a multi-faceted polygon mirror rod of, for example, octagonal cross-section which is continuously rotated by the motors 74, 76. In the present example, the mirror 60 is a single plane mirror and is rotationally oscillated for frame scan.
As the pilot's head moves, so does the displayed view move over the screen, so as to be in the pilot's new line of view and the view itself is changed according to the simulated real world view in the direction of the line of view.
To this end, the visual system receives data from the host flight computer on lines 80 and 81.
Position data defining the simulated aircraft position throughout a simulated flight exercise is supplied to the image generator 20 on line 80.
Attitude data, defining the simulated aircraft instantaneous attitude, is supplied on line 81 to a vector summing unit 82 together with head orientation data, defining the pilot's actual instantaneous line of view, on line 84. The summed output is supplied to the image generator 20 on line 86. A throughput delay error signal obtained by subtracting the head attitude input to the image generator one throughput delay period ago from the current head attitude position is supplied to the througput delay error control unit 100 on line 119.
The two images, respectively for the first and second zone views, in accordacce with the inputted data, and allowing for the known seating position of the pilot in the simulated aircraft type, are supplied to the respective modulators 38 and 40 on lines 88 and 90.
It will be appreciated that the change of the displayed image with simulated aircraft position is relatively slow. However, the change of the displayed image with head orientation may be complete and is relatively very rapid. The image generator is unable to compute an entirely new image immediately a new line of view is established due to the throughput delay of the image generator computer. To overcome this limitation the residual old displayed view is derotated to its former screen position until the computed new displayed view is available.
The required image derotation can be effected by controlling the relationship between the video signal and the line scan and frame scan positions.
This control can be produced in a number of ways.
The line scanner is typically a continuously rotating polygon mirror which sweeps the input laser beam or beams through an arc to produce a line scan, as in the example of Fig. 2. Three alternatives are available: (i) If the video signal is produced at a constant rate then the line scan drive may be phase modulated to maintain the correct line in space to produce an image with the correct spatial orientation. If the line projection system is capable of transmitting only the displayed field of view, then the image size will only be that part which is common to both the computed and projected images. If the fibre optic ribbon and the projection system is capable of projecting more than the required field of view in the line scan direction then the field of view obtained may be held constant.
(ii) The video signal may be produced at a constant rate and the line scanner rotated at a constant rate. The required angular shift may then be introduced with a supplementary mirror. Line scanning apparatus, alternative to that of Fig. 2 and including such a supplementary mirror, is described later herein with reference to Fig. 8.
(iii) The polygon mirror may be run at a constant angular velocity and the video signal timing adjusted by altering the time at which the video signal is read out of the frame store 20' of the image generator 20. This ensures that the video signal corresponding to a point in space is produced at the predetermined time that the scanner points the light beam at that part of the screen representing the required point in space.
Qf these three methods described above, method (i) involves the phase modulation of a mechanical system rotating at high speed and has the disadvantages associated with the inertia and response times of such a system. Method (ii) overcomes some of these problems by using a supplementary mirror. This mirror does not rotate at high speed but nevertheless has inertia inherent in any mechanical system and so it will have some response time. Method (iii) requires only the ability to read out a memory at controlled times. Since a memory is not a mechanical system, it has no inertia and can be read out in a discontinuous manner if required. Accordingly, method (iii) is the preferred method for line scan synchronisation in the present invention.
The frame scanner of Fig. 1 does not offer the same options as does the line scanner due to the difficulties of implementation. The alternative methods corresponding to those described for the line scanner as as follows: (i) If the video signal is produced at a constant rate then the frame scan drive may be controlled to give the required pointing direction. In this case the frame scanner will be a position servomechanism driven by a sawtooth waveform in which the starting point of the ramp may vary in a controlled manner and the slope of the ramp may vary in a controlled manner in order to give a constant angular sweep in free space when the projector mount is being subjected to angular shifts.
(ii) The use of a supplementary mirror is impractical in the frame scanner of Fig. 1.
(iii) If the frame scanner is driven with a sawtooth of constant period, start point and slope, then the read out times from the frame store 20' may be adjusted to produce the video signal when the scanner is at the required orientation in free space.
Of these three methods, method (i) requires adjustments to the period and rate of a mechanical system which, due to its construction, has a very low inertia. Hence, the settling time following such a disturbance may be acceptable.
It can preserve the instantaneous field of view constant through the throughput delay period.
Method (ii) is impractical due to the physical constraints of the projection lens and frame scanner assembly of Fig. 1. Method (iii) involves adjustment to a system without inertia or the requirements of continuity. However method (iii) reduces the virtual field of view during the throughput delay period.
Continuing with the description of the apparatus of Fig. 1, a synchronising pulse generator 106 supplies pulses on line 108 to the throughput delay error control unit 1 00.
Line scan control signals are supplied to the line scanners of unit 42 from unit 92 by way of line 94. Frame scan control signals are supplied to the frame scan motors 74, 76 from unit 96 by way of a flexible line 98. Video synchronisation timing pulses are fed to the frame buffer 20' of the C.G.I. image generator 20, from the unit 100 on line 110. Control of the relative timings between the line scan control 92, the frame scan control 96 and the C.G.I. image generator frame buffer 20' is effected by the throughput delay error compensation.circuit 100 by way of lines 102,104 and 11 0, respectively.
It will be noted that the projection middle lines 66 and 68 do not coincide with the lines of view 70 and 72 for the reason that projection is effected from above the pilot's eyes. Projected onto any vertical plane, the respective lines diverge away from the screen. The angle of divergence is small but is nevertheless great enough, compared with the apex angle of the half-brilliance cone of reflection of a retroreflective screen material, to result in a viewed scene of much reduced brilliance. It is preferred therefore to use a screen of modified retroreflective material for which the axis of the halfbrilliance cone of reflection is depressed downwardly by the angle between the projection lines 66, 68 and the line of view lines 70, 72.
The various units of the apparatus, shown in the block schematic part of Fig. 1, will now be considered in further detail in the following order: C.G.I. Image Generator.
Laser Source.
Laser Beam Modulator.
Line Scanner.
Fibre Optic Light Guide Ribbon.
Frame scanner.
Retro-reflective Screen.
Helmet-Head Orientation Sensor.
Throughput Delay Error Compensation Unit.
Line Scan Control.
Frame Scan Control.
C.G.I. Image Generator The displayed scene corresponds to a real world scene as it would be visible from the simulated aircraft during flight. In earlier visual display apparatus for ground-based simulators, the visual image was generated using a scale model and a closed-circuit television camera. The camera lens, comprising an optical probe, was moved over the model correspondingly to the aircraft simulated position, altitude, heading, pitch and roll. The generated image was varied according to all these factors.
According to a more recent technique, now well established, the same form of image is computer-generated. The technique is explained in text books such as, for example, "Principles of Interactive Computer Graphics", by William M.
Newman and Robert F. Sproull, published in 1973 by McGraw-Hill Book Company, New York and elsewhere.
The signals available to the image generator computer from the host flight computer of the simulator are: aircraft position, X.Y., altitude, heading, pitch and roll. C.G.I. image generators are known which generate the direct ahead view from the aircraft according to the input data, including solid-looking features with surface detail, concealing hidden edge-lines and surfaces as the aircraft flies around such objects and clipping and windowing the display according to the simulated field of view.
The image generator 20 of Fig. 1 is of this general type. Aircraft position and attitude data are supplied from the host flight computer on line 80. Aircraft heading, pitch and roll data are supplied on line 81.
However, the image generated in the apparatus of Fig. 1 is in the actual instantaneous line of view of the pilot. This view is determined also by the pilot's line of view heading and pitch and head roll relatively to the aircraft axes. Head azimuth, head pitch and head roll are determined by the head orientation sensor 22 and these data are supplied on line 84 to the summing unit 82, which adds these values to the aircraft heading, pitch and roll values respectively. The output information defining the pilot's line of view relatively to the simulated terrain overflown is supplied to the image generator 20 on line 86.
The point midway between the piiot's eyes is a constant position offset above and to the left of the aircraft longitudinal axis. This offset requires only constant values to be added to aircraft altitude and position respectively throughout an entire exercise.
For the generation of separate zone images two simular type image generators are included in the image generator 20. The same data are continuously inputted to both image generators but each image generator provides for an area of image corresponding to the respective zone. The smaller area zone provides for correspondingly greater detail within the same video signal bandwidth.
Laser Source, Laser Beam Modulator, Line Scanner, Fibre Optic Light Guide Ribbon and Frame Scanner The basic components of a laser source, laser beam modulator, line scanner, fibre optic light guide ribbon and frame scanner combination, for the apparatus of Fig. 1, will first be described with reference to the simplified diagram of Fig. 2 and Fig. 3.
Fig. 2 shows the laser beam source 30 which provides the output laser beam 31 directed through the full colour modulator 38. Both the laser beam source 30 and the modulator 38 are of known form. The full-colour modulated beam output is shown at 31' in this figure, in which intermediate beam-splitters are not shown. The line scanner is shown generally at 42.
The line scanner comprises a synchronouslydriven polygonal section mirror drum 144 which rotates continuously in the direction shown by the arow 145 to sweep the beam 31' over the scan path 44. One pass occurs for the movement of each mirror facet of the mirror drum 144 past the beam 31'.
A fibre optic light guide, formed into a flat ribbon 52 over most of its length, has individual groups of fibres formed into an arc at the input end 48 of the light guide. The width of the line scan 44 may exactly cover the arc at 48. The modulated beam 31' is then scanned along the entire arc at 48 for each line of the image.
At the output end 56 of the fibre optic light guide 52, the individual groups of fibres are similarly formed into an arc the fibre groups occurring in the same sequence at the two ends 48 and 56, so that the scanned image line at the input end 48 is exactly reproduced at the output end 56.
The emergent rays from the output end 56 of the light guide 52 are focussed by the spherical lens 62 onto the face of the frame scanning mirror 60. As shown in Fig. 1, the mirror 60 is mounted on the pilot's helmet 12 in bearings provided by reciprocating motors 74 and 76.
With the mirror 60 stationary, the emergent rays are reflected from the mirror 60, as shown instantaneously at 66, to form a single line of the image. As the mirror 60 is moved, successive lines of the image are projected to form the entire scanned image corresponding to one zone of the display.
Fig. 3 shows, in side view, the output end 56 of the light guide 52, the spherical lens 62, the mirror 60 and the reflected beam 66 as described above with reference to Fig. 2.
A second line scanner, comprising a second mirror drum, produces a second line scan over the input end 50 of the second fibre optic light guide 54, as is shown in Fig. 1 and in Fig. 5.
Referring now to Fig. 4, there is shown at 14 a part of the screen 14 of Fig. 1 and the point 17 at which the pilot's line of view 67, Fig. 1, intersects the screen 14 is shown as the centre point of a small circle on the screen 14 of Fig. 4.
A first zone 201 extends over an area represented by the line 211 and corresponds to the pilot's field of view. A smaller zone 202 extends over an area represented by the broken line 212 and corresponds to a much smaller area within which greater detail can be appreciated visually by a viewer.
The high-definition zone may be inset in the low-definition zone in a number of alternative ways. Thus, the low-definition zone 201 may be blanked out to leave a blank zone 202 and the high-definition zone 202 may be optically inset in the blank area so provided. Alternatively, the lowdefinition zone 201 may include the highdefinition zone 202 and the increased definition be provided by a high-definition image electronically inset within the lower definition image.
In either case, the high-definition zone 202 is always of smaller area than the low-definition zone 201. It may either be central within the lowdefinition zone 201, if head coupling only is used to position the image, or it may be offset from centre, if head coupling is used to position the boundary 211 of the low-definition image and eye coupling is used to position the boundary 212 of the high-definition image.
The apparatus of Fig. 5 uses the same spherical lens 62 and frame scanner mirror 60, for both zones. A separate line scanner and fibre optic guide for each zone.
Fig. 5 shows the laser source 30, beam-spiitter and reflector elements 32, 33 to provide two beams of equal intensity, as in Fig. 1. Laser beam 34 passes through the modulator 38 and laser beam 36 passes through the modulator 40, providing modulated output beams 34' and 36', respectively, which are directed to the double line scanners generally shown at 42 and 43, respectively.
In Fig. 5, the modulator 38 receives the highdefinition video input and the modulator 40 receives the low-definition input. The fibre optic light guide 52 is composed of fine fibres and transmits the high-definition image projected as area 202, Fig. 4. The fibre optic light guide 53 is composed of coarse fibres and transmits the lowdefinition image projected as area 201, Fig. 4.
In the preferred system, the low-definition fibres are each four times the diameter of the high-definition fibres. The line scanning polygons 144 and 144a, respectively, are driven on a common shaft, not shown, and the high-definition area scanner 144 has four times as many facets as the low-definition scanner 144a.
The high-definition line scan is thus one quarter the length of the low-definition line scan.
The high-definition fibre guide 52 is capable of transmitting a line scan covering the whole field of view. Only one quarter is used at any time and the portion selected determines where the highdefinition insert 202 is positioned across the total field of view 201, Fig. 4. This selection is effected by a mirror 402 mounted on a pivot 404 with rotational position 403 determined by the viewer's eye movements relative to his head.
Since a common frame scanner 60 is used, this results in a selected column of the screen being scanned by both a high resolution spot and a low resolution spot. Video signals may be routed to either modulator in order to display the appropriate resolution image, with an appropriate delay to compensate for the angular separation between the two light guides at the projection lens end.
A complete column of high resolution image may be projected if desired or the interval over which a high resolution image is projected may be determined by monitoring vertical eye movement relative to the head.
The system shown diagrammatically in Fig. 5 is duplicated to provide an image for each of the pilot's two eyes, and may use the same video signal to produce a pseudo-collimated image or correctly computed video to produce a stereo pair of images. The apparatus of Fig. 7 uses two line scanners for each eye but uses the same lowdefinition and high-definition video information to provide a pseudo-collimated image.
An alternative form of the apparatus uses only a single high-definition fibre optic guide and line scanner. In this case, the image is computed with two different resolutions and stored in a buffer store as shown diagrammatically in Fig. 6.
The resolution of the display is equal to the resolution of a high resolution computed picture element 406. A low resolution picture element 405 is 1 6 times the size of a high resolution picture element. The image is computed as a lowdefinition scene or as a high-definition scene and stored in the appropriate portion of the bit map.
To read the memory, the address of each successive high resolution element is computed in the desired sequence and the video corresponding to that address is used to modulate the laser beam. When reading the low resolution portion of the image store, the two least significant bits of the address of both the line and picture element along that line are ignored. Thus the video level will only change at the boundaries of the coarse picture elements 405. When the high resolution portion of the memory is accessed, all the address bits are used, so allowing the video signal to change at the fine picture element 406 boundaries. The position of the boundary 212 between the two areas is determined during picture composition and can either be fixed relatively to the totai field of view boundary or controlled by the measured or predicted eye angular position with respect to head orientation.
Referring to Fig. 7, which illustrates the form of the invention which provides the pilot with a pseudo-collimated display with an inset highdefinition zone, it will be noted that the apparatus is generally similar to that of Fig. 1, except that two projectors are used, one above each eye of the pilot. Thus, two projectors use respectively lenses 62 and 64 to project the line image of a pair of fibre optic light guide output ends onto the screen 14 by way of the common frame scanning mirror 60.
The respective pairs of light guide ends 56, 57 and 58, 59 are relatively disposed as shown in Fig. 5.
The respective ends 56, 57 and 58, 59 terminate the light guide pairs 52, 53 and 54, 55.
The input ends 48, 50 and 49, 51 are scanned respectively by the high-definition zone line scanner 42 and by the low-definition zone line scanner 43.
The line scanner 42 scans the modulated laser beam 34' over the two light guide input ends 48 and 50. The line scanner 43 scans the modulated laser beam 35' over the two light guide input ends 49 and 51.
The respective high-definition zone and lowdefinition zone laser beam modulators 38 and 40 both receive their video modulation signals from the store 20' of the C.G.I. image generator 20, under control of the pulses supplied by the throughput delay error control unit 100 on line 110.
Fig. 8 shows line scanning apparatus alternative to that of Fig. 2 and Fig. 5 and includes a supplementary mirror 202. The mirror 202 is pivotable on an axis 203 which is parallel to the spin axis 204 of the polygon mirror line scanner 144.
To effect image derotation for head movement in the direction of line scan by the method (ii) described earlier, the mirror 202 is rotationally positioned about its axis 203 by a motor 205 in a controlled manner so that the swept arc 44 is positioned at the required part of the arc 48 at the input end of the fibre optic light guide 52. The motor 205 is controlled from the throughput delay error control unit 100 by a signal on line 102.
Modified Retro-Reflective Screen Retro-reflective projection screen material such as that sold under the name Scotchlite (Registered Trade Mark) has a reflection characteristic such that light incident upon the screen is reflected back along the line of incidence. That is to say, reflected light is brightest on the line of incidence, falling in intensity rapidly as the eye is displaced from the line of incidence in any direction. With one retroreflective material, observed brightness falls to one-half intensity at an angle of 0.80 displacement from the line of incidence. Stated in other words, the area of half-brightness is the base area of a cone which has its axis on the line of incidence and having a half-angle of 0.80 at its apex.
In the projection apparatus described with reference to Fig. 1 , the line of incidence 66, between the frame scanner 60 and the screen 14, makes an angle which is also approximately 0.80 with the line of view 70, between the screen 14 and the eye of pilot 10. Thus, with an unmodified retro-reflective screen, the projected image would be seen at half-brightness by the pilot.
In the apparatus of the invention, it is preferred to modify the reflection characteristic of the screen in order to increase the brightness of the projected image on the pilot's line of view, while decreasing brightness elsewhere. This modification is effected by placing a diffraction grating in front of the screen surface.
Head/Helmet Orientation Sensor Mechanical linkages have been proposed to sense the orientation of a pilot's helmet relatively to an aircraft cockpit. However, mechanical arrangements of any sort are undesirable in the environment of an aircraft simulator cockpit.
It is preferred to effect helmet orientation sensing by non-contact means. Any suitable known head/helmet orientation sensor may be used in apparatus of the present invention to provide electrical signals defining instantaneous helmet orientation. One such sensor is that described by R. G. Stoutmeyer and others in U.S.
Patent No. 3,917,412, entitled "Advanced Helmet Tracker Using Lateral Photodetection and Light-Emitting Diodes". Such apparatus is further described by Edgar B. Lewis in U.S. Patent No.
4,028,725, entitled "High-Resolution Vision System".
Throughput Delay Error Compensation Unit, Line Scan Control and Frame Scan Control As has been explained earlier in the description, the C.G.I. image generator 20 takes an appreciable time to compute a new view for display when the pilot's line of view is changed.
The delay is of the order of 100 m secs. However, when any viewer changes his line of view, by extensive head movement, there is a delay before the viewer appreciates the new view before him.
This delay also is of the same order of time as the image generator delay.
In a simplified form of the apparatus according to the invention means are provided merely to ensure that the old display is not projected in the new line of view of the change head position.
In this simplified form of the apparatus, a large change of head orientation signal on line 11 9 is effective to blank out the projected view for a period of some 100 m secs until the new view has been computed.
The apparatus of Fig. 1 provides means for the derotation of the projected image upon rotation of the pilot's head. Derotation is considered to be of especial importance when head movement is such that the new field of view is not separate from the old field of view but is within it or overlaps it.
The displayed view is some 1000 in azimuth and some 700 in elevation, with respect to the pilot's line of view. Although a viewer's field of view may exceed these angles, the marginal areas are low-interest and the central area of primeinterest may be a cone of perhaps only 50 about the line of vision. It is therefore readily possible for the pilot to change his line of view so as to move this area of central interest within the initial displayed view area.
In the apparatus of Fig. 1, line scan is in a direction across the screen 14 and frame scan is orthogonal thereto. The head orientation sensor 22 provides signals resolved into head azimuth movement and head pitch movement.
The synchronising pulse generator 106 provides a line synchronising and frame synchronising pulse output of equally spaced apart pulses. Upon change of head azimuth, the output signal on line 11 9 causes the throughput delay error control unit 100 to provide a relative change of phase of the line synchronising pulses supplied by control unit 92 to the line scanner 42, and the video synchronising pulses supplied on line 110 by the throughput delay error control unit 100 to the frame buffer store 20', so controlling read out from the store 20' in the sense to displace the displayed image equally and oppositely to every change of head azimuth.
Similarly, the output signal on line 119 causes control unit 100 together with frame scan control unit 96 to provide a relative change of phase of the frame synchronising pulses supplied by control unit 96 to the frame scanning motors 74 and 76.
Thereby, upon head rotation in azimuth or pitch or both, the displayed view is displaced oppositely. The derotation is maintained for a period of some 100 rn secs, until the new view is computed. The original relative timing of the synchronising pulses is then restored, so that the new view is displayed in the direction of the new line of view.

Claims (10)

Claims
1. Head-coupled, area-of-interest, visual display apparatus providing a displayed scene having two zones the first zone being of greater area and having a lower definition image relatively to the second zone, the second zone being of smaller area, being inset within the first zone and having a higher definition image relatively to the first zone, comprising a partspherical retro-reflective concave screen of area greater than a viewer's instantaneous field of view, a helmet, sensing means for sensing the orientation of the viewer's head and helmet, visual image generating means for generating a simulated scene in the direction of the viewer's instantaneous line of view according to the viewer's simulated position and orientation and under control of the said sensing means, the said image generator being adapted for providing two visual images corresponding respectively to the two said zones of the displayed scene, a laser light source, separate laser beam modulators for each zone of the displayed scene, separate line scanners for each zone of the said scene for scanning the modulated laser beam over the input ends of respective fibre optic light guides, the said fibre optic light guides having their output ends at spaced-apart positions on the viewer's helmet, and frame scanning means mounted on the said helmet for receiving light from the light guide outputs and projecting the light as simultaneous scan lines of the two said zones to form a composite two-zone displayed scene on the screen.
2. Head-coupled, area-of-interest, visual display apparatus as claimed in Claim 1, in which the said high-definition zone is inset within the said low-definition zone by blanking out a part of the low-definition zone to accommodate the high definition zone therein.
3. Head-coupled, area-of-interest, visual display apparatus as claimed in Claim 1, in which the said high-definition zone is inset within the low-definition zone by being overlaid upon part of the area of said low-definition zone.
4. Head-coupled, area-of-interest, visual display apparatus as claimed in Claim 1, in which the visual image generating means generates both the low-definition and the high-definition zones of the simulated scene and includes a frame buffer store in which the zone information is held.
5. Head-coupled, area-of-interest, visual display apparatus as claimed in Claim 4, in which the simulated scene is computed in highdefinition, stored in the said frame buffer store and subsequently read out, the high-definition zone video information being derived by reading all bits of every store address and the lowdefinition zone video information being derived by ignoring the two least-significant address bits, thereby to provide low-definition picture elements of sixteen times the area of high-definition picture elements.
6. Head-coupled, area-of-interest, visual display apparatus as claimed in Claim 1, in which the separate line scanners for the low-definition zone and high-definition zone are both continuously rotatable polygon mirror scanners, mounted after the respective low-definition zone and high-definition zone modulatos for scanning the respective modulated laser beam of the input ends of low-definition and high-definition fibre optic light guides respectively.
7. Head-coupled, area-of-interest, visual display apparatus as claimed in Claim 6, in which the low-definition fibre optic light guide is composed of coarse fibres and the high-definition fibre optic light guide is composed of fine fibres, thereby to cover a smaller line width.
8. Head-coupled, area-of-interest, visual display apparatus as claimed in Claim 7, in which the fibres of the low-definition fibre optic light guide are of four times the diameter of the fibres of the high-definition fibre optic light guide.
9. Head-coupled, area-of-interest, visual display apparatus as claimed in Claim 8, in which the low-definition zone polygon mirror line scanner and the high-definition zone polygon mirror line scanner are mounted for rotation on a common shaft and the high-definition zone polygon mirror line scanner has four times as many facets as the low-definition zone polygon mirror line scanner and the high-definition zone scanner includes an intermediate pivotable mirror mounted between the polygon mirror and the input end of the high-definition zone fibre optic light guide.
10. Apparatus as claimed in Claim 1, constructed substantially as described herein with reference to the accompanying drawings or as one of the modifications thereto described herein.
GB8001166A 1979-01-24 1980-01-14 Visual display apparatus Expired GB2043941B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB8001166A GB2043941B (en) 1979-01-24 1980-01-14 Visual display apparatus
CA000344313A CA1147073A (en) 1979-01-24 1980-01-24 Visual display apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB7902504 1979-01-24
GB8001166A GB2043941B (en) 1979-01-24 1980-01-14 Visual display apparatus

Publications (2)

Publication Number Publication Date
GB2043941A true GB2043941A (en) 1980-10-08
GB2043941B GB2043941B (en) 1983-08-17

Family

ID=26270325

Family Applications (1)

Application Number Title Priority Date Filing Date
GB8001166A Expired GB2043941B (en) 1979-01-24 1980-01-14 Visual display apparatus

Country Status (2)

Country Link
CA (1) CA1147073A (en)
GB (1) GB2043941B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2253280A (en) * 1990-12-20 1992-09-02 Gen Electric Projection system
DE102005028210A1 (en) * 2005-06-17 2006-12-28 Carl Zeiss Ag Stereo display device has support unit which is attachable on head of user whereby left and right image module is fixed on support unit for generation of image for left and right eye of user

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2253280A (en) * 1990-12-20 1992-09-02 Gen Electric Projection system
DE102005028210A1 (en) * 2005-06-17 2006-12-28 Carl Zeiss Ag Stereo display device has support unit which is attachable on head of user whereby left and right image module is fixed on support unit for generation of image for left and right eye of user
DE102005028210B4 (en) * 2005-06-17 2009-10-15 Carl Zeiss Ag Stereo display and stereo display method

Also Published As

Publication number Publication date
GB2043941B (en) 1983-08-17
CA1147073A (en) 1983-05-24

Similar Documents

Publication Publication Date Title
US4340878A (en) Visual display apparatus
US4315240A (en) Visual display apparatus
US4315241A (en) Visual display apparatus
US4349815A (en) Head-movable frame-scanner for head-coupled display
US4347508A (en) Visual display apparatus
GB2115178A (en) Projection screen for display apparatus
US4347507A (en) Visual display apparatus
US4340274A (en) Visual display apparatus
CA1048777A (en) Visual display apparatus for trainee pilot
US3659920A (en) Wide angle infinity image visual display
US5582518A (en) System for restoring the visual environment of a pilot in a simulator
US5684498A (en) Field sequential color head mounted display with suppressed color break-up
US4437113A (en) Anti-flutter apparatus for head mounted visual display
US6454411B1 (en) Method and apparatus for direct projection of an image onto a human retina
JP2007503020A (en) Wide-angle scanner for panoramic display
GB2115946A (en) Improvements in or relating to visual display apparatus
CA1123595A (en) Flight simulator visual display apparatus
US4373169A (en) Multi-window visual display system for flight simulators
US4395234A (en) Optical scanning probe with multiple outputs
CA1147073A (en) Visual display apparatus
GB2101948A (en) Air combat simulator
CA1121591A (en) Flight simulator visual display apparatus
CA1119854A (en) Head up visual display apparatus
Breglia et al. Helmet mounted laser projector
US3040123A (en) Television equipment, especially for ground aircraft trainers and the like

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee