[go: up one dir, main page]

GB2640972A - Optically variable image device - Google Patents

Optically variable image device

Info

Publication number
GB2640972A
GB2640972A GB2406660.7A GB202406660A GB2640972A GB 2640972 A GB2640972 A GB 2640972A GB 202406660 A GB202406660 A GB 202406660A GB 2640972 A GB2640972 A GB 2640972A
Authority
GB
United Kingdom
Prior art keywords
pixel
light
optical
scene
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2406660.7A
Other versions
GB202406660D0 (en
Inventor
Ryzi Zbynêk
Kolarík Vladimir
Houha Roman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Iqs Group A S
Original Assignee
Iqs Group A S
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Iqs Group A S filed Critical Iqs Group A S
Priority to GB2406660.7A priority Critical patent/GB2640972A/en
Publication of GB202406660D0 publication Critical patent/GB202406660D0/en
Priority to PCT/EP2025/062925 priority patent/WO2025233536A1/en
Publication of GB2640972A publication Critical patent/GB2640972A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B42BOOKBINDING; ALBUMS; FILES; SPECIAL PRINTED MATTER
    • B42DBOOKS; BOOK COVERS; LOOSE LEAVES; PRINTED MATTER CHARACTERISED BY IDENTIFICATION OR SECURITY FEATURES; PRINTED MATTER OF SPECIAL FORMAT OR STYLE NOT OTHERWISE PROVIDED FOR; DEVICES FOR USE THEREWITH AND NOT OTHERWISE PROVIDED FOR; MOVABLE-STRIP WRITING OR READING APPARATUS
    • B42D25/00Information-bearing cards or sheet-like structures characterised by identification or security features; Manufacture thereof
    • B42D25/20Information-bearing cards or sheet-like structures characterised by identification or security features; Manufacture thereof characterised by a particular use or purpose
    • B42D25/29Securities; Bank notes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B42BOOKBINDING; ALBUMS; FILES; SPECIAL PRINTED MATTER
    • B42DBOOKS; BOOK COVERS; LOOSE LEAVES; PRINTED MATTER CHARACTERISED BY IDENTIFICATION OR SECURITY FEATURES; PRINTED MATTER OF SPECIAL FORMAT OR STYLE NOT OTHERWISE PROVIDED FOR; DEVICES FOR USE THEREWITH AND NOT OTHERWISE PROVIDED FOR; MOVABLE-STRIP WRITING OR READING APPARATUS
    • B42D25/00Information-bearing cards or sheet-like structures characterised by identification or security features; Manufacture thereof
    • B42D25/30Identification or security features, e.g. for preventing forgery
    • B42D25/324Reliefs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B42BOOKBINDING; ALBUMS; FILES; SPECIAL PRINTED MATTER
    • B42DBOOKS; BOOK COVERS; LOOSE LEAVES; PRINTED MATTER CHARACTERISED BY IDENTIFICATION OR SECURITY FEATURES; PRINTED MATTER OF SPECIAL FORMAT OR STYLE NOT OTHERWISE PROVIDED FOR; DEVICES FOR USE THEREWITH AND NOT OTHERWISE PROVIDED FOR; MOVABLE-STRIP WRITING OR READING APPARATUS
    • B42D25/00Information-bearing cards or sheet-like structures characterised by identification or security features; Manufacture thereof
    • B42D25/30Identification or security features, e.g. for preventing forgery
    • B42D25/328Diffraction gratings; Holograms
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • G03H1/0011Adaptation of holography to specific applications for security or authentication
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • G03H1/08Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/22Processes or apparatus for obtaining an optical image from holograms
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07DHANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
    • G07D7/00Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency
    • G07D7/003Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency using security elements
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07DHANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
    • G07D7/00Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency
    • G07D7/003Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency using security elements
    • G07D7/0032Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency using security elements using holograms
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • G03H1/0011Adaptation of holography to specific applications for security or authentication
    • G03H2001/0016Covert holograms or holobjects requiring additional knowledge to be perceived, e.g. holobject reconstructed only under IR illumination
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H2210/00Object characteristics
    • G03H2210/303D object

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Holo Graphy (AREA)

Abstract

An optically variable image device 1 forms an observable optically variable image/output light pattern, of light 11 scattered from a three-dimensional object/scene 7. A layer 2 of optical material has multiple pixels 3, each pixel encoded with a respective portion of the image, as a representation of scattered light 11 from the object/scene. That portion of the scattered light from the object/scene which is scattered from a respective sub-area 505,508 located on a pre-selected surface 6 of the object/scene corresponds by mapping to the respective pixel 305,308 in which its representation is encoded. Each pixel includes a respective optical structure with optical properties forming a respective phase modulated non-planar wavefront component 12 of the output light pattern. This component (i) corresponds to the respective portion of the image to be formed that is recorded in that pixel, and (ii) has a phase modulation that is characteristic of/a function of, the modulating optical properties of the respective optical structure of that pixel. The component’s shape is independent of the wavefront’s shape of the scattered light from the respective sub-area. The component formed by each pixel has at least one respective main propagation direction 1012,1015. The device may be for security/authentication.

Description

OPTICALLY VARIABLE IMAGE DEVICE
TECHNICAL FIELD
This invention relates to an optically variable image device, especially for example for use for security or authentication purposes in various end applications such as brand protection, document or banknote protection, passports, visas and ID cards or documents, driving licences, membership cards, tickets, certificates, packaging, works of art, antiques or other valuable items, etc. More particularly, though not exclusively, it relates to such an optically variable image device which upon illumination with incident light forms an observable optically variable image of light scattered from a three-dimensional object or scene. The optically variable image device has recorded in encoded form in respective pixels thereof respective portions of the optically variable image to be formed, and upon illumination of the device with incident light the pixels collectively form an output light pattern which resembles the scattered light from the original object or scene. In other aspects the present invention further relates to a method for producing such an optically variable image device, a method for recreating an observable optically variable image from an encoded record thereof in such an optically variable image device, and an article or product having applied or affixed thereto or incorporated therein such an optically variable image device.
BACKGROUND OF THE INVENTION AND PRIOR ART
As used herein, various terms and expressions are intended to have particular defined meanings, as follows: As used herein, the term "optically variable image" means an image whose visual appearance to a viewer changes with an angle at which the viewer observes the image and/or with an angle at which the device is illuminated by light for the purpose of the image's formation. For example, a change in the image's appearance may be perceived by the viewer as a change in one or more parameters of the formed image (or a portion thereof), such as its shape, configuration, colour, brightness, angle of view or composition and/or even as a change in the identity or subject-matter of the image itself (or a portion thereof) that is formed by the device.
As used herein, the terms "recorded" and "encoded", as applied to the form in which the optically variable image (or a portion thereof) to be formed is stored, incorporated or represented within the optical material of the pixels of the device, encompass the provision of any form of encoded recording of the optically variable image (or portion thereof) within the optical material of the respective pixels, and these terms encompass the formation of an encoded record of a representation of that image (or portion thereof) in the optical material of the respective pixels by any suitable means. Furthermore, these terms encompass various practical forms of such encoded recordings of representations of images (or portion(s) thereof) to be formed in finally produced practical embodiments of devices according to the invention, whether they be a final one-off or unique or single specific embodiment version of an optically variable image device according to the invention, or one of a plurality of final like or substantially identical or cloned replicated or derivative optically variable image devices all being according to a single specific embodiment version of such a device but being substantially identical to, or substantially clones of, each other.
Moreover, as the term is used herein, a reference to a "recording" step or procedure (or its grammatical equivalents) includes any process which forms an optical structure in the optical material of a given pixel which is capable at least of modulating the phase of light incident thereon. Such processes can include modifying existing optical or physical properties of the optical material of a given pixel, or actually forming particular optical or physical features in the optical material of the pixel such that they have or exhibit the desired optical or physical properties. Such processes can also include combining or assembling together (e.g. by coating or lamination) plural optical materials (optionally with one or more of them being pre-modified or pre-formed so as to have or exhibit particular preliminary optical properties) to form a final optical structure of the given pixel which has or exhibits the desired final properties. Such recording processes may for instance comprise performing any of the following techniques: exposure of the relevant optical material by a laser or electron beam, chemical or dry etching, embossing, UV casting, coating using any of various printing techniques or vacuum deposition techniques, lamination, etc, and such techniques may be used either singly or in any combination or sequence of two or more thereof in order to achieve the relevant desired phase-modulating end properties of the optical structure of the relevant pixel(s).
As used herein, the term "light" is to be construed broadly as referring to electromagnetic radiation in any region or portion of the electromagnetic spectrum. In many embodiments of the various aspects of the invention, however, the light that is used to define the original object or scene's scattered light that is recorded in encoded form in the pixels of the device, as well as the light that is used to illuminate the optically variable image device for the purpose of forming the observable optically variable image corresponding to the original object or scene, may typically be light within the visible region of the electromagnetic spectrum. In particular, in many embodiments of the invention the light may be white light, or alternatively it may be light of a selected range or band of frequencies/wavelengths within the visible region of the spectrum. Alternatively still, however, light which includes at least a portion that falls outside the visible region of the spectrum, e.g. into the infrared or ultraviolet regions, may be used instead, if desired.
As used herein, the term "non-planar wavefront component" refers to a component of the output light pattern that is actually generated by a particular pixel's optical structure formed in the layer of optical material, which layer constitutes at least part of the optically variable image device of the invention. Such a "non-planar wavefront component" is specifically designed to form a portion of the image of the respective portion of the light scattered from the three-dimensional object or scene. In some embodiment instances the output light pattern generated by a particular pixel may further comprise one or more other (e.g. additional or auxiliary) components which may nevertheless also have a non-planar wavefront; however, these are not primarily designed to contribute to the image of the light scattered by the three-dimensional object or scene, but instead they are designed to create additional or auxiliary overt or covert images or image components and/or light patterns, and so these other wavefronts are not encompassed by the term "non-planar wavefront component" as that term is used throughout this disclosure.
Also in connection with the term "non-planar wavefront component" as used herein, the reference to its "non-planarity" is to be understood as referring to the wavefront component's purposive design, i.e. it being designed intentionally (i.e. under the particular design conditions set by the designer) so as to have a form or shape that is other than planar or generally planar. The reference to the "non-planar" form/nature of the "non-planar wavefront component" therefore excludes any feature(s) of its form or shape (especially minor feature(s) of its form or shape) that are any of the following: (i) irregularities, i.e. one or more local micro-or macro-scale deviations from a true planar form/shape, that may be present naturally as technological imperfections or anomalies (e.g. in the form of roughness or inhomogeneity of the layer and/or the optical material), or (ii) irregularities, i.e. one or more local micro-or macro-scale deviations from a true planar form/shape, that may be present intentionally as part of the design process by the designer (e.g. in the form of one or more additional modulation feature(s) of the relevant wavefront component), or (iii) deformation(s) of the relevant wavefront component occurring due to any curvature of the layer itself or due to any diffraction effect(s) that may be inevitably present owing to the spatially limited character of the wavefront component generated from a spatially limited area (i.e. a pixel), which may cause various wavefront modulations and may affect its divergence and/or generation, especially of higher diffraction orders.
The word "main" -as referring to a particular propagation direction of either a non-planar wavefront component formed by a respective pixel or of light scattered from a respective sub-area on the pre-selected surface of the object or scene (especially having either passed through or been reflected from such a given sub-area) -means, in each respective case, that propagation direction which satisfies at least one of the following definitions (which one being at the choice of the designer of the device): (i) it being an average propagation direction of the respective light wavefront, which has been reduced thereto from one or more directional propagating constituents of the respective light wavefront with weighting according to their respective intensities, or (ii) it being a given (especially a single given) directional propagating constituent of the respective light wavefront which is highest in intensity (i.e. carries the highest proportion of the respective light wavefront's energy).
References to light being light which is "scattered from" a given sub-area on a pre-selected surface of the three-dimensional object or scene means that light which is scattered away from that given sub-area by either transmission through or reflection from that sub-area. Additionally and more specifically, references to such light which is "scattered from" a given sub-area on the pre-selected surface of the three-dimensional object or scene means and refers to that light which originates outside the three-dimensional object or scene and is incident thereon, and which during the interaction with the object or scene is incident on a receiving side of the relevant sub-area on the pre-selected surface of the object or scene and emerges from that sub-area as scattered light.
As used herein, the term "design wavelength" means the wavelength of light which the optical properties of the optical structures of the pixels of devices according to embodiments of the invention are designed and produced with reference to. It may be noted, however, that in certain cases the design wavelength may not necessarily or always be, or it may not have to be (although in certain instances it may indeed be), identical to the wavelength of light scattered by the three-dimensional object or scene (whose image is to be formed by the device) or the incident light that is modulated by the layer of the device to form the image that is observable by a viewer. Moreover, as used herein, the term "pre-selected wavelength" is to be understood as meaning the same thing as "design wavelength", and indeed such terms may be used interchangeably with the same meaning in various places within this
disclosure.
As used herein, the term "mapping" (and its grammatically equivalent terms) -as referring to the creating of a correspondence between each respective sub-area located on a preselected surface of the three-dimensional object or scene and a respective pixel in the layer of the device, or vice versa -means the assigning of respective ones of the sub-areas to respective ones of the pixels, or vice versa, so as to define or designate respective sub-area- -pixel pairs. Such assigning, or defining/designating as respective sub-area--pixel pairs, may generally be done through the use of respective projection vectors or projection lines extending (e.g. drawn virtually or calculated theoretically) between the respective sub-areas and pixels to form each mapped pair, whilst maintaining a spatial correspondence between the distribution of the pixels in the pixel arrangement in the layer and the distribution of the sub-areas on the pre-selected surface of the three-dimensional object or scene (i.e. each respective sub-area--pixel pair has the same respective neighbouring or adjacent sub-area-pixel pairs as the respective individual sub-areas have and the respective individual pixels have in both distributions).
The art is replete with many ways in which a three-dimensional perception of a three-dimensional object can be created using a substantially flat recording medium and illumination conditions. Probably the oldest ones are based on artistic shadowing techniques used in paintings or drawings of a scene or an object. Depending on the skills of the artist, some artworks can even entice the observer to touch the image to be convinced that the scene or object he/she is looking at is actually not three-but just two-dimensional.
Nevertheless, such images remain static regardless of the observer's position or illumination conditions. A leap in the art of three-dimensional imaging was the discovery of the principle of holographic recording of three-dimensional objects or scenes. This technique allowed recording of the light which a given object or scene reflects or scatters into various directions.
A fully three-dimensional recording of an image or scene on a two-dimensional medium is called a hologram and it offers the observer a three-dimensional perception of a real object or scene that can be viewed differently from different observation angles. Such images can generally be termed "optically variable images" and they represent a truest resemblance of the real object or scene in question. However, the price the observer usually pays is that the illumination conditions need to be quite special, such as unidirectional or a point source lighting or even monochromatic lighting, not to mention that in order to create a hologram the cost of the equipment and recording operation is usually much more expensive than for creating a drawing or a painting of a similar size.
Any image on a two-dimensional medium which resembles a three-dimensional object or scene always attracts attention, especially those images which appear variable under variable observation or illumination conditions. Over the years a large number of techniques have been invented for creating a three-dimensional perception of an image on a substantially two-dimensional medium and for finding a compromise or balance between static and fully three-dimensional images. These have generally relied on arbitrary and special illumination conditions, including for example stereoscopic imaging using lenticular arrays, micro-faceted surfaces either in a pixelated or zonal form and which work by discretizing the relief surface of an object and transferring it into a two-dimensional optical relief medium, and micro-or nanostructures which mimic the reflecting properties of a relief surface by means of diffraction or other optical effects. In more recent years these known techniques have focused on creating a unique arrangement of a sophisticated micro-or nanostructure which can provide a three-dimensional perception of an object or scene often in combination with other features of the structure or its optical properties which can be advantageously used not only for aesthetic but also authentication or security purposes.
Nowadays in various markets, in particular the brand protection or authentication and document security markets, there is a constant need for new authentication features of devices that record optically variable images and which can thus be used as authentication or security elements. In recent years achromatic bas-relief-type elements have become a standard in optically variable image security elements, not only for their distinctive design but especially for their clear visibility to the naked eye of the observer without the need for a special reading or viewing device. Such elements are highly distinguishable even in poor viewing conditions, especially under diffuse lighting conditions. This is also achieved by virtue of the fact that the relief is made up of specific blazed zonal or faceted structures. However, the production of optically variable image elements with such structures is technologically demanding and expensive, and this already usefully leads to significant limitations for potential counterfeiters.
As mentioned above, such security or protective elements can be designed in such a way that they provide an achromatic image. An achromatic character of such images with a three-dimensional appearance is often a desired feature that clearly distinguishes such security or protective elements from known standard rainbow diffractive optically variable image devices hitherto known in the art. Various techniques and structures for producing images from such known elements have been developed already in the art, and examples thereof are described for instance in published patent documents US5105306A, W02018/201208A1, EP3059093A1 (and its corresponding US2013/093172A1) and W02004/048119A1. A common feature of these known techniques or processes is the conversion of the bas-relief of a three-dimensional object or its geometrical properties into a micro-structured surface in the form of Fresnel-like zones or facets.
However, known optically variable image devices such as those mentioned above inherently come with various limitations or shortcomings. For instance, the images that they recreate often involve viewing parallax, which can detract from the efficacy of more complex images that need or desirably have true three-dimensional viewing capability without parallax effects.
Furthermore, the manner in which the optical structures of such known devices record their optically variable images in question sometimes limits their ability to record secondary or complementary (e.g. overt or covert) authentication or security features or effects that may be useful to incorporate into an overall recreated authentication or security image or image
set or field.
More particularly, with ever increasing demands being placed on optically variable image devices useful for authentication or security purposes in terms of their ability to incorporate ever more complex and multi-feature or multi-effect security components, as well as for their non-counterfeitability, there is an increasing need in the art for new optically variable image devices that overcome at least some of the limitations and shortcomings of the known art of optically variable image devices such as those discussed above, and furthermore that allow for the creation of new and/or additional security or authentication features or effects. It is a primary object of the present invention to address this need.
SUMMARY OF THE INVENTION
The present invention has been devised with the above object at its heart, and aims to provide a new form of optically variable image device that uses a different and novel approach to recording an optically variable image -which resembles or corresponds to a three-dimensional object or scene -in encoded form in a device which upon illumination with light forms the optically variable image corresponding to that three-dimensional object or scene.
Accordingly, in a first aspect the present invention provides an optically variable image device for forming an observable optically variable image of light scattered from a three-dimensional object or scene upon illumination of the device with incident light, the image to be formed comprising an output light pattern, the device comprising a layer of optical material with an arrangement of a plurality of pixels defined or definable thereon, each pixel having recorded therein in encoded form a respective portion of the image to be formed, wherein: each respective portion of the image to be formed that is recorded in a respective pixel is encoded therein as a representation of a respective portion of the said scattered light from the three-dimensional object or scene, where that respective portion of the said scattered light from the three- dimensional object or scene is scattered from a respective sub-area located on a pre-selected surface of the three-dimensional object or scene, and which respective sub-area corresponds by mapping to the respective pixel in the pixel arrangement in which is encoded its representation; each pixel comprises a respective optical structure with optical properties capable of modulating the phase of light incident thereon and forming a respective phase-modulated non-planar wavefront component of the output light pattern, which respective non-planar wavefront component formed by that respective pixel (i) corresponds to the respective portion of the image to be formed that is recorded in that respective pixel, and (ii) has phase modulation that is characteristic of or a function of the modulating optical properties of the respective optical structure of that respective pixel, where a shape of the said respective non-planar wavefront component is independent of the shape of a wavefront of the said scattered light from the respective subarea on the pre-selected surface of the three-dimensional object or scene that corresponds by mapping to the respective pixel in the pixel arrangement in which is encoded its representation, and where upon illumination of each respective pixel of the layer of the device with incident light, each said pixel forms a respective portion of the observable optically variable image in the form of the said respective non-planar wavefront component of the output light pattern, and where the said respective non-planar wavefront component formed by each respective pixel, upon that respective pixel being illuminated under pre-selected illumination conditions, has at least one respective main propagation direction that is either substantially equal to or at least is dependent on a respective at least one main propagation direction of light scattered from the respective sub-area on the pre-selected surface of the three-dimensional object or scene that corresponds by mapping to the respective pixel in the pixel arrangement in which is encoded its representation, upon the three-dimensional object or scene being illuminated under the said pre-selected illumination conditions, where the said pre-selected illumination conditions comprise, for each respective pixel and each respective sub-area on the pre-selected surface of the three-dimensional object or scene, illumination by a planar light wave of a pre-selected or design wavelength and at a respective pre-selected angle of incidence on the respective pixel and on the three-dimensional object or scene, as the case may be, and in the case of the respective sub-area the said planar light wave originating outside the three-dimensional object or scene and being incident on a receiving side of the respective sub-area as it interacts with the three-dimensional object or scene; whereby upon collective illumination of the arrangement of pixels of the layer of the device with incident light, the plurality of pixels collectively form the observable optically variable image in the form of a plurality of portions thereof in the form of a plurality of said respective non-planar wavefront components of the output light pattern, which plurality of said respective non-planar wavefront components of the output light pattern collectively correspond to the said scattered light from the three-dimensional object or scene whose image is formed by the device.
In a second aspect the present invention provides a method for producing an optically variable image device for forming an observable optically variable image of light scattered from a three-dimensional object or scene upon illumination of the device with incident light, the image to be formed comprising an output light pattern, and the optically variable image device being a device according to any one of claims 1 to 33, wherein the method comprises: (1) either before or after step (2), pre-selecting a surface of the three-dimensional object or scene for the purpose of the following steps of the method; (2) either after or before step (1), providing a layer of optical material with an arrangement of a plurality of pixels defined or definable thereon, each respective pixel being for forming a respective optical structure therein with optical properties capable of modulating the phase of the light incident thereon, where each respective pixel is for forming a said respective optical structure which is capable of modulating the phase of light incident thereon so as to form a respective phase-modulated non-planar wavefront component of the output light pattern, which respective non-planar wavefront component to be formed by that respective pixel (i) corresponds to a respective portion of the image to be formed that is to be recorded in that respective pixel, and (ii) has phase modulation that is characteristic of or a function of the modulating optical properties of the respective optical structure to be formed in that respective pixel; (3) mapping each respective one of a plurality of sub-areas located on the preselected surface of the three-dimensional object or scene onto each corresponding respective pixel that is to record the corresponding respective portion of the image to be formed, or alternatively mapping each respective pixel that is to record a respective portion of the image to be formed onto each corresponding respective one of a plurality of sub-areas located on the pre-selected surface of the three-dimensional object or scene; (4) for the purpose of subsequent encoding in the respective pixels of the respective recorded portions of the image to be formed, defining pre-selected illumination conditions comprising, for each respective pixel and each respective sub-area on the pre-selected surface of the three-dimensional object or scene, illumination by a planar light wave of a preselected or design wavelength and at a respective pre-selected angle of incidence on the respective pixel and on the three-dimensional object or scene, as the case may be, and in the case of the respective sub-area the said planar light wave originating outside the three-dimensional object or scene and being incident on a receiving side of the respective subarea as it interacts with the three-dimensional object or scene; (5) for each respective sub-area located on the pre-selected surface of the three-dimensional object or scene, and for the pre-selected illumination conditions defined in step (4), determining, especially from calculation or measurement, at least one respective main propagation direction of light scattered from the respective sub-area on the pre-selected surface of the three-dimensional object or scene that corresponds by the said mapping to the respective mapped pixel in the pixel arrangement in which is to be encoded its representation, and therefrom assigning that respective at least one main propagation direction of the scattered light from the respective sub-area to a respective at least one main propagation direction of the respective non-planar wavefront component of the output light pattern to be formed by that respective pixel after interaction therewith of the incident light thereon under the pre-selected illumination conditions, where that determining and assigning are such that the said respective non-planar wavefront component to be formed by each respective pixel, upon said respective pixel being illuminated under the said pre-selected illumination conditions, has at least one said respective main propagation direction that is either substantially equal to or at least is dependent on a respective at least one said main propagation direction of the said scattered light from the respective sub-area on the pre-selected surface of the three-dimensional object or scene; (6) for each respective mapped pixel, using the respective at least one main propagation direction of the respective non-planar wavefront component to be formed by that respective pixel as determined and assigned in step (5), designing a shape of the said respective non-planar wavefront component to be formed by that respective pixel, which respective non-planar wavefront component to be formed by that respective pixel is independent of the shape of a wavefront of the said scattered light from the respective subarea on the pre-selected surface of the three-dimensional object or scene that corresponds by the said mapping to the respective pixel in the pixel arrangement in which is to be encoded its representation; (7) for each respective pixel in the pixel arrangement, calculating the modulating optical properties of the optical material that is to form the respective optical structure thereof and which is capable of transforming a planar wavefront of the light incident on that respective pixel into a said designed respective phase-modulated non-planar wavefront component of the output light pattern that is characteristic of or a function of those modulating optical properties of the respective optical structure to be formed in that respective pixel; and (8) recording in the optical material of each respective pixel each respective optical structure thereof that is capable of forming each respective phase-modulated non-planar wavefront component of the output light pattern of the image to be formed, where each respective phase-modulated non-planar wavefront component is so recorded in the respective pixel's optical structure in encoded form as said modulating optical properties of the optical material of the respective optical structure as calculated in step (7), whereby each respective encoded record of each respective portion of the image to be formed is so recorded in the respective optical structure of the respective pixel as a representation of the corresponding respective portion of the said scattered light from the respective sub-area on the pre-selected surface of the three-dimensional object or scene.
In some practical embodiments of the above-defined method of the second aspect, especially in embodiments in which the produced layer following step (8) -i.e. with its arrangement of pixels with the respective portions of the image to be formed recorded in encoded form in the respective optical structures thereof -does not itself inherently constitute the final optically variable image device being produced, the method may further include, after step (8), a step of: (9) assembling or incorporating the layer with its arrangement of pixels (with the respective portions of the image to be formed recorded in encoded form in the respective optical structures thereof) into the final optically variable image device being produced, optionally by application of the layer onto, or by embedding of the layer into, or by combining the layer with (e.g. by lamination), one or more carrier layers or other structural or optically functional or non-functional layers of the produced device.
It is to be understood that in certain practical forms of such embodiments as in the preceding paragraph, such a "assembling or incorporating" step (9) may contribute to the overall procedure for "recording" in the optical material of each respective pixel each respective optical structure thereof that is capable of forming each respective phase-modulated non-planar wavefront component of the output light pattern of the image to be formed, as defined generally in step (8). Thus, in such embodiment forms, the overall procedure of "recording" as defined generally as step (8) may only be complete once such an additional step (9) has been carried out.
In the context of commercial production, practical embodiments of the above-defined method of the second aspect may be designed and carried out so as to produce any desired number of final optically variable image devices according to the first aspect of the invention, especially a plurality of such devices which in many typical commercial production scenarios may each be a replicated or derivative such device according to a single embodiment version thereof.
For example, in some such embodiments, the complete above-defined method of the second aspect comprising the above-defined steps (1) to (7) as well as the above-defined "recording" step (8), and optionally also the additional above-defined "assembling or incorporating" step (9), may be designed and carried out any desired or required number of times, e.g. either once only or a plurality of times, so as to produce one or more respective final specific embodiment versions of an optically variable image device according to the first aspect of the invention upon the or each respective such rendering of the complete method.
However, in other such embodiments, only that part of the above-defined method of the second aspect comprising just the defined steps (1) to (7) only may be designed and carried out once only -e.g. for the purpose of defining the encoding of each respective portion of the image to be formed that is to be recorded in the respective pixels of a single specific embodiment version of such a device according to the first aspect of the invention -and then only the remaining above-defined "recording" step (8), and optionally also the additional above-defined "assembling or incorporating" step (9), may be carried out any desired or required number of times, especially a plural number of times, on each respective one of a plurality of substantially identical discrete layers of optical material, so as to produce a plurality of final replicated or derivative devices all being according to that single specific embodiment version thereof but which are substantially identical to, or are substantial clones of, each other.
In a third aspect the present invention provides a method of forming an observable optically variable image of light scattered from a three-dimensional object or scene, the image being formed from an encoded record thereof in an optically variable image device, wherein the method comprises: (1) providing said optically variable image device being an optically variable image device according to the first aspect of the invention or any embodiment thereof or an optically variable image device produced by the method according to the second aspect of the invention or any embodiment thereof; and (2) illuminating the arrangement of pixels of the layer of the device collectively with incident light; whereby upon said illumination of each respective pixel of the layer of the device with the incident light, each said pixel forms a respective portion of the observable optically variable image in the form of the said respective non-planar wavefront component of the output light pattern, and whereby upon said collective illumination of the arrangement of pixels of the layer of the device with the incident light, the plurality of pixels collectively form the observable optically variable image in the form of a plurality of portions thereof in the form of a plurality of said respective non-planar wavefront components of the output light pattern, which plurality of said respective non-planar wavefront components of the output light pattern collectively correspond to the said scattered light from the three-dimensional object or scene whose image is formed by the device.
In embodiments of the invention, it is to be understood that in the case of each pixel comprising a respective optical structure with optical properties that render it capable of modulating the phase of light incident thereon and forming a respective phase-modulated non-planar wavefront component of the output light pattern, it may be possible within this definition, or possible at the same time, for the optical properties of the respective optical structure of the respective pixel to additionally modulate one or more other properties of the light incident on the respective pixel, in particular its amplitude and/or its polarization. Thus, the manner of modulation of the light incident on the respective pixels may not be limited to phase modulation only, and the respective pixels may modulate the incident light in other ways too, within the scope of embodiments of the invention.
In many embodiments of the invention in its various aspects, the observable optically variable image of the light scattered from the three-dimensional object or scene that is formed by the optically variable image device upon illumination thereof with incident light may comprise an achromatic such image. In general, an "achromatic" image (i.e. an "achromatic" light distribution pattern that forms an image) means an image that provides substantially the same appearance to an observer when formed by different wavelengths of light incident on the optically variable image device and/or when images formed by different wavelengths of polychromatic incident light overlap or are mixed together to generate an appearance that is perceivable as essentially achromatic or having substantially the same colour as the incident light itself. The image in question may be so viewable or perceivable as achromatic within a limited wavelength range. In typical visual applications this may mean in the visible range of light wavelengths, e.g. from about 400 to 780 nm or even from about 475 to 650 nm. Alternatively or additionally, the image in question may be so viewable or perceivable as achromatic in a limited range of incidence or observation angles, for example as nearspecular reflection of light from the optically variable image (e.g. within a 10, 20 or 30° angular range) or under preferred observation conditions (e.g. where the angle between the light source and the observer is about 30°).
In most practical embodiments of the invention in its various aspects, the representational relationship between the respective portions of the output light pattern formed by the device (and comprising the phase-modulated non-planar wavefront components thereof) and the respective portions of the scattered light from the three-dimensional object or scene may be such that they merely resemble each other or are similar to or are an imitation or approximation of each other, instead of strictly being a substantially exact or nearsubstantially-exact reproduction or duplicate thereof -provided that those resembling/similar/imitating/approximating respective output light pattern portions (comprising the phase-modulated non-planar wavefront components) and the respective portions of the scattered light from the original object or scene have or retain as between them generally similar or approximately or roughly or generally in the vicinity of (or even possibly substantially, in certain embodiment cases) the same respective main propagation directions. In other words, in most practical embodiments of the invention the optically variable image device may merely act or function to form an observable optically variable image that has or provides approximately or roughly or generally in the vicinity of (or even possibly substantially, in some embodiment cases) the same or a merely similar perception (especially when perceived by an observer) as the scattered light from the original three-dimensional object or scene.
In some embodiments of the invention, the encoding by which the respective portions of the image to be formed are recorded in the respective pixels may include not only the encoding of the modulated optical properties of the optical material in the respective pixels that record therein in that encoded form the respective phase-modulated non-planar output light wavefront components, but the said encoding may also include additional encoding as defined in one of the following (a), (b) or (c): (a) auxiliary encoding in the optical material in one or more of, optionally substantially all of, the pixels which modifies or modulates the said encoding (that records in the respective pixels the respective phase-modulated non-planar output light wavefront components) so as to encode therein at least one modification to or modulation of the phase-modulated non-planar output light wavefront components which defines, and manifests itself in the output light pattem as, at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component of the output light pattern; or (b) encoding in the optical material in one or more of, optionally substantially all of, the pixels of at least one additional or auxiliary wavefront component which defines, and manifests itself in the output light pattern as, at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component of the output light pattern; or (c) a combination of (a) and (b); wherein in either case (a), (b) or (c) the additional encoding of the said at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component may be manifested in the output light pattern that is formable by the device as a modification or modulation of, or as an additional or auxiliary feature, characteristic, image or image component of, or as a subtraction of a portion from, one or more portions of the image to be formed by the device.
In some such embodiments, the additional encoding of the at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component may be so additionally encoded so as to be included in or present within or across or superimposed onto or subtracted from either (i) a respective single one of the phase-modulated non-planar output light wavefront components, or one or more portions thereof, that is formable by the device, or (ii) collectively a plurality of, or optionally substantially all of, the phase-modulated non-planar output light wavefront components, or one or more portions of each of one or more of said phase-modulated non-planar output light wavefront components of the plurality, that are formable by the device, Alternatively or additionally in some such embodiments, the maximum or average intensity of the or each one of the at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component(s) may be less than half of, optionally less than a quarter of, further optionally less than a tenth of, the maximum or average (as the respective case may be) intensity of each respective phase-modulated non-planar output light wavefront component forming a respective portion of the image to be formed of the three-dimensional object or scene.
In some such embodiments as in the preceding three paragraphs, it may be that the said additional encoding as defined by (a), (b) or (c) is so additionally encoded in a given pixel so as to comprise encoding of either: (i) a representation of at least part of (especially a part less than the whole of) the scattered light from a sub-area located on the pre-selected surface of the three-dimensional object or scene that is an adjacent or neighbouring sub-area to the respective sub-area thereon that corresponds by mapping to the said given pixel, or (ii) a representation of at least part of (especially a part less than the whole of) the portion of the output light pattern that is generated by an adjacent or neighbouring pixel to the said given pixel.
Such embodiments as in the preceding paragraph may be useful in that the said additional encoding may constitute a tool in the designer's hands which may be used to refine the appearance of the overall output light pattern generated by the device (e.g. to produce smoother transitions in viewing light generated by neighbouring pixels in the layer by application of fine-tuned intensity weights, wavefront forms, etc), and which may also contribute to hiding or suppressing of noise generated in the output light pattern due to interaction of incident light with the pixel boundaries.
In some such embodiments as in any of the preceding five paragraphs, the additional encoding as defined in (a), (b) or (c) of the at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component may comprise one or more of, or any combination of any plurality of, any of the following: (i) auxiliary encoding of a spatially-limiting or truncating feature or characteristic which limits or restricts or truncates the spatial extent and/or shape of one or more of the phase-modulated non-planar output light wavefront components; (ii) auxiliary encoding of a phase variation feature or characteristic which further varies the phase-modulation of one or more of the non-planar output light wavefront components; (iii) encoding of a covert or auxiliary image or image component; (iv) encoding of one or more auxiliary or subsidiary images or image components that require specific lighting and/or viewing conditions in order to be observable.
In some embodiment forms of the immediately-above auxiliary encoding feature (i), it may be that the spatially-limiting or truncating feature or characteristic may comprise an area of the or each said pixel, to which said pixel the auxiliary encoding is applied and which generates a respective non-planar wavefront component of the output light pattern, being reduced within that pixel so as to delimit a sub-area of that pixel, where the said sub-area has a shape of any desired design, and where the minimum inscribed circle of that sub-area is at least about 5 or 10 microns, and the angular width of the respective truncated non-planar wavefront component generated by that pixel is not less than about 5 or 10° in at least one azimuthal direction; In some embodiment forms of the immediately-above auxiliary encoding feature (ii), it may be that the phase variation feature or characteristic may comprise an additional phase modulation or phase modification function superimposed onto the phase modulation or modulation function of the respective said non-planar wavefront component and which does not change the main propagation direction of the so modulated/modified non-planar wavefront component by more than about 5 or 10° from the main propagation direction of the non-so-modulated/modified non-planar wavefront component (i.e. prior to its additional modulation/modification), and wherein the additional phase modulation/modification may comprise spatial modulation frequencies producing an auxiliary light pattern component (e.g. by diffraction) with a higher, optionally at least 2 or 3 times higher, angular width than the angular width of the non-so-modulated/modified non-planar wavefront component, and the modulation depth of the additional phase modulation/modification may be less than about half, optionally less than about a quarter, further optionally less than about a tenth, of the pre-selected or design wavelength. Optionally it may further be that the so-produced auxiliary light pattern component's (e.g. diffracted) light may form a respective auxiliary wavefront component, and yet further optionally it may be that that respective so-produced auxiliary wavefront component may be in the form of a covert or auxiliary image or image component formed on or adjacent to or around the respective said non-planar wavefront component.
In some embodiment forms of the immediately-above encoding feature (iii), it may be that the covert or auxiliary image or image component may comprise any one of the following: (iii)(a) a covert or auxiliary image or image component which is encoded into the optical structure(s) of the or one or more said pixels or each pixel of a group of a plurality of said pixels which can be defined as a light wave pattern, the light wave pattem being superimposed on the or each respective non-planar wavefront component generated by the or each respective said pixel or group of pixels, optionally wherein the encoding of the covert or auxiliary image or image component is superimposed onto the encoding of the or each respective non-planar wavefront component generated by the or each respective said pixel or group of pixels, and further optionally wherein the encoding of the covert or auxiliary image or image component is independent of the main propagation direction(s) of the respective non-planar wavefront component(s) generated by the optical structure(s) of the respective said pixel(s); or (iii)(b) a covert or auxiliary image or image component as defined in (iii)(a) above, and which is itself an optically variable image and is overlaid with or superimposed by, or itself overlays or superimposes or even substitutes (especially partially or fully), the said image to be formed of the three-dimensional object or scene; or (iii)(c) a covert or auxiliary image or image component as defined in (iii)(a) or (iii)(b) above, and which is in the form of a two-dimensional graphic pattern; or (iii)(d) a covert or auxiliary image or image component as defined in (iii)(a) or (iii)(b) or (iii)(c) above, and which is a hologram, optionally a Fourier or Fresnel hologram or a combination thereof.
In some embodiment forms of the immediately-above encoding feature (iv), it may be that the specific lighting and/or viewing conditions, under which -especially only under which -the one or more auxiliary or subsidiary images or image components is/are observable, may comprise one or more of the following: incident light of a specific directionality relative to the device (e.g. light which is substantially unidirectional and/or that propagates in one specific direction relative to the device, or light which is multidirectional but emitted from a point source or a representation thereof), incident light of a specific chromaticity (e.g. light which is substantially monochromatic), incident light of a specific pre-selected or design wavelength, a specific viewing direction or viewing angle relative to the device or a specific range of viewing directions or viewing angles relative to the device.
In some embodiments of the invention, the wavefront which is to be generated by a given pixel may be designed with one or more extension(s) into one or more neighbouring or adjacent pixels (i.e. into a portion of or an entire area delimited by the neighbouring or adjacent pixel or pixels). A practical example of this will be described and explained in detail further below in connection with a specific embodiment shown in the accompanying drawings. Such an extended portion(s) of the wavefront designed for a given pixel may be either (i) encoded into the neighbouring pixel(s) itself/themselves, or (ii) encoded into the neighbouring pixel(s) as an auxiliary or additional or superimposed wavefront component(s) or image(s) (especially that being as defined above).
In practising many embodiments of the invention, the determination and calculation of the encoding in the optical material of the respective pixels' optical structures of the respective phase-modulated non-planar wavefront components forming respective portions of the image to be formed may be based on either a direct or an indirect working from the light scattered from the pre-selected surface of the three-dimensional object or scene, where the encoding is overall determined and calculated based on: either (i) a direct working from real scattered light detected from a real three-dimensional object or scene under a predetermined real optical illumination arrangement, or (ii) an indirect working from a calculated or theoretical record of, or a modelled or simulated approximation of, light scattered from a calculated or theoretical or modelled or simulated three-dimensional object or scene under a predetermined calculated or theoretical or modelled or simulated optical illumination arrangement.
In some such embodiments as in the preceding paragraph which are based on feature (h), it may be that its pre-selected surface's light scattering properties or a light scattering pattern it produces under the predetermined calculated, theoretical, modelled or simulated optical illumination arrangement may be based on a solely calculated, theoretical, modelled or simulated such arrangement determined from purely calculated, theoretical, modelled or simulated technical principles only (such as, for example, by ray-tracing).
Thus, in practising such embodiments of the invention as in the preceding two paragraphs, the three dimensional object or scene, whose light scattered therefrom is used to determine and calculate the encoding in the optical material of the respective pixels' optical structures of the respective phase-modulated non-planar wavefront components forming the respective portions of the image to be formed, may be either a real-life object or scene, or a scaled version or a model of a real-life object or scene, or it may be a virtual or theoretical or modelled or simulated object or scene.
Moreover, for practising embodiments of the invention, the object or scene -whose scattered light therefrom is represented by the image to be formed by the device of the invention -may in principle be substantially unlimited in its identity and/or nature, and typically it may be or comprise any three-dimensional object (which term as used herein also encompasses plural objects, e.g. in close proximity to one another) that one may wish to record and re-create an image of, or it may be or comprise any three-dimensional scene (e.g. an indoors or outdoors scene, perhaps including any kind and number of objects or people) again that one may wish to record and re-create an image of.
Accordingly, following the general principles underpinning the present invention, it may be said that, in many embodiments of the invention, a key feature thereof is that the scattered light from the three-dimensional object or scene, and in particular the scattered light from each respective sub-area on the pre-selected surface of the three-dimensional object or scene, is directionally correlated to the respective portion of the output light pattern which is generated by each respective pixel of the device and which constitutes the representation of the scattered light from that respective sub-area on the pre-selected surface of the three-dimensional object or scene -this being true for various configurations of the incident light, in particular for the incident light configured in various ways for the purpose of the encoding, e.g. especially in embodiments where the incident light interacts only with one pre-selected surface of the three-dimensional object or scene.
Thus, in general terms, the present invention brings to the art a new category of optically variable image devices based on a new technical approach and techniques for devising and constructing them. Such new devices, and their associated methods of production and methods of formation of observable optically variable images therefrom, may in many practised embodiments provide optically variable images which have a three-dimensional appearance, and in some embodiments may form achromatic optically variable images.
However, the devices according to the invention are based on a different manner of design and construction from known optically variable image devices that have been used hitherto to form these kinds of images. In the practising of embodiments of the invention, the control of the appearance of the formed image from pixel to pixel of the device is rooted in the design and construction method and how the apportioned image to be formed is recorded in encoded form in the pixels of the device. As a result, the optical structures formed in the optical material of the pixels of devices according to embodiments of the invention are different in their structural or geometric, and thus their optical, characteristics and how they encode the various portions of the image to be formed as compared with equivalent optical structures of known optically variable image devices that rely on known techniques for converting the geometrical surface shape of a three-dimensional object or scene (e.g. a bas-relief type of object) into an encoded recorded, and thus re-creatable, form.
Furthermore, and very usefully, by use of the invention it may now be possible, or possible more easily, to integrate into the optical structures of pixels of optically variable image devices according to embodiments of the invention one or more additional or auxiliary optical features, effects, images or objects -such as various unique overt or covert features, effects, images or objects, e.g. for enhanced security or authentication purposes -within observable optically variable images that are formable by such devices.
The present invention thus offers a new approach to encoding information about a three-dimensional real or virtual object or scene into the overall optical construction of an optically variable image device based on a layer of optical material in defined pixels of which the apportioned image to be formed is recorded in encoded form. Unlike many hitherto known techniques in the art, such as those mentioned in the "Background of the Invention and Prior Art" discussion hereinabove, the new approach provided by the present invention is based on encoding properties of light scattered, e.g. either in reflection or transmission, from a real or virtual three-dimensional object or scene into the optical material of an arrangement of pixels, which upon illumination of those pixels forms scattered light which can be observed as a representation or resemblance or imitation of the original three-dimensional object or scene. However, also unlike a classical known holographic approach to forming/recreating such kinds of images, which can nevertheless provide true three-dimensional perception of an object or a scene in the process of reconstructing recorded light scattered by such an object or scene, the new approach provided by embodiments of the present invention enables the formation of an image which is an imitation or simulation or approximation of a three-dimensional object or scene which may be substantially without viewing parallax, but with variable scatter properties of the optical structures of the device's pixels under various illumination conditions.
Furthermore, the new encoding approach provided by the invention in some embodiments thereof enables one to easily (or more easily) include or incorporate, if it should be desired or advantageous, into the optical material of the pixels of an optically variable image device one or more additional or auxiliary images, objects, image components, features, information or data, e.g. one or more covert or auxiliary images or image components or one or more images or image components that may for instance require specific lighting conditions in order to be observable. Such incorporation of one or more additional or auxiliary images, objects, image components, features, information or data may for instance be effected by either (i) modifying or modulating the calculated encoding of the respective phase-modulated non-planar wavefront component(s) themselves, or (ii) adding to or incorporating or including or introducing into or superimposing onto the calculated encoding of the respective phase-modulated non-planar wavefront component(s) themselves one or more additional or auxiliary light beam or wavefront function(s) which define(s) or carry(ies) or superimpose(s) thereon the one or more additional or auxiliary images, objects, image components, features, information or data.
For example, by practising some such embodiments of the present invention that involve incorporating into the encoded recording in the pixels of the device the one or more additional or auxiliary images, image components, features, information or data, the novel encoding approach offered by the invention may enable an observer to inspect the authenticity of the optically variable image device more assuredly and unambiguously, since it may yield properties of or information about its overall optical construction with respect to its physical form or optical function(s) which may be significantly different from similar optical structures recorded according to current known encoding methods in comparable devices. This may for example be of particularly advantageous use in forensic authentication analysis scenarios. Therefore, embodiments of the uniquely designed and constructed overall optical construction of the optically variable image device of this invention may generate a unique form of one or more overt images or visual effects -or in some cases even a unique combination of overt and covert images or visual effects -that are viewable either by the naked eye or via a reading device or under specific lighting and/or viewing conditions, thereby significantly increasing levels of security or authentication ability in such applications.
Thus, embodiments of the optically variable image device of the present invention may be used to good and useful effect in a wide variety of end applications, such as brand protection, document or banknote protection, passports, visas and ID cards or documents, driving licences, membership cards, tickets, certificates, packaging, works of art, antiques or other valuable items, etc. In some embodiments of the invention in its various aspects, the said respective non-planar wavefront components formed by the respective pixels may each be arcuate in shape.
Moreover, and in accordance with many such embodiments of the invention, the said respective arcuate non-planar wavefront components formed by the respective pixels, upon the respective pixels being illuminated under the above-defined pre-selected illumination conditions, may each have a respective main propagation direction that is either substantially equal to or is at least dependent on a respective main propagation direction of the light scattered from the respective sub-area on the pre-selected surface of the object or scene that corresponds by the said mapping to the respective pixel in the pixel arrangement in which is encoded its respective representation, upon the three-dimensional object or scene being illuminated under those said pre-selected illumination conditions.
In some such embodiments of the invention in its various aspects, the said respective arcuate non-planar wavefront components formed by the respective pixels may each be selected from any of the following shapes or geometric functions or forms: spherical, parabolic, hyperbolic, Gaussian, toroidal, cylindrical, concave-or convex-modified versions of any of the aforementioned spherical or parabolic or hyperbolic or Gaussian or toroidal or cylindrical geometric forms, a function which can be described by Zernike polynomials of at least 2nd or higher orders, symmetrical or asymmetrical forms of any of the aforesaid geometric functions or forms, distorted or aberrated or imperfect versions or renderings of any of the aforesaid geometric functions or forms, and combinations of any two or more of any of the aforesaid geometric functions or forms or distorted, aberrated or imperfect versions/renderings thereof.
In some embodiments of the invention in its various aspects, for each portion of the scattered light from the three-dimensional object or scene, i.e. for each portion of the scattered light that is scattered from the respective sub-area located on the pre-selected surface of the object or scene, the said scattered light may have, prior to its scatter from the object or scene, a substantially same-shaped planar wavefront and substantially the same wavelength and angle of incidence on the object or scene as does, prior to its transformation into the respective portion of the output light pattern by a respective pixel in the pixel arrangement, the light incident on that respective pixel in which is encoded the representation of that respective portion of the said scattered light from the object or scene.
In some embodiments of the invention in its various aspects, each said respective non-planar wavefront component formed by each respective pixel, upon that respective pixel being illuminated under the said pre-selected illumination conditions, may have a respective plurality of main propagation directions each of which is either substantially equal to or at least are dependent on a respective one of a corresponding plurality of main propagation directions of the light scattered from the respective sub-area on the pre-selected surface of the three-dimensional object or scene under the said pre-selected illumination conditions that corresponds by mapping to the respective pixel in the pixel arrangement in which is encoded its representation, upon the three-dimensional object or scene being illuminated under the said pre-selected illumination conditions.
In some such embodiments the said plurality of main propagation directions of the light scattered from each respective sub-area on the pre-selected surface of the three-dimensional object or scene may be determined from directional constituents of the said scattered light which have local intensity maxima therein and are separated from each other by one or more other constituents of the said scattered light which have intensities of less than about half the intensity of each of the said directional constituents of the said scattered light having the local intensity maxima therein.
In some embodiments of the invention in its various aspects, one or more of the said respective non-planar wavefront components of the output light pattern formed by the respective pixels and each defining a respective main propagation direction of the respective portion of the output light pattern may comprise or may be decomposable into a collection or fan or cone of at least two or more, or a plurality of, discrete propagating constituent light waves each propagating in a respective one of a plurality of discrete constituent output light propagation directions. (It is to be understood that the reference here to a respective discrete "constituent output light propagation direction" means, and may be alternatively termed, a respective discrete "sub-main propagation direction" of that respective propagating constituent light wave in the collection/fan/cone of the plurality thereof. In other words, each such discrete propagating constituent light wave of the collection/fan/cone of the plurality thereof may have its own respective "sub-main propagation direction" which is defined in a directly analogous way to the "main propagation direction" of a non-planar wavefront component itself (that is formed by a respective pixel) or of light scattered from a respective sub-area on the pre-selected surface of the object or scene -as defined as such herein near
the beginning of this disclosure.)
In some such embodiments the main propagation direction of the respective portion of the output light pattern comprising the collection or fan or cone of discrete constituent light waves may be either (i) the direction of maximum intensity of those discrete constituent light waves, or 00 an average propagation direction of those discrete constituent light waves calculated as a weighted average of their discrete constituent output light propagation directions with weighting according to their respective intensities.
In some embodiments of the invention in its various aspects, the one or more respective non-planar wavefront components, or optionally or especially the one or more respective non-planar wavefront components with continuous wavefront forms, which each defines a respective main propagation direction of the respective portion of the output light pattern, may each have an angular width of at least about 5°, especially at least about 10°, or perhaps even at least about 15°, and in each case the said angular width being in at least one angular (i.e. azimuthal) direction, especially in all angular directions.
In some embodiments of the optically variable image device of the invention, the layer of optical material, which is provided with the arrangement of the plurality of pixels defined or definable thereon and has the said optical properties that are capable of modulating the phase of light incident thereon, may constitute, or may be formed or defined by at least a portion of, optionally by a portion of the thickness of that is less than the whole thickness of, a body of the said optical material. Optionally such a body of the optical material may be in the form of a relatively thin layer, plate, sheet, web, foil or film of that optical material. By "relatively thin" here is meant that the body of the optical material has a thickness that is relatively thin in comparison with the width and/or length of its major faces, e.g. its thickness is less than about 0.5 or 0.3 or 0.1 or 0.075 or 0.05 or 0.025 or 0.01 times its major faces' width and/or length.
Thus, in some embodiments of the invention, the "layer" may be formed or defined by or may be constituted by such a relatively thin body of the said optical material, especially such a thin body per se, e.g. such a relatively thin body whose substantially total thickness forms, defines or constitutes the said layer.
However, in other embodiments of the invention, the "layer" may be formed or defined by or may be constituted by just a portion of, i.e. a portion less than the whole of, especially just a portion, less than the whole, of the overall thickness of, a larger (especially a thicker) body of the said optical material. Accordingly, in such embodiment cases, it is to be understood that as the term "layer" is used throughout this disclosure, that term "layer" is to be interpreted in such cases as meaning and referring to only that portion (especially just that thickness portion) of such a larger (especially thicker) body of the optical material which actually bounds or envelopes the actual modifying/modulating portion of the optical material of that body which comprises the respective optical structure(s) (in the respective pixels) with the respective modulated optical properties and which is designed and/or used to effect the modulation of the light's phase properties according to the fundamental principles of the invention.
Thus, it follows that, although broadly speaking the optical material usable in the invention may be any such material which has inherent or applied optical properties (such as refractive index, optical surface relief height/depth and/or width of relief features, light absorption or reflection properties) that can be modified or modulated across and/or along and/or through at least part of or a portion of a body of that material, it is to be understood that in embodiments of the invention such modification(s) or modulation(s) of the material's inherent or applied optical properties may be so modified or modulated across and/or along and/or through either (i) substantially the whole of one or more dimensions (i.e. length and/or width and/or (especially) thickness) of such a body of the optical material, or (ii) only part of or through only one or more portions, less than the whole, of the one or more dimensions (i.e. length and/or width and/or (especially) thickness) of such a body of the optical material. In particular, in some such embodiments, such modification(s) or modulation(s) of the material's inherent or applied optical properties may be so modified or modulated through either (i) substantially the whole of the thickness of such a body of the optical material, or (ii) only part of or through only one or more portions, less than the whole, of the thickness of such a body of the optical material.
Accordingly, in those more specific embodiments as defined in the preceding paragraph in which the modification(s) or modulation(s) of the material's inherent or applied optical properties is/are so modified or modulated through only part of or through only one or more portions, less than the whole, of the thickness of such a body of the optical material, the term "layer" as used herein refers to, and means, just that portion of the body that actually comprises the respective optical structures (in the respective pixels) with the respective modulated optical properties and which is designed and/or used to effect the modulation of the light's phase properties according to the fundamental principles of the invention.
Furthermore, in all such embodiments as in the preceding two paragraphs, it may desirably be that such modification(s) or modulation(s) of the material's inherent or applied optical properties is effectable independently for each pixel in the arrangement.
Thus, in many practical embodiments of the invention, and especially independently within each pixel, the thickness of the "layer" may be understood as being defined by a distance between opposed surfaces -i.e. which may be physical (i.e. real) or virtual (i.e. notional) surfaces -of the body or the relevant part/portion of the body (especially the relevant part/portion of its thickness) which envelope the actual modifying/modulating portion of the optical material which is designed or used to modulate the light's phase properties according to the invention. The layer-defining envelope surface, or portion thereof, of the body which receives the incident light first of all (i.e. the incident light which the layer then goes on to modulate the phase properties of) may be termed the "input" or "entrance" surface. The layer-defining envelope surface, or portion thereof, from which the phase-modulated light emerges may be termed the "output" or "exit" surface. In some embodiment forms, in particular where the layer (especially independently within each pixel) is designed to phase-modulate the incident light upon reflection, the output/exit surface may be the same surface as the input/entrance surface. On the other hand, in other embodiment forms, in particular where the layer (especially independently within each pixel) is designed to phase-modulate the incident light in transmission, the output/exit surface may be a different surface, especially an opposite surface, from the input/entrance surface.
In some embodiments of the invention, the "layer' of optical material -which is provided with the arrangement of the plurality of pixels defined or definable thereon and comprises the respective optical structures (in the respective pixels) with the respective modulated optical properties and which is designed and/or used to effect the modulation of the light's phase properties according to the fundamental principles of the invention -may typically have a thickness of from about 0.5 up to about 50 pm, more especially from about 0.5 up to about 10 or 20 or 30 pm, and even more especially from about 0.5 up to about 3 or 5 pm.
Thus, the above desirable thickness dimensions of the said "layer" are to be understood as referring to the thickness of the actual layer whose surfaces (real or virtual) envelope the actual modifying/modulating portion of the optical material which is designed or used to modulate the light's phase properties according to the invention (e.g. perhaps being part of a larger (especially thicker) body of the optical material.
In many practical embodiments of the invention in its various aspects, the optical material that forms or defines the said layer may be a material that is capable of providing, or being formed into or having formed therein or thereon, in each respective pixel of the arrangement, and optionally independently for each pixel, a respective optical structure that interacts with incident light transmitted therethrough or reflected therefrom and is capable of modulating one or more optical properties of that light, including modulating the phase of the light wave that interacts therewith, whereby the light wavefront becomes distorted or warped or otherwise shape-modified into a non-planar shape, and is capable of having a record of an optical image or portion of an optical image encoded therewithin or thereon; wherein the optical structure within at least a portion of each respective pixel is formed or provided by the optical material that forms or defines the said layer by virtue of either (a) the modulatable inherent optical properties of the said optical material itself in the region thereof defining the respective pixel or the said portion thereof, or (b) the respective optical structure being applied to the optical material (optionally onto or into one or more surface(s) or surface layer(s) of the optical material (i.e. where "surface layer" here and further below means a layer or thickness portion located at or directly beneath or immediately adjacent a surface of the optical material)) in the region thereof that defines the respective pixel or the said portion thereof, or optionally (c) the respective optical structure being formed at least in part by an additional optically functional layer applied to the optical material (optionally onto or into one or more surface(s) or surface layer(s) thereof) in the region thereof that defines the respective pixel or the said portion thereof; and further wherein the optical material that forms or defines the said layer is such a material which has, or has applied thereto or has formed therein or thereon, any one or more of, or any combination of any plurality of, the following: (i) inherent optical properties -especially refractive index -that can be modified or modulated across and/or along and/or through at least part of the thickness of the said layer (i.e. through either substantially the whole thickness of the said layer or alternatively through only part of (i.e. a portion of, less than the whole of) the thickness of the said layer, or (ii) optical surface relief (especially at least partially reflective or refractive optical surface relief) applied to or formed on or in at least one surface of, or within a surface layer of, the said layer, which relief has variations in its surface relief height/depth and/or width of its relief features, and/or variations in its relief layer thickness, or (iii) light absorption or reflection properties which can be varied across and/or along and/or through part of or all of the material of the said layer.
In some such embodiments as in the preceding paragraph, such variations in optical properties of the optical material that forms or defines the said layer as per any one or more of (i), (ii) or (iii) above may be manifested either (iv) within each of the various pixels over at least a portion of the area of each pixel and independently for each pixel, or (v) over one or more groups of a respective plurality of pixels or a respective plurality of portions of the areas of that plurality's pixels and independently for the or each such group of pixels or pixel portions.
In some such embodiments as in the preceding-but-one paragraph, the said additional optically functional layer may comprise a layer capable of modifying the phase of a light wave interacting therewith and/or of increasing or suppressing the reflectivity and/or transmission -optionally selectively with respect to a wavelength or wavelength range/band, or an angle of incidence, of the interacting light -by its material properties (e.g. refractive index, high refractive index in particular (i.e. higher than standard plastics) and/or its interference properties). Such an additional optically functional layer may in some instances take the form of a coated or deposited layer or even a stack of sub-layers, for example formed using printing techniques or vacuum deposition techniques used for forming thin single-or multilayer films of optically functional material.
It is to be understood in this overall disclosure, that in referring to one or more "pixels" of the plurality thereof in the pixel arrangement, each of which pixels has recorded therein in encoded form a respective portion of the image to be formed (where each respective portion of the image to be formed that is recorded in such a respective pixel is encoded therein as a representation of a respective portion of the said scattered light from the three-dimensional object or scene), what is meant and is/are being referred to is/are that/those pixel(s) of the plurality thereof in the pixel arrangement which are actually "active" pixels -which is to say, that/those pixel(s) which actually have the modulated optical properties which form the characteristic phase-modulated non-planar wavefront component(s) of the output light pattern. Thus, it may be possible within the scope of certain embodiments of the invention for the arrangement of pixels to contain one or more other "inactive" pixels (e.g. in one or more island(s) or patch(es) therein or distributed thereacross) which do not have the said modulated optical properties and so do not form a characteristic said phase-modulated non-planar wavefront component of the output light pattern. Such "inactive" pixel(s) is/are therefore not deemed to be part of the "arrangement" thereof that acts to form the important phase-modulated non-planar wavefront components that are key to the present invention, although such "inactive" pixel(s) could still be present in the device of the invention and perhaps contribute somehow to the overall output light pattern in one or more other ways. In other words, in such embodiments which may indeed contain one or more such "inactive" pixels, it may alternatively be said that, within the overall scope of the invention, not all mapped pixels in a given pixel arrangement need necessarily to be used to form portions of the image of the three-dimensional object or scene in question, or certain one(s) of the pixel(s) in the arrangement could be masked, or even interlaced with other pixels with some other optical function, so as to form some other image component/feature or optical effect or element of patterning that still makes a contribution to the overall output light pattern, but in other way(s) from the key phase-modulated non-planar wavefront component(s) that are unique to this invention.
Practical examples of such optical materials suitable for use as the layer in devices according to embodiments of the invention are well-known perse in the art, and may include, especially in the case of optical modulation of the refractive index of the optical material within the layer, typically photopolymers, e.g. Bayfol® HX 200 or DuPont HRF-150-38 photopolymer, as well as materials based on dichromated gelatin (DCG) such as e.g. PFG04 from Slavich. However, as is more commonly used in the art, materials allowing local relief modulation of the material surface -especially photoresists and e-beam resists -may also be used. In practical embodiments of the present invention a particular optical material that is most suited for use may depend on the technology of recording or writing in the pixels themselves. For example, in the case of electron beam writing, electron resists may be most suitable, e.g. based on PMMA (polymethyl methacrylate), whilst in the case of optical beam writing, photoresists may more suitably be used, e.g. Microposit S1818 G2 from Dow Chemicals or Ormoclear from Micro Resist Technology.
In certain other embodiment devices, still other suitable optical materials (besides those listed above, which are primarily photo-or electron-sensitive materials, which are suitable primarily for direct recording by light or electrons of the encoded information), such as, for example, metals (e.g. Al, Au, Cu, Ag, etc) or plastics (e.g. PC, PET, PMMA, etc) or even UV-curable polymers, may form the layer, and these may be further useful in that a previously formed optical surface relief applied to at least one surface thereof may be more readily copied (e.g. by electroforming, embossing, micro-moulding or UV-curing processes) in a replication process. Furthermore, such materials may be able to be more readily combined with other materials (e.g. metals, such as Al, Cu, Au, Ag, etc, or oxides or halides, such as MgF2, SiO2, TiC2, A1203, Ta205, etc) which may be used to form a thin film layer or a stack of layers (e.g. stacked fully or partially) on one or more faces of the layer by various known techniques (e.g. by PVD or sputtering processes) or on optical surface relief pre-formed on one or more faces of the layer, or which may be used to laminate or fill its formed relief (e.g. coated or uncoated) with other material(s) (e.g. plastics or polymers of various specific optical properties, such as refractive index, colour, etc), such as may be employed in certain embodiments as defined/described further below.
The thickness of the layer of optical material, which is provided with the arrangement of the plurality of pixels defined or definable thereon, may constitute either substantially the whole thickness of the device itself, or alternatively just a portion of (i.e. a portion less than the whole of) the thickness thereof. Likewise the facial surface area of the layer of optical material may constitute either substantially the whole of the facial surface area of the device itself, or alternatively just a portion of (i.e. a portion less than the whole of) the facial surface area thereof. In either case where the layer of optical material constitutes just a portion of the thickness and/or facial surface area of the device itself, the optical material layer may be provided either (i) as a discrete layer (e.g. manufactured separately) which is applied or mounted onto a surface or face of -or alternatively embedded in a facial portion of -a suitable carrier, e.g. a body of an optically non-functional or non-phase-modulating carrier material, or 00 as an integral, inseparable surface layer of a unitary such carrier, and in such a case the layer thickness may be defined as a distance between opposite faces (i.e. physical or virtual) that envelope the actual portion of the carrier material which exhibits the modulating optical properties (as exploited by the present invention). Optionally, in embodiments where the optical material layer is provided as a discrete layer applied or mounted onto or into a surface or face of a carrier, it may (as desired or as appropriate) be so applied or mounted on/in the carrier either directly or indirectly via at least one intermediate mounting layer which is itself mounted on/in the carrier and serves as an additional structural layer component of the complete device. As practical examples of suitable carrier materials (and intermediate layer materials, if used) may be mentioned polymeric carriers such as polycarbonate (PC), PET, polypropylene, SAN, etc. If desired or appropriate, deposited thin films with a different refractive index, such as ZnS, SiO2, TiO2, may be used to treat a surface of optical relief or to optically separate surface relief, especially for example before lamination or other encapsulation between subsequent layers (if such is employed in certain embodiments of the invention). Also if desired or appropriate, various overcoat or potting varnishes with increased refractive index achieved by e.g. dispersed TiO2 nanoparticles or sol-gels containing TiO2 particles, e.g. from BASF, may also be suitable for use in certain embodiments of the invention.
In embodiments of the device of the invention, especially independently within each pixel, the layer of optical material, or a body of the optical material comprising the said layer, may be either substantially planar (i.e. substantially flat) or it may be curved or arcuate in one or two or three dimensions. In the case of the layer being applied or mounted (either directly or indirectly via an intermediate layer) onto a surface or face of, or alternatively embedded in a facial portion of, a carrier, the optical material layer may typically take on or adopt the surface or facial shape of the carrier, whereby the optical material layer assumes the same or a geometrically similar general shape as the carrier and thus as the overall optically variable image device itself.
In many embodiments of the invention in its various aspects, the optical material layer, or at least a portion thereof, may be subdivided into one or more groups of adjacent or neighbouring pluralities of sub-areas, each such sub-area being termed a said "pixel", as that term is used herein. In accordance with the invention, each pixel has recorded therein -i.e. has recorded in the optical material of the respective pixel -in encoded form a respective portion of the optically variable image to be formed by the device.
The shape of each pixel, and also the or each group of adjacent or neighbouring pixels, may take any suitable geometric shape, especially when viewed in plan (i.e. normal to a general plane of the layer). A typical pixel may for example be square or rectangular in plan shape.
However, other pixel shapes, both regular or irregular or even mixtures of regular and irregular shapes as between different pixels, may be possible, such as polygons, and especially those polygons which tessellate, e.g. hexagons or triangles. The typical size, i.e. average width, of each pixel may range from a few microns -e.g. from about 3 or 5 or 10 or 20 pm -up to a few or several tens or hundreds of microns -e.g. up to about 50 or 100 or 200 or 300 or 400 or 500 pm. Usually, however, for most regular or irregular shapes of pixels, as a rule of thumb, an inscribed circle area that each pixel occupies may be in a diameter size range of from at least several microns -e.g. at least about 3 or 5 or 10 pm -up to at most a few or several tens or hundreds of microns -e.g. at most about 100 or 200 or 300 pm. More especially in many practical embodiments, a typical diameter size range of an inscribed circle area that each pixel occupies may be in the approximate range of from about 5 or 10 pm up to about 100 or 200 pm, more particularly from about 5 pm up to about 100 pm, e.g. from about 5 or 10 pm up to about 40 or 50 or 60 pm. Within any given group of pixels, the pixels may substantially all be of substantially the same size, or alternatively their sizes may vary. Alternatively or additionally, within any given group of pixels, the pixels may substantially all be of substantially the same geometrical shape, or alternatively their geometrical shapes may vary. Furthermore, in the case of a plurality of groups of pixels, the sizes and/or shapes of the pixels, as between different ones of the groups thereof, may either be substantially the same or alternatively they may vary.
In practising some embodiments of the invention in its various aspects, any given non-planar wavefront component of the output light pattern (which component forms a respective portion of the image to be formed) that is generated by a given pixel under illumination by a collimated beam of incident light or a light beam with a narrow angular width (e.g. typically of or less than about 5 or 10°), where the wavefront of the non-planar wavefront component has an arcuate shape (e.g. substantially spherical in shape) and/or has an angular spread (especially a substantially continuous angular spread) in all azimuthal directions of typically of or greater than about 5 or 10°, may -for the purpose of its observing or viewing -be projected onto a screen of some kind (which "screen" could be a physical screen or detector of some kind or even the retina of a human eye) at a distance which is substantially larger than (e.g. is typically larger than by plural orders of magnitude) the given pixel's size, i.e. the average pixel width. Such a projection may form an image originating from that pixel that is thus magnified in shape and/or area as compared with the actual pixel shape/area. The magnitude of the magnification may be proportional to the angular width (i.e. spread) of the non-planar wavefront component and the observation distance. The pixel shape or pixel area may be observable even by the human naked eye (i.e. projected onto the eye's retina), especially over any area of the layer of the device that contains pixels of substantially the same size and shape, thereby generating respective non-planar wavefront components of substantially the same angular width (i.e. spread) as each other, and where that area of the layer has (from the viewing point of the observer) an angular width which is approximately the same as or larger than the angular width of the said non-planar wavefront components, and further where the main propagation directions of the said non-planar wavefront components change only slowly therealong and/or vary only within the range of their angular width (i.e. spread). Such a visual projection of the pixel size or pixel area of the respective pixel may not be possible if the wavefront generated by that pixel is substantially planar, i.e. it has a substantially planar shape.
In typical practical embodiments, the or each group of pixels may comprise a set or array of pixels arranged regularly or uniformly, especially in the form of a generally square or rectangular grid pattern, optionally with one or more corner regions of such a grid pattern which are rounded or bevelled or truncated in form rather than being strictly angular. However, other grid shapes of a pixel set or array may be possible. Thus, in any given group of pixels, the pixels may be arranged regularly or uniformly -e.g. as a square, rectangular, triangular, hexagonal or brick-wall-type arrangement -or alternatively they may be arranged irregularly or non-uniformly, especially for example depending on the shapes of the individual pixels within the or each group.
The subdivisions between immediately adjacent or neighbouring pixels which define their respective boundaries may be either real or notional. By "real" here is meant that the boundaries between immediately adjacent or neighbouring pixels are defined in practical terms -and are discernable or observable upon viewing of the pixels -by a change, especially a marked or stepwise or sudden change, in the optical properties of or optical functions imparted by, the respective optical structures in those adjacent or neighbouring pixels. Alternatively, by "notional" here is meant that the boundaries between immediately adjacent or neighbouring pixels are defined implicitly either (i) by various projection vectors or projection lines that are used to define those pixels themselves by the said mapping from the boundaries of the corresponding respective sub-areas on the pre-selected surface of the three-dimensional object or scene (into which sub-areas the pre-selected surface is divided), or alternatively 00 by various projection vectors or projection lines that are used to map the boundaries of the respective sub-areas on the pre-selected surface of the three-dimensional object or scene onto and thereby define the corresponding pixels themselves.
Furthermore, in practising some embodiments of the invention, such projection vectors or projection lines that define the said mapped correspondences between the respective pixels and the corresponding respective sub-areas on the pre-selected surface of the object or scene may be either (i) unidirectional or (ii) multidirectional, i.e. the projection vectors or projection lines either (i) all lie or point in substantially the same direction across the subareas on the pre-selected surface of the object or scene, or (ii) lie or point in varying directions across the sub-areas on the pre-selected surface of the object or scene.
In some such embodiments, in the mapped correspondences between the respective pixels and the corresponding respective sub-areas on the pre-selected surface of the object or scene, the arrangement of the pixels and the pre-selected surface of the three-dimensional object or scene may be either (iii) in substantially equal size proportions relative to one another, or (iv) scaled-up or scaled-down in size proportions relative to one another, wherein one of the arrangement of the pixels and the pre-selected surface of the three-dimensional object or scene is scaled-up or scaled-down in size (especially by an appropriate or suitable multiplier or percentage) relative to the other of the arrangement of the pixels and the preselected surface of the three-dimensional object or scene.
Furthermore, alternatively or additionally in some such embodiments, in the mapped correspondences between the respective pixels and the corresponding respective sub-areas on the pre-selected surface of the object or scene, the said mapping is done either (v) for substantially all the sub-areas on the pre-selected surface of the object or scene, or (vi) some of, but not all of, the sub-areas on the pre-selected surface of the object or scene, optionally one or more groups each of one or more adjacent or neighbouring sub-areas which collectively are less than all the sub-areas.
In embodiments of the invention, the image to be formed by the pixels of the device may comprise an image which resembles or imitates or is an approximation or simulation of the scattered light from at least one said pre-selected surface of the original three-dimensional object or scene. In embodiments, that at least one pre-selected surface of the original object or scene may be either (i) an outer surface of the original object or scene, or (ii) an inner surface of the original object or scene. Thus, in many embodiments, the image to be formed by the respective pixels of the device may comprise a plurality of respective image portions which resemble or imitate or are approximations or simulations of portions of scattered light from the respective sub-areas on the pre-selected surface of the original object or scene and which either pass through or are reflected from those sub-areas on that pre-selected surface.
In some such embodiments of the invention, the at least one pre-selected surface of the original object or scene whose light scattered therefrom is to be imitated by formation of an image thereof by the device may be able to receive incident light originating from outside the original object or scene, and may do so either directly or indirectly therefrom. Thus, in many embodiments of the invention, in principle it may be that it is not only that pre-selected surface whose light scattered therefrom is imitated by formation of an image thereof by the device, but it may instead be considered that it is the light scattered by the entire object or scene that is being imitated by formation of an image thereof by the device. The pre-selected surface only defines which part of (i.e. which sub-areas of the relevant surface of) the object or scene the incident light has to pass through or be reflected from during its interaction with the object or scene to be considered for recording into the respective pixels corresponding to those respective sub-areas.
In some embodiments of the invention, the original object or scene may be either (i) transparent or semi-transparent, whereby the scattered light therefrom is produced at least in part upon transmission therethrough, or (ii) optically opaque or reflective, whereby the scattered light therefrom is produced substantially only upon reflection therefrom.
More particularly, in practising some such embodiments of the invention in its various aspects, the optically variable image device may, especially usefully, be designed and constructed for forming an observable optically variable image of light scattered from a three-dimensional object or scene where that three-dimensional object or scene is any of the following: (i) a substantially reflective three-dimensional object or scene, or (ii) a reflective object or scene comprising optical relief with at least some three-dimensional content (e.g. reflective optical relief in the form of an embossed or cast optical relief pattern or motif), or (iii) a reflective three-dimensional object or scene with an entrance surface (i.e. on a face or side of the object which first receives the incident light) in the form or an optical relief pattern or motif and an opposite exit surface (i.e. on the face or side of the object opposite its entrance surface) which is substantially flat or planar, or (iv) a substantially transparent three-dimensional object or scene with an entrance surface (i.e. on a face or side of the object which first receives the incident light) which is substantially flat or planar, and an opposite exit surface (i.e. on the face or side of the object opposite its entrance surface) in the form or an optical relief pattern or motif. Moreover, in those of the above options where an optical relief pattern or motif is employed, that optical relief pattern or motif may be an embossed or cast form thereof.
In some embodiments of the invention in its various aspects, the pre-selected surface of the three-dimensional object or scene may be described by a distance function D=f(s) relative to an entrance surface s of the said layer, the said function f defining a distance D between a selected or given point on the said layer's entrance surface s and a corresponding point on the pre-selected surface of the three-dimensional object or scene, where the distance D between the said points is measured along a respective projection vector or projection line defining a mapping relationship between a respective sub-area located on the pre-selected surface of the three-dimensional object or scene and a respective pixel (that is to record a corresponding respective portion of the image to be formed).
In some such embodiments the distance D between the said points may be measured along a respective projection vector or projection line being a normal to the surface s. Alternatively or additionally in some such embodiments, the function f may be substantially continuous yet non-uniform over either (i) substantially the whole of the said layer's entrance surface s, or alternatively (ii) substantially the whole of each of one or more portions of the said layer's entrance surface s on which are defined a respective group or array of the pixels.
In some embodiments of the invention, the phase modulation of the incident light wave or the calculation thereof may comprise the incident light wave's undergoing a phase delay or phase shift (which is to say, a shift of the position of the sinusoidal wave in its propagation direction) as compared with an incident light wave that does not interact with the optical structure in the respective pixel.
In some such embodiments the amount of the phase delay or phase shift may be dependent on the local optical and/or physical properties of the optical structure of the relevant pixel, wherein said local optical and/or physical properties of the optical structure are selected from: (i) its optical material's refractive index, and/or (ii) its optical surface relief profile height or depth and/or width of its relief features applied thereto, and/or (iii) the thickness of a layer thereof in which is formed its optical surface relief profile, and/or (iv) the composition of a single or a stack of a plurality of thin film layers applied to any one or more surfaces of the said optical material forming the layer, whereby the light wave exiting the optical structure of the relevant pixel has its wavefront phase-modified compared to the wavefront of the original incident light wave.
In some embodiments of the invention, the interaction between the incident light wave and the respective optical structure of each respective pixel may be of the nature of, or may be predominantly of the nature of, transmission, reflection or absorption, or a combination of any two or more thereof, and the respective optical structure of the respective pixel may be constructed or formed so as to impart thereto the required phase modulation capability for the purpose of forming a respective non-planar wavefront component forming a respective portion of the image to be formed by that pixel, wherein the said respective optical structure may comprise any one or more of the following features: (i) localized variation(s) in the refractive index of the optical material forming the layer in the region thereof defining the respective pixel; (ii) variation(s) in the thickness of a transparent optical relief structure or relief layer within the layer in the region thereof defining the respective pixel and which is open to the air to one side thereof, e.g. whereby a variable path length is created within the transparent optical relief structure due to a variable ratio between the air and material thicknesses as the light wave propagates through a specific location or path trajectory within the respective optical structure; (iii) the same feature as feature (ii) above, but modified such that the air is replaced with a different transparent material with a different refractive index from the material in which the optical relief is formed (optionally wherein an additional thin sub-layer of optical material is applied onto the optical relief between (i.e. sandwiched between) the two material sub-layers of this option (iii), wherein that sandwiched additional thin sub-layer has yet another different refractive index from either one (only) of or both of those other two material sub-layers therebelow and thereabove), and optionally wherein the refractive indexes of any two (only), but not all, of those sub-layers in this configuration are optionally the same; (iv) the same feature as feature (iii) above, but modified such that the variable optical relief structure is buried within an optical material with a localized modulated refractive index; (v) by arranging for a variable travel length of a forward and a backward path of the incident light beam which is reflected by optical relief, which is open to the air to one side thereof, formed in an optical material (forming the layer) that is reflective or is a material coated with one or more layers (optionally as a monolayer or a layer stack) of a coating material that is at least partially, or acts as, a reflecting surface; (vi) the same feature as feature (v) above, but modified such that the air is replaced with a transparent optical material; (vii) by arranging for the optical material to have variable refractive index by virtue of the provision of any one of the above features (i), (iii), (iv) or (v) above, combined with a reflecting material on the back side of the layer (i.e. on the side opposite to the side through which the incident light first enters the respective optical structure), whereby the combined respective optical structure can phase-modulate the incident light wave on its forward and backward (i.e. reflected) paths through the layer; (viii) by applying an intermediate layer or stack of layers (e.g. of one or more semitransparent metallic material(s) or dielectric material(s) onto at least one side of (optionally the front (i.e. entrance) side of) the layer, and optionally also burying an optical relief layer (as a modification of feature (iii) above) within the layer, whereby the optical properties (optionally the phase properties) of transmitted and/or reflected light are yet further modulated (optionally yet further phase-modulated), optionally wherein, as an optional augmentation of this arrangement, an overcoat on a front (i.e. entrance) side of the layer is present and is optionally designed as an antireflective coating layer (further optionally an antireflective coating layer which is selective with respect to wavelength and/or angle of incidence and/or angle of observation), whilst optionally in the case of a buried optical relief layer the overcoat on the buried optical relief layer is optionally designed with an increased reflectivity (further optionally with an increased wavelength-selective reflectivity and/or angle of incidence-selective reflectivity), in order to yet further modify the overall phase-modulating optical properties of the complete respective optical structure of the respective pixel.
In any of the above embodiment options (i) to (viii) as in the preceding paragraph, it may optionally be that one of more of the following additional features (ix) and/or (x) is/are present: (ix) as an augmentation of the respective arrangement, there is added to the layer a reflective sub-layer or multilayer stack on a back side thereof in order to further modify the overall phase-modulating optical properties of the combined respective optical structure of the respective pixel; and/or (x) an overcoat is applied to any surface of the main layer and/or any exposed or buried optical relief forming part of the main layer in order to yet further modify the interaction of incident light with the respective optical structure, optionally wherein the overcoat is applied fully or selectively to only one or more parts of the main layer, optionally to all of or only to selected one(s) of the respective pixels or to one or more sub-groups of a plurality of the pixels or one or more portion(s) of one or more such sub-groups of a plurality of pixels, optionally disregarding individual boundaries of one or more such pixels in one or more such sub-group(s) thereof, and further optionally wherein the overcoat is reflective, partially reflective or selectively reflective with respect to wavelengths and optionally the angle of incidence of the interacting light.
In some embodiments of the invention, independently for each pixel, the respective optical structure of each respective pixel may have a phase modulation depth limit for a given wavelength and a given angle of incidence of the light interacting with the respective optical structure of the respective pixel, wherein the phase modulation depth limit may be defined as a maximum phase shift (i.e. maximum phase delay) that the respective optical structure can induce in the light wave interacting with the respective optical structure of that respective pixel.
In many such embodiments, each pixel's generated phase-modulated non-planar output light wavefront component, that exits the layer from that pixel and imitates the light scattered by the three-dimensional object or scene in the corresponding respective sub-area on the preselected surface thereof, may be designed -on the basis of the said pre-selected illumination conditions as defined in the first aspect of the invention defined hereinabove -such that it has a phase modulation depth which is at most equal to or is smaller than the said phase modulation depth limit under the said pre-selected illumination conditions.
In some embodiments of the invention, the respective modulation function of the respective optical structure of one or more of the pixels (i.e. the function describing the changing optical property or properties of the respective optical structure across the respective pixel's area) may be such that it represents an encoded light wavefront component exiting the layer which imitates the light scattered by the three-dimensional object or scene in the corresponding respective sub-area on the pre-selected surface thereof upon either transmission through and/or reflection from said object or scene.
In some embodiments of the invention, one or more of the pixels may be in effect split into two or more sub-portions -which is to say, the wavefront component to be encoded (i.e. under the said pre-selected illumination conditions) in such a pixel which exceeds a maximum modulation limit of that pixel's optical structure is spatially divided into two or more sub-components, each one not exceeding the maximum modulation limit of the optical structure of that pixel, and each such sub-component of the wavefront component being encoded separately in a respective sub-portion of that pixel's optical structure that spatially corresponds to the said sub-component of the wavefront component to be encoded thereby.
In some such embodiments, it may optionally be that, especially when the smallest dimension of the respective pixel's sub-portions is smaller than about 10 or 5 times the design wavelength), the wavefront sub-components generated by the neighbouring sub-portions of the pixel may remain synchronized (i.e. under the said pre-selected illumination conditions), i.e. the phase offset between the neighbouring wavefront sub-components at the exit of the pixel, if any, is minimized or is equal to 27c or multiples thereof, whereby unwanted wavefront disruptions are suppressed or minimized.
Within the scope of this specification it is envisaged that the various aspects, embodiments, examples, features and alternatives, and in particular the individual constructional, configurational or operational features thereof, set out in the preceding paragraphs, in the claims and/or in the following description and accompanying drawings, may be taken independently or in any combination of any number of same. For example, individual features described in connection with one particular embodiment, or described singly or in combination with another feature in any one or more embodiments, are applicable on their own or in combination with one or more other features to all embodiments and may be found and used in combination with any other feature in any given embodiment, unless expressly stated otherwise or such features are incompatible.
BRIEF DESCRIPTION OF THE DRAWINGS
Various features and embodiments of the present invention in its various aspects, and how embodiments of the invention may be put into practical effect, will now be described in detail, by way of non-limiting example only, with reference to the accompanying drawings, in which: FIGURE 1 is a perspective view of one practical embodiment of an optically variable image device according to the invention; FIGURES 2(a), (b) & (c) are schematic cross-sectional views showing three examples of how sub-areas located on a pre-selected surface of a real or virtual three-dimensional object may be mapped onto the pixels of embodiment devices such as that of FIG. 1 or an alternatively shaped form thereof, for the purpose of recording in those pixels respective mapped portions of the image of light scattered from the pre-selected surface of the object to be formed by the device; FIGURE 3 shows four example schematic cross-sectional views of how different surfaces of a real or virtual three-dimensional object may be chosen as a pre-selected surface for the purpose of mapping their respective sub-areas onto respective pixels of an optically variable image device according to various embodiments of the invention; FIGURE 4 is a schematic perspective illustration of an example of how incident light may be scattered by a transparent real or virtual three-dimensional object, which scattered light may then be used to define how it is to be recorded in encoded form in the pixels of an embodiment device according to the invention; FIGURE 5 is a schematic cross-sectional view showing the manner in which the shape of a pre-selected surface of a real or virtual three-dimensional object may be described by a distance function mapped onto the entrance surface of a layer with an arrangement of pixels thereon in an embodiment device according to the invention; FIGURE 6A is a schematic cross-sectional view of a layer with an arrangement of pixels thereon in an embodiment device according to the invention, highlighting one of its pixels in particular, and FIG. 6B then goes on to show various examples of that pixel's optical structure which can phase-modulate incident light for the purpose of forming a respective portion of the image to be formed by that pixel; FIGURES 6B(a), (b), (c), (d), (e), (f), (g) & (h) are schematic cross-sectional views showing eight different examples of the optical structure of the highlighted pixel of the device of FIG. 6A which can phase-modulate incident light for the purpose of forming a respective portion of the image to be formed by that pixel; FIGURES 7(a) & (b) are explanatory schematic cross-sectional views of one of the pixels from the arrangement thereof on the layer as shown in FIG. 6A and with an optical structure as shown in FIG. 6B(a), illustrating the mechanism of phase modulation of incident light using a geometrical (i.e. ray-tracing) approach, and further illustrating how the phase modulation depth limit may be determined for this particular type of optical structure; and FIGURES 7(c) & (d) are each an explanatory schematic cross-sectional view of one of the pixels from the arrangement thereof on the layer as shown in FIG. 6A and with an optical structure as shown in either FIG. 6B(a) or FIG. 6B(e), respectively, illustrating how to determine the phase modulation of incident light using a geometrical (i.e. ray-tracing) approach for each of these particular types of optical structure; FIGURES 8A(a) & (b) are schematic cross-sectional views showing how light scattered from sub-areas located on a pre-selected surface of a real or virtual three-dimensional object may relate to the light formed by the respective pixels in the layer of an embodiment device according to the invention, and in particular how the respective main propagation directions of the light scattered from the respective sub-areas may be assigned to the respective main propagation directions of light formed by the respective pixels; FIGURES 8A(c) & (d) are schematic cross-sectional views showing how light scattered from sub-areas located on a pre-selected surface of a real or virtual three-dimensional object may relate to the light formed by the respective pixels in the layer of an embodiment device according to the invention, and in particular showing a case of assigning two (or more) main propagation directions to the light scattered from a particular sub-area and their assignment to main propagation directions of light formed by the respective pixels; FIGURES 8A(e) & (f) are similar to FIGS. 8A(c) & (d) in that they are schematic cross-sectional views showing how light scattered from sub-areas located on a pre-selected surface of a real or virtual three-dimensional object may relate to the light formed by the respective pixels in the layer of the same embodiment device as in FIGS. 8A(c) & (d), but this time showing in particular a case where the light formed by a particular pixel is composed of plural discrete sub-components of various wavefront forms; FIGURES 8A(g) & (h) are schematic cross-sectional views showing how light scattered from sub-areas located on a pre-selected surface of a real or virtual three-dimensional object may relate to the light formed by the respective pixels in the layer of another embodiment device according to the invention, and in particular showing a case of selecting two (or more) main propagation directions from the light scattered from a particular sub-area, as a representation of such scattered light and their assignment to main propagation directions of light formed by the respective pixels; FIGURE 8B(a) is a schematic cross-sectional view showing how light scattered from certain sub-areas of a pre-selected surface of a real or virtual three-dimensional object, for a given incident direction, does not produce scattered light at the observer's side (i.e. at the output or exit side of the object where the majority of the scattered light is observed); FIGURE 8B(b) is a schematic cross-sectional view of a portion of the layer depicted in FIG. 8A(b) showing light generated by a pixel with extensions to neighbouring pixels; FIGURES 9A(a) & (b) are, respectively, a schematic cross-sectional view and a graphical representation of what is happening in FIG. 9A(a), of a pixel such as that shown in FIG. 6B(a), showing the manner in which its optical structure is capable of modulating the phase of an incident light beam with a planar wavefront into a phase-modulated non-planar light beam component with a spherical wavefront by modulation of the refractive index of the optical material forming the optical structure; FIGURES 9B(a) & (b) correspond to FIGS. 9A(a) & (b), but show a modified way in which the pixel's optical structure can modulate the phase of the incident light beam with a planar wavefront into a phase-modulated non-planar light beam component with a spherical wavefront by modulation of the refractive index of the optical material forming the optical structure, but this time in a case where the pixel is in effect split into two distinct portions, in order to keep the modulation of the refractive index of the material of that pixel's optical structure within the maximum modulation depth limit therefor under given illumination conditions; FIGURE 10(a) is a simplified schematic cross-sectional view of the arrangement shown in FIG. 9A(a), showing how a converging spherical phase-modulated wavefront component of a certain angular width generated by the pixel has its focal centre located in the direction of propagation of the wavefront; FIGURE 10(b) corresponds to FIG. 10(a), but shows an alternative arrangement in which an equivalent but diverging spherical phase-modulated wavefront component of the same angular width generated by the pixel has a virtual focal centre located on the opposite side of the pixel's exit side (i.e. opposite to the beam's propagation direction); FIGURE 11A(a) is a schematic plan view of a portion of an arrangement of pixels of another embodiment device according to the invention, each pixel having an optical structure formed by modulation of the refractive index of its optical material and being capable of generating a non-planar wavefront component with a spherical wavefront, and FIGURE 11A(b) is a graph showing the propagation directions of the spherical wavefront generated by the central pixel in the portion of the arrangement shown in FIG. 11A(a), thereby forming a square in the space of propagation angles; FIGURE 11B(a) is a schematic plan view of a portion of an arrangement of pixels of yet another embodiment device according to the invention, each pixel having an optical structure formed by modulation of the refractive index of its optical material and being capable of generating a non-planar wavefront component with a spherical wavefront, where a portion of the central pixel is masked-off so as to limit (i.e. truncate) the pixel area that is capable of generating the non-planar wavefront component with the spherical wavefront, and FIGURE 11B(b) is a graph showing the propagation directions of the spherical wavefront generated by the central pixel in the portion of the arrangement shown in FIG. 11B(a), thereby forming a star shape in the space of propagation angles corresponding to the shape of the truncated pixel area that is capable of generating the non-planar wavefront component; FIGURES 12(a), (b), (c) & (d) are various explanatory views illustrating how, in yet another embodiment device according to the invention, a phase-modulated non-planar output light wavefront component of a spherical wavefront shape generated by a given pixel may have its phase-modulation (as shown in FIG. 12(a)) further modified by a phase variation feature (as shown in FIG. 12(b)), thereby imparting a particular "signature" to the non-planar wavefront component of the spherical wavefront shape generated by that pixel, shown in terms of representations of propagation angles and of spatial frequencies in FIGS. 12(c) and 12(d), respectively; FIGURES 13(a) & (b) are schematic perspective views of, respectively, a group of pixels of yet another embodiment device according to the invention, and the same group of pixels but with a delimited area for further modification of their optical structures, for the purpose of generating a phase-modulated non-planar output light wavefront component and an additional "superimposed" image by the respective groups of pixels; FIGURES 14(a), (b) & (c) are various views representative of an example of an additional "superimposed" image that may be additionally encoded in the optical material forming the optical structures of a pixel group (such as, for example the one depicted in FIGS. 13(a) & (b)) in order to generate a non-planar wavefront component with an additional superimposed image, and illustrating, respectively, a graphical representation of the image to be superimposed (FIG. 14(a)), its representation in the form of a phase function of an on-axis phase Fourier hologram (FIG. 14(b)), and the superimposed image itself as a representation of its propagation directions (FIG. 14(c)), reconstructed by a plane wave of normal incidence; FIGURES 15(a), (b), (c), (d), (e) & (f) are various views representative of another example of an additional "superimposed" signature image that may be additionally encoded in the optical material forming the optical structures of the pixels of a given group in yet another embodiment device within the scope of the invention, where FIG. 15(a) shows (including in zoomed-in detail) the group of pixels in question being a selected group of pixels corresponding to a selected group of sub-areas on a front right side portion of the hood of a car as illustrated in FIG. 4, FIG. 15(b) shows the phase modulation functions of the non-planar wavefront components -or, more simply, Wavefronts", as they may alternatively be termed in the remainder of this description hereinbelow -with a spherical wavefront shape generated by the corresponding pixels in the selected group, FIG. 15(c) shows the various phase modification functions assigned to the pixels of the group (i.e. that are added to the phase modulation functions of the non-planar wavefront components), FIG. 15(d) shows the phase modulation function of the image superimposed onto the phase-modulated non-planar wavefront components, FIG. 15(e) shows schematically the manner in which the encoded phase-modulated non-planar wavefront components with the superimposed image forming the output light pattern propagate at various angles as they are reconstructed from the group of pixels of the device using normal incident light, and FIG. 15(f) is an intensity distribution map of the output light pattern (ignoring zero order intensity) as represented in FIG. 15(e), shown as a representation of the spatial frequencies 1/x and 1/y (i.e. in coordinates of spatial frequencies); FIGURE 16 is a representative view of yet another example of an additional "superimposed" image that may be additionally encoded in a given pixel in yet another embodiment device within the scope of the invention, showing a spherical wavefront component (represented by the black square) generated by the given pixel and auxiliary spherical wavefront components (represented by the lighter grey squares) generated by the same pixel but as superimposed images representing spherical wavefront components of light generated by respective neighbouring pixels with reduced intensity, all shown by way of a representation of their propagation directions (i.e. output angles).
DETAILED DESCRIPTION OF THE INVENTION AND ITS FEATURES AND EMBODIMENTS
The optically variable image device and its relationship with the (real of virtual) three-dimensional object Referring firstly to FIG. 1, this shows a representative practical example of an optically variable image device 1 according to an embodiment of the invention, which may be used in a wide variety of security or authentication applications, such as brand protection, document or banknote protection, passports, visas and ID cards or documents, driving licences, membership cards, tickets, certificates, packaging, works of art, antiques or other valuable items, etc. The device 1 comprises a thin layer 2 of optical material -namely a material that interacts with incident light transmitted therethrough or reflected therefrom and is capable of imparting a particular optical function to that light by modulating the phase of the light wave that interacts with it and is capable of having a record of an optical image encoded therewithin, especially by virtue of its optical properties being able to be modified or modulated across and/or along and/or through at least part of the layer 2, such as the material's refractive index, optical surface relief height/depth and/or width of relief features, and/or light absorption or reflection properties. Examples of suitable optical materials for forming the thin layer 2 include for instance photopolymers, e.g. Bayfol® HX 200, or in the case of optical surface embossing, the application of a lacquer layer with an imprint of the surface relief, or preferably a thin polymer film based on PC or PET with an imprint of the optical embossing (e.g. microrelief) on its surface. The layer 2 of optical material -or rather, more correctly in some embodiments, the portion of the layer 2 that has the modulated optical properties (since it is possible in some embodiments for those modulated properties to be present in only part of the overall thickness and/or area of the layer 2 of optical material) -is microscopically thin, e.g. typically from about 0.5 up to about 50 pm in thickness, more especially from about 0.5 up to about 10 or 25 pm, or even from about 0.5 up to about 3 or 5 pm. The layer 2 may either constitute the whole of the thickness of the device 1, or alternatively the layer 2 may be pre-formed as a discrete patch-like layer 2 of the optical material which is then mounted, as shown in FIG. 1, on a suitable carrier or substrate layer C (e.g. of polycarbonate or PET or a paper carrier or a combination of these carriers formed for example by lamination), e.g. using a suitable adhesive, and optionally via an intermediate mounting layer (not shown) which may itself serve as an additional structural layer component of the complete device.
The thickness of the layer 2 can be understood as meaning the distance between two opposite (especially major) parallel surfaces (which may be real or virtual) which envelope the modulated portion of the optical material of the layer which is used or designed to modulate the phase properties of the light in accordance with the fundamental principle behind the present invention. The envelope surface, or portion(s) thereof, of the layer 2 which receives the incident light first of all (i.e. the incident light which the layer 2 modulates the phase properties of) is referred to as the "input" or "entrance" surface, whereas the layer's envelope surface, or portion(s) thereof, from which the phase-modulated light emerges is referred to as the "output" or "exit" surface. The output surface may be identical to the entrance surface if the layer 2 is designed to phase-modulate the incident light in reflection (either within an entire dimensional extent of the layer 2 or its portion(s) in one or more dimensions, or within only a part thereof), or the output surface may be the opposite envelope surface to the entrance surface if the layer 2 is designed to phase-modulate the incident light in transmission (again, either within an entire dimensional extent of the layer 2 or its portion(s) in one or more dimensions, or within only a part thereof).
The layer 2 can be either (i) substantially flat or planar, or (ii) curved or arcuate in one or two or three dimensions, e.g. depending on (and so as to substantially match) the shape of an article to which the device is designed or intended to be affixed of security or authentication purposes, or more usually in the case of the layer 2 being mounted on a carrier/substrate C the shape of that carrier/substrate C. As shown in in FIG. 1, the layer 2, or at least a portion of it, is subdivided into one or more groups of adjacent pluralities of pixels 3 (i.e. adjacent sub-areas of the layer 2). Each pixel 3 has recorded therein in encoded form a respective portion of the optically variable image to be recreated by the device 1. A typical arrangement of the pixels 3 is an array or set of the pixels 3 arranged in a regular or uniform pattern or grid, especially with a rectangular or square shape (such a rectangular shaped array being that shown in FIG. 1) or alternatively in the shape of a polygon (e.g. one or more hexagons or triangles), or possibly in the form of a brick-wall-type arrangement. Typically each pixel 3 is itself square or rectangular in shape (i.e. in plan), although other pixel shapes that tessellate (e.g. hexagons or triangles) are also possible. Mixtures of different pixels shapes may also be used, if desired or appropriate. As an alternative to regular or uniform pixel patterns or grids (and/or also as an alternative to regular pixel shapes), irregular or non-uniform pixel patterns or grids (and/or irregular pixel shapes), such as for example an Escher tiling or Penrose tiling, may be used if desired or appropriate. Each pixel 3 typically has a size (i.e. an average width) ranging from a few or several microns up to several hundreds of microns -e.g. from about 3 or 5 or 10 or 20 pm up to about 200 or 300 or 400 or 500 pm. As a rule of thumb, however, for regular or irregular shapes of pixels, a minimum inscribed circle area that each pixel occupies is at least a few or several microns in diameter and a maximum of several hundred microns in diameter -e.g. at least about 3 or 5 pm and up to at most about 100 or 200 or 300 pm, more especially from about 5 up to about 100 or 200 pm, even more particularly from about 5 up to about 50 pm, e.g. from about 10 up to about 20 or 30 or 40 pm. The sizes or dimensions of the pixels 3 in any given array, group or grid can be either uniform or in some forms can vary within a given array/group/grid.
The subdivisions between immediately adjacent pixels 3 in any given array/group/grid which define the individual pixels' boundaries can be either real or notional -i.e. the boundaries between immediately adjacent pixels may be either (i) defined in practical terms and are discernable or observable as such upon viewing or alternatively they may be defined by a change, especially a marked or stepwise or sudden change, in the optical functions imparted by the respective optical structures of adjacent pixels 3, or alternatively (ii) defined implicitly (i.e. in such a way that they may not be explicitly apparent upon viewing) by the various projection vectors or projection lines that are used to define those pixels themselves.
Thus, the boundaries between immediately adjacent pixels 3 may be defined implicitly from the boundaries of corresponding sub-areas 5 delimited on a pre-selected external or internal surface 6 of a virtual three-dimensional object 7 into which the pre-selected surface 6 is subdivided, it being mainly this pre-selected surface 6 -or more precisely the light scattered therefrom -whose image is to be recreated and thus imitated by the pixels 3 in layer 2 of the optical variable device 1 of the invention. Thus, the projection vectors or projection lines are used to define a projection of the pre-selected surface 6 onto the layer 2, i.e. to map or assign the said sub-areas 5 to the respective pixels 3.
This is illustrated by way of example in FIGS. 2(a), (b) & (c), which show cross-sections of a (real or virtual) three-dimensional object 7 (in this case a motor car, for example -although it could in principle be any object or objects, or even a scene of some kind) and either a planar or a curved layer 2. FIGS. 2(a)-(c) merely illustrate the assigning, i.e. mapping, relationship between the sub-areas 5 on the pre-selected surface 6 of the (real or virtual) three-dimensional object 7 and the pixels 3 on the layer 2 of the device. The pixel boundaries 4 in the layer 2 are created as a projection of the boundaries of the sub-areas 5 on the preselected surface 6 (i.e. the thick solid line) of the three-dimensional object 7. The projection vectors (that is, the projection directions as indicated by the arrows in FIGS. 2(a)-(c)) can be constructed in various ways. For example, they may be unidirectional, as illustrated in FIGS. 2(a) or (c), or they may be multidirectional, i.e. they change direction across the layer 2's curved input surface, as illustrated in FIG. 2(b). As an example, FIG. 2(b) shows the projection directions constructed as each being perpendicular to the layer 2's curved input surface. The unidirectionality or multidirectionality of the projection directions may be at the designer's choice. The projections of the boundaries of the sub-areas 5 on the pre-selected surface 6 of the three-dimensional object 7 thus produce or generate corresponding boundaries 4 of individual pixels 3 in the layer 2 which have the same neighbours as the corresponding sub-areas 5 on the pre-selected surface 6 of the three-dimensional object 7.
It should be understood that the projections, especially the multidirectional projections as in FIG. 2(b), of the boundaries of the sub-areas 5 on the pre-selected surface 6 of the three-dimensional object 7, or those projections on the layer 2's curved input surface, may in some instances result in a distortion of the recreated image of the three-dimensional object 7 that is to be formed by the device according to the invention. It is at the designer's discretion to control and/or allow such a type of image distortion. That being said, not all three-dimensional objects may be best suited to the projection of the sub-areas' boundaries on the object's pre-selected surface into pixels' boundaries in the layer 2 as described above. In other words, a three-dimensional object, and the pre-selected surface thereof in particular, may in some instances need to be chosen carefully (and perhaps even modified if necessary) to avoid ambiguity in the process of assignment or mapping of sub-areas 5 onto pixels 3 and their related boundaries.
It should be appreciated that the mapping (i.e. projection) as described above may if desired be alternatively done in an opposite direction -i.e. the projection vectors have opposite directions from those shown in FIGS. 2(a)-(c). In this case, the pixel boundaries 4 in the layer 2 can be defined first, and then projected onto the three-dimensional object 7 (unidirectionally or multidirectionally) to define the sub-areas 5 on its pre-selected surface 6.
FIGS. 2(a)-(c) show examples of mapping when the arrangement of pixels 3 and the three-dimensional object 7 are in comparable proportions, i.e. scaled as substantially one-to-one. It should be understood, however, that the mapping may if desired be alternatively done on scaled-up or scaled-down versions of the arrangement, i.e. one or more of the layer 2, the arrangement of pixels 3 thereon and the three-dimensional object 7 may be scaled-up or scaled-down to a desired degree in comparison with one or more other one(s) of the layer 2, the arrangement of pixels 3 and the three-dimensional object 7. This scaling-up or -down may even be necessary in certain practical instances where a real three-dimensional object 7 (or scene) is larger than the arrangement of pixels 3 in the layer 2. Once the mapping is done, the arrangement of pixels 3 (or the layer 2) and/or the three-dimensional object 7 may be scaled back to their original or intended proportions relative to the other one(s) thereof.
In some embodiments of the invention, the mapping of the sub-areas 5 of the pre-selected surface 6 onto the pixels 3 in the layer 2 may not be done for each and every sub-area 5. Alternatively the sub-areas 5 may represent isolated islands on the pre-selected surface 6. In such a case the islands of the sub-areas 5 may be mapped onto respective islands of pixels 3 in the arrangement. It is then at the designer's choice to decide what portion(s) of the pre-selected surface 6 of the three-dimensional object 7 (or scene), e.g. represented by islands of the sub-areas 5 thereon, would be mapped onto the pixels 3, e.g. represented by islands of the pixels 3. In typical such embodiments, the islands of the sub-areas 5 may be distributed uniformly or quasi-uniformly across the pre-selected surface 6, covering a substantial proportion of its total area (e.g. at least about 20% or 40% of its total area). This helps to ensure that in such cases the three-dimensional object 7 is still recognizable as such when imaged by the optically variable device 1.
On the other hand, in other embodiments the mapping between the sub-areas 5 on the preselected surface 6 onto the pixels 3 in the layer may alternatively be done using the voids in between the sub-areas 5. Such a case is practically an inversion of the case described in the preceding paragraph above, i.e. an inversion of the area on the pre-selected surface 6 that is subjected to the mapping.
The pixel assignment or mapping described above makes the pre-selected surface 6 of the object 7 spatially related to the geometrical distribution of the pixels 3 in the layer 2 of the optically variable image device 1. The pre-selected surface 6 (or, more precisely, the light scattered by the three-dimensional object 7 mapped onto the layer 2 of the optically variable device 1 via the sub-areas 5 on the pre-selected surface 6) therefore becomes a part of the three-dimensional object 7 (or, more precisely, becomes the light scattered by the three-dimensional object 7 mapped onto the layer 2 via the sub-areas 5) which will be the major subject for (or the focus of) imitation by the optically variable image device 1 according to this invention. It is at the designer's choice which such surface of an object may be selected for imitation by recreation of an image thereof in accordance with the invention. It may be usefully mentioned here byway of reminder that embodiments of the invention focus primarily on creating an image of the light scattered by the three-dimensional object (or scene) 7 rather than an image of the object (or scene) 7 itself, which fact will become more clearly apparent from the further description and more detailed explanations hereinbelow.
FIG. 3 shows four alternative examples of the pre-selection of the three-dimensional object 7's surface 6 for imitation by recreation of an image thereof in accordance with the invention. In the upper two drawings of FIG. 3, the pre-selected surfaces 6 of a motor car as the three-dimensional object 7 are the car body on the left and the car chassis on the right (i.e. highlighted by a thick line contour). These surfaces 6 are outer surfaces of the three-dimensional object 7. However, the pre-selected surface 6 may altematively be an inner surface of the three-dimensional object 7. The bottom two drawings of FIG. 3 show an example of another three-dimensional object 7 -in the form of a paperweight with an embedded object (either as a full object or as an object's shell, and either in the form of a cavity within the paperweight body (i.e. an inverse definition of the objects exterior) or otherwise formed as a real embedded object's exterior) with highlighted pre-selected surfaces 6, which in this case are inner surfaces of the three-dimensional object 7 (i.e. the paperweight).
Any given pre-selected surface 6 of the three-dimensional object 7 (or the light scattered therefrom) that is to be imitated by recreation of an image thereof should be selected in such a way that it can receive incident light originating from outside the object 7 (which can of course be either real or virtual). In some cases the pre-selected surface 6 may receive the incident light directly, i.e. the surface 6 interacts first with the incident light, whereas in other cases the pre-selected surface 6 may receive the incident light indirectly, i.e. the incident light propagates through other part(s) of the three-dimensional object 7 first before reaching the pre-selected surface 6.
FIG. 4 shows an example of the relative configuration of a (real or virtual) three-dimensional object 7 and incident light 9, and illustrates the manner in which that incident light 9 is scattered by the object 7 (which in this case is optically transparent or semi-transparent). The (real or virtual) three-dimensional object (i.e. car) 7 is illuminated by the incident light 9 from thereabove. Most of the object/car body will receive incoming incident light directly from the incident light 9 itself. Provided the three-dimensional object (i.e. car) 7 is transparent or semi-transparent, as it is illustrated in FIG. 4, the object/car chassis (i.e. its underside) will receive some incoming incident light indirectly after the main incident light 9 has entered the body of the object/car 7 and propagates through its volume. Generally, therefore, when the incident light 9 hits the object/car 7 it produces scattered light 11 mainly upon reflection, i.e. the scattered light 11 propagates away from the object/car 7, but if the object/car 7 is transparent or semi-transparent then the scattered light 11 is produced also upon transmission, i.e. as it leaves the object/car 7 on the opposite side thereof from the illuminated side thereof where the incident light 9 first entered it. Also, a portion of the light propagating through the volume of the object/car 7 gets reflected back towards its initial receiving surface (i.e. its surface where the incident light 9 first entered the object/car 7) and gets combined with the directly reflected light as that portion of the reflected light passes back through that initial receiving surface. Although in principle there could be several and/or more complex ways in which the incident light 9 interacts overall with the car or other three-dimensional object 7, in general in embodiments of the invention the scattered light 11 may be observed either in reflection, i.e. on the incident-light-receiving side of a pre-selected surface of the object 7, or in transmission, i.e. on the side of the object opposite to its incident-light-receiving side of the pre-selected surface of the object 7 (as indicated by the positions of the observers 0 in FIG. 4). As an alternative to an optically transparent or semitransparent object/car 7, it may in an alternative embodiment be an optically opaque or reflective object/car 7, in which case the scattered light 11 will be observed substantially only in reflection.
As shown in FIG. 5, if the object 7's pre-selected surface 6 can be described as a function of the layer 2's receiving (entrance) surface s, i.e. each point within a given sub-area 5 on the pre-selected surface 6 has a unique mappable counterpart point on a given pixel 3 of the entrance surface s of the layer 2, and each point within a given sub-area 5 on the pre-selected surface 6 has the same neighbouring points as does the corresponding mapped point and its neighbours on a given pixel 3 on the entrance surface s of the layer 2, then it may be possible to avoid any ambiguity in the assignment (i.e. projection or mapping) of the boundaries of the sub-areas 5 of the pre-selected surface 6 of the object 7 to the boundaries of the pixels 3 in the layer 2. An object 7's pre-selected surface 6 which meets such conditions may be especially desirable in practising some embodiments of the invention.
The assignment or mapping between the above-mentioned respective point pairs -i.e. the corresponding points within a given sub-area 5 on the pre-selected surface 6 of the three-dimensional object 7 and on a given pixel 3 on the entrance surface s of the layer 2 of the optically variable image device 1, which can be used to describe the pre-selected surface 6 as a function of the layers 2's surface s -may be done in various ways. One example thereof is shown in FIG. 5, where the assignment or mapping of the various point pairs is done along projection lines perpendicular to the layer 2's surface s, and the pre-selected surface 6 is then described in the form of a distance function D=f(s) between the corresponding point pairs, where for each point pair the distance D is measured along the projection line, i.e. the direction normal to the surface s, or alternatively along a respective direction defined by a respective projection line, as shown in examples in FIG. 2. In many especially desirable embodiments the pre-selected surface 6 of the object 7 whose sub-areas 5 are mapped onto the corresponding pixels 3 of the layer 2 of the device 1 may be that which has a distance function D=f(s) which is substantially continuous and uniform over either (i) substantially the whole of the layer 2's entrance surface s, or alternatively (ii) substantially the whole of each of one or more portions of the layer 2's entrance surface s that each contain a respective group or array of pixels 3.
The optical structures of the pixels of the optically variable image device, and how the incident light is modulated by such structures Turning to FIGS. 6A and 6B, each pixel 3 in the array or grid of pixels 3 in the layer 2 comprises an optical structure that is capable of modulating the phase of the incident light wave. The arrangement shown in FIG. 6A highlights (by being ringed) one such pixel 3, and FIG. 6B then goes on to show various examples of its optical structure which can phase-modulate the incident light for the purpose of recreating a respective portion of the image to be formed by that pixel. In various embodiments, such phase-modulation of the incident light wave may be effected by either (i) modulation of one or more optical properties of the material of the layer 2 itself that forms the optical structure (such as the material's refractive index or its light absorption or reflection properties), or (ii) modulation of surface optical relief properties of the layer 2 (such as surface optical relief height/depth and/or width of relief features), or possibly even (iii) by a combination of both forms of modulation (i) and (ii).
In general, the phase modulation of the incident light wave means that during the interaction of the incident light wave with the optical structure of the relevant pixel 3, the light wavefront undergoes a phase delay (or phase shift -which is to say, a shift of the position of the sinusoidal wave in its propagation direction) compared to an incident light wave that does not interact with the optical structure. The amount of the phase delay (or phase shift), which may typically be expressed as a phase offset of a sinusoidal wave function, may be dependent on the local optical properties of the optical structure of the relevant pixel 3, such as, for example, its refractive index, optical relief profile thickness or width and/or profile depth. The light wave exiting the optical structure of the relevant pixel 3 thus has its wavefront phase-modified compared to the wavefront of the original incident light wave.
The interaction between the incident light wave and the optical structure of the relevant pixel 3 may be of the nature of transmission, reflection or absorption, or even a combination of any two or more thereof. FIGS. 6B(a) to (h) depict various examples of optical structures 8 within a given pixel 3 (such as the highlighted (ringed) pixel in FIG. 6A) with various different versions of how its optical structure may be constructed or formed so as to impart thereto the required phase modulation capability for the purpose of recreating a respective portion of the image to be formed by that pixel: FIG. 6B(a) illustrates localized variation(s) in the refractive index of the optical material forming the layer 2 (which is depicted by the variable density of the hatching) as one way of modulating the phase of a transmitting light wave; FIG. 6B(b) illustrates variation(s) in the thickness of a transparent optical relief structure within the layer (i.e. which is open to the air to its lowermost side as shown in FIG. 6B(b)) as another way of modulating the phase of a transmitting light wave by creating a variable path length due to the variable ratio between the air and material thicknesses as the light wave propagates through a specific location (i.e. path trajectory) within the optical structure; FIG. 6B(c) illustrates an alternative modulation option that is equivalent to that of FIG. 6B(b) but replacing air with a different transparent material (shown to its lowermost side as shown in FIG. 6B(c)) with a different refractive index from the material in which the optical relief (to its uppermost side) is formed (and as an optional augmentation of this arrangement it may be possible to add to the layer a reflective sub-layer or multilayer stack on the back side thereof in order to further modify the overall phase-modulating optical properties of the combined optical structure); FIG. 6B(d) is similar to option FIG. 6B(c) but here the variable optical relief structure is buried within an optical material with a localized modulated refractive index (and as an optional augmentation of this arrangement it may be possible to add to the layer a reflective sub-layer or multilayer stack on the back side thereof in order to further modify the overall phase-modulating optical properties of the combined optical structure); FIG. 6B(e) illustrates yet another way of modulating the phase of a transmitting light wave, this time by arranging for a variable travel length of a forward and a backward path of the incident light beam which is reflected by the optical relief formed in an optical material (forming the layer 2) that is reflective or is a material coated with one or more layers (e.g. as a monolayer or a layer stack) of a reflective coating material; FIG. 6B(f) is a variation of the modulation option of FIG. 6B(e) but replacing air (shown above the optical relief) with a transparent optical material of a higher refractive index than the air; FIG. 6B(g) illustrates yet another way of modulating the phase of a transmitting light wave, this time by arranging for the optical material to have variable refractive index combined with a reflecting material on the back side of the layer (i.e. on the side opposite to the side through which the incident light first enters the optical structure), whereby the combined optical structure can phase-modulate the incident light wave on its forward and backward (i.e. reflected) paths through the layer; FIG. 6B(h) illustrates yet another way of modulating the phase of a transmitting light wave, this time by applying an intermediate layer or stack of layers (e.g. of a metallic material such as aluminium, or of a high refractive index material with a refractive index higher than that of the surrounding optical material) onto the front (i.e. entrance) side of the layer and optionally also burying an optical relief layer (as a modification of option FIG. 6B(c)) within the layer, whereby the phase properties of transmitted and/or reflected light are yet further phase-modulated (and as an optional augmentation of this arrangement it may be possible, for example, to design the overcoat on the front (i.e. entrance) side of the layer as an antireflective coating layer, whilst the overcoat on the buried optical relief layer may be designed with an increased reflectivity, in order to yet further modify the overall phase-modulating optical properties of the complete optical structure).
In practical terms, an optical structure with variable refraction index may typically be originated using for example a direct writing laser or electron beam system which enables locally variable dosing of energy in a photopolymer or electron sensitive polymer. Similarly, the micro-relief of structures with relief modulation may be recorded (i.e. exposed and etched) by direct writing in photoresists or e-beam resists or ion-beam etching techniques. Any of these originated structures may be further transferred (i.e. copied or replicated or recorded) by photolithography or nanoimprint lithography methods or by embossing, UV casting, etc into other materials, e.g. photopolymers, metals, plastics, UV-curable polymers, etc (to name just a few). Practical examples of suitable such systems, and methods and equipment for practising them, are well-known in the art and to the skilled person.
In any or all of the above illustrated cases in FIGS. 6B(a) to (h), if desired or appropriate an overcoat may be applied to any surface of the main layer 2 and/or any exposed optical relief (or a relief before being buried) forming part of the main layer 2 in order to yet further modify the interaction of incident light with the optical structure. Such an overcoat may be applied fully or selectively to only one or more parts of the main layer 2 -i.e. to all or only selected one(s) of the pixels 3 or perhaps even just to one or more sub-groups of the pixels 3). The selective application of an overcoat may even not have to respect the pixel boundaries, and so may even extend over one or more boundaries between adjacent or neighbouring pixels 3. Such an overcoat may for example be reflective, partially reflective or selectively reflective with respect to wavelengths of the interacting light, which may for example allow for colour effects to be created in the observed light wavefront and/or which may be designed to further phase-modulate the wavefront upon both reflection and transmission thereof within the layer 2. Techniques for designing or applying such layers (whose function(s) may typically be based on principles of interference on thin dielectric layers) are well-known in the art and thus readily available to the skilled person.
Another possible option not explicitly illustrated in the options explicitly represented in FIGS. 6B(a) to (h) may involve application of an additional thin layer of optical material (i.e. substantially thinner than either of the material sub-layers therebelow and thereabove) onto the optical relief between (i.e. sandwiched between) the two material sub-layers shown in option FIG. 6B(c), wherein that sandwiched additional thin layer has yet another different refractive index from those of the other two material sub-layers therebelow and thereabove.
Such an additional thin layer may be substantially uniform in thickness or its thickness may vary across it. In the case of the sandwiched additional layer having a refractive index higher (or alternatively lower) than those of the other two material sub-layers therebelow and thereabove, the phase modulation of the incident light may occur primarily in reflection thereof, unless the thickness of the sandwiched additional layer varies, which may also lead to phase-modulation of the incident light wave upon transmission.
Other phase modulation options may be achievable by various combinations of any two or more of any of the above-described phase-modulation options shown in, or based on, or augmented from, in FIGS. 6B(a) to (h). Also, various phase modulation types may be applied at different locations on the layer 2, in which case the image of the three-dimensional object (or the scattered light therefrom) may acquire a differentiated appearance based on different appearances at different locations on the layer 2 from which it is being generated. Such a differentiation may have not only an aesthetic value, but may contribute to increased complexity of the optically variable image device, making it much harder to counterfeit.
Practically any optical structure that is capable of modulating the phase of a light wave (including those depicted in FIG. 6B) has a phase modulation depth limit for a given wavelength and a given angle of incidence of the light interacting with the optical structure. Thus, it will generally be that each portion of the wavefront that exits the layer and imitates the light scattered by the three-dimensional object will have a phase modulation depth which is at most equal to or is smaller than the phase modulation depth limit of the given structure. In embodiments of this invention the phase modulation depth limit may be defined as a maximum phase shift (i.e. maximum phase delay) that the optical structure can induce in the light wave interacting with the structure. For example, if the optical structure modulates the light wave by means of a varying refractive index within the layer 2 (i.e. as per the modulation option shown in FIG. 6B(a)), then the maximum phase shift will occur between two light rays when one propagates through a location with the lowest refractive index exhibited by the structure and the other propagates through a location with the highest refractive index exhibited by the structure. Such a situation is depicted in FIG. 7.
Turning to those FIGS. 7(a), (b), (c) & (d): FIG. 7(a) shows a cross-section of a pixel 3 surrounded by air (i.e. with air thereabove and therebelow) of a width d with refractive index modulated along the layer 2 and two selected rays #1 and #2 of the incident light 9 entering the optical structure at angle am in air. Ray #1 further propagates through the structure at the location with the lowest and ray #2 with the highest refractive index that the pixel's optical structure can have (or can be modulated to), the refractive index having its minimum and maximum values at locations x' and x", respectively, where the rays cross the reference surface within the layer 2. For simplicity of the illustration (and/or as an approximation), the minimum and maximum refractive index values are adjusted to the average refractive index values nram and nmax averaged along the path of the respective rays within the layer 2, and along such paths the refractive index values are considered constant (this is practically true for angles of incidence near zero). The reference surface (i.e. parallel to the layer 2) is placed at the centre of the layer's thickness -however, its position may be chosen differently as a slightly different approximation. Within the layer 2 the rays will propagate at angles of refraction an(min) and awn.), respectively. The rays #1 and #2 are just part of a wider incident light beam (i.e. wave) 9 illuminating pixel 3, which is depicted in FIG. 7(b) (which illustrates the same situation as in FIG. 7(a)) by added parallel rays (i.e. the dashed lines). All these rays are part of a plane wave and its wavefront planes (i.e. equiphase planes) are shown as numbered perpendicular lines to the direction of propagation spaced by a distance equal to the wavelength X. Such a distance represents a 27c phase shift in the sinusoidal plane wave of the incident light. As soon as the wave enters the optical structure of the pixel the phase relations in and/or between the individual rays will be modified by modulated refractive index.
The 2rz phase shift along the rays #1 and #2 propagating in the layer is illustrated by short numbered solid lines perpendicular to their direction of propagation within the layer. Their distance is still one wavelength, but the wavelength X, divided by the respective refractive index within the layer. The lines with the same number represent the same phase in the rays #1, #2 as well as in the incident wave (i.e. undisturbed/unmodified). When rays #1 and #2 reach the exit side of the pixel, it is shown that the wavefront is locally shifted (i.e. delayed) by a distance A' and A", respectively, which translates to the respective phase shifts (27/2)A' and (27c/X)A" induced in the wavefront transmitted through the layer, compared with the original undisturbed wavefront of the incident wave. (Note that the equiphase planes of the original undisturbed incident wave beyond the input/entrance side of the pixel are shown as dashed lines perpendicular to the wave's propagation direction.) Thus the wavefront of the incident wave has been modulated. Since the rays #1 and #2 propagated through the locations with minimum and maximum refractive index the optical structure can have, the difference between the phase shifts (27c/X)A' and (27c/X)A" represents the maximum phase shift the optical structure can induce in the wavefront or an incident wave of a particular wavelength and angle of incidence. Using the geometrical approach describing the ray path in FIG. 7(b), the phase shift between rays #2 and #1 at the output of the pixel can be derived and expressed as Aprnax= (27c/k)(A"-S) = (27c/k)t [nmaxcos a(nn,ax) -nmincos a(nrnin)], which represents the phase modulation depth limit of the optical structure within pixel 3 comprised in layer 2 of thickness t for the wave 9 incident at the pixel entrance side at angle an and of wavelength X. The geometrical approach of determining the maximum modulation depth limit of the optical structure described above can be actually used to determine a phase shift of any of the rays within the incident wave (i.e. any ray incident at any x location across the pixel width) passing through the optical structure of a given pixel relative to a corresponding ray of an undisturbed incident wave. It can be derived from the geometries depicted in FIGS. 7(a) & (b) and expressed as a phase shift function Ap(x) = (27/k) [n(x) -cos(an -anm)] t / cos(anno), where n(x) is the refractive index of the optical structure in the pixel assigned to a location x on the reference surface (i.e. in the same manner as described above for min. and max. refractive indices), and an(x) is a corresponding angle of propagation (i.e. refraction) of the ray in the optical structure.
The two preceding paragraphs above apply to modulation of the incident wave interacting with the optical structure which is modulated by variable refractive index of the layer's medium (i.e. material). An analogous geometrical approach can be used or derived to determine the phase modulation depth limit and phase shift function for other modulation options (such as the examples shown in FIG. 6B) by a person skilled in the art. The phase modulation may in practice depend on modulation of other properties of the optical structure other than the modulation of refractive index. For example, in the case of relief modulation options (b) or (e) in FIG. 6B, the phase modulation would depend on the modulation of the relief thickness or depth, respectively. FIGS. 7(c) & (d) show examples of both modulation options -namely, 7(c) on the left showing modulation of thickness t(x) of material of refractive index n in the layer, and 7(d) on the right showing modulating depth h(x) of the reflective relief -indicating how the phase shift can be determined for these types of modulation. In each example, two rays in the light 9 incident on the pixel 3 at the angle an are shown -one (i.e. solid line) which is modulated by the optical structure 8 and the other (i.e. dashed line) which passes through (i.e. modulation option 7(c)) or is reflected from (i.e. modulation option 7(d)) the layer 2 of thickness t without being affected by the optical structure 8. Both rays intersect at location x within the pixel width on the reference surface placed in both cases at the exit side of the pixel. The common wavefront of both rays is indicated by a fine dashed line perpendicular to the ray propagation direction at the point of entry of modulated ray into the pixel. The phase shift between the modulated and undisturbed rays can be determined from the difference of their respective optical paths indicated by thicker portions of the ray paths in FIGS. 7(c) & (d). The modulated portion in FIG. 7(c) starts after the modulated ray enters the material of refractive index n where it propagates through the material thickness at an angle of refraction an (angle of refraction) to a location xt where it leaves the material and is refracted to the air at angle a(xt) and further propagates to the pixel exit. In FIG. 7(d) the modulated portion of ray paths starts when the modulated ray enters the pixel and propagates at angle of incidence an until it reaches reflective relief at location xh and is reflected at an angle a(xh) towards the pixel's exit side (note: in this case the pixel's entrance and exits sides are the same).
In reality in some practical embodiments of the invention, a rigorous calculation of the maximum phase modulation depth or phase shift (i.e. modulation) function may be more complex due to other effects, such as, for example, diffraction, polarization, rays propagating along curved trajectories within the layer 2, rays near the pixel boundaries interacting also with neighbouring pixels, etc. However in many practical implementations of embodiments of the invention the geometrical approach, such as the one described above may be considered to be accurate enough and an acceptable approximation. This is especially since the main purpose of creating the optically variable image device according to embodiments of the invention is to provide a visual effect for the observer, not necessarily a specific or precisely defined optical function. In this sense it may in many embodiments also be that discrepancies between the designed and the real optical behaviours of the optically variable image device caused by approximations used or by recording or manufacturing imperfections or limitations may not be especially critical.
The phase modulation depth may usually be different upon reflection or transmission of the incident light interacting with the same optical structure in a given pixel 3. In some embodiments of this invention the minimum phase modulation depth may be considered to be of the order of 1/10 of the design wavelength, i.e. the wavelength in respect of which the optical structure is primarily designed and/or the wavelength the optical structure is to be used with. However, in many practical applications the minimum phase modulation depth may be at least about one or two waves of the design wavelength. In other embodiments, the optical structure of a given pixel 3 may easily have a phase modulation depth limit which could be several tens (e.g. from about 20 to about 50 or 60 or 70 or 80 or 90 or 100) or even several hundreds (e.g. from about 200 to about 500 or 600) of wavelengths. Furthermore, the design wavelength may not necessarily be the same as the wavelength(s) of the light scattered by the three-dimensional object or incident light modulated by the layer 2 that is observed by a viewer. Nevertheless, if the viewer is a human, then the wavelength(s) of the observed scattered or modulated light may be part of the visible region of the electromagnetic spectrum. In such cases, therefore, the design wavelength may typically be a part of the visible region of the spectrum, e.g. around 500 nm or 555 nm, or multiples of such a wavelength (e.g. from about 2 to about 10 times such a wavelength). Such an approach for selection of the design wavelength may be applied similarly to other wavelength ranges if the observation is done, for example, in ultraviolet or infrared light.
In more detail: the optically variable image device and its relationship with the three-dimensional object, and how the three-dimensional object is imitated by recreation of its image by the optically variable image device Turning to FIGS. 8A and 8B, having described the general structure and general optical phase-modulation properties of the layer 2 in the optical variable image device 1 of many embodiments of the invention, FIGS. 8A and 8B illustrate how the light scattered by the (real or virtual) three-dimensional object 7 relates to the light generated by the pixels 3 in the layer 2. The modulation function of the optical structure of a particular pixel 3 (i.e. the function describing the changing optical property or properties of the optical structure across the pixel 3's area) is designed such that it represents an encoded light wavefront 12 exiting the layer 2, which imitates light 11 scattered by the (real or virtual) three-dimensional object 7 upon either transmission through and/or reflection from such an object 7. Prior its scattering, the light is incident at a predetermined angle of incidence 9xx at sub-area location 5xx of the receiving surface 6 of the three-dimensional object 7, the incident location corresponding to the respective pixel 3xx in the layer 2. The light also has a specific wavelength A, i.e. the design wavelength. FIGS. 8A(a) to (h) illustrate a situation where the light incident at the specific sub-area locations 503, 512 & 515 of the receiving surface 6 and at specific angles of incidence 903, 912 and 915 is transmitted through the object 7 and exits the object 7 as scattered light 11 or a representation of scattered light 11'.
The incident light may be considered typically to be in the form of a plane wave (i.e. a wave having a planar wavefront), thereby being unidirectional when it is incident at a specific sub-area 5 on the entrance surface 6 of the three-dimensional object 7. In typical embodiment cases, in particular when the layer 2 is substantially planar, the direction of the incident light 9 may be considered to be normal to the layer 2 or otherwise unidirectional across the layer 2's entrance surface (or portion thereof), i.e. also being incident at the same angle on the entire (or on a portion of) the entrance surface 6 of the object 7. The incident angles of the unidirectional incident light may have values of up to about 10 or 15 or 20 or 30 degrees from normal. In some cases, the direction of incident light may vary across the surface 6, for example imitating directions from a point source illuminating the object 7, or converging towards a predetermined observer's position (such as, for example, at a distance of around 20 to 30 cm from the layer 2 and within about 30° from the normal thereto), although the illumination direction across each of the sub-areas 5 of the receiving surface 6 may typically be considered to be constant (for example, determined as an average illumination direction of incident light varying across the pixel area). In other embodiment cases the incident directions may imitate a finite area light source (e.g. a diffuse light source), and the illumination directions of sub-areas 5 may be randomly chosen as if coming from different points of the source's surface. In a similar manner, the illumination directions may be assigned to the sub-areas 5 from a converging illumination source. In such cases the typical full angular width of the finite area light source may be not larger than about 20° when observed from any point of the layer 2.
The light 11 or 11' scattered by the three-dimensional object 7 is a result of the interaction of the incident light of propagation direction 9>o< with the object's surfaces and the optical material (i.e. the optical material volume), after entering the object's surface 6 at the delimited sub-areas 5xx.
In putting into practical effect embodiments of the invention, the light scattered by the (real or virtual) three-dimensional object 7 may be determined either by measurements or by calculations, based on knowledge of the optical material being used and the surface properties of the three-dimensional object 7. In many embodiment cases a geometrical approach, such as ray-tracing, may be sufficient for determining the directions and/or intensities of the scattered light. A ray-tracing approach may be especially advantageous if the three-dimensional object 7 is defined as a virtual object in a digital form (for example, in the use of 3D optical modelling software).
In practising embodiments of this invention, the process of the encoding of the scattered wavefront 11 (or its representation as a selection of discrete scatter directions 11') in the optical structure of a given pixel 3 does not primarily focus on the wavefront 11 itself, i.e. its shape or phase variation, but rather on the main propagation direction(s) 10xx of its scatter.
In other words, the scattered wavefront (or light pattern) 11 (or its representation 11') which is scattered from a particular sub-area 5 (or selected point locations thereon) on the preselected surface 6 having the main propagation direction(s) (as defined below) is represented by an encoded wavefront 12 in the respective pixel 3, the encoded wavefront 12 being generated by the respective pixel 3 and having the same main propagation direction(s) as the respective scattered wavefront (or its representation), when the respective pixel 3 is illuminated under the same conditions as the respective sub-area 5 on the preselected surface 6. In general, the main scatter (or propagation) direction can be determined as a weighted average (i.e. weighted with respect to intensity) of the propagation directions (i.e. continuous or discrete) in the wavefront (note: in FIGS. 8A(a)-(h) and 8B(a) & (b) some propagation directions are indicated by the dashed lines (or dashed arrows) perpendicular to the equiphase contours/planes of the wavefront), where each has its intensity as determined by measurement or modelling (e.g. ray-tracing). Alternatively, the main scatter direction can be considered to be the scatter or propagation direction with maximum light intensity. FIGS. 8A(c) & (d) also illustrate an example case where the light entering the object 7's surface at sub-area location 515 is scattered into two distinct main propagation directions 1015a & 1015b (i.e. fully separable at the half of the peak intensity of the beams propagating at such directions). In such a case the designer has a choice to either assign one (average) main direction 1015 to the scattered light (as per FIGS. 8A(a) and (b)) or alternatively to assign two (or plural) main directions 1015a & 1015b (as per FIGS. 8A(c) & (d)).
In yet other embodiments, two or more (e.g. typically up to about 10 or 20) discrete directions within the scattered light from a given sub-area may be selected as main directions. Such directions are defined as propagation directions of light beams (or rays) scattered from particular point locations on a given sub-area (or determined as main directions from the light scattered from elementary areas around such point locations, elementary areas being smaller portions of the given sub-area), and they are a representation of the light scattered from the given sub-area. This is illustrated in FIG. 8A(g), where sub-area 503 is illuminated by incident light 903 and three separate rays are selected from the scattered light which propagate in their respective directions, i.e. main directions 1003a, 1003b and 1003c, as they exit the three-dimensional object. The point locations on the given sub-area from which the discrete rays are propagating and forming main directions as they exit the three-dimensional object may typically be evenly or quasi-evenly (i.e. representatively) distributed across the sub-area. Selecting a relatively small number of representative scatter directions from a small number of discrete points on the sub-area to determine the main direction(s) of the scattered light may be advantageous in speeding up the measurement or modelling (e.g. ray-tracing) process. Optionally, in certain embodiments, the selected discrete scattered representative directions may be averaged into one main direction (as done in the somewhat similar case of light scattered into two distinct directions as shown in FIG. 8A(a), where one main direction 1015 is determined and assigned to sub-area 515.
Once the main scatter directions 10xx associated with sub-areas 5xx of light entry are determined, they are assigned to pixels 3xx in the layer 2 corresponding respectively to the sub-areas 5xx on the object 7's receiving surface 6, and optionally disregarding the fact that the scattered light exiting the object 7 may be offset relative to the location of the entry at the object 7's receiving surface 6. This latter issue may lead to the light scattered by the optical structure not necessarily accurately and fully imitating the volume (i.e. the three-dimensional) properties of the three-dimensional object 7 in the way a true holographic projection does. Nevertheless, the light generated by the optical structure will still be directional, resembling the true scattering properties of the object 7 (namely the main directions of scatter) and allowing for a semi-three-dimensional perception of the object 7 to a much higher degree of authenticity and accuracy than, for example, a two-dimensional graphic image using shading techniques to evoke the third dimension. High quality semi-three-dimensional perception of the object 7 may thus still be achieved especially well for reflective objects and/or objects formed as an embossed or casted relief motif, for example, or an object with an entrance surface in the form or a relief and the opposite side (i.e. the side opposite to the incident light) flat, or even vice versa as an option for transparent objects.
By ignoring the object 7's volume (namely, the spatial separation of the receiving and output surfaces of the three-dimensional object 7), it may be noted that not all three-dimensional objects may be suitable for practising some embodiments of this invention. Therefore, if the designer's goal is to provide an aesthetically pleasing or recognizable imitation of an object by the practising of one or more embodiments of this invention, at least some of the above-described limitations may need to be taken into consideration when choosing a three-dimensional object or its virtual model to be imitated by recreation of its image by the device of the invention.
As mentioned above, the main subject of imitation of light scattered from particular sub-areas 5 on the pre-selected surface 6 by the light generated by respective pixels 3 is the main propagation direction. However, the particular form or shape of the wavefront generated by a particular pixel 3 may be the subject of a design that is independent of the wavefront form or shape of the light scattered from the respective sub-area 5 on the pre-selected surface 6. This is illustrated in several ways in FIGS. 8A(a) to (f). The light scattered from the three-dimensional object 7 in FIGS 8A(a), (c) and (e), i.e. light wavefronts or patterns 11 (i.e. scattered from sub-areas 512 and 515), is the same. The main propagation directions of the scattered light represented by the numerals 1012 and 1015 (optionally 1015a and 1015b) are the same as the main propagation directions of the light wavefronts or patterns generated by corresponding pixels 3 -see the corresponding respective FIGS 8A(b), (d) and (f).
However, the shapes of these wavefronts are shown here with multiple options. One option is shown in FIGS. 8A(b) and (d) where the wavefronts 1012 generated by pixel 312 are diverging spherical wavefronts. Quite a different option is shown in FIG. 8A(f), where the light pattern generated by pixel 312 is a combination of three separate/discrete subcomponents or light wave constituents characterized by their own discrete wavefronts, one spherical and two planar, and their own respective discrete "sub-main" propagation directions 1012-1, 1012-2 and 1012-3, thereby collectively forming a complex wavefront (i.e. comprising discontinuities) or a light pattern having a main propagation direction 1012. In the case illustrated in FIGS. 8A(g) and (h), the complex wavefront is also constructed from discrete sub-components, although the main directions of these sub-components 1003a, 1003b and 1003c are dictated by the same main directions selected as the representation 11' of the light scattered from the sub-area 503. The design of the wavefront shape (or wavefront composition) generated by a particular pixel 3 in any given practical embodiment is at the designer's choice, which allows control of the appearance of the imitated object 7 and/or the creation of additional visual effects (such as diffuseness, contrast, shine, sparkliness, etc). FIGS. 8A(b), (d), (f), (h) and 8B(a) show examples of some such constructed wavefronts 12, some of them being curved (i.e. spherical) (as generated by pixels 312 and 315 in FIG. 8A(b) and by pixel 312 in FIGS. 8A(d) and 8B(b)) and some being a combination of plural discrete wavefront sub-components, e.g. curved (i.e. spherical) (as generated by pixel 315 in FIG. 8A(d)), planar (as generated by pixel 303 in FIG. 8A(h)), or curved (i.e. spherical) and planar (as generated by pixel 312 in FIG. BA(f)). All these constructions of wavefront 12 have, by nature or as an intended common property, a non-planar character in general, which allows observation at a wider range of angles even under a point source of illumination. The wavefront components 12 generated by individual pixels 3 which imitate the light scattered from corresponding sub-areas 5 on the pre-selected surface 6 may therefore be alternatively termed scatter imitating wavefronts or non-planar wavefront components.
Going further, some (real or virtual) three-dimensional objects may have locations on their surface 6, e.g. location 501 in FIG. 8B(a), which for a given incident direction 901 may not produce scattered light at the observer's side, i.e. at the output or exit side of the object 7 where the majority of the scattered light is observed. In this illustrated case (in FIG. 8B(a)) the light incident at direction 901 is actually totally internally reflected within the volume of the object. Thus, scattered light may exit the object on the opposite side to the observation side or even remain trapped within the object itself. In other cases the scattered light may exit at an especially wide angle, and the main scatter direction may not be obvious or may be ambiguous. Scatter with a not well-defined main scatter direction may also be produced at locations on the object 7 where the pre-selected receiving surface exhibits a discontinuity or a sharp change. In all such or other similar cases, it may thus be desirable or appropriate, at the discretion of the designer, to fill in or smooth out the main directions of scatter, considering the scatter directions from adjacent/neighbouring locations, in order to produce a visual imitation of the scattered light which is more pleasing to the observer, in comparison with an imitation resulting strictly only from measured or calculated properties of the scattered light.
In some embodiments, the wavefront which is to be generated by a given pixel may be designed with an extension (or possibly even more than one extension) into one or more neighbouring or adjacent pixels (i.e. into a portion of or an entire area delimited by the neighbouring or adjacent pixel or pixels). In such a case, as illustrated by way of example in FIG. 8B(b), the light incident on a given pixel 312 having direction 912 is also extended beyond the boundaries of the given pixel into the neighbouring pixels 311 and 313 (i.e. the extension of incident light corresponds to the extension of the wavefront to be encoded in the given pixel). The wavefront designed for encoding into pixel 312 has a spherical wavefront 12 and the same wavefront form is kept even in its extended portions 1211 and 1213. The continuous equiphase planes of the spherical wavefront are shown as dotted curved lines in the extended portions and as solid curved lines in the portion which is to be generated by the pixel 312. In this particular example the propagation directions in the extended portions of the designed wavefront are not the same as the main direction 1012 of the portion 12 which corresponds to the main direction of scattered light 11 from the subarea 512 on the pre-selected surface 6 shown in FIG. 8A(a). However, in other cases, for example, when the wavefront designed for a given pixel is a composition (or a combination) of plural planar waves (as shown, for example, in FIG 8A(h)), the main directions of the extended wavefront portions may be the same as the main direction of the wavefront designed for the given pixel. In any event, the extended portion(s) of the wavefront designed for a given pixel may be either (i) encoded into the neighbouring pixel(s), for example, so as to be able to generate light from such pixel(s) provided it does not have a definable main direction due to a lack of existence of scattered light at the exit of the object, i.e. light scattered from a corresponding sub-area not being formed at the exit from the object, as described in the case shown in FIG. 8B(a), or (H) the extended portion(s) can be encoded into neighbouring pixel(s) as an auxiliary or additional or superimposed wavefront component(s) or image(s) described in more detail later below.
As discussed above and shown in FIGS. 8A(a) to (h) by way of example, the subject of the imitation of a given portion of the three-dimensional object 7 in the recreation of that portion's image recorded in a corresponding pixel 3 of the optically variable image device 1 is the main direction 10xx of scattered light exiting the object 7, which prior its scatter enters the object 7's receiving surface 6 at a particular sub-area 5. Such an imitation may be considered to be sufficient, such that an observer of the light collectively scattered by the optical structures in the arrangement of pixels 3 within the device 1 can recognize the resemblance of the object 7 and/or the way in which it scatters the light, in particular when the object 7 includes a form of cast or embossed relief and is either a reflective or a transmissive object (or a combination of both).
FIGS. 8A(a) to (h) show just one embodiment example of light being scattered by the three-dimensional object 7 upon transmission of the light through the object 7. However, in other embodiment cases, the incident light may be scattered upon reflection instead. Various embodiments of the present invention may employ both mechanisms of light scattering either separately or simultaneously in a given embodiment.
How the recreated light wavefront imitating the light scattered by the three-dimensional object is encoded into the optical structure of each pixel of the optically variable image device Referring now to FIGS. 9A & 9B, we turn now to the basic wavefront properties of the light generated by each pixel's optical structure and the way in which each such wavefront is encoded into the optical structure of each respective pixel.
The main scatter directions 10 for the various sub-areas 5 of the object 7 are determined for the respective incident directions 9 as described above in relation to FIG. 8A. The same incident directions 9 apply to the corresponding pixels 3 when it comes to encoding in the respective optical structures of the respective pixels the respective wavefronts to be generated thereby. In many embodiment cases the generated wavefronts will be substantially non-planar.
As shown in FIGS. 9A & 9B, the main scatter directions 10 associated with the sub-areas 5 on the pre-selected surface 6 of the (real or virtual) three-dimensional object 7 as described above (in relation to FIGS. 8A and 8B) are applicable to the main directions of propagation of the wavefronts 12 generated by the optical structures of the respective pixels 3 upon illumination thereof by plane waves of respective incident directions 9, i.e. the main scatter directions 10 generated by the three-dimensional object 7 and by the corresponding pixels remain the same. However, the other properties of each propagating wavefront 11, such as its shape, divergence, etc, for example, generated by the object 7 may be discarded and substituted by new properties in order to generate a specifically shaped and/or composed wavefront 12 in accordance with the principles of this invention. Such substitution may give the designer freedom to control the visual appearance of the light imitating the light scattered from the three-dimensional object 7 -for example, by virtue of its degree of shine, glossiness, brilliance, diffuseness, smoothness, etc. One of the key parameters of the wavefront 12 of the beam generated by the optical structure within a given pixel 3 is its divergence (alternatively its convergence) or just its angular width w. The angular width w is defined as the full apex angle of a cone inscribed within the beam at half of its intensity, and it can vary in various azimuthal directions (i.e. across the beam's cross-section). The light beams with wavefronts (optionally wavefronts with continuous forms, such as, for example, a spherical form) generated by the optical structures of the majority of the pixels 3 in the pixel arrangement used in accordance with various embodiments of this invention may have a full angular width of at least about 5° (or possibly even at least about 10°, or perhaps even at least about 15°). Beams with angular widths greater than this lower limit may ensure greater visibility of the generated or scattered light over a wider range of observation angles, which may give a certain degree of smoothness or diffuseness to the observed scattered light, whereas narrower beams may provide more glossiness or brilliance of the observed scattered light. Both types of beams may provide desirable visual effects or appearances of the observed scattered light imitating the three-dimensional object 7 which are easily controllable by the designer. Larger values of the angular width w (e.g. up to about 10 or 20 or 40 or 60 or even 70°) may also help to better cover (or conceal) the unwanted scatter directions (for example, caused by a diffraction due to the limited size of the pixels 3) that are generated by a pixel's inner structure or its boundaries. The unwanted scatter directions may be a part of the scatter-imitating wavefront, but they generally are not taken to be a part of the intended wavefront design that is fundamental to embodiments of the invention.
The diverging (or converging) wavefronts generated by the optical structures of the pixels 3 may have any of various basic shapes (i.e. phase profiles or forms), such as spherical, parabolic (either rotationally symmetric or asymmetric, e.g. with an elliptical cross-section or a cross-section of some other regular or irregular shape), or an otherwise continuous non-planar shape, or the wavefront may be formed as a composition of a plurality of overlapping (especially interfering) discrete wavefronts (e.g. planar or spherical) that propagate in plural directions over the intended angular width w.
FIG 9A(a) shows a two-dimensional illustration (namely, a cross-section of a pixel 3's optical structure and the involved light beams) of how a spherical wavefront, as an example of a converging wavefront, can be designed and encoded into the optical structure of a given pixel 3. The incident light with a planar wavefront enters the pixel 3's input surface at direction 9 (in this case a direction perpendicular to the pixel's input surface in the z=0 plane associated with the output surface of the pixel within the layer) and interacts with the pixel's optical structure of a variable refractive index n(x), which profile across the pixel width is shown in FIG 9A(b). As the light transmits through the optical structure of thickness t, its wavefront is modified (i.e. modulated) in accordance with the optical path length it undergoes at different locations on/in the optical structure. At the output of the pixel the interacting wave has a spherical wavefront with main propagation direction 10, i.e. the main propagation angle Gout. The spherical wave (or rather its portion with angular width w) can be mathematically described in a complex form as u,,,,,=Aout/r(x,z)exp(ikr(x,z)+kkaut), where Auut/r(x,z) is the amplitude of the spherical wave and q1/4,,,t(x,z)=kr(x,z) is its phase modulation function (the time character of the propagating wave is ignored for simplicity and without having any significant effect on the encoding process). *out is a phase offset constant which may be derived from the phase of the incident wave or set arbitrarily, since the absolute phase of the incident wave may be arbitrary as well. The amplitude of the spherical wave changes with the distance r(x,z) from the point source S (i.e. the centre of the spherical wavefront or a focal point), and A0.1 is the amplitude of the input wave at plane z=0 reduced by interaction with the optical structure (typically due to absorption, reflection or scatter losses). In practising embodiments of the invention, the variable character of the spherical wave amplitude at the z=0 plane may be ignored, especially when considering the size of the pixel being much smaller than a typical observation distance (usually by plural orders of magnitude). Thus, the variable amplitude Aout/r(x,z) may be substituted by a constant value Aout. Wave vectors kout(x), representing the propagation directions of the wavefront, are designed such that they vary across the width d of the pixel, covering a range of angles of width w around the main propagation direction 10 and pointing to a common focal point, which may also be understood as a point source S having coordinates (xs,z3). The absolute value of each km,t vector is k=27c/X, where A is the design wavelength. Calculating the product of the wave number k and distance r in the z=0 plane yields a phase function of the spherical wave at the output of the pixel which is yout(x,0) = (27c/A,)(xs-x)/sin anut(x)+Onut, where aout(x) covers the interval of propagation angles (a00-w/2, aout+w/2) and sin ao",(x) = (xs-x)/r. ;The properties of the optical structure need to be designed such that they induce a phase shift (i.e. a shift of the position of the sinusoidal wave in its propagation direction) in the incident wave upon transmission through the optical structure of the pixel, so that the spherical output wavefront, propagating at main direction 10, is produced at the output of the pixel (i.e. at the z=0 plane), and having phase distribution rpout(x,0). ;In practising many embodiments of the invention, many different calculation approaches may be taken to determining the respective phase modulating properties of the respective optical structures of the respective pixels 3. For example, solving the wave equation, using ray-tracing software, or altematively using a thin lens (or thin transparent) approximation approach, may be used for this purpose, to name but a few possible methods. Any of these approaches may be suitable for calculating the optical properties of the optical structures of the various pixels. However, in most practical cases a simple geometrical approximation, such as, for example, a thin transparent approximation, which may be very efficient from a design or computing-time standpoint, may yield optical structures for the pixels which can generate a desired overall output light pattern. In spite of various distortions or aberrations which may be induced in any given wavefront (e.g. a spherical wavefront) of the output light by the chosen calculation approach, such an output light pattern may still meet the requirements for imitating the light scattered by the three-dimensional object in accordance with the fundamental principle behind the present invention. ;As an example, using the approach of a thin transparent, the optical structure may be approximated by an infinitesimally thin transparent, which locally changes the phase of an incident light wave. The incident wave at the input side of the transparent has a particular phase modulation function and the wave at the output side of the transparent has a different phase modulation function. Mathematically the output wave is defined as um,' = UinT, where T represents the thin transparent's transmission function. Thus, the transmission function can be calculated as t = UnutiUjn. Using FIG. 9 as an example, the incident plane wave that propagates along the direction of the wave vector kin can be mathematically described as u,"=AinexPiikinr(x,z)+4i4 where An is the wave amplitude and On is an arbitrary phase offset constant. If the thin transparent (as an approximation of the pixel's structure 8) is placed into plane z=0, then T (Acut/A,n)exp[i(27c/X)((xs-x)/sin aout(x) -x sin a,n)+0,,,,,,-(1),n)]. Considering angle ao=0 and no transmission losses (as an example), then T= exp[i(27c/X)(xo-x)/sin ao.,(x)+4,], where ch th is a phase offset constant which can be chosen arbitrarily T=Toury,in since the phase offset of the incident wave can be arbitrary as well. The argument of the exponent pi(x,0)= (27c/k)(xo-x)/sin aoot(x,0) + is a desired phase modulation function of the transparent. ;Now, the phase function of the transparent has to be translated into a mechanism by which a real optical structure changes the phase of the interacting incident wave. The optical structure 8 in FIGS. 9A & 9B is capable of modulating the phase of the incident light wave by modulation of the refractive index n(x). Light passing through a medium of a particular refractive index will undergo an optical path which is a product of the length of travel and the refractive index, and when multiplied by the wavenumber the product describes the phase of the light collected along that path. By modifying (i.e. modulating) the refractive index within the optical structure of a given pixel, different portions of the light (i.e. different rays within the incident beam) will collect different amounts of phase when they reach the exit side of the pixel, and thus the wavefront of the light passing through the optical structure of the pixel will be modulated (i.e. modified) compared to its original (planar) form -or, in other words, it will be transformed from one wavefront form to another. An example of such a modulation mechanism is part of the description of FIGS. 7(a) & (b) above. Mathematically the phase of the light wave exiting the pixel can be expressed as gh(x) = Apt(x)+(kra(x), where Ack(x) = (27c/X) [n(x) -cos(ao -ao(x))] t / cos(an(x) is a phase shift function describing the phase shift induced in the incident wave interacting with the optical structure at different locations x of the thin transparent, X is the wavelength and ao, is the angle of incidence of the incoming plane wave, ar() is the angle of refraction depending on actual refractive index at location x (e.g. at reference surface -see FIG. 7(a)), t is the actual (i.e. real) thickness of the layer 2 containing the optical structure of the pixel 3, and th,o(x) is the initial phase of the wave prior entering the pixel applied to the x location of the thin transparent. Considering the configuration in an example depicted in FIGS. 9A & 9B with incident wave 9 normal to the pixel surface, and using both formulae for the phase function cp, defining the phase modulation function of the transparent (determined as above), the refractive index properties of the optical structure can be determined and expressed as n(x)=x/[t sin aout(x)]+no. The modulation of refractive index n(x) in the optical structure within the layer of thickness t represents the encoded wavefront of the output light -which in this example case is a spherical wavefront of angular width w, propagating at a main (i.e. central) output angle a01 for normal incidence of the incident light of wavelength X. The refractive index offset no = th,X1(27c) can be chosen arbitrarily, since the (I), phase constant can also be chosen arbitrarily. However, since we are considering real materials, no can be used to offset the n(x) values, so that they fall within a range of refractive index values nmin to nn,.x that the optical structure can have or can be modulated to have. The value An.=nmax-nmin constitutes the maximum modulation depth limit of the refractive index for the optical structure, which consequently affects the maximum phase modulation depth which can be induced in the phase of the wavefront of the light interacting with the optical structure. If necessary, i.e. if the function n(x) -even (optionally) with the application of an appropriate value no -exceeds the maximum or minimum limits of the refractive index, then the given pixel can be split into two or more portions within which the refractive index values stay within the minimum to maximum limits. Such a situation is illustrated in FIGS 9A & 9B. The graph in FIG. 9A(b) shows the refractive index of the optical structure n(x) as a function of coordinate x along the pixel width d, whilst the range of calculated refractive index modulation function n(x) exceeds the maximum modulation limit An. required to produce the spherical wave of angular width w and main propagation direction 10 at the output of the pixel. ;Therefore, the pixel is split into two portions -as shown in FIG. 9B(a) & (b) -the first portion being of a width di and the second portion being of a width dz-di, such that the refractive index function n1(x) in the first portion ranges across the entire maximum modulation depth limit Anmax, and the refractive index function nz(x) in the second portion stays within the minimum/maximum limits for the refractive index. The refractive index function in each portion of the pixel has its own refractive index offset value nio and nzo, respectively. ;Splitting the pixel into portions like this results in splitting the encoded wavefront into its own respective portions, each wavefront portion having its own phase offset, which can be determined from the formulae for the phase function based on the modulation function of the optical structure, i.e. the refractive index, within each pixel portion. The wavefront portions at the output of the pixel will spatially correspond to the pixel portions. Collectively the wavefront portions will represent the wavefront encoded in the optical structure of the whole pixel, i.e. the overall optical construction composed of the optical structures of the respective pixel portions. ;Going further in relation to the pixel being split into two portions: The discontinuities in the modulation of the optical structure (i.e. at the boundaries between the pixel portions) will cause corresponding discontinuities or disruptions in the wavefront exiting the whole pixel, which typically exhibits itself as a scatter or diffraction pattern associated with that pixel. In other words, other propagation directions of various intensities will occur in the wavefront at the exit of the pixel. The wavefront disruptions, although present in the wavefront, are not a part of the original or intended wavefront design, and may affect the quality of the visual perception of the light imitating the light scattered by the three-dimensional object (e.g. by adding optical noise into the observed image). Such wavefront disruption or degradation may occur especially when the widths of the pixel portions are not substantially larger than the wavelength of the interacting light. In order to suppress such unwanted wavefront disruptions, it may be desirable that the wavefronts generated by the neighbouring portions of the pixel remain synchronized, i.e. that the phase offset between neighbouring wavefront portions at the exit of the pixel, if any, is minimized. This may typically be achieved by defining a proper offset of the modulation function of the optical structure in the respective portions of the pixel (which in the case of the example in FIGS. 9B(a) & (b) means the refractive index modulation function of the pixel portions). The offset in modulation function of the optical structure of the neighbouring pixel portions should induce a phase offset in the respective wavefront portions (typically for the light of the design wavelength) on the respective pixel portion boundaries of the order of multiples of 27c (which in the case of the example situation in FIGS. 9B(a) & (b) means that the phase difference between the neighbouring wavefront portions 42 (di) -pi (di)1 at point di should be equal to 27 or multiples of 270. Otherwise the desynchronized light waves may produce additional unwanted (and possibly excessive) scatter and/or cause attenuation of the wave amplitude propagating in the desired directions. ;It may be noted that it may not be possible for every type of optical structure and/or configuration of the incident direction and design and/or observation wavelength that the phase offset at the boundary of the adjacent wavefront components can be minimized, i.e. set to multiples of 27c at the output of the pixel. It may thus be at the designer's discretion to decide or determine whether and to what degree the selected three-dimensional object, the configuration of incident light and the type of optical structure in the pixels, and even the proportions of the pixels and/or the layer thickness, may be properly matched so as to create an imitation of the scattered light of the three-dimensional object in such a way that it can bear sufficiently close resemblance to the light actually or virtually scattered by the three-dimensional object according to this invention. Even a chosen method or approximation for calculating the modulation of the optical structure to generate a desired wavefront may require a fine selection of optical properties of the optical structure (e.g. maximum modulation limit) and/or geometrical properties of the layer or pixel (e.g. thickness, dimensions) to be compatible with the design approach. For example, a thin transparent approximation (used in the description above) is usually an adequate tool for the design of optical elements or structures which are substantially thinner (i.e. at least about 5 or 10 times thinner) than their lateral dimensions. The limitations of various optical design approaches and their effect(s) on the optical function of microstructure-based optically variable image devices according to embodiments of the invention will be well known to a person skilled in the design of such optical microstructures. ;It should be understood that the above description referring to FIGS. 9A & 9B principally describes in predominantly mathematical or theoretical terms the process of encoding the wavefront imitating the light scattered by a given sub-area of the pre-selected surface of the three-dimensional object into an optical structure of a given pixel, but by way of example only. Various other approaches for determining the modulation properties of the optical structures of the pixels of various other types (of which some are mentioned in relation to the description of FIG. 5) may be employed instead, as will be readily understood by a person skilled in the art, and this applies to various configurations of the incident light and forms and/or shapes of the encoded wavefronts created by the pixels. ;As mentioned earlier, in practical terms, an optical structure with variable refraction index may typically be originated using for example a direct writing laser or electron beam system which enables locally variable dosing of energy in a photopolymer. Practical examples of suitable such systems, and methods and equipment for practising them, are well-known in the art and to the skilled person. Planar structures with a surface micro-relief may typically be formed and recorded using for example a direct-writing laser and/or electron and/or ion beam technique, and optionally subsequent replication techniques (e.g. electroforming, embossing, UV casting, etc.). Again, practical examples of suitable such techniques, and equipment for practising them, are well-known in the art and to the skilled person. ;It should be also noted that the overall wavefront generated collectively by the group or array or grid arrangement of pixels into which the wavefront has been encoded according to embodiments of this invention may provide the most accurate imitation of light scattered by the (real of virtual) three-dimensional object under the conditions of the incident light under which it was encoded into the optical structures of the pixels. Conditions of the incident light which deviate from the conditions used during the encoding process (namely, wavelength and angle of incidence) may in some cases produce a less accurate degree of imitation. The scope of the present invention in terms of its presented example embodiments as described herein may enable the designer to choose and/or tune many different parameters of the optical structures of the pixels and of the encoding process in order to achieve a desirable or even broad spectrum of various imitation outcomes requiring either more specific illumination conditions (e.g. monochromatic, directional light, etc) or allowing a wider range of illumination conditions (e.g. broadband light source, multidirectional light source, ambient illumination conditions, etc.). ;How to additionally modify modulate or shape the wavefront imitating the light scattered by the three-dimensional object Referring now to FIGS. 10 et seq, we now turn to how the wavefront imitating the light scattered by the three-dimensional object may in some further embodiments of the invention be additionally modified, modulated or shaped, so that such an additional modification or modulation is also encoded into the optical structures of the individual pixels or group(s) of pixels, and so that it can be decoded or visualized under particular illumination and observation conditions. ;FIGS. 9A & 9B described a situation where the wavefront generated by a pixel 3 in the layer 2 imitating the light scattered by the three-dimensional object which has entered the object 7 at the sub-area 5 corresponding to the said pixel 3 has a spherical form, with angular width w and its centre located in the direction of the wavefront propagation. As the spherical light beam propagates away from the pixel's exit, it converges until it reaches the centre of the spherical wavefront, i.e. the focal point thereof, and then it propagates as a diverging spherical beam. However, an equivalent spherical beam with the same main propagation direction and the same angular width w can be generated if the centre of the wavefront of the spherical beam is placed on the opposite side of the pixel's exit side, i.e. the side opposite to the beam propagation direction. Such a centre would be only a virtual centre and the beam exiting the pixel would not converge to it but propagate therefrom as a diverging beam only. FIGS. 10(a) & (b) illustrate both cases where a spherical beam of the same angular width w leaving a pixel 3 has a real and a virtual centre of its spherical wavefront. Although the behaviour (i.e. the propagation direction and the divergence) of both beams may appear substantially the same to an observer, their wavefronts are encoded differently into the optical structure of the pixel 3. This may be used as a way of encoding additional information into the layer 2 of the optically variable image device 1, when particular one or more individual pixels or group(s) of pixels have optical structures with encoded spherical wavefronts whose centres are either real or virtual, i.e. are located at the exit side or the opposite side of the pixel 3. Using an appropriate method of detection of the focal points, such as for example a projection screen or a photosensitive device (e.g. a CCD detector) placed at the appropriate (focal) distance from the layer 2, the real centres of the spherical wavefronts may be visualized, in particular when illuminating the layer 2 by a light source having the same or similar properties as the light used as the incident light of a specific direction and wavelength during the encoding process. ;The spherical wavefronts are the natural wavefronts which create focal points according to the description above. However, it should be understood that other wavefront geometric forms (e.g. Gaussian, parabolic, etc.) or even aberrated spherical wavefronts (i.e. predominantly spherical wavefronts but with various aberrations, e.g. due to diffraction effects or having an imperfect form due to approximations used in calculating the properties of the optical structure) may also produce focal points according to the description above. If desired or appropriate, in some more specific embodiments, this feature may be used only at the areas of the layer 2 where the focusing wavefronts generated by the optical structures in the pixels are disturbed the least, as discussed (for example) in the above section headed "How the recreated light wavefront imitating the light scattered by the three-dimensional object is encoded into the optical structure of each pixel of the optically variable image device". ;Another possible modification of the wavefront generated by an optical structure in a pixel 3 relates to the spatial limitation of the wavefront, in particular a continuous wavefront. In general, a cross-section of the wavefront propagating away from a pixel as a diverging beam, e.g. a spherical beam, takes on the shape of the actual pixel (i.e. as defined by the pixel's boundary). ;If the basic shape of the pixel is, for example, square, then the cross-section of the wavefront will also be approximately square. This is illustrated in FIGS. 11A(a) & (b). The central pixel 3 or a section of an arrangement or group/array/grid of square pixels (as viewed from above, in plan) contains an optical structure (in the optical material with modulated refractive index n, the modulation being indicated by the varying density of the hatching lines) which generates a spherical wavefront of angular width w and main propagation angle a0 having components a0 1,, and aout,y. The angular distribution of the propagation directions within the spherical wavefront is shown schematically as a black square in FIG. 11A(b). The axes on the graph in that FIG. 11A(b) represent the propagation angles in the x and y directions. ;On the other hand, if a portion of the pixel 3 is blocked or masked or substituted by a different optical structure, then the originally square cross-section of the spherical wavefront can be truncated so as to take on a different shape -as illustrated in FIGS. 11B(a) & (b). The truncation areas 13 in the central pixel 3 are shown as the black area delimiting a star-shaped optical structure in the pixel 3. The wavefront cross-section generated by the truncated optical structure is converted from a square shape into the shape of a star. The resulting distribution of propagation directions within the truncated wavefront is shown schematically in FIG. 11B(b). ;Truncating a wavefront generated by a truncated optical structure within a pixel may be used as an authentication feature in some further embodiments of the invention. Truncation shapes which are the same as or different from each other may be applied to individual ones of one or more, or of a plurality of, pixels or even to different groups/arrays/grids of pixels, in a given optically variable image device 1. Upon illumination of such pixels by a light source, in particular a predominantly directional and/or monochromatic light source, the beam or beams generated by truncated pixels can be projected, for example onto a screen or a detector, and a particular truncation shape can be visualized. The truncation may even be viewed by the naked eye of a human observer (i.e. projected onto a human retina), in particular if the same truncation area is applied to pixels covering a larger area of the optically variable image device, and the main directions of the non-planar wavefront components are similar. ;It should also be understood that, in certain embodiments, a specifically shaped cross-section of the wavefront generated by a pixel, which can be projected onto a screen in the same or similar manner as described above, may be achieved by a specific shape of the pixel itself. Since, for example, a square pixel generating a wavefront (e.g. a spherical wavefront) may be projected as a square, a pixel of a different shape (e.g. a star shape) may generate a spherical wavefront of that particular shape. Even a group of adjacent (e.g. tessellated) pixels of the same shape and orientation as each other, when illuminated by a directional and/or monochromatic light source with a narrow angular width (e.g. typically of or less than about 5 or 10°), may create overlapping projections of the common shape of the individual pixels. The size of the projected shape may be proportional to the angular width (i.e. spread) of the wavefront generated by the pixels and the observation distance. If the group of pixels has a much smaller size than the size of the projected pixel shape, then the blur of the projected shape may be minimal, provided the main propagation directions of the individual wavefronts generated by the group of pixels change slowly therealong, typically within the range of their angular width (i.e. spread), and the angular widths (i.e. spreads) of all of (or nearly/substantially all of) or a majority of the wavefronts generated by the group of pixels are substantially the same. Under such conditions the projection of the pixel shape may be visible also by the human naked eye (i.e. it can be projected onto the eye's retina), partially or fully depending on the angular width of the group of pixels from the observer's observation point. Such a visual projection of the pixel shape or pixel area, i.e. a "visual signature" of the pixel, may not be possible if the wavefront generated by the pixel is substantially planar, or it has an irregular shape but is on average substantially planar. ;A different type of modification of light generated by the optical structure(s) of one or more pixels of a device according to another embodiment of the present invention is based on a direct modification of the phase modulation function of the light wave imitating the light scattered by the three-dimensional object 7. The modification of the phase function may be applied individually to one or more light waves exiting the corresponding pixel 3. A phase modification function (Nod may be added to the phase modulation function pow which describes the phase of the light wave exiting a particular pixel 3 prior to its modification. The modified phase modulation function pout-Ftmod is then encoded into the optical structure of the relevant pixel in a corresponding manner to that described hereinabove in the preceding section headed "How the recreated light wavefront imitating the light scattered by the three-dimensional object is encoded into the optical structure of each pixel of the optically variable image device". ;In general, the phase modification function creates a disturbance in the wavefront of the light wave designed to imitate the light scattered by the three-dimensional object 7 (i.e. the "scatter-imitating wavefront") associated with a particular pixel 3. Such a disturbance causes the scatter-imitating wavefront to propagate also in one or more directions other than the originally designed propagation direction(s), i.e. the propagation direction(s) designed according to the principles and methodologies described hereinabove prior to application of the phase modification function. The phase modification function should be designed in such a way that it does not change the main direction of the scatter-imitating wavefront more than a few degrees (e.g. it may change the main direction of the scatter-imitating wavefront typically by less than about 5° or 10°), and nor does it decrease its divergence angle below about 5°. ;The main purpose of applying a phase modification function to the phase modulation function of the scatter-imitating wavefront is to create a specific pattern (i.e. auxiliary image or wavefront component) in the spectrum of the propagation angles, which is distinguishable from the original angular spectrum of the light exiting the relevant pixel prior the modification. ;This typically requires the phase modification function to exhibit spatial frequencies which can produce scatter with a higher angular width than the angular width of the scatter-imitating light beam. FIG. 12(a) shows an example of a phase modulation function pout of a spherical wavefront of a scatter-imitating light beam, and FIG. 12(b) shows a phase modification function cpmod in the form of a combination of concentric elliptical patterns (Both of these illustrated examples are functions of the pixel coordinates x and y.) When both functions are combined, and the resulting wavefront phase function is encoded into the optical structure of a pixel according to embodiments of this invention, then upon illumination of the optical structure, for example by an incident light beam of the same or similar properties as the incident light used for the encoding (i.e. under the particular design conditions), the optical structure of the pixel produces a pattern in the angular spectrum of the propagating light beam which is illustrated in FIG. 12(c). The central square 15 in the pattern represents the angular spectrum of the spherical wave of angular width w (in both the x and y angular directions), i.e. representing a light beam which imitates the scatter of the three-dimensional object and which is generated by an optical structure of a pixel that corresponds to an area on the pre-selected surface of the three-dimensional object according to the principles of this invention. The central square 15 is surrounded by a mirrored pair of crescent moon-shaped angular patterns 14. The separation of the patterns 14 from the central light beam is driven by the spatial frequency of the elliptical ripples in the phase modification function pattern. ;Since the ripples have a periodic sinusoidal character, the central pair of crescent moon patterns 14 is accompanied by a distinct surrounding secondary pattern (i.e. of the second order) of the same shape, but of lower intensity. Even higher than second order patterns may even appear, but with much lower intensity(ies). ;In practising some such embodiments as in the preceding paragraph, it may be important not to disturb significantly the visual perception of the imitation of the three-dimensional object. If that is the case (i.e. typically when the patterns 14 spread widely in the angular spectrum), the intensity of the portion of the light 15 produced by the optical structure of the relevant pixel may need to be considerably higher than the intensity of the pattern portions 14 of the light beam which has been generated by the optical structure of the pixel due to the modification of the phase modulation function of the scattered light-imitating wavefront. This may be controlled by the phase modulation depth of the phase modification function, i.e. the difference between the maximum and minimum phase shift that the phase modification function may induce in the interacting wave of the design wavelength. The lower the modulation depth, the less intense the pattern around the central beam will become. There may be no explicit limit on how small the modulation depth limit should or may be, but it may be as low as 1/20 of the design wavelength in order to produce a bright enough pattern to ensure its visibility is sufficient, especially when it is projected onto a projection screen, typically using a laser beam. This is ultimately the designer's choice. The intensity ratio between pattern portions 15 and 14 of the light produced by the optical structure may be tuned either experimentally or by simulations. The wave exiting the relevant pixel can be expressed as uoui,mod(x,y)=Aout(x,y)expfipout(x,y)+ipniod(x,y)] at the exit plane of the pixel, and it can be decomposed into a spectrum of plane waves propagating at various propagation directions (for example, by a Fourier transform) and which can be easily visualized or simulated as a far field projection. An example of such decomposition is illustrated in FIG. 12(d), which shows the intensity distribution of plane waves propagating at various propagation directions represented by spatial frequencies 1/x and 1/y. The central peak 15 represents the scatter-imitating light beam of angular width 5°, and the side patterns 14 represent the light formed around the central peak due to the applied phase modification to the spherical wavefront. The phase modulation depth of the phase modification function &pun, is of the order of 1/10 of the design wavelength. ;The design of the phase modification function and the pattern it produces around the scatter-imitating light beam is the designer's choice. Furthermore, and as discussed above, such a pattern may be used in some embodiments of the invention to create a specific image "signature" associated with a particular pixel. Such signatures can be encoded with the main scatter-imitating wavefront into the optical structure of the relevant pixel(s), or even into a selection of plural pixels, as appropriate. The signatures may even vary from pixel to pixel or they may be the same for all the pixels to which they are applied. The signatures and their distribution among the pixels represent a new and higher level of authentication and security features for implementation in an optically variable image device than has been possible in the art hitherto. ;Using a "superimposed" image as an additional modification of the scattered light-imitating wavefront The above-described "signature" of the light generated by a pixel imitating the light scattered by the three-dimensional object associated with a particular pixel is formed around the main propagation direction -i.e. the signatures associated with different pixels will be formed around different main directions into which the optical structures of different pixels direct the light. Yet another type of modification of light generated by the optical structure(s) of one or more pixels of a device according to another embodiment of the present invention, which provides at least one additional or auxiliary image which is not dependent on the main propagation directions of a particular scatter-imitating output light pattern generated by the optical structures of the various pixels and is superimposed on such an output light pattern. ;As described below, such an additional or auxiliary image (i.e. a superimposed image) may be encoded into the optical structure of a given pixel or a group/array/grid of adjacent pixels (or one or more portions thereof) together with the scatter-imitating beam(s) generated by such a pixel or pixel group/array/grid (or portion thereof). The light waves of the scatter-imitating beams (with or without modification of their phase function(s) or cross-section(s) as described above) at the exit of the associated pixels in the group can be described in a form of a sum: u(x,y,0)=Zu(km(x( k,I),,V (k,I), 0) where (k,l) are indices of pixels in the group and x(k,h, yo,,) are coordinates of points of the respective pixel's area in the plane z=0, and the u(km are waves associated with pixels (k,l) defined in z=0. The said superimposed image can be also represented by an optical wave uh(xh,yh,O) at the exit plane of the pixels where the coordinates xh, yh include all points of a compact area overlapping fully or partially an area covered by the group of pixels described by coordinates (i.e. points) x( k,1),.,v (k,l) for all (k,l) indices of pixels in the group. This is illustrated in FIGS. 13(a) & (b). FIG. 13(a) shows a group of twenty square pixels 3 with (k,l) indices equal to (1,1),(2,1)... (5,4). The area of each pixel (k,l) is delimited by coordinates xk,xk+1,3(1,Y1+1. FIG. 13(b) shows the same group of pixels and a rectangular area in the plane z=0, delimited by coordinates xhi,xh2,yhi,yhz in which the light wave representing a superimposed image is defined. It should be understood that FIGS. 13(a) & (b) show just one example of the overlap of the area of a group of adjacent pixels and the area in which the light wave of the superimposed image is defined. The pixels' shape may in some cases be more complex and variable (as described hereinabove), and so may also be the area associated with the superimposed image. In any case, the boundary of the area in which the light wave of the superimposed image is defined may be independent of the boundary of the group of pixels which it overlays. ;The light wave representing the superimposed image uh in the plane z=0 is then divided into portions along the boundaries of the pixels it overlays. Each portion of the wave uhoc,0 corresponding to a specific pixel (k,l) in the group is then added to the scatter-imitating light wave associated with the same pixel u(km. The resulting wave uoutin,(km = u(k,l) + uh(km associated with the pixel (k,l) is then encoded into the optical structure in accordance with the principles of this invention and as described hereinabove in the above section headed "How the recreated light wavefront imitating the light scattered by the three-dimensional object is encoded into the optical structure of each pixel of the optically variable image device". Using, for example, a thin transparent approximation approach, this would mean one has to calculate a transmission and/or reflection modulation function of the thin transparent associated with pixel (k,l) expressed as Th,(k,I)=Uout/h,(k,I)/U in. where uh, is the incident light wave according to the principles of the invention as described hereinabove, assuming the incident light wave is the same for all pixels (k,l) as well as for the superimposed image (i.e. having the same propagation direction, amplitude, wavefront and design wavelength). ;In general, Th,(k,l) is a complex function and it can be expressed as Th,(k,D=Aqh,00Dexp(i(mha,0)), where Ar(h,(k,h) and 41/40,,(km) are its amplitude and phase function, respectively. In many practical embodiments of the invention the amplitude may usually be ignored (i.e. assumed to be equal to 1), and only the phase function considered for the encoding into the optical structure of the relevant pixel. This is due to the fact that the optical structures of the pixels usually modulate primarily (by design) only the phase of the interacting light wave. Once the phase function of the thin transparent is determined for each pixel (k,l) then it is encoded into the optical structure of the relevant pixel according to the principles of this invention and as described hereinabove in the above section headed "How the recreated light wavefront imitating the light scattered by the three-dimensional object is encoded into the optical structure of each pixel of the optically variable image device". ;Alternatively to the above-described approach, the transmission and/or reflection modulation function of the thin transparent associated with pixel (k,l) may instead be defined as a separate modulation function for modulating the incident light wave into the scatter-imitating wave -C(k,i) = U(k,o/Llin,(k,i) and for modulating the incident light wave into the light wave of the superimposed image Th(k,i) = Ury(k,DiLlin,h. In this case, the incident light wave uin,(k,l) associated with pixel (k,l) as described hereinabove is a planar wave and its direction may be different for each pixel. The incident light wave associated with the superimposed image is defined across the entire area delimited by coordinates xhl,xh2,yhl,yh2, and only its respective portion is applied to calculate Th(k,I) associated with a given pixel. Moreover, the incident wave uin,h may be different from any of the incident waves uin,(k,l) associated with pixels (k,l), i.e. it does not have to propagate in the same propagation direction and nor does it have to be planar. Both modulation functions of the thin transparent, which can be expressed in a complex form as -c(k,i) = A,(kmexp(iw(k,o) and Th(k,i) = A2(h(k,1))eXPOPh(k,1)), where A and cp are respective amplitude and phase modulation functions of the respective thin transparent, are then combined (i.e. added) and the resulting modulation functions Try,(k,i) associated with pixels (k,l) are encoded into the optical structure of the respective pixels in accordance with the principles of this invention and as described hereinabove (i.e. in the preceding paragraph and in the above section headed "How the recreated light wavefront imitating the light scattered by the three-dimensional object is encoded into the optical structure of each pixel of the optically variable image device"). ;FIG. 14(a) shows an example of such a superimposed image, whilst FIG. 14(b) shows its representation as an on-axis phase Fourier hologram designed to create an image in far field, where Th is a phase function of the thin transmission function of the hologram. The reconstruction of the image 16 using a plane light wave of normal incidence is illustrated in the graph of FIG. 14(c) as a representation of its propagation directions (the signal propagating in the normal direction, i.e. at zero order, being ignored). The intensity of the reconstructed superimposed image, i.e. the portion of the incident light which forms the reconstructed superimposed image, may be controlled by the modulation depth of the phase function Th, typically between a theoretical maximum limit that the type of hologram structure can provide and zero, and/or by scaling the amplitude A, of the thin transparent Tr, of the superimposed image. The reconstructed image may for instance be projected (i.e. visualized) on a screen in far field. ;FIGS. 14(a)-(c) furthermore illustrate an example of a superimposed image which can be a type of covert image in the form of a Fourier hologram which may be visualized (e.g. projected onto a screen) for example by monochromatic and directional (i.e. collimated) light. ;FIGS. 15(a) to (f) show an example of the implementation of a superimposed image as a modification of the scatter-imitating light. FIG. 15(a) depicts a portion of the front right side of the hood of the car shown in FIG. 4. A zoomed-in detail of the receiving pre-selected surface 6 shows the boundaries of sub-areas 5x into which the surface 6 is divided. FIG. 15(a) further shows the projection of the sub-areas 5x onto a flat surface s of the layer 2 of the optically variable image device 1 forming an arrangement of pixels 3x. A group of pixels 3a -3d is selected for modification of the imitation of the light scattered by the three-dimensional object 7 after the light of normal incidence is received by the pre-selected surface 6 at sub-areas 5a -5d and for encoding the additional (superimposed) image. The phase modulation functions p.m) (where j represents the indices a to d of the pixels in the group) of the scatter-imitating wavefront components generated by the corresponding pixels 3a -3d in the selected group are shown in FIG. 15(b). Each of the scatter-imitating wavefront components of the light at the outputs of the pixels 3a -3d in the group have a spherical form and an angular width of about 5°. The graph in FIG. 15(c) shows the various phase modification functions pmado assigned to the pixels 3a -3d in the group. While the modification function ymodoo has the same character as the one in FIG. 12(b), generating mirrored crescent moons 14d around the main propagation direction of the corresponding spherical wave 15d, the modification function (pmod(o is a cross-sinusoidal pattern creating higher diffraction orders 14c around the main propagation direction of the spherical wave 15c, and the modification function pmod(b) is a Fourier phase structure (i.e. hologram) generating a triangular pattern 14b around the spherical wave 15b. The patterns 14b,c,d shown in FIG. 15(e) in the representation of the angular spectrum of propagation directions of the light generated by the group of pixels 3a -3d are "signatures" (or auxiliary wavefront components) of the corresponding beams 15b,c,d, which propagate in directions corresponding to the main scatter direction of light incident at sub-area locations 5a -5d on the pre-selected surface 6 of the object 7. Note that no signature, i.e. no phase modification function, is applied to the light wave generated by pixel 3a. The modulation depths of the ii phase modification functions Tmod(bsd) are about 1/3, 1/5 and 1/8 of 27c, respectively. The phase modulation function of the modified scatter-imitating light beam associated with a given pixel in the group can be expressed as cpouto)+ (pmodo). Assuming unidirectional light of normal incidence and unit intensity and no losses of light interacting with the optical structure of the pixels, a thin transparent modulation function associated with a particular pixel in the group can be expressed as to) ,rnodo) = exp(i (poiiiw+ i (pmodri)* The phase modulation function of the superimposed image ph which further modifies the scatter-imitating light beam (or rather is added or superimposed as an additional or auxiliary image or image component) is shown in FIG. 15(d). Its definition is identical to the one described in FIG. 14. It extends across the entire area of the group of pixels 3a -3d and its modulation depth is about 27c. A portion of the thin transparent modulation function of the superimposed image associated with each pixel in the group (under the same conditions as in the preceding paragraph above) can be expressed as -ND = exp(i (pia When combined with the thin transparent modulation function of the scatter-imitating light beam with modified phase it yields the final thin transparent modulation function associated with a pixel in the group to,,,odano) = exp(i pomp + i (prnodo)) + exp(i pho) ). Such a modulation function is then encoded into the optical structure of the respective pixels in accordance with the principles of this invention and as described hereinabove in the above section headed "How the recreated light wavefront imitating the light scattered by the three-dimensional object is encoded into the optical structure of each pixel of the optically variable image device".
When the encoded wavefront (comprising the scatter-imitating wavefront component, auxiliary "signature" wavefront components and additional auxiliary "superimposed" image component) is reconstructed from the group of pixels 3a -3d using normal incident light, it propagates at various angles as shown in FIG. 15(e). The light wave exiting the pixels uniii,d,h(x,y,0) = Z [ exp(i + i pmoda) (xi,yi3O)) + exp(i who) (xi,yi3O) a where (xi, y) represents the coordinates within a respective pixel area, may be decomposed, for example using Fourier transform, into a spectrum of plane light waves propagating at various propagating directions and which can for example be visualized or simulated as a far field projection. Such decomposition is illustrated in FIG. 15(f), which shows the intensity distribution of plane light waves propagating at various propagating directions represented by spatial frequencies 1/x and 1/y (ignoring zero order intensity).
In yet another embodiment of the invention, it may be possible for an auxiliary or additional superimposed image encoded in a given pixel to take the form of an encoded light pattern which is composed of light that is a representation of at least part of (and in some such cases only a part of, less than the whole of) either (i) the scattered light from an adjacent or neighbouring sub-area to the respective sub-area that corresponds by mapping to the given pixel in question, or alternatively (ii) the portion of the output light pattern that is generated by an adjacent or neighbouring pixel to the given pixel in question. An example of this is shown in FIG. 16. As shown here, the auxiliary or additional superimposed image can take the form of a light pattern generated at least in part by a neighbouring pixel (or perhaps even by part of a neighbouring pixel). In other words, an optical structure in a given pixel which generates a portion of the image of the light scattered from a corresponding sub-area on the pre-selected surface of the three-dimensional object or scene may have encoded therein also a portion of the image of the light scattered from a sub-area neighbouring (i.e. adjacent to) the sub-area corresponding by mapping to the given pixel in question. To illustrate this case by way of example, the configuration of the sub-areas 5x and corresponding pixels 3x and the respective scatter-imitating light beams with spherical wavefronts 15x they generate that are shown in FIG. 15 (in particular FIG. 15(a) and FIG. 15(e)) can be used. FIG. 16 shows a scatter-imitating light beam 15a as a representation of the propagation directions (i.e. output angles) generated by pixel 3a, which also generates additional or auxiliary (i.e. superimposed) images 16b, 16c and 16d, which represent portions of images of light scattered from sub-areas 5b, 5c, 5d adjacent the sub-area 5a that corresponds by mapping to the pixel 3a. In this particular case, the images 16b, 16c and 16c are identical to the respective scatter-imitating light beams 15b, 15c and 15d, except that they are superimposed on the image scatter-imitation beam 15a with a lower weight (namely the intensity weight), which is indicated by the lighter fill pattern in FIG. 16, and which is uniform across the entire angular range of each image 16x. In still other embodiments, such a weight may vary, and so may the number or wavefront forms (i.e. shapes, types, etc) of the superimposed image(s) representing portion(s) of image(s) of light scattered from neighbouring sub-area(s) (i.e. sub-area(s) adjacent the sub-area mapped to the given pixel). In some such practical embodiments, the pixel's optical structure may even comprise encoded superimposed images representing scattered light from substantially all sub-areas on the pre-selected surface that are adjacent to (i.e. have a boundary with) a given pixel, and such encoding may be applied to a plurality of, or optionally even to all of, the pixels in the layer 2. Furthermore, such encoding may constitute a tool in the designer's hands which may be used to refine the appearance of the overall output light pattern generated by the device (e.g. to produce smoother transitions in viewing light generated by neighbouring pixels in the layer 2 by application of fine-tuned intensity weights, wavefront forms, etc), and which may also contribute to hiding or suppressing of noise generated in the output light pattern due to interaction of incident light with the pixel boundaries.
In some special cases, when the scatter-imitating wavefronts generated by individual pixels in a compact group of pixels are designed as a combination (or a cone or a fan) of planar wavefront sub-components propagating in multiple directions, they may be designed over an area larger than a given pixel, i.e. extending into the area of one or more neighbouring pixels (as described above in the example illustrated in FIG. 8B(b) above), and these extensions may be added to or superimposed onto the scatter-imitating wavefront of the respective neighbour as an auxiliary image (or additional or superimposed image or wavefront component). Such extensions may be added with a different intensity weight (optionally variable, e.g. decreasing as they extend further into the area of a pixel into which they are extended) compared to the intensity of the original scatter-imitating wavefront of a given pixel.
Such an approach with proper application of the extensions and their intensity weights when applied to a compact group of pixels or even all pixels in a layer may help to suppress discontinuities on the pixel boundaries (i.e. making them less pronounced or harder to identify, especially harder to identify visually) or even to eliminate them altogether, which may significantly improve reduction of the noise generated by scatter and or diffraction of light generated by its interaction with pixel boundaries on which the wavefronts are not continuous.
Other examples of superimposed images may also include other types of images, such as images visible by the naked eye in the form of other optically variable images, which may be two-or three-dimensional (especially holographic) such images, and which overlay (or even substitute fully or partially) the main image of the three-dimensional object or scene formed by the non-planar wavefront components according to this invention. Such other optically variable images may even comprise various kinetic and/or colour effects, e.g. variable graphic or holographic patterns that change upon a change occurring in the illumination or observation conditions or angles.
It should be understood that all the parameters used in the various drawings FIGURES referred to in the various descriptions of embodiments and features of the invention hereinabove were chosen for illustration and example only. They may not be shown at a proper scale and their values, although within the scope of this invention, may need further refinement according to the designer's intentions if used in real embodiments.
Throughout the description and claims of this specification, the words "comprise" and "contain" and linguistic variations of those words, for example "comprising" and "comprises", mean "including but not limited to", and are not intended to (and do not) exclude other moieties, additives, components, elements, integers or steps.
Throughout the description and claims of this specification, the singular encompasses the plural unless expressly stated otherwise or the context otherwise requires. In particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, unless expressly stated otherwise or the context requires otherwise.
Throughout the description and claims of this specification, features, components, elements, integers, characteristics, properties, compounds, chemical moieties or groups described in conjunction with a particular aspect, embodiment or example of the invention are to be understood to be applicable to any other aspect, embodiment or example described herein unless incompatible therewith or expressly stated otherwise.
Furthermore, it is expressly envisaged in this disclosure of the present invention that the various aspects, embodiments, examples, features and alternatives, and in particular the individual constructional, configurational or operational features thereof, set out in the preceding paragraphs, in the claims and/or in the following description and accompanying drawings, may be taken independently or in any combination of any number of same. For example, individual features described in connection with one particular embodiment, or described singly or in combination with another feature in any one or more embodiments, are applicable on their own or in combination with one or more other features to all embodiments and may be found and used in combination with any other feature in any given embodiment, unless expressly stated otherwise or such features are incompatible.

Claims (37)

  1. CLAIMS1. An optically variable image device for forming an observable optically variable image of light scattered from a three-dimensional object or scene upon illumination of the device with incident light, the image to be formed comprising an output light pattern, the device comprising a layer of optical material with an arrangement of a plurality of pixels defined or definable thereon, each pixel having recorded therein in encoded form a respective portion of the image to be formed, wherein: each respective portion of the image to be formed that is recorded in a respective pixel is encoded therein as a representation of a respective portion of the said scattered light from the three-dimensional object or scene, where that respective portion of the said scattered light from the three-dimensional object or scene is scattered from a respective sub-area located on a pre-selected surface of the three-dimensional object or scene, and which respective sub-area corresponds by mapping to the respective pixel in the pixel arrangement in which is encoded its representation; each pixel comprises a respective optical structure with optical properties capable of modulating the phase of light incident thereon and forming a respective phase-modulated non-planar wavefront component of the output light pattern, which respective non-planar wavefront component formed by that respective pixel (i) corresponds to the respective portion of the image to be formed that is recorded in that respective pixel, and (ii) has phase modulation that is characteristic of or a function of the modulating optical properties of the respective optical structure of that respective pixel, where a shape of the said respective non-planar wavefront component is independent of the shape of a wavefront of the said scattered light from the respective subarea on the pre-selected surface of the three-dimensional object or scene that corresponds by mapping to the respective pixel in the pixel arrangement in which is encoded its representation, and where upon illumination of each respective pixel of the layer of the device with incident light, each said pixel forms a respective portion of the observable optically variable image in the form of the said respective non-planar wavefront component of the output light pattern, and where the said respective non-planar wavefront component formed by each respective pixel, upon that respective pixel being illuminated under pre-selected illumination conditions, has at least one respective main propagation direction that is either substantially equal to or at least is dependent on a respective at least one main propagation direction of light scattered from the respective sub-area on the pre-selected surface of the three-dimensional object or scene that corresponds by mapping to the respective pixel in the pixel arrangement in which is encoded its representation, upon the three-dimensional object or scene being illuminated under the said pre-selected illumination conditions, where the said pre-selected illumination conditions comprise, for each respective pixel and each respective sub-area on the pre-selected surface of the three-dimensional object or scene, illumination by a planar light wave of a pre-selected or design wavelength and at a respective pre-selected angle of incidence on the respective pixel and on the three-dimensional object or scene, as the case may be, and in the case of the respective sub-area the said planar light wave originating outside the three-dimensional object or scene and being incident on a receiving side of the respective sub-area as it interacts with the three-dimensional object or scene; whereby upon collective illumination of the arrangement of pixels of the layer of the device with incident light, the plurality of pixels collectively form the observable optically variable image in the form of a plurality of portions thereof in the form of a plurality of said respective non-planar wavefront components of the output light pattern, which plurality of said respective non-planar wavefront components of the output light pattern collectively correspond to the said scattered light from the three-dimensional object or scene whose image is formed by the device.
  2. 2. An optically variable image device according to claim 1, wherein the observable optically variable image of the light scattered by the three-dimensional object or scene that is formed by the optically variable image device upon illumination thereof with incident light comprises an achromatic such image.
  3. 3. An optically variable image device according to claim 1 or claim 2, wherein the representational relationship between the respective portions of the output light pattern formed by the device (and comprising the phase-modulated non-planar wavefront components thereof) and the respective portions of the scattered light from the three-dimensional object or scene is such that the output light pattern and the scattered light from the three-dimensional object or scene merely resemble each other or are similar to or are an imitation or approximation of each other, instead of strictly being a substantially exact or nearsubstantially-exact reproduction or duplicate thereof, provided that those resembling/similar/imitating/approximating respective output light pattern portions (comprising the phase-modulated non-planar wavefront components thereof) and the respective portions of the scattered light from the three-dimensional object or scene have or retain as between them generally similar or approximately or roughly or generally in the vicinity of the same respective main propagation directions, whereby the optically variable image device acts or functions to form an observable optically variable image that has or provides approximately or roughly or generally in the vicinity of the same or a merely similar perception (optionally when perceived by an observer) as the scattered light from the three-dimensional three-dimensional object or scene.
  4. 4. An optically variable image device according to any one of claims 1 to 3, wherein the said encoding by which the respective portions of the image to be formed are recorded in the respective pixels includes not only the encoding of the modulated optical properties of the optical material in the respective pixels that record therein in that encoded form the respective phase-modulated non-planar output light wavefront components, but the said encoding also includes additional encoding as defined in one of the following (a), (b) or (c): (a) auxiliary encoding in the optical material in one or more of, optionally substantially all of, the pixels which modifies or modulates the said encoding (that records in the respective pixels the respective phase-modulated non-planar output light wavefront components) so as to encode therein at least one modification to or modulation of the phase-modulated non-planar output light wavefront components which defines, and manifests itself in the output light pattern as, at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component of the output light pattern; or (b) encoding in the optical material in one or more of, optionally substantially all of, the pixels of at least one additional or auxiliary wavefront component which defines, and manifests itself in the output light pattern as, at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component of the output light pattern; or (c) a combination of (a) and (b); wherein in either case (a), (b) or (c) the additional encoding of the said at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component is manifested in the output light pattern that is formable by the device as a modification or modulation of, or as an additional or auxiliary feature, characteristic, image or image component of, or as a subtraction of a portion from, one or more portions of the image to be formed by the device; optionally wherein the additional encoding of the said at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component is so additionally encoded so as to be included in or present within or across or superimposed onto or subtracted from either (i) a respective single one of the phase-modulated non-planar output light wavefront components, or one or more portions thereof, that is formable by the device, or (ii) collectively a plurality of, or optionally substantially all of, the phase-modulated non-planar output light wavefront components, or one or more portions of each of one or more of said phase-modulated non-planar output light wavefront components of the plurality, that are formable by the device; and further optionally wherein the maximum or average intensity of the or each one of the at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component(s) is less than half of, optionally less than a quarter of, further optionally less than a tenth of, the maximum or average (as the respective case may be) intensity of each respective phase-modulated non-planar output light wavefront component forming a respective portion of the image to be formed of the three-dimensional object or scene.
  5. 5. An optically variable image device according to claim 4, wherein the said additional encoding as defined by (a), (b) or (c) is so additionally encoded in a given pixel so as to comprise encoding of either: (i) a representation of at least part of (especially a part less than the whole of) the scattered light from a sub-area located on the pre-selected surface of the three-dimensional object or scene that is an adjacent or neighbouring sub-area to the respective sub-area thereon that corresponds by mapping to the said given pixel, or (ii) a representation of at least part of (especially a part less than the whole of) the portion of the output light pattern that is generated by an adjacent or neighbouring pixel to the said given pixel.
  6. 6. An optically variable image device according to claim 4 or claim 5, wherein the said additional encoding as defined in (a), (b) or (c) of the said at least one additional or auxiliary modification or modulation feature or characteristic or auxiliary or subsidiary image or image component comprises one or more of, or any combination of any plurality of, any of the following: (i) auxiliary encoding of a spatially-limiting or truncating feature or characteristic which limits or restricts or truncates the spatial extent and/or shape of one or more of the phase-modulated non-planar output light wavefront components, optionally wherein the said spatially-limiting or truncating feature or characteristic comprises an area of the or each said pixel, to which said pixel the auxiliary encoding is applied and which generates a respective non-planar wavefront component of the output light pattern, being reduced within that pixel so as to delimit a sub-area of that pixel, where the said sub-area has a shape of any desired design, and where the minimum inscribed circle of that sub-area is at least 5 or 10 microns, and the angular width of the respective truncated non-planar wavefront component generated by that pixel is not less than 5 or 10° in at least one azimuthal direction; (ii) auxiliary encoding of a phase variation feature or characteristic which further varies the phase-modulation of one or more of the non-planar output light wavefront components, optionally wherein the said phase variation feature or characteristic comprises an additional phase modulation or phase modification function superimposed onto the phase modulation or modulation function of the respective said non-planar wavefront component and which does not change the main propagation direction of the so modulated/modified non-planar wavefront component by more than 5 or 10° from the main propagation direction of the non-so-modulated/modified non-planar wavefront component (i.e. prior to its additional modulation/modification), and wherein the additional phase modulation/modification comprises spatial modulation frequencies producing an auxiliary light pattern component (e.g. by diffraction) with a higher, optionally at least 2 or 3 times higher, angular width than the angular width of the non-so-modulated/modified non-planar wavefront component, and the modulation depth of the additional phase modulation/modification is less than half, optionally less than a quarter, further optionally less than a tenth, of the pre-selected or design wavelength, and optionally further wherein the so-produced auxiliary light pattern component's (e.g. diffracted) light forms a respective auxiliary wavefront component, and further optionally wherein that respective so-produced auxiliary wavefront component is in the form of a covert or auxiliary image or image component formed on or adjacent to or around the respective said non-planar wavefront component; (iii) encoding of a covert or auxiliary image or image component, optionally wherein the said covert or auxiliary image or image component comprises any one of the following: (iii)(a) a covert or auxiliary image or image component which is encoded into the optical structure(s) of the or one or more said pixels or each pixel of a group of a plurality of said pixels which can be defined as a light wave pattern, the light wave pattern being superimposed on the or each respective non-planar wavefront component generated by the or each respective said pixel or group of pixels, optionally wherein the encoding of the covert or auxiliary image or image component is superimposed onto the encoding of the or each respective non-planar wavefront component generated by the or each respective said pixel or group of pixels, and further optionally wherein the encoding of the covert or auxiliary image or image component is independent of the main propagation direction(s) of the respective non-planar wavefront component(s) generated by the optical structure(s) of the respective said pixel(s); or (iii)(b) a covert or auxiliary image or image component as defined in (iii)(a) above, and which is itself an optically variable image and is overlaid with or superimposed by, or itself overlays or superimposes or substitutes (optionally partially or fully), the said image to be formed of the three-dimensional object or scene; or (iii)(c) a covert or auxiliary image or image component as defined in (iii)(a) or (iii)(b) above, and which is in the form of a two-dimensional graphic pattern; or (iii)(d) a covert or auxiliary image or image component as defined in (iii)(a) or (iii)(b) or (iii)(c) above, and which is a hologram, optionally a Fourier or Fresnel hologram or a combination thereof; (iv) encoding of one or more auxiliary or subsidiary images or image components that require specific lighting and/or viewing conditions in order to be observable, optionally wherein the said specific lighting and/or viewing conditions, under which -optionally only under which -the said one or more auxiliary or subsidiary images or image components is/are observable, comprise one or more of the following: incident light of a specific directionality relative to the device (e.g. light which is substantially unidirectional and/or that propagates in one specific direction relative to the device, or light which is multidirectional but emitted from a point source or a representation thereof), incident light of a specific chromaticity (e.g. light which is substantially monochromatic), incident light of a specific pre-selected or design wavelength, a specific viewing direction or viewing angle relative to the device or a specific range of viewing directions or viewing angles relative to the device.
  7. 7. An optically variable image device according to any preceding claim, wherein a wavefront which is to be generated by a given pixel is designed with one or more extension(s) into one or more neighbouring or adjacent pixels (i.e. into a portion of or an entire area delimited by the neighbouring or adjacent pixel or pixels), and wherein the extended portion(s) of the wavefront designed for the given pixel is/are either (i) encoded into the neighbouring pixel(s) itself/themselves, or (ii) encoded into the neighbouring pixel(s) as an auxiliary or additional or superimposed wavefront component(s) or image(s).
  8. 8. An optically variable image device according to any preceding claim, wherein the determination and calculation of the encoding in the optical material of the respective pixels' optical structures of the respective phase-modulated non-planar wavefront components forming respective portions of the image to be formed is based on either a direct or an indirect working from the light scattered from the pre-selected surface of the three-dimensional object or scene, and wherein the encoding is overall determined and calculated based on: either (i) a direct working from real scattered light detected from a real three-dimensional object or scene under a predetermined real optical illumination arrangement, or (ii) an indirect working from a calculated or theoretical record of, or a modelled or simulated approximation of, light scattered from a calculated or theoretical or modelled or simulated three-dimensional object or scene under a predetermined calculated or theoretical or modelled or simulated optical illumination arrangement, optionally wherein, in said option (ii), its pre-selected surface's light scattering properties or a light scattering pattern it produces under the predetermined calculated, theoretical, modelled or simulated optical illumination arrangement is based on a solely calculated, theoretical, modelled or simulated such arrangement determined from purely calculated, theoretical, modelled or simulated technical principles only.
  9. 9. An optically variable image device according to any preceding claim, wherein the three-dimensional object or scene, whose light scattered therefrom is used to determine and calculate the encoding in the optical material of the respective pixels' optical structures of the respective phase-modulated non-planar wavefront components forming the respective portions of the image to be formed, is either a real-life object or scene, or a scaled version or a model of a real-life object or scene, or it is a virtual or theoretical or modelled or simulated object or scene.
  10. 10. An optically variable image device according to any preceding claim, wherein the said respective non-planar wavefront components formed by the respective pixels are each arcuate in shape.
  11. 11. An optically variable image device according to claim 10, wherein the said respective arcuate non-planar wavefront components formed by the respective pixels are each selected from any of the following shapes or geometric functions or forms: spherical, parabolic, hyperbolic, Gaussian, toroidal, cylindrical, concave-or convex-modified versions of any of the aforementioned spherical or parabolic or hyperbolic or Gaussian or toroidal or cylindrical geometric forms, a function which can be described by Zernike polynomials of at least 2nd or higher orders, symmetrical or asymmetrical forms of any of the aforesaid geometric functions or forms, distorted or aberrated or imperfect versions or renderings of any of the aforesaid geometric functions or forms, and combinations of any two or more of any of the aforesaid geometric functions or forms or distorted, aberrated or imperfect versions/renderings thereof.
  12. 12. An optically variable image device according to any preceding claim, wherein, for each portion of the scattered light from the three-dimensional object or scene, which is to say for each portion of the scattered light that is scattered from the respective sub-area located on the pre-selected surface of the object or scene, the said scattered light has, prior to its scatter from the object or scene, a substantially same-shaped planar wavefront and substantially the same wavelength and angle of incidence on the object or scene as does, prior to its transformation into the respective portion of the output light pattern by a respective pixel in the pixel arrangement, the light incident on that respective pixel in which is encoded the representation of that respective portion of the said scattered light from the object or scene.
  13. 13. An optically variable image device according to any preceding claim, wherein each said respective non-planar wavefront component formed by each respective pixel, upon that respective pixel being illuminated under the said pre-selected illumination conditions, has a respective plurality of main propagation directions each of which is either substantially equal to or at least are dependent on a respective one of a corresponding plurality of main propagation directions of the light scattered from the respective sub-area on the pre-selected surface of the three-dimensional object or scene under the said pre-selected illumination conditions that corresponds by mapping to the respective pixel in the pixel arrangement in which is encoded its representation, upon the three-dimensional object or scene being illuminated under the said pre-selected illumination conditions; optionally wherein the said plurality of main propagation directions of the light scattered from each respective sub-area on the pre-selected surface of the three-dimensional object or scene are determined from directional constituents of the said scattered light which have local intensity maxima therein and are separated from each other by one or more other constituents of the said scattered light which have intensities of less than half the intensity of each of the said directional constituents of the said scattered light having the local intensity maxima therein.
  14. 14. An optically variable image device according to any preceding claim, wherein one or more of the said respective non-planar wavefront components of the output light pattern formed by the respective pixels and each defining a respective main propagation direction of the respective portion of the output light pattern comprise(s) or is/are decomposable into a collection or fan or cone of at least two or more, or a plurality of, discrete propagating constituent light waves each propagating in a respective one of a plurality of discrete constituent output light propagation directions; optionally wherein the main propagation direction of the respective portion of the output light pattern comprising the collection or fan or cone of discrete constituent light waves is either (i) the direction of maximum intensity of those discrete constituent light waves, or 00 an average propagation direction of those discrete constituent light waves calculated as a weighted average of their discrete constituent output light propagation directions with weighting according to their respective intensities.
  15. 15. An optically variable image device according to any preceding claim, wherein the one or more respective non-planar wavefront components, or optionally the one or more respective non-planar wavefront components with continuous wavefront forms, which each defines a respective main propagation direction of the respective portion of the output light pattern, each has an angular width of at least 5°, optionally at least 10°, further optionally at least 15°, and in each case the said angular width being in at least one angular (i.e. azimuthal) direction, optionally in all angular directions.
  16. 16. An optically variable image device according to any preceding claim, wherein the layer of optical material, which is provided with the arrangement of the plurality of pixels defined or definable thereon and has the said optical properties that are capable of modulating the phase of light incident thereon, constitutes, or is formed or defined by at least a portion of, optionally by a portion of the thickness of that is less than the whole thickness of, a body of the said optical material, optionally wherein the said body is in the form of a relatively thin layer, plate, sheet, web, foil or film of that optical material, where by "relatively thin" here is meant that the body of the optical material has a thickness that is relatively thin in comparison with the width and/or length of its major faces, optionally its thickness is less than 0.5 or 0.3 or 0.1 or 0.075 or 0.05 or 0.025 or 0.01 times its major faces' width and/or length.
  17. 17. An optically variable image device according to claim 16, wherein: either (i) the layer is formed or defined by or is constituted by such a relatively thin body of the said optical material, optionally such a thin body per se whose substantially total thickness forms, defines or constitutes the said layer; or (ii) the layer is formed or defined by or is constituted by just a portion of, i.e. a portion less than the whole of, optionally just a portion, less than the whole, of the overall thickness of, a larger (optionally a thicker) body of the said optical material, whereby the layer is only that portion (optionally just that thickness portion) of such a larger (optionally thicker) body of the optical material which actually bounds or envelopes the actual modifying/modulating portion of the optical material of that body which comprises the respective optical structure(s) (in the respective pixels) with the respective modulated optical properties and which is designed and/or used to effect the modulation of the light's phase properties; and wherein, in either case (i) or (ii), the optical material is any such material which has inherent or applied optical properties (optionally refractive index, optical surface relief height/depth and/or width of relief features, light absorption or reflection properties) that can be modified or modulated across and/or along and/or through at least part of or a portion of a body of that material, and such modification(s) or modulation(s) of the material's inherent or applied optical properties is/are so modified or modulated across and/or along and/or through either (iii) substantially the whole of one or more dimensions (i.e. length and/or width and/or thickness) of such a body of the optical material, or (iv) only part of or through only one or more portions, less than the whole, of the one or more dimensions (i.e. length and/or width and/or thickness) of such a body of the optical material, optionally wherein such modification(s) or modulation(s) of the material's inherent or applied optical properties is/are so modified or modulated through either (i) substantially the whole of the thickness of such a body of the optical material, or (ii) only part of or through only one or more portions, less than the whole, of the thickness of such a body of the optical material, whereby in the latter case (ii) the layer constitutes just that portion of the body that actually comprises the respective optical structures (in the respective pixels) with the respective modulated optical properties and which is designed and/or used to effect the modulation of the light's phase properties; and optionally further wherein such modification(s) or modulation(s) of the material's inherent or applied optical properties is effectable independently for each pixel in the arrangement.
  18. 18. An optically variable image device according to claim 16 or claim 17, wherein, optionally independently within each pixel, the thickness of the layer is defined by a distance between opposed surfaces (optionally physical (i.e. real) or virtual (i.e. notional) surfaces) of the body or the relevant part/portion of the body (optionally the relevant part/portion of its thickness) which envelope the actual modifying/modulating portion of the optical material which is designed or used to modulate the light's phase properties, and wherein the layer, which is provided with the arrangement of the plurality of pixels defined or definable thereon and comprises the respective optical structures (in the respective pixels) with the respective modulated optical properties and which is designed and/or used to effect the modulation of the light's phase properties, has a said thickness of from 0.5 up to 50 pm, optionally from 0.5 up to 10 or 20 or 30 pm, and further optionally from 0.5 up to 3 or 5 pm.
  19. 19. An optically variable image device according to any preceding claim, wherein: the optical material that forms or defines the said layer is a material that is capable of providing, or being formed into or having formed therein or thereon, in each respective pixel of the arrangement, and optionally independently for each pixel, a respective optical structure that interacts with incident light transmitted therethrough or reflected therefrom and is capable of modulating one or more optical properties of that light, including modulating the phase of the light wave that interacts therewith, whereby the light wavefront becomes distorted or warped or otherwise shape-modified into a non-planar shape, and is capable of having a record of an optical image or portion of an optical image encoded therewithin or thereon; and wherein the optical structure within at least a portion of each respective pixel is formed or provided by the optical material that forms or defines the said layer by virtue of either (a) the modulatable inherent optical properties of the said optical material itself in the region thereof defining the respective pixel or the said portion thereof, or (b) the respective optical structure being applied to the optical material (optionally onto or into one or more surface(s) or surface layer(s) of the optical material) in the region thereof that defines the respective pixel or the said portion thereof, or optionally (c) the respective optical structure being formed at least in part by an additional optically functional layer applied to the optical material (optionally onto or into one or more surface(s) or surface layer(s) thereof) in the region thereof that defines the respective pixel or the said portion thereof; and further wherein the optical material that forms or defines the said layer is such a material which has, or has applied thereto or has formed therein or thereon, any one or more of, or any combination of any plurality of, the following: (i) inherent optical properties -optionally refractive index -that can be modified or modulated across and/or along and/or through at least part of the thickness of the said layer (i.e. through either substantially the whole thickness of the said layer or alternatively through only part of (i.e. a portion of, less than the whole of) the thickness of the said layer, or (ii) optical surface relief (optionally at least partially reflective or refractive optical surface relief) applied to or formed on or in at least one surface of, or within a surface layer of, the said layer, which relief has variations in its surface relief height/depth and/or width of its relief features, and/or variations in its relief layer thickness, or (iii) light absorption or reflection properties which can be varied across and/or along and/or through part of or all of the material of the said layer; optionally wherein such variations in optical properties of the optical material that forms or defines the said layer as per any one or more of (i), (ii) or (iii) above are manifested either (iv) within each of the various pixels over at least a portion of the area of each pixel and independently for each pixel, or (v) over one or more groups of a respective plurality of pixels or a respective plurality of portions of the areas of that plurality's pixels and independently for the or each such group of pixels or pixel portions.
  20. 20. An optically variable image device according to claim 19, wherein the modification(s) or modulation(s) of the optical material's inherent or applied optical properties are either (i) so modified or modulated across and/or along and/or through substantially the whole of one or more dimensions of (i.e. length and/or width and/or thickness of) the said layer, or (ii) so modified or modulated across and/or along and/or through only part of or through only one or more portions, less than the whole, of the one or more dimensions of (i.e. length and/or width and/or thickness of) the said layer; optionally wherein the inherent or applied optical properties of the optical material forming the said layer vary throughout the layer or the relevant part/portion thereof by virtue of the layer or its relevant part/portion itself comprising a plurality of portions, optionally a plurality of thickness portions, each within and less than the layer's or its relevant part/portion's overall thickness, wherein at least one of those thickness portions has one or more different inherent or applied optical properties from the other thickness portion(s) thereof.
  21. 21. An optically variable image device according to any preceding claim, wherein: (i) the thickness of the layer of optical material, which is provided with the arrangement of the plurality of pixels defined or definable thereon, constitutes either substantially the whole thickness of the device itself or alternatively just a portion of (i.e. a portion less than the whole of) the thickness thereof; and/or (ii) the facial surface area of the layer of optical material constitutes either substantially the whole of the facial surface area of the device itself or alternatively just a portion of (i.e. a portion less than the whole of) the facial surface area thereof; optionally wherein, in either case (i) or (ii), where the layer of optical material constitutes just a portion of the thickness and/or facial surface area (as the case may be) of the device itself, the optical material layer is provided either (iii) as a discrete layer which is applied or mounted onto a surface or face of, or alternatively embedded in a facial portion of, a carrier of an optically non-functional or non-phase-modulating carrier material, or (iv) as an integral, inseparable surface layer of a unitary such carrier.
  22. 22. An optically variable image device according to any preceding claim, wherein: either (i) the layer of optical material, or a body of the optical material comprising the said layer, is substantially planar or flat, or 00 the layer of optical material, or a body of the optical material comprising the said layer, is curved or arcuate in one or two or three dimensions.
  23. 23. An optically variable image device according to any preceding claim, wherein each pixel has an average width of from 3 or 5 or 10 or 20 pm up to 50 or 100 or 200 or 300 or 5 400 or 500 pm, optionally further wherein an inscribed circle area that each pixel occupies is in a diameter size range of from at least 3 or 5 or 10 pm up to at most 100 or 200 or 300 pm., further optionally from at least 5 or 10 pm up to 100 or 200 pm, yet further optionally from 5 pm up to 100 pm.
  24. 24. An optically variable image device according to any preceding claim, wherein projection vectors or projection lines that define the said mapped correspondences between the respective pixels and the corresponding respective sub-areas on the pre-selected surface of the object or scene are either (i) unidirectional or (ii) multidirectional, which is to say that the projection vectors or projection lines either (i) all lie or point in substantially the same direction across the sub-areas on the pre-selected surface of the object or scene, or (ii) lie or point in varying directions across the sub-areas on the pre-selected surface of the object or scene; optionally wherein, in the mapped correspondences between the respective pixels and the corresponding respective sub-areas on the pre-selected surface of the object or scene, the arrangement of the pixels and the pre-selected surface of the three-dimensional object or scene are either (iii) in substantially equal size proportions relative to one another, or (iv) scaled-up or scaled-down in size proportions relative to one another, wherein one of the arrangement of the pixels and the pre-selected surface of the three-dimensional object or scene is scaled-up or scaled-down in size relative to the other of the arrangement of the pixels and the pre-selected surface of the three-dimensional object or scene; further optionally wherein, in the mapped correspondences between the respective pixels and the corresponding respective sub-areas on the pre-selected surface of the object or scene, the said mapping is done either (v) for substantially all the sub-areas on the pre-selected surface of the object or scene, or (vi) some of, but not all of, the sub-areas on the pre-selected surface of the object or scene, optionally one or more groups each of one or more adjacent or neighbouring sub-areas which collectively are less than all the sub-areas.
  25. 25. An optically variable image device according to any preceding claim, wherein the image to be formed by the pixels of the device comprises an image which resembles or imitates or is an approximation or simulation of the scattered light from the three-dimensional object or scene, and wherein the pre-selected surface of the three-dimensional object or scene is either (i) an outer surface of the three-dimensional object or scene, optionally an outer surface of the three dimensional object or scene which faces the light incident on the three-dimensional object or scene, or (ii) an inner surface of the three-dimensional object or scene, whereby the image to be formed by the respective pixels of the device comprises a plurality of respective image portions which resemble or imitate or are approximations or simulations of portions of scattered light from the respective sub-areas on the pre-selected surface of the three-dimensional object or scene and which either pass through or are reflected from those sub-areas on that pre-selected surface; optionally wherein the pre-selected surface of the three-dimensional object or scene whose light scattered therefrom is to be imitated by formation of an image thereof by the device is able to receive incident light originating from outside the three-dimensional object or scene, and does so either directly or indirectly from outside the three-dimensional object or scene; and further optionally wherein the three-dimensional object or scene and/or the pre-selected surface thereof is either (iii) transparent or semi-transparent, whereby the scattered light therefrom is produced at least in part upon transmission therethrough, or (iv) optically opaque or reflective, whereby the scattered light therefrom is produced substantially only upon reflection therefrom.
  26. 26. An optically variable image device according to any preceding claim, wherein the optically variable image device is designed and constructed for forming an observable optically variable image of light scattered from a three-dimensional object or scene where that three-dimensional object or scene is any of the following: (i) a substantially reflective three-dimensional object or scene; or (ii) a reflective object or scene comprising a relief pattern or motif with at least some three-dimensional content; or (iii) a reflective three-dimensional object or scene with an entrance surface (i.e. on a face or side of the object which first receives the incident light) in the form of a relief pattern or motif and an opposite exit surface (i.e. on the face or side of the object opposite its entrance surface) which is substantially flat or planar; or (iv) a substantially transparent three-dimensional object or scene with an entrance surface (i.e. on a face or side of the object which first receives the incident light) which is substantially flat or planar, and an opposite exit surface (i.e. on the face or side of the object opposite its entrance surface) in the form of an optical relief pattern or motif; optionally wherein the said relief pattern or motif is an embossed or cast form thereof.
  27. 27. An optically variable image device according to any preceding claim, wherein the said pre-selected surface of the three-dimensional object or scene is described by a distance function D=f(s) relative to an entrance surface s of the said layer, the said function f defining a distance D between a selected or given point on the said layer's entrance surface s and a corresponding point on the pre-selected surface of the three-dimensional object or scene, where the distance D between the said points is measured along a respective projection vector or projection line defining a mapping relationship between a respective sub-area located on the pre-selected surface of the three-dimensional object or scene and a respective pixel (that is to record a corresponding respective portion of the image to be formed), optionally wherein the distance D between the said points is measured along a respective projection vector or projection line being a normal to the surface s; further optionally wherein the function f is substantially continuous yet non-uniform over either (i) substantially the whole of the said layer's entrance surface s, or alternatively (ii) substantially the whole of each of one or more portions of the said layer's entrance surface s on which are defined a respective group or array of the pixels.
  28. 28. An optically variable image device according to any preceding claim, wherein the phase modulation of the incident light wave or the calculation thereof comprises the incident light wave's undergoing a phase delay or phase shift (which is to say, a shift of the position of the sinusoidal wave in its propagation direction) as compared with an incident light wave that does not interact with the optical structure in the respective pixel; optionally wherein the amount of the phase delay or phase shift is dependent on the local optical and/or physical properties of the optical structure of the relevant pixel, wherein said local optical and/or physical properties of the optical structure are selected from: (i) its optical material's refractive index, and/or 00 its optical surface relief profile height or depth and/or width of its relief features applied thereto, and/or (iii) the thickness of a layer thereof in which is formed its optical surface relief profile, and/or (iv) the composition of a single or a stack of a plurality of thin film layers applied to any one or more surfaces of the said optical material forming the layer, whereby the light wave exiting the optical structure of the relevant pixel has its wavefront phase-modified compared to the wavefront of the original incident light wave.
  29. 29. An optically variable image device according to any preceding claim, wherein an interaction between the incident light wave and the respective optical structure of each respective pixel is of the nature of, or is predominantly of the nature of, transmission, reflection or absorption, or a combination of any two or more thereof, and the respective optical structure of the respective pixel is constructed or formed so as to impart thereto the required phase modulation capability for the purpose of forming a respective non-planar wavefront component forming a respective portion of the image to be formed by that pixel, wherein the said respective optical structure comprises any one or more of the following features: (i) localized variation(s) in the refractive index of the optical material forming the layer in the region thereof defining the respective pixel; (ii) variation(s) in the thickness of a transparent optical relief structure or relief layer within the layer in the region thereof defining the respective pixel and which is open to the air to one side thereof; (iii) the same feature as feature (ii) above, but modified such that the air is replaced with a different transparent material with a different refractive index from the material in which the optical relief is formed (optionally wherein an additional thin sub-layer of optical material is applied onto the optical relief between (i.e. sandwiched between) the two material sub-layers of this option (iii), wherein that sandwiched additional thin sub-layer has yet another different refractive index from either one (only) of or both of those other two material sub-layers therebelow and thereabove), and optionally wherein the refractive indexes of any two (only), but not all, of those sub-layers in this configuration are optionally the same; (iv) the same feature as feature (iii) above, but modified such that the variable optical relief structure is buried within an optical material with a localized modulated refractive index; (v) by arranging for a variable travel length of a forward and a backward path of the incident light beam which is reflected by optical relief, which is open to the air to one side thereof, formed in an optical material (forming the layer) that is reflective or is a material coated with one or more layers (optionally as a monolayer or a layer stack) of a coating material that is at least partially, or acts as, a reflecting surface; (vi) the same feature as feature (v) above, but modified such that the air is replaced with a transparent optical material; (vii) by arranging for the optical material to have variable refractive index by virtue of the provision of any one of the above features (i), (iii), (iv) or (v) above, combined with a reflecting material on the back side of the layer (i.e. on the side opposite to the side through which the incident light first enters the respective optical structure), whereby the combined respective optical structure can phase-modulate the incident light wave on its forward and backward (i.e. reflected) paths through the layer; (viii) by applying an intermediate layer or stack of layers (optionally of one or more semi-transparent metallic material(s) or dielectric material(s) onto at least one side of (optionally the front (i.e. entrance) side of) the layer, and optionally also burying an optical relief layer (as a modification of feature (hi) above) within the layer, whereby the optical properties (optionally the phase properties) of transmitted and/or reflected light are yet further modulated (optionally yet further phase-modulated), optionally wherein, as an optional augmentation of this arrangement, an overcoat on a front (i.e. entrance) side of the layer is present and is optionally designed as an antireflective coating layer (further optionally an antireflective coating layer which is selective with respect to wavelength and/or angle of incidence and/or angle of observation), whilst optionally in the case of a buried optical relief layer the overcoat on the buried optical relief layer is optionally designed with an increased reflectivity (further optionally with an increased wavelength-selective reflectivity and/or angle of incidence-selective reflectivity), in order to yet further modify the overall phase-modulating optical properties of the complete respective optical structure of the respective pixel.
  30. 30. An optically variable image device according to claim 29, wherein in the provision of the feature(s) of any of the said options (i) to (viii), one of more of the following additional features (ix) and/or (x) is/are present: (ix) as an augmentation of the respective arrangement, there is added to the layer a reflective sub-layer or multilayer stack on a back side thereof in order to further modify the overall phase-modulating optical properties of the combined respective optical structure of the respective pixel; and/or (x) an overcoat is applied to any surface of the main layer and/or any exposed or buried optical relief forming part of the main layer in order to yet further modify the interaction of incident light with the respective optical structure, optionally wherein the overcoat is applied fully or selectively to only one or more parts of the main layer, optionally to all of or only to selected one(s) of the respective pixels or to one or more sub-groups of a plurality of the pixels or one or more portion(s) of one or more such sub-groups of a plurality of pixels, optionally disregarding individual boundaries of one or more such pixels in one or more such sub-group(s) thereof, and further optionally wherein the overcoat is reflective, partially reflective or selectively reflective with respect to wavelengths and optionally the angle of incidence of the interacting light.
  31. 31. An optically variable image device according to any preceding claim, wherein independently for each pixel, the respective optical structure of each respective pixel has a phase modulation depth limit for a given wavelength and a given angle of incidence of the light interacting with the respective optical structure of the respective pixel, wherein the phase modulation depth limit is defined as a maximum phase shift (i.e. maximum phase delay) that the respective optical structure can induce in the light wave interacting with the respective optical structure of that respective pixel, and optionally wherein each pixel's generated phase-modulated non-planar output light wavefront component that exits the layer from that pixel and imitates the light scattered by the three-dimensional object or scene in the corresponding respective sub-area on the pre-selected surface thereof, is designed -on the basis of the said pre-selected illumination conditions as defined in claim 1 -such that it has a phase modulation depth which is at most equal to or smaller than the said phase modulation depth limit under the said pre-selected illumination conditions.
  32. 32. An optically variable image device according to claim 31, wherein one or more of the pixels is in effect split into two or more sub-portions -which is to say, the wavefront component to be encoded (i.e. under the said pre-selected illumination conditions) in such a pixel which exceeds a maximum modulation limit of that pixel's optical structure is spatially divided into two or more sub-components, each one not exceeding the maximum modulation limit of the optical structure of that pixel, and each such sub-component of the wavefront component being encoded separately in a respective sub-portion of that pixel's optical structure that spatially corresponds to the said sub-component of the wavefront component to be encoded thereby; optionally (and especially optionally when the smallest dimension of the respective pixel's sub-portions is smaller than 10 or 5 times the design wavelength) wherein the wavefront sub-components generated by the neighbouring sub-portions of the pixel remain synchronized (i.e. under the said pre-selected illumination conditions), i.e. the phase offset between the neighbouring wavefront sub-components at the exit of the pixel, if any, is minimized or is equal to 27c or multiples thereof, whereby unwanted wavefront disruptions are suppressed or minimized.
  33. 33. A method for producing an optically variable image device for forming an observable optically variable image of light scattered from a three-dimensional object or scene upon illumination of the device with incident light, the image to be formed comprising an output light pattern, and the optically variable image device being a device according to any one of claims 1 to 32, wherein the method comprises: (1) either before or after step (2), pre-selecting a surface of the three-dimensional object or scene for the purpose of the following steps of the method; (2) either after or before step (1), providing a layer of optical material with an arrangement of a plurality of pixels defined or definable thereon, each respective pixel being for forming a respective optical structure therein with optical properties capable of modulating the phase of the light incident thereon, where each respective pixel is for forming a said respective optical structure which is capable of modulating the phase of light incident thereon so as to form a respective phase-modulated non-planar wavefront component of the output light pattern, which respective non-planar wavefront component to be formed by that respective pixel (i) corresponds to a respective portion of the image to be formed that is to be recorded in that respective pixel, and (ii) has phase modulation that is characteristic of or a function of the modulating optical properties of the respective optical structure to be formed in that respective pixel; (3) mapping each respective one of a plurality of sub-areas located on the preselected surface of the three-dimensional object or scene onto each corresponding respective pixel that is to record the corresponding respective portion of the image to be formed, or alternatively mapping each respective pixel that is to record a respective portion of the image to be formed onto each corresponding respective one of a plurality of sub-areas located on the pre-selected surface of the three-dimensional object or scene; (4) for the purpose of subsequent encoding in the respective pixels of the respective recorded portions of the image to be formed, defining pre-selected illumination conditions comprising, for each respective pixel and each respective sub-area on the pre-selected surface of the three-dimensional object or scene, illumination by a planar light wave of a preselected or design wavelength and at a respective pre-selected angle of incidence on the respective pixel and on the three-dimensional object or scene, as the case may be, and in the case of the respective sub-area the said planar light wave originating outside the three- dimensional object or scene and being incident on a receiving side of the respective sub-area as it interacts with the three-dimensional object or scene; (5) for each respective sub-area located on the pre-selected surface of the three-dimensional object or scene, and for the pre-selected illumination conditions defined in step (4), determining, especially from calculation or measurement, at least one respective main propagation direction of light scattered from the respective sub-area on the pre-selected surface of the three-dimensional object or scene that corresponds by the said mapping to the respective mapped pixel in the pixel arrangement in which is to be encoded its representation, and therefrom assigning that respective at least one main propagation direction of the scattered light from the respective sub-area to a respective at least one main propagation direction of the respective non-planar wavefront component of the output light pattern to be formed by that respective pixel after interaction therewith of the incident light thereon under the pre-selected illumination conditions, where that determining and assigning are such that the said respective non-planar wavefront component to be formed by each respective pixel, upon said respective pixel being illuminated under the said pre-selected illumination conditions, has at least one said respective main propagation direction that is either substantially equal to or at least is dependent on a respective at least one said main propagation direction of the said scattered light from the respective sub-area on the pre-selected surface of the three-dimensional object or scene; (6) for each respective mapped pixel, using the respective at least one main propagation direction of the respective non-planar wavefront component to be formed by that respective pixel as determined and assigned in step (5), designing a shape of the said respective non-planar wavefront component to be formed by that respective pixel, which respective non-planar wavefront component to be formed by that respective pixel is independent of the shape of a wavefront of the said scattered light from the respective subarea on the pre-selected surface of the three-dimensional object or scene that corresponds by the said mapping to the respective pixel in the pixel arrangement in which is to be encoded its representation; (7) for each respective pixel in the pixel arrangement, calculating the modulating optical properties of the optical material that is to form the respective optical structure thereof and which is capable of transforming a planar wavefront of the light incident on that respective pixel into a said designed respective phase-modulated non-planar wavefront component of the output light pattern that is characteristic of or a function of those modulating optical properties of the respective optical structure to be formed in that respective pixel; and (8) recording in the optical material of each respective pixel each respective optical structure thereof that is capable of forming each respective phase-modulated non-planar wavefront component of the output light pattern of the image to be formed, where each respective phase-modulated non-planar wavefront component is so recorded in the respective pixel's optical structure in encoded form as said modulating optical properties of the optical material of the respective optical structure as calculated in step (7), whereby each respective encoded record of each respective portion of the image to be formed is so recorded in the respective optical structure of the respective pixel as a representation of the corresponding respective portion of the said scattered light from the respective sub-area on the pre-selected surface of the three-dimensional object or scene.
  34. 34. A method according to claim 33, wherein the method is a method for producing an optically variable image device in which the produced layer following step (8) (i.e. with its arrangement of pixels with the respective portions of the image to be formed recorded in encoded form in the respective optical structures thereof) does not itself inherently constitute the final optically variable image device being produced, and wherein the method further includes, after step (8), a step of: (9) assembling or incorporating the layer with its arrangement of pixels (with the respective portions of the image to be formed recorded in encoded form in the respective optical structures thereof) into the final optically variable image device being produced, optionally by application of the layer onto, or by embedding of the layer into, or by combining the layer with (optionally by lamination), one or more carrier layers or other structural or optically functional or non-functional layers of the produced device.
  35. 35. A method according to claim 33 or claim 34, which method is carried out in either one of the following two manners (i) or (ii): (i) the complete method as defined in claim 33, comprising the defined steps (1) to (7) as well as the "recording" step (8), and optionally also the additional "assembling or incorporating" step (9) as defined in claim 34 (as the case may be), is designed and carried out either once only or a plurality of times, so as to produce one or more respective final specific embodiment versions of an optically variable image device according to any one of claims 1 to 32 upon the or each respective rendering of that complete method; or (ii) only that part of the method as defined in claim 33 comprising just the defined steps (1) to (7) only is designed and carried out once only, for defining the encoding of each respective portion of the image to be formed that is to be recorded in the respective pixels of a single specific embodiment version of an optically variable image device according to any one of claims 1 to 32, and then only the remaining defined "recording" step (8), and optionally also the additional "assembling or incorporating" step (9) as defined in claim 34 (as the case may be), is carried out a plural number of times on each respective one of a plurality of substantially identical discrete layers of optical material, so as to produce a plurality of final replicated or derivative devices all being according to that single specific embodiment version thereof but which are substantially identical to, or are substantial clones of, each other.
  36. 36. A method of forming an observable optically variable image of light scattered from a three-dimensional object or scene, the image being formed from an encoded record thereof in an optically variable image device, wherein the method comprises: (1) providing said optically variable image device being an optically variable image device according to any one of claims 1 to 32 or an optically variable image device produced by a method according to any one of claims 33 to 35; and (2) illuminating the arrangement of pixels of the layer of the device collectively with incident light, optionally under the said pre-selected illumination conditions; whereby upon said illumination of each respective pixel of the layer of the device with the incident light, each said pixel forms a respective portion of the observable optically variable image in the form of the said respective non-planar wavefront component of the output light pattern, and whereby upon said collective illumination of the arrangement of pixels of the layer of the device with the incident light, the plurality of pixels collectively form the observable optically variable image in the form of a plurality of portions thereof in the form of a plurality of said respective non-planar wavefront components of the output light pattern, which plurality of said respective non-planar wavefront components of the output light pattern collectively correspond to the said scattered light from the three-dimensional object or scene whose image is formed by the device.
  37. 37. An article or product having applied or affixed thereto or incorporated therein an optically variable image device according to any one of claims 1 to 32 or an optically variable image device produced by a method according to any one of claims 33 to 35, optionally wherein the article or product is any one of the following: a document, a banknote, a passport, a visa, an ID card or document, a driving licence, a membership card, a ticket, a certificate, packaging, a work of art, an antique or other valuable item.
GB2406660.7A 2024-05-10 2024-05-10 Optically variable image device Pending GB2640972A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2406660.7A GB2640972A (en) 2024-05-10 2024-05-10 Optically variable image device
PCT/EP2025/062925 WO2025233536A1 (en) 2024-05-10 2025-05-12 Optically variable image device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2406660.7A GB2640972A (en) 2024-05-10 2024-05-10 Optically variable image device

Publications (2)

Publication Number Publication Date
GB202406660D0 GB202406660D0 (en) 2024-06-26
GB2640972A true GB2640972A (en) 2025-11-12

Family

ID=91581574

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2406660.7A Pending GB2640972A (en) 2024-05-10 2024-05-10 Optically variable image device

Country Status (2)

Country Link
GB (1) GB2640972A (en)
WO (1) WO2025233536A1 (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5105306A (en) 1989-01-18 1992-04-14 Ohala John J Visual effect created by an array of reflective facets with controlled slopes
WO2004048119A1 (en) 2002-11-22 2004-06-10 Ovd Kinegram Ag Optically variable element and the use thereof
EP1782108B1 (en) * 2004-08-06 2011-02-16 Optaglio sro A method of creating a three-dimensional image, a diffractive element and method of creating the same
US20130093172A1 (en) 2009-12-04 2013-04-18 Giesecke & Devrient Gmbh Security element, value document comprising such a security element, and method for producing such a security element
US20150070350A1 (en) * 2012-04-25 2015-03-12 Visual Physics, Llc Security device for projecting a collection of synthetic images
US20180015771A1 (en) * 2015-01-09 2018-01-18 Ovd Kinegram Ag Method for Producing Security Elements, and Security Elements
WO2018201208A1 (en) 2017-05-03 2018-11-08 Demax Holograms Ad Optical variable device
WO2019077419A1 (en) * 2017-10-20 2019-04-25 Wavefront Technology, Inc. Optical switch devices
US10688822B1 (en) * 2014-12-30 2020-06-23 Morphotrust Usa, Llc Embedding 3D information in documents
EP3339048B1 (en) * 2016-12-22 2020-11-04 Giesecke+Devrient Currency Technology GmbH Security element having reflective surface area
US20220063318A1 (en) * 2018-12-20 2022-03-03 Giesecke+Devrient Currency Technology Gmbh Optically variable security element
US20220111676A1 (en) * 2018-09-24 2022-04-14 Ovd Kinegram Ag Optically variable element, security document, method for producing an optically variable element, method for producing a security document
EP2955564B1 (en) * 2014-06-13 2023-08-09 OpSec Security Limited Optically variable element
CN117241947A (en) * 2021-05-03 2023-12-15 捷德货币技术有限责任公司 Optically variable security elements

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5105306A (en) 1989-01-18 1992-04-14 Ohala John J Visual effect created by an array of reflective facets with controlled slopes
WO2004048119A1 (en) 2002-11-22 2004-06-10 Ovd Kinegram Ag Optically variable element and the use thereof
EP1782108B1 (en) * 2004-08-06 2011-02-16 Optaglio sro A method of creating a three-dimensional image, a diffractive element and method of creating the same
US20130093172A1 (en) 2009-12-04 2013-04-18 Giesecke & Devrient Gmbh Security element, value document comprising such a security element, and method for producing such a security element
EP3059093A1 (en) 2009-12-04 2016-08-24 Giesecke & Devrient GmbH Security element, valuable document comprising such a security element and method for producing such a security element
US20150070350A1 (en) * 2012-04-25 2015-03-12 Visual Physics, Llc Security device for projecting a collection of synthetic images
EP2955564B1 (en) * 2014-06-13 2023-08-09 OpSec Security Limited Optically variable element
US10688822B1 (en) * 2014-12-30 2020-06-23 Morphotrust Usa, Llc Embedding 3D information in documents
US20180015771A1 (en) * 2015-01-09 2018-01-18 Ovd Kinegram Ag Method for Producing Security Elements, and Security Elements
EP3339048B1 (en) * 2016-12-22 2020-11-04 Giesecke+Devrient Currency Technology GmbH Security element having reflective surface area
WO2018201208A1 (en) 2017-05-03 2018-11-08 Demax Holograms Ad Optical variable device
WO2019077419A1 (en) * 2017-10-20 2019-04-25 Wavefront Technology, Inc. Optical switch devices
US20220111676A1 (en) * 2018-09-24 2022-04-14 Ovd Kinegram Ag Optically variable element, security document, method for producing an optically variable element, method for producing a security document
US20220063318A1 (en) * 2018-12-20 2022-03-03 Giesecke+Devrient Currency Technology Gmbh Optically variable security element
CN117241947A (en) * 2021-05-03 2023-12-15 捷德货币技术有限责任公司 Optically variable security elements

Also Published As

Publication number Publication date
GB202406660D0 (en) 2024-06-26
WO2025233536A1 (en) 2025-11-13

Similar Documents

Publication Publication Date Title
CN102905909B (en) Security element, value document comprising such a security element, and method for producing such a security element
EP1407419B1 (en) Diffractive optical device and method of manufacture
CN104718469B (en) Optically Variable Surface Graphics
JP4601008B2 (en) Optical security element
EP1782108B1 (en) A method of creating a three-dimensional image, a diffractive element and method of creating the same
JP5431363B2 (en) Film element
CN102725148B (en) Multilayer
CN100489568C (en) Grid image with one or several grid fields
AU2009250051B8 (en) Optical security element
KR20180029062A (en) Optical products, masters for making optical products, and methods for manufacturing master and optical products
JP2016505161A (en) Security document with decorative elements and decorative elements
JPH02165987A (en) Optically variable face pattern
CN107107646B (en) Method for producing a secure element and secure element
AU2016101590B4 (en) A 3d micromirror device
CN115230363B (en) Optical anti-counterfeiting element, design method thereof and anti-counterfeiting product
CN113056376B (en) Optically variable element, security document, method for producing an optically variable element, method for producing a security document
CN110832366B (en) Optical structure and authentication body
AU2011101251A4 (en) Optically variable device
JP5251236B2 (en) Diffraction structure having fine uneven diffraction structure
GB2640972A (en) Optically variable image device
JP2011095465A (en) Display body
JP2024502477A (en) Optical device and its manufacturing method
Goncharsky et al. Synthesis of Nano-Optical Elements for Forming 3D Images
BG67098B1 (en) Optical variable element
JP2007334076A (en) Light diffraction structure