[go: up one dir, main page]

GB2399248A - Projection of supplementary image data onto a studio set - Google Patents

Projection of supplementary image data onto a studio set Download PDF

Info

Publication number
GB2399248A
GB2399248A GB0305294A GB0305294A GB2399248A GB 2399248 A GB2399248 A GB 2399248A GB 0305294 A GB0305294 A GB 0305294A GB 0305294 A GB0305294 A GB 0305294A GB 2399248 A GB2399248 A GB 2399248A
Authority
GB
United Kingdom
Prior art keywords
image
camera
production
projector
tracking camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0305294A
Other versions
GB0305294D0 (en
GB2399248B (en
Inventor
Vali Lalioti
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
British Broadcasting Corp
Original Assignee
British Broadcasting Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by British Broadcasting Corp filed Critical British Broadcasting Corp
Priority to GB0305294A priority Critical patent/GB2399248B/en
Publication of GB0305294D0 publication Critical patent/GB0305294D0/en
Publication of GB2399248A publication Critical patent/GB2399248A/en
Application granted granted Critical
Publication of GB2399248B publication Critical patent/GB2399248B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2222Prompting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • H04N5/275Generation of keying signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Processing Or Creating Images (AREA)
  • Studio Circuits (AREA)

Abstract

Apparatus for tracking the position of one or more selected objects, having identifiable patterns, and selectively projecting a visual cue onto one or more of the objects. The apparatus preferably comprises an integrated unit, and does not require a complex set-up and calibration process. The image projected onto the objects may be controlled and adapted to have a desired appearance on the object. A scene containing the patterns may be filmed, and the resulting video sequence processed to key out the patterns, which can be replaced with alternative image data. Interaction of objects can be monitored and events in the production video sequence can be triggered according to certain types of interaction. The apparatus allows for direct viewing of a potential scene in virtual reality studios, which would otherwise only be accessible (to directors etc) after subsequent image processing.

Description

1 2399248
VIDEO PRODUCTION
This invention relates to video production, and more specifically to mixed reality (virtual reality) systems. This invention is particularly applicable in the field of television or motion picture production, and may be especially useful in the context of a virtual studio, a traditional studio or an outdoor production.
In the past few decades, the use of digital effects has become more and more common in film production, not just for the industry of motion pictures but also for the production of TV programs and commercial video clips. The latest advances in the fields of image processing, computer vision and computer graphics have enabled the filmmaking industry to merge synthetic content and natural content with increasing levels of realism and progressively decreasing costs.
Examples of television systems in which digital effects are extensively used include Virtual studio, Augmented reality and Mixed reality systems Virtual Studio systems allow merging of people and real objects within a fully synthetic 3D set. Virtual studios typically require a purpose-built space covered with material which permits chrome-keying (blue, green, our Truematt(RTM) retro- reflective material). Within a virtual studio the positions of cameras, actors and other real objects can be monitored, for example, with the use of retro-reflective markers.
Applicant's GB-A-2,325,807, incorporated herein by reference, describes such an object tracking system. GB-A-2,352,899, also incorporated herein by reference, discloses a method of determining the position of an actor by camera imaging. The set up and calibration process however, in conjunction with cost of the relevant systems, inhibits ad-hoc use of virtual reality technology in a conventional studio or outdoors.
Augmented Reality, while not providing all of the functionality of a virtual studio, can enhance a real studio environment with some virtual reality features. It has been proposed to use inexpensive web cameras visually tracking of patterns or specific features of the physical environment, and subsequently superimposing synthetic information on the video image at the location of these tracked elements. Augmented Reality also allows for a more flexible and physical interaction between the real and the virtual elements (e.g. ARToolkit and its use in applications of the HITLab). Visual feedback to the user is typically provided via monitors or head-mounted displays.
By way of further background, projection-based user feedback systems are known, primarily for office-type environments, for example as disclosed in Raskar et al. "The Of lice of the Future: A unified approach to ImageBased Modeling and Spatially Immersive Displays", Proceedings of STGGRAPH'98, Orland, Fla. pp. 179-188, July 1998 where images are projected onto non coplanar surfaces. A further example is US Patent No. 6,431,71 1, "Multiple-surface display projector with interactive input capability" in which real objects can be enhanced by projection, and interaction via hand gestures is possible within the projected area. An example of this system can be viewed at http://www. research.ibm.com/ed/. Pursuant to the invention, it has been appreciated that, whilst such techniques are not intended (or even suited) for production of studio quality images, they can nonetheless be adapted to provide a useful role in a video production system.
Many disadvantages however, often have to be accommodated by production staff and performers using digital effects systems such as those described above.
One limitation is that digital effects for television and film making can usually only be manufactured and applied in an off-line postproduction phase that makes massive use of computational power and requires highly specialized technicians. This makes it difficult for actors and directors to gain an accurate picture of what the completed picture will look like. It also increases production costs dramatically, thus making the film less competitive.
Film and television directors who make use of digital effects are often forced to accept changes in their way of working. During the actual filming they cannot easily have a preview of what the result will look like at the end of the postproduction work.
The lack of a WYSIWYG (What-You-See-ls-What-You-Get) approach to the filming of live action in, for example, a virtual studio where actors perform in front of a reflective screen for chrome-keying purposes, can make it difficult to plan camera trajectories, object and actor positioning and even dialogue.
A film director may ultimately be forced to accept sub-optimal results, or have to pay for a number of expensive screen tests or prototypes of digital effects.
Similar problems are encountered by the actors in a virtual studio. Once again, they lack a visual feedback of the key elements of the scene, not to mention the necessary positional and synchronization cues, which are usually replaced by rough marks on the floor and timed gestures made by the crew. All such problems have a severe impact on the final production costs, as it is difficult to predict how much postproduction work will be necessary.
Efforts have been made to provide improved user feedback using projectionbased technology. Approaches include "Prompting Guide for Chroma-Keying" US Patent No. 5,886,747, wherein an outline of the objects, or their point of intersection with the walls of the studio are projected, and Applicant's UK Patent Application No. GB0206214, incorporated herein by reference, which discloses a system whereby a representation of a scene can be projected into a virtual studio to provide an image rendered from the perspective of an actor. In EP-A-0993204 projection of virtual objects into a virtual set is synchronized with screen blanking intervals so that the projection is not recorded by the studio cameras. These approaches effectively project a representation of a rendered image into the whole of the virtual studio space.
Various systems exist to monitor the position of objects and cameras in a number of production applications. These include US patent No. 6,181,345, "Method and apparatus for replacing target zones in a video sequence"; US patent No. 5,515,485 "Method and device for modifying a zone in successive images"; and US patent No. 5,436,672 "Video processing system for modifying a zone in successive images".
In a first aspect, the invention aims to produce a tool for providing visual feedback to a participant in a studio concerning an augmented object without some of the cost and complexity of a typical virtual studio approach.
In a first aspect therefore, the invention provides apparatus for providing a visual cue concerning a selected object to a participant in a scene, the selected object having an identifiable pattern, the apparatus comprising: a tracking camera arranged to obtain an image of the scene including an image of the selected object; digital video processing means for analysing said image to detect said identifiable pattern in the image; a projector arranged to project an image onto a selectable portion within a possible projection space, which possible projection space includes at least a portion of the scene which includes the object; means for providing image data for a visual cue to the projector; means for storing camera parameters for said tracking camera and positional information relating the position of the tracking camera to the position of the projector; position processing means arranged to provide position selection information for the projector based on the position of the detected identifiable pattern in the tracking camera image and the stored tracking camera parameters and positional information to control the projector to project the visual cue selectively onto the selected object.
In this way, a visual cue can be selectively presented without requiring the complex set-up and calibration of a virtual studio. In particular, it is not necessary to know the absolute positions of the projector and object in a defined space, merely to know the relative positions of the camera and the projector.
Preferably the tracking camera and the projector are coupled (e.g. rigidly connected) in a known (either predetermined or selectable) spatial configuration, and more preferably they are integral, forming a single unit. This allows the relative positions of I the tracking camera and projector to easily be selected from a number of predetermined values (which selection may be achieved automatically according to the rigid connection) or to be stored permanently in the case of an integral unit.
The image data is desirably provided by a graphics rendering tool, preferably via a data interface.
In preferred embodiments, the scene will be filmed by one or more production cameras, and the identifiable patterns will be keyed out and replaced by virtual images representing the selected objects or alternatively a real still picture or alternative video feed. The invention therefore allows the production of augmented video sequence output at reduced expense and with reduced set-up work. It may be desired that the data for the visual cue is produced from an image or virtual object to be inserted in the video output, however it may be advantageous to create the two sets of data independently. The images may broadly correspond but may differ in form. The images may be rendered from different perspectives, for example the visual cue may be rendered from the perspective of a participant in the studio or an approximation thereof, for example based on the technology disclosed in our earlier UK patent application no. 0206214.9, the entire disclosure of which is incorporated herein by reference. The two images may differ in resolution or degree of rendering accuracy, for example a low resolution visual cue may be used which is sufficient to impart the desired information to the actor, while the image in the video output can be of broadcast quality. The visual cue may be crudely rendered, for example without taking into account perspective, lighting or shading or texture or may use simplified algorithms for such factors whereas the broadcast image may be more accurately rendered. Of course the data for the visual cue and rendered output image need not be related at all, for example the cue may simply have a marker or descriptive text and the broadcast image may have a complex graphic.
In an alternative embodiment, a visual cue can be provided concerning a selected object and at least one further selected object, each object having an identifiable pattern. The patterns can be tracked by apparatus substantially as described above and the visual cue can be selectively projected onto one or more of the selected objects. 1
In such an embodiment, a control means is preferably provided to trigger a production event based on relative position information for two or more selected objects, derived by an object interaction processing means. Images representing selected objects can therefore be updated automatically according to interactions between the objects.
In a second aspect the invention seeks to provide a method of providing a visual cue concerning a selected object to a participant in a scene, the selected object having an identifiable pattern, the method comprising: producing an image of the scene including said identifiable pattern with a tracking camera; processing said image to detect said identifiable pattern; determining camera parameters and positional information relating the position of the tracking camera and the position of a projector; providing image data for a visual cue to the projector; controlling the projector to project the visual cue selectively onto the selected object, based on the position of the identifiable pattern in the camera image and said camera parameters and positional information Preferably the tracking camera is an auxiliary camera which may be positioned for tracking convenience, however a production camera may also be used to derive position information for controlling the projector. The tracking camera may be fixed, have limited degrees of freedom or have substantially unlimited freedom of 1 5 movement.
The visual cue may be either a front or rear projection, and may be paused or masked in response to objects sensed intervening between the projector and the selected object.
The visual cue may desirably be adapted to eliminate errors due to keystone distortion and projection onto non coplanar surfaces, and may be projected so as to create an image from the perspective of an actor in the filming space.
Preferably more than one selected object may be tracked by, and projected onto the apparatus.
In a further aspect the invention provides apparatus for augmenting a motion video sequence of images of a scene containing at least first and second selected objects, each object having a respective identifiable pattern, the apparatus comprising: a tracking camera arranged to provide a sequence of images of the i scene including images of said selected objects; digital video processing means for analysing said sequence of images to detect said identifiable patterns in said tracking camera image; object interaction processing means arranged to provide position information for selected objects having patterns detected in the tracking camera images, based on the position of the patterns in said sequence of images, and tracking camera parameters; a graphics processor arranged to recognise and replace said identifiable patterns in the motion video sequence with alternative image data; an event processor for triggering an event in said motion video sequence based on said relative position information.
Preferably the event will comprise altering the image used to replace one or more of the patterns in the motion video sequence. The interaction of selected objects can therefore cause their appearance or appearances on screen to be affected automatically (ea. a missile turning into a crater). In some situations the appearance of each object can be changed, or alternatively it may be desirable to merge or morph two or more objects into a third object (ea. a missile and a tank becoming a crater when brought into close proximity) in the output video sequence. Where it is desired to create a new virtual' object, or to change the output sequence more fundamentally, the event may also comprise the insertion of an external graphic into the video sequence. An event may also cause a sound effect or music to be added to the video output (ea. an explosion for the examples provided above).
In the above example of a missile and a tank, it is desired to trigger an event when two selected objects are brought into proximity. This is provided for in one embodiment of this aspect of the invention by determining the relative separations of two objects, and comparing this distance to a threshold to trigger the event. Alternatively the special arrangement or orientation of the selected objects relative to each other may be determined and be compared to a predetermined arrangement in order to trigger an event. This could be when two object face one another, or when three or more objects are arranged in a straight line for example.
The positional information or the selected objects need not be relative to one another i however. A preferred embodiment of the invention located the position of the objects in a reference frame superimposed on the tracking camera image and triggers an event based on an evaluation of the location of certain objects within this reference frame.
This allows an event to be triggered when one or more selected objects moves into a particular portion of the image, such as a corner or a central area. This may represent an object making a land/sea transition for example, and trigger an associated change in the representation of that object in the augmented output video sequence.
The invention will now be described by way of example with reference to the accompanying figures in which: Figure 1 depicts a typical prior art Augmented Reality arrangement.
Figure 2 depicts an embodiment of a prior art arrangement wherein real and virtual images can be combined for view on a screen.
Figure 3 depicts an embodiment of a prior art arrangement wherein real and virtual images can be combined for view in a head mounted display.
Figure 4 shows a schematic of a simple embodiment of the present invention Figure 5 shows an embodiment of the present invention in which a target object is static.
Figure 6 shows an embodiment of the present invention with a dynamic target object.
Figure 7 illustrates one embodiment of an apparatus according to the present invention.
Figure 8 depicts a flexible studio and production arrangement according to an embodiment of the present invention.
In Figure 1, a camera 102 takes real images of a real scene 101. A virtual object 103 for viewing with the real scene is created by a graphics rendering module 104. In order to combine the real and virtual images, an alignment module calculates and aligns the real image coordinates and the virtual object coordinates. This can be achieved by taking information from the position and pose of the camera, deriving position information from the real scene, for example through the use of recognisable markers and combining these with virtual object coordinates of the computer rendered virtual image. Finally, the virtual and real images, having been aligned, are combined to form augmented video image 106.
One possible way of viewing an augmented video image, such as that created by an embodiment as illustrated in Figure I, is to use a monitor display, as depicted in Figure 2. Here the real image from camera 201, and the virtual image from graphics system 202 are merged at stage 203, and displayed on monitor 204.
A more realistic and "immersive" display method is illustrated in Figure 3. A head mounted display generally designated by numeral 301 comprises a monitor 302 and a semi reflective mirror 303. In the system of Figure 3, a camera to capture real images of a scene 304 is not required, as the real part of the image is viewed directly through semi reflective mirror 303. The virtual image is displayed on monitor 302 and is merged with the real image through the use of mirror 303, such that both real and virtual images can be seen by the user. In this embodiment it is necessary to provide position information 305 for the head mounted display to the graphics rendering unit 306, so that the rendered images can be created to align correctly with the real images viewed by the user. It should be noted that in such a system, any changes in scene coordinates, such as the movement of a particular real object, would have to be l 5 provided by additional means if required.
Figure 4 is a simplified schematic of one aspect of the present invention. A real scene 401 contains a target pattern 402. The pattern may be applied to a selected object within the scene, for example a chair or a hand held object such as a sword, or may simply be applied to a marker card. Images of pattern 402 are captured by camera 403. Pattern 402 is identifiable in a camera image and the position of the pattern or patterned object may be obtained from processing its camera derived image. A suitable method of camera based position tracking is described in our GB-A 2,325,807 incorporated herein by reference. A virtual object 404 is rendered by a graphics unit 405, and the virtual image thus derived, together with the real image from camera 403 are passed to an image processor 406. Here the real and virtual images can be correctly aligned and combined to render (in real time) the virtual elements at locations indicated by target patterns. This creates an augmented video output 407. Processor 406 includes a visual tracking system to analyse the production camera (403) image and track target patterns 402. It is desirable to allow for this information not only to be superimposed on the video image, as is the case with Augmented Reality, but also to appear behind or in between real objects. This involves generating an alpha signal (pixel transparency) together with the rendered virtual element and use of a keyer (preferably a chrome keyer) and video mixer.
Where studio cameras operate zoom lenses, it is desirable to distinguish between zooming of the camera and bringing a pattern closer to the camera. This is achieved by incorporating data from the zoom sensor of the camera into the target object tracking algorithm. Finally, the video and rendered image are combined by a video mixing or keying system taking into account the alpha channel if present, to provide the final broadcasted or recorded picture.
Graphics processor 406 also creates an image 408 for projection by a projector 409.
By using information concerning the position of pattern 402, which may be derived from a camera image as described above, image 408 can be projected onto pattern 402 in real time. In this way a real representation of augmented output 407 is created in scene 401, which allows actors, director etc. to gain a real time "immersive" visualization of the augmented output without the need for a monitor display or for a head mounted apparatus.
The target pattern must be identifiable in the camera image, both to determine the position of the pattern, and preferably to allow the pattern to be keyed out of the production image. In some applications it may be possible to use parts of real objects to be included in the production video output as target patterns ea. recognizable corners or edges. Preferably however, the target pattern is keyed out of the production image. In this case the target pattern may be a specific colour, or may be made of a retro reflective material such as that described in GB-A-2,321,815 and GB- A- 2,321,814, incorporated herein by reference. This allows a chrome keying process to be used. This allows the position of the pattern to be determined from a camera image, and subsequently for the pattern to be keyed out and replaced with virtual graphics in that image.
Preferably, the pattern can be keyed out of the camera image irrespective of projected light falling on the pattern. This may be achieved through the use of retro reflective patterns which diffusely reflect light from a projected image, but which strongly retro reflects key light from a direction substantially coincident with the camera. This allows the projected image to be seen on the pattern, but at the same time maintains a good clean key colour to be viewed by the camera. Alternatively projection may be timed with camera blanking periods to avoid any projected light appearing on the production images.
In many embodiments the target pattern is fixed in relation to the real scene. This occurs when the pattern is applied to an area of the wall or floor, or of a table top of a virtual studio. An example of such an embodiment is illustrated in Figure 5. Here the target pattern 501 is applied to a virtual video wall. Production camera 502 captures real images of the real scene 505 and passes them to image processor 506. Where the target pattern appears in the images it is keyed out and replaced by virtual image from graphics rendering engine 508 in order to create a video output 510. Processor 506 additionally outputs an image for real time projection onto target pattern 501 by projector 503.
It should be appreciated that the camera may be free to move. This might involve only simple operations such as pan and tilt, but might include a freely roving camera position, such as in a hand held camera. Where the camera is free to move it may be advantageous to monitor the position and pose of the camera. Where this can be done with sufficient accuracy it may no longer be necessary to identify the target pattern in the camera image, but rather to calculate its position and appearance 'on screen' from the known location of the target pattern, and camera based information (position, tilt, zoom etc.). Furthermore, camera parameters can be used to influence the image projected onto the pattern. In one example real time perspective changes in a virtual scene projected onto the pattern may be produced in response to camera movements, allowing actors to see a representation of the target area as seen in the video output (ie. from the point of view of the camera).
In certain embodiments, the target pattern may be required to move in relation to the rest of the real scene. These embodiment are greatly simplified in variations where the camera is locked off (ie. fixed). In this case, the position of the target pattern can be located from the camera image relatively easily, as discussed previously, with the use of identifiable markers or patterns. Position information concerning the pattern can then be used to create a projected image for real time projection onto the target pattern.
Although the target pattern may move within a real scene fixed coordinate system, the camera may still be allowed to move in either a constrained, or free fashion. Where the position of the pattern is tracked by analysing the image of the object from the camera this will inevitably be at the expense of computational complexity, since it is necessary to derive the position and possibly also orientation of the pattern relative to a real scene coordinate system (for the purpose of projection). In such an embodiment it would be necessary to determine the position of the target pattern in relation to the (moving) camera, and then to determine the position of the pattern in relation to the real scene.
In one embodiment, tracking of the target pattern is desirably performed by means other than a production camera. This is particularly applicable in the case where both the studio camera and the target pattern are required to move independently.
Preferably the target pattern is tracked by an auxiliary tracking camera. An example of such an embodiment is shown in Figure 6. Target pattern 601 may be freely moved within the real scene 605. Production camera 602 captures real images of the scene which are passed to image processor 606 where they are combined with virtual images from rendering engine 608 to provide a merged video output 610, as described previously. Target patterns may be keyed out of the production camera images during processing. The position of pattern 601 is monitored by auxiliary tracking camera 604 in substantially the same fashion as described above. This position information, along with the virtually rendered graphics is used to create a projection image for real time projection onto pattern 601 by projector 603. Projection may be continuously guided onto the target pattern using positional information obtained from images from auxiliary camera 604.
An auxiliary tracking camera can be fixed, and can be positioned substantially independently of the studio image camera. Where the movement of the target pattern is approximately known, or is constrained, the position of the tracking camera can advantageously be chosen to allow greater accuracy when tracking the position of the pattern. For example, if the target pattern is constrained to motion in the plane of a table top, an auxiliary tracking camera may be positioned directly above the table top looking down onto the table, while the production camera can maintain a conventional view 'across' the table. In such an embodiment, information concerning the position and pose of the studio camera may not be required. This adds a great deal of flexibility to the system and allows the present invention to have an expanded range of applications outside of a virtual studio environment, ea. outdoors productions. s
In some circumstances it may be desirable to provide more than one auxiliary camera to track the position of a single target pattern. This may be the case when movement of the pattern results in it being obscured from the view of one auxiliary camera.
Another possibility is that one auxiliary camera may be used for low resolution, wide area tracking, while a second auxiliary camera is used for higher resolution tracking in
a narrower field of view are In an embodiment where the projector is fixed, a degree of motion of the
projected image may be achieved by 'moving' an image within the bounds of the total projected area. This simplistic approach is obviously very limited, and the target pattern may only move within the fixed field of projection. There is the further disadvantage that light may inevitably be projected over the entire projection area, irrespective of the location of the target pattern where it is desired to project a 'local' image.
Preferably, the field of projection may be moved across a given scene. In one embodiment therefore, the projector may be combined with a moveable mirror, in an arrangement similar to that described in US 6,431,711 "multiple surface display projector with interactive input capability" to direct the projection to a specific location. Preferably an auxiliary tracking camera is used in conjunction with the projector-mirror arrangement. More preferably, the auxiliary tracking camera, projector, mirror and associated drive means are combined in a single apparatus as shown in Figure 7. An auxiliary tracking camera 701, a projector 702, and a mirror 703 with a driveable mounting 704 are provided. Camera 701 tracks a target pattern 705, and derives its position through image analysis. This tracking information is transformed for the perspective of the projector 702, and subsequently the necessary pan and tilt of the mirror 703 is calculated and updated to guide the projected image onto pattern 705 in real time.
Use of a driven mirror to guide the projected image over a wide field of projection allows a number of different aspects of the invention to be used in a variety of applications.
In many situations it may be desirable to project onto non coplanar surfaces from a single projector. Accordingly, in certain embodiments of the invention, perspective or keystone distortion of projected images onto non coplanar surfaces are eliminated by adapting the image to be projected. Techniques for performing such image adaptation are well known. Examples may be found in Raskar et al., "The Of rice of the Future: A unified approach to Image-Based Modeling and Spatially Immersive Displays", Proceedings of SIGGRAPH'98, Orland, Fla. pp. 179-188, July 1998.
In a further aspect of the present invention the image projected onto a pattern need not be a representation of a virtual image to be viewed in the video output, but might be instructions from the director or other information in the form of projected text for example. A continuous stream of instructions or a script may be provided to the actor, or alternatively such instructions may be inserted intermittently into a projection of virtual objects to be included in the video output, as described above. Information may be provided to the actor via the content of the projected image (ea. text, arrows, colour codes etc.) or by the location of the projected image (ea. guiding the actor to a particular location by projecting to that location.) A preferred feature of the invention which may be of particular use when projecting instructions to an actor is that virtual elements can be projected onto a pattern so as to appear from the perspective of the actor. In one embodiment this may be achieved by tracking the actor with a six degrees of freedom sensor and using the position and orientation of the actor to make the necessary transformations to the projected image.
The methods disclosed in our GB-A-2,352,899, incorporated herein by reference, may be used as preferred methods of estimating position of an actor. A less accurate, but nevertheless effective embodiment is to combine a three degrees of freedom sensor that can determine an actor's orientation with location information of a target pattern, which is held by or near to the actor, to render the projected images. Methods for projecting an image adapted for the perspective of a participant are described in the Applicants UK Patent Application No. GB 0206214.9, incorporated herein by reference.
Either front or rear projection may be used. Rear projection would be suited to applications where the target pattern is fixed, for example to a virtual video wall, while front projection is better suited where moving target pattern are used. In the case of front projection, there is the possibility that projected light will cause unwanted illumination of actors or other objects. In order to avoid such undesirable illumination being present in the production image captured by the studio camera, the projection may be synchronized to signal blanking intervals, as described in EP-A-O 993 204. Alternatively steps can be taken to ensure that projection only occurs on areas which will subsequently be edited out, ea. chrome keyed target objects. This may simply be achieved by careful scripting of movements to ensure nothing comes between the projector and the target object. In a preferred arrangement however, an additional actor/object sensing system or technique monitors the positions of real objects which it is desired to appear in the final output video to ensure that they are not illuminated by the projector. Examples of preferred actor/object sensing systems and techniques can be found in GB-A-2,325,807 and GB-A-2, 352,899, the contents of which are incorporated herein by reference. If, for example, an actor is sensed to be within a projected area, projection may be paused, or masked to avoid that actor being illuminated by a projected image. A method of masking a projection is described in our UK Patent Application No. GB 0206214.9.
Further embodiments of the invention may use more than one projector to project onto a target pattern. This may allow greater freedom of movement of the target pattern and of actors and other real objects when using the invention.
In a related aspect of the present invention, more than one target pattern may be tracked within an area of projection. Tracked patterns may comprise a mixture of fixed location and moving patterns. Each pattern may be tracked from images taken from a separate auxiliary tracking camera, however two or more patterns may be tracked from the same auxiliary camera. Alternatively, the position of one or more patterns may be tracked from images from a production camera. An embodiment of this aspect of the invention is illustrated schematically in Figure 8. Here, three independent target patterns 801, 802 & 803 are tracked by auxiliary tracking cameras 811, 812 & 813. These patterns may be replaced, enhanced or extended virtually at stage 805 for merging (at stage 806) with images from a studio camera 808. Pre visualisation unit 822 receives inputs from stages 805 and 806, and provides an output for projector 820. Mirror 822 guides the projected image onto one or more of the target patterns in real time. Real time visualisation unit 824 produces video output for broadcast or recording.
Figure 8 shows that many different routes to production may be taken with varying amounts of virtual processing. It can be seen therefore, that the invention allows for a flexible system capable of tracking and projecting onto multiple target patterns, which can be tailored to the production scenario and budget.
In a further embodiment the invention can also deliver active feedback on a pocket PC equipped with an attached camera via a wireless connection. The camera attached to the device provides the video image, which is then processed on a PC server that tracks target objects in the image and renders the relevant virtual elements. The image is then combined on the server and sent to the pocket PC for display. This allows the director to use the pocket PC as a window into the studio and plan the production.
A further aspect of the invention is concerned with the interaction of real and virtual objects. In one embodiment the position of an actor, or of part of an actor (ea. an actor's hand) may be tracked along with one or more target patterns. The relative positions of the actor and of a tracked pattern may be used to trigger certain events.
For example when an actor approaches a certain tracked pattern, a prescripted animation may be rendered for that object and merged with images from the studio camera for output. In order to provide real time user feedback, a projected image, representative of the animation may be projected onto the target pattern in real time.
An extension of this aspect of the invention is concerned with the interaction between virtual objects. In embodiments according to this aspect of the invention, the relative positions of two or more target patterns are used trigger certain events. In one example, when two target patterns are brought sufficiently close together, a pre scripted computer rendered animation is performed at their locations. If, however one of those target patterns is brought into proximity with a third pattern, a different animation is rendered and merged with the image from the studio camera. In both cases, a representation of the animations can be projected onto the target pattern to provide real time user feedback. 1
The positions and interactions of the tracked patterns can be used to trigger events in a number of different ways. One of the simplest ways of monitoring the interaction of two tracked patterns or objects is to determine the distance between them. The result of comparing this distance to a threshold or set value can be used to trigger an event.
An event might also be triggered by a certain number of patterns being located within a certain proximity of each other.
Similarly it should be understood that a wide range of events might be triggered by monitoring the positions of tracked patterns. Events will typically be the rendering of certain virtual graphics or graphics sequences for merging with real video input. For example a virtual object associated with a particular pattern could be altered when it is brought into proximity with another virtual object (associated with another pattern).
This or these alterations can be reflected in real time in the associated images projected onto the patterns. Events triggered by the positioning of tracked patterns might also or alternatively be non virtual. Examples include changes in sounds or music, lighting, camera positions or editing decisions.
It should be understood that a variety of real and virtual, and virtual and virtual interactions can be produced according to the present invention, which interactions can be used to control computer generated effects for merging with images from the studio camera. Projected images representative of these effects can be projected onto relevant target pattern to provide real time user feedback of these effects.
A number of possible applications of the invention will now be described purely by
way of example:
o Virtuality News In the studio, a tabletop is covered with a pattern representing a virtual landscape, and virtual models in the landscape are represented by tracked target patterns which can be moved/placed on landscape by a Newsperson to show a current or hypothetical situation in, say, a fictional country. In this example the production camera could also move completely into the virtual I landscape, while the Newsperson is guiding the viewer through it (e.g. audio narration and virtual tour of the landscape, with final view of Newsperson from within the landscape) . Personalised news could be provided through different patterned targets replaced by additional information in 2-D and 3-D, or other media. An animation may simulate movement from a real camera through a sequence of virtual camera positions, for example zooming or assuming the perspective of a virtual object.
o Animal interaction scenario In this example 3D models of animals and actors are seamlessly mixed and interact. Actors or audience members may manipulate target patterns representing specific animals. Interaction between studio members and patterns or between patterns can trigger animations of animal behaviours. For example if a presenter brings two tracked elements close together, or if a member of the audience approaches a tracked pattern a certain animal response can be animated.
o Build Your Own Set This example allows intuitive pre-production design and editing of a virtual and mixed reality set as a physical walk through. A producer/set designer or other relevant person can select from a database of 3D models and insert then into a virtual set by directmanipulation, e.g. positioning tracked patterns representing the selected models on the studio space and viewing the result in real time. Sizing, removal or other operations are also performed using tracked patterns and manipulations thereof representing commands.
o Beyond the Dancer This example is a dancing scenario, where virtual content and interaction is used in a performance context, for example tracking static patterns on surfaces I or patterns on outfit of dancer. Animations and other virtual effects can be triggered by the performer's movement around the tracked patterns.
o Gaming show Real-objects and/or contestants are enhanced, extended, or replaced in this example to create a mixed reality game, and personalized information or virtual content is provided to them as part of the game as the game progresses.
It should be understood that although specific examples of embodiments have been provided, these are in no way limiting. For example, in this specification, the term "actor" is intended to encompass without limitation any actor, presenter or other participant involved with the production or indeed any other animate entity which; might interact with target patterns or projections representative of virtual content. The phrase "real time" is intended to distinguish from post production processes, or other processes which are incompatible with live action elements. Preferably real time processes are updated at least ten times each second. Whereas many examples of this invention relate to use in a studio environment, it should be appreciated that the invention provides methods and apparatus which are sufficiently flexible to be used in; a number of alternative scenarios, for example outdoor use, and sports.

Claims (46)

1. Apparatus for providing a visual cue concerning a selected object to a participant in a scene, the selected object having an identifiable pattern, the apparatus comprising: a tracking camera arranged to obtain an image of the scene including an image of the selected object; digital video processing means for analysing said image to detect said identifiable pattern in the image; a projector arranged to project an image onto a selectable portion within a possible projection space, which possible projection space includes at least a portion of the scene which includes the object; means for providing image data for a visual cue to the projector; means for storing camera parameters for said tracking camera and positional information relating the position of the tracking camera to the position of the projector; position processing means arranged to provide position selection information for the projector based on the position of the detected identifiable pattern in the tracking camera image and the stored tracking camera parameters and positional information to control the projector to project the visual cue selectively onto the selected object.
2. Apparatus according to Claim 1, wherein said tracking camera and said projector are coupled in a determined spatial configuration.
3. Apparatus according to Claim 1 or Claim 2, wherein said tracking camera and said projector are integral.
4. Apparatus according to any one of Claims 1 to 3, wherein the camera parameters for said tracking camera include zoom information.
5. Apparatus according to any preceding claim, wherein said means for providing image data comprises an interface for receiving data from a graphics rendering tool.
6. Apparatus according to any preceding claim, further including a production camera arranged to obtain a production image of a scene, to form at least part of a video output.
7. Apparatus according to Claim 6, wherein said tracking camera is a production camera.
8. Apparatus according to Claim 6 or Claim 7, further comprising means for detecting said identifiable pattern in a production image of the scene.
9. Apparatus according to Claim 8, further comprising means for keying out said identifiable pattern in a production image of the scene.
10. Apparatus according to Claim 8 or Claim 9, further comprising means for rendering a virtual image to replace said identifiable pattern in a production image of the scene.
I 1. Apparatus according to Claim 10, wherein said virtual image is rendered independently from said visual cue image.
12. Apparatus according to Claim 10 or Claiml 1, wherein said virtual image is of a higher resolution than said visual cue image.
13. Apparatus according to any preceding claim, wherein a visual cue can be provided concerning a selected object and at least one further selected object, each said object having an identifiable pattern; wherein said digital video processing means is arranged to detect said identifiable patterns, and wherein the visual cue can be selectively projected onto one or more of the selected objects.
14. Apparatus according to Claim 13, further comprising object interaction processing means arranged to provide relative position information for two or more selected objects based on the position of the identifiable patterns in the tracking camera image, and further comprising control means to trigger a production event based on said relative position information.
15. Apparatus according to Claim 14, wherein said relative position information includes the distance between two selected objects, and said control means triggers an event based on a comparison between said distance and a threshold distance.
16. Apparatus according to Claim 14, wherein said control means triggers an event based on a comparison between the relative positions of selected objects 2 and a predetermined positional arrangement.
17. Apparatus according to any one of Claims 13 to 16, wherein said production event comprises the insertion of a graphic into a production image of the scene.
18. Apparatus according to any one of Claims 1 1 to 13, wherein said production event comprises one or more of the following: playing of a prescripted animation to be merged into a production image of the scene; an audio cue; an audio signal to be included with a production image; a production camera/view switch.
19. A method of providing a visual cue concerning a selected object to a participant in a scene, the selected object having an identifiable pattern, the method comprising: producing an image of the scene including said identifiable pattern with a tracking camera; processing said image to detect said identifiable pattern; determining camera parameters and positional information relating the position of the tracking camera and the position of a projector; providing image data for a visual cue to the projector; controlling the projector to project the visual cue selectively onto the selected object, based on the position of the identifiable pattern in the camera image and said camera parameters and positional information.
20. A method according to Claim 19 further comprising producing, with a production camera, a production image of the scene to form at least part of a video output.
21. A method according to Claim 20, further comprising detecting and keying out said identifiable pattern in said production image
22. A method according to Claim 20 or Claim 21, further comprising the steps of rendering virtual graphics and merging said virtual graphics with said production image
23. A method according to any one of Claims 20 to 22, wherein said identifiable patterns in the production image are replaced by rendered graphics in the video output.
24. A method according to any one of Claims 19 to 23, wherein said tracking camera has a fixed position, and wherein said selected object is moveable.
25. A method according to any one of Claims 19 to 23, wherein said selected object has a fixed position, and wherein said tracking camera is moveable.
26. A method according to any one of Claims 19 to 25, wherein said tracking camera is a production camera.
27. A method according to any one of Claims 19 to 25, wherein said camera is an auxiliary camera.
28. A method according to any one of Claims 19 to 27, wherein said projected image may be either a front or a rear projected image.
29. A method according to any one of Claims 19 to 28, wherein projection can be paused or masked on sensing an intervening object between the projector and the selected object to be projected onto.
30. A method according to any one of Claims 19 to 29, wherein the projected image is adapted to eliminate distortions due to projection onto non coplanar surfaces.
31. A method according to any one of Claims 19 to SO, wherein the projected image is rendered from the perspective of a participant in the scene.
32. A method according to Claim 31, wherein the perspective of said participant is determined by a six degrees of freedom sensor arrangement monitoring said participant.
33. A method according to Claim 31, wherein the perspective of said participant is determined by a three degrees of freedom sensor arrangement monitoring said participant in conjunction with the position of the selected object in the tracking camera image.
34. A method according to any one of Claims 19 to 33, wherein a visual cue can be provided for concerning a selected object and at least one further selected object, each said object having an identifiable pattern; wherein said identifiable patterns are detected in said tracking camera image; and wherein said visual cue can be selectively projected onto one or more of the selected objects.
35. A method according to Claim 34, further comprising the steps of determining relative position information for two or more selected objects based on the position of the identifiable patterns in the tracking camera image, and triggering a production event based on said relative position information.
36. Apparatus for augmenting a motion video sequence of images obtained by a production camera of a scene containing at least first and second selected objects, each object having a respective identifiable pattern, the apparatus comprising: at least one tracking camera arranged to provide a sequence of images of the scene including images of said selected objects; digital video processing means for analysing said sequence of images to detect said identifiable patterns; object interaction processing means arranged to provide position information for said selected objects having patterns detected in the tracking camera images, based on the positions of the patterns in said sequence of images, and tracking camera parameters; a graphics processor arranged to recognise and replace said identifiable patterns in the motion video sequence produced by the production camera with alternative image data; an event processor for triggering an event in said motion video sequence based on said relative position information.
37. Apparatus according to Claim 36, wherein said event comprises altering the image data used to replace one or more of the identifiable patterns.
38. Apparatus according to Claim 36, wherein said event comprises the insertion of an external graphic into the motion video sequence.
39. Apparatus according to Claim 36, wherein said event comprises inserting a sound effect into the audio accompanying the motion video sequence.
40. Apparatus according to any one of Claims 36 to 39, wherein said position information includes the distance between two selected objects, and said event processor triggers an event based on a comparison between said distance and a threshold distance.
41. Apparatus according to any one of Claims 36 to 40, wherein said position information includes the relative positions of the selected objects, and said event processor triggers an event based on a comparison between said relative positions and a predetermined positional arrangement. i
42. Apparatus according to any one of Claims 36 to 39, wherein said position information for the selected objects is determined within the reference frame of the tracked camera image, and said event processor triggers an event based on an evaluation of the locations of selected objects within said reference frame.
43. Apparatus according to any one of Claims 36 to 42, wherein said graphics processor includes a keyer to key out said patterns in the motion video.
44. Apparatus according to any one of Claims 36 to 43, further comprising: a projector arranged to project onto a selectable portion within a possible projection space, which projection space includes at least one of said i selected objects; and i position processing means arranged to provide position selection information for the projector based on the position of the detected identifiable patterns in the tracking camera image and tracking camera parameters and positional information relating the position of the tracking camera to the position of the projector; said apparatus arranged to control the projector to project a visual cue selectively onto at least a first selected object
45. Apparatus according to Claim 44, wherein said tracking camera and said projector are coupled in a determined spatial configuration.
46. Apparatus according to Claim 44 or Claim 45, wherein said tracking camera and said projector are integral.
GB0305294A 2003-03-07 2003-03-07 Video production Expired - Fee Related GB2399248B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0305294A GB2399248B (en) 2003-03-07 2003-03-07 Video production

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0305294A GB2399248B (en) 2003-03-07 2003-03-07 Video production

Publications (3)

Publication Number Publication Date
GB0305294D0 GB0305294D0 (en) 2003-04-09
GB2399248A true GB2399248A (en) 2004-09-08
GB2399248B GB2399248B (en) 2006-03-29

Family

ID=9954357

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0305294A Expired - Fee Related GB2399248B (en) 2003-03-07 2003-03-07 Video production

Country Status (1)

Country Link
GB (1) GB2399248B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1480450A2 (en) 2003-05-20 2004-11-24 British Broadcasting Corporation Automated video production
WO2012106366A3 (en) * 2011-01-31 2012-10-26 Qualcomm Incorporated Context aware augmentation interactions
CN102959938A (en) * 2010-06-30 2013-03-06 富士胶片株式会社 Image processing method and apparatus
CN104168406A (en) * 2014-09-02 2014-11-26 北京中科大洋科技发展股份有限公司 Studio linkage broadcasting controlling system and method
EP3059886A1 (en) * 2011-07-29 2016-08-24 Music Mastermind, Inc. System and method for producing a more harmonious musical accompaniment and for applying a chain of effects to a musical composition
WO2019073245A1 (en) * 2017-10-13 2019-04-18 Mo-Sys Engineering Limited Lighting integration
EP4020973A1 (en) * 2020-12-23 2022-06-29 Arnold & Richter Cine Technik GmbH & Co. Betriebs KG Background reproduction device, background reproduction system, recording system, camera system, digital camera, and method for controlling a background reproduction device
EP4419964A4 (en) * 2021-10-20 2025-08-13 Nantstudios Llc CAMERA TRACKING VIA DYNAMIC PERSPECTIVES

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101674419B (en) * 2008-09-10 2013-01-02 新奥特(北京)视频技术有限公司 Method for editing template in real time in virtual studio system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5353392A (en) * 1990-04-11 1994-10-04 Multi Media Techniques Method and device for modifying a zone in successive images
GB2323733A (en) * 1997-03-25 1998-09-30 Orad Hi Tech Systems Limited Virtual studio projection system
JPH10327350A (en) * 1997-05-23 1998-12-08 Sony Corp Image generation apparatus and image generation method
US5886747A (en) * 1996-02-01 1999-03-23 Rt-Set Prompting guide for chroma keying

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5353392A (en) * 1990-04-11 1994-10-04 Multi Media Techniques Method and device for modifying a zone in successive images
US5886747A (en) * 1996-02-01 1999-03-23 Rt-Set Prompting guide for chroma keying
GB2323733A (en) * 1997-03-25 1998-09-30 Orad Hi Tech Systems Limited Virtual studio projection system
JPH10327350A (en) * 1997-05-23 1998-12-08 Sony Corp Image generation apparatus and image generation method

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1480450A2 (en) 2003-05-20 2004-11-24 British Broadcasting Corporation Automated video production
CN102959938A (en) * 2010-06-30 2013-03-06 富士胶片株式会社 Image processing method and apparatus
EP2590395A4 (en) * 2010-06-30 2014-02-19 Fujifilm Corp IMAGE PROCESSING AND DEVICE
US8767096B2 (en) 2010-06-30 2014-07-01 Fujifilm Corporation Image processing method and apparatus
CN102959938B (en) * 2010-06-30 2016-06-22 富士胶片株式会社 Image processing method and equipment
WO2012106366A3 (en) * 2011-01-31 2012-10-26 Qualcomm Incorporated Context aware augmentation interactions
US8509483B2 (en) 2011-01-31 2013-08-13 Qualcomm Incorporated Context aware augmentation interactions
CN103443743A (en) * 2011-01-31 2013-12-11 高通股份有限公司 Context aware augmentation interactions
JP2014503923A (en) * 2011-01-31 2014-02-13 クアルコム,インコーポレイテッド Context-aware extended interaction
CN103443743B (en) * 2011-01-31 2016-06-29 高通股份有限公司 For the method and apparatus that the enhancing of context-aware is mutual
EP3059886A1 (en) * 2011-07-29 2016-08-24 Music Mastermind, Inc. System and method for producing a more harmonious musical accompaniment and for applying a chain of effects to a musical composition
CN104168406A (en) * 2014-09-02 2014-11-26 北京中科大洋科技发展股份有限公司 Studio linkage broadcasting controlling system and method
CN104168406B (en) * 2014-09-02 2017-08-15 北京中科大洋科技发展股份有限公司 Link broadcast control method for a kind of studio
WO2019073245A1 (en) * 2017-10-13 2019-04-18 Mo-Sys Engineering Limited Lighting integration
US11189101B2 (en) 2017-10-13 2021-11-30 Mo-Sys Engineering Limited Lighting integration
EP4020973A1 (en) * 2020-12-23 2022-06-29 Arnold & Richter Cine Technik GmbH & Co. Betriebs KG Background reproduction device, background reproduction system, recording system, camera system, digital camera, and method for controlling a background reproduction device
US12075182B2 (en) 2020-12-23 2024-08-27 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg Background display device, background display system, recording system, camera system, digital camera and method of controlling a background display device
EP4419964A4 (en) * 2021-10-20 2025-08-13 Nantstudios Llc CAMERA TRACKING VIA DYNAMIC PERSPECTIVES

Also Published As

Publication number Publication date
GB0305294D0 (en) 2003-04-09
GB2399248B (en) 2006-03-29

Similar Documents

Publication Publication Date Title
US6724386B2 (en) System and process for geometry replacement
Gibbs et al. Virtual studios: An overview
US5963247A (en) Visual display systems and a system for producing recordings for visualization thereon and methods therefor
KR100271384B1 (en) Video merging employing pattern-key insertion
CN102726051B (en) Virtual plug-in unit in 3D video
US20130278727A1 (en) Method and system for creating three-dimensional viewable video from a single video stream
EP1798691A2 (en) Method and apparatus for generating a desired view of a scene from a selected viewpoint
US8130330B2 (en) Immersive surround visual fields
KR20070119018A (en) Auto scene modeling for 3D cameras and 3D video
US11335039B2 (en) Correlation of multiple-source image data
CA2244467C (en) Chroma keying studio system
GB2399248A (en) Projection of supplementary image data onto a studio set
US11615755B1 (en) Increasing resolution and luminance of a display
GB2436921A (en) Methods and apparatus providing central, primary displays with surrounding display regions
Wojdala Challenges of virtual set technology
Fukui et al. A virtual studio system for TV program production
CN112189340A (en) 3DAR content creation device, 3DAR content reproduction device, and 3DAR content creation system
Thomas et al. Virtual graphics for broadcast production
Gibbs et al. Interaction in the virtual studio
Grau Studio production system for dynamic 3D content
KR101843024B1 (en) System and Computer Implemented Method for Playing Compoiste Video through Selection of Environment Object in Real Time Manner
Vanherle et al. Automatic Camera Control and Directing with an Ultra-High-Definition Collaborative Recording System
Wojdala Virtual set: The state of the art
Bazzoni et al. The origami project: advanced tools and techniques for high-end mixing and interaction between real and virtual content
Fukui et al. Virtual studio system for TV program production

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20180307