[go: up one dir, main page]

US20220300145A1 - Media content planning system - Google Patents

Media content planning system Download PDF

Info

Publication number
US20220300145A1
US20220300145A1 US17/835,753 US202217835753A US2022300145A1 US 20220300145 A1 US20220300145 A1 US 20220300145A1 US 202217835753 A US202217835753 A US 202217835753A US 2022300145 A1 US2022300145 A1 US 2022300145A1
Authority
US
United States
Prior art keywords
experience
location
time
user
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/835,753
Inventor
Lucy Cooke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spacedraft Pty Ltd
Original Assignee
Spacedraft Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2018901016A external-priority patent/AU2018901016A0/en
Application filed by Spacedraft Pty Ltd filed Critical Spacedraft Pty Ltd
Priority to US17/835,753 priority Critical patent/US20220300145A1/en
Assigned to SPACEDRAFT PTY LTD reassignment SPACEDRAFT PTY LTD ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Cooke, Lucy
Publication of US20220300145A1 publication Critical patent/US20220300145A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/44Browsing; Visualisation therefor
    • G06F16/444Spatial browsing, e.g. 2D maps, 3D or virtual spaces
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/63Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/002Programmed access in sequence to a plurality of record carriers or indexed parts, e.g. tracks, thereof, e.g. for editing
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/024Multi-user, collaborative environment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]

Definitions

  • the present invention relates to a media content planning system.
  • the content planning system has particular application for virtual reality, augmented reality and mixed reality content, and video game development.
  • this ‘pre-visualization’ step is achieved by first building a ‘blueprint’ of the movie using a script and/or a computer-generated animation.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • a media content planning system comprising:
  • a data storage device arranged to store information indicative of scenes of a media content project
  • a user interface arranged to:
  • the system may be arranged to facilitate selection by a user of an open world project or a closed world project.
  • the system may be arranged to facilitate selection by a user of a journey open world project or a free roaming open world project.
  • the media content is a closed world project and the user interface is arranged to facilitate selection by a user of scenes to form part of the media content and the order of presentation of the scenes to an observer.
  • the user interface may also be arranged to facilitate selection by a user of the timing of presentation of scenes of the media content.
  • the media content is an open world project and the user interface is arranged to facilitate selection by a user of scenes to form part of the media content.
  • system is arranged to display a world overview, the world overview comprising a world space including at least one scene icon indicative of at least one scene at a location on the world space representative of the desired location of the scene in the media content.
  • the world space comprises a defined shape to represent a world overview.
  • the world space is defined according to a computer-generated space mesh.
  • the space mesh may represent an actual real-world space, and the space mesh may be generated using a LIDAR, matterport scanner or any other scanning device.
  • the system enables the user to add a scene icon to the world space at a location representative of the desired location of a scene in the media content, and to enable the user to select at least one observer experience for association with the scene icon.
  • the system enables the user to select the type of observer experience associated with the experience icon.
  • the experience space comprises at least one annular portion surrounding the representation of the observer, the annular portion usable by the user to add an experience icon to the experience space at a location in 2 dimensional space relative to the observer representative of the desired location in at least 2 dimensional space of an observer experience in the scene.
  • the experience space comprises a sphere surrounding the representation of the observer, the sphere usable by the user to add an experience icon to the experience space at a location in 3 dimensional space relative to the observer representative of the desired location in at least 2 dimensional space of an observer experience in the scene.
  • the system is arranged to enable a user to add notes to a scene.
  • the system is arranged to enable a user to add a representative image to the media content project.
  • the representative image may be a 360° image or a 2D image.
  • system is arranged to enable a user to select a point of view for the media content project.
  • system is arranged to enable a user to:
  • system is arranged to enable a user to share the media content project with a selected other user.
  • the observer experience includes any one or more of a haptic experience, a visual experience and/or an audio experience.
  • the system includes a user interface device and a remote computing device in communication with the user interface device.
  • the remote computing device comprises a server arranged to serve data indicative of the user interface to the user interface device.
  • the data indicative of the user interface may comprise data indicative of web pages.
  • the observer experience data may be stored at the remote computing device and/or at the user interface device.
  • the user interface device includes a head mounted device (HMD) that may include a tool that supports WebVR.
  • HMD head mounted device
  • the experience space associated with a defined time and including a representation of an observer
  • each experience icon enabling the user to select a location on the experience space at which to dispose each experience icon, and to display each experience icon on the experience space at the respective selected location in response to selection of a location on the experience space for each experience icon, each selected location representative of a desired location in at least 2 dimensional space relative to the observer of an observer experience in the scene at the defined time;
  • each further experience space associated with a further defined time corresponding to a later time in the scene than the defined time;
  • the user to select a further location on the experience space at which to dispose an experience icon, and to display the experience icon on the further experience space at the selected further location in response to selection of a location on the further experience space, the selected location representative of a desired further location in at least 2 dimensional space relative to the observer of the observer experience associated with the experience icon at the further defined time;
  • a user interface for a system for planning media content the user interface arranged to:
  • an experience space for display the experience space associated with a defined time and including a representation of an observer
  • each experience icon is displayed on the experience space at the respective selected location in response to selection of a location on the experience space for each experience icon, each selected location representative of a desired location in at least 2 dimensional space relative to the observer of an observer experience in the scene at the defined time;
  • each further experience space associated with a further defined time corresponding to a later time in the scene than the defined time;
  • the selected further location representative of a desired further location in at least 2 dimensional space relative to the observer of the observer experience associated with the experience icon at the further defined time.
  • a media content planning system comprising:
  • a data storage device arranged to store information indicative of scenes of a media content project
  • the system arranged to communicate information to a user interface device for display at the user interface device, the information indicative of:
  • the system arranged to receive information indicative of selection by a user of an experience space associated with a defined time, to receive information indicative of selection by the user of an observer experience to be associated with the scene, and to receive information indicative of a selection by the user of a location on the experience space at which to dispose an experience icon associated with the observer experience in at least 2 dimensional space relative to the observer, wherein in response to selection of a location on the experience space for each experience icon, the system is arranged to communicate to the user interface device information usable by the user interface device to display each experience icon on the experience space at the respective selected location;
  • the system arranged to receive information indicative of selection by a user of at least one further experience space associated with a further defined time corresponding to a later time in the scene than the defined time, and to receive information indicative of selection by the user of a further location on the experience space at which to dispose the experience icon associated with the observer experience in at least 2 dimensional space relative to the observer, wherein in response to selection of a further location on the experience space for the experience icon, the system is arranged to communicate to the user interface device information usable by the user interface device to display the experience icon on the further experience space at the respective selected further location; and
  • system arranged to store data indicative of:
  • Also disclosed is a project planning system comprising:
  • a data storage device arranged to store information indicative of a project
  • a user interface arranged to:
  • system arranged to store data indicative of:
  • a project planning system comprising:
  • a data storage device that stores information indicative of a project
  • a user interface that displays an experience location screen, the experience location screen including a time selector and an experience space corresponding to a time selected using the time selector;
  • the experience location screen is usable to plan a location and time of occurrence at the location of at least one experience associated with the project by:
  • the experience location screen enables the user to view an experience space associated with a selected time, the experience space displaying the planned location of the or each experience at the selected time.
  • a method of planning a project comprising:
  • the experience location screen including a time selector and an experience space corresponding to a time selected using the time selector;
  • a project planning system for planning location and time of events, the system comprising:
  • a data storage device that stores information indicative of a project
  • a user interface that displays a time and location screen, the time and location screen including a time selector and a location space corresponding to a time selected using the time selector;
  • time and location screen is usable to plan a location and time of occurrence at the location of at least one event associated with the project by:
  • time and location screen enables the user to view a location space associated with a selected time, the location space displaying the planned location of the or each event at the selected time on the location space.
  • FIG. 1 is a schematic block diagram of a media content planning system in accordance with an embodiment of the present invention
  • FIG. 2 is a schematic block diagram of functional components of a user computing device for use with the system shown in FIG. 1 ;
  • FIGS. 3 to 13 are diagrammatic representations of screens presented to a user on a user computing device by the system shown in FIG. 1 .
  • a “closed world” is a defined space environment wherein an observer is not able to roam freely and scenes of the world are presented to the observer in a defined structure, such as for example in 360° video; and an open world is an environment wherein an observer is able to roam, either in accordance with a defined journey or freely in any direction.
  • a media content planning system 10 arranged to facilitate creation of a pre-production blueprint of media content, in particular virtual reality (VR), augmented reality (AR) and mixed reality (MR) content, that can be used by media creators to conceptualise and plan an immersive media experience prior to creation of the actual media content.
  • the system may be applied to a non-linear mixed reality experience.
  • the system is arranged to facilitate mapping of 3D ideas in order to represent the ideas as they would appear to a VR/AR/MR participant within space and time by creating a blueprint for VR/AR/MR content.
  • the system allows a user to spend time developing a representation of the structure and desired content of a VR/AR/MR world, and to share the intended experience of the created VR/AR/MR world with others for collaboration purposes.
  • the system facilitates creation of an ordered scene sequence, and enables a user to plot the relative locations of observer experiences in each scene and to determine the particular observer experiences that occur at the respective locations, such as audio, visual and/or haptic experiences. For example, in each scene the system enables a user to plot the relative locations of audio, visual and/or haptic experiences for an observer in 2D or 3D space.
  • the system facilitates creation of a 3D world space, enables a user to plot the relative locations of scenes in the world space, enables a user to plot the relative locations of observer experiences at each of the scene locations in 2D or 3D space, and enables a user to determine the particular observer experiences that occur at the respective scene locations, such as audio, visual and/or haptic experiences.
  • the system maps content ideas as they would appear to a VR/AR/MR observer within the space and time of the experience.
  • Each observer may represent a character and therefore the system may be used to define different experiences and/or different point of view for each character.
  • the system 10 is implemented using a remote computing device in the form of a server 20 accessible by user computing devices that include a smartphone 12 , a tablet computer 14 and a personal computing device 16 arranged to communicate through a communications network 18 .
  • the user computing devices 12 , 14 , 16 serve to provide a user interface arranged to present screens associated with the system 10 to a user and facilitate reception of inputs from the user, with functional components 22 of the system substantially implemented at the server 20 .
  • a user computing device 12 , 14 , 16 may be arranged to substantially implement functional components 22 of the system as a stand-alone device, for example by downloading or otherwise installing an application on the user device 12 , 14 , 16 , or functional components 22 of the system 10 may be implemented partly by a user computing device 12 , 14 , 16 and partly by the server 20 .
  • the communications network 18 includes the Internet, although it will be understood that any suitable communications network that includes wired and/or wireless communication paths is envisaged. It will also be understood that any suitable computing device capable of executing programs, displaying information to a user and receiving inputs from the user is envisaged.
  • the server 20 is arranged to include at least one dedicated software application, although it will be understood that functionality may be implemented using dedicated hardware or a combination of dedicated hardware and software.
  • the functional components 22 implemented by the server 20 include a database management system (DBMS) 24 arranged to manage data stored in a data storage device 26 that may include a local data storage device, for example implemented using SQL protocols, and/or cloud based data storage; a login application 28 arranged to manage a user login process, for example by receiving user login details from a user computing device 12 , 14 , 16 and verifying the received login details with reference login details 30 stored in the data storage device 26 ; a closed world application 32 arranged to implement functionality for a closed world project; an open world application 32 arranged to implement functionality for an open world project; a 2D pin application 36 that enables a user to select the relative locations of observer experiences for an observer in a scene in 2D space; and a 3D pin application 38 that enables a user to select the relative locations of observer experiences for an observer in a scene in 3D space.
  • DBMS database management system
  • the data storage device 26 is arranged to store data used by the system 10 in multiple relational databases that may be configured according to SQL protocols.
  • the databases include:
  • a projects database 40 arranged to store data indicative of VR/AR/MR projects including the project name, project type (closed world, open world journey, or open world free roaming) and scene locations;
  • an experiences database 42 arranged to store data indicative of the relative locations of observer experiences at each of the scene locations, and the types of observer experiences that occur at the respective scene locations, such as audio, visual and/or haptic experiences;
  • a haptic database 44 arranged to store data indicative of haptic information, such as touch or smell, associated with observer haptic experiences linked to the scene locations;
  • a video database 46 arranged to store data indicative of video information associated with observer video experiences linked to the scene locations;
  • an audio database 48 arranged to store data indicative of audio information associated with observer audio experiences linked to the scene locations, for example traditional audio and/or ambisonic/spatial audio;
  • an images database 50 arranged to store data indicative of image information associated with observer image experiences linked to the scene locations;
  • a users database 52 arranged to store data indicative of registered users associated with the system
  • a 3D files database 53 arranged to store 3D files, for example in .OBJ format.
  • FIG. 2 An example configuration of a user computing device 12 , 14 , 16 , in this example a tablet computer 14 , is shown in FIG. 2 .
  • the user computing device 14 includes a processor 60 arranged to control and coordinate operations in the user computing device 14 , a data storage device 62 arranged to store programs and data used by the processor 60 to implement the desired functionality, and a memory 64 used by the processor to temporarily store programs and/or data during use.
  • the user computing device 14 also includes a display 68 and a user interface 70 , in this example in the form of a touch screen, arranged to enable the user computing device 14 to receive inputs from a user.
  • FIGS. 3 to 13 An example implementation with reference to screens displayed to a user on the user computing device 12 , 14 , 16 is shown in FIGS. 3 to 13 .
  • the computing device 12 , 14 , 16 is a tablet computer 14 having a user interface in the form of a touch screen 70 overlaid on the display 68 .
  • inputs to the computing device 14 are primarily effected by touching the touch screen 70 using taps, swipes and any other device recognizable gestures.
  • the example is equally applicable to implementations on other computing devices.
  • the user computing device may include a head mounted device (HMD) and a tool that supports WebVR.
  • HMD head mounted device
  • a user first logs into the system 10 by entering user login details at the user interface implemented by the user computing device 14 , and the system 10 verifies the entered login details by communicating the login details to the login application 28 and comparing the entered login details at the login application 28 with stored reference login details 30 associated with the user.
  • the user interface may be implemented on the user computing device 12 , 14 , 16 by installing an interface application on the user computing device 12 , 14 , 16 arranged to communicate with the server 20 , the user interface may be implemented through a web browser, for example by serving web pages corresponding to the screens shown in FIGS. 3 to 13 to the user interface device as required, or the user interface may be implemented in any other way.
  • Each displayed project 82 includes a project name 84 , world type indicia 86 indicative of the type of world environment associated with the project (closed world or open world), and a share icon 88 usable to provide a selected user with a link to the project so that the selected user is able to collaborate in the project creation process.
  • the home page 80 also includes a create new project button 90 usable to create new project.
  • Activation of the create new project button 90 causes a create new project screen 94 to be displayed, as shown in FIG. 4 .
  • Like and similar features are indicated with like reference numerals.
  • the create new project screen 94 includes a world type selection field 96 that enables a user to select the type of world environment associated with the project, that is, a closed world environment, a journey open world environment or a free roaming open world environment; a world type icon 98 representative of the type of world selected; a project name field 100 for receiving a project title; a project tagline field 102 for receiving a project tagline; and a description field 104 for receiving descriptive information associated with the project.
  • a world type selection field 96 that enables a user to select the type of world environment associated with the project, that is, a closed world environment, a journey open world environment or a free roaming open world environment
  • a world type icon 98 representative of the type of world selected
  • a project name field 100 for receiving a project title
  • a project tagline field 102 for receiving a project tagline
  • a description field 104 for receiving descriptive information associated with the project.
  • the create new project screen 94 also includes a reference image field 106 usable to facilitate selection and display of an image that is representative of the project, and a create button 110 that when activated causes a new project record to be created in the projects database 40 . Activation of the create button 110 also causes a relevant project overview screen 120 , 230 , 260 associated with the project to be displayed as shown in FIG. 6, 12 or 13 .
  • Selection of the share icon 88 on the home page 80 or on the create new project screen 94 causes a project share screen 111 to be displayed on a user interface of the user selected to collaborate on the project, as shown in FIG. 5 .
  • the shared project screen 111 includes an open project button 112 that when activated causes the relevant project overview screen 120 , 230 , 260 associated with the project to be displayed as shown in FIG. 6, 12 or 13 .
  • a user has selected a closed world environment and as such a closed project overview screen 120 associated with the project is displayed, as shown in FIG. 6 .
  • the closed project overview screen 120 includes world type selectors 113 —a closed world selector 114 , a journey open world selector 116 and a free roaming open world selector 117 that enable a user to switch between world types, and a point of view selector 118 arranged to facilitate selection of the observer point of view, in this example a first person point of view.
  • the closed project overview screen 120 shows a time line 122 defining timing for a sequence of defined scenes 128 to be presented to an observer during the closed world experience.
  • the time line may be displayed or hidden using a hide/show button 124 .
  • the scenes 128 are organized in scene groups 126 , each scene group 126 representing a different part of the story associated with the project, in this example “Setup”, “Confrontation” and “Resolution”. New scene groups 126 are added using an add scene group button 130 . Similarly, new scenes 128 are added using an add scene button 132 .
  • Each scene 128 has an associated notes icon 134 that when selected causes a notes screen 140 as shown in FIG. 7 to be displayed.
  • the notes screen 140 includes a scene title 142 and is usable to add notes for a scene into a content field 146 , for example using edit tools 144 .
  • Selection of a scene 128 causes a 2D experience location screen 150 (hereinafter a “plate screen”) to be displayed, as shown in FIG. 8 .
  • the plate screen 150 is usable to select the locations of experiences relative to an observer that can occur at the scene 128 , and the types of observer experiences that occur, such as audio, video and/or haptic experiences.
  • the plate screen 150 includes an experience space, in this example a plate area 152 , that has several concentric annular portions 154 surrounding an observer icon 156 . Disposable on the plate area 152 are pins 158 that represent observer experiences relative to the observer 156 . Using the plate screen 150 , a user is able to select the desired location of an experience relative to the observer 156 and the type of observer experience. In this example, available observer experiences include video, image, audio and haptic experiences.
  • Each pin 158 includes a pin type icon, for example a visual pin icon 160 , an audio pin icon 162 or a haptic pin icon 164 .
  • the type of pin is selected using pin type selectors 166 , in this example a visual pin selector 168 , an audio pin selector 170 and a haptic pin selector 172 .
  • the plate screen 150 also includes an experience model selector 174 that can be used to select the type of experience selection model, in this example a 2D experience selection model, as shown in FIGS. 8 and 9 , wherein a user is able to select in 2D the locations of experiences relative to an observer that can occur at a scene, and a 3D experience selection model, as shown in FIG. 11 , wherein a user is able to select in 3D the locations of experiences relative to an observer that can occur at a scene.
  • an experience model selector 174 can be used to select the type of experience selection model, in this example a 2D experience selection model, as shown in FIGS. 8 and 9 , wherein a user is able to select in 2D the locations of experiences relative to an observer that can occur at a scene
  • 3D experience selection model as shown in FIG. 11
  • the plate screen 150 also includes a pin view selector 176 usable to select the type of pins 158 that are displayed on the plate area 152 , for example all pins 158 , only haptic pins, only audio pins or only video pins.
  • the plate screen 150 also includes a scene identifier 178 that identifies the title of the scene associated with the displayed plate area 152 , in this example a scene titled “speak to an old friend”; a previous scene navigation button 180 usable to navigate to a previous scene in the story timeline; and a next scene navigation button 182 usable to navigate to a subsequent scene in the story timeline.
  • a scene identifier 178 that identifies the title of the scene associated with the displayed plate area 152 , in this example a scene titled “speak to an old friend”
  • a previous scene navigation button 180 usable to navigate to a previous scene in the story timeline
  • a next scene navigation button 182 usable to navigate to a subsequent scene in the story timeline.
  • the plate screen 150 also includes a time line 184 that includes a current time marker 186 to indicate the relevant time in the scene that corresponds to the experiences and relative locations of the experiences represented by the pins 158 on plate area 152 .
  • each plate screen 150 corresponding to a different time in the scene and each plate screen 150 potentially including different pins 158 and/or different pin locations relative to the observer 156 .
  • a further plate screen 150 associated with the scene “speak to old friend” is shown, with the further plate screen 150 representing a later time 188 in the scene than the plate screen 150 shown in FIG. 8 .
  • an observer is able to have different experiences that are linked to different locations relative to the observer.
  • Selection of a pin selector 166 by a user and subsequent selection of a location on the plate area 152 causes a pin 158 of the type that has been selected to be added to the plate area 152 .
  • Subsequent selection of the pin causes an add pin window 190 to be displayed over the plate screen 150 .
  • the add pin window 190 is used to add information indicative of the relevant experience or to add a link to information indicative of the relevant experience.
  • the add pin screen 190 includes a video link box 194 usable to add information indicative of the location of a selected video to be associated with the pin 158 , an image link box 196 usable to add information indicative of the location of a selected image to be associated with the pin 158 , an audio link box 198 usable to add information indicative of the location of selected audio to be associated with the pin 158 , and a document link box 200 usable to add information indicative of the location of a selected document to be associated with the pin 158 .
  • the add pin screen 190 also includes an add note field 202 usable to add a note to the pin 158 , an action location field 204 , a character encounters field 206 and a next scene trigger point field 208 .
  • a scene link indicator 210 may also be included to link the scene to other scenes.
  • the action location field 204 , character encounters field 206 and next scene trigger point field 208 enable a user to track, log and group encounters and interactions that are non-linear within an experience. For example, a user can create an opportunity for users to link to other worlds and scenes that are non-chronological, or the user may define different points of view and/or different experiences for different characters associated with the media content.
  • a 3D experience selection model may be used, as shown in FIG. 11 , wherein a user is able to select in 3D the locations of experiences relative to an observer that can occur at a scene.
  • Like and similar features are indicated with like reference numerals.
  • the 3D experience selection model may be selected by selecting “3D sphere” instead of “closed” using the experience model selector 174 and this causes a 3D experience location screen 220 (hereinafter a “spherical space screen”) to be displayed.
  • the spherical space screen 220 is usable to select the locations of experiences in 3D relative to an observer that can occur at a scene, and the types of observer experiences that occur, such as audio, video and/or haptic experiences.
  • a spherical experience space 222 is provided to represent the locations of experiences in 3D relative to an observer.
  • the spherical space screen 220 includes a navigation tool 224 .
  • a journey open world overview screen 230 is displayed instead of a closed world.
  • a journey open world is selected using the world type selection field 96 on the create new project screen 94 , or if a journey open world selector 116 is selected on the closed project overview screen 120 .
  • a journey open world overview screen 230 is displayed instead of a closed world.
  • the journey open world overview screen 230 is used to define the relative locations of scenes in a world space that is structured as a journey in the sense that an underlying direction for the observer is defined but the observer is able to roam within the journey, to define the relative locations of observer experiences at each of the scene locations, and to define the particular observer experiences that occur at the respective scene locations, such as audio, visual and/or haptic experiences.
  • an observer has at least some control over movement of the observer and therefore the location of the observer relative to the available scenes, and over the consequent experiences provided to the observer at the scenes.
  • the journey open world overview screen 230 includes a journey space 232 that represents the available roaming space of an observer 156 .
  • a user is able to add scenes by adding scene icons to the journey space 232 at locations relative to the observer 156 that correspond to the desired locations of scenes, for example by selecting the locations on a touch screen.
  • the scenes may be grouped into several scene groups 240 , 242 , 244 with each scene group allocated a different scene icon.
  • the scene icons include a main scene icon 234 , a side mission icon 236 and a photo mission icon 238 .
  • the scene titles 246 , 248 , 250 of the available scenes may be shown on the journey open world overview screen 230 in a plurality of scene groups 242 , 242 , 244 , and the scene groups may be hidden or displayed using a hide/show button 254 .
  • the journey open world overview screen 230 also includes an add scene button 256 that may be used to add a scene to a scene group 240 , 242 , 244 .
  • selection of a scene icon 234 , 236 , 238 on the journey space 232 causes the relevant scene title 246 , 248 , 250 to be highlighted in the relevant scene group 240 , 242 , 244 .
  • Selection of a scene title 246 , 248 , 250 causes the plate screen 150 shown in FIGS. 8 and 9 to be displayed to enable the user to define the desired locations of experiences relative to the observer 156 for the scene and the type of observer experiences.
  • the user may select “3D sphere” instead of “closed” using the experience model selector 174 if it is desired to define the locations of experiences in 3D relative to an observer instead of 2D.
  • a free roaming open world is selected using the world type selection field 96 on the create new project screen 94 , or if a free roaming world selector 117 is selected on the closed project overview screen 120 , a free roaming open world overview screen 260 as shown in FIG. 13 is displayed.
  • Like and similar features are indicated with like reference numerals.
  • the free roaming open world overview screen 260 is similar to the journey open world overview screen 230 except that the free roaming open world overview screen 260 is used to define the relative locations of scenes in a world space that is structured as a free roaming space instead of a structured journey.
  • the free roaming open world overview screen 260 is used to define the locations of scenes relative to an observer and, through the plate screen 150 or the sphere screen 220 shown in FIGS. 8, 9 and 11 , the relative locations of observer experiences at each of the scene locations, and the particular observer experiences that occur at the respective scene locations, such as audio, visual and/or haptic experiences.
  • the free roaming open world overview screen 260 includes a free roaming space 262 that represents the available roaming space of an observer 156 , in this example the free roaming space 262 shown as a cube.
  • a user is able to add scenes by adding scene icons to the free roaming space 262 at locations that correspond to the desired locations of scenes, for example by selecting the location on a touch screen.
  • the free roaming space may be defined according to a computer-generated space mesh that can have any shape.
  • the space mesh may represent an actual real world space, and the space mesh may be generated using a LIDAR or matterport scanner.
  • a user desires to create a 360° video with defined video, audio and/or haptic responses in defined scenes and at defined times of the video.
  • the user first adds haptic, video, audio and image experience information to the haptic, video, audio and/or images databases 44 , 46 , 48 , 50 , then creates a closed world project by selecting the create new project button 90 on the home page 80 , and selecting “closed” in the world type selection field 96 on the create new project screen 94 shown in FIG. 4 .
  • the user can also add a project name, tagline, description and reference image using project name, tagline, description and reference image fields 100 , 102 , 104 , 106 on the create new project screen 94 .
  • the closed world project overview screen 120 is displayed as shown in FIG. 6 .
  • the user is able to define scene groups 126 , scene titles 128 , the point of view of the observer using the point of view selector 118 , and the order and timing of the scene groups 126 and scenes 128 .
  • the user is also able to add notes to the scenes 128 using a notes icon 134 and notes screen 140 .
  • the user is also able to define the observer experiences that occur at a scene 128 and the locations of the experiences relative to the observer using the plate screen 150 as shown in FIGS. 8 and 9 . If the user wishes to define observer experiences in 3D space, the user selects 3D sphere in the experience model selector 174 which causes the sphere screen 22 to be displayed as shown in FIG. 11 .
  • the user uses either the plate screen 150 or the sphere screen 220 , the user adds pins 158 to the relevant plate area 152 or spherical space 222 at locations relative to the observer 156 that correspond to the desired locations of the observer experiences at a defined time indicated by a time marker 186 , and the user selects the type of user experiences corresponding to each pin 158 using the add pin window 190 .
  • an experience may be an explosion that occurs behind the observer 156 in a defined scene at a defined time in the scene.
  • the user would add a pin to the plate area 152 or sphere space 222 at a location that corresponds to a location behind the observer 156 , and the user would identify relevant video and optionally haptic response associated with the explosion to the pin 158 using the add pin window 190 .
  • Subsequent observer experiences in the scene occurring later can be added by selecting a different time on the time line 184 and adding pins 158 to a further plate area 152 or spherical space 222 at locations relative to the observer 156 that correspond to the desired locations of the desired observer experiences at the different time.
  • a user desires to create a free roaming game world with defined video, audio and/or haptic responses occurring in defined scenes at defined scene locations in the game world and at defined locations in the scenes relative to the observer 156 .
  • the user first adds haptic, video, audio and image experience information to the haptic, video, audio and/or images databases 44 , 46 , 48 , 50 , then creates a free world roaming project by selecting the create new project button 90 on the home page 80 , and selecting free roaming open world in the world type selection field 96 on the create new project screen 94 shown in FIG. 4 .
  • the user can also add a project name, tagline, description and reference image using project name, tagline, description and reference image fields 100 , 102 , 104 , 106 on the create new project screen 94 .
  • the free roaming open world overview screen 260 is displayed as shown in FIG. 13 .
  • the user is able to define: scene types and group the scene types into scene groups 240 , 242 , 244 ;
  • the user is also able to define the observer experiences that occur at each scene and the locations of the experiences relative to the observer by selecting a scene 246 , 248 , 250 which causes the plate screen 150 to be displayed, as shown in FIGS. 8 and 9 . If the user wishes to define observer experiences in 3D space, the user selects 3D sphere in the experience model selector 174 which causes the sphere screen 220 to be displayed as shown in FIG. 11 .
  • the user uses either the plate screen 150 or the sphere screen 220 , the user adds pins 158 to the relevant plate area 152 or spherical space 222 at locations relative to the observer 156 that correspond to the desired locations of the desired observer experiences at a defined time indicated by a time marker 186 , and the user selects the type of user experiences corresponding to each pin 158 using the add pin window 190 .
  • Subsequent observer experiences in the scene occurring at a later time can be added by selecting a different time on the time line 184 and adding pins 158 to a further plate area 152 or spherical space 222 at locations relative to the observer 156 that correspond to the desired locations of the desired observer experiences at the different time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A project planning system is disclosed that comprises a data storage device that stores information indicative of a project, and a user interface that displays an experience location screen including a time selector and an experience space corresponding to a time selected using the time selector. The experience location screen is usable to plan a location and time of occurrence of at least one experience associated with the project by enabling a user to select the time of occurrence of an experience using the time selector, and enabling the user to select the location of the experience by selecting the location of an icon on the experience space, and in response the system disposing the icon on the experience space at the selected location. The experience location screen enables the user to view an experience space displaying the planned location of the or each experience at the selected time.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This application is a continuation application of U.S. patent application Ser. No. 17/030,161 entitled “MEDIA CONTENT PLANNING SYSTEM,” filed on Sep. 23, 2020, which is a continuation application of International Patent Application No. PCT/AU2019/050274 entitled “A MEDIA CONTENT PLANNING SYSTEM,” filed on Mar. 27, 2019, which claims priority to Australian Patent Application No. 2018901016, filed on Mar. 27, 2018, all of which are herein incorporated by reference in their entirety for all purposes.
  • FIELD OF THE INVENTION
  • The present invention relates to a media content planning system. The content planning system has particular application for virtual reality, augmented reality and mixed reality content, and video game development.
  • BACKGROUND OF THE INVENTION
  • In the creation of video content, such as a movie, it is desirable to conceptualise and plan the movie prior to creating the movie as this is cheaper than creating the movie to later determine that the story associated with the movie does not actually work. Typically, this ‘pre-visualization’ step is achieved by first building a ‘blueprint’ of the movie using a script and/or a computer-generated animation.
  • However, it is difficult for creators of virtual reality (VR), augmented reality (AR) and mixed reality (MR) content to appropriately conceptualise and plan VR/AR/MR content because observer experiences typically do not occur at defined times, are typically dependent on the observer location which is controlled by a user, and can emanate from any location relative to the observer.
  • BRIEF SUMMARY OF THE INVENTION
  • Disclosed is a media content planning system comprising:
  • a data storage device arranged to store information indicative of scenes of a media content project;
  • a user interface arranged to:
      • display scene indicia indicative of locations of scenes of a media content project and/or respective timings of occurrence of the scenes in the media content project;
      • enable a user to select an experience space for display, the experience space associated with a defined time and including a representation of an observer;
      • enable the user to select at least one experience icon indicative of an observer experience associated with the scene;
      • enable the user to select a location on the experience space at which to dispose each experience icon, and to display each experience icon on the experience space at the respective selected location in response to selection of a location on the experience space for each experience icon, each selected location representative of a desired location in at least 2 dimensional space relative to the observer of an observer experience in the scene at the defined time;
      • enable the user to select at least one further experience space for display, each further experience space associated with a further defined time corresponding to a later time in the scene than the defined time;
      • enable the user to select a further location on the experience space at which to dispose an experience icon, and to display the experience icon on the further experience space at the selected further location in response to selection of a further location on the further experience space, the selected further location representative of a desired further location in at least 2 dimensional space relative to the observer of the observer experience associated with the experience icon at the further defined time; and
      • the system arranged to store data indicative of:
      • the or each selected observer experience;
      • the scene associated with each observer experience, and for each experience space, the selected location in at least 2 dimensional space relative to the observer of each observer experience relative to the observer.
  • The system may be arranged to facilitate selection by a user of an open world project or a closed world project.
  • For an open world project, the system may be arranged to facilitate selection by a user of a journey open world project or a free roaming open world project.
  • In an embodiment, the media content is a closed world project and the user interface is arranged to facilitate selection by a user of scenes to form part of the media content and the order of presentation of the scenes to an observer. The user interface may also be arranged to facilitate selection by a user of the timing of presentation of scenes of the media content.
  • In an embodiment, the media content is an open world project and the user interface is arranged to facilitate selection by a user of scenes to form part of the media content.
  • In an embodiment, the system is arranged to display a world overview, the world overview comprising a world space including at least one scene icon indicative of at least one scene at a location on the world space representative of the desired location of the scene in the media content.
  • In an embodiment, the world space comprises a defined shape to represent a world overview.
  • In an alternative embodiment, the world space is defined according to a computer-generated space mesh. The space mesh may represent an actual real-world space, and the space mesh may be generated using a LIDAR, matterport scanner or any other scanning device.
  • In an embodiment, the system enables the user to add a scene icon to the world space at a location representative of the desired location of a scene in the media content, and to enable the user to select at least one observer experience for association with the scene icon.
  • In an embodiment, the system enables the user to select the type of observer experience associated with the experience icon.
  • In an embodiment, the experience space comprises at least one annular portion surrounding the representation of the observer, the annular portion usable by the user to add an experience icon to the experience space at a location in 2 dimensional space relative to the observer representative of the desired location in at least 2 dimensional space of an observer experience in the scene.
  • In an embodiment, the experience space comprises a sphere surrounding the representation of the observer, the sphere usable by the user to add an experience icon to the experience space at a location in 3 dimensional space relative to the observer representative of the desired location in at least 2 dimensional space of an observer experience in the scene.
  • In an embodiment, the system is arranged to enable a user to add notes to a scene.
  • In an embodiment, the system is arranged to enable a user to add a representative image to the media content project. The representative image may be a 360° image or a 2D image.
  • In an embodiment, the system is arranged to enable a user to select a point of view for the media content project.
  • In an embodiment, the system is arranged to enable a user to:
  • create a plurality of characters; and
  • create observer experience data for each character.
  • In an embodiment, the system is arranged to enable a user to share the media content project with a selected other user.
  • In an embodiment, the observer experience includes any one or more of a haptic experience, a visual experience and/or an audio experience.
  • In an embodiment, the system includes a user interface device and a remote computing device in communication with the user interface device.
  • In an embodiment, the remote computing device comprises a server arranged to serve data indicative of the user interface to the user interface device. The data indicative of the user interface may comprise data indicative of web pages.
  • The observer experience data may be stored at the remote computing device and/or at the user interface device.
  • In an embodiment, the user interface device includes a head mounted device (HMD) that may include a tool that supports WebVR.
  • Also disclosed is a method of planning media content, the method comprising:
  • storing information indicative of scenes of a media content project;
  • displaying scene indicia indicative of locations of scenes of the media content project and/or respective timings of occurrence of the scenes in the media content project;
  • enabling a user to select scene indicia representative of a scene;
  • enabling the user to select an experience space for display, the experience space associated with a defined time and including a representation of an observer;
  • enabling the user to select at least one experience icon indicative of an observer experience associated with the scene;
  • enabling the user to select a location on the experience space at which to dispose each experience icon, and to display each experience icon on the experience space at the respective selected location in response to selection of a location on the experience space for each experience icon, each selected location representative of a desired location in at least 2 dimensional space relative to the observer of an observer experience in the scene at the defined time;
  • enabling the user to select at least one further experience space for display, each further experience space associated with a further defined time corresponding to a later time in the scene than the defined time;
  • enabling the user to select a further location on the experience space at which to dispose an experience icon, and to display the experience icon on the further experience space at the selected further location in response to selection of a location on the further experience space, the selected location representative of a desired further location in at least 2 dimensional space relative to the observer of the observer experience associated with the experience icon at the further defined time;
  • storing data indicative of:
      • the or each selected observer experience;
      • the scene associated with each observer experience, and
      • for each experience space, the selected location in at least 2 dimensional space relative to the observer of each observer experience relative to the observer.
  • Also disclosed is a user interface for a system for planning media content, the user interface arranged to:
  • display scene indicia indicative of locations of scenes of a media content project and/or respective timings of occurrence of the scenes in the media content project;
  • enable a user to select an experience space for display, the experience space associated with a defined time and including a representation of an observer;
  • enable the user to select at least one experience icon indicative of an observer experience associated with a scene; and
  • enable the user to select a location on the experience space at which to dispose each experience icon, and to display each experience icon on the experience space at the respective selected location in response to selection of a location on the experience space for each experience icon, each selected location representative of a desired location in at least 2 dimensional space relative to the observer of an observer experience in the scene at the defined time;
  • enable the user to select at least one further experience space for display, each further experience space associated with a further defined time corresponding to a later time in the scene than the defined time; and
  • enable the user to select a further location on the experience space at which to dispose an experience icon, and to display each experience icon on the further experience space at the respective selected further location in response to selection of a further location on the experience space for each experience icon, the selected further location representative of a desired further location in at least 2 dimensional space relative to the observer of the observer experience associated with the experience icon at the further defined time.
  • Also disclosed is a media content planning system comprising:
  • a data storage device arranged to store information indicative of scenes of a media content project;
  • the system arranged to communicate information to a user interface device for display at the user interface device, the information indicative of:
      • scene indicia indicative of locations of scenes of a media content project and/or respective timings of occurrence of the scenes of the media content project;
      • an experience space associated with a scene and including a representation of an observer, the experience space associated with a defined time; and
      • at least one experience icon in the experience space, each experience icon indicative of the location in at least 2 dimensional space relative to an observer of an observer experience, the location of each experience icon in the experience space being representative of a desired location in 2 dimensional space relative to the observer of the observer experience in the scene;
  • the system arranged to receive information indicative of selection by a user of an experience space associated with a defined time, to receive information indicative of selection by the user of an observer experience to be associated with the scene, and to receive information indicative of a selection by the user of a location on the experience space at which to dispose an experience icon associated with the observer experience in at least 2 dimensional space relative to the observer, wherein in response to selection of a location on the experience space for each experience icon, the system is arranged to communicate to the user interface device information usable by the user interface device to display each experience icon on the experience space at the respective selected location;
  • the system arranged to receive information indicative of selection by a user of at least one further experience space associated with a further defined time corresponding to a later time in the scene than the defined time, and to receive information indicative of selection by the user of a further location on the experience space at which to dispose the experience icon associated with the observer experience in at least 2 dimensional space relative to the observer, wherein in response to selection of a further location on the experience space for the experience icon, the system is arranged to communicate to the user interface device information usable by the user interface device to display the experience icon on the further experience space at the respective selected further location; and
  • the system arranged to store data indicative of:
      • the selected observer experience;
      • the scene associated with each observer experience, and
      • for each experience space, the selected location in at least 2 dimensional space relative to the observer of each observer experience relative to the observer.
  • Also disclosed is a project planning system comprising:
  • a data storage device arranged to store information indicative of a project;
  • a user interface arranged to:
      • enable a user to select an experience space for display, the experience space associated with a defined time;
      • enable the user to select at least one experience icon indicative of an experience in the project;
      • enable the user to select a location on the experience space at which to dispose each experience icon, and to display each experience icon on the experience space at the respective selected location in response to selection of a location on the experience space for each experience icon, each selected location representative of a desired location in at least 2 dimensional space of an experience at the defined time;
      • enable the user to select at least one further experience space for display, each further experience space associated with a further defined time corresponding to a later time than the defined time;
        • each further experience space enabling the user to select a further location on the experience space at which to dispose an experience icon, and to display the experience icon on the further experience space at the selected further location in response to selection of a further location on the further experience space, the selected further location representative of a desired further location in at least 2 dimensional space of the experience associated with the experience icon at the further defined time; and
  • the system arranged to store data indicative of:
      • the or each selected experience;
      • the scene associated with each experience, and
      • for each experience space, the selected location in at least 2 dimensional space of each experience.
  • In accordance with a first aspect of the present invention, there is provided a project planning system comprising:
  • a data storage device that stores information indicative of a project;
  • a user interface that displays an experience location screen, the experience location screen including a time selector and an experience space corresponding to a time selected using the time selector;
  • wherein the experience location screen is usable to plan a location and time of occurrence at the location of at least one experience associated with the project by:
      • enabling a user to select the time of occurrence of an experience associated with the project using the time selector; and
      • enabling the user to select the location of the experience at the selected time by selecting the location of an icon on the experience space, and in response the system disposing the icon on the experience space at the selected location; and
  • wherein the experience location screen enables the user to view an experience space associated with a selected time, the experience space displaying the planned location of the or each experience at the selected time.
  • In accordance with a second aspect of the present invention, there is provided a method of planning a project, the method comprising:
  • storing information indicative of a project in a data storage device;
  • displaying an experience location screen, the experience location screen including a time selector and an experience space corresponding to a time selected using the time selector;
  • using the experience location screen to plan a location and time of occurrence at the location of at least one experience associated with the project by:
      • selecting the time of occurrence of an experience associated with the project using the time selector; and
      • selecting the location of the experience at the selected time by selecting the location of an icon on the experience space, and in response disposing the icon on the experience space at the selected location; and
      • using the experience location screen to view an experience space associated with a selected time, the experience space displaying the planned location of the or each experience at the selected time.
  • In accordance with a third aspect of the present invention, there is provided a project planning system for planning location and time of events, the system comprising:
  • a data storage device that stores information indicative of a project;
  • a user interface that displays a time and location screen, the time and location screen including a time selector and a location space corresponding to a time selected using the time selector;
  • wherein the time and location screen is usable to plan a location and time of occurrence at the location of at least one event associated with the project by:
      • enabling a user to select the time of occurrence of an event associated with the project using the time selector; and
      • enabling the user to select the location of the event at the selected time by selecting the location of an icon on the location space, and in response the system disposing the icon on the location space at the selected location; and
  • wherein the time and location screen enables the user to view a location space associated with a selected time, the location space displaying the planned location of the or each event at the selected time on the location space.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
  • FIG. 1 is a schematic block diagram of a media content planning system in accordance with an embodiment of the present invention;
  • FIG. 2 is a schematic block diagram of functional components of a user computing device for use with the system shown in FIG. 1; and
  • FIGS. 3 to 13 are diagrammatic representations of screens presented to a user on a user computing device by the system shown in FIG. 1.
  • DETAILED DESCRIPTION OF THE INVENTION
  • In this specification, it will be understood that a “closed world” is a defined space environment wherein an observer is not able to roam freely and scenes of the world are presented to the observer in a defined structure, such as for example in 360° video; and an open world is an environment wherein an observer is able to roam, either in accordance with a defined journey or freely in any direction.
  • Referring to FIGS. 1 and 2 of the drawings, there is shown a media content planning system 10 arranged to facilitate creation of a pre-production blueprint of media content, in particular virtual reality (VR), augmented reality (AR) and mixed reality (MR) content, that can be used by media creators to conceptualise and plan an immersive media experience prior to creation of the actual media content. The system may be applied to a non-linear mixed reality experience.
  • The system is arranged to facilitate mapping of 3D ideas in order to represent the ideas as they would appear to a VR/AR/MR participant within space and time by creating a blueprint for VR/AR/MR content. The system allows a user to spend time developing a representation of the structure and desired content of a VR/AR/MR world, and to share the intended experience of the created VR/AR/MR world with others for collaboration purposes.
  • In a closed world implementation, the system facilitates creation of an ordered scene sequence, and enables a user to plot the relative locations of observer experiences in each scene and to determine the particular observer experiences that occur at the respective locations, such as audio, visual and/or haptic experiences. For example, in each scene the system enables a user to plot the relative locations of audio, visual and/or haptic experiences for an observer in 2D or 3D space.
  • In an open world implementation, the system facilitates creation of a 3D world space, enables a user to plot the relative locations of scenes in the world space, enables a user to plot the relative locations of observer experiences at each of the scene locations in 2D or 3D space, and enables a user to determine the particular observer experiences that occur at the respective scene locations, such as audio, visual and/or haptic experiences.
  • In this way, the system maps content ideas as they would appear to a VR/AR/MR observer within the space and time of the experience. Each observer may represent a character and therefore the system may be used to define different experiences and/or different point of view for each character.
  • In this example, the system 10 is implemented using a remote computing device in the form of a server 20 accessible by user computing devices that include a smartphone 12, a tablet computer 14 and a personal computing device 16 arranged to communicate through a communications network 18.
  • In this example, the user computing devices 12, 14, 16 serve to provide a user interface arranged to present screens associated with the system 10 to a user and facilitate reception of inputs from the user, with functional components 22 of the system substantially implemented at the server 20. However, it will be understood that other implementations are possible. For example, a user computing device 12, 14, 16 may be arranged to substantially implement functional components 22 of the system as a stand-alone device, for example by downloading or otherwise installing an application on the user device 12, 14, 16, or functional components 22 of the system 10 may be implemented partly by a user computing device 12, 14, 16 and partly by the server 20.
  • In this example, the communications network 18 includes the Internet, although it will be understood that any suitable communications network that includes wired and/or wireless communication paths is envisaged. It will also be understood that any suitable computing device capable of executing programs, displaying information to a user and receiving inputs from the user is envisaged.
  • In order to implement desired functionality at the server 20, in this example the server 20 is arranged to include at least one dedicated software application, although it will be understood that functionality may be implemented using dedicated hardware or a combination of dedicated hardware and software.
  • The functional components 22 implemented by the server 20 include a database management system (DBMS) 24 arranged to manage data stored in a data storage device 26 that may include a local data storage device, for example implemented using SQL protocols, and/or cloud based data storage; a login application 28 arranged to manage a user login process, for example by receiving user login details from a user computing device 12, 14, 16 and verifying the received login details with reference login details 30 stored in the data storage device 26; a closed world application 32 arranged to implement functionality for a closed world project; an open world application 32 arranged to implement functionality for an open world project; a 2D pin application 36 that enables a user to select the relative locations of observer experiences for an observer in a scene in 2D space; and a 3D pin application 38 that enables a user to select the relative locations of observer experiences for an observer in a scene in 3D space.
  • In this example, the data storage device 26 is arranged to store data used by the system 10 in multiple relational databases that may be configured according to SQL protocols. The databases include:
  • a projects database 40 arranged to store data indicative of VR/AR/MR projects including the project name, project type (closed world, open world journey, or open world free roaming) and scene locations;
  • an experiences database 42 arranged to store data indicative of the relative locations of observer experiences at each of the scene locations, and the types of observer experiences that occur at the respective scene locations, such as audio, visual and/or haptic experiences;
  • a haptic database 44 arranged to store data indicative of haptic information, such as touch or smell, associated with observer haptic experiences linked to the scene locations;
  • a video database 46 arranged to store data indicative of video information associated with observer video experiences linked to the scene locations;
  • an audio database 48 arranged to store data indicative of audio information associated with observer audio experiences linked to the scene locations, for example traditional audio and/or ambisonic/spatial audio;
  • an images database 50 arranged to store data indicative of image information associated with observer image experiences linked to the scene locations;
  • a users database 52 arranged to store data indicative of registered users associated with the system; and
  • a 3D files database 53 arranged to store 3D files, for example in .OBJ format.
  • An example configuration of a user computing device 12, 14, 16, in this example a tablet computer 14, is shown in FIG. 2.
  • The user computing device 14 includes a processor 60 arranged to control and coordinate operations in the user computing device 14, a data storage device 62 arranged to store programs and data used by the processor 60 to implement the desired functionality, and a memory 64 used by the processor to temporarily store programs and/or data during use.
  • The user computing device 14 also includes a display 68 and a user interface 70, in this example in the form of a touch screen, arranged to enable the user computing device 14 to receive inputs from a user.
  • An example implementation with reference to screens displayed to a user on the user computing device 12, 14, 16 is shown in FIGS. 3 to 13. In this example, the computing device 12, 14, 16 is a tablet computer 14 having a user interface in the form of a touch screen 70 overlaid on the display 68. As such, inputs to the computing device 14 are primarily effected by touching the touch screen 70 using taps, swipes and any other device recognizable gestures. However, it will be understood that the example is equally applicable to implementations on other computing devices. For example, the user computing device may include a head mounted device (HMD) and a tool that supports WebVR.
  • A user first logs into the system 10 by entering user login details at the user interface implemented by the user computing device 14, and the system 10 verifies the entered login details by communicating the login details to the login application 28 and comparing the entered login details at the login application 28 with stored reference login details 30 associated with the user.
  • It will be understood that the user interface may be implemented on the user computing device 12, 14, 16 by installing an interface application on the user computing device 12, 14, 16 arranged to communicate with the server 20, the user interface may be implemented through a web browser, for example by serving web pages corresponding to the screens shown in FIGS. 3 to 13 to the user interface device as required, or the user interface may be implemented in any other way.
  • As shown in FIG. 3, after a user has successfully logged in, the user is presented with a home page 80 that displays information indicative of created projects 82 to the user. Each displayed project 82 includes a project name 84, world type indicia 86 indicative of the type of world environment associated with the project (closed world or open world), and a share icon 88 usable to provide a selected user with a link to the project so that the selected user is able to collaborate in the project creation process.
  • The home page 80 also includes a create new project button 90 usable to create new project.
  • Activation of the create new project button 90 causes a create new project screen 94 to be displayed, as shown in FIG. 4. Like and similar features are indicated with like reference numerals.
  • The create new project screen 94 includes a world type selection field 96 that enables a user to select the type of world environment associated with the project, that is, a closed world environment, a journey open world environment or a free roaming open world environment; a world type icon 98 representative of the type of world selected; a project name field 100 for receiving a project title; a project tagline field 102 for receiving a project tagline; and a description field 104 for receiving descriptive information associated with the project.
  • The create new project screen 94 also includes a reference image field 106 usable to facilitate selection and display of an image that is representative of the project, and a create button 110 that when activated causes a new project record to be created in the projects database 40. Activation of the create button 110 also causes a relevant project overview screen 120, 230, 260 associated with the project to be displayed as shown in FIG. 6, 12 or 13.
  • Selection of the share icon 88 on the home page 80 or on the create new project screen 94 causes a project share screen 111 to be displayed on a user interface of the user selected to collaborate on the project, as shown in FIG. 5.
  • The shared project screen 111 includes an open project button 112 that when activated causes the relevant project overview screen 120, 230, 260 associated with the project to be displayed as shown in FIG. 6, 12 or 13.
  • In this example, a user has selected a closed world environment and as such a closed project overview screen 120 associated with the project is displayed, as shown in FIG. 6.
  • The closed project overview screen 120 includes world type selectors 113—a closed world selector 114, a journey open world selector 116 and a free roaming open world selector 117 that enable a user to switch between world types, and a point of view selector 118 arranged to facilitate selection of the observer point of view, in this example a first person point of view.
  • Since the present project is a closed world project, the closed project overview screen 120 shows a time line 122 defining timing for a sequence of defined scenes 128 to be presented to an observer during the closed world experience. The time line may be displayed or hidden using a hide/show button 124.
  • The scenes 128 are organized in scene groups 126, each scene group 126 representing a different part of the story associated with the project, in this example “Setup”, “Confrontation” and “Resolution”. New scene groups 126 are added using an add scene group button 130. Similarly, new scenes 128 are added using an add scene button 132.
  • Each scene 128 has an associated notes icon 134 that when selected causes a notes screen 140 as shown in FIG. 7 to be displayed.
  • The notes screen 140 includes a scene title 142 and is usable to add notes for a scene into a content field 146, for example using edit tools 144.
  • Selection of a scene 128 causes a 2D experience location screen 150 (hereinafter a “plate screen”) to be displayed, as shown in FIG. 8. The plate screen 150 is usable to select the locations of experiences relative to an observer that can occur at the scene 128, and the types of observer experiences that occur, such as audio, video and/or haptic experiences.
  • The plate screen 150 includes an experience space, in this example a plate area 152, that has several concentric annular portions 154 surrounding an observer icon 156. Disposable on the plate area 152 are pins 158 that represent observer experiences relative to the observer 156. Using the plate screen 150, a user is able to select the desired location of an experience relative to the observer 156 and the type of observer experience. In this example, available observer experiences include video, image, audio and haptic experiences.
  • Each pin 158 includes a pin type icon, for example a visual pin icon 160, an audio pin icon 162 or a haptic pin icon 164. The type of pin is selected using pin type selectors 166, in this example a visual pin selector 168, an audio pin selector 170 and a haptic pin selector 172.
  • The plate screen 150 also includes an experience model selector 174 that can be used to select the type of experience selection model, in this example a 2D experience selection model, as shown in FIGS. 8 and 9, wherein a user is able to select in 2D the locations of experiences relative to an observer that can occur at a scene, and a 3D experience selection model, as shown in FIG. 11, wherein a user is able to select in 3D the locations of experiences relative to an observer that can occur at a scene.
  • The plate screen 150 also includes a pin view selector 176 usable to select the type of pins 158 that are displayed on the plate area 152, for example all pins 158, only haptic pins, only audio pins or only video pins.
  • The plate screen 150 also includes a scene identifier 178 that identifies the title of the scene associated with the displayed plate area 152, in this example a scene titled “speak to an old friend”; a previous scene navigation button 180 usable to navigate to a previous scene in the story timeline; and a next scene navigation button 182 usable to navigate to a subsequent scene in the story timeline.
  • The plate screen 150 also includes a time line 184 that includes a current time marker 186 to indicate the relevant time in the scene that corresponds to the experiences and relative locations of the experiences represented by the pins 158 on plate area 152.
  • Within a scene, it is possible to create multiple plate screens 150, each plate screen 150 corresponding to a different time in the scene and each plate screen 150 potentially including different pins 158 and/or different pin locations relative to the observer 156. For example, as shown in FIG. 8, a further plate screen 150 associated with the scene “speak to old friend” is shown, with the further plate screen 150 representing a later time 188 in the scene than the plate screen 150 shown in FIG. 8. In this way, during a scene an observer is able to have different experiences that are linked to different locations relative to the observer.
  • Selection of a pin selector 166 by a user and subsequent selection of a location on the plate area 152 causes a pin 158 of the type that has been selected to be added to the plate area 152. Subsequent selection of the pin causes an add pin window 190 to be displayed over the plate screen 150. The add pin window 190 is used to add information indicative of the relevant experience or to add a link to information indicative of the relevant experience.
  • In this example, the add pin screen 190 includes a video link box 194 usable to add information indicative of the location of a selected video to be associated with the pin 158, an image link box 196 usable to add information indicative of the location of a selected image to be associated with the pin 158, an audio link box 198 usable to add information indicative of the location of selected audio to be associated with the pin 158, and a document link box 200 usable to add information indicative of the location of a selected document to be associated with the pin 158.
  • The add pin screen 190 also includes an add note field 202 usable to add a note to the pin 158, an action location field 204, a character encounters field 206 and a next scene trigger point field 208. A scene link indicator 210 may also be included to link the scene to other scenes.
  • The action location field 204, character encounters field 206 and next scene trigger point field 208 enable a user to track, log and group encounters and interactions that are non-linear within an experience. For example, a user can create an opportunity for users to link to other worlds and scenes that are non-chronological, or the user may define different points of view and/or different experiences for different characters associated with the media content.
  • As an alternative to a 2D experience selection model, as shown in FIGS. 8 and 9, a 3D experience selection model may be used, as shown in FIG. 11, wherein a user is able to select in 3D the locations of experiences relative to an observer that can occur at a scene. Like and similar features are indicated with like reference numerals.
  • The 3D experience selection model may be selected by selecting “3D sphere” instead of “closed” using the experience model selector 174 and this causes a 3D experience location screen 220 (hereinafter a “spherical space screen”) to be displayed. The spherical space screen 220 is usable to select the locations of experiences in 3D relative to an observer that can occur at a scene, and the types of observer experiences that occur, such as audio, video and/or haptic experiences.
  • Instead of a plate area 152 to represent the locations of experiences in 2D relative to an observer, a spherical experience space 222 is provided to represent the locations of experiences in 3D relative to an observer.
  • In a similar way to the plate screen 150, using the spherical space screen 220 a user is able to select the desired location of an experience relative to the observer 156 and the type of observer experience. In this example, available observer experiences include video, image, audio and haptic experiences. In order to facilitate addition of pins 158 at desired 3D locations, the spherical space screen 220 includes a navigation tool 224.
  • Referring to FIGS. 4 and 6, instead of a closed world, if a journey open world is selected using the world type selection field 96 on the create new project screen 94, or if a journey open world selector 116 is selected on the closed project overview screen 120, a journey open world overview screen 230 as shown in FIG. 12 is displayed. Like and similar features are indicated with like reference numerals.
  • The journey open world overview screen 230 is used to define the relative locations of scenes in a world space that is structured as a journey in the sense that an underlying direction for the observer is defined but the observer is able to roam within the journey, to define the relative locations of observer experiences at each of the scene locations, and to define the particular observer experiences that occur at the respective scene locations, such as audio, visual and/or haptic experiences.
  • It will be understood that unlike the closed world project described in relation to FIGS. 6 to 11 wherein the scenes are presented to an observer according to a defined structure and timing, with the journey open world represented in FIG. 12, an observer has at least some control over movement of the observer and therefore the location of the observer relative to the available scenes, and over the consequent experiences provided to the observer at the scenes.
  • As shown in FIG. 12, the journey open world overview screen 230 includes a journey space 232 that represents the available roaming space of an observer 156. A user is able to add scenes by adding scene icons to the journey space 232 at locations relative to the observer 156 that correspond to the desired locations of scenes, for example by selecting the locations on a touch screen.
  • The scenes may be grouped into several scene groups 240, 242, 244 with each scene group allocated a different scene icon. In this example, the scene icons include a main scene icon 234, a side mission icon 236 and a photo mission icon 238.
  • The scene titles 246, 248, 250 of the available scenes may be shown on the journey open world overview screen 230 in a plurality of scene groups 242, 242, 244, and the scene groups may be hidden or displayed using a hide/show button 254.
  • The journey open world overview screen 230 also includes an add scene button 256 that may be used to add a scene to a scene group 240, 242, 244.
  • In this example, selection of a scene icon 234, 236, 238 on the journey space 232 causes the relevant scene title 246, 248, 250 to be highlighted in the relevant scene group 240, 242, 244. Selection of a scene title 246, 248, 250 causes the plate screen 150 shown in FIGS. 8 and 9 to be displayed to enable the user to define the desired locations of experiences relative to the observer 156 for the scene and the type of observer experiences. At the plate screen 150, the user may select “3D sphere” instead of “closed” using the experience model selector 174 if it is desired to define the locations of experiences in 3D relative to an observer instead of 2D.
  • Referring to FIGS. 4 and 6, if a free roaming open world is selected using the world type selection field 96 on the create new project screen 94, or if a free roaming world selector 117 is selected on the closed project overview screen 120, a free roaming open world overview screen 260 as shown in FIG. 13 is displayed. Like and similar features are indicated with like reference numerals.
  • The free roaming open world overview screen 260 is similar to the journey open world overview screen 230 except that the free roaming open world overview screen 260 is used to define the relative locations of scenes in a world space that is structured as a free roaming space instead of a structured journey. As with the journey open world overview screen 230, the free roaming open world overview screen 260 is used to define the locations of scenes relative to an observer and, through the plate screen 150 or the sphere screen 220 shown in FIGS. 8, 9 and 11, the relative locations of observer experiences at each of the scene locations, and the particular observer experiences that occur at the respective scene locations, such as audio, visual and/or haptic experiences.
  • It will be understood that with a free world roaming project, the scenes are not presented to an observer according to a defined structure, and instead an observer has full control over movement of the observer and therefore the location of the observer relative to the available scenes and the consequent experiences provided to the observer at the scenes.
  • As shown in FIG. 13, the free roaming open world overview screen 260 includes a free roaming space 262 that represents the available roaming space of an observer 156, in this example the free roaming space 262 shown as a cube. A user is able to add scenes by adding scene icons to the free roaming space 262 at locations that correspond to the desired locations of scenes, for example by selecting the location on a touch screen.
  • In a variation, instead of using a defined shape to represent the free roaming space, the free roaming space may be defined according to a computer-generated space mesh that can have any shape. For example, the space mesh may represent an actual real world space, and the space mesh may be generated using a LIDAR or matterport scanner.
  • Examples of the media content planning system will now be described during use.
  • In a first example, a user desires to create a 360° video with defined video, audio and/or haptic responses in defined scenes and at defined times of the video.
  • Using the system 10, if necessary, the user first adds haptic, video, audio and image experience information to the haptic, video, audio and/or images databases 44, 46, 48, 50, then creates a closed world project by selecting the create new project button 90 on the home page 80, and selecting “closed” in the world type selection field 96 on the create new project screen 94 shown in FIG. 4. The user can also add a project name, tagline, description and reference image using project name, tagline, description and reference image fields 100, 102, 104, 106 on the create new project screen 94.
  • After selecting the create button 110, the closed world project overview screen 120 is displayed as shown in FIG. 6.
  • Using the closed world project overview screen 120, the user is able to define scene groups 126, scene titles 128, the point of view of the observer using the point of view selector 118, and the order and timing of the scene groups 126 and scenes 128. The user is also able to add notes to the scenes 128 using a notes icon 134 and notes screen 140.
  • The user is also able to define the observer experiences that occur at a scene 128 and the locations of the experiences relative to the observer using the plate screen 150 as shown in FIGS. 8 and 9. If the user wishes to define observer experiences in 3D space, the user selects 3D sphere in the experience model selector 174 which causes the sphere screen 22 to be displayed as shown in FIG. 11.
  • Using either the plate screen 150 or the sphere screen 220, the user adds pins 158 to the relevant plate area 152 or spherical space 222 at locations relative to the observer 156 that correspond to the desired locations of the observer experiences at a defined time indicated by a time marker 186, and the user selects the type of user experiences corresponding to each pin 158 using the add pin window 190. For example, an experience may be an explosion that occurs behind the observer 156 in a defined scene at a defined time in the scene. For this experience, the user would add a pin to the plate area 152 or sphere space 222 at a location that corresponds to a location behind the observer 156, and the user would identify relevant video and optionally haptic response associated with the explosion to the pin 158 using the add pin window 190.
  • Subsequent observer experiences in the scene occurring later can be added by selecting a different time on the time line 184 and adding pins 158 to a further plate area 152 or spherical space 222 at locations relative to the observer 156 that correspond to the desired locations of the desired observer experiences at the different time.
  • In a second example, a user desires to create a free roaming game world with defined video, audio and/or haptic responses occurring in defined scenes at defined scene locations in the game world and at defined locations in the scenes relative to the observer 156.
  • Using the system 10, if necessary, the user first adds haptic, video, audio and image experience information to the haptic, video, audio and/or images databases 44, 46, 48, 50, then creates a free world roaming project by selecting the create new project button 90 on the home page 80, and selecting free roaming open world in the world type selection field 96 on the create new project screen 94 shown in FIG. 4. The user can also add a project name, tagline, description and reference image using project name, tagline, description and reference image fields 100, 102, 104, 106 on the create new project screen 94.
  • After selecting the create button 110, the free roaming open world overview screen 260 is displayed as shown in FIG. 13.
  • Using the free roaming open world overview screen 260, the user is able to define: scene types and group the scene types into scene groups 240, 242, 244;
  • scene titles 246, 248, 250;
  • the point of view of the observer using the point of view selector 118; and
  • the locations of the scenes relative to the free roaming space 262 by adding scene icons 234, 236, 238 at relevant locations on the free roaming space 262.
  • The user is also able to define the observer experiences that occur at each scene and the locations of the experiences relative to the observer by selecting a scene 246, 248, 250 which causes the plate screen 150 to be displayed, as shown in FIGS. 8 and 9. If the user wishes to define observer experiences in 3D space, the user selects 3D sphere in the experience model selector 174 which causes the sphere screen 220 to be displayed as shown in FIG. 11.
  • Using either the plate screen 150 or the sphere screen 220, the user adds pins 158 to the relevant plate area 152 or spherical space 222 at locations relative to the observer 156 that correspond to the desired locations of the desired observer experiences at a defined time indicated by a time marker 186, and the user selects the type of user experiences corresponding to each pin 158 using the add pin window 190.
  • Subsequent observer experiences in the scene occurring at a later time can be added by selecting a different time on the time line 184 and adding pins 158 to a further plate area 152 or spherical space 222 at locations relative to the observer 156 that correspond to the desired locations of the desired observer experiences at the different time.
  • It is to be understood that, if any prior art publication is referred to herein, such reference does not constitute an admission that the publication forms a part of the common general knowledge in the art, in Australia or any other country.
  • In the claims which follow and in the preceding description of the invention, except where the context requires otherwise due to express language or necessary implication, the word “comprise” or variations such as “comprises” or “comprising” is used in an inclusive sense, i.e. to specify the presence of the stated features but not to preclude the presence or addition of further features in various embodiments of the invention.
  • Modifications and variations as would be apparent to a skilled addressee are determined to be within the scope of the present invention.

Claims (17)

What is claimed is:
1. A project planning system comprising:
a data storage device that stores information indicative of a project;
a user interface that displays an experience location screen, the experience location screen including a time selector and an experience space corresponding to a time selected using the time selector;
wherein the experience location screen is usable to plan a location and time of occurrence at the location of at least one experience associated with the project by:
enabling a user to select the time of occurrence of an experience associated with the project using the time selector; and
enabling the user to select the location of the experience at the selected time by selecting the location of an icon on the experience space, and in response the system disposing the icon on the experience space at the selected location; and
wherein the experience location screen enables the user to view an experience space associated with a selected time, the experience space displaying the planned location of the or each experience at the selected time.
2. A system as claimed in claim 1, wherein the system enables the user to select a type of experience associated with the icon.
3. A system as claimed in claim 1, wherein the experience space comprises at least one annular or spherical portion, the annular or spherical portion usable by the user to add an icon to the experience space at a location representative of a desired location of an experience.
4. A system as claimed in claim 1, wherein the experience includes any one or more of a haptic experience, a visual experience and/or an audio experience.
5. A system as claimed in claim 1, wherein the system is arranged to enable the user to add notes.
6. A system as claimed in claim 1, wherein the time selector comprises a timeline having a user manipulatable time marker.
7. A method of planning a project, the method comprising:
storing information indicative of a project in a data storage device;
displaying an experience location screen, the experience location screen including a time selector and an experience space corresponding to a time selected using the time selector;
using the experience location screen to plan a location and time of occurrence at the location of at least one experience associated with the project by:
selecting the time of occurrence of an experience associated with the project using the time selector; and
selecting the location of the experience at the selected time by selecting the location of an icon on the experience space, and in response disposing the icon on the experience space at the selected location; and
using the experience location screen to view an experience space associated with a selected time, the experience space displaying the planned location of the or each experience at the selected time.
8. A method as claimed in claim 7, comprising facilitating selection by a user of a type of experience associated with the icon.
9. A method as claimed in claim 7, wherein the experience space comprises at least one annular or spherical portion, the method comprising using the annular or spherical portion to add an icon to the experience space at a location representative of a desired location of an experience.
10. A method as claimed in claim 7, wherein the experience includes any one or more of a haptic experience, a visual experience and/or an audio experience.
11. A method as claimed in claim 7, wherein the time selector comprises a timeline having a user manipulatable time marker.
12. A project planning system for planning location and time of events, the system comprising:
a data storage device that stores information indicative of a project;
a user interface that displays a time and location screen, the time and location screen including a time selector and a location space corresponding to a time selected using the time selector;
wherein the time and location screen is usable to plan a location and time of occurrence at the location of at least one event associated with the project by:
enabling a user to select the time of occurrence of an event associated with the project using the time selector; and
enabling the user to select the location of the event at the selected time by selecting the location of an icon on the location space, and in response the system disposing the icon on the location space at the selected location; and
wherein the time and location screen enables the user to view a location space associated with a selected time, the location space displaying the planned location of the or each event at the selected time on the location space.
13. A system as claimed in claim 12, wherein the system enables the user to select a type of event associated with the icon.
14. A system as claimed in claim 12, wherein the location space comprises at least one annular or spherical portion, the annular or spherical portion usable by the user to add an icon to the location space at a location representative of a desired location of an event.
15. A system as claimed in claim 12, wherein the event includes an interaction, a haptic experience, a visual experience and/or an audio experience.
16. A system as claimed in claim 12, wherein the system is arranged to enable the user to add notes.
17. A system as claimed in claim 12, wherein the time selector comprises a timeline having a user manipulatable time marker.
US17/835,753 2018-03-27 2022-06-08 Media content planning system Abandoned US20220300145A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/835,753 US20220300145A1 (en) 2018-03-27 2022-06-08 Media content planning system

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
AU2018901016A AU2018901016A0 (en) 2018-03-27 A media content planning system
AU2018901016 2018-03-27
PCT/AU2019/050274 WO2019183676A1 (en) 2018-03-27 2019-03-27 A media content planning system
US17/030,161 US11360639B2 (en) 2018-03-27 2020-09-23 Media content planning system
US17/835,753 US20220300145A1 (en) 2018-03-27 2022-06-08 Media content planning system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/030,161 Continuation US11360639B2 (en) 2018-03-27 2020-09-23 Media content planning system

Publications (1)

Publication Number Publication Date
US20220300145A1 true US20220300145A1 (en) 2022-09-22

Family

ID=68062393

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/030,161 Active US11360639B2 (en) 2018-03-27 2020-09-23 Media content planning system
US17/835,753 Abandoned US20220300145A1 (en) 2018-03-27 2022-06-08 Media content planning system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/030,161 Active US11360639B2 (en) 2018-03-27 2020-09-23 Media content planning system

Country Status (6)

Country Link
US (2) US11360639B2 (en)
EP (1) EP3776491A4 (en)
JP (1) JP7381556B2 (en)
AU (1) AU2019240763B2 (en)
CA (1) CA3092884A1 (en)
WO (1) WO2019183676A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11074400B2 (en) * 2019-09-30 2021-07-27 Dropbox, Inc. Collaborative in-line content item annotations
JP7407150B2 (en) * 2021-08-17 2023-12-28 任天堂株式会社 Game program, game device, game system, and game processing method

Citations (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5689669A (en) * 1994-04-29 1997-11-18 General Magic Graphical user interface for navigating between levels displaying hallway and room metaphors
US6002853A (en) * 1995-10-26 1999-12-14 Wegener Internet Projects Bv System for generating graphics in response to a database search
US6079982A (en) * 1997-12-31 2000-06-27 Meader; Gregory M Interactive simulator ride
US6119147A (en) * 1998-07-28 2000-09-12 Fuji Xerox Co., Ltd. Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space
US6179619B1 (en) * 1997-05-13 2001-01-30 Shigenobu Tanaka Game machine for moving object
US6219045B1 (en) * 1995-11-13 2001-04-17 Worlds, Inc. Scalable virtual world chat client-server system
US6243091B1 (en) * 1997-11-21 2001-06-05 International Business Machines Corporation Global history view
US6271843B1 (en) * 1997-05-30 2001-08-07 International Business Machines Corporation Methods systems and computer program products for transporting users in three dimensional virtual reality worlds using transportation vehicles
US20010018667A1 (en) * 2000-02-29 2001-08-30 Kim Yang Shin System for advertising on a network by displaying advertisement objects in a virtual three-dimensional area
US6362817B1 (en) * 1998-05-18 2002-03-26 In3D Corporation System for creating and viewing 3D environments using symbolic descriptors
US6396522B1 (en) * 1999-03-08 2002-05-28 Dassault Systemes Selection navigator
US6414679B1 (en) * 1998-10-08 2002-07-02 Cyberworld International Corporation Architecture and methods for generating and displaying three dimensional representations
US20020095463A1 (en) * 2000-04-28 2002-07-18 Sony Corporation Information processing apparatus and method, and storage medium
US20020113820A1 (en) * 2000-10-10 2002-08-22 Robinson Jack D. System and method to configure and provide a network-enabled three-dimensional computing environment
US6570563B1 (en) * 1995-07-12 2003-05-27 Sony Corporation Method and system for three-dimensional virtual reality space sharing and for information transmission
US6573903B2 (en) * 1995-05-08 2003-06-03 Autodesk, Inc. Determining and displaying geometric relationships between objects in a computer-implemented graphics system
US6590593B1 (en) * 1999-04-06 2003-07-08 Microsoft Corporation Method and apparatus for handling dismissed dialogue boxes
US6621508B1 (en) * 2000-01-18 2003-09-16 Seiko Epson Corporation Information processing system
US6690393B2 (en) * 1999-12-24 2004-02-10 Koninklijke Philips Electronics N.V. 3D environment labelling
US20040113887A1 (en) * 2002-08-27 2004-06-17 University Of Southern California partially real and partially simulated modular interactive environment
US6784901B1 (en) * 2000-05-09 2004-08-31 There Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment
US20040193441A1 (en) * 2002-10-16 2004-09-30 Altieri Frances Barbaro Interactive software application platform
US20050093719A1 (en) * 2003-09-26 2005-05-05 Mazda Motor Corporation On-vehicle information provision apparatus
US20050128212A1 (en) * 2003-03-06 2005-06-16 Edecker Ada M. System and method for minimizing the amount of data necessary to create a virtual three-dimensional environment
US6961055B2 (en) * 2001-05-09 2005-11-01 Free Radical Design Limited Methods and apparatus for constructing virtual environments
US7119819B1 (en) * 1999-04-06 2006-10-10 Microsoft Corporation Method and apparatus for supporting two-dimensional windows in a three-dimensional environment
US20080030429A1 (en) * 2006-08-07 2008-02-07 International Business Machines Corporation System and method of enhanced virtual reality
US20080125218A1 (en) * 2006-09-20 2008-05-29 Kelly James Collins Method of use for a commercially available portable virtual reality system
US7382288B1 (en) * 2004-06-30 2008-06-03 Rockwell Collins, Inc. Display of airport signs on head-up display
US7414629B2 (en) * 2002-03-11 2008-08-19 Microsoft Corporation Automatic scenery object generation
US20080235570A1 (en) * 2006-09-15 2008-09-25 Ntt Docomo, Inc. System for communication through spatial bulletin board
US7467356B2 (en) * 2003-07-25 2008-12-16 Three-B International Limited Graphical user interface for 3d virtual display browser using virtual display windows
US20090076791A1 (en) * 2007-09-18 2009-03-19 Disney Enterprises, Inc. Method and system for converting a computer virtual environment into a real-life simulation environment
US20090091583A1 (en) * 2007-10-06 2009-04-09 Mccoy Anthony Apparatus and method for on-field virtual reality simulation of US football and other sports
US7542040B2 (en) * 2004-08-11 2009-06-02 The United States Of America As Represented By The Secretary Of The Navy Simulated locomotion method and apparatus
US20090287728A1 (en) * 2008-05-15 2009-11-19 International Business Machines Corporation Tag along shopping
US20090300528A1 (en) * 2006-09-29 2009-12-03 Stambaugh Thomas M Browser event tracking for distributed web-based processing, spatial organization and display of information
US7663625B2 (en) * 2001-03-23 2010-02-16 Dassault Systemes Collaborative design
US20100070378A1 (en) * 2008-09-13 2010-03-18 At&T Intellectual Property I, L.P. System and method for an enhanced shopping experience
US20100115428A1 (en) * 2000-02-04 2010-05-06 Browse3D Corporation System and method for web browsing
US7746343B1 (en) * 2005-06-27 2010-06-29 Google Inc. Streaming and interactive visualization of filled polygon data in a geographic information system
US20100205541A1 (en) * 2009-02-11 2010-08-12 Jeffrey A. Rapaport social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic
US20100214284A1 (en) * 2009-02-24 2010-08-26 Eleanor Rieffel Model creation using visual markup languages
US7788323B2 (en) * 2000-09-21 2010-08-31 International Business Machines Corporation Method and apparatus for sharing information in a virtual environment
US7804507B2 (en) * 2006-07-27 2010-09-28 Electronics And Telecommunications Research Institute Face-mounted display apparatus for mixed reality environment
US7814429B2 (en) * 2006-06-14 2010-10-12 Dassault Systemes Computerized collaborative work
US7817150B2 (en) * 2005-09-30 2010-10-19 Rockwell Automation Technologies, Inc. Three-dimensional immersive system for representing an automation control environment
US20100274567A1 (en) * 2009-04-22 2010-10-28 Mark Carlson Announcing information about payment transactions of any member of a consumer group
US20100274627A1 (en) * 2009-04-22 2010-10-28 Mark Carlson Receiving an announcement triggered by location data
US7844724B2 (en) * 2007-10-24 2010-11-30 Social Communications Company Automated real-time data stream switching in a shared virtual area communication environment
US20110010636A1 (en) * 2009-07-13 2011-01-13 International Business Machines Corporation Specification of a characteristic of a virtual universe establishment
US20110041083A1 (en) * 2007-12-12 2011-02-17 Oz Gabai System and methodology for providing shared internet experience
US20170185261A1 (en) * 2015-12-28 2017-06-29 Htc Corporation Virtual reality device, method for virtual reality
US9696795B2 (en) * 2015-02-13 2017-07-04 Leap Motion, Inc. Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments
US9996797B1 (en) * 2013-10-31 2018-06-12 Leap Motion, Inc. Interactions with virtual objects for machine control
US20180276221A1 (en) * 2017-03-21 2018-09-27 EarthX, Inc. Geostory method and apparatus
US20190197785A1 (en) * 2017-12-22 2019-06-27 Magic Leap, Inc. Methods and system for managing and displaying virtual content in a mixed reality system

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997042601A1 (en) * 1996-05-06 1997-11-13 Sas Institute, Inc. Integrated interactive multimedia process
CA2587644C (en) * 2004-11-12 2015-01-13 Mok3, Inc. Method for inter-scene transitions
US20070162854A1 (en) * 2006-01-12 2007-07-12 Dan Kikinis System and Method for Interactive Creation of and Collaboration on Video Stories
US8271962B2 (en) * 2006-09-12 2012-09-18 Brian Muller Scripted interactive screen media
US20090113305A1 (en) * 2007-03-19 2009-04-30 Elizabeth Sherman Graif Method and system for creating audio tours for an exhibition space
US8443284B2 (en) * 2007-07-19 2013-05-14 Apple Inc. Script-integrated storyboards
US20090219291A1 (en) * 2008-02-29 2009-09-03 David Brian Lloyd Movie animation systems
US8531522B2 (en) * 2008-05-30 2013-09-10 Verint Systems Ltd. Systems and methods for video monitoring using linked devices
US8631334B2 (en) * 2009-12-31 2014-01-14 International Business Machines Corporation Virtual world presentation composition and management
US20160182971A1 (en) * 2009-12-31 2016-06-23 Flickintel, Llc Method, system and computer program product for obtaining and displaying supplemental data about a displayed movie, show, event or video game
KR101721539B1 (en) * 2010-02-11 2017-03-30 삼성전자주식회사 Method and apparatus for providing user interface in mobile terminal
US20110210962A1 (en) * 2010-03-01 2011-09-01 Oracle International Corporation Media recording within a virtual world
US9378296B2 (en) * 2010-08-24 2016-06-28 International Business Machines Corporation Virtual world construction
EP2469474B1 (en) * 2010-12-24 2020-02-12 Dassault Systèmes Creation of a playable scene with an authoring system
US20120198319A1 (en) * 2011-01-28 2012-08-02 Giovanni Agnoli Media-Editing Application with Video Segmentation and Caching Capabilities
US8606611B1 (en) * 2011-10-13 2013-12-10 Intuit Inc. Scheduling via multiple dimensions including worker, time, and location
US9632685B2 (en) * 2012-05-31 2017-04-25 Eric Qing Li Method of navigating through a media program displayed on a portable electronic device in a magnified time scale
WO2014085823A1 (en) * 2012-12-02 2014-06-05 Bachir Babale Virtual decals for precision alignment and stabilization of motion graphics on mobile video
CN104102678B (en) * 2013-04-15 2018-06-05 腾讯科技(深圳)有限公司 The implementation method and realization device of augmented reality
WO2015006784A2 (en) * 2013-07-12 2015-01-15 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US9952042B2 (en) * 2013-07-12 2018-04-24 Magic Leap, Inc. Method and system for identifying a user location
JP6259368B2 (en) * 2013-11-28 2018-01-10 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Control method of mobile terminal
US9332285B1 (en) * 2014-05-28 2016-05-03 Lucasfilm Entertainment Company Ltd. Switching modes of a media content item
KR20160001266A (en) * 2014-06-27 2016-01-06 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9817627B2 (en) * 2014-08-04 2017-11-14 At&T Intellectual Property I, L.P. Method and apparatus for presentation of media content
US20160330522A1 (en) * 2015-05-06 2016-11-10 Echostar Technologies L.L.C. Apparatus, systems and methods for a content commentary community
US10235808B2 (en) * 2015-08-20 2019-03-19 Microsoft Technology Licensing, Llc Communication system
US9928656B2 (en) * 2015-09-11 2018-03-27 Futurewei Technologies, Inc. Markerless multi-user, multi-object augmented reality on mobile devices
EP3151243B1 (en) * 2015-09-29 2021-11-24 Nokia Technologies Oy Accessing a video segment
US9824500B2 (en) * 2016-03-16 2017-11-21 Microsoft Technology Licensing, Llc Virtual object pathing
US20180096532A1 (en) * 2016-10-03 2018-04-05 Honeywell International Inc. System and method for virtual reality simulation of vehicle travel
US20180096506A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
CN109143576B (en) * 2017-06-27 2021-01-22 京东方科技集团股份有限公司 Display system, display method thereof and vehicle
US10901687B2 (en) * 2018-02-27 2021-01-26 Dish Network L.L.C. Apparatus, systems and methods for presenting content reviews in a virtual world

Patent Citations (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5689669A (en) * 1994-04-29 1997-11-18 General Magic Graphical user interface for navigating between levels displaying hallway and room metaphors
US6573903B2 (en) * 1995-05-08 2003-06-03 Autodesk, Inc. Determining and displaying geometric relationships between objects in a computer-implemented graphics system
US6570563B1 (en) * 1995-07-12 2003-05-27 Sony Corporation Method and system for three-dimensional virtual reality space sharing and for information transmission
US6002853A (en) * 1995-10-26 1999-12-14 Wegener Internet Projects Bv System for generating graphics in response to a database search
US6219045B1 (en) * 1995-11-13 2001-04-17 Worlds, Inc. Scalable virtual world chat client-server system
US6179619B1 (en) * 1997-05-13 2001-01-30 Shigenobu Tanaka Game machine for moving object
US6271843B1 (en) * 1997-05-30 2001-08-07 International Business Machines Corporation Methods systems and computer program products for transporting users in three dimensional virtual reality worlds using transportation vehicles
US6243091B1 (en) * 1997-11-21 2001-06-05 International Business Machines Corporation Global history view
US6079982A (en) * 1997-12-31 2000-06-27 Meader; Gregory M Interactive simulator ride
US6362817B1 (en) * 1998-05-18 2002-03-26 In3D Corporation System for creating and viewing 3D environments using symbolic descriptors
US6119147A (en) * 1998-07-28 2000-09-12 Fuji Xerox Co., Ltd. Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space
US6414679B1 (en) * 1998-10-08 2002-07-02 Cyberworld International Corporation Architecture and methods for generating and displaying three dimensional representations
US6396522B1 (en) * 1999-03-08 2002-05-28 Dassault Systemes Selection navigator
US7119819B1 (en) * 1999-04-06 2006-10-10 Microsoft Corporation Method and apparatus for supporting two-dimensional windows in a three-dimensional environment
US6590593B1 (en) * 1999-04-06 2003-07-08 Microsoft Corporation Method and apparatus for handling dismissed dialogue boxes
US6690393B2 (en) * 1999-12-24 2004-02-10 Koninklijke Philips Electronics N.V. 3D environment labelling
US6621508B1 (en) * 2000-01-18 2003-09-16 Seiko Epson Corporation Information processing system
US20100115428A1 (en) * 2000-02-04 2010-05-06 Browse3D Corporation System and method for web browsing
US20010018667A1 (en) * 2000-02-29 2001-08-30 Kim Yang Shin System for advertising on a network by displaying advertisement objects in a virtual three-dimensional area
US7653877B2 (en) * 2000-04-28 2010-01-26 Sony Corporation Information processing apparatus and method, and storage medium
US20020095463A1 (en) * 2000-04-28 2002-07-18 Sony Corporation Information processing apparatus and method, and storage medium
US6784901B1 (en) * 2000-05-09 2004-08-31 There Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment
US7788323B2 (en) * 2000-09-21 2010-08-31 International Business Machines Corporation Method and apparatus for sharing information in a virtual environment
US20020113820A1 (en) * 2000-10-10 2002-08-22 Robinson Jack D. System and method to configure and provide a network-enabled three-dimensional computing environment
US7663625B2 (en) * 2001-03-23 2010-02-16 Dassault Systemes Collaborative design
US6961055B2 (en) * 2001-05-09 2005-11-01 Free Radical Design Limited Methods and apparatus for constructing virtual environments
US7414629B2 (en) * 2002-03-11 2008-08-19 Microsoft Corporation Automatic scenery object generation
US20040113887A1 (en) * 2002-08-27 2004-06-17 University Of Southern California partially real and partially simulated modular interactive environment
US20040193441A1 (en) * 2002-10-16 2004-09-30 Altieri Frances Barbaro Interactive software application platform
US20050128212A1 (en) * 2003-03-06 2005-06-16 Edecker Ada M. System and method for minimizing the amount of data necessary to create a virtual three-dimensional environment
US7467356B2 (en) * 2003-07-25 2008-12-16 Three-B International Limited Graphical user interface for 3d virtual display browser using virtual display windows
US20050093719A1 (en) * 2003-09-26 2005-05-05 Mazda Motor Corporation On-vehicle information provision apparatus
US7382288B1 (en) * 2004-06-30 2008-06-03 Rockwell Collins, Inc. Display of airport signs on head-up display
US7542040B2 (en) * 2004-08-11 2009-06-02 The United States Of America As Represented By The Secretary Of The Navy Simulated locomotion method and apparatus
US7746343B1 (en) * 2005-06-27 2010-06-29 Google Inc. Streaming and interactive visualization of filled polygon data in a geographic information system
US7817150B2 (en) * 2005-09-30 2010-10-19 Rockwell Automation Technologies, Inc. Three-dimensional immersive system for representing an automation control environment
US7814429B2 (en) * 2006-06-14 2010-10-12 Dassault Systemes Computerized collaborative work
US7804507B2 (en) * 2006-07-27 2010-09-28 Electronics And Telecommunications Research Institute Face-mounted display apparatus for mixed reality environment
US20080246693A1 (en) * 2006-08-07 2008-10-09 International Business Machines Corporation System and method of enhanced virtual reality
US20080030429A1 (en) * 2006-08-07 2008-02-07 International Business Machines Corporation System and method of enhanced virtual reality
US20080235570A1 (en) * 2006-09-15 2008-09-25 Ntt Docomo, Inc. System for communication through spatial bulletin board
US20080125218A1 (en) * 2006-09-20 2008-05-29 Kelly James Collins Method of use for a commercially available portable virtual reality system
US20090300528A1 (en) * 2006-09-29 2009-12-03 Stambaugh Thomas M Browser event tracking for distributed web-based processing, spatial organization and display of information
US20090076791A1 (en) * 2007-09-18 2009-03-19 Disney Enterprises, Inc. Method and system for converting a computer virtual environment into a real-life simulation environment
US20090091583A1 (en) * 2007-10-06 2009-04-09 Mccoy Anthony Apparatus and method for on-field virtual reality simulation of US football and other sports
US7844724B2 (en) * 2007-10-24 2010-11-30 Social Communications Company Automated real-time data stream switching in a shared virtual area communication environment
US20110041083A1 (en) * 2007-12-12 2011-02-17 Oz Gabai System and methodology for providing shared internet experience
US20090287728A1 (en) * 2008-05-15 2009-11-19 International Business Machines Corporation Tag along shopping
US20100070378A1 (en) * 2008-09-13 2010-03-18 At&T Intellectual Property I, L.P. System and method for an enhanced shopping experience
US20100205541A1 (en) * 2009-02-11 2010-08-12 Jeffrey A. Rapaport social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic
US20100214284A1 (en) * 2009-02-24 2010-08-26 Eleanor Rieffel Model creation using visual markup languages
US20100274567A1 (en) * 2009-04-22 2010-10-28 Mark Carlson Announcing information about payment transactions of any member of a consumer group
US20100274627A1 (en) * 2009-04-22 2010-10-28 Mark Carlson Receiving an announcement triggered by location data
US20110010636A1 (en) * 2009-07-13 2011-01-13 International Business Machines Corporation Specification of a characteristic of a virtual universe establishment
US9996797B1 (en) * 2013-10-31 2018-06-12 Leap Motion, Inc. Interactions with virtual objects for machine control
US9696795B2 (en) * 2015-02-13 2017-07-04 Leap Motion, Inc. Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments
US20170185261A1 (en) * 2015-12-28 2017-06-29 Htc Corporation Virtual reality device, method for virtual reality
US20180276221A1 (en) * 2017-03-21 2018-09-27 EarthX, Inc. Geostory method and apparatus
US20190197785A1 (en) * 2017-12-22 2019-06-27 Magic Leap, Inc. Methods and system for managing and displaying virtual content in a mixed reality system

Also Published As

Publication number Publication date
AU2019240763B2 (en) 2024-09-19
AU2019240763A1 (en) 2020-10-01
JP2021519481A (en) 2021-08-10
WO2019183676A1 (en) 2019-10-03
JP7381556B2 (en) 2023-11-15
EP3776491A1 (en) 2021-02-17
US20210004143A1 (en) 2021-01-07
EP3776491A4 (en) 2021-07-28
US11360639B2 (en) 2022-06-14
CA3092884A1 (en) 2019-10-03

Similar Documents

Publication Publication Date Title
US11275481B2 (en) Collaborative augmented reality system
US20230062951A1 (en) Augmented reality platform for collaborative classrooms
CN114443945B (en) Display method of application icons in virtual user interface and three-dimensional display device
US8271905B2 (en) Information presentation in virtual 3D
US20050193333A1 (en) Survey generation system
CN108961418A (en) A kind of Knowledge Visualization interface system and method based on virtual three-dimensional space
CN103650013A (en) Methods and systems for browsing heterogeneous map data
US20220300145A1 (en) Media content planning system
Batch et al. Evaluating View Management for Situated Visualization in Web‐based Handheld AR
CN103631477B (en) A kind of device and method for representing financial transaction peration data with dynamic image
WO2019190722A1 (en) Systems and methods for content management in augmented reality devices and applications
CN114344898A (en) A method and device for marking virtual objects in games
JP2004184650A (en) Virtual reality space system
GB2425878A (en) Electronic learning environment
JP3413145B2 (en) Virtual space editing method and virtual space editing device
Gonzalez Calleros et al. Is natural user interaction really natural? An evaluation of gesture-based navigating techniques in virtual environments
JPH04288674A (en) Hypertext device
JP2006163721A (en) Method for simulating furniture layout
Shepard Map-based Input with Google Fusion Tables
JP3312688B2 (en) Hypertext device
Cartwright Using 3D models for visualizing “The city as it might be”
Tickoo Exploring Autodesk Navisworks 2019
JP2007249561A (en) Display system and program of screen transition diagram
dos Santos Augmented Reality Application for Smart Cities
KR20240087499A (en) Metaverse based engineering system

Legal Events

Date Code Title Description
AS Assignment

Owner name: SPACEDRAFT PTY LTD, AUSTRALIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:COOKE, LUCY;REEL/FRAME:060150/0213

Effective date: 20200828

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION