US12229800B2 - Augmented reality guest recognition systems and methods - Google Patents
Augmented reality guest recognition systems and methods Download PDFInfo
- Publication number
- US12229800B2 US12229800B2 US17/180,551 US202117180551A US12229800B2 US 12229800 B2 US12229800 B2 US 12229800B2 US 202117180551 A US202117180551 A US 202117180551A US 12229800 B2 US12229800 B2 US 12229800B2
- Authority
- US
- United States
- Prior art keywords
- guest
- augmented reality
- guests
- reality display
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0282—Rating or review of business operators or products
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
- G06V40/25—Recognition of walking or running movements, e.g. gait recognition
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the present disclosure relates generally to systems and methods for the creation of an augmented reality environment that facilitates employees of a venue (e.g., an amusement or theme park) to enhance the experiences of guests of the venue. More specifically, embodiments of the present disclosure relate generally to systems and methods for an augmented reality environment based on automated recognition of guests of the venue.
- a venue e.g., an amusement or theme park
- Amusement parks and/or theme parks may include various entertainment attractions, restaurants, souvenir shops, and rides useful in providing enjoyment to guests (e.g., families and/or people of all ages) of the venue.
- Areas of the venue may have different themes that are specifically targeted to certain audiences. For example, certain areas may include themes that are traditionally of interest to children, while other areas may include themes that are traditionally of interest to more mature audiences.
- locations having themes associated with such a venue may be referred to as an attraction or a themed attraction.
- These themed attractions may be established using fixed equipment, building layouts, props, decorations, and so forth, most of which may generally relate to a certain theme.
- employees of the venue may help enhance the experiences of the guests. As such, the employees of the venue may be assisted with more up-to-date information relating to the previous and current states of the experiences of the guests in order to enhance the future experiences of the guests.
- a system in certain embodiments, includes a guest recognition system configured to recognize one or more guests in a venue.
- the system also includes a guest experience analysis system configured to receive data relating to the recognized one or more guests from the guest recognition system, to generate guest experience information relating to the recognized one or more guests based at least in part on the received data relating to the recognized one or more guests, and to transmit the guest experience information relating to the recognized one or more guests to an augmented reality display device for display on an augmented reality display of the augmented reality display device.
- a method includes recognizing, via a guest recognition system, one or more guests in a venue.
- the method also includes generating, via a guest experience analysis system, guest experience information relating to the recognized one or more guests.
- the method further includes transmitting, via the guest experience analysis system, the guest experience information relating to the recognized one or more guests to an augmented reality display device for display on an augmented reality display of the augmented reality display device.
- an augmented reality display device includes an augmented reality display configured to pass-through images of one or more guests of a venue.
- the augmented reality display device also includes one or more non-transitory, computer-readable media storing instructions which, when executed by at least one processor, cause the at least one processor to perform operations.
- the operations include receiving guest experience information relating to a targeted guest of the one or more guests from a guest experience analysis system.
- the operations also include superimposing the guest experience information relating to the targeted guest on the augmented reality display near pass-through images of the targeted guest.
- FIG. 1 illustrates features of a venue including one or more attractions, in accordance with embodiments of the present disclosure
- FIG. 2 is a schematic diagram of an augmented reality guest recognition system for providing an enhanced guest experience for the venue features illustrated in FIG. 1 , in accordance with embodiments of the present disclosure
- FIG. 3 illustrates an exemplary portion of the augmented reality guest recognition system of FIG. 2 , in accordance with embodiments of the present disclosure
- FIG. 4 illustrates the guest recognition sensors and the augmented reality display endpoint of FIG. 3 from the point-of-view of a venue employee, illustrating exemplary guest experience information relating to guests of the venue, in accordance with embodiments of the present disclosure
- FIG. 5 illustrates wearable augmented reality display devices (e.g., augmented reality glasses, augmented reality goggles, other augmented reality headgear, and so forth), in accordance with embodiments of the present disclosure
- FIG. 6 is a flow diagram of a method of use of the augmented reality guest recognition system, in accordance with embodiments of the present disclosure.
- Embodiments of the present disclosure enable a more dynamic interaction experience for guests of venues, such as amusement or theme parks, where information relating to previous and current experiences of the guests within a venue are automatically (e.g., without any human intervention) provided to employees of the venue for the purpose of enabling the employees to enhance future experiences of the guests. More specifically, embodiments of the present disclosure enable the guests to be automatically recognized (e.g., by a computer-based augmented reality guest recognition system, as described in greater detail herein) during their time in the venue, and information related to the guests' experiences within the venue may be automatically (e.g., without any human intervention) presented to the employees of the venue to facilitate the employees enhancing future experiences of the guests within the venue.
- a computer-based augmented reality guest recognition system as described in greater detail herein
- the guests of the venue may be automatically recognized using facial recognition techniques, clothing recognition techniques, movement recognition techniques (e.g., to detect identifiable gaits of particular guests), and/or other guest recognition techniques, as described in greater detail herein, to track the previous and current experiences of the guests within the venue, and the information relating to such experiences may be automatically (e.g., without any human intervention) provided to the employees of the venue based at least in part on this automated guest recognition, as also described in greater detail herein.
- the venue 10 may include thrill rides 12 , venue facilities 14 (e.g., restaurants, souvenir shops, and so forth), additional venue attractions 16 , and venue employees 18 (e.g., acting as themed characters, in certain embodiments).
- venue facilities 14 e.g., restaurants, souvenir shops, and so forth
- additional venue attractions 16 e.g., additional venue attractions
- venue employees 18 e.g., acting as themed characters, in certain embodiments.
- experiences of guests 20 of the venue 10 may be enhanced by the venue employees 18 based at least in part on information relating to previous and current experiences of the guests 20 within the venue 10 , which may be automatically (e.g., without any human intervention) determined based at least in part on automated recognition of the guests 20 within the venue 10 , and which may be presented to the venue employees 18 via an augmented reality system.
- a guest recognition system 22 may automatically (e.g., without any human intervention) recognize the guests 20 for the purpose of tracking previous and current experiences of the guests 20 within the venue 10 using, for example, facial recognition techniques, movement recognition techniques (e.g., to detect identifiable gaits of particular guests 20 ), and/or other guest recognition techniques, as described in greater detail herein.
- the information relating to the guest recognition performed by the guest recognition system 22 may be analyzed by a guest experience analysis system 24 to determine information relating to previous and current experiences of the guests 20 within the venue 10 , and this information may be provided to the venue employees 18 by an augmented reality system, as described in greater detail herein, to enable the venue employees 18 to enhance future experiences of the guests 20 within the venue 10 .
- the augmented reality system may include one or more augmented reality display endpoints 26 that are configured to display augmented reality information with respect to guests 20 that are also displayed via the augmented reality display endpoints 26 .
- the one or more augmented reality display endpoints 26 may be, for example, relatively stationary kiosks disposed at certain locations within the venue 10 .
- the augmented reality display endpoints 26 may include transparent displays that pass-through images of the guests 20 while also augmenting the display with information relating to previous and current experiences of the guests 20 , which is determined by the guest experience analysis system 24 .
- the augmented reality system may include one or more wearable augmented reality display devices 28 (e.g., augmented reality glasses, augmented reality goggles, other augmented reality headgear, and so forth) that are configured to display augmented reality information with respect to guests 20 that are also displayed via the wearable augmented reality display devices 28 .
- the wearable augmented reality display devices 28 also may include transparent displays that pass-through images of the guests 20 while also augmenting the display with information relating to previous and current experiences of the guests 20 , which is determined by the guest experience analysis system 24 .
- the information relating to previous and current experiences of a particular guest 20 may be superimposed onto a transparent display near a pass-through image of the particular guest 20 so that a venue employee 18 may easily recognize the information as being associated with the particular guest 20 , thereby facilitating the venue employee 18 to help improve future experiences of the particular guest 20 .
- the augmented reality display endpoints 26 and/or wearable augmented reality display devices 28 described herein may be configured to provide audio cues and/or haptic feedback indicative of information relating to guests 20 displayed via the augmented reality display endpoints 26 and/or wearable augmented reality display devices 28 .
- the information relating to previous and current experiences of the guests 20 may also be presented to users (e.g., venue employees 18 ) via one or more wearable devices 30 , one or more mobile devices 32 , and/or other one or more themed devices 34 .
- the wearable devices 30 may be watch-like electronic devices, bracelets, amulets, rings, headbands, hats, helmets, t-shirts, jackets, coats, shorts, pants, shoes, boots, or any other conveniently wearable items.
- the mobile devices 32 may be mobile phones (e.g., smartphones), tablet computers, or any other suitable devices that can be carried around the venue 10 by a guest 20 .
- the themed devices 34 may be venue theme-related objects, such as toy guns, swords, flags, wands, and so forth.
- the wearable devices 30 and/or the themed devices 34 may either include circuitry (e.g., small chips) disposed within them (e.g., sewn within clothing material, and so forth) or may include unique patterns (e.g., images, and so forth) that may be passively tracked by the guest recognition system 22 to recognize the guests 20 and/or experiences of the guests 20 .
- one or more physical objects 36 disposed within a real-world environment of the venue 10 may be configured to generate one or more physical effects 38 (e.g., generation of sparks, generation of fire, generation of wind, movement, and so forth) based at least in part on control signals received from one or more of the devices 26 , 28 , 30 , 32 , 34 described herein, which may be caused to be transmitted by users (e.g., venue employees 18 ) based on the augmented reality information presented to the users (e.g., venue employees 18 ) via the devices 26 , 28 , 30 , 32 , 34 .
- one or more physical effects 38 e.g., generation of sparks, generation of fire, generation of wind, movement, and so forth
- control signals received from one or more of the devices 26 , 28 , 30 , 32 , 34 described herein, which may be caused to be transmitted by users (e.g., venue employees 18 ) based on the augmented reality information presented to the users (e.g., venue employees 18 )
- the venue employee 18 may determine that a particular physical effect 38 may enhance an experience of the particular guest 20 , and the venue employee 18 may initiate a control signal being transmitted to a particular physical object 36 to generate the particular physical effect 38 .
- FIG. 2 is a schematic diagram of an augmented reality guest recognition system 40 for providing an enhanced guest experience for a venue 10 , in accordance with embodiments of the present disclosure.
- the augmented reality guest recognition system 40 may include a guest recognition system 22 and a guest experience analysis system 24 that are configured to cooperate together to recognize guests 20 while they are experiencing various features of the venue 10 , for example, as illustrated in FIG. 1 , and to provide users (e.g., venue employees 18 ) with augmented reality views of information relating to previous and current experiences of the guests 20 to enable the users (e.g., venue employees 18 ) to further enhance future experiences of the guests 20 .
- users e.g., venue employees 18
- one or more augmented reality display endpoints 26 and/or one or more wearable augmented reality display devices 28 may be configured to display the augmented reality views of the information relating to the previous and current experiences of the guests 20 within the venue 10 , which is generated by the guest experience analysis system 24 based at least in part on the information tracked by the guest recognition system 22 .
- one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 may also be configured to present the augmented reality views of the information relating to the previous and current experiences of the guests 20 within the venue 10 , which is generated by the guest experience analysis system 24 based at least in part on the information tracked by the guest recognition system 22 .
- guests 20 of the venue 10 may be able to view at least a portion of the augmented reality views of the information relating to the previous and current experiences of guests 20 within the venue 10 via one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 , and may provide inputs similar to those that may be received from users (e.g., venue employees 18 ) via the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 , as described in greater detail herein, via one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 .
- users e.g., venue employees 18
- the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 as described in greater detail herein, via one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 .
- data from one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 may be used by the guest experience analysis system 24 to determine certain experiences of certain guests 20 , as described in greater detail herein.
- the augmented reality guest recognition system 40 may also include one or more physical objects 36 that are configured to generate one or more physical effects 38 , as described in greater detail herein (e.g., with respect to FIG. 1 ).
- the augmented reality display endpoints 26 , the wearable augmented reality display devices 28 , the wearable devices 30 , the mobile devices 32 , the themed devices 34 , and/or the physical objects 36 may be communicatively coupled to the guest recognition system 22 and/or the guest experience analysis system 24 (e.g., within the venue 10 ) via a wireless network 42 (e.g., wireless local area networks (WLANs), wireless wide area networks (WWANs), near field communication (NFC) networks, or any other suitable wireless networks).
- the augmented reality display endpoints 26 and/or the physical objects 36 may be communicatively coupled to the guest recognition system 22 and/or the guest experience analysis system 24 via direct physical connection 44 (e.g., using communication cables).
- the guest experience analysis system 24 may generate guest experience information 46 (e.g., metadata) to be displayed as augmenting information via, for example, displays 48 , 50 of the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 , respectively, based at least in part on guest recognition data detected by the guest recognition system 22 .
- the guest recognition system 22 may include one or more guest recognition sensors 52 configured to collect data, which may be used by the guest recognition system 22 to recognize guests 20 within the venue 10 .
- the guest recognition sensors 52 may include cameras configured to collect images (and/or video streams) in substantially real time, and the guest recognition system 22 may execute facial recognition algorithms, clothing recognition algorithms, movement recognition algorithms (e.g., to detect identifiable gaits of particular guests 20 ), and other types of recognition algorithms, to detect identities of certain guests 20 (or groups of guests 20 , as described in greater detail herein) within the venue 10 .
- activity of guests 20 (or groups of guests 20 , as described in greater detail herein) of the venue 10 may be tracked over time by the guest recognition system 22 , and information relating to this tracked activity may be transmitted to the guest experience analysis system 24 via one or more communications interfaces 54 of the guest recognition system 22 for analysis and generation of the guest experience information 46 , which relates to previous and current experiences of the guests 20 within the venue 10 .
- real time and “substantially real time” used herein indicate that images (or frames of the video streams) are obtained and/or provided in a timeframe substantially close to the time of actual observation (e.g., the images may be obtained and/or provided once every 1/10 th of a second, every 1/15 th , every 1/20 th of a second, every 1/30 th of a second, every 1/60 th of a second, or even more frequently).
- the guest experience information 46 may include any information relating to guests 20 (or groups of guests 20 ), including activity of the guests 20 (or groups of guests 20 ) within the venue 10 .
- the guest experience information 46 may include personal information about the guests 20 , such as name, age, gender, height, weight, hometown, languages spoken, themed sections of the venue 10 in which the guests 20 are credentialed (e.g., entitled) to be, classifications for priority relating to particular themed sections or particular thrill rides 12 , venue facilities 14 , and/or other venue attractions 16 of the venue 10 , whether the guests 20 are annual pass holders for the venue 10 , number of times the guests 20 have visited the venue 10 , associations with certain themed groups of the venue 10 , and so forth.
- this personal information may be used by the guest recognition system 22 to recognize the guests 20 (or groups of guests 20 ).
- the guest experience information 46 may include activity of the guests 20 (or groups of guests 20 ) within the venue 10 , for example, number of times (and accumulated duration of time) the guests 20 (or groups of guests 20 ) rode, or otherwise visited, certain thrill rides 12 , venue facilities 14 , and/or other venue attractions 16 of the venue 10 , what the guests 20 (or groups of guests 20 ) are currently doing in the context of a particular themed section of the venue 10 within which the guests 20 (or group of guests 20 ) are currently located, and so forth.
- the guest experience information 46 may also include interaction of the guests 20 with particular wearable devices 30 , mobile devices 32 , and/or themed devices 34 associated with the guests 20 .
- the guest recognition system 22 may include processing circuitry, such as a processor 56 (e.g., general purpose processor or other processor) and a memory 58 , and may process the data collected by the one or more guest recognition sensors 52 to, for example, detect identities of certain guests 20 (or groups of guest 20 , as described in greater detail herein) within the venue 10 , track activity of the identified guests 20 , and convert the data into a form suitable for processing by the guest experience analysis system 24 .
- a processor 56 e.g., general purpose processor or other processor
- memory 58 may process the data collected by the one or more guest recognition sensors 52 to, for example, detect identities of certain guests 20 (or groups of guest 20 , as described in greater detail herein) within the venue 10 , track activity of the identified guests 20 , and convert the data into a form suitable for processing by the guest experience analysis system 24 .
- the guest recognition system 22 may be configured to identify when certain guests 20 are in relatively close proximity (e.g., within 30 feet, within 25 feet, within 20 feet, within 15 feet, within 10 feet, within 5 feet, or even closer) to other guests 20 for certain non-negligible amounts of time (e.g., greater than 10 minutes/day, greater than 20 minutes/day, greater than 30 minutes/day, or even longer) such that the guests 20 may be identifiable by the guest recognition system 22 as a group of guests 20 , such that the guest experience analysis system 24 may analyze the experiences of the group of guests 20 as opposed to, or in addition to, experiences of the individual guests 20 themselves.
- certain guests 20 are in relatively close proximity (e.g., within 30 feet, within 25 feet, within 20 feet, within 15 feet, within 10 feet, within 5 feet, or even closer) to other guests 20 for certain non-negligible amounts of time (e.g., greater than 10 minutes/day, greater than 20 minutes/day, greater than 30 minutes/day, or even longer) such that the guests 20
- Instructions e.g., facial recognition algorithms, clothing recognition algorithms, movement recognition algorithms, and other type of algorithms
- the processor 56 may be a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration.
- the processor 56 may include more than one processor.
- the guest experience analysis system 24 may include processing circuitry, such as a processor 60 (e.g., general purpose processor or other processor) and a memory 62 , which may process the data relating to the guest recognition performed by the guest recognition system 22 , which may be received from the guest recognition system 22 via one or more communications interfaces 64 of the guest experience analysis system 24 , to generate the guest experience information 46 for individual guests 20 (or groups of guests 20 ), which may be displayed as augmenting information via, for example, displays 48 , 50 of the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 , respectively.
- processing circuitry such as a processor 60 (e.g., general purpose processor or other processor) and a memory 62 , which may process the data relating to the guest recognition performed by the guest recognition system 22 , which may be received from the guest recognition system 22 via one or more communications interfaces 64 of the guest experience analysis system 24 , to generate the guest experience information 46 for individual guests 20 (or groups of guests 20 ), which may be displayed as augmenting
- the guest experience information 46 generated by the guest experience analysis system 24 may include recommendations for individual guests 20 (or groups of guests 20 ) based at least in part on the other guest experience information 46 for the individual guests 20 (or groups of guests 20 ). For example, if a particular guest 20 has visited a particular subset of venue attractions 16 within the venue 10 a relatively large number of times, the guest experience analysis system 24 may determine that a recommendation to visit a similar venue attraction 16 within the venue 10 should be provided to the particular guest 20 . In certain embodiments, the guest experience analysis system 24 may store the generated guest experience information in a database, for example, in the memory 62 of the guest experience analysis system 24 .
- the guest experience information 46 may be generated based at least in part on guest activity data collected from one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 associated with the guests 20 .
- certain activity of the guests 20 within the venue 10 may be directly tracked by the one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 associated with the guests 20 , and transmitted to the one or more communications interfaces 64 of the guest experience analysis system 24 to help the guest experience analysis system 24 generate the guest experience information 46 for the guests 20 .
- processor 60 may be a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration. In certain embodiments, the processor 60 may include more than one processor.
- SoC system-on-chip
- ASIC application-specific integrated circuit
- the guest experience analysis system 24 may transmit the generated guest experience information 46 to the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 such that the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 may display the guest experience information 46 as augmenting information on one or more displays 48 , 50 of the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 , respectively.
- the guest experience analysis system 24 may transmit the generated guest experience information 46 to one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 such that the one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 may display the guest experience information 46 as augmenting information on displays 108 , 110 , 112 of the one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 , respectively, similar to the functionality of the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 described herein.
- guests 20 of the venue 10 may be able to view at least a portion of the augmented reality views of the guest experience information 46 relating to the previous and current experiences of guests 20 within the venue 10 via one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 , and may provide inputs similar to those that may be received from users (e.g., venue employees 18 ) via the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 , as described in greater detail herein, via one or more wearable devices 30 , one or more mobile devices 32 , and/or one or more themed devices 34 .
- users e.g., venue employees 18
- the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 as described in greater detail herein
- the augmented reality display endpoints 26 may include processing circuitry, such as a processor 72 and a memory 74 .
- the processor 72 may be operatively coupled to the memory 74 to execute instructions for at least partially carrying out the presently disclosed techniques of displaying guest experience information 46 as augmenting information on the one or more displays 48 of the augmented reality display endpoints 26 , to enable users (e.g., venue employees 18 ) to enhance the experiences of the guests 20 , as described in greater detail herein.
- These instructions may be encoded in programs or code stored in tangible non-transitory computer-readable media, such as the memory 74 and/or other storage.
- the processor 72 may be a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration. In certain embodiments, the processor 72 may include more than one processor. Furthermore, in certain embodiments, the augmented reality display endpoints 26 may also receive the guest experience information 46 from the guest experience analysis system 24 via one or more communications interfaces 76 of the augmented reality display endpoints 26 .
- SoC system-on-chip
- ASIC application-specific integrated circuit
- the one or more displays 48 of the augmented reality display endpoints 26 may each include a liquid crystal display (LCD), an organic light emitting diode (OLED) display, or other similar display configured to display the guest experience information 46 as augmenting information.
- the one or more displays 48 of the augmented reality display endpoints 26 may each include an opaque or see-through LCD or an opaque or see-through OLED display configured to allow, for example, venue employees 18 to view the guest experience information 46 appearing on the displays 48 while preserving the ability to see through the respective displays 48 to the actual and physical real-world environment of the venue 10 (e.g., to pass-through images of guests 20 to which the guest experience information relates).
- guest experience information 46 for a targeted guest 20 may be superimposed on a display 48 of the augmented reality display endpoints 26 near (e.g., adjacent or within relatively close proximity to) pass-through images of the targeted guest 20 (or targeted group of guests 20 ).
- the augmented reality display endpoints 26 may be configured to provide audio cues and/or haptic feedback indicative of the guest experience information 46 relating to guests 20 displayed via the one or more displays 48 of the augmented reality display endpoints 26 .
- the augmented reality display endpoints 26 may include one or more output devices 132 configured to generate audio cues and/or haptic feedback indicative of the guest experience information 46 relating to guests 20 displayed via the one or more displays 48 of the augmented reality display endpoints 26 .
- the one or more output devices 132 of the augmented reality display endpoints 26 may include audio speakers configured to output audio cues and/or haptic devices configured to output haptic feedback.
- the guest experience information 46 displayed on the one or more displays 48 of the augmented reality display endpoints 26 may include recommendations for targeted guests 20 (or targeted groups of guests 20 ) based at least in part on the other guest experience information 46 for the targeted guests 20 (or targeted groups of guests 20 ).
- the guest experience analysis system 24 may transmit the determined recommendation to an augmented reality display endpoint 26 , and the augmented reality display endpoint 26 may display the determined recommendation on a display 48 of the augmented reality display endpoint 26 , and may also generate an alert (e.g., flashing light proximate the determined recommendation and/or flashing text of the determined recommendation) on the display 48 , for example, to draw the attention of a venue employee 18 .
- an alert e.g., flashing light proximate the determined recommendation and/or flashing text of the determined recommendation
- the guest experience analysis system 24 may transmit the determined recommendation to an augmented reality display endpoint 26 , and the augmented reality display endpoint 26 may switch focus (e.g., highlighting or other indication) from a first targeted guest 20 (or group of guests 20 ) to a second targeted guest 20 (or group of guests 20 ) to which the determined recommendation relates.
- focus e.g., highlighting or other indication
- the augmented reality display endpoints 26 may include one or more input devices 78 configured to receive inputs from users (e.g., venue employees 18 ) of the augmented reality display endpoints 26 , which may be transmitted back to the guest experience analysis system 24 via the one or more communications interfaces 76 of the augmented reality display endpoints 26 .
- users e.g., venue employees 18
- the guest experience analysis system 24 may be transmitted back to the guest experience analysis system 24 via the one or more communications interfaces 76 of the augmented reality display endpoints 26 .
- the one or more input devices 78 of the augmented reality display endpoints 26 may include audio sensors (e.g., microphones), cameras (e.g., to capture images for the purpose of recognizing gestures), touch screens (e.g., incorporated into the one or more displays 48 ), joysticks, trackballs, buttons, and/or other inputs devices suitable for receiving inputs from users (e.g., venue employees 18 ) of the augmented reality display endpoints 26 .
- audio sensors e.g., microphones
- cameras e.g., to capture images for the purpose of recognizing gestures
- touch screens e.g., incorporated into the one or more displays 48
- joysticks e.g., trackballs, buttons, and/or other inputs devices suitable for receiving inputs from users (e.g., venue employees 18 ) of the augmented reality display endpoints 26 .
- the one or more input devices 78 of the augmented reality display endpoints 26 may include one or more audio sensors configured to capture audio (e.g., voice commands) generated by users (e.g., venue employees 18 ), which may be processed by the processing circuitry of the augmented reality display endpoints 26 to generate information that may be transmitted to the guest experience analysis system 24 .
- audio e.g., voice commands
- users e.g., venue employees 18
- the one or more input devices 78 of the augmented reality display endpoints 26 may include one or more cameras for capturing images of users (e.g., venue employees 18 ) and/or body features of users (e.g., venue employees 18 ), which may be processed by the processing circuitry of the augmented reality display endpoints 26 to recognize gestures of the users (e.g., venue employees 18 ), which may be converted into information that may be transmitted to the guest experience analysis system 24 .
- users e.g., venue employees 18
- body features of users e.g., venue employees 18
- the one or more input devices 78 of the augmented reality display endpoints 26 may include one or more touch screens (e.g., incorporated into the one or more displays 48 of the augmented reality display endpoints 26 ) with which users (e.g., venue employees 18 ) may interact to enter information, for example, via a context-sensitive menu displayed via the one or more displays 48 ), which may be transmitted to the guest experience analysis system 24 .
- touch screens e.g., incorporated into the one or more displays 48 of the augmented reality display endpoints 26
- users e.g., venue employees 18
- enter information for example, via a context-sensitive menu displayed via the one or more displays 48
- the one or more input devices 78 of the augmented reality display endpoints 26 may receive modified and/or additional guest experience information 46 relating to a targeted guest 20 (or group of guests 20 ), and the modified and/or additional guest experience information 46 may be transmitted back to the guest experience analysis system 24 for analysis by the guest experience analysis system 24 .
- the one or more input devices 78 of the augmented reality display endpoints 26 may receive a command to switch focus (e.g., highlighting or other indication) from a first targeted guest 20 (or group of guests 20 ) to a second targeted guest 20 (or group of guests 20 ), and the processing circuitry of the respective augmented reality display endpoint 26 may cause the focus to be switched on a display 48 of the respective augmented reality display endpoint 26 in accordance with the command.
- focus e.g., highlighting or other indication
- the one or more input devices 78 of the augmented reality display endpoints 26 may receive information relating to a command to implement one or more physical effects 38 via one or more physical objects 36 disposed within the venue 10 , and the command may be transmitted to the guest experience analysis system 24 , which may in turn generate a command signal to be sent to the one or more physical objects 36 to implement the one or more physical effects 38 .
- the one or more input devices 78 of the augmented reality display endpoints 26 may receive information relating to a command to implement one or more actions (e.g., physical effects 38 , information alerts, and so forth) to occur for one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 associated with one or more guests 20 , and the command may be transmitted to the guest experience analysis system 24 , which may in turn generate a command signal to be sent to the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 to implement the one or more actions.
- one or more actions e.g., physical effects 38 , information alerts, and so forth
- the command may be transmitted to the guest experience analysis system 24 , which may in turn generate a command signal to be sent to the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 to implement the one or more actions.
- the functionality of the wearable augmented reality display devices 28 described herein may be substantially similar to the functionality of the augmented reality display endpoints 26 described herein.
- the wearable augmented reality display devices 28 may include processing circuitry, such as a processor 80 and a memory 82 .
- the processor 80 may be operatively coupled to the memory 82 to execute instructions for at least partially carrying out the presently disclosed techniques of displaying guest experience information 46 as augmenting information on the one or more displays 50 of the wearable augmented reality display devices 28 , to enable users (e.g., venue employees 18 ) to enhance the experiences of the guests 20 , as described in greater detail herein.
- the processor 80 may be a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration. In certain embodiments, the processor 80 may include more than one processor.
- the wearable augmented reality display devices 28 may also receive the guest experience information 46 from the guest experience analysis system 24 via one or more communications interfaces 84 of the wearable augmented reality display devices 28 .
- the one or more displays 50 of the wearable augmented reality display devices 28 may each include a liquid crystal display (LCD), an organic light emitting diode (OLED) display, or other similar display configured to display the guest experience information 46 as augmenting information.
- the one or more displays 50 of the wearable augmented reality display devices 28 may each include an opaque or see-through LCD or an opaque or see-through OLED display configured to allow, for example, venue employees 18 to view the guest experience information 46 appearing on the displays 50 while preserving the ability to see through the respective displays 50 to the actual and physical real-world environment of the venue 10 (e.g., to pass-through images of guests 20 to which the guest experience information relates).
- guest experience information 46 for a targeted guest 20 may be superimposed on a display 50 of the wearable augmented reality display devices 28 near (e.g., adjacent or within relatively close proximity to) pass-through images of the targeted guest 20 (or targeted group of guests 20 ).
- the wearable augmented reality display devices 28 may be configured to provide audio cues and/or haptic feedback indicative of the guest experience information 46 relating to guests 20 displayed via the one or more displays 50 of the wearable augmented reality display devices 28 .
- the wearable augmented reality display devices 28 may include one or more output devices 134 configured to generate audio cues and/or haptic feedback indicative of the guest experience information 46 relating to guests 20 displayed via the one or more displays 50 of the wearable augmented reality display devices 28 .
- the one or more output devices 134 of the wearable augmented reality display devices 28 may include audio speakers configured to output audio cues and/or haptic devices configured to output haptic feedback.
- the guest experience information 46 displayed on the one or more displays 50 of the wearable augmented reality display devices 28 may include recommendations for targeted guests 20 (or targeted groups of guests 20 ) based at least in part on the other guest experience information 46 for the targeted guests 20 (or targeted groups of guests 20 ).
- the guest experience analysis system 24 may transmit the determined recommendation to a wearable augmented reality display device 28 , and the wearable augmented reality display device 28 may display the determined recommendation on a display 50 of the wearable augmented reality display device 28 , and may also generate an alert (e.g., flashing light proximate the determined recommendation and/or flashing text of the determined recommendation) on the display 50 , for example, to draw the attention of a venue employee 18 .
- an alert e.g., flashing light proximate the determined recommendation and/or flashing text of the determined recommendation
- the guest experience analysis system 24 may transmit the determined recommendation to a wearable augmented reality display device 28 , and the wearable augmented reality display device 28 may switch focus (e.g., highlighting or other indication) from a first targeted guest 20 (or group of guests 20 ) to a second targeted guest 20 (or group of guests 20 ) to which the determined recommendation relates.
- focus e.g., highlighting or other indication
- the wearable augmented reality display devices 28 may include one or more input devices 86 configured to receive inputs from users (e.g., venue employees 18 ) of the wearable augmented reality display devices 28 , which may be transmitted back to the guest experience analysis system 24 via the one or more communications interfaces 84 of the wearable augmented reality display devices 28 .
- the one or more input devices 86 of the wearable augmented reality display devices 28 may include audio sensors (e.g., microphones), cameras (e.g., to capture images for the purpose of recognizing gestures), trackballs, buttons, and/or other inputs devices suitable for receiving inputs from users (e.g., venue employees 18 ) of the wearable augmented reality display devices 28 .
- the one or more input devices 86 of the wearable augmented reality display devices 28 may include one or more audio sensors configured to capture audio (e.g., voice commands) generated by users (e.g., venue employees 18 ), which may be processed by the processing circuitry of the wearable augmented reality display devices 28 to generate information that may be transmitted to the guest experience analysis system 24 .
- audio e.g., voice commands
- users e.g., venue employees 18
- processing circuitry of the wearable augmented reality display devices 28 may be processed by the processing circuitry of the wearable augmented reality display devices 28 to generate information that may be transmitted to the guest experience analysis system 24 .
- the one or more input devices 86 of the wearable augmented reality display devices 28 may include one or more cameras for capturing images of users (e.g., venue employees 18 ) and/or body features of users (e.g., venue employees 18 ), which may be processed by the processing circuitry of the wearable augmented reality display devices 28 to recognize gestures of the users (e.g., venue employees 18 ), which may be converted into information that may be transmitted to the guest experience analysis system 24 .
- the one or more input devices 86 of the wearable augmented reality display devices 28 may receive modified and/or additional guest experience information 46 relating to a targeted guest 20 (or group of guests 20 ), and the modified and/or additional guest experience information 46 may be transmitted back to the guest experience analysis system 24 for analysis by the guest experience analysis system 24 .
- the one or more input devices 86 of the wearable augmented reality display devices 28 may receive a command to switch focus (e.g., highlighting or other indication) from a first targeted guest 20 (or group of guests 20 ) to a second targeted guest 20 (or group of guests 20 ), and the processing circuitry of the respective wearable augmented reality display device 28 may cause the focus to be switched on a display 50 of the respective wearable augmented reality display device 28 in accordance with the command.
- a command to switch focus e.g., highlighting or other indication
- the one or more input devices 86 of the wearable augmented reality display devices 28 may receive information relating to a command to implement one or more physical effects 38 via one or more physical objects 36 disposed within the venue 10 , and the command may be transmitted to the guest experience analysis system 24 , which may in turn generate a command signal to be sent to the one or more physical objects 36 to implement the one or more physical effects 38 .
- the one or more input devices 86 of the wearable augmented reality display devices 28 may receive information relating to a command to implement one or more actions (e.g., physical effects 38 , information alerts, and so forth) to occur for one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 associated with one or more guests 20 , and the command may be transmitted to the guest experience analysis system 24 , which may in turn generate a command signal to be sent to the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 to implement the one or more actions.
- one or more actions e.g., physical effects 38 , information alerts, and so forth
- the guest experience analysis system 24 may receive commands from the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 to implement physical effects 38 via physical objects 36 disposed within the real-world environment of the venue 10 to, for example, further enhance the experiences of guests 20 .
- the physical objects 36 may include processing circuitry, such as a processor 88 and a memory 90 .
- the processor 88 may be operatively coupled to the memory 90 to execute instructions for implementing physical effects 38 based on such commands received from the guest experience analysis system 24 via one or more communications interfaces 92 of the physical objects 36 .
- the processor 88 may be a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration. In certain embodiments, the processor 88 may include more than one processor.
- SoC system-on-chip
- ASIC application-specific integrated circuit
- the physical effects 38 may be implemented, for example, via physical actuation mechanisms 94 that are associated with the physical objects 36 .
- the physical effects 38 may be electrical sparks emanating from the physical object 36 as generated by an electrical power source, flames emanating from the physical object 36 as generated by an ignition system, wind emanating from the physical object 36 as generated by a wind system, movement of a portion of the physical object 36 , and so forth.
- similar actions may be implemented via one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 associated with one or more guests 20 based at least in part on commands transmitted to the guest experience analysis system 24 by the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 , which may in turn generate a command signal sent to the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 to implement the one or more actions.
- certain functionality of the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 may be replicated by the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 described herein. Indeed, in certain embodiments, both venue employees 18 and/or guests 20 may be capable of experiencing at least a portion of the functionality of the augmented reality display endpoints 26 and/or the wearable augmented reality display devices 28 via the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 described herein.
- the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 may include respective processing circuitry, such as a processor 96 , 98 , 100 and a memory 102 , 104 , 106 .
- the respective processor 96 , 98 , 100 may be operatively coupled to the respective memory 102 , 104 , 106 to execute instructions for at least partially carrying out the presently disclosed techniques of displaying guest experience information 46 as augmenting information on one or more respective displays 108 , 110 , 112 of the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 , to enable users (e.g., venue employees 18 ) to enhance the experiences of the guests 20 , as described in greater detail herein.
- These instructions may be encoded in programs or code stored in tangible non-transitory computer-readable media, such as the memory 102 , 104 , 106 and/or other storage.
- the processors 96 , 98 , 100 may be general-purpose processors, system-on-chip (SoC) devices, application-specific integrated circuits (ASICs), or some other similar processor configurations. In certain embodiments, the processors 96 , 98 , 100 may include more than one processor. Furthermore, in certain embodiments, the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 may also receive the guest experience information 46 from the guest experience analysis system 24 via one or more respective communications interfaces 114 , 116 , 118 of the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 .
- the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 may be used to aid the guest recognition performed by the guest recognition system 22 .
- the guest recognition sensors 52 may include sensors that are configured to track activity of one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 associated with certain guests 20 , which may be used by the guest recognition system 22 to determine previous and current experiences of the guests 20 within the venue 10 .
- the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 may include respective sets of features 120 , 122 , 124 (e.g., geometric aspects or markings) that may be passively monitored by the guest recognition system 22 (e.g., a camera system, such as a light detection and ranging (LiDAR) system) to track activity of the guests 20 to which the particular devices 30 , 32 , 34 are associated.
- the guest recognition system 22 e.g., a camera system, such as a light detection and ranging (LiDAR) system
- the one or more wearable devices 30 , mobile devices 32 , and/or themed devices 34 may include respective sets of input devices 126 , 128 , 130 with which guests 20 associated with particular device 30 , 32 , 34 may interact to enter information, which may be transmitted to the guest recognition system 22 and/or the guest experience analysis system 24 .
- the guest recognition system 22 of the augmented reality guest recognition system 40 includes one or more guest recognition sensors 52 configured to collect data, which may be used by the guest recognition system 22 to recognize guests 20 (or groups of guests 20 ) within the venue 10
- the guest experience analysis system 24 of the augmented reality guest recognition system 40 is configured to analyze information relating to the guest recognition performed by the guest recognition system 22 , and received from the guest recognition system 22 , to determine guest experience information 46 relating to previous and current experiences of the guests 20 (or groups of guests 20 ) within the venue 10
- this guest experience information 46 may be provided to one or more augmented reality display endpoints 26 and/or one or more wearable augmented reality display devices 28 to enable users (e.g., venue employees 18 ) to enhance future experiences of the guests 20 (or groups of guests 20 ) within the venue 10 .
- FIG. 3 illustrates an exemplary portion of the augmented reality guest recognition system 40 of FIG. 2 .
- FIG. 3 illustrates a first guest 20 A, a second guest 20 B, and a venue employee 18 in relatively close proximity to guest recognition sensors 52 and an augmented reality display endpoint 26 .
- the guest recognition sensors 52 and the augmented reality display endpoint 26 are integrated into a single relatively stationary structure.
- guest recognition sensors 52 may not be integrated with augmented reality display endpoints 26 .
- FIG. 4 illustrates the guest recognition sensors 52 and the augmented reality display endpoint 26 of FIG. 3 from the point-of-view of the venue employee 18 , illustrating exemplary guest experience information 46 relating to the first guest 20 A and the second guest 20 B.
- the first guest 20 A is a currently targeted guest, for example, highlighted by a glow 136 around the pass-through image of the first guest 20 A (or otherwise visually indicated, in other embodiments) as being focused on by the display 48 of the augmented reality display endpoint 26 , and the pass-through image of the second guest 20 B does not have such visual indication.
- the focus on the first guest 20 A may be switched to the second guest 20 B based on a command received by the augmented reality display endpoint 26 from the venue employee 18 .
- FIG. 5 illustrates wearable augmented reality display devices 28 (e.g., augmented reality glasses, augmented reality goggles, other augmented reality headgear, and so forth), which may include functionality substantially similar to the functionality of the augmented reality display endpoints 26 described herein.
- FIG. 6 is a flow diagram of a method 138 of use of the augmented reality guest recognition system 40 described herein.
- the method 138 may include recognizing, via the guest recognition system 22 , one or more guests 20 in the venue (block 140 ).
- recognizing the one or more guests 20 may include utilizing, via the guest recognition system 22 , facial recognition algorithms, clothing recognition algorithms, or movement recognition algorithms to detect identities of the one or more guests 20 , to detect activity of the one or more guests 20 within the venue 10 , or some combination thereof.
- the method 138 may include generating, via the guest experience analysis system 24 , guest experience information 46 (e.g., including recommendations) relating to the recognized one or more guests 20 (block 142 ).
- the method 138 may include transmitting, via the guest experience analysis system 24 , the guest experience information 46 (e.g., including recommendations) relating to the recognized one or more guests 20 to an augmented reality display device (e.g., one or more augmented reality display endpoints 26 and/or one or more wearable augmented reality display devices 28 ) for display on an augmented reality display (e.g., displays 48 , 50 ) of the one or more augmented reality display endpoints 26 and/or the one or more wearable augmented reality display devices 28 (block 144 ).
- an augmented reality display device e.g., one or more augmented reality display endpoints 26 and/or one or more wearable augmented reality display devices 28
- an augmented reality display e.g., displays 48 , 50
- the method 138 may also include identifying, via the guest recognition system 22 , one or more groups of guests 20 based at least in part on an amount of time that individual guests 20 of the one or more guests 20 remain in proximity with each other.
- the method 138 may also include generating, via the guest experience analysis system 24 , guest group experience information 46 (e.g., including recommendations) relating to the identified one or more groups of guests 20 , and transmitting, via the guest experience analysis system 24 , the guest group experience information 46 (e.g., including recommendations) relating to the identified one or more groups of guests 20 to an augmented reality display device (e.g., one or more augmented reality display endpoints 26 and/or one or more wearable augmented reality display devices 28 ) for display on an augmented reality display (e.g., displays 48 , 50 ) of the one or more augmented reality display endpoints 26 and/or the one or more wearable augmented reality display devices 28 .
- an augmented reality display device e.g., one or more augmented reality
- the method 138 may also include receiving, via the guest experience analysis system 24 , modified and/or additional guest experience information 46 (e.g., including recommendations) relating to the recognized one or more guests 20 from one or more augmented reality display endpoints 26 and/or one or more wearable augmented reality display devices 28 .
- the method 138 may also include receiving, via the guest experience analysis system 24 , a command from an augmented reality display endpoint 26 or a wearable augmented reality display device 28 to implement one or more physical effects 38 via one or more physical objects 36 disposed within the venue 10 , and transmitting, via the guest experience analysis system 24 , a control signal to the one or more physical objects 36 to implement the one or more physical effects 38 .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- Strategic Management (AREA)
- Development Economics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Marketing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Entrepreneurship & Innovation (AREA)
- Game Theory and Decision Science (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Data Mining & Analysis (AREA)
- Optics & Photonics (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- User Interface Of Digital Computer (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (22)
Priority Applications (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/180,551 US12229800B2 (en) | 2020-02-27 | 2021-02-19 | Augmented reality guest recognition systems and methods |
CA3166372A CA3166372A1 (en) | 2020-02-27 | 2021-02-25 | Augmented reality guest recognition systems and methods |
JP2022551705A JP7646692B2 (en) | 2020-02-27 | 2021-02-25 | Augmented reality guest recognition system and method |
EP21713501.1A EP4111402A1 (en) | 2020-02-27 | 2021-02-25 | Augmented reality guest recognition systems and methods |
KR1020227033398A KR20220145893A (en) | 2020-02-27 | 2021-02-25 | Augmented Reality Guest Recognition System and Method |
CN202180017143.1A CN115136174A (en) | 2020-02-27 | 2021-02-25 | Augmented reality customer identification system and method |
PCT/US2021/019594 WO2021173789A1 (en) | 2020-02-27 | 2021-02-25 | Augmented reality guest recognition systems and methods |
US19/055,211 US20250191025A1 (en) | 2020-02-27 | 2025-02-17 | Augmented reality guest recognition systems and methods |
JP2025034478A JP2025096277A (en) | 2020-02-27 | 2025-03-05 | Augmented reality guest recognition system and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202062982528P | 2020-02-27 | 2020-02-27 | |
US17/180,551 US12229800B2 (en) | 2020-02-27 | 2021-02-19 | Augmented reality guest recognition systems and methods |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US19/055,211 Continuation US20250191025A1 (en) | 2020-02-27 | 2025-02-17 | Augmented reality guest recognition systems and methods |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210271881A1 US20210271881A1 (en) | 2021-09-02 |
US12229800B2 true US12229800B2 (en) | 2025-02-18 |
Family
ID=77462852
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/180,551 Active 2041-07-01 US12229800B2 (en) | 2020-02-27 | 2021-02-19 | Augmented reality guest recognition systems and methods |
US19/055,211 Pending US20250191025A1 (en) | 2020-02-27 | 2025-02-17 | Augmented reality guest recognition systems and methods |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US19/055,211 Pending US20250191025A1 (en) | 2020-02-27 | 2025-02-17 | Augmented reality guest recognition systems and methods |
Country Status (7)
Country | Link |
---|---|
US (2) | US12229800B2 (en) |
EP (1) | EP4111402A1 (en) |
JP (2) | JP7646692B2 (en) |
KR (1) | KR20220145893A (en) |
CN (1) | CN115136174A (en) |
CA (1) | CA3166372A1 (en) |
WO (1) | WO2021173789A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105988776B (en) * | 2015-01-27 | 2019-11-26 | 阿里巴巴集团控股有限公司 | Release processing method and processing device |
WO2023220310A1 (en) * | 2022-05-11 | 2023-11-16 | Universal City Studios Llc | Guest-specific artificial intelligence entity systems and methods |
KR20250005499A (en) * | 2022-05-11 | 2025-01-09 | 유니버셜 시티 스튜디오스 엘엘씨 | Ride vehicle artificial intelligence entity system and method |
WO2025128734A1 (en) * | 2023-12-11 | 2025-06-19 | Universal City Studios Llc | System and method for temporary device pairing |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100130296A1 (en) | 2008-11-24 | 2010-05-27 | Disney Enterprises, Inc. | System and method for providing an augmented reality experience |
US20130018661A1 (en) * | 2011-07-11 | 2013-01-17 | Disney Enterprises, Inc. | Guest experience management system and method |
US20130083003A1 (en) * | 2011-09-30 | 2013-04-04 | Kathryn Stone Perez | Personal audio/visual system |
US8438110B2 (en) | 2011-03-08 | 2013-05-07 | Bank Of America Corporation | Conducting financial transactions based on identification of individuals in an augmented reality environment |
US20130147686A1 (en) * | 2011-12-12 | 2013-06-13 | John Clavin | Connecting Head Mounted Displays To External Displays And Other Communication Networks |
US20140192085A1 (en) | 2013-01-04 | 2014-07-10 | Lg Electronics Inc. | Head mounted display and method for controlling the same |
KR20140136288A (en) | 2013-05-20 | 2014-11-28 | 노영희 | Thema park video making service pproviding system and method by user's participation |
US9028325B2 (en) | 2011-02-23 | 2015-05-12 | Disney Enterprises, Inc. | Number of players determined using facial recognition |
US20150178558A1 (en) * | 2012-08-23 | 2015-06-25 | Sony Corporation | Control system, control method and computer program product |
US20150294322A1 (en) | 2014-04-11 | 2015-10-15 | Bank Of America Corporation | Security-monitoring implementing customer recognition via an augmented reality display |
US20160026253A1 (en) * | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US20170351909A1 (en) | 2016-06-03 | 2017-12-07 | Magic Leap, Inc. | Augmented reality identity verification |
US20180040044A1 (en) | 2016-08-04 | 2018-02-08 | Wal-Mart Stores, Inc. | Vector-based characterizations of products and individuals with respect to personal partialities |
US20180129984A1 (en) * | 2016-11-09 | 2018-05-10 | Universal City Studios Llc | Virtual queuing techniques |
US20180250589A1 (en) | 2017-03-06 | 2018-09-06 | Universal City Studios Llc | Mixed reality viewer system and method |
US20180284453A1 (en) * | 2017-04-03 | 2018-10-04 | Walmart Apollo, Llc | Customer interaction system |
US20180350171A1 (en) * | 2017-06-02 | 2018-12-06 | Hospitality Engagement Corporation | Method and systems for event entry with facial recognition |
WO2019082687A1 (en) | 2017-10-27 | 2019-05-02 | ソニー株式会社 | Information processing device, information processing method, program, and information processing system |
US20190206132A1 (en) | 2018-01-04 | 2019-07-04 | Universal City Studios Llc | Systems and methods for textual overlay in an amusement park environment |
US20190258313A1 (en) * | 2016-11-07 | 2019-08-22 | Changchun Ruixinboguan Technology Development Co., Ltd. | Systems and methods for interaction with an application |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5248273B2 (en) * | 2008-11-12 | 2013-07-31 | 株式会社三共 | Store system |
JP6596452B2 (en) * | 2017-01-23 | 2019-10-23 | ティフォン インコーポレーテッド | Display device, display method and display program thereof, and entertainment facility |
-
2021
- 2021-02-19 US US17/180,551 patent/US12229800B2/en active Active
- 2021-02-25 JP JP2022551705A patent/JP7646692B2/en active Active
- 2021-02-25 CN CN202180017143.1A patent/CN115136174A/en active Pending
- 2021-02-25 EP EP21713501.1A patent/EP4111402A1/en active Pending
- 2021-02-25 WO PCT/US2021/019594 patent/WO2021173789A1/en active Application Filing
- 2021-02-25 CA CA3166372A patent/CA3166372A1/en active Pending
- 2021-02-25 KR KR1020227033398A patent/KR20220145893A/en active Pending
-
2025
- 2025-02-17 US US19/055,211 patent/US20250191025A1/en active Pending
- 2025-03-05 JP JP2025034478A patent/JP2025096277A/en active Pending
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100130296A1 (en) | 2008-11-24 | 2010-05-27 | Disney Enterprises, Inc. | System and method for providing an augmented reality experience |
US9028325B2 (en) | 2011-02-23 | 2015-05-12 | Disney Enterprises, Inc. | Number of players determined using facial recognition |
US8438110B2 (en) | 2011-03-08 | 2013-05-07 | Bank Of America Corporation | Conducting financial transactions based on identification of individuals in an augmented reality environment |
US20130018661A1 (en) * | 2011-07-11 | 2013-01-17 | Disney Enterprises, Inc. | Guest experience management system and method |
US20130083003A1 (en) * | 2011-09-30 | 2013-04-04 | Kathryn Stone Perez | Personal audio/visual system |
US20130147686A1 (en) * | 2011-12-12 | 2013-06-13 | John Clavin | Connecting Head Mounted Displays To External Displays And Other Communication Networks |
US20150178558A1 (en) * | 2012-08-23 | 2015-06-25 | Sony Corporation | Control system, control method and computer program product |
US20140192085A1 (en) | 2013-01-04 | 2014-07-10 | Lg Electronics Inc. | Head mounted display and method for controlling the same |
KR20140136288A (en) | 2013-05-20 | 2014-11-28 | 노영희 | Thema park video making service pproviding system and method by user's participation |
US20160026253A1 (en) * | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US20150294322A1 (en) | 2014-04-11 | 2015-10-15 | Bank Of America Corporation | Security-monitoring implementing customer recognition via an augmented reality display |
US20170351909A1 (en) | 2016-06-03 | 2017-12-07 | Magic Leap, Inc. | Augmented reality identity verification |
US20180040044A1 (en) | 2016-08-04 | 2018-02-08 | Wal-Mart Stores, Inc. | Vector-based characterizations of products and individuals with respect to personal partialities |
US20190258313A1 (en) * | 2016-11-07 | 2019-08-22 | Changchun Ruixinboguan Technology Development Co., Ltd. | Systems and methods for interaction with an application |
US20180129984A1 (en) * | 2016-11-09 | 2018-05-10 | Universal City Studios Llc | Virtual queuing techniques |
US20180250589A1 (en) | 2017-03-06 | 2018-09-06 | Universal City Studios Llc | Mixed reality viewer system and method |
US20180255285A1 (en) * | 2017-03-06 | 2018-09-06 | Universal City Studios Llc | Systems and methods for layered virtual features in an amusement park environment |
US20180284453A1 (en) * | 2017-04-03 | 2018-10-04 | Walmart Apollo, Llc | Customer interaction system |
US20180350171A1 (en) * | 2017-06-02 | 2018-12-06 | Hospitality Engagement Corporation | Method and systems for event entry with facial recognition |
WO2019082687A1 (en) | 2017-10-27 | 2019-05-02 | ソニー株式会社 | Information processing device, information processing method, program, and information processing system |
US20190206132A1 (en) | 2018-01-04 | 2019-07-04 | Universal City Studios Llc | Systems and methods for textual overlay in an amusement park environment |
Non-Patent Citations (6)
Title |
---|
AE Office Action for United Arab Emirates Application No. P6001652/22 mailed Nov. 23, 2024. |
Ellis, Cat, "How augmented reality could transform the lives of people with face blindness," TechRadar, Oct. 16, 2018, 12 pgs., https://www.techradar.com/news/how-augmented reality-could-transform-the-lives-of-people-with-face-blindness. |
JP Office Action for Japanese Application No. 2022-551705 mailed Sep. 2, 2024. |
Morozova, Anastasia, "Commercial use Cases of AR Face Recognition and Facial Tracking Apps," Jasoren, 15 pgs, retrieved Feb. 19, 2021, https://jasoren.com/commercial-use-cases-of-ar-face-recognition-and-facial-tracking-apps/. |
PCT/US2021/019594 International Search Report and Written Opinion mailed Jun. 11, 2021. |
Starner, Thad et al., "Augmented Reality Through Wearable Computing," The Media Laboratory, Massachusetts Institute of Technology, 24 pgs. |
Also Published As
Publication number | Publication date |
---|---|
JP7646692B2 (en) | 2025-03-17 |
US20210271881A1 (en) | 2021-09-02 |
KR20220145893A (en) | 2022-10-31 |
US20250191025A1 (en) | 2025-06-12 |
JP2023515988A (en) | 2023-04-17 |
CN115136174A (en) | 2022-09-30 |
EP4111402A1 (en) | 2023-01-04 |
JP2025096277A (en) | 2025-06-26 |
WO2021173789A1 (en) | 2021-09-02 |
CA3166372A1 (en) | 2021-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12229800B2 (en) | Augmented reality guest recognition systems and methods | |
US12112012B2 (en) | User-customized location based content presentation | |
US10474336B2 (en) | Providing a user experience with virtual reality content and user-selected, real world objects | |
US10269180B2 (en) | Information processing apparatus and information processing method, display apparatus and display method, and information processing system | |
JP7736269B2 (en) | Amusement park system and method | |
KR101894573B1 (en) | Smart phone interface management system by 3D digital actor | |
Cucchiara et al. | Visions for augmented cultural heritage experience | |
JP2020518929A (en) | Feedback on emotion-based experiences | |
JPWO2018142756A1 (en) | Information processing apparatus and information processing method | |
WO2016209437A1 (en) | Facilitating media play and real-time interaction with smart physical objects | |
CN109074679A (en) | The Instant Ads based on scene strengthened with augmented reality | |
CN116841436A (en) | Video-based interaction method, apparatus, device, storage medium, and program product | |
HK40081741A (en) | Augmented reality guest recognition systems and methods | |
JP6857537B2 (en) | Information processing device | |
KR20210116838A (en) | Electronic device and operating method for processing a voice input based on a gesture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: UNIVERSAL CITY STUDIOS LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TRAYNOR, MARK JAMES;ALTER, DAVID JOHN-LOUIS;HANLEY, KYLE PATRICK;AND OTHERS;SIGNING DATES FROM 20210119 TO 20210218;REEL/FRAME:055356/0627 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |