WO2024034053A1 - 情報処理装置および情報処理方法 - Google Patents
情報処理装置および情報処理方法 Download PDFInfo
- Publication number
- WO2024034053A1 WO2024034053A1 PCT/JP2022/030568 JP2022030568W WO2024034053A1 WO 2024034053 A1 WO2024034053 A1 WO 2024034053A1 JP 2022030568 W JP2022030568 W JP 2022030568W WO 2024034053 A1 WO2024034053 A1 WO 2024034053A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- user
- walking
- information processing
- processing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
Definitions
- the present invention relates to an information processing device and an information processing method that display virtual objects together with objects in real space.
- Augmented Reality which visually augments the real world by superimposing virtual objects in a virtual space made of digital information created with CG (Computer Graphics) etc.
- CG Computer Graphics
- information processing devices that can easily recognize virtual objects while recognizing objects in real space have become widespread.
- Examples include a head mounted display (HMD) that is worn on the head, and AR glasses, which are glasses-type digital devices that are a form of HMD.
- HMDs and AR glasses real objects in real space or virtual objects are displayed on the display section installed in front of the head (so-called video see-through type), or virtual objects are displayed on the display section and the objects in front of the eyes are displayed.
- a virtual object is generated from digital information consisting of various types of display items such as text characters, icons, and moving images, and is visualized by being displayed in front of the user's eyes.
- Patent Document 1 describes a method for "displaying display items determined according to environmental information or user information using a change in the user's behavior as a trigger.”
- An information processing apparatus including a display control section that controls the presence or absence of the display item on the display section based on the order.
- Patent Document 1 Although it is shown in Patent Document 1 that the influence on the user's field of vision is reduced by controlling whether or not display items that are virtual objects are displayed while walking, display items that are virtual objects while walking are There is a possibility that the AR technology may not be displayed, which poses a problem that it becomes difficult for the user to always use the AR technology.
- the present invention takes the above-mentioned problems into consideration and provides an information processing device and an information processing method that display display items within the visual display range as virtual objects in an easy-to-see display mode along with the scenery in front of the user while walking.
- the present invention is an information processing apparatus worn by a user, which includes a display processing device that displays an augmented reality object within a viewing display range, a walking detection sensor that detects the walking state of the user, and a processor.
- the processor determines whether the user is walking based on sensor information of the walking detection sensor, and when determining that the user is walking, magnifies the augmented reality object and displays the enlarged object.
- the present invention is characterized in that control is executed to display an augmented reality object within the visual field display range.
- the present invention also provides an information processing apparatus worn by a user, which includes a display processing device that displays an augmented reality object within a visual display range, a walking detection sensor that detects the walking state of the user, and a processor.
- the visual field display range includes a central area including a center point of the visual field display range and a peripheral area located at an outer edge of the central area, and the processor detects the user based on sensor information of the walking detection sensor. It is characterized in that it is determined whether the user is walking, and if it is determined that the user is walking, display control is executed to move the augmented reality object displayed in the central area to the peripheral area.
- the present invention also provides an information processing method executed by an information processing device worn by a user, which determines whether the user is walking based on sensor information of a walking detection sensor that detects the walking state of the user. a walking determination step of determining that the user is walking; an enlarging step of enlarging the augmented reality object compared to when the user is stationary; and displaying the enlarged augmented reality object on a display processing device. and a display step.
- FIG. 1 is an example of a diagram schematically illustrating the appearance of an embodiment of an information processing apparatus according to the present embodiment
- FIG. 2 is an example of a diagram illustrating a front landscape screen and a visual field display range when a user is stationary in the information processing apparatus according to the embodiment described in FIG. 1.
- FIG. 2 is an example of a diagram illustrating a case where all displayed items are displayed in an enlarged manner when a user walks in the information processing apparatus according to the embodiment described in FIG. 1
- FIG. FIG. 2 is an example of a diagram illustrating a case where display items in the periphery are greatly enlarged and displayed when a user walks in the information processing apparatus according to the embodiment described in FIG.
- FIG. 1 This is an example of a diagram illustrating a case where display items in the peripheral area are greatly enlarged and display items in the central area are slightly enlarged when the user walks in the information processing apparatus according to the embodiment described in FIG. 1.
- FIG. 2 is an example of a diagram illustrating a case where a display item that moves to a peripheral area is displayed in a large enlarged manner when walking diagonally to the left in the information processing apparatus according to the embodiment described in FIG. 1;
- FIG. 2 is an example of a diagram illustrating a case where a display item that moves to a peripheral area is displayed in a large enlarged manner when walking diagonally to the right in the information processing apparatus according to the embodiment described in FIG. 1;
- FIG. 1 This is an example of a diagram illustrating a case where display items in the peripheral area are greatly enlarged and display items in the central area are slightly enlarged when the user walks in the information processing apparatus according to the embodiment described in FIG. 1.
- FIG. 2 is an example of a diagram illustrating a case where a
- FIG. 2 is an example of a diagram illustrating a case where the user turns his or her head to the left side while walking in front of the information processing apparatus according to the embodiment described in FIG. 1; 2 is another example of a diagram illustrating a case where the user turns his or her head to the left side while walking forward in the information processing apparatus according to the embodiment described in FIG. 1; FIG. 2 is an example of a diagram illustrating a case where the information processing apparatus according to the embodiment described in FIG. 1 turns the head toward the upper left while walking diagonally to the right. 2 is an example of a diagram illustrating the relationship between the front landscape screen, visual field display range, and central area, first peripheral area, and second peripheral area when the user is stationary in the information processing apparatus according to the embodiment described in FIG. 1; FIG. FIG.
- FIG. 2 is an example of a diagram illustrating a case where a display item in a central area is moved to a first peripheral area when a user walks in the information processing apparatus according to the embodiment described in FIG. 1;
- FIG. 2 is an example of a diagram illustrating a case in which a display item in a central area is moved to a first peripheral area and enlarged when a user walks in the information processing apparatus according to the embodiment described in FIG. 1;
- FIG. 2 is an example of a diagram illustrating a case where a display item in a central area is moved to a second peripheral area and enlarged when a user walks in the information processing apparatus according to the embodiment described in FIG.
- FIG. 1; 2 is an example of a diagram showing the relationship between walking speed and display size of display items in the information processing apparatus according to the embodiment described in FIG. 1.
- FIG. 2 is an example of a diagram showing a relationship between walking speed and a movement destination of a display item in a central area in the information processing apparatus according to the embodiment described in FIG. 1.
- FIG. 2 is an example of a diagram illustrating a case where an operation menu is displayed in a stationary state in the information processing apparatus according to the embodiment described in FIG. 1.
- FIG. This is an example of a flowchart (first half) explaining the operation of the information processing apparatus according to the present embodiment.
- This is an example of a flowchart (second half) explaining the operation of the information processing apparatus according to the present embodiment.
- FIG. 2 is a block diagram showing an example of the configuration of AR glasses as an example of the information processing device according to the present embodiment.
- 2 is an example of a diagram illustrating a case where a screen of a smartphone or a smart watch is displayed as a display item when a user is stationary in the information processing apparatus according to the embodiment described in FIG. 1.
- FIG. FIG. 2 is an example of a diagram illustrating a case where the information processing apparatus according to the embodiment described in FIG. 1 displays the screen of a smartphone or smart watch as a display item when a user walks.
- the present invention can provide an information processing device and method that contributes to both improving the visibility of augmented reality objects while walking and improving the visibility of the front scenery, so it is suitable for labor-intensive industries.
- technologies can be expected to improve, there will be a wide variety of industries, mainly industries that increase the value of products and services and labor-intensive industries (Sustainable Development Goals 8.2 (SDGs) advocated by the United Nations). It can be expected to contribute to increasing economic productivity through economic development, technological improvements, and innovation.
- FIG. 1 is an example of a diagram schematically explaining the external appearance of an embodiment of an information processing apparatus according to the present embodiment
- FIGS. It is an example of a diagram explaining a display screen within a range.
- AR glasses 100 are attached to the head of a user 10, and the AR glasses 100 include a left-eye line-of-sight sensor 101 that detects the line-of-sight of the left and right eyes of the user 10, a right-eye line-of-sight sensor 102, and a right eye line-of-sight sensor 102 that photographs the outside world. It is equipped with a camera 103 and an acceleration sensor 104 that detects acceleration, which is a change in speed per unit time.
- the user 10 visually recognizes a tangible object within the field of view around the front with his or her eyes, while in the case of the video see-through type, the camera 103 photographs the solid object within the field of view around the front. Display the image of the object on the display.
- the AR glasses 100 generate various types of display items such as text characters, icons, and images as augmented reality objects, and display them by placing them within the visual display range of the user 10.
- text characters and icons such as ⁇ Train to Tokyo is delayed. It's raining tonight'', ⁇ ** Teacher'', and ⁇ Turn right at 8m station'' are placed in a real space consisting of real objects such as buildings, roads, and passersby. This shows a case where display items 111, 112, and 113 represented by , etc. are virtually arranged.
- the field of view display range is a range in which the display items 111, 112, and 113 of real objects and virtual objects are displayed in an overlapping manner through the AR glasses 100.
- the user's line of sight may be detected from the movement of the user's eyeballs, and the user's visual field may be set as the visual field display range.
- the visual field display range may be an area that includes the visual field when the user is looking at a far distance in front of the user, as well as the visual field when the user moves the eyeballs up and down and left and right.
- the AR glasses 100 photograph the front and left and right feet 11 and 12 of the user 10 using the camera 103 included in the AR glasses 100, capture the walking state of the user from the feet of both photographed feet, and record the walking motion, walking direction, In addition to the walking speed, user walking information such as the relative positional relationship between the user's 10's feet and body and the AR glasses 100 is acquired.
- the AR glasses 100 measure acceleration in three-dimensional directions using an acceleration sensor 104 included in the AR glasses 100, and provide movement information such as the position, direction, tilt, and vibration of the user's head wearing the AR glasses 100. Analyze and detect.
- the AR glasses 100 can capture the user's walking state from the analyzed and detected movement information of the user's head, and can acquire user walking information such as walking motion, walking direction, and walking speed.
- the AR glasses 100 use the camera 103 that photographs the user's feet and the acceleration sensor 104 that measures the three-dimensional acceleration of the user's head to monitor the walking motion, walking direction, walking speed, etc. of the user wearing the AR glasses 100.
- a walking detection sensor is configured and provided to detect walking information.
- the AR glasses 100 also include a head movement detection sensor that detects movements of the user's head wearing the AR glasses 100 using an acceleration sensor 104 that measures the acceleration of the user's head in three-dimensional directions. .
- the user's 10's line of sight can be detected by the left eye line of sight sensor 101 and the right eye line of sight sensor 102.
- FIG. 2 shows a user when the user 10 wearing the AR glasses 100 is in the state shown in FIG. , a front scenery 201, a visual field display range 202 of the AR glasses 100, and a user 10 wearing the AR glasses 100 are schematically shown.
- the display items 211, 212, 213 can be displayed within the visual display range without making it difficult to read even if the display items 111, 112, 113 have a small display size. It can be visually recognized as such.
- the head up, down, left, and right can be grasped.
- FIG. 3 shows an example of a display when the user 10 starts walking from the resting state shown in FIG. 2.
- the walking detection sensor that analyzes the output of the camera 103 and the acceleration sensor 104 determines that the user 10 is walking
- the display items 211, 212, and 213, which are virtual objects are enlarged as shown in FIG. It is enlarged and displayed as display items 311, 312, and 313.
- the user can easily see and read the display item that is displayed in a large size, and the visibility of the display item, which is a virtual object, can be greatly improved.
- the display item may be enlarged by enlarging the display item as it is, or by enlarging the characters or icons within the display item. If enlarging the displayed item would result in too large an area that obstructs the field of view, the amount of displayed information may be reduced by limiting the number of characters, or the character spacing may be narrowed.
- FIG. 4 A second embodiment will be described using FIG. 4. Components having similar features to those in FIGS. 2 and 3 are given the same numbers, and detailed explanations will be omitted.
- the user's line of sight is directed mainly toward the central region 250 of the visual field display range 202 in which the user is walking, avoiding concentrating on the periphery, taking into consideration the safety of the front in the walking direction and the feet. Therefore, within the field of view display range 202, the central region 250 has relatively good user visibility, but the display items in the peripheral region 260 are more difficult to see than the display items in the central region 250.
- the display items 212 and 213 in FIG. 2 in the central area 250 of the visual field display range 202 are left as they are, and the display items 211 in FIG. 2 in the peripheral area 260 in the visual field display range 202 are Similarly, it is enlarged and displayed as a display item 311.
- This makes it possible to improve the visibility of display items in the peripheral area 260, which is harder for the user's line of sight to reach than the central area 250 of the visual field display range 202 as the user walks. It becomes possible to visually recognize the displayed items on a par with the displayed items at 250.
- the central region 250 and the peripheral region 260 have a predetermined range including the center point of the visual field display range 202 as the central region 250, and a region around the central region 250 as the peripheral region 260.
- the above-mentioned "predetermined range” may be determined in advance, or the width of the predetermined range may be changed depending on the walking condition. For example, if the walking speed is relatively fast, the "predetermined range” may be made relatively narrow, and if the walking speed is relatively slow, the "predetermined range” may be made relatively wide.
- the display items in the central area 250 are enlarged, the large display items 312 and 313 largely block the front view when walking, and the user may find it difficult to see.
- the size of the display item in the central area 250 is kept unchanged to minimize the influence on the front view, and the display item in the peripheral area 260 is enlarged to improve visibility. effective.
- FIG. 5 A third embodiment will be described using FIG. 5. Components having similar features to those in FIGS. 2, 3, and 4 are given the same numbers, and detailed explanations will be omitted.
- FIGS. 3 and 4 The difference from the embodiments in FIGS. 3 and 4 is the display size of the display items 212 and 213 in FIG. 2 located in the central area 250 within the visual display range 202.
- display items 312 and 313 are enlarged, and in FIG. 4, display items 212 and 213 are not enlarged, but in FIG. This is the point.
- FIG. 5 shows two-stage switching of the display item size between the central area and the peripheral area, it may also be switched in multiple stages from the center to the periphery within the field of view display range, or the display size may be smoothly controlled. Multi-stage and smooth display size control can further reduce landscape occlusion effects and ensure visibility.
- FIG. 6A shows the visual field display range 602 of the AR glasses 100 and the display state of display items when the user walks diagonally to the left
- FIG. 6B shows the display state of display items when the user walks diagonally to the right.
- the field of view display range 604 of the AR glasses 100 and the display state of display items are shown.
- the parts shown in FIG. 4 and given the same reference numerals have the same operations as those already explained in FIG. 4, so a detailed explanation thereof will be partially omitted.
- FIG. 6A when walking with the head turned diagonally to the left as shown by an arrow 601, the visual field display range of the AR glasses 100 moves to the left by an amount corresponding to the direction of diagonally left walking, and the visual field display range 602.
- the positions of the central area 650 and peripheral area 660 within the visual field display range 602 also change. Therefore, when standing still or walking forward, the display item 213 in FIGS. 2 and 4, which was in the central area 250 of the visibility display range 202 (for example, FIG. 5), is moved to the peripheral area within the visibility display range 602, where the user's line of sight is difficult to reach. Move to 660.
- the display item 213 that has come to be located in the peripheral area 660 when walking diagonally to the left is greatly enlarged and displayed as the display item 613. That is, when walking diagonally to the left, display items that have moved to the peripheral area 660 can be greatly enlarged and displayed, thereby improving the visibility of the display items that have moved to the peripheral area 660.
- FIG. 6B shows the case of walking diagonally to the right.
- the visual field display range of the AR glasses 100 is the same as the direction of walking diagonally to the right. It moves to the right and becomes indicated by the visual field display range 604.
- the positions of the central area 651 and peripheral area 661 within the visual field display range 604 also change. Therefore, the display item 212 that has come to be located in the peripheral area 661 when walking diagonally to the right is greatly enlarged and displayed as the display item 612. Thereby, when walking diagonally to the right, it is possible to improve the visibility of display items that have been moved to the peripheral area 661.
- FIGS. 6A and 6B in addition to when standing still or walking forward, when walking diagonally to the left, the display items 211 in the peripheral areas 660 and 661 of the visual field display ranges 602 and 604 are greatly enlarged and displayed. 311 to improve the visibility of this display item.
- FIG. 7A is an example in which the user 10 walks forward and turns his head to the left. It has moved to the left from the visual field display range 202 in FIG. 4 to the visual field display range 701. Therefore, the display item 713 in the peripheral area 760 of the visual field display range 701 is displayed in a larger display size than the display item 213 in the central area 250 in the visual field display range 202 in FIG.
- the display item 212 was in the center area 250 of the visual field display range 202 in FIG. 4, it remains in the central area 750 in the visual field display range 701 in FIG. 7A, so there is no change in the display size.
- FIG. 7A shows an example in which the user 10 stops and only turns his head to the left
- another display control may be performed if the user changes the direction of his head while walking.
- FIG. 7B is an example.
- the user 10 When the user 10 turns his head left, right, up, down, etc. while walking and there is a movement in the user's head, the user 10 becomes interested and turns his head, for example, to the left, so at this time, the central area within the visual field display range 701 It can be said that the user's awareness of viewing, reading, and visually recognizing the display items located in the right peripheral area 760 opposite to the moving direction of the display 750 can be said to be very low. Therefore, it is considered that there is little need to intentionally enlarge and display a display item that would cause the user to turn his head and move to the peripheral area 760 within the visual field display range 701.
- the display state such as the display size and display position of display items changes according to the movement of the head
- the user may become distracted and become distracted without concentrating on the direction in which the head is turned.
- the display state of display items changes when the head moves, although the scenery in the field of view does not change, the screen within the field of view becomes noisy, which adversely affects the user's visual recognition.
- the center By maintaining the area 250 and the display state of the display item (that is, the same central area and display state as in FIG. 4), changes in the display state of the display item are suppressed.
- the user 10 When the user 10 turns his or her head to the left with respect to the front direction of walking, the user moves to the field of view display range 701 in contrast to the view display range 202 of the embodiment shown in FIG. .
- the display item 713 is displayed in a large size in FIG. 7A
- the display item 213 is displayed in the same display size as in FIG. 4 in FIG. 7B.
- the display size is not changed when the walking direction is maintained, a stable display can be provided without drawing unnecessary attention from the user due to a change in display size.
- FIG. 7C shows an example in which the user 10 turns his head toward the upper left while walking diagonally to the right.
- the visual field display range moves from the visual field display range 604 to the upper center of the front scenery 201 and This results in a visual field display range 702 shown in 7C.
- a visual field display range 702 shown in 7C.
- display items 311, 612, and 213 in the peripheral area 761 of the visual field display range 702 may be displayed in an enlarged manner.
- the display size of FIG. 6B is maintained when the head is turned diagonally to the right while walking, and only the visual field display range is changed to the visual field of FIG. 6B. It is moved from the display range 604 to the visibility display range 702 in FIG. 7C. Therefore, the display items 311 and 612 located in the peripheral area 661 within the visual field display range 604 when the head is turned in the walking direction are displayed in an enlarged manner, and the display item 213 located in the central area 651 of the visual field display range 604 in FIG. 6B is displayed in an enlarged manner. It has not been.
- FIGS. 7B and 7C are the front visual field display range when the head is turned in the walking direction.
- FIGS. 4, 5, 6A, and 6B the head is facing in the walking direction, so the actual visual field display range and the front visual field display range match.
- the user may be able to switch whether the display size determination criterion is the actual visual field display range or the front visual field display range.
- the front view display range will be used as the standard, and if there is a mismatch, for example, "crab walking" (moving the body sideways while facing forward), the actual field of view will be Automatic switching may be performed based on the display range.
- the predetermined time may be controlled to become shorter as the predetermined angle becomes larger.
- FIGS. 8A, 8B, 8C, and 8D An information processing device that divides the peripheral area within the visual display range into multiple peripheral areas, and displays the display items in the central area of the visual display range by moving them to the divided peripheral area and enlarging them while walking.
- the display item display operation will be explained using FIGS. 8A, 8B, 8C, and 8D.
- the parts shown in FIGS. 3 and 4 and given the same reference numerals have the same operations as those already described, so detailed explanation thereof will be omitted.
- FIG. 8A shows, as an example of dividing the peripheral area, a central area 850 is an area surrounded by a broken line frame 801 closest to the center, and a first peripheral area 860 is an area outside the broken line frame 801 and inside the broken line frame 802. , the second peripheral area 870 is outside the broken line frame 802 and inside the visual field display range 202.
- the display item 212 is displayed in the central area 850, the display item 813 is displayed in the first peripheral area 860, and the display item 211 is displayed in the second peripheral area 870.
- FIG. 8B the display item 212 that was in the central area 850 of FIG. 8A is moved to the first peripheral area 860 and displayed as the display item 812, thereby ensuring visibility in front of the user while walking.
- the display item 813 in the first peripheral area 860 is displayed without being enlarged as in FIG. 8A, and the display item 211 in the second peripheral area 870 in FIG. 8A is enlarged and displayed as the display item 511. In this way, while ensuring the visibility of the scenery in the walking direction, the size of the surrounding display items is increased to increase the visibility of the display items.
- FIG. 8C is an example in which the display items located in the first peripheral area 860 from FIG. 8B are displayed in an enlarged manner to ensure visibility in front of the user while walking and to improve visibility of the display items.
- the display item 212 that was in the central area 850 of FIG. 8A is moved to the first peripheral area 860 and further enlarged to the display item 822.
- the display item 813 that was displayed in the first peripheral area 860 in FIG. 8A is enlarged to the display item 823, and the display item 211 that was displayed in the second peripheral area 870 in FIG. 8A is enlarged to the display item 511.
- the display items 822 and 823 in the first peripheral area 860 larger than the display item 511 in the second peripheral area 870, the visibility of the scenery in the walking direction and the visibility of large display items in the vicinity are improved. I am raising it and balancing it out.
- FIG. 8D is an example in which the display item 212 in the central area 850 of FIG. 8A is moved to the second peripheral area 870 and further enlarged to the display item 832.
- This is an example in which the display item 832 is displayed in the same size as the display item 511 displayed in the second peripheral area 870.
- the size of the display item may be changed in stages depending on the surrounding area of the movement destination, thereby improving the visibility of the scenery in the walking direction and the visibility of large display items in the vicinity.
- FIG. 9A is an example of a diagram showing the relationship between walking speed and display size of display items.
- the faster the walking speed the harder it becomes to see and read the display items displayed as virtual objects within the visual display range, leading to a decrease in the visibility of the display items.
- the faster the walking speed of the user 10 when walking the more the display items within the visual display range can be enlarged and displayed, thereby improving the visibility of the display items.
- the enlarged display size of the display item may be gradually increased according to the walking speed of the user 10 when walking.
- the enlarged display size of the display item may be switched in multiple stages according to the walking speed of the user 10 when walking.
- the display size of the display item does not change even if the walking speed slightly changes within a certain level stage, and it is possible for the user 10 to stably view the display item. You can get the effect of becoming.
- multi-step switching when display size switching occurs, the display tends to become unstable, so it is preferable to provide hysteresis to prevent critical display size switching, as shown by a broken line 1003.
- FIG. 9B is an example of a diagram showing the relationship between walking speed and the movement destination of display items in the central area.
- the display item in the central area 850 is moved to the first peripheral area 860, and the walking speed of the user is If the user is fast and walking at a fast pace, the displayed item in the central area 850 is moved to the second peripheral area 870, as shown in FIG. 8D. Furthermore, as the walking speed increases, the displayed items in the central area 850 come closer to the user's field of vision, and the displayed items tend to obstruct the user's view of the scenery ahead.
- FIG. 9B the user 10 is It is possible to reduce visual obstruction to the scenery ahead due to display items caused by walking speed.
- FIG. 9B is also preferably provided with hysteresis to prevent movement of critical display items.
- the display position of the display item is set in multiple stages according to the walking speed of the user 10 when walking, that is, the movement destination is further subdivided than the movement between the central area 850, the first peripheral area 860, and the second peripheral area 870. It is also possible to switch by changing the When the display position of the display item is switched to multiple stages, the display position of the display item does not change even if the walking speed slightly changes within a certain level stage, making it possible for the user 10 to stably view the display item. You can get the effect of
- the central region 850 of the visual field display range depending on the walking speed or moving speed.
- the size of this central region 850 may also be controlled in the same way as the walking speed vs. display size in FIG. 9A.
- the central area 850 is widened from the shoulder width of the user 10 to about 10 degrees in front and left and right, making it a trapezoidal area of about 3 m in front, and when the user is walking at 5 km/h, it is widened to about 5 m in front, and when riding a 10 km bicycle.
- the distance in front may be increased to 10 m depending on the moving speed.
- the front left and right opening angles may be widened depending on the traveling speed.
- FIG. 10 is an example of a case where the operation menu 1101 is displayed within the visual display range when the user is stationary.
- the parts shown in FIG. 2 and given the same reference numerals have the same operations as those already explained in FIG. 2, so a detailed explanation thereof will be partially omitted.
- an operation menu 1101 for inputting and setting the display state such as the display size and display position of display items is displayed within the visual field display range 202, and when the user 10 walks, the operation menu 1101 is displayed.
- the desired display size and display position of the display item are input and set from 1101
- the display size and display position of the display item are controlled according to the setting input information that has been input and set while walking.
- the operation menu 1101 By operating the operation menu 1101 with the hand or line of sight, it is possible to input settings such as the display size and display position of display items that are optimal for the user 10.
- the input setting operation using the line of sight is performed by detecting the line of sight of the user 10 using the left eye line of sight sensor 101 and the right eye line of sight sensor 102, and by pointing the line of sight to the input setting button position on the operation menu 1101. Therefore, the left eye line of sight sensor 101 and the right eye line of sight sensor 102 constitute a line of sight input device that selects the operation menu 1101.
- manual input setting operations are also possible by pointing the hand to the input setting button position on the operation menu 1101. For example, if the user 10 with poor eyesight inputs and sets the display size in the operation menu 1101 so that the displayed item is enlarged and displayed larger than the standard, the visibility of the displayed item can be improved when walking.
- the optimal display position that matches the visual acuity of the user 10 may be set and input in the operation menu 1101, and the display item may be moved from the central area to the peripheral area of the visual field display range 202.
- the visibility of the front scenery and the display item itself in the central area opened by movement can be improved.
- input setting operations using the line of sight may be restricted by erasing the display of the operation menu 1101 in which input settings are performed using the line of sight, or by stopping input setting operations using the line of sight in the operation menu 1101. good. This prevents the user 10 from directing his/her line of sight toward the operation menu 1101, and allows the user's 10's line of sight to concentrate on the front scenery or the display item, which is a virtual object, within the field of view display range 202.
- the display of the operation menu 1101, including operation menus other than input settings may be suppressed when the walking speed exceeds a predetermined speed, such as 1 km/h, so that the user's attention is not distracted by the direction of the walk when the user inputs the line of sight while walking. .
- a predetermined speed such as 1 km/h
- the user 10 wears the AR glasses 100 and is about to start walking. Before the user starts walking, that is, in a stationary state (S1201), a display item is displayed within the visual display range (S1202: Yes). In this state, the processor 1320 determines whether the user is walking based on sensor information from the walking detection sensor (S1203: walking determination step).
- the AR glasses 100 control the display size of the display item and adjust the display item according to the position of the display item displayed within the visual display range. or a combination thereof (S1204). For example, display items located in the peripheral area within the visual field display range may be enlarged and displayed (corresponding to the enlargement step), or display items located in the central area of the visual field display range may be moved to the peripheral area (display position movement step). ), display (corresponds to the display step).
- the AR glasses 100 maintain the controlled display size and display position of the display item for the time being, as shown in FIG. 11B. (S1206).
- step S1208 is shown to be executed following step S1207, but step S1207 and step S1208 may be performed in the reverse order, or they may be processed in parallel, and an affirmative determination is made in either step.
- the configuration may be such that the process proceeds to step S1209.
- the display state of the display item during walking is maintained for the time being, and after a predetermined control maintenance period or by input operation by the user, the display state of the display item during walking is maintained. to return to the original display state of the display item when it is at rest.
- the display state consisting of the display size, display position, or a combination thereof of the display item changes suddenly at that moment, making it difficult to see or read the display item. It is possible to reduce bad situations in which the user feels strange or abrupt.
- the display state of the display item may be gradually returned to the original display state of the display item at rest at a gentle pace; in this case, , it becomes possible to further reduce the sense of strangeness and suddenness that the user feels.
- the display state of display items that were displayed during the previous average walking may be maintained as is. It is possible to suppress frequent changes in display size and display position, and reduce and eliminate difficulty in viewing display items.
- FIG. 12 is a block diagram illustrating a configuration example of AR glasses, which are a form of HMD, as an example of the information processing device according to the present embodiment.
- the parts shown in FIG. 1 etc. and given the same reference numerals have the same operations as those already explained in FIG. 1 etc., so a detailed explanation thereof will be partially omitted.
- the AR glasses 100 include a left eye gaze sensor 101, a right eye gaze sensor 102, a camera 103, an acceleration sensor 104, a geomagnetic sensor 1304, a gyro sensor 1305, a ranging sensor 1306, an operation input interface 1307, a display processing device 1308, and a processor. 1320, a storage 1330 for storing programs 1331 and information data 1332, a vibrator 1341, a microphone 1342, a speaker 1343, a communication device 1344, a timer 1345, and a RAM 1325. Each component is interconnected via a bus 1350. It is connected to the.
- the camera 103 is for photographing the field of view around the front of the camera, and converts the light incident from the lens into an electrical signal using an image sensor to obtain a camera-captured image.
- the camera 103 photographs the scenery in front of the user 10 as well as the feet of the user 10. Therefore, the walking state of the user 10 can be analyzed from the photographed foot state of the user 10, and is used as a walking detection sensor that detects the user's walking motion, walking direction, and walking speed. Note that by photographing the user's feet with a camera, it is possible to analyze not only the walking state of the user but also when the user is moving his or her feet while riding a bicycle or kickboard, making it possible to control the display state such as enlarging the displayed items.
- the camera 103 photographs a real object in the real space, ie, the scenery in front of the user, and the image of the photographed physical object is displayed on the display processing device 1308.
- the left-eye line-of-sight sensor 101 and the right-eye line-of-sight sensor 102 are capable of detecting the movement and direction of the left eye and right eye, respectively, and capturing the line of sight of the user 10.
- the process of detecting the movement of the eyeballs can be performed by using a well-known technique that is generally used as an eye tracking process.
- an infrared LED Light Emitting Diode
- the acceleration sensor 104 is a sensor that detects acceleration, which is a change in speed per unit time, and can detect movements, vibrations, shocks, etc.
- the AR glasses 100 are attached to the head of the user 10, and movement information such as rotation, tilt, and vibration of the user's 10 head is analyzed and acquired from acceleration data detected by the acceleration sensor 104 provided in the AR glasses 100. can do.
- the acceleration sensor 104 is used as a head movement detection sensor that detects the movement of the user's head, and can detect the state in which the user 10 turns his or her head left, right, up, down, etc. Note that the relative relationship between the head and body (orientation of the feet) cannot be directly detected with just an acceleration sensor attached to the head, so it is possible to attach an acceleration sensor to the body, or use a camera built into the AR glasses 100 to detect the body and legs. You may also obtain it by photographing it.
- sensor information from the head movement detection sensor is analyzed to detect the movement of the user's head, and the walking direction and walking speed of the user wearing the AR glasses 100 are also detected. Therefore, it can also be used as a walking detection sensor.
- the AR glasses 100 alone are limited to detecting the average walking direction and walking speed, it does not matter at that level.
- the gyro sensor 1305 is a sensor that detects the angular velocity in the rotational direction, and can capture the state of vertical, horizontal, and diagonal postures.
- the acceleration sensor 104 and gyro sensor 1305 of a smartphone or smart watch placed in the user's pocket or the like may be used together to improve accuracy.
- the geomagnetic sensor 1304 is a sensor that detects the earth's magnetic force, and detects the direction in which the AR glasses 100 are facing. It is also possible to detect the movement of the AR glasses 100 by using a three-axis type device that detects geomagnetism in the vertical direction as well as in the front-back and left-right directions, and by capturing changes in the geomagnetic field with respect to the movement of the AR glasses 100. As a result, it is possible to more accurately analyze and acquire the movement of the head of the user wearing the AR glasses 100 and the walking state of the user.
- the distance sensor 1306 is a sensor that can measure the distance and angle to an object and capture the shape of the object, such as an object, as a three-dimensional object.
- LiDAR Light Detection and Ranging
- TOF Time of Flight
- sensors measure distance by measuring the reflection time of the irradiated pulsed light for each pixel, emit millimeter wave radio waves, capture the reflected waves, and measure the distance to the reflected object.
- Millimeter wave radar and the like are used to detect the state of the target object.
- the distance sensor 1306 measures the distance and direction to a real object in the front scenery, and based on the measurement information, for example, display items related to the real object can be virtually installed in a space near the real object. It becomes possible.
- the processor 1320 is composed of a CPU and the like.
- the storage 1330 is composed of a non-volatile storage device, etc., and stores various programs 1331 and information data 1332 handled by the processor 1320 and the like. As the information data 1332, display item information 1335, walking information 1336, head movement information 1337, input setting information 1338, etc. are stored.
- the processor 1320 loads an operating system (OS) 1333 stored in the storage 1330 and an application program 1334 for motion control into the RAM 1325 and executes the walking state detection processing unit 1321 and head movement.
- OS operating system
- a detection processing section 1322, a display item display form control processing section 1323, and an operation menu processing section 1324 are configured to realize functions such as an OS, middleware, applications, and other functions.
- the display processing device 1308 includes a projection device that projects display items and notification information to the user, and a transparent device that displays the projected display items and the like in an image form in front of the user's eyes. Consists of a half mirror.
- the user 10 can visually recognize the imaged display item together with the tangible object in the field of vision in front of him in a floating manner.
- a display such as a liquid crystal panel that displays a combination of the real object in front of the eyes photographed by the camera 103 and display items.
- the user 10 can visually recognize the real object in the field of view image in front of him and the display item etc. in an overlapping manner.
- the operation input interface 1307 is an input means using, for example, a line of sight, a hand, a pointer, a keyboard, a key button, a touch key, etc., and is used to set and input information that the user 10 wants to input.
- An input operation screen such as the operation menu 1101 may be displayed on the display screen of the display processing device 1308, and input operation information may be captured depending on the position on the input operation screen where the line of sight is directed, or the input operation information may be captured by moving the hand or pointer on the input operation screen.
- the input operation information may be inputted by displaying the screen and operating the hand or pointer using the operation input interface 1307.
- keyboard in the case of input using a keyboard, key buttons, touch keys, etc., it may be provided in a position and form that makes it easy for the user 10 to perform input operations within the AR glasses 100, or it may be separated from the main body of the AR glasses 100 and used in a wired or wireless manner. It may also be connected with
- the user 10 may utter a voice indicating an input operation, and the microphone 1342 may collect the sound to capture the input operation information.
- the microphone 1342 collects external sounds and the user's own utterances, and converts them into audio data consisting of digital data using an A/D converter (not shown). Instruction information in the form of vocalizations from the user 10 can be taken into the AR glasses 100, and operations in response to the instruction information can be executed with ease.
- the speaker 1343 outputs audio based on the audio data, and can notify the user 10 of notification information in audio. For example, information such as the direction in which the user 10 is running, the direction in which the user is turning his head, enlarged display of display items, and positional movement is emitted by voice.
- the vibrator 1341 generates vibrations under the control of the processor 1320, and converts the notification instruction information sent by the AR glasses 100 to the user 10 into vibrations.
- the vibrator 1341 transmits vibrations to the user's head wearing the AR glasses 100 to notify the user 10 of the notification instruction information, thereby improving usability.
- the communication device 1344 is a communication interface that performs wireless communication with other devices using short-range wireless communication such as Bluetooth (registered trademark) or ZigBee (registered trademark).
- the communication device 1344 includes a communication processing circuit, an antenna, etc. corresponding to various predetermined communication interfaces, and transmits and receives various information, control signals, and the like. Note that a telephone communication network may also be included.
- the timer 1345 measures time, and measures the elapse of a predetermined control maintenance period after detection of stopping walking.
- the walking state detection processing unit 1321 analyzes the walking state of the user 10 from the image taken by the camera 103 of the feet, and detects the walking direction and walking speed. Furthermore, the walking state of the user 10 is analyzed from the head movement information detected by the acceleration sensor 104, and the walking direction and walking speed are detected. In addition to the acceleration sensor 104, a geomagnetic sensor 1304 and a gyro sensor 1305 may also be used.
- the head movement detection processing unit 1322 detects the movement of the user's 10 head using the acceleration sensor 104, and analyzes and obtains the state in which the user 10 turns his or her head left, right, up, down, etc.
- a geomagnetic sensor 1304 and a gyro sensor 1305 may also be used.
- the display item display form control processing unit 1323 controls the display size of the display item and the display position of the display item according to the position of the display item displayed as a virtual object within the visual display range 202, 602, 604, 701, 702. , or a combination thereof.
- display items located in the peripheral areas 660, 760, 761, 860, and 870 within the visual display range may be enlarged and displayed, or display items located in the central area 650, 750, 751, and 850 of the visual display range may be displayed in the peripheral area. 660, 760, 761, 860, 870 and enlarged display.
- the operation menu processing unit 1324 displays an operation menu 1101 for inputting display conditions such as the display size and display position of display items, operation instructions for the AR glasses 100, etc. Information desired to be input is set and input by the user 10's line of sight, hands, etc.
- the display item display form control processing section 1323 when the walking state detection processing section 1321 detects the walking of the user 10, the display item display form control processing section 1323 generates and displays display items within the visual field display range of the user. You can enlarge and display the display items in the peripheral areas 660, 760, 761, 860, 870 within the visibility display ranges 202, 602, 604, 701, 702, , 604, 701, 702, display items in the peripheral areas 660, 760, 761, 860, 870 are enlarged and displayed larger than the enlarged display items in the central areas 650, 750, 751, 850. can do.
- the walking state detection processing section 1321 detects the walking of the user 10
- the head movement detection processing section 1322 detects the movement of the user's head
- the display item display form The control processing unit 1323 can maintain the display size and display position of the display item before the head movement occurs, and display the display item in the maintained display state.
- the display item display form control processing unit 1323 displays the display items in the central areas 650, 750, 751, and 850 of the visual field display range.
- the item can be moved to a peripheral area outside the central area 650, 750, 850 of the visual display range and displayed in an enlarged manner.
- the visibility display range 202 is divided into a first peripheral area 860 adjacent to the outside of the central area 850 and forming a peripheral part, and a second peripheral area 870 adjacent to the outside of the first peripheral area 860 and forming a peripheral part.
- the display item display form control processing unit 1323 moves the display item in the central area 850 to the first peripheral area 860 and displays it at a slightly enlarged display size or at the same display size, or A certain display item can be moved to the second peripheral area 870 and displayed at a greatly enlarged display size.
- display items in the central area 850 can also be moved outside the visual field display range 202.
- the walking state detection processing unit 1321 detects the walking speed of the user 10 while walking, and the display item display form control processing unit 1323 changes the display items according to the detected walking speed of the user 10. Display can be performed by controlling the display size, display position, or a combination thereof. In this case, it is also possible to switch and control the display size or display position of the display item, or a combination thereof, in multiple stages according to the user's walking speed.
- the operation menu processing unit 1324 sets and inputs the information that the user 10 wants to input by operating the user's gaze or hand toward the displayed operation menu 1101, and the display item display form control processing unit 1323 inputs the information to be input.
- the display size and display position of display items can be controlled according to the input setting information.
- the operation menu processing unit 1324 can be stopped and input operations to the operation menu using the user's line of sight can be prohibited.
- the display item display form control processing unit 1323 maintains the display size and display position of the display item, and After the control maintenance period or by an input operation by the user, the maintained display size and display position of the display item when walking can be shifted to the original display size and display position of the display item when the display item is stationary.
- a screen transmitted from a smartphone or smart watch may be displayed on the AR glasses 100 as a display item.
- the smartphone screen 1401 and smart watch screen 1402 are displayed as display items in the peripheral area 260 within the visual field display range 202, when the user is walking, as shown in FIG. 13B, Then, the screen is enlarged to a large display size and displayed within the visual display range 202 as a smartphone screen 1411 and a smart watch screen 1412.
- the parts shown in FIGS. 2 and 4 and labeled with the same reference numerals have the same operations as those already described, so a detailed explanation thereof will be partially omitted.
- the AR glasses 100 may enlarge the screen transmitted from the smartphone or smart watch and display it on the AR glasses, or detect that the user 10 has started walking and display the enlarged screen on the smartphone or smart watch.
- the enlarged screen may be received upon request and displayed on the AR glasses 100.
- the smartphone or smart watch needs to have the function of receiving an enlarged screen instruction and transmitting the enlarged screen, and it is preferable to authenticate with the AR glasses 100 in advance that it has the same function.
- the text characters of display items may be made bold in bright places to increase brightness, and the text characters of display items may be made thin in dark places to reduce glare.
- the AR glasses 100 which is a form of HMD, have been explained as a specific example of the information processing device, but the target is all devices having similar functions, and the display of virtual objects in the user's field of view is applicable. It goes without saying that similar actions and effects can be obtained with an information processing device that displays items and visually recognizes them along with the scenery in front of them.
- the present invention is not limited to the embodiments described above, and includes various modifications.
- the above-described embodiments have been described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to having all the configurations described.
- each of the above-mentioned configurations, functions, processing units, processing means, etc. may be partially or entirely realized in hardware by, for example, designing an integrated circuit, a general-purpose processor, or a specific-purpose processor.
- a processor includes transistors and other circuits and is considered a circuit or processing circuit.
- each of the configurations, functions, etc. described above may be realized by software by a processor interpreting and executing programs for realizing the respective functions.
- Information such as programs, tables, files, etc. that realize each function may be stored in a memory, a recording device such as a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD. However, it may also be stored in a device on a communication network.
- the control lines and information lines are shown to be necessary for explanation purposes, and not all control lines and information lines are necessarily shown in the product. In reality, almost all components may be considered to be interconnected.
- the embodiment includes the following aspects.
- An information processing device worn on the user's head that has a function of allowing the user to see the field of view in front of the user and generating and displaying display items such as text characters and icons as virtual objects.
- the device is equipped with a walking detection sensor that detects walking conditions such as the user's walking motion, walking direction, and walking speed,
- the information processing apparatus is characterized in that when the walking detection sensor detects the user's walking, the display item generated and displayed within the user's visual display range is enlarged and displayed.
- the information processing device when the walking detection sensor detects the user's walking, the information processing device enlarges display items in the peripheral area within the visual field display range.
- This is an information processing device characterized by displaying the following information.
- the information processing device when the walking detection sensor detects the user's walking, displays display items in the peripheral area within the visual field display range.
- the information processing apparatus is characterized in that the display item is enlarged and displayed larger than the enlarged display item in the central area of the display range.
- the information processing device includes a head movement detection sensor that detects movement of the head of a user wearing the information processing device, In the information processing device, if the head movement detection sensor detects a movement of the user's head when the walking detection sensor detects the user's walking, the display before the head movement is detected.
- the information processing apparatus is characterized in that the display state such as the display size of the item is maintained as it is, and the display item is displayed in the maintained display state.
- the information processing device detects the walking speed of the user using the walking detection sensor, and detects the walking speed of the user detected by the walking detection sensor.
- the information processing apparatus is characterized by controlling the display size of display items that are displayed in an enlarged manner.
- the information processing device adjusts the display size of the display item enlarged and displayed in multiple stages according to the walking speed of the user detected by the walking detection sensor.
- This information processing device is characterized in that it performs switching control.
- the information processing device includes an operation menu through which the user sets and inputs the display state such as the display size and display position of the display item when walking;
- the information processing apparatus is characterized in that the display size of display items during walking is controlled in accordance with setting input information input through the operation menu.
- the information processing device includes an operation menu and a line-of-sight input device for selecting the operation menu
- the information processing apparatus is characterized in that when the walking detection sensor detects the user's walking, input operations to the operation menu by the user's line of sight are restricted.
- the information processing device maintains the display size of the display item during walking even if the walking detection sensor detects that the user has stopped walking while walking.
- Information characterized in that, after a predetermined control maintenance period after stopping walking or by an input operation by the user, the maintained display size of the display item when walking is changed to the original display size of the display item when the display item is stationary. It is a processing device.
- An information processing device worn on the user's head that has a function of allowing the user to see the field of view in front of the user and generating and displaying display items such as text characters and icons as virtual objects;
- the device is equipped with a walking detection sensor that detects walking conditions such as the user's walking motion, walking direction, and walking speed,
- the information processing apparatus is characterized in that when the walking detection sensor detects the user's walking, the display item located in the center area of the visual field display range is moved to the outside of the central area of the visual field display range. It is a device.
- the information processing device when walking, moves the display item located in the central area of the visual field display range to a peripheral area within the visual field display range and displays it in an enlarged manner.
- This is an information processing device characterized by the following.
- the peripheral area within the visual field display range includes a first peripheral area constituting a peripheral area adjacent to the outside of the central area of the visual field display range; and a second peripheral area constituting a peripheral area adjacent to the outside of the peripheral area, and in the information processing device, the display item is moved to the second peripheral area when the walking of the user is detected by the walking detection sensor.
- the information processing apparatus is characterized in that the enlargement ratio is made larger than that of the display item that has been moved to the first peripheral area.
- the information processing device detects the walking speed of the user using the walking detection sensor, and detects the walking speed of the user detected by the walking detection sensor.
- the information processing apparatus is characterized by controlling the display position of a display item that is displayed in an enlarged manner.
- the information processing device controls switching of the display position of the display item in multiple stages according to the walking speed of the user detected by the walking detection sensor.
- the information processing device is provided with an operation menu through which the user sets and inputs the display state such as the display size and display position of the display item when walking;
- the information processing apparatus is characterized in that the display position of display items during walking is controlled in accordance with setting input information input through the operation menu.
- the information processing device includes an operation menu and a line-of-sight input device for selecting the operation menu
- the information processing apparatus is characterized in that when the walking detection sensor detects the user's walking, input operations to the operation menu by the user's line of sight are restricted.
- the display position of the display item during walking remains unchanged.
- the display item is maintained, and after a predetermined control maintenance period after stopping walking or by an input operation by the user, the display position of the display item during the maintained walking state is shifted to the original display position of the display item when the display item is at rest. It is an information processing device.
- a walking detection sensor that has the function of visually checking the field of view in front of the user and generating and displaying display items such as text characters and icons as virtual objects, and detects the user's walking status such as walking motion, walking direction, and walking speed.
- An information processing method in an information processing device worn on a user's head comprising: detecting the user's walking with the walking detection sensor; When the walking of the user is detected by the walking detection sensor, the display item generated and displayed within the visual display range of the user is enlarged and displayed or moved outside the central area within the visual display range.
- An information processing device worn by a user comprising: a display processing device that displays an augmented reality object within a viewing display range; a walking detection sensor that detects a walking state of the user; and a processor; The processor determines whether the user is walking based on sensor information of the walking detection sensor, and when determining that the user is walking, magnifies the augmented reality object and displays the enlarged augmented reality object.
- the information processing apparatus is characterized in that the information processing apparatus executes control for displaying the image within the visual field display range.
- An information processing device worn by a user comprising: a display processing device that displays an augmented reality object within a viewing display range; a walking detection sensor that detects a walking state of the user; and a processor;
- the visual field display range includes a central area including the center point of the visual field display range and a peripheral area located at the outer edge of the central area, and the processor detects when the user is walking based on sensor information of the walking detection sensor.
- the information processing apparatus is characterized in that if it is determined that the object is walking, display control is executed to move the augmented reality object displayed in the central area to the peripheral area.
- An information processing method executed by an information processing device worn on a user the method determining whether the user is walking based on sensor information of a walking detection sensor that detects the walking state of the user. a determining step; when determining that the user is walking, an enlarging step of enlarging the augmented reality object compared to when the user is stationary; and a displaying step of displaying the enlarged augmented reality object on a display processing device.
- An information processing method characterized by including the following.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
以下、本発明の実施形態の例を、図面を用いて説明する。本実施形態に係る情報処理装置の具体例として、ユーザ頭部に装着され現実空間や仮想空間の情報を表示して視認するHMDの一形態であるメガネ型のARグラスを例にとり説明する。図1は、本実施形態に係る情報処理装置の実施形態例を外観模式的に説明する図の例であり、図2、図3は、本実施形態に係る情報処理装置の前方風景や視界表示範囲内の表示画面を説明する図の例である。
第2の実施形態を、図4を用いて説明する。図2や図3と同様な特徴を持つものは同じ番号を付与して、詳細説明は省略する。
第3の実施形態を、図5を用いて説明する。図2や図3、図4と同様な特徴を持つものは同じ番号を付与して、詳細説明は省略する。
正面への歩行時を説明した図4の実施形態に対して、静止時や正面への歩行開始後から斜め左右に向けて歩行する場合の情報処理装置の表示アイテム表示動作について、図6Aと図6Bを用いて説明する。
次に、歩行時にユーザが頭を左右上下などに振り向けて頭部に動きがある場合の情報処理装置の視界表示範囲内の表示動作について、図7Aと図7B、図7Cを例示して説明する。図7Aと図7B、図7Cにおいて、図4や図6Bに示され同一の符号を付された部分は、図4や図7Bで既に説明した動作と同一の動作を有するので、それらの詳細説明は一部省略する。
視界表示範囲内の周辺領域を複数の周辺領域に分割区分けし、歩行時に視界表示範囲の中央領域にある表示アイテムを区分けされた周辺領域に移動させ拡大などして表示する場合の情報処理装置の表示アイテム表示動作について、図8A、図8B 、図8C、図8Dを用いて説明する。これらの図において、図3と図4に示され同一の符号を付された部分は、既に説明した動作と同一の動作を有するので、それらの詳細説明は省略する。
ユーザの歩行時の歩行速度に応じて視界表示範囲内の表示アイテムの拡大表示サイズを制御する場合の情報処理装置の表示アイテム表示動作について、図9A、図9Bを用いて説明する。図9Aは、歩行速度と表示アイテムの表示サイズの関係を示す図の例である。
ユーザ10の歩行に際し、表示アイテムの表示サイズや表示位置等の表示状態を入力設定する場合の情報処理装置の入力操作について、図10を用いて説明する。図10は、ユーザ静止状態で視界表示範囲内に操作メニュー1101が表示されている場合の例である。図10において、図2に示され同一の符号を付された部分は、図2で既に説明した動作と同一の動作を有するので、それらの詳細説明は一部省略する。
歩行開始と歩行停止における表示アイテムの表示動作について、図11A、図11Bに示したフローチャートを用いて説明する。
表示アイテムとしてスマートフォンやスマートウォッチから伝送された画面をARグラス100に表示してもよい。例えば、ユーザ静止時、図13Aに示すように、視界表示範囲202内の周辺領域260にスマートフォン画面1401、スマートウォッチ画面1402が表示アイテムとして表示している場合、ユーザ歩行時には、図13Bに示すように、大きな表示サイズに拡大されてスマートフォン画面1411、スマートウォッチ画面1412として視界表示範囲202内に表示する。なお、図13A、図13Bにおいて、図2や図4に示され同一の符号を付された部分は、既に説明した動作と同一の動作を有するので、それらの詳細説明は一部省略する。
(1)前方の視界をユーザに視認させるとともに仮想オブジェクトとしてテキスト文字やアイコンなどの表示アイテムを生成表示する機能を有し、ユーザの頭部に装着された情報処理装置にあって、該情報処理装置はユーザの歩行動作、歩行方向、歩行速度などの歩行状態を検出する歩行検出センサを具備し、
該情報処理装置では、該歩行検出センサによってユーザの歩行が検出されたとき、ユーザの視界表示範囲内に生成表示された表示アイテムを拡大して表示することを特徴とする情報処理装置である。
該情報処理装置では、歩行検出センサによってユーザの歩行が検出されたときに該頭部動き検出センサでユーザ頭部の動きが検出された場合には、頭部の動きが検出される前の表示アイテムの表示サイズなどの表示状態をそのまま維持し、維持された表示状態で表示アイテムを表示することを特徴とする情報処理装置である。
該情報処理装置では、該操作メニューで設定入力された設定入力情報に応じて、歩行時の表示アイテムの表示サイズを制御することを特徴とする情報処理装置である。
該情報処理装置では、該歩行検出センサによってユーザの歩行が検出されたとき、操作メニューへのユーザ視線による入力操作を制限することを特徴とする情報処理装置である。
該情報処理装置では、該歩行検出センサによってユーザの歩行が検出されたとき、視界表示範囲の中央領域にある表示アイテムを視界表示範囲の中央領域よりも外側に移動させることを特徴とする情報処理装置である。
該情報処理装置では、該操作メニューで設定入力された設定入力情報に応じて、歩行時の表示アイテムの表示位置を制御することを特徴とする情報処理装置である。
該情報処理装置では、該歩行検出センサによってユーザの歩行が検出されたとき、操作メニューへのユーザ視線による入力操作を制限することを特徴とする情報処理装置である。
該歩行検出センサによってユーザの歩行を検出するステップと、
該歩行検出センサによってユーザの歩行が検出されたとき、ユーザの視界表示範囲内に生成表示された表示アイテムを拡大して表示したり、視界表示範囲内の中央領域よりも外側に移動させるステップと、を有する情報処理方法である。
11 :足
12 :足
100 :ARグラス
101 :左目視線センサ
102 :右目視線センサ
103 :カメラ
104 :加速度センサ
111 :表示アイテム
112 :表示アイテム
113 :表示アイテム
201 :前方風景
202 :視界表示範囲
211 :表示アイテム
212 :表示アイテム
213 :表示アイテム
250 :中央領域
260 :周辺領域
300 :矢印
311 :表示アイテム
312 :表示アイテム
313 :表示アイテム
511 :表示アイテム
512 :表示アイテム
601 :矢印
602 :視界表示範囲
603 :矢印
604 :視界表示範囲
612 :表示アイテム
613 :表示アイテム
650 :中央領域
651 :中央領域
660 :周辺領域
661 :周辺領域
701 :視界表示範囲
702 :視界表示範囲
713 :表示アイテム
750 :中央領域
751 :中央領域
760 :周辺領域
761 :周辺領域
801 :破線枠
802 :破線枠
812 :表示アイテム
813 :表示アイテム
822 :表示アイテム
823 :表示アイテム
832 :表示アイテム
850 :中央領域
860 :第1周辺領域
870 :第2周辺領域
1002 :実線
1003 :破線
1101 :操作メニュー
1304 :地磁気センサ
1305 :ジャイロセンサ
1306 :測距センサ
1307 :操作入力インタフェース
1308 :表示処理デバイス
1320 :プロセッサ
1321 :歩行状態検出処理部
1322 :頭部動き検出処理部
1323 :表示アイテム表示形態制御処理部
1324 :操作メニュー処理部
1325 :RAM
1330 :ストレージ
1331 :プログラム
1332 :情報データ
1334 :アプリケーションプログラム
1335 :表示アイテム情報
1336 :歩行情報
1337 :頭部動き情報
1338 :入力設定情報
1341 :バイブレータ
1342 :マイク
1343 :スピーカ
1344 :通信デバイス
1345 :タイマー
1350 :バス
1401 :スマートフォン画面
1402 :スマートウォッチ画面
1411 :スマートフォン画面
1412 :スマートウォッチ画面
Claims (15)
- ユーザに装着された情報処理装置であって、
拡張現実オブジェクトを視界表示範囲内に表示する表示処理デバイスと、
ユーザの歩行状態を検出する歩行検出センサと、
プロセッサと、を備え、
前記プロセッサは、前記歩行検出センサのセンサ情報に基づいて前記ユーザが歩行しているかを判断し、前記ユーザが歩行していると判断すると、前記拡張現実オブジェクトを拡大し、
拡大された前記拡張現実オブジェクトを前記視界表示範囲内に表示する制御を実行する、
ことを特徴とする情報処理装置。 - 請求項1に記載の情報処理装置にあって、
前記視界表示範囲は、当該視界表示範囲の中心点を含む中央領域及び当該中央領域の外縁に位置する周辺領域を含み、
前記プロセッサは、前記周辺領域に表示される前記拡張現実オブジェクトは拡大し、前記中央領域に表示される前記拡張現実オブジェクトは拡大しない、
ことを特徴とする情報処理装置。 - 請求項2に記載の情報処理装置にあって、
前記プロセッサは、前記周辺領域に表示される前記拡張現実オブジェクトの表示サイズを、前記中央領域に表示される前記拡張現実オブジェクトの表示サイズよりも大きいサイズに拡大する、
ことを特徴とする情報処理装置。 - 請求項2に記載の情報処理装置にあって、
前記情報処理装置は前記情報処理装置を装着したユーザの頭部の動きを検出する頭部動き検出センサを更に備え、
前記プロセッサは、前記歩行検出センサのセンサ情報に基づいて前記ユーザが静止中と判断した際に前記頭部動き検出センサのセンサ情報に基づいて前記ユーザの頭部の動きを検出すると、当該頭部の動きに追従して前記視界表示範囲の位置を移動し、移動後の視界表示範囲内の前記周辺領域に表示される前記拡張現実オブジェクトは拡大し、移動後の視界表示範囲内の前記中央領域に表示される前記拡張現実オブジェクトは拡大しない、
ことを特徴とする情報処理装置。 - 請求項4に記載の情報処理装置にあって、
前記プロセッサは、前記歩行検出センサのセンサ情報に基づいて前記ユーザが歩行中と判断した際に前記頭部動き検出センサのセンサ情報に基づいて前記ユーザの頭部の動きを検出すると、前記頭部の動きが検出される前と後の前記拡張現実オブジェクトの表示サイズを変更しない、
ことを特徴とする情報処理装置。 - 請求項2に記載の情報処理装置にあって、
前記プロセッサは、前記歩行検出センサのセンサ情報に基づいて前記ユーザの歩行速度を検出し、当該歩行速度に応じて前記拡張現実オブジェクトを拡大する表示サイズを制御する、
ことを特徴とする情報処理装置。 - 請求項1に記載の情報処理装置にあって、
前記プロセッサは、前記拡張現実オブジェクトの表示サイズを設定入力する操作メニューを前記視界表示範囲内に表示制御する、
ことを特徴とする情報処理装置。 - ユーザに装着された情報処理装置であって、
拡張現実オブジェクトを視界表示範囲内に表示する表示処理デバイスと、
ユーザの歩行状態を検出する歩行検出センサと、
プロセッサと、を備え、
前記視界表示範囲は、当該視界表示範囲の中心点を含む中央領域及び当該中央領域の外縁に位置する周辺領域を含み、
前記プロセッサは、前記歩行検出センサのセンサ情報に基づいて前記ユーザが歩行しているかを判断し、歩行していると判断すると、前記中央領域に表示された前記拡張現実オブジェクトを前記周辺領域に移動させる表示制御を実行する、
ことを特徴とする情報処理装置。 - 請求項8に記載の情報処理装置にあって、
前記プロセッサは、前記中央領域に表示された前記拡張現実オブジェクトを拡大して前記周辺領域に移動させる表示制御を実行する、
ことを特徴とする情報処理装置。 - 請求項9に記載の情報処理装置にあって、
前記視界表示範囲内の周辺領域は、前記中央領域の外側に隣接する第1周辺領域と、当該第1周辺領域の外側に隣接する第2周辺領域とに区分けされ、
前記プロセッサは、前記歩行検出センサのセンサ情報に基づいて前記ユーザが歩行していると判定し、前記第2周辺領域に移動させた前記拡張現実オブジェクトの拡大率は、前記第1周辺領域に移動させた前記拡張現実オブジェクトの拡大率よりも大きくする、
ことを特徴とする情報処理装置。 - 請求項10に記載の情報処理装置にあって、
前記プロセッサは、前記歩行検出センサのセンサ情報に基づいて前記ユーザの歩行速度を検出し、当該歩行速度に応じて前記拡張現実オブジェクトの表示位置を制御する、
ことを特徴とする情報処理装置。 - 請求項8に記載の情報処理装置にあって、
前記プロセッサは、前記拡張現実オブジェクトの表示サイズを設定入力する操作メニューを前記視界表示範囲内に表示制御する、
ことを特徴とする情報処理装置。 - 請求項1に記載の情報処理装置にあって、
前記歩行検出センサは、前記ユーザの足元を撮影するカメラ、加速度センサ、地磁気センサ、及びジャイロセンサの少なくとも一つであり、
前記頭部動き検出センサは、前記加速度センサ、地磁気センサ、及びジャイロセンサの少なくとも一つである、
ことを特徴とする情報処理装置。 - ユーザに装着された情報処理装置で実行される情報処理方法であって、
前記ユーザの歩行状態を検出する歩行検出センサのセンサ情報に基づいて前記ユーザが歩行しているかを判断する歩行判断ステップと、
前記ユーザが歩行していると判断すると、静止時と比較して拡張現実オブジェクトを拡大する拡大ステップと、
表示処理デバイスに対して、拡大された前記拡張現実オブジェクトを表示させる表示ステップと、
を含む、
ことを特徴とする情報処理方法。 - 請求項14に記載の情報処理方法であって、
前記ユーザが歩行していると判断すると、前記表示処理デバイスにおける視界表示範囲内において、前記拡張現実オブジェクトの表示位置を移動する表示位置移動ステップと、を含み、
前記視界表示範囲は、当該視界表示範囲の中心点を含む中央領域及び当該中央領域の外縁に位置する周辺領域を含み、
前記表示位置移動ステップにおいて、前記中央領域に表示された前記拡張現実オブジェクトを前記周辺領域に移動させる、
ことを特徴とする情報処理方法。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024540154A JPWO2024034053A1 (ja) | 2022-08-10 | 2022-08-10 | |
| CN202280098964.7A CN119678119A (zh) | 2022-08-10 | 2022-08-10 | 信息处理装置以及信息处理方法 |
| PCT/JP2022/030568 WO2024034053A1 (ja) | 2022-08-10 | 2022-08-10 | 情報処理装置および情報処理方法 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/030568 WO2024034053A1 (ja) | 2022-08-10 | 2022-08-10 | 情報処理装置および情報処理方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024034053A1 true WO2024034053A1 (ja) | 2024-02-15 |
Family
ID=89851271
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/030568 Ceased WO2024034053A1 (ja) | 2022-08-10 | 2022-08-10 | 情報処理装置および情報処理方法 |
Country Status (3)
| Country | Link |
|---|---|
| JP (1) | JPWO2024034053A1 (ja) |
| CN (1) | CN119678119A (ja) |
| WO (1) | WO2024034053A1 (ja) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015213226A (ja) * | 2014-05-02 | 2015-11-26 | コニカミノルタ株式会社 | ウエアラブルディスプレイ及びその表示制御プログラム |
| JP2019038413A (ja) * | 2017-08-25 | 2019-03-14 | アルパイン株式会社 | 表示制御装置および表示制御方法 |
| JP2019217790A (ja) * | 2016-10-13 | 2019-12-26 | マクセル株式会社 | ヘッドアップディスプレイ装置 |
-
2022
- 2022-08-10 WO PCT/JP2022/030568 patent/WO2024034053A1/ja not_active Ceased
- 2022-08-10 JP JP2024540154A patent/JPWO2024034053A1/ja active Pending
- 2022-08-10 CN CN202280098964.7A patent/CN119678119A/zh active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015213226A (ja) * | 2014-05-02 | 2015-11-26 | コニカミノルタ株式会社 | ウエアラブルディスプレイ及びその表示制御プログラム |
| JP2019217790A (ja) * | 2016-10-13 | 2019-12-26 | マクセル株式会社 | ヘッドアップディスプレイ装置 |
| JP2019038413A (ja) * | 2017-08-25 | 2019-03-14 | アルパイン株式会社 | 表示制御装置および表示制御方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN119678119A (zh) | 2025-03-21 |
| JPWO2024034053A1 (ja) | 2024-02-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9552676B2 (en) | Wearable computer with nearby object response | |
| US11194388B2 (en) | Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device | |
| CN111886564B (zh) | 信息处理装置、信息处理方法和程序 | |
| KR102058891B1 (ko) | 헤드 장착식 디스플레이를 위한 반응성 사용자 인터페이스 | |
| US10082940B2 (en) | Text functions in augmented reality | |
| US9164588B1 (en) | Wearable computing device with gesture recognition | |
| US10114466B2 (en) | Methods and systems for hands-free browsing in a wearable computing device | |
| US9041741B2 (en) | User interface for a head mounted display | |
| US20160252956A1 (en) | Imaging Method | |
| CN112213856A (zh) | 可穿戴眼镜和经由可穿戴眼镜显示图像的方法 | |
| WO2019244670A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| JP6339887B2 (ja) | 画像表示装置 | |
| KR20200040716A (ko) | 시선 추적을 이용한 시인성 개선 방법, 저장 매체 및 전자 장치 | |
| KR20180045644A (ko) | 머리 착용형 디스플레이 장치 및 그의 제어 방법 | |
| KR20250021572A (ko) | 헤드 마운트 웨어러블 디바이스에서의 콘텐츠 출력 관리 | |
| JP2025076493A (ja) | 情報処理システム、情報処理装置及び画像表示装置 | |
| WO2024034053A1 (ja) | 情報処理装置および情報処理方法 | |
| US20260038209A1 (en) | Information processing device and information processing method | |
| JP7538886B2 (ja) | 画像表示装置および画像表示方法 | |
| US20230376109A1 (en) | Image processing apparatus, image processing method, and storage device | |
| WO2023248381A1 (ja) | 映像表示システム、映像制御方法、及び映像制御プログラム | |
| KR20240028897A (ko) | HMD(head mounted display) 장치에서 가상키보드를 표시하는 방법 및 장치 | |
| KR20250048040A (ko) | 가상 마커들과 상관되는 확대된 오버레이들 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22954970 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024540154 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202280098964.7 Country of ref document: CN |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWP | Wipo information: published in national office |
Ref document number: 202280098964.7 Country of ref document: CN |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22954970 Country of ref document: EP Kind code of ref document: A1 |