[go: up one dir, main page]

WO2011013245A1 - Position estimating device - Google Patents

Position estimating device Download PDF

Info

Publication number
WO2011013245A1
WO2011013245A1 PCT/JP2009/063664 JP2009063664W WO2011013245A1 WO 2011013245 A1 WO2011013245 A1 WO 2011013245A1 JP 2009063664 W JP2009063664 W JP 2009063664W WO 2011013245 A1 WO2011013245 A1 WO 2011013245A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
scene
probability
estimation
facility
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2009/063664
Other languages
French (fr)
Japanese (ja)
Inventor
直紀 池谷
健太 長
久雄 瀬戸口
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Priority to PCT/JP2009/063664 priority Critical patent/WO2011013245A1/en
Publication of WO2011013245A1 publication Critical patent/WO2011013245A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0284Relative positioning

Definitions

  • the present invention relates to estimation of a position where a user is located.
  • the frequency of arrival at the destination (arrival frequency) and the frequency of arrival at each day of the week within the predetermined period (appearance frequency) are calculated from the accumulated movement history, and based on the calculated arrival frequency and appearance frequency
  • the action class is calculated by combining one day of the week whose arrival frequency is equal to or higher than the threshold and which is higher than the threshold. Then, based on the accumulated movement history, the calculated action class, and the current position information of the user, a destination to which the user is going from now is predicted.
  • the present invention has been made in view of the above, and by using a life scene that is an operation scene in a user's life, the position where the user is located after a predetermined time can be estimated even in a place visited for the first time.
  • An object is to obtain a position estimation device.
  • the present invention provides a life scene that is an operation scene in a user's life, a time zone of the life scene, a position of the life scene, and the life scene.
  • the first estimation storage unit that stores first estimation information having the operation state of the user in the first storage information, the first correspondence information in which the life scene is associated with the facility type where the user is located in the life scene is stored
  • a second correspondence storage unit that stores second estimation information including a first correspondence storage unit, a life scene, and a next scene representing a scene of an operation performed by the user next to the life scene, and a current position From the positioning unit that performs positioning, the detection unit that detects the current operation state of the user, and the first estimation information, using the life scene corresponding to the current position, the current operation state, and the current time, Profit
  • a first scene estimation unit that estimates a user's current life scene, and the second scene, and the next scene corresponding to the current life scene, and the user's current life scene after the predetermined time.
  • the present invention it is possible to estimate the position where the user is located after a predetermined time even in a place visited for the first time by using a life scene that is an action scene in the life of the user.
  • the external appearance block diagram which shows an example of a portable terminal.
  • the block diagram which shows an example of a functional structure of a portable terminal.
  • the flowchart which shows the flow of the position estimation process by a portable terminal.
  • the flowchart which shows the flow of the position estimation process by a portable terminal.
  • the block diagram which shows an example of a functional structure of a portable terminal.
  • the flowchart which shows the flow of the position estimation process by a portable terminal.
  • the block diagram which shows an example of a functional structure of a portable terminal.
  • the figure which shows an example of a movement history.
  • the figure which shows an example of a movement pattern.
  • the flowchart which shows the flow of the position estimation process by a portable terminal.
  • FIG. 1 is an external configuration diagram illustrating an example of a mobile terminal according to the first embodiment. At least one positioning sensor and at least one body motion sensor are connected to one mobile terminal 100.
  • the positioning sensor 101 is a sensor that receives radio waves from GPS satellites at predetermined time intervals in order to measure the current position of the mobile terminal 100.
  • the positioning sensor 101 uses a GPS function as a typical configuration.
  • the positioning is not limited to the positioning using the GPS function.
  • the positioning sensor 101 when the mobile terminal 100 is a mobile phone, the positioning sensor 101 is configured as an antenna that receives radio waves from a communication base station, and the received radio waves are emitted. You may comprise so that it may measure by specifying the communication base station which performed.
  • the positioning sensor 101 may be configured as an antenna that receives radio waves from a wireless LAN (Local Area Network) access point, and may be configured to perform positioning by specifying the access point that has emitted the received radio waves. .
  • the positioning result does not necessarily need to be acquired by the mobile terminal 100, and the user may have an RFID (Radio Frequency IDentification) tag, and positioning may be performed by an RFID antenna that is an external system and an associated system. .
  • the body motion sensor 102 is a sensor that detects a signal corresponding to the operation of the user holding the mobile terminal 100 at a predetermined time interval.
  • the body motion sensor 102 has a three-axis acceleration as a typical configuration. It is comprised as an acceleration sensor which detects the acceleration data according to a user's operation
  • the present invention is not limited to this, and the body motion sensor 102 may be constituted by, for example, a gyroscope, a direction sensor, or the like.
  • the present invention is not limited to this, and a single mobile terminal may be used by a plurality of users.
  • FIG. 2 is a block diagram of an example of a functional configuration of the mobile terminal according to the first embodiment.
  • the mobile terminal 100 includes a positioning unit 111, a detection unit 112, a first scene estimation unit 113, a storage unit 114, a second scene estimation unit 115, and a first position estimation unit 116.
  • the scene update unit 117, the output unit 118, the first estimation storage unit 131, the first correspondence storage unit 142, the second estimation storage unit 132, and the second correspondence storage unit 141 are mainly provided.
  • the mobile terminal 100 according to the first embodiment further has a hardware configuration such as a CPU, a storage medium such as an HDD or a memory (not shown).
  • the positioning unit 111 measures the current position of the mobile terminal 100 using the radio wave received from the GPS satellite by the positioning sensor 101.
  • the detection unit 112 acquires current acceleration data from the body motion sensor 102, and detects the current operation state (body movement) of the user who holds the mobile terminal 100 from the acquired acceleration data. Specifically, the detection unit 112 first estimates the direction of gravity from the acquired acceleration data. The estimation of the direction of gravity is performed by calculating the average of acceleration data for a certain time as a three-dimensional vector and then normalizing the vector so as to have a magnitude of 1G. Thereafter, the detection unit 112 calculates acceleration data excluding the gravity component by subtracting the gravity vector from the acceleration data. Further, the detection unit 112 calculates three feature amounts including the length of the obtained acceleration data, the inner product of the acceleration data and the gravity vector, and the outer product. These feature values are numerical values that are not affected by the orientation of the mobile terminal 100 equipped with the acceleration sensor, regardless of the direction in which the user holds the mobile terminal 100. It can be used as a feature amount for obtaining the context of the moving means.
  • the detection unit 112 calculates a statistical feature value from these three feature values.
  • the statistical feature amount is the maximum value, minimum value, average, or variance of the feature amount during a certain period of time.
  • a 12-dimensional value can be acquired.
  • This statistical feature quantity can be used as a feature quantity that takes into account the temporal transition of acceleration.
  • the detecting unit 112 applies a neural network to the obtained 12-dimensional value, so that the current user is in any one of four states of running / walking / riding the vehicle / stationary Guess that. It is assumed that the neural network has been learned by giving statistical data in each of these four states acquired in advance as correct data.
  • Each of the four states is represented by a value from 0 to 1, and the larger the numerical value, the higher the degree that the state is estimated.
  • the detection unit 112 can detect four states of “running”, “walking”, “riding on a vehicle”, and “still” based on the value of acceleration data from the acceleration sensor (body motion sensor 102). .
  • the user's operation state is detected based on the acceleration data detected by the body motion sensor 102.
  • the operation state of the user may be detected by estimating the operation state of the user holding the mobile terminal 100 with a certain accuracy based on the signal or the image pickup signal from the camera.
  • the first estimation storage unit 131 is a storage medium such as an HDD (Hard Disk Drive) or a memory that stores first estimation information for estimating the current life scene of the user.
  • the life scene is, for example, a scene of an operation performed by the user in the user's life such as “work”, “business trip”, “dinner”, “shopping”, and “on the way home”, that is, an action in the user's life. Indicates the situation. Further, the first estimation information is registered in advance in the mobile terminal 100.
  • FIG. 3 is a diagram illustrating an example of first estimation information for estimating a user's life scene.
  • the first estimation information shown in FIG. 3 is obtained by registering an individual user based on the position of the user's home, work place, frequently visited store, etc., and the time zone where the user is located. This is a generated table.
  • the first estimation information is stored in a table format, but the present invention is not limited to this and may be stored in another format.
  • the first estimation information includes a life scene that is an operation scene in a user's life, a time zone that is the life scene, a position of the life scene, and a user's life scene.
  • the operation states are associated with each other, and 1 to 12 are stored in descending order of priority. For example, if the life scene is “working”, the user can select “stationary” or “latitude” at the position of “35.495500, 139.593122” in the time zone “8:00 to 22:00”. You can see that you are “walking”.
  • the position indicates the central latitude and longitude, the distance from the actual user position is calculated, and if it is within a certain range, for example, within 100 m. It is assumed that the user is located.
  • the second correspondence storage unit 141 is a storage medium such as an HDD or a memory that stores second correspondence information for estimating the position where the user is located.
  • FIG. 4 is a diagram illustrating an example of second correspondence information in which a facility type is associated with a position of the facility type.
  • the second correspondence information is a table in which the facility type indicating the type of facility, the position of the facility type, and the probability that the user is located at the position of the facility type are associated with each other. .
  • the probability that the user is located in the “company” of the facility type at the position where the latitude and longitude are “35.495500, 139.593122” is “17/20”.
  • the second correspondence information is stored in a table format, but the present invention is not limited to this and may be stored in another format.
  • the first correspondence storage unit 142 is a storage medium such as an HDD or a memory that stores first correspondence information for estimating the facility type where the user is located.
  • FIG. 5 is a diagram illustrating an example of first correspondence information in which a life scene is associated with a facility type.
  • the first correspondence information is a table in which a life scene, a facility type where the user is located in the life scene, and a probability that the user is located in the facility type are associated with each other. For example, when the life scene is “dinner”, the probability that the user is located in the “type restaurant” of the facility type is “12/20”.
  • the first correspondence information is stored in a table format, but the present invention is not limited to this and may be stored in another format.
  • the first correspondence information described above is preferably constructed and updated from the user's personal history, but this is not a limitation.
  • a future position with a certain accuracy may be estimated from the beginning.
  • the second estimation storage unit 132 is a storage medium such as an HDD or a memory that stores second estimation information for estimating a future life scene that is a life scene after a predetermined time of the user.
  • FIG. 6 is a diagram illustrating an example of second estimation information in which the current life scene is associated with the next scene that is the next life scene. As shown in FIG. 6, the second estimation information includes a life scene, a shortest time in which the life scene is continued, an average time, a longest time, and a scene of an operation performed by the user next to the life scene. Is a table having a next scene representing Further, the probability that the user's action is actually the next scene is attached to the next scene.
  • the second estimation information is stored in a table format, but the present invention is not limited to this and may be stored in another format. Further, the second estimation information is registered in advance in the mobile terminal 100.
  • the first scene estimation unit 113 estimates the current life scene of the user who owns the mobile terminal 100. Specifically, the first scene estimation unit 113 uses the first estimation information to determine the current position immediately followed by the positioning unit 111, the current operation state detected by the detection unit 112, and the time measurement unit (not shown). Of the life scenes corresponding to the current time, a life scene with a high priority is estimated as the user's current life scene.
  • the first scene estimation unit 113 uses the current operation state to estimate the life scene, and thus the operation state such as walking or stationary that cannot be determined by the positioning function is clarified. Even in situations where positioning is difficult, it can be estimated that the user is on board. Also, the first scene estimation unit 113 becomes “estimation impossible” when it does not correspond to any life scene.
  • the current life scene is estimated from the current position measured by the positioning unit 111 and the operation state detected by the detection unit 112 using the first estimation information.
  • the present invention is not limited to this. It is good also as a structure estimated from the electronic data of a schedule, operation of a portable terminal, and the log
  • the storage unit 114 acquires the facility type corresponding to the current position measured by the positioning unit 111 from the second correspondence information, and obtains the current life scene estimated by the first scene estimation unit 113 and the acquired facility type.
  • the first correspondence information in the first correspondence storage unit 142 is updated by storing the first correspondence information in association with each other. Further, the storage unit 114 updates the probability of being located in the facility type in the first correspondence information and the probability of being located in the position of the facility type in the second correspondence information based on the transition of the life scene of the user.
  • the second scene estimation unit 115 uses the next scene corresponding to the current life scene estimated by the first scene estimation unit 113 based on the second estimation information, after a predetermined time of the user who owns the mobile terminal 100. A future life scene that is a life scene is estimated.
  • the second scene estimation unit 115 estimates the end time of the current life scene by storing the current life scene and the time when the life scene started.
  • the average time in FIG. 6 is applied as the life scene time. For example, if the current life scene is “working” and 8 hours and 40 minutes have already passed since becoming the life scene, referring to the average time, the life scene of “working” continues for 9 hours. It can be estimated that it will be completed in about 20 minutes.
  • the probability that the mobile terminal 100 has transitioned to the “dinner” life scene is 6/16, and the life scene is “on the way home”. 4/16, and the probability of transition to the “shopping” life scene is 2/16.
  • the probability can be corrected more accurately by correcting the probability using the shortest time and the longest time, or by holding the end time, the distribution thereof, and the like in the second estimation information.
  • the first position estimation unit 116 estimates the facility type corresponding to the future life scene from the first correspondence information as the facility type where the user holding the mobile terminal 100 is located after a predetermined time. Then, the first position estimation unit 116 multiplies the probability that the second estimation information is the next scene corresponding to the current life scene and the probability that the first correspondence information is located in the facility type corresponding to the future life scene. Is calculated as the probability that the user is located in the estimated facility type.
  • the probability that the user's future life scene after 30 minutes has changed to the life scene of “dinner” is 6/16, and the user changes to the life scene of “on the way home”. 4/16, and the probability of transition to the “shopping” life scene is 2/16.
  • the first position estimation unit 116 calculates the probability P type located in each facility type corresponding to those future life scenes after a predetermined time by the following equation. .
  • P type Probability that the user is located in the facility type type after a predetermined time
  • P scene Probability of occurrence of a future life scene after a predetermined time
  • P (scene, type) The facility type type corresponding to the future life scene is used probability
  • the first position estimation unit 116 indicates that the probability that the user who owns the mobile terminal 100 is located in the facility type “railway” with the life scene “on the way home” after 30 minutes is 1 /. 4. Life scene is “Dinner” and the probability of being in the “Restaurant” facility type is 9/40, Life scene is “Dinner” and the probability of being in the “Company” facility type is 3/20, and the life scene is “ The probability of being in the facility type “shop” in “shopping” is calculated as 1/8.
  • the first location estimation unit 116 further determines the location of the facility type corresponding to the facility type located after the calculated predetermined time from the second correspondence information, and the user holding the mobile terminal 100 is located after the predetermined time. Estimated as the location of the facility type. Then, the first position estimation unit 116 has the probability that the second estimation information is the next scene corresponding to the current life scene, the probability that the first correspondence information is located in the facility type corresponding to the future life scene, 2 The product of the probability that the user is located at the location of the facility type corresponding to the facility type located after a predetermined time in the correspondence information is calculated as the probability that the user is located at the estimated location of the facility type. In other words, the first position estimating unit 116 can calculate the specific position from the estimated facility type by using the second correspondence information shown in FIG. 4 and calculate the probability that the user is located at the position. .
  • the probability of using the restaurant at the position “35.551331, 139.6675131” is 6/8, and the restaurant at the position “35.552345, 139.6671251” is used. Therefore, the probability of the location of the facility type is calculated by multiplying these probabilities by the probability of being located in the facility type. That is, in the above-described example, the probability that the user is located in the former restaurant after 30 minutes is 27/160 by multiplying the probability 9/40 located in the facility type of “restaurant” by the probability 6/8. Similarly, the probability of being in a restaurant can be calculated as 9/160 by multiplying the probability 9/40 by the probability 2/8.
  • the first position estimation unit 116 may be configured to correct the probability calculated according to the current position.
  • the scene update unit 117 updates the first estimated information registered in advance according to the position measured when the user actually lives, the detected operating state, and the time zone counted.
  • the scene update unit 117 has a time zone corresponding to the life scene “work” from 8:00 to 22:00 from the first estimation information. It is assumed that the user's operation state remains “still” and continues until 23:00 even after “work” is continued until that time. Thus, it can be presumed that the user's operation state is continued “still” and that the previous life scene is continued. That is, in this case, the user presumes that “work” has continued until 23:00, and updates the time zone corresponding to “work” in the first estimation information to 8-23 (8 to 23:00). .
  • the scene update unit 117 updates the probability that the user is located in the facility type in the second estimated information registered in advance with the life scene when the user actually lives.
  • the scene update unit 117 determines that the life scene changes to “on boarding” after the life scene is estimated to be “working” at a certain time.
  • the probability of the next scene corresponding to the “work” life scene in the second estimation information is updated.
  • the denominator of the probability of all the next scenes corresponding to “work” is increased by 1 and updated to 17, and the numerator is also increased by 1 for the next scene of “on the way home” that has actually shifted from 4/16. Update to 5/17.
  • the output unit 118 uses the facility type estimated by the first position estimating unit 116 and the certainty factor calculated by the first position estimating unit 116 as the probability that a user exists in the facility type as an estimation result. Output to a display unit (not shown).
  • FIG. 7 is a diagram illustrating an example of an estimation result display. As illustrated in FIG. 7, the certainty factor that is the probability of existing in the facility type is displayed in association with each facility type estimated by the first position estimating unit 116. For example, when the probability that the user is located on the railway is estimated to be 1 ⁇ 4, the probability (certainty factor) “1 ⁇ 4” located on the “railway” is displayed in association with the facility type “railway”.
  • the output unit 118 further has a certainty factor that is the probability that a user exists at the location of the facility type calculated by the first location estimation unit 116 and the location of the facility type estimated by the first location estimation unit 116.
  • a certainty factor that is the probability that a user exists at the location of the facility type calculated by the first location estimation unit 116 and the location of the facility type estimated by the first location estimation unit 116.
  • FIG. 8 is a diagram illustrating an example of the display of the estimation result. As illustrated in FIG. 8, the name of the facility type and the name of the facility type are associated with each facility type estimated by the first position estimation unit 116.
  • the certainty level which is the probability of being present at the location or location of the facility type, is displayed.
  • the restaurant name “Restaurant” is associated with the facility type “Restaurant”.
  • A “ 35.551331, 139.675131 ”which is the location of the restaurant, and the probability (certainty factor)“ 27/160 ”of being located in the restaurant.
  • FIG. 9 is a flowchart of a position estimation process performed by the mobile terminal according to the first embodiment. Below, the process in the case of calculating the probability that a user is located in the facility type estimated after the predetermined time will be described.
  • the positioning unit 111 measures the current position of the mobile terminal 100 (step S11). And the detection part 112 detects the present operation state of the user who has the portable terminal 100 (step S12).
  • the first scene estimation unit 113 estimates the current life scene based on the current position, the current operation state, and the current time based on the first estimation information (step S13).
  • the storage unit 114 updates the first correspondence information with the facility type corresponding to the current position in the second correspondence information and the current life scene (step S14).
  • the second scene estimation unit 115 estimates the future life scene after a predetermined time of the user based on the current life scene based on the second estimation information (step S15).
  • the first position estimating unit 116 estimates the facility type where the user is located after a predetermined time based on the future life scene based on the first correspondence information, and calculates the probability that the user is located in the facility type. (Step S16).
  • the output unit 118 outputs the estimated facility type and the calculated probability to the display unit as an estimation result (step S17).
  • FIG. 10 is a flowchart of a position estimation process performed by the mobile terminal according to the first embodiment.
  • the position estimation process in FIG. 9 or the position estimation process in FIG. 10 receives and accepts a setting input for performing any position estimation process by the user from an operation unit (not shown) of the mobile terminal 100, for example.
  • One of the position estimation processes is performed based on the setting.
  • step S31 to S35 the process from the current position measurement to the estimation of the future life scene is the same as the process (steps S11 to S15) in FIG.
  • the first position estimating unit 116 estimates the facility type where the user is located after a predetermined time based on the future life scene based on the first correspondence information, and calculates the probability that the user is located in the facility type. (Step S36). Then, the first position estimation unit 116 estimates the position of the facility type based on the second correspondence information, and calculates the probability that the user is located at the position of the facility type (step S37). The output unit 118 outputs the estimated location of the facility type and the calculated probability to the display unit as an estimation result (step S38).
  • the current life scene of the user is estimated from the current position, the current operation state of the user, and the current time, and the estimated current life scene is obtained. Based on this, a future life scene after a predetermined time is estimated. Then, the mobile terminal 100 estimates the location of the facility type or the facility type where the user is located after a predetermined time based on the estimated future life scene, and the user is located at the estimated location of the facility type or the facility type. Probabilities are calculated and output to the display unit. In this way, by using the current life scene of the user, the facility type or the user is likely to be located after a predetermined time from the continuity of the life scene without estimating from the continuity of the position of the user. Since the location of the facility type can be estimated, the facility type of the user or the location of the facility type after a predetermined time can be estimated even at a place visited for the first time.
  • the position after a predetermined time can be estimated in consideration of the user's behavior pattern. This is because there is a high possibility that it is common in a place different from the behavior in the life of a certain person. For example, it is presumed that the place visited for the first time will become a dinner scene after a business trip scene, and this user uses a canteen restaurant with a probability of 50% and a ramen restaurant with a probability of 30%. Based on this probability, the position where the user is located after a predetermined time can be estimated.
  • the life scene of the above embodiment is not limited to the above example.
  • the life scene of “dinner” can be further subdivided into “dinner”, “meal dinner”, “banquet” and the like.
  • the first position estimation unit 116 performs a predetermined time It can be estimated that there is a high possibility of being located in “Izakaya” or “Yakitori-ya” later. In other words, it is possible to estimate the future position in consideration of the action state such as “the presence or absence of a companion”.
  • the estimation of the future life scene has been described above.
  • the current life scene not only the current life scene but also the current action state, It is effective to estimate including the position, current time, day of the week, and the like.
  • a method of retaining the current action state, current position, facility type currently located, current time, day of the week, etc. in the second estimation information is effective.
  • other techniques such as a Bayesian network and a hidden Markov model may be used. In this case, as described above, not only the current life scene but also the current behavioral state, the current position, the type of facility currently located, the current time, the day of the week, etc.
  • a future life scene after a predetermined time is estimated using behavior information, state information, and environment information.
  • the location of the facility type where the user is located after a predetermined time it may be configured to use the current location that has been measured and perform estimation considering the reachability to the location. Specifically, for example, when the current life scene is estimated to be “working” and the future life scene after 30 minutes is estimated to be “dinner” with a probability of 6/16, in the above embodiment, The position where the user takes dinner with high probability was calculated and output as an estimation result.
  • the relationship between the current position and the estimated position after a predetermined time is difficult to move in 30 minutes, it is appropriate to perform a different estimation. That is, the possibility of movement within a specified time (in this example, 30 minutes) is calculated from the positional relationship between the latitude and longitude of the current position and the estimated position after a predetermined time, and unreachable positions are removed. .
  • the estimated travel time to the restaurant “35.551331, 139.675131” is 50 minutes and cannot be reached within 30 minutes.
  • the candidate is removed, the probability is similarly calculated using other positions, and the estimation result is output.
  • the estimated travel time at this time may be calculated using a past travel time, or may be calculated using an existing calculation method such as a navigation service.
  • the facility type where the user is located after a predetermined time is estimated by the first correspondence information in which the life scene is associated with the facility type.
  • the facility type where the user is located after a predetermined time is estimated based on the first correspondence information in which the life scene and the plurality of facility types are hierarchically associated.
  • FIG. 11 is a block diagram of an example of a functional configuration of the mobile terminal according to the second embodiment.
  • the mobile terminal 200 includes a positioning unit 111, a detection unit 112, a first scene estimation unit 113, a storage unit 114, a second scene estimation unit 115, and a first position estimation unit 216.
  • the scene update unit 117, the output unit 118, the first estimation storage unit 131, the second estimation storage unit 132, the second correspondence storage unit 141, and the first correspondence storage unit 242 are mainly provided.
  • the portable terminal 200 according to the second embodiment has a hardware configuration such as a CPU, a storage medium such as an HDD or a memory (not shown), as in the first embodiment.
  • the same reference numerals as those in the first embodiment are the same as those in the first embodiment, the description thereof is omitted.
  • the external configuration of the mobile terminal 200 according to the second embodiment is the same as that of the first embodiment.
  • the first correspondence storage unit 242 is a storage medium such as an HDD or a memory that stores first correspondence information for estimating the type of facility where the user is located.
  • FIG. 12 is a diagram illustrating an example of first correspondence information in which a life scene is associated with a facility type. As shown in FIG. 12, the first correspondence information includes the life scene, the name of the facility in the facility type where the user is located in the life scene, and the probability that the user is located in the facility type indicated by the name of the facility. And the category of the facility in the facility type where the user is located in the life scene and the probability that the user is located in the facility type indicated by the category of the facility.
  • the name of the facility is a facility type that is included in the facility category and further classifies the facility category.
  • the probability that the user is located in the facility type “second factory” indicated by the name of the facility is “3/4”
  • the facility type indicated by the name of the facility “third” It can be seen that the probability that the user is located in the “factory” is “1/4” and the probability that the user is located in the facility type “factory” indicated by the facility category is “4/4”.
  • the name of the facility and the category of the facility are described here in two layers, it can be realized as three or more layers.
  • a label called a facility type tag to each facility, it is also possible to use a label group that is not a complete tree structure and whose hierarchical level is not constant.
  • the first correspondence information is stored in a table format, but the present invention is not limited to this and may be stored in another format.
  • the first position estimation unit 216 estimates the facility type corresponding to the future life scene from the first correspondence information as the facility type where the user who owns the mobile terminal 200 is located after a predetermined time. Then, the probability that the user exists in the estimated facility type is calculated. Since this calculation method is the same as that of the first embodiment, description thereof is omitted.
  • the first position estimating unit 216 when estimating the facility type where the user is located after a predetermined time, the first position estimating unit 216 first estimates the name of the facility that classifies the facility type in detail. Then, the first position estimation unit 216 is located in the probability of the next scene corresponding to the current life scene in the second estimation information and the facility type indicated by the name of the facility corresponding to the future life scene in the first correspondence information. The product with the calculated probability is calculated as the probability that the user is located in the facility type indicated by the name of the facility located after a predetermined time. Then, if the probability that the user is located in the facility type indicated by the estimated facility name is equal to or greater than a predetermined threshold value, the first position estimation unit 216 uses the probability as an estimation result.
  • the first position estimating unit 216 determines whether the probability that the user is located in the facility type indicated by the estimated facility name is less than a predetermined threshold value.
  • the first position estimating unit 216 estimates the category of the facility that includes the name of the facility where the user is located after a predetermined time. Then, the product of the probability that the second estimated information is the next scene corresponding to the current life scene and the probability that the first corresponding information is located in the facility type indicated by the category of the facility corresponding to the future life scene is used. The probability that the person is located in the facility type indicated by the category of the facility located after a predetermined time is calculated, and the probability is used as the estimation result.
  • the first position estimation unit 216 sets the threshold for the probability of the facility type where the user is located to 0.5, and the second scene estimation unit 115 estimates the future life scene as “dinner”.
  • the probability 4/10 of using the A set restaurant is the mode value. This is not above the threshold of 0.5. Therefore, this estimation result is not used, and if the next facility type, that is, the facility category is referred to, the probability of using a set restaurant is 5/10, which is adopted because the threshold is 0.5 or more.
  • FIG. 13 is a flowchart of a position estimation process performed by the mobile terminal according to the second embodiment.
  • step S51 to 55 since the process from the current position measurement to the estimation of the future life scene (steps S51 to 55) is the same as the process (steps S11 to S15) in FIG. 9 of the first embodiment, a description thereof will be omitted.
  • the first position estimation unit 216 estimates the name of the facility where the user is located after a predetermined time based on the future life scene based on the first correspondence information, and the user enters the facility type indicated by the name of the facility.
  • the probability of being located is calculated (step S56).
  • the first position estimating unit 216 determines whether or not the calculated probability is equal to or greater than a predetermined threshold (step S57). If it is equal to or greater than the predetermined threshold (step S57: Yes), the output unit 118 outputs the estimated facility name and the calculated probability to the display unit as an estimation result (step S58).
  • Step S57 when it is not more than a predetermined threshold (Step S57: No), the 1st position estimating part 216 presumes the category of the facility where a user is located after predetermined time based on the future life scene by the 1st correspondence information. Then, the probability that the user is located in the facility type indicated by the category of the facility is calculated (step S59). Then, the output unit 118 outputs the estimated facility category and the calculated probability to the display unit as an estimation result (step S60).
  • the current life scene of the user is estimated from the current position, the current operation state of the user, and the current time, and the estimated current life scene is obtained. Based on this, a future life scene after a predetermined time is estimated. Then, the mobile terminal 200 estimates the name of the facility or the facility category where the user is located after a predetermined time based on the estimated future life scene, and uses it for the facility type indicated by the estimated facility name or the facility category. The probability that the person is located is calculated, and these are output to the display unit.
  • the future life scene is estimated from the user's current life scene, and the facility type where the user is located after a predetermined time is estimated based on the future life scene.
  • a technique for further storing a user's movement history and estimating a position where the user is located after a predetermined time based on a user's movement pattern extracted from the stored movement history Is added. Below, the structure which added the said technique to Embodiment 2 is demonstrated.
  • FIG. 14 is a block diagram of an example of a functional configuration of the mobile terminal according to the third embodiment.
  • the mobile terminal 300 includes a positioning unit 111, a detection unit 112, a first scene estimation unit 113, a storage unit 114, a second scene estimation unit 115, and a first position estimation unit 216.
  • the portable terminal 300 according to the third embodiment has a hardware configuration such as a CPU (not shown), a storage medium such as an HDD or a memory, as in the first and second embodiments.
  • a hardware configuration such as a CPU (not shown), a storage medium such as an HDD or a memory, as in the first and second embodiments.
  • the same reference numerals as those in the first embodiment are the same as those in the first embodiment, the description thereof is omitted.
  • the external configuration of the mobile terminal 300 according to the third embodiment is the same as that of the first embodiment.
  • the mobile terminal 300 stores the movement history of the current position measured by the positioning unit 111 in addition to the estimation of the future position based on the life scene transition in the mobile terminal 200 of the second embodiment.
  • the future position is estimated based on the movement history, and the estimated future position is corrected.
  • the history storage unit 351 is a storage medium such as an HDD or a memory that stores a movement history in which the current position measured by the positioning unit 111 and the current time are recorded over time.
  • FIG. 15 is a diagram illustrating an example of the movement history. As shown in FIG. 15, the movement history includes the current position (position indicated by latitude and longitude), the label for identifying the current position, and the name of the facility at the current position at the date and time when the current position is measured. Correspondingly recorded.
  • the pattern storage unit 352 stores a movement pattern in which the current position measured by the positioning unit 111 is associated with the probability that the user is located at the other position when moving from the current position to the other position.
  • a storage medium such as an HDD or a memory.
  • FIG. 16 is a diagram illustrating an example of the movement pattern. As shown in FIG. 16, the movement pattern includes “position 01”, “position 02”, etc., which are labels of the current position. "Position 02" and the like are associated with each other, and the probability “23/46" and "23/25" that the user is located at each next position is associated.
  • the extraction unit 321 generates a movement history from the current position and the current time measured by the positioning unit 111 and stores the movement history in the history storage unit 351. Then, a movement pattern is extracted based on the temporal transition of the current position in the stored movement history, and the extracted movement pattern is stored in the pattern storage unit 352. Specifically, the extraction unit 321 registers the location as a stay location when the user is present within a predetermined radius range for a predetermined time or more. And the extraction part 321 shall regard the passage as one stay, when a user passes the location registered later. For example, the predetermined radius is set to 100 m and the predetermined time is set to 5 minutes. And the extraction part 321 calculates the probability that the user was located in the next place by calculating the frequency of moving to another stay place after being located in each stay place, and the movement pattern is calculated. Extract.
  • the second position estimation unit 322 calculates, from the movement pattern, the probability that the user who owns the mobile terminal 300 is located at the next position corresponding to the current position after a predetermined time. For example, when the user is currently staying at the current position “position 04” from the movement pattern, the second position estimation unit 322 has a probability of 13/18 at the next position “position 03” after a predetermined time. It is presumed that it will move, and it is assumed that it will move to the next position “position 05” with a probability of 5/18.
  • an estimation method for extracting a movement pattern from a movement history is described. However, a method using a Bayesian network or a hidden Markov model may be used.
  • the determination unit 323 determines the next position where the user is located after a predetermined time estimated by the second position estimation unit 322 based on the movement pattern, and the location of the user after the predetermined time estimated by the first position estimation unit 216.
  • the final estimation result is determined by comparing with the facility type. That is, the determination unit 323 determines the next position estimated by the second position estimation unit 322 when the probability of being located at the next position estimated by the second position estimation unit 322 is equal to or greater than a predetermined threshold.
  • the calculated probability is used as the estimation result.
  • the probability of being located at the next position estimated by the second position estimation unit 322 is less than a predetermined threshold, the facility type estimated by the first position estimation unit 216 and the calculated probability To do.
  • the determination unit 323 estimates the second position estimation unit 322 when the probability calculated by the second position estimation unit 322 is 0.3 or more. The determined next position is determined as a final estimation result and is output by the output unit 118. On the other hand, when the probability calculated by the second position estimation unit 322 is less than 0.3, the determination unit 323 determines the facility type estimated by the first position estimation unit 216 as the final estimation result. And output by the output unit 118.
  • an estimation using the movement history (estimation by the second position estimation unit 322) is output, and the user visited for the first time.
  • the estimation based on the life scene described in Embodiment 2 (estimation by the first position estimation unit 216) can be output.
  • FIG. 17 is a flowchart of a position estimation process performed by the mobile terminal according to the third embodiment. It is assumed that portable terminal 300 performs the position estimation process (FIG. 13) in the second embodiment in parallel with the following process.
  • the positioning unit 111 measures the current position of the mobile terminal 300 (step S71). Then, the extraction unit 321 generates a movement history from the current position and the current time and stores the movement history in the history storage unit 351 (step S72).
  • the extraction unit 321 extracts a movement pattern based on the stored movement history, and stores the extracted movement pattern in the pattern storage unit 352 (step S73).
  • the second position estimation unit 322 calculates the probability that the user is located at the next position after a predetermined time based on the movement pattern (step S74).
  • the determination part 323 judges whether the probability calculated by the 2nd position estimation part 322 is more than a predetermined threshold value (step S75). When it is equal to or greater than the predetermined threshold (step S75: Yes), the determination unit 323 determines the next position estimated by the second position estimation unit 322 and the calculated probability as an estimation result, and the output unit 118 The estimation result is output to the display unit (step S77). On the other hand, when it is not equal to or greater than the predetermined threshold (step S75: No), the determination unit 323 determines the facility type estimated by the first position estimation unit 216 and the calculated probability as an estimation result, and outputs the output unit 118. Outputs the estimation result to the display unit (step S76).
  • the facility type estimated from the current life scene of the user, the probability that the user determines for the facility type, and the movement history of the user actually moved When the next position estimated by the above and the probability that the user is located at the next position are calculated, an estimation result suitable for the user's situation can be output to the display unit. Accordingly, as in the second embodiment, the user's facility type after a predetermined time can be estimated even at a place visited for the first time, and the position can be estimated based on the movement history.
  • the present invention is not limited to the above-described embodiment as it is, and can be embodied by modifying the constituent elements without departing from the scope of the invention in the implementation stage.
  • various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above embodiments. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, constituent elements over different embodiments may be appropriately combined.
  • the position estimation apparatus is useful for estimating a position where a user is located after a predetermined time, and in particular, the user is located after a predetermined time at a place where the user has visited for the first time. Suitable for estimating position.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

A position estimating device is provided with a positioning section (111) which positions a current position; a detecting section (112) which detects the current action state of a user; a first scene estimating section (113) which estimates the current life scene of the user by means of a life scene corresponding to the current position, the current action state, and a current time from first estimation information containing life scenes being action scenes in the life of the user, time zones, positions, and the action states of the user; a second scene estimating section (115) which estimates a future life scene of the user by means of the next scene corresponding to the current life scene from second estimation information containing the life scenes and next scenes; and a first position estimation section (116) which estimates the kind of a facility where the user will be present a predetermined time later by means of the kind of a facility corresponding to the future life scene from first association information associating the life scenes with the kinds of facilities where the user is present in the life scenes.

Description

位置推定装置Position estimation device

 本発明は、利用者が所在する位置の推定に関する。 The present invention relates to estimation of a position where a user is located.

 従来から、位置推定装置を所持する利用者の現在位置を利用して、当該利用者が所定時間後に所在する位置を推定する技術が知られている。例えば、利用者の行動特性を反映して移動先を予測する移動先予測装置が開示されている(特許文献1参照)。特許文献1の移動先予測装置では、利用者の位置情報と日時を検出して利用者の移動履歴を蓄積する。そして、蓄積した移動履歴から目的地へ到着した頻度(到着頻度)および目的地へ所定期間内の各曜日に到着した頻度(出現頻度)を算出し、算出した到達頻度および出現頻度をもとに、閾値以上の到達頻度で、かつ閾値以上の曜日を1つ組み合わせとした行動クラスを算出する。そして、蓄積された移動履歴と、算出した行動クラスと、利用者の現在の位置情報とをもとに、利用者が今から向かう移動先を予測するものである。 2. Description of the Related Art Conventionally, a technique for estimating a position where a user is located after a predetermined time using a current position of a user who possesses a position estimation device is known. For example, a destination prediction apparatus that predicts a destination by reflecting user behavior characteristics is disclosed (see Patent Document 1). In the movement destination prediction apparatus of Patent Literature 1, the position information and date / time of the user are detected and the movement history of the user is accumulated. Then, the frequency of arrival at the destination (arrival frequency) and the frequency of arrival at each day of the week within the predetermined period (appearance frequency) are calculated from the accumulated movement history, and based on the calculated arrival frequency and appearance frequency The action class is calculated by combining one day of the week whose arrival frequency is equal to or higher than the threshold and which is higher than the threshold. Then, based on the accumulated movement history, the calculated action class, and the current position information of the user, a destination to which the user is going from now is predicted.

特開2009-36594号公報JP 2009-36594 A

 しかしながら、上記特許文献1の移動先予測装置では、予め蓄積した移動履歴や行動クラスを用いて移動先を予測するため、初めて訪問した場所においては所定時間後に利用者が所在する位置を推定することはできなかった。 However, in the destination prediction apparatus of Patent Document 1, since the destination is predicted using the movement history and action class accumulated in advance, the location where the user is located after a predetermined time is estimated at the place visited for the first time. I couldn't.

 本発明は、上記に鑑みてなされたものであって、利用者の生活における動作のシーンである生活シーンを用いることで、初めて訪問した場所においても所定時間後に利用者が所在する位置を推定できる位置推定装置を得ることを目的とする。 The present invention has been made in view of the above, and by using a life scene that is an operation scene in a user's life, the position where the user is located after a predetermined time can be estimated even in a place visited for the first time. An object is to obtain a position estimation device.

 上述した課題を解決し、目的を達成するために、本発明は、利用者の生活における動作のシーンである生活シーンと、前記生活シーンの時間帯と、前記生活シーンの位置と、前記生活シーンにおける利用者の動作状態とを有する第1推定情報を記憶する第1推定記憶部と、前記生活シーンと、前記生活シーンにおいて利用者が所在する施設種別とを対応付けた第1対応情報を記憶する第1対応記憶部と、前記生活シーンと、利用者が前記生活シーンの次に行う動作のシーンを表す次シーンとを有する第2推定情報を記憶する第2推定記憶部と、現在位置を測位する測位部と、利用者の現在の動作状態を検出する検出部と、前記第1推定情報から、前記現在位置、前記現在の動作状態、および現在時刻に対応する前記生活シーンを用いて、利用者の現在の生活シーンを推定する第1シーン推定部と、前記第2推定情報から、前記現在の生活シーンに対応する前記次シーンを用いて、利用者の所定時間後の前記生活シーンである将来生活シーンを推定する第2シーン推定部と、前記第1対応情報から、前記将来生活シーンに対応する前記施設種別を用いて、利用者が所定時間後に所在する前記施設種別を推定する第1位置推定部と、を備えることを特徴とする。 In order to solve the above-described problems and achieve the object, the present invention provides a life scene that is an operation scene in a user's life, a time zone of the life scene, a position of the life scene, and the life scene. The first estimation storage unit that stores first estimation information having the operation state of the user in the first storage information, the first correspondence information in which the life scene is associated with the facility type where the user is located in the life scene is stored A second correspondence storage unit that stores second estimation information including a first correspondence storage unit, a life scene, and a next scene representing a scene of an operation performed by the user next to the life scene, and a current position From the positioning unit that performs positioning, the detection unit that detects the current operation state of the user, and the first estimation information, using the life scene corresponding to the current position, the current operation state, and the current time, Profit A first scene estimation unit that estimates a user's current life scene, and the second scene, and the next scene corresponding to the current life scene, and the user's current life scene after the predetermined time. A first scene estimation unit that estimates a future life scene, and a first type that estimates a facility type where a user is located after a predetermined time from the first correspondence information, using the facility type corresponding to the future life scene. A position estimation unit.

 本発明によれば、利用者の生活における動作のシーンである生活シーンを用いることで、初めて訪問した場所においても所定時間後に利用者が所在する位置を推定できるという効果を奏する。 According to the present invention, it is possible to estimate the position where the user is located after a predetermined time even in a place visited for the first time by using a life scene that is an action scene in the life of the user.

携帯端末の一例を示す外観構成図。The external appearance block diagram which shows an example of a portable terminal. 携帯端末の機能的構成の一例を示すブロック図。The block diagram which shows an example of a functional structure of a portable terminal. 第1推定情報の一例を示す図。The figure which shows an example of 1st estimation information. 第2対応情報の一例を示す図。The figure which shows an example of 2nd corresponding | compatible information. 第1対応情報の一例を示す図。The figure which shows an example of 1st corresponding | compatible information. 第2推定情報の一例を示す図。The figure which shows an example of 2nd estimation information. 推定結果の表示の一例を示す図。The figure which shows an example of the display of an estimation result. 推定結果の表示の一例を示す図。The figure which shows an example of the display of an estimation result. 携帯端末による位置推定処理の流れを示すフローチャート。The flowchart which shows the flow of the position estimation process by a portable terminal. 携帯端末による位置推定処理の流れを示すフローチャート。The flowchart which shows the flow of the position estimation process by a portable terminal. 携帯端末の機能的構成の一例を示すブロック図。The block diagram which shows an example of a functional structure of a portable terminal. 第1対応情報の一例を示す図。The figure which shows an example of 1st corresponding | compatible information. 携帯端末による位置推定処理の流れを示すフローチャート。The flowchart which shows the flow of the position estimation process by a portable terminal. 携帯端末の機能的構成の一例を示すブロック図。The block diagram which shows an example of a functional structure of a portable terminal. 移動履歴の一例を示す図。The figure which shows an example of a movement history. 移動パターンの一例を示す図。The figure which shows an example of a movement pattern. 携帯端末による位置推定処理の流れを示すフローチャート。The flowchart which shows the flow of the position estimation process by a portable terminal.

 以下に添付図面を参照して、この発明にかかる位置推定装置の最良な実施の形態を詳細に説明する。以下の実施の形態では、本発明の位置推定装置を、携帯電話や、PDA(Personal Digital Assistants)などの携帯端末に適用した例を示すが、これに限定されることはなく、携帯可能であって、測位機能および所持者の動作状態検出機能が搭載できる装置であれば適用可能である。 DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a preferred embodiment of a position estimation device according to the present invention will be described in detail with reference to the accompanying drawings. In the following embodiment, an example in which the position estimation device of the present invention is applied to a mobile terminal such as a mobile phone or a PDA (Personal Digital Assistant) is shown, but the present invention is not limited to this and is portable. Any device that can be equipped with a positioning function and a function for detecting the operating state of the owner is applicable.

(実施の形態1)
 図1は、実施の形態1にかかる携帯端末の一例を示す外観構成図である。一つの携帯端末100に少なくとも1つ以上の測位センサと、少なくとも1つ以上の体動センサが接続されている。
(Embodiment 1)
FIG. 1 is an external configuration diagram illustrating an example of a mobile terminal according to the first embodiment. At least one positioning sensor and at least one body motion sensor are connected to one mobile terminal 100.

 測位センサ101は、携帯端末100の現在位置を測位するために、予め定めた所定の時間間隔でGPS衛星から電波を受信するセンサである。本実施の形態においては、測位センサ101は典型的な構成としてGPS機能を用いるものとする。ただし、GPS機能による測位に限定するものではなく、例えば、携帯端末100が携帯電話である場合には、測位センサ101を通信基地局からの電波を受信するアンテナとして構成し、受信した電波を発した通信基地局を特定することにより測位するように構成してもよい。また、測位センサ101を無線LAN(Local Area Network)のアクセスポイントからの電波を受信するアンテナとして構成し、受信した電波を発したアクセスポイントを特定することにより測位を行うように構成することもできる。さらに測位の結果は必ずしも携帯端末100において取得する必要はなく、利用者がRFID(Radio Frequency IDentification)タグを持ち、測位は外部システムであるRFIDアンテナおよびそれに付随したシステムにより行う構成とすることもできる。 The positioning sensor 101 is a sensor that receives radio waves from GPS satellites at predetermined time intervals in order to measure the current position of the mobile terminal 100. In the present embodiment, the positioning sensor 101 uses a GPS function as a typical configuration. However, the positioning is not limited to the positioning using the GPS function. For example, when the mobile terminal 100 is a mobile phone, the positioning sensor 101 is configured as an antenna that receives radio waves from a communication base station, and the received radio waves are emitted. You may comprise so that it may measure by specifying the communication base station which performed. Further, the positioning sensor 101 may be configured as an antenna that receives radio waves from a wireless LAN (Local Area Network) access point, and may be configured to perform positioning by specifying the access point that has emitted the received radio waves. . Further, the positioning result does not necessarily need to be acquired by the mobile terminal 100, and the user may have an RFID (Radio Frequency IDentification) tag, and positioning may be performed by an RFID antenna that is an external system and an associated system. .

 体動センサ102は、予め定めた所定の時間間隔で、携帯端末100を所持する利用者の動作に応じた信号を検出するセンサであり、本実施の形態では、典型的な構成として3軸加速度を用いて利用者の動作に応じた加速度データを検出する加速度センサとして構成している。ただし、これに限定されるものではなく、体動センサ102を、例えば、ジャイロ、方位センサなどで構成してもよい。 The body motion sensor 102 is a sensor that detects a signal corresponding to the operation of the user holding the mobile terminal 100 at a predetermined time interval. In this embodiment, the body motion sensor 102 has a three-axis acceleration as a typical configuration. It is comprised as an acceleration sensor which detects the acceleration data according to a user's operation | movement using. However, the present invention is not limited to this, and the body motion sensor 102 may be constituted by, for example, a gyroscope, a direction sensor, or the like.

 本実施の形態では、携帯端末100を一人の利用者が利用する場合について説明するが、これに限定されることなく、一つの携帯端末を複数人の利用者により利用する利用形態としてもよい。 In the present embodiment, a case where one user uses the mobile terminal 100 will be described, but the present invention is not limited to this, and a single mobile terminal may be used by a plurality of users.

 図2は、実施の形態1にかかる携帯端末の機能的構成の一例を示すブロック図である。図2に示すように、携帯端末100は、測位部111と、検出部112と、第1シーン推定部113と、保存部114と、第2シーン推定部115と、第1位置推定部116と、シーン更新部117と、出力部118と、第1推定記憶部131と、第1対応記憶部142と、第2推定記憶部132と、第2対応記憶部141とを主に備えている。なお、実施の形態1にかかる携帯端末100には、さらに、不図示のCPU、HDDやメモリ等の記憶媒体などのハードウェア構成を有している。 FIG. 2 is a block diagram of an example of a functional configuration of the mobile terminal according to the first embodiment. As illustrated in FIG. 2, the mobile terminal 100 includes a positioning unit 111, a detection unit 112, a first scene estimation unit 113, a storage unit 114, a second scene estimation unit 115, and a first position estimation unit 116. The scene update unit 117, the output unit 118, the first estimation storage unit 131, the first correspondence storage unit 142, the second estimation storage unit 132, and the second correspondence storage unit 141 are mainly provided. The mobile terminal 100 according to the first embodiment further has a hardware configuration such as a CPU, a storage medium such as an HDD or a memory (not shown).

 測位部111は、測位センサ101によりGPS衛星から受信した電波により、携帯端末100の現在位置を測位する。 The positioning unit 111 measures the current position of the mobile terminal 100 using the radio wave received from the GPS satellite by the positioning sensor 101.

 検出部112は、体動センサ102から現在の加速度データを取得し、取得した加速度データから、携帯端末100を所持する利用者の現在の動作状態(体の動き)を検出する。具体的には、検出部112は、まず、取得した加速度データから重力方向を推定する。重力方向の推定は、一定時間の加速度データの平均を3次元ベクトルとして算出した後、ベクトルを1Gの大きさになるように正規化することによって行う。その後、検出部112は、加速度データから重力ベクトルを減算することで、重力成分を除いた加速度データを算出する。さらに、検出部112は、得られた加速度データの長さ、加速度データと重力ベクトルとの内積、外積からなる3つの特徴量を算出する。これらの特徴量は、加速度センサを備えた携帯端末100がどのような方向を向いていても影響を受けない数値であり、利用者がどのように携帯端末100を保持していても、利用者の移動手段のコンテクストを得るための特徴量として利用することができる。 The detection unit 112 acquires current acceleration data from the body motion sensor 102, and detects the current operation state (body movement) of the user who holds the mobile terminal 100 from the acquired acceleration data. Specifically, the detection unit 112 first estimates the direction of gravity from the acquired acceleration data. The estimation of the direction of gravity is performed by calculating the average of acceleration data for a certain time as a three-dimensional vector and then normalizing the vector so as to have a magnitude of 1G. Thereafter, the detection unit 112 calculates acceleration data excluding the gravity component by subtracting the gravity vector from the acceleration data. Further, the detection unit 112 calculates three feature amounts including the length of the obtained acceleration data, the inner product of the acceleration data and the gravity vector, and the outer product. These feature values are numerical values that are not affected by the orientation of the mobile terminal 100 equipped with the acceleration sensor, regardless of the direction in which the user holds the mobile terminal 100. It can be used as a feature amount for obtaining the context of the moving means.

 次に、検出部112は、それら3つの特徴量から統計特徴量を算出する。統計特徴量とは、ある一定時間の間における、それら特徴量の最大値、最小値、平均、分散である。各特徴量に対してこれら4つの値を算出することで、12次元の値が取得できることになる。この統計特徴量は、加速度の時間的遷移を加味した特徴量として利用できる。次に、検出部112は、得られた12次元の値に対してニューラルネットを適用することで、現在利用者が走行/歩行/乗り物に乗車中/静止の4状態のうちのいずれかの状態であることを推測する。ニューラルネットは、あらかじめ取得したこれら4状態それぞれにおける統計特徴量を正解データとして与えることで学習が行われているものとする。また4状態はそれぞれ0から1の値で表され、数値が大きいほど、その状態と推測される度合いが高いことを意味する。以上により、検出部112は、加速度センサ(体動センサ102)からの加速度データの値に基づいて「走行」「歩行」「乗り物に乗車中」「静止」の4状態を検出可能となっている。 Next, the detection unit 112 calculates a statistical feature value from these three feature values. The statistical feature amount is the maximum value, minimum value, average, or variance of the feature amount during a certain period of time. By calculating these four values for each feature quantity, a 12-dimensional value can be acquired. This statistical feature quantity can be used as a feature quantity that takes into account the temporal transition of acceleration. Next, the detecting unit 112 applies a neural network to the obtained 12-dimensional value, so that the current user is in any one of four states of running / walking / riding the vehicle / stationary Guess that. It is assumed that the neural network has been learned by giving statistical data in each of these four states acquired in advance as correct data. Each of the four states is represented by a value from 0 to 1, and the larger the numerical value, the higher the degree that the state is estimated. As described above, the detection unit 112 can detect four states of “running”, “walking”, “riding on a vehicle”, and “still” based on the value of acceleration data from the acceleration sensor (body motion sensor 102). .

 また、本実施の形態では、体動センサ102により検出した加速度データにより利用者の動作状態を検出したが、これに限定されるものではなく、マイクやカメラ等により構成して、マイクからの音声信号やカメラからの撮像信号に基づいて、その携帯端末100を所持する利用者の動作状態を一定の精度で推定することで、利用者の動作状態を検出するように構成してもよい。 In the present embodiment, the user's operation state is detected based on the acceleration data detected by the body motion sensor 102. However, the present invention is not limited to this. The operation state of the user may be detected by estimating the operation state of the user holding the mobile terminal 100 with a certain accuracy based on the signal or the image pickup signal from the camera.

 第1推定記憶部131は、利用者の現在の生活シーンを推定する第1推定情報を記憶するHDD(Hard Disk Drive)やメモリ等の記憶媒体である。ここで、生活シーンとは、例えば、「勤務」「出張」「夕食」「買い物」「帰路乗車中」など、利用者の生活において該利用者が行う動作のシーン、すなわち利用者の生活における行動状況を示している。また、第1推定情報は、携帯端末100に予め登録されている。 The first estimation storage unit 131 is a storage medium such as an HDD (Hard Disk Drive) or a memory that stores first estimation information for estimating the current life scene of the user. Here, the life scene is, for example, a scene of an operation performed by the user in the user's life such as “work”, “business trip”, “dinner”, “shopping”, and “on the way home”, that is, an action in the user's life. Indicates the situation. Further, the first estimation information is registered in advance in the mobile terminal 100.

 図3は、利用者の生活シーンを推定する第1推定情報の一例を示す図である。図3に示す第1推定情報は、ある利用者個人について、その利用者の自宅、勤務先、よく訪問する店舗などの位置と、それらの位置に所在する時間帯等に基づいて登録することによって生成されたテーブルである。なお、ここでは、第1推定情報をテーブル形式で記憶しているが、これに限定することなく他の形式で記憶してもよい。 FIG. 3 is a diagram illustrating an example of first estimation information for estimating a user's life scene. The first estimation information shown in FIG. 3 is obtained by registering an individual user based on the position of the user's home, work place, frequently visited store, etc., and the time zone where the user is located. This is a generated table. Here, the first estimation information is stored in a table format, but the present invention is not limited to this and may be stored in another format.

 図3に示すように、第1推定情報は、利用者の生活における動作のシーンである生活シーンと、当該生活シーンとなる時間帯と、当該生活シーンの位置と、当該生活シーンにおける利用者の動作状態とを対応付けて有しており、優先度の高い順に1~12まで記憶している。例えば、生活シーンが「勤務」中であれば、利用者は、「8時~22時」の時間帯で、緯度、経度が「35.495500,139.593122」の位置に、「静止」または「歩行」していることがわかる。ここで、図3の第1推定情報では、位置については中央の緯度経度を示しているため、実際の利用者の位置との距離を算出し、それが一定範囲以内、たとえば100m以内であれば利用者が所在しているとみなしている。 As illustrated in FIG. 3, the first estimation information includes a life scene that is an operation scene in a user's life, a time zone that is the life scene, a position of the life scene, and a user's life scene. The operation states are associated with each other, and 1 to 12 are stored in descending order of priority. For example, if the life scene is “working”, the user can select “stationary” or “latitude” at the position of “35.495500, 139.593122” in the time zone “8:00 to 22:00”. You can see that you are “walking”. Here, in the first estimation information of FIG. 3, since the position indicates the central latitude and longitude, the distance from the actual user position is calculated, and if it is within a certain range, for example, within 100 m. It is assumed that the user is located.

 第2対応記憶部141は、利用者が所在する位置を推定する第2対応情報を記憶するHDDやメモリ等の記憶媒体である。図4は、施設種別と該施設種別の位置とを対応付けた第2対応情報の一例を示す図である。図4に示すように、第2対応情報には、施設の種類を示す施設種別と、当該施設種別の位置と、利用者が当該施設種別の位置に所在した確率とを対応付けたデーブルである。例えば、緯度、経度が「35.495500,139.593122」の位置にある施設種別の「会社」に利用者が所在する確率は、「17/20」であることがわかる。なお、ここでは、第2対応情報をテーブル形式で記憶しているが、これに限定することなく他の形式で記憶してもよい。 The second correspondence storage unit 141 is a storage medium such as an HDD or a memory that stores second correspondence information for estimating the position where the user is located. FIG. 4 is a diagram illustrating an example of second correspondence information in which a facility type is associated with a position of the facility type. As shown in FIG. 4, the second correspondence information is a table in which the facility type indicating the type of facility, the position of the facility type, and the probability that the user is located at the position of the facility type are associated with each other. . For example, it can be seen that the probability that the user is located in the “company” of the facility type at the position where the latitude and longitude are “35.495500, 139.593122” is “17/20”. Here, the second correspondence information is stored in a table format, but the present invention is not limited to this and may be stored in another format.

 第1対応記憶部142は、利用者が所在する施設種別を推定する第1対応情報を記憶するHDDやメモリ等の記憶媒体である。図5は、生活シーンと施設種別とを対応付けた第1対応情報の一例を示す図である。図5に示すように、第1対応情報には、生活シーンと、当該生活シーンにおいて利用者が所在する施設種別と、利用者が当該施設種別に所在した確率とを対応付けたテーブルである。例えば、生活シーンが「夕食」である場合には、施設種別の「レストラン」に利用者が所在する確率は、「12/20」であることがわかる。なお、ここでは、第1対応情報をテーブル形式で記憶しているが、これに限定することなく他の形式で記憶してもよい。 The first correspondence storage unit 142 is a storage medium such as an HDD or a memory that stores first correspondence information for estimating the facility type where the user is located. FIG. 5 is a diagram illustrating an example of first correspondence information in which a life scene is associated with a facility type. As shown in FIG. 5, the first correspondence information is a table in which a life scene, a facility type where the user is located in the life scene, and a probability that the user is located in the facility type are associated with each other. For example, when the life scene is “dinner”, the probability that the user is located in the “type restaurant” of the facility type is “12/20”. Here, the first correspondence information is stored in a table format, but the present invention is not limited to this and may be stored in another format.

 また、上述した第1対応情報は、利用者個人の履歴から構築および更新されることが望ましいがその限りではない。予め携帯端末100の作成者が用意した一般的なデータを用いることで、当初から一定の精度での将来の位置を推定する構成としてもよい。 In addition, the first correspondence information described above is preferably constructed and updated from the user's personal history, but this is not a limitation. By using general data prepared by the creator of the mobile terminal 100 in advance, a future position with a certain accuracy may be estimated from the beginning.

 第2推定記憶部132は、利用者の所定時間後の生活シーンである将来生活シーンを推定する第2推定情報を記憶するHDDやメモリ等の記憶媒体である。図6は、現在の生活シーンと次の生活シーンである次シーンとを対応付けた第2推定情報の一例を示す図である。図6に示すように、第2推定情報には、生活シーンと、当該生活シーンが継続された最短時間と、平均時間と、最長時間と、利用者が当該生活シーンの次に行う動作のシーンを表わす次シーンとを対応付けて有するテーブルである。また、次シーンには、利用者の動作が実際に次シーンとなった確率を付随させている。例えば、現在の生活シーンが「勤務」となる場合は、勤務を継続した最短時間が「4時間」、平均時間が「9時間」、最長時間が「14時間」であり、次の生活シーンが「夕食」である確率が6/16、「帰路乗車中」である確率が4/16、「買い物」である確率が2/16であることがわかる。なお、ここでは、第2推定情報をテーブル形式で記憶しているが、これに限定することなく他の形式で記憶してもよい。また、第2推定情報は、携帯端末100に予め登録されている。 The second estimation storage unit 132 is a storage medium such as an HDD or a memory that stores second estimation information for estimating a future life scene that is a life scene after a predetermined time of the user. FIG. 6 is a diagram illustrating an example of second estimation information in which the current life scene is associated with the next scene that is the next life scene. As shown in FIG. 6, the second estimation information includes a life scene, a shortest time in which the life scene is continued, an average time, a longest time, and a scene of an operation performed by the user next to the life scene. Is a table having a next scene representing Further, the probability that the user's action is actually the next scene is attached to the next scene. For example, when the current life scene is “working”, the shortest time for which work has been continued is “4 hours”, the average time is “9 hours”, and the longest time is “14 hours”, and the next life scene is It can be seen that the probability of “dinner” is 6/16, the probability of “returning on board” is 4/16, and the probability of “shopping” is 2/16. Here, the second estimation information is stored in a table format, but the present invention is not limited to this and may be stored in another format. Further, the second estimation information is registered in advance in the mobile terminal 100.

 第1シーン推定部113は、携帯端末100を所持する利用者の現在の生活シーンを推定する。具体的には、第1シーン推定部113は、第1推定情報から、測位部111により即位された現在位置、検出部112により検出された現在の動作状態、および計時部(不図示)により計時されている現在時刻に対応する生活シーンのうち、優先度の高い生活シーンを、利用者の現在の生活シーンとして推定する。 The first scene estimation unit 113 estimates the current life scene of the user who owns the mobile terminal 100. Specifically, the first scene estimation unit 113 uses the first estimation information to determine the current position immediately followed by the positioning unit 111, the current operation state detected by the detection unit 112, and the time measurement unit (not shown). Of the life scenes corresponding to the current time, a life scene with a high priority is estimated as the user's current life scene.

 なお、第1シーン推定部113は、生活シーンの推定に現在の動作状態を用いることで、測位機能によっては判定できなかった歩行や静止などの動作状態が明らかになるため、例えば電車内などの測位が困難である状況においても乗車中であることが推定できる。また、第1シーン推定部113は、いずれの生活シーンにも該当しなかった場合は「推定不能」となる。また、ここでは、第1推定情報を利用して、測位部111により測位された現在位置、および検出部112により検出された動作状態等から現在の生活シーンを推定しているが、これに限定されるものではなく、スケジュールの電子データや携帯端末の操作、通信の履歴から推定する構成としてもよい。 Note that the first scene estimation unit 113 uses the current operation state to estimate the life scene, and thus the operation state such as walking or stationary that cannot be determined by the positioning function is clarified. Even in situations where positioning is difficult, it can be estimated that the user is on board. Also, the first scene estimation unit 113 becomes “estimation impossible” when it does not correspond to any life scene. Here, the current life scene is estimated from the current position measured by the positioning unit 111 and the operation state detected by the detection unit 112 using the first estimation information. However, the present invention is not limited to this. It is good also as a structure estimated from the electronic data of a schedule, operation of a portable terminal, and the log | history of communication.

 保存部114は、第2対応情報から測位部111により測位された現在位置に対応する施設種別を取得し、第1シーン推定部113により推定された現在の生活シーンと、取得した施設種別とを対応付けて第1対応情報に保存することで、第1対応記憶部142の第1対応情報を更新する。また、保存部114は、利用者の生活シーンの推移に基づいて、第1対応情報における施設種別に所在する確率、および第2対応情報における施設種別の位置に所在する確率を更新する。 The storage unit 114 acquires the facility type corresponding to the current position measured by the positioning unit 111 from the second correspondence information, and obtains the current life scene estimated by the first scene estimation unit 113 and the acquired facility type. The first correspondence information in the first correspondence storage unit 142 is updated by storing the first correspondence information in association with each other. Further, the storage unit 114 updates the probability of being located in the facility type in the first correspondence information and the probability of being located in the position of the facility type in the second correspondence information based on the transition of the life scene of the user.

 第2シーン推定部115は、第2推定情報から、第1シーン推定部113により推定された現在の生活シーンに対応する次シーンを用いて、携帯端末100を所持する利用者の所定時間後の生活シーンである将来生活シーンを推定する。 The second scene estimation unit 115 uses the next scene corresponding to the current life scene estimated by the first scene estimation unit 113 based on the second estimation information, after a predetermined time of the user who owns the mobile terminal 100. A future life scene that is a life scene is estimated.

 具体的には、まず、第2シーン推定部115は、現在の生活シーンと、その生活シーンが始まった時刻を記憶しておくことにより、現在の生活シーンの終了時刻を推定する。ここでは、生活シーンの時間として、図6における平均時間を適用するものとする。例えば、現在の生活シーンが「勤務」であり、その生活シーンになってから既に8時間40分が経過している場合、平均時間を参照すると「勤務」の生活シーンは9時間継続するので、あと20分程度で終了すると推定できる。 Specifically, first, the second scene estimation unit 115 estimates the end time of the current life scene by storing the current life scene and the time when the life scene started. Here, the average time in FIG. 6 is applied as the life scene time. For example, if the current life scene is “working” and 8 hours and 40 minutes have already passed since becoming the life scene, referring to the average time, the life scene of “working” continues for 9 hours. It can be estimated that it will be completed in about 20 minutes.

 ここで、携帯端末100が30分後の将来生活シーンを推定する場合には、「夕食」の生活シーンに遷移している確率が6/16、「帰路乗車中」の生活シーンに遷移している確率が4/16、「買い物」の生活シーンに遷移している確率が2/16となる。なお、図6では省略されているが、4つ以上の次シーンがある場合も同様となる。また、最短時間、最長時間を用いてその確率を補正したり、終了時刻やその分布等を第2推定情報に保持することで、さらに正確に推定することもできる。 Here, when the mobile terminal 100 estimates a future life scene after 30 minutes, the probability that the mobile terminal 100 has transitioned to the “dinner” life scene is 6/16, and the life scene is “on the way home”. 4/16, and the probability of transition to the “shopping” life scene is 2/16. Although not shown in FIG. 6, the same applies when there are four or more next scenes. Further, the probability can be corrected more accurately by correcting the probability using the shortest time and the longest time, or by holding the end time, the distribution thereof, and the like in the second estimation information.

 第1位置推定部116は、第1対応情報から、将来生活シーンに対応する施設種別を、携帯端末100を所持する利用者が所定時間後に所在する施設種別として推定する。そして、第1位置推定部116は、第2推定情報において現在の生活シーンに対応する次シーンとなった確率と、第1対応情報において将来生活シーンに対応する施設種別に所在する確率との積を、推定された前記施設種別に利用者が所在する確率として算出する。 The first position estimation unit 116 estimates the facility type corresponding to the future life scene from the first correspondence information as the facility type where the user holding the mobile terminal 100 is located after a predetermined time. Then, the first position estimation unit 116 multiplies the probability that the second estimation information is the next scene corresponding to the current life scene and the probability that the first correspondence information is located in the facility type corresponding to the future life scene. Is calculated as the probability that the user is located in the estimated facility type.

 例えば、上述した例では、30分後の利用者の将来生活シーンは、「夕食」の生活シーンに遷移している確率が6/16であり、「帰路乗車中」の生活シーンに遷移している確率が4/16、「買い物」の生活シーンに遷移している確率が2/16である。そして、第1位置推定部116は、図5の第1対応情報を用いて、それらの将来生活シーンに対応するそれぞれ施設種別に所定時間後に所在している確率Ptypeを、次式により算出する。 For example, in the above-mentioned example, the probability that the user's future life scene after 30 minutes has changed to the life scene of “dinner” is 6/16, and the user changes to the life scene of “on the way home”. 4/16, and the probability of transition to the “shopping” life scene is 2/16. Then, using the first correspondence information in FIG. 5, the first position estimation unit 116 calculates the probability P type located in each facility type corresponding to those future life scenes after a predetermined time by the following equation. .

Figure JPOXMLDOC01-appb-M000001
 Ptype:所定時間後に利用者が施設種別typeに所在する確率
 Pscene:所定時間後の将来生活シーンscene発生確率
 P(scene,type):将来生活シーンscene時に対応する施設種別typeが利用される確率
Figure JPOXMLDOC01-appb-M000001
P type : Probability that the user is located in the facility type type after a predetermined time P scene : Probability of occurrence of a future life scene after a predetermined time P (scene, type): The facility type type corresponding to the future life scene is used probability

 以上のような例では、第1位置推定部116により、携帯端末100を所持する利用者が30分後に、生活シーンが「帰路乗車中」で「鉄道」の施設種別に所在する確率が1/4、生活シーンが「夕食」で「レストラン」の施設種別に所在する確率が9/40、生活シーンが「夕食」で「会社」の施設種別に所在する確率が3/20、生活シーンが「買い物」で「店舗」の施設種別に所在する確率が1/8と算出される。 In the example as described above, the first position estimation unit 116 indicates that the probability that the user who owns the mobile terminal 100 is located in the facility type “railway” with the life scene “on the way home” after 30 minutes is 1 /. 4. Life scene is “Dinner” and the probability of being in the “Restaurant” facility type is 9/40, Life scene is “Dinner” and the probability of being in the “Company” facility type is 3/20, and the life scene is “ The probability of being in the facility type “shop” in “shopping” is calculated as 1/8.

 また、第1位置推定部116は、さらに、第2対応情報から、算出した所定時間後に所在する施設種別に対応する施設種別の位置を、携帯端末100を所持する利用者が所定時間後に所在する施設種別の位置として推定する。そして、第1位置推定部116は、第2推定情報において現在の生活シーンに対応する次シーンとなった確率と、第1対応情報において将来生活シーンに対応する施設種別に所在する確率と、第2対応情報において利用者が所定時間後に所在する施設種別に対応する施設種別の位置に所在する確率との積を、推定された施設種別の位置に利用者が所在する確率として算出する。つまり、第1位置推定部116は、図4に示した第2対応情報を用いることで、推定された施設種別からその具体的な位置を求め、その位置に利用者が所在する確率を算出できる。 In addition, the first location estimation unit 116 further determines the location of the facility type corresponding to the facility type located after the calculated predetermined time from the second correspondence information, and the user holding the mobile terminal 100 is located after the predetermined time. Estimated as the location of the facility type. Then, the first position estimation unit 116 has the probability that the second estimation information is the next scene corresponding to the current life scene, the probability that the first correspondence information is located in the facility type corresponding to the future life scene, 2 The product of the probability that the user is located at the location of the facility type corresponding to the facility type located after a predetermined time in the correspondence information is calculated as the probability that the user is located at the estimated location of the facility type. In other words, the first position estimating unit 116 can calculate the specific position from the estimated facility type by using the second correspondence information shown in FIG. 4 and calculate the probability that the user is located at the position. .

 例えば、図4の第2対応情報では、「35.551331,139.675131」の位置であるレストランを利用する確率が6/8、「35.552345,139.671251」の位置にあるレストランを利用する確率が2/8となっているため、これらの確率を施設種別に所在する確率に乗じて、施設種別の位置の確率を算出する。つまり、上述した例では、30分後に利用者が前者のレストランに所在する確率は、「レストラン」の施設種別に所在する確率9/40に上記確率6/8を乗じて27/160、後者のレストランにいる確率は、同様に確率9/40に上記確率2/8を乗じて9/160と算出できる。なお、第1位置推定部116は、現在位置に応じて算出された確率を補正する構成としてもよい。 For example, in the second correspondence information in FIG. 4, the probability of using the restaurant at the position “35.551331, 139.6675131” is 6/8, and the restaurant at the position “35.552345, 139.6671251” is used. Therefore, the probability of the location of the facility type is calculated by multiplying these probabilities by the probability of being located in the facility type. That is, in the above-described example, the probability that the user is located in the former restaurant after 30 minutes is 27/160 by multiplying the probability 9/40 located in the facility type of “restaurant” by the probability 6/8. Similarly, the probability of being in a restaurant can be calculated as 9/160 by multiplying the probability 9/40 by the probability 2/8. The first position estimation unit 116 may be configured to correct the probability calculated according to the current position.

 シーン更新部117は、予め登録されている第1推定情報を、実際に利用者が生活している際に測位された位置、検出された動作状態、および計時された時間帯により更新する。 The scene update unit 117 updates the first estimated information registered in advance according to the position measured when the user actually lives, the detected operating state, and the time zone counted.

 例えば、図3の第1推定情報を更新する場合、シーン更新部117は、第1推定情報から、生活シーン「勤務」に対応する時間帯は8時~22時までとなっているが、22時まで「勤務」を継続した後であっても、利用者の動作状態が「静止」のまま23時まで継続されたとする。このように、利用者の動作状態が「静止」のまま継続されるのは、その前の生活シーンが継続していると推定できる。すなわち、この場合、利用者は「勤務」を23時まで継続したものと推定して、第1推定情報の「勤務」に対応する時間帯を8-23(8時~23時)に更新する。 For example, when updating the first estimation information in FIG. 3, the scene update unit 117 has a time zone corresponding to the life scene “work” from 8:00 to 22:00 from the first estimation information. It is assumed that the user's operation state remains “still” and continues until 23:00 even after “work” is continued until that time. Thus, it can be presumed that the user's operation state is continued “still” and that the previous life scene is continued. That is, in this case, the user presumes that “work” has continued until 23:00, and updates the time zone corresponding to “work” in the first estimation information to 8-23 (8 to 23:00). .

 また、シーン更新部117は、予め登録されている第2推定情報における利用者が施設種別に所在する確率を、実際に利用者が生活している際の生活シーンにより更新する。 In addition, the scene update unit 117 updates the probability that the user is located in the facility type in the second estimated information registered in advance with the life scene when the user actually lives.

 例えば、図6の第2推定情報を更新する場合、シーン更新部117は、ある時点で生活シーンが「勤務」と推定されていた後に、生活シーンが「帰路乗車中」に変化した場合には、第2推定情報における「勤務」の生活シーンに対応する次シーンの確率を更新する。つまり、「勤務」に対応する全ての次シーンの確率の分母を1増加させ17と更新するとともに、実際に遷移した「帰路乗車中」の次シーンについては分子も1増加させて4/16から5/17に更新する。 For example, when updating the second estimation information in FIG. 6, the scene update unit 117 determines that the life scene changes to “on boarding” after the life scene is estimated to be “working” at a certain time. The probability of the next scene corresponding to the “work” life scene in the second estimation information is updated. In other words, the denominator of the probability of all the next scenes corresponding to “work” is increased by 1 and updated to 17, and the numerator is also increased by 1 for the next scene of “on the way home” that has actually shifted from 4/16. Update to 5/17.

 出力部118は、第1位置推定部116により推定された施設種別と、第1位置推定部116により算出された、当該施設種別に利用者が存在する確率である確信度とを、推定結果として表示部(不図示)に出力する。図7は、推定結果の表示の一例を示す図である。図7に示すように、第1位置推定部116により推定されたそれぞれの施設種別に対応させて、該施設種別に存在する確率である確信度を表示する。例えば、利用者が鉄道に所在する確率が1/4と推定された場合、施設種別「鉄道」に対応させて、「鉄道」に所在する確率(確信度)「1/4」を表示する。 The output unit 118 uses the facility type estimated by the first position estimating unit 116 and the certainty factor calculated by the first position estimating unit 116 as the probability that a user exists in the facility type as an estimation result. Output to a display unit (not shown). FIG. 7 is a diagram illustrating an example of an estimation result display. As illustrated in FIG. 7, the certainty factor that is the probability of existing in the facility type is displayed in association with each facility type estimated by the first position estimating unit 116. For example, when the probability that the user is located on the railway is estimated to be ¼, the probability (certainty factor) “¼” located on the “railway” is displayed in association with the facility type “railway”.

 出力部118は、さらに、第1位置推定部116により推定された施設種別の位置と、第1位置推定部116により算出された、当該施設種別の位置に利用者が存在する確率である確信度とを推定結果として表示部(不図示)に出力する。図8は、推定結果の表示の一例を示す図である、図8に示すように、第1位置推定部116により推定されたそれぞれの施設種別に対応させて、施設種別の名称、施設種別の位置、施設種別の位置に存在する確率である確信度を表示する。例えば、利用者が「35.551331,139.675131」の位置にあるレストランに所在する確率が27/160と推定された場合、施設種別「レストラン」に対応させて、レストランの名称である「レストランA」、レストランの位置である「35.551331,139.675131」、レストランに所在する確率(確信度)「27/160」を表示する。 The output unit 118 further has a certainty factor that is the probability that a user exists at the location of the facility type calculated by the first location estimation unit 116 and the location of the facility type estimated by the first location estimation unit 116. Are output to a display unit (not shown) as an estimation result. FIG. 8 is a diagram illustrating an example of the display of the estimation result. As illustrated in FIG. 8, the name of the facility type and the name of the facility type are associated with each facility type estimated by the first position estimation unit 116. The certainty level, which is the probability of being present at the location or location of the facility type, is displayed. For example, when the probability that the user is located in a restaurant at the position “35.551331, 139.675131” is estimated to be 27/160, the restaurant name “Restaurant” is associated with the facility type “Restaurant”. A ”,“ 35.551331, 139.675131 ”which is the location of the restaurant, and the probability (certainty factor)“ 27/160 ”of being located in the restaurant.

 次に、以上のように構成された本実施の形態にかかる携帯端末100による位置推定処理について説明する。図9は、実施の形態1にかかる携帯端末による位置推定処理の流れを示すフローチャートである。以下では、所定時間後に推定された施設種別に利用者が所在する確率を算出する場合の処理について説明する。 Next, position estimation processing by the mobile terminal 100 according to the present embodiment configured as described above will be described. FIG. 9 is a flowchart of a position estimation process performed by the mobile terminal according to the first embodiment. Below, the process in the case of calculating the probability that a user is located in the facility type estimated after the predetermined time will be described.

 まず、測位部111は、携帯端末100の現在位置を測位する(ステップS11)。そして、検出部112は、携帯端末100を所持する利用者の現在の動作状態を検出する(ステップS12)。 First, the positioning unit 111 measures the current position of the mobile terminal 100 (step S11). And the detection part 112 detects the present operation state of the user who has the portable terminal 100 (step S12).

 次に、第1シーン推定部113は、第1推定情報により、現在位置、現在の動作状態、および現在時刻に基づき現在の生活シーンを推定する(ステップS13)。保存部114は、第2対応情報における現在位置に対応する施設種別と、現在の生活シーンとにより、第1対応情報を更新する(ステップS14)。そして、第2シーン推定部115は、第2推定情報により、現在の生活シーンに基づいて、利用者の所定時間後の将来生活シーンを推定する(ステップS15)。 Next, the first scene estimation unit 113 estimates the current life scene based on the current position, the current operation state, and the current time based on the first estimation information (step S13). The storage unit 114 updates the first correspondence information with the facility type corresponding to the current position in the second correspondence information and the current life scene (step S14). Then, the second scene estimation unit 115 estimates the future life scene after a predetermined time of the user based on the current life scene based on the second estimation information (step S15).

 次に、第1位置推定部116は、第1対応情報により、将来生活シーンに基づいて利用者が所定時間後に所在する施設種別を推定し、該施設種別に利用者が所在する確率を算出する(ステップS16)。出力部118は、推定された施設種別と、算出された確率とを推定結果として表示部に出力する(ステップS17)。 Next, the first position estimating unit 116 estimates the facility type where the user is located after a predetermined time based on the future life scene based on the first correspondence information, and calculates the probability that the user is located in the facility type. (Step S16). The output unit 118 outputs the estimated facility type and the calculated probability to the display unit as an estimation result (step S17).

 次に、以下では、所定時間後に利用者が推定された施設種別の位置に所在する確率を算出した場合の処理について説明する。図10は、実施の形態1にかかる携帯端末による位置推定処理の流れを示すフローチャートである。なお、図9における位置推定処理または図10における位置推定処理は、例えば、携帯端末100の操作部(不図示)から利用者によるいずれかの位置推定処理を行う旨の設定入力を受付け、受付けた設定に基づきいずれかの位置推定処理が行われる。 Next, the processing when the probability that the user is located at the location of the estimated facility type after a predetermined time is calculated will be described below. FIG. 10 is a flowchart of a position estimation process performed by the mobile terminal according to the first embodiment. In addition, the position estimation process in FIG. 9 or the position estimation process in FIG. 10 receives and accepts a setting input for performing any position estimation process by the user from an operation unit (not shown) of the mobile terminal 100, for example. One of the position estimation processes is performed based on the setting.

 まず、現在位置の測位から、将来生活シーンの推定までの処理(ステップS31~35)は、図9における処理(ステップS11~15)と同様であるため、説明を省略する。 First, the process from the current position measurement to the estimation of the future life scene (steps S31 to S35) is the same as the process (steps S11 to S15) in FIG.

 次に、第1位置推定部116は、第1対応情報により、将来生活シーンに基づいて利用者が所定時間後に所在する施設種別を推定し、該施設種別に利用者が所在する確率を算出する(ステップS36)。そして、第1位置推定部116は、第2対応情報により、施設種別の位置を推定し、該施設種別の位置に利用者が所在する確率を算出する(ステップS37)。出力部118は、推定された施設種別の位置と、算出された確率とを推定結果として表示部に出力する(ステップS38)。 Next, the first position estimating unit 116 estimates the facility type where the user is located after a predetermined time based on the future life scene based on the first correspondence information, and calculates the probability that the user is located in the facility type. (Step S36). Then, the first position estimation unit 116 estimates the position of the facility type based on the second correspondence information, and calculates the probability that the user is located at the position of the facility type (step S37). The output unit 118 outputs the estimated location of the facility type and the calculated probability to the display unit as an estimation result (step S38).

 以上のように、実施の形態1にかかる携帯端末100では、現在位置、利用者の現在の動作状態、および現在時刻から、利用者の現在の生活シーンを推定し、推定した現在の生活シーンに基づいて、所定時間後の将来生活シーンを推定する。そして、携帯端末100は、推定した将来生活シーンに基づいて、所定時間後に利用者が所在する施設種別または施設種別の位置を推定し、推定した施設種別または施設種別の位置に利用者が所在する確率を算出し、これらを表示部に出力する。このように、利用者の現在の生活シーンを用いることで、利用者の位置の連続性から推定せずとも、生活シーンの連続性から利用者が所定時間後に所在する可能性の高い施設種別または施設種別の位置を推定できるため、初めて訪問した場所においても所定時間後の利用者の施設種別または施設種別の位置を推定できる。 As described above, in the mobile terminal 100 according to the first embodiment, the current life scene of the user is estimated from the current position, the current operation state of the user, and the current time, and the estimated current life scene is obtained. Based on this, a future life scene after a predetermined time is estimated. Then, the mobile terminal 100 estimates the location of the facility type or the facility type where the user is located after a predetermined time based on the estimated future life scene, and the user is located at the estimated location of the facility type or the facility type. Probabilities are calculated and output to the display unit. In this way, by using the current life scene of the user, the facility type or the user is likely to be located after a predetermined time from the continuity of the life scene without estimating from the continuity of the position of the user. Since the location of the facility type can be estimated, the facility type of the user or the location of the facility type after a predetermined time can be estimated even at a place visited for the first time.

 つまり、初めて訪問した場所についての利用者の移動履歴が記憶されていない場合にも、利用者の行動パターンを考慮した所定時間後の位置を推定できる。これは、ある人物の生活における行動とは異なる場所でも共通する可能性が高いことによる。例えば、初めて訪問した場所においても出張シーンのあとには夕食シーンになると推定され、夕食シーンにおいてこの利用者は定食屋を50%の確率で利用し、ラーメン屋を30%の確率で利用する場合、この確率に基づいて所定時間後の利用者の所在する位置を推定できる。 That is, even when the user's movement history for the place visited for the first time is not stored, the position after a predetermined time can be estimated in consideration of the user's behavior pattern. This is because there is a high possibility that it is common in a place different from the behavior in the life of a certain person. For example, it is presumed that the place visited for the first time will become a dinner scene after a business trip scene, and this user uses a canteen restaurant with a probability of 50% and a ramen restaurant with a probability of 30%. Based on this probability, the position where the user is located after a predetermined time can be estimated.

 上記実施の形態の生活シーンは、上述の例に限定されるものではない。例えば、「夕食」という生活シーンも、さらに「夕食」「会食」「宴会」などに細分化して用いることもできる。その場合、例えば検出部112において同行者を検出することで、所定時間後に「宴会」の生活シーンが発生する可能性が高いことを推定し、これに基づいて第1位置推定部116が所定時間後に「居酒屋」や「焼鳥屋」に所在する可能性が高いことを推定できる。すなわち、「同行者の有無」といった行動状態を加味して将来の位置を推定することも可能となる。 The life scene of the above embodiment is not limited to the above example. For example, the life scene of “dinner” can be further subdivided into “dinner”, “meal dinner”, “banquet” and the like. In that case, for example, by detecting a companion in the detection unit 112, it is estimated that there is a high possibility that a “banquet” life scene will occur after a predetermined time, and based on this, the first position estimation unit 116 performs a predetermined time It can be estimated that there is a high possibility of being located in “Izakaya” or “Yakitori-ya” later. In other words, it is possible to estimate the future position in consideration of the action state such as “the presence or absence of a companion”.

 また、上記実施の形態では、将来生活シーンの推定を上述のように説明したが、より高精度で将来生活シーンを推定するためには、現在の生活シーンだけではなく、現在の行動状態、現在位置、現在時刻、その日の曜日などを含めて推定することが有効である。そのためには、第2推定情報において、現在の行動状態、現在位置、現在所在している施設種別、現在時刻、その日の曜日などを保持させる方式が有効である。また、将来生活シーンの推定に関しては他の技術、例えば、ベイジアンネットワークや隠れマルコフモデルなどの手法を用いてもよい。この場合には、上述のように、現在の生活シーンのみならず、現在の行動状態、現在位置、現在所在している施設種別、現在時刻、その日の曜日などの携帯端末を所持する利用者に関する行動情報、状態情報、環境情報を用いて所定時間後の将来生活シーンを推定する。 In the above embodiment, the estimation of the future life scene has been described above. However, in order to estimate the future life scene with higher accuracy, not only the current life scene but also the current action state, It is effective to estimate including the position, current time, day of the week, and the like. For this purpose, a method of retaining the current action state, current position, facility type currently located, current time, day of the week, etc. in the second estimation information is effective. Further, regarding the estimation of the future life scene, other techniques such as a Bayesian network and a hidden Markov model may be used. In this case, as described above, not only the current life scene but also the current behavioral state, the current position, the type of facility currently located, the current time, the day of the week, etc. A future life scene after a predetermined time is estimated using behavior information, state information, and environment information.

 さらに、所定時間後に利用者が所在する施設種別の位置を推定する際に、測位された現在位置を用いて、該位置への到達可能性を考慮した推定を行うように構成してもよい。具体的には、例えば、現在の生活シーンが「勤務」と推定され、30分後の将来生活シーンが6/16の確率で「夕食」であると推定された場合、上記実施の形態では、利用者が高い確率で夕食を摂る位置を算出して、推定結果として出力していた。これに対し、現在位置と、推定された所定時間後の位置との関係が、30分で移動することが困難である場合には異なる推定を行うことが妥当である。つまり、現在位置の緯度経度と、推定された所定時間後の位置の位置関係から、指定の時間(この例では30分)以内での移動可能性を算出し、到達不可能な位置は除去する。 Furthermore, when estimating the location of the facility type where the user is located after a predetermined time, it may be configured to use the current location that has been measured and perform estimation considering the reachability to the location. Specifically, for example, when the current life scene is estimated to be “working” and the future life scene after 30 minutes is estimated to be “dinner” with a probability of 6/16, in the above embodiment, The position where the user takes dinner with high probability was calculated and output as an estimation result. On the other hand, when the relationship between the current position and the estimated position after a predetermined time is difficult to move in 30 minutes, it is appropriate to perform a different estimation. That is, the possibility of movement within a specified time (in this example, 30 minutes) is calculated from the positional relationship between the latitude and longitude of the current position and the estimated position after a predetermined time, and unreachable positions are removed. .

 例えば、現在位置が「35.495500,139.593122」であった場合に、「35.551331,139.675131」のレストランまでの推定移動時間が50分であり30分以内に到達不可能である場合には、この候補を除去し、その他の位置を用いて同様に確率を算出し、推定結果の出力を行う。なお、この際の推定移動時間は過去の移動時間を用いて算出してもよいし、ナビゲーションサービスのような既存の計算方式を用いて算出してもよい。 For example, when the current position is “35.495500, 139.593122”, the estimated travel time to the restaurant “35.551331, 139.675131” is 50 minutes and cannot be reached within 30 minutes. In this case, the candidate is removed, the probability is similarly calculated using other positions, and the estimation result is output. Note that the estimated travel time at this time may be calculated using a past travel time, or may be calculated using an existing calculation method such as a navigation service.

(実施の形態2)
 実施の形態1では、生活シーンと施設種別とを対応させた第1対応情報によって、所定時間後の利用者の所在する施設種別を推定していた。これに対し、本実施の形態では、生活シーンと複数の施設種別とを階層的に対応させた第1対応情報によって、所定時間後の利用者の所在する施設種別を推定するものである。
(Embodiment 2)
In the first embodiment, the facility type where the user is located after a predetermined time is estimated by the first correspondence information in which the life scene is associated with the facility type. On the other hand, in this embodiment, the facility type where the user is located after a predetermined time is estimated based on the first correspondence information in which the life scene and the plurality of facility types are hierarchically associated.

 図11は、実施の形態2にかかる携帯端末の機能的構成の一例を示すブロック図である。図11に示すように、携帯端末200は、測位部111と、検出部112と、第1シーン推定部113と、保存部114と、第2シーン推定部115と、第1位置推定部216と、シーン更新部117と、出力部118と、第1推定記憶部131と、第2推定記憶部132と、第2対応記憶部141と、第1対応記憶部242とを主に備えている。なお、実施の形態2にかかる携帯端末200には、実施の形態1と同様、不図示のCPU、HDDやメモリ等の記憶媒体などのハードウェア構成を有している。ここで、実施の形態1と同じ番号を付した構成については、実施の形態1と同様であるため説明を省略する。また、実施の形態2にかかる携帯端末200の外観構成については、実施の形態1と同様である。 FIG. 11 is a block diagram of an example of a functional configuration of the mobile terminal according to the second embodiment. As illustrated in FIG. 11, the mobile terminal 200 includes a positioning unit 111, a detection unit 112, a first scene estimation unit 113, a storage unit 114, a second scene estimation unit 115, and a first position estimation unit 216. The scene update unit 117, the output unit 118, the first estimation storage unit 131, the second estimation storage unit 132, the second correspondence storage unit 141, and the first correspondence storage unit 242 are mainly provided. Note that the portable terminal 200 according to the second embodiment has a hardware configuration such as a CPU, a storage medium such as an HDD or a memory (not shown), as in the first embodiment. Here, since the same reference numerals as those in the first embodiment are the same as those in the first embodiment, the description thereof is omitted. The external configuration of the mobile terminal 200 according to the second embodiment is the same as that of the first embodiment.

 第1対応記憶部242は、利用者が所在する施設種別を推定する第1対応情報を記憶するHDDやメモリ等の記憶媒体である。図12は、生活シーンと施設種別とを対応付けた第1対応情報の一例を示す図である。図12に示すように、第1対応情報には、生活シーンと、当該生活シーンにおいて利用者が所在する施設種別における施設の名称と、利用者が当該施設の名称が示す施設種別に所在した確率と、当該生活シーンにおいて利用者が所在する施設種別における施設のカテゴリと、利用者が当該施設のカテゴリが示す施設種別に所在した確率とを対応付けたテーブルである。施設の名称とは、施設のカテゴリに含まれており、施設のカテゴリをさらに詳細に分類した施設種別である。例えば、生活シーンが「出張」である場合には、施設の名称が示す施設種別「第二工場」に利用者が所在する確率が「3/4」、施設の名称が示す施設種別「第三工場」に利用者が所在する確率が「1/4」、施設のカテゴリが示す施設種別「工場」に利用者が所在する確率が「4/4」であることがわかる。 The first correspondence storage unit 242 is a storage medium such as an HDD or a memory that stores first correspondence information for estimating the type of facility where the user is located. FIG. 12 is a diagram illustrating an example of first correspondence information in which a life scene is associated with a facility type. As shown in FIG. 12, the first correspondence information includes the life scene, the name of the facility in the facility type where the user is located in the life scene, and the probability that the user is located in the facility type indicated by the name of the facility. And the category of the facility in the facility type where the user is located in the life scene and the probability that the user is located in the facility type indicated by the category of the facility. The name of the facility is a facility type that is included in the facility category and further classifies the facility category. For example, when the life scene is “business trip”, the probability that the user is located in the facility type “second factory” indicated by the name of the facility is “3/4”, and the facility type indicated by the name of the facility “third” It can be seen that the probability that the user is located in the “factory” is “1/4” and the probability that the user is located in the facility type “factory” indicated by the facility category is “4/4”.

 なお、ここでは2つの階層により施設の名称と施設のカテゴリについて記載したが、3つ以上の階層として実現することも可能である。また、各施設には施設種別タグと呼ぶラベルを付与することで、完全な木構造ではなく階層レベルが一定ではないラベル群を同様に用いることも可能である。また、ここでは、第1対応情報をテーブル形式で記憶しているが、これに限定することなく他の形式で記憶してもよい。 In addition, although the name of the facility and the category of the facility are described here in two layers, it can be realized as three or more layers. In addition, by assigning a label called a facility type tag to each facility, it is also possible to use a label group that is not a complete tree structure and whose hierarchical level is not constant. Here, the first correspondence information is stored in a table format, but the present invention is not limited to this and may be stored in another format.

 第1位置推定部216は、実施の形態1と同様に、第1対応情報から、将来生活シーンに対応する施設種別を、携帯端末200を所持する利用者が所定時間後に所在する施設種別として推定し、推定した施設種別に利用者が存在する確率を算出する。この算出方法は、実施の形態1と同様であるため説明を省略する。 Similar to the first embodiment, the first position estimation unit 216 estimates the facility type corresponding to the future life scene from the first correspondence information as the facility type where the user who owns the mobile terminal 200 is located after a predetermined time. Then, the probability that the user exists in the estimated facility type is calculated. Since this calculation method is the same as that of the first embodiment, description thereof is omitted.

 また、第1位置推定部216は、所定時間後に利用者が所在する施設種別を推定する際、まずは施設種別を詳細に分類した施設の名称を推定する。そして、第1位置推定部216は、第2推定情報において現在の生活シーンに対応する次シーンとなった確率と、第1対応情報において将来生活シーンに対応する施設の名称が示す施設種別に所在した確率との積を、利用者が所定時間後に所在する施設の名称が示す施設種別に所在する確率として算出する。そして、第1位置推定部216は、推定した施設の名称の示す施設種別に利用者が所在する確率が予め定めた所定の閾値以上である場合は、該確率を推定結果とする。 In addition, when estimating the facility type where the user is located after a predetermined time, the first position estimating unit 216 first estimates the name of the facility that classifies the facility type in detail. Then, the first position estimation unit 216 is located in the probability of the next scene corresponding to the current life scene in the second estimation information and the facility type indicated by the name of the facility corresponding to the future life scene in the first correspondence information. The product with the calculated probability is calculated as the probability that the user is located in the facility type indicated by the name of the facility located after a predetermined time. Then, if the probability that the user is located in the facility type indicated by the estimated facility name is equal to or greater than a predetermined threshold value, the first position estimation unit 216 uses the probability as an estimation result.

 一方、第1位置推定部216は、推定した施設の名称の示す施設種別に利用者が所在する確率が予め定めた所定の閾値未満である場合は、その施設の名称を推定結果とはしない。次に、第1位置推定部216は、所定時間後に利用者が所在する施設の名称が含まれている施設のカテゴリを推定する。そして、第2推定情報において現在の生活シーンに対応する次シーンとなった確率と、第1対応情報において将来生活シーンに対応する施設のカテゴリが示す施設種別に所在した確率との積を、利用者が所定時間後に所在する施設のカテゴリが示す施設種別に所在する確率として算出し、該確率を推定結果とする。 On the other hand, if the probability that the user is located in the facility type indicated by the estimated facility name is less than a predetermined threshold value, the first position estimating unit 216 does not use the facility name as the estimation result. Next, the first position estimating unit 216 estimates the category of the facility that includes the name of the facility where the user is located after a predetermined time. Then, the product of the probability that the second estimated information is the next scene corresponding to the current life scene and the probability that the first corresponding information is located in the facility type indicated by the category of the facility corresponding to the future life scene is used. The probability that the person is located in the facility type indicated by the category of the facility located after a predetermined time is calculated, and the probability is used as the estimation result.

 例えば、第1位置推定部216は、利用者が所在する施設種別の確率に対する閾値が0.5に設定され、第2シーン推定部115により将来生活シーンが「夕食」と推定されたとする。この場合、図12の第1対応情報を参照すると、「夕食」についての最も詳細な施設種別、すなわち施設の名称においては、A定食屋を利用する確率4/10が最頻値であるが、これは閾値の0.5以上ではない。そこで、この推定結果は利用せず、次の施設種別、すなわち施設のカテゴリを参照すると、定食屋を利用する確率が5/10となっており、閾値0.5以上であるため採用となる。つまり、この利用者が所定時間後(例えばここでは30分後)には「夕食」の生活シーンに遷移すると推定され、その際にその利用者が滞留する可能性が高いのは「A定食屋」と推定するには確信度が不足しているが、「定食屋」という種別に50%の可能性で所在することを推定できる。また他の施設のカテゴリである「ラーメン屋」は30%、「ファミリーレストラン」は20%と推定できる。 For example, it is assumed that the first position estimation unit 216 sets the threshold for the probability of the facility type where the user is located to 0.5, and the second scene estimation unit 115 estimates the future life scene as “dinner”. In this case, referring to the first correspondence information of FIG. 12, in the most detailed facility type about “dinner”, that is, the name of the facility, the probability 4/10 of using the A set restaurant is the mode value. This is not above the threshold of 0.5. Therefore, this estimation result is not used, and if the next facility type, that is, the facility category is referred to, the probability of using a set restaurant is 5/10, which is adopted because the threshold is 0.5 or more. That is, it is estimated that this user will transition to the “dinner” life scene after a predetermined time (for example, 30 minutes here), and the user is likely to stay at that time. The degree of certainty is insufficient to estimate, but it can be estimated that there is a 50% possibility of being located in the type of “canteen”. In addition, it can be estimated that “Ramen restaurant” which is another facility category is 30%, and “Family restaurant” is 20%.

 次に、以上のように構成された本実施の形態にかかる携帯端末200による位置推定処理について説明する。図13は、実施の形態2にかかる携帯端末による位置推定処理の流れを示すフローチャートである。 Next, the position estimation process by the mobile terminal 200 according to the present embodiment configured as described above will be described. FIG. 13 is a flowchart of a position estimation process performed by the mobile terminal according to the second embodiment.

 まず、現在位置の測位から、将来生活シーンの推定までの処理(ステップS51~55)は、実施の形態1の図9における処理(ステップS11~15)と同様であるため、説明を省略する。 First, since the process from the current position measurement to the estimation of the future life scene (steps S51 to 55) is the same as the process (steps S11 to S15) in FIG. 9 of the first embodiment, a description thereof will be omitted.

 次に、第1位置推定部216は、第1対応情報により、将来生活シーンに基づいて利用者が所定時間後に所在する施設の名称を推定し、該施設の名称が示す施設種別に利用者が所在する確率を算出する(ステップS56)。そして、第1位置推定部216は、算出した確率が所定の閾値以上か否かを判断する(ステップS57)。所定の閾値以上であった場合(ステップS57:Yes)、出力部118は、推定された施設の名称と、算出された確率とを推定結果として表示部に出力する(ステップS58)。 Next, the first position estimation unit 216 estimates the name of the facility where the user is located after a predetermined time based on the future life scene based on the first correspondence information, and the user enters the facility type indicated by the name of the facility. The probability of being located is calculated (step S56). Then, the first position estimating unit 216 determines whether or not the calculated probability is equal to or greater than a predetermined threshold (step S57). If it is equal to or greater than the predetermined threshold (step S57: Yes), the output unit 118 outputs the estimated facility name and the calculated probability to the display unit as an estimation result (step S58).

 一方、所定の閾値以上でなかった場合(ステップS57:No)、第1位置推定部216は、第1対応情報により、将来生活シーンに基づいて利用者が所定時間後に所在する施設のカテゴリを推定し、該施設のカテゴリが示す施設種別に利用者が所在する確率を算出する(ステップS59)。そして、出力部118は、推定された施設のカテゴリと、算出された確率とを推定結果として表示部に出力する(ステップS60)。 On the other hand, when it is not more than a predetermined threshold (Step S57: No), the 1st position estimating part 216 presumes the category of the facility where a user is located after predetermined time based on the future life scene by the 1st correspondence information. Then, the probability that the user is located in the facility type indicated by the category of the facility is calculated (step S59). Then, the output unit 118 outputs the estimated facility category and the calculated probability to the display unit as an estimation result (step S60).

 以上のように、実施の形態2にかかる携帯端末200では、現在位置、利用者の現在の動作状態、および現在時刻から、利用者の現在の生活シーンを推定し、推定した現在の生活シーンに基づいて、所定時間後の将来生活シーンを推定する。そして、携帯端末200は、推定した将来生活シーンに基づいて、所定時間後に利用者が所在する施設の名称または施設のカテゴリを推定し、推定した施設の名称または施設のカテゴリが示す施設種別に利用者が所在する確率を算出し、これらを表示部に出力する。このように、利用者の現在の生活シーンを用いることで、利用者の位置の連続性から推定せずとも、生活シーンの連続性から利用者が所定時間後に所在する可能性の高い施設の名称または施設のカテゴリを推定できるため、初めて訪問した場所においても所定時間後の利用者の施設の名称または施設のカテゴリを推定できる。また、生活シーンと施設種別とを階層的に対応付けた第1対応情報を用いることにより、さらに詳細な利用者の位置を推定できる。 As described above, in the mobile terminal 200 according to the second embodiment, the current life scene of the user is estimated from the current position, the current operation state of the user, and the current time, and the estimated current life scene is obtained. Based on this, a future life scene after a predetermined time is estimated. Then, the mobile terminal 200 estimates the name of the facility or the facility category where the user is located after a predetermined time based on the estimated future life scene, and uses it for the facility type indicated by the estimated facility name or the facility category. The probability that the person is located is calculated, and these are output to the display unit. In this way, by using the current life scene of the user, the name of the facility where the user is likely to be located after a predetermined time from the continuity of the life scene without being estimated from the continuity of the user's position. Or since the category of the facility can be estimated, the name of the facility of the user or the category of the facility after a predetermined time can be estimated even at a place visited for the first time. Further, by using the first correspondence information in which the life scene and the facility type are hierarchically associated, it is possible to estimate the position of the user in more detail.

(実施の形態3)
 実施の形態1、2では、利用者の現在の生活シーンから将来生活シーンを推定し、該将来生活シーンに基づいて所定時間後に利用者が所在する施設種別等を推定していた。これに対し、本実施の形態では、さらに、利用者の移動履歴を記憶し、記憶した移動履歴から抽出した利用者の移動のパターンに基づいて所定時間後に利用者が所在する位置を推定する技術を加えたものである。以下では、実施の形態2に、当該技術を加えた構成を説明する。
(Embodiment 3)
In the first and second embodiments, the future life scene is estimated from the user's current life scene, and the facility type where the user is located after a predetermined time is estimated based on the future life scene. On the other hand, in the present embodiment, a technique for further storing a user's movement history and estimating a position where the user is located after a predetermined time based on a user's movement pattern extracted from the stored movement history. Is added. Below, the structure which added the said technique to Embodiment 2 is demonstrated.

 図14は、実施の形態3にかかる携帯端末の機能的構成の一例を示すブロック図である。図14に示すように、携帯端末300は、測位部111と、検出部112と、第1シーン推定部113と、保存部114と、第2シーン推定部115と、第1位置推定部216と、シーン更新部117と、出力部118と、抽出部321と、第2位置推定部322と、決定部323と、第1推定記憶部131と、第2推定記憶部132と、第2対応記憶部141と、第1対応記憶部242と、履歴記憶部351と、パターン記憶部352とを主に備えている。なお、実施の形態3にかかる携帯端末300には、実施の形態1、2と同様、不図示のCPU、HDDやメモリ等の記憶媒体などのハードウェア構成を有している。ここで、実施の形態1と同じ番号を付した構成については、実施の形態1と同様であるため説明を省略する。また、実施の形態3にかかる携帯端末300の外観構成については、実施の形態1と同様である。 FIG. 14 is a block diagram of an example of a functional configuration of the mobile terminal according to the third embodiment. As illustrated in FIG. 14, the mobile terminal 300 includes a positioning unit 111, a detection unit 112, a first scene estimation unit 113, a storage unit 114, a second scene estimation unit 115, and a first position estimation unit 216. The scene update unit 117, the output unit 118, the extraction unit 321, the second position estimation unit 322, the determination unit 323, the first estimation storage unit 131, the second estimation storage unit 132, and the second correspondence memory Unit 141, first correspondence storage unit 242, history storage unit 351, and pattern storage unit 352. Note that the portable terminal 300 according to the third embodiment has a hardware configuration such as a CPU (not shown), a storage medium such as an HDD or a memory, as in the first and second embodiments. Here, since the same reference numerals as those in the first embodiment are the same as those in the first embodiment, the description thereof is omitted. The external configuration of the mobile terminal 300 according to the third embodiment is the same as that of the first embodiment.

 実施の形態3にかかる携帯端末300は、実施の形態2の携帯端末200における生活シーンの推移に基づく将来位置の推定に加え、測位部111で測位した現在位置の移動履歴を記憶し、記憶した移動履歴に基づいて将来位置の推定を行い、さらに、推定された将来位置の補正を行う構成となっている。 The mobile terminal 300 according to the third embodiment stores the movement history of the current position measured by the positioning unit 111 in addition to the estimation of the future position based on the life scene transition in the mobile terminal 200 of the second embodiment. The future position is estimated based on the movement history, and the estimated future position is corrected.

 履歴記憶部351は、測位部111により測位された現在位置と現在時刻とを経時的に記録した移動履歴を記憶するHDDやメモリ等の記憶媒体である。図15は、移動履歴の一例を示す図である。図15に示すように、移動履歴は、現在位置を測位した日時に、測位した現在位置(緯度および経度により示される位置)と、現在位置を識別するラベルと、現在位置における施設の名称とを対応付けて記録されている。 The history storage unit 351 is a storage medium such as an HDD or a memory that stores a movement history in which the current position measured by the positioning unit 111 and the current time are recorded over time. FIG. 15 is a diagram illustrating an example of the movement history. As shown in FIG. 15, the movement history includes the current position (position indicated by latitude and longitude), the label for identifying the current position, and the name of the facility at the current position at the date and time when the current position is measured. Correspondingly recorded.

 パターン記憶部352は、測位部111により測位された現在位置と、その現在位置から他の位置に移動した場合に当該他の位置に利用者が所在した確率とを対応付けた移動パターンを記憶するHDDやメモリ等の記憶媒体である。図16は、移動パターンの一例を示す図である。図16に示すように、移動パターンは、現在位置のラベルである「位置01」「位置02」等に、現在位置から次に移動する他の位置のラベル(次の位置のラベル)「位置01」「位置02」等が対応付けられており、さらに、それぞれの次の位置に利用者が所在した確率「23/46」「23/25」等を対応付けている。 The pattern storage unit 352 stores a movement pattern in which the current position measured by the positioning unit 111 is associated with the probability that the user is located at the other position when moving from the current position to the other position. A storage medium such as an HDD or a memory. FIG. 16 is a diagram illustrating an example of the movement pattern. As shown in FIG. 16, the movement pattern includes “position 01”, “position 02”, etc., which are labels of the current position. "Position 02" and the like are associated with each other, and the probability "23/46" and "23/25" that the user is located at each next position is associated.

 抽出部321は、測位部111により測位された現在位置と現在時刻とから移動履歴を生成して履歴記憶部351に記憶する。そして、記憶した移動履歴における現在位置の時間的な推移に基づいて移動パターンを抽出して、抽出した移動パターンをパターン記憶部352に記憶する。具体的には、抽出部321は、予め定めた所定の半径の範囲内に、所定時間以上利用者が所在していた場合に、その場所を滞在場所として登録する。そして、抽出部321は、後に登録した場所を利用者が通過した場合、その通過を一回の滞在とみなすものとする。例えば、所定の半径を100m、所定時間を5分と定める。そして、抽出部321は、各滞在場所に所在した後、その次に別の滞在場所に移動した頻度を計算することで、次の場所に利用者が所在した確率を算出して、移動パターンを抽出する。 The extraction unit 321 generates a movement history from the current position and the current time measured by the positioning unit 111 and stores the movement history in the history storage unit 351. Then, a movement pattern is extracted based on the temporal transition of the current position in the stored movement history, and the extracted movement pattern is stored in the pattern storage unit 352. Specifically, the extraction unit 321 registers the location as a stay location when the user is present within a predetermined radius range for a predetermined time or more. And the extraction part 321 shall regard the passage as one stay, when a user passes the location registered later. For example, the predetermined radius is set to 100 m and the predetermined time is set to 5 minutes. And the extraction part 321 calculates the probability that the user was located in the next place by calculating the frequency of moving to another stay place after being located in each stay place, and the movement pattern is calculated. Extract.

 第2位置推定部322は、移動パターンから、携帯端末300を所持する利用者が所定時間後に現在位置に対応する次の位置に所在する確率を算出する。例えば、第2位置推定部322は、移動パターンから、利用者が現在位置「位置04」に現在滞在している場合、所定時間後には、次の位置「位置03」に13/18の確率で移動すると推定し、次の位置「位置05」に5/18の確率で移動すると推定する。なお、本実施の形態では、移動履歴から移動パターンを抽出する推定方式を記載したが、ベイジアンネットワークや隠れマルコフモデルを活用する手法などを用いてもよい。 The second position estimation unit 322 calculates, from the movement pattern, the probability that the user who owns the mobile terminal 300 is located at the next position corresponding to the current position after a predetermined time. For example, when the user is currently staying at the current position “position 04” from the movement pattern, the second position estimation unit 322 has a probability of 13/18 at the next position “position 03” after a predetermined time. It is presumed that it will move, and it is assumed that it will move to the next position “position 05” with a probability of 5/18. In the present embodiment, an estimation method for extracting a movement pattern from a movement history is described. However, a method using a Bayesian network or a hidden Markov model may be used.

 決定部323は、第2位置推定部322により移動パターンにより推定された所定時間後の利用者の所在する次の位置と、第1位置推定部216により推定された所定時間後の利用者の所在する施設種別とを比較して、最終的な推定結果を決定する。つまり、決定部323は、第2位置推定部322により推定された次の位置に所在する確率が所定の閾値以上である場合には、第2位置推定部322により推定された次の位置と、算出された確率を推定結果とする。一方、第2位置推定部322により推定された次の位置に所在する確率が所定の閾値未満である場合、第1位置推定部216により推定された施設種別と、算出された確率を推定結果とする。 The determination unit 323 determines the next position where the user is located after a predetermined time estimated by the second position estimation unit 322 based on the movement pattern, and the location of the user after the predetermined time estimated by the first position estimation unit 216. The final estimation result is determined by comparing with the facility type. That is, the determination unit 323 determines the next position estimated by the second position estimation unit 322 when the probability of being located at the next position estimated by the second position estimation unit 322 is equal to or greater than a predetermined threshold. The calculated probability is used as the estimation result. On the other hand, when the probability of being located at the next position estimated by the second position estimation unit 322 is less than a predetermined threshold, the facility type estimated by the first position estimation unit 216 and the calculated probability To do.

 例えば、予め定めた所定の閾値を0.3とすると、決定部323は、第2位置推定部322により算出された確率が0.3以上である場合には、第2位置推定部322により推定された次の位置を最終的な推定結果と決定し、出力部118により出力される。一方、第2位置推定部322により算出された確率が0.3未満である場合には、決定部323は、第1位置推定部216により推定された施設種別を最終的な推定結果として決定し、出力部118により出力される。 For example, assuming that a predetermined threshold value is 0.3, the determination unit 323 estimates the second position estimation unit 322 when the probability calculated by the second position estimation unit 322 is 0.3 or more. The determined next position is determined as a final estimation result and is output by the output unit 118. On the other hand, when the probability calculated by the second position estimation unit 322 is less than 0.3, the determination unit 323 determines the facility type estimated by the first position estimation unit 216 as the final estimation result. And output by the output unit 118.

 このような構成とすることで、利用者が過去に多数訪問した場所にいる場合は、その移動履歴を利用した推定(第2位置推定部322による推定)を出力し、利用者が初めて訪問した場所にいるような場合は、実施の形態2で説明した生活シーンに基づく推定(第1位置推定部216による推定)を出力することができる。 With this configuration, when the user has visited many places in the past, an estimation using the movement history (estimation by the second position estimation unit 322) is output, and the user visited for the first time. When the user is in a place, the estimation based on the life scene described in Embodiment 2 (estimation by the first position estimation unit 216) can be output.

 次に、以上のように構成された本実施の形態にかかる携帯端末300による位置推定処理について説明する。図17は、実施の形態3にかかる携帯端末による位置推定処理の流れを示すフローチャートである。なお、携帯端末300では、下記の処理と並行して、実施の形態2における位置推定処理(図13)を行っているものとする。 Next, position estimation processing by the mobile terminal 300 according to the present embodiment configured as described above will be described. FIG. 17 is a flowchart of a position estimation process performed by the mobile terminal according to the third embodiment. It is assumed that portable terminal 300 performs the position estimation process (FIG. 13) in the second embodiment in parallel with the following process.

 まず、測位部111は、携帯端末300の現在位置を測位する(ステップS71)。そして、抽出部321は、現在位置と現在時刻から移動履歴を生成して履歴記憶部351に記憶する(ステップS72)。 First, the positioning unit 111 measures the current position of the mobile terminal 300 (step S71). Then, the extraction unit 321 generates a movement history from the current position and the current time and stores the movement history in the history storage unit 351 (step S72).

 次に、抽出部321は、記憶した移動履歴に基づいて、移動パターンを抽出し、抽出した移動パターンをパターン記憶部352に記憶する(ステップS73)。次に、第2位置推定部322は、移動パターンにより、所定時間後に利用者が次の位置に所在する確率を算出する(ステップS74)。 Next, the extraction unit 321 extracts a movement pattern based on the stored movement history, and stores the extracted movement pattern in the pattern storage unit 352 (step S73). Next, the second position estimation unit 322 calculates the probability that the user is located at the next position after a predetermined time based on the movement pattern (step S74).

 そして、決定部323は、第2位置推定部322により算出された確率が所定の閾値以上か否かを判断する(ステップS75)。所定の閾値以上であった場合(ステップS75:Yes)、決定部323は、第2位置推定部322により推定された次の位置と、算出された確率を推定結果と決定し、出力部118がその推定結果を表示部に出力する(ステップS77)。一方、所定の閾値以上でなかった場合(ステップS75:No)、決定部323は、第1位置推定部216により推定された施設種別と、算出された確率を推定結果と決定し、出力部118がその推定結果を表示部に出力する(ステップS76)。 And the determination part 323 judges whether the probability calculated by the 2nd position estimation part 322 is more than a predetermined threshold value (step S75). When it is equal to or greater than the predetermined threshold (step S75: Yes), the determination unit 323 determines the next position estimated by the second position estimation unit 322 and the calculated probability as an estimation result, and the output unit 118 The estimation result is output to the display unit (step S77). On the other hand, when it is not equal to or greater than the predetermined threshold (step S75: No), the determination unit 323 determines the facility type estimated by the first position estimation unit 216 and the calculated probability as an estimation result, and outputs the output unit 118. Outputs the estimation result to the display unit (step S76).

 以上のように、実施の形態3にかかる携帯端末300では、利用者の現在の生活シーンにより推定した施設種別および該施設種別に利用者が所定する確率と、利用者の実際に移動した移動履歴により推定した次の位置および該次の位置に利用者が所在する確率とを算出した場合に、利用者の状況に適した推定結果を表示部に出力できる。従って、実施の形態2と同様に、初めて訪問した場所においても所定時間後の利用者の施設種別を推定できるとともに、移動履歴に基づく位置の推定を行うことができる。 As described above, in the mobile terminal 300 according to the third embodiment, the facility type estimated from the current life scene of the user, the probability that the user determines for the facility type, and the movement history of the user actually moved When the next position estimated by the above and the probability that the user is located at the next position are calculated, an estimation result suitable for the user's situation can be output to the display unit. Accordingly, as in the second embodiment, the user's facility type after a predetermined time can be estimated even at a place visited for the first time, and the position can be estimated based on the movement history.

 なお、本発明は、上記実施の形態そのままに限定されるものではなく、実施段階ではその要旨を逸脱しない範囲で構成要素を変形して具体化することができる。また、上記実施の形態に開示されている複数の構成要素の適宜な組み合わせにより、種々の発明を形成することができる。例えば、実施の形態に示される全構成要素からいくつかの構成要素を削除してもよい。さらに、異なる実施の形態にわたる構成要素を適宜組み合わせても良い。 Note that the present invention is not limited to the above-described embodiment as it is, and can be embodied by modifying the constituent elements without departing from the scope of the invention in the implementation stage. In addition, various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above embodiments. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, constituent elements over different embodiments may be appropriately combined.

 以上のように、本発明にかかる位置推定装置は、利用者が所定時間後に所在する位置を推定する場合に有用であり、特に、利用者が初めて訪問した場所において利用者が所定時間後に所在する位置を推定する場合に適している。 As described above, the position estimation apparatus according to the present invention is useful for estimating a position where a user is located after a predetermined time, and in particular, the user is located after a predetermined time at a place where the user has visited for the first time. Suitable for estimating position.

 100、200、300 携帯端末
 101 測位センサ
 102 体動センサ
 111 測位部
 112 検出部
 113 第1シーン推定部
 114 保存部
 115 第2シーン推定部
 116、216 第1位置推定部
 117 シーン更新部
 118 出力部
 118 表示部
 131 第1推定記憶部
 132 第2推定記憶部
 141 第2対応記憶部
 142、242 第1対応記憶部
 321 抽出部
 322 第2位置推定部
 323 決定部
 351 履歴記憶部
 352 パターン記憶部
100, 200, 300 Portable terminal 101 Positioning sensor 102 Body motion sensor 111 Positioning unit 112 Detection unit 113 First scene estimation unit 114 Storage unit 115 Second scene estimation unit 116, 216 First position estimation unit 117 Scene update unit 118 Output unit 118 display unit 131 first estimation storage unit 132 second estimation storage unit 141 second correspondence storage unit 142, 242 first correspondence storage unit 321 extraction unit 322 second position estimation unit 323 determination unit 351 history storage unit 352 pattern storage unit

Claims (10)

 利用者の生活における動作のシーンである生活シーンと、前記生活シーンの時間帯と、前記生活シーンの位置と、前記生活シーンにおける利用者の動作状態とを有する第1推定情報を記憶する第1推定記憶部と、
 前記生活シーンと、前記生活シーンにおいて利用者が所在する施設種別とを対応付けた第1対応情報を記憶する第1対応記憶部と、
 前記生活シーンと、利用者が前記生活シーンの次に行う動作のシーンを表す次シーンとを有する第2推定情報を記憶する第2推定記憶部と、
 現在位置を測位する測位部と、
 利用者の現在の動作状態を検出する検出部と、
 前記第1推定情報から、前記現在位置、前記現在の動作状態、および現在時刻に対応する前記生活シーンを用いて、利用者の現在の生活シーンを推定する第1シーン推定部と、
 前記第2推定情報から、前記現在の生活シーンに対応する前記次シーンを用いて、利用者の所定時間後の前記生活シーンである将来生活シーンを推定する第2シーン推定部と、
 前記第1対応情報から、前記将来生活シーンに対応する前記施設種別を用いて、利用者が所定時間後に所在する前記施設種別を推定する第1位置推定部と、
 を備えることを特徴とする位置推定装置。
First estimation information storing a life scene, which is an action scene in a user's life, a time zone of the life scene, a position of the life scene, and a user's operation state in the life scene is stored. An estimated storage unit;
A first correspondence storage unit that stores first correspondence information in which the life scene is associated with a facility type in which the user is located in the life scene;
A second estimation storage unit that stores second estimation information including the life scene and a next scene representing a scene of an operation performed by the user next to the life scene;
A positioning unit that measures the current position;
A detection unit for detecting the current operating state of the user;
A first scene estimation unit that estimates a current life scene of a user from the first estimation information using the life scene corresponding to the current position, the current operation state, and the current time;
A second scene estimation unit for estimating a future life scene, which is the life scene after a predetermined time of the user, from the second estimation information using the next scene corresponding to the current life scene;
From the first correspondence information, using the facility type corresponding to the future life scene, a first position estimation unit for estimating the facility type where a user is located after a predetermined time;
A position estimation apparatus comprising:
 施設の種類を示す施設種別と、前記施設種別の位置とを対応付けた第2対応情報を記憶する第2対応記憶部をさらに備え、
 前記第1位置推定部は、さらに、前記第2対応情報から、利用者が所定時間後に所在する前記施設種別に対応する前記施設種別の位置を用いて、利用者が所定時間後に所在する位置を推定することを特徴とする請求項1に記載の位置推定装置。
A second correspondence storage unit that stores second correspondence information in which a facility type indicating a type of facility is associated with a position of the facility type;
The first position estimating unit further uses the position of the facility type corresponding to the facility type where the user is located after a predetermined time from the second correspondence information to determine the position where the user is located after the predetermined time. The position estimation apparatus according to claim 1, wherein estimation is performed.
 前記現在の生活シーンと、前記第2対応情報における前記現在位置に対応する前記施設種別とを対応付けた前記第1対応情報を前記第1対応記憶部に保存する保存部をさらに備えることを特徴とする請求項2に記載の位置推定装置。 The apparatus further comprises a storage unit that stores the first correspondence information in which the current life scene is associated with the facility type corresponding to the current position in the second correspondence information in the first correspondence storage unit. The position estimation apparatus according to claim 2.  前記第2推定情報は、前記生活シーンと、利用者の動作が前記次シーンとなった確率を付随させた前記次シーンとを有し、
 前記第1対応情報は、さらに、利用者が前記生活シーンに対応する前記施設種別に所在する確率を対応付け、
 前記第1位置推定部は、さらに、前記第2推定情報において前記現在の生活シーンに対応する前記次シーンとなった確率と、前記第1対応情報において前記将来生活シーンに対応する前記施設種別に所在する確率との積を、推定された前記施設種別に利用者が所在する確率として算出することを特徴とする請求項3に記載の位置推定装置。
The second estimation information includes the life scene and the next scene accompanied by a probability that a user's action is the next scene,
The first correspondence information further associates a probability that a user is located in the facility type corresponding to the life scene,
The first position estimating unit further sets the probability that the second scene information corresponds to the current life scene in the second estimation information and the facility type corresponding to the future life scene in the first correspondence information. The position estimation apparatus according to claim 3, wherein a product of the location probability is calculated as a probability that a user is located in the estimated facility type.
 前記第2対応情報は、さらに、利用者が前記施設種別に対応する前記施設種別の位置に所在する確率を対応付け、
 前記第1位置推定部は、さらに、前記第2推定情報において前記現在の生活シーンに対応する前記次シーンとなった確率と、前記第1対応情報において前記将来生活シーンに対応する前記施設種別に所在する確率と、前記第2対応情報において利用者が所定時間後に所在する前記施設種別に対応する前記施設種別の位置に所在する確率との積を、推定された位置に利用者が所在する確率として算出することを特徴とする請求項4に記載の位置推定装置。
The second correspondence information further associates the probability that the user is located at the location of the facility type corresponding to the facility type,
The first position estimating unit further sets the probability that the second scene information corresponds to the current life scene in the second estimation information and the facility type corresponding to the future life scene in the first correspondence information. Probability that the user is located at the estimated position, the product of the probability of being located and the probability that the user is located at the location of the facility type corresponding to the facility type located after a predetermined time in the second correspondence information The position estimation device according to claim 4, wherein
 利用者の動作に基づいて、前記第2推定情報における前記次シーンとなった確率を更新するシーン更新部をさらに備えることを特徴とする請求項5に記載の位置推定装置。 6. The position estimation apparatus according to claim 5, further comprising a scene update unit that updates a probability of the next scene in the second estimation information based on a user's action.  前記施設種別は、施設のカテゴリと、前記施設のカテゴリに含まれる施設の名称とを含み、
 前記第1対応情報は、前記生活シーンと、前記施設種別と、前記施設の名称が示す前記施設種別に所在した確率とを対応付け、
 前記第1位置推定部は、さらに、前記第2推定情報において前記現在の生活シーンに対応する前記次シーンとなった確率と、前記第1対応情報において前記将来生活シーンに対応する前記施設の名称が示す前記施設種別に所在した確率との積を、利用者が所定時間後に所在する前記施設の名称が示す前記施設種別に所在する確率として算出することを特徴とする請求項6に記載の位置推定装置。
The facility type includes a facility category and a facility name included in the facility category,
The first correspondence information associates the life scene, the facility type, and the probability of being located in the facility type indicated by the name of the facility,
The first position estimator further includes a probability of becoming the next scene corresponding to the current life scene in the second estimation information, and a name of the facility corresponding to the future life scene in the first correspondence information. The position according to claim 6, wherein the product of the probability of being located in the facility type indicated by is calculated as the probability that the user is located in the facility type indicated by the name of the facility located after a predetermined time. Estimating device.
 前記第1対応情報は、さらに、前記生活シーンと、前記施設のカテゴリが示す前記施設種別に所在した確率とを対応付け、
 前記第1位置推定部は、さらに、算出した前記施設の名称が示す前記施設種別に所在する確率が所定の閾値未満である場合は、前記第2推定情報において前記現在の生活シーンに対応する前記次シーンとなった確率と、前記第1対応情報において前記将来生活シーンに対応する前記施設のカテゴリが示す前記施設種別に所在した確率との積を、利用者が所定時間後に所在する前記施設のカテゴリが示す前記施設種別に所在する確率として算出することを特徴とする請求項7に記載の位置推定装置。
The first correspondence information further associates the life scene with the probability of being located in the facility type indicated by the facility category,
The first position estimation unit further corresponds to the current life scene in the second estimation information when the probability of being located in the facility type indicated by the calculated name of the facility is less than a predetermined threshold. The product of the probability of becoming the next scene and the probability of being located in the facility type indicated by the facility category corresponding to the future life scene in the first correspondence information is the product of the facility where the user is located after a predetermined time. The position estimation apparatus according to claim 7, wherein the position estimation apparatus calculates the probability of being located in the facility type indicated by the category.
 前記現在位置と、前記現在位置から他の位置に移動した利用者が当該他の位置に所在した確率とを対応付けた移動パターンを記憶するパターン記憶部と、
 前記移動パターンから、利用者が所定時間後に前記現在位置に対応する前記他の位置に所在する確率を算出する第2位置推定部と、
 前記他の位置に所在する確率が所定の閾値以上である場合、前記第2位置推定部により算出された確率を推定結果とし、前記他の位置に所在する確率が所定の閾値未満である場合、前記第1位置推定部により算出された確率を推定結果とする決定部と、をさらに備えることを特徴とする請求項8に記載の位置推定装置。
A pattern storage unit that stores a movement pattern in which the current position is associated with a probability that a user who has moved from the current position to another position is located at the other position;
A second position estimation unit that calculates a probability that the user is located in the other position corresponding to the current position after a predetermined time from the movement pattern;
When the probability of being located in the other position is equal to or greater than a predetermined threshold, the probability calculated by the second position estimation unit is an estimation result, and the probability of being located in the other position is less than a predetermined threshold, The position estimation apparatus according to claim 8, further comprising: a determination unit that uses the probability calculated by the first position estimation unit as an estimation result.
 前記現在位置と現在時刻とを経時的に記録した移動履歴を記憶する履歴記憶部と、
 前記移動履歴における前記現在位置の時間的な推移に基づいて、前記移動パターンを抽出し、抽出した前記移動パターンを前記パターン記憶部に記憶する抽出部と、をさらに備えることを特徴とする請求項9に記載の位置推定装置。
A history storage unit for storing a movement history in which the current position and the current time are recorded over time;
The apparatus further comprises: an extraction unit that extracts the movement pattern based on temporal transition of the current position in the movement history and stores the extracted movement pattern in the pattern storage unit. The position estimation apparatus according to 9.
PCT/JP2009/063664 2009-07-31 2009-07-31 Position estimating device Ceased WO2011013245A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/063664 WO2011013245A1 (en) 2009-07-31 2009-07-31 Position estimating device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/063664 WO2011013245A1 (en) 2009-07-31 2009-07-31 Position estimating device

Publications (1)

Publication Number Publication Date
WO2011013245A1 true WO2011013245A1 (en) 2011-02-03

Family

ID=43528921

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/063664 Ceased WO2011013245A1 (en) 2009-07-31 2009-07-31 Position estimating device

Country Status (1)

Country Link
WO (1) WO2011013245A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012173982A (en) * 2011-02-21 2012-09-10 Nippon Telegr & Teleph Corp <Ntt> Action prediction device, action prediction method and action prediction program
US8600918B2 (en) 2010-12-13 2013-12-03 Kabushiki Kaisha Toshiba Action history search device
JP2016154019A (en) * 2012-12-20 2016-08-25 フェイスブック,インク. Estimation of user state and duration time on context
JP7201783B1 (en) 2021-12-15 2023-01-10 Kddi株式会社 Information processing device and information processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009026305A (en) * 2007-06-21 2009-02-05 Mitsubishi Electric Corp Living behavior estimation device, device state detection device, living behavior estimation method
JP2009036594A (en) * 2007-07-31 2009-02-19 Panasonic Corp Destination prediction apparatus and destination prediction method
JP2009176130A (en) * 2008-01-25 2009-08-06 Olympus Corp Information presentation system, program, and information storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009026305A (en) * 2007-06-21 2009-02-05 Mitsubishi Electric Corp Living behavior estimation device, device state detection device, living behavior estimation method
JP2009036594A (en) * 2007-07-31 2009-02-19 Panasonic Corp Destination prediction apparatus and destination prediction method
JP2009176130A (en) * 2008-01-25 2009-08-06 Olympus Corp Information presentation system, program, and information storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8600918B2 (en) 2010-12-13 2013-12-03 Kabushiki Kaisha Toshiba Action history search device
JP2012173982A (en) * 2011-02-21 2012-09-10 Nippon Telegr & Teleph Corp <Ntt> Action prediction device, action prediction method and action prediction program
JP2016154019A (en) * 2012-12-20 2016-08-25 フェイスブック,インク. Estimation of user state and duration time on context
KR20170015526A (en) * 2012-12-20 2017-02-08 페이스북, 인크. Inferring contextual user status and duration
KR102007190B1 (en) * 2012-12-20 2019-08-05 페이스북, 인크. Inferring contextual user status and duration
JP7201783B1 (en) 2021-12-15 2023-01-10 Kddi株式会社 Information processing device and information processing method
JP2023088779A (en) * 2021-12-15 2023-06-27 Kddi株式会社 Information processing device and information processing method

Similar Documents

Publication Publication Date Title
US8078152B2 (en) Venue inference using data sensed by mobile devices
TWI500003B (en) Positioning and mapping based on virtual landmarks
JP5904021B2 (en) Information processing apparatus, electronic device, information processing method, and program
US10032181B1 (en) Determining a topological location of a client device using received radio signatures
EP2936921B1 (en) Preventing dropped calls through behavior prediction
US20100097269A1 (en) Concept for localizing a position on a path
US20080293430A1 (en) Method, Apparatus and Computer Program Product for a Social Route Planner
JP5362337B2 (en) Information distribution system, information distribution server, and program
JP2002140362A (en) Mobile information providing system and information providing method
Teng et al. IONavi: An indoor-outdoor navigation service via mobile crowdsensing
AU2018333084B2 (en) Lost device detection using geospatial location data
JP5234637B2 (en) User flow line generation server, user flow line generation method, and user flow line generation program
JP2016143232A (en) POSITION INFORMATION MANAGEMENT DEVICE AND POSITION INFORMATION MANAGEMENT METHOD
WO2011013245A1 (en) Position estimating device
JP6541044B2 (en) Mobile computing device location method and mobile computing device for performing the same
JP4680739B2 (en) Traffic information generator
EP3182738B1 (en) Method and means for triggering at least one action based on geolocation and user information, places and user habits
JP2019128155A (en) Estimation device, estimation method, and estimation program
JP6443967B2 (en) Area attribute estimation apparatus and area attribute estimation method
CN110428631B (en) Method and device for determining public transport vehicle taken by user
JP2017130038A (en) Method and device for estimating movement route
CN115455129B (en) POI processing method, POI processing device, electronic equipment and storage medium
US12249094B2 (en) Inter-device positional relationship estimation device, inter-device positional relationship estimation method, and program
JP2025086007A (en) Information processing device, control method, program, and storage medium
JP2002207921A (en) System, server, and method for providing information and recording medium with recorded processing program thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09847836

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09847836

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP