[go: up one dir, main page]

WO2022019001A1 - Evaluation device, evaluation method, and program - Google Patents

Evaluation device, evaluation method, and program Download PDF

Info

Publication number
WO2022019001A1
WO2022019001A1 PCT/JP2021/022465 JP2021022465W WO2022019001A1 WO 2022019001 A1 WO2022019001 A1 WO 2022019001A1 JP 2021022465 W JP2021022465 W JP 2021022465W WO 2022019001 A1 WO2022019001 A1 WO 2022019001A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
time
distance
determination
evaluation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2021/022465
Other languages
French (fr)
Japanese (ja)
Inventor
彰 内山
輝夫 東野
研 中田
凌佑 長谷川
裕美 高畑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Osaka NUC
Original Assignee
Osaka University NUC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Osaka University NUC filed Critical Osaka University NUC
Priority to JP2022538628A priority Critical patent/JPWO2022019001A1/ja
Publication of WO2022019001A1 publication Critical patent/WO2022019001A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention mainly relates to an evaluation technique during exercise for social distance for prevention of infection with infectious diseases.
  • Non-Patent Document 1 describes a short-range wireless communication technology that uses Bluetooth installed in a mobile terminal, and whether or not it is in close contact within 2 m (social distance) for 15 minutes continuously for close contact. An evaluation device for determining the presence or absence has been proposed.
  • physical activity may decrease due to self-restraint in social activities (Stayhome), resulting in a state of lack of exercise. Prolonged lack of exercise poses a risk of NCD (non-communicable disease) such as heart disease and mental health. 3 It is important to maintain sufficient physical activity both qualitatively and quantitatively while avoiding stagnation.
  • the present invention has been made in view of the above, and is an evaluation that enables highly accurate dense contact determination or evaluation by adopting a method of applying a time condition in consideration of the peculiarity of being under the movement of a person during exercise. It provides equipment, evaluation methods and programs.
  • the evaluation device and method according to the present invention include an image reading means, a position detecting means, a distance calculating means, a time measuring means, a determination means, an aggregation means, and an evaluation means.
  • the image reading means reads out the captured images of a plurality of people in motion captured by the imaging means.
  • the position detecting means periodically detects the position coordinates of each of the plurality of people from the captured image read out.
  • the distance calculation means periodically calculates the distance between one person and another person among the plurality of people for each person.
  • the timekeeping means measures the passage of time.
  • the determination means periodically determines for each person whether or not the distance between the persons is within the close range for the reference time.
  • the aggregation means sequentially aggregates at least the reference time when the determination by the determination means is continuously affirmed for each person into the aggregation time up to the previous time, and sets it as a new aggregation time. Then, the evaluation means evaluates the close contact for each person according to the magnitude of the time totaled by the totaling means.
  • the present invention is a program for making a computer function as the evaluation device.
  • the position coordinates are periodically detected for the captured human image, and the distance between the people is calculated for each person. Then, it is periodically determined for each person whether or not the distance between the people is within the close distance continuously for the reference time.
  • the determination by the determination means is continuously affirmed, the reference time or the duration exceeding the reference time is sequentially aggregated for each person by the aggregation means.
  • the evaluation means evaluates the close contact according to the magnitude of the time totaled for each person by the totaling means.
  • the time required for one breath (3 seconds) is set as the reference time, and when the time is close to the reference time exceeding the reference time, the time is totaled as evaluation information.
  • the reference time is not limited to the breathing time, and may be shorter as a probability, or may be longer, for example, two breaths (6 seconds).
  • the exercise environment temperature, sunshine, indoors and outdoors
  • the physical condition and movement state body temperature, heart rate, acceleration
  • the result is used as a parameter for changes in the respiratory cycle. It can be reflected in the time adjustment.
  • an appropriate reference time may be set depending on the type of exercise. In addition to the reference time, these can also be applied to the adjustment of the close distance. Since the accumulation time of each person is acquired with each other person, risk information for each other person can be obtained.
  • FIG. 1 It is a block diagram which shows one Embodiment of the evaluation apparatus which concerns on this invention. It is a figure explaining the position of a person in a captured image and the distance between people, (A) is a diagram showing an imaging environment from the side, (B) is a diagram of an image captured by a camera, and (C) is a diagram showing a camera. It is a figure explaining that the 2D plane imaged in 1 was converted into a horizontal plane, and a person was re-plotted in it. (A) is a diagram showing an example of a skeleton image created from a captured human image, and (B) is a diagram showing that the height of the horizontal plane on which a person is plotted is set to the height of a person's waist.
  • C) is a diagram for explaining the outline of the homography transformation. It is a flowchart which shows an example of the procedure of the monitoring process which a control part executes. It is a flowchart which shows an example of the procedure of the evaluation process which a control part executes.
  • FIG. 1 is a block diagram showing an embodiment of the evaluation device according to the present invention.
  • the evaluation device 1 includes a control unit 10 including a computer, a storage unit 101, and a touch panel 102.
  • the storage unit 101 includes a control program for executing the evaluation process according to the present invention, a machine learning model (program and learned parameters) for operating AI described later, and a processing program for converting the imaging surface of the camera 30 into a horizontal plane. I remember. Further, as will be described later, the storage unit 101 stores various processing programs such as detecting a human body from a human image, labeling, detecting a position, and creating a skeleton image from a human image. ..
  • a transparent pressure-sensitive sheet is laminated on a liquid crystal display panel (display unit 1021), and the display coordinates of buttons and the like are associated with the coordinates on the pressure-sensitive sheet (input unit 1022) in advance. It enables selection of buttons and the like at the pressing position.
  • the input unit 1022 is provided with a keyboard and a mouse (not shown) so that necessary information can be input.
  • At least one camera 30 and, if necessary, a sensor unit 31 are connected to the control unit 10.
  • the control unit 10 is configured to be able to receive the image captured by the camera 30 by wire or wirelessly.
  • the sensor unit 31 obtains information on the exercise environment, for example, receives information from a thermometer by wire or wirelessly.
  • the sensor unit 31 receives sensing information from a wearable small thermometer, heart rate monitor, accelerometer, etc. that can be carried around each person, for example, by wire or wirelessly. In a mode other than real-time evaluation, it may be temporarily stored in a storage unit in the sensor so that it can be read out later.
  • the camera 30 is a digital video camera capable of acquiring time-series images (moving images).
  • the camera 30 may be a camera mounted on a smartphone or a personal computer in addition to a normal digital camera.
  • the image output from the camera 30 to the control unit 10 may be a recorded image temporarily stored in the internal storage unit (not shown) of the camera 30 in addition to the real-time data.
  • the image may be data stored in a server on the network.
  • the camera 30 is supported by a support tool at an appropriate position on the side wall of the gymnasium, for example, in a posture of maintaining a predetermined depression / elevation angle so as to overlook the entire exercise area.
  • images are taken from different directions, for example, from the front-back direction and the left-right direction, or from the lateral direction and the ceiling, and both images are combined to identify a person in the image.
  • Position detection processing becomes possible with high accuracy.
  • the control unit 10 executes the initial setting unit 11, the captured image reading unit 12, the position detection unit 13, the distance calculation unit 14, the proximity distance determination unit 15, the time integration unit 16, and the proximity. It functions as a time determination unit 17, an alarm processing unit 18, a proximity time totaling unit 19, an evaluation unit 20, a timer 21, an image display processing unit 22, and a search unit 23.
  • the position detection unit 13 includes a human body detection unit 131, a labeling processing unit 132, a skeleton image creation unit 133, and a coordinate conversion unit 134.
  • the initial setting unit 11 sets various initial conditions according to the situation of the observation target.
  • the setting items for example, the viewpoint and line of sight of the camera 30, a sign, and a close contact condition are assumed.
  • the captured image reading unit 12 reads the captured image from the camera 30 to the position detection unit 13 in real time, or reads the recorded image stored in the memory. Specifically, the captured image (moving image) to be read out is output to an running control program or the like and used for image processing. The read image may be output (displayed) to the display unit 1021 as needed.
  • FIGS. 2 and 3 are views for explaining the position of a person and the distance between people in a captured image
  • FIG. 2A is a diagram showing an imaging environment from the side
  • FIG. 2B is a diagram of an image captured by a camera.
  • C) is a diagram for explaining that a two-dimensional surface imaged by a camera is converted into a horizontal plane and a person is plotted in the horizontal plane.
  • (A) is a diagram showing an example of a skeletal image created from a captured human image
  • (B) shows that the height of the horizontal plane on which a person is plotted is set to the height of a person's waist.
  • the figure shown, (C) is a diagram for explaining the outline of the homography conversion.
  • the human body detection unit 131 detects the players A to E, and in the present embodiment, the human body detection unit 131 acquires time-series two-dimensional position information at a plurality of specific parts of the moving human body from the image captured from the camera 30. .. The human body detection unit 131 repeatedly detects the position information of each joint portion from the acquired two-dimensional image of the human body at a predetermined cycle, for example, 20 Hz.
  • the human body detection unit 131 may detect the human body by performing tracking processing at the predetermined cycle once the human body is detected.
  • the labeling processing unit 132 labels the human image detected by the human body detection unit 131 like players A, B, C, D, and E. Labeling may be a personal name or a pseudonym (for example, A, B ). In addition, in the case of personal identification, it is possible by, for example, face recognition, uniform number recognition, wearing different color wear, and identifying those colors.
  • the skeleton image creation unit 133 is based on the labeled moving images of the players A, B, C, D, and E, and here, as shown in FIG. 3A, the feature point (joint position: node) and the feature direction (skeleton).
  • a known skeleton detection algorithm OpenPose for extracting Sk: skeleton is exemplified.
  • the positions Pa to Pe of the players A to E indicate the absolute or relative positions with respect to the floor surface FL in the gymnasium, which is an exercise space. Since various methods are known for detecting the position of the player, a brief description thereof will be given.
  • the displayed image is distorted into a trapezoidal shape along the depth direction by the perspective method. Therefore, the known line mark 40 or the separately set known marks 411 and 412 are imaged, and coordinate conversion is performed so that the image pickup surface is projected onto another image pickup surface as described later.
  • scale bars of a predetermined height or the like are erected at at least four places, and the coordinate conversion of the image pickup surface is performed from the length of each scale bar imaged by the camera 30.
  • the method of doing it can also be adopted.
  • a method of identifying each player by referring to two two-dimensional images and obtaining position coordinates may be used.
  • the coordinate conversion unit 134 converts the coordinates of the trapezoidal image pickup plane PL1 viewed from the camera 30 at a predetermined depression / elevation angle ⁇ into a rectangular bird's-eye view plane PL2 viewed from directly above.
  • a known bird's-eye view transformation or transformation using a homography matrix is applied by utilizing the mark information. This makes it possible to convert each coordinate position on the plane PL1 into a coordinate position on the plane PL2.
  • the coordinate conversion unit 134 is based on the assumption that the height of the waist does not change significantly even when the player performs various movements, and the main part of the skeleton image Hu, for example, The height h of the waist portion Wa is converted into a bird's-eye view when cut by a horizontal surface PL.
  • the coordinate conversion unit 134 uses the coordinates in the camera image of the waist portion Wa detected by the skeleton detection algorithm (OpenPose) as the coordinates when converted into a bird's-eye view, and uses these coordinates as the position information of the player.
  • OpenPose skeleton detection algorithm
  • the distance calculation unit 14 calculates the distance between the players for all the players A to E based on the converted bird's-eye view.
  • the lengths of the line mark 40 or the separately set marks 411 and 412 are known, and the distance between the players is calculated based on the known lengths.
  • FIG. 2C describes a case where the distance between the player B and another player on the horizontal plane PL is calculated.
  • the distance between the player BAs is the distance Dba
  • the distance between the player BCs is the distance Dbc
  • the distance between the player BDs is the distance Dbd
  • the distance between the player BEs is the distance Dbe.
  • the distance between the players may be determined by the number of combinations. For example, if the distance Dba between the player BAs is measured, the distance Dab between the players AB does not need to be measured.
  • the distance calculation unit 14 may calculate the distance in the three-dimensional space in consideration of the height, instead of the method of performing the distance between the players after the conversion to the bird's-eye view surface. In this way, the distance can be measured in three dimensions.
  • the distance between the heads may be the distance between people.
  • the proximity distance determination unit 15 is one of the conditions for close contact, and determines whether or not the distance between the players is within a predetermined proximity distance.
  • the close range is set to 2 m here from the viewpoint of preventing droplet infection. It should be noted that it may be farther outdoors, for example, 3 m.
  • the time integration unit 16 integrates the duration for each other player while the proximity distance determination unit 15 determines that the distance between the players is within the proximity distance. In this example, the time integration unit 16 repeatedly integrates the time having a cycle corresponding to 20 Hz.
  • the proximity time determination unit 17 is one of the conditions for close contact, and determines whether or not each player has continued for a predetermined time (proximity time or longer duration) within the proximity distance.
  • the proximity time is set to, for example, about 3 seconds in consideration of the respiratory cycle.
  • the alarm processing unit 18 is provided as needed, and issues an alarm when the determination results of the proximity distance determination unit 15 and the proximity time determination unit 17 are both affirmative.
  • Alarms include images, sounds, and sounds.
  • the proximity time aggregation unit 19 adds the duration of this time to the aggregation time up to the previous time and stores it as a new aggregation time. And make it a cumulative target.
  • the evaluation unit 20 determines for each player whether or not the total time exceeds the time corresponding to the close contact, and if it exceeds, evaluates as a close contact.
  • the evaluation unit 20 can also give points such as a rich contact risk point according to the magnitude of the totaled time during the series of exercise time and the practice time monitored.
  • the timer 21 is a timekeeping means composed of software for measuring the passage of time.
  • the image display processing unit 22 causes the display unit 1021 to display the state in which the evaluation process is being executed or the result at the end in a predetermined format. For example, the total time for each player and the total time for each player are displayed. In these displays, it is preferable to display the cumulative status of the total time over time in two dimensions by plotting time on the horizontal axis and aggregation time on the vertical axis.
  • the search unit 23 activates the search program to evaluate the evaluation result of the player who exercised with the infected person, particularly. Check the total time.
  • the search unit 23 searches (extracts) the player who has the longest close gathering time with the player confirmed to be infected, and the player whose time corresponds to or longer than the close contact.
  • FIG. 4 is a flowchart showing an example of the procedure of the monitoring process executed by the control unit 10.
  • the initial setting process for setting various initial conditions including the input setting of the shape information of the mark which is the reference of the position detection is performed (step S1).
  • the predetermined time the cycle of 20 Hz described above
  • the two-dimensional image captured by the camera 30 including the image of the moving human body to be measured is captured (step S5).
  • a human body image is detected from the acquired two-dimensional image of the human body, and labeling processing and skeleton image creation processing are performed on the image (step S7).
  • step S9 the position of each player image in the imaging space is calculated. Further, coordinate conversion to the bird's-eye view plane is performed (step S11). After that, the evaluation process (FIG. 5) is executed every predetermined time elapse (Yes in step S3) (step S13). Then, until the monitoring process is completed (step S15), the process returns to step S3 and the same process is repeated.
  • FIG. 5 is a flowchart showing an example of the procedure of the evaluation process executed by the control unit 10.
  • the control unit 10 sequentially selects each pair that is a combination of the players A to E, and calculates the distance between the selected pair of people (step S21). Then, it is determined whether or not the calculated distance is within the close range (step S23), and if it is within the close range, a predetermined time (time corresponding to 20 Hz in this example) is integrated with the integrated time of the pair (step). S25). Next, it is determined whether or not the accumulation time of the pair exceeds the reference time (typically 3 seconds) (step S27).
  • the reference time typically 3 seconds
  • step S29 If the reference time is exceeded, an alarm is instructed as necessary (step S29), and then it is determined whether or not there is a remaining pair (step S31). If there is a remaining pair, step S21 is performed. Go back and do the same for the next pair. If there are no remaining pairs and the processing for all the pairs is completed, the process proceeds to step S15.
  • step S23 If it is determined in step S23 that the distance is not within the close range, it is determined whether or not the integrated time of the pair is equal to or longer than the reference time (step S33). If it is equal to or longer than the reference time, the totalization time of this time is added to each totalization time of the pair, and the totalization time is reset (step S35). The time added this time is temporarily stored as a new aggregation time of the pair, and is subject to addition in a later evaluation (step S23, step S35, and then step S35).
  • step S37 if the integrated time of the pair does not exceed the reference time in step S33, it is considered that there is no close contact, and the integrated time of the pair is reset without totaling (step S37).
  • the integrated time to be aggregated may be aggregated so as to include the reference time itself at the time when the reference time is reached or the duration exceeding the reference time.
  • the present invention is also applicable to diseases that are more infectious by close contact, including droplet infection.
  • the evaluation device and method according to the present invention include image reading means, position detection means, distance calculation means, timekeeping means, determination means, tabulation means, and evaluation means.
  • the image reading means reads out the captured images of a plurality of people in motion captured by the imaging means.
  • the position detecting means periodically detects the position coordinates of each of the plurality of people from the captured image read out.
  • the distance calculation means periodically calculates the distance between one person and another person among the plurality of people for each person.
  • the timekeeping means measures the passage of time.
  • the determination means periodically determines for each person whether or not the distance between the persons is within the close range for the reference time.
  • the aggregation means sequentially aggregates at least the reference time when the determination by the determination means is continuously affirmed for each person into the aggregation time up to the previous time, and sets it as a new aggregation time. Then, the evaluation means evaluates the close contact for each person according to the magnitude of the time totaled by the totaling means.
  • the present invention is a program for making a computer function as the evaluation device.
  • the time required for one breath (3 seconds) is set as the reference time, and when the time is close to the reference time exceeding the reference time, the time is totaled as evaluation information.
  • the reference time is not limited to the breathing time, and may be shorter as a probability, or may be longer, for example, two breaths (6 seconds).
  • the exercise environment temperature, sunshine, indoors and outdoors
  • the physical condition and movement state body temperature, heart rate, acceleration
  • the result is used as a parameter for changes in the respiratory cycle. It can be reflected in the time adjustment.
  • an appropriate reference time may be set depending on the type of exercise. In addition to the reference time, these can also be applied to the adjustment of the close distance. Since the accumulation time of each person is acquired with each other person, risk information for each other person can be obtained.
  • the number of the imaging means is at least one. According to this configuration, in the case of one unit, a method of constructing a two-dimensional coordinate system on the captured image surface using signs (mark size information and height information) so that the coordinate position of the human image can be calculated can be applied. It becomes. On the other hand, if there are two or more units, it is possible to identify the captured human image by collating a plurality of captured images.
  • the position detection means and the distance calculation means execute position detection and distance calculation in a short cycle shorter than the reference time, and the determination means performs a determination operation in each short cycle.
  • the determination means performs a determination operation in each short cycle.
  • the short cycle is not limited to the time of 20 Hz, and may be shorter, or at least shorter than the reference time.
  • the first determination means for determining whether or not the distance between the persons is within the close distance and the affirmative determination by the first determination means continue for the reference time. It is preferable to include a second determination unit for determining whether or not. According to this configuration, determination is made for each determination condition of the proximity distance and the reference time.
  • the aggregation means sequentially aggregates the reference time when the determination by the determination means is affirmed.
  • the reference time is totaled. Therefore, in this embodiment, when the integration process continues longer than the reference time, when the integration process reaches the reference time, the aggregation for the reference time is once completed, and then the integration time is reset and the integration time is again calculated. It is possible to repeat the continuous integration process. In this case, the aggregation process for each reference time is possible.
  • the present invention includes a sensor that measures at least one of the physical condition and the exercise environment of an exercising person, and the determination means adjusts at least one of the reference time and the proximity distance using the measurement result of the sensor as a parameter. Those are preferable. According to this configuration, since the physical condition and the state of the exercise environment affect the respiratory rate, it is possible to evaluate with high accuracy by adjusting the evaluation conditions using the measurement result as a parameter.
  • the present invention preferably includes a sensor for measuring at least one of the physical condition and the exercise environment of an exercising person, and the evaluation means preferably adjusts the evaluation of the dense contact using the measurement result of the sensor as a parameter.
  • the parameters can be applied to the evaluation as well as the determination.
  • the tabulation means further totals the totaling time of each person, and it is preferable that the evaluation means evaluates the close contact with the whole exercise. According to this configuration, it is possible to present a close contact opportunity as a whole separately from each person for a series of exercises, and it is possible to adjust a safer exercise method and practice method.
  • the total time between the input unit for inputting the information of the infected person infected with the infectious disease and the input infected person among the plurality of persons exceeds the rich contact standard.
  • Those provided with a search means for searching for a person are preferable. According to this configuration, when an infected person is found, it is possible to quickly search for a close contact person who exceeds a standard (for example, 15 minutes).
  • Control unit 101 Storage unit 102 Touch panel (input unit) 12 Captured image reading unit (image reading means) 13 Position detection unit (position detection means) 131 Human body detection unit 132 Labeling processing unit 133 Skeleton image creation unit 134 Coordinate conversion unit 14 Distance calculation unit (distance detection means) 15 Proximity distance determination unit (determination means, first determination means) 16 Hours integration unit 17 Proximity time determination unit (determination means, second determination means) 18 Alarm processing unit 19 Proximity time totaling unit (aggregation means) 20 Evaluation Department (Evaluation Means) 21 Timer (timekeeping means) 22 Image display processing unit 23 Search unit (search means) 30 Camera 31 Sensor unit (sensor)

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Geometry (AREA)
  • Signal Processing (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

An evaluation device (1) comprises: an image readout unit (12) for reading out an image; a position detection unit (13) for periodically detecting the position coordinate of a human image in the read out image; a distance calculation unit (14) for periodically calculating, for each person, the distance between one person and another person out of multiple persons; a timer (21); determination units (15,17) for periodically determining, for each person, whether the distance between persons is 2 m or less continuously for 3 seconds or not; a totalization unit (19) for successively adding, for each person, a period of at least 3 seconds in the case where the determination by the determination units is continuously affirmed to the previous totalized time and obtaining a new totalized time; and an evaluation unit (20) for performing, for each person, an evaluation on close contact according to the length of the time period totalized by the totalization unit. Accordingly, high accuracy close contact determination is performed on a person in motion.

Description

評価装置、評価方法及びプログラムEvaluation device, evaluation method and program

 本発明は、主に感染症への感染予防のためのソーシャルディスタンスの運動中における評価技術に関する。 The present invention mainly relates to an evaluation technique during exercise for social distance for prevention of infection with infectious diseases.

 今日、例えば新型コロナウイルスの飛沫感染を防止する上で、密閉、密接及び密集という3密の発生を防止することが効果的であるといわれている。非特許文献1には、携帯端末に搭載されているブルーツゥース(Blue tooth)を利用した近距離の無線通信技術で、2m(ソーシャルディスタンス)以内を15分間連続して密接したかどうかで濃厚接触の有無を判断する評価装置が提案されている。一方、濃厚接触を避けるべく、社会活動自粛(Stay home)により、身体活動が低下し、運動不足状態となり得る。運動不足が長期化すると、心疾患やメンタルヘルスなどNCD(非感染性疾患)リスクとなる。3密を避けつつ、かつ質的にも量的にも十分な身体活動を維持することは重要である。 Today, for example, in preventing droplet infection of the new coronavirus, it is said that it is effective to prevent the occurrence of the three dense, closed, close and dense. Non-Patent Document 1 describes a short-range wireless communication technology that uses Bluetooth installed in a mobile terminal, and whether or not it is in close contact within 2 m (social distance) for 15 minutes continuously for close contact. An evaluation device for determining the presence or absence has been proposed. On the other hand, in order to avoid close contact, physical activity may decrease due to self-restraint in social activities (Stayhome), resulting in a state of lack of exercise. Prolonged lack of exercise poses a risk of NCD (non-communicable disease) such as heart disease and mental health. 3 It is important to maintain sufficient physical activity both qualitatively and quantitatively while avoiding stagnation.

 また、カメラで撮像した画像(動画像)を入力として、深層学習により画像内の人を認識すると共に、その人画像の位置を把握する技術は種々存在している。かかる技術を例えば新型コロナウイルス対策として応用し、人同士の距離が一定以内の場合に警告を発するようにしたシステムは、現在、世界中で開発が行われている。 In addition, there are various techniques for recognizing a person in an image by deep learning and grasping the position of the person image by inputting an image (moving image) captured by a camera. A system that applies such technology as a countermeasure against the new coronavirus and issues a warning when the distance between people is within a certain range is currently being developed all over the world.

“新型コロナウイルス接触確認アプリについて”、2020年6月19日リリース、厚生労働省新型コロナウイルス感染症対策推進本部、内閣官房新型コロナウイルス感染症対策テックチーム事務局、[令和2年7月13日検索]、インターネット <https://www.mhlw.go.jp/content/10900000/000641655.pdf>"About the new coronavirus contact confirmation app", released on June 19, 2020, Ministry of Health, Labor and Welfare New Coronavirus Infectious Disease Control Promotion Headquarters, Cabinet Office New Coronavirus Infectious Disease Control Tech Team Secretariat, [Rewa 2 July 13 Day search], Internet <https://www.mhlw.go.jp/content/10900000/000641655.pdf>

 ところで、運動には、日常的な運動不足を補う身体活動の他、プロスポーツ、学校スポーツ、クラブ活動、民間・公共スポーツ、子供・高齢者の運動もあり、建物の内外を問わず、いずれも3密環境を避けて行うことが望まれる。一方、運動中は携帯端末を携行することができないため、従来技術のような携帯端末を利用したソーシャルディスタンス判定の技術を適用することは困難となる。また、カメラで撮像した人画像の位置を把握する技術を新型コロナウイルス対策に応用するシステムは、現状、一定の距離以内に位置した場合に警告を発するにとどまっており、時間条件もスポーツにおける人の動きに特化した特性も何ら考慮されていない。 By the way, in addition to physical activities that make up for lack of daily exercise, there are also professional sports, school sports, club activities, private / public sports, and exercises for children / elderly people, both inside and outside the building. 3 It is desirable to avoid a dense environment. On the other hand, since it is not possible to carry a mobile terminal while exercising, it is difficult to apply a social distance determination technique using a mobile terminal such as the conventional technique. In addition, the system that applies the technology to grasp the position of the human image captured by the camera to the new coronavirus countermeasures currently only issues a warning when it is located within a certain distance, and the time condition is also for people in sports. No consideration is given to the characteristics specific to the movement of.

 本発明は、上記に鑑みてなされたもので、運動中の人の動き下という特異性を考慮した時間条件の適用法を採用することで精度の高い濃厚接触判定乃至は評価を可能にする評価装置、評価方法及びプログラムを提供するものである。 The present invention has been made in view of the above, and is an evaluation that enables highly accurate dense contact determination or evaluation by adopting a method of applying a time condition in consideration of the peculiarity of being under the movement of a person during exercise. It provides equipment, evaluation methods and programs.

 本発明に係る評価装置及び方法は、画像読出手段、位置検出手段、距離算出手段、計時手段、判定手段、集計手段及び評価手段を備える。画像読出手段は、撮像手段で撮像された、運動中の複数の人の撮像画像を読み出す。位置検出手段は、読み出される前記撮像画像から、前記複数の人のうちの、各人の位置座標を周期的に検出する。距離算出手段は、前記複数の人のうち、一人と他の人との間の人同士の間の距離を各人について周期的に算出する。計時手段は、時間経過を計測する。判定手段は、前記人同士の間の距離が基準時間継続して近接距離内であるか否かを各人について周期的に判定する。集計手段は、前記判定手段による判定が継続して肯定されたときの少なくとも前記基準時間を各人について逐次、前回までの集計時間に集計すると共に新たな集計時間とする。そして、評価手段は、前記集計手段により集計された時間の大小に対応して濃厚接触の評価を各人について行う。 The evaluation device and method according to the present invention include an image reading means, a position detecting means, a distance calculating means, a time measuring means, a determination means, an aggregation means, and an evaluation means. The image reading means reads out the captured images of a plurality of people in motion captured by the imaging means. The position detecting means periodically detects the position coordinates of each of the plurality of people from the captured image read out. The distance calculation means periodically calculates the distance between one person and another person among the plurality of people for each person. The timekeeping means measures the passage of time. The determination means periodically determines for each person whether or not the distance between the persons is within the close range for the reference time. The aggregation means sequentially aggregates at least the reference time when the determination by the determination means is continuously affirmed for each person into the aggregation time up to the previous time, and sets it as a new aggregation time. Then, the evaluation means evaluates the close contact for each person according to the magnitude of the time totaled by the totaling means.

 また、本発明は、コンピュータを、前記評価装置として機能させるためのプログラムである。 Further, the present invention is a program for making a computer function as the evaluation device.

 本発明によれば、撮像された人画像に対して、周期的に位置座標が検出され、更に人同士の距離が各人について算出される。そして、人同士の間の距離が基準時間継続して近接距離内であるか否かが各人について周期的に判定される。判定手段による判定が継続して肯定されたときの前記基準時間あるいは基準時間を超える継続時間は、集計手段によって各人について逐次集計される。そして、評価手段は、集計手段により各人について集計された時間の大小に対応して濃厚接触の評価を行う。 According to the present invention, the position coordinates are periodically detected for the captured human image, and the distance between the people is calculated for each person. Then, it is periodically determined for each person whether or not the distance between the people is within the close distance continuously for the reference time. When the determination by the determination means is continuously affirmed, the reference time or the duration exceeding the reference time is sequentially aggregated for each person by the aggregation means. Then, the evaluation means evaluates the close contact according to the magnitude of the time totaled for each person by the totaling means.

 スポーツにおいては、プレイ(運動)中にごく短い間、人同士が近づいたり、離れたりすることが多い。新型コロナウイルスの飛沫感染を考慮すると、人同士が近づいているときに呼吸をしていたかどうかが重要であり、単純に距離だけで濃厚接触、ひいては感染リスクを評価することはできない。平均的に人の呼吸回数は、1分間あたりで十数回から20回といわれている。回数の多い方に着目すると、1回の呼吸が3秒で行われることを意味する。従って、近接時に呼吸をしたかどうかの判断は、近接状態が3秒継続したかどうかを判断の基準とすることが好ましい。そこで、本発明では、呼吸1回に要する時間(3秒)を基準時間とし、この基準時間を超えて近接している場合に、評価情報として当該時間を集計するようにしている。なお、基準時間は、前記呼吸時間に限定されず、確率としてより短くてもよいし、あるいは逆により長く、例えば呼吸2回分(6秒)としてもよい。また、運動環境(温度、日照、屋内外)をセンサで、及び人の体調や動き状態(体温、心拍数、加速度)をウエアラブル端末で計測して、その結果を呼吸周期の変化に対するパラメータとして基準時間の調整に反映させることが可能である。また、運動の種類によって、適宜の基準時間を設定してもよい。なお、これらは、基準時間の他、近接距離の調整にも同様に適用可能である。各人の集積時間を他の各人との間で取得するので、個々の他人に対するリスク情報が得られる。 In sports, people often approach or separate from each other for a very short time during play (exercise). Considering the droplet infection of the new coronavirus, it is important whether or not people were breathing when they were approaching each other, and it is not possible to evaluate the risk of close contact and thus infection simply by distance. On average, a person's breathing rate is said to be more than 10 to 20 times per minute. Focusing on the one with the highest number of times, it means that one breath is performed in 3 seconds. Therefore, it is preferable to use whether or not the proximity state has continued for 3 seconds as a criterion for determining whether or not the person has breathed at the time of proximity. Therefore, in the present invention, the time required for one breath (3 seconds) is set as the reference time, and when the time is close to the reference time exceeding the reference time, the time is totaled as evaluation information. The reference time is not limited to the breathing time, and may be shorter as a probability, or may be longer, for example, two breaths (6 seconds). In addition, the exercise environment (temperature, sunshine, indoors and outdoors) is measured with a sensor, and the physical condition and movement state (body temperature, heart rate, acceleration) of a person are measured with a wearable terminal, and the result is used as a parameter for changes in the respiratory cycle. It can be reflected in the time adjustment. Further, an appropriate reference time may be set depending on the type of exercise. In addition to the reference time, these can also be applied to the adjustment of the close distance. Since the accumulation time of each person is acquired with each other person, risk information for each other person can be obtained.

 本発明によれば、運動中の人に対して精度の高い濃厚接触判定ができる。 According to the present invention, it is possible to make a highly accurate rich contact determination for a person who is exercising.

本発明に係る評価装置の一実施形態を示す構成図である。It is a block diagram which shows one Embodiment of the evaluation apparatus which concerns on this invention. 撮像画像内の人の位置と人同士の距離とを説明する図で、(A)は撮像環境を横から示した図、(B)はカメラで撮像された画像の図、(C)はカメラで撮像される2次元面を水平面に変換し、その中に人をプロットし直したことを説明する図である。It is a figure explaining the position of a person in a captured image and the distance between people, (A) is a diagram showing an imaging environment from the side, (B) is a diagram of an image captured by a camera, and (C) is a diagram showing a camera. It is a figure explaining that the 2D plane imaged in 1 was converted into a horizontal plane, and a person was re-plotted in it. (A)は撮像した人画像から作成した骨格画像の一例を示す図、(B)は人がプロットされる水平面の面の高さを人の腰の高さに設定したことを示す図、(C)はホモグラフィ変換の概要を説明するための図である。(A) is a diagram showing an example of a skeleton image created from a captured human image, and (B) is a diagram showing that the height of the horizontal plane on which a person is plotted is set to the height of a person's waist. C) is a diagram for explaining the outline of the homography transformation. 制御部が実行するモニタリング処理の手順の一例を示すフローチャートである。It is a flowchart which shows an example of the procedure of the monitoring process which a control part executes. 制御部が実行する評価処理の手順の一例を示すフローチャートである。It is a flowchart which shows an example of the procedure of the evaluation process which a control part executes.

 図1は、本発明に係る評価装置の一実施形態を示す構成図である。評価装置1は、コンピュータからなる制御部10、記憶部101及びタッチパネル102を備える。記憶部101は、本発明に係る評価処理を実行するための制御プログラムの他、後述するAIを動作させる機械学習モデル(プログラム及び学習済みパラメータ)、カメラ30の撮像面を水平面に変換する処理プログラムを記憶している。また、記憶部101は、後述するように、人画像から人体を検出したり、ラベリングしたり、位置を検出したり、人画像から骨格画像を作成したりする各種の処理プログラムを格納している。タッチパネル102は、例えば液晶表示パネル(表示部1021)上に透明な感圧シートを積層し、ボタン等の表示座標と感圧シート(入力部1022)上の座標とを予め関連付けておくことで、押圧位置のボタン等の選択などを可能にしている。また、入力部1022として、図略のキーボード、マウスを備えて、所要の情報を入力可能にしている。 FIG. 1 is a block diagram showing an embodiment of the evaluation device according to the present invention. The evaluation device 1 includes a control unit 10 including a computer, a storage unit 101, and a touch panel 102. The storage unit 101 includes a control program for executing the evaluation process according to the present invention, a machine learning model (program and learned parameters) for operating AI described later, and a processing program for converting the imaging surface of the camera 30 into a horizontal plane. I remember. Further, as will be described later, the storage unit 101 stores various processing programs such as detecting a human body from a human image, labeling, detecting a position, and creating a skeleton image from a human image. .. In the touch panel 102, for example, a transparent pressure-sensitive sheet is laminated on a liquid crystal display panel (display unit 1021), and the display coordinates of buttons and the like are associated with the coordinates on the pressure-sensitive sheet (input unit 1022) in advance. It enables selection of buttons and the like at the pressing position. Further, the input unit 1022 is provided with a keyboard and a mouse (not shown) so that necessary information can be input.

 制御部10には、少なくとも1台のカメラ30、及び必要に応じてセンサ部31が接続されている。制御部10は、カメラ30で撮像された画像を有線又は無線で受信可能に構成されている。センサ部31は、運動環境に関する情報を得る、例えば温度計からの情報を有線又は無線で受信する。また、センサ部31は、各人に携行、例えば手首に巻き付け可能なウエアラブルな小型の体温計、心拍計、加速度計などからのセンシング情報を有線又は無線で受信する。なお、リアルタイム評価でない態様では、センサ内の記憶部に一時的に格納し、後に読み出し可能にしてもよい。 At least one camera 30 and, if necessary, a sensor unit 31 are connected to the control unit 10. The control unit 10 is configured to be able to receive the image captured by the camera 30 by wire or wirelessly. The sensor unit 31 obtains information on the exercise environment, for example, receives information from a thermometer by wire or wirelessly. In addition, the sensor unit 31 receives sensing information from a wearable small thermometer, heart rate monitor, accelerometer, etc. that can be carried around each person, for example, by wire or wirelessly. In a mode other than real-time evaluation, it may be temporarily stored in a storage unit in the sensor so that it can be read out later.

 カメラ30は、時系列の画像(動画像)が取得可能なデジタルビデオカメラである。カメラ30は、通常のデジタルカメラの他、スマートフォンやパソコンに搭載のカメラであってもよい。また、カメラ30から制御部10へ出力される画像はリアルタイムのデータの他、カメラ30の内部記憶部(図略)に一時的に保管された録画の画像であってもよい。なお、画像は、ネットワーク上のサーバに保管されたデータでもよい。カメラ30は、本実施形態では、運動エリア全体を見渡せるように所定の俯仰角を維持する姿勢にされた状態で、例えば体育館の側壁適所に支持具を介して支持されている。なお、複数のカメラを採用する態様では、互いに異なる方向から、例えば前後方向と左右方向とから、あるいは横方向と天井とから撮像し、両方の画像を合成することによって画像内の人の同定及び位置検出処理が高精度で可能となる。 The camera 30 is a digital video camera capable of acquiring time-series images (moving images). The camera 30 may be a camera mounted on a smartphone or a personal computer in addition to a normal digital camera. Further, the image output from the camera 30 to the control unit 10 may be a recorded image temporarily stored in the internal storage unit (not shown) of the camera 30 in addition to the real-time data. The image may be data stored in a server on the network. In the present embodiment, the camera 30 is supported by a support tool at an appropriate position on the side wall of the gymnasium, for example, in a posture of maintaining a predetermined depression / elevation angle so as to overlook the entire exercise area. In the aspect of adopting a plurality of cameras, images are taken from different directions, for example, from the front-back direction and the left-right direction, or from the lateral direction and the ceiling, and both images are combined to identify a person in the image. Position detection processing becomes possible with high accuracy.

 制御部10は、記憶部101の各プログラムを実行することで、初期設定部11、撮影画像読出部12、位置検出部13、距離計算部14、近接距離判定部15、時間積算部16、近接時間判定部17、アラーム処理部18、近接時間集計部19、評価部20、タイマ21、画像表示処理部22及び検索部23として機能する。位置検出部13は、本実施形態では、人体検出部131、ラベリング処理部132、骨格画像作成部133及び座標変換部134を備える。 By executing each program of the storage unit 101, the control unit 10 executes the initial setting unit 11, the captured image reading unit 12, the position detection unit 13, the distance calculation unit 14, the proximity distance determination unit 15, the time integration unit 16, and the proximity. It functions as a time determination unit 17, an alarm processing unit 18, a proximity time totaling unit 19, an evaluation unit 20, a timer 21, an image display processing unit 22, and a search unit 23. In the present embodiment, the position detection unit 13 includes a human body detection unit 131, a labeling processing unit 132, a skeleton image creation unit 133, and a coordinate conversion unit 134.

 初期設定部11は、観察対象の状況に合わせて各種の初期条件を設定するものである。設定事項としては、例えばカメラ30の視点と視線、標識、濃厚接触条件が想定される。 The initial setting unit 11 sets various initial conditions according to the situation of the observation target. As the setting items, for example, the viewpoint and line of sight of the camera 30, a sign, and a close contact condition are assumed.

 撮影画像読出部12は、カメラ30からのリアルタイムの撮影画像の位置検出部13への読み出し、あるいはメモリに格納された録画画像の読み出しを行わせるものである。具体的には、読み出される撮像画像(動画像)は、実行中の制御プログラム等に出力されて、画像処理に供される。なお、読出画像は、必要に応じて表示部1021に出力(表示)されてもよい。 The captured image reading unit 12 reads the captured image from the camera 30 to the position detection unit 13 in real time, or reads the recorded image stored in the memory. Specifically, the captured image (moving image) to be read out is output to an running control program or the like and used for image processing. The read image may be output (displayed) to the display unit 1021 as needed.

 続いて、位置検出部13の処理について、図2及び図3を参照しつつ説明する。図2は、撮像画像内の人の位置と人同士の距離とを説明する図で、(A)は撮像環境を横から示した図、(B)はカメラで撮像された画像の図、(C)はカメラで撮像される2次元面を水平面に変換し、その中に人をプロットし直したことを説明する図である。図3において、(A)は撮像した人画像から作成した骨格画像の一例を示す図、(B)は人がプロットされる水平面の面の高さを人の腰の高さに設定したことを示す図、(C)はホモグラフィ変換の概要を説明するための図である。 Subsequently, the processing of the position detection unit 13 will be described with reference to FIGS. 2 and 3. 2A and 2B are views for explaining the position of a person and the distance between people in a captured image, FIG. 2A is a diagram showing an imaging environment from the side, and FIG. 2B is a diagram of an image captured by a camera. C) is a diagram for explaining that a two-dimensional surface imaged by a camera is converted into a horizontal plane and a person is plotted in the horizontal plane. In FIG. 3, (A) is a diagram showing an example of a skeletal image created from a captured human image, and (B) shows that the height of the horizontal plane on which a person is plotted is set to the height of a person's waist. The figure shown, (C) is a diagram for explaining the outline of the homography conversion.

 図2では、今、例えば体育館内の撮像空間内に、5人のプレイヤA~Eが運動していることを想定している。実際には各プレイヤA~Eは運動しているが、図2では作図の便宜上、起立姿勢の相似図柄で示している。人体検出部131は、プレイヤA~Eの検出を行うもので、本実施形態では、カメラ30から取り込んだ画像内から動作中の人体の複数の特定部位における時系列の2次元位置情報を取得する。人体検出部131は、所定周期、例えば20Hzで、取得した人体の2次元画像中から各関節部位の位置情報を繰り返し検出する。関節部位と関連付けられた2次元位置情報の検出の方法は種々提案されており、例えば機械学習、深層学習あるいは画像認識処理等を利用してもよい。なお、人体検出部131は、一旦人体を検出すると前記所定周期で追尾処理を行うことで人体検出するようにしてもよい。 In FIG. 2, it is assumed that five players A to E are exercising in the imaging space in the gymnasium, for example. Actually, each player A to E is exercising, but in FIG. 2, for convenience of drawing, it is shown by a similar figure of the standing posture. The human body detection unit 131 detects the players A to E, and in the present embodiment, the human body detection unit 131 acquires time-series two-dimensional position information at a plurality of specific parts of the moving human body from the image captured from the camera 30. .. The human body detection unit 131 repeatedly detects the position information of each joint portion from the acquired two-dimensional image of the human body at a predetermined cycle, for example, 20 Hz. Various methods for detecting the two-dimensional position information associated with the joint site have been proposed, and for example, machine learning, deep learning, image recognition processing, or the like may be used. The human body detection unit 131 may detect the human body by performing tracking processing at the predetermined cycle once the human body is detected.

 ラベリング処理部132は、人体検出部131で検出された人画像に対してプレイヤA,B,C,D,Eのようにラベリングを施す。ラベリングは、個人名あるいは仮名(例えばA,B…)でよい。なお、個人識別を行う場合には、例えば顔認識、背番号認識、色違いのウエアを着用し、それらの色の識別によって可能である。 The labeling processing unit 132 labels the human image detected by the human body detection unit 131 like players A, B, C, D, and E. Labeling may be a personal name or a pseudonym (for example, A, B ...). In addition, in the case of personal identification, it is possible by, for example, face recognition, uniform number recognition, wearing different color wear, and identifying those colors.

 骨格画像作成部133は、ラベリングされたプレイヤA,B,C,D,Eの動画像から、ここでは図3(A)に示すように、特徴点(関節位置:ノード)及び特徴方向(骨格Sk:スケルトン)を抽出する公知のスケルトン検出アルゴリズム(OpenPose)が例示される。 The skeleton image creation unit 133 is based on the labeled moving images of the players A, B, C, D, and E, and here, as shown in FIG. 3A, the feature point (joint position: node) and the feature direction (skeleton). A known skeleton detection algorithm (OpenPose) for extracting Sk: skeleton) is exemplified.

 なお、人体検出における位置情報は、カメラ30による撮像画面上のプレイヤA~Eの2次元位置情報を床面に対応する位置情報に置き換える必要がある。プレイヤA~Eの位置Pa~Peは、運動空間である体育館内の床面FLに対する絶対的乃至相対的な位置を示している。プレイヤの位置の検出は、種々の方法が公知であるので、簡単に説明する。カメラ30で撮像された競技空間は遠近法によって奥行き方向に沿って表示画像が台形状に歪む。そこで、既知のラインマーク40、あるいは別途設定した既知のマーク411,412を撮像させて、後述するようにして撮像面を他の撮像面に射影するべく座標変換を行う。なお、マークは、床面に付する方法の他、所定高さのスケール棒等を少なくとも4か所に立設させ、カメラ30に撮像された各スケール棒の長さから撮像面の座標変換を行う方法も採用可能である。なお、カメラ30として複数台、例えば2台採用する態様では、2枚の2次元画像を参照して各プレイヤを同定し、位置座標を得る方法でもよい。 For the position information in the human body detection, it is necessary to replace the two-dimensional position information of the players A to E on the image pickup screen by the camera 30 with the position information corresponding to the floor surface. The positions Pa to Pe of the players A to E indicate the absolute or relative positions with respect to the floor surface FL in the gymnasium, which is an exercise space. Since various methods are known for detecting the position of the player, a brief description thereof will be given. In the competition space captured by the camera 30, the displayed image is distorted into a trapezoidal shape along the depth direction by the perspective method. Therefore, the known line mark 40 or the separately set known marks 411 and 412 are imaged, and coordinate conversion is performed so that the image pickup surface is projected onto another image pickup surface as described later. In addition to the method of attaching the mark to the floor surface, scale bars of a predetermined height or the like are erected at at least four places, and the coordinate conversion of the image pickup surface is performed from the length of each scale bar imaged by the camera 30. The method of doing it can also be adopted. In a mode in which a plurality of cameras 30, for example, two cameras are used, a method of identifying each player by referring to two two-dimensional images and obtaining position coordinates may be used.

 座標変換部134は、図3(C)に示すように、カメラ30から所定の俯仰角θで見渡した台形状の撮像平面PL1を真上から見た長方形の鳥瞰図平面PL2に座標変換する。この場合の座標変換は、マーク情報を利用して、公知の鳥瞰変換乃至ホモグラフィ行列を用いた変換が適用される。これにより、平面PL1上の各座標位置を平面PL2上の座標位置に変換することが可能となる。 As shown in FIG. 3C, the coordinate conversion unit 134 converts the coordinates of the trapezoidal image pickup plane PL1 viewed from the camera 30 at a predetermined depression / elevation angle θ into a rectangular bird's-eye view plane PL2 viewed from directly above. As the coordinate transformation in this case, a known bird's-eye view transformation or transformation using a homography matrix is applied by utilizing the mark information. This makes it possible to convert each coordinate position on the plane PL1 into a coordinate position on the plane PL2.

 なお、座標変換部134は、プレイヤが色々な動作をしたときでも腰の高さは大きくは変わらないとの仮定に基づき、図3(B)に示すように、骨格画像Huの主要部位、例えば腰部位Waの高さhを水平な面PLで切ったときの鳥瞰図に変換している。座標変換部134は、前記したスケルトン検出アルゴリズム(Open Pose)で検出された腰部位Waのカメラ画像内座標を、鳥瞰図に変換した時の座標とし、かつこの座標をプレイヤの位置情報としている。 As shown in FIG. 3B, the coordinate conversion unit 134 is based on the assumption that the height of the waist does not change significantly even when the player performs various movements, and the main part of the skeleton image Hu, for example, The height h of the waist portion Wa is converted into a bird's-eye view when cut by a horizontal surface PL. The coordinate conversion unit 134 uses the coordinates in the camera image of the waist portion Wa detected by the skeleton detection algorithm (OpenPose) as the coordinates when converted into a bird's-eye view, and uses these coordinates as the position information of the player.

 距離計算部14は、変換後の鳥瞰図に基づいて、プレイヤA~Eのすべてに対してプレイヤ同士の間の距離を算出する。この場合、ラインマーク40、あるいは別途設定したマーク411,412の長さなどは既知であり、それに基づいてプレイヤ同士の間の距離を計算するようにしている。図2(C)は、プレイヤBと他のプレイヤとの間の、水平面PL上での距離を計算する場合を説明している。この図では、プレイヤBA間が距離Dbaであり、プレイヤBC間が距離Dbcであり、プレイヤBD間が距離Dbdであり、そしてプレイヤBE間が距離Dbeである。なお、プレイヤ同士の間の距離は、組み合わせ数について行えばよく、例えばプレイヤBA間の距離Dbaを測定すれば、プレイヤAB間の距離Dabは測定不要である。 The distance calculation unit 14 calculates the distance between the players for all the players A to E based on the converted bird's-eye view. In this case, the lengths of the line mark 40 or the separately set marks 411 and 412 are known, and the distance between the players is calculated based on the known lengths. FIG. 2C describes a case where the distance between the player B and another player on the horizontal plane PL is calculated. In this figure, the distance between the player BAs is the distance Dba, the distance between the player BCs is the distance Dbc, the distance between the player BDs is the distance Dbd, and the distance between the player BEs is the distance Dbe. The distance between the players may be determined by the number of combinations. For example, if the distance Dba between the player BAs is measured, the distance Dab between the players AB does not need to be measured.

 また、距離計算部14として、プレイヤ同士の間の距離を前記鳥瞰面への変換後に行う方法に代えて、高さも考慮して3次元空間での距離を計算するようにしてもよい。このようにすれば、3次元での距離が測れる。例えば頭部間の距離を人同士の間の距離としてもよい。 Further, the distance calculation unit 14 may calculate the distance in the three-dimensional space in consideration of the height, instead of the method of performing the distance between the players after the conversion to the bird's-eye view surface. In this way, the distance can be measured in three dimensions. For example, the distance between the heads may be the distance between people.

 近接距離判定部15は、濃厚接触の条件のひとつであり、各プレイヤ同士の間の距離が所定の近接距離内かどうかを判断する。近接距離は、飛沫感染を防止する観点から、ここでは2mとしている。なお、屋外ではより遠く、例えば3mとしてもよい。 The proximity distance determination unit 15 is one of the conditions for close contact, and determines whether or not the distance between the players is within a predetermined proximity distance. The close range is set to 2 m here from the viewpoint of preventing droplet infection. It should be noted that it may be farther outdoors, for example, 3 m.

 時間積算部16は、近接距離判定部15によりプレイヤ同士の距離が近接距離内にあると判定された間に、継続時間を他のプレイヤ毎に積算する。この例では、時間積算部16は、20Hzに相当する周期の時間を繰り返し積算する。 The time integration unit 16 integrates the duration for each other player while the proximity distance determination unit 15 determines that the distance between the players is within the proximity distance. In this example, the time integration unit 16 repeatedly integrates the time having a cycle corresponding to 20 Hz.

 近接時間判定部17は、濃厚接触の条件のひとつであり、各プレイヤ同士が近接距離内に所定時間(近接時間乃至はそれ以上の継続時間)だけ継続したかどうかを判断する。近接時間は、呼吸周期を考慮して、例えば3秒程度を設定している。 The proximity time determination unit 17 is one of the conditions for close contact, and determines whether or not each player has continued for a predetermined time (proximity time or longer duration) within the proximity distance. The proximity time is set to, for example, about 3 seconds in consideration of the respiratory cycle.

 アラーム処理部18は、必要に応じて設けられ、近接距離判定部15及び近接時間判定部17の判定結果が共に肯定であった場合に、アラームを発する。アラームは画像乃至音声、音響を含む。 The alarm processing unit 18 is provided as needed, and issues an alarm when the determination results of the proximity distance determination unit 15 and the proximity time determination unit 17 are both affirmative. Alarms include images, sounds, and sounds.

 近接時間集計部19は、近接距離判定部15及び近接時間判定部17の判定結果が共に肯定であった場合に、今回の継続時間を前回までの集計時間に加算して新たな集計時間として保管し、累積の対象とする。 When the determination results of the proximity distance determination unit 15 and the proximity time determination unit 17 are both affirmative, the proximity time aggregation unit 19 adds the duration of this time to the aggregation time up to the previous time and stores it as a new aggregation time. And make it a cumulative target.

 評価部20は、各プレイヤに対して、集計時間が濃厚接触に該当する時間を超えているかどうかを判定し、超えている場合には濃厚接触者として評価する。評価部20は、また、モニタリングした一連の運動時間、練習時間内で集計された集計時間の大小に応じて濃厚接触リスクポイントのようなポイント付けを行うことが可能である。 The evaluation unit 20 determines for each player whether or not the total time exceeds the time corresponding to the close contact, and if it exceeds, evaluates as a close contact. The evaluation unit 20 can also give points such as a rich contact risk point according to the magnitude of the totaled time during the series of exercise time and the practice time monitored.

 タイマ21は、ソフトウエアで構成される時間経過を計測する計時手段である。画像表示処理部22は、評価処理実行中の状態を、乃至は終了時の結果を所定の形式で表示部1021に表示させる。例えば、プレイヤ毎の集計時間、各プレイヤの集計時間を合計した合計時間の表示を行う。これらの表示において、横軸に時間を、縦軸に集計時間を取って、経時的な集計時間の累積状況を2次元表示する態様が好ましい。 The timer 21 is a timekeeping means composed of software for measuring the passage of time. The image display processing unit 22 causes the display unit 1021 to display the state in which the evaluation process is being executed or the result at the end in a predetermined format. For example, the total time for each player and the total time for each player are displayed. In these displays, it is preferable to display the cumulative status of the total time over time in two dimensions by plotting time on the horizontal axis and aggregation time on the vertical axis.

 検索部23は、後にプレイヤA~Eの誰かが該当の感染症に感染したことが判明した場合に、検索プログラムを起動することで、当該感染者との間で運動したプレイヤの評価結果、特に集計時間を確認する。検索部23は、感染が確認されたプレイヤとの間での近接した集合時間が最も長いプレイヤ、また濃厚接触に相当する時間以上となったプレイヤの検索(抽出)を行う。 When it is later found that any of the players A to E is infected with the infectious disease, the search unit 23 activates the search program to evaluate the evaluation result of the player who exercised with the infected person, particularly. Check the total time. The search unit 23 searches (extracts) the player who has the longest close gathering time with the player confirmed to be infected, and the player whose time corresponds to or longer than the close contact.

 図4は、制御部10が実行するモニタリング処理の手順の一例を示すフローチャートである。まず、モニタリング処理の実行に際して、位置検出の基準となるマークの形状情報の入力設定をはじめとする各種の初期条件を設定する初期設定処理が行われる(ステップS1)。次いでモニタリングが開始されると、タイマ21で計時される所定時間(前記した周期20Hz)が経過したか否かが判断される(ステップS3)。所定時間が経過した場合、計測対象となる運動中の人体の画像を含む、カメラ30で撮像された2次元画像を取り込む(ステップS5)。次いで、取得した人体の2次元画像中から人体画像を検出し、当該画像に対してラベリング処理及び骨格画像作成処理を行う(ステップS7)。 FIG. 4 is a flowchart showing an example of the procedure of the monitoring process executed by the control unit 10. First, when the monitoring process is executed, the initial setting process for setting various initial conditions including the input setting of the shape information of the mark which is the reference of the position detection is performed (step S1). Next, when monitoring is started, it is determined whether or not the predetermined time (the cycle of 20 Hz described above) timed by the timer 21 has elapsed (step S3). When the predetermined time has elapsed, the two-dimensional image captured by the camera 30 including the image of the moving human body to be measured is captured (step S5). Next, a human body image is detected from the acquired two-dimensional image of the human body, and labeling processing and skeleton image creation processing are performed on the image (step S7).

 次いで、各プレイヤ画像の撮像空間内での位置算出を行う(ステップS9)。さらに、鳥瞰図平面への座標変換が行われる(ステップS11)。この後、所定時間経過毎(ステップS3でYes)の評価処理(図5)が実行される(ステップS13)。そして、モニタリング処理が終了するまで(ステップS15)、ステップS3に戻って同様の処理が繰り替えされる。 Next, the position of each player image in the imaging space is calculated (step S9). Further, coordinate conversion to the bird's-eye view plane is performed (step S11). After that, the evaluation process (FIG. 5) is executed every predetermined time elapse (Yes in step S3) (step S13). Then, until the monitoring process is completed (step S15), the process returns to step S3 and the same process is repeated.

 図5は、制御部10が実行する評価処理の手順の一例を示すフローチャートである。評価処理を図2の例で説明する。まず、制御部10は、プレイヤA~Eの組み合わせである各ペアを順次選出し、選出された1組のペアの人同士の距離を計算する(ステップS21)。そして、計算された距離が近接距離内かどうかが判断され(ステップS23)、近接距離内であれば、当該ペアの積算時間に所定時間(この例では20Hzに相当する時間)を積算する(ステップS25)。次いで、当該ペアの蓄積時間が基準時間(典型的には3秒)を超えたか否かが判断される(ステップS27)。基準時間を超えた場合には、必要に応じてアラーム指示を行い(ステップS29)、続いて残りのペアがあるか否かが判断され(ステップS31)、残りのペアがあれば、ステップS21に戻って次のペアに対して同様な処理が実行される。残りのペアがなく、全てのペアに対する処理が終了したのであれば、ステップS15に移行する。 FIG. 5 is a flowchart showing an example of the procedure of the evaluation process executed by the control unit 10. The evaluation process will be described with reference to the example of FIG. First, the control unit 10 sequentially selects each pair that is a combination of the players A to E, and calculates the distance between the selected pair of people (step S21). Then, it is determined whether or not the calculated distance is within the close range (step S23), and if it is within the close range, a predetermined time (time corresponding to 20 Hz in this example) is integrated with the integrated time of the pair (step). S25). Next, it is determined whether or not the accumulation time of the pair exceeds the reference time (typically 3 seconds) (step S27). If the reference time is exceeded, an alarm is instructed as necessary (step S29), and then it is determined whether or not there is a remaining pair (step S31). If there is a remaining pair, step S21 is performed. Go back and do the same for the next pair. If there are no remaining pairs and the processing for all the pairs is completed, the process proceeds to step S15.

 ステップS23で近接距離内でないと判断された場合、当該ペアの積算時間は基準時間以上であったかどうかが判断される(ステップS33)。基準時間以上であった場合、ペアの各集計時間に今回の積算時間を加算して、積算時間をリセットする(ステップS35)。なお、今回加算された時間は新たな当該ペアの集計時間として一時的に記憶され、後の評価(ステップS23,S33を経てステップS35)で加算対象とされる。 If it is determined in step S23 that the distance is not within the close range, it is determined whether or not the integrated time of the pair is equal to or longer than the reference time (step S33). If it is equal to or longer than the reference time, the totalization time of this time is added to each totalization time of the pair, and the totalization time is reset (step S35). The time added this time is temporarily stored as a new aggregation time of the pair, and is subject to addition in a later evaluation (step S23, step S35, and then step S35).

 一方、ステップS33で、当該ペアの積算時間が基準時間を超えていない場合、近接接触はないとみなし、当該ペアの積算時間を集計することなく、リセットする(ステップS37)。 On the other hand, if the integrated time of the pair does not exceed the reference time in step S33, it is considered that there is no close contact, and the integrated time of the pair is reset without totaling (step S37).

 このように、各ペア間の基準時間乃至継続時間を取得することで、濃厚接触に至らずとも運動乃至は練習方法を変えれば、基準時間乃至継続時間をより低減する調整が可能となり、より安心した運動、練習方法を提案し、採用することが可能となる。 In this way, by acquiring the reference time or duration between each pair, it is possible to make adjustments that further reduce the reference time or duration by changing the exercise or practice method even if it does not lead to close contact, and it is more secure. It will be possible to propose and adopt the exercise and practice method that you have done.

 また、本実施形態では、プレイヤ同士の間での基準時間乃至集計時間を総計して、その時間の大小によって、運動、練習の評価を行い、運動方法を適宜変更することが可能となる。この場合、集計対象となる積算時間は、基準時間に達した時点の基準時間そのもの、あるいは基準時間を超えた継続時間までを含むよう集計してもよい。 Further, in the present embodiment, it is possible to total the reference time to the total time between the players, evaluate the exercise and practice according to the magnitude of the time, and change the exercise method as appropriate. In this case, the integrated time to be aggregated may be aggregated so as to include the reference time itself at the time when the reference time is reached or the duration exceeding the reference time.

 また、本発明は、飛沫感染を含む、濃厚接触でより感染する病症にも適用可能である。 The present invention is also applicable to diseases that are more infectious by close contact, including droplet infection.

 以上説明したように、本発明に係る評価装置及び方法は、画像読出手段、位置検出手段、距離算出手段、計時手段、判定手段、集計手段及び評価手段を備えることが好ましい。画像読出手段は、撮像手段で撮像された、運動中の複数の人の撮像画像を読み出す。位置検出手段は、読み出される前記撮像画像から、前記複数の人のうちの、各人の位置座標を周期的に検出する。距離算出手段は、前記複数の人のうち、一人と他の人との間の人同士の間の距離を各人について周期的に算出する。計時手段は、時間経過を計測する。判定手段は、前記人同士の間の距離が基準時間継続して近接距離内であるか否かを各人について周期的に判定する。集計手段は、前記判定手段による判定が継続して肯定されたときの少なくとも前記基準時間を各人について逐次、前回までの集計時間に集計すると共に新たな集計時間とする。そして、評価手段は、前記集計手段により集計された時間の大小に対応して濃厚接触の評価を各人について行う。 As described above, it is preferable that the evaluation device and method according to the present invention include image reading means, position detection means, distance calculation means, timekeeping means, determination means, tabulation means, and evaluation means. The image reading means reads out the captured images of a plurality of people in motion captured by the imaging means. The position detecting means periodically detects the position coordinates of each of the plurality of people from the captured image read out. The distance calculation means periodically calculates the distance between one person and another person among the plurality of people for each person. The timekeeping means measures the passage of time. The determination means periodically determines for each person whether or not the distance between the persons is within the close range for the reference time. The aggregation means sequentially aggregates at least the reference time when the determination by the determination means is continuously affirmed for each person into the aggregation time up to the previous time, and sets it as a new aggregation time. Then, the evaluation means evaluates the close contact for each person according to the magnitude of the time totaled by the totaling means.

 また、本発明は、コンピュータを、前記評価装置として機能させるためのプログラムであることが好ましい。 Further, it is preferable that the present invention is a program for making a computer function as the evaluation device.

 スポーツにおいては、プレイ(運動)中にごく短い間、人同士が近づいたり、離れたりすることが多い。新型コロナウイルスの飛沫感染を考慮すると、人同士が近づいているときに呼吸をしていたかどうかが重要であり、単純に距離だけで濃厚接触、ひいては感染リスクを評価することはできない。平均的に人の呼吸回数は、1分間あたりで十数回から20回といわれている。回数の多い方に着目すると、1回の呼吸が3秒で行われることを意味する。従って、近接時に呼吸をしたかどうかの判断は、近接状態が3秒継続したかどうかを判断の基準とすることが好ましい。そこで、本発明では、呼吸1回に要する時間(3秒)を基準時間とし、この基準時間を超えて近接している場合に、評価情報として当該時間を集計するようにしている。なお、基準時間は、前記呼吸時間に限定されず、確率としてより短くてもよいし、あるいは逆により長く、例えば呼吸2回分(6秒)としてもよい。また、運動環境(温度、日照、屋内外)をセンサで、及び人の体調や動き状態(体温、心拍数、加速度)をウエアラブル端末で計測して、その結果を呼吸周期の変化に対するパラメータとして基準時間の調整に反映させることが可能である。また、運動の種類によって、適宜の基準時間を設定してもよい。なお、これらは、基準時間の他、近接距離の調整にも同様に適用可能である。各人の集積時間を他の各人との間で取得するので、個々の他人に対するリスク情報が得られる。 In sports, people often approach or separate from each other for a very short time during play (exercise). Considering the droplet infection of the new coronavirus, it is important whether or not people were breathing when they were approaching each other, and it is not possible to evaluate the risk of close contact and thus infection simply by distance. On average, a person's breathing rate is said to be more than 10 to 20 times per minute. Focusing on the one with the highest number of times, it means that one breath is performed in 3 seconds. Therefore, it is preferable to use whether or not the proximity state has continued for 3 seconds as a criterion for determining whether or not the person has breathed at the time of proximity. Therefore, in the present invention, the time required for one breath (3 seconds) is set as the reference time, and when the time is close to the reference time exceeding the reference time, the time is totaled as evaluation information. The reference time is not limited to the breathing time, and may be shorter as a probability, or may be longer, for example, two breaths (6 seconds). In addition, the exercise environment (temperature, sunshine, indoors and outdoors) is measured with a sensor, and the physical condition and movement state (body temperature, heart rate, acceleration) of a person are measured with a wearable terminal, and the result is used as a parameter for changes in the respiratory cycle. It can be reflected in the time adjustment. Further, an appropriate reference time may be set depending on the type of exercise. In addition to the reference time, these can also be applied to the adjustment of the close distance. Since the accumulation time of each person is acquired with each other person, risk information for each other person can be obtained.

 また、前記撮像手段は、少なくとも1台以上であることが好ましい。この構成によれば、1台の場合、標識(マークサイズ情報や身長情報)を用いて撮影画像面に2次元座標系を構築して、人画像の座標位置を算出可能にする方法が適用可能となる。一方、2台以上であれば、複数の撮像画像を照合して撮像された人画像を同定することが可能となる。 Further, it is preferable that the number of the imaging means is at least one. According to this configuration, in the case of one unit, a method of constructing a two-dimensional coordinate system on the captured image surface using signs (mark size information and height information) so that the coordinate position of the human image can be calculated can be applied. It becomes. On the other hand, if there are two or more units, it is possible to identify the captured human image by collating a plurality of captured images.

 また、前記位置検出手段及び前記距離算出手段は、前記基準時間より短い短周期で位置の検出及び距離の算出を実行し、前記判定手段は、前記短周期毎に判定動作を行うことが好ましい。この構成によれば、例えば20Hzに相当する短周期で位置検出及び距離算出を繰り返し行うことで、3秒等の基準時間内で判定条件が継続して肯定されたかが精度よくモニタできる。なお、短周期は、20Hzの時間に限定されず、より短くてもよいし、また、少なくとも基準時間より短ければよい。 Further, it is preferable that the position detection means and the distance calculation means execute position detection and distance calculation in a short cycle shorter than the reference time, and the determination means performs a determination operation in each short cycle. According to this configuration, by repeating the position detection and the distance calculation in a short cycle corresponding to, for example, 20 Hz, it is possible to accurately monitor whether the determination condition is continuously affirmed within the reference time such as 3 seconds. The short cycle is not limited to the time of 20 Hz, and may be shorter, or at least shorter than the reference time.

 また、前記判定手段は、前記人同士の間の距離が前記近接距離内であるか否かを判定する第1の判定手段と、前記第1の判定手段による肯定の判定が前記基準時間継続するか否かを判定する第2の判定部とを備えることが好ましい。この構成によれば近接距離及び基準時間の各判定条件についての判定が行われる。 Further, in the determination means, the first determination means for determining whether or not the distance between the persons is within the close distance and the affirmative determination by the first determination means continue for the reference time. It is preferable to include a second determination unit for determining whether or not. According to this configuration, determination is made for each determination condition of the proximity distance and the reference time.

 また、前記集計手段は、前記判定手段による判定が肯定された場合の前記基準時間を逐次集計するものが好ましい。この構成によれば、いわゆる近接と判断された場合、基準時間分が集計される。従って、この態様では、積算処理が基準時間より長く継続する場合に、積算処理が基準時間に達した時点で一旦、基準時間分の集計を済ませ、次いでこの積算時間をリセットして再び積算時間の継続的積算処理を繰り返すことが可能となる。この場合には基準時間毎の集計処理が可能となる。 Further, it is preferable that the aggregation means sequentially aggregates the reference time when the determination by the determination means is affirmed. According to this configuration, when it is determined that the proximity is so-called, the reference time is totaled. Therefore, in this embodiment, when the integration process continues longer than the reference time, when the integration process reaches the reference time, the aggregation for the reference time is once completed, and then the integration time is reset and the integration time is again calculated. It is possible to repeat the continuous integration process. In this case, the aggregation process for each reference time is possible.

 また、本発明は、運動する人の体調及び運動環境の少なくとも一方を計測するセンサを備え、前記判定手段は、前記センサの計測結果をパラメータとして前記基準時間及び前記近接距離の少なくとも一方を調整するものが好ましい。この構成によれば、体調や運動環境の状況は呼吸数に影響することから計測結果をパラメータとして評価のための条件を調整することで高い精度の評価が可能となる。 Further, the present invention includes a sensor that measures at least one of the physical condition and the exercise environment of an exercising person, and the determination means adjusts at least one of the reference time and the proximity distance using the measurement result of the sensor as a parameter. Those are preferable. According to this configuration, since the physical condition and the state of the exercise environment affect the respiratory rate, it is possible to evaluate with high accuracy by adjusting the evaluation conditions using the measurement result as a parameter.

 また、本発明は、運動する人の体調及び運動環境の少なくとも一方を計測するセンサを備え、前記評価手段は、前記センサの計測結果をパラメータとして前記濃厚接触の評価を調整するものが好ましい。この構成によれば、パラメータを判定の他、評価に適用することが可能となる。 Further, the present invention preferably includes a sensor for measuring at least one of the physical condition and the exercise environment of an exercising person, and the evaluation means preferably adjusts the evaluation of the dense contact using the measurement result of the sensor as a parameter. According to this configuration, the parameters can be applied to the evaluation as well as the determination.

 また、前記集計手段は、さらに、各人の集計時間を総計するもので、前記評価手段は、全体の運動に対する濃厚接触の評価を行うものが好ましい。この構成によれば、一連の運動について各人とは別に全体としての濃厚接触機会を提示することが可能となり、より安全な運動方法、練習方法の調整が可能となる。 Further, the tabulation means further totals the totaling time of each person, and it is preferable that the evaluation means evaluates the close contact with the whole exercise. According to this configuration, it is possible to present a close contact opportunity as a whole separately from each person for a series of exercises, and it is possible to adjust a safer exercise method and practice method.

 また、本発明は、前記複数の人のうち、感染症に感染した感染者の情報を入力する入力部と、前記入力された感染者との間での前記集計時間が濃厚接触基準を超えた人を検索する検索手段とを備えたものが好ましい。この構成によれば、感染者が判明した場合に、基準(例えば15分)を超えた濃厚接触者の検索が迅速に行える。 Further, in the present invention, the total time between the input unit for inputting the information of the infected person infected with the infectious disease and the input infected person among the plurality of persons exceeds the rich contact standard. Those provided with a search means for searching for a person are preferable. According to this configuration, when an infected person is found, it is possible to quickly search for a close contact person who exceeds a standard (for example, 15 minutes).

 1 評価装置
 10 制御部
 101 記憶部
 102 タッチパネル(入力部)
 12 撮影画像読出部(画像読出手段)
 13 位置検出部(位置検出手段)
 131 人体検出部
 132 ラベリング処理部
 133 骨格画像作成部
 134 座標変換部
 14 距離計算部(距離検出手段)
 15 近接距離判定部(判定手段、第1の判定手段)
 16 時間積算部
 17 近接時間判定部(判定手段、第2の判定手段)
 18 アラーム処理部
 19 近接時間集計部(集計手段)
 20 評価部(評価手段)
 21 タイマ(計時手段)
 22 画像表示処理部
 23 検索部(検索手段)
 30 カメラ
 31 センサ部(センサ)
1 Evaluation device 10 Control unit 101 Storage unit 102 Touch panel (input unit)
12 Captured image reading unit (image reading means)
13 Position detection unit (position detection means)
131 Human body detection unit 132 Labeling processing unit 133 Skeleton image creation unit 134 Coordinate conversion unit 14 Distance calculation unit (distance detection means)
15 Proximity distance determination unit (determination means, first determination means)
16 Hours integration unit 17 Proximity time determination unit (determination means, second determination means)
18 Alarm processing unit 19 Proximity time totaling unit (aggregation means)
20 Evaluation Department (Evaluation Means)
21 Timer (timekeeping means)
22 Image display processing unit 23 Search unit (search means)
30 Camera 31 Sensor unit (sensor)

Claims (11)

 撮像手段で撮像された、運動中の複数の人の撮像画像を読み出す画像読出手段と、
 読み出される前記撮像画像から、前記複数の人のうちの、各人の位置座標を周期的に検出する位置検出手段と、
 前記複数の人のうち、一人と他の人との間の人同士の間の距離を各人について周期的に算出する距離算出手段と、
 時間経過を計測する計時手段と、
 前記人同士の間の距離が基準時間継続して近接距離内であるか否かを各人について周期的に判定する判定手段と、
 前記判定手段による判定が継続して肯定されたときの少なくとも前記基準時間を各人について逐次、前回までの集計時間に集計すると共に新たな集計時間とする集計手段と、
 前記集計手段により集計された時間の大小に対応して濃厚接触の評価を各人について行う評価手段とを備えた評価装置。
An image reading means for reading out images captured by a plurality of people in motion, which are captured by the imaging means, and
A position detection means for periodically detecting the position coordinates of each person among the plurality of people from the captured image read out.
A distance calculation means for periodically calculating the distance between one person and another person among the plurality of people, and a distance calculation means for periodically calculating the distance between each person.
Timekeeping means to measure the passage of time and
A determination means for periodically determining whether or not the distance between the persons is within the close range for a continuous reference time, and
At least when the determination by the determination means is continuously affirmed, at least the reference time is sequentially aggregated for each person in the aggregation time up to the previous time, and the aggregation means is used as a new aggregation time.
An evaluation device provided with an evaluation means for evaluating each person in close contact according to the amount of time totaled by the totaling means.
 前記撮像手段は、少なくとも1台以上である請求項1に記載の評価装置。 The evaluation device according to claim 1, wherein the image pickup means is at least one or more.  前記位置検出手段及び前記距離算出手段は、前記基準時間より短い短周期で位置の検出及び距離の算出を実行し、前記判定手段は、前記短周期毎に判定動作を行うことを特徴とする請求項1又は2に記載の評価装置。 The claim means that the position detection means and the distance calculation means execute position detection and distance calculation in a short cycle shorter than the reference time, and the determination means performs a determination operation in each short cycle. Item 2. The evaluation device according to Item 1 or 2.  前記判定手段は、前記人同士の間の距離が前記近接距離内であるか否かを判定する第1の判定手段と、前記第1の判定手段による肯定の判定が前記基準時間継続するか否かを判定する第2の判定部とを備えた請求項1~3のいずれかに記載の評価装置。 The determination means is a first determination means for determining whether or not the distance between the persons is within the close distance, and whether or not the affirmative determination by the first determination means continues for the reference time. The evaluation device according to any one of claims 1 to 3, further comprising a second determination unit for determining whether or not.  前記集計手段は、前記判定手段による判定が肯定された場合の前記基準時間を逐次集計する請求項1~4のいずれかに記載の評価装置。 The evaluation device according to any one of claims 1 to 4, wherein the aggregation means sequentially aggregates the reference time when the determination by the determination means is affirmed.  運動する人の体調及び運動環境の少なくとも一方を計測するセンサを備え、
 前記判定手段は、前記センサの計測結果をパラメータとして前記基準時間及び前記近接距離の少なくとも一方を調整する請求項1~5のいずれかに記載の評価装置。
Equipped with sensors that measure at least one of the physical condition and exercise environment of the person exercising,
The evaluation device according to any one of claims 1 to 5, wherein the determination means adjusts at least one of the reference time and the proximity distance using the measurement result of the sensor as a parameter.
 運動する人の体調及び運動環境の少なくとも一方を計測するセンサを備え、
 前記評価手段は、前記センサの計測結果をパラメータとして前記濃厚接触の評価を調整する請求項1~5のいずれかに記載の評価装置。
Equipped with sensors that measure at least one of the physical condition and exercise environment of the person exercising,
The evaluation device according to any one of claims 1 to 5, wherein the evaluation means adjusts the evaluation of the dense contact using the measurement result of the sensor as a parameter.
 前記集計手段は、さらに、各人の集計時間を総計するもので、
 前記評価手段は、全体の運動に対する濃厚接触の評価を行う請求項1~7のいずれかに記載の評価装置。
The totaling means further totals the totaling time of each person.
The evaluation device according to any one of claims 1 to 7, wherein the evaluation means evaluates the dense contact with respect to the whole movement.
 前記複数の人のうち、感染症に感染した感染者の情報を入力する入力部と、
 前記入力された感染者との間での前記集計時間が濃厚接触基準を超えた人を検索する検索手段とを備えた請求項1~8のいずれかに記載の評価装置。
An input unit for inputting information on an infected person infected with an infectious disease among the plurality of people.
The evaluation device according to any one of claims 1 to 8, further comprising a search means for searching for a person whose total time with respect to the input infected person exceeds the dense contact criterion.
 画像読出手段が、撮像手段で撮像された、運動中の複数の人の撮像画像を読み出し、
 位置検出手段が、読み出される前記撮像画像から、前記複数の人のうちの、各人の位置座標を周期的に検出し、
 距離算出手段が、前記複数の人のうち、一人と他の人との間の人同士の間の距離を各人について周期的に算出し、
 計時手段が、時間経過を計測し、
 判定手段が、前記人同士の間の距離が基準時間継続して近接距離内であるか否かを各人について周期的に判定し、
 集計手段が、前記判定手段による判定が継続して肯定されたときの少なくとも前記基準時間を各人について逐次、前回までの集計時間に集計すると共に新たな集計時間とし、
 評価手段が、前記集計手段により集計された時間の大小に対応して濃厚接触の評価を各人について行う評価方法。
The image reading means reads out the captured images of a plurality of people in motion, which are captured by the imaging means.
The position detecting means periodically detects the position coordinates of each of the plurality of people from the captured image read out.
The distance calculation means periodically calculates the distance between one person and another person among the plurality of people for each person.
Timekeeping means measure the passage of time and
The determination means periodically determines for each person whether or not the distance between the persons is within the close range for the reference time.
At least the reference time when the determination by the determination means is continuously affirmed by the aggregation means is sequentially aggregated for each person in the aggregation time up to the previous time, and is set as a new aggregation time.
An evaluation method in which the evaluation means evaluates each person for close contact according to the magnitude of the time totaled by the totaling means.
 コンピュータを、請求項1~9のいずれかに記載の評価装置として機能させるためのプログラム。 A program for making a computer function as the evaluation device according to any one of claims 1 to 9.
PCT/JP2021/022465 2020-07-20 2021-06-14 Evaluation device, evaluation method, and program Ceased WO2022019001A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022538628A JPWO2022019001A1 (en) 2020-07-20 2021-06-14

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020123769 2020-07-20
JP2020-123769 2020-07-20

Publications (1)

Publication Number Publication Date
WO2022019001A1 true WO2022019001A1 (en) 2022-01-27

Family

ID=79729333

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/022465 Ceased WO2022019001A1 (en) 2020-07-20 2021-06-14 Evaluation device, evaluation method, and program

Country Status (2)

Country Link
JP (1) JPWO2022019001A1 (en)
WO (1) WO2022019001A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025052496A1 (en) * 2023-09-04 2025-03-13 株式会社RedDotDroneJapan Action determination system, action determination method, and program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011248802A (en) * 2010-05-31 2011-12-08 Michito Miyazaki Viral infection hazard system using gps function
JP2019083395A (en) * 2017-10-30 2019-05-30 パナソニックIpマネジメント株式会社 Infectious substance monitoring system, and infectious substance monitoring method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011248802A (en) * 2010-05-31 2011-12-08 Michito Miyazaki Viral infection hazard system using gps function
JP2019083395A (en) * 2017-10-30 2019-05-30 パナソニックIpマネジメント株式会社 Infectious substance monitoring system, and infectious substance monitoring method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025052496A1 (en) * 2023-09-04 2025-03-13 株式会社RedDotDroneJapan Action determination system, action determination method, and program

Also Published As

Publication number Publication date
JPWO2022019001A1 (en) 2022-01-27

Similar Documents

Publication Publication Date Title
US20210290109A1 (en) Range of motion system, and method
KR100772497B1 (en) Golf Clinic System and Its Operation Method
JP3570163B2 (en) Method and apparatus and system for recognizing actions and actions
US11679300B2 (en) Systems and methods for real-time data quantification, acquisition, analysis, and feedback
EP3459061B1 (en) Multi -joint tracking combining embedded sensors and an external
JP6466139B2 (en) Robot measuring instrument that measures human movement
US9452341B2 (en) Running form diagnosis system and method for scoring running form
WO2019114708A1 (en) Motion data monitoring method and system
US20030008731A1 (en) Automated method and system for golf club selection based on swing type
US20080269644A1 (en) Precision Athletic Aptitude and Performance Data Analysis System
US20110246123A1 (en) Personal status monitoring
US20170216665A1 (en) System for Measuring and Reporting Weight-Training Performance Metrics
CN107085651A (en) The method of wearable device and the operation wearable device
KR20070095407A (en) Method and system for analysis and instruction of movement
KR20160022940A (en) Fatigue indices and uses therof
JP2016080671A5 (en)
CN105917355A (en) Camera-based tracking system for the determination of physical, physiological and/or biometric data and/or for risk assessment
JP2020174910A (en) Exercise support system
WO2017122705A1 (en) Training classification system, training classification method, and training classification server
CN112933581A (en) Sports action scoring method and device based on virtual reality technology
JP6815104B2 (en) Video display device and video display method
JP2016035651A (en) Home rehabilitation system
WO2022019001A1 (en) Evaluation device, evaluation method, and program
CN114190883B (en) Respiratory movement measurement device and measurement method
JP7287715B1 (en) Auxiliary Systems, Auxiliary Methods and Programs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21846499

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022538628

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21846499

Country of ref document: EP

Kind code of ref document: A1