WO2019045091A1 - Dispositif de traitement d'informations, système de comptage, procédé de comptage et support d'enregistrement de programme - Google Patents
Dispositif de traitement d'informations, système de comptage, procédé de comptage et support d'enregistrement de programme Download PDFInfo
- Publication number
- WO2019045091A1 WO2019045091A1 PCT/JP2018/032538 JP2018032538W WO2019045091A1 WO 2019045091 A1 WO2019045091 A1 WO 2019045091A1 JP 2018032538 W JP2018032538 W JP 2018032538W WO 2019045091 A1 WO2019045091 A1 WO 2019045091A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- objects
- measured
- learning model
- fish
- feature amount
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06M—COUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
- G06M11/00—Counting of objects distributed at random, e.g. on a surface
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the present invention relates to a technique for measuring the number of objects using an imaging device.
- Patent Document 1 discloses a configuration for measuring the number of fish. That is, in the configuration in Patent Document 1, the camera captures the fish in the ginger from the side of the ginger, and the information processing apparatus tracks the fish by analyzing the captured image taken, and the tracking result is obtained. Use to count fish in ginger.
- the camera captures the fish in the room from the upper side of the room containing the fish
- the computer detects the fish shadow from the captured image captured, and detects it as a fish shadow in the captured image
- a configuration is shown to determine the presence and number of fish shadows based on the area of the selected area.
- Patent Document 1 by analyzing a captured image, a fish is detected and the fish in the ginger is counted using a result of tracking the fish.
- a fish shadow is detected from a captured image, and the number of fish shadows is determined based on the area of a region detected as a fish shadow in the captured image.
- a fish is detected and the number of fish is measured using the detection result.
- the present invention has been devised to solve the above problems. That is, the main object of the present invention is to suppress the increase in time required for the process of measuring the number of objects even if the number of objects to be measured is large when the number of objects is measured using a captured image by the imaging device In addition, it is to provide the technology which raises measurement accuracy.
- an information processing apparatus includes: An extraction unit that extracts a predetermined feature amount that changes according to the number of objects to be measured from a captured image in which the object to be measured is captured; A counting unit that detects the number of objects to be measured from the learning model by collating the extracted feature amount with a learning model that is relationship data between the feature amount and the number of objects by machine learning; Equipped with
- One aspect of the counting system is A photographing device for photographing a photographing space in which an object to be measured exists; An information processing apparatus configured to measure the number of objects to be measured in the imaging space based on an image captured by the imaging apparatus;
- the information processing apparatus is An extraction unit that extracts a predetermined feature amount that changes according to the number of objects to be measured from a captured image in which the object to be measured is captured;
- a counting unit that detects the number of objects to be measured from the learning model by collating the extracted feature amount with a learning model that is relationship data between the feature amount and the number of objects by machine learning; Equipped with
- One aspect of the counting method according to the present invention is And extracting a predetermined feature amount that changes according to the number of objects to be measured from the captured image in which the object to be measured is captured,
- the number of objects to be measured is detected from the learning model by collating the extracted feature amount with a learning model which is relationship data between the feature amount and the number of objects by machine learning.
- One aspect of a program storage medium is A process of extracting, from a photographed image in which an object to be measured is photographed, a predetermined feature amount that changes according to the number of objects to be measured; Processing for detecting the number of objects to be measured from the learning model by collating the extracted feature amount with a learning model which is relationship data between the feature amount and the number of objects by machine learning; Stores computer programs to be executed by a computer.
- the present invention when measuring the number of objects using a captured image by the imaging device, it is possible to suppress an increase in time required for measurement processing of the number of objects even if the number of objects to be measured is large. , Measurement accuracy can be improved.
- FIG. 1 is a block diagram showing a simplified configuration of a counting system according to a first embodiment of the present invention.
- the counting system 10 in the first embodiment is a system that measures the number of fish in a ginger that is an object to be measured.
- the counting system 10 includes a camera 11 which is a photographing device and an information processing device 12.
- the camera 11 has a waterproof function, and as shown in FIG. 2, the camera 11 is disposed in the ginger 25 in which the fish 26 are cultured.
- the camera 11 is disposed at the center of the bottom of the water (including the position near the bottom) in the production bowl 25. The orientation of the lens of the camera 11 disposed at such a position is directed to the water surface (upward).
- the camera 11 has a shooting range (field of view) as shown in FIG. That is, the shooting range of the camera 11 is a range in which the side (peripheral edge) of the ginger 25 can be photographed in consideration of the size of the ginger 25, and the camera 11 photographs the water in the ginger 25 as a photographing space .
- each of the cameras 11 is supported and fixed on the metal plate 20 as a supporting member as shown in FIG. 4 so that the direction of the lens is upward (the direction facing the upper side of the substrate surface of the metal plate 20). Be done.
- the metal plate 20 on which the camera 11 is supported and fixed is connected to the buoy 22 which is a floating body by a plurality of (four) ropes 21 which are wire members. Further, the opposite end side of the buoy 22 in each rope 21 is connected to the weight 23.
- the buoy 22 floats to the water surface, and the weight 23 sinks to the water bottom side, whereby the metal plate 20 is a rope 21 is suspended in water. Further, the weight 23 prevents the arrangement position of the metal plate 20 (in other words, the camera 11) from being largely changed.
- the method of arranging the camera 11 in water by the metal plate 20, the rope 21, the buoy 22, and the weight 23 has a simple structure and is easy to be compact and lightweight. From this, it becomes easy to move the camera 11 to another ginger 25.
- the camera 11 thus disposed in the water shoots the fish 26 in the ginger 25 from the ventral side.
- the fish 26 in the image captured by the camera 11 is an image as shown in FIGS. 5A to 5C.
- the camera 11 is a photographing device having a function of photographing a moving image
- a photographing device that intermittently photographs, for example, a still image at each setting time interval without adopting a moving image photographing function is adopted as the camera 11
- the calibration of the camera 11 is performed by an appropriate calibration method in consideration of the environment of the ginger 25, the type of fish to be measured, and the like.
- the description of the calibration method is omitted.
- a method of starting photographing by the camera 11 and a method of stopping photographing an appropriate method taking into consideration the performance of the camera 11, the environment of the production 25, and the like is adopted.
- a fish observer (operator) manually starts shooting before the camera 11 enters the ginger 25 and stops shooting manually after the camera 11 is withdrawn from the ginger 25.
- the camera 11 has the function of wireless communication or wired communication
- the camera 11 is communicably connected to an operating device capable of transmitting information for controlling the start and stop of shooting. Then, the photographing start and the photographing stop of the camera 11 in the water may be controlled by the operation of the operation device by the observer.
- the captured image captured by the camera 11 as described above may be captured by the information processing apparatus 12 by wired communication or wireless communication, or may be stored in the portable storage medium and then stored in the portable storage medium to process the information processing apparatus from the portable storage medium It may be incorporated into T.12.
- the information processing apparatus 12 generally includes a control device 13 and a storage device 14 as shown in FIG.
- the information processing apparatus 12 is connected to an input device (for example, a keyboard or a mouse) 16 which inputs information to the information processing apparatus 12 by an operation of an observer, for example, and a display device 17 which displays information.
- the information processing device 12 may be connected to an external storage device 15 separate from the information processing device 12.
- the storage device 14 has a function of storing various data and a computer program (hereinafter also referred to as a program), and is realized by a storage medium such as a hard disk device or a semiconductor memory, for example.
- the storage device 14 included in the information processing device 12 is not limited to one, and multiple types of storage devices may be included in the information processing device 12. In this case, a plurality of storage devices are collectively referred to. It is referred to as a storage device 14.
- the storage device 15 also has a function of storing various data and computer programs as the storage device 14 and is realized by a storage medium such as a hard disk device or a semiconductor memory, for example.
- the information processing apparatus 12 appropriately executes the process of writing and reading information in the storage device 15, but in the following description, the description of the storage device 15 is omitted.
- the storage device 14 stores the image captured by the camera 11 in association with information identifying the captured camera, shooting date and time, information related to the shooting condition such as weather information, etc. Be done.
- the control device 13 is configured by, for example, a processor such as a CPU (Central Processing Unit).
- the control device 13 can have the following functions, for example, by the CPU executing a computer program stored in the storage device 14. That is, the control device 13 includes an extraction unit 30, a counting unit 31, and a display control unit 32 as functional units.
- the display control unit 32 has a function of controlling the display operation of the display device 17. For example, when the display control unit 32 receives a request to reproduce a photographed image of the camera 11 from the input device 16, the display control unit 32 reads the photographed image of the camera 11 according to the request from the storage device 14 and displays the photographed image. Display on
- the extraction unit 30 determines a predetermined feature amount from the photographed image Have the ability to extract
- the feature amount is a feature amount used in a learning model for number detection stored in the storage device 14.
- the learning model for detecting the number is a model used in the process of detecting (counting) the number of fish in the ginger 25 from the photographed image taken by the camera 11 disposed as shown in FIG. It is the related data of quantity and the number of fish.
- the learning model for detecting the number is generated by machine learning using teacher images (teacher images) as a large number of photographed images obtained by photographing the fish in the ginger 25 by the camera 11 in a state where the number of fish is known.
- teacher images teacher images
- 5A-5C an example of a captured image in the ginger 25 by the camera 11 is represented in association with the number of fish in the ginger 25.
- FIG. As shown in FIGS. 5A to 5C, the brightness of the photographed image, the number of fish shadows, and the like change according to the number of fish. Thus, an element that changes according to the number of fish is set as the feature amount.
- the learning model for number detection may be, for example, an aspect as shown in FIG. 6A or an aspect as shown in FIG. 7, and a method for generating the learning model as described above. It is set appropriately along with the setting of.
- the learning model in FIG. 6A is a model in which the number of fish (the number of fish) changes continuously according to the change in the feature value extracted from the teacher data. This model is obtained, for example, by regression analysis using measurement points P representing the relationship between a large number of feature amounts and the number as shown in FIG. 6B obtained from teacher data.
- a combination of a plurality of different feature amounts M and N extracted from the teacher image is grouped according to the number of fish in the production 25 in the teacher image in which the combination of the feature amounts M and N is extracted ( It is a model classified into classes.
- the extraction unit 30 extracts feature amounts used in the above-described learning model for number detection from the captured image.
- a deep learning framework used for generating a learning model is used as an extractor.
- the counting unit 31 collates the feature quantity extracted from the photographed image by the extracting unit 30 with the learning model for detecting the number of fish in the storage unit 14, and the number of fish corresponding to the extracted feature quantity from the learning model for detecting the fish quantity It has a function to count as the number of fish to be measured.
- the counting unit 31 externally receives information (environmental information) of the photographing environment such as the season and the weather. Then, the counting unit 31 has a function of correcting the number of fish detected from the learning model according to the imaging environment, using the received information of the imaging environment and the data for correction stored in advance in the storage device 14 May be provided.
- a learning model may be generated for each shooting environment, and a learning model associated with information on the shooting environment may be stored in the storage device 14. In this case, the counting unit 31 detects the number of fish using a learning model corresponding to the received information of the imaging environment.
- the counting unit 31 may detect the number of fish using a learning model according to information of fish age (fish size) and ginger size received from the outside.
- the counting system 10 of the first embodiment is configured as described above. Next, an example of the counting process of the information processing apparatus 12 in the first embodiment will be described based on the flowchart of FIG.
- the extractor 30 of the information processing device 12 detects that a request to count fish to be measured has been input by the operation of the input device 16 by the observer (step S101), the image captured by the camera 11 is stored. Acquired from the device 14. Then, the extraction unit 30 extracts a feature amount set in advance from the acquired image captured by the camera 11 (step S102).
- the counting unit 31 collates the extracted feature quantity with the learning model for detecting the number, and detects (counts) the number of fish corresponding to the extracted feature quantity from the learning model as the number of fish to be measured ( Step S103). Then, the counting unit 31 causes the display control unit 32 to display the measured number of fish, for example.
- the information processing apparatus 12 can measure the number of fish to be measured from the captured image and output the measured number of fish by such processing.
- the extraction unit 30 and the counting unit 31 of the information processing device 12 detect the number of fish to be measured from the photographed image by using a learning model for detecting the number by machine learning. Since the learning model by machine learning can improve the detection accuracy using the model by increasing the number of teacher data, the information processing apparatus 12 can use the learning model by machine learning to obtain an object. The accuracy of counting can be easily enhanced. In addition, since the information processing apparatus 12 extracts the feature amount of the captured image and detects the number of objects (fish) to be measured based on the feature amount, the number of fish in the captured image is measured one by one. In comparison, even if the number of fish increases, it is possible to suppress an increase in the time required for measurement processing.
- the information processing apparatus 12 measures the number of fish based on the feature quantity extracted from the photographed image instead of measuring the number of fish one by one, the photographing is caused due to the overlapping of the fish and the turbidity of the water. Even if there are fish that can not be detected in the image, the number of fish can be detected (counted) with high accuracy.
- the camera 11 is disposed at the bottom of the water (a position near the bottom of the water).
- a camera 35 disposed as shown in FIG. 9 may be used. That is, as shown in FIG. 9, the camera 35 is fixed to the metal plate 20 in the horizontal orientation such that the lens orientation is parallel to the substrate surface of the metal plate 20 as the support member. In other words, the camera 35 is disposed sideways so as to be parallel to the water surface in a state of being introduced into the water (raw water 25).
- FIG. 10 is a diagram illustrating an example of the relationship between the shooting range of the horizontally oriented camera 35 and the ginger 25 when the ginger 25 is viewed from the upper side. As shown in FIG.
- the horizontally oriented camera 35 is disposed at the periphery of the ginger 25 and photographs the fish in the ginger 25 from the periphery (side surface) of the ginger 25.
- the camera 35 has a field of view in which almost the entire production area 25 can be photographed.
- the depth position (height position) of the horizontally oriented camera 35 in water is the middle position between the bottom of the water and the water surface, and in this case, the depth position tends to be crowded depending on the type of fish. It is a depth position etc. which are easy to be crowded, and it is considered as a depth position suitable for the number of fish (number of fish) detection.
- FIGS. 11A to 11C examples of captured images including fish captured by the horizontally oriented camera 35 are shown for each number of fish.
- a learning model for detecting the number is generated by using a photographed image taken by the camera 35 disposed similarly as teacher data.
- the extraction unit 30 extracts, from the image captured by the horizontally oriented camera 35, the feature amount used in the learning model for number detection generated as such.
- the counting unit 31 detects the number of fish by collating the feature quantity extracted by the extraction unit 30 with a learning model for detecting the number of fishes based on a photographed image by the horizontally oriented camera 35.
- the counting system 10 can obtain the same effect as the effect described above.
- FIG. 12 is a block diagram schematically showing the configuration of the counting system of the second embodiment.
- the counting system 10 of the second embodiment is provided with a camera 35 in addition to the configuration of the first embodiment. That is, the counting system 10 of the second embodiment uses the plurality of cameras 11 and 35.
- one camera 11 is disposed upward (toward the water surface) at the center of the bottom of the bottom of the production line 25, and another camera 35 is disposed at the bottom of the production line 25. It is disposed laterally on the periphery.
- the depth position of the horizontally oriented camera 35 is, for example, a position at an intermediate portion between the bottom of the water and the water surface, or a depth position where it is easy to swarm depending on the type of fish. Yes, and the appropriate depth position for number detection.
- the cameras 11 and 35 are disposed in water using the rope 21 and the buoy 22 as in the first embodiment, but the rope 21 and the buoy 22 are not illustrated in FIG. There is.
- Each of the cameras 11 and 35 has a field of view capable of capturing the entire production 25 and captures the inside of the production 25 from different directions.
- the cameras 11 and 35 are imaging devices each having a function to capture a fish by a moving image, but as described in the first embodiment, for example, a time interval for setting a still image is not provided without having a moving image capturing function.
- An imaging device that captures images intermittently may be adopted as the cameras 11 and 35 each time.
- the information processing apparatus 12 detects the number of fish using the captured images of the cameras 11 and 35 captured at the same time. Taking this into consideration, in order to make it easy to obtain an image taken by the camera 11 taken at the same time and an image taken by the camera 35, it is also possible It is preferable to take a picture. For example, light that is emitted for a short time by automatic control or manually by the observer (worker) may be used as a mark used for time alignment, and the cameras 11 and 35 may capture the light. Accordingly, based on the light captured in the captured image by the cameras 11 and 35, it becomes easy to perform time alignment (synchronization) of the captured image by the camera 11 and the captured image by the camera 35.
- the display control unit 32 of the control device 13 in the information processing device 12 has a function of controlling the display device 17 such that the display images of the cameras 11 and 35 are displayed side by side in two-screen display. Equipped with, the display control unit 32 has a function that allows the observer to adjust the reproduction frame of each photographed image by the cameras 11 and 35 by using the time alignment mark as described above simultaneously photographed by the cameras 11 and 35. Equipped with Furthermore, the display control unit 32 may have a function of switching between one-screen display for displaying one of the captured images by the cameras 11 and 35 and dual-screen display for displaying both the captured images by the cameras 11 and 35. .
- the display control unit 32 further has a function of causing the display device 17 to display a display for receiving an instruction for switching by the observer.
- the display for receiving the switching instruction may be, for example, in the form of an icon, may be characters, or may be in the form of a radio button. Aspects are employed.
- a learning model for detecting the number is generated by machine learning using both captured images of the cameras 11 and 35 captured simultaneously. That is, a learning model for detecting the number is generated by machine learning using teacher images (teacher images) as a large number of captured images simultaneously captured by the cameras 11 and 35 in a state in which the number of fish in the ginger 25 is known. .
- a learning model for detecting the number of fish is relationship data between the feature quantity extracted from the photographed image and the number of fish, and is stored in the storage device 14.
- the extraction unit 30 has a function of extracting a feature value used in a number detection learning model from both an image captured by the camera 11 and an image captured by the camera 35 captured simultaneously.
- the counting unit 31 collates the feature quantity extracted by the extracting unit 30 from the images captured simultaneously with the cameras 11 and 35 with the learning model for detecting the number of fish in the storage device 14 and extracts the feature quantity extracted from the learning model. It has a function to detect the number of fish corresponding to as the number of fish to be measured.
- the remaining configuration of the counting system 10 according to the second embodiment is similar to that of the counting system 10 according to the first embodiment.
- the counting system 10 according to the second embodiment can increase the amount of information used when detecting the number of fish in the ginger 25 by using the images taken by the plurality of cameras 11 and 35, so the number of fish can be increased. Detection accuracy can be further improved.
- the counting system 10 may have the following configuration. That is, the counting system 10 detects the number of fish in the ginger 25 using mode 1 in which the number of fish in the ginger 25 is detected using the image photographed by the camera 11 at the bottom of the water and the image photographed by the cameras 11 and 35 It may have a configuration to switch between mode 2 and That is, the storage device 14 stores a learning model for number detection (learning model for mode 1) generated by machine learning using an image captured by the camera 11 at the bottom of the water.
- learning model for number detection learning model for mode 1
- the storage device 14 stores a learning model for detecting the number of rays (learning model for mode 2) generated by machine learning using images captured by the camera 11 at the bottom of the water and the camera 35 at the periphery of the ginger.
- the control device 13 includes a switching unit 34 represented by a dotted line in FIG. 12 as a functional unit.
- the switching unit 34 switches the number detection mode from mode 1 to mode 2 or vice versa in response to the request.
- the extracting unit 30 and the counting unit 31 operate in mode 1 or mode 2 in accordance with an instruction from the switching unit 34. That is, in mode 1, the extraction unit 30 and the counting unit 31 function in the same manner as the functions described in the first embodiment. In the case of mode 2, the extraction unit 30 and the counting unit 31 function in the same manner as the functions described in the second embodiment.
- the counting system 10 has a configuration capable of switching between mode 1 and mode 2, for example, the observer detects the number of fish if the photographed image of the camera 35 is defective for the fish number detection due to some problem. Mode is switched from mode 2 to mode 1. Thereby, even when the photographed image of the camera 35 is not good for the number detection, the counting system 10 can detect the number of fish in the ginger 25.
- the counting system 10 may have the following configuration. That is, the counting system 10 may be configured to operate in a mode which is alternatively selected from the following three modes 1-1, 1-2, and 2.
- the mode 1-1 is a mode in which the number of fish in the ginger 25 is detected using an image captured by the camera 11 at the bottom of the water.
- Mode 1-2 is a mode in which the number of fish in the ginger 25 is detected using an image captured by the camera 35 at the periphery of the ginger.
- Mode 2 is a mode for detecting the number of fish in the ginger 25 using images taken by the camera 11 at the bottom of the water and the camera 35 at the periphery of the ginger.
- the storage device 14 includes a learning model for mode 1-1 and a learning model for mode 1-2.
- a learning model for mode 2 is stored.
- the learning model for mode 1-1 is a learning model for number detection generated by machine learning using an image captured by the camera 11 at the bottom of the water.
- the learning model for mode 1-2 is a learning model for number detection generated by machine learning using an image captured by the camera 35 at the periphery of ginger.
- the learning model for mode 2 is a learning model for number detection generated by machine learning using images captured by the camera 11 at the bottom of the water and the camera 35 at the periphery of the ginger.
- the control device 13 is provided with a switching unit 34.
- the switching unit 34 switches the number detection mode to a mode based on the received information, for example, when the information representing the mode after switching is received by the operation of the input device 16 by the observer.
- a function of instructing the counting unit 31 is provided.
- the extraction unit 30 and the counting unit 31 function to detect the number of fish in the raw fish 25 using a learning model corresponding to the mode instructed by the switching unit 34.
- the counting system 10 can detect the number of fish in the production 25. Thereby, the counting system 10 can improve the convenience.
- the present invention is not limited to the first embodiment, and various embodiments can be adopted.
- the calculated value according to is output as it is as the number of objects to be measured (finalized value).
- the counting unit 31 measures the number of fish (the number of objects to be measured) calculated based on one photographed image (a plurality of photographed images in the second embodiment) as a measurement value of the number of fish
- a storage device 14 may be provided.
- the counting unit 31 further reads a predetermined number of measured values (for example, a plurality of measured values based on a plurality of (multiple sets of) captured images within the set imaging time) from the storage device 14
- a function may be provided to calculate, for example, the average value of the measurement values as the determined value of the number of fish.
- FIG. 14 shows a simplified configuration of the information processing apparatus according to another embodiment of the present invention.
- the information processing apparatus 40 in FIG. 14 includes an extraction unit 41 and a counting unit 42 as functional units.
- the information processing device 40 constitutes a counting system 50 together with the photographing device 51 as shown in FIG.
- the photographing device 51 has a configuration for photographing a photographing space in which an object to be measured exists.
- the extraction unit 41 of the information processing apparatus 40 has a function of extracting a predetermined feature amount that changes in accordance with the number of objects to be measured from a photographed image in which the object to be measured is photographed.
- the counting unit 42 has a function of detecting the number of objects to be measured from the learning model by collating the extracted feature amounts with a learning model which is relationship data between the feature amount by machine learning and the number of objects. ing.
- the number of objects to be measured is at most the number of objects. It is possible to suppress an increase in the time required for measurement processing of Moreover, the information processing device 40 and the counting system 50 measure the number of objects to be measured based on the feature amount extracted from the captured image. Therefore, the information processing apparatus 40 and the counting system 50 can accurately detect (count) the number of objects to be measured even if there is an object to be measured that can not be detected in the captured image due to overlapping of objects or the like.
- Reference Signs List 10 50 counting system 11, 35 camera 12, 40 information processing device 30, 41 extraction unit 31, 42 counting unit 51 imaging device
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
Afin de fournir une technologie avec laquelle il serait possible de supprimer une augmentation du temps nécessaire pour un processus de mesure du nombre d'objets, même lorsque le nombre d'objets à mesurer est important, la présente invention concerne un dispositif de traitement des informations (40) comprenant une unité d'extraction (41) et une unité de comptage (42). L'unité d'extraction (41) extrait, d'une image capturée dans laquelle l'objet à mesurer est capturé, une quantité de caractéristiques prédéterminée qui change en fonction du nombre d'objets à mesurer. L'unité de comptage (42) vérifie la quantité de caractéristiques extraites par rapport à un modèle d'apprentissage, qui représente des données relationnelles entre une quantité de caractéristiques par apprentissage machine et le nombre d'objets, et détecte ainsi le nombre d'objets à mesurer à partir du modèle d'apprentissage.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019539698A JP6849081B2 (ja) | 2017-09-04 | 2018-09-03 | 情報処理装置、計数システム、計数方法およびコンピュータプログラム |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017-169793 | 2017-09-04 | ||
| JP2017169793 | 2017-09-04 | ||
| JP2017-224834 | 2017-11-22 | ||
| JP2017224834 | 2017-11-22 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019045091A1 true WO2019045091A1 (fr) | 2019-03-07 |
Family
ID=65525587
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/032538 Ceased WO2019045091A1 (fr) | 2017-09-04 | 2018-09-03 | Dispositif de traitement d'informations, système de comptage, procédé de comptage et support d'enregistrement de programme |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP6849081B2 (fr) |
| WO (1) | WO2019045091A1 (fr) |
Cited By (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110288623A (zh) * | 2019-06-18 | 2019-09-27 | 大连海洋大学 | 无人机海上网箱养殖巡检图像的数据压缩方法 |
| WO2020183837A1 (fr) * | 2019-03-08 | 2020-09-17 | 三菱電機株式会社 | Système de comptage, dispositif de comptage, dispositif d'apprentissage automatique, procédé de comptage, procédé d'agencement de composants et programme |
| JP2021152782A (ja) * | 2020-03-24 | 2021-09-30 | 国立大学法人愛媛大学 | 個体検出システム、撮影ユニット、個体検出方法、およびコンピュータプログラム |
| JP2021190040A (ja) * | 2020-06-05 | 2021-12-13 | 株式会社日立製作所 | 水中目標物検知用学習装置および水中目標物検知装置 |
| CN113812370A (zh) * | 2021-11-02 | 2021-12-21 | 广东科贸职业学院 | 基于视觉识别技术的虾苗计数装置及方法 |
| JP7004375B1 (ja) * | 2020-09-16 | 2022-01-21 | Fsx株式会社 | 携帯端末及びおしぼり管理システム |
| WO2022059404A1 (fr) * | 2020-09-16 | 2022-03-24 | Fsx株式会社 | Terminal mobile et système de gestion d'essuie-mains humides |
| WO2022065020A1 (fr) | 2020-09-28 | 2022-03-31 | ソフトバンク株式会社 | Procédé de traitement d'informations, programme et dispositif de traitement d'informations |
| JPWO2022080407A1 (fr) * | 2020-10-14 | 2022-04-21 | ||
| WO2024105963A1 (fr) | 2022-11-17 | 2024-05-23 | ソフトバンク株式会社 | Système d'imagerie |
| WO2024142701A1 (fr) | 2022-12-27 | 2024-07-04 | ソフトバンク株式会社 | Programme de traitement d'informations, dispositif de traitement d'informations et procédé de traitement d'informations |
| US12382936B2 (en) | 2022-07-28 | 2025-08-12 | Softbank Corp. | Information processing apparatus and information processing method |
| US12514236B2 (en) | 2021-10-28 | 2026-01-06 | Softbank Corp. | Information processing method, non-transitory computer-readable recording medium, and information processor |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010121970A (ja) * | 2008-11-17 | 2010-06-03 | Chugoku Electric Power Co Inc:The | 移動体認識システム及び移動体認識方法 |
| WO2015122161A1 (fr) * | 2014-02-14 | 2015-08-20 | 日本電気株式会社 | Système d'analyse vidéo |
| WO2015146113A1 (fr) * | 2014-03-28 | 2015-10-01 | 日本電気株式会社 | Système d'apprentissage de dictionnaire d'identification, procédé d'apprentissage de dictionnaire d'identification, et support d'enregistrement |
| WO2016136214A1 (fr) * | 2015-02-27 | 2016-09-01 | 日本電気株式会社 | Dispositif d'apprentissage d'identifiant, système de détection d'objet restant, procédé d'apprentissage d'identifiant, procédé de détection d'objet restant, et support d'enregistrement de programme |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5875917B2 (ja) * | 2012-03-26 | 2016-03-02 | 一般財団法人電力中央研究所 | 移動体の画像判別装置および移動体の画像判別方法 |
| JP6343874B2 (ja) * | 2013-05-10 | 2018-06-20 | 株式会社ニコン | 観察装置、観察方法、観察システム、そのプログラム、および細胞の製造方法 |
| JP2017051162A (ja) * | 2015-09-11 | 2017-03-16 | 国立研究開発法人農業・食品産業技術総合研究機構 | 検体表面の生菌数推定方法及び装置、その装置に搭載されるプログラム |
-
2018
- 2018-09-03 WO PCT/JP2018/032538 patent/WO2019045091A1/fr not_active Ceased
- 2018-09-03 JP JP2019539698A patent/JP6849081B2/ja active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010121970A (ja) * | 2008-11-17 | 2010-06-03 | Chugoku Electric Power Co Inc:The | 移動体認識システム及び移動体認識方法 |
| WO2015122161A1 (fr) * | 2014-02-14 | 2015-08-20 | 日本電気株式会社 | Système d'analyse vidéo |
| WO2015146113A1 (fr) * | 2014-03-28 | 2015-10-01 | 日本電気株式会社 | Système d'apprentissage de dictionnaire d'identification, procédé d'apprentissage de dictionnaire d'identification, et support d'enregistrement |
| WO2016136214A1 (fr) * | 2015-02-27 | 2016-09-01 | 日本電気株式会社 | Dispositif d'apprentissage d'identifiant, système de détection d'objet restant, procédé d'apprentissage d'identifiant, procédé de détection d'objet restant, et support d'enregistrement de programme |
Cited By (24)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7134331B2 (ja) | 2019-03-08 | 2022-09-09 | 三菱電機株式会社 | 計数システム、計数装置、機械学習装置、計数方法、部品配置方法、および、プログラム |
| WO2020183837A1 (fr) * | 2019-03-08 | 2020-09-17 | 三菱電機株式会社 | Système de comptage, dispositif de comptage, dispositif d'apprentissage automatique, procédé de comptage, procédé d'agencement de composants et programme |
| CN113518998B (zh) * | 2019-03-08 | 2024-04-16 | 三菱电机株式会社 | 计数系统、计数装置、机器学习装置、计数方法、零件配置方法以及记录介质 |
| CN113518998A (zh) * | 2019-03-08 | 2021-10-19 | 三菱电机株式会社 | 计数系统、计数装置、机器学习装置、计数方法、零件配置方法以及程序 |
| JPWO2020183837A1 (ja) * | 2019-03-08 | 2021-10-28 | 三菱電機株式会社 | 計数システム、計数装置、機械学習装置、計数方法、部品配置方法、および、プログラム |
| CN110288623B (zh) * | 2019-06-18 | 2023-05-23 | 大连海洋大学 | 无人机海上网箱养殖巡检图像的数据压缩方法 |
| CN110288623A (zh) * | 2019-06-18 | 2019-09-27 | 大连海洋大学 | 无人机海上网箱养殖巡检图像的数据压缩方法 |
| JP2021152782A (ja) * | 2020-03-24 | 2021-09-30 | 国立大学法人愛媛大学 | 個体検出システム、撮影ユニット、個体検出方法、およびコンピュータプログラム |
| JP7480984B2 (ja) | 2020-03-24 | 2024-05-10 | 国立大学法人愛媛大学 | 個体検出システム、撮影ユニット、個体検出方法、およびコンピュータプログラム |
| JP2021190040A (ja) * | 2020-06-05 | 2021-12-13 | 株式会社日立製作所 | 水中目標物検知用学習装置および水中目標物検知装置 |
| JP7579068B2 (ja) | 2020-06-05 | 2024-11-07 | 株式会社日立製作所 | 水中目標物検知用学習装置および水中目標物検知装置 |
| JP7004375B1 (ja) * | 2020-09-16 | 2022-01-21 | Fsx株式会社 | 携帯端末及びおしぼり管理システム |
| WO2022059404A1 (fr) * | 2020-09-16 | 2022-03-24 | Fsx株式会社 | Terminal mobile et système de gestion d'essuie-mains humides |
| US11854250B2 (en) | 2020-09-16 | 2023-12-26 | Fsx, Inc. | Portable terminal and oshibori management system |
| WO2022065020A1 (fr) | 2020-09-28 | 2022-03-31 | ソフトバンク株式会社 | Procédé de traitement d'informations, programme et dispositif de traitement d'informations |
| JPWO2022080407A1 (fr) * | 2020-10-14 | 2022-04-21 | ||
| JP7287734B2 (ja) | 2020-10-14 | 2023-06-06 | 国立研究開発法人海洋研究開発機構 | 魚数算出方法、魚数算出プログラム、及び、魚数算出装置 |
| US12514236B2 (en) | 2021-10-28 | 2026-01-06 | Softbank Corp. | Information processing method, non-transitory computer-readable recording medium, and information processor |
| CN113812370A (zh) * | 2021-11-02 | 2021-12-21 | 广东科贸职业学院 | 基于视觉识别技术的虾苗计数装置及方法 |
| US12382936B2 (en) | 2022-07-28 | 2025-08-12 | Softbank Corp. | Information processing apparatus and information processing method |
| JP7556926B2 (ja) | 2022-11-17 | 2024-09-26 | ソフトバンク株式会社 | 撮影システム |
| JP2024073313A (ja) * | 2022-11-17 | 2024-05-29 | ソフトバンク株式会社 | 撮影システム |
| WO2024105963A1 (fr) | 2022-11-17 | 2024-05-23 | ソフトバンク株式会社 | Système d'imagerie |
| WO2024142701A1 (fr) | 2022-12-27 | 2024-07-04 | ソフトバンク株式会社 | Programme de traitement d'informations, dispositif de traitement d'informations et procédé de traitement d'informations |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2019045091A1 (ja) | 2020-09-17 |
| JP6849081B2 (ja) | 2021-03-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019045091A1 (fr) | Dispositif de traitement d'informations, système de comptage, procédé de comptage et support d'enregistrement de programme | |
| EP3248374B1 (fr) | Procédé et appareil pour acquisition et fusion de cartes de profondeur à technologies multiples | |
| US10225445B2 (en) | Methods and apparatus for providing a camera lens or viewing point indicator | |
| US10191356B2 (en) | Methods and apparatus relating to detection and/or indicating a dirty lens condition | |
| TWI527448B (zh) | 成像設備,影像處理方法,及用以記錄程式於其上之記錄媒體 | |
| JP2021060421A (ja) | 魚体長さ測定システム、魚体長さ測定方法および魚体長さ測定プログラム | |
| US20140300779A1 (en) | Methods and apparatuses for providing guide information for a camera | |
| US11328439B2 (en) | Information processing device, object measurement system, object measurement method, and program storage medium | |
| JP2016114668A (ja) | 撮像装置および制御方法とプログラム | |
| JP6096298B2 (ja) | 逆光補正方法、装置及び端末 | |
| CN102550015A (zh) | 多视点拍摄控制装置、控制方法以及控制程序 | |
| CN107295252A (zh) | 对焦区域显示方法、装置及终端设备 | |
| JP6981531B2 (ja) | 物体同定装置、物体同定システム、物体同定方法およびコンピュータプログラム | |
| US20170118394A1 (en) | Autofocusing a macro object by an imaging device | |
| US20160353021A1 (en) | Control apparatus, display control method and non-transitory computer readable medium | |
| US20200077059A1 (en) | Display apparatus, display system, and method for controlling display apparatus | |
| JP2017192114A5 (fr) | ||
| JP6583565B2 (ja) | 計数システムおよび計数方法 | |
| JP7007280B2 (ja) | 情報処理装置、計数システム、計数方法およびコンピュータプログラム | |
| US9955130B2 (en) | Image display control device, image display system, and image display control method | |
| JP2017072945A (ja) | 画像処理装置、画像処理方法 | |
| JP6956894B2 (ja) | 撮像素子、撮像装置、画像データ処理方法、及びプログラム | |
| CN102843512A (zh) | 摄像装置、摄像方法 | |
| JP2008252304A (ja) | 動画像処理装置及び方法 | |
| JP2014241160A (ja) | 光照射装置およびプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18852495 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2019539698 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18852495 Country of ref document: EP Kind code of ref document: A1 |