WO1999023576A1 - Procede et dispositif pour la classification d'une premiere serie chronologique et au moins d'une deuxieme serie chronologique - Google Patents
Procede et dispositif pour la classification d'une premiere serie chronologique et au moins d'une deuxieme serie chronologique Download PDFInfo
- Publication number
- WO1999023576A1 WO1999023576A1 PCT/DE1998/003184 DE9803184W WO9923576A1 WO 1999023576 A1 WO1999023576 A1 WO 1999023576A1 DE 9803184 W DE9803184 W DE 9803184W WO 9923576 A1 WO9923576 A1 WO 9923576A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time series
- statistical
- samples
- surrogate
- measure
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/18—Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
Definitions
- the invention relates to the classification of a first time series and at least one second time series.
- a given measurement signal x has any number of samples xj-, which are sampled with a step size w (see FIG. 2). It is important to determine linear and nonlinear statistical dependencies between the sample values x- j -. Depending on a predetermined number of samples v in the past, which are analyzed with regard to their statistical dependency, the information obtained by the analysis is used to predict a number of values z in the future.
- a surrogate for a given time series is to be understood as a time series that has certain statistical properties that are the same as the given time series.
- [6] describes the training of a neural network according to the maximum likelihood principle.
- a Markov process of order m is to be understood as a time-discrete random process in which a future value depends only on the values that lie m steps in the past.
- the rank of a time series is further understood to mean the order of the samples of a time series according to the size of the samples.
- a method for classifying a time series is known from [9], in which a predeterminable number of surrogates is determined for the time series.
- a predeterminable number of surrogates is determined for the time series.
- non-linear correlations between the values of the time series and the values of the surrogates are determined using a culinary-based method.
- the time series is classified based on the non-linear correlations.
- Another method for classifying a time series is known from [10]. This process becomes dynamic System modeled according to their probability density. A neural network is trained according to the probabilities of a non-linear Markov process of order m according to the maximum likelihood principle.
- the invention is based on the problem of creating a method and a device with which a classification of a plurality of time series with regard to their statistical dependency of the samples is possible.
- a nonlinear Markov process is modeled for a first time series by a first statistical estimator.
- a nonlinear Markov process for the second time series is modeled by a second statistical estimator.
- At least one surrogate time series is formed for the first time series using the first statistical estimator.
- At least one surrogate time series is formed for the second time series using the second statistical estimator.
- a first measure for the statistical dependency of the samples of the first time series and the samples of the second time series is formed for a predetermined number of future samples.
- a second measure for the statistical dependence of the values of the surrogate time series on one another is formed for a predetermined number of samples lying in the future.
- a difference measure is formed from the first measure and the second measure.
- the classification is such that
- the first time series and the second time series are assigned to a first group,
- the device has a processor unit which is set up in such a way that a non-linear Markov process is modeled for a first time series by a first statistical estimator. A nonlinear Markov process for the second time series is replaced by a second statistical
- Model estimator At least one surrogate time series is formed for the first time series using the first statistical estimator. At least one surrogate time series is formed for the second time series using the second statistical estimator. A first measure for the statistical dependency of the samples of the first time series and the samples of the second time series is formed for a predetermined number of future samples. Furthermore, a second measure for the statistical dependence of the values of the surrogate time series on one another is formed for a predetermined number of samples lying in the future. A difference measure is formed from the first measure and the second measure. The classification is such that
- the first time series and the second time series are assigned to a first group,
- the invention makes it possible for the first time to establish statistical dependency between multidimensional time series, i.e. several time series.
- a nonlinear neural network as a statistical estimator, since a neural network is very well suited for estimating probability densities.
- the invention can be used in various fields of application. Statistical dependencies between measured signals of an electroencephalogram (EEG) or an electrocardiogram (EKG) can be determined.
- EEG electroencephalogram
- EKG electrocardiogram
- the invention can also be used very advantageously for analyzing a financial market, the course of the signal in this case, for example, describing the course of a share or a foreign exchange rate.
- FIG. 1 shows a sketch in which the invention is shown in its individual elements
- Figure 2 is a sketch showing the course of a measurement signal f, which is converted into a time series ⁇ x ⁇ by sampling with a step size w
- Figure 3 is a sketch showing a computer with which the invention is carried out.
- a first time series ⁇ xt ⁇ unc ⁇ a second time series ⁇ yt ⁇ each have a predeterminable number of samples x- j -, y ⁇ of a signal, in particular an electrical signal.
- the signals are measured by a measuring device MG (see FIG. 3) and fed to a computer R.
- the method described below is carried out in the computer and the results are fed to a means for further processing WV.
- ny is modeled using a nneuurroonnaalleenn NNeettzzeess NNNN XX ,, NNNNyy
- the information flow of the first time series ⁇ t ⁇ and the second time series ⁇ yt ⁇ is approximated.
- a time series is identified by the following designation:
- Each neural network NN X , NNy is trained to approximate a non-linear Markov process of order n x , ny using the maximum likelihood principle with which the learning rule is followed to maximize the product of the probabilities.
- the respective neural network NN X , NNy is thus intended to estimate the conditional probability
- ⁇ , ⁇ i: k +1 ) i expM vk, rs + 1 K s + 1 k, rl v s + 1, dv s + 1
- ⁇ denotes the Fourier transform of the probability densities and K_ denotes the variables of the function ⁇ l ..., K ⁇ , ... I in Fourier space.
- i V- ⁇ .
- regulation (14) can be simplified to the following regulation:
- a measure is formed for the statistical dependency between the sample values v-, ' r , ..., v', r -, of the respective time series.
- the measure mj (r) represents a cumulative-based characterization of the information flow of the dynamic system from which the time series ⁇ xt ⁇ ] are determined, and quantifies the dependencies of the values ⁇ xt + r ⁇ kk e J - taking nj values into account Past time series ⁇ x ⁇ - ⁇ ..
- a 2-layer, forward-looking neural network is trained to measure the probability densities p (+ l ⁇ k ⁇ * t ⁇ •••, ⁇ x - n ⁇ + l ⁇ ⁇ •••> ⁇ N > ••• * ⁇ x tn N + l ⁇ N
- a neural network is trained for each time series ⁇ xt ⁇ ⁇ ⁇ - in such a way that the neural network performs an estimate of the respective probability density of the Markov process of order (n] _, ..., ⁇ N):
- L (d) is defined according to regulation (19).
- the conditional probability density is therefore described by a weighted sum of normal distributions, their weights u h , mean values
- Perceptron receive the first s- k r components of the vector v 'as an s-dimensional input variable.
- the training takes place according to the maximum likelihood principle, which is described in [7].
- N neural networks are trained and N conditional probability densities for an N-dimensional Markov process of order (n] _, ..., n ⁇ ) are estimated.
- the neural networks are able to generate new time series using the Markov processes of the original time series ⁇ x ⁇ l, starting with the first n ⁇ values of each time series ⁇ x ⁇ ,.
- the first s values ⁇ x ⁇ •••, ⁇ x n ⁇ ⁇ , •••, ⁇ x l ⁇ N '••• / ⁇ x n N ⁇ N J are fed to the neural network, which each simulates a conditional probability density. According to the so-called Monte Carlo method, new values ⁇ xi ⁇
- 1 N formed
- time series ⁇ xt ⁇ ⁇ and the surrogate time series ⁇ xt ⁇ • are subjected to a statistical test, which is described in [6], (step 102).
- null hypothesis defines that the respective dynamic system can be described with sufficient accuracy by an s-dimensional Markov process of order (n, ..., njj).
- the so-called Student t test is used to test the null hypothesis.
- a surrogate measure m. (r) determined.
- the determination of the surrogate measures ⁇ ⁇ (r) is carried out in the same way as the determination of the measure m k (r) for the statistical dependency for the time series ⁇ x ⁇ ,.
- the dependence on r results from the last component of the vector v '.
- the statistical dependencies between the last n k values of each time series ⁇ xt ⁇ k and the value that lies r steps in the future are measured.
- Surrogate mean values ⁇ ⁇ (r) and surrogate standard deviations ⁇ (r) are determined according to the following regulations:
- the Student t test is carried out in accordance with the following regulation:
- t k (r) By forming a test value t k (r), a value is determined by which the surrogate time series are compared with the measure m k (r) of the original time series ⁇ xt ⁇ k .
- the time series ⁇ xt ⁇ k can be described by a multi-dimensional Markov process, in which the last n_ values of the time series ⁇ x ⁇ ) ⁇ ,. , , and the last n ⁇ values of the time series ⁇ xtN are taken into account. If the null hypothesis is accepted, the examined time series ⁇ xt ⁇ ⁇ a ⁇ s become a time series of a first group which can be described by the Markov process of the respective order (n] _, ..., jsj) , classified (step 103) and the process is ended.
- the order (ni, ..., ⁇ N) of the Markov process is increased and the process is repeated, starting with the training of the neural networks (step 104).
- the number of time series examined is arbitrary, with two time series only two neural networks are required.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Operations Research (AREA)
- Probability & Statistics with Applications (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Algebra (AREA)
- Evolutionary Biology (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Computational Biology (AREA)
- Complex Calculations (AREA)
Abstract
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2000519368A JP2001522094A (ja) | 1997-11-04 | 1998-10-30 | 第1の時系列および少なくとも1つの第2の時系列を分類する方法および装置 |
| EP98961058A EP1027663A1 (fr) | 1997-11-04 | 1998-10-30 | Procede et dispositif pour la classification d'une premiere serie chronologique et au moins d'une deuxieme serie chronologique |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE19748676.2 | 1997-11-04 | ||
| DE19748676 | 1997-11-04 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO1999023576A1 true WO1999023576A1 (fr) | 1999-05-14 |
Family
ID=7847569
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/DE1998/003184 Ceased WO1999023576A1 (fr) | 1997-11-04 | 1998-10-30 | Procede et dispositif pour la classification d'une premiere serie chronologique et au moins d'une deuxieme serie chronologique |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP1027663A1 (fr) |
| JP (1) | JP2001522094A (fr) |
| WO (1) | WO1999023576A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2001085018A3 (fr) * | 2000-05-09 | 2003-02-27 | Siemens Ag | Procede et dispositif de classification de valeurs de serie, support d'enregistrement lisible par ordinateur, et element de programme informatique |
| RU2268485C2 (ru) * | 2003-05-20 | 2006-01-20 | Войсковая часть 45807 | Устройство для классификации последовательности цифровых сигналов |
| CN110462629A (zh) * | 2017-03-30 | 2019-11-15 | 罗伯特·博世有限公司 | 用于识别眼睛和手的系统和方法 |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO1997033237A1 (fr) * | 1996-03-06 | 1997-09-12 | Siemens Aktiengesellschaft | Procede de classification par un ordinateur d'une serie chronologique presentant un nombre preselectionnable de valeurs echantillonnees, en particulier d'un signal electrique |
-
1998
- 1998-10-30 WO PCT/DE1998/003184 patent/WO1999023576A1/fr not_active Ceased
- 1998-10-30 JP JP2000519368A patent/JP2001522094A/ja not_active Withdrawn
- 1998-10-30 EP EP98961058A patent/EP1027663A1/fr not_active Withdrawn
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO1997033237A1 (fr) * | 1996-03-06 | 1997-09-12 | Siemens Aktiengesellschaft | Procede de classification par un ordinateur d'une serie chronologique presentant un nombre preselectionnable de valeurs echantillonnees, en particulier d'un signal electrique |
Non-Patent Citations (2)
| Title |
|---|
| DECO G ET AL: "LEARNING TIME SERIES EVOLUTION BY UNSUPERVISED EXTRACTION OF CORRELATIONS", PHYSICAL REVIEW E. STATISTICAL PHYSICS, PLASMAS, FLUIDS, AND RELATED INTERDISCIPLINARY TOPICS, vol. 51, no. 3, March 1995 (1995-03-01), pages 1780 - 1790, XP000677868 * |
| SILIPO R ET AL: "Dynamics modelling in brain circulation", NEURAL NETWORKS FOR SIGNAL PROCESSING VII. PROCEEDINGS OF THE 1997 IEEE SIGNAL PROCESSING SOCIETY WORKSHOP (CAT. NO.97TH8330), NEURAL NETWORKS FOR SIGNAL PROCESSING VII. PROCEEDINGS OF THE 1997 IEEE SIGNAL PROCESSING SOCIETY WORKSHOP, AMELIA ISLAND,, ISBN 0-7803-4256-9, 1997, New York, NY, USA, IEEE, USA, pages 162 - 171, XP002094263 * |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2001085018A3 (fr) * | 2000-05-09 | 2003-02-27 | Siemens Ag | Procede et dispositif de classification de valeurs de serie, support d'enregistrement lisible par ordinateur, et element de programme informatique |
| RU2268485C2 (ru) * | 2003-05-20 | 2006-01-20 | Войсковая часть 45807 | Устройство для классификации последовательности цифровых сигналов |
| CN110462629A (zh) * | 2017-03-30 | 2019-11-15 | 罗伯特·博世有限公司 | 用于识别眼睛和手的系统和方法 |
| CN110462629B (zh) * | 2017-03-30 | 2024-04-02 | 罗伯特·博世有限公司 | 用于识别眼睛和手的系统和方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2001522094A (ja) | 2001-11-13 |
| EP1027663A1 (fr) | 2000-08-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP0934567B1 (fr) | Procede de classification de la dependance statistique d'une serie chronologique mesurable | |
| Hyvärinen et al. | Independent component analysis by general nonlinear Hebbian-like learning rules | |
| DE60208223T2 (de) | Anordnung und verfahren zur gesichtserkennung unter verwendung von teilen des gelernten modells | |
| DE69324052T2 (de) | Neuronalnetzwerk-Lernsystem | |
| DE69607460T2 (de) | Neuronales netzwerk | |
| DE69527523T2 (de) | Verfahren und apparat zum auffinden und identifizieren eines gesuchten objekts in einem komplexen bild | |
| DE112018006885B4 (de) | Trainingsvorrichtung,sprachaktivitätsdetektor und verfahren zur erfassung einer sprachaktivität | |
| EP3736817A1 (fr) | Vérification et/ou amélioration de la cohérence des identifications de données lors du traitement des images médicales | |
| DE102014223226A1 (de) | Diskriminator, Unterscheidungsprogramm und Unterscheidungsverfahren | |
| DE112018000723T5 (de) | Aktualisierungsverwaltung für eine RPU-Anordnung | |
| DE112017005640T5 (de) | Informationsverarbeitungsvorrichtung und Informationsverarbeitungsverfahren | |
| EP0925541B1 (fr) | Dispositif et procede de production assistee par ordinateur d'au moins un vecteur de donnees d'entrainement artificiel pour un reseau neuronal | |
| WO1999023576A1 (fr) | Procede et dispositif pour la classification d'une premiere serie chronologique et au moins d'une deuxieme serie chronologique | |
| DE69619154T2 (de) | Verfahren und Vorrichtung zur Mustererkennung | |
| EP0978052B1 (fr) | Selection assistee par ordinateur de donnees d'entrainement pour reseau neuronal | |
| EP1359539A2 (fr) | Modèle neurodynamique de traitement d'informations visuelles | |
| DE69030301T2 (de) | System zur Erzeugung von Referenzmustern | |
| EP0885423B1 (fr) | Procede de et appareil pour classification d'une serie chronologique d'un signal electrique par un ordinateur | |
| EP1114398B1 (fr) | Procede pour entrainer un reseau neuronal, procede de classification d'une sequence de grandeurs d'entree au moyen d'un reseau neuronal, reseau neuronal et dispositif pour l'entrainement d'un reseau neuronal | |
| DE112022001967T5 (de) | Klassifizierung von zellkernen mit vermeidung von artefakten | |
| DE19549300C1 (de) | Verfahren zur rechnergestützten Ermittlung einer Bewertungsvariablen eines Bayesianischen Netzwerkgraphen | |
| EP1170678B1 (fr) | Méthode et appareil de recouvrement automatique d'ensembles pertinents d'images | |
| EP1254415A1 (fr) | Dispositif, support d'informations et procede pour trouver des objets presentant une grande similitude par rapport a un objet predetermine | |
| EP4200737B1 (fr) | Procédé de détection de manipulation de données relative à des valeurs de données numériques | |
| DE60204693T2 (de) | Signalsverarbeitungssystem mit einer Vorrichtung zum Korregieren einer Kovarianzmatrix |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): JP US |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
| DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 1998961058 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 09530711 Country of ref document: US |
|
| WWP | Wipo information: published in national office |
Ref document number: 1998961058 Country of ref document: EP |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 1998961058 Country of ref document: EP |