Hi, it sholud be working well now! Thanks for finding the bug! K.
trunk,nnet1,mmi : bugfix in inital data filteri...
Yes it is. We should both thank to 'Lukas Burget', he is the original author of the...
Ok, I'll fix that. Thanks for finding the bug! K. Dne 15. 7. 2015 v 13:32 Daniel...
Hi, the problem is that you pre-trained only 5 layers, while the DNN training script...
trunk,nnet1: fixing 'blocksoftmax' example, add...
trunk,nnet1: updating 'blocksoftmax' example (r...
Hi, Yes it should work like that. K. Dne 4. 7. 2015 v 19:56 tfpeach napsal(a): Hi,...
trunk,ami : refactoring the ami recipe which wi...
trunk,nnet1 : bugfix in cuda-gpu detection (mmi...
Hi, no, the adaptation of NN weights to individual speakers is not supported. The...
trunk,nnet1: cosmetic changes, allowing initial...
Hi, you need to put whole <ParallelComponent> description to single line in the config...
trunk,nnet1 : adding BLSTM code from Ni Chongji...
Hi, most likely you have run out of RAM (you'll see this with utility 'top'), by...
trunk,nnet1: adding tool 'cuda-gpu-available', ...
trunk,nnet1 : lstm, changing indentation to be ...
trunk,nnet1 : cosmetic changes to 2D-CNN script,
trunk,nnet1: updating the 'usage' messages in s...
Hello, no, in order to put 2 NNs in parallel you'll need to create a "prototype"...
trunk,nnet1 : cosmetic change in MultiTaskLoss
You don't need to split the input features '<ParallelComponent>' will do it for you....
Hi, this can be done easily with nnet1, there is a Component '<ParallelComponent>'...
trunk,nnet1 : implementing multi-task loss func...
Hi, most likely the input-feature dimension is wrong, i.e. different from the RM...
trunk,nnet1: adding the LSTM equation diagram f...
sandbox/cudnn : syncing branch with trunk, will...
Hi, I have noticed the pipeline with 'paste-post' consumed quite a lot of memory,...
Hi, I have noticed the pipeline with 'paste-post' consumed quite a lot of memory,...
Hi, I have noticed the pipeline with 'paste-post' consumed quite a lot of memory,...
trunk,nnet1: updating paste-post.cc, improving ...
OK, I have tried to fix it, but I haven't run the example script to check. Dan
No, there has not, and since it relates to the nnet1 setup which Karel maintains...
It looks like that program was added recently by Karel (Karel, please run cpplint.py...
Hi, the variable 'cmvn_opts' is contains options for tool 'apply-cmvn'. If you go...
You will have to see how data-fbank-multisoftmax/train/ was created, but IIRC in...
trunk,tools: beamformit, updating the patch (th...
trunk,tools : adding patch to beamformit tool, ...
Hi, hard to tell where the problem is, the assert in 'nnet-forward.cc:158' is right...
trunk,cudamatrix : removing compute capability ...
How did this happen? Someone else gave you a trained neural network or did you train...
trunk,nnet1 : changing magic constant in 'nnet-...
Hi, the affine transforms contain variable "<LearnRateCoef> 1" you can set it to...
Hi, do you need to use the nnet2 setup? If not, it should be easy, you build an RBM...
trunk,nnet1 : adding prior re-estimation after ...
trunk,nnet1 : sMBR bugfix (accidentally flipped...
trunk,nnet1: tedlium, adding DNN baseline for L...
trunk,nnet1 : tedlium, updating DNN recipe with...
trunk,nnet1: removing scrtipt (use run_dnn_bn.sh)
trunk,nnet1 : bugfix in mmi/mpe script (unshuff...
trunk,nnet1 : adding support to --one-silence-c...
trunk,nnet1: various enhancements, minor update...
dummy : just a test if svn commit work
trunk,nnet1 : bugfix in LSTM forwarding (missin...
Hi, looks like lots of insertions appeared, hard to say why, you can try: steps/nnet/train_mpe.sh...
Hi Lucian, something got wrong, the difference must be (<0.5%), typically (<0.2%)....
trunk,nnet1 : adding lstm example for tedlium (...
trunk,nnet1 : adding --time-shift to 'nnet-forw...
trunk,nnet1 : adding lstm example (needs to be ...
trunk,nnet1: adding the LSTM C++ code from Jiay...
trunk,nnet1 : refactoring nnet/nnet-loss.{h,cc}...
trunk,dnn1.dox : updating the doc based on the ...
trunk,dnn1.dox : updating the docs (incorporate...
trunk,nnet1: steps/nnet/make_bn_feats.sh, HTK f...
utils/copy_data_dir.sh - suppress message when ...
trunk,nnet1 : bugfix
Hi, seems like you are using very old GPU, which has 10x lower computing power than...
Okay I have the code already, it was in email from the author... Thanks, K. On 01/22/2015...
Hi Alim, thanks for the info, with classical feedforward DNNs, the frame accuracy...
trunk,nnet1: moving run_dnn.sh scripts to be co...
trunk,nnet1 : adding the tandem recipe,
trunk,bugfix : fixing bug in decode script with...
trunk,steps: adding the 'delta_opts' so there c...
trunk,nnet1: adding some bacward-compatibility ...
Hi, Gaussians with diagonal covariance matrix cannot fit well the data with correlations,...
Hi, the autoencoder example is in the trunk: trunk/egs/timit/s5/local/nnet/run_autoencoder.sh...
Hi, the autoencoder example is in the trunk: trunk/egs/timit/s5/local/nnet/run_autoencoder.sh...
trunk,nnet1: adding autoencoder example, demons...
trunk,nnet1: adding tool 'feat-to-post', Conver...
Hi, thanks to Dan for forwarding. What you need to do is very similar to training...
... Also you may have to replace the softmax nonlinearity with another one, such...
In nnet-loss.h, there are child classes of Component called Xent and Mse. You have...
Look for the functions Propagate and Backpropagate of virtual class Component in...
Karel would have to answer, since you're using the nnet1 setup. Dan
Thanks for your reply. Yes, maximizing the -MSE is also one of method for minimization...
Instead of modification the update rule to allow for minimization, you can \max -1...
I think the toolkit is probably set up to maximize the objective function, not minimize...
It is done beforehead, when preparing the input features, there are long vectors...
WARNING, TIMIT BASELINES CHANGE BY THIS COMMIT!
Dear Xinquan, which script do you refer to? If nnet1, there is example in ^/trunk/egs/rm/s5/local/nnet/run_cnn.sh...
Dear Ben, please have a look at the tool : nnet1-to-raw-nnet, and the script "steps/nnet2/convert_nnet1_to_nnet2.sh"....
trunk,openblas : sync of linux_openblas.mk with...
trunk,matrix: adding speed test which can be us...
trunk,nnet1: adding workaround to disable GPU c...
Hi Rayava, Sorry, I try not to focus on TNet anymore and there was no conversion...
sandbox/cudnn : adding demonstration of Sigmoid...
Creating branch for cuDNN and Thrust code examp...
Yes, the tool was long ago replaced by "nnet-copy", which accepts options --remove-{first,last}-layers,...
trunk,nnet: updating the training scripts,
trunk,nnet1: changing comment to the --label op...