experiment for numerical stability
add FilterCSVWithNumbersUpToMax
experimental transformer code
fixes ReverseGroupedOneHotEncoding
fixes TNNetNeuron.InitAdam
fixes GroupedOneHotEncoding
fixes access violation error
fixes access violation error
NLP coding
New experimental CAI self attention
MulWeights is now a function
fixes TNNetSigmoid.Backpropagate()
InitHeUniformForAllDenseLayers
Adding InitGlorotBengioUniformForAllConvLayers
CAI is again using He for convolutional layers
Convolutional layers got the same initialization as Keras
Updating CAI Transformer
Updating CAI Transformer
Updating CAI Transformer
fixes pointwise softmax with no forward and skip derivative
adding debug code
adding debug code
adding debug code
better normalization methods
fixes backpropagation with branching
fixes backpropagation with branching
adds plenty of self testing
fixes backpropagation with branching
New TNNetMovingStdNormalization learning algorithm
New TNNetMovingStdNormalization learning algorithm
better delta clip
inertia coding - better debug
inertia coding - better debug
Clear delta and inertia at loading
More debug info
Save best accuracy x loss
Save best accuracy x loss
Better preperties usage
adding validation error and loss properties to NFIT
better initialization parameters for NLP
updating default optimizer values
Adding local connect linear
fixes pixel class reading
new loss functions
better log messages
fixing one-hot encoding
small optimization for noforward in dotproducts
fixes noforward parameter in dotproducts
optimizing dotproducts backprop
coding grouped pointwise softmax
Grouped one-hot encoding
testing transformers
coding NLP support
coding NLP support
No forward calculation at softmax
adding debug functions
More safety checks
Optimizing TVolume.CopyTransposingAs2D
Fixing departing branches count
Fixing departing branches count
Fixing DotProducts departing branches
Fixing DotProducts departing branches
closer implementation of transformer decoder
closer implementation of transformer encoder
fixes embedding learning direction
fixes embedding learning direction
removing min learning rate
removing extra transpose from transformers
experimental max norm in CAI transformer
In testing we trust - fixing initialization of embedding
In testing, we trust: fixes PointwiseSoftMax
In testing, we trust: fixes PointwiseSoftMax
protects against overflow with adam
protects against overflow with adam
in code review we trust
in code review we trust
updating SGD optimizer
embedding now uses uniform initialization
fixing broken softmax
in testing/replication we trust
coding adam
coding adam
coding adam
fixing introduced bug and speeing up
experimenting delta norm
experimenting delta norm
in testing we trust
in testing we trust
OneHotEncodingReversed
testing/bug fixing transformers
testing/bug fixing transformers
Coding transformers - missed debug
Coding transformers
coding embedding
coding token embedding
Adding Transformer Example
coding TNNetTokenAndPositionalEmbedding
coding TNNetTokenAndPositionalEmbedding
coding token embedding
updating source code comments