Skip to content

domschl/transformer-poet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 

Repository files navigation

transformer-poet

Open In Colab License

Character, ngram or word based transformer model for text generation. Uses ml-indie-tools to run locally on M1-Mac, Nvidia or remotely on Colab with single code-base.

You can find the transformer attention implementation at the ml-indie-project, it's minimal and well documented and can server as base for further experimentations with transformer-like architecture.

History

  • 2022-12-13: ml-indie-tools 0.4.0 removed all recurrence and gated memory, since it didn't improve things. Work-around for M1 tensorflow 2.11 problems with ADAM and XLA (crash on train), fixed using legacy.ADAM.
  • 2022-12-11: ml-indie-tools 0.3.17 has new RecurrentSelfAttention layer that introduces a state similar to RNNs into key matrix of the Attention.
  • 2022-11-21: ml-indie-tools ngram support.
  • 2022-06-16: Tests with autoencoder-like bottlenecks in multi-head attention: in the middle of the layer-stack, decrease attention-units and increase attention-heads.
  • 2022-01-13: Project split from LSTM-version tensor-poet. Further project-siblings are torch-poet, implementing char-based text generation with pytorch and syncognite rnnreader, implementing char-based text generation completely from scratch with c++.

About

Character based transformer model for text generation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published