|
5bdcb56725
|
new(transformer): implement TransformerEncoderLayer and add test
|
2025-02-15 18:12:55 +09:00 |
|
|
e3765c80d7
|
update: dependency
|
2025-02-15 18:12:40 +09:00 |
|
|
8c14f7bb64
|
refactor: rename test_transformer to test_multihead_attention
|
2025-02-15 18:12:03 +09:00 |
|
|
de6634e3f7
|
update doc on MultiheadAttention
|
2025-02-03 22:12:34 +09:00 |
|
|
c2c2872b46
|
complete MultiheadAttention
|
2025-02-03 22:09:23 +09:00 |
|
|
c07d9c30cc
|
(WIP): implement MultiheadAttention
|
2025-01-28 22:19:46 +09:00 |
|
|
6a91ef35e1
|
update Manifest.toml
|
2025-01-28 22:19:32 +09:00 |
|
|
d6b1d5b30f
|
add formatter config
|
2025-01-28 22:19:15 +09:00 |
|
|
e41f6eafb3
|
update
|
2025-01-08 22:12:55 +09:00 |
|
testuser
|
baa7d5a9d6
|
updated to 1.11.2
|
2024-12-04 22:23:36 +09:00 |
|
|
dc399629ba
|
done: training a simple LSTM
|
2024-11-22 21:30:46 +09:00 |
|