Missing tie_word_embeddings in config.json causes incorrect weight tying

#1

This is to fix issue with transformers 4.54+..<5 where it sets tie_word_embeddings=True which is not expected.

louis-jan changed pull request title from Update config.json to Missing tie_word_embeddings in config.json causes incorrect weight tying
jan-hq changed pull request status to merged

Sign up or log in to comment