Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
fzengin18
/
multrenizer
like
0
Transformers
wikimedia/wikipedia
Helsinki-NLP/opus-100
Turkish
English
tokenizer
tokenizers
unigram
turkish
english
bilingual
arxiv:
2508.08424
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
multrenizer
1.86 MB
Ctrl+K
Ctrl+K
1 contributor
History:
7 commits
fzengin18
Add dataset metadata to model card
e1fe4bd
verified
about 14 hours ago
.gitattributes
Safe
1.52 kB
initial commit
about 14 hours ago
LICENSE
10.8 kB
Upload LICENSE with huggingface_hub
about 14 hours ago
README.md
14.9 kB
Add dataset metadata to model card
about 14 hours ago
special_tokens_map.json
16.2 kB
Upload folder using huggingface_hub
about 14 hours ago
tokenizer.json
1.8 MB
Upload folder using huggingface_hub
about 14 hours ago
tokenizer_config.json
16.9 kB
Upload folder using huggingface_hub
about 14 hours ago