LongFinBERT-base / config.json
minhtriphan's picture
Upload folder using huggingface_hub
59712b7 verified
{"attention_probs_dropout_prob": 0.1, "hidden_dropout_prob": 0.1, "hidden_size": 768, "max_position_embeddings": 250000, "num_attention_heads": 12, "num_hidden_layers": 12, "pad_token_id": 0, "vocab_size": 30873, "segment_size": [16, 128, 512, 1024, 2048, 4096, 8192], "dilated_rate": [1, 16, 64, 256, 512, 1024, 2048]}