Synthetic sampling
Collection
Collection of models finetuned using various configurations of social surveys data for the synthetic sampling use case. • 5 items • Updated
This model is a fine-tuned version of google/gemma-2-9b on the oxford-llms/ultrachat_filtered, the oxford-llms/Magpie-Qwen2.5-Pro-1M-v0.1-filtered, the oxford-llms/european_social_survey_2020_sft, the oxford-llms/european_social_survey_2023_sft and the oxford-llms/world_values_survey_2017_2022_sft datasets. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 0.7037 | 1.0 | 7750 | 0.8465 |
| 0.5064 | 2.0 | 15500 | 0.8087 |
Base model
google/gemma-2-9b