Uploaded finetuned model
- Developed by: AhmetSemih
- License: apache-2.0
- Finetuned from model : unsloth/gemma-3-4b-it-unsloth-bnb-4bit
This model was trained on:
- Wikipedia (Turkish) – 21,399 articles
Source: wikimedia/wikipedia (20231101.tr)
- MMLU-style QA Dataset in Turkish – 5,000 examples
Source: alibayram/turkish_mmlu Format: Multiple-choice questions.
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for AhmetSemih/gemma-3-4b-finetuned
Base model
google/gemma-3-4b-pt
Finetuned
google/gemma-3-4b-it
Quantized
unsloth/gemma-3-4b-it-unsloth-bnb-4bit