Dataset Viewer
Auto-converted to Parquet Duplicate
T
stringclasses
1 value
Modelo
stringclasses
5 values
Tipo
stringclasses
1 value
Arquitetura
stringclasses
2 values
Tipo de Peso
stringclasses
1 value
Precisão
stringclasses
1 value
Licença
stringclasses
2 values
#Params (B)
float64
0.49
3.21
Hub Likes
int64
263
1.52k
Disponível no hub
bool
1 class
SHA do modelo
stringclasses
5 values
Semântica e Inferência
float64
0
0
Computação
float64
0
0
Discurso de Ódio
float64
0.56
0.79
Área do Direito
float64
0
0
Área Médica
float64
0
0
Multidisciplinar
float64
0
0
Economia e Contabilidade
float64
0
0
Provas Militares
float64
0
0
HateBR
float64
0.5
0.88
PT Hate Speech
float64
0.64
0.7
tweetSentBR
float64
0.5
0.71
OAB
float64
0
0
Revalida
float64
0
0
MREX
float64
0
0
ENAM
float64
0
0
AFA
float64
0
0
ITA
float64
0
0
IME
float64
0
0
POSCOMP
float64
0
0
OBI
float64
0
0
BCB
float64
0
0
CFCES
float64
0
0
ASSIN2 RTE
float64
0
0
ASSIN2 STS
float64
0
0
FAQUAD NLI
float64
0
0
BLUEX
float64
0
0
ENEM
float64
0
0
CNPU
float64
0
0
ENADE
float64
0
0
BNDES
float64
0
0
CACD (1ª fase)
float64
0
0
CACD (2ª fase)
float64
0
0
Média Geral
float64
0.56
0.79
Datasets Área Médica
stringclasses
1 value
Datasets Área do Direito
stringclasses
1 value
Datasets Provas Militares
stringclasses
1 value
Datasets Computação
stringclasses
1 value
Datasets Discurso de Ódio
stringclasses
1 value
Datasets Economia e Contabilidade
stringclasses
1 value
Datasets Semântica e Inferência
stringclasses
1 value
Datasets Multidisciplinar
stringclasses
1 value
energy_dataset
float64
0.5
0.5
reasoning_dataset
float64
0.5
0.5
ToxSyn-PT
float64
0.53
0.91
SFT
Qwen/Qwen2.5-0.5B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
0.494
329
true
7ae557604adf67be50417f59c2c2f167def9a775
0
0
0.555696
0
0
0
0
0
0.499285
0.700824
0.497512
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.555696
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
0.525163
SFT
Qwen/Qwen2.5-1.5B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
1.544
449
true
989aa7980e4cf806f80c7fef2b1adb7bc71aa306
0
0
0.787641
0
0
0
0
0
0.878571
0.696827
0.674627
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.787641
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
0.900538
SFT
Qwen/Qwen2.5-3B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
3.086
263
true
aa8e72537993ba99e69dfaafa59ed015b17504d1
0
0
0.778369
0
0
0
0
0
0.840714
0.650999
0.71393
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.778369
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
0.907834
SFT
meta-llama/Llama-3.2-1B-Instruct
SFT : Supervised Finetuning
LlamaForCausalLM
Original
BF16
llama3.2
1.236
964
true
9213176726f574b556790deb65791e0c5aa438b6
0
0
0.649586
0
0
0
0
0
0.633571
0.681551
0.554726
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.649586
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
0.728495
SFT
meta-llama/Llama-3.2-1B-Instruct
SFT : Supervised Finetuning
LlamaForCausalLM
Original
BF16
llama3.2
1.236
964
true
9213176726f574b556790deb65791e0c5aa438b6
0
0
0.649586
0
0
0
0
0
0.633571
0.681551
0.554726
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.649586
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
0.728495
SFT
meta-llama/Llama-3.2-3B-Instruct
SFT : Supervised Finetuning
LlamaForCausalLM
Original
BF16
llama3.2
3.213
1,516
true
0cb88a4f764b7a12671c53f0838cd831a0843b95
0
0
0.761334
0
0
0
0
0
0.865
0.640423
0.663184
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0.761334
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR, ToxSyn-PT
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
0.876728
README.md exists but content is empty.
Downloads last month
3