| Get a closer look at DistilBERT by accessing [DistilBertConfig] to inspect it's attributes: | |
| from transformers import DistilBertConfig | |
| config = DistilBertConfig() | |
| print(config) | |
| DistilBertConfig { | |
| "activation": "gelu", | |
| "attention_dropout": 0.1, | |
| "dim": 768, | |
| "dropout": 0.1, | |
| "hidden_dim": 3072, | |
| "initializer_range": 0.02, | |
| "max_position_embeddings": 512, | |
| "model_type": "distilbert", | |
| "n_heads": 12, | |
| "n_layers": 6, | |
| "pad_token_id": 0, | |
| "qa_dropout": 0.1, | |
| "seq_classif_dropout": 0.2, | |
| "sinusoidal_pos_embds": false, | |
| "transformers_version": "4.16.2", | |
| "vocab_size": 30522 | |
| } | |
| [DistilBertConfig] displays all the default attributes used to build a base [DistilBertModel]. |