Dataset Viewer
The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
FineWeb GPT-2 Tokens
This is all of the 10B sample of FineWeb,
split into 99% training data, 1% validation data, each converted to a single uint16 tensor of GPT-2 tokens.
The text for each record in the original dataset was tokenized using tiktoken,
then they were all concatenated together, with <|endoftext|> tokens separating them.
Code:
Based on
This dataset is derived from the original FineWeb dataset:
- Source: FineWeb (by Hugging Face)
- Processing:
- Input text from FineWeb 10B sample
- Tokenized using
tiktokenwith GPT-2 encoding <|endoftext|>inserted between documents- Stored as
uint16safetensors
File format
train.safetensors– token stream for trainingvalidation.safetensors– token stream for validation
Each file contains a single tensor:
{"tokens": torch.uint16[...]}
- Downloads last month
- 10