Quantizations of https://huggingface.co/bralynn/pydevmini1

Open source inference clients/UIs

Closed source inference clients/UIs


From original readme

πŸš€ Try It Yourself (for free)

Don't just take my word for it. Test the model right now under the exact conditions shown in the video demonstration.

Open In Colab


Model Details

  • Model Type: Causal Language Model
  • Number of Parameters: 4.0B
  • Number of Parameters (Non-Embedding): 3.6B
  • Number of Layers: 36
  • Number of Attention Heads (GQA): 32 for Q, 8 for KV
  • Context Length: 262,144 tokens (native)

Recommended Inference Parameters

For best results, I suggest using the following generation parameters:

  • Temperature: 0.7
  • Top P: 0.8
  • Top K: 20
  • Min P: 0.0
Downloads last month
260
GGUF
Model size
4B params
Architecture
qwen3
Hardware compatibility
Log In to view the estimation

1-bit

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support