| | --- |
| | language: en |
| | license: mit |
| | tags: |
| | - code |
| | - python |
| | - assistant |
| | - causal-lm |
| | - streamlit |
| | pipeline_tag: text-generation |
| | --- |
| | |
| | # 🧠 Python Code Assistant (Fine-tuned CodeGen 350M) |
| |
|
| | This model is a fine-tuned version of `Salesforce/codegen-350M-multi` designed to assist with Python code generation based on natural language prompts. |
| |
|
| | ## 🧪 Example Prompt |
| |
|
| | ``` |
| | Write a Python function to check if a number is prime. |
| | ``` |
| |
|
| | ## ✅ Example Output |
| |
|
| | ```python |
| | def is_prime(n): |
| | if n < 2: |
| | return False |
| | for i in range(2, int(n ** 0.5) + 1): |
| | if n % i == 0: |
| | return False |
| | return True |
| | ``` |
| |
|
| | ## 🛠️ Intended Use |
| |
|
| | - Educational coding help |
| | - Rapid prototyping in notebooks or IDEs |
| | - Integration with Streamlit apps |
| |
|
| | > 🚫 Not intended to replace formal code review or secure programming practices. |
| |
|
| | ## 🔍 Model Details |
| |
|
| | - Base: `Salesforce/codegen-350M-multi` |
| | - Training: Fine-tuned on 500+ Python instruction-completion pairs |
| | - Format: causal LM |
| |
|
| | ## 🧰 How to Use |
| |
|
| | ```python |
| | from transformers import AutoModelForCausalLM, AutoTokenizer |
| | |
| | model = AutoModelForCausalLM.from_pretrained("AhsanFarabi/python-assistant") |
| | tokenizer = AutoTokenizer.from_pretrained("AhsanFarabi/python-assistant") |
| | |
| | prompt = "Write a function to reverse a string." |
| | inputs = tokenizer(prompt, return_tensors="pt") |
| | outputs = model.generate(**inputs, max_new_tokens=128) |
| | print(tokenizer.decode(outputs[0], skip_special_tokens=True)) |
| | ``` |
| |
|
| | --- |
| |
|