Final Exam Issue

In the LLM course, Fine-tuning large language models exam, I get this error and I would like to solve it and thus obtain my certificate

1 Like

The error you are seeing (410 Gone) is not a bug in your code logic, but a result of Hugging Face’s recent infrastructure migration. The old inference endpoint has been decommissioned in favor of a new routing system.

To fix this and successfully complete your exam, follow these steps:

1. Update the Base URL

You must replace the deprecated URL with the new “Router” address in your configuration.

  • Old URL: /static-proxy?url=https%3A%2F%2Fapi-inference.huggingface.co%2F...%3C%2Fcode%3E%3C%2Fp%3E

  • New URL: /static-proxy?url=https%3A%2F%2Frouter.huggingface.co%2Fhf-inference%2Fv1%3C%2Fcode%3E%3C%2Fp%3E

2. Upgrade the Model Version

Since you are currently pointing to Qwen 2.5, you should take this opportunity to upgrade to the latest iteration. Qwen 3.5-Coder (or the most recent version available on the Hub) offers significantly better reasoning and syntax handling for LLM fine-tuning tasks.

3. Implementation Example (Python)

If you are using the OpenAI SDK or a similar client to connect to Hugging Face, update your initialization as follows:

Python

from openai import OpenAI

client = OpenAI(
    base_url="/static-proxy?url=https%3A%2F%2Frouter.huggingface.co%2Fhf-inference%2Fv1", # Update this line
    api_key="YOUR_HF_TOKEN"
)

# Use the latest model version
response = client.chat.completions.create(
    model="Qwen/Qwen3.5-Coder-32B-Instruct", 
    messages=[{"role": "user", "content": "Explain your fine-tuning process."}]
)


Note: Once you switch to the router.huggingface.co endpoint, the connection error will disappear, allowing the automated grader to verify your code and issue your certificate.