Model not loaded on the server
#28
by
divakaivan
- opened
I am getting:
Model not loaded on the server: /static-proxy?url=https%3A%2F%2Fapi-inference.huggingface.co%2Fmodels%2Fcodellama%2FCodeLlama-34b-Instruct-hf%3C%2Fa%3E. Please retry with a higher timeout (current: 120).
I used the model in my code 1 month ago and it worked. Checked now and model does not work :(
code: https://github.com/divakaivan/text2chart/blob/main/app.py
edit: I am new to hugging face so am not sure what is happening