just download the model or use inference api to run it in your local machine
· Sign up or log in to comment