Inference on a $200 SBC
I’ve really wanted a server that I can use to host a quantized model for personal purposes, such as demoing a model, using it in a Hugging Face Space, etc. There are a huge variety of options to do this.…
I’ve really wanted a server that I can use to host a quantized model for personal purposes, such as demoing a model, using it in a Hugging Face Space, etc. There are a huge variety of options to do this.…
Huggingface Spaces are amazing tools to demo a model you’ve built, the only problem the Free tier the CPU Basic with 2 vCPUs and 16GB Ram is not really capable to run any bigger model. Here I want to show…