yes to use it with inference api, you need pro subscription since its too large(13gbish>10gb which is free api limit). Ofcourse you could run it locally without any error.
yes to use it with inference api, you need pro subscription since its too large(13gbish>10gb which is free api limit). Ofcourse you could run it locally without any error.