How to download a model and run it with Ollama locally?

My I ask,after my downloading models, how can i use ollama to load it?
My downlaoding is in transformer format,with files like config.json,model-00001-of-00002.safetensors , pytorch_model-00003-of-00003.bin special_tokens_map.json tokenizer.model
No modelfile,how can I load it with ollama