How to use multithreading on a CPU

I tried (with the help of copilot) to convert to the onnx format, but I got this error message:

python3 conv_moon.py
Traceback (most recent call last):
File “/home/martin/esn_vqa/conv_moon.py”, line 34, in
convert_to_onnx(model, tokenizer)
File “/home/martin/esn_vqa/conv_moon.py”, line 10, in convert_to_onnx
torch.onnx.export(
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/onnx/init.py”, line 375, in export
export(
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/onnx/utils.py”, line 502, in export
_export(
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/onnx/utils.py”, line 1564, in _export
graph, params_dict, torch_out = _model_to_graph(
^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/onnx/utils.py”, line 1113, in _model_to_graph
graph, params, torch_out, module = _create_jit_graph(model, args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/onnx/utils.py”, line 997, in _create_jit_graph
graph, torch_out = _trace_and_get_graph_from_model(model, args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/onnx/utils.py”, line 904, in _trace_and_get_graph_from_model
trace_graph, torch_out, inputs_states = torch.jit._get_trace_graph(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/jit/_trace.py”, line 1500, in _get_trace_graph
outs = ONNXTracedModule(
^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/nn/modules/module.py”, line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/nn/modules/module.py”, line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/jit/_trace.py”, line 139, in forward
graph, out = torch._C._create_graph_by_tracing(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/jit/_trace.py”, line 130, in wrapper
outs.append(self.inner(*trace_inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/nn/modules/module.py”, line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/nn/modules/module.py”, line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/nn/modules/module.py”, line 1726, in _slow_forward
result = self.forward(*input, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/martin/esn_vqa/lib/python3.12/site-packages/torch/nn/modules/module.py”, line 394, in _forward_unimplemented
raise NotImplementedError(
NotImplementedError: Module [HfMoondream] is missing the required “forward” function

Sadly, I found very little on that with google. Is there anything I can do about this?

1 Like