I’m trying to get started with the example extension (transformers.js/examples/extension at main · huggingface/transformers.js · GitHub). I’d like to switch the model that’s used to the google/gemma-2b model. What files do I edit in the extension to change the model that is compiled in to the extension?
1 Like
Hi, @ai-driana !
To switch the model used in the example extension to google/gemma-2b
, you’ll need to make changes in specific files of the extension project. Here’s a step-by-step guide:
-
Locate the Model Configuration File:
- In the example extension, the model is typically defined in a configuration file or directly in the source code where the model is loaded. Look for a file like
config.js
,model.js
, or similar.
- In the example extension, the model is typically defined in a configuration file or directly in the source code where the model is loaded. Look for a file like
-
Update the Model Name:
- Change the model name from the default to
google/gemma-2b
. For example, if the model is specified like this:
Update it to:const modelName = 'default-model-name';
const modelName = 'google/gemma-2b';
- Change the model name from the default to
-
Verify Model Initialization:
- If there’s a section where the model is initialized, ensure it references the new model. For example:
const pipeline = await hf.pipeline('task-name', 'google/gemma-2b');
- If there’s a section where the model is initialized, ensure it references the new model. For example:
-
Update Any Preloaded Model Assets (if applicable):
- If the example extension preloads model weights or tokenizer files, you might need to replace these assets with the ones corresponding to
google/gemma-2b
. Check the folder structure for directories likemodels
orassets
.
- If the example extension preloads model weights or tokenizer files, you might need to replace these assets with the ones corresponding to
-
Test and Build the Extension:
- Run the extension locally to verify that the new model loads correctly. You might need to install dependencies or ensure the model is compatible with the
transformers.js
library.
- Run the extension locally to verify that the new model loads correctly. You might need to install dependencies or ensure the model is compatible with the
-
Troubleshoot Potential Issues:
- If the model requires additional configuration (e.g., specific tokenizers or parameters), ensure these are also updated in the code.
If you’re unfamiliar with the project structure, start by looking at the main entry points like index.js
or app.js
and follow the imports to find where the model is defined and used.
Hope this help!
1 Like