Text_encoder_2, local model, not working

Hey. I am trying to use a specific model called “protovisionXLHighFidelity3D_release0630Bakedvae.safetensors” locally, it downloads most things down, but it gives me this error:

Traceback (most recent call last):
File “f:\Disk 2 (D)\CODING\PROJECTS\PYTHON\InstaLessReels\Diffusers.py”, line 9, in
pipe = StableDiffusionPipeline.from_single_file(
File “C:\Users\Alexa\AppData\Local\Programs\Python\Python310\lib\site-packages\huggingface_hub\utils_validators.py”, line 118, in _inner_fn
return fn(*args, **kwargs)
File “C:\Users\Alexa\AppData\Local\Programs\Python\Python310\lib\site-packages\diffusers\loaders\single_file.py”, line 261, in from_single_file
pipe = download_from_original_stable_diffusion_ckpt(
File “C:\Users\Alexa\AppData\Local\Programs\Python\Python310\lib\site-packages\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py”, line 1749, in download_from_original_stable_diffusion_ckpt
pipe = pipeline_class(
TypeError: StableDiffusionPipeline.init() got an unexpected keyword argument ‘text_encoder_2’

It looks like a missing text_encoder_2. Does any of you know what the solution to this is?

Here is my code:

import requests
from PIL import Image

from io import BytesIO

from diffusers import StableDiffusionPipeline

from diffusers import StableDiffusionUpscalePipeline

import torch

pipe = StableDiffusionPipeline.from_single_file(

"protovisionXLHighFidelity3D_release0630Bakedvae.safetensors", local_files_only=False, torch_dtype=torch.float16,safety_checker=None,requires_safety_checker=False

).to("cuda")

#Prompt

prompt = "charturnerv2, multiple views of the same character in the same outfit, a character turnaround of a woman wearing a black jacket and red shirt, best quality, intricate details."

image = pipe(prompt, height=720, width=408, num_inference_steps=24, guidance_scale=4).images[0]

image.save("Doesitwork.png")

print("Finished")

XL model should be loaded by StableDiffusionXLPipeline instead of StableDiffusionPipeline.

It’s been long since you posted the question, but I hope it would help to others having the same errors…