赞
踩
OSError: Can't load tokenizer for 'openai/clip-vit-large-patch14'.
完整报错如下:
OSError: Can't load tokenizer for 'openai/clip-vit-large-patch14'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'openai/clip-vit-large-patch14' is the correct path to a directory containing all relevant files for a CLIPTokenizer tokenizer.
这是由于 huggingface
网站上不去导致的。
在本地新建文件夹 openai/clip-vit-large-patch14
,将官网对应位置下面的文件全部下下来放进去即可123。
官方网址:https://huggingface.co/openai/clip-vit-large-patch14/tree/main
国内镜像:
或者百度网盘:
链接: https://pan.baidu.com/s/1pmOuyaRnLcc8ee-8_jtb1g?pwd=ukyi 提取码: ukyi 复制这段内容后打开百度网盘手机App,操作更方便哦
–来自百度网盘超级会员v9的分享
在本地新建文件夹 openai
之后,在该路径下利用 git clone https://www.modelscope.cn/AI-ModelScope/clip-vit-large-patch14.git
自动下载4,但是注意到此时会报错 safetensors_rust.SafetensorError
,具体如下:
File "/home/xxx/.conda/envs/xxx/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3503, in from_pretrained
with safe_open(resolved_archive_file, framework="pt") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
原因是几个大一点文件没有完整下载,需要重新手动下载并覆盖。
https://hf-mirror.com
即可5 (实测未成功)pip install -U huggingface_hub
,然后修改HF_ENDPOINT 环境变量6 (未验证)export HF_ENDPOINT=https://hf-mirror.com # Linux
set HF_ENDPOINT=https://hf-mirror.com # Windows
huggingface-cli download --resume-download InstantX/InstantID --local-dir checkpoints
huggingface_hub.utils._errors.LocalEntryNotFoundError:
出现如下报错:
File "/home/xxx/.conda/envs/xxx/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1371, in hf_hub_download
raise LocalEntryNotFoundError(
huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
还是网络的问题。
参考博文:【秒解决!!huggingface_hub.utils._errors.LocalEntryNotFoundError】,将 huggingface
网站 改成其镜像站:
import os
os.environ["HF_ENDPOINT"] = "https://hf-mirror.com"
HF_ENDPOINT=https://hf-mirror.com python xxx.py
OSError: Can‘t load tokenizer for ‘openai/clip-vit-large-patch14‘. ↩︎
Stable-diffusion安装时Can‘t load tokenizer for ‘openai/clip-vit-large-patch14‘问题解决 ↩︎
【debug】OSError: Can‘t load tokenizer for ‘XXX‘. If you were trying to load it from ‘https://huggingf ↩︎
StableDiffusion搭建[报错] OSError openai/clip-vit-large-patch14 ↩︎
解决diffusion部署时,无法从‘huggingface.co‘下载‘openai/clip-vit-large-patch14‘导致的报错 ↩︎
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。