当前位置:   article > 正文

transformers peft加载lora模型;TextStreamer流式输出,kv cache使用

textstreamer

1、transformers peft加载lora模型

https://github.com/hiyouga/LLaMA-Factory/blob/cae47379079ff811aa385c297481a27020a8da6b/scripts/loftq_init.py#L13

在这里插入图片描述

代码:

from peft import AutoPeftModelForCausalLM, PeftModel
from transformers import AutoTokenizer,AutoModelForCausalLM
import torch

tokenizer = AutoTokenizer.from_pretrained("/ai/loong/Qwen1.5-7B-Chat")

model = AutoModelForCausalLM.from_pretrained("/ai/loong/Qwen1.5-7B-Chat", trust_remote_code=True, device_map="auto")

model = PeftModel.from_pretrained(model, "/ai/loong/output/checkpoint-300",offload_folder='./')


model.eval()
inputs = tokenizer("你是谁", return_tensors="pt")

outputs = model.generate(input_ids=inputs["input_ids"], max_new_tokens=500)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True)[0])
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16

TextStreamer流式输出

参考:https://zhuanlan.zhihu.com/p/694576810

from peft import AutoPeftModelForCausalLM, PeftModel
from transformers import AutoTokenizer,AutoModelForCausalLM,TextStreamer
import torch

tokenizer = AutoTokenizer.from_pretrained("/ai/loong/Qwen1.5-7B-Chat")

model = AutoModelForCausalLM.from_pretrained("/ai/loong/Qwen1.5-7B-Chat", trust_remote_code=True, device_map="auto")

model = PeftModel.from_pretrained(model, "/ai/loong/output/checkpoint-300",offload_folder='./')



inputs = tokenizer("听说你以前叫通义千问", return_tensors="pt")
streamer = TextStreamer(tokenizer)
 
# Despite returning the usual output, the streamer will also print the generated text to stdout.
model.generate(**inputs, streamer=streamer, max_new_tokens=20)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17

kv cache使用

use_cache=True

model.eval()
inputs = tokenizer("你是谁", return_tensors="pt")

outputs = model.generate(input_ids=inputs["input_ids"], max_new_tokens=500,use_cache=True)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True)[0])
  • 1
  • 2
  • 3
  • 4
  • 5
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/码创造者/article/detail/960780
推荐阅读
相关标签
  

闽ICP备14008679号