当前位置:   article > 正文

huggingface transformers库中LlamaForCausalLM_llamaforcausallm.from_pretrained

llamaforcausallm.from_pretrained

新手入门笔记。

LlamaForCausalLM 的使用示例,这应该是一段推理代码。

  1. from transformers import AutoTokenizer, LlamaForCausalLM
  2. model = LlamaForCausalLM.from_pretrained(PATH_TO_CONVERTED_WEIGHTS)
  3. tokenizer = AutoTokenizer.from_pretrained(PATH_TO_CONVERTED_TOKENIZER)
  4. prompt = "Hey, are you conscious? Can you talk to me?"
  5. inputs = tokenizer(prompt, return_tensors="pt")
  6. # Generate
  7. generate_ids = model.generate(inputs.input_ids, max_length=30)
  8. tokenizer.batch_decode(generate_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0]

参考:

Llama2
https://huggingface.co/docs/transformers/v4.32.1/en/model_doc/llama2#transformers.LlamaForCausalLM

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/你好赵伟/article/detail/364572
推荐阅读
相关标签
  

闽ICP备14008679号