发布于2024-11-23 21:35 阅读(944) 评论(0) 点赞(25) 收藏(5)
在 Amazon sagemaker 上训练 llama3.1-8B-Instruct 模型时,训练作业失败并显示以下输出:
./usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py:1150: FutureWarning: `resume_download`
is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
warnings.warn(
Traceback (most recent call last):
File "/workspace/train.py", line 85, in <module>
main()
File "/workspace/train.py", line 48, in main
config = AutoConfig.from_pretrained(model_name, token=use_auth_token)
File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py", line 1124, in from_pretrained
return config_class.from_dict(config_dict, **unused_kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py", line 764, in from_dict
config = cls(**config_dict)
File "/usr/local/lib/python3.10/dist-packages/transformers/models/llama/configuration_llama.py", line 160, in __init__
self._rope_scaling_validation()
File "/usr/local/lib/python3.10/dist-packages/transformers/models/llama/configuration_llama.py", line 180, in _rope_scaling_validation
raise ValueError(
ValueError: `rope_scaling` must be a dictionary with with two fields, `type` and `factor`, got {'factor': 8.0, 'low_freq_factor': 1.0, 'high_freq_factor': 4.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
我尝试更改 config.rope_scaling 并将其应用于模型,但没有效果。这是我更改配置的代码片段:
# Load model configuration
config = AutoConfig.from_pretrained(model_name, token=use_auth_token)
# Modify the rope_scaling config
config.rope_scaling = {
"type": "llama3",
"factor": 8.0
}
# Initialize the model with the modified config
model = LlamaForCausalLM.from_pretrained(modek_name, token=use_auth_token, config=config)
我刚刚在使用 Llama 3.1 时遇到了同样的错误transformers-4.41.0
。通过升级解决了此问题:
pip install --upgrade transformers
一切transformers-4.44.2
运行正常。有关更多详细信息,请参阅相关的GitHub 问题。
作者:黑洞官方问答小能手
链接:https://www.pythonheidong.com/blog/article/2045463/1c5a6bc1a36ebda6e9cf/
来源:python黑洞网
任何形式的转载都请注明出处,如有侵权 一经发现 必将追究其法律责任
昵称:
评论内容:(最多支持255个字符)
---无人问津也好,技不如人也罢,你都要试着安静下来,去做自己该做的事,而不是让内心的烦躁、焦虑,坏掉你本来就不多的热情和定力
Copyright © 2018-2021 python黑洞网 All Rights Reserved 版权所有,并保留所有权利。 京ICP备18063182号-1
投诉与举报,广告合作请联系vgs_info@163.com或QQ3083709327
免责声明:网站文章均由用户上传,仅供读者学习交流使用,禁止用做商业用途。若文章涉及色情,反动,侵权等违法信息,请向我们举报,一经核实我们会立即删除!