模型:
mayaeary/pygmalion-6b-4bit-128g
GPTQ对 https://huggingface.co/PygmalionAI/pygmalion-6b/commit/b8344bb4eb76a437797ad3b19420a13922aaabe1 的量化
使用这个仓库: https://github.com/mayaeary/GPTQ-for-LLaMa/tree/gptj-v2
命令:
python3 gptj.py models/pygmalion-6b_b8344bb4eb76a437797ad3b19420a13922aaabe1 c4 --wbits 4 --groupsize 128 --save_safetensors models/pygmalion-6b-4bit-128g.safetensors