模型:
lambdalabs/pythia-1.4b-deduped-synthetic-instruct
此模型是在 Dahoas/synthetic-instruct-gptj-pairwise 上进行微调的。
您可以在 Lambda Cloud 上托管的模型中尝试。
使用该模型进行推断需要大约4GB的GPU内存。
import torch from transformers import AutoTokenizer, pipeline, StoppingCriteria, StoppingCriteriaList device = torch.device("cuda:0") if torch.cuda.is_available() else torch.device("cpu") model_name = "lambdalabs/pythia-1.4b-deduped-synthetic-instruct" max_new_tokens = 2048 stop_token = "<|stop|>" class KeywordsStoppingCriteria(StoppingCriteria): def __init__(self, keywords_ids: list): self.keywords = keywords_ids def __call__( self, input_ids: torch.LongTensor, scores: torch.FloatTensor, **kwargs ) -> bool: if input_ids[0][-1] in self.keywords: return True return False tokenizer = AutoTokenizer.from_pretrained( model_name, ) tokenizer.pad_token = tokenizer.eos_token tokenizer.add_tokens([stop_token]) stop_ids = [tokenizer.encode(w)[0] for w in [stop_token]] stop_criteria = KeywordsStoppingCriteria(stop_ids) generator = pipeline( "text-generation", model=model_name, device=device, max_new_tokens=max_new_tokens, torch_dtype=torch.float16, stopping_criteria=StoppingCriteriaList([stop_criteria]), ) example = "Can you give me some tips on how to save money every month." text = "Question: {}\nAnswer:".format(example) result = generator( text, num_return_sequences=1, ) output = result[0]["generated_text"] print(output)
输出:
Question: Can you give me some tips on how to save money every month. Answer:Create a budget and track your spending. 2. Cut down on unnecessary expenses, such as eating out, shopping, and entertainment. 3. Make a list of your monthly expenses and stick to it. 4. Take advantage of discounts and coupons when shopping. 5. Make sure to pay your bills on time to avoid late fees. 6. Save a portion of your income each month by investing it in a high-yield savings account. 7. Consider automating your savings by setting up a recurring transfer from your checking to a savings account. 8. Take advantage of free entertainment opportunities, such as going to the park or museum. 9. Look for ways to save on utilities, such as installing energy-efficient appliances. 10. Research and use public transportation to save on gas.<|stop|>
该模型在 Dahoas/synthetic-instruct-gptj-pairwise 上进行了训练。我们将原始数据集分为训练集(前32000个示例)和验证集(剩下的1144个示例)。
我们将模型进行了4个周期的微调。此过程耗时8xA100 80GB 2小时,其中将batch_size_per_gpu设置为8(因此全局批量大小为64),学习率设置为0.00002(线性衰减至最后一个训练步骤为零)。您可以在Weights and Biases记录中找到更多信息 here 。