广州建设高端网站,苏州百度,wordpress 更新用户名,软考网络工程师中级分类目录#xff1a;《自然语言处理从入门到应用》总目录 使用少量示例
本部分的内容介绍了如何在聊天模型#xff08;Chat Models#xff09;中使用少量示例。关于如何最好地进行少量示例提示尚未形成明确的共识。因此#xff0c;我们尚未固定任何关于此的抽象概念#…分类目录《自然语言处理从入门到应用》总目录 使用少量示例
本部分的内容介绍了如何在聊天模型Chat Models中使用少量示例。关于如何最好地进行少量示例提示尚未形成明确的共识。因此我们尚未固定任何关于此的抽象概念而是使用现有的抽象概念。
交替的人工智能/人类消息
进行少量示例提示的第一种方式是使用交替的人工智能/人类消息。以下是一个示例
from langchain.chat_models import ChatOpenAI
from langchain import PromptTemplate, LLMChain
from langchain.prompts.chat import (ChatPromptTemplate,SystemMessagePromptTemplate,AIMessagePromptTemplate,HumanMessagePromptTemplate,
)
from langchain.schema import (AIMessage,HumanMessage,SystemMessage
)chat ChatOpenAI(temperature0)templateYou are a helpful assistant that translates english to pirate.
system_message_prompt SystemMessagePromptTemplate.from_template(template)
example_human HumanMessagePromptTemplate.from_template(Hi)
example_ai AIMessagePromptTemplate.from_template(Argh me mateys)
human_template{text}
human_message_prompt HumanMessagePromptTemplate.from_template(human_template)chat_prompt ChatPromptTemplate.from_messages([system_message_prompt, example_human, example_ai, human_message_prompt])chain LLMChain(llmchat, promptchat_prompt)# 从格式化的消息中获取聊天完成结果
chain.run(I love programming.)输出
I be lovin programmin, me hearty!系统消息
OpenAI提供了一个可选的name参数我们也建议与系统消息一起使用以进行少量示例提示。以下是如何使用此功能的示例
templateYou are a helpful assistant that translates english to pirate.
system_message_prompt SystemMessagePromptTemplate.from_template(template)
example_human SystemMessagePromptTemplate.from_template(Hi, additional_kwargs{name: example_user})
example_ai SystemMessagePromptTemplate.from_template(Argh me mateys, additional_kwargs{name: example_assistant})
human_template{text}
human_message_prompt HumanMessagePromptTemplate.from_template(human_template)chat_prompt ChatPromptTemplate.from_messages([system_message_prompt, example_human, example_ai, human_message_prompt])
chain LLMChain(llmchat, promptchat_prompt)# 从格式化的消息中获取聊天完成结果
chain.run(I love programming.)输出
I be lovin programmin, me hearty!响应流式传输
本部分介绍了如何在聊天模型中使用流式传输
from langchain.chat_models import ChatOpenAI
from langchain.schema import (HumanMessage,
)
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
chat ChatOpenAI(streamingTrue, callbacks[StreamingStdOutCallbackHandler()], temperature0)
resp chat([HumanMessage(contentWrite me a song about sparkling water.)])输出
Verse 1:
Bubbles rising to the top
A refreshing drink that never stops
Clear and crisp, its pure delight
A taste thats sure to exciteChorus:
Sparkling water, oh so fine
A drink thats always on my mind
With every sip, I feel alive
Sparkling water, youre my vibeVerse 2:
No sugar, no calories, just pure bliss
A drink thats hard to resist
Its the perfect way to quench my thirst
A drink that always comes firstChorus:
Sparkling water, oh so fine
A drink thats always on my mind
With every sip, I feel alive
Sparkling water, youre my vibeBridge:
From the mountains to the sea
Sparkling water, youre the key
To a healthy life, a happy soul
A drink that makes me feel wholeChorus:
Sparkling water, oh so fine
A drink thats always on my mind
With every sip, I feel alive
Sparkling water, youre my vibeOutro:
Sparkling water, youre the one
A drink thats always so much fun
Ill never let you go, my friend
Sparkling参考文献 [1] LangChain ️ 中文网跟着LangChain一起学LLM/GPT开发https://www.langchain.com.cn/ [2] LangChain中文网 - LangChain 是一个用于开发由语言模型驱动的应用程序的框架http://www.cnlangchain.com/