olivebot/utils/openai_api/langchain_openai_api.py
guo zebin 4cfad5ae0f 年翻更新
- 全新ui
- 全面优化websocket逻辑,提高数字人和ui连接的稳定性及资源开销
- 全面优化唤醒逻辑,提供稳定的普通唤醒模式和前置词唤醒模式
- 优化拾音质量,支持多声道麦克风拾音
- 优化自动播放服务器的对接机制,提供稳定和兼容旧版ue工程的对接模式
- 数字人接口输出机器人表情,以适应新fay ui及单片机的数字人表情输出
- 使用更高级的音频时长计算方式,可以更精准控制音频播放完成后的逻辑
- 修复点击关闭按钮会导致程序退出的bug
- 修复没有麦克风的设备开启麦克风会出错的问题
- 为服务器主机地址提供配置项,以方便服务器部署
2024-10-26 11:34:55 +08:00

54 lines
1.8 KiB
Python

"""
This script is designed for interacting with a local GLM3 AI model using the `ChatGLM3` class
from the `langchain_community` library. It facilitates continuous dialogue with the GLM3 model.
1. Start the Local Model Service: Before running this script, you need to execute the `api_server.py` script
to start the GLM3 model's service.
2. Run the Script: The script includes functionality for initializing the LLMChain object and obtaining AI responses,
allowing the user to input questions and receive AI answers.
"""
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
from langchain.schema.messages import HumanMessage, SystemMessage, AIMessage
from langchain_community.llms.chatglm3 import ChatGLM3
def initialize_llm_chain(messages: list):
template = "{input}"
prompt = PromptTemplate.from_template(template)
endpoint_url = "http://127.0.0.1:8000/v1/chat/completions"
llm = ChatGLM3(
endpoint_url=endpoint_url,
max_tokens=8096,
prefix_messages=messages,
top_p=0.9,
)
return LLMChain(prompt=prompt, llm=llm)
def get_ai_response(llm_chain, user_message):
ai_response = llm_chain.invoke({"input": user_message})
return ai_response
def continuous_conversation():
messages = [
SystemMessage(content="You are an intelligent AI assistant, named ChatGLM3."),
]
while True:
user_input = input("Human (or 'exit' to quit): ")
if user_input.lower() == 'exit':
break
llm_chain = initialize_llm_chain(messages=messages)
ai_response = get_ai_response(llm_chain, user_input)
print("ChatGLM3: ", ai_response["text"])
messages += [
HumanMessage(content=user_input),
AIMessage(content=ai_response["text"]),
]
if __name__ == "__main__":
continuous_conversation()