Conversational Ai Development Tutorials, Guides & Insights
Unlock 1+ expert-curated conversational ai tutorials, real-world code snippets, and modern dev strategies. From fundamentals to advanced topics, boost your conversational ai skills on DeveloperBreeze.
Adblocker Detected
It looks like you're using an adblocker. Our website relies on ads to keep running. Please consider disabling your adblocker to support us and access the content.
Tutorial
python
Build a Simple AI Chatbot with Python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
# Load pre-trained model and tokenizer
model_name = "microsoft/DialoGPT-small"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# Initialize chat history
chat_history_ids = None
def chat_with_bot(user_input):
global chat_history_ids
# Encode the new user input, add the eos_token and return a tensor in Pytorch
new_user_input_ids = tokenizer.encode(user_input + tokenizer.eos_token, return_tensors='pt')
# Append the new user input tokens to the chat history
bot_input_ids = torch.cat([chat_history_ids, new_user_input_ids], dim=-1) if chat_history_ids is not None else new_user_input_ids
# Generate a response
chat_history_ids = model.generate(bot_input_ids, max_length=1000, pad_token_id=tokenizer.eos_token_id)
# Decode the last response
bot_response = tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True)
return bot_response
if __name__ == "__main__":
print("Start chatting with the AI chatbot (type 'exit' to stop)!")
while True:
user_input = input("You: ")
if user_input.lower() == "exit":
break
bot_response = chat_with_bot(user_input)
print(f"Bot: {bot_response}")Run the script in your terminal or command prompt:
Aug 04, 2024
Read More