Extract the main point of a conversation
Summarize this dialogue:
Customer: Please connect me with a support agent.
AI: Hi there, how can I assist you today?
Customer: I forgot my password and lost access to the email affiliated to my account. Can you please help me?
AI: Yes of course. First I'll need to confirm your identity and then I can connect you with one of our support agents.
TLDR: A customer lost access to their account.
--
Summarize this dialogue:
AI: Hi there, how can I assist you today?
Customer: I want to book a product demo.
AI: Sounds great. What country are you located in?
Customer: I'll connect you with a support agent who can get something scheduled for you.
TLDR: A customer wants to book a product demo.
--
Summarize this dialogue:
AI: Hi there, how can I assist you today?
Customer: I want to get more information about your pricing.
AI: I can pull this for you, just a moment.
TLDR:
A customer wants to get more information about pricing.
import cohereco = cohere.Client('{apiKey}')response = co.generate(model='xlarge',prompt='Summarize this dialogue:\nCustomer: Please connect me with a support agent.\nAI: Hi there, how can I assist you today?\nCustomer: I forgot my password and lost access to the email affiliated to my account. Can you please help me?\nAI: Yes of course. First I\'ll need to confirm your identity and then I can connect you with one of our support agents.\nTLDR: A customer lost access to their account.\n--\nSummarize this dialogue:\nAI: Hi there, how can I assist you today?\nCustomer: I want to book a product demo.\nAI: Sounds great. What country are you located in?\nCustomer: I\'ll connect you with a support agent who can get something scheduled for you.\nTLDR: A customer wants to book a product demo.\n--\nSummarize this dialogue:\nAI: Hi there, how can I assist you today?\nCustomer: I want to get more information about your pricing.\nAI: I can pull this for you, just a moment.\nTLDR:',max_tokens=20,temperature=0.6,k=0,p=1,frequency_penalty=0,presence_penalty=0,stop_sequences=["--"],return_likelihoods='NONE')print('Prediction: {}'.format(response.generations[0].text))
model_size | xlarge |
frequency_penalty | 0 |
k | 0 |
max_tokens | 10 |
p | 1 |
presence_penalty | 0 |
stop_sequence | -- |
temperature | 0.6 |