-
-
Notifications
You must be signed in to change notification settings - Fork 6
Answer out of context #2
Copy link
Copy link
Open
Description
Hello.
Im using your PHP GitHub library to connect with API.
When Im sending prompts Im using create method to create prediction. Problem is when I want to use second prompt as „continue” to get longer answer like Im using on https://llama.replicate.dev/
Then even Im starting session with same ID parameters in input it answers out of context, like he didn’t get what he was talking prompt ago!
What to do, to get answers in context I prompt before?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels