feat: add troubleshooting system prompt#2809
Conversation
Signed-off-by: Alberto Falossi <afalossi@redhat.com>
|
Skipping CI for Draft Pull Request. |
|
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here. DetailsNeeds approval from an approver in each of these files:Approvers can indicate their approval by writing |
ols/customize/ols/prompts.py
Outdated
| the provided context. | ||
| # OPENSHIFT CONTEXT | ||
| - You operate in OpenShift, not plain Kubernetes. Use OpenShift-specific resources when appropriate. | ||
| - Cluster version: OpenShift 4.20.13 |
There was a problem hiding this comment.
We do have a mechanism of pulling the cluster version in code today, so it should be possible to "autoinject" the right version in the prompt too.
There was a problem hiding this comment.
Although the RAG is providing docs for the particular version.
You observer you get better results when version is also in the prompt?
There was a problem hiding this comment.
Hopefully this should avoid questions like "Are you in Kubernetes or OpenShift?" "If you tell me which version you are running...". It doesn't happen often, but when it does, we don't exactly come across well
| # OPENSHIFT CONTEXT | ||
| - You operate in OpenShift, not plain Kubernetes. Use OpenShift-specific resources when appropriate. | ||
| - Cluster version: OpenShift 4.20.13 | ||
| - Current time: {time} |
There was a problem hiding this comment.
It's a pretty common pattern having a reference time. The LLM can better correlate past events with current state. And answer something like "2 hours ago".
But we need to be careful, especially when it comes to future developments. What happens if a conversation is archived and then retrieved? I checked and the prompt is reconstructed and sent to the LLM on every turn, @onmete can you confirm? If so it should be ok.
Security: if someone manages prompt injection through tool output or user input, having the exact server time in the system prompt gives them one more piece of info. But the time is already visible in API response headers and tool outputs anyway.
Maybe I will add some rounding, but it's probably overkilling, wdyt?
There was a problem hiding this comment.
This would need to surface on multiple levels. I would skip this unless we have a concrete reason/measurable improvement for doing this.
- Emphasize troubleshooting and diagnostics with structured
response formats (diagnosis, assessment, question).
- Add investigation protocol and metrics workflow
- Inject {version} template variable into prompt context
Signed-off-by: Alberto Falossi <afalossi@redhat.com>
|
PR needs rebase. DetailsInstructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. |
Description
I am opening this PR to test, brainstorm, and refine the prompt.
I have currently overwritten the existing system prompt to facilitate testing. Once the content has been decided, I will change the PR with the correct implementation approach.
Type of change
Related Tickets & Documents
Checklist before requesting a review
Testing