Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
__turbobrew__
3 days ago
|
parent
|
context
|
favorite
| on:
The "confident idiot" problem: Why AI needs hard r...
The day when the LLM responds to my question with another question will be quite interesting. Especially at work, when someone asks me a question I need to ask for clarifying information to answer the original question fully.
djeastm
3 days ago
[–]
Have you tried adding a system prompt asking for this behavior? They seem to readily oblige when I ask for this (e.g. brainstorming)
reply
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: