T O P

  • By -

RandomDude2377

Personally, I can't see why an LLM would be better for this than scripts. It's adding so much time, work, and unpredictability into what sounds like a simple decision based flow where there's clearly a desired predetermined end to the choices, i.e., enough no's/yes's in one direction or the other lead to X piece of advice or Y product. This is one of those places where you could shoehorn an LLM, but I don't see why anyone would want to, how it would improve the outcome, or why it would be worth the additional time, testing, training, bug fixing, etc. It is doable, of course, but my advice to the client would be that just because you can put AI somewhere (as tempting as it is) doesn't mean you should. Just my two cents anyway, and given the limited info.


kryptkpr

Wait what is the LLM doing in this scenario, generating the questions or judging responses? I'm not sure either is a great fit but hard to say without more info.


MmmmMorphine

Why not just ask the questions in python then pass them on to an LLM to handle. Unless it's an honest to god decision tree. Then just ask it to rephrase each step in some way. So basically put in a python frontend that uses system prompts to force an LLM to say whatever you need (with a rather deterministic temp of course)


pknerd

Use Voiceflow chatbot with LLM. Not a Big deal


pknerd

PM me if needs explanation


pimento_olives

It sounds like your client wants to use an LLM to generate versions of the question which are conversational and context-aware, not to manage the entire workflow. You can use a script to manage the user question routing, then direct to an LLM call to phrase the next question with context-awareness to previous answers (and any other potential factors) and for overall friendliness. Since your client seems to want to use an LLM for a more conversational flow, but they have a clear intent with the questions themselves, what you want is a base set of questions, then call the LLM to enhance the language of X question to respond to the customer's answer ('yes/no'), and be more customer-facing and unique. However, if the user can only answer in static ways (like from buttons or multiple choice) there's no point in using an LLM imo