Conditional branching. Is there a better way? #3293
Replies: 3 comments
-
Hi @ArmykOliva , firstly wanna confirm with you, does the requirements is merge the above 6 nodes to 1 python nodes? aka you don't need to see the logic structure from the flow graph. |
Beta Was this translation helpful? Give feedback.
-
Hi, yes my idea would be to have one node, where I would define two prompts, one for checking whether edits need to be done and the other prompt to make the edit. Basically to save some output tokens and time, I just want to first check if I need to do some changes with the LLM at all. If no changes are done, just return the original input, if changes were done, return the edited LLM output. If I were to just create a python function for this, how would you pass the required prompts to the node? And how would you make it still compatible with the dynamic setting of custom connection and LLM want in the DAG editor? Also, what if it wasn't a simple if/else but rather a large switch? |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
If you look closely on my promptflow. I first evaulate with an LLM if edits need to be done to a resume based on formatting requirements or if there are no edits required. Now if edits are required the LLM edits the template, if not I just want to pass the original data.
With current promptflow, I have to first check the answer of the LLM and pass it to a function, then either call LLM or only parse data. Then I have to junction the inputs and return the not "None" value to my final node.
Is there a better way to do this? Is there a way to create a promptflow, and then add it as only one node to the project? Similar to how functions in programming work so we don't have to repeat code.
Maybe code a tool for that?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions