Skip to content

Expose internal ReqLLM data in action response #182

@sezaru

Description

@sezaru

Code of Conduct

  • I agree to follow this project's Code of Conduct

AI Policy

  • I agree to follow this project's AI Policy, or I agree that AI was not used while creating this issue.

Is your feature request related to a problem? Please describe.

Some times there are information that can only be retrieved from the ReqLLM data directly and that is discarded by AshAI.

For example, if I set a limit of tokens and want to check if the LLM generation was stopped by checking ReqLLM metadata field finish_reason.

Describe the solution you'd like

I'm not sure where this data should be stored, maybe in the context? But regardless on where, giving the user a way to retrieve it would be great, either by somehow returning it with the action return value, or, if that is not possible, having some way to handle that inside the action itself (in this case, it would be great if we could update the input data and request a retry (so, for example, in the case I mentioned above regarding max tokens, I would be able to check finish_reason, and, if the generating was stopped in the middle because of not enough tokens, I would be able to increase the amount of tokens and retry).

Describe alternatives you've considered

No response

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions