-
Notifications
You must be signed in to change notification settings - Fork 18
Feat/llm responses #376
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Feat/llm responses #376
Changes from all commits
Commits
Show all changes
234 commits
Select commit
Hold shift + click to select a range
f15be68
Started working on llm_responses
NotBioWaste905 56b7789
Created class, created 1st tutorial
NotBioWaste af60115
Added dependecies for langchain
NotBioWaste b3b79a5
Fixed adding custom prompt for each node
NotBioWaste 6eb910d
Added image processing, updated tutorial
NotBioWaste 1f8cddc
Added typehint
NotBioWaste 74cd954
Added llm_response, LLM_API, history management
NotBioWaste 1fd31a2
Fixed image reading
NotBioWaste 2c48490
Started llm condition
NotBioWaste a1884e5
Added message_to_langchain
NotBioWaste 61f302e
Implementing deepeval integration
NotBioWaste 38a8f8f
Figured out how to implement DeepEval functions
NotBioWaste905 592267f
Adding conditions
NotBioWaste baccc47
Implemented simple conditions call, added BaseMethod class, renaming,…
NotBioWaste 8e84ba1
Fixed history extraction
NotBioWaste 2b2847b
Delete test_bot.py
NotBioWaste905 7e336ac
Fixed prompt handling, switched to AIMessage in LLM response
NotBioWaste 71babbf
Merge branch 'feat/llm_responses' of https://github.com/deeppavlov/di…
NotBioWaste 351ae06
Fixed conditions call
NotBioWaste e3d0d15
Working on autotesting
NotBioWaste 0405998
Added tests
NotBioWaste 3dbfd0c
Removed unused method
NotBioWaste 5c876ba
Added annotations
NotBioWaste 8f1932c
Added structured output support, tweaked tests
NotBioWaste aedf47e
Reworking tutorials
NotBioWaste adadb05
Reworked prompt usage and hierarchy, reworked filters and methods
NotBioWaste 0288896
No idea how to make script smaller in tutorials
NotBioWaste 67e2758
Small fixes in tutorials and structured generation
NotBioWaste 428a9f0
Working on user guide
NotBioWaste 5e26b4b
Fixed some tutorials, finished user guide
NotBioWaste 5dbb6cd
Bugfixes in docs
NotBioWaste db63d1a
Lint
NotBioWaste 2b9080f
Removed type annotation that broke docs building
NotBioWaste 2bcda71
Tests and bugfixes
NotBioWaste d2f28ed
Deleted DeepEval references
NotBioWaste 7318c91
Numpy versions trouble
NotBioWaste 27eae27
Fixed dependecies
NotBioWaste 3fed1fc
Made everything asynchronous
NotBioWaste 30862ca
Added and unified docstring
NotBioWaste 06ab5bc
Added 4th tutorial, fixed message_schema parameter passing
NotBioWaste 798a77b
Bugfix, added max_size to the message_to_langchain function
NotBioWaste 3343159
Made even more everything asynchronous
NotBioWaste 014ff7e
Remade condition, added logprob check
NotBioWaste 761bd81
Async bugfix, added model_result_to_text, working on message_schema f…
NotBioWaste 90a811e
Minor fixes, tinkering tests
NotBioWaste 5bff191
Merge branch 'refs/heads/dev' into feat/llm_responses
RLKRo 8b88ba6
update lock file
RLKRo 20c4afd
Merge remote-tracking branch 'origin/feat/llm_responses' into feat/ll…
RLKRo 0139421
Merge remote-tracking branch 'origin/master' into feat/llm_responses
NotBioWaste905 9bb0cba
Updating to v1.0
NotBioWaste905 f2d6b68
Finished tests, finished update
NotBioWaste905 6fddaea
lint
NotBioWaste905 e06bc2b
Started working on llm slots
NotBioWaste905 22d8efc
Resolving pydantic errors
NotBioWaste905 aa735b5
Delete llmslot_test.py
NotBioWaste905 cc91133
Finished LLMSlot, working on LLMGroupSlot
NotBioWaste905 8756838
Merge remote-tracking branch 'origin/feat/llm_responses' into feat/ll…
NotBioWaste905 f1857f6
Added flag to
NotBioWaste905 c334ff5
First test attempts
NotBioWaste905 8306bbb
linting
NotBioWaste905 f842776
Merge branch 'feat/slots_extraction_update' into feat/llm_responses
NotBioWaste905 ada17ca
Merge remote-tracking branch 'origin/feat/llm_responses' into feat/ll…
NotBioWaste905 a45f653
File structure fixed
NotBioWaste905 3838d30
Fixed naming
NotBioWaste905 0e650f8
Create LLMCondition and LLMResponse classes
NotBioWaste905 015cb4f
Debugging flattening
NotBioWaste905 b6e5eeb
Bugfix
NotBioWaste905 b20137e
Added return_type property for LLMSlot
NotBioWaste905 25f5b04
Changed return_type from Any to type
NotBioWaste905 b651087
lint
NotBioWaste905 1b5a77b
removed deprecated from_script from tutorials
NotBioWaste905 c18d375
Fixed LLMCondition class
NotBioWaste905 459f7fc
Fixed missing 'models' field in Pipeline, updated tutorials
NotBioWaste905 24300e8
create __get_llm_response method in LLM_API, refactoring LLM Conditio…
NotBioWaste905 03b02be
Merge branch 'refs/heads/dev' into feat/llm_responses
RLKRo e6663b3
update lock file
RLKRo 2e1c190
remove outdated entries from conf.py
RLKRo 859c57a
small fixes to user guide
RLKRo fb3142b
minor tutorial changes
RLKRo ff81267
Moved docstring, removed pipeline parameter
NotBioWaste905 7518259
Fixed type annotation for models field in Pipeline
NotBioWaste905 ac28d78
removed unused imports from llm/__init__.py
NotBioWaste905 2d4998c
Fix redundancy in chatsky/slots/llm.py
NotBioWaste905 23d6a31
Fixed circular LLM_API<=>Pipeline import
NotBioWaste905 ef9baa3
Merge remote-tracking branch 'origin/feat/llm_responses' into feat/ll…
NotBioWaste905 4bf5bba
Update import order chatsky/llm/filters.py
NotBioWaste905 9188b89
Fixes in filters
NotBioWaste905 02894f0
Fixes of LLM_API annotations and docs
NotBioWaste905 8e839a1
Removed __get_llm_response, lint
NotBioWaste905 210b10a
Added context_to_history util, some tweaks in responses
NotBioWaste905 784f323
remove llm_response object initialization from tutorials
RLKRo 042d256
fix imports in __init__ files:
RLKRo 10533ed
fix: rename llm_response to LLMResponse, rename llm_condition to LLMC…
RLKRo 8f21069
fix codeblocks in user guide
RLKRo 95e2418
fix: message_to_langchain accepts context instead of pipeline
RLKRo 934a0b8
remove defaults from filter definitions
RLKRo 1be58a0
check field not none in filters
RLKRo 4d68a29
remove model_name from LLM_API.respond
RLKRo fa0ae70
make LLMResponse prompt AnyResponse, remove __prompt_to_message
RLKRo 8778637
fix return style in LLM_API.respond
RLKRo d4b67a1
fix LLM_API.condition signature
RLKRo 4a29687
some doc fixes
RLKRo 37aafb3
fix message schema json dumping
RLKRo 54a7376
remove unused imports
RLKRo 86da03e
fix circular import
RLKRo eac43e0
fix tests
RLKRo 51c66a8
remove cnd.true()
RLKRo 33242ca
Fixed empty prompt popping up
NotBioWaste905 65f7c8f
Format
NotBioWaste905 dc92132
Switched model from 3.5-turbo to 4o-mini
NotBioWaste905 020a7ef
Updated all of the models
NotBioWaste905 c9891f6
Fixes and logging
NotBioWaste905 c678f89
Codestyle
NotBioWaste905 f2df441
update lock file
RLKRo f20d463
simplify history text
RLKRo 44e5571
fix codestyle
RLKRo 9f97ce2
fix doc building
RLKRo b9e738a
Merge branch 'refs/heads/dev' into feat/llm_responses
RLKRo 39750ba
update lock file
RLKRo 6603f7d
remove unnecessary langchain extras
RLKRo 3827462
update lock file
RLKRo f7e7684
protect langchain imports & sort imports in modules
RLKRo a4e0462
skip llm tests on missing langchain
RLKRo 13923ab
Added docstrings in llm/methods.py
NotBioWaste905 537d8cc
Docstring fixes
NotBioWaste905 35d9d7d
Fixes in message_to_langchain
NotBioWaste905 e5c83fb
lint
NotBioWaste905 5a7313f
Fixed overseen raise condition
NotBioWaste905 0000414
Signature fixes
NotBioWaste905 36a9f54
Responses related fixes
NotBioWaste905 ba95767
Slot related fixes + lint
NotBioWaste905 3d79cec
Fixed abstract call
NotBioWaste905 8e22b97
Adding tests
NotBioWaste905 b8de244
Bunch of documentation fixes, removed attachment_to_content
NotBioWaste905 bfba582
Added tests, need fix
NotBioWaste905 2b3c02b
Renamed FromTheModel to FromModel
NotBioWaste905 47f3855
Changes in BaseFilter class
NotBioWaste905 248d77f
Switched to localhost models in tutorials
NotBioWaste905 b5ecc1a
Renamed BaseFilter into BaseHistoryFilter, added API reference
NotBioWaste905 34e5536
Lint
NotBioWaste905 60c7c97
Slots and tutorials update
NotBioWaste905 3cf1df7
Tutorials and structured output update
NotBioWaste905 7f00028
More clear instructions in tutorial
NotBioWaste905 513eb19
Fixes in llm slots and tutorial
NotBioWaste905 2cd5d41
lint
NotBioWaste905 6a0845d
Finalizing tweaks
NotBioWaste905 81a86e9
Lint
NotBioWaste905 24e65c5
Removed import test
NotBioWaste905 b6af8f5
Removed dotenv, fixed Union
NotBioWaste905 ee5f643
Conditions cleanup
NotBioWaste905 1ff7020
Switched to the '|' operator, IsImportant and FromModel are now inher…
NotBioWaste905 2f65265
Added partial extraction to the tutorial
NotBioWaste905 04c5b54
Moved history flag annotation to another tutorial
NotBioWaste905 0d56e75
Fixed docstrings
NotBioWaste905 74c6d5e
Quickfix for message_to_langchain
NotBioWaste905 7e2da91
Fixed signatures in filters, lint
NotBioWaste905 7a313d1
Fixed tutorial link
NotBioWaste905 9b31ac9
Actually fixed tutorial link
NotBioWaste905 1c4aa24
Fixed splitted lines in tutorials, reworked system prompt handling af…
NotBioWaste905 419ab8d
Added missing docstrings for LLM_API
NotBioWaste905 e723334
Small docstring fix
NotBioWaste905 6b1ffed
Added test for conditions + fixed some bugs
NotBioWaste905 2a7bd4f
Removed return_schema from condition due to not using it for now
NotBioWaste905 e25e2f8
Experiencing issues with slot testing
NotBioWaste905 8e553bd
lint
NotBioWaste905 fea185c
Fixes in LLM Slot testing
NotBioWaste905 968fe75
Refactor context_to_history function to streamline filtering of dialo…
NotBioWaste905 8bc71ce
Working on Prompt rework
NotBioWaste905 e27d85f
Returned test case
NotBioWaste905 13e6a31
Started working on get_langchain_context
NotBioWaste905 93412e8
Working on prompt processing
NotBioWaste905 3b6f941
Resolved typeching issues in Pipeline
NotBioWaste905 24237fb
Added some logging, WIP
NotBioWaste905 f4d1852
Renamed `model_name` parameter into `llm_model_name`
NotBioWaste905 f0f0e2d
Update LLMResponse
NotBioWaste905 8b8085f
Update LLM_API to work with LLM Response
NotBioWaste905 09b0487
Renamed DesaultPositionConfig to PositionConfig
NotBioWaste905 6eb50e7
Reworked context related functions
NotBioWaste905 bba5178
Added buch on TODOs
NotBioWaste905 d1063b9
Made request and response optional for history filters, renamed field…
NotBioWaste905 cee86a9
Removed deprecated TODO
NotBioWaste905 3c0fe22
Updated conditions.llm to use get_langchain_context
NotBioWaste905 44935ff
Added docstring for get_langchain_context, lint
NotBioWaste905 2b37d59
Fixed renaming issue
NotBioWaste905 32eae7d
Fixing tests
NotBioWaste905 6440eaf
Fixed appending empty strings + wrong prompt positions in tests
NotBioWaste905 b9f3925
Added missing PositionConfig
NotBioWaste905 d43b468
Added de-flattening func to slots.llm
NotBioWaste905 4968907
Update prompt handling in LLM conditions and tests
NotBioWaste905 42b6ced
Refactor Prompt model to use float for position attribute, not BasePr…
NotBioWaste905 b11f44b
Added tests for get_langchain_context
NotBioWaste905 5bedd3f
lint
NotBioWaste905 c792f93
Modified tutorial to include prompt positioning
NotBioWaste905 c8dc417
Added mock OPENAI_API_KEY for tutorials to be testes
NotBioWaste905 1f31292
removed pipe symbol from union
NotBioWaste905 bf8f7cf
lint
NotBioWaste905 6204b85
Added actual Union
NotBioWaste905 b7f1cd7
Fixed wrong method override
NotBioWaste905 65e24f9
Updated tutorial
NotBioWaste905 8f0587f
lint
NotBioWaste905 94aa660
Added missing mock ANTHROPIC_API_KEY
NotBioWaste905 2a21f5a
Trying to fix escape sequence
NotBioWaste905 be76bf1
Okay this breaks everything
NotBioWaste905 5b120c6
Fixed typo
NotBioWaste905 37ae4ae
Updated userguide
NotBioWaste905 db4f8e1
readability improvements
NotBioWaste905 5d9681b
tests are grouped into classes
NotBioWaste905 ed7ba23
Fixed formatting
NotBioWaste905 aed1af8
Updated tutorials
NotBioWaste905 fc706f9
Updated tutorial via llm
NotBioWaste905 0bd0e15
Formating fixes and docstrings
NotBioWaste905 0ee3a8c
Reformatted and improved readability for tutorials
NotBioWaste905 e25921e
Fixed some hallucinations
NotBioWaste905 62f0bdd
And once more
NotBioWaste905 5dd6fc0
Deleted new line
NotBioWaste905 6570738
Trying to fix doc building
NotBioWaste905 0d2d5a1
Updated user guide
NotBioWaste905 6f64e24
Merge branch 'refs/heads/dev' into feat/llm_responses
RLKRo cb3cd70
fix llm tests with after #93
RLKRo 41978a4
update history extraction after #93
RLKRo 994b32c
change last_request in context history to last_turn
RLKRo 98bb5d8
raise max size to 5000 and update its docstring
RLKRo f844f0f
improve first llm tutorial
RLKRo 873e3ef
Merge branch 'refs/heads/dev' into feat/llm_responses
RLKRo 4a54f53
lint first tutorial
RLKRo 35cddbd
add loggers
RLKRo cfdc261
improve docs for condition and response
RLKRo 3666b9a
move check_langchain_available from response to langchain_context
RLKRo 54e3ba3
improve pipeline docs
RLKRo 997e0be
add more entries to llm init
RLKRo 315e697
remove unnecessary logs
RLKRo d043e7f
fix filters (they did not work at all) & improve documentation
RLKRo 93abd3b
documentation improvements & a few code improvements
RLKRo 6be72b9
move slots to the status of experimental feature
RLKRo File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,77 @@ | ||
| """ | ||
| LLM Conditions | ||
| -------------- | ||
| This module provides LLM-based conditions. | ||
| """ | ||
|
|
||
| from pydantic import Field | ||
| from typing import Optional | ||
|
|
||
| from chatsky.core import BaseCondition, Context | ||
| from chatsky.core.script_function import AnyResponse | ||
| from chatsky.llm.methods import BaseMethod | ||
| from chatsky.llm.langchain_context import get_langchain_context | ||
| from chatsky.llm.filters import BaseHistoryFilter, DefaultFilter | ||
| from chatsky.llm.prompt import PositionConfig, Prompt | ||
|
|
||
|
|
||
| class LLMCondition(BaseCondition): | ||
RLKRo marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| """ | ||
| LLM-based condition. | ||
| Uses prompt to produce result from model and evaluates the result using given method. | ||
| """ | ||
|
|
||
| llm_model_name: str | ||
| """ | ||
| Key of the model in the :py:attr:`~chatsky.core.pipeline.Pipeline.models` dictionary. | ||
| """ | ||
| prompt: AnyResponse = Field(default="", validate_default=True) | ||
| """ | ||
| Condition prompt. | ||
| """ | ||
| history: int = 1 | ||
| """ | ||
| Number of dialogue turns aside from the current one to keep in history. `-1` for full history. | ||
| """ | ||
| filter_func: BaseHistoryFilter = Field(default_factory=DefaultFilter) | ||
| """ | ||
| Filter function to filter messages in history. | ||
| """ | ||
| prompt_misc_filter: str = Field(default=r"prompt") | ||
| """ | ||
| Regular expression to find prompts by key names in MISC dictionary. | ||
| """ | ||
| position_config: Optional[PositionConfig] = None | ||
| """ | ||
| Config for positions of prompts and messages in history. | ||
| """ | ||
| max_size: int = 5000 | ||
| """ | ||
| Maximum size of any message in chat in symbols. | ||
| If a message exceeds the limit it will not be sent to the LLM and a warning | ||
| will be produced. | ||
| """ | ||
| method: BaseMethod | ||
| """ | ||
| Method that takes model's output and returns boolean. | ||
| """ | ||
|
|
||
| async def call(self, ctx: Context) -> bool: | ||
| model = ctx.pipeline.models[self.llm_model_name] | ||
|
|
||
| history_messages = [] | ||
| history_messages.extend( | ||
| await get_langchain_context( | ||
| system_prompt=await model.system_prompt(ctx), | ||
| ctx=ctx, | ||
| call_prompt=Prompt(message=self.prompt), | ||
| prompt_misc_filter=self.prompt_misc_filter, | ||
| position_config=self.position_config or model.position_config, | ||
| length=self.history, | ||
| filter_func=self.filter_func, | ||
| llm_model_name=self.llm_model_name, | ||
| max_size=self.max_size, | ||
| ) | ||
| ) | ||
|
|
||
| return await model.condition(history_messages, self.method) | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
NotBioWaste905 marked this conversation as resolved.
Show resolved
Hide resolved
RLKRo marked this conversation as resolved.
Show resolved
Hide resolved
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,4 @@ | ||
| from chatsky.llm.filters import BaseHistoryFilter, FromModel, IsImportant, MessageFilter, Return | ||
| from chatsky.llm.methods import BaseMethod, LogProb, Contains | ||
| from chatsky.llm.llm_api import LLM_API | ||
| from chatsky.llm.prompt import Prompt, PositionConfig |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,25 @@ | ||
| from typing import Any | ||
|
|
||
| try: | ||
| from langchain_core.output_parsers import StrOutputParser | ||
| from langchain_core.language_models.chat_models import BaseChatModel | ||
| from langchain_core.messages.base import BaseMessage | ||
| from langchain_core.messages import HumanMessage, SystemMessage, AIMessage | ||
| from langchain_core.outputs.llm_result import LLMResult | ||
|
|
||
| langchain_available = True | ||
| except ImportError: # pragma: no cover | ||
| StrOutputParser = Any | ||
| BaseChatModel = Any | ||
| BaseMessage = Any | ||
| HumanMessage = Any | ||
| SystemMessage = Any | ||
| AIMessage = Any | ||
| LLMResult = Any | ||
|
|
||
| langchain_available = False | ||
|
|
||
|
|
||
| def check_langchain_available(): # pragma: no cover | ||
| if not langchain_available: | ||
| raise ImportError("Langchain is not available. Please install it with `pip install chatsky[llm]`.") |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,164 @@ | ||
| """ | ||
| Filters | ||
| --------- | ||
| This module contains a collection of basic functions for history filtering to avoid cluttering LLMs context window. | ||
| """ | ||
|
|
||
| import abc | ||
| from enum import Enum | ||
| from logging import Logger | ||
| from typing import Union, Optional | ||
|
|
||
| from pydantic import BaseModel | ||
|
|
||
| from chatsky.core.message import Message | ||
| from chatsky.core.context import Context | ||
|
|
||
|
|
||
| logger = Logger(name=__name__) | ||
|
|
||
|
|
||
| class Return(Enum): | ||
| """ | ||
| Enum that defines options for filtering turns. | ||
| """ | ||
|
|
||
| NoReturn = 0 | ||
| """ | ||
| Do not include the turn. | ||
| """ | ||
| Request = 1 | ||
| """ | ||
| Include request only. | ||
| """ | ||
| Response = 2 | ||
| """ | ||
| Include response only. | ||
| """ | ||
| Turn = 3 | ||
| """ | ||
| Include the entire turn (both request and response). | ||
| """ | ||
|
|
||
|
|
||
| class BaseHistoryFilter(BaseModel, abc.ABC): | ||
| """ | ||
| Base class for all message history filters. | ||
| """ | ||
|
|
||
| @abc.abstractmethod | ||
| def call( | ||
| self, ctx: Context, request: Optional[Message], response: Optional[Message], llm_model_name: str | ||
| ) -> Union[Return, int]: | ||
| """ | ||
| Decide whether to include request or response or both in the context history from | ||
| a single turn. | ||
|
|
||
| The filter function is called repeatedly over all turns in context (up to history length limit in | ||
| :py:func:`~chatsky.llm.langchain_context.context_to_history`) to determine which parts of the turn | ||
| to include. | ||
|
|
||
| Both request and response may be ``None``. Even if such messages are not filtered out by this filter, | ||
| they won't be included in history. | ||
|
|
||
| :param ctx: Context object. | ||
| :param request: Request message. | ||
| :param response: Response message. | ||
| :param llm_model_name: Name of the model that calls this filter in the Pipeline.models. | ||
|
|
||
| :return: Instance of Return enum or a corresponding int value. | ||
| """ | ||
| raise NotImplementedError() | ||
|
|
||
| def __call__( | ||
| self, ctx: Context, request: Optional[Message], response: Optional[Message], llm_model_name: str | ||
| ) -> Return: | ||
| """ | ||
| Wrapper for call that catches exceptions and does not return any turn items if an exception occurs. | ||
|
|
||
| :param ctx: Context object. | ||
| :param request: Request message. | ||
| :param response: Response message. | ||
| :param llm_model_name: Name of the model that calls this filter in the Pipeline.models. | ||
|
|
||
| :return: Instance of Return enum. | ||
| """ | ||
| try: | ||
| result = self.call(ctx, request, response, llm_model_name) | ||
|
|
||
| if isinstance(result, int): | ||
| result = Return(result) | ||
|
|
||
| return result | ||
| except Exception as exc: | ||
| logger.warning(exc) | ||
| return Return.NoReturn | ||
|
|
||
|
|
||
| class MessageFilter(BaseHistoryFilter): | ||
| """ | ||
| Variant of history filter that allows to define simple filters that do not | ||
| differentiate between requests and responses. | ||
| """ | ||
|
|
||
| @abc.abstractmethod | ||
| def single_message_filter_call(self, ctx: Context, message: Optional[Message], llm_model_name: str) -> bool: | ||
| """ | ||
| Determine based on a single message (which may be either request or response) | ||
| whether to include the message in history. | ||
|
|
||
| :param ctx: Context object. | ||
| :param message: Either request or response message. | ||
| :param llm_model_name: Name of the model that calls this filter in the Pipeline.models. | ||
|
|
||
| :return: Whether the `message` should be included in history. | ||
| """ | ||
| raise NotImplementedError() | ||
|
|
||
| def call( | ||
| self, ctx: Context, request: Optional[Message], response: Optional[Message], llm_model_name: str | ||
| ) -> Union[Return, int]: | ||
| return ( | ||
| int(self.single_message_filter_call(ctx, request, llm_model_name)) * Return.Request.value | ||
| | int(self.single_message_filter_call(ctx, response, llm_model_name)) * Return.Response.value | ||
| ) | ||
|
|
||
|
|
||
| class DefaultFilter(BaseHistoryFilter): | ||
| """ | ||
| Filter used by default. | ||
| Never filters out messages. | ||
| """ | ||
|
|
||
| def call( | ||
| self, ctx: Context, request: Optional[Message], response: Optional[Message], llm_model_name: str | ||
| ) -> Union[Return, int]: | ||
| return Return.Turn | ||
|
|
||
|
|
||
| class IsImportant(MessageFilter): | ||
| """ | ||
| Filter that checks if the "important" field in a Message.misc is True. | ||
| """ | ||
|
|
||
| def single_message_filter_call(self, ctx: Context, message: Optional[Message], llm_model_name: str) -> bool: | ||
| if message is not None and message.misc is not None and message.misc.get("important", None): | ||
| return True | ||
| return False | ||
|
|
||
|
|
||
| class FromModel(BaseHistoryFilter): | ||
| """ | ||
| Filter that checks if the response of the turn is generated by the currently | ||
| """ | ||
|
|
||
| def call( | ||
| self, ctx: Context, request: Optional[Message], response: Optional[Message], llm_model_name: str | ||
| ) -> Union[Return, int]: | ||
| if ( | ||
| response is not None | ||
| and response.annotations is not None | ||
| and response.annotations.get("__generated_by_model__") == llm_model_name | ||
| ): | ||
| return Return.Turn | ||
| return Return.NoReturn |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.