Skip to content
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 24 additions & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Documentation

## Building the Documentation

1. Install dependencies:

```console
python3 -m pip install -r requirements.txt
python3 -m pip install -r docs/requirements-docs.txt
```

1. Build the documentation:

```console
make -C docs/source doc
```

The HTML is created in the `docs/source/html` directory.

## Publishing the Documentation

Tag the commit to publish with `docs-v<semver>`.

To avoid publishing the documentation as the latest, ensure the commit has `/not-latest` on a single line, tag that commit, and push to GitHub.
45 changes: 24 additions & 21 deletions docs/source/configurable.rst
Original file line number Diff line number Diff line change
@@ -1,16 +1,17 @@
.. headings: = - ^ "
Configuring ``garak``
=====================

Configuring garak
=================

Beyond the standard CLI options, garak is highly configurable.
You can use YAML files to configure a garak run, down to the level
of exactly how each plugin behaves.


Specifying custom configuration
Specifying Custom Configuration
-------------------------------

``garak`` can be configured in multiple ways:
garak can be configured in multiple ways:

* Via command-line parameters
* Using YAML configs
Expand All @@ -19,8 +20,8 @@ Specifying custom configuration
The easiest way is often to use a YAML config, and how to do that is
described below.

Garak's config hierarchy
^^^^^^^^^^^^^^^^^^^^^^^^
Garak Config Hierarchy
^^^^^^^^^^^^^^^^^^^^^^

Configuration values can come from multiple places. At garak load, the
``_config`` module manages parsing configuration. This includes determining
Expand Down Expand Up @@ -90,8 +91,8 @@ Here we can see many entries that correspond to command line options, such as
such as ``show_100_pass_modules``.


``system`` config items
"""""""""""""""""""""""
System Config Items
"""""""""""""""""""

* ``parallel_requests`` - For generators not supporting multiple responses per prompt: how many requests to send in parallel with the same prompt? (raising ``parallel_attempts`` generally yields higher performance, depending on how high ``generations`` is set)
* ``parallel_attempts`` - For parallelisable generators, how many attempts should be run in parallel? Raising this is a great way of speeding up garak runs for API-based models
Expand All @@ -102,8 +103,8 @@ such as ``show_100_pass_modules``.
* ``enable_experimental`` - Enable experimental function CLI flags. Disabled by default. Experimental functions may disrupt your installation and provide unusual/unstable results. Can only be set by editing core config, so a git checkout of garak is recommended for this.
* ``max_workers`` - Cap on how many parallel workers can be requested. When raising this in order to use higher parallelisation, keep an eye on system resources (e.g. `ulimit -n 4026` on Linux)

``run`` config items
""""""""""""""""""""
Run Config Items
""""""""""""""""

* ``system_prompt`` -- If given and not overriden by the probe itself, probes will pass the specified system prompt when possible for generators that support chat modality.
* ``probe_tags`` - If given, the probe selection is filtered according to these tags; probes that don't match the tags are not selected
Expand All @@ -116,8 +117,9 @@ such as ``show_100_pass_modules``.
* ``target_lang`` - A single language (as BCP47 that the target application for LLM accepts as prompt and output
* ``langproviders`` - A list of configurations representing providers for converting from probe language to lang_spec target languages (BCP47)

``plugins`` config items
""""""""""""""""""""""""
Plugins Config Items
""""""""""""""""""""

* ``model_type`` - The generator model type, e.g. "nim" or "huggingface"
* ``model_name`` - The name of the model to be used (optional - if blank, type-specific default is used)
* ``probe_spec`` - A comma-separated list of probe modules or probe classnames (in ``module.classname``) format to be used. If a module is given, only ``active`` plugin in that module are chosen, this is equivalent to passing `-p` to the CLI
Expand All @@ -135,8 +137,9 @@ such as ``show_100_pass_modules``.
For an example of how to use the ``detectors``, ``generators``, ``buffs``,
``harnesses``, and ``probes`` root entries, see :ref:`Configuring plugins with YAML <config_with_yaml>` below.

``reporting`` config items
""""""""""""""""""""""""""
Reporting Config Items
""""""""""""""""""""""

* ``report_dir`` - Directory for reporting; defaults to ``$XDG_DATA/garak/garak_runs``
* ``report_prefix`` - Prefix for report files. Defaults to ``garak.$RUN_UUID``
* ``taxonomy`` - Which taxonomy to use to group probes when creating HTML report
Expand All @@ -146,7 +149,7 @@ For an example of how to use the ``detectors``, ``generators``, ``buffs``,
* ``show_top_group_score`` - Should the aggregated score be shown as a top-level figure in report concertinas?


Bundled quick configs
Bundled Quick Configs
^^^^^^^^^^^^^^^^^^^^^

Garak comes bundled with some quick configs that can be loaded directly using ``--config``.
Expand All @@ -163,7 +166,7 @@ These are great places to look at to get an idea of how garak YAML configs can l
Quick configs are stored under ``garak/configs/`` in the source code/install.


Using a custom config
Using a Custom Config
^^^^^^^^^^^^^^^^^^^^^

To override values in this we can create a new YAML file and point to it from the
Expand All @@ -183,7 +186,7 @@ If we save this as ``latent1.yaml`` somewhere, then we can use it with ``garak -



Using a custom JSON config
Using a Custom JSON Config
^^^^^^^^^^^^^^^^^^^^^^^^^^

Some plugins can take a JSON config specified on the command line. This config
Expand All @@ -203,7 +206,7 @@ the ``generators`` that interface with models, and even the ``harnesses``
that manage run orchestration. Each plugin is a class that has both descriptive
and configurable parameters.

Viewing plugin parameters
Viewing Plugin Parameters
^^^^^^^^^^^^^^^^^^^^^^^^^

You can see the parameters for any given plugin using garak ``--plugin_info``.
Expand Down Expand Up @@ -241,7 +244,7 @@ config, or the default.

.. _config_with_yaml:

Configuring plugins with YAML
Configuring Plugins with YAML
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Plugin config happens inside the ``plugins`` block. Multiple plugins can be
Expand Down Expand Up @@ -340,7 +343,7 @@ This defines a REST endpoint where:
This should be written to a file, and the file's path passed on the command
line with `-G`.

Configuration in code
Configuration in Code
---------------------

The preferred way to instantiate a plugin is using ``garak._plugins.load_plugin()``.
Expand All @@ -351,7 +354,7 @@ This function takes two parameters:

``load_plugin()`` returns a configured instance of the requested plugin.

OpenAIGenerator config with dictionary
OpenAIGenerator Config with Dictionary
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. code-block:: python
Expand Down
32 changes: 16 additions & 16 deletions docs/source/translation.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Translation support
Translation Support
===================

Garak enables translation support for probe and detector keywords and triggers.
Expand All @@ -13,18 +13,18 @@ Limitations
- If probes or detectors fail to load, you need may need to choose a smaller local translation model or utilize a remote service.
- Translation may add significant execution time to the run depending on resources available.

Supported translation services
Supported Translation Services
------------------------------

- Huggingface: This project supports usage of the following translation models:
- Hugging Face: This project supports usage of the following translation models:
- `Helsinki-NLP/opus-mt-{<source_lang>-<target_lang>} <https://huggingface.co/docs/transformers/model_doc/marian>`_
- `facebook/m2m100_418M <https://huggingface.co/facebook/m2m100_418M>`_
- `facebook/m2m100_1.2B <https://huggingface.co/facebook/m2m100_1.2B>`_
- `DeepL <http://www.deepl.com>`_
- `NVIDIA Riva for Developers <https://developer.nvidia.com/riva>`_
- `Google Cloud Translation API <https://cloud.google.com/translate/docs/reference/api-overview>`_

API KEY Requirements
API Key Requirements
--------------------

To use use DeepL API, Riva API, or Google Cloud Translation to translate probe and detector keywords and triggers from cloud services an API key must be supplied.
Expand Down Expand Up @@ -64,7 +64,7 @@ Google Cloud Translation

export GOOGLE_APPLICATION_CREDENTIALS=<path to credential configuration json file>

Configuration file
Configuration File
------------------

Translation function is configured in the ``run`` section of a configuration with the following keys:
Expand Down Expand Up @@ -107,7 +107,7 @@ An example template is provided below.

* Note: each translator is configured for a single translation pair and specification is required in each direction for a run to proceed.

Examples for translation configuration
Examples for Translation Configuration
--------------------------------------

DeepL
Expand All @@ -130,7 +130,7 @@ You use the following yaml config.
.. code-block:: bash

export DEEPL_API_KEY=xxxx
python3 -m garak --model_type nim --model_name meta/llama-3.1-8b-instruct --probes encoding --config {path to your yaml config file}
python3 -m garak --model_type nim --model_name meta/llama-3.1-8b-instruct --probes encoding --config <path-to-your-yaml-config-file>


Riva
Expand All @@ -152,7 +152,7 @@ You use the following yaml config.
.. code-block:: bash

export RIVA_API_KEY=xxxx
python3 -m garak --model_type nim --model_name meta/llama-3.1-8b-instruct --probes encoding --config {path to your yaml config file}
python3 -m garak --model_type nim --model_name meta/llama-3.1-8b-instruct --probes encoding --config <path-to-your-yaml-config-file>


Google Cloud Translation
Expand All @@ -161,21 +161,21 @@ Google Cloud Translation
For Google Cloud Translation, run the following command:
You use the following yaml config.

.. code-block:: yaml
.. code-block:: yaml

run:
target_lang: {target language code}
target_lang: <target-language-code>
langproviders:
- language: {source language code},{target language code}
- language: <source-language-code>,<target-language-code>
model_type: remote.GoogleTranslator
- language: {target language code},{source language code}
- language: <target-language-code>,<source-language-code>
model_type: remote.GoogleTranslator


.. code-block:: bash

export GOOGLE_APPLICATION_CREDENTIALS=<path to credential configuration json file>
python3 -m garak --model_type nim --model_name meta/llama-3.1-8b-instruct --probes encoding --config {path to your yaml config file}
python3 -m garak --model_type nim --model_name meta/llama-3.1-8b-instruct --probes encoding --config <path-to-your-yaml-config-file>


Local
Expand All @@ -196,11 +196,11 @@ You use the following yaml config.

.. code-block:: bash

python3 -m garak --model_type nim --model_name meta/llama-3.1-8b-instruct --probes encoding --config {path to your yaml config file}
python3 -m garak --model_type nim --model_name meta/llama-3.1-8b-instruct --probes encoding --config <path-to-your-yaml-config-file>

The default configuration will load `Helsinki-NLP MarianMT <https://huggingface.co/docs/transformers/model_doc/marian>`_ models for local translation.

Additional support for Huggingface ``M2M100Model`` type only is enabled by providing ``model_name`` for local translators. The model name provided must
Additional support for Hugging Face ``M2M100Model`` type only is enabled by providing ``model_name`` for local translators. The model name provided must
contain ``m2m100`` to be loaded by garak.

.. code-block:: yaml
Expand All @@ -218,4 +218,4 @@ contain ``m2m100`` to be loaded by garak.

.. code-block:: bash

python3 -m garak --model_type nim --model_name meta/llama-3.1-8b-instruct --probes encoding --config {path to your yaml config file}
python3 -m garak --model_type nim --model_name meta/llama-3.1-8b-instruct --probes encoding --config <path-to-your-yaml-config-file>
2 changes: 1 addition & 1 deletion garak/attempt.py
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@ def last_message(self, role=None) -> Message:
"""The last message exchanged in the conversation

:param role: Optional, role to search for
type: str
:type role: str
"""
if len(self.turns) < 1:
raise ValueError("No messages available")
Expand Down
Loading
Loading