Skip to content

fastai_inference.py is non-deterministic #390

@dthaler

Description

@dthaler

Running the inference script on the same input yields non-deterministic results due to randomness in the method used.
Repro:

model_type: "FastAI"
model_local_threshold: 0.5
model_global_threshold: 3
model_path: "./model"
model_name: "model.pkl"
hls_stream_type: "DateRangeHLS"
hls_polling_interval: 60
hls_start_time_pst: "2025-12-27 23:09"
hls_end_time_pst: "2025-12-27 23:10"
hls_hydrophone_id: "rpi_sunset_bay"
upload_to_azure: False
delete_local_wavs: True

Running:

python src/LiveInferenceOrchestrator.py --config ./config/Test/Positive/FastAI_DateRangeHLS_SunsetBay.yml --max_iterations 1

Will report Found about 50% of the time and not found the other 50% of the time.

Metadata

Metadata

Assignees

No one assigned

    Labels

    inference systemCode to perform inference with the trained model(s)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions