Skip to content

fix: Return 400 instead of 500 for raw bytes sent as base64 image#2016

Merged
grzegorz-roboflow merged 2 commits intomainfrom
fix/unicode-errors-handled-gracefully
Feb 19, 2026
Merged

fix: Return 400 instead of 500 for raw bytes sent as base64 image#2016
grzegorz-roboflow merged 2 commits intomainfrom
fix/unicode-errors-handled-gracefully

Conversation

@bigbitbus
Copy link
Contributor

@bigbitbus bigbitbus commented Feb 18, 2026

Summary

  • When a client sends raw image bytes (e.g. JPEG starting with 0xFF 0xD8) with type: "base64", load_image_base64 calls value.decode("utf-8") which throws an unhandled UnicodeDecodeError
  • This falls through to the catch-all exception handler and returns a 500 Internal Error with no useful message
  • This fix catches UnicodeDecodeError and raises InputImageLoadError, which the existing error handler returns as a 400 with a clear message telling the client to base64-encode the image

Example stack trace we are fixing

\xff\\xd9', 0, 1, 'invalid start byte')"], "event": "%s: %s", "request_id": "18e5e45cade4464e9ca3d81481667ac4", "timestamp": "2026-02-18 09:45.08", "exception": {"type": "UnicodeDecodeError", "message": "'utf-8' codec can't decode byte 0xff in position 0: invalid start byte", "stacktrace": [{"filename": "/var/task/inference/core/interfaces/http/error_handlers.py", "lineno": 83, "function": "wrapped_route", "code": "return route(*args, **kwargs)"}, {"filename": "/var/task/inference/usage_tracking/collector.py", "lineno": 715, "function": "sync_wrapper", "code": "res = func(*args, **kwargs)"}, {"filename": "/var/task/inference/core/interfaces/http/http_api.py", "lineno": 3136, "function": "legacy_infer_from_request", "code": "inference_response = self.model_manager.infer_from_request_sync("}, {"filename": "/var/task/inference/core/managers/decorators/fixed_size_cache.py", "lineno": 177, "function": "infer_from_request_sync", "code": "return super().infer_from_request_sync(model_id, request, **kwargs)"}, {"filename": "/var/task/inference/core/managers/decorators/base.py", "lineno": 106, "function": "infer_from_request_sync", "code": "return self.model_manager.infer_from_request_sync(model_id, request, **kwargs)"}, {"filename": "/var/task/inference/core/managers/active_learning.py", "lineno": 54, "function": "infer_from_request_sync", "code": "prediction = super().infer_from_request_sync("}, {"filename": "/var/task/inference/core/managers/base.py", "lineno": 265, "function": "infer_from_request_sync", "code": "rtn_val = self.model_infer_sync("}, {"filename": "/var/task/inference/core/managers/base.py", "lineno": 338, "function": "model_infer_sync", "code": "return model.infer_from_request(request)"}, {"filename": "/var/task/inference/core/models/base.py", "lineno": 134, "function": "infer_from_request", "code": "responses = self.infer(**request.dict(), return_image_dims=False)"}, {"filename": "/var/task/inference/core/models/instance_segmentation_base.py", "lineno": 97, "function": "infer", "code": "return super().infer("}, {"filename": "/var/task/inference/core/models/roboflow.py", "lineno": 821, "function": "infer", "code": "return super().infer(image, **kwargs)"}, {"filename": "/var/task/inference/usage_tracking/collector.py", "lineno": 715, "function": "sync_wrapper", "code": "res = func(*args, **kwargs)"}, {"filename": "/var/task/inference/core/models/base.py", "lineno": 25, "function": "infer", "code": "preproc_image, returned_metadata = self.preprocess(image, **kwargs)"}, {"filename": "/var/task/inference/core/models/instance_segmentation_base.py", "lineno": 200, "function": "preprocess", "code": "img_in, img_dims = self.load_image("}, {"filename": "/var/task/inference/core/models/roboflow.py", "lineno": 1042, "function": "load_image", "code": "img_in, img_dims = self.preproc_image("}, {"filename": "/var/task/inference/core/models/roboflow.py", "lineno": 514, "function": "preproc_image", "code": "np_image, is_bgr = load_image("}, {"filename": "/var/task/inference/core/utils/image_utils.py", "lineno": 93, "function": "load_image", "code": "np_image, is_bgr = load_image_with_known_type("}, {"filename": "/var/task/inference/core/utils/image_utils.py", "lineno": 176, "function": "load_image_with_known_type", "code": "image = loader(value, cv_imread_flags)"}, {"filename": "/var/task/inference/core/utils/image_utils.py", "lineno": 271, "function": "load_image_base64", "code": "value = value.decode(\"utf-8\")"}]}, "filename": "error_handlers.py", "func_name": "wrapped_route", "lineno": 382}```

## Test plan
- [ ] Send raw JPEG bytes with `type: "base64"` to a legacy inference endpoint — should now return 400 with descriptive error instead of 500
- [ ] Send a valid base64-encoded image — should continue to work as before
- [ ] Send a malformed base64 string — should still return the existing 400 error from the `binascii.Error` handler

🤖 Generated with [Claude Code](https://claude.com/claude-code)

When a client sends raw image bytes (e.g. JPEG) with type "base64"
instead of an actual base64-encoded string, the server crashes with
UnicodeDecodeError and returns a generic 500 Internal Error. This
catches that error and raises InputImageLoadError instead, giving
the client a 400 with a clear message about what went wrong.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@grzegorz-roboflow grzegorz-roboflow merged commit b6318de into main Feb 19, 2026
51 checks passed
@grzegorz-roboflow grzegorz-roboflow deleted the fix/unicode-errors-handled-gracefully branch February 19, 2026 08:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants