Skip to content

Conversation

@hertschuh
Copy link
Collaborator

There were several issues with keras.utils.plot_model() / keras.utils.model_to_dot() when rendering nested models with expand_nested=True:

  • Sequential models were not expanded
  • Edges going into nested models would always connect to the first box in the subgraph, which is incorrect because:
    • Functional models can have multiple inputs and the edge needs to point to the correct input
    • The destination of the edge can be further nested sub-models like Sequential sub-models
  • Edges going out of nested models would always connect from the last box in the subgraph, which is incorrect because:
    • Functional models can have multiple outputs and the edge needs to come from the correct layer
    • The source of the edge can be further nested in sub-models since there is no "output" box

This adds unit tests, which check that the graph has the expected topology (correct nodes and edges).

Visual tests: https://colab.research.google.com/gist/hertschuh/fb7dfdbf4fb31139e33e750ab10aaad4/plot_model-testing.ipynb

Fixes #21119

@codecov-commenter
Copy link

codecov-commenter commented May 2, 2025

Codecov Report

Attention: Patch coverage is 0% with 32 lines in your changes missing coverage. Please review.

Project coverage is 82.61%. Comparing base (48a6692) to head (b97b5ab).
Report is 6 commits behind head on master.

Files with missing lines Patch % Lines
keras/src/utils/model_visualization.py 0.00% 32 Missing ⚠️
Additional details and impacted files
@@           Coverage Diff           @@
##           master   #21243   +/-   ##
=======================================
  Coverage   82.60%   82.61%           
=======================================
  Files         564      564           
  Lines       54501    54581   +80     
  Branches     8469     8481   +12     
=======================================
+ Hits        45020    45091   +71     
- Misses       7397     7403    +6     
- Partials     2084     2087    +3     
Flag Coverage Δ
keras 82.42% <0.00%> (+<0.01%) ⬆️
keras-jax 63.67% <0.00%> (-0.01%) ⬇️
keras-numpy 58.81% <0.00%> (+<0.01%) ⬆️
keras-openvino 32.99% <0.00%> (-0.01%) ⬇️
keras-tensorflow 64.10% <0.00%> (+0.01%) ⬆️
keras-torch 63.76% <0.00%> (-0.01%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Collaborator

@fchollet fchollet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR!

ImportError: You must install pydot (pip install pydot) for model_to_dot to work.

If we want these tests on CI (questionable) we need to add pydot to requirements.

@fchollet
Copy link
Collaborator

fchollet commented May 3, 2025

There is currently a manually-run integration test for plots. It's manually run because the idea is to manually inspect the generated image files after running it.

@hertschuh
Copy link
Collaborator Author

There is currently a manually-run integration test for plots. It's manually run because the idea is to manually inspect the generated image files after running it.

Oh, I didn't realize that.

@hertschuh
Copy link
Collaborator Author

Thanks for the PR!

ImportError: You must install pydot (pip install pydot) for model_to_dot to work.

If we want these tests on CI (questionable) we need to add pydot to requirements.

Hmm... it's easy enough to install. But let me try something, maybe I should just mock it since we're not rendering anyway.

It does feel wasteful to run these every time for every backend, but then, when else would we run them?

@fchollet
Copy link
Collaborator

fchollet commented May 3, 2025

when else would we run them?

When we modify the plotting logic -- I feel like this strictly requires a step of manually inspecting the visual effect of a change across all kinds of models (which is what the current integration test seeks to achieve). It's basically impossible to catch issues without actually looking at the images that come out

We rarely every modify this logic so it seems fine

@hertschuh
Copy link
Collaborator Author

hertschuh commented May 4, 2025

We rarely every modify this logic so it seems fine

I'll move the tests I created to the integration test file.

I feel like this strictly requires a step of manually inspecting the visual effect of a change across all kinds of models (which is what the current integration test seeks to achieve). It's basically impossible to catch issues without actually looking at the images that come out

Actually, the unit tests get you most of the way there (the existing ones and the ones I added). It verifies that at least the topology is correct: existence of nodes, nesting of nodes in subgraphs, edges between nodes.

In fact, I found a bug that I didn't notice visually.

There were several issues with `keras.utils.plot_model()` / `keras.utils.model_to_dot()` when rendering nested models with `expand_nested=True`:
- Sequential models were not expanded
- Edges going into nested models would always connect to the first box in the subgraph, which is incorrect because:
  - Functional models can have multiple inputs and the edge needs to point to the correct input
  - The destination of the edge can be further nested sub-models like Sequential sub-models
- Edges going out of nested models would always connect from the last box in the subgraph, which is incorrect because:
  - Functional models can have multiple outputs and the edge needs to come from the correct layer
  - The source of the edge can be further nested in sub-models since there is no "output" box

This adds to the integration tests. In particular, it augments the way the topology is verified:
- it provides detailed message for dangling edges
- it can inspected nested subgraphs
- it verifies there are no extra edges that the ones expected
- it can verify splits in the graph

Visual tests: https://colab.research.google.com/gist/hertschuh/fb7dfdbf4fb31139e33e750ab10aaad4/plot_model-testing.ipynb

Fixes keras-team#21119
@hertschuh hertschuh force-pushed the plot_model_fixes branch from a59e204 to b97b5ab Compare May 6, 2025 00:36
@hertschuh
Copy link
Collaborator Author

@fchollet ready for review.

Copy link
Collaborator

@fchollet fchollet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thank you!

@google-ml-butler google-ml-butler bot added kokoro:force-run ready to pull Ready to be merged into the codebase labels May 7, 2025
@fchollet fchollet merged commit f98b91f into keras-team:master May 7, 2025
7 checks passed
@hertschuh hertschuh deleted the plot_model_fixes branch May 7, 2025 22:02
@miticollo
Copy link
Contributor

miticollo commented May 21, 2025

Hey @hertschuh ! Thank you for this PR. I have a question. Can you look this gist? Is this behavior intended?
I created two identical model, but I changed a line in the decoder:

keras.layers.Input(shape=(30,),dtype='float32')

into this

keras.layers.Input(tensor=encoder.outputs[0])

IMO, the last plot is correct.

@hertschuh
Copy link
Collaborator Author

@miticollo

IMO, the last one plot is correct.

Correct, they should look the same. That's a bug.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

kokoro:force-run ready to pull Ready to be merged into the codebase size:L

Projects

None yet

Development

Successfully merging this pull request may close these issues.

plot_model bugs with functional submodels

5 participants