Skip to content

Fix #8462: embed patch sizes in einops pattern for einops >= 0.8 compatibility#8834

Open
williams145 wants to merge 5 commits intoProject-MONAI:devfrom
williams145:fix/issue-8462-patchembedding-einops-compat
Open

Fix #8462: embed patch sizes in einops pattern for einops >= 0.8 compatibility#8834
williams145 wants to merge 5 commits intoProject-MONAI:devfrom
williams145:fix/issue-8462-patchembedding-einops-compat

Conversation

@williams145
Copy link
Copy Markdown
Contributor

Fixes #8462

Description

PatchEmbeddingBlock with proj_type="perceptron" raises TypeError on einops >= 0.8.x because Rearrange.__init__() no longer accepts arbitrary keyword arguments.

The current code builds an axes_len dict and passes it as **axes_len to Rearrange(pattern, **axes_len). einops 0.8.x removed support for this.

Fix: embed the patch size values directly as integer literals in the einops pattern string — e.g. "b c (h 16) (w 16) (d 16)" instead of using kwargs. einops supports integer literals in patterns across all versions including 0.8.x. The change is semantically identical.

Types of changes

  • Non-breaking change (fix or new feature that would not break existing functionality).
  • Breaking change
  • New tests added
  • Quick tests passed locally by running ./runtests.sh --quick --unittests --disttests.
  • In-line docstrings updated.

…s >= 0.8 compatibility

einops 0.8.x removed support for arbitrary kwargs in Rearrange.__init__().
Replace axes_len dict and **axes_len kwarg with integer literals embedded
directly in the pattern string. Semantically identical, compatible with all
einops versions.

Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Apr 18, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

Adds a fallback rearrangement path for proj_type == "perceptron" in PatchEmbeddingBlock: the initializer first tries to instantiate einops.layers.torch.Rearrange with axes_len; if that raises TypeError, it imports einops.rearrange and uses a small _RearrangeFn wrapper that calls the functional API with the same pattern and axis-length mapping. The rest of the perceptron branch (the nn.Linear(self.patch_dim, hidden_size) projection and subsequent embedding/position/dropout logic) is unchanged.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 33.33% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed Title directly references issue #8462 and clearly describes the main fix: embedding patch sizes in einops patterns for version 0.8+ compatibility.
Description check ✅ Passed Description covers the bug, the root cause (einops 0.8.x removed kwargs support), and the solution (embedding patch sizes as literals). Template sections are mostly complete.
Linked Issues check ✅ Passed PR addresses #8462 by fixing the TypeError when PatchEmbeddingBlock uses proj_type='perceptron' with einops ≥0.8 by embedding patch sizes as integer literals in the pattern string instead of passing them as kwargs.
Out of Scope Changes check ✅ Passed Changes are scoped to PatchEmbeddingBlock's Rearrange initialization logic only, with a fallback wrapper for edge cases. No unrelated modifications present.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
Review rate limit: 7/8 reviews remaining, refill in 7 minutes and 30 seconds.

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
monai/networks/blocks/patchembedding.py (1)

100-106: Fix looks correct and semantically equivalent.

Embedded integer literals in the einops pattern cleanly sidesteps the removed kwargs path in einops ≥ 0.8. Existing shape tests parameterized over spatial_dims∈{2,3} and patch_size∈{8,16} cover this path.

Minor nit (ruff B905): the zip on Line 102 could use strict=True since dim_names and patch_size are both length spatial_dims by construction — would also surface any future spatial_dims > 3 misuse (currently silently truncates via ("h","w","d")[:spatial_dims]).

♻️ Optional diff
-            from_chars = "b c " + " ".join(f"({name} {psize})" for name, psize in zip(dim_names, patch_size))
+            from_chars = "b c " + " ".join(
+                f"({name} {psize})" for name, psize in zip(dim_names, patch_size, strict=True)
+            )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 100 - 106, Change the
non-strict zip used when building from_chars to a strict zip so mismatched
lengths between dim_names and patch_size are caught early; specifically update
the expression that constructs from_chars (where dim_names, patch_size and
spatial_dims are used) to use zip(..., strict=True) so any future misuse (e.g.,
spatial_dims > 3) surfaces immediately.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@monai/networks/blocks/patchembedding.py`:
- Around line 100-106: Change the non-strict zip used when building from_chars
to a strict zip so mismatched lengths between dim_names and patch_size are
caught early; specifically update the expression that constructs from_chars
(where dim_names, patch_size and spatial_dims are used) to use zip(...,
strict=True) so any future misuse (e.g., spatial_dims > 3) surfaces immediately.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: b56cc03c-93bf-453e-a791-3e7817fd215c

📥 Commits

Reviewing files that changed from the base of the PR and between 851054c and 78756f8.

📒 Files selected for processing (1)
  • monai/networks/blocks/patchembedding.py

In some einops 0.8.x builds, Rearrange.__init__() does not accept
**kwargs for axis sizes, raising TypeError. Add _PatchRearrange as a
pure-PyTorch fallback that produces identical output. The primary path
still uses Rearrange with named axes (**axes_len); the fallback only
activates if that call raises TypeError.

Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@monai/networks/blocks/patchembedding.py`:
- Around line 131-135: Tests don't deterministically exercise the fallback
_PatchRearrange because current tests run whichever einops version is installed;
add a unit test that forces the fallback by monkey-patching the Rearrange symbol
to raise TypeError (or by directly instantiating _PatchRearrange) and assert
that outputs of the fallback match the normal Rearrange path for a
representative input shape; target names to modify/assert are _PatchRearrange,
Rearrange and the patch embedding behavior (e.g., the patch_embeddings sequence)
in the existing tests/networks/blocks/test_patchembedding.py so both branches
are covered and compared.
- Around line 32-55: Update the _PatchRearrange class: narrow the type of
patch_size in __init__ to tuple[int, ...], replace the view call in forward
(x.view(*v)) with reshape to handle non-contiguous tensors safely, and add
Google-style docstrings for the class __init__ and forward methods describing
Args and Returns (and any raised exceptions) per project guidelines; reference
the _PatchRearrange class, its __init__ signature, and the forward method when
making these changes.
- Around line 126-135: The _PatchRearrange fallback needs fixes: add
Google-style docstrings to the class methods __init__ and forward describing
arguments, return values, and behavior; change the type hint patch_size: tuple
to patch_size: tuple[int, ...]; replace any use of x.view(*v) with x.reshape(*v)
to avoid errors on non-contiguous tensors; and add a deterministic unit test
that directly instantiates and exercises _PatchRearrange (independent of
einops/Rearrange) to validate its behavior for representative
spatial_dims/patch_size combinations.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 8833ef1f-b19d-4c70-a2ad-3f4bd3d6a4a8

📥 Commits

Reviewing files that changed from the base of the PR and between 78756f8 and 093f0b0.

📒 Files selected for processing (1)
  • monai/networks/blocks/patchembedding.py

Comment thread monai/networks/blocks/patchembedding.py Outdated
Comment on lines +126 to +135
# for 3d: "b c (h p1) (w p2) (d p3) -> b (h w d) (p1 p2 p3 c)"
chars = (("h", "p1"), ("w", "p2"), ("d", "p3"))[:spatial_dims]
from_chars = "b c " + " ".join(f"({k} {v})" for k, v in chars)
to_chars = f"b ({' '.join([c[0] for c in chars])}) ({' '.join([c[1] for c in chars])} c)"
axes_len = {f"p{i + 1}": p for i, p in enumerate(patch_size)}
self.patch_embeddings = nn.Sequential(
Rearrange(f"{from_chars} -> {to_chars}", **axes_len), nn.Linear(self.patch_dim, hidden_size)
)
try:
rearrange_layer: nn.Module = Rearrange(f"{from_chars} -> {to_chars}", **axes_len)
except TypeError:
rearrange_layer = _PatchRearrange(spatial_dims, tuple(int(p) for p in patch_size))
self.patch_embeddings = nn.Sequential(rearrange_layer, nn.Linear(self.patch_dim, hidden_size))
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🌐 Web query:

einops Rearrange pattern integer literals support 0.8

💡 Result:

No, einops Rearrange patterns do not support standalone integer literals (e.g., 'b 2 h w') as axis specifications. Patterns use symbolic axis names (letters like 'b', 'h'), anonymous axes (numbers in parentheses like '(b1 b2)'), ellipsis (...), or underscores (_ for skipping). Integer values are provided via the axes_lengths keyword argument (e.g., rearrange(x, 'b (h1 h) w c -> ...', h1=2)) or parsed from tensor shapes via parse_shape. This syntax has been consistent across versions, including 0.8.0+ (released 2024-04-28), with no changes introducing integer literal support in patterns per release notes, changelogs, and docs. Examples in official documentation and tutorials confirm named axes only.

Citations:


🏁 Script executed:

# Get the file and examine _PatchRearrange class and test coverage
head -150 monai/networks/blocks/patchembedding.py | tail -100

Repository: Project-MONAI/MONAI

Length of output: 4640


🏁 Script executed:

# Check for existing tests covering the fallback path
find . -name "*.py" -path "*/test*" -type f | xargs grep -l "patchembedding" | head -5

Repository: Project-MONAI/MONAI

Length of output: 130


🏁 Script executed:

# Get the _PatchRearrange class definition (around lines 32-55)
sed -n '32,56p' monai/networks/blocks/patchembedding.py

Repository: Project-MONAI/MONAI

Length of output: 958


🏁 Script executed:

# Check the test file for coverage of _PatchRearrange and the fallback path
cat monai/tests/networks/blocks/test_patchembedding.py

Repository: Project-MONAI/MONAI

Length of output: 145


🏁 Script executed:

# Find the test file with correct path
find . -name "test_patchembedding.py" -type f 2>/dev/null

Repository: Project-MONAI/MONAI

Length of output: 109


🏁 Script executed:

# Also check for any tests that import or test _PatchRearrange
rg "_PatchRearrange" --type py

Repository: Project-MONAI/MONAI

Length of output: 275


🏁 Script executed:

# Read the test file
cat ./tests/networks/blocks/test_patchembedding.py

Repository: Project-MONAI/MONAI

Length of output: 7510


🏁 Script executed:

# Check einops version constraints and any comments about the fallback logic
head -30 monai/networks/blocks/patchembedding.py

Repository: Project-MONAI/MONAI

Length of output: 1308


🏁 Script executed:

# Verify if x.view() in _PatchRearrange could fail on non-contiguous tensors
rg "x\.view\(" monai/networks/blocks/patchembedding.py -A 2 -B 2

Repository: Project-MONAI/MONAI

Length of output: 233


Retract the simplification suggestion; the current try/except approach is necessary.

einops Rearrange does not support integer literals in patterns—numeric axis values must be passed via axes_lengths (the current axes_len dict). The suggested approach of embedding integers as literals (e.g., "b c (h 16) (w 16) (d 16)") is not feasible. The fallback with _PatchRearrange is the correct strategy for version compatibility.

However, address these remaining issues in _PatchRearrange:

  1. Missing Google-style docstrings: Add docstrings to __init__ and forward methods describing arguments, return values, and behavior per coding guidelines.
  2. Type hint specificity: Change patch_size: tuple to patch_size: tuple[int, ...].
  3. Use reshape() instead of view(): Line 47 uses x.view(*v), which fails on non-contiguous tensors; reshape() is safer.
  4. Incomplete test coverage: The test suite only exercises the Rearrange path (since einops is installed). The fallback is never deterministically validated. Add a test that directly instantiates and tests _PatchRearrange independently.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 126 - 135, The
_PatchRearrange fallback needs fixes: add Google-style docstrings to the class
methods __init__ and forward describing arguments, return values, and behavior;
change the type hint patch_size: tuple to patch_size: tuple[int, ...]; replace
any use of x.view(*v) with x.reshape(*v) to avoid errors on non-contiguous
tensors; and add a deterministic unit test that directly instantiates and
exercises _PatchRearrange (independent of einops/Rearrange) to validate its
behavior for representative spatial_dims/patch_size combinations.

Comment on lines +131 to +135
try:
rearrange_layer: nn.Module = Rearrange(f"{from_chars} -> {to_chars}", **axes_len)
except TypeError:
rearrange_layer = _PatchRearrange(spatial_dims, tuple(int(p) for p in patch_size))
self.patch_embeddings = nn.Sequential(rearrange_layer, nn.Linear(self.patch_dim, hidden_size))
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fallback path isn't deterministically covered by tests.

_PatchRearrange only executes when Rearrange(..., **axes_len) raises TypeError, i.e. only on einops ≥ 0.8. Existing test_shape cases (tests/networks/blocks/test_patchembedding.py) exercise whichever branch the installed einops version selects, so CI never compares both paths in the same run. Suggest a targeted test that forces the fallback — e.g. monkey-patch Rearrange to raise TypeError, or instantiate _PatchRearrange directly and compare its output against the Rearrange path for a known input.

As per coding guidelines: "Ensure new or modified definitions will be covered by existing or new unit tests."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 131 - 135, Tests don't
deterministically exercise the fallback _PatchRearrange because current tests
run whichever einops version is installed; add a unit test that forces the
fallback by monkey-patching the Rearrange symbol to raise TypeError (or by
directly instantiating _PatchRearrange) and assert that outputs of the fallback
match the normal Rearrange path for a representative input shape; target names
to modify/assert are _PatchRearrange, Rearrange and the patch embedding behavior
(e.g., the patch_embeddings sequence) in the existing
tests/networks/blocks/test_patchembedding.py so both branches are covered and
compared.

Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (2)
monai/networks/blocks/patchembedding.py (2)

131-135: ⚠️ Potential issue | 🟡 Minor

Add deterministic coverage for the fallback branch.

The fallback only runs when Rearrange(...) raises TypeError, so normal shape tests cover whichever branch the installed dependency happens to take. Add a direct _PatchRearrange test or monkey-patch Rearrange to force this path.

As per coding guidelines: "Ensure new or modified definitions will be covered by existing or new unit tests."

#!/bin/bash
# Description: Check whether tests directly exercise _PatchRearrange or force the Rearrange fallback.

fd -i '^test_patchembedding\.py$' -x sh -c '
  printf "\n== %s ==\n" "$1"
  rg -n -C3 "_PatchRearrange|monkeypatch|Rearrange|patch_embeddings" "$1" || true
' sh {}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 131 - 135, Add a
deterministic unit test that exercises the fallback branch by forcing the
_PatchRearrange path: either write a test that instantiates and validates
_PatchRearrange directly (ensuring it produces the same output/shape as the
normal path) or monkey-patch the Rearrange symbol used in patchembedding.py to
raise TypeError before importing/constructing the module so the try/except in
patch_embeddings triggers; reference _PatchRearrange, Rearrange and
patch_embeddings in the test to make the coverage explicit and assert expected
tensor shapes/behaviour.

32-55: ⚠️ Potential issue | 🟡 Minor

Carry forward the fallback hardening fixes.

forward() still uses view() on user tensors, so non-contiguous inputs can fail. Also tighten patch_size, add Google-style method docstrings, and address Ruff B905 with explicit zip(strict=...) if the Python target supports it.

As per coding guidelines: "Docstrings should be present for all definition which describe each variable, return value, and raised exception in the appropriate section of the Google-style of docstrings."

Proposed patch
 class _PatchRearrange(nn.Module):
-    """Fallback patch rearrangement using pure PyTorch, for einops compatibility."""
+    """Fallback patch rearrangement using pure PyTorch, for einops compatibility."""
 
-    def __init__(self, spatial_dims: int, patch_size: tuple) -> None:
+    def __init__(self, spatial_dims: int, patch_size: tuple[int, ...]) -> None:
+        """Initialize the patch rearrangement layer.
+
+        Args:
+            spatial_dims: Number of spatial dimensions.
+            patch_size: Patch size for each spatial dimension.
+        """
         super().__init__()
         self.spatial_dims = spatial_dims
         self.patch_size = patch_size
 
     def forward(self, x: torch.Tensor) -> torch.Tensor:
+        """Rearrange image patches into a patch sequence.
+
+        Args:
+            x: Input tensor with shape ``(batch, channels, *spatial)``.
+
+        Returns:
+            Tensor with shape ``(batch, n_patches, patch_dim)``.
+
+        Raises:
+            RuntimeError: If ``x`` cannot be reshaped into the configured patch grid.
+        """
         batch, channels = x.shape[0], x.shape[1]
         sp = x.shape[2:]
-        g = tuple(s // p for s, p in zip(sp, self.patch_size))
+        g = tuple(s // p for s, p in zip(sp, self.patch_size, strict=True))
         v: list[int] = [batch, channels]
-        for gi, pi in zip(g, self.patch_size):
+        for gi, pi in zip(g, self.patch_size, strict=True):
             v += [gi, pi]
-        x = x.view(*v)
+        x = x.reshape(*v)

Verify the Python target before applying zip(strict=True):

#!/bin/bash
# Description: Check project Python target metadata for zip(strict=True) support.

fd '^(pyproject\.toml|setup\.cfg|setup\.py)$' --max-depth 2 -x sh -c '
  printf "\n== %s ==\n" "$1"
  grep -nEi "requires-python|python_requires|target-version" "$1" || true
' sh {}

python - <<'PY'
import sys
print("sandbox_python:", sys.version)
print("zip_strict_supported:", sys.version_info >= (3, 10))
PY
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 32 - 55, The forward()
in class _PatchRearrange uses tensor.view on potentially non-contiguous inputs,
lacks a Google-style docstring, doesn't validate/tighten patch_size, and
triggers Ruff B905 for zip(strict=...). Fix by: 1) validate and canonicalize
self.patch_size in __init__ (ensure tuple of ints of length spatial_dims and
values > 0); 2) add a Google-style docstring to forward describing Args and
Returns and any raised exceptions; 3) avoid view on non-contiguous tensors by
using torch.reshape (or call .contiguous() before view) for the initial reshape
and the final reshape, and call .contiguous() after permute() before reshaping;
4) replace unsafe zip(...) usage with a conditional strict zip: if
sys.version_info >= (3,10) use zip(..., strict=True) else fallback to regular
zip (use unique symbols: class _PatchRearrange, method forward, attribute
self.patch_size, local vars g, v, gdims, pdims) so the change is robust across
Python targets.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@monai/networks/blocks/patchembedding.py`:
- Around line 131-135: Add a deterministic unit test that exercises the fallback
branch by forcing the _PatchRearrange path: either write a test that
instantiates and validates _PatchRearrange directly (ensuring it produces the
same output/shape as the normal path) or monkey-patch the Rearrange symbol used
in patchembedding.py to raise TypeError before importing/constructing the module
so the try/except in patch_embeddings triggers; reference _PatchRearrange,
Rearrange and patch_embeddings in the test to make the coverage explicit and
assert expected tensor shapes/behaviour.
- Around line 32-55: The forward() in class _PatchRearrange uses tensor.view on
potentially non-contiguous inputs, lacks a Google-style docstring, doesn't
validate/tighten patch_size, and triggers Ruff B905 for zip(strict=...). Fix by:
1) validate and canonicalize self.patch_size in __init__ (ensure tuple of ints
of length spatial_dims and values > 0); 2) add a Google-style docstring to
forward describing Args and Returns and any raised exceptions; 3) avoid view on
non-contiguous tensors by using torch.reshape (or call .contiguous() before
view) for the initial reshape and the final reshape, and call .contiguous()
after permute() before reshaping; 4) replace unsafe zip(...) usage with a
conditional strict zip: if sys.version_info >= (3,10) use zip(..., strict=True)
else fallback to regular zip (use unique symbols: class _PatchRearrange, method
forward, attribute self.patch_size, local vars g, v, gdims, pdims) so the change
is robust across Python targets.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 22291e60-3ff7-46af-ac07-515fd865b4ec

📥 Commits

Reviewing files that changed from the base of the PR and between 093f0b0 and f1cffe8.

📒 Files selected for processing (1)
  • monai/networks/blocks/patchembedding.py

@williams145
Copy link
Copy Markdown
Contributor Author

@ericspod: CI is green and the branch is up to date. Happy to answer any questions if helpful.

@ericspod
Copy link
Copy Markdown
Member

Hi @williams145 the following works for me with einops 0.8.1:

import torch
from einops.layers.torch      import Rearrange
images = torch.rand(3, 30, 40, 3)
kwargs={"h1":2, "w1":2}
r=Rearrange('b (h h1) (w w1) c -> b h w (c h1 w1)', **kwargs)
print(r(images).shape)  # torch.Size([3, 15, 20, 12])

I think this replicates the use case in which you're seeing an error, can you check this code for yourself? Which version of einops specifically are you using?

Even if this is a real issue needing to be fixed I would not recommend re-implementing a rearranging algorithm which is going to be error-prone and hard to test as the Coderabbit comment suggests above. I would look into using the einops.rearrange function in your class regardless, we should use the einops function which has been well tested.

…tead of re-implementation

Replace the pure-torch _PatchRearrange class with _RearrangeFn, a thin
nn.Module wrapper around einops.rearrange (the function). The function's
**kwargs signature is stable across einops 0.8.x versions, so it
correctly handles axis-size kwargs even on builds where the Rearrange
layer's __init__ rejects them.

Avoids re-implementing the rearrangement logic in pure torch, per
review feedback.

Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
@williams145
Copy link
Copy Markdown
Contributor Author

Thanks for taking a close look. You're right — I tried reproducing on my end with a fresh pip install einops==0.8.1 and the kwargs path works fine, just like your example. The reporter listed einops==0.8.1 in their environment but the failure mode (Rearrange.init() got an unexpected keyword argument 'p1' — note init not __init__) suggests something non-standard in their setup, possibly a custom build or namespace conflict.

Your point about avoiding a re-implementation is well taken. I've dropped the pure-torch fallback and replaced it with a thin wrapper around einops.rearrange (the function), so even the fallback path uses einops' tested rearrangement engine. The primary path still uses Rearrange as before — the wrapper only kicks in if Rearrange.__init__ raises TypeError on a particular build.

Let me know if you'd prefer a different approach, or if you'd rather close this since the bug isn't reliably reproducible.

While you're here — would also appreciate a look at #8835 when you get a chance. It's a small, green, focused fix for a LoadImage crash when set_track_meta(False) is active.

einops_rearrange comes from optional_import which returns Any, so the
forward return type was leaking Any. Annotate the local variable as
torch.Tensor to satisfy mypy.

Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
monai/networks/blocks/patchembedding.py (1)

33-43: ⚡ Quick win

Add Google-style method docstrings and tighten axes_lengths typing.

_RearrangeFn.__init__ and forward are missing required Google-style docstrings, and axes_lengths: dict is too generic.

♻️ Proposed patch
 class _RearrangeFn(nn.Module):
@@
-    def __init__(self, pattern: str, axes_lengths: dict) -> None:
+    def __init__(self, pattern: str, axes_lengths: dict[str, int]) -> None:
+        """Initialize fallback functional rearrange module.
+
+        Args:
+            pattern: Einops rearrangement pattern.
+            axes_lengths: Named axis lengths consumed by ``einops.rearrange``.
+        """
         super().__init__()
         self.pattern = pattern
         self.axes_lengths = axes_lengths
 
     def forward(self, x: torch.Tensor) -> torch.Tensor:
+        """Rearrange an input tensor.
+
+        Args:
+            x: Input tensor.
+
+        Returns:
+            Rearranged tensor.
+        """
         out: torch.Tensor = einops_rearrange(x, self.pattern, **self.axes_lengths)
         return out

As per coding guidelines, "Docstrings should be present for all definition which describe each variable, return value, and raised exception in the appropriate section of the Google-style of docstrings."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 33 - 43, The
_RearrangeFn class is missing Google-style docstrings on __init__ and forward
and uses a too-generic type for axes_lengths; add a Google-style docstring to
__init__ describing parameters (pattern: str, axes_lengths: Mapping[str, int])
and any raised exceptions, and add a Google-style docstring to forward
describing the input x: torch.Tensor and the returned torch.Tensor; tighten the
type of axes_lengths from dict to typing.Mapping[str, int] (or Dict[str, int] if
mutability is required) in the __init__ signature and any stored attribute to
reflect the precise mapping type.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@monai/networks/blocks/patchembedding.py`:
- Around line 33-43: The _RearrangeFn class is missing Google-style docstrings
on __init__ and forward and uses a too-generic type for axes_lengths; add a
Google-style docstring to __init__ describing parameters (pattern: str,
axes_lengths: Mapping[str, int]) and any raised exceptions, and add a
Google-style docstring to forward describing the input x: torch.Tensor and the
returned torch.Tensor; tighten the type of axes_lengths from dict to
typing.Mapping[str, int] (or Dict[str, int] if mutability is required) in the
__init__ signature and any stored attribute to reflect the precise mapping type.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 5da666a0-7ef4-492d-9f68-f909160f8e53

📥 Commits

Reviewing files that changed from the base of the PR and between d401c76 and 1527dd4.

📒 Files selected for processing (1)
  • monai/networks/blocks/patchembedding.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG]einops.layer.torch.Rearrange got an unexpected keyword argument

2 participants