Detect the used python package and add it when register the python automatically#8
Detect the used python package and add it when register the python automatically#8DevTGHa wants to merge 4 commits intojupyter-supportfrom
Conversation
|
Important Review skippedAuto reviews are disabled on base/target branches other than the default branch. Please check the settings in the CodeRabbit UI or the ⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: You can disable this status message by setting the Use the checkbox below for a quick retry:
WalkthroughImplements lazy module registration on first execute/rpc call, adds AST-based per-function import discovery and PackageInfo models, renames plugin API functions, changes session context modules from lists to sets, updates client/session callsites, expands CI workflows, and adds/updates tests and documentation for these behaviors. Changes
Sequence DiagramsequenceDiagram
participant Client
participant Session as D3Session
participant API as Designer API
participant Registry as Module Registry
rect rgba(255, 200, 100, 0.5)
Note over Client,Registry: Old Flow (Explicit Registration)
Client->>Session: with Session(context_modules={module_x})
Session->>API: register_module(module_x)
API->>Registry: Register Module
Client->>Session: execute(payload)
Session->>API: d3_api_execute(payload)
end
rect rgba(100, 200, 255, 0.5)
Note over Client,Registry: New Flow (Lazy Registration)
Client->>Session: execute(payload(moduleName="module_x"))
Session->>Session: Check: "module_x" in registered_modules?
alt Not registered
Session->>API: register_module(module_x)
API->>Registry: Register Module
Session->>Session: Add "module_x" to registered_modules
end
Session->>API: d3_api_execute(payload)
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 6
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (5)
src/designer_plugin/d3sdk/client.py (1)
86-89:⚠️ Potential issue | 🟡 MinorUpdate stale docstring references to old function names.
The docstring still references
d3_api_pluginandd3_api_aplugin(line 88), but these have been renamed tod3_api_executeandd3_api_aexecute.📝 Proposed fix
- 5. Sends it to Designer via d3_api_plugin or d3_api_aplugin + 5. Sends it to Designer via d3_api_execute or d3_api_aexecute🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/designer_plugin/d3sdk/client.py` around lines 86 - 89, Update the stale docstring in client.py that still mentions d3_api_plugin and d3_api_aplugin: change those references to the new function names d3_api_execute and d3_api_aexecute and ensure any explanatory text that describes synchronous vs asynchronous calls matches the current behavior of d3_api_execute/d3_api_aexecute; verify the docstring around the PluginPayload/script creation (mentions of "return plugin.{method_name}({args})" and the call flow) uses the new names so readers can find the correct functions..github/workflows/ci.yml (1)
20-41:⚠️ Potential issue | 🟠 MajorWire the matrix Python into uv itself.
The current workflow doesn't properly configure uv to use the matrix Python version. The
uv python installstep installs the version but doesn't guarantee subsequentuv syncanduv runcommands use it. Per official uv documentation, use thepython-versionparameter in theastral-sh/setup-uvaction, which sets theUV_PYTHONenvironment variable for all downstream commands. Without this, all matrix jobs may collapse to the same Python interpreter.Suggested fix
- - name: Install uv + - name: Install uv and set Python ${{ matrix.python-version }} uses: astral-sh/setup-uv@v7 with: enable-cache: true - - - name: Set up Python ${{ matrix.python-version }} - run: uv python install ${{ matrix.python-version }} + python-version: ${{ matrix.python-version }}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/ci.yml around lines 20 - 41, The workflow currently installs the matrix Python with a separate "Set up Python" step but doesn't ensure uv uses that interpreter; update the "Install uv" step that uses astral-sh/setup-uv@v7 to include the python-version input set to ${{ matrix.python-version }} (this sets UV_PYTHON for downstream steps) and remove or make the "Set up Python" `uv python install` step redundant; ensure step names ("Install uv", action astral-sh/setup-uv@v7) are the ones updated so subsequent `uv sync` and `uv run` commands use the matrix Python.src/designer_plugin/d3sdk/session.py (3)
32-43:⚠️ Potential issue | 🟡 MinorMinor docstring inconsistency: parameter is
set[str], notlist[str].The docstring at line 38 says "List of module names" but the parameter type is
set[str].📝 Proposed fix
def __init__(self, hostname: str, port: int, context_modules: set[str]) -> None: """Initialize base session with connection details and module context. Args: hostname: The hostname of the Designer instance. port: The port number of the Designer instance. - context_modules: List of module names to register when entering session context. + context_modules: Set of module names to register when entering session context. """🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/designer_plugin/d3sdk/session.py` around lines 32 - 43, Docstring for __init__ incorrectly describes context_modules as a "List of module names" while the parameter type is set[str]; update the docstring for the __init__ method (the constructor that assigns self.hostname, self.port, self.context_modules, self.registered_modules) to refer to "set of module names" (or "collection/set of module names") so the parameter description matches the type annotation.
57-66:⚠️ Potential issue | 🟡 MinorDocstring inconsistency: type is
set[str], notlist[str].Line 64 says "Optional list of module names" but the parameter type is
set[str] | None.📝 Proposed fix
"""Initialize synchronous Designer session. Args: hostname: The hostname of the Designer instance. port: The port number of the Designer instance. - context_modules: Optional list of module names to register when entering session context. + context_modules: Optional set of module names to register when entering session context. """🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/designer_plugin/d3sdk/session.py` around lines 57 - 66, The docstring for the __init__ method incorrectly calls context_modules an "Optional list of module names" while the parameter type is set[str] | None; update the docstring in the __init__ (the constructor that accepts context_modules) to describe it as an "optional set of module names" (or "optional collection/set of module names") so the text matches the type annotation for context_modules and adjust any wording around registration when entering session context accordingly.
194-203:⚠️ Potential issue | 🟡 MinorDocstring inconsistency: type is
set[str], notlist[str].Line 201 says "Optional list of module names" but the parameter type is
set[str] | None.📝 Proposed fix
"""Initialize asynchronous Designer session. Args: hostname: The hostname of the Designer instance. port: The port number of the Designer instance. - context_modules: Optional list of module names to register when entering session context. + context_modules: Optional set of module names to register when entering session context. """🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/designer_plugin/d3sdk/session.py` around lines 194 - 203, The docstring for the session initializer incorrectly refers to "Optional list of module names" while the parameter context_modules is typed as set[str] | None; update the docstring in the __init__ method (the session initializer) to reference a set (e.g., "Optional set of module names to register when entering session context") or otherwise align the type and description (change the type to list[str] if you intend a list) so the parameter description matches the declared type context_modules.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@README.md`:
- Around line 86-91: The README currently shows an unconditional import of
designer_plugin.pystub which makes the stub a runtime dependency and causes
automatic import detection (see ast_utils.py) to include it in remote payloads;
change the examples to use typing.TYPE_CHECKING and wrap the import in an if
TYPE_CHECKING: block (i.e., reference TYPE_CHECKING and import
designer_plugin.pystub only inside that conditional) so the stubs remain
available for IDE/type hints but are excluded from runtime/import detection used
for remote execution.
In `@src/designer_plugin/d3sdk/ast_utils.py`:
- Around line 56-66: The mypy error is caused by assigning different AST node
types to the same variable `node` in `to_import_statement`; update `node` with
an explicit union type annotation (e.g., ast.Import | ast.ImportFrom) at its
declaration inside the `to_import_statement` method so both `ast.ImportFrom` and
`ast.Import` assignments are type-compatible for `ast.unparse`.
- Around line 446-453: The type-narrowing issue comes from assigning an
ast.Attribute to `root` then reassigning it to `root.value` (an ast.expr),
confusing mypy; update the declaration so `root` is typed as `ast.expr` (e.g.,
replace `root = node` with an explicit annotation `root: ast.expr = node`) so
the while loop `while isinstance(root, ast.Attribute): root = root.value` and
the subsequent `isinstance(root, ast.Name)` check are type-correct for mypy in
the function that processes `ast.Attribute` nodes.
In `@src/designer_plugin/d3sdk/function.py`:
- Around line 271-281: The current replacement of D3Function only updates the
in-process registry and doesn’t force re-registration in existing
D3Session/D3AsyncSession instances that cache modules in registered_modules;
modify the design to track a module version or dirty flag on D3Function (e.g.,
bump a version or set is_dirty when replaced) and store that version in each
session’s registered_modules map, then change D3Session.register_module (or the
callers execute()/rpc()) to compare the local D3Function module version against
the session’s cached version and force re-registration when they differ; update
D3Function._available_d3functions management to increment the module
version/mark dirty on replacement and add a regression test that opens a
session, calls execute(), redefines the function locally, and verifies
subsequent execute()/rpc() triggers re-registration and runs the new code.
- Around line 121-130: find_imports_for_function currently inspects only names
referenced in the function body, missing imports used in defaults/kw-defaults at
definition time; update find_imports_for_function (and any callers that build
packages for FunctionInfo) to traverse the entire AST FunctionDef node including
args.defaults, args.kw_defaults, and decorator_list to collect imports required
at definition time, then include those packages in the
FunctionInfo(packages=...) payload; add a regression test that defines a
function using an import in a default (e.g., def f(now=datetime.datetime.now()):
...) and asserts registration succeeds and the discovered packages include the
datetime import.
- Around line 266-281: When replacing a D3Function you currently discard the old
function but never remove its package imports, so
_available_packages[module_name] grows stale; fix by rebuilding
D3Function._available_packages[module_name] from the current set of registered
functions for that module: after you discard the old function and add the new
one to D3Function._available_d3functions[module_name] (using the existing
module_name and self._function_info.packages symbols), recompute
D3Function._available_packages[module_name] as the union of
pkg.to_import_statement() for every pkg in every f._function_info.packages for
each f in D3Function._available_d3functions[module_name]; assign that rebuilt
set back to D3Function._available_packages[module_name] so removed imports are
no longer present.
---
Outside diff comments:
In @.github/workflows/ci.yml:
- Around line 20-41: The workflow currently installs the matrix Python with a
separate "Set up Python" step but doesn't ensure uv uses that interpreter;
update the "Install uv" step that uses astral-sh/setup-uv@v7 to include the
python-version input set to ${{ matrix.python-version }} (this sets UV_PYTHON
for downstream steps) and remove or make the "Set up Python" `uv python install`
step redundant; ensure step names ("Install uv", action astral-sh/setup-uv@v7)
are the ones updated so subsequent `uv sync` and `uv run` commands use the
matrix Python.
In `@src/designer_plugin/d3sdk/client.py`:
- Around line 86-89: Update the stale docstring in client.py that still mentions
d3_api_plugin and d3_api_aplugin: change those references to the new function
names d3_api_execute and d3_api_aexecute and ensure any explanatory text that
describes synchronous vs asynchronous calls matches the current behavior of
d3_api_execute/d3_api_aexecute; verify the docstring around the
PluginPayload/script creation (mentions of "return plugin.{method_name}({args})"
and the call flow) uses the new names so readers can find the correct functions.
In `@src/designer_plugin/d3sdk/session.py`:
- Around line 32-43: Docstring for __init__ incorrectly describes
context_modules as a "List of module names" while the parameter type is
set[str]; update the docstring for the __init__ method (the constructor that
assigns self.hostname, self.port, self.context_modules, self.registered_modules)
to refer to "set of module names" (or "collection/set of module names") so the
parameter description matches the type annotation.
- Around line 57-66: The docstring for the __init__ method incorrectly calls
context_modules an "Optional list of module names" while the parameter type is
set[str] | None; update the docstring in the __init__ (the constructor that
accepts context_modules) to describe it as an "optional set of module names" (or
"optional collection/set of module names") so the text matches the type
annotation for context_modules and adjust any wording around registration when
entering session context accordingly.
- Around line 194-203: The docstring for the session initializer incorrectly
refers to "Optional list of module names" while the parameter context_modules is
typed as set[str] | None; update the docstring in the __init__ method (the
session initializer) to reference a set (e.g., "Optional set of module names to
register when entering session context") or otherwise align the type and
description (change the type to list[str] if you intend a list) so the parameter
description matches the declared type context_modules.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 2712682c-bc61-44e0-941c-dd73435d3adb
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (18)
.github/workflows/ci.yml.github/workflows/test-publish.ymlCHANGELOG.mdCONTRIBUTING.mdREADME.mdpyproject.tomlsrc/designer_plugin/api.pysrc/designer_plugin/d3sdk/__init__.pysrc/designer_plugin/d3sdk/ast_utils.pysrc/designer_plugin/d3sdk/builtin_modules.pysrc/designer_plugin/d3sdk/client.pysrc/designer_plugin/d3sdk/function.pysrc/designer_plugin/d3sdk/session.pytests/test_ast_utils.pytests/test_client.pytests/test_core.pytests/test_session.pytests/test_supported_modules.py
| Once installed, import the stubs. | ||
| > **Important:** `pystub` provides type hints for Designer's API objects but not their implementations. These objects only exist in Designer's runtime and cannot be used in local Python code. They must only be referenced in code that will be executed remotely in Designer. | ||
|
|
||
| ```python | ||
| from typing import TYPE_CHECKING | ||
| if TYPE_CHECKING: | ||
| from designer_plugin.pystub.d3 import * | ||
| from designer_plugin.pystub import * | ||
| ``` |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# Check README.md content at the cited lines
fd -t f "README.md" --exec cat -n {} \; | head -250 | grep -A 10 -B 5 "86:"Repository: disguise-one/python-plugin
Length of output: 52
🏁 Script executed:
# Also get the full context around all cited sections
fd -t f "README.md" --exec head -200 {} \;Repository: disguise-one/python-plugin
Length of output: 8429
🏁 Script executed:
# Search for import auto-discovery implementation in the codebase
rg -i "auto.*discover|discover.*import" -A 3 -B 3 --max-count 20Repository: disguise-one/python-plugin
Length of output: 1095
🏁 Script executed:
# Check for TYPE_CHECKING related code patterns
rg "TYPE_CHECKING" -B 2 -A 5Repository: disguise-one/python-plugin
Length of output: 2678
🏁 Script executed:
# Get specific line ranges from README
sed -n '85,92p' README.md | cat -nRepository: disguise-one/python-plugin
Length of output: 488
🏁 Script executed:
sed -n '102,105p' README.md | cat -nRepository: disguise-one/python-plugin
Length of output: 238
🏁 Script executed:
sed -n '194,197p' README.md | cat -nRepository: disguise-one/python-plugin
Length of output: 260
🏁 Script executed:
# Also verify the import auto-discovery behavior more thoroughly
cat -n src/designer_plugin/d3sdk/ast_utils.py | grep -A 15 "def _collect_imports"Repository: disguise-one/python-plugin
Length of output: 52
Move designer_plugin.pystub imports behind TYPE_CHECKING blocks.
The README currently recommends unconditional imports of pystub, but this makes the stub package a runtime dependency of code sent for remote execution. The codebase's automatic import detection explicitly includes unconditional imports in registered modules (see ast_utils.py), while specifically excluding imports in TYPE_CHECKING blocks. Since pystub provides only type hints and not implementations, unconditional imports should be avoided.
Use this pattern instead:
Suggested import pattern
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from designer_plugin.pystub import *This allows IDE autocomplete while preventing the stub package from being included in remote execution payloads.
Applies to: lines 86–91, 103–104, 195–196
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@README.md` around lines 86 - 91, The README currently shows an unconditional
import of designer_plugin.pystub which makes the stub a runtime dependency and
causes automatic import detection (see ast_utils.py) to include it in remote
payloads; change the examples to use typing.TYPE_CHECKING and wrap the import in
an if TYPE_CHECKING: block (i.e., reference TYPE_CHECKING and import
designer_plugin.pystub only inside that conditional) so the stubs remain
available for IDE/type hints but are excluded from runtime/import detection used
for remote execution.
| packages = find_imports_for_function(func) | ||
|
|
||
| return FunctionInfo( | ||
| source_code=source_code_py3, | ||
| source_code_py27=source_code_py27, | ||
| name=function_name, | ||
| body=body.strip(), | ||
| body_py27=body_py27.strip(), | ||
| args=args, | ||
| packages=packages, |
There was a problem hiding this comment.
Definition-time imports are still missed.
find_imports_for_function() only tracks names referenced from the function body, but the register payload executes the whole def. Imports used only in defaults / kw-defaults are still required when Designer evaluates the function definition, so cases like def f(now=datetime.datetime.now()) will still fail to register. Please widen discovery to the full function node and add a regression test.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/designer_plugin/d3sdk/function.py` around lines 121 - 130,
find_imports_for_function currently inspects only names referenced in the
function body, missing imports used in defaults/kw-defaults at definition time;
update find_imports_for_function (and any callers that build packages for
FunctionInfo) to traverse the entire AST FunctionDef node including
args.defaults, args.kw_defaults, and decorator_list to collect imports required
at definition time, then include those packages in the
FunctionInfo(packages=...) payload; add a regression test that defines a
function using an import in a default (e.g., def f(now=datetime.datetime.now()):
...) and asserts registration succeeds and the discovered packages include the
datetime import.
| # Auto-register packages used by this function | ||
| D3Function._available_packages[module_name].update( | ||
| pkg.to_import_statement() for pkg in self._function_info.packages | ||
| ) | ||
|
|
||
| # Update the function in case the function was updated in same session. | ||
| # For example, jupyter notebook server can be running, but function signature can | ||
| # change constantly. | ||
| if self in D3Function._available_d3functions[module_name]: | ||
| logger.warning( | ||
| "Function '%s' in module '%s' is being replaced.", | ||
| self.name, | ||
| module_name, | ||
| ) | ||
| D3Function._available_d3functions[module_name].discard(self) | ||
| D3Function._available_d3functions[module_name].add(self) |
There was a problem hiding this comment.
Recompute module imports after a replacement.
On redefinition the old D3Function is removed, but _available_packages[module_name] only ever grows. If the previous version imported something unavailable on Designer and the new version removes it, that stale import stays in every later register payload.
Suggested fix
- # Auto-register packages used by this function
- D3Function._available_packages[module_name].update(
- pkg.to_import_statement() for pkg in self._function_info.packages
- )
-
# Update the function in case the function was updated in same session.
# For example, jupyter notebook server can be running, but function signature can
# change constantly.
if self in D3Function._available_d3functions[module_name]:
logger.warning(
"Function '%s' in module '%s' is being replaced.",
self.name,
module_name,
)
D3Function._available_d3functions[module_name].discard(self)
D3Function._available_d3functions[module_name].add(self)
+ D3Function._available_packages[module_name] = {
+ pkg.to_import_statement()
+ for fn in D3Function._available_d3functions[module_name]
+ for pkg in fn.function_info.packages
+ }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/designer_plugin/d3sdk/function.py` around lines 266 - 281, When replacing
a D3Function you currently discard the old function but never remove its package
imports, so _available_packages[module_name] grows stale; fix by rebuilding
D3Function._available_packages[module_name] from the current set of registered
functions for that module: after you discard the old function and add the new
one to D3Function._available_d3functions[module_name] (using the existing
module_name and self._function_info.packages symbols), recompute
D3Function._available_packages[module_name] as the union of
pkg.to_import_statement() for every pkg in every f._function_info.packages for
each f in D3Function._available_d3functions[module_name]; assign that rebuilt
set back to D3Function._available_packages[module_name] so removed imports are
no longer present.
| # Update the function in case the function was updated in same session. | ||
| # For example, jupyter notebook server can be running, but function signature can | ||
| # change constantly. | ||
| if self in D3Function._available_d3functions[module_name]: | ||
| logger.warning( | ||
| "Function '%s' in module '%s' is being replaced.", | ||
| self.name, | ||
| module_name, | ||
| ) | ||
| D3Function._available_d3functions[module_name].discard(self) | ||
| D3Function._available_d3functions[module_name].add(self) |
There was a problem hiding this comment.
Function replacement does not invalidate live sessions.
This only updates the in-process registry. Once a D3Session / D3AsyncSession has cached the module in registered_modules, later execute() / rpc() calls will skip register_module() and keep running the old remote code, so notebook redefinitions do not take effect in an already-open session. Consider tracking a module version/dirty flag and forcing re-registration when the local definition changes, plus a regression test that redefines a function after the first execute().
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/designer_plugin/d3sdk/function.py` around lines 271 - 281, The current
replacement of D3Function only updates the in-process registry and doesn’t force
re-registration in existing D3Session/D3AsyncSession instances that cache
modules in registered_modules; modify the design to track a module version or
dirty flag on D3Function (e.g., bump a version or set is_dirty when replaced)
and store that version in each session’s registered_modules map, then change
D3Session.register_module (or the callers execute()/rpc()) to compare the local
D3Function module version against the session’s cached version and force
re-registration when they differ; update D3Function._available_d3functions
management to increment the module version/mark dirty on replacement and add a
regression test that opens a session, calls execute(), redefines the function
locally, and verifies subsequent execute()/rpc() triggers re-registration and
runs the new code.
9f76c63 to
6e33aa4
Compare
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/designer_plugin/d3sdk/ast_utils.py`:
- Around line 505-523: The current import collection only iterates
module_tree.body and therefore misses imports inside the function body and
imports nested inside module-level control blocks (If/Try) or inside the
function node itself; update the import-discovery to walk the entire AST subtree
for relevant Import and ImportFrom nodes (e.g., use ast.walk on module_tree and
also inspect func_node for local imports), then filter those imports by
used_names from _collect_used_names and populate packages (PackageInfo)
accordingly; ensure you still skip TYPE_CHECKING blocks via
_is_type_checking_block and preserve existing behavior for async vs sync
function nodes.
- Around line 524-531: The import handling currently treats the full dotted
module name (alias.name) as the bound name, which is wrong for statements like
"import sklearn.linear_model" where Python binds only "sklearn"; update the
binding logic in the ast.Import branch (inside the function processing
node/alias) so that effective_name = alias.asname if alias.asname else
alias.name.split(".")[0] (i.e., take the top-level package when there is no
alias), then use that effective_name when checking membership in used_names and
for downstream dependency detection; keep the existing _is_builtin_package check
and other logic unchanged.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 0282693f-e5a8-4d1b-b4cf-990554e0be5a
📒 Files selected for processing (1)
src/designer_plugin/d3sdk/ast_utils.py
| # --- 2. Collect names used inside the function body --- | ||
| func_source = textwrap.dedent(inspect.getsource(func)) | ||
| func_tree = ast.parse(func_source) | ||
| if not func_tree.body: | ||
| return [] | ||
|
|
||
| source: str = inspect.getsource(modules) | ||
|
|
||
| # Parse the source code | ||
| tree = ast.parse(source) | ||
|
|
||
| # Get the name of this function to filter it out | ||
| # For example, we don't want `from core import find_packages_in_current_file` | ||
| function_name: str = current_frame.f_code.co_name | ||
| # Skip any package from d3blobgen | ||
| d3blobgen_package_name: str = "d3blobgen" | ||
| # typing not supported in python2.7 | ||
| typing_package_name: str = "typing" | ||
| func_node = func_tree.body[0] | ||
| if not isinstance(func_node, (ast.FunctionDef, ast.AsyncFunctionDef)): | ||
| return [] | ||
|
|
||
| def is_type_checking_block(node: ast.If) -> bool: | ||
| """Check if an if statement is 'if TYPE_CHECKING:'""" | ||
| return isinstance(node.test, ast.Name) and node.test.id == "TYPE_CHECKING" | ||
| used_names = _collect_used_names(func_node) | ||
|
|
||
| imports: list[str] = [] | ||
| for node in tree.body: | ||
| # Skip TYPE_CHECKING blocks entirely | ||
| if isinstance(node, ast.If) and is_type_checking_block(node): | ||
| # --- 3. Parse file-level imports and filter to used ones --- | ||
| packages: list[PackageInfo] = [] | ||
| for node in module_tree.body: | ||
| # Skip TYPE_CHECKING blocks | ||
| if isinstance(node, ast.If) and _is_type_checking_block(node): | ||
| continue | ||
|
|
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# Get the full find_imports_for_function implementation
sed -n '495,565p' src/designer_plugin/d3sdk/ast_utils.py | cat -nRepository: disguise-one/python-plugin
Length of output: 2993
🏁 Script executed:
# Also check the _collect_used_names function to see what it collects
sed -n '440,495p' src/designer_plugin/d3sdk/ast_utils.py | cat -nRepository: disguise-one/python-plugin
Length of output: 2479
🏁 Script executed:
# Check if there are any function-internal imports or conditional imports in tests
rg "import.*\(" src/designer_plugin/d3sdk/ast_utils.py -A 3 -B 1Repository: disguise-one/python-plugin
Length of output: 1141
Broaden discovery beyond direct module-body imports.
Lines 517–530 only walk module_tree.body directly, missing imports declared inside the target function itself and imports wrapped in module-level if/try blocks. Functions can depend on packages through either pattern, but they will not be auto-registered. For example:
- Imports inside the function:
def my_func(): from pathlib import Path; ... - Conditional imports:
if sys.version_info >= (3, 9): from importlib import xyz
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/designer_plugin/d3sdk/ast_utils.py` around lines 505 - 523, The current
import collection only iterates module_tree.body and therefore misses imports
inside the function body and imports nested inside module-level control blocks
(If/Try) or inside the function node itself; update the import-discovery to walk
the entire AST subtree for relevant Import and ImportFrom nodes (e.g., use
ast.walk on module_tree and also inspect func_node for local imports), then
filter those imports by used_names from _collect_used_names and populate
packages (PackageInfo) accordingly; ensure you still skip TYPE_CHECKING blocks
via _is_type_checking_block and preserve existing behavior for async vs sync
function nodes.
| if isinstance(node, ast.Import): | ||
| imported_modules: list[str] = [alias.name for alias in node.names] | ||
| # Skip imports that include d3blobgen | ||
| if any(d3blobgen_package_name in module for module in imported_modules): | ||
| continue | ||
| if any(typing_package_name in module for module in imported_modules): | ||
| continue | ||
| import_text: str = f"import {', '.join(imported_modules)}" | ||
| imports.append(import_text) | ||
| for alias in node.names: | ||
| if not _is_builtin_package(alias.name): | ||
| continue | ||
|
|
||
| # The name used in code is the alias if present, otherwise the module name | ||
| effective_name = alias.asname if alias.asname else alias.name | ||
| if effective_name in used_names: |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's examine the file and the specific lines
head -540 src/designer_plugin/d3sdk/ast_utils.py | tail -30Repository: disguise-one/python-plugin
Length of output: 1192
🏁 Script executed:
# Let's see more context around those lines
sed -n '510,540p' src/designer_plugin/d3sdk/ast_utils.pyRepository: disguise-one/python-plugin
Length of output: 1193
🏁 Script executed:
# Check if there are tests for this function
find . -name "*test*" -type f -exec grep -l "ast_utils\|_is_builtin_package" {} \;Repository: disguise-one/python-plugin
Length of output: 118
🏁 Script executed:
# Check the test file for relevant tests
cat ./tests/test_ast_utils.pyRepository: disguise-one/python-plugin
Length of output: 41997
🏁 Script executed:
# Let's also verify Python's import behavior with a quick test
python3 << 'EOF'
import ast
# Test 1: import sklearn.linear_model
code1 = "import sklearn.linear_model"
tree1 = ast.parse(code1)
import_node = tree1.body[0]
alias = import_node.names[0]
print(f"Test 1: {code1}")
print(f" alias.name: {alias.name}")
print(f" alias.asname: {alias.asname}")
# What gets bound in the namespace?
import sys
namespace = {}
exec(code1, namespace)
print(f" Names in namespace: {[k for k in namespace.keys() if not k.startswith('_')]}")
# Test 2: import sklearn.linear_model as slm
code2 = "import sklearn.linear_model as slm"
tree2 = ast.parse(code2)
import_node2 = tree2.body[0]
alias2 = import_node2.names[0]
print(f"\nTest 2: {code2}")
print(f" alias.name: {alias2.name}")
print(f" alias.asname: {alias2.asname}")
namespace2 = {}
try:
exec(code2, namespace2)
print(f" Names in namespace: {[k for k in namespace2.keys() if not k.startswith('_')]}")
except:
print(f" Names in namespace: Would be ['slm'] if sklearn were available")
# Test 3: What about usage?
code3 = """
import sklearn.linear_model
x = sklearn.linear_model.LogisticRegression()
used_names = set()
for node in ast.walk(tree):
if isinstance(node, ast.Name):
used_names.add(node.id)
"""
print(f"\nTest 3: When using sklearn.linear_model in code, what name is accessed?")
print(f" You would use 'sklearn' first, so 'sklearn' would be in used_names")
print(f" But the code checks if 'sklearn.linear_model' is in used_names - this would FAIL")
EOFRepository: disguise-one/python-plugin
Length of output: 338
🏁 Script executed:
# Search for tests that use dotted imports
grep -n "import.*\." ./tests/test_ast_utils.py | head -20Repository: disguise-one/python-plugin
Length of output: 474
🏁 Script executed:
# Let's check if there are any tests of find_imports_for_function with dotted imports
grep -A 20 "class TestFindImportsForFunction" ./tests/test_ast_utils.py | grep -E "(def test|import.*\.)"Repository: disguise-one/python-plugin
Length of output: 265
🏁 Script executed:
# Now let's verify Python's actual import binding behavior more thoroughly
python3 << 'EOF'
import ast
# Demonstrate the issue clearly
test_cases = [
"import os",
"import os.path",
"import sklearn.linear_model",
"import numpy as np",
"import sklearn.linear_model as skm",
]
print("=== Import Binding Behavior ===\n")
for code in test_cases:
tree = ast.parse(code)
alias = tree.body[0].names[0]
# What gets bound?
if alias.asname:
bound_name = alias.asname
else:
# For dotted imports, Python binds only the first component
bound_name = alias.name.split(".", 1)[0]
print(f"Code: {code}")
print(f" alias.name: {alias.name}")
print(f" alias.asname: {alias.asname}")
print(f" Name bound in namespace: {bound_name}")
print()
# Now demonstrate the bug scenario
print("=== Bug Scenario ===\n")
code = """
import sklearn.linear_model
model = sklearn.linear_model.LogisticRegression()
"""
tree = ast.parse(code)
# Get the import
import_node = tree.body[0]
alias = import_node.names[0]
# Current buggy approach
buggy_name = alias.asname if alias.asname else alias.name
print(f"Current code checks for: '{buggy_name}' in used_names")
# Collect used names
used_names = set()
for node in ast.walk(tree):
if isinstance(node, ast.Name):
used_names.add(node.id)
print(f"Actually used names: {used_names}")
print(f"Would '{buggy_name}' match? {buggy_name in used_names}")
# Fixed approach
fixed_name = alias.asname or alias.name.split(".", 1)[0]
print(f"\nFixed code checks for: '{fixed_name}' in used_names")
print(f"Would '{fixed_name}' match? {fixed_name in used_names}")
EOFRepository: disguise-one/python-plugin
Length of output: 875
Fix dotted import binding to match Python's import semantics.
When import sklearn.linear_model is used without an alias, Python binds only sklearn to the namespace, not the full path. The current code incorrectly checks if sklearn.linear_model is in used_names, which will never match the actual binding. This causes valid dotted imports to be silently skipped from dependency detection.
Suggested fix
- # The name used in code is the alias if present, otherwise the module name
- effective_name = alias.asname if alias.asname else alias.name
+ # For `import pkg.submodule`, Python binds `pkg` unless an alias is used.
+ effective_name = alias.asname or alias.name.split(".", 1)[0]🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/designer_plugin/d3sdk/ast_utils.py` around lines 524 - 531, The import
handling currently treats the full dotted module name (alias.name) as the bound
name, which is wrong for statements like "import sklearn.linear_model" where
Python binds only "sklearn"; update the binding logic in the ast.Import branch
(inside the function processing node/alias) so that effective_name =
alias.asname if alias.asname else alias.name.split(".")[0] (i.e., take the
top-level package when there is no alias), then use that effective_name when
checking membership in used_names and for downstream dependency detection; keep
the existing _is_builtin_package check and other logic unchanged.
Detect the used python package and add it when register the python automatically
Summary
tbd
Summary by CodeRabbit
New Features
Removals
API Changes
Documentation
CI & Release
Tests