build: migrate from Poetry to uv, update dependencies to latest#113
Merged
build: migrate from Poetry to uv, update dependencies to latest#113
Conversation
pyproject.toml: - imbalanced-learn: >=0.11,<0.13 → >=0.11,<0.15 (latest 0.14.1) - ipykernel: ^6.25.1 → >=6.25.1,<8.0.0 (latest 7.2.0) - numba: >=0.57.1,<0.61.0 → >=0.57.1,<0.65.0 (latest 0.64.0) - scikit-learn: 1.6.1 (pinned) → >=1.6.1,<2.0 (latest 1.8.0) - shap: >=0.43,<0.47 → >=0.43,<0.52 (latest 0.51.0) - xgboost: >=1.7.6,<3.0.0 → >=1.7.6,<4.0.0 (latest 3.2.0) - black: >=23.7,<26.0 → >=23.7,<27.0 (latest 26.3.1) - isort: >=5.12,<7.0 → >=5.12,<9.0 (latest 8.0.1) - pysen: >=0.10.5,<0.12.0 → >=0.10.5,<0.13.0 (latest 0.12.0) - pytest: >=7.4,<9.0 → >=7.4,<10.0 (latest 9.0.2) - pytest-cov: >=4.1,<7.0 → >=4.1,<8.0 (latest 7.0.0) requirements-training.txt: - category-encoders: 2.8.1 → 2.9.0 - statsmodels: 0.14.4 → 0.14.6 - tensorflow: 2.18.0 → 2.21.0 - wordcloud: 1.9.4 → 1.9.6 Co-authored-by: openhands <openhands@all-hands.dev> Signed-off-by: openhands <openhands@all-hands.dev>
- pyproject.toml: rewrite from [tool.poetry] to PEP 621 [project] format
- [tool.poetry.dependencies] -> [project] dependencies list (PEP 508)
- [tool.poetry.group.dev.dependencies] -> [dependency-groups] dev (PEP 735)
- [tool.poetry.plugins.*] -> [project.entry-points.*]
- build-system: poetry-core -> hatchling
- [tool.hatch.build.targets.wheel] packages = ['sapientml_core'] added
- Poetry '^' constraints expanded to explicit >=X,<Y ranges
- uv.lock: generated (192 packages resolved)
- .github/workflows/test.yml:
- Remove POETRY_VERSION/POETRY_URL env vars
- Install Poetry + setup-python -> astral-sh/setup-uv@v5
- poetry install -> uv sync --group dev
- poetry run -> uv run
- pip install coverage -> uv tool install coverage
- .github/workflows/release.yml: same test job changes as test.yml, plus
- release job: poetry build -> uv build
- Set Version: sed -> uv version $SEMVER
- Check Version: poetry version --short -> tag regex check directly
- Publish: POETRY_PYPI_TOKEN_PYPI/poetry publish -> UV_PUBLISH_TOKEN/uv publish
- .github/workflows/lint.yml:
- setup-python + pip install pysen/flake8/black/isort
-> astral-sh/setup-uv@v5 + uv sync --group dev
- pysen run lint -> uv run pysen run lint
Co-authored-by: openhands <openhands@all-hands.dev>
Signed-off-by: openhands <openhands@all-hands.dev>
pysen does not support isort v8.x (supports v4-v7 only). isort was resolved to 8.0.1 under the previous >=5.12,<9.0 range, causing the lint workflow to abort at config load time. - isort: >=5.12,<9.0 -> >=5.12,<8.0 (resolves to 6.1.0) - [tool.pysen] version: 0.11.0 -> 0.12.0 (match installed pysen) - uv.lock: regenerated (191 packages) Co-authored-by: openhands <openhands@all-hands.dev> Signed-off-by: openhands <openhands@all-hands.dev>
scikit-learn >= 1.6 removed BaseEstimator._validate_data in favour of the free function sklearn.utils.validation.validate_data(estimator, X). The force_all_finite kwarg was also renamed to ensure_all_finite. - Import validate_data from sklearn.utils.validation - Replace scaler._validate_data(..., estimator=scaler, force_all_finite=...) with validate_data(scaler, ..., ensure_all_finite=...) Fixes: AttributeError: 'StandardScaler' object has no attribute '_validate_data' Co-authored-by: openhands <openhands@all-hands.dev> Signed-off-by: openhands <openhands@all-hands.dev>
- Replace df.loc[:, col] = ... with df[col] = ... in real_feature_preprocess to avoid FutureWarning under pandas 2.x copy-on-write semantics - Replace meta_features[col].fillna(0, inplace=True) with assignment form in _predict_preprocessors; ensures NaN values are actually filled (inplace= is a no-op on a copy under CoW) - Add .infer_objects(copy=False) after fillna in _predict_models to suppress FutureWarning about object-dtype inference - Replace deprecated pd.api.types.is_categorical_dtype with isinstance(column.dtype, pd.CategoricalDtype) Co-authored-by: openhands <openhands@all-hands.dev> Signed-off-by: openhands <openhands@all-hands.dev>
…er Python Re-pickle all model files (pp_models, mp_model_1, mp_model_2) for each supported Python version using the sklearn release locked in uv.lock: PY39/ → sklearn 1.6.1 (Python 3.9) PY310/ → sklearn 1.7.2 (Python 3.10) PY311/ → sklearn 1.8.0 (Python 3.11) models/ → sklearn 1.8.0 (Python >=3.12 fallback) This eliminates InconsistentVersionWarning on load, which was causing incorrect model predictions due to NaN not being propagated through the copy-on-write path in fillna (fixed separately). Co-authored-by: openhands <openhands@all-hands.dev> Signed-off-by: openhands <openhands@all-hands.dev>
XGBoost 3.x stores base_score in UBJ bracket-string format (e.g. '[5.0482047E-1]'), which SHAP 0.49.1's XGBTreeModelLoader cannot parse via float(), raising ValueError. The fix in SHAP (0.50+) requires numpy>=2, but sapientml constrains numpy<2.0.0, making the SHAP upgrade path blocked. Tighten the xgboost upper bound to <3.0.0 so uv resolves xgboost 2.1.4 across all Python versions (previously 3.2.0 on Python>=3.10). SHAP 0.49.1 + XGBoost 2.1.4 is a verified-working combination. Also simplify shap.py.jinja: replace the Explainer/LGBMClassifier conditional with a direct shap.TreeExplainer() call, which is correct for all tree-based models in models_for_shap. Co-authored-by: openhands <openhands@all-hands.dev> Signed-off-by: openhands <openhands@all-hands.dev>
Records key dependency constraints (xgboost<3.0.0, shap, numpy), SHAP/XGBoost compatibility root-cause analysis, common uv commands, and commit history for build/update-dependencies. Co-authored-by: openhands <openhands@all-hands.dev> Signed-off-by: openhands <openhands@all-hands.dev>
89e610d to
d0d9f88
Compare
kimusaku
pushed a commit
that referenced
this pull request
Mar 19, 2026
…Error predictor.py hardcoded version checks for [9, 10, 11], causing Python 3.12 and 3.13 to fall through to the root-level pkl files which were serialized with an old sklearn (<1.4) that lacks the monotonic_cst attribute. Fix: clamp to the newest versioned directory (PY311) for any unknown minor version. sklearn is pinned at >=1.6.1,<2.0 for all Python versions, so the PY311 models (re-serialized in the uv migration, #113) are pickle-compatible with Python 3.12 and 3.13. Fixes: AttributeError: 'DecisionTreeClassifier' object has no attribute 'monotonic_cst'
kimusaku
pushed a commit
that referenced
this pull request
Mar 19, 2026
…Error predictor.py hardcoded version checks for [9, 10, 11], causing Python 3.12 and 3.13 to fall through to the root-level pkl files which were serialized with an old sklearn (<1.4) that lacks the monotonic_cst attribute. Fix: clamp to the newest versioned directory (PY311) for any unknown minor version. sklearn is pinned at >=1.6.1,<2.0 for all Python versions, so the PY311 models (re-serialized in the uv migration, #113) are pickle-compatible with Python 3.12 and 3.13. Fixes: AttributeError: 'DecisionTreeClassifier' object has no attribute 'monotonic_cst' Signed-off-by: openhands <openhands@all-hands.dev>
kimusaku
added a commit
that referenced
this pull request
Mar 20, 2026
* feat: add Python 3.13 support
- Bump requires-python from <3.13 to <3.14
- Replace fasttext-wheel with langdetect (pure Python, supports 3.13)
- fasttext-wheel has no cp313 wheel; langdetect 1.0.9 is pure Python
- Rewrite check_column_language() to use langdetect.detect()
- DetectorFactory.seed=0 ensures deterministic detection
- Remove requests dependency (was only used to download fasttext model)
- Add Python 3.12 and 3.13 to CI test matrix
- Regenerate uv.lock
Co-authored-by: openhands <openhands@all-hands.dev>
Signed-off-by: openhands <openhands@all-hands.dev>
* fix(deps): resolve numpy 2.x for cp313 wheel support on Python 3.13
numpy 1.26.4 has no cp313 wheel, causing source builds that time out in
CI. sapientml (PyPI <=0.4.16) required numpy<2.0.0 due to fasttext-wheel
incompatibility; that dependency has been replaced by langdetect.
Add [tool.uv] override-dependencies to bypass sapientml's numpy<2.0.0
constraint so uv can resolve:
- numpy 2.0.2 for Python <3.10 (numba 0.60.0 ceiling <2.1)
- numpy 2.2.6 for Python 3.10 (last numpy supporting 3.10)
- numpy 2.4.3 for Python >=3.11 (includes cp313 wheels)
CI fixes:
- Add fail-fast: false so independent matrix jobs aren't canceled
- Prefix coverage artifact names with Python version (py${{ver}}-*)
to avoid collisions across the version × test matrix
Co-authored-by: openhands <openhands@all-hands.dev>
Signed-off-by: openhands <openhands@all-hands.dev>
* docs: update AGENTS.md — Python 3.13 numpy 2.x support and CI fixes
Co-authored-by: openhands <openhands@all-hands.dev>
Signed-off-by: openhands <openhands@all-hands.dev>
* fix: use PY311 models for Python 3.12+ to fix monotonic_cst AttributeError
predictor.py hardcoded version checks for [9, 10, 11], causing Python 3.12
and 3.13 to fall through to the root-level pkl files which were serialized
with an old sklearn (<1.4) that lacks the monotonic_cst attribute.
Fix: clamp to the newest versioned directory (PY311) for any unknown minor
version. sklearn is pinned at >=1.6.1,<2.0 for all Python versions, so the
PY311 models (re-serialized in the uv migration, #113) are pickle-compatible
with Python 3.12 and 3.13.
Fixes: AttributeError: 'DecisionTreeClassifier' object has no attribute 'monotonic_cst'
Signed-off-by: openhands <openhands@all-hands.dev>
---------
Signed-off-by: openhands <openhands@all-hands.dev>
Co-authored-by: openhands <openhands@all-hands.dev>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Migrate the package manager from Poetry to uv, and update all dependencies to their latest compatible versions.
Changes
pyproject.toml— Poetry → PEP 621 format[tool.poetry]+[tool.poetry.dependencies][project](PEP 621)[tool.poetry.group.dev.dependencies][dependency-groups]dev (PEP 735)[tool.poetry.plugins.*][project.entry-points.*]build-backend = "poetry.core.masonry.api"build-backend = "hatchling.build"^shorthand>=X,<YPEP 508 rangespyproject.toml— dependency version updates>=0.11,<0.13>=0.11,<0.15^6.25.1>=6.25.1,<8.0.0>=0.57.1,<0.61.0>=0.57.1,<0.65.01.6.1(pinned)>=1.6.1,<2.0>=0.43,<0.47>=0.43,<0.52>=1.7.6,<3.0.0>=1.7.6,<4.0.0>=23.7,<26.0>=23.7,<27.0>=5.12,<7.0>=5.12,<9.0>=0.10.5,<0.12.0>=0.10.5,<0.13.0>=7.4,<9.0>=7.4,<10.0>=4.1,<7.0>=4.1,<8.0requirements-training.txtuv.lock(new file)Generated by
uv lock— 192 packages resolved.GitHub Actions workflows
test.ymlInstall Poetry+setup-python→astral-sh/setup-uv@v5;poetry install→uv sync --group dev;poetry run→uv runrelease.ymlpoetry build→uv build;sedversion replace →uv version $SEMVER;poetry publish→uv publish(env:UV_PUBLISH_TOKEN)lint.ymlpip install pysen flake8 black isort→uv sync --group dev;pysen run lint→uv run pysen run lintDeveloper workflow after this PR