Skip to content

[Data] Add credential provider abstraction for Databricks UC datasource#60457

Merged
bveeramani merged 12 commits intoray-project:masterfrom
ankur-anyscale:ankur/add_databricks_credential_provider
Jan 30, 2026
Merged

[Data] Add credential provider abstraction for Databricks UC datasource#60457
bveeramani merged 12 commits intoray-project:masterfrom
ankur-anyscale:ankur/add_databricks_credential_provider

Conversation

@ankur-anyscale
Copy link
Contributor

@ankur-anyscale ankur-anyscale commented Jan 23, 2026

Introduce a credential provider pattern for Databricks authentication, enabling custom credential sources while maintaining backward compatibility.

Changes:

  • Add DatabricksCredentialProvider base class with StaticCredentialProvider and EnvironmentCredentialProvider implementations
  • Add credential_provider parameter to DatabricksUCDatasource and read_databricks_tables()
  • Add UnityCatalogConnector class for reading Unity Catalog tables directly (supports Delta/Parquet formats with AWS, Azure, and GCP credential handoff)
  • Add retry on 401 with credential invalidation via shared request_with_401_retry() helper
  • Centralize common code (build_headers, request_with_401_retry) in databricks_credentials.py
  • Move Databricks tests to dedicated test files with shared test utilities

The credential provider abstraction allows users to implement custom credential sources that support token refresh and other authentication patterns beyond static tokens.

Backward compatibility: Existing code using the DATABRICKS_TOKEN and DATABRICKS_HOST environment variables continues to work unchanged.

  Introduce a credential provider pattern for Databricks authentication,
  enabling custom credential sources while maintaining backward compatibility.

  Changes:
  - Add DatabricksCredentialProvider base class and StaticCredentialProvider
  - Add credential_provider parameter to DatabricksUCDatasource and read_databricks_tables()
  - Add retry on 401 with credential invalidation
  - Move Databricks tests to dedicated test file

  The credential provider abstraction allows users to implement custom
  credential sources that support token refresh and other authentication
  patterns beyond static tokens.

  Backward compatibility: Existing code using the token parameter or
  DATABRICKS_TOKEN environment variable continues to work unchanged.

Signed-off-by: ankur <ankur@anyscale.com>
@ankur-anyscale ankur-anyscale requested a review from a team as a code owner January 23, 2026 19:56
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new, extensible Databricks credential provider system for Ray Data. It defines an abstract DatabricksCredentialProvider base class and provides concrete implementations for static tokens (StaticCredentialProvider) and environment variables (EnvironmentCredentialProvider), the latter including logic to detect the Databricks host from the runtime environment. The DatabricksUCDatasource was refactored to utilize this new provider, dynamically fetching authentication tokens for each request and implementing a single-retry mechanism for 401 Unauthorized responses during both statement polling and chunk resolution. The read_databricks_tables API was updated to accept an optional credential_provider parameter, replacing its previous direct environment variable parsing. Corresponding unit tests were added for the new credential providers and their integration, including tests for the 401 retry logic. Review comments highlighted areas for improvement, such as making the EnvironmentCredentialProvider's host resolution error message more precise, evaluating the portability of the _detect_databricks_host method's reliance on IPython, and enhancing the 401 retry logic in both the polling and chunk resolution loops with a maximum retry limit or backoff strategy to prevent potential infinite loops. Additionally, a comment noted the importance of ensuring the DatabricksCredentialProvider instance is serializable if it holds state for remote workers.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 005cdf5265

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Signed-off-by: ankur <ankur@anyscale.com>
@ankur-anyscale ankur-anyscale force-pushed the ankur/add_databricks_credential_provider branch from 324b3e0 to 5bcf76b Compare January 23, 2026 20:47
cursor[bot]

This comment was marked as outdated.

Signed-off-by: ankur <ankur@anyscale.com>
@ray-gardener ray-gardener bot added the data Ray Data-related issues label Jan 24, 2026
Copy link
Contributor

@goutamvenkat-anyscale goutamvenkat-anyscale left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

left a few comments

Comment on lines +124 to +125
token_env_var: str = "DATABRICKS_TOKEN",
host_env_var: str = "DATABRICKS_HOST",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lets make these consts

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

Comment on lines +166 to +167
except Exception:
pass
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we bypassing all exceptions?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Logged the exception here. If the data runtime is not available and host is also not set, then it will be handled by the current flow.

Comment on lines +200 to +203
headers = _build_headers(credential_provider_for_tasks)
resolve_response = requests.get(
resolve_external_link_url, headers=headers
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make this into a reusable function

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

_STATEMENT_EXEC_POLL_TIME_S = 1


def _build_headers(credential_provider: DatabricksCredentialProvider) -> dict:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: dict[str, str]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

)


def test_databricks_uc_datasource():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this being nuked?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was just moved to other file. I have updated the tests as well as per current guidelines as well as your comments.

.reset_index(drop=True)
)

pd.testing.assert_frame_equal(result, expected_result_df)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use rows_same

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

.sort_values("c1")
.reset_index(drop=True)
)
pd.testing.assert_frame_equal(result, expected_result_df)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use rows_same

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

  - Add module-level constants DEFAULT_TOKEN_ENV_VAR and DEFAULT_HOST_ENV_VAR
    for environment variable names
  - Add logging to _detect_databricks_host() instead of silently swallowing
    exceptions
  - Extract 401 retry logic into reusable _request_with_401_retry() function
    to reduce code duplication

Signed-off-by: ankur <ankur@anyscale.com>
@ankur-anyscale ankur-anyscale force-pushed the ankur/add_databricks_credential_provider branch from b110106 to 8acd939 Compare January 27, 2026 11:25
@ankur-anyscale ankur-anyscale force-pushed the ankur/add_databricks_credential_provider branch from f56d718 to 3ae45c6 Compare January 27, 2026 18:42
Signed-off-by: ankur <ankur@anyscale.com>
@ankur-anyscale ankur-anyscale force-pushed the ankur/add_databricks_credential_provider branch from 3ae45c6 to 4ba7c89 Compare January 27, 2026 18:44
Signed-off-by: ankur <ankur@anyscale.com>
Signed-off-by: ankur <ankur@anyscale.com>
Signed-off-by: ankur <ankur@anyscale.com>
@ankur-anyscale ankur-anyscale force-pushed the ankur/add_databricks_credential_provider branch from 17e4adb to afbf7a5 Compare January 30, 2026 03:16
@goutamvenkat-anyscale goutamvenkat-anyscale added the go add ONLY when ready to merge, run all tests label Jan 30, 2026
@goutamvenkat-anyscale
Copy link
Contributor

/gemini summary

@gemini-code-assist
Copy link
Contributor

Summary of Changes

This pull request significantly enhances the Databricks Unity Catalog datasource by introducing a robust and extensible credential provider abstraction. This change improves authentication flexibility, allowing users to define custom credential sources and enabling automatic token refresh mechanisms. It also adds resilience to transient authentication failures by implementing a 401 retry mechanism with credential invalidation, ensuring smoother data operations.

Highlights

  • Credential Provider Abstraction: Introduced a new DatabricksCredentialProvider abstract base class and concrete implementations (StaticCredentialProvider, EnvironmentCredentialProvider) to standardize and extend Databricks authentication methods.
  • Enhanced Authentication Flexibility: The DatabricksUCDatasource and read_databricks_tables() now accept a credential_provider parameter, allowing users to implement custom authentication logic, including token refresh mechanisms.
  • Robust 401 Retry Mechanism: Implemented automatic retry logic for 401 Unauthorized HTTP responses across all Databricks API calls (initial POST, polling GETs, and chunk fetching GETs), with credential invalidation to prompt token refresh.
  • Improved Test Structure: Databricks-related tests have been refactored and moved into dedicated test files (test_databricks_credentials.py, test_databricks_uc_datasource.py) for better organization and maintainability.
  • Backward Compatibility: Existing authentication methods using the token parameter or DATABRICKS_TOKEN environment variable remain functional and are handled by the default EnvironmentCredentialProvider.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • python/ray/data/_internal/datasource/databricks_credentials.py
    • Added new file to define Databricks credential provider interfaces and implementations.
    • Defined DatabricksCredentialProvider as an abstract base class with get_token, get_host, and invalidate methods.
    • Implemented StaticCredentialProvider for handling fixed authentication tokens and hosts.
    • Implemented EnvironmentCredentialProvider to retrieve credentials from environment variables, including Databricks runtime host detection and token re-reading on invalidate().
    • Added resolve_credential_provider utility function to determine the appropriate credential provider based on input.
  • python/ray/data/_internal/datasource/databricks_uc_datasource.py
    • Imported DatabricksCredentialProvider for type hinting and usage.
    • Modified DatabricksUCDatasource.__init__ to accept a credential_provider instance instead of separate host and token arguments.
    • Introduced _build_headers and _request_with_401_retry helper functions to centralize header construction and implement 401 retry logic.
    • Updated all internal requests.post and requests.get calls to utilize the new _request_with_401_retry function, providing resilience against expired tokens.
    • Ensured the credential_provider is correctly captured for serialization when creating read tasks.
  • python/ray/data/read_api.py
    • Removed direct os imports for environment variable access related to Databricks credentials.
    • Imported DatabricksCredentialProvider and resolve_credential_provider.
    • Added an optional credential_provider parameter to the read_databricks_tables function.
    • Updated the read_databricks_tables function to use resolve_credential_provider to obtain the authentication provider.
    • Removed previous direct environment variable parsing for Databricks host and token within read_databricks_tables.
    • Updated documentation for read_databricks_tables to include usage examples for custom credential providers.
  • python/ray/data/tests/datasource/test_databricks_credentials.py
    • Added new file containing comprehensive unit tests for the DatabricksCredentialProvider abstract base class and its concrete implementations (StaticCredentialProvider, EnvironmentCredentialProvider).
    • Includes tests for proper initialization, credential retrieval, host detection, token refresh on invalidate(), and serialization capabilities of the providers.
  • python/ray/data/tests/datasource/test_databricks_uc_datasource.py
    • Added new file containing integration tests for the DatabricksUCDatasource.
    • Tests reading data using both table names and SQL queries with various parallelism settings.
    • Includes specific tests for credential provider integration, ensuring fresh tokens are fetched for each request.
    • Verifies the 401 retry mechanism functions correctly during initial POST requests, polling, and chunk fetching.
    • Tests handling of empty query results.
  • python/ray/data/tests/datasource/test_sql.py
    • Removed Databricks Unity Catalog datasource tests, which have been migrated to the new test_databricks_uc_datasource.py file.
Activity
  • gemini-code-assist[bot] raised concerns regarding the precision of host resolution error messages, the dependency on IPython for host detection, and the lack of retry limits/backoff for 401 errors in polling and chunk fetching. It also questioned the serializability of DatabricksCredentialProvider.
  • ankur-anyscale (author) responded by stating that existing logic was moved without change, added unit tests to confirm serializability, and implemented token refresh within EnvironmentCredentialProvider's invalidate method.
  • chatgpt-codex-connector[bot] suggested refreshing environment-backed tokens on invalidate(), which was subsequently addressed by the author.
  • cursor[bot] identified several issues, including MockResponse in tests missing the status_code attribute, the initial POST request lacking 401 retry logic, the presence of unused helper functions/fixtures, redundant credential provider classes in tests, and duplicated test setup code.
  • ankur-anyscale addressed the MockResponse issue, implemented 401 retry for the initial POST, and refactored test code to remove unused elements and reduce duplication.
  • goutamvenkat-anyscale provided feedback suggesting the use of constants, creating reusable functions, applying specific type hints (dict[str, str]), utilizing rows_same for DataFrame comparisons, and parameterizing tests. These suggestions were addressed by the author.

Signed-off-by: ankur <ankur@anyscale.com>
Signed-off-by: ankur <ankur@anyscale.com>
Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

@bveeramani bveeramani merged commit 2a57731 into ray-project:master Jan 30, 2026
6 checks passed
liulehui pushed a commit to liulehui/ray that referenced this pull request Jan 31, 2026
…ce (ray-project#60457)

Introduce a credential provider pattern for Databricks authentication,
enabling custom credential sources while maintaining backward
compatibility.

  Changes:
- Add DatabricksCredentialProvider base class with
StaticCredentialProvider and EnvironmentCredentialProvider
implementations
- Add credential_provider parameter to DatabricksUCDatasource and
read_databricks_tables()
- Add UnityCatalogConnector class for reading Unity Catalog tables
directly (supports Delta/Parquet formats with AWS, Azure, and GCP
credential handoff)
- Add retry on 401 with credential invalidation via shared
request_with_401_retry() helper
- Centralize common code (build_headers, request_with_401_retry) in
databricks_credentials.py
- Move Databricks tests to dedicated test files with shared test
utilities

The credential provider abstraction allows users to implement custom
credential sources that support token refresh and other authentication
patterns beyond static tokens.

Backward compatibility: Existing code using the DATABRICKS_TOKEN and
DATABRICKS_HOST environment variables continues to work unchanged.

---------

Signed-off-by: ankur <ankur@anyscale.com>
simonsays1980 pushed a commit to simonsays1980/ray that referenced this pull request Jan 31, 2026
…ce (ray-project#60457)

Introduce a credential provider pattern for Databricks authentication,
enabling custom credential sources while maintaining backward
compatibility.

  Changes:
- Add DatabricksCredentialProvider base class with
StaticCredentialProvider and EnvironmentCredentialProvider
implementations
- Add credential_provider parameter to DatabricksUCDatasource and
read_databricks_tables()
- Add UnityCatalogConnector class for reading Unity Catalog tables
directly (supports Delta/Parquet formats with AWS, Azure, and GCP
credential handoff)
- Add retry on 401 with credential invalidation via shared
request_with_401_retry() helper
- Centralize common code (build_headers, request_with_401_retry) in
databricks_credentials.py
- Move Databricks tests to dedicated test files with shared test
utilities

The credential provider abstraction allows users to implement custom
credential sources that support token refresh and other authentication
patterns beyond static tokens.

Backward compatibility: Existing code using the DATABRICKS_TOKEN and
DATABRICKS_HOST environment variables continues to work unchanged.

---------

Signed-off-by: ankur <ankur@anyscale.com>
400Ping pushed a commit to 400Ping/ray that referenced this pull request Feb 1, 2026
…ce (ray-project#60457)

Introduce a credential provider pattern for Databricks authentication,
enabling custom credential sources while maintaining backward
compatibility.

  Changes:
- Add DatabricksCredentialProvider base class with
StaticCredentialProvider and EnvironmentCredentialProvider
implementations
- Add credential_provider parameter to DatabricksUCDatasource and
read_databricks_tables()
- Add UnityCatalogConnector class for reading Unity Catalog tables
directly (supports Delta/Parquet formats with AWS, Azure, and GCP
credential handoff)
- Add retry on 401 with credential invalidation via shared
request_with_401_retry() helper
- Centralize common code (build_headers, request_with_401_retry) in
databricks_credentials.py
- Move Databricks tests to dedicated test files with shared test
utilities

The credential provider abstraction allows users to implement custom
credential sources that support token refresh and other authentication
patterns beyond static tokens.

Backward compatibility: Existing code using the DATABRICKS_TOKEN and
DATABRICKS_HOST environment variables continues to work unchanged.

---------

Signed-off-by: ankur <ankur@anyscale.com>
Signed-off-by: 400Ping <jiekaichang@apache.org>
rayhhome pushed a commit to rayhhome/ray that referenced this pull request Feb 4, 2026
…ce (ray-project#60457)

Introduce a credential provider pattern for Databricks authentication,
enabling custom credential sources while maintaining backward
compatibility.

  Changes:
- Add DatabricksCredentialProvider base class with
StaticCredentialProvider and EnvironmentCredentialProvider
implementations
- Add credential_provider parameter to DatabricksUCDatasource and
read_databricks_tables()
- Add UnityCatalogConnector class for reading Unity Catalog tables
directly (supports Delta/Parquet formats with AWS, Azure, and GCP
credential handoff)
- Add retry on 401 with credential invalidation via shared
request_with_401_retry() helper
- Centralize common code (build_headers, request_with_401_retry) in
databricks_credentials.py
- Move Databricks tests to dedicated test files with shared test
utilities

The credential provider abstraction allows users to implement custom
credential sources that support token refresh and other authentication
patterns beyond static tokens.

Backward compatibility: Existing code using the DATABRICKS_TOKEN and
DATABRICKS_HOST environment variables continues to work unchanged.

---------

Signed-off-by: ankur <ankur@anyscale.com>
Signed-off-by: Sirui Huang <ray.huang@anyscale.com>
elliot-barn pushed a commit that referenced this pull request Feb 9, 2026
…ce (#60457)

Introduce a credential provider pattern for Databricks authentication,
enabling custom credential sources while maintaining backward
compatibility.

  Changes:
- Add DatabricksCredentialProvider base class with
StaticCredentialProvider and EnvironmentCredentialProvider
implementations
- Add credential_provider parameter to DatabricksUCDatasource and
read_databricks_tables()
- Add UnityCatalogConnector class for reading Unity Catalog tables
directly (supports Delta/Parquet formats with AWS, Azure, and GCP
credential handoff)
- Add retry on 401 with credential invalidation via shared
request_with_401_retry() helper
- Centralize common code (build_headers, request_with_401_retry) in
databricks_credentials.py
- Move Databricks tests to dedicated test files with shared test
utilities

The credential provider abstraction allows users to implement custom
credential sources that support token refresh and other authentication
patterns beyond static tokens.

Backward compatibility: Existing code using the DATABRICKS_TOKEN and
DATABRICKS_HOST environment variables continues to work unchanged.

---------

Signed-off-by: ankur <ankur@anyscale.com>
Signed-off-by: elliot-barn <elliot.barnwell@anyscale.com>
elliot-barn pushed a commit that referenced this pull request Feb 9, 2026
…ce (#60457)

Introduce a credential provider pattern for Databricks authentication,
enabling custom credential sources while maintaining backward
compatibility.

  Changes:
- Add DatabricksCredentialProvider base class with
StaticCredentialProvider and EnvironmentCredentialProvider
implementations
- Add credential_provider parameter to DatabricksUCDatasource and
read_databricks_tables()
- Add UnityCatalogConnector class for reading Unity Catalog tables
directly (supports Delta/Parquet formats with AWS, Azure, and GCP
credential handoff)
- Add retry on 401 with credential invalidation via shared
request_with_401_retry() helper
- Centralize common code (build_headers, request_with_401_retry) in
databricks_credentials.py
- Move Databricks tests to dedicated test files with shared test
utilities

The credential provider abstraction allows users to implement custom
credential sources that support token refresh and other authentication
patterns beyond static tokens.

Backward compatibility: Existing code using the DATABRICKS_TOKEN and
DATABRICKS_HOST environment variables continues to work unchanged.

---------

Signed-off-by: ankur <ankur@anyscale.com>
ans9868 pushed a commit to ans9868/ray that referenced this pull request Feb 18, 2026
…ce (ray-project#60457)

Introduce a credential provider pattern for Databricks authentication,
enabling custom credential sources while maintaining backward
compatibility.

  Changes:
- Add DatabricksCredentialProvider base class with
StaticCredentialProvider and EnvironmentCredentialProvider
implementations
- Add credential_provider parameter to DatabricksUCDatasource and
read_databricks_tables()
- Add UnityCatalogConnector class for reading Unity Catalog tables
directly (supports Delta/Parquet formats with AWS, Azure, and GCP
credential handoff)
- Add retry on 401 with credential invalidation via shared
request_with_401_retry() helper
- Centralize common code (build_headers, request_with_401_retry) in
databricks_credentials.py
- Move Databricks tests to dedicated test files with shared test
utilities

The credential provider abstraction allows users to implement custom
credential sources that support token refresh and other authentication
patterns beyond static tokens.

Backward compatibility: Existing code using the DATABRICKS_TOKEN and
DATABRICKS_HOST environment variables continues to work unchanged.

---------

Signed-off-by: ankur <ankur@anyscale.com>
Signed-off-by: Adel Nour <ans9868@nyu.edu>
peterxcli pushed a commit to peterxcli/ray that referenced this pull request Feb 25, 2026
…ce (ray-project#60457)

Introduce a credential provider pattern for Databricks authentication,
enabling custom credential sources while maintaining backward
compatibility.

  Changes:
- Add DatabricksCredentialProvider base class with
StaticCredentialProvider and EnvironmentCredentialProvider
implementations
- Add credential_provider parameter to DatabricksUCDatasource and
read_databricks_tables()
- Add UnityCatalogConnector class for reading Unity Catalog tables
directly (supports Delta/Parquet formats with AWS, Azure, and GCP
credential handoff)
- Add retry on 401 with credential invalidation via shared
request_with_401_retry() helper
- Centralize common code (build_headers, request_with_401_retry) in
databricks_credentials.py
- Move Databricks tests to dedicated test files with shared test
utilities

The credential provider abstraction allows users to implement custom
credential sources that support token refresh and other authentication
patterns beyond static tokens.

Backward compatibility: Existing code using the DATABRICKS_TOKEN and
DATABRICKS_HOST environment variables continues to work unchanged.

---------

Signed-off-by: ankur <ankur@anyscale.com>
Signed-off-by: peterxcli <peterxcli@gmail.com>
peterxcli pushed a commit to peterxcli/ray that referenced this pull request Feb 25, 2026
…ce (ray-project#60457)

Introduce a credential provider pattern for Databricks authentication,
enabling custom credential sources while maintaining backward
compatibility.

  Changes:
- Add DatabricksCredentialProvider base class with
StaticCredentialProvider and EnvironmentCredentialProvider
implementations
- Add credential_provider parameter to DatabricksUCDatasource and
read_databricks_tables()
- Add UnityCatalogConnector class for reading Unity Catalog tables
directly (supports Delta/Parquet formats with AWS, Azure, and GCP
credential handoff)
- Add retry on 401 with credential invalidation via shared
request_with_401_retry() helper
- Centralize common code (build_headers, request_with_401_retry) in
databricks_credentials.py
- Move Databricks tests to dedicated test files with shared test
utilities

The credential provider abstraction allows users to implement custom
credential sources that support token refresh and other authentication
patterns beyond static tokens.

Backward compatibility: Existing code using the DATABRICKS_TOKEN and
DATABRICKS_HOST environment variables continues to work unchanged.

---------

Signed-off-by: ankur <ankur@anyscale.com>
Signed-off-by: peterxcli <peterxcli@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

data Ray Data-related issues go add ONLY when ready to merge, run all tests

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants