Skip to content

Add bias to scores pandas api#992

Open
wuxx66 wants to merge 6 commits intonci:developfrom
wuxx66:add_bias_to_scores_pandas_api
Open

Add bias to scores pandas api#992
wuxx66 wants to merge 6 commits intonci:developfrom
wuxx66:add_bias_to_scores_pandas_api

Conversation

@wuxx66
Copy link
Collaborator

@wuxx66 wuxx66 commented Feb 4, 2026

No description provided.

) -> PandasType:
"""Calculates the additive bias (mean error) from forecast and observed data.

Bias is commonly defined as the mean signed difference between forecast and observed values. A detailed explanation is on https://scores.readthedocs.io/en/stable/tutorials/Additive_and_multiplicative_bias.html
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure a general definition of Bias is needed here. Suggest deleting the first sentence to make this consistent with other docstrings in this file.

Suggested change
Bias is commonly defined as the mean signed difference between forecast and observed values. A detailed explanation is on https://scores.readthedocs.io/en/stable/tutorials/Additive_and_multiplicative_bias.html
A detailed explanation is on https://scores.readthedocs.io/en/stable/tutorials/Additive_and_multiplicative_bias.html

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed first line definition

# Additive Bias (Mean Error)


def test_additive_bias_pandas_series():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since you're expanding the input types for additive_bias you need to test for xarray inputs as well.

For 100% test coverage you'll need to check the exception is correctly raised.

Something like

 with pytest.raises(ValueError):
    scores.continuous.additive_bias(fcst=fcst, obs=obs, weights=xr.DataArray([1,2,3]))

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done the test

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@John-Sharples why do we need this in the pandas-specific API? Is the idea it's xarray plus pandas, or just pandas only?

Copy link
Contributor

@John-Sharples John-Sharples Feb 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My mistake, in putting it here it probably belongs in tests/continuous/test_standard.py

@John-Sharples
Copy link
Contributor

John-Sharples commented Feb 13, 2026

We still need a test checking we get the expected answer when the inputs are xarray like objects. Something like:

def test_additive_bias_xarray():

    fcst =xr.DataArray([1, 3, 1, 3, 2, 2, 2, 1, 1, 2, 3])
    obs = xr.DataArray([1, 1, 1, 2, 1, 2, 1, 1, 1, 3, 1])

    expected = 0.5455

    result = scores.continuous.additive_bias(fcst=fcst, obs=obs)

    assert isinstance(result, numbers.Real)
    assert round(float(result), PRECISION) == expected
    
    # And possibly with weights and reduce_dims
    
    result = scores.continuous.additive_bias(fcst=fcst, obs=obs, reduce_dims = ...., weights = ....)
    
    assert result == ....
    

@tennlee
Copy link
Collaborator

tennlee commented Feb 13, 2026

Do we? This is for the pandas API, where xarraylike objects aren't expected. Do we need the pandas API to also handle xarraylikes?

@John-Sharples
Copy link
Contributor

John-Sharples commented Feb 13, 2026

Do we? This is for the pandas API, where xarraylike objects aren't expected. Do we need the pandas API to also handle xarraylikes?

Apologies - I was mistaken about which function we were testing here. Current test looks good.

Happy to approve once confilcts are resolved.

@wuxx66 wuxx66 force-pushed the add_bias_to_scores_pandas_api branch from 1d9ac37 to 5c14b3c Compare February 13, 2026 03:02
@John-Sharples
Copy link
Contributor

@tennlee
@wuxx66 and I had a call to sort out the confusion I created. We now have appropriate tests in the right place, maintaining 100% coverage.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants