Skip to content

BUG: assign doesnt cast SparseDataFrame to DataFrame #19178

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 13 commits into from
Feb 12, 2018
Merged
Show file tree
Hide file tree
Changes from 9 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions doc/source/whatsnew/v0.23.0.txt
Original file line number Diff line number Diff line change
Expand Up @@ -555,6 +555,7 @@ Sparse

- Bug in which creating a ``SparseDataFrame`` from a dense ``Series`` or an unsupported type raised an uncontrolled exception (:issue:`19374`)
- Bug in :class:`SparseDataFrame.to_csv` causing exception (:issue:`19384`)
- Bug in constructing a :class:`SparseArray`: if `data` is a scalar and `index` is defined it will coerce to float64 regardless of scalar's dtype. (:issue:`19163`)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Whoops, just noticed we don't have SparseArray in our API docs so this link won't work. Just

``SparseArray``

until we get that sorted out. Also double ticks around data nd index.

-

Reshaping
Expand Down
5 changes: 3 additions & 2 deletions pandas/core/sparse/array.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
is_scalar, is_dtype_equal)
from pandas.core.dtypes.cast import (
maybe_convert_platform, maybe_promote,
astype_nansafe, find_common_type)
astype_nansafe, find_common_type, infer_dtype_from_scalar)
from pandas.core.dtypes.missing import isna, notna, na_value_for_dtype

import pandas._libs.sparse as splib
Expand Down Expand Up @@ -161,7 +161,8 @@ def __new__(cls, data, sparse_index=None, index=None, kind='integer',
data = np.nan
if not is_scalar(data):
raise Exception("must only pass scalars with an index ")
values = np.empty(len(index), dtype='float64')
values = np.empty(len(index),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we have a routine for this
construct_1d_from_scalar in pandas.core.dtypes.cast

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for the empty/fill step

dtype=infer_dtype_from_scalar(data)[0])
values.fill(data)
data = values

Expand Down
11 changes: 11 additions & 0 deletions pandas/tests/sparse/frame/test_frame.py
Original file line number Diff line number Diff line change
Expand Up @@ -1303,3 +1303,14 @@ def test_quantile_multi(self):

tm.assert_frame_equal(result, dense_expected)
tm.assert_sp_frame_equal(result, sparse_expected)

def test_assign_with_sparse_frame(self):
# GH 19163
df = pd.DataFrame({"a": [1, 2, 3]})
res = df.to_sparse(fill_value=False).assign(newcol=False)
exp = df.assign(newcol=False).to_sparse(fill_value=False)

tm.assert_sp_frame_equal(res, exp)

for column in res.columns:
assert type(res[column]) is SparseSeries
15 changes: 15 additions & 0 deletions pandas/tests/sparse/test_array.py
Original file line number Diff line number Diff line change
Expand Up @@ -113,6 +113,21 @@ def test_constructor_spindex_dtype(self):
assert arr.dtype == np.int64
assert arr.fill_value == 0

@pytest.mark.parametrize('scalar,dtype', [
(False, bool),
(0.0, 'float64'),
(1, 'int64'),
('z', 'object')])
def test_scalar_with_index_infer_dtype(self, scalar, dtype):
# GH 19163
arr = SparseArray(scalar, index=[1, 2, 3], fill_value=scalar)
exp = SparseArray([scalar, scalar, scalar], fill_value=scalar)

tm.assert_sp_array_equal(arr, exp)

assert arr.dtype == dtype
assert exp.dtype == dtype

def test_sparseseries_roundtrip(self):
# GH 13999
for kind in ['integer', 'block']:
Expand Down