计算标准差时,python,pandas,statistics,standard-deviation"/>

首页 > 解决方案 > Python TypeError:无法将系列转换为计算标准差时

问题描述

我想从头开始计算标准偏差。它给了我一个TypeError: cannot convert the series to <class 'float'>错误。

我尝试diff_squared使用列表进行投射,.tolist()但它反而引发TypeError: unsupported operand type(s) for /: 'list' and 'int'了。

import pandas as pd

from math import sqrt

df = pd.read_csv('C:/Users/User/Downloads/Admission_Predict.csv')

# Mean
sums = 0
for m in range(len(df)):
    sums += df.iloc[m]
mean = sums / len(df)

# Square of difference of mean and each value
diff_squared = 0
for n in range(len(df)):
    diff_squared += (df.iloc[n].tolist() - mean) ** 2

# Standard deviation
stdv = sqrt(diff_squared / ((len(df)) - 1))

完整回溯

Traceback (most recent call last):
  File "C:\Users\User\PycharmProjects\algorithms\Fibonacci recursive.py", line 18, in <module>
    stdv = sqrt(diff_squared / ((len(df)) - 1))
  File "C:\Users\User\PycharmProjects\algorithms\venv\lib\site-packages\pandas\core\series.py", line 141, in wrapper
    raise TypeError(f"cannot convert the series to {converter}")
TypeError: cannot convert the series to <class 'float'>

Process finished with exit code 1

标签: pythonpandasstatisticsstandard-deviation

解决方案


推荐阅读