Python Pandas: Counting the frequency of a specific value in each row of dataframe?(Python Pandas:计算每行数据帧中特定值的频率?)
问题描述
我有一个数据框 df:
I have a dataframe df:
domain country out1 out2 out3
oranjeslag.nl NL 1 0 NaN
pietervaartjes.nl NL 1 1 0
andreaputting.com.au AU NaN 1 0
michaelcardillo.com US 0 0 NaN
我想定义两列 sum_0 和 sum_1 并计算每行列 (out1,out2,out3) 中 0 和 1 的数量.所以预期的结果是:
I would like to define two columns sum_0 and sum_1 and count the number of 0s and 1s in columns (out1,out2,out3),per row. So expected results would be:
domain country out1 out2 out3 sum_0 sum_1
oranjeslag.nl NL 1 0 NaN 1 1
pietervaartjes.nl NL 1 1 0 1 2
andreaputting.com.au AU NaN 1 0 1 1
michaelcardillo.com US 0 0 NaN 2 0
我有这个计算1个数的代码,但我不知道如何计算0个数.
I have this code for counting the number of 1s, but I do not know how to count the number of 0s.
df['sum_1'] = df[['out_1','out_2','out_3']].sum(axis=1)
有人可以帮忙吗?
推荐答案
你可以为每个条件调用sum
,1
条件很简单,只是一个直接的axis=1
上的 sum,第二次您可以将 df 与 0
值进行比较,然后像以前一样调用 sum
:
You can call sum
for each condition, the 1
condition is simple just a straight sum
on axis=1
, for the second you can compare the df against 0
value and then call sum
as before:
In [102]:
df['sum_1'] = df[['out1','out2','out3']].sum(axis=1)
df['sum_0'] = (df[['out1','out2','out3']] == 0).sum(axis=1)
df
Out[102]:
domain country out1 out2 out3 sum_0 sum_1
0 oranjeslag.nl NL 1 0 NaN 1 1
1 pietervaartjes.nl NL 1 1 0 1 2
2 andreaputting.com.au AU NaN 1 0 1 1
3 michaelcardillo.com US 0 0 NaN 2 0
这篇关于Python Pandas:计算每行数据帧中特定值的频率?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!
本文标题为:Python Pandas:计算每行数据帧中特定值的频率?


- pytorch 中的自适应池是如何工作的? 2022-07-12
- python-m http.server 443--使用SSL? 2022-01-01
- 使用Heroku上托管的Selenium登录Instagram时,找不到元素';用户名'; 2022-01-01
- python check_output 失败,退出状态为 1,但 Popen 适用于相同的命令 2022-01-01
- padding='same' 转换为 PyTorch padding=# 2022-01-01
- 沿轴计算直方图 2022-01-01
- 如何将一个类的函数分成多个文件? 2022-01-01
- 如何在 Python 的元组列表中对每个元组中的第一个值求和? 2022-01-01
- 如何在 python3 中将 OrderedDict 转换为常规字典 2022-01-01
- 分析异常:路径不存在:dbfs:/databricks/python/lib/python3.7/site-packages/sampleFolder/data; 2022-01-01