首页 > 解决方案 > 当我想检查 Dataframe 是否为空时,“list”对象没有属性“isEmpty”

问题描述

我创建了一个数据框 d2 并想应用一个函数来查看它是否为空。我收到错误: “列表”对象没有属性“isEmpty”

import pyspark
from pyspark.sql import SparkSession
from pyspark.sql.types import DecimalType, FloatType, StructType,StructField, StringType, IntegerType 
from pyspark.sql.types import ArrayType, DoubleType, BooleanType
from pyspark.sql.functions import col,array_contains, monotonically_increasing_id, when
from pyspark.sql.window import Window as W
from pyspark.sql import functions as F
from pyspark.sql.types import StructType,StructField 
from pyspark.sql.types import StringType, IntegerType, ArrayType
from pyspark.sql.functions import round, lit


sc = SparkSession.builder.appName('SparkByExamples.com').getOrCreate()

columns2 = ["Java","Python"]
data2 = [("Java", "20000"), ("Hello", "100000"), ("Scala", "3000")]
df2 = sc.createDataFrame(data2).toDF(*columns2)

print(df2.head(1).isEmpty)

在最后一行我得到一个错误。有人能帮我吗?

标签: pythonpandaspysparkapache-spark-sql

解决方案


IsEmpty是属于 pyspark DataFrame 的方法。 .head()正在返回文档Row中所写的对象列表。python中的列表对象确实没有调用方法IsEmpty


推荐阅读