首页 > 解决方案 > 无法在 spark sql 中注册 UDF

问题描述

我试图注册我的 UDF 函数并想在我的 spark sql 查询中使用它,但无法注册我的 udf 我得到以下错误。

    val squared = (s: Column) => { 
    concat(substring(s,4,2),year(to_date(from_unixtime(unix_timestamp(s,"dd-MM-yyyy")))))
    }
    squared: org.apache.spark.sql.Column => org.apache.spark.sql.Column = <function1>

    scala> sqlContext.udf.register("dc",squared)
    java.lang.UnsupportedOperationException: Schema for type org.apache.spark.sql.Column is not   supported
    at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:733)
    at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:671)
    at org.apache.spark.sql.UDFRegistration.register(UDFRegistration.scala:143)
    ... 48 elided

我试图将 Column 更改为 String 但低于错误。

    val squared = (s: String) => { 
    | concat(substring(s,4,2),year(to_date(from_unixtime(unix_timestamp(s,"dd-MM-yyyy")))))
    | }
    <console>:28: error: type mismatch;
    found   : String
    required: org.apache.spark.sql.Column
   concat(substring(s,4,2),year(to_date(from_unixtime(unix_timestamp(s,"dd-MM-yyyy")))))


   can someone please guide me how should i implement this.

标签: apache-sparkhiveapache-spark-sql

解决方案


此包 org.apache.spark.sql.functions._ 中的所有 spark 函数将无法在 UDF 中访问。

而不是内置的火花函数..您可以使用普通的 scala 代码来获得相同的结果。

val df = spark.sql("select * from your_table")

def date_concat(date:Column): Column = { 
    concat(substring(date,4,2),year(to_date(from_unixtime(unix_timestamp(date,"dd-MM-yyyy")))))
}

df.withColumn("date_column_name",date_concat($"date_column_name")) // with function.
df.withColumn("date_column_name",concat(substring($"date_column_name",4,2),year(to_date(from_unixtime(unix_timestamp($"date_column_name","dd-MM-yyyy")))))) // without function, direct method.
df.createOrReplaceTempView("table_name")
spark.sql("[...]") // Write your furthur logic in sql if you want.


推荐阅读