首页 > 解决方案 > 将 DF 创建到 tempView 后运行 Spark Glue 作业时出错

问题描述

 Explanation

当我从动态框架创建 DF 时,它工作正常,并且我能够将数据帧写回动态框架,但是当我将数据帧转换为 createOrReplaceTempView 时,
它会抛出这个错误。列数相同,从源到目标没有任何变化

请帮助我出了什么问题

ERROR

IllegalArgumentException: "requirement failed: The number of columns doesn't 
match.\nOld column names (2): name, id\nNew column names (0): "

pysparkGlueCode

import sys
from awsglue.transforms import *
from awsglue.utils import getResolvedOptions
from pyspark.context import SparkContext
from awsglue.dynamicframe import DynamicFrame
from awsglue.context import GlueContext
from awsglue.job import Job
from pyspark.sql.types import *
from pyspark.sql.functions import *

from pyspark import SparkConf,SparkContext
from pyspark.sql import *



 args = getResolvedOptions(sys.argv, ['TempDir','JOB_NAME'])

sc = SparkContext()
glueContext = GlueContext(sc)
spark = glueContext.spark_session
job = Job(glueContext)
job.init(args['JOB_NAME'], args)

datasource0 = glueContext.create_dynamic_frame.from_catalog(database =   
"mygluedatabaseoregon", table_name = "dev_glue_poc_gluetable", redshift_tmp_dir =  
args["TempDir"], transformation_ctx = "datasource0")

applymapping1 = ApplyMapping.apply(frame = datasource0, mappings = [("name",  
"string", "name", "string"), ("id", "int", "id", "int")], transformation_ctx =   
"applymapping1")

selectfields2 = SelectFields.apply(frame = applymapping1, paths = ["name", "id"], 
transformation_ctx = "selectfields2")

getOutput = selectfields2.toDF() #.select('name','id')

getOutput.createOrReplaceTempView("info")

sqlData = spark.sql("select name,id from info")

outputSql = sqlData.toDF()

getOutputDFY = DynamicFrame.fromDF(outputSql,glueContext,"getOutputDFY")


resolvechoice3 = ResolveChoice.apply(frame = getOutputDFY, choice = 
"MATCH_CATALOG", database = "mygluedatabaseoregon", table_name = 
"dev_glue_poc_importfromglue", transformation_ctx = "resolvechoice3")



resolvechoice4 = ResolveChoice.apply(frame = resolvechoice3, choice = "make_cols", 
transformation_ctx = "resolvechoice4")



  datasink5 = glueContext.write_dynamic_frame.from_catalog(frame = resolvechoice4, 
  database = "mygluedatabaseoregon", table_name = "dev_glue_poc_importfromglue", 
  redshift_tmp_dir = args["TempDir"], transformation_ctx = "datasink5")
  job.commit()

标签: amazon-web-servicesapache-sparkaws-glueaws-glue-data-catalogaws-glue-spark

解决方案


推荐阅读