apache-spark - 使用 Overwrite 模式时 Spark 不会删除 MemSql 中的旧数据
问题描述
我正在使用覆盖模式运行火花作业。我期待它会删除表中的数据并插入新数据。但是,它只是将数据附加到它。
我期待与在文件系统中使用保存鼠标覆盖时相同的行为,
object HiveToMemSQL {
def main(args: Array[String]) {
val log = Logger.getLogger(HiveToMemSQL.getClass)
//var options = getOptions()
//val cmdLineArgs = new CommandLineOptions().validateArguments(args, options)
//if (cmdLineArgs != null) {
// Get command line options values
var query = "select * from default.students"
// Get destination DB details from command line
val destHostName ="localhost"
//val destUserName = cmdLineArgs.getOptionValue("destUserName")
//val destPassword = cmdLineArgs.getOptionValue("destPassword")
val destDBName ="tsg"
val destTable = "ORC_POS_TEST"
val destPort = 3308
val destConnInfo = MemSQLConnectionInfo(destHostName, destPort, "root", "", destDBName)
val spark = SparkSession.builder().appName("Hive To MemSQL")
.config("maxRecordsPerBatch" ,"100")
.config("spark.memsql.host", destConnInfo.dbHost)
.config("spark.memsql.port", destConnInfo.dbPort.toString)
.config("spark.memsql.user", destConnInfo.user)
.config("spark.memsql.password", destConnInfo.password)
.config("spark.memsql.defaultDatabase", destConnInfo.dbName)
// .config("org.apache.spark.sql.SaveMode" , SaveMode.Overwrite.toString())
.config("spark.memsql.defaultSaveMode" , "Overwrite")
.config("maxRecordsPerBatch" ,"100").master("local[*]").enableHiveSupport().getOrCreate()
import spark.implicits._
import spark.sql
// Queries are expressed in HiveQL
val sqlDF = spark.sql("select* from tsg.v_pos_krogus_wk_test")
log.info("Successfully read data from source")
sqlDF.printSchema()
sqlDF.printSchema()
// MemSQL destination DB Master Aggregator, Port, Username and Password
import spark.implicits._
// Disabling writing to leaf nodes directly
var saveConf = SaveToMemSQLConf(spark.memSQLConf,
params = Map("useKeylessShardingOptimization" -> "false",
"writeToMaster" -> "false" ,
"saveMode" -> SaveMode.Overwrite.toString()))
log.info("Save mode before :" + saveConf.saveMode )
saveConf= saveConf.copy(saveMode=SaveMode.Overwrite)
log.info("Save mode after :" + saveConf.saveMode )
val tableIdent = TableIdentifier(destDBName, destTable)
sqlDF.saveToMemSQL(tableIdent, saveConf)
log.info("Successfully completed writing to MemSQL DB")
}}
解决方案
MemSQL Spark 连接器设置将写入 REPLACE 语句。REPLACE 的工作方式与 INSERT 完全相同,只是如果表中的旧行与 PRIMARY KEY 的新行具有相同的值,则在插入新行之前删除旧行。请参阅https://docs.memsql.com/sql-reference/v6.0/replace/
推荐阅读
- c - 具有 lld、ld 和 d 类型标识符的 int 变量的 printf
- html - 以 html 形式调用时未找到简单的 cgi“Hello World”
- c - 如何删除根节点作为完整树删除的一部分
- firebase - Firebase 项目初始化错误:无法列出可用的 Google Cloud Platform 项目
- javascript - 如何在 javascript、chrome 扩展中按下一个键?
- python - 使用 Numba 提取 numpy 数组中的特定行
- html - 网页中的应用程序链接
- vue.js - Vue Composition API: Cannot read property '$createElement' of undefined
- swift - xcode gives error about linked swift framework but project builds and run
- web - Set up 301 status code with