用Spark写数据到doris报错Failed to commit txn 8514725, cause tablet 4846172 succ replica num 0 < load required replica num 2

Viewed 84

使用Spark写数据Doris总报错,数据总量二十多亿
org.apache.doris.spark.exception.StreamLoadException: stream load error, load status:Fail, response:StreamLoadResponse(200,OK,{
"TxnId": 8514725,
"Label": "spark_streamload_20240808_202529_57ee3076cc164fdc9ecf4983de637338",
"Comment": "",
"TwoPhaseCommit": "false",
"Status": "Fail",
"Message": "[ANALYSIS_ERROR]TStatus: errCode = 2, detailMessage = Failed to commit txn 8514725, cause tablet 4846172 succ replica num 0 < load required replica num 2. table 4199922, partition: [ id=4846159, commit version 13796, visible version 13796 ], this tablet detail: 3 replicas write data failed: { [replicaId=4846174, backendId=13376, backendAlive=true, version=13796, state=NORMAL], [replicaId=4846175, backendId=12120, backendAlive=true, version=13796, state=NORMAL], [replicaId=4846173, backendId=13375, backendAlive=true, version=13796, state=NORMAL] }; ",
"NumberTotalRows": 500000,
"NumberLoadedRows": 500000,
"NumberFilteredRows": 0,
"NumberUnselectedRows": 0,
"LoadBytes": 688752089,
"LoadTimeMs": 17367,
"BeginTxnTimeMs": 0,
"StreamLoadPutTimeMs": 3,
"ReadDataTimeMs": 1000,
"WriteDataTimeMs": 17360,
"CommitAndPublishTimeMs": 0
}
)
at org.apache.doris.spark.load.StreamLoader.org$apache$doris$spark$load$StreamLoader$$handleStreamLoadResponse(StreamLoader.scala:482)
at org.apache.doris.spark.load.StreamLoader$$anonfun$1.apply$mcV$sp(StreamLoader.scala:102)
at org.apache.doris.spark.load.StreamLoader$$anonfun$1.apply(StreamLoader.scala:99)
at org.apache.doris.spark.load.StreamLoader$$anonfun$1.apply(StreamLoader.scala:99)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.doris.spark.load.StreamLoader.load(StreamLoader.scala:99)
at org.apache.doris.spark.writer.DorisWriter$$anonfun$write$1.apply(DorisWriter.scala:78)
at org.apache.doris.spark.writer.DorisWriter$$anonfun$write$1.apply(DorisWriter.scala:78)
at org.apache.doris.spark.writer.DorisWriter$$anonfun$doWrite$1$$anonfun$3.apply(DorisWriter.scala:98)
at org.apache.doris.spark.writer.DorisWriter$$anonfun$doWrite$1$$anonfun$3.apply(DorisWriter.scala:98)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.doris.spark.sql.Utils$.retry(Utils.scala:182)
at org.apache.doris.spark.writer.DorisWriter$$anonfun$doWrite$1$$anonfun$2$$anonfun$apply$1.apply(DorisWriter.scala:97)
at org.apache.doris.spark.writer.DorisWriter$$anonfun$doWrite$1$$anonfun$2$$anonfun$apply$1.apply(DorisWriter.scala:97)
at org.apache.doris.spark.writer.DorisWriter$$anonfun$doWrite$1.apply(DorisWriter.scala:98)
at org.apache.doris.spark.writer.DorisWriter$$anonfun$doWrite$1.apply(DorisWriter.scala:94)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

1 Answers

麻烦提供一下写入到doris的参数配置,写入频次以及每批数据量的大小