数据集更新报错什么原因?

java.lang.RuntimeException: java.util.concurrent.ExecutionException: com.finebi.exception.DSInterruptedException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: com.finebi.spider.sparksql.exception.SpiderWrapSparkException: com.finebi.exception.DSInterruptedException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 969378.0 failed 1 times, most recent failure: Lost task 0.0 in stage 969378.0 (TID 981867) (WIN-CA0PST2MGP0 executor driver): org.apache.spark.SparkException: Failed to execute user defined function(functions$$$Lambda$9680/1758078813: (int, string) => bigint)

\n at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage16.processNext(Unknown Source)

\n at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

\n at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:755)

\n at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)

\n at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)

\n at org.apache.spark.util.random.SamplingUtils$.reservoirSampleAndCount(SamplingUtils.scala:41)

\n at org.apache.spark.RangePartitioner$.$anonfun$sketch$1(Partitioner.scala:306)

\n at org.apache.spark.RangePartitioner$.$anonfun$sketch$1$adapted(Partitioner.scala:304)

\n at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2(RDD.scala:915)

\n at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2$adapted(RDD.scala:915)

\n at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)

\n at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)

\n at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)

\n at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)

\n at org.apache.spark.scheduler.Task.run(Task.scala:131)

\n at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)

\n at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)

\n at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)

\n at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

\n at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

\n at java.lang.Thread.run(Thread.java:748)

\nCaused by: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long

\n ... 21 more

\n\nDriver stacktrace:)

FineBI VincentYph 发布于 2023-5-31 15:40
1min目标场景问卷 立即参与
回答问题
悬赏:4 F币 + 添加悬赏
提示:增加悬赏、完善问题、追问等操作,可使您的问题被置顶,并向所有关注者发送通知
共1回答
最佳回答
0
Z4u3z1Lv6专家互助
发布于2023-5-31 15:42

字符转数字出错????

image.png

  • 1关注人数
  • 355浏览人数
  • 最后回答于:2023-5-31 15:42
    请选择关闭问题的原因
    确定 取消
    返回顶部