Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

调用linkis执行任务,报错Table or view not found #186

Closed
penny321 opened this issue Mar 13, 2024 · 1 comment
Closed

调用linkis执行任务,报错Table or view not found #186

penny321 opened this issue Mar 13, 2024 · 1 comment

Comments

@penny321
Copy link

2024-03-13 11:16:19.016 INFO Program is substituting variables for you
2024-03-13 11:16:19.016 INFO Variables substitution ended successfully
Job with jobId : IDE_huilan_spark_0 and execID : IDE_huilan_spark_0 submitted
2024-03-13 11:16:19.016 INFO You have submitted a new job, script code (after variable substitution) is
SCRIPT CODE
import java.sql.{Connection, DriverManager}
val prop = new java.util.Properties;
prop.setProperty("user", "");
prop.setProperty("password", "
");
val UUID = java.util.UUID.randomUUID.toString
val tmp1 = spark.sql("select * from media_db.t_report where (1=1) and (target_name is null)");
val schemas = tmp1.schema.fields.map(f => f.name).toList
val newSchemas = schemas.map(s => s.replaceAll("[()]", "")).toList
val tmp2 = tmp1.toDF(newSchemas: _)
spark.sqlContext.setConf("hive.exec.dynamic.partition", "true")
spark.sqlContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
spark.conf.set("spark.sql.sources.partitionOverwriteMode","dynamic")
if (spark.catalog.tableExists("huilan_ind.check_test_test_null")) {
tmp2.withColumn("qualitis_partition_key", lit("20240313")).write.mode("overwrite").insertInto("huilan_ind.check_test_test_null");
} else {
tmp2.withColumn("qualitis_partition_key", lit("20240313")).write.mode("append").partitionBy("qualitis_partition_key").format("hive").saveAsTable("huilan_ind.check_test_test_null");
}
tmp2.selectExpr("count(
) as value", "'QUALITIS20240313111618596_621728' as application_id", "'Long' as result_type", "'7' as rule_id", "'-1' as rule_metric_id", "'-1' as run_date", "'2024-03-13 11:16:18' as create_time").write.mode(org.apache.spark.sql.SaveMode.Append).jdbc("jdbc:mysql://172.20.0.71:3306/qualitis?useSSL=false&createDatabaseIfNotExist=true&useUnicode=true&characterEncoding=utf-8", "qualitis_application_task_result", prop)
SCRIPT CODE
2024-03-13 11:16:19.016 INFO Your job is accepted, jobID is IDE_huilan_spark_0 and taskID is 13 in ServiceInstance(linkis-cg-entrance, huilan71:6104). Please wait it to be scheduled
job is scheduled.
2024-03-13 11:16:19.016 INFO Your job is Scheduled. Please wait it to run.
Your job is being scheduled by orchestrator.
2024-03-13 11:16:19.016 INFO job is running.
2024-03-13 11:16:19.016 INFO Your job is Running now. Please wait it to complete.
2024-03-13 11:16:19.016 INFO Job with jobGroupId : 13 and subJobId : 11 was submitted to Orchestrator.
2024-03-13 11:16:19.016 INFO Background is starting a new engine for you,execId astJob_1_codeExec_1 mark id is mark_1, it may take several seconds, please wait
2024-03-13 11:17:29.017 INFO EngineConn local log path: ServiceInstance(linkis-cg-engineconn, huilan71:37767) /home/huilan/linkis/back/appcom/tmp/huilan/workDir/c1341aca-1c90-44d6-b492-b49581f51f18/logs
2024-03-13 11:17:29.017 INFO yarn application id: application_1710127870671_0002
scala> import java.sql.{Connection, DriverManager}
2024-03-13 11:17:47.017 INFO yarn application id: application_1710127870671_0002
scala> val prop = new java.util.Properties;
2024-03-13 11:17:48.017 INFO yarn application id: application_1710127870671_0002
scala> prop.setProperty("user", "");
2024-03-13 11:17:48.017 INFO yarn application id: application_1710127870671_0002
scala> prop.setProperty("password", "
");
2024-03-13 11:17:48.017 INFO yarn application id: application_1710127870671_0002
scala> val UUID = java.util.UUID.randomUUID.toString
2024-03-13 11:17:49.017 INFO yarn application id: application_1710127870671_0002
2024-03-13 11:17:29.071 WARN [Linkis-Default-Scheduler-Thread-3] org.apache.linkis.engineconn.computation.executor.hook.executor.ExecuteOnceHook 50 warn - execute once become effective, register lock listener
2024-03-13 11:17:29.164 WARN [Linkis-Default-Scheduler-Thread-3] org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor 50 warn - Start to init sparkILoop cost 2.
2024-03-13 11:17:47.246 WARN [Linkis-Default-Scheduler-Thread-3] org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor 50 warn - Finished to init sparkILoop cost 18083.
2024-03-13 11:18:06.520 ERROR [Linkis-Default-Scheduler-Thread-3] org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor 62 error - Execute code error for org.apache.spark.sql.AnalysisException: Table or view not found: media_db.t_report; line 1 pos 14;
'Project []
+- 'Filter ((1 = 1) && isnull('target_name))
+- 'UnresolvedRelation media_db.t_report
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:90)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:85)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:127)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:85)
at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:95)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:108)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
... 82 elided
2024-03-13 11:18:06.524 ERROR [Linkis-Default-Scheduler-Thread-3] org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor 58 error - execute code failed! org.apache.linkis.engineplugin.spark.exception.ExecuteError: errCode: 40005 ,desc: execute sparkScala failed! ,ip: huilan71 ,port: 37767 ,serviceKind: linkis-cg-engineconn
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor$$anonfun$1.apply(SparkScalaExecutor.scala:192) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor$$anonfun$1.apply(SparkScalaExecutor.scala:156) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) ~[scala-library-2.11.12.jar:?]
at scala.Console$.withOut(Console.scala:65) ~[scala-library-2.11.12.jar:?]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor.executeLine(SparkScalaExecutor.scala:155) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor$$anonfun$runCode$1.apply$mcV$sp(SparkScalaExecutor.scala:130) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor$$anonfun$runCode$1.apply(SparkScalaExecutor.scala:130) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor$$anonfun$runCode$1.apply(SparkScalaExecutor.scala:130) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) ~[linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor.runCode(SparkScalaExecutor.scala:131) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkEngineConnExecutor$$anonfun$executeLine$2$$anonfun$2.apply(SparkEngineConnExecutor.scala:83) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkEngineConnExecutor$$anonfun$executeLine$2$$anonfun$2.apply(SparkEngineConnExecutor.scala:83) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:61) ~[linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkEngineConnExecutor$$anonfun$executeLine$2.apply(SparkEngineConnExecutor.scala:83) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkEngineConnExecutor$$anonfun$executeLine$2.apply(SparkEngineConnExecutor.scala:64) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:61) ~[linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkEngineConnExecutor.executeLine(SparkEngineConnExecutor.scala:91) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10$$anonfun$apply$11.apply(ComputationExecutor.scala:180) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10$$anonfun$apply$11.apply(ComputationExecutor.scala:179) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) ~[linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10.apply(ComputationExecutor.scala:181) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10.apply(ComputationExecutor.scala:175) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at scala.collection.immutable.Range.foreach(Range.scala:160) [scala-library-2.11.12.jar:?]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2.apply(ComputationExecutor.scala:174) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2.apply(ComputationExecutor.scala:150) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:61) [linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.toExecuteTask(ComputationExecutor.scala:227) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$3.apply(ComputationExecutor.scala:242) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$3.apply(ComputationExecutor.scala:242) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:61) [linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:55) [linkis-accessible-executor-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:49) [linkis-accessible-executor-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.ensureOp(ComputationExecutor.scala:134) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.execute(ComputationExecutor.scala:241) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl.org$apache$linkis$engineconn$computation$executor$service$TaskExecutionServiceImpl$$executeTask(TaskExecutionServiceImpl.scala:240) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1$$anonfun$run$1.apply$mcV$sp(TaskExecutionServiceImpl.scala:173) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1$$anonfun$run$1.apply(TaskExecutionServiceImpl.scala:171) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1$$anonfun$run$1.apply(TaskExecutionServiceImpl.scala:171) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) [linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:69) [linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1.run(TaskExecutionServiceImpl.scala:171) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_271]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_271]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_271]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_271]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_271]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_271]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_271]
2024-03-13 11:18:06.585 ERROR [Linkis-Default-Scheduler-Thread-3] org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl 58 error - org.apache.spark.sql.AnalysisException: Table or view not found: media_db.t_report; line 1 pos 14;
'Project [
]
+- 'Filter ((1 = 1) && isnull('target_name))
+- 'UnresolvedRelation media_db.t_report
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:90)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:85)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:127)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:85)
at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:95)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:108)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
... 82 elided
org.apache.linkis.engineplugin.spark.exception.ExecuteError: errCode: 40005 ,desc: execute sparkScala failed! ,ip: huilan71 ,port: 37767 ,serviceKind: linkis-cg-engineconn
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor$$anonfun$1.apply(SparkScalaExecutor.scala:192) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor$$anonfun$1.apply(SparkScalaExecutor.scala:156) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) ~[scala-library-2.11.12.jar:?]
at scala.Console$.withOut(Console.scala:65) ~[scala-library-2.11.12.jar:?]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor.executeLine(SparkScalaExecutor.scala:155) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor$$anonfun$runCode$1.apply$mcV$sp(SparkScalaExecutor.scala:130) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor$$anonfun$runCode$1.apply(SparkScalaExecutor.scala:130) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor$$anonfun$runCode$1.apply(SparkScalaExecutor.scala:130) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) ~[linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkScalaExecutor.runCode(SparkScalaExecutor.scala:131) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkEngineConnExecutor$$anonfun$executeLine$2$$anonfun$2.apply(SparkEngineConnExecutor.scala:83) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkEngineConnExecutor$$anonfun$executeLine$2$$anonfun$2.apply(SparkEngineConnExecutor.scala:83) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:61) ~[linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkEngineConnExecutor$$anonfun$executeLine$2.apply(SparkEngineConnExecutor.scala:83) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkEngineConnExecutor$$anonfun$executeLine$2.apply(SparkEngineConnExecutor.scala:64) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:61) ~[linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineplugin.spark.executor.SparkEngineConnExecutor.executeLine(SparkEngineConnExecutor.scala:91) ~[linkis-engineplugin-spark-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10$$anonfun$apply$11.apply(ComputationExecutor.scala:180) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10$$anonfun$apply$11.apply(ComputationExecutor.scala:179) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) ~[linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10.apply(ComputationExecutor.scala:181) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10.apply(ComputationExecutor.scala:175) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at scala.collection.immutable.Range.foreach(Range.scala:160) ~[scala-library-2.11.12.jar:?]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2.apply(ComputationExecutor.scala:174) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2.apply(ComputationExecutor.scala:150) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:61) ~[linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.toExecuteTask(ComputationExecutor.scala:227) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$3.apply(ComputationExecutor.scala:242) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$3.apply(ComputationExecutor.scala:242) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:61) ~[linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:55) ~[linkis-accessible-executor-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:49) ~[linkis-accessible-executor-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.ensureOp(ComputationExecutor.scala:134) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.execute(ComputationExecutor.scala:241) ~[linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl.org$apache$linkis$engineconn$computation$executor$service$TaskExecutionServiceImpl$$executeTask(TaskExecutionServiceImpl.scala:240) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1$$anonfun$run$1.apply$mcV$sp(TaskExecutionServiceImpl.scala:173) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1$$anonfun$run$1.apply(TaskExecutionServiceImpl.scala:171) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1$$anonfun$run$1.apply(TaskExecutionServiceImpl.scala:171) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) [linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:69) [linkis-common-1.0.3.jar:1.0.3]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1.run(TaskExecutionServiceImpl.scala:171) [linkis-computation-engineconn-1.0.3.jar:1.0.3]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_271]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_271]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_271]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_271]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_271]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_271]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_271]
Job with execId-IDE_huilan_spark_0 and subJobId : 11 from orchestrator completed with state ErrorExecuteResponse(21304, Task is Failed,errorMsg: org.apache.spark.sql.AnalysisException: Table or view not found: media_db.t_report; line 1 pos 14;
'Project [*]
+- 'Filter ((1 = 1) && isnull('target_name))
+- 'UnresolvedRelation media_db.t_report
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:90)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:85)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:127)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:85)
at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:95)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:108)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
... 82 elided
,null)
2024-03-13 11:18:07.018 INFO job is completed.
2024-03-13 11:18:07.018 INFO Task creation time(任务创建时间): 2024-03-13 11:16:19, Task scheduling time(任务调度时间): 2024-03-13 11:16:19, Task start time(任务开始时间): 2024-03-13 11:16:19, Mission end time(任务结束时间): 2024-03-13 11:18:07
2024-03-13 11:18:07.018 INFO Your mission(您的任务) 13 The total time spent is(总耗时时间为): 1.8 分钟
2024-03-13 11:18:07.018 INFO Sorry. Your job completed with a status Failed. You can view logs for the reason.

@Tangjiafeng
Copy link
Contributor

在集群中确认库表是否存在,比如 hive cli: desc your table 来验证。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants