Created
July 27, 2023 23:56
-
-
Save pingzh/706a2e27b4dd90e405d9c6f8a4237d6f to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[2023-07-27T22:57:23.296000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:23 INFO InsertIntoHiveTable: [AIRBNB] Overriding hive.exec.scratchdir to hdfs://ha-nn-uri/tmp/svc_di_data_infra/hive_staging | |
[2023-07-27T22:57:23.298000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:23 INFO FileUtils: Creating directory if it doesn't exist: hdfs://ha-nn-uri/tmp/svc_di_data_infra/hive_staging_hive_2023-07-27_22-57-23_296_8878419864542981198-1 | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - Traceback (most recent call last): | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/engines/spark/app.py", line 98, in <module> | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - main(args.dialect, args.command_type, args.ddl_concurrent_tasks, args.payload_path) | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/engines/spark/app.py", line 70, in main | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - command_handler(evaluator, command_payload) | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/engines/commands.py", line 80, in evaluate | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - is_dev=command_payload.is_dev, | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/core/snapshot/evaluator.py", line 184, in evaluate | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - apply(query_or_df, index=0) | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/core/snapshot/evaluator.py", line 124, in apply | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - model, table_name, query_or_df, snapshots, is_dev, start=start, end=end | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/core/snapshot/evaluator.py", line 754, in insert | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - **kwargs, | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/core/engine_adapter/base.py", line 628, in insert_overwrite_by_time_partition | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - return self._insert_overwrite_by_condition(table_name, query_or_df, where, columns_to_types) | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/core/engine_adapter/spark.py", line 75, in _insert_overwrite_by_condition | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - super()._insert_overwrite_by_condition(table_name, query_or_df, where, columns_to_types) | |
[2023-07-27T22:57:23.388000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/core/engine_adapter/base.py", line 686, in _insert_overwrite_by_condition | |
[2023-07-27T22:57:23.389000 UTC] {spark_submit.py:521} INFO - self.execute(insert_exp) | |
[2023-07-27T22:57:23.389000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/core/engine_adapter/base.py", line 910, in execute | |
[2023-07-27T22:57:23.389000 UTC] {spark_submit.py:521} INFO - self.cursor.execute(sql, **kwargs) | |
[2023-07-27T22:57:23.389000 UTC] {spark_submit.py:521} INFO - File "/usr/local/lib/python3.7/dist-packages/sqlmesh/engines/spark/db_api/spark_session.py", line 21, in execute | |
[2023-07-27T22:57:23.389000 UTC] {spark_submit.py:521} INFO - self._last_df = self._spark.sql(query) | |
[2023-07-27T22:57:23.389000 UTC] {spark_submit.py:521} INFO - File "/srv/emr-spark/spark3.1-bin-hadoop2.8.5-prod-5-3.1.1.5/python/lib/pyspark.zip/pyspark/sql/session.py", line 723, in sql | |
[2023-07-27T22:57:23.389000 UTC] {spark_submit.py:521} INFO - File "/srv/emr-spark/spark3.1-bin-hadoop2.8.5-prod-5-3.1.1.5/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py", line 1305, in __call__ | |
[2023-07-27T22:57:23.389000 UTC] {spark_submit.py:521} INFO - File "/srv/emr-spark/spark3.1-bin-hadoop2.8.5-prod-5-3.1.1.5/python/lib/pyspark.zip/pyspark/sql/utils.py", line 111, in deco | |
[2023-07-27T22:57:23.389000 UTC] {spark_submit.py:521} INFO - File "/srv/emr-spark/spark3.1-bin-hadoop2.8.5-prod-5-3.1.1.5/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py", line 328, in get_return_value | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - py4j.protocol.Py4JJavaError: An error occurred while calling o94.sql. | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - : org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.processInsert(InsertIntoHiveTable.scala:165) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.run(InsertIntoHiveTable.scala:106) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:108) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:106) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:120) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3687) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.Dataset.<init>(Dataset.scala:228) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:615) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:610) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at java.lang.reflect.Method.invoke(Method.java:498) | |
[2023-07-27T22:57:23.390000 UTC] {spark_submit.py:521} INFO - at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) | |
[2023-07-27T22:57:23.391000 UTC] {spark_submit.py:521} INFO - at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) | |
[2023-07-27T22:57:23.391000 UTC] {spark_submit.py:521} INFO - at py4j.Gateway.invoke(Gateway.java:282) | |
[2023-07-27T22:57:23.391000 UTC] {spark_submit.py:521} INFO - at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) | |
[2023-07-27T22:57:23.391000 UTC] {spark_submit.py:521} INFO - at py4j.commands.CallCommand.execute(CallCommand.java:79) | |
[2023-07-27T22:57:23.391000 UTC] {spark_submit.py:521} INFO - at py4j.GatewayConnection.run(GatewayConnection.java:238) | |
[2023-07-27T22:57:23.391000 UTC] {spark_submit.py:521} INFO - at java.lang.Thread.run(Thread.java:750) | |
[2023-07-27T22:57:23.391000 UTC] {spark_submit.py:521} INFO - | |
[2023-07-27T22:57:23.619000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:23 INFO SparkExecutionPlanProcessor: Shutting down SAC (SparkExecutionPlanProcessor) executor | |
[2023-07-27T22:57:26.057000 UTC] {base_job.py:248} INFO - Job 625076 on host i-09ccdc61677067624.inst.aws.us-east-1.prod.musta.ch heartbeat with 0 s | |
[2023-07-27T22:57:28.620000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:28 INFO SparkExecutionPlanProcessor: SAC (SparkExecutionPlanProcessor) executor terminated with all tasks have completed execution. | |
[2023-07-27T22:57:28.620000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:28 INFO MLPipelineEventProcessor: Shutting down SAC (MLPipelineEventProcessor) executor | |
[2023-07-27T22:57:33.620000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:33 INFO MLPipelineEventProcessor: SAC (MLPipelineEventProcessor) executor terminated with all tasks have completed execution. | |
[2023-07-27T22:57:33.621000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:33 INFO SparkExecutionPlanProcessor: Shutting down SAC (SparkExecutionPlanProcessor) executor | |
[2023-07-27T22:57:38.621000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:38 INFO SparkExecutionPlanProcessor: SAC (SparkExecutionPlanProcessor) executor terminated with all tasks have completed execution. | |
[2023-07-27T22:57:38.621000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:38 INFO SparkStageSubmittedEventProcessor: Shutting down SAC (SparkStageSubmittedEventProcessor) executor | |
[2023-07-27T22:57:43.622000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:43 INFO SparkStageSubmittedEventProcessor: SAC (SparkStageSubmittedEventProcessor) executor terminated with all tasks have completed execution. | |
[2023-07-27T22:57:43.622000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:43 INFO MLPipelineEventProcessor: Shutting down SAC (MLPipelineEventProcessor) executor | |
[2023-07-27T22:57:48.622000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:48 INFO MLPipelineEventProcessor: SAC (MLPipelineEventProcessor) executor terminated with all tasks have completed execution. | |
[2023-07-27T22:57:48.622000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:48 INFO SparkCatalogEventProcessor: Shutting down SAC (SparkCatalogEventProcessor) executor | |
[2023-07-27T22:57:53.623000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO SparkCatalogEventProcessor: SAC (SparkCatalogEventProcessor) executor terminated with all tasks have completed execution. | |
[2023-07-27T22:57:53.624000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO SparkContext: Invoking stop() from shutdown hook | |
[2023-07-27T22:57:53.634000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO AbstractConnector: Stopped Spark@7a4c8041{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} | |
[2023-07-27T22:57:53.636000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO SparkUI: Stopped Spark web UI at http://i-09ccdc61677067624.inst.aws.us-east-1.prod.musta.ch:4040 | |
[2023-07-27T22:57:53.639000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO YarnClientSchedulerBackend: Interrupting monitor thread | |
[2023-07-27T22:57:53.666000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO YarnClientSchedulerBackend: Shutting down all executors | |
[2023-07-27T22:57:53.670000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down | |
[2023-07-27T22:57:53.674000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO YarnClientSchedulerBackend: YARN client scheduler backend Stopped | |
[2023-07-27T22:57:53.707000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! | |
[2023-07-27T22:57:53.718000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO MemoryStore: MemoryStore cleared | |
[2023-07-27T22:57:53.718000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO BlockManager: BlockManager stopped | |
[2023-07-27T22:57:53.729000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO BlockManagerMaster: BlockManagerMaster stopped | |
[2023-07-27T22:57:53.732000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! | |
[2023-07-27T22:57:53.738000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO SparkContext: Successfully stopped SparkContext | |
[2023-07-27T22:57:53.739000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO ShutdownHookManager: Shutdown hook called | |
[2023-07-27T22:57:53.739000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO ShutdownHookManager: Deleting directory /tmp/spark-d2cf3b0b-dbfb-4415-b169-fd664746e5ab | |
[2023-07-27T22:57:53.743000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO ShutdownHookManager: Deleting directory /tmp/spark-0fb53e34-7f81-4a82-bc1b-7049409f3f19 | |
[2023-07-27T22:57:53.746000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO ShutdownHookManager: Deleting directory /tmp/spark-0fb53e34-7f81-4a82-bc1b-7049409f3f19/pyspark-e3b3db31-1e66-4fe4-9946-c2fba5bd3a58 | |
[2023-07-27T22:57:53.750000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO AtlasHook: ==> Shutdown of Atlas Hook | |
[2023-07-27T22:57:53.750000 UTC] {spark_submit.py:521} INFO - 23/07/27 22:57:53 INFO AtlasHook: <== Shutdown of Atlas Hook |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment