++ id -u
+ myuid=185
++ id -g
+ mygid=0
+ set +e
++ getent passwd 185
+ uidentry=
+ set -e
+ '[' -z '' ']'
+ '[' -w /etc/passwd ']'
+ echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false'
+ SPARK_CLASSPATH=':/opt/spark/jars/*'
+ env
+ grep SPARK_JAVA_OPT_
+ sort -t_ -k4 -n
+ sed 's/[^=]*=\(.*\)/\1/g'
+ readarray -t SPARK_EXECUTOR_JAVA_OPTS
+ '[' -n '' ']'
+ '[' -z ']'
+ '[' -z ']'
+ '[' -n '' ']'
+ '[' -z ']'
+ '[' -z x ']'
+ SPARK_CLASSPATH='/opt/spark/conf::/opt/spark/jars/*'
+ case "$1" in
+ shift 1
+ CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
+ exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=192.168.79.167 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.JavaWordCount hdfs://10.129.2.179:9000/Pritam/wc_new.jar hdfs://10.129.2.179:9000/input-data/WC/wc-data-500mb.txt
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/spark/jars/spark-unsafe_2.12-3.2.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
22/05/18 14:47:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
22/05/18 14:47:53 INFO SparkContext: Running Spark version 3.2.1
22/05/18 14:47:53 INFO ResourceUtils: ==============================================================
22/05/18 14:47:53 INFO ResourceUtils: No custom resources configured for spark.driver.
22/05/18 14:47:53 INFO ResourceUtils: ==============================================================
22/05/18 14:47:53 INFO SparkContext: Submitted application: JavaWordCount
22/05/18 14:47:53 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
22/05/18 14:47:53 INFO ResourceProfile: Limiting resource is cpus at 1 tasks per executor
22/05/18 14:47:53 INFO ResourceProfileManager: Added ResourceProfile id: 0
22/05/18 14:47:53 INFO SecurityManager: Changing view acls to: 185,spark
22/05/18 14:47:53 INFO SecurityManager: Changing modify acls to: 185,spark
22/05/18 14:47:53 INFO SecurityManager: Changing view acls groups to:
22/05/18 14:47:53 INFO SecurityManager: Changing modify acls groups to:
22/05/18 14:47:53 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(185, spark); groups with view permissions: Set(); users with modify permissions: Set(185, spark); groups with modify permissions: Set()
22/05/18 14:47:53 INFO Utils: Successfully started service 'sparkDriver' on port 7078.
22/05/18 14:47:53 INFO SparkEnv: Registering MapOutputTracker
22/05/18 14:47:53 INFO SparkEnv: Registering BlockManagerMaster
22/05/18 14:47:53 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/05/18 14:47:53 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/05/18 14:47:53 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
22/05/18 14:47:53 INFO DiskBlockManager: Created local directory at /var/data/spark-9f39ddea-4d18-41e6-a52e-757ea3b1ad53/blockmgr-8bff0882-bd7b-46ef-8708-2bad747c6094
22/05/18 14:47:53 INFO MemoryStore: MemoryStore started with capacity 413.9 MiB
22/05/18 14:47:53 INFO SparkEnv: Registering OutputCommitCoordinator
22/05/18 14:47:54 INFO Utils: Successfully started service 'SparkUI' on port 4040.
22/05/18 14:47:54 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://utkarsh-wc-test-lens-1-e879dc80d7a2fb47-driver-svc.default.svc:4040
22/05/18 14:47:54 INFO SparkContext: Added JAR file:/tmp/spark-ff448a6b-5869-4dba-8d97-052d27ba82dd/sparklens-0.3.2-s_2.11.jar at spark://utkarsh-wc-test-lens-1-e879dc80d7a2fb47-driver-svc.default.svc:7078/jars/sparklens-0.3.2-s_2.11.jar with timestamp 1652885273112
22/05/18 14:47:54 INFO SparkContext: Added JAR file:/tmp/spark-ff448a6b-5869-4dba-8d97-052d27ba82dd/wc_new.jar at spark://utkarsh-wc-test-lens-1-e879dc80d7a2fb47-driver-svc.default.svc:7078/jars/wc_new.jar with timestamp 1652885273112
22/05/18 14:47:54 INFO SparkContext: Added JAR hdfs://10.129.2.179:9000/Pritam/wc_new.jar at hdfs://10.129.2.179:9000/Pritam/wc_new.jar with timestamp 1652885273112
22/05/18 14:47:54 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file
22/05/18 14:47:56 INFO ExecutorPodsAllocator: Going to request 5 executors from Kubernetes for ResourceProfile Id: 0, target: 10, known: 0, sharedSlotFromPendingPods: 2147483647.
22/05/18 14:47:56 INFO BasicExecutorFeatureStep: Decommissioning not enabled, skipping shutdown script
22/05/18 14:47:56 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 7079.
22/05/18 14:47:56 INFO NettyBlockTransferService: Server created on utkarsh-wc-test-lens-1-e879dc80d7a2fb47-driver-svc.default.svc:7079
22/05/18 14:47:56 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
22/05/18 14:47:56 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, utkarsh-wc-test-lens-1-e879dc80d7a2fb47-driver-svc.default.svc, 7079, None)
22/05/18 14:47:56 INFO BlockManagerMasterEndpoint: Registering block manager utkarsh-wc-test-lens-1-e879dc80d7a2fb47-driver-svc.default.svc:7079 with 413.9 MiB RAM, BlockManagerId(driver, utkarsh-wc-test-lens-1-e879dc80d7a2fb47-driver-svc.default.svc, 7079, None)
22/05/18 14:47:56 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, utkarsh-wc-test-lens-1-e879dc80d7a2fb47-driver-svc.default.svc, 7079, None)
22/05/18 14:47:56 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, utkarsh-wc-test-lens-1-e879dc80d7a2fb47-driver-svc.default.svc, 7079, None)
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class
at com.qubole.sparklens.common.ApplicationInfo.<init>(ApplicationInfo.scala:22)
at com.qubole.sparklens.QuboleJobListener.<init>(QuboleJobListener.scala:42)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.base/java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.spark.util.Utils$.$anonfun$loadExtensions$1(Utils.scala:2876)
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2868)
at org.apache.spark.SparkContext.$anonfun$setupAndStartListenerBus$1(SparkContext.scala:2538)
at org.apache.spark.SparkContext.$anonfun$setupAndStartListenerBus$1$adapted(SparkContext.scala:2537)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2537)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:641)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
at org.apache.spark.examples.JavaWordCount.main(JavaWordCount.java:47)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.base/java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: scala.Product$class
at java.base/java.net.URLClassLoader.findClass(Unknown Source)
at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
... 37 more
22/05/18 14:47:56 INFO BasicExecutorFeatureStep: Decommissioning not enabled, skipping shutdown script
22/05/18 14:47:56 INFO BasicExecutorFeatureStep: Decommissioning not enabled, skipping shutdown script
22/05/18 14:47:57 INFO BasicExecutorFeatureStep: Decommissioning not enabled, skipping shutdown script
22/05/18 14:47:58 INFO BasicExecutorFeatureStep: Decommissioning not enabled, skipping shutdown script
22/05/18 14:47:59 INFO ExecutorPodsAllocator: Going to request 5 executors from Kubernetes for ResourceProfile Id: 0, target: 10, known: 5, sharedSlotFromPendingPods: 2147483642.
22/05/18 14:47:59 INFO BasicExecutorFeatureStep: Decommissioning not enabled, skipping shutdown script
22/05/18 14:48:00 INFO BasicExecutorFeatureStep: Decommissioning not enabled, skipping shutdown script
22/05/18 14:48:01 INFO BasicExecutorFeatureStep: Decommissioning not enabled, skipping shutdown script
22/05/18 14:48:02 INFO BasicExecutorFeatureStep: Decommissioning not enabled, skipping shutdown script
22/05/18 14:48:02 INFO BasicExecutorFeatureStep: Decommissioning not enabled, skipping shutdown script
22/05/18 14:48:38 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.84.44:43902) with ID 4, ResourceProfileId 0
22/05/18 14:48:38 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.84.44:39033 with 413.9 MiB RAM, BlockManagerId(4, 192.168.84.44, 39033, None)
22/05/18 14:48:41 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.79.168:41938) with ID 2, ResourceProfileId 0
22/05/18 14:48:41 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.79.168:37999 with 413.9 MiB RAM, BlockManagerId(2, 192.168.79.168, 37999, None)
22/05/18 14:48:42 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.250.173:37326) with ID 1, ResourceProfileId 0
22/05/18 14:48:42 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.250.173:42451 with 413.9 MiB RAM, BlockManagerId(1, 192.168.250.173, 42451, None)
22/05/18 14:48:42 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.84.45:39364) with ID 9, ResourceProfileId 0
22/05/18 14:48:42 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.84.45:45931 with 413.9 MiB RAM, BlockManagerId(9, 192.168.84.45, 45931, None)
22/05/18 14:49:04 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.79.169:57970) with ID 7, ResourceProfileId 0
22/05/18 14:49:04 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.79.169:40811 with 413.9 MiB RAM, BlockManagerId(7, 192.168.79.169, 40811, None)
22/05/18 14:49:20 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.250.174:46390) with ID 8, ResourceProfileId 0
22/05/18 14:49:21 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.250.174:44595 with 413.9 MiB RAM, BlockManagerId(8, 192.168.250.174, 44595, None)
22/05/18 14:49:22 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.250.175:48854) with ID 5, ResourceProfileId 0
22/05/18 14:49:22 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.250.175:39033 with 413.9 MiB RAM, BlockManagerId(5, 192.168.250.175, 39033, None)
22/05/18 14:49:30 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.237.159:54016) with ID 10, ResourceProfileId 0
22/05/18 14:49:30 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.237.159:35809 with 413.9 MiB RAM, BlockManagerId(10, 192.168.237.159, 35809, None)
22/05/18 14:49:30 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.79.170:48378) with ID 6, ResourceProfileId 0
22/05/18 14:49:31 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.79.170:44125 with 413.9 MiB RAM, BlockManagerId(6, 192.168.79.170, 44125, None)
22/05/18 14:49:47 INFO KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.250.176:37284) with ID 3, ResourceProfileId 0
22/05/18 14:49:47 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.250.176:35971 with 413.9 MiB RAM, BlockManagerId(3, 192.168.250.176, 35971, None)