Got it

ERROR yarn.Client: Failed to contact YARN for application

Created: Jun 28, 2022 06:22:31Latest reply: Jun 28, 2022 06:27:59 125 1 0 0 0
  Rewarded HiCoins: 0 (problem resolved)

Hello, friend!

When the spark-submit task is submitted, the following error message is displayed:


2022-06-01 09:33:44 INFO BlockManagerInfo:54 - Updated broadcast_0_piece0 in memory on test1:48384 (current size: 1181.0 B, original size: 1181.0 B, free: 413.9 MB) 2022-06-01 09:33:44 ERROR Client:91 - Failed to contact YARN for application application_1541034627389_0001. java.io.InterruptedIOException: Call interrupted 
at org.apache.hadoop.ipc.Client.call(Client.java:1469) 
at org.apache.hadoop.ipc.Client.call(Client.java:1412) 
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) 
at com.sun.proxy.$Proxy15.getApplicationReport(Unknown Source) 
at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getApplicationReport(ApplicationClientProtocolPBClientImpl.java:191) 
at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:498) 
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 
at com.sun.proxy.$Proxy16.getApplicationReport(Unknown Source) 
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getApplicationReport(YarnClientImpl.java:430) at org.apache.spark.deploy.yarn.Client.getApplicationReport(Client.scala:300) 
at org.apache.spark.deploy.yarn.Client.monitorApplication(Client.scala:1059) 
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:109)


Do you know how to solve the error?

I need your help, thanks in advance!

  • x
  • convention:

Featured Answers
olive.zhao
Admin Created Jun 28, 2022 06:27:59

Hello, friend!

Check whether the environment variable specifies the Hadoop configuration file directory.

When starting Spark, you need to set the Hadoop installation directory, Hadoop configuration file directory, and Yarn configuration file directory. Check whether the three parameters are correctly set in environment variables.

HADOOP

Hope this helps!

View more
  • x
  • convention:

All Answers
olive.zhao
olive.zhao Admin Created Jun 28, 2022 06:27:59

Hello, friend!

Check whether the environment variable specifies the Hadoop configuration file directory.

When starting Spark, you need to set the Hadoop installation directory, Hadoop configuration file directory, and Yarn configuration file directory. Check whether the three parameters are correctly set in environment variables.

HADOOP

Hope this helps!

View more
  • x
  • convention:

Comment

You need to log in to comment to the post Login | Register
Comment

Notice: To protect the legitimate rights and interests of you, the community, and third parties, do not release content that may bring legal risks to all parties, including but are not limited to the following:
  • Politically sensitive content
  • Content concerning pornography, gambling, and drug abuse
  • Content that may disclose or infringe upon others ' commercial secrets, intellectual properties, including trade marks, copyrights, and patents, and personal privacy
Do not share your account and password with others. All operations performed using your account will be regarded as your own actions and all consequences arising therefrom will be borne by you. For details, see " User Agreement."

My Followers

Login and enjoy all the member benefits

Login

Block
Are you sure to block this user?
Users on your blacklist cannot comment on your post,cannot mention you, cannot send you private messages.
Reminder
Please bind your phone number to obtain invitation bonus.
Information Protection Guide
Thanks for using Huawei Enterprise Support Community! We will help you learn how we collect, use, store and share your personal information and the rights you have in accordance with Privacy Policy and User Agreement.