Got it

Spark service fails to be started

Created: May 21, 2021 08:52:02Latest reply: Feb 21, 2022 19:42:56 447 6 0 0 0
  HiCoins as reward: 0 (problem unresolved)

Hello, team!

During install the FusionInsight cluster, an error was generated.

Error information:

Roleinstance validity check failure{ScriptExecutionResult=ScriptExecutionResult{exitCode=1, output=,errmsg: Running example{SparkPi]failes! Spark is not available!

Please helps, thanks!

  • x
  • convention:

Featured Answers
olive.zhao
Admin Created May 21, 2021 08:59:51

Posted by Helen.F at 2021-05-21 08:56 FusionInsight HD 6.5.1

Helen,

The possible cause is other similar software is installed on the nodes of the cluster and the software residue is not completely deleted.

Troubleshooting Methods

Check whether the libhadoop.so file of another version exists in /usr/lib64/ of each node. If the file exists, delete it and restart the Spark service.

Procedure

1. Use PuTTY to log in to each node as user root.

2. Run the following commands to go to /usr/lib64/ and check whether the libhadoop.so file exists:

cd /usr/lib64

find libhadoop*

If the following information is displayed, the file exists:

libhadoop.so
libhadoop.so.1.0.0

3. Run the following command to delete the file:

rm -f libhadoop.so.1.0.0

rm -f libhadoop.so

4. Restart the Spark service. The fault is rectified.

Hope this helps!


View more
  • x
  • convention:

Saqibaz
Saqibaz Created Feb 21, 2022 19:42:42 (0) (0)
 
All Answers
olive.zhao
olive.zhao Admin Created May 21, 2021 08:54:19

Hello, friend!

What's a version of your FusionInsight?

Wait for your feedback!


View more
  • x
  • convention:

Helen.F
Helen.F Created May 21, 2021 08:56:16

FusionInsight HD 6.5.1
View more
  • x
  • convention:

olive.zhao
olive.zhao Admin Created May 21, 2021 08:59:51

Posted by Helen.F at 2021-05-21 08:56 FusionInsight HD 6.5.1

Helen,

The possible cause is other similar software is installed on the nodes of the cluster and the software residue is not completely deleted.

Troubleshooting Methods

Check whether the libhadoop.so file of another version exists in /usr/lib64/ of each node. If the file exists, delete it and restart the Spark service.

Procedure

1. Use PuTTY to log in to each node as user root.

2. Run the following commands to go to /usr/lib64/ and check whether the libhadoop.so file exists:

cd /usr/lib64

find libhadoop*

If the following information is displayed, the file exists:

libhadoop.so
libhadoop.so.1.0.0

3. Run the following command to delete the file:

rm -f libhadoop.so.1.0.0

rm -f libhadoop.so

4. Restart the Spark service. The fault is rectified.

Hope this helps!


View more
  • x
  • convention:

Saqibaz
Saqibaz Created Feb 21, 2022 19:42:42 (0) (0)
 
olive.zhao
olive.zhao Admin Created May 29, 2021 08:37:31

Hello, friend!

Is the problem resolved?
View more
  • x
  • convention:

Saqibaz
Saqibaz Created Feb 21, 2022 19:42:56

Thanks for sharing
View more
  • x
  • convention:

Comment

You need to log in to comment to the post Login | Register
Comment

Notice: To protect the legitimate rights and interests of you, the community, and third parties, do not release content that may bring legal risks to all parties, including but are not limited to the following:
  • Politically sensitive content
  • Content concerning pornography, gambling, and drug abuse
  • Content that may disclose or infringe upon others ' commercial secrets, intellectual properties, including trade marks, copyrights, and patents, and personal privacy
Do not share your account and password with others. All operations performed using your account will be regarded as your own actions and all consequences arising therefrom will be borne by you. For details, see " User Agreement."

My Followers

Login and enjoy all the member benefits

Login

Block
Are you sure to block this user?
Users on your blacklist cannot comment on your post,cannot mention you, cannot send you private messages.
Reminder
Please bind your phone number to obtain invitation bonus.
Information Protection Guide
Thanks for using Huawei Enterprise Support Community! We will help you learn how we collect, use, store and share your personal information and the rights you have in accordance with Privacy Policy and User Agreement.