livy installation problem on cloudera cdh

#1

Hello, guys.
I am using cloudera quickstart vm.
The hadoop version is 2.6.0-cdh5.13.0 and the spark version is 2.4.3.
I referenced https://docs.knime.com/2018-12/bigdata_spark_installation_guide/index.html#livy to install livy.

  1. First download ‘CSD for CDH 5’ and copy it to / opt / cloudera / csd and download RHEL 6 parcel and sha to / opt / clouderea / parcel-repo.
  2. I ran service cloudera-scm-server restart and ran cloudera-scm-manager.
  3. An error occurs when adding and running the LIVY service in the cloudera manager WebUI.

$> csd/csd.sh [“start”] stderr (7.1 KB)

Please check which part is the problem.

0 Likes

#2

Hi,

you can use the Create Local Big Data Environment Node if want to try the Spark Nodes without any special setup.

The latest VirtualBox version of the Cloudera Quickstart VM contains CDH 5.13 with Spark 1.6 and use RPMs. The RPMs can’t be mixed with parcels (Spark 2 and Livy on CDH 5.x). How do you get Spark 2.4.3 running in the VM and how do you install Livy? I tried to install the Livy Parcel using the Cloudera Manager Web UI and it complains that the Spark 2 parcel (2.2.0.clouder1 or higher) is required.

best regards
Sascha

0 Likes

#4

Thank you for the guidance.
So I installed CDH 6.3.0. The spark version is 2.4.0-cdh6.3.0.
Then I installed and activated LIVY. Then I created a livy account on HDFS.

However, when I run the “Create Spark Context (Livy)” node, I get a “Execute failed: Failed to create Livy session.Please consult the Livy server log. (Exception)” log.

LIVY logs is next.

19/09/04 08:50:32 INFO client.RMProxy: Connecting to ResourceManager at cmaster/192.168.200.99:8032
19/09/04 08:50:32 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers
19/09/04 08:50:32 INFO conf.Configuration: resource-types.xml not found
19/09/04 08:50:32 INFO resource.ResourceUtils: Unable to find ‘resource-types.xml’.
19/09/04 08:50:32 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2048 MB per container)
19/09/04 08:50:32 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
19/09/04 08:50:32 INFO yarn.Client: Setting up container launch context for our AM
19/09/04 08:50:32 INFO yarn.Client: Setting up the launch environment for our AM container
19/09/04 08:50:32 INFO yarn.Client: Preparing resources for our AM container
19/09/04 08:50:32 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/rsc-jars/livy-api-0.5.0.knime3.jar -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/livy-api-0.5.0.knime3.jar
19/09/04 08:50:32 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/rsc-jars/livy-rsc-0.5.0.knime3.jar -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/livy-rsc-0.5.0.knime3.jar
19/09/04 08:50:32 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/rsc-jars/netty-all-4.0.37.Final.jar -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/netty-all-4.0.37.Final.jar
19/09/04 08:50:32 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/repl_2.11-jars/livy-repl_2.11-0.5.0.knime3.jar -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/livy-repl_2.11-0.5.0.knime3.jar
19/09/04 08:50:32 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/repl_2.11-jars/commons-codec-1.9.jar -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/commons-codec-1.9.jar
19/09/04 08:50:32 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/repl_2.11-jars/livy-core_2.11-0.5.0.knime3.jar -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/livy-core_2.11-0.5.0.knime3.jar
19/09/04 08:50:32 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/spark/jars/datanucleus-core-4.1.6.jar -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/datanucleus-core-4.1.6.jar
19/09/04 08:50:33 INFO yarn.Client: Uploading resource file:/run/cloudera-scm-agent/process/102-livy-LIVY_SERVER/hive-conf/hive-site.xml -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/hive-site.xml
19/09/04 08:50:33 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/spark/python/lib/pyspark.zip -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/pyspark.zip
19/09/04 08:50:33 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/CDH-6.3.0-1.cdh6.3.0.p0.1279813/lib/spark/python/lib/py4j-0.10.7-src.zip -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/py4j-0.10.7-src.zip
19/09/04 08:50:33 INFO yarn.Client: Uploading resource file:/tmp/spark-b1bd1ca4-4bbe-42c3-8ff0-7eb19424fc83/__spark_conf__1584444046197881787.zip -> hdfs://cmaster:8020/user/livy/.sparkStaging/application_1567502852640_0015/spark_conf.zip
19/09/04 08:50:33 INFO spark.SecurityManager: Changing view acls to: livy
19/09/04 08:50:33 INFO spark.SecurityManager: Changing modify acls to: livy
19/09/04 08:50:33 INFO spark.SecurityManager: Changing view acls groups to:
19/09/04 08:50:33 INFO spark.SecurityManager: Changing modify acls groups to:
19/09/04 08:50:33 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(livy); groups with view permissions: Set(); users with modify permissions: Set(livy); groups with modify permissions: Set()
19/09/04 08:50:33 INFO conf.HiveConf: Found configuration file file:/run/cloudera-scm-agent/process/102-livy-LIVY_SERVER/hive-conf/hive-site.xml
19/09/04 08:50:33 INFO yarn.Client: Submitting application application_1567502852640_0015 to ResourceManager
19/09/04 08:50:33 INFO impl.YarnClientImpl: Submitted application application_1567502852640_0015
19/09/04 08:50:33 INFO yarn.Client: Application report for application_1567502852640_0015 (state: ACCEPTED)
19/09/04 08:50:33 INFO yarn.Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: root.users.livy
start time: 1567554633566
final status: UNDEFINED
tracking URL: http://cmaster:8088/proxy/application_1567502852640_0015/
user: livy
19/09/04 08:50:33 INFO util.ShutdownHookManager: Shutdown hook called
19/09/04 08:50:33 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-b1bd1ca4-4bbe-42c3-8ff0-7eb19424fc83
19/09/04 08:50:33 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-bd5ef747-5c94-435d-9d68-1aa1f00f5bf1

Looking at this log I don’t know what is causing it.
Is there anything else I need to do to use LIVY?

0 Likes

#5

Hi, you can find more information in the YARN container logs. Have a look at the YARN Web UI or in /yarn/container-logs if log aggregation is disabled in yarn. The first container contains the logs of the application master e.g. application_1567502852640_0015_01_000001. Did you find the logs?

0 Likes

#6

In the /yarn/container-log directory and /var/logs/hadoop-yarn, no logs for Application ID were generated. The log-aggregation setting does not allow me to find the application-id log anyway. Is it a cloudera setting problem?

0 Likes

#7

There is a logs links at the bottom near the center of the page (from your screenshot). Klick this, then on the next page on the stderr… link and then Click here for full log on top of the page. The files in /yarn/container-logs are removed after a short time.

0 Likes

#8

The screenshot is the page that appears when you click the log link.
The URL for the Logs link at the bottom center is the URL for that screenshot.
What’s something wrong?

0 Likes

#9

The Logs link looks strange, it does not contain any hostname. The application consumes 0 vCores and 0 memory. Does your test cluster have enough running nodes / resources to execute the Livy containers? (See Nodes on the left side)

0 Likes

#10

Yes.
Node link screenshot is follow.
Different versions of CDH Installed. So different host names but resources is same .

Follow log is /var/log/livy/livy.log and one request log.

2019-09-09 17:20:30,405 WARN org.apache.livy.server.interactive.InteractiveSession$: sparkr.zip not found; cannot start R interpreter.
2019-09-09 17:20:30,411 INFO org.apache.livy.server.interactive.InteractiveSession$: Creating Interactive session 2: [owner: null, request: [kind: shared, proxyUser: None, conf: spark.driver.memory -> 1g,spark.executor.instances -> 1,spark.driver.cores -> 1,livy.uri -> http://192.168.200.99:8998/,spark.executor.memory -> 1g,spark.dynamicAllocation.enabled -> false,spark.executor.cores -> 1, heartbeatTimeoutInSecond: 0]]
2019-09-09 17:20:30,430 INFO org.apache.livy.rsc.rpc.RpcServer: Connected to the port 10001
2019-09-09 17:20:30,431 WARN org.apache.livy.rsc.RSCConf: Your hostname, cdh600, resolves to a loopback address, but we couldn’t find any external IP address!
2019-09-09 17:20:30,431 WARN org.apache.livy.rsc.RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.
2019-09-09 17:20:30,453 INFO org.apache.livy.sessions.InteractiveSessionManager: Registering new session 2
2019-09-09 17:20:34,045 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO client.RMProxy: Connecting to ResourceManager at cdh600/192.168.200.99:8032
2019-09-09 17:20:34,234 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers
2019-09-09 17:20:34,285 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO conf.Configuration: resource-types.xml not found
2019-09-09 17:20:34,286 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO resource.ResourceUtils: Unable to find ‘resource-types.xml’.
2019-09-09 17:20:34,293 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (3017 MB per container)
2019-09-09 17:20:34,294 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
2019-09-09 17:20:34,295 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO yarn.Client: Setting up container launch context for our AM
2019-09-09 17:20:34,298 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO yarn.Client: Setting up the launch environment for our AM container
2019-09-09 17:20:34,322 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO yarn.Client: Preparing resources for our AM container
2019-09-09 17:20:34,373 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/rsc-jars/livy-api-0.5.0.knime3.jar -> hdfs://cdh600:8020/user/livy/.sparkStaging/application_1568015477136_0003/livy-api-0.5.0.knime3.jar
2019-09-09 17:20:34,588 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/rsc-jars/livy-rsc-0.5.0.knime3.jar -> hdfs://cdh600:8020/user/livy/.sparkStaging/application_1568015477136_0003/livy-rsc-0.5.0.knime3.jar
2019-09-09 17:20:34,631 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/rsc-jars/netty-all-4.0.37.Final.jar -> hdfs://cdh600:8020/user/livy/.sparkStaging/application_1568015477136_0003/netty-all-4.0.37.Final.jar
2019-09-09 17:20:34,677 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:34 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/repl_2.11-jars/livy-repl_2.11-0.5.0.knime3.jar -> hdfs://cdh600:8020/user/livy/.sparkStaging/application_1568015477136_0003/livy-repl_2.11-0.5.0.knime3.jar2019-09-09 17:20:35,148 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/repl_2.11-jars/commons-codec-1.9.jar -> hdfs://cdh600:8020/user/livy/.sparkStaging/application_1568015477136_0003/commons-codec-1.9.jar
2019-09-09 17:20:35,194 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/LIVY-0.5.0.knime3-cdh6/repl_2.11-jars/livy-core_2.11-0.5.0.knime3.jar -> hdfs://cdh600:8020/user/livy/.sparkStaging/application_1568015477136_0003/livy-core_2.11-0.5.0.knime3.jar2019-09-09 17:20:35,233 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO yarn.Client: Uploading resource file:/opt/cloudera/parcels/CDH-6.0.1-1.cdh6.0.1.p0.590678/lib/spark/jars/datanucleus-core-4.1.6.jar -> hdfs://cdh600:8020/user/livy/.sparkStaging/application_1568015477136_0003/datanucleus-core-4.1.6.jar
2019-09-09 17:20:35,265 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO yarn.Client: Uploading resource file:/run/cloudera-scm-agent/process/89-livy-LIVY_SERVER/hive-conf/hive-site.xml -> hdfs://cdh600:8020/user/livy/.sparkStaging/application_1568015477136_0003/hive-site.xml
2019-09-09 17:20:35,356 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO yarn.Client: Uploading resource file:/tmp/spark-7224ae72-bbd5-444b-8841-645d13c051ff/__spark_conf__5594470883316094049.zip -> hdfs://cdh600:8020/user/livy/.sparkStaging/application_1568015477136_0003/spark_conf.zip
2019-09-09 17:20:35,422 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO spark.SecurityManager: Changing view acls to: livy
2019-09-09 17:20:35,424 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO spark.SecurityManager: Changing modify acls to: livy
2019-09-09 17:20:35,425 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO spark.SecurityManager: Changing view acls groups to:
2019-09-09 17:20:35,426 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO spark.SecurityManager: Changing modify acls groups to:
2019-09-09 17:20:35,427 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(livy); groups with view permissions: Set(); users with modify permissions: Set(livy); groups with modify permissions: Set()
2019-09-09 17:20:35,448 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO yarn.Client: Submitting application application_1568015477136_0003 to ResourceManager
2019-09-09 17:20:35,493 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO impl.YarnClientImpl: Submitted application application_1568015477136_0003
2019-09-09 17:20:35,498 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO yarn.Client: Application report for application_1568015477136_0003 (state: ACCEPTED)
2019-09-09 17:20:35,504 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO yarn.Client:
2019-09-09 17:20:35,504 INFO org.apache.livy.utils.LineBufferedStream: stdout: client token: N/A
2019-09-09 17:20:35,505 INFO org.apache.livy.utils.LineBufferedStream: stdout: diagnostics: N/A
2019-09-09 17:20:35,505 INFO org.apache.livy.utils.LineBufferedStream: stdout: ApplicationMaster host: N/A
2019-09-09 17:20:35,505 INFO org.apache.livy.utils.LineBufferedStream: stdout: ApplicationMaster RPC port: -1
2019-09-09 17:20:35,505 INFO org.apache.livy.utils.LineBufferedStream: stdout: queue: root.users.livy
2019-09-09 17:20:35,505 INFO org.apache.livy.utils.LineBufferedStream: stdout: start time: 1568017235465
2019-09-09 17:20:35,505 INFO org.apache.livy.utils.LineBufferedStream: stdout: final status: UNDEFINED
2019-09-09 17:20:35,505 INFO org.apache.livy.utils.LineBufferedStream: stdout: tracking URL: http://cdh600:8088/proxy/application_1568015477136_0003/
2019-09-09 17:20:35,506 INFO org.apache.livy.utils.LineBufferedStream: stdout: user: livy
2019-09-09 17:20:35,521 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO util.ShutdownHookManager: Shutdown hook called
2019-09-09 17:20:35,523 INFO org.apache.livy.utils.LineBufferedStream: stdout: 19/09/09 17:20:35 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-7224ae72-bbd5-444b-8841-645d13c051ff
2019-09-09 17:22:00,537 ERROR org.apache.livy.rsc.RSCClient: Failed to connect to context.
java.util.concurrent.TimeoutException: Timed out waiting for context to start.
at org.apache.livy.rsc.ContextLauncher.connectTimeout(ContextLauncher.java:134)
at org.apache.livy.rsc.ContextLauncher.access$300(ContextLauncher.java:63)
at org.apache.livy.rsc.ContextLauncher$2.run(ContextLauncher.java:122)
at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:358)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:394)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112)
at java.lang.Thread.run(Thread.java:748)
2019-09-09 17:22:00,542 INFO org.apache.livy.rsc.RSCClient: Failing pending job 1ac0d720-4930-4274-b57b-300065003b52 due to shutdown.
2019-09-09 17:22:00,549 INFO org.apache.livy.server.interactive.InteractiveSession: Failed to ping RSC driver for session 2. Killing application.
2019-09-09 17:22:00,549 INFO org.apache.livy.server.interactive.InteractiveSession: Stopping InteractiveSession 2…
2019-09-09 17:22:00,759 INFO org.apache.hadoop.yarn.client.api.impl.YarnClientImpl: Killed application application_1568015477136_0003
2019-09-09 17:22:00,771 INFO org.apache.livy.server.interactive.InteractiveSession: Stopped InteractiveSession 2.2019-09-09 17:22:00,771 WARN org.apache.livy.server.interactive.InteractiveSession: Fail to get rsc uri
java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for context to start.
at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41)
at org.apache.livy.server.interactive.InteractiveSession$$anonfun$22.apply(InteractiveSession.scala:406)
at org.apache.livy.server.interactive.InteractiveSession$$anonfun$22.apply(InteractiveSession.scala:406)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.concurrent.TimeoutException: Timed out waiting for context to start.
at org.apache.livy.rsc.ContextLauncher.connectTimeout(ContextLauncher.java:134)
at org.apache.livy.rsc.ContextLauncher.access$300(ContextLauncher.java:63)
at org.apache.livy.rsc.ContextLauncher$2.run(ContextLauncher.java:122)
at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:358)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:394)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112)
at java.lang.Thread.run(Thread.java:748)

0 Likes

#11

The application master seems to never start. Does your test VM have enough memory?

Can you have a look at the tracking URL (http://cdh600:8088/proxy/application_... or http://cmaster:8088/proxy/application_.... in the logs above) and then click on Logs?

The Cloudera Test VM is very restricted test setup. You might consider to install a recent Version of CDH (6.3) on a fresh Cent OS virtual machine or a real server if the local big data environment is no option for you.

0 Likes

#12

Thank you, your reply.
I’m running a VM on a workstation with 64GB of memory and allocated 50GB of memory to the VM.
And I now installed CDH 6.3.0 on the VM.

Adding LIVY as a Cloudera service (when the LIVY server is running as a service) prevented the LIVY Connector from creating a livy session as normal.
But with LIVY activated in cloudera manager and without adding livy services, run ‘/opt/cloudera/parcels/LIVY/bin/livy_server start’ directly with root privileges and add root privileges to /user/livy in hdfs, This confirms that the livy server is working properly.

I don’t know why I can’t create a livy session when I run it as a service, but I want to use the spark context as a way to run it directly now. If you find out the cause of the problem if you run it as a service, anwser please.

0 Likes

#13

The YARN screenshoot above shows only the host cdh600 with 2.95GB memory available. Maybe this was the problem?

We run Livy as a service without any problems. Does the livy user have write permission to /user/livy in HDFS?

Debugging those problems without any logs is very difficult. Did you find some logs?

0 Likes

#14

Oh. This is correctly point.
I set the yarn.nodemanager.resource.memory-mb and yarn.scheduler.maximum-allocation-mb value 4GB.
And livy session is created normally.
Thank you your support.

1 Like

#15

Good to hear that Livy now works like expected. Can you mark the post as solution? (Click the green check box below the post)

0 Likes