Hail notebook error: Consider boosting spark.yarn.executor.memoryOverhead
I am trying to write a hail matrix to my workspace bucket and the job is not completing. I am getting this error:
Hail version: 0.2.11-daed180b84d8
Error summary: SparkException: Job aborted due to stage failure: Task 87 in stage 13.0 failed 4 times, most recent failure: Lost task 87.3 in stage 13.0 (TID 280, saturn-aac178b8-a115-47fb-9d73-96e89305ec9f-w-2.c.anvil-stage-demo.internal, executor 25): ExecutorLostFailure (executor 25 exited caused by one of the running tasks) Reason: Container killed by YARN for exceeding memory limits. 5.0 GB of 5 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.
I don't believe I have access to edit the container, correct? I found this thread discussing this issue, but I am unsure how to do this in Terra. https://stackoverflow.com/questions/38101857/boosting-spark-yarn-executor-memoryoverhead
What container are you using?
You probably won't be able to edit the default docker containers but perhaps you can create a custom container based on the defaults and add the required changes.
Hi Beri can you elaborate on creating a custom docker container with a custom Spark configuration? I am having the same error as Elizabeth, caused by a different function but the result is I need more Spark executor cores (default is 2 and I need to set to 4).
To best do this, should I create my own docker container for this as you suggest, or is there a way to modify the Spark context after running hl.init()?
Hi Dan Spagnolo
For compute resource error messages try increasing the notebook runtime. Follow along this article for details Understanding-and-Customizing-your-Notebook-Runtime.
If you think the docker container being used doesn't have the correct version of tools that you need try creating your own. (being familiar with docker containers helps) Instructions are described here : https://github.com/databiosphere/terra-docker#terra-base-images
Other helpful article : Docker-tutorial-Custom-runtime-environments-for-Jupyter-Notebooks
Please sign in to leave a comment.