skip to Main Content

I am new to Minikube and Docker. I have a setup of Minikube with three pods of apache spark.
One spark master and two spark worker. My docker file for spark master is as below,

# base image
FROM openjdk:11

# define spark and hadoop versions
ENV SPARK_VERSION=3.2.0
ENV HADOOP_VERSION=3.3.1

# download and install hadoop
RUN mkdir -p /opt && 
    cd /opt && 
    curl http://archive.apache.org/dist/hadoop/common/hadoop-${HADOOP_VERSION}/hadoop-${HADOOP_VERSION}.tar.gz | 
        tar -zx hadoop-${HADOOP_VERSION}/lib/native && 
    ln -s hadoop-${HADOOP_VERSION} hadoop && 
    echo Hadoop ${HADOOP_VERSION} native libraries installed in /opt/hadoop/lib/native

# download and install spark
RUN mkdir -p /opt && 
    cd /opt && 
    curl http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz | 
        tar -zx && 
    ln -s spark-${SPARK_VERSION}-bin-hadoop2.7 spark && 
    echo Spark ${SPARK_VERSION} installed in /opt

# add scripts and update spark default config
ADD common.sh spark-master spark-worker /
ADD spark-defaults.conf /opt/spark/conf/spark-defaults.conf
ENV PATH $PATH:/opt/spark/bin

When I deploy the pods I got an error,

Events:

  Type     Reason     Age                   From               Message

  ----     ------     ----                  ----               -------

  Warning  Failed     25m (x5 over 26m)     kubelet            Error: failed to start container "spark-master": Error response from daemon: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "/spark-master": permission denied: unknown

The contents of the script spark-master,

#!/bin/bash

. /common.sh

echo "$(hostname -i) spark-master" >> /etc/hosts

/opt/spark/bin/spark-class org.apache.spark.deploy.master.Master --ip spark-master --port 7077 --webui-port 8080

Please help to solve this issue.
My Docker Version is:
Docker version 20.10.18, build b40c2f6

2

Answers


  1. Chosen as BEST ANSWER

    In the Dockerfile I have commented out the following line

    #ADD common.sh spark-master spark-worker /
    

    Replaced the line with the below lines which resolved the permission error

    COPY common.sh spark-master spark-worker  /
    RUN chmod +x /common.sh /spark-master /spark-worker
    

  2. The contents of the script spark-master are as follows:

    #!/bin/bash
    
    . /common.sh
    
    echo "$(hostname -i) spark-master" >> /etc/hosts
    
    /opt/spark/bin/spark-class org.apache.spark.deploy.master.Master --ip spark-master --p spark-master-ui-port >> /var/log/spark-master.log 2>&1
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search