I am new to Minikube and Docker. I have a setup of Minikube with three pods of apache spark.
One spark master and two spark worker. My docker file for spark master is as below,
# base image
FROM openjdk:11
# define spark and hadoop versions
ENV SPARK_VERSION=3.2.0
ENV HADOOP_VERSION=3.3.1
# download and install hadoop
RUN mkdir -p /opt &&
cd /opt &&
curl http://archive.apache.org/dist/hadoop/common/hadoop-${HADOOP_VERSION}/hadoop-${HADOOP_VERSION}.tar.gz |
tar -zx hadoop-${HADOOP_VERSION}/lib/native &&
ln -s hadoop-${HADOOP_VERSION} hadoop &&
echo Hadoop ${HADOOP_VERSION} native libraries installed in /opt/hadoop/lib/native
# download and install spark
RUN mkdir -p /opt &&
cd /opt &&
curl http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz |
tar -zx &&
ln -s spark-${SPARK_VERSION}-bin-hadoop2.7 spark &&
echo Spark ${SPARK_VERSION} installed in /opt
# add scripts and update spark default config
ADD common.sh spark-master spark-worker /
ADD spark-defaults.conf /opt/spark/conf/spark-defaults.conf
ENV PATH $PATH:/opt/spark/bin
When I deploy the pods I got an error,
Events: Type Reason Age From Message ---- ------ ---- ---- ------- Warning Failed 25m (x5 over 26m) kubelet Error: failed to start container "spark-master": Error response from daemon: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "/spark-master": permission denied: unknown
The contents of the script spark-master,
#!/bin/bash
. /common.sh
echo "$(hostname -i) spark-master" >> /etc/hosts
/opt/spark/bin/spark-class org.apache.spark.deploy.master.Master --ip spark-master --port 7077 --webui-port 8080
Please help to solve this issue.
My Docker Version is:
Docker version 20.10.18, build b40c2f6
2
Answers
In the Dockerfile I have commented out the following line
Replaced the line with the below lines which resolved the permission error
The contents of the script spark-master are as follows: