我有三台主机,每台主机上都安装了docker。我希望在三个容器之间有一个分布式文件系统HDFS。因此,我必须制作一个Hadoop集群。我使用此docker文件制作hadoop镜像。
FROM ubuntu_mesos
ENV HADOOP_HOME /opt/hadoop
ENV JAVA_HOME /usr/lib/jvm/java-8-openjdk-amd64
RUN apt-get update && apt-get install -y ssh rsync vim openjdk-8-jdk
# download and extract hadoop, set JAVA_HOME in hadoop-env.sh, update
path
RUN wget https://archive.apache.org/dist/hadoop/core/hadoop-
3.1.0/hadoop-3.1.0.tar.gz && tar -xzf hadoop-3.1.0.tar.gz && \
mv hadoop-3.1.0 $HADOOP_HOME && \
echo "export JAVA_HOME=$JAVA_HOME" >> $HADOOP_HOME/etc/hadoop/
hadoop-env.sh && \
echo "PATH=$PATH:$HADOOP_HOME/bin" >> ~/.bashrc
# create ssh keys
RUN ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa && cat ~/.ssh/id_rsa.pub
>> ~/.ssh/authorized_keys && chmod 0600 ~/.ssh/authorized_keys
ADD core-site.xml $HADOOP_HOME/etc/hadoop/
ADD hdfs-site.xml $HADOOP_HOME/etc/hadoop/
ADD mapred-site.xml $HADOOP_HOME/etc/hadoop/
ADD mapred-site.xml $HADOOP_HOME/etc/hadoop/
ADD ssh_config /root/.ssh/config
COPY start-hadoop.sh start-hadoop.sh
EXPOSE 22 9000 8088 50070 50075 50030 50060
RUN echo export HDFS_NAMENODE_USER="root" >>
$HADOOP_HOME/etc/hadoop/hadoop-env.sh
RUN echo export HDFS_DATANODE_USER="root" >>
$HADOOP_HOME/etc/hadoop/hadoop-env.sh
RUN echo export HDFS_SECONDARYNAMENODE_USER="root" >>
$HADOOP_HOME/etc/hadoop/hadoop-env.sh
RUN echo export YARN_RESOURCEMANAGER_USER="root" >>
$HADOOP_HOME/etc/hadoop/hadoop-env.sh
RUN echo export YARN_NODEMANAGER_USER="root" >>
$HADOOP_HOME/etc/hadoop/hadoop-env.sh
docker plugin install weaveworks/net-plugin:latest_release
docker network create --driver=weaveworks/net-plugin:latest_release
--attachable mycontainernetwork
sudo weave launch <IP address of first host> <IP address of second
host>
sudo docker run -it --net mycontainernetwork my-hadoop
ping -c 1 -q ContainerID
ssh e121a0ef81ef
ssh: connect to host e121a0ef81ef port 22: Connection refused
“e121a0ef81ef”是我的容器ID。
我很困惑,不知道如何解决问题。你能帮我吗?
任何帮助,将不胜感激。
最佳答案
问题解决了。我做了这些阶段:
首先,我使ssh在三台主机之间没有密码。
在三位主持人中:
ssh-keygen
ssh-copy-id user@ip
sudo docker run -it -v /home/user/.ssh:/root/.ssh --net mycontainernetwork
--privileged my-hadoop
ssh -v user@ip
/etc/init.d/ssh restart
service ssh restart
chmod 600 ~/.ssh/*
关于docker - ssh:连接到主机e121a0ef81ef(容器ID)端口22: docker 中的连接被拒绝,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/55119564/
相关文章:
python - Fedora 22 : ERROR: No module named '_rpmb
hadoop - Luigi可以运行远程Hadoop作业吗?
java - 如何在 Hadoop 中对自定义可写类型进行排序
hadoop - Spring Cloud Dataflow-http |卡夫卡和卡夫卡| hdfs
amazon-web-services - 需要有关数据管道创建的设计输入
apache-spark - Spark的示例在客户端模式下引发FileNotFoundExcept
hadoop - HDFS的默认 block 大小为128 MB,而Hive的默认 strip 大小