Errores de "permiso denegado" al iniciar un clúster de nodo único en Hadoop

10

Estoy trabajando en Ubuntu 10.10 e intento iniciar un clúster de nodo único en Hadoop.

hadoop@abraham-Dimension-3000:/usr/local/hadoop$ bin/start-all.sh
mkdir: cannot create directory `/usr/local/hadoop/bin/../logs': Permission denied
starting namenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-abraham-Dimension-3000.out
/usr/local/hadoop/bin/hadoop-daemon.sh: line 117: /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-abraham-Dimension-3000.out: No such file or directory
head: cannot open `/usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-abraham-Dimension-3000.out' for reading: No such file or directory
localhost: mkdir: cannot create directory `/usr/local/hadoop/bin/../logs': Permission denied
localhost: starting datanode, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-abraham-Dimension-3000.out
localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117: /usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-abraham-Dimension-3000.out: No such file or directory
localhost: head: cannot open `/usr/local/hadoop/bin/../logs/hadoop-hadoop-datanode-abraham-Dimension-3000.out' for reading: No such file or directory
localhost: mkdir: cannot create directory `/usr/local/hadoop/bin/../logs': Permission denied
localhost: starting secondarynamenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-abraham-Dimension-3000.out
localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117: /usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-abraham-Dimension-3000.out: No such file or directory
localhost: head: cannot open `/usr/local/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-abraham-Dimension-3000.out' for reading: No such file or directory
mkdir: cannot create directory `/usr/local/hadoop/bin/../logs': Permission denied
starting jobtracker, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-abraham-Dimension-3000.out
/usr/local/hadoop/bin/hadoop-daemon.sh: line 117: /usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-abraham-Dimension-3000.out: No such file or directory
head: cannot open `/usr/local/hadoop/bin/../logs/hadoop-hadoop-jobtracker-abraham-Dimension-3000.out' for reading: No such file or directory
localhost: mkdir: cannot create directory `/usr/local/hadoop/bin/../logs': Permission denied
localhost: starting tasktracker, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-abraham-Dimension-3000.out
localhost: /usr/local/hadoop/bin/hadoop-daemon.sh: line 117: /usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-abraham-Dimension-3000.out: No such file or directory
localhost: head: cannot open `/usr/local/hadoop/bin/../logs/hadoop-hadoop-tasktracker-abraham-Dimension-3000.out' for reading: No such file or directory
hadoop@abraham-Dimension-3000:/usr/local/hadoop$ 

No tengo idea de lo que estoy haciendo mal o lo que sugieren algunos de estos errores.

hadoop@abraham-Dimension-3000:/usr/local/hadoop$ jps
5099 Jps

¿Alguien puede diagnosticar el problema?

ATMathew
fuente

Respuestas:

20

Los errores sugieren un problema de permisos.

Asegúrese de que el usuario hadoop tenga los privilegios adecuados /usr/local/hadoop. Tratar:

sudo chown -R hadoop / usr / local / hadoop /
Mark Russell
fuente
Pasé algunas horas tratando de averiguar qué configuraciones debería cambiar para que mi hadoop funcione. Y finalmente encontré una solución que funciona para mí. ¡Gracias! :)
jjankowiak
1

ha especificado el directorio de trabajo para el sistema de archivos hadoop con otra cosa que el usuario hadoop, por lo que establece el directorio que ha leído, permisos de escritura al usuario hadoop o cambiar el ower del directorio
Try ::

sudo chown -R hadoop-user /user/hadoop_project/

o

sudo chmod 777 /user/hadoop_project

donde el nombre de usuario 'hadoop-user' para el entorno hadoop. directorio de trabajo 'hadoop-project' especificado para el sistema de archivos hadoop. (especifique su propio inicio de sesión y directorio que está utilizando para la instalación de hadoop)

sudhakara.st
fuente