...
- my shell was not bash, I changed it by typing bash -l , type echo $SHELL
- module load tig hadoop
generic hadoop command: * hadoop command \ [genericOptions\] \ [commandOptions\]*Wiki Markup - Create hadoop FS: hadoop fs -mkdir /user/balewski
- List its content (should be nothing now, but no error) : hadoop fs -ls
Exercise 1: create , load, read back text file to HFS
...
To re-run a job you must first CLEANUP old output files: hadoop dfs -rmr wordcount-opd
Next run Hadoop on 4 reducers : hadoop jar /usr/common/tig/hadoop/hadoop-0.20.2+228/hadoop-0.20.2+228-examples.jar wordcount -Dmapred.reduce.tasks=4 wordcount-in wordcount-op
Some suggestion: change user permision to allow me to read the Hadoop output because Hadopp owns all by default ????on the Scratch disk
Or use provided script: fixperms.sh /global/scratch/sd/balewski/hadoop/wordcount- gpfs/
hadoop jar /usr/common/tig/hadoop/hadoop-0.20.2+228/hadoop-0.20.2+228-examples.jar wordcount -Dmapred.reduce.tasks=4 wordcount-in wordcount-op
- d
- d