bin/hdfs dfs -ls -R / | sort -r -n -k 5.
And I used the below command. It sorts based on file size but it is listing all the
subdirectories as well. hadoop fs -ls -S -h
Is there any script to refer to if I need to recursively list files ordered by file size in an HDFS folder? bin/hdfs dfs -ls -R / | sort -r -n -k 5. < Prev ยท Next >.
I tried a workaround with hdfs -dfs -ls /tmp | sort -k6,7.
An HDFS file or directory such as /parent/child can be specified as
The chapter also shows how to manage HDFS file permissions and create HDFS users.
This command will list only files and directories in your current working
To check for the file, use the ls command to enlist the files and directories.
you can call du without -h to get sortable output, and produce human-friendly output with a little
You could also add | head -10 to find the top 10 or any number of sub-folders in the specified directory.