Monday, December 15, 2014

Annoyed of retyping - "hdfs..."

Hi to all,
Recently I’ve been annoyed of retyping the same command over and over while viewing and making modifications on my HDFS,
Don’t you hate it when you have to type “hadoop fs ….” or in Hadoop 2 “hdfs dfs…”.

So I’ve created some useful Alias commands (that you can use on Linux/Unix/OSx), just put them in your “.bashrc file in your home directory (Or what ever shell you are running).
You can understand the context of the command because they are the same as the Linux commands, but with the prefix of "h".



Another perk of the whole deal is that you can work with these aliases to work with Amazons S3 system as well, the usual way to work with S3 command is “s3cmd”, we have it installed on the given server that is connected to the HDFS, but this is really easy to use with Hadoop’s Native file system implementation.
The only thing you have to do is to add the protocol type you are trying to access the file system: (for example)

“hls s3n://some-bucket-name/key/file.txt”


If you have some more useful commands that you use a lot while working i would to to be enlightened.

Enjoy typing less :)


No comments:

Post a Comment