hadoop - running a custom shell script distributed shell apache yarn -


i have been going through apache hadoop yarn book hortonworks, wherein have explained 2 ways of running yarn task.

my intent run shell script ( compiles , runs various java , python scripts ) , runs set of these scripts/patches various folders. easy metaphor :- " unzipping 100 folders , logging 'ls' "

now want parallelize flow , such container runs 1-2 folders , ask 50 such containers.

how do using distributed shell ? have seen examples of ls / whoami / uptime / hostname not want. want run script takes / iterated on argument path , want run in distributed fashion on yarn. ?


Comments

Popular posts from this blog

c# - Validate object ID from GET to POST -

node.js - Custom Model Validator SailsJS -

php - Find a regex to take part of Email -