Proper way to provide spark application a parameter/arg with spaces in spark-submit

3.3k Views Asked by At

I have something like the following, a shell script with:

FOO="bar gum"
$ spark-submit --verbose --class SomeApp some.jar "$FOO"

However this would result in:

Main class:
SomeApp
Arguments:
bar
gum

Where as what I expected was a single argument of 'bar gum'

/update

So much for dumbing down this question

What I really had was:

FOO="bar gum"
$ ssh host spark-submit --verbose --class SomeApp some.jar "$FOO"

This should've been:

FOO="bar gum"
$ ssh host spark-submit --verbose --class SomeApp some.jar \"$FOO\"
1

There are 1 best solutions below

0
On BEST ANSWER

When you send a command through the ssh command, any double quotes are interpreted by your local, current shell.

Say you have a simple script, print-first-arg.sh on the remote:

#!/bin/bash
echo $1

Then

$ ssh host print-first-arg.sh "hello world"

Results in

$ host print-first-arg.sh hello world
hello

On the remote.

Use:

$ ssh host echo \"hello world\"

and it results in:

$ ssh host print-first-arg.sh "hello world"
hello world

On the remote