bash versus the space character

Does anyone know how to get bash to split the output of a command enclosed in `` the same way it would split the command line if typed in? Everything i do to try to escape spaces ends up not working. Eg, this command (for sorting the songs from _The Dark Side Of Phobos_ in level order) produces a string that can be cut'n'pasted to bash, but if i change the echo command to mplayer, it doesn't work as bash split on spaces. echo `find . -name '*E1M*' | awk '{print $2,$0}' FS='[()]' | sort | sed -e 's/[^ ]* \(.*\)/\1/' -e 's/\([ ()]\)/\\\\\1/g'`

On Fri, 12 Aug 2005 08:28, Jonathan Purvis wrote:
Does anyone know how to get bash to split the output of a command enclosed in `` the same way it would split the command line if typed in? Everything i do to try to escape spaces ends up not working. Eg, this command (for sorting the songs from _The Dark Side Of Phobos_ in level order) produces a string that can be cut'n'pasted to bash, but if i change the echo command to mplayer, it doesn't work as bash split on spaces.
echo `find . -name '*E1M*' | awk '{print $2,$0}' FS='[()]' | sort | sed -e 's/[^ ]* \(.*\)/\1/' -e 's/\([ ()]\)/\\\\\1/g'`
Make sure you are using the version of echo you need, ie the bash builtin command or one installed on your filesystem in eg /bin. -- Professor Farnsworth: "He may have ocean madness, but that's no excuse for ocean rudeness."

* Jonathan Purvis <jon(a)purvis.co.nz> [2005-08-11 22:35]:
Eg, this command (for sorting the songs from _The Dark Side Of Phobos_ in level order) produces a string that can be cut'n'pasted to bash, but if i change the echo command to mplayer, it doesn't work as bash split on spaces.
echo `find . -name '*E1M*' | awk '{print $2,$0}' FS='[()]' | sort | sed -e 's/[^ ]* \(.*\)/\1/' -e 's/\([ ()]\)/\\\\\1/g'`
That’s not very helpful to people who have no set of suitable files to check the output… can you give examples of what you expect and what you are getting? Regards, -- Aristotle Pagaltzis // <http://plasmasturm.org/>

A. Pagaltzis wrote:
That’s not very helpful to people who have no set of suitable files to check the output… can you give examples of what you expect and what you are getting?
Make yourself and empty directory and try these 3 commands: touch file\ 1 file\ 2 file\ 3 for name in * ; do echo $name ; done for name in `ls -1 | sort -r` ; do echo $name ; done I'd like some way of making the output of the 3rd command be the output of the 2nd, only reversed. Instead it prints a line break instead of a space as bash as split the arguments given to for on spaces. Trying various combinations of quoting or escaping don't help: for name in `ls -1 | sort -r | sed 's/ /\\\\ /g'` ; do echo $name ; done for name in `ls -1 | sort -r | sed -e 's/^/"/' -e 's/$/"/'` ; do echo $name ; done for name in `ls -1 | sort -r | sed -e 's/ /\\\\ /g' -e 's/^/"/' -e 's/$/"/'` ; do echo $name ; done

On Fri, 2005-08-12 at 11:35 +1200, Jonathan Purvis wrote:
A. Pagaltzis wrote:
That’s not very helpful to people who have no set of suitable files to check the output… can you give examples of what you expect and what you are getting?
Make yourself and empty directory and try these 3 commands:
touch file\ 1 file\ 2 file\ 3 for name in * ; do echo $name ; done for name in `ls -1 | sort -r` ; do echo $name ; done
I'd like some way of making the output of the 3rd command be the output of the 2nd, only reversed. Instead it prints a line break instead of a space as bash as split the arguments given to for on spaces.
You can get what you want by changing $IFS (bash's input field separator): IFS=; for name in `/bin/ls -1 | sort -r` ; do echo $name ; done -- Colin Palmer <colinp(a)waikato.ac.nz> University of Waikato, ITS Division

Colin Palmer wrote:
On Fri, 2005-08-12 at 11:35 +1200, Jonathan Purvis wrote:
Make yourself and empty directory and try these 3 commands:
touch file\ 1 file\ 2 file\ 3 for name in * ; do echo $name ; done for name in `ls -1 | sort -r` ; do echo $name ; done
I'd like some way of making the output of the 3rd command be the output of the 2nd, only reversed. Instead it prints a line break instead of a space as bash as split the arguments given to for on spaces.
You can get what you want by changing $IFS (bash's input field separator):
IFS=; for name in `/bin/ls -1 | sort -r` ; do echo $name ; done
That seemed to produce the correct output, but echo was only called once. But with your idea about using IFS, i managed to get this, which does work: IFS=" "; for name in `/bin/ls -1 | sort -r` ; do echo line: $name ; done Thanks for your help.

IFS=" "; for name in `/bin/ls -1 | sort -r` ; do echo line: $name ; done
ls -l | sort -r | xargs echo
I'm not sure why the reverse was important but you can always use find's -exec option to perform actions on files with spaces in their names. [oliver] mobility:~/1$ find . -name "file*" -exec echo "'{}'" \; './file 3' './file 2' './file 1' Or if you just want a delimited filename string then this: [oliver] mobility:~/1$ find . -name "file*" -printf "'%f'\n" 'file 3' 'file 2' 'file 1' Find is your friend. Regards -- Oliver Jones <oliver(a)deeperdesign.com> Deeper Design

* Oliver Jones <oliver(a)deeperdesign.com> [2005-08-15 15:35]:
Or if you just want a delimited filename string then this:
[oliver] mobility:~/1$ find . -name "file*" -printf "'%f'\n" 'file 3' 'file 2' 'file 1'
Except that will fall over on filenames with spaces in them. You want to pipe `find ... -print0` to something, most of the time. (Usually xargs(1). That’s much faster than spawning one process for every single file, too, even for a pretty small number of files.) Regards, -- Aristotle Pagaltzis // <http://plasmasturm.org/>

[oliver] mobility:~/1$ find . -name "file*" -printf "'%f'\n" 'file 3' 'file 2' 'file 1'
Except that will fall over on filenames with spaces in them. You want to pipe `find ... -print0` to something, most of the time. (Usually xargs(1). That’s much faster than spawning one process for every single file, too, even for a pretty small number of files.)
xargs is only useful for feeding commands that accept multiple files as arguments and do the same thing to each file (eg cat). All xargs does is take a large set of filenames/lines on standard input and break them up into smaller chunks to execute on the command line of the specified command. On a small number of files/lines xargs is likely to only execute the target command once. xargs can also suffer (as stated in the man page) from the same space \new line issues of bash. Though you can tell it to split on NULL which is nice as it plays well with find -print0 as you suggest. Regards -- Oliver Jones <oliver(a)deeperdesign.com> Deeper Design

* Oliver Jones <oliver(a)deeperdesign.com> [2005-08-15 16:10]:
You want to pipe `find ... -print0` to something, most of the time. (Usually xargs(1). That’s much faster than spawning one process for every single file, too, even for a pretty small number of files.)
xargs is only useful for feeding commands that accept multiple files as arguments and do the same thing to each file (eg cat).
That’s why I said “usually.” :-) Sometimes, you want to pipe the input to something else. What I was emphasizing is that find(1)’s `-print0` predicate avoids any quoting problems simply by delimiting filenames using the only character that can’t be part of a filename. You can then do anything you want with the output without running into quoting problems; xargs(1) is only one choice for processing it, but the most common because it covers many simple uses.
On a small number of files/lines xargs is likely to only execute the target command once.
Exactly, that’s why it’s useful to reduce the overhead as compared to find(1)’s `-exec` predicate, which always executes the command as many times as there are files. Regards, -- Aristotle Pagaltzis // <http://plasmasturm.org/>

Oliver Jones wrote:
[oliver] mobility:~/1$ find . -name "file*" -printf "'%f'\n" 'file 3' 'file 2' 'file 1'
Except that will fall over on filenames with spaces in them. You want to pipe `find ... -print0` to something, most of the time. (Usually xargs(1). That’s much faster than spawning one process for every single file, too, even for a pretty small number of files.)
xargs is only useful for feeding commands that accept multiple files as arguments and do the same thing to each file (eg cat). All xargs does is take a large set of filenames/lines on standard input and break them up into smaller chunks to execute on the command line of the specified command. On a small number of files/lines xargs is likely to only execute the target command once.
you can use "-n 1" to only do one file, and -istr to make it insert the file in instead of str, so you can do ls * | xargs -ixxx cp xxx /some/other/dir (if for some reason cp * /some/other/dir wasn't desirable). And combine xargs with -0 for extra delimiting goodness.

* Perry Lorier <perry(a)coders.net> [2005-08-16 00:55]:
you can use "-n 1" to only do one file, and -istr to make it insert the file in instead of str, so you can do
Of course, in that case, when combining with find(1), you don’t gain anything over `find ... -exec ...`. Regards, -- Aristotle Pagaltzis // <http://plasmasturm.org/>

* A. Pagaltzis <pagaltzis(a)gmx.de> [2005-08-15 16:00]:
* Oliver Jones <oliver(a)deeperdesign.com> [2005-08-15 15:35]:
Or if you just want a delimited filename string then this:
[oliver] mobility:~/1$ find . -name "file*" -printf "'%f'\n" 'file 3' 'file 2' 'file 1'
Except that will fall over on filenames with spaces in them.
Errr, with quotes in them, of course. Regards, -- Aristotle Pagaltzis // <http://plasmasturm.org/>

Advance warning: this mail ended up being pretty academical in nature… * Jonathan Purvis <jon(a)purvis.co.nz> [2005-08-12 02:50]:
But with your idea about using IFS, i managed to get this, which does work:
Filenames can legally contain newlines. (Yes, it’s rare, but it’s also possible.) The canonical way to go about this is passing `-print0` to find(1) so it terminates filenames with a NUL rather than a LF, and spliting on NULs in whatever processes the filenames next.
echo `find . -name '*E1M*' | awk '{print $2,$0}' FS='[()]' | sort | sed -e 's/[^ ]* \(.*\)/\1/' -e 's/\([ ()]\)/\\\\\1/g'`
After squinting at this command for a while, I think what you’re doing is massaging the filenames to properly sort them, then putting them back together, and finally escaping shell metachars. Is that right? This is really a task sort(1) should be up to… unfortunately it’s not, since it can’t use mulitple delimiter characters and can’t cope with NUL-separated input either (even though it can produce it, sigh). awk is useless, too. Escaping metachars wouldn’t be necessary if you kept the shell out of the picture, ie no backticks to invoke mplayer, but that’s not doable with sed(1) in the picture, since that can’t split on NULs at all… yuck. I don’t see a way to do with *well* without Perl. find . -name '*E1M*' -print0 | perl -naF'\0' -0777l0e' @k = map join( "", ( split /[()]/ )[2, 0] ), @F; print for @F[ sort { $k[$a] cmp $k[$b] } 0 .. $#k ] ' | xargs -0 mplayer Quite a mouthful. :-( This really is a task that sort(1) *should* be fully up to… splitting on NULs and being able to use multiple different as delimiter characters would make it a simple matter of saying something like find . -name '*E1M*' -print0 | sort -0z -t \( -t \) -k 3,1 | xargs -0 mplayer Alas. Regards, -- Aristotle Pagaltzis // <http://plasmasturm.org/>
participants (6)
-
A. Pagaltzis
-
Colin Palmer
-
Glenn Enright
-
Jonathan Purvis
-
Oliver Jones
-
Perry Lorier