I created a simple alias for xargs, with the intend to pipe it when needed. It will simply run a command for each line of it. My question to you is, is this useful or are there better ways of doing this? This is just a little bit of brainstorming basically. Maybe I have a knot in my head.

# Pipe each line and execute a command. The "{}" will be replaced by the line.
# Example:
#   find . -maxdepth 2 -type f -name 'M*' | foreach grep "USB" {}
alias foreach='xargs -d "\n" -I{}'

For commands that already operate on every line from stdin, this won’t be much useful. But in other cases, it might be. A more simplified usage example (and a useless one) would be:

find . -maxdepth 1 | foreach echo "File" {}

It’s important to use the {} as a placeholder for the “current line” that is processed. What do you think about the usefulness? Have you any idea how to use it?

  • kittenroar@beehaw.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 day ago

    Great minds, lol. I have almost the exact same command set up as a little script. Mine has an extra modification for my use case, and I named mine iter, but foreach is a good name for it too.

  • gnuhaut
    link
    fedilink
    arrow-up
    5
    ·
    2 days ago

    Don’t use ls if you want to get filenames, it does a bunch of stuff to them. Use a shell glob or find.

    Also, because filenames can have newlines, if you want to loop over them, it’s best to use one these:

    for x in *; do do_stuff "$x"; done # can be any shell glob, not just *
    find . -exec do_stuff {} \;
    find . -print0 | xargs -0 do_stuff # not POSIX but widely supported
    find . -print0 | xargs -0 -n1 do_stuff # same, but for single arg command
    

    When reading newline-delimited stuff with while read, you want to use:

    cat filenames.txt | while IFS= read -r x; do_stuff "$x"; done
    

    The IFS= prevents trimming of whitespace, and -r prevents interpretation of backslashes.

    • thingsiplay@beehaw.orgOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      15 hours ago

      Some additional thoughts to be aware of by looking closer to each line (previously I just glanced over).

      This point is not directly affecting your example, but I want to make you aware of something I fall into myself. Its one of those Bash quirks. Other shells might handle it differently, only speaking about Bash here. For a regular for loop over files, its important to note that if no file exists, the variable will be set to the search string. So example for x in *.png; do, if no .png file is found, then x will be set to *.png literally. So depending on what you do in the loop this could be catastrophic. But Bash has an option for this specifically: shopt -s nullglob . Using this option, if no file is found, then x will be set to an empty string. More about Bash options: https://www.gnu.org/software/bash/manual/html_node/The-Shopt-Builtin.html

      for x in *.abcdefg; do echo "$x"; done
      shopt -s nullglob
      for x in *.abcdefg; do echo "$x"; done
      

      BTW one can also do a read line by line without cat, by reading the file directly: (for some reasons Beehaw won’t let me type the lower than character, so replace that, here a screenshot too):

      while IFS= read -r line; do echo "Line: ${line}" ; done \< filenames.txt
      
    • thingsiplay@beehaw.orgOP
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      Those find and ls commands were just to illustrate how the actual alias work, to get a sense of. I’m aware of those issues with filenames. It’s not about ls or find here.

      • gnuhaut
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        Yeah sorry then. It would be good to not use ls in your example though, someone who doesn’t know about that might read this discussion and think that’s reasonable.

        As for your original question, doing the foreach as a personal alias is fine. I wouldn’t use it in any script, since if anyone else reads that, they probably already know about xargs. So using your foreach would be more confusing to any potential reader I think.

        • thingsiplay@beehaw.orgOP
          link
          fedilink
          arrow-up
          2
          ·
          1 day ago

          I guess you are right. I even point such things out on others, so fair enough. I will update the example, as I don’t want someone to see this and take it as a good example. For the alias, I never use aliases in scripts anyway. These are always for interactive usage for me at least. In scripts i follow some other rules, such as use longer names for options (unless it is a really common one).

  • davelA
    link
    fedilink
    arrow-up
    3
    ·
    2 days ago

    A bit of a tangent, but I almost never use xargs in the shell anymore, and instead use “while read line ; do *SOMETHING* $line ; done”, because xargs doesn’t have access to the shell’s local variables, aliases, or functions.

    • dcdc@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      True that the loop is easier to work with, though you can still pass args/env into the sub shell, and xargs’ -P is one of my favorites depending on the task (may not be desired in this case). Sometimes I’ve done both: echo assembled commands in a loop/find -exec, sanity check output, then pipe to xargs … bash -c to run in parallel.

    • thingsiplay@beehaw.orgOP
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      Good point! A while loop is probably more flexible here and easier to expand too. I will experiment a bit more and maybe I’ll change to a while readline implemented as a Bash function.

  • F04118F@feddit.nl
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    2 days ago

    How to call xargs is typically one of those things I always forget. The foreach alias is a great solution!

    My current solution was to use tldr for all of these tools, but yeah if I find myself having to do a for each line, I’ll definitely steal your alias.

    Luckily (knocks on wood) I almost exclusively work with yaml and json nowadays so I should just learn yq.