Shell What is the size limit on shuf in FBSD, getting error.

I get an error on this script.
Code:
$ ranImg
/home/userx/bin/ranImg: line 69: /usr/local/bin/shuf: Argument list too long
line 69 is
Code:
temp4=( $( shuf -e -n "$smallest" "${temp1[@]}" ) )
this is as seen filling arrays with path to images then adding them all into one array mixing them up along the way then using the last array to pick a random image to set in mrxvt. In FreeBSD I get the error posted, but not in Linux.

So to me that tells me FreeBSD got a size limit on how much shuf can shuffle?

is there a fix or should I reduse the amount of images I am taking in to try and fix this?

or is that message telling me something different?? which puts me in the I'm lost dept.
Code:
#!/usr/bin/env bash
#Jun. 11, 2019
#Michael Heras

#to give a random background image to
#mrxvt terminal
#by chaning its resource file
#.mrxvtrc

wd1=$HOME/data/ScreenResizedImages
wd2=$HOME/data/wallpapers
wd3=$HOME/data/wallhaven-papers
SysColors=$HOME/bin/colorsOfASystem

#echo "
#wd1Count=$( find $wd1 -type f | wc -l )
#wd2Count=$( find $wd2 -type f | wc -l )
#wd3Count=$( find $wd3 -type f | wc -l )
#"
#get count of totals in the dir, put in an array
totals=(
$( find $wd3 -type f | wc -l )
$( find $wd1 -type f | wc -l )
$( find $wd2 -type f | wc -l )
)

smallest=${totals[0]}

for((i=0;i<${#totals[@]};i++))
do
#logic for smallest number
if [ ${totals[$i]} -lt $smallest ]; then
    smallest=${totals[$i]}
fi
done
#echo $smallest

#logic for greatest number
#elif [ ${nos[$i]} -gt $greatest ]; then
#greatest=${nos[$i]}
##fi
#done

RandomColor()
{
    cat "$SysColors" | shuf | shuf > tempColors
    mv tempColors "$SysColors"
    mapfile -t colors < "$SysColors"
}


MixRandomImages()
{
    #Resized  Images
    mapfile -t temp1 < <(find "$wd1" -type f | shuf )
    #Wallpaper Images
    mapfile -t temp2 < <(find "$wd2" -type f -name "*.jpg" | shuf )
    #wallhaven-papers
    mapfile -t temp3 < <(find "$wd3" -type f -name "*.jpg" | shuf )
 
 
    #takes half of the amount of images in wd2 and adds that
    #and temp1 into an array.
    #-e, --echo treat each ARG as an input line
    #-n, --head-count=COUNT output at most COUNT lines
    #    temp5=( $( shuf -e -n "$(($(ls "$wd2" | wc -l)/2))" "${temp1[@]}" ) )
    #take the lowest amount in all three dir, and give same amount
    #of other images into the mix
    temp4=( $( shuf -e -n "$smallest" "${temp1[@]}" ) )
    temp5=( $( shuf -e -n "$smallest" "${temp2[@]}" ) )
    temp6=( $( shuf -e -n "$smallest" "${temp3[@]}" ) )
 
 
    #all all 3 image arrays into one arrays
    ranArray1=( "${temp5[@]}" "${temp4[@]}"  "${temp6[@]}" ) 
 
    #shuffle them up a few times
    for i in {1..4}
    do
        for g in 1
        do
            sh1=( $(shuf -e "${ranArray1[@]}") )
        done
        unset ranArray1
        ranArray1=( ${sh1[@]} )
        unset sh1 
    done
 
 
    #shuffle them up once
    #ranArray2=( $( shuf -e "${ranArray1[@]}") )
    #shuffle them up twice
#    ranArray3=( $( shuf -e "${ranArray2[@]}")  )
 
    #shuffle array up to get a good mix
    ImageArray=( "${ranArray1[@]}" )
 
 
    #echo ${ImageArray[@]}
}
#write changes to mrxvt terminal config file
UpdateMrxvtConfig()
{
    sed -ibak 's|.*mrxvt.background:.*|mrxvt.background: '"$( echo -e ${colors[ $RANDOM % ${#colors[@]} ]})"'|' $HOME/.mrxvtrc
    sed -ibak 's|mrxvt.tabBackground:.*|mrxvt.tabBackground: '"$( echo -e ${colors[ $RANDOM % ${#colors[@]} ]})"'|' $HOME/.mrxvtrc
    sed -ibak 's|mrxvt.cursorColor: .*|mrxvt.cursorColor: '"$( echo -e ${colors[ $RANDOM % ${#colors[@]} ]})"'|' $HOME/.mrxvtrc
    sed -ibak 's|mrxvt.foreground: .*|mrxvt.foreground:  '"$( echo -e ${colors[ $RANDOM % ${#colors[@]} ]})"'|' $HOME/.mrxvtrc
    sed -ibak 's|.*mrxvt.Pixmap:.*|mrxvt.Pixmap: '"$( echo -e ${ImageArray[ $RANDOM % ${#ImageArray[@]} ]})"';80x80|' $HOME/.mrxvtrc
}
#run functions
RandomColor
MixRandomImages
UpdateMrxvtConfig

#start the terminal.
mrxvt &
 
The error "Argument list too long" doesn't come from shuf, it probably comes from the shell. You know how shell globbing works? If you say "ls *", and the directory contains a, b and c, then the shell turns the * into "ls a b c" (except it doesn't build up a string with blanks, it turns this into an array of short strings). The shell then takes that command line, and starts the program (ls in my example, shuf in yours) using one of the exec... calls, which takes the arguments as an array. It is in this process that there is a limitation.

Now, I think (but I'm not 100% sure) that there are two sets of limitations. First, the shell has to allocate memory to process all the strings, and it may have an explicit limit there, or it might just run out of memory. Second, the underlying execve call has a limitation on how long that array is allowed to be, and that limit is cunningly called ARG_MAX. You can check how big that limit is with getconf.

If you feel like debugging this, then use the find command to count how many files there are, and add debugging printouts to your script. I don't know whether ARG_MAX can be increased without a full recompile, never checked.

In general, it is considered silly to use the normal argument passing mechanism for insanely large numbers of arguments, and tens of thousands or hundreds of thousands is just crazy. This is not at all efficient. The preferred technique here is to use the find command, and process the arguments in groups. For example: "find . -type f | xargs -n 10 ls -l" will run the ls command on 10 files at a time. But that technique doesn't work if the command has to see the whole list at once. In that case, a better technique is to either feed the list of arguments into the command via stdin, or via a special input file, or have the command itself do the glob expansion or directory walking.

Where this gets interesting is on supercomputers with cluster filesystems. Users there learn pretty quickly that doing "ls *" in a directory with a billion files is not practical.
 
What is the maximum character length of arguments in a shell command? How do I find out the maximum length of arguments for a new process under Linux or Unix like operating systems?

If you get an error that read as – command: Argument list too long due to limit for the command line length.
UNIX / Linux / BSD system has a limit on how many bytes can be used for the command line argument and
environment variables. You need to use the getconf command to query system configuration variable called
ARG_MAX.

Code:
$ getconf ARG_MAX 
262144

$ echo $(( $(getconf ARG_MAX) - $(env | wc -c) ))
 261372

$ expr `getconf ARG_MAX` - `env|wc -c` - `env|wc -l` \* 4 - 2048
259204
source https://www.cyberciti.biz/faq/linux-unix-arg_max-maximum-length-of-arguments/

So then I'd have to conceded that yes, my paths to these files in that array plus the length of the file name times the amount of total files in the parents sub-dir has to be going over the limitations of FreeBSD.
Code:
$ ranImg

    Temp1
    15980


/home/userx/bin/ranImg: line 82: /usr/local/bin/shuf: Argument list too long
that being the amount of files taken into the array names, temp1, then I'd need to figure out how many characters are totaling up with the files to see who far over I am going, which is redundant. But yeah, that is why I am getting the, hey that is way to many for me to deal with message.

I'll just need to adjust how I get them images evenly or limit the amount.

thanks I've defiantly learned something today.

I'll just go compare that to Linux for a look see.

Yep Linux is quite larger then FreeBSD is, (ArcoLinux)
Code:
[userx@arcomeo ~]
$ getconf ARG_MAX
2097152

$ echo $(( $(getconf ARG_MAX) - $(env | wc -c) ))
2095916

$ expr `getconf ARG_MAX` - `env|wc -c` - `env|wc -l` \* 4 - 2048
2093688
 
You have to be careful when you say "Linux" here. Linux is a kernel, and the kernel is just one component of execve. The answer depends crucially on the distribution, and what defaults they used when compiling userland and matching kernel.

In theory, you could find the source for FreeBSD (it's trivial to find and install, the handbook explains how), find the place where ARG_MAX is set (perhaps indirectly), increase it, and recompile. I think that would be a bad idea; instead you should examine the way you are trying to solve this problem, and find a more efficient and elegant technique.
 
You have to be careful when you say "Linux" here. Linux is a kernel, and the kernel is just one component of execve. The answer depends crucially on the distribution, and what defaults they used when compiling userland and matching kernel.

In theory, you could find the source for FreeBSD (it's trivial to find and install, the handbook explains how), find the place where ARG_MAX is set (perhaps indirectly), increase it, and recompile. I think that would be a bad idea; instead you should examine the way you are trying to solve this problem, and find a more efficient and elegant technique.
that is why I referenced the distro of Linux(/GNU) I check it on
Yep Linux is quite larger then FreeBSD is, (ArcoLinux)

as far as recompiling to get a larger set constant for a max number, which I too would not even want to do with a OS.

I am working on a solution. But am having an issue with assigning a number to an array dynamically as I move along filling it to MAX allowed. and have changed it to this thus far.
Code:
MAX_ARG=$( expr `getconf ARG_MAX` - `env|wc -c` - `env|wc -l` \* 4 - 4096 ) # 2048 )
echo "1: MAX_ARG=$MAX_ARG"
 
while read d
do
  #keep running total of chars

    totalARGMAX=$(( $totalARGMAX + ${#d} ))
    #echo "$totalARGMAX"
    #echo "MAX_ARG $MAX_ARG"
 
     if [[ "$totalARGMAX" -ge "$MAX_ARG" ]] ; then
        ((cnt++))
        #totalARGMAX=0
        echo;echo
        echo "$cnt :: $totalARGMAX"
        echo;echo
        sleep 1
        echo "Breaking "
        break
    else
        myarray+=( "$d" )
     #does not work, bad substitution on $cnt
       # myarray"$cnt"+=( "$d" )
    fi
done < <(find "$wd1" -type f)

that eliminates the error, but I will always get the same images, and leaving the rest to pick from behind. (not taking into consideration of the "over kill" from the multitude of images to be used) that is a mute point. ;)

anyways that is a work in progress
 
Back
Top