Shell [SOLVED] Which tool are the best for bash/sh scripts aggregating into library ?

Dear FreeBSD Gurus!

After more than 20+ years a numbers of sh/bash scripts that any SysAdmin has are counting 300-500:
- some of them are obsolete because OS version significantly changed but the goal which this script resolve- never repeated, so script never updated;
- some of them are collected from forums threads like this and not tested on actual configuration;
- some of them required special configurations (which was on some old projects);
- etc, etc

And in some point the situation became in state, when asking new question on user forum, asking LLM like Claude/GPT or asking on StackExchange are the quickest way to complete the task and making job done well.

On Apple’s macOS there are several app to keep script and code snippets libraries, and best of them are Dash (which also have bash/sh and many other languages code downloadable reference index book inside) and SnippetsLab.

Which tools You using to organize Your scripts/code snippets library?
 
Which tools You using to organize Your scripts/code snippets library?
Just one actually... devel/git; ever since I discovered this tool several years ago I became quite a vivid user and advocate.

And not just for development purposes, but also as a means of "history storage". For example: on my system /usr/local/etc is actually a Git repository, which in its turn is part of a main Git repository within /root in the form of a worktree. And not just /usr/local/etc, many other locations as well.

This allows me to keep full control over all my config files, my shell scripts, everything; and all from a central(ized) location. Best part is that this also gives me full history access as well, despite the limited retention of my system backups.
 
Which tools You using to organize Your scripts/code snippets library?
Well, there's many reasons to organize the scripts and code snippets that one writes for themselves.

Sometimes, those scripts/snippets accomplish a task that is not normally done by the system.
Sometimes, one discovers that they reinvented the wheel, and there was no real need to write that script.
Sometimes, those one-liners are actually reused in a project for an app.
Sometimes, that same task is better off being re-written as a C++/Ruby/Java/Rust/whatever module.

My point is, there's no perfect, one-size-fits-all solution.

- If you're keeping obsolete stuff around for reference, just dump everything into one subfolder, and use a file manager to organize it. Maybe include a README that spells out the story behind each and every script...
- If you have a collection of bash/sh scripts that you downloaded from random places and never used - time to delete them. I sometimes keep random Windows softwares around - I keep them offline on a usb stick because I know that it would be time-consuming to obtain an up-to-date copy all over again. I haven't installed any of that stuff in ages, because I found replacements in Ports that work just as well. What I do for Windows softwares is pretty different from what I do with UNIX scripts. I just don't see it worthwhile to collect and archive bash/sh scripts. They normally accomplish just too little by themselves, and the functionality they offer - it's easily replaced.
- If a bash/sh script requires special config, I just look elsewhere to get the task done.

Having said that, SnippetsLab does look interesting. As for what's available in FreeBSD, there's VSCode (editors/vscode), and Obsidian Notes. Both are based on Electron, and they do offer pretty powerful offline text-search capabilities. Not impossible to use those to manage code snippets and notes, but it takes a bit of time and effort to create and maintain an organizational structure that you're happy with.
 
I exclude myself from "Gurus", but possibly my answer may contain some help anyway.
Which tools You using to organize Your scripts/code snippets library?
Before we can start talking tools it needs to be defined in detail what exactly do you mean with "organize" and "library".

Every tool is meant for a special job based on a certain concept. So any tool is useless unless you define what you want to do exactly. You can pick a tool to get its concept, but in my eyes you better define your concept first and look for suiting tools, instead of starting with the tool, reorganize your machine and yourself to fit that tool, and potentially find out it was not what you wanted.

To me it seems there are at least 2 tasks:

1. You want to keep track on the different development stages of your scripts.
As ShelLuser already pointed out for exact that are version controls: Git, Subversion, Mercurial,... - you need to pick one for yourself.

2. You want to get an overview, keep track, and organize which scripts there are, which are in use, not in use but somehow still useful, needed to be updated, are completely obsolete...(definitions)

The most simple way like I do it since I have only some dozens self written scripts in actual use, I use directories and links. My core main tools for organizing such are cp, mv, rm, and give comprehensible filenames, even if they are long. For things frequently used in my daily production I define short aliases within my ~/.cshrc pointing at the files with (path)names telling me what's it.
I keep my 'development & experimentation area' strictly seperate from the directories which exclusively contain stuff in use in form of one release version each, only: ~/scripts_in_use/ and ~/bin
All what I don't need/use anymore I move into a directory named 'archive' (which you may also use tar and compression for.)

Organize your directories manually presupposes some discipline, but therefore it brings a lot of basic order. I know what I have, where I have it, or at least where I may look for, so to limit searches, can be sure I don't have it if it's not found in certain places, and above all have no discussions with automatic facilities which want to delete things the computer thinks he knows better what I should get rid off than myself.

One of the most important things I learned in my early computering days was:
Removing things you don't need anymore is second most important after making backups.

Especially MS Windows teaches "every sperm is useful" (Monthy Python's 'The Meaning of Life') Asking everytime if you're really, really sure to not safe that empty default sheet you created by accident by starting a program by mistake: "It may of some use in the future. And be warned, you will lose it forever, no way to get it back ever, when you delete it now!" Delete! "Okay, I will put it into the paper basket, for you maybe change your mind later." No! Remove it, tracelessly! Kill it! Wipe it out!
I bet a sixpack of bavarian beer there are computers out there, having directories full of identical empty files named 'untitled','untitled1', 'untitled2', 'untitled3',...

Get rid of all and any useless crap you don't need anymore the very moment you realize you don't need it anymore, or stumble over it, does not only reduce the efforts needed to organize your machine massively - do you need to order, sort, organize and archive garbage?
Otherwise sooner or later you will drown in garbage. Not finding the things anymore you have because of all the useless garbage around and above it is quite the same as not having them at all.

Anyway, cleaning up is always the very first step, even needed for a fresh start with a new tool or concept for that in any case. Maybe the job is already done with that.
If not, a more sophisticated way was to use also version control for organizing those directories.
Since a version control does not only keep track on different stages of single files, but whole directories, it can be used for helping you to organize certain directories as well, e.g. ~/my_scripts ~/bin /etc /usr/local/etc

For more complex things more detailed definitions are needed what shall be organized how in detail.
Some content management systems could also be a help, and worth a closer look.
Maybe you want/need combine those, or version control with a database. Or maybe just a simple spreadsheet does the job already for you, or you want to get your overview in another form...
What do want to be automized on what criteria? What can be automized?
 
- If you're keeping obsolete stuff around for reference, just dump everything into one subfolder, and use a file manager to organize it. Maybe include a README that spells out the story behind each and every script…

Thank You, I appreciate Your answers!

But having a collection/library WITHOUT TAGGING - NOT USABLE ST ALL.
And no matter is this library of scripts or useful “one-line shell commands” (this depend of Your everyday’s work needs, - much more different small tasks as server TechSupport/SysAdmin or much more complex scripting as DevOps).

Is using readme.txt comfortable? Definitely NOT!
Because You need to write at least a short description of code/script, and no one doing this. We all hate to spend time on this.
Opposite, the picking up a short tags are FAST, INFORMATIVE and USABLE.

Writing my initial question I clearly understood that opposite to DevOps, around 97% of *nix users DO NOT HAVE USEFUL TOOLS for maintaining scripts/code snippets. And Git are only one tool that are more or less usable.

But anyway, without some specialized separate app with graphics UI (and better - with integrations with NeoVim/VSCode, Terminal) AND TAGGING (for quick access to particular code/script) - are VERY UNUSABLE = just near impossible.
Today most of us using iPads/iPhones to access to our servers and active network equipment everywhere: from home’s kitchen table to public cafe on the road. And having the useful tools with cloud sync with our desktop at work/home - are MUST HAVE. Not need, or wanted, just MUST HAVE.

Where am I wrong?
 
But anyway, without some specialized separate app with graphics UI (and better - with integrations with NeoVim/VSCode, Terminal) AND TAGGING (for quick access to particular code/script) - are VERY UNUSABLE = just near impossible.

Where am I wrong?
Well, if you want a system that allows you to tag those one-liners, then I'd recommend VSCode or Obsidian Notes. They do support tags as an organizational tool. But there's still some up-front effort required to learn how to use those tags, and how to organize them to fit your needs.
 
Where am I wrong?
You lost me.
On the one hand you're asking an open question vaguely, on the other hand you are pretty clear about what's mandatory to get the things done you like it, while at the same time imply it's not there, while there are tools offering such things...

Maybe I misunderstood, and you're not looking for a tool/solution/way, but want to know if it was worth the effort if you write such a tool?
 
Which tools You using to organize Your scripts/code snippets library?
DokuWiki!
  • Want to add kmods_latest repo on FreeBSD? Notes
  • Random FlightGear flags for performance? Notes
  • Taming Microsoft Edge on Windows? Notes
  • Max performance turn-key scripts for AMD/NVIDIA GPUs on Linux? Notes
  • Solution to Fedora trying to erase Xorg? Notes
  • Coreboot on ASUS KCMA-D8? Notes
  • Playing RuneScape on a Peloton bike? Believe it or not, notes :p
  • And notes on how I host the notes, 3 ways Linux and FreeBSD too
If it's interesting, I try it, note it along the way, and improve it later! Oblivion's been pretty fun lately that I still have to clean this up a bit, but it's all noted :p

But having a collection/library WITHOUT TAGGING - NOT USABLE ST ALL.
If it's anything too-nonfunctional I just delete the text or whole page, but I roll with "if I wrote it, it worked at the time most likely" :p
 
Thank You, I appreciate Your answers!

I exclude myself from "Gurus", but possibly my answer may contain some help anyway.
:)
Before we can start talking tools it needs to be defined in detail what exactly do you mean with "organize" and "library".

Every tool is meant for a special job based on a certain concept. So any tool is useless unless you define what you want to do exactly. You can pick a tool to get its concept, but in my eyes you better define your concept first and look for suiting tools, instead of starting with the tool, reorganize your machine and yourself to fit that tool, and potentially find out it was not what you wanted.
Absolutely agree. I more than sure the needs of scripts/shell code organizing come ONLY for peoples who WORKING WITH SCRIPTS 5-7DAYS / WEEK. And probably, on different *nix systems and versions of this systems.

For ordinary FreeBSD enthusiasts - this organizing not needed: the problem/issue coming -> they asking on this forum, or Claude/ChatGPT, or StackExschange -> trying & editing script -> done.
And then never touch scripts within 2-3 month or so.

To me it seems there are at least 2 tasks:

1. You want to keep track on the different development stages of your scripts.
As ShelLuser already pointed out for exact that are version controls: Git, Subversion, Mercurial,... - you need to pick one for yourself.
Today only Git have great implementations, useful with graphics UI user clients.

Another point is: no one love to learning another one (and complexity!) tool like CVS in addition to learning scripting language!
Especially, if there are huge dependency from which command interpreter (zsh, zsh+fish, csh, ksh, ash, bash, sh,…) would be used.

2. You want to get an overview, keep track, and organize which scripts there are, which are in use, not in use but somehow still useful, needed to be updated, are completely obsolete...(definitions)
Agree with You.

But only few of us have a habits to constantly updating (optimize by speed, optimize algorithm, increase security, etc..).
Especially, if on one project we using the particular scripts, then change workplace, and no one pay to us to improve old 2-3years age scripts or crone jobs.

In any case, using tagging (#needupdsec, #needoptimise, …) better than nothing. Isn’t it?

The most simple way like I do it since I have only some dozens self written scripts in actual use, I use directories and links. My core main tools for organizing such are cp, mv, rm, and give comprehensible filenames, even if they are long. For things frequently used in my daily production I define short aliases within my ~/.cshrc pointing at the files with (path)names telling me what's it.
I keep my 'development & experimentation area' strictly seperate from the directories which exclusively contain stuff in use in form of one release version each, only: ~/scripts_in_use/ and ~/bin
All what I don't need/use anymore I move into a directory named 'archive' (which you may also use tar and compression for.)

Organize your directories manually presupposes some discipline, but therefore it brings a lot of basic order. I know what I have, where I have it, or at least where I may look for, so to limit searches, can be sure I don't have it if it's not found in certain places, and above all have no discussions with automatic facilities which want to delete things the computer thinks he knows better what I should get rid off than myself.

One of the most important things I learned in my early computering days was:
Removing things you don't need anymore is second most important after making backups.

Especially MS Windows teaches "every sperm is useful" (Monthy Python's 'The Meaning of Life') Asking everytime if you're really, really sure to not safe that empty default sheet you created by accident by starting a program by mistake: "It may of some use in the future. And be warned, you will lose it forever, no way to get it back ever, when you delete it now!" Delete! "Okay, I will put it into the paper basket, for you maybe change your mind later." No! Remove it, tracelessly! Kill it! Wipe it out!
I bet a sixpack of bavarian beer there are computers out there, having directories full of identical empty files named 'untitled','untitled1', 'untitled2', 'untitled3',...

Get rid of all and any useless crap you don't need anymore the very moment you realize you don't need it anymore, or stumble over it, does not only reduce the efforts needed to organize your machine massively - do you need to order, sort, organize and archive garbage?
Otherwise sooner or later you will drown in garbage. Not finding the things anymore you have because of all the useless garbage around and above it is quite the same as not having them at all.

Anyway, cleaning up is always the very first step, even needed for a fresh start with a new tool or concept for that in any case. Maybe the job is already done with that.
If not, a more sophisticated way was to use also version control for organizing those directories.
Since a version control does not only keep track on different stages of single files, but whole directories, it can be used for helping you to organize certain directories as well, e.g. ~/my_scripts ~/bin /etc /usr/local/etc

For more complex things more detailed definitions are needed what shall be organized how in detail.
Some content management systems could also be a help, and worth a closer look.
Maybe you want/need combine those, or version control with a database. Or maybe just a simple spreadsheet does the job already for you, or you want to get your overview in another form...
What do want to be automized on what criteria? What can be automized?
Thank You for detailed explanation.

But what is a conclusion from this? Again, because of just LACK OF PROPER TOOL anyone from “only *nix world” try to solve a problem of organizing by the tool which are well-known: txt files (readme), folders, even CVS, and at the end come to minimizing strategy (all this “cleaning up”, “delete unnecessary”, etc) and hypnotizing yourself by “I not really need this actually” mantra…

But no one from this “tools from *nix world” really solves the organizing problem.
Just take a look on Dash and SnippetsLab to understanding what really mean “code/snippets organizing”.

Happy to read Your opinion on this. :)
 
You lost me.
On the one hand you're asking an open question vaguely, on the other hand you are pretty clear about what's mandatory to get the things done you like it, while at the same time imply it's not there, while there are tools offering such things...

Maybe I misunderstood, and you're not looking for a tool/solution/way, but want to know if it was worth the effort if you write such a tool?
Ou are absolutely right: I am not one, who *just have a ton of both useful, old and new, underdeveloped and polished, etc. scripts”. But I am one who find a great tool for keep scripts (and code snippets) library organized, BUT TRY TO FIND THE BETTER OR MORE USEFUL TOOL, because 101% there are someone more smarty (or with much more biggest experience) than I am.

Really, I not need a *just one useful tool*, I NEED A COMPLETE USEFUL WORKFLOW: from idea to writing, testing, and then to deploy, and then to keep scripts/code organized.
And of course, with cloud sync (because I need on all my devices, both mobile and desktops, home and work,…) and integration with CVS (mean Git) and Editors (like VSCode, NeoVim because here we discuss all related to FreeBSD).

Without all of this any “just one great tool” - would be incomplete and unusable. As most of us here I hate spending my lifetime to improving the tool which I need for speed-up my work. :)
 
But no one from this “tools from *nix world” really solves the organizing problem.
On DokuWiki, stuff is organized under namespaces. I have Linux distro notes under linux:distros so Fedora's is linux:distros:fedora_workstation_gnome
I named the namespaces and page names, and know how to reference them quickly if I'm looking for something :p

I feel if you sort the stuff yourself, you won't have a problem with organizing it or knowing where things are. Dash and SnippetLabs look way-more confusing to organize stuff under (it looks like they want to impose a certain style to the categorization).

My thing is just notes (basically DIY cookbook), but for actual software projects with Git and multi-user, that might benefit from something different. For tags, something like putting spoiler/hidden text #needoptimise in pages would allow them to be found in quick text searches, but I'm sure there's something more nuanced on DokuWiki for that!

And then never touch scripts within 2-3 month or so.
I make my scripts good to avoid rot or flimsy solutions that break on an update :cool:

Not a script exactly, but 3 commands for osu! work any Linux distro, while the notes were made on FreeBSD. And those commands worked for years. Every distro has Wine, but for winetricks a drop-in of the latest src also works for distros that don't package it (the distros I run do; anyone else can likely figure it out :p)

This is the largest desktop launcher I made for a game for Linux, and it was mostly copy/paste to FreeBSD (Java 8 no HiDPI) with the only things really breakable being upstream/their code, or FreeBSD unexpectedly changing JAVA_HOME='/usr/local/openjdk8' java (which if it happened in the future, someone could read the readable notes, and change that path easy with the rest of the scripts likely fine). I provide clean working notes, and updates later might need minor changes.
 
Really, I not need a *just one useful tool*, I NEED A COMPLETE USEFUL WORKFLOW: from idea to writing, testing, and then to deploy, and then to keep scripts/code organized.
And of course, with cloud sync (because I need on all my devices, both mobile and desktops, home and work,…) and integration with CVS (mean Git) and Editors (like VSCode, NeoVim because here we discuss all related to FreeBSD).

Without all of this any “just one great tool” - would be incomplete and unusable. As most of us here I hate spending my lifetime to improving the tool which I need for speed-up my work.
If you want a complete useful workflow, you might want to research Jupyter Notebooks. It's available on FreeBSD, as well. Dunno about deploying, but it definitely supports writing and testing code, and a bit of organizing it, too. Jupyter offers plenty of add-ons to help with organizing the workflow, and it supports lots of languages.

I just don't think anyone is trying to get you to necessarily to write a new tool, but to string together a few useful tools into a workflow.

Sometimes, an 'All-In-One' tool is exactly what's needed, and sometimes, it's unnecessary complexity. Sometimes, you need a whole Content Management System, and sometimes, just a simple blogging platform is sufficient. Those Content Management Systems have perks and drawbacks.

Useful, Simple, Low-cost. Pick two, you can't have all three.
 
Tools? I think the base OS gives you all you need, if you use a little discipline. Here's what I do:

The login shell config files (for me .bashrc) are shared across all systems I use, both as a regular user, and as a sys admin. They are all generated from one master copy, which then has quite a few ifdef in it: For OS-specific stuff (Linux vs. MacOs vs. FreeBSD), host-specific stuff (certain things are only needed on my personal desktop. others on every machine in my home domain), personal use versus at my place of employment, user versus root, and so on. There are 3 or 4 of these files (including .emacs and such), and they are under source control. Whenever I create a new home directory, or set up a new computer, I copy it from source control. A lot of everyday use scripts are actually stored in there, as either aliases or shell functions. Examples: sgrep (which greps everything in a source tree), or "lower_well_fill_off" (a single-line command to be used when the lower well water tank overflows, which it sometimes does).

Several handful of scripts are stored in a "ralph_utils" source directory. The scripts have long and descriptive names, such as "rename_scans" (for mass production renaming of documents that have been scanned into PDF files), "find_unlinked" (for directories that are supposed to be served by a web server, it finds files that aren't linked to by any .html files, and it finds links in .html files for which the corresponding file is missing), "copy_from_sd_card" (copies pictures from the SD card of a camera, places them in the correct directories, groups them into lists for review, and backs them up to the server", "spread_photos" (takes many images from a camera, spreads them to a directory hierarchy with one directory per year, month and day, downsamples them to thumbnails, and then creates empty HTML indices). There are also system utilities, in particular for embedded Linux RPi, such as powering them off remotely, called "poweroff". The reason for the long file names is: Most of the time, doing "ls" in those directories will already tell me what each script does.

The next step is: Every scripts has a paragraph or two of description at the top of the file: why does this exist, when would you use it, how does it work internally, how do you control it, what assumptions does it have. For example: "Script to copy all the photos from the security camera SD card at the gate. This only works on a Mac. The SD card has to be mounted already, at /Volumes/GATE. A previous copy of the content already has to be stored on ~/Desktop/GATE. The copying is done using rsync, which has to be installed. It aborts if the directory structure on the SD card looks wrong, but if you aren't sure, run it with the -n option first to get a list of what would have been copied. After copying, it groups the photos into lists of at most 100, which are left in /tmp/GATE.list.aa, .ab and so on. The newly copied files are automatically backed up to the home server; this is done last, so you can start reviewing the images from the lists while the backup is running".

The reason each file has such an elaborate comment: If you can't find the script, just grep the directory where all scripts are stored for keywords, and you will likely find something.

The source code directory should have an elaborate make file. It doesn't, there is usually just a file called "install.txt" which shows which commands should be used to install things; I just cut and paste the necessary commands.

So ultimately, instead of using a tool, just use the existing infrastructure of directories, source control, file names, and comments in files.
 
Really, I not need a *just one useful tool*, I NEED A COMPLETE USEFUL WORKFLOW: from idea to writing, testing, and then to deploy, and then to keep scripts/code organized.
And of course, with cloud sync (because I need on all my devices, both mobile and desktops, home and work,…) and integration with CVS (mean Git) and Editors (like VSCode, NeoVim because here we discuss all related to FreeBSD).
Now you got me again. (And at least partially I got you right, though) :cool:

What you're describing resp. searching a solution for is not limited to shell scripts only,
but an issue within professional software development in general since the dawn of computers.

There are ways and also many discussions, since nobody yet found The Egg of Columbus by now; and personally I highly doubt there ever will.
Don't be too focused on tools, only, too much. As I already mentioned above: tools support ways. You may learn a way because of a tool, but this doesn't mean tools create ways. It's vice versa.

If you get to a point like you are it's always a good idea to reflect the own way(s), look for other ways, and reconsider if changing your way, or completely going another way was a solution. Which for sure does not mean: you have to. absolutely not. Nor it means to always search for other/new tools.
If life taught me anything, many (most?) problems can be solved by simply change/adapt the personal perspective.
Which again does not mean, not to keep searching for better ways and tools.
But again: Without a new, better, sophisticated way any new tool cannot be something else than another copy of what there already is.

However, since I have the feeling that after all 97% of all unix[like] users know how to deal with it, seems proof enough for me there are ways. 😁
 
Tools? I think the base OS gives you all you need, if you use a little discipline.
Maybe not all, but pretty much already, yes.

Of course you cannot do everything perfectly suiting own ideas with basic shell only. On the other hand: What does?

In my eyes the fathers of Unix(philosophy) already faced the issue that it's way better to have a set of small(er) tools you need to learn, learn to use, and above all learn to combine them, than trying to answer the impossible task to have one specific tailored tool for every individual task each user may have, which may also become large, complex, complicated, and is usable for this certain job, only, because you either get one large and ugly one-size-fits-nobody-really, or you need to tailor a personal suiting tool for each user individually, which simply is impossible. So that's the core idea of Unix-philosophy: Everybody tailor their own tools. So, provide toolsets, not turn-key solutions.

But of course within a world of GUIs as default you have to engage all those little small text ("boring!") only tools the shell gives you, understand the basic concept, grasp the founding idea, think, analyse your work style, and develop your own concepts and ways, which can be seen somehow also as a kind of programming, which needs effort, a kind of starting investment is needed, while seeing all those large 'software-suites' with their coloful GUIs, containing thousands of buttons, menus, and stuff, promise: "That's tedious effort. Not needed. Just simply use me!" overseeing: Effort cannot be spared. It's just transformed in another form, plus for the price it's more inefficient in the long term, so eventually even more effort is needed.
What large software-suites sell you is the promise you don't need to think for yourself. That you could buy precasted ideas, concepts, and ways, and because of a GUI you don't even need to learn its usage, which simply ain't true.

This distract from ways, strategys, and concepts, and could make one focus on tools, and the illusion anything can be done, if just the right tools was there. Or vice versa - even worse:
It cannot be done as long as there is not the perfect suiting tool.

Well, some software-suites for sure may have their qualification, but in my eyes it's kind of a disease of our modern society to stare at those, feeling one misses something if there ain't such a thing. With unix-philosophy you cannot sell much, while jack-of-all-trades always sell. That way the believe continues to be nourished that for any problem there simply has to be just the proper tool. So the search never ends. So infinite tools can be sold. 😁
 
There's kind of a reason we use git instead of manually creating and naming versioned directories... There's kind of a reason we tell people to learn git, rather than telling them to write their own scripts to read files, traverse directories, and make decisions on what we pull together to compile. You can either learn proper git commands to accomplish a task or you can write a truckload of bash scripts that need maintenance... Or better yet, use punch cards.

My point is, one needs to ask, "At what point would chore automation make a difference? In what way?".
 
There's kind of a reason we use git instead of manually creating and naming versioned directories... There's kind of a reason we tell people to learn git, rather than telling them to write their own scripts to read files, traverse directories, and make decisions on what we pull together to compile. You can either learn proper git commands to accomplish a task or you can write a truckload of bash scripts that need maintenance... Or better yet, use punch cards.
With all my appreciates about Your efforts and passion here on forum, let’s note:

If You are not just “home enthusiast who spending free time in a basement garage with DIY rack full of loud used HP servers”, - You definitely hate to spending time on something outside Your particular case: You have a work and a family. And need spending time exactly on them, not for learning “another one tech thing from IT-world”.

Shortly to say, You need create, test and deploy scripts/code, or find in own library, make small corrections, test, than deploy and close the task.

So, from this point of view would be good to using normal Continuous CI/CD practice, which mean GitLab / GitHub as standard (with perfect Tower Git client for this) for scripts/code development.
My point is, one needs to ask, "At what point would chore automation make a difference? In what way?".
For now I prefer (and more than sure this is the best editors and standard for shell scripts now) VSCode, NeoVim for coding shell scripts.

But about “how to organize scripts/code snippets” I am not so sure:
GitLab/GitHub (with advantages as cloud storage)
OR
stand-alone Dash and SnippetsLab apps

for all scripts/code snippets.

And because of this doubt I asking here.
 
Ok, if no one propose more advanced tools, for now the above stack would be unchanged for 1 year for the next iteration…

THANK YOU ALL FOR PASSION AND EFFORTS !
I really hope this thread help someone to keep shell code organized.
 
Back
Top