When I read the headline I thought this was going to be a terrible idea but I quite like it, especially the bit about using tab to list all your tooling.

Anecdotally I haven’t had many namespaces collisions recently. I’ve also let myself go a bit after going into management. My tech skills are 10 years too old.

Any tips from someone else on where they started to be hip again?

  • L8D
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
I think the hip thing to do is just use your spare time to build stuff, and organically learn from building stuff. Self-sufficiency, creativity and generally being a self-starter is what's hip. More time spent building translates into an incentive to pickup tricks and relying on yourself to find hip solutions to actual problems.
Also in "management", also feel like my skills will atrophy if I don't keep up.

What I do is make tools to make my life easier. For example, if there's a web service at work I use for mundane lookups I'll find out if it has an API and write a CLI for it to speed up my daily grind. Once it's tuned to my liking I'll share it with the team. I do struggle to convince others to try it. Not sure why. But I don't really care because I use the tools every day.

I built my side project in a separate, trendier stack than the one I’m comfortable with at work for this very reason.

Just to see some new perspectives and be conversant in the trends.

Some of the new stack now crosses over to work, and I have a deeper appreciation for some of the older pieces.

> Because my shell script names tended to be short and pithy collections of lowercase characters, just like the default system commands, there was no telling when Linux would add a new command that would happen to have the same name as one of mine.

Not sure I understand this problem. I just put my bin directory at the front of $PATH rather than the end. To browse my commands, I simply `ls ~/bin`.

But then you start using some tool that expects $0 in the system path and breaks causing frustration and debugging.

Pick your poison

Fair point. I've been doing this for years and none of my scripts have ever caused any (noticeable) breakage, though.

The danger is also mitigated because I only modify my own user's shell rc file. Any daemons running as root or their own user are unaffected.

Article is from 2009. Plenty of prevalent short named tools have been released since then, causing exactly this sort of issues. Just thinking about it, "node" and "npm" come to mind.
> causing exactly this sort of issues

POTENTIALLY. This (for me, anyway), is a solution in search of a problem. Been using some sort of *nix since the early 90's, writing scripts and commands and aliases and functions the whole time - this has never once happened to me.

It obviously _can_, and PROBABLY has, to some. But it's more a "can" than "will". Maybe I just name my homebrew stuff in a way that's unlikely to collide, dunno.

In 35 years of using Unix I've never come across anything like this. What tool do you use that this is an issue?
Mostly build tools. I also have a custom script called ip way before Linux made theirs and caused some problems.
I put ~/bin/ at the start of my path as well. It works very well for me, but I also have to admit that I've never experienced a name collision from doing this as far as I'm aware.

I'm not on the comma train here because it makes the name ugly and confusing, and doesn't solve any problem that I have. But it is a clever hack for those who do have this problem.

What is "system path"? Every tool inherits the caller's path. I don't think this could be a problem unless a tool talked directly to systemd to execute itself.
Default PATH minus ~/bin.
There is no "default PATH", the default is inheriting the parent process's.
I’m here to communicate not be pedantic. I’m sure you could understand what I meant.
in the context of "some tool that expects $0 in the system path..." - i do not understand either.

if a tool looks up a command name "x" given to it, it just takes $PATH and goes through it. the same $PATH as in your shell when you call "x" directly.

thinking more about it, you must thought something like putting "@daily mycommand ..." in crontab, then being annoyed by it not finding your command. then the problem is not that some tools expecting a "system path", but that some tools being defiant and overrides inherited path on their own accord. which is totally unneccessary: environment is called environment because it is prepared for you (the given program) to run in.

  • ·
  • 3 weeks ago
  • ·
  • [ - ]
I just… remember the names I give things? Why is this considered a “hack” of some kind?
It may not be useful to you, but it seems a bit too much to then jump to the conclusion it must be useless for everyone.

I forget names all the time. I even forget I wrote entire projects sometimes. That's why I try to organise my systems in a way I can easily stumble upon things I haven't thought about in months, or years.

I find this article's approach actually solves a problem for me. I do find myself going back to the ~/bin folder once I a while to look for some script I use less often. So at least for N=2, it's a cool hack.

  • dizhn
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
I go to my bin dir and list the files.
One of the benefits is using fzf autocomplete. E.g. in fish type the first char of the command and Tab to launch fzf. Then `,`+Tab is a quick way to filter these custom commands. Versus `ls ~/bin` would be a lot of characters for something I do a lot, or maybe I'd be going `ls`+UpArrow+UpArrow+UpArrow to autocomplete that first.
The “ls “ part is not necessary for tab completion.

    ~/bin<tab><tab>
Is enough. Not that short, but is something done infrequently in my experience. Maybe I’d do it more if it was easier?
ls ~/bin is wayyyyy slower to type than ,<tab>
You could make a command called , that runs ls ~/bin ...
Name it ","
Yep, that's what I said.
Dang, sorry, I misread your comment.
I'm not dang.

</s> :)

MISSING START TAG
I use short custom command names like aa, st, di, dp, cm and le in some thin wrappers around git.

One of these names actually collides with a utility that is installed by default on some systems.

Doesn’t matter to me. I have my own bin dirs before the system dirs in my path, so mine “win”, and I’m not really interested at all in the tool that mine has a name collision with.

If someone were to make a useful to me tool that collided with one of my own tools, I’d probably sooner alias that other tool to a new name that didn’t collide with mine, than to change any of my own tool names.

It’s just too comfortable to use these two-character tools of mine.

This approach can lead to issues like `apt-get upgrade` launching dwarf fortress:

https://askubuntu.com/questions/938606/dwarf-fortress-starti...

  • owl57
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
I hope that tools like apt-get run subcommands in a predictable environment and never leak interactive session's $PATH. Then nothing in $HOME will break them — that user just put his df executable in /usr/bin or somewhere like that.
This. 1-3 character names should be reserved for user aliases/functions/scripts and standard utilities.
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
I tend to make all my git aliases into two-letter combos starting with g. So `gs` is `git status`, for example. Once in a while I actually really need GhostScript though, it’s brilliant for e.g. embedding fonts into PDF files. Usually I then go with `env gs`.
I have `alias s='git status'` and I feel like I'm missing a finger when it's not there.
All my personal commands begin with 'j'. Got real fun when java came around. Using commas is a rather interesting idea.

But at least I did not start them with a 'k' (KDE) :)

Hot take: system commands shouldn't be as accessible as user commands. There should be some sort of namespacing. For example, mkfs should be invoked with `sys::mkfs` or something like that.

The line that separates system and user commands may be defined in different ways, and it may be fuzzy in some places, but if a user accidentally invokes a command that they don't even know why is there, and they didn't explicitly install, then that's clearly a command that shouldn't be directly available in the global namespace.

This is the original reason for having /bin vs /sbin

The sbin directories are supposed to contain "system" (or superuser) commands, and regular users should NOT have those directories in their PATH.

This has been broken for a long time on every distribution I've looked at though.

i am unsure if RHEL/Cent did this, but at one of the places i worked you had to do /bin/ls. I expect this is much more common at places that do lots of acquisitions. that way if some random cron job relies on UB from some tool in order to not start the DC on fire, it doesn't matter. Admins use latest tooling, the software gets whatever it came with and was working until actual competent devs can look at it and move it to ansible or whatever puppet/chef i forget what they all used.
A kinda relevant question.

I use Windows most of time. Like the author, I have bunch of CLI scripts (in Python mainly) which I put into my ~/bin/ equivalent.

After setting python.exe as the default program for `.py` extension, and adding `.py` to `%pathext%`, I can now run my ~/bin/hello.py script at any path by just type `hello`, which I use hundreds of time a day.

I now use Linux more and more (still a newbie) but I never get it to work similarly here.

Firstly, Linux seems to have no concept of "associated program", so you can never "just" call .py file, and let the shell to know to use python to execute it. Sure, you can chmod +x to the script, but then you have to add a shebang line directly to the script itself, which I always feel uncomfortable since it's hard-coded (what if in future I don't want to execute my .py script with `/usr/bin/python` but `/usr/bin/nohtyp`?).

Furthermore, I failed to find any way to omit `.py` part when calling my script.

Again, none of the above is to question the design of the Linux -- I know it comes with lots of advantages.

But I really, really just want to run `hello` to call a `hello.py` script that is in my $PATH.

> then you have to add a shebang line directly to the script itself, which I always feel uncomfortable since it's hard-coded (what if in future I don't want to execute my .py script with `/usr/bin/python` but `/usr/bin/nohtyp`?)

> But I really, really just want to run `hello` to call a `hello.py` script that is in my $PATH.

On Linux I'd say the shebang is still the right tool for this. If you want a lightweight approach, just have a `my_python` symlink in your path, then your shebang can be `/usr/bin/env my_python` (or heck just `/foo/bar/baz/my_python`, /usr/bin/env is already an abstraction).

If you want a more principled approach, look at the `update-alternatives` tool, which provides this sort of abstraction in a more general way: https://linuxconfig.org/how-to-set-default-programs-using-up...

  • ffsm8
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
> /usr/bin/env is already an abstraction).

Isn't that path and the behavior of the binary defined by POSIX though? I thought it's as stable as you can get.

That's why it's usually recommended that you use /use/bin/env bash vs /bin/bash in the shebang, as the latter isn't defined by POSIX

> Isn't that path and the behavior of the binary defined by POSIX though? I thought it's as stable as you can get.

I don't see anything about the path being defined. Certainly possible I missed it, though.

I'm not able to check right now but I vaguely recall that I've used a system in the past with env in a location other than /usr/bin.

Yes, env is an abstraction for the sake of portability, but if you're setting up custom indirections then portability probably isn't much of a concern.
  • o11c
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Others have already mentioned how to fix your problem, but I just want to mention one thing about why:

On Linux (really, all platforms other than Windows), file extensions are much less of a thing; executables of any kind have no extension just the +x flag (among other things, this means you can rewrite them in another language without breaking anything).

The .py extension is only relevant for modules meant to be imported; for scripts being run, if you really need to know, you are supposed to look at the shebang (usually #!/usr/bin/env python for externally-distributed scripts; this gets overwritten to #!/usr/bin/python or whatever for scripts packaged by the distro itself).

Note also that, while shebangs don't support multiple arguments, the GNU version of `env` supports a `-S` argument that can emulate them (the argument length problem remains though).

File name extensions.[1]

I'm not saying you're wrong, but let's be clear about what these are. I would point out that Linux inherited some, but not all of its naming conventions from Unix (as did macOS), but at least here, that is a secondary concern.

Carry on...

[1]: https://arstechnica.com/gadgets/2001/08/metadata/

Simply remove the .py from the filename. It's perfectly acceptable to call it "hello".

I can't think of a downside to the shebang. If you really wanted to run the script with a different interpreter, just specify it. "nohtyp hello" or whatever.

If that still bothers you too much, you could define an alias in your shell startup. For example, in bash, you might do:

alias hello="python3 /path/to/hello.py"

If you were so inspired, you could even write a short script to automatically create such aliases for the contents of a directory you specify.

Node seems to be partial to whether one has a .mjs or not on the filename.
> Firstly, Linux seems to have no concept of "associated program", so you can never "just" call .py file, and let the shell to know to use python to execute it.

There is, but not in the shell syntax. It's an application concern normally delegated to the desktop/GUI.

For shell scripts, the executable is usually declared in the script itself, by adding a Shebang and making the file executable. Think of the Shebang like a file extension of sorts.

If

  chmod +x ./malware.py
  ./malware.py
does not work, check the path the Shebang points to.

That being said, as long as an interpreter can execute the scripts as regular argument, you should be able to get this behavior also for xdg-open:

  xdg-open malware.py
if you really want to do that by default.

This should be equivalent to double-clicking file in the default file manager, IIRC (am on Mac now).

I had an alias "xop <file>" when using Linux as my primary desktop OS.

But I only used this for data files (images, documents etc) where the default already works.

Wouldn't recommend setting an interpreter as default for executable scripts.

You might want to not execute scripts by default, instead opening them in an editor for example.

xdg-open is a Gnome thing I think, but that doesn't mean it's unavailable for other desktops. I know it from Xubuntu (so Xfce).

So I'd really advise against that, But if you want to execute all python files by default in any GUI context, too you could set this kind of default there

'man xdg-open' might help, or maybe you could even select a specific Python executable as the default for .py files after double-clicking in the File Manager.

Again, bad advice

On Ubuntu and probably Debian, the 'mailcap' package is (I think) installed by default. It provides the 'see', 'view', 'edit', 'print' and 'compose' commands which open a file in a suitable default program.
> but then you have to add a shebang line directly to the script itself, which I always feel uncomfortable since it's hard-coded

It won’t directly help reach your goal, but it is semi hard-coded. The ‘correct’ (but see https://unix.stackexchange.com/a/29620 for some caveats) way to write a shebang line is

  #!/usr/bin/env python
That will make it run the first python in your path.

> what if in future I don't want to execute my .py script with `/usr/bin/python` but `/usr/bin/nohtyp`?).

You could create a symlink called python to /usr/bin/nohtyp on your system in a directory that’s searched before /usr/bin (e.g. by adding ~/myCommandPreferences to the front of your PATH)

To be excruciatingly correct, we should specify python2 or python3, because they can't interop and probably never will.
I think at this point we can rest easy that python2 has finally been fully purged from default installs. Heck a few weeks ago I installed Kubuntu 24.04 base and there was no Python at all...
Debian stable doesn't have "python" anywhere on my PATH, for a good reason IMHO. Shebangs say "/usr/bin/env python3" or hardcode "/usr/bin/python3".
> Firstly, Linux seems to have no concept of "associated program", so you can never "just" call .py file, and let the shell to know to use python to execute it. Sure, you can chmod +x to the script, but then you have to add a shebang line directly to the script itself, which I always feel uncomfortable since it's hard-coded (what if in future I don't want to execute my .py script with `/usr/bin/python` but `/usr/bin/nohtyp`?).

Might be you could use binfmt_misc for that.

https://www.kernel.org/doc/html/latest/admin-guide/binfmt-mi...

Here is an article about using binfmt_misc to make .go files executable. I assume something similar could be done for python:

https://blog.cloudflare.com/using-go-as-a-scripting-language...

  • o11c
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
binfmt_misc is only useful for when a fileformat does not allow shebangs
Or if you don't want to use shebangs as mentioned.
You can use the /usr/bin/env python shebang line to work across python location

I keep all my scripts in ~/git/$Project and symlink them into ~/bin and I've added ~/bin to the end of my path.

The normal way here is to name your script simply `hello`, start it with a shebang reading `#!/usr/bin/env python3`, and mark it executable. This of course makes running it as `hello` work (if you put it in PATH), but also:

- The shebang is only specially interpreted by the Linux loader, i.e. when executing the file directly.

- You can still run it with any other interpreter in the standard way: `nohtyp ~/bin/hello`. Python comments start with `#`, so the shebang does nothing with programs expecting Python code.

- This situation (a script without an extension) is common on Linux, so Linux-aware editors understand the shebang to indicate a file type. At least, vim understands this and automatically detects a python file type without the .py extension.

I get your wish of Windows-like behaviour, and even if you might be able to conspire to have Linux behave the way you want, it's certainly not how people expect it to work, so prefer the above scheme for any software you send to others. :)

You could have a shim script with the shebang that only exists to call the "real" script using Python.

/usr/bin/hello:

  #!/usr/bin/bash
  python3 /usr/bin/hello.py
/usr/bin/hello.py:

  print("Hello, world!")
Console:

  $ chmod +x /usr/bin/hello
  $ hello
  Hello, World!
Why is this response being marked down? Just off the top of my head Sublime and VSCODE do this. I prefer `/usr/local/bin` since your package manager won't touch it.

Sublime:

  /usr/bin/subl:
  #!/bin/sh

  exec /opt/sublime_text/sublime_text "$@"
VSCODE:

  /usr/bin/code:
  ...
  # Launch
  exec /opt/visual-studio-code/bin/code "$@" $CODE_USER_FLAGS
  • jraph
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
> you have to add a shebang line directly to the script itself, which I always feel uncomfortable since it's hard-coded (what if in future I don't want to execute my .py script with `/usr/bin/python` but `/usr/bin/nohtyp`?).

I believe the elegant solution to this is update-alternatives, which lets you tell the system which actual program to call. Maybe look into update-alternatives, I haven't looked into this much but it seems like it might interest you particularly. That's the closest equivalent to file association for the UNIX shell I would guess.

You could also have a specific folder that you control in your PATH that symlinks to the Python you want to use.

This handles the default, but you can still call your script with the program you want if you ever wish to bypass that.

Linux can do this. There are probably more ways, but this is off the top of my head.

Use an alias which you set up in your initialization scripts. You alias "hello" to "python3 /yada/yada/hello.py" which is essentially what Windows is doing for you behind the scenes.

  • Dove
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
You could even write a script to traverse all the files in your bin directory and make all the aliases.
> what if in future I don't want to execute my .py script with `/usr/bin/python` but `/usr/bin/nohtyp

you're thinking too much, people have had that shebang for 20 years without any problems

> But I really, really just want to run `hello` to call a `hello.py` script that is in my $PATH.

I don't really understand why you're so adamant about this, either make a python "hello" script with a shebang or just tab complete hell<tab> which you should do with most commands anyway so the .py doesn't matter

another option would be to alias but you'd have to do that manually for every frequent script you need

Tab complete does not work if the script isn't at CWD, which is the case here since all my script is at ~/bin/.
Tab complete should still work if ~/bin is in $PATH.
You don't need the file to be named "hello.py". You can name it just "hello", with the right shebang it will work fine.
can't you use a shebang in Linux with .py files? And, as for removing .py, just remove the extension and make it executable or use a symlink

from a google search, https://stackoverflow.com/a/19305076

I'm not at the computer now to test though

  • ·
  • 3 weeks ago
  • ·
  • [ - ]
> I always feel uncomfortable since it's hard-code

I used to think this as well, but I've since come around to the opposite view. Having it as a "requirement" for what's likely the most popular cli execution strategy enforces a (somewhat disorganised but still useful) defacto standard across all scripts. I can open up any repo/gist/pastebin in the world & chances are if it's intended to be run, it'll contain this handy little statement of intent on the first line. I might not actually run (env-dependent) but I'm sure I can make it.

On the env-sensitivity though, if e.g. you're running nohtyp, as another commenter mentioned, /usr/bin/env has that covered.

  • meken
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Your example got me thinking about the difference between how the windows shell and Unix shell is designed. Seems like the windows shell knows about extensions, whereas the Unix shell does not

That’s an interesting feature for a shell to have. Thanks!

It's can be traced back to the roots of each OS. Windows has it's heritage in the land of DOS where files had an 8 character name with a 3 character extension and that extension carried meaning for the OS.

Linux being of Unix ancestry which had no such concept as a file extension. It was the responsibility of the application or kernel to discern what type a file was. Typically by the first few bytes of a file and handle it appropriately.

I personally am a fan of the Unix way but I can see why some might prefer the DOS convention.

Speaking of heritage... 8+3 goes back at least to DECSystem-10 on PDP-10s.
Everything goes back to the PDP-10 we are all using the incestuous off-spring of DEC.

Whether that is good or bad is left as an exercise to the reader.

A lot of it came from RSX11 features that got rolled into the PDP-10 OSes.
6+3 on DEC TOPS-10.
[flagged]
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Linux has a feature called 'binfmt_misc' which allows you to associate an interpreter to run when a file of any arbitrary format is invoked like an executable. You have to tell the kernel which extension or format is associated with which interpreter, but this is easily done in a startup script.

You still have to say '.py' at the end, though.

You can run /usr/bin/nohtyp hello.py even on a script with a shebang specifying a different executable.

To remove the .py just rename the file to “hello”, or keep “hello.py” and create a symlink or a shell alias called “hello” that points to it.

You can utilize `command_not_found_handle()` for the extension-less behavior.
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
An alternative method for avoiding collisions in PATH is to use really long executable names that are unlikely to be used by other executables and then have shorter aliases for them in your bashrc. The aliases won't affect executables called from within scripts and you can still refer to your executables by their long names in your own scripts.

One drawback is that this doesn't have the same tab completion ergonomics, which I have to admit is really nifty.

EDIT: And another is that collisions can still occur in scripts that need to be sourced rather than executed as a sub-process (like Python's venv activation scripts). But those are rare.

> One drawback is that this doesn't have the same tab completion ergonomics, which I have to admit is really nifty.

They do in zsh

I meant specifically that you can quickly see all your custom scripts (and only them) by tab-completing comma itself. Of course aliases have regular tab completion in bash as well.
Ah, I thought you referred to completion of parameters etc. not working for aliases. I checked again now, and this does not work for bash.

Of course - your own custom scripts usually wont have so fancy completion, and in any case you'd need to configure this and setting it up for both the long and short version is not that much hassle.

Starting with comma is also a common technique in the text expander / text replacement community.
Yup. Most of my vim aliases begin with ,
I was recently poking around ~/.local/bin/ when I noticed that it had dozens of executables that I don't remember putting there. Mostly pyside things, but some other scripts as well. I really had to open each to jog my memory, especially about which scripts I had written myself and which were by other people.

The idea about starting my own scripts' names with a comma would have made the job go much faster, and I'm sure would have helped to job some memories about why each script was written, before opening it.

  • o11c
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Normally ~/.local/bin/ is only for installed scripts; locally-written ones go in ~/bin/
Thanks. I used to use ~/.bin/ for locally-written scripts because I like that directory to be hidden, maybe I'll go back to doing that.
No thank you. Put your personal bin first in PATH, and use /usr/bin or /bin for referring to the shadowed programs.

You can list your personalized tooling using ~/bin/[Tab] for whatever value there is in that.

I don't understand why you would shadow the system utilities with your own, yet not want to use identical names, so that you have to keep remembering to use the comma.

If you don't like the system's grep (e.g. Solaris grep or whatever) but prefer your own (e.g. GNU grep), why wouldn't you just want that to be "grep".

Encountering this idea 5 years allowed me to bring order to my bag of shell tricks! I have over 50 ,commands between aliases and ~/bin and my shell life is way smoother than the previous agglomeration.
  • jwilk
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Discussed in 2020:

https://news.ycombinator.com/item?id=22778988 (90 comments)

  • dang
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Thanks!

, macroexpand:

Start all of your commands with a comma (2009) - https://news.ycombinator.com/item?id=31846902 - June 2022 (121 comments)

Start all of your commands with a comma (2009) - https://news.ycombinator.com/item?id=22778988 - April 2020 (89 comments)

I see what you did there...
  • sre2
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
I've been doing this for at least a decade. Was introduced to the idea by a colleague who might have read this blog post.

I usually do the same with commands where you are able to create sub-commands too, like git-,home (which allows you to run `git ,home add -p` and it conveniently set GIT_DIR to something and GIT_WORKTREE to $HOME). Sadly you can't do it with git aliases, I have to live with them starting with a dot (via '[alias ""]' as a section).

This works for text-expansion snippets too. All of the text expansions that I do with Alfred (other tools might work, too) are all comma phrases. I realize that writing English or even programming scripts/tag, I'd never (not so far) encounter a word or text that starts with a comma immediately followed by anything. I used to use period but have stumbled on instances such as file extension where a period can be followed by words.
> I'd never (not so far) encounter a word or text that starts with a comma immediately followed by anything.

Agreed. I prefer using `!bang`s for the same reason for expanding text.

The problem is that *nix has system utilities who's binaries are named in such a way to make it difficult to replace the nice shorthands that are useful during interactive use, because they may cause problems during script execution.

If we could go back to the drawing board I'd say every system utlity should have a verbose name with some kind of aliasing system that provides easy shorthands to them. Then the shorthands could be replaced easily, with the verbose names being used during scripting.

This might seem like a moot point, since we can't go back to the drawing board, but many projects continue to make this problem worse by insisting on naming their binaries like we're still living with the constraints of the 80s. I guess because it gives them the flavour of "real" system utilities. It would be nice if projects stopped doing that, but oh well.

Doesn't work with powershell (which, to be fair, was quite new at the time this blog post was released).

But honestly, while 2 or 3-letters aliases are tricky, I've very rarely had issues with 4-letter aliases. There are 456k possibilities. On my small opensuse install, my PATH contains only 105 4-letter executables.

what doesn't work about it? i made ",foo.bar"

and i went into PS and typed ,<tab> and it said:

> PS V:\> & '.\,foo.bar'

The goal is to have short aliases. That explicit call syntax isn't exactly convenient. That's what doesn't work.
i typed a literal comma, and then the tab button, and it put in the baker's dozen other characters. I only did this to see if "this doesn't work in powershell" was true, turns out it is not true.

if i have ,foo.exe and ,cuda_install.exe in a directory (or on my path), it's two characters and then a tab, same as linux, to run either of them: ,c || ,f

anyhow, it was for my own edification.

  • neilv
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
With command completion, another option is to use descriptive names.

Just a few examples on this machine: backup-workstation-to-foo, backup-workstation-to-usb-restic, make-label-for-laptop-battery, set-x-keyboard-preferences, update-pocketbook

For one-letter and two-letter commands that might conceivably overlap with some command in some package someday (e.g., `gi` for `grep -i`), I only do those as interactive shell aliases. So they shouldn't break any scripts, and I'll know if someday I'm typing one of those and intending to get something different.

In a few cases, those one-letter aliases have been for very-often-used scripts of mine.

I'm a fan of using argc for this <https://github.com/sigoden/argc>. I have my `~/.local/bin/.argc` file, which has a bunch of commands that I wrote. The commands have inline documentation and documented parameters. Quite nice for rarely-used scripts!
Great idea! And if for some reason you feel like your filenames should stay as they are (without a comma), you could just add symlinks to all executable files in your bin directory:

  $ cd ~/bin
  $ for x in $(find . -type f -perm /a=x -exec basename {} \;) ; do echo $x ; done
  temps

  $ for x in $(find . -type f -perm /a=x -exec basename {} \;) ; do ln -s $x ,$x ; done

  $ ls -l
  total 4
  lrwxrwxrwx 1 tanel tanel   5 Jun 23 16:38 ,temps -> temps
  -rwxr--r-- 1 tanel tanel 251 May 30 23:26 temps
  • sctb
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Similarly, some Lisps (like Scheme48 IIRC) use a comma to begin REPL commands (as distinct from Lisp forms) because commas outside of quasiquotation forms are otherwise syntax errors.
I don't really understand the problem it solves. can't you just put your own bin directory first in the PATH?

I do like the idea of autocompleting your own commands though.

I'm curious how folks manage their important local configurations, e.g.

- is your ~/bin directory a git repo?

- if you git to manage your dot files, do you use hard links or soft links?

I have a custom tool built up over the years that keeps the history of that stuff in repos, but the actual files in ~/bin are usually hadlinks to the repo files (configurable - can be softlink or copy, too).

Every few weeks or months, I run a command on each system that gathers up any accumulated changes I've made to these files and syncs them to common machine that has all the repos. I merge those changes, then run another command to install the updates on all machines, so everything stays in-sync, over time.

I found that these ~/bin scripts and config files fell into a bit of a "donut hole" of development effort, where it was too much bother to maintain a full repo/build/install setup for every single script independently, but I did want to keep changes in sync and track them over time, rather than just having each system diverge forever.

So, my solution was to bundle lots of scripts together, into just a few repos, and sync/merge/etc them in bulk, to streamline the process.

A downside is lots of my commit notes are just a generic "gathering up latest changes" since I'm slurping up lots of edits to unrelated files at once. Hasn't really been a problem for me, though. I mostly just care about having the history.

I didn't invent this, but I have a headless "config" checkout, and have a git-alias which sets my home-directory as the work-tree:

   git init --bare $HOME/.config/repo
   alias config='/usr/bin/git --git-dir=$HOME/.config/repo --work-tree=$HOME'
   config config --local status.showUntrackedFiles no
Then I can do things like "config add", "config commit", and "config push".
There are tools to manage it for you that I’m sure someone will come along and mention, but I’ve got a repo I check out at `~/.home`, then a shell script they just symlinks everything into place.

So .bashrc is a symlink to ~/.home/.bashrc, ~/.config/nvim to ~/.home/.config/nvim, etc.

It’s simple and only relies on having something sh-compatible available so portable now and in the future.

To manage per-system tweaks, I have places that include an additional file based on hostname. For example my .bashrc has something like:

    if [ -f “$HOME/.bashrc.$HOSTNAME” ]; then
        source “$HOME/.bashrc.$HOSTNAME”
    if
Which will include a bashrc file specific to that host if it exists.

Been working well for me for… a decade now?

  • wruza
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
My main system is windows, so it's c:/CmdTools repo in PATH. I tried to learn `stow`-likes, but it's not worth it. I just clone/pull/push it in that fixed location. Mostly cmdbash scripts.

More complex (multi-file) tools are usually separate ts or python projects. Node has {npm,yarn} link, which puts a starter .cmd somewhere in PATH, out of box. Python scripts I usually run through .cmd "alias" files in c:/CmdTools, there's no `pip link` afaik.

I always have MSYS2 installed and here's my cmdbash prolog:

  some.cmd:

    :<<BATCH
      @xbash %~dpf0 %*
      @exit /b
    BATCH

    for i in {1..5}; do
      echo "Hello from bash!"
    done
xbash.exe is just msys's bash.exe that I copied to avoid collisions with WSL (which is a useless PR nonsense). Same with xgrep, xfind, xecho, xmkdir, xsort.

This setup carried me for years and turns out to be a very reasonable windows/unix integration. I like unix architecture, but can't stand linux desktop. This year I've got a project related to linux desktop, and I'm literally become so stressed using it sometimes that I have to take a break or vent loudly.

My ~/bin directory is not directly version controlled. It primarily consists of symlinks, often stripping file extensions from shell scripts (e.g., ~/bin/foobar links to ~/src/foobar.sh) I have just enough python scripts and go binaries to make me think it's worth separating src and bin.

~/src is a git repo. One script evolved into its own project and became a submodule within ~/src.

For configuration files like ~/.foobar_rc and directories such as ~/.vim/, they again are not directly version controlled but are symlinked into ~/etc which is. I don't see any reason that ~/.foobar_rc couldn't be a hardlink, but it's not in my setup.

I used to maintain a single repository at ~ that included ~/src and ~/etc as submodules, with a build script for setting up links. Always being within a git repository became cumbersome, so I moved the build tools into their respective directories (~/src and ~/etc) and now clone and build each repository manually.

Lastly, since private repos aren't (weren't?) free, those submodule repos are really just branches of the same repo that share no common ancestors.

I have some essential stuff for zsh config in a git repo.

For my most important custom bins, they are written in Rust and published to crates.io so I cargo install them from there. It’s just one crate, with wrappers for the git commands that I use day to day

In addition to this, I have host specific repos with some scripts. These scripts are not in my path but are instead scripts that run on a schedule from cron. These scripts run and log various facts about the machine such as zpool status and list installed packages, and auto-commit and push those to their repo. And the other kind of script I have invokes zfs send and recv to have automatic backups of data that I care about.

In addition to this I have a couple other git repos for stuff that matters to me, which either runs via cron (retrieving data from 3rd parties on a schedule) or manually (processing some data).

For neovim I stopped caring about having a custom RC file at all. I just use vanilla neovim now on my servers. On my laptop I use full blown IDEs from JetBrains for doing work.

My ~/scripts is a git repo

My dotfiles/configs are a mix of the following setup on boot:

- one-way copy of config file from $repo/$path to $path (prevents apps from modifying my curated config or adding noise)

- or make it a symlink (if I want it mutable)

- or make it a bind mount (if a symlink won't work; can be a file or folder)

- or make it a one way copy but add a file watcher to copy changes back to the repo (if none of the above work. Some programs fail if the file they need is a symlink or is bind mounted)

For dotfiles using a one-way copy, whenever I change a setting I want to persist I have to manually copy or edit the original $repo/$path. I can take a diff against the repo for a hint, or use `inotifywait -r -e modify,create,delete,move . ~/.local ~/.config -m` for something new.

Not using hard links since the dotfiles are likely to be on a different filesystem (or dataset) than their target paths.

I use a git repository `dotfiles` containing several configs in `dotfiles/etc/`.

Since I use `zsh`, I usually only symlink the `dotfiles/etc/zsh/.zshrc` to `$HOME/.zshrc`, while the `.zshrc` loads environment variables settings all required paths for my tools, e.g.:

  export PATH="$HOME/bin:$HOME/dotfiles/scripts:$PATH"
  export STARSHIP_CONFIG="$HOME/dotfiles/etc/starship/starship.toml"
  export GIT_CONFIG_GLOBAL="$HOME/dotfiles/etc/git/.gitconfig"
  export MYVIMRC="$HOME/dotfiles/etc/vim/vimrc"
  export VIMINIT='source $MYVIMRC'
  # ...
The only files of `dotfiles` I copy over are for ssh, because ssh checks file permissions for security reasons and does not allow symlinks.
I keep everything I care about in a dot-files repo that knows what to install where in different places. I wrote a little tool to make it easier for me to do this: https://github.com/oalders/is
Symbolic links set up with dotbot[1].

Since the link directives are idempotent, you can run it on every login shell if you desire. I ended up setting up a shared jumpbox used by some contractors with it so they could work with our internal tooling without requiring excessive setup, and wrapped it into a shell startup script[2] and found it performant enough that I couldn't tell the difference.

1: https://github.com/anishathalye/dotbot

2: https://gist.github.com/RulerOf/f41259f493b965c9354c2564d85b...

I use chezmoi - it handles both elegantly & has no real requirements w.r.t. how you choose to structure your filesystem. It also handles recovery well when you mess things up.
I got this from an old HN comment I can't find anymore. I have this in my .bashrc:

  alias dotfiles='git --git-dir=$HOME/.dotfiles --work-tree=$HOME'
And I init my dotfiles into a fresh home directory like this:

  git clone --bare gitolite3@example.com:dotfiles $HOME/.dotfiles
  git --git-dir=$HOME/.dotfiles --work-tree=$HOME config status.showUntrackedFiles no
  git --git-dir=$HOME/.dotfiles --work-tree=$HOME reset --hard
Re your second question, here’s how I manage my dotfiles:

https://github.com/andersix/dotfiles

Moin, To manage my dotfiles I use git and an alias. See https://www.atlassian.com/git/tutorials/dotfiles
Look into gnu stash
I use ansible
  • arjie
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
Ah! I called them comma-commands https://wiki.roshangeorge.dev/index.php/Comma_command and I was wondering what the source material was. Now I know!

I feel like these small web people's blogs were so much more accessible before link aggregators got this mainstream.

I’ve used this method for a while now, not sure if it’s from this particular article or if someone else blogged the same idea. I also prefix any aliases or sh functions from my rc file with a comma. Mostly it gets me easy to find custom commands when I “forget what I called that one alias”.
I use different namespaces for different convenience scripts, and use the notation [character].[command] for all of them.

I've used a character for each company I've worked for, and a different one for common scripts. This way is very easy to clean $HOME when I move.

I like the idea of using periods for collections of commands. I have a todo list hacked together with shell scripts and I've been thinking about how I might want to stay away from collisions. I think I'll use do.thing style commands, so my commands will be:

  do.item  Add an item to the bottom of my todo list  
  do.soon  List the items that I need to do soon  
  do.next  List the next item to work on  
  do.mark  Mark the current item done
How do you implement namespaces? Is it just that you do this with all the commands you create for this company?
I liked the idea of this but it won't work for me in practice. A down-side of periods is that you can't use them in an `alias`. Since that is how I configure a lot of these commands the period is probably not an option I can use consistently.
> How do you implement namespaces? Is it just that you do this with all the commands you create for this company?

Exactly. It's super barebones but it works haha.

Funny. My ~/bin is filled with commands that I want to override the system version.
Wrappers or replacements? If it’s a wrapper do you remove local bin from path or hard code system path?
Yeah, shell script that runs /usr/bin/x or whatever.
> The lower-case letters are the very characters used in system commands; brackets, backslashes, the colon, the back-tick, and the single-tick all had a special meaning to the shell

Please note that brackets have no special meaning to the shell.

Brackets are used in shell wildcard ("glob") expressions. For example, if you try to use "[bar]" as a command, the shell will first look for files named "b", "a", and "r" in the current directory, and if it finds any it'll use the first one as the command name and any others as arguments to it.

But as far as I can see, using a close-bracket as the first character in a command is safe, since it cannot be treated as part of such a pattern. Open-bracket (without a matching close-bracket) would work in many shells, but will get you a "bad pattern" error in zsh.

Brackets have a special meaning in the UNIX shell since the earliest times.

Together with "*" and "?", the brackets "[" and "]" have been used by the UNIX shell since some of its earliest versions (already many years before the Bourne shell) in pattern matching for pathname expansion (globbing).

For example, if you have in a directory 3 files named "file1", "file2" and "file3", then

"ls file?" will output

file1 file2 file3

while "ls file[13]" will output

file1 file3

Yes, of course I was too quick: I was trying to suggest to use a single bracket char in the command just as TFA uses the comma. But it turned out that an opening bracket wouldn't work with some non-bash shells. So I was doubly wrong.
Cunningham's Law in action lol
I like it I think. I'd probably be more inclined to add the comma to the end, that way tabbing on "mount" would bring up "mount and mount," which is your personal one.
Thought this was going to be about Comma [0]

0: https://github.com/nix-community/comma

Discussed previously:

https://news.ycombinator.com/item?id=22778988 (April 2020, 90 comments)

Those can really be called comma_nds!

Ahem. Nice idea though, I think I'll start using it...

Surely ,nds?
Or maybe ,&
Commaands? I liike it
Commaet(s). & is Et, which means "and" in latin. Your read of it reminds me of the old Jack in the Box signage which, according to a good friend, looked like "Jack in the B-Fish"
And just for compl&eness the symbol "&" is called an ampersand.
I think "comma_nds" ("commands" if you remove the underscore) makes the pun mkre obvious.

I didn't get it at first thought, thinking of the .nds file extension for Nintendo DS ROMs.

The pun still holds as "comma,nds" though, and might even do so better.
Oh, I thought the user was suggesting using just ",nds" instead of "comma,nds". That makes more sense.
Oh, they might have been. The joke might have gone completely over my head when I said that! Still though, it sounds like "comma,nds" would be a good compromise either way.
Love it!
I think comma is probably one of the most commonly used keys for <leader> in Vim as well, probably for the same sort of reason.
  • c22
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
I guess this works great right up to when the contents of ~/bin/ are added to a CSV for whatever reason.
Use TSV instead.
  • tgv
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
So much better. And you don't have the ridiculous "localization" of csv: over here, the field separator defaults to a semicolon. I suppose someone thought it was so terribly important to have a decimal comma in a csv file that any form of common sense went out of the window.
Good idea overall, I must say.

It's one more key press, but I'm pretty sure I would use underscore for the first character.

  • ·
  • 3 weeks ago
  • ·
  • [ - ]
best idea i've read in a while, will do
Never thought about that, cool!
Nice
  • klysm
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
things like this happen all the time when we didn’t implement name spacing when we should have
I dislike this immensely. It costs nothing to type "~/bin" in front of the command each and every time. And it means I can put comma's wherever I want in some commands that use them, like SQL ..

I do, however, like to comment my custom commands:

    $ mv ~/Desktop/*pdf ~/Documents/PDF # pdfsync
    $ for i in ~/Documents/Development/JUCE/JUCE_Projects/* ; do ; cowsay $i ; cd $i ; git pull ; git fetch --all ; git submodule update --init --recursive ; done # updatejuce
CTRL-R "updatejuce" or "pdfsync" .. and off we go ..

A much nicer way of finding my custom commands ..

You can do what you want but it’s not nothing: it costs six keystrokes every time (seven, counting shift for tilde)
[dead]
  • ·
  • 3 weeks ago
  • ·
  • [ - ]
[dead]
[dead]