- cross-posted to:
- linux@programming.dev
- cross-posted to:
- linux@programming.dev
I’ve been trying nushell and words fail me. It’s like it was made for actual humans to use! 🤯 🤯 🤯
It even repeats the column headers at the end of the table if the output takes more than your screen…
Trying to think of how to do the same thing with awk/grep/sort/whatever is giving me a headache. Actually just thinking about awk is giving me a headache. I think I might be allergic.
I’m really curious, what’s your favorite shell? Have you tried other shells than your distro’s default one? Are you an awk wizard or do you run away very fast whenever it’s mentioned?
Nushell looks cool but I prefer to stick with the POSIXes so that I know my scripts will always work and syntax always does what I expect it to. I use zsh as a daily driver, and put up with various bashes, ashes, dashes, that come pre-installed with systems I won’t be using loads (e.g. temporary vms).
I love NuShell but it is annoying when using LLMs to generate troubleshooting code.
Your scripts should have Bourne shebangs
They have
!/bin/shshebangs./bin/shis a symlink, in my case to zsh. I like using one language.than your hashbangs are bad. isn’t their point to tell the kernel exactly which interpreter can process it correctly?
They’re posix scripts… Any posix compliant bin/sh can interpret them.
To be fair, I’m fairly sure the zsh interpreter has a POSIX sh mode
Hopefully you’re not using the sh language—hopefully you’re restraining yourself from using any of the non-POSIX extensions then
Yeah, there should be a clear separation between scripts, which should have a shebang, and interactive use.
If a script starts acting oddly after someone does a
chsh, then that script is broken. Hopefully people don’t actually distribute broken script files that have some implicit dependency on an unspecified interpreter in this day and age.
Always confuses me when people say this. You can use multiple different shells / scripting languages, just as you can use multiple programming languages.
I know that. I just don’t have a use case for alternative shells. Zsh works fine for me and I know how it works. I don’t have problems that need fixing, so I don’t need to take the time to learn a new, incompatible shell.
If you want your scripts to “always work” you’ll need to go with the most common/standard language, because the environments you work on might not be able to use all of those languages.
I mean if all your scripts are fully general purpose. That just seems really weird to me. I don’t need to run my yt-dlp scripts on the computational clusters I work on.
Moreover, none of this applies to the interactive use of the shell.
It’s not only clusters… I have my shell configuration even in my Android phone, where I often connect to by ssh. And also in my Kobo, and in my small portable console running Knulli.
In my case, my shell configuration is structured in some folders where I can add config specific to each location while still sharing the same base.
Maybe not everything is general, but the things that are general and useful become ingrained in a way that it becomes annoying when you don’t have them. Like specific shortcuts for backwards history search, or even some readline movement shortcuts that apparently are not standard everywhere… or jumping to most ‘frecent’ directory based on a pattern like z does.
If you don’t mind that those scripts not always work and you have the time to maintain 2 separate sets of configuration and initialization scripts, and aliases, etc. then it’s fine.
those scripts not always work
This feels like ragebait. I have multiple devices, use fish whenever that can be installed and zsh/bash when not, and have none of these issues.
EDIT:
or some methods to jump to most recent directory like z.
Manually downloading the same shell scripts on every machine is just doing what the package manager is supposed to do for you. I did this once to get some rust utils like eza to get them to work without sudo. It’s terrible.
Manually downloading the same shell scripts on every machine is just doing what the package manager is supposed to do for you
If you have a package manager available, and what you need is available there, sure. My Synology NAS, my Knulli, my cygwin installs in Windows, my Android device… they are not so easy to have custom shells in (does fish even have a Windows port?).
I rarely have to manually copy, in many of those environments you can at least
git clone, or use existing syncing mechanisms. In the ones that don’t even have that… well, at least copying the config works, I just scp it, not a big deal, it’s not like I have to do that so often… I could even script it to make it automatic if it ever became a problem.Also, note that I do not just use things like
zstraight away… my custom configuration automatically callszas a fallback when I mistype a directory withcd(or when I intentionally usecdwhile in a far/wrong location just so I can reach faster/easier)… I have a lot of things customized, the package install would only be the first step.So you’re willing to do a lot of manual package managing, in general put a lot of work into optimizing your workflow, adjusting to different package availability, adjusting to different operating systems…
…but not writing two different configs?
That is your prerogative but you’re not convincing me. Though I don’t think I’ll be convincing you either.
I have separate configs/aliases/etc for most of my machines just because, well, they are different machines with different hardware, software, data, operating systems and purposes. Even for those (most) that I can easily install fish on.
I don’t really mind having a non-POSIX shell since it doesn’t prevent bash scripts from working, but I get that if you want portability bash is still best since it’ll work mostly anywhere.
If I can shebang nutshell (assuming all the builtins from bash or even sh work) and pass a flag to remove all the fancy UI-for-humans formatting so that piped commands int eh scripts work, then I think this is incredible.
Yeah having this installed along side other more “standard” shells is fine I guess, but it looks like maybe it has some neat functionality that is more difficult in other shells? I guess I’d need to read up on it more but having a non-interactive mode for machines to read more easily would be a huge plus for it overall. I suppose that depends on what it offers/what it’s trying to accomplish.
The Unicode bars aren’t actually stored; that’s just the graphical representation of the table datatype which you can think of as JSON
Like PowerShell does?
exactly
some claim that was the inspiration for nushell: powershell but less verbose and more bashy
I used nushell for a good 6 months, it was nice having structured data, but the syntax difference to bash which I use for my day job was just too jarring to stick with.
Fish was (for me) the right balance of nice syntactic sugar and being able to reasonably expect a bash idiom will work.
So you drive daily with nushell and then script in bash for portability?
Sounds not bad actually…
I’ve been using fish (with starship for prompt) for like a year I think, after having had a self-built zsh setup for … I don’t know how long.
I’m capable of using
awkbut in a very simple way; I generally prefer being able to usejq. IMO both awk and perl are sort of remnants of the age before JSON became the standard text-based structured data format. We used to have to write a lot of dinky little regex-based parsers in Perl to extract data. These days we likely get JSON and can operate on actual data structures.I tried
nuvery briefly but I’m just too used to POSIX-ish shells to bother switching to another model. For scripting I’ll usewithset -eou pipefailbut very quickly switch to Python if it looks like it’s going to have any sort of serious logic.My impression is that there’s likely more of us that’d like a less wibbly-wobbly, better shell language for scripting purposes, but that efforts into designing such a language very quickly goes in the direction of nu and oil and whatnot.
That’s interesting I hadn’t thought about the JSON angle! Do you mean that you can actually use
jqon regular command outputs likels -l?Oil is an interesting project and the backward compatibility with bash is very neat! I don’t see myself using it though, since it’s syntax is very close to bash on purpose I’d probably get oil syntax and bash syntax all mixed up in my head and forget which is which… So I went with nushell because it doesn’t look anything like bash. If you know python what do you think about xonsh? I
That’s interesting I hadn’t thought about the JSON angle! Do you mean that you can actually use
jqon regular command outputs likels -l?No, you need to be using a tool which has json output as an option. These are becoming more common, but I think still rare among the GNU coreutils.
lsoutput especially is unparseable, as in, there are tons of resources telling people not to do it because it’s pretty much guaranteed to break.
nu's commands also work on JSON, so you don’t really need jq (or xq or yq) any more. It offers a unified set of commands that’ll work on almost any kind of structured data.
I’ve used nushell for several months, and it really is an amazing shell
It feels more like an actual language than arcane runes, and I can easily makes chains and pipelines and things that would be difficult in bash
Additionally, it makes a pretty good scripting language
Fish is great.
Sorry I am vegan
Vegans can use fish, as long as they don’t bash
Looks like it’s taken a page from PowerShell in passing structured data rather than just text.
Yeah, it has. I think they started out as loving the concepts of PowerShell but hating the implementation, combined with the fact that PowerShell is clearly a Windows-first shell and doesn’t work so well on other OSes (it surprised me a lot to find out that PowerShell even has support for linux).
nutries to implement these concepts in a way that’s more universal and can work equally well on Linux, macOS or Windows.Powershell works really well on other OSs now. I use it on MacOS and Linux daily. I might loath MS but Powershell is a fantastic shell and after working with an object-oriented shell I hate going back to anything else.
Oh I didn’t know powershell did that too! It sure beats endless parsing errors
That was the foundational concept in powershell; everything is an object. They then went a ruined it with insane syntax and a somewhat logical, but entirely
in practiceimpractical verb-noun command structure.Nushell is powershell for humans. And helps that it runs across all systems. It’s one of the first things I install.
somewhat logical, but entirely in practice verb-noun command structure.
That’s supposed to be “impractical”, not “in practice”, for others reading along.
For example, the “proper” command to list a directory is:
Get-ChildItem
The “proper” command to fetch a webpage is:Invoke-WebRequest https://example.com/In these particular cases, they do have aliases defined, so you can use
ls,dirandcurlinstead, but …yeah, that’s still generally what the command names are like.It’s partially more verbose than C#, which is one of the most verbose programming languages out there. I genuinely feel like this kind of defeats the point of having a scripting language in the first place, when it isn’t succinct.
Like, you’re hardly going to use it interactively, because it is so verbose, so you won’t know the commands very well. Which means, if you go to write a script with Powershell, you’ll need to look up how to do everything just as much as with a full-fledged programming language. And I do typically prefer the better tooling of a full-fledged programming language…
I love Nushell, it’s so much more pleasant for writing scripts IMO. I know some people say they’d just use Python if they need more than what a POSIX shell offers, but I think Nushell is a perfect option in between.
With a Nushell scripts you get types, structured data, and useful commands for working with them, while still being able to easily execute and pipe external commands. I’ve only ever had two very minor gripes with Nushell, the inability to detach a process, and the lack of a
-lflag forcp. Now that uutils supports the-lflag, Nushell support is a WIP, and I realized systemd-run is a better option than just detaching processes when SSHd into a server.I know another criticism is that it doesn’t work well with external cli tools, but I’ve honestly never had an issue with any. A ton of CLI tools support JSON output, which can be piped into
from jsonto make working with it in Nushell very easy. Simpler tools often just output a basic table, which can be piped intodetect columnsto automatically turn it into a Nushell table. Sometimes strange formatting will make this a little weird, but fixing that formatting with some string manipulation (which Nushell also makes very easy) is usually still easier than trying to parse it in Bash.I’m an absolute Linux tard, so it’s hilarious to me trying to read and understand most of these comments
Everyone was a newbie at one point
Love nushell. It’s just about the most practical functional programming language I’ve ever had the pleasure of using.
I’m using fish as my default shell since it’s more standards-compliant and plays nicer with tools that modify your environment. But any time I need to do more complicated shell scripting, I’m breaking out nushell.
Until you discover nushell’s (lack of) quoting rules
Can you elaborate?
Last I checked, there was no rigorous system for how quoting worked, such as how to escape a quote inside a string.
That looks a lot like PowerShell
PowerShell without the awful syntax
What awful syntax?
Ffs bash uses
echo "${filename%.*}"andsubstring=${string:0:5}andlower="${var,,}"andtitle="${var^}"&c. It doesn’t usefor assignment, only in expressions.
I feel my sanity slowly slipping away while reading…
Yeah, why are linebreaks & co. in names even allowed on file system level? There’s not even something like a restricted mode mount option for most fs.
There’s an argument to be made that system software like filesystems and kernels shouldn’t get too smart about validating or transforming strings, because once you start caring about a strings meaning, you can no longer treat it as just a byte sequence and instead need to worry about all the complexities of Unicode code points. “Is this character printable” seems like a simple question but it really isn’t.
Now if I were to develop a filesystem from scratch, would I go for the 80% solution of just banning the ASCII newline specifically? Honestly yes, I don’t see a downside. But regardless of how much effort is put into it, there will always be edge cases – either filenames that break stuff, or filenames that aren’t allowed even though they should be.
Oh right, filesystem is initialized before charset & stuff. My bad.
The usual problems with parsing
lsdon’t happen here because Nu’slsbuiltin returns properly typed data. You can work with it in pretty much the same way you would work with it in Python, except that Nu has a composition operator that doesn’t suck ass (|), so you don’t have to write as much imperative boilerplate.I have a number of reservations regarding Nu (the stability of the scripting language, unintuitive syntax rules, a disappointing standard library) but this particular argument just doesn’t apply.
The usual problems with parsing ls don’t happen here because Nu’s ls builtin returns properly typed data.
Isn’t that the point that the previous commenter was making by linking that answer? I read their comment as “here is why you should use Nu shell instead of parsing
lsoutput.”
Nushell is great, I should use it again. Gave up on it after I wrote a thing for converting fish completions to their autocomplete system for it and their internal autocomplete didn’t perform anywhere nearly adequately.
Check out carapace. It takes a bit of setup but basically tries to make all the completions work in almost any shell. For me that solved the big step backwards from fish’s completions that nu’s native completions have.
Yeah, that’s what I’ll do when I get around to checking it out again.
Thanks, just tried this out and it works well.
thanks, good thread.












