Hacker News new | past | comments | ask | show | jobs | submit login
Linux and Powershell (matteoguadrini.github.io)
82 points by johnjackjames on Jan 19, 2021 | hide | past | favorite | 129 comments



I think the reason I've been avoiding Powershell is specifically the pattern of `Verb-Noun -Option argument` the author brings up.

I'm not saying it's not "better" (however you measure that) but I think it's very wordy and that's always been a turn-off and barrier of entry for me.

some examples:

pwd --> Set-Location

ls --> Get-ChildItem

cp --> Copy-Item

I totally understand how bash's sometimes seemingly random shortcuts or acronyms can ALSO be a huge barrier of entry (just as it was for me!) and the straightforwardness of Powershell is a lot clearer, but personally now that I'm "in" on the short-hand of bash, it's been hard to start using Powershell.


The single most underused command in powershell is: Get-Alias

Essentially every command you'd use has a 2 or 3 character alias that is easy to remember and quick to type.

On top of it their almost programitacilly named, so if you know the powershell commands full name you can almost certainly guess the alias

Typing Get-Alias lists them all out.

https://docs.microsoft.com/en-us/powershell/module/microsoft...


I now have a burning temptation to write "Get-Alias Get-Alias"


  PS C:\Users\Kuinox> Get-Alias -D Get-Alias

  CommandType     Name                                               Version    Source

  -----------     ----                                               -------    ------

  Alias           gal -> Get-Alias


and I have been told by at least twenty people never to use aliases in powershell scripts so aliases are effectively useless.


I think the idea is that in scripts you should use the full name for maximum clarity and self-documentation; in your own terminal invocations, be as terse as you want.


That is precisely the idea. The verbose names are self-documenting. You need documentation (names, types, comments, etc.) when code will be reused and read and altered at a later time.

Interactive use is different. It is write (and tweak) then use once. You never write comments in the interactive terminal. For interactive use you have aliases, plenty of them.


One reason for that is that other people may have different aliases set up, so you cannot count on your aliases working everywhere. Yes, that also means the default aliases, which may (although very rarely) change from version to version.

For PowerShell ISE there was an extension that automatically expanded all aliases, I believe for the current language server there may be similar things. If all else fails, a PowerShell script can do that as well, since PowerShell exposes its own parser in its API, so a script can easily introspect itself or another script.


If you're working on a shared project where people might care, I usually have the [PSScriptAnalyzer][1] just auto-correct any aliases dynamically for me in my IDE.

If you're just doing a quick one-off or interactively, then who cares! Crack on with your incomprehensible one-liners!

[1]: https://github.com/PowerShell/PSScriptAnalyzer


Powershell gives the best of both worlds: readable command names in scripts, and simple Bash-like shortcuts when you type stuff in the terminal.


Is this really the "best of both worlds" or does this just double the number of commands you have the learn?


Since you don't have to learn the aliases, you can use the full command names in the interactive shell, it doesn't double anything.


Last I heard (and I haven't had direct visibility into this since about 7 years ago) aliases are preserved between versions for backwards compatibility, but you shouldn't use abbreviated parameter names in scripts because there's no guarantee that they won't become ambiguous if more parameters are added to a command.


That is true; but since the aliases and parameter names are discoverable with introspection, if you are writing a script you can have a tool which expands them all to their full name if you want that.

That said, the PowerShell team do care about breaking backwards compatibility, so this risk is more likely in a 3rd party module than in the PS main cmdlets.


Then you just end up with bash commands again, such as "pwd" and "cat"


Yes, you do. With the advantage of the rich object model. Best of both worlds.


And you can still use the full names in scripts (rather than directly in the shell), so you don’t rely on future readers knowing the aliases.


Thanks so much for sharing this!!


Aliases don't have short forms for their arguments, so for any serious command you'd still be typing (tabbing) a lot.


Yes they do. Powershell is smart enough to find the argument as long as it's not ambiguous. For instance:

    Remove-Item -r -fo ./path/to/some/directory
Powershell is smart enough to know that -r means -Recurse and -fo means -Force because the Remove-Item cmdlet has no other parameter that starts with -r. For -f there are two possible arguments: -Filter and -Force which is why you need to be more specific with -fo.


oh god, fuzzy matching


No it's not fuzzy. It's long-enough-prefix-string-to-be-unambiguous.


But possibly not forward compatible for future parameter addition?


Potentially, if someone adds a new parameter which clashes so the prefix you used is not unique. PowerShell 7.x is introducing support for traditional C-style ternary expression "a ? b : c" and there's a potential syntax clash there because PowerShell has always allowed variable names to end in a question mark, e.g. "$isConfigured?" and with that there's no way to stop "$isConfigured? a : b" parsing as a variable name and then an error. The official way forward is to mandate that variable names used with ternary expressions must use the full-brace names like ${isConfigured?}, which is annoying.

People don't want that, but the PowerShell Team are not moving easily to a possible-break of backwards compatibility in how the language parses variable names. It's been discussed here: https://github.com/PowerShell/PowerShell/issues/3240 for dozens of comments, and goes:

"@PowerShell/powershell-committee reviewed this one today, we have a couple thoughts: No matter what we do, we're going to do some analysis of our corpus of scripts to see how often folks use ? in variable names. Some of us have a hypothesis (that others would like to validate) that the users who are using ? in their variable names may be less advanced users (as we agree in this room we'd stay away from it because of the potential problems that could arise). On the other hand, anyone using the functionality described here will be able to understand a slightly more complicated syntax (like ${foo}?.bar). Therefore, we prefer option 3 because it avoids breaking changes on these less experienced users."

The committee ruled to keep the need for ${}, then the ternary expression became a gated experimental feature in PowerShell 7, you have to opt-in to enable it. One of the PowerShell developers analysed the PowerShell Corpus of 400k+ scripts collected by Lee Holmes and Daniel Bohannon for security threat research[2][3] and said "I found that about 62% of variables that have ? in them use it in the end. That made me lean towards not introducing a breaking change." - in the comments on a pull request[4]. Then someone else made another analysis of the corpus with regex and came up with 329 out of 22,000,000 of variables have a potentially clashing "?" at the end.

It's been reopened as another discussion[5] and another person came up with an analysis of the Corpus with proper tokenising/parsing[6] and came up with 11 variables ending in ? out of 1,896,983 unique variables used[6] which then narrowed down to 1 that might break in backwards compatibility if this change happens. Coming up 4 years of back and forth discussion, even with that kind of evidential backing, with 2 core developers, the Team Lead and Bruce Payette one of the original developers weighing in against, being brought up in a community call, then flagged for review by the committee, then reopened as a reminder again[7] the team is still not won over on the risk of breaking backward compatibility by changing ? parsing at the end of a variable name.

That is, yes "possible not forward compatible for future parameter addition", especially by third party modules who might not take it as seriously, but not at all casually by PowerShell's developers.

[2] https://www.fireeye.com/blog/threat-research/2017/07/revoke-...

[3] https://aka.ms/PowerShellCorpus (1.2GB compressed)

[4] https://github.com/PowerShell/PowerShell-RFC/pull/223#discus...

[5] https://github.com/PowerShell/PowerShell/issues/11379

[6] https://github.com/PowerShell/PowerShell/issues/11379#issuec...

[7] https://github.com/PowerShell/PowerShell/issues/14025


For deletion!


You can shorten any argument name in PowerShell, so long as it can be disambiguated:

    PS C:\> ls -di # same as Get-ChildItem -Directory; -d would pass the -Depth parameter instead
    

        Directory: C:\


    Mode                LastWriteTime         Length Name
    ----                -------------         ------ ----
    d-----       2020-07-24     18:09                PerfLogs
    d-r---       2020-12-14     22:45                Program Files
    d-r---       2020-12-24     22:57                Program Files (x86)
    d-----       2020-07-23     11:29                temp
    d-r---       2020-07-24     14:19                Users
    d-----       2021-01-15     21:33                Windows


  > ls -di # same as Get-ChildItem -Directory; -d would pass the -Depth parameter instead
Ouch, that is actually worse than I thought. So not only can you abbreviate parameters, but when the abbreviation is ambiguous Powershell just arbitrarily picks one instead of giving an error?


Alias are nice, but if the short version is not enabled by default, it is mostly useless outside of your normal workstation. During the course of my day I'll be working in many different containers and VMs. All of them 'cattle' so they will only live hours to weeks at most. It just isn't worth creating aliases if I can't use them most of the time.

A long time ago I spent time making alias and using custom shells but over time I just slowly reverted to default bash because it is always there and always works the same.


The aliases he’s talking about are all set by default. Most of them IIRC are just simple abbreviations- for example Get-ChildItem is gci.


I didn't realize the alias were pre-defined and not just user created. Thanks for clearing that up in reasonable way.


What the heck is going on with people on this site (or the internet in general) speaking with absolutely complete confidence in something while also being completely factually wrong?

It used to be that some minor point might be wrong; now it's the entire comment! Premise, conclusion, everything!

Have we gotten this lazy and this overconfident that we think sounding confident is enough? That if we thought it up, it must be true? In a technical discussion? REALLY?!

...I don't want to live on this plane of existence anymore.


Please don't post unsubstantive comments to HN. If another comment is wrong, the thing to do is respond respectfully with correct information. Then everybody can learn something. Please don't respond by fulminating about the community. HN is an internet forum and the internet is replete with people speaking confidently about things they don't know about I'm not saying that was true of the GP but it's certainly true in general—and frankly that's just human nature.

The internet, including HN, is also a frustrating and activating place which triggers all sorts of untrue overgeneralizations after encountering things one dislikes: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor...

https://news.ycombinator.com/newsguidelines.html

(Please see https://news.ycombinator.com/item?id=25862346 downthread also.)


The parent comment may be incorrect (I don’t actually know, more on this in a sec), but your comment is not constructive.

https://news.ycombinator.com/newsguidelines.html

Unfortunately, while accusing the parent of being completely wrong, you have not elaborated on how/why.

Would you explain the issue(s) for the parent commenter’s future reference and for those of us who don’t have a deep knowledge of PowerShell?


The comment you refer to seems like it was posted by a machine, or at least a troll. Note it is very generic and could be inserted into ANY discussion-thread. And cleverly it claims to criticize "over confidence" when in fact it itself tries to get its point across by being forcefully confident about its own righteousness. That is an old rhetorical trick accusing others of something you are doing to take the focus away from your own issues. Reminds me of some politicians


Please don't post unsubstantive comments, call names, or take HN threads further into flamewar. All that is against the site guidelines: https://news.ycombinator.com/newsguidelines.html.


I'm not a troll, or a machine, thank you very much.

At this point I can't edit the comment in question, but every assumption in the comment it was a reply to was wrong. Every single one.

Aliases in powershell are not turned off by default. Aliases either exist or they don't. When speaking of aliases alone, bash is no simpler nor more complex than powershell.

By the way, this comment (the one I am writing right now) is ALSO replying to a comment written by someone who appears to be overconfident and under-informed. It's a freaking malady, and people are somehow proud of their incorrect blind guesswork.

Incorrect, blind guesswork gets a pass, because the statements made sound correct to a layman, and sound objections don't get a pass. I understand, now. I wasn't clear on how important facts are in a technical discussion.

I stated no facts in my earlier comment, that's true, but neither did the person I was replying to. I thought my point was clear. It was less clear than the comment based on false assumption, I guess.

I fucking give up. You assholes can have this.


Please don't give up! but please also don't break the site guidelines: https://news.ycombinator.com/newsguidelines.html.

Your two comments in this thread contain one fine, substantive statement: "Aliases in powershell are not turned off by default. Aliases either exist or they don't. When speaking of aliases alone, bash is no simpler nor more complex than powershell." If you had simply replied upthread with that in the fist place, it would have made a fine comment with a very high signal/noise ratio. Unfortunately the rest of both comments is just noise, tanking the ratio.

As I explained upthread (https://news.ycombinator.com/item?id=25862312), the internet is crazy-making this way, but it's not some property specific to HN and it's not something that's somehow changed lately. It's intrinsic to the medium, and we all need to build our skills in coping with it—which includes not reacting from a provoked place. That's not easy, but it's necessary to the sort of community we're trying for here, and we need everyone to work together on that.


Don't give up don't give up. It's just that if you think about it your original comment is written in a way that it could be injected into most any discussion-thread on social media as is, and therefore I found it quite amazing in its construction :-)


How is my comment wrong? I said `if the short version is not enabled by default..` which suggests that I do not think they are but am not 100% sure. Ok, I was wrong and rest of my critic doesn't apply but how is this suddenly translate to `I don't want to live on this plane of existence anymore.`? I was just wrong in assuming that aliases were user defined like on most shells.


Wordiness for some interesting reason makes it HARDER for me to remember stuff.

Kind of like I struggled with PMP exam and memorization because "Integrated Change Control Management" became "Controlled Management Change Integration" or "Managed Variation Integration" or whatever in my mind,

"Retrieve-Child-Item" vs "Get-ChildItem" or "Send-Command" vs "Invoke-Command" "Remove-Tiingamajiggy" vs "Delete-Whatchamacallit"... my mind is seemingly better equipped to memorize "obscure but unique" gobbledygook rather than "meaningful-but-generic" verbiage :|


Perhaps PowerShell is the new COBOL?


The rigid and obscure syntactic structure also checks. It's a couple of hours task for me to define a function, mostly spent on debugging the definition.

(Probably, if I did it all the time, I'd get it right every time. Just like creating and using COBOL variables.)


I have the same problem with bash. I have spent hours trying to write a simple function which in PS would have been 10 minutes.


You mean writing the function body? I was not talking about the function body, but about getting the declaration correct. (The bad error messages and lack of real time verification surely contribute here.)

PS is more powerful than Bash, it really should be faster to write a function body in it.


Hah, I was thinking of that... I did have to code in COBOL for 6 months in ~2000 and I see some similarities in syntax paradigm (though superficial of course) - but I figured the reference would be too obscure for majority of HN audience :->


Indeed, I have the same reaction to Powershell syntax as I do Enterprise Java and C# --- extremely verbose and hard to actually read beyond the surface --- the fact that it contains English words is in some ways deceptive. The mixed case Reminds Me Of People Who Always Write Like This, and that's as annoying as it looks. My theory of why it is harder to read is because longer sequences of characters are harder to memorise than shorter ones.


I had the same experience, though eventually just having more practice with powershell it ends up second nature. Especially with the ides and completion that can come with it.


It also doesn't help that by default (at least in my experience), Powershell's tab completion is slow and annoying to use. I don't know if it's technically inferior, but it feels terrible.

It does the thing where you get to cycle through various options instead of completing to the longest common prefix, which is really hard to get used to after years and years of interfaces that do the other thing.

It's also difficult to form a repertoire of common shortcuts over time because so many commands share a prefix, so most often the shortest you will be typing is 5-6 characters before you get to anything unique.


> "It does the thing where you get to cycle through various options instead of completing to the longest common prefix, which is really hard to get used to after years and years of interfaces that do the other thing."

So change it:

    Set-PSReadLineKeyHandler -Chord Tab -Function Complete
"Bash style completion (optional in Cmd mode, default in Emacs mode)" - https://docs.microsoft.com/en-us/powershell/module/psreadlin...


The defaults come out of old CMD.EXE (bad) habits, so so yes are missing like a dozen years of Linux terminal UX experimentation/improvements.

One particularly common advice with Powershell is to try PSReadLine for more bash-like versions of tab completion and other line editing things: https://github.com/PowerShell/PSReadLine

As for common shortcuts, as mentioned elsewhere in this thread, Get-Alias (and Set-Alias) is a very useful tool.


  PS C:\Users\Kuinox> pwd

  Path
  ----
  C:\Users\Kuinox

  PS C:\Users\Kuinox> ls


      Directory: C:\Users\Kuinox


  Mode                LastWriteTime         Length Name
  ----                -------------         ------ ----
  d-----       2020-11-30     15:39                .android
  [private info redacted]

  PS C:\Users\Kuinox> Get-Alias cp

  CommandType     Name                                        Version    Source
  -----------     ----                                               -------    ------
  Alias           cp -> Copy-Item
Every command you listed are default alias in ps.


> "Every command you listed are default alias in ps."

No, ls and cp are not aliases on Linux; the aliases which clashed too much with existing Linux commands were removed in PowerShell core, they only still exist on Windows.

    PS /> get-command ls,cp

    CommandType     Name    Source
    -----------     ----    ------
    Application     ls      /bin/ls
    Application     cp      /bin/cp


I that actual backslashes I spy in there?

Jeezus.


It's on Windows, so yes, path are displayed with backslashes.


Though I definitely agree that bash's abbreviations are a barrier to entry, I think a bigger impediment not just with bash but many languages (say, Haskell) is ungooglable operator syntax.

If you didn't know what it was, how would you figure out what `$(expression)` means in bash for example?


Haskell has https://hoogle.haskell.org/ which lets you search functions named by symbols, function signatures, etc.


Yea, in that case I just search “bash operators” and hope to find a page that includes an example of the one I’m trying to understand.


I think this one is called a "type of quote", not "operator".

At least Bash has a very complete man-page, you can just search for `$(` there and you'll find it. But the Bash symbols all have different functions, there are (very few) operators, quotes, variables, and probably more than I can remember.


bash syntax is terse enough that it's practical at the interactive command line. Thus once you take it up, you take it up for scripts AND daily interactive use. For system operators who use both frequently, between these two you quickly internalize the abbreviations. I can see it being an issue for infrequent users.

While it's not terrible, I find powershell pretty frustrating. I started off enthusiastic, especially given how archaic cmd.exe is. As mentioned elsewhere though, the advice not to use aliases, coupled with unbelievably long command names that I hate typing and can't always recall exactly - is it convert-to-csv? to-csv? no it's convertto-csv - I can never remember and I don't feel I should need to use ISE to work around this. This utterly prevents me from internalizing.

Even worse, until v3 apparently, iterating over an empty array would fail out (iterate once on $null instead of not iterating at all) and had to be protected with an explicit check. I was on v2, and it was at this point that I completely checked out and decided it wasn't worth learning and that I'd wasted my time. In general I could do what I needed with either cygwin or win32 cpython and those didn't make me feel like clawing my own eyes out.

It seems the situation has improved, but I just don't see any reason to take it up again, ESPECIALLY on linux, unless I _have to_ do dotnet stuff, and even if I do I'll explore every other available option first (ironpython? f#? is there a dotnet tcl?) TBH I avoid dotnet anyways given Microsoft's past (EEE) and current (telemetry, start menu ads, etc) behavior. Fool me once etc etc etc.


Commonly used terms/functions tend to become shorter and symbolized as some notation, similar to how all the Math notation came to be. A mathematician would be unproductive without those shorthand notations.

But I guess there is a balance. Not everyone want to write matrix multiplications as A+.×B, though A%*%B seems acceptable to some, most nowadays write it as np.dot(A,B).


Fwiw, those aliases you mention all exist. I always do "ls" and "cd" for example. But what really annoys me is that there's no aliases for arguments. So "find . -name foo" becomes "ls -Recurse -Include foo" which makes my carpal tunnel just a little worse.

It's a shame because the rest of Powershell is so good.


There are aliases for parameters, `-ea` for `-ErrorAction` comes to mind. But those are defined by cmdlets, not the user.

However, you can always shorten parameter names as long as they remain unambiguous with other parameters, so

    Get-ChildItem -Recurse -Include foo
would become

    ls -r -i foo
which just happens to be shorter than your 'find' example.


there's no aliases for arguments

There are (if implemented by the function), and moreover you can use just shrink the name to the point it becomes ambiguous. Your example is the same a "ls -r -i foo".

Also tab completion. You never type those out in full, no carpal tunnel, it's just "l -r" the TAB and select what you want. Discoverability is everything in PS.


Powershell is smart enough to find the argument as long as it's not ambiguous. For instance:

    Remove-Item -r -fo ./path/to/some/directory
Powershell is smart enough to know that -r means -Recurse and -fo means -Force because the Remove-Item cmdlet has no other parameter that starts with -r. For -f there are two possible arguments: -Filter and -Force which is why you need to be more specific with -fo.


For commands you use frequently you can still use shortforms, positional arguments, and tab completion, though.

For instance in powershell you can also do:

    ls -r foo


More importantly, when you have to adhere to the "Generic Verb-Generic Noun" pattern, the namespace fills up quickly and now creating a new program and sharing it is stifled by two problems:

1. coming up with a name that won't clash with others in already-extremely-limited namespace

2. marketing of a generically-named thing.

In the Unix world, let's take top. After top came htop. How would that play out in the Powershell world? List-Processes -> SuperDuperList-Processes?



> "when you have to adhere to the "Generic Verb-Generic Noun" pattern"

a) it's GenericVerb-SpecificNoun usually with a name at the front of the noun like Get-YakubinComments, and you don't /have/ to adhere to it.

b) there isn't a single namespace, you can use the full name style Microsoft.PowerShell.Core\Get-Command if you want to disambiguate namespaces.

c) You can `import-module -prefix ZZ` and then all commands in that module will get your prefix at the start of the noun part, like Get-ZZYakubinComment to avoid clashes.

> "How would that play out in the Powershell world? List-Processes -> SuperDuperList-Processes?"

No because List- isn't a standard verb, nor is SuperDuperList. Get-EventLog had Get-WinEvent added. Get-WmiInstance became Get-CimInstance. The third party NTFSSecurity module added a Get-Item2 to cover for Get-Item, and a Get-NTFSAccess to cover for Get-ACL.

And because it's a shell and can run binaries, there's no problem with you having or running top and htop. or aliasing htop or calling your own thing htop, the Verb-Noun pattern isn't mandatory, it's recommended for companies developing modules so that administrators will have ideas how to list things, add things, remove things, without having to read the manual to find that it's mysterymgmtcli --auth-control -login-create instead of New-ToolUser.


Namespaces in a shell gives me all the bad vibes of XML namespaces.


Microsoft may contribute to open source but not all contributions are good. Powershell breaks with every significant open-source language style and is therefore a typically bad investment of time for a non-Windows developer.

MS technologies have sometimes been a bad investment for Windows developers too.


That is ridiculous. Please don't spread FUD.


I'm not sure if this is relevant but I tried all three of the Linux commands in Powershell and they worked as well.


All of those are aliased to the Linux commands and also have their own abbreviations (get-childitem -> gci).


As others have said, aliases do exist, but I don't think you can deny that verb-noun makes learning Powershell a lot easier. For instance, if I want to get a user account, you know that Get-User exists. You can therefore also assume that Set-User also exists as a command. Tab completion helps a lot too with figuring out what options there are.


That's what aliases are for. All three examples are there by default for me as aliases.


Lots of people here suggesting the main benefit of PowerShell is its object model and indeed that is very useful, but there are other great features as well. First and foremost, PowerShell basically has a command line parameter framework built in. You also have a runtime backed by one of the best standard libraries out there (.NET) - one in which you can easily reach into anywhere in your PowerShell scripts. It also has a module ecosystem supporting development in either PowerShell or C# proper. And now with PowerShell core it's cross platform. PowerShell also supports pipelines, but I mention this last because it's obviously not a distinguisher for it with bash. It's really not even a contest, PowerShell is way more... well, powerful than bash. Since I've become proficient I would never go back to bash.


This isn’t necessarily something I’m proud of, but at my old C# job I would do inline C# in Powershell as a very hackish “C# REPL” for prototyping and interactive testing. At the time C# Interactive in Visual Studio was unreliable and I found it easier to just copy-paste C# code into a Powershell script.

It seems that C# Interactive has gotten better (and since leaving that job I have switched to 100% F# for .NET stuff). But a more useful application is using Powershell to bundle a .NET class library into a flexible, low-weight, modular command line application for internal use. For instance, a C# library which does serious analytics on large data, and then a Powershell script that deals with easier annoyances like AWS authentication or FTP access, argument parsing, and so on. Obviously a real .exe is a better long-term solution but I found Powershell worked really well for rapidly sharing compiled .NET code into a tool that data scientists on my team could use.


I don't think it's a question of it being more powerful than bash. Why does it need to be? I'm not going to use shell script to write apps. There are better scripting languages like python and node for scripting non-trivial apps.


I'm a powershell user for about 7 years now, or more. Have a lot of bindings, helpers in my powershell profile, and use it everyday for work.

That being said, I'll never use it on Linux as my shell. It's just too slow to start. Fish, nushell, bash, all start instantaneously, powershell (legacy and core), all take more than a second to start, on beefy machines.

I've been looking at the powershell core repo in hopes of fixing this with .net core ready to run profile, but they seemed to have something like that in place, but was disabled at that time.

Anyway, PowerShell is good, probably the best you can get on Windows(nushell also takes a bit to start, and it's still new). But, on Linux you can do much better, even if that means having to struggle with Bash/Fish scripting.

For more complex scripts a full language like Lua or Python are most likely better.

Also, last time I checked the docker container for PowerShell Core was easily over 100MB, if I remember correctly. Might work well for a dev machine, where you set it once, but for a CI, it's not ideal.


PowerShell 7.2.0-preview.2 starts nearly instantaneously for me on Linux, while it's significantly slower on Windows.

The non-core version on Windows takes ages (up to 10 seconds) to start. It's helped a bit by configuring it with -NoLogo, but still extremely slow, especially as soon as there are any modules loaded in the profile.


Pipes in powershell are just as powerful as in bash.

Every curly braces pair is a lambda function with a single parameter $_.

You can approach powershell very functionally and the abstractions work fine.

It is however a minefield of pitfalls.

Obscure error handling at the root script. Always run code at a main function that is called in the root script.

Permissive undefined variables and parameters access.

Unwinding nested attributes by referring to it at the root object level: what if you need a variable attribute from an object but the attribute collides with a built in method?

Truthy and equality are not interchangeable. -eq and Object.Equals produces vastly different results, due to the first not being type safe.

Some functions are sintactically similar to flags and parameters: -join.

Case insensitivity. Seriously...

All in all, if you are mindful of the pitfalls, use Set-StrictMode early on and don't run code at the script root, you are fine. Better than bash. A well written poweshell code is in my opinion way more readable and maintainable by than a well written bash code, despite its shortcomings.

However both of them don't come close to what python is in terms of type safety, error handling, legibility, flexibility and maintainability, even for system scripting.


My favorite is that functions don't actually have return statements.

A return is just an exit for the function. If you try to assign the output of a function to a variable, it will essentially be a string containing whatever is printed to the console throughout the function's duration.


No, that's misleading. The result is not typecast to string, it's not combined into a single string, many outputs become an array. And it's not whatever is printed to the console, it's whatever was sent to the pipeline output. Printing to the console is separate in PowerShell. Unlike in Linux world where the main way appears to be abuse of stderr for things which are informational and not errors, PowerShell has many output streams, and only one of them is the output of a function. e.g.

    function test {
      "hello"
      5
      write-host "world"
    }
    $result = test
    
$result contains "hello",5 the output of the function and 5 is still typed as an integer - not string and not a single string - and the console prints "world".

And it does this to make it work like a Unix shell, because if you write a command and get some output, then want to batch some commands together in a function and get no output except what you explicitly "return", that would be annoying.

> "functions don't actually have return statements"

They do actually have return statements (for control flow).

(And if you make classes with methods, they have traditional programming language features - called with parens, lexical scope, return statement controls the return value).


Oh Man, that can also cause a lot of issues. If you call a function that has a return value but you don't actually need it, like invoke-webrequest, but you use an explicit return statement right thereafter things go haywire. I have yet to produce a proof of concept to illustrate this issue, but I came across this and it actually overwrites my return statement with the orphaned return value from the invoke method. Took me ages to figure out what was wrong, because debugging the application I could see inside the function that the variable I was returning had the expected value, but what was being bound to the outside was something completely different


Makes me wonder if there might ever be support for standardizing a json input/output schema for all the typical Linux userland stuff, /proc structures, and so on. Then a "jq" like utility would have more value. Having it co-exist with the current "just a bunch of text" isn't ideal though, as you'd be stuck with something clunky like adding a switch to every command. Or an environment variable that changes everything. Or a "j-" version of each.


The best approach I've heard is to "standardize" on doing it with IO descriptors, so we'd have stdin, stdout, stderr, stdjsin, stdjsout. Individual programs could check which file descriptors are open and read/write the format automatically. This may also be the best way to leverage the benefits of plain text and structured data and move between them quite naturally as the situation demands. It's also really not that hard to write a few basic utils or fork old ones to the new scheme as a proof of concept, but AFAIK no one ever has.

While we're here I'd also like to see more experimentation on using file desciprtors with GUI/TUI tools. I've had good like using things like vim in the middle of a pipe for data that needs a human touch before being passed through. The suckless world uses dmenu quite a bit.


A piped array of "octets" (C style character bytes, like void*) of nil or more length is probably the most basic interface possible for relevant hardware.

A structured interface introduces much more complexity. Rather than standardize on any specific 'wire' format a better approach might be to have a bi-directional connection and a standard for negotiating across it.

This might allow different forms of communication (E.G. a shared ringbuffer), negotiation of protocols, signaling when ready / blocked / needing more to proceed, what format(s) to use and if multiple streams should be used (as well as how to multiplex them)... etc. Programs might even be added to the middle to setup the preferences between two applications and then tell the OS that they're done and to just connect the input and output directly.


JFTR, stumbled over this recently: https://github.com/kellyjonbrazil/jc "CLI tool and python library that converts the output of popular command-line tools and file-types to JSON or Dictionaries. This allows piping of output to tools like jq and simplifying automation scripts."


There was an endeavor sometime around 1998-2002 to reimplement a lot of coreutils, textutils, and other GNU-ish *utils in perl with regular, structured output formats (I believe XML at the time). Unfortunately, I don't remember the exact name and I don't think it was ever completed enough to be a contender.


libxo on FreeBSD tries to do this to some extent but a way to turn json back into neat tables is somehow missing (jq isn't great in that regard).


Shameless plug - I created a cli tool called jtbl[0] that converts json output to tables in the terminal.

[0] https://github.com/kellyjonbrazil/jtbl


Ah, that's interesting, thanks for sharing. Their github README does a nice job of showing how it doesn't add a lot of work: https://github.com/Juniper/libxo


I know there's a project started a few years ago, building something like this for bash called Relational pipes[1].

It's perennially in my "to check out" list and I haven't actually played around with it, mainly because I'm lazy and it's not in AUR.

So can't comment how ready it is to use to today. But it looks interesting.

[1] https://relational-pipes.globalcode.info/v_0/index.xhtml


Maybe a better idea would be something like the HTTP Accept header for the shell, where the user can set it once and any supporting application will respond in that format.


Helge Klein, a long time Windows dev, summed up my complaints about using PowerShell as anything other than a nice shell that sorta feels like *nix:

https://helgeklein.com/blog/2014/11/hate-powershell/

I still use CMD.exe, so please get off my lawn ;-)

When I need the power of Powershell, I use C# (or even Python + Py2Exe if I'm deploying).


All of those things are bad, but the worst is the variable scoping.

Did PowerShell devs learn nothing from what was horrid about PHP?


I think for the devops person or someone doing server administration (sysadmins etc) powershell everywhere must make things easier even with the trade offs (as some have mentioned, it does have a bigger memory foot print, which may or may not matter depending on a host of factors)

As a developer? I haven't found PowerShell more useful than zsh/bash or fish (if you haven't, try fish, it has a lot of benefits of PowerShell (fish has its own scripting language that is more "language like", like some of the more simple constructs of Python, syntax wise) but via a simple plugin[0] you get bash compatibility too, and its made for Unix like environments). I do like that it has a rich object data model, I just don't do that kind of thing in my shell. I mostly use aliases, shortcuts, and maybe some grepping. I don't do heavy duty tasks from the command line where I'm not writing the logic in the first place, and I just find it easier to use the standard that my team does (currently, this is JavaScript, with the shebang it executes just like a binary. We can reliably say everyone has the same version of node)

Maybe in the future this will change, but I don't see the win to divide my attention economy to it deeply, personally.

[0]: https://github.com/edc/bass


I agree its more for a devops person, not developers so much. Specifically Windows techs who maybe don't use Linux very often. I've steered a couple of the new IT guys towards it when they ask about how to automate something. Its much easier to learn new, than to learn shell scripts, batchfiles or C#.

The only place where I use PowerShell as a developer is in Visual Studio, in NuGet Package Manager. Update-Package, Generate-Migration, Create-Database, etc..


I tried Powershell for cross platform usage. While I didn't put it through the paces, I didn't have any issues on linux. In the end, I ditched it

- Powershell is too verbose. I might as well use Python or node. Hard to beat lodash and node.js for processing objects.

- Powershell starts slow, almost 2 sec. I use i3 and it's noticeable when I start a terminal with Powershell. Sure it's a one time cost but I'm nerdy like that.

- If I work on Windows I use virtual machines or WSL2 obviating the need for a cross platform shell.


It's funny how the article claims that treating everything like a string is a drawback, when it's often touted as a strength of bash.

The author of this piece claims: "Powershell [...] offers a series of commands useful for developing tools and automatisms that are very difficult to implement with simple strings." but as far as I can tell, they don't go on to actually explain any of these cases where Powershell is a more appropriate tool than traditional string-based shells.


Let me channel Snover really quick:

Imagine you need to get the MAC address of a network adapter in Bash. One way to do it would be...

   ifconfig eth0 | grep -o -E '([[:xdigit:]]{1,2}:){5}[[:xdigit:]]{1,2}'
...another...

   ifconfig eth0 | awk '/^[a-z]/ { iface=$1; mac=$NF; next } /inet addr:/ { print iface, mac }'
Both of these are tied to the way the MAC address is printed in the output of the ifconfig command, and there's no contract that says that can't change. In fact, there are probably versions of it out there where these won't work. In PowerShell you would do this...

    Get-NetAdapter -Name Wi-Fi | select -ExpandProperty MacAddress
This is far more readable if you pass this script on to someone else, it won't break if the way it Get-NetAdapter gets rendered to the screen changes, and best of all, since I know the discoverability tricks in PowerShell, even though I've never done this before, I didn't have to go to stack overflow to find it.


Right, but the fact that you can parse out something that didn't have a contract is a strength.

Someone had to write the Get-NetAdapter cmdlet to give you a contract, and Linux has those too, so a fairer comparison would be something like:

    addr=$(</sys/class/net/ens33/address)
I think that's just as readable as your example.

The question is how does powershell handle extracting something without a contract, and is that more readable than bash? I'm skeptical :)


> "Someone had to write the Get-NetAdapter cmdlet"

Amusingly no, nobody wrote it. There's a part of PowerShell which can wrap WMI/CIM classes into auto-generated cmdlets, and Get-NetAdapter is one of those. (Which is why it's not available on Linux installs of PowerShell).

Take a look at the XML in C:\Windows\System32\WindowsPowerShell\v1.0\Modules\NetAdapter\ and there's no C# code or .Net DLL for it, just "MSFT_NetAdapter.cmdletDefinition.cdxml" and "MSFT_NetAdapter.Format.ps1xml". Presumably auto-generated, human checked, though I can't prove that.

[1] https://docs.microsoft.com/en-us/previous-versions/windows/d...


Its not like pwsh can't do strings ffs. It just can do objects on top of that. You can totally go bash-like-crazy in it and parse strings if you like that.


Sure, I'm saying that would be a fairer comparison - getting output from something that has no defined interface.

So why not compare apples to apples, how would powershell extract the MAC from ifconfig output? If that's more flexible or powerful than bash with standard UNIX tools, then that might be impressive!


> how would powershell extract the MAC from ifconfig output?

It could go identically crazy as in bash, as it has its own grep (sls), can use grep itself etc.

    > (ipconfig /all | sls Physical) -replace ".+: "
    00-FF-84-14-66-D7
    00-FF-B7-06-19-0F
    00-15-5D-00-1F-3C
    ...


Right, that's a fairer comparison. So the question is, why is that better?


Its not. Everything else is.

Although, on more thought, it is better even as parsing engine as you have system design rules enforced across tools and OSes. Even single tool may work differently depending on *nix flavor.


Here's what I got:

ifconfig | ?{$_ -like "eth0 *"} | %{($_ -Split " ")[-1]}

In other words, find the line that starts with "eth0 ", then return the last column using " " as a delimiter.


But Get-NetAdapter just works. Your command, I don't have a /sys/class/net/ens33 on this Ubuntu virtual server. I do have a /sys/class/net/venet0 but /sys/class/net/venet0/address is blank, even though the VM does have an IP address.

Grepping everything under /sys/class/net/* can't find the IP in there. It shows up in `ip addr` as venet0:0 and as a P-t-P link, so I assume that's related; doesn't work unless you know the adapter name and still might not work then? What kind of "contract" is that?


Here's the output of the command the parent told me to run on my Windows workstation:

    > Get-NetAdapter -Name Wi-Fi | select -ExpandProperty MacAddress
    Get-NetAdapter : No MSFT_NetAdapter objects found with property 'Name' equal to 'Wi-Fi'.  Verify the value of the property and retry.                                                          
    At line:1 char:1                                                                              
    + Get-NetAdapter -Name Wi-Fi | select -ExpandProperty MacAddress                              
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~                                                                  
        + CategoryInfo          : ObjectNotFound: (Wi-Fi:String) [Get-NetAdapter], CimJobException
        + FullyQualifiedErrorId : CmdletizationQuery_NotFound_Name,Get-NetAdapter               
I know the problem, I don't have a wireless card on this machine, but how is this different from you complaining that your network interface isn't called ens33?


Even better than the ip | jq in my sibling comment.


Your example is relevant to show how powershell can handle some tasks better, but this task is better performed with ip than ifconfig, as it has an option for json output which jq can parse:

  ip --json link show eth0 | jq '.[0].address'
I'll admit that the discovery of this is probably not very great. I wouldn't be surprised if a number of people learned about the json output, and possibly jq, from this post.


PowerShell is simply inverting the defaults: Linux/Bash scripts output formatted text by default, but sometimes have options to return objects (often as JSON text strings). PowerShell cmdlets by default return objects (in a custom binary/in-memory format, but convertible to/from JSON, among other options like CSV and XML), but might optionally return formatted strings. (PowerShell also offers tools out of the box similar to but different from jq for exploring/converting/navigating objects.)

In general when working in PowerShell you don't have to look up if there is a command with a --json flag (or equivalent) that does what you need to do, you can assume that objects are the default response and move on to working with them with PowerShell's jq equivalents. (Obviously, that makes discovery generally easy.)


Now parse the IP address


  ip --json address show eth0 | jq '.[0].addr_info [].local'


Cherry picking. Not a system feature. Good luck with that in general.


I'm not on a Linux box at the moment but isn't there an eth0 entry in /sys (or maybe /proc) which can give you mac address in easier to parse way (I think you could just use cat).

I don't know if they are still in vogue but /sys and /proc interfaces were originally intended to return clean output which is easy to parse in programs and scripts. I guess usage never really caught on.


Over the past month I did an experiment of using only powershell, including on my macbook and Linux boxes. I’ve gone back to zsh now though.

The benefit of powershell - the rich object data model - isn’t actually that useful in practice day to day.

The memory usage is ridiculous for a shell.

The killer though is a disregard for economy of expression. The example in the article can be expressed as just “find /path -mtime -3”.

I have kept ps on Windows, i didn’t go back to cmd.exe there but WSL2 is still my default when opening Microsoft Terminal.


> The memory usage is ridiculous for a shell.

I can bring 10 - 30 instances any day in parallel without a glitch and still consuming low amount of mem. Its higher then bash, but again, bash sux and when you have 0 features compared to pwsh you can use lower amount of memory I guess. Bash is ancient, memory was much bigger problem then. Its funny that in age in electron apps we talk about memory when your todo app takes more then anything else.

> The killer though is a disregard for economy of expression.

?

> the rich object data model - isn’t actually that useful in practice day to day.

Its most useful every day feature for me.


I love powershell because I have a basic working knowledge of Linux commands. In bash this means I have to google a bit for even slightly complex things like 'change the extension of all .swp files under this directory'.

In powershell the short syntax tends to be noisy but I usually can start with the linux commands and muddle through with tab completion:

  ls -r *.swp | % { mv $_ ($_.FullName -replace ".swp$", ".swap") }


Almost always when I see PowerShell critiqued by people used to Linux and Bash, the criticisms leveled against if often add up to: "I am used to the workarounds for the limitations of my system. Sure, your system does not have these limitations, but what if I need the workarounds, like with my current system?"

This is like... AutoCAD before the year 2000. It was digital paper, and acted exactly like a drafting board with pens, rulers, and protractors. It was better than paper, but not by much! SolidWorks came out and blew it away. It was proper 3D, with constructive solid modelling. It could generate drawings of arbitrary projections or cross sections in under a second, saving months of time. Yet... people complained the same way. What if I need drafting tools? What if I want to draw lines manually? How do I draw "just" a circle? Why do I need to define what I'm doing in 3D? Just give me digital paper! That's what I want!

I made a comment on YC News nearly a year ago that I'm going to partially repeat below: https://news.ycombinator.com/item?id=23257776

PowerShell is more UNIX than UNIX.

Seriously. In UNIX, if you want to sort the output of "ps"... sss... that's hard. Sure, it has some built-in sorting capabilities, but they're not a "sort" command, this is a random addon it has accumulated over time. It can order its output by some fields, but not others. It can't do complex sorts, such as "sort by A ascending, then by B descending". To do that, you'd have to resort to parsing its text output and feeding that into an external tool. Ugh.

Heaven help you if you want to sort the output of several different tools by matching parameters. Some may not have built-in sort capability. Some may. They might have different notions of collations or internationalisation.

In PowerShell, no command has built in sort, except for "Sort-Object". There are practically none that do built in grouping, except for "Group-Object". Formatting is external too, with "Format-Table", "Format-List", etc...

So in PowerShell, sorting processes by name is simply:

    ps | sort ProcessName
And never some one-character parameter like it is in UNIX, where every command has different characters for the same concept, depending on who wrote it, when, what order they added features, what conflicting letters they came across, etc...

UNIX commands are more an accident of history than a cohesive, coherent, composable design. PowerShell was designed. It was designed by one person, in one go, and it is beautiful.

The acid test I give UNIX people to see if they really understand how weak the classic bash tools they use is this:

Write me a script that takes a CSV file as an input, finds processes being executed by users given their account names and process names from the input file, and then terminates those processes. Export a report of what processes were terminated, with ISO format dates of when the processes were started and how much memory they used into a CSV sorted by memory usage.

Oh, there's a user called "bash", and some of the CSV input fields may contain multiple lines and the comma character. (correctly stored in a quoted string, of course!)

This kind of thing is trivial in PowerShell. See if you can implement this, correctly in bash, such that you never kill a process that isn't in the input list.

Give it a go.

...

After I posted the above, "JoshuaDavid" provided a correct Bash solution, which blew my mind because I just assumed it was borderline impossible: https://news.ycombinator.com/item?id=23267901

Note how complex his solution is, and that he had to resort to using "jq" to convert the output of "ps" to JSON for the processing!

Compare to the solution in PowerShell, of which nearly half is just sample data: https://news.ycombinator.com/item?id=23270291

Clear, readable, and easy to modify even for a junior tech.

What I didn't say in that thread was this: I didn't actually bother to work out the solution to my toy problem in PowerShell before JoshuaDavid posted his solution.

I made up the problem simply assuming that it's ludicriously difficult in bash -- without checking -- and I similarly assumed that it's trivial in PowerShell -- without bothering to check.

I was that confident.

Are you still that confident that Bash is superior to PowerShell? Or have you internalised its constraints, and are too used to drawing fiddly little lines on digital paper to realise that your tooling is hopelessly outmatched by solid modelling?


I like powershell, but miss

bash set -e.(really really miss this)

Find it hard to set a script to abort with a stack trace.

Find it hard to deal with relative imports(this script imports a file in the same folder)

explaining the scoping rules

Disklike explains how your array is now a single object when you returned it from a function

Absolutely love powershell JSON support, miss native yaml support.

Love parameter globing

Love integration of parameters with a script, dislike that auto generated help can’t be done via single line comment to function


> bash set -e.(really really miss this)

    Set-PSBreakpoint -Command Write-Error -Action { break; }
similarly:

    trap { <# IDE breakpoint here #> }
> Find it hard to set a script to abort with a stack trace.

    $script:ErrorActionPreference = 'Stop'
    # or
    throw "Oops!"
> Find it hard to deal with relative imports

That's a very unfortunate limitation that I've never understood myself, to be honest. The typical "best practice" is to not use relative imports, but to use "installed" modules or scripts instead.

> explaining the scoping rules

https://docs.microsoft.com/en-us/powershell/module/microsoft...

> Disklike explains how your array is now a single object when you returned it from a function

This is also "one of those irritations" that tends to bite people when dealing with search results, e.g.: "Get-ADUser". If you always want an array (even an empty or single-valued array) then wrap functions in @(...), e.g.:

    $users = @( Get-ADUser -Filter ... )
This is also the syntax to create an empty array, or an array of one item:

    $empty = @()
    $listOfOne = @( 'foo' )
> miss native yaml support

But this would be trivial to add. Writing a module to provide commandlets such as "ConvertFrom-Yaml" and "ConvertTo-Yaml" is about a day of effort in PowerShell. Good luck doing the same thing in Bash and producing something useful, let alone full-featured!

In fact, someone has done it:

https://github.com/cloudbase/powershell-yaml

> auto generated help can’t be done via single line comment to function

Two lines for automatically generated help is one too many?

    # .SYNOPSIS
    # This works...
    echo 'foo'


`bash set -e`. Your examples don't work (if you have an answer please paste it in https://github.com/PowerShell/PowerShell/issues/3415 )

    Set-PSBreakpoint -Command Write-Error -Action { break; }
    cmd.exe /c "exit 1"
    echo "if this->($LASTEXITCODE) is 1 I shouldn't be here"
Your stack trace code is wrong. What your code is doing is essentially saying throw an exception when you encounter an error. All I am asking is when exception hits the top, dump the stack trace.

    $script:ErrorActionPreference = 'Stop'
    function bar() {
        throw "oops"
    }
    function foo() {
        $a = 123
        bar
    }

    foo

    PS C:\tmp> .\a.ps1

    Exception: C:\tmp\a.ps1:6:5
    Line |
    6 |      throw "oops"
        |      ~~~~~~~~~~~~
        | oops
I use `trap{$_.ScriptStackTrace; break}` but unfortunately I cannot hide this code into a helper script that I can dot source... because of scoping rules! Even though I am dot sourcing a file which I would think would load it into my current scope according to dot sourcing link you sent.

Why this is not the default behavior when leaking an exception and aborting I have no clue

enforcing things are arrays

    yes you can wrap stuff in arrays, but that is making the callsite look ugly for a function definition problem. The solution "return ,$array" is just weird

Honestly I prefer the bash way of $yaml | yaml2json(.exe) | convert-fromjson. Easier to find cross platform converters which your example is not. Takes me like 5 seconds to write, however then I have to teach others to do it as well, I want it in the platform. Also why they are at it, add toml support as well.

for documentation

  # .SYNOPSIS
  # Why do I need to type .SYNOPSIS above here, and yes that extra line is too many
  function foo(
      # but here its just obviously a comment for a parameter
      [string] $param=""
  ) {
      $param
  }
Also didn't mention this before I wish there was a version of powershell that was static checked like typescript for javascript.


> Honestly I prefer the bash way of $yaml | yaml2json(.exe) | convert-fromjson

I always prefer native support, but that's just me. I recently even wrote a converter for DNS bind zone files because they're such a pain to deal with as "text" files.

> Why do I need to type .SYNOPSIS above here, and yes that extra line is too many

Because there are other sections as well.

Consider yourself lucky! If you're writing binary modules in C#, the automatic help generation is missing. Instead, you have to use a hideous legacy XML-based help system nobody asked for. There are thankfully generators available now that plug into the Visual Studio build system, but in the past you literally had to author these by hand.

> I wish there was a version of powershell that was static checked like typescript for javascript.

Don't we all?

Set-StrictMode adds some static checks (not enough IMHO), and there are also linters available. If using VS Code, you get a bunch by default.

Fundamentally, once I start getting too frustrated by the weak typing, I realise that I'm writing software, not scripts. I simply crack open Visual Studio and start writing C#...

> Why this is not the default behavior

PowerShell conceptually is nearly perfect at a high level. It was originally called the "monad shell", and that design pedigree still shines through.

Unfortunately the implementation has many gaps that are as yet unresolved.

I was hoping PowerShell Core would fix everything, but it only fixed a few things (parallel foreach finally!) while leaving simple things like break-on-exception on the table.

Nonetheless, having worked with both Bash and PowerShell, I hugely prefer the latter because of that purity of vision.


Well... if it's about complex scripting, you have Python or Perl whose are standard, well integrated, lightweight, powerfull and preinstalled on Linux.

If it's for a day to day usage as a shell, bash / fish / zsh... are more concise and faster.

The example given on the article :

# Get all file modified in the last 3 days

Get-ChildItem -Path path -Recurse | Where-Object {

  $_.LastWriteTime -gt (Get-Date).AddDays(-3) 
}

Is just:

find path -mtime 3

in bash ...

The object thing is nice, but using strings as output is universal.


I wish there was a sane way to bootstrap powershell without having to use an existing binary release. It is not surprising thought, you can't bootstrap bash using .net core either.


Telemetry in dotnet is opt out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: