Translating Unix Shell Processing to Powershell Equivalents

I frequently find myself comparing data sets from Unix hosts and other systems. My go to for years are Linux shell commands. I’d ssh into a system, grab some data, and process with sed, awk, sort, uniq, perl etc. These days, however, I find myself working with customers who are more comfortable working with PowerShell. It became interesting to see how I might translate work from Unix shell equivalents which is how I tend to think when given a task and use PowerShell equivalents.

In this particular case I’m again working with NetApp 7-mode filers. I’ve received a request to validate a list of volumes is correct, complete, and represent the scope of all volumes used to present data via SMB from these systems. I was provided a list of volumes to validate which I simply sorted alphabetically and dropped into a plain text file.

What I need to do is log into the Netapp side, grab a list of cifs shares, and extract the unique sorted list of volumes to compare against with diff and see what pops out. Using my Linux host I do this:

ssh sharney@xxx-fas3270-a0 cifs shares | grep \/vol | awk -F '/' '{print $3}' \
|cut -f1 -d' '| sort | uniq

I just repeat that for the various other controllers in my list. This just filters out the volume info, stripping the 3rd field from the text output which is the volume name and sorts. I typically don’t ssh in to each controller individually. I’ll use a parallel ssh tool and grab everything I need in short order dumping data into separate files. These days I use ansible in ad-hoc mode as my multiple ssh session manager, but I’ve used other tools. You could also just do a bash for loop .

Now, how would I grab the same in PS? I don’t ssh in to the filer though there is a PS module for that. Instead I use the NetApp PowerShell toolkit which uses the NetApp Data OnTap API (available for download from support.netapp.com)

PS H:\> Import-Module dataontap
PS H:\> $xxx3270a0 = Connect-NaController -Name xxx-fas3270-a0
PS H:\> $xxx3270a1 = Connect-NaController -Name xxx-fas3270-a1
PS H:\> $yyy3270a0 = Connect-NaController -Name yyy-fas3270-a0
PS H:\> $yyy3270a1 = Connect-NaController -Name yyy-fas3270-a1
PS H:\> $filers3270 = $xxx3270a0, $xxx3270a1, $xxx3270a0, $xxx3270a1
PS H:\> foreach ($filer in $filers3270) {write-output $filer.Name}
xxx-fas3270-a0
xxx-fas3270-a1
yyy-fas3270-a0
yyy-fas3270-a1

I’ve connected to four controllers (two 7 mode HA pairs) in short order using AD credentials. The Connect-NaController cmdlet will pop up a credential dialog of course if AD credentials you’re using don’t work.

Now I just need to grab my data. Note that the line splits with \ are for readability in this post.

PS H:\> $xxx3270a0cifsshares = foreach ($share in Get-NaCifsShare -Controller \
$xxx3270a0) { $share.mountpoint.split('/') [2] 
    # like ssh controller cifs shares | grep | awk 'F/' '{print $3}'
PS H:\> $xxx3270a1cifsshares = foreach ($share in Get-NaCifsShare -Controller \
$xxx3270a0) { $share.mountpoint.split('/') [2] }
PS H:\> $xxx3270a0cifsshares | sort-object | get-unique 
    # then like | sort | uniq 

I could easily foreach loop over the $filers3270 objects but I wanted to do each one in turn. The latter command piping my initial filtered object into a sort-object| get-unique simply prints the result which I can capture in a text file. Since this list is sorted and the lists I was provided are sorted, lining them up side by side to compare is one way to complete my task. diff will do the job programatically. My weapon of choice for this is vimdiff which gives me those results visually as well.

So how did I figure out how to translate things from awk, etc. into PS equivalents? If you guessed “google” and “stackoverflow” you would be correct. It didn’t take terribly long to find results, experiment a bit, and make the translation. The nice thing about PowerShell is that it’s not returning just plain text that I filter with text processing tools, but instead obtaining objects with attributes. There is more power with this approach and it’s generally more reliable than text processing. Of course processing text with regexes and pattern matching is always a very useful technique, but having defined objects with attributes is even better. CLI output can change and patterns can be unpredictable. As long as the API you’re leveraging is stable, it’s likely going to be a better approach and you can use a minimal amount of text parsing where required.

I could have instead used the NetApp SDK and Python (or Perl) to perform these tasks in Unix and also received API object values. But there’s something to be said for quick and dirty and getting the job done. What’s very nice about PowerShell here is that it allows me to operate quick and dirty. The equivalent Python SDK isn’t quite as quick to use but that may be because I’m still ramping up my Python skills. But it is also true that Microsoft has made practical easy of use a focus of PowerShell design.

Making the translation from my Unix shell, text processing approach to PowerShell equivalent also has value because I can provide colleagues that are more comfortable with Windows and PowerShell something they can use. And this gives me a way to collaborate more effectively. It also deepens my understanding of these different tool kits and sparks additional ideas for automating these kinds of day-to-day tasks. This example is using NetApp provided PowerShell interfaces but of course these same techniques can be applied using other PowerShell-based tools.

 
comments powered by Disqus