GET-ACL and “ReadandExecute” versus List

I find it a lot easier to do virtually all of my work via the keyboard.  Using PS ISE I can essentially make a log of everything I work on during the day. There are a few things where I have to resort to using a GUI but I’m learning how to get around more and more of those.

One of the areas I learned a while back was using GET-ACL in order to find the NTFS security on a shared folder in order to be able to see what AD group a person would need to be in for access. In case you haven’t used that it’s essentially something like this

get-acl $fldrpath | fl AccessToString

It works great – until you hit a situation where the real permission is List. Then it’s confusing:

Everyone Allow ReadAndExecute, Synchronize

Looks just like Read-only access.

After a little searching around I was able to find that there is a way with PowerShell to get the correct List entry – the inheritanceflags on List and Read-Only differ. List has only the inheritance flag “ContainerInherit” while Read has “ContainerInherit,ObjectInherit”. Once I updated my quicky script to include some extra logic to check for that and presto

Everyone ----------------------------------------> Allow -----> ListDirectory

Much better 🙂

Using PowerShell to get a list of shares from a server

This one is relatively easy on first glance.

$shares = Get-WmiObject -ComputerName SERVERNAME -class win32_Share

The win32_Share class gets the shares as listed by WMI. From here normally you can get the permissions fairly easily. Except in a case like this:

Name Path Description
---- ---- -----------
ADMIN$ C:\Windows Remote Admin
C$ C:\ Default share
E$ E:\ Default share
F$ F:\ Default share
G$ G:\ Default share
\\SERVER-MSDTC\M$ M:\ Cluster Default Share
H$ H:\ Default share
IPC$ Remote IPC
\\SERVERSHR-CLS\ClusterStorage$ C:\ClusterStorage Cluster Shared Volumes Default Share
M$ M:\ Default share
\\SERVERSHR-CLS\Q$ Q:\ Cluster Default Share
O$ O:\ Default share
\\SERVERSHR-SQL\F$ F:\ Cluster Default Share
\\SERVERSHR-SQL\FILES F:\files
Q$ Q:\ Default share
\\SERVERSHR-SQL\G$ G:\ Cluster Default Share
\\SERVERSHR-SQL\H$ H:\ Cluster Default Share
LogFiles C:\Program Files\Microsoft SQL Server\MSRS11.RPT\Reporting Services\LogFiles
\\SERVERSHR-SQL\App F:\app
\\SERVERSHR-SQL\bin F:\\bin
\\SERVERSHR-SQL\O$ O:\ Cluster Default Share
\\SERVERSHR-SQL\Files2 F:\Files2
\\SERVERSHR-SQL\Files3 H:\Files3

If we were looping through trying to do something like this on each share

$ShareSec = Get-WmiObject -Class Win32_LogicalShareSecuritySetting -ComputerName $($ServerReportingOn.DNSHostname) -Filter "Name='$sharetocheck'"

We’d get errors that would look like this.

+ ... $ShareSec = Get-WmiObject -Class Win32_LogicalShareSecuritySetting -C ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Get-WmiObject], ManagementException
+ FullyQualifiedErrorId : GetWMIManagementException,Microsoft.PowerShell.Commands.GetWmiObjectCommand

The solution is simple. Notice how all the cluster shares start “\\”? In your loop to get the permissions you would check the share name for “\\” and skip that “Get-WmiObject -Class Win32_LogicalShareSecuritySetting” line for any share name starting with it.

Just upgraded to Windows 10 on my personal laptop.

I was sort of leery of doing this given the media reports of the multiple locations where Microsoft decided to be helpful with people’s data, notably the telemetry that sends portions of file contents back to Microsoft.

That said, I decided to go ahead and bite the bullet and upgrade.

First thing I did though was blast my personal Lenovo G570 back to the factory image. I figured that would be the safest bet given I had installed and removed some software I wanted the cleanest image possible. After downloading a mere 3 GIG of patches to Windows 7 I was finally at a fully patched level. At this point I took the plunge.

So far it has been a fairly pleasant experience. I am already at this point looking at Windows 8/8.1 as more akin to Windows 95 with the fixed and polished Windows 10 being like Win98.

Indeed, the only thing I have run into hat was a problem was my sound disappeared, even with the drive and hardware saying everything was cool. I wound up having to Google the answer, which was on an MS forum. Short answer to the problem : Right click the speaker in the tray->Playback devices->Click your speaker->Click Properties->Click Enhancements tab->Tick ‘Disable all enhancements’

Just had something weird happen – and the fix wasn’t obvious

This morning when Outlook 2013 loaded on my work PC the Outlook Social Connector got disabled because it was taking too long to load.

So I did the normal routine – Go to the FILE tab, then Options, then to Add-ins in the Options window, selecting Disabled Items from the Manage list at the bottom of the window and clicking Go. I then selected the Social Connector addin and clicked the enable button. After a restart of Outlook the Social Connector was still disabled.

The solution was to go into the registry. The path in question (for 32-bit Office on 64 bit Windows) is

HKLM\SOFTWARE\Microsoft\Office\15.0\ClickToRun\REGISTRY\MACHINE\Software\Wow6432Node\Microsoft\Office\Outlook\Addins\OscAddin.Connect

The LoadBehavior REG_DWORD was still at value 2. Using the value listing from this page I closed Outlook again and updated LoadBehavior to 3. Restarting Outlook I now have my Social Connector and People Pane back.

Sony, you need to fire the whole IT security department

I’ve read some stories on the Sony hack today that are disturbing.

47000+ SSNs of current and former employees, including celebs like Sylvester Stallone, were breached.

OUCH!

That was the disturbing part. Now comes the mind-numbingly stupid part. Some of the data breached was passwords.

Not a few either.

THOUSANDS.

FOR EVERYTHING FROM FACEBOOK ACCOUNTS TO LEXIS/NEXIS TO AMEX TO FIDELITY!

STORED IN PLAIN TEXT!!

stupid-burns

Sony, a word of advice – Fire your whole IT security department. Now. They are obviously grossly incompetent or they would have at least used something like KeePass to somewhat safely vault them. They have made life hell for thousands of current and former Sony staff, wrecked the security of all your data and systems, and destroyed your corporate reputation. It will take years to recover from this.

http://gizmodo.com/sony-pictures-hack-keeps-getting-worse-thousands-of-pa-1666761704

http://www.buzzfeed.com/charliewarzel/it-gets-worse-the-newest-sony-data-breach-exposes-thousands

SUBINACL – and needing to migrate ACLs for a foreign forest

So here I was today working an a folder a server in a foreign forest, trying to duplicate the ACLs of the source forest to the target forest. Both forests contained groups with the same RDNs and SAM account names.

My first choice was the obvious and trusty SUBINACL. So I entered the command below expecting it to start chugging away.

subinacl /subdirectories "x:\foldertochange\*.*" /migratetodomain=source=target

Instead of the nice chug of folders being modified I got an error saying the syntax was wrong. Quick check – Yep, using an elevated prompt. Spelling good – check. Trust between the forest was still going. Switching the command to verbose mode I got a different result:

subinacl /verbose=1 /subdirectories "x:\foldertochange\*.*" /migratetodomain=source=target

1722 Unexpected error  NetUserModalsGet on server \\DC-IN-TARGET
Error finding domain name : 1722 The RPC server is unavailable

The fix is simple. On the source data server I opened the HOSTS file and added an entry for the target DC it was trying to talk to. Re-ran the command and 211,000+ objects later everything was golden.

Office365 and dirsync – the multiple accounts with the same UPN/mail/proxyaddresses value

For everyone who is working on an Office365 email deployment and using dirsync you are probably familiar with dirsync errors and trying to find the duplicated proxy addresses within AD. For everyone who hasn’t started one of these there are several things which must be unique within the FOREST. Proxy addresses is one of those things that must be unique within the directory. Dir sync will find instances of duplicated proxy addresses and will error on those objects.

Running into this at work I decided to see if someone had done the heavy-lifting of writing a script before me to find duplicated proxy address, mail, and UPN values. I didn’t find anything doing a quick Google that suited my needs and wants.

What this does is it goes out and pulls all of the user, contact, and group objects within the forest, selecting the canonical name, mail, UPN, and proxy addresses values for each. It then puts all of that into a single array. A prompt is put in for what string to look for. Once that is entered a straight search through all of the collected objects is performed with the results of any matches displayed. The input loop is then repeated so I don’t have to recollect all the data each time.

# This code is deliberately inefficient on the Get-ADObject command. The purpose being so the script can be adapted for other
# duplicate searches, such as for the mail user object property being non-unique. One advantage to this code is that it looks for
# all user and contact objects and gets their UPNS, mail, and proxyaddresses values, rather than just those with homeMDB populated. I have seen some
# accounts that have had the mailboxes rudely disassociated leaving proxyaddresses values that are not searchable via EMC/EMS.
#
$domainlist= (get-adforest).domains
foreach ($d in $domainlist)
    {
        Write-host "Processing domain " $d ". Please be patient. This may take some time depending on the number of user objects."
        $userlist = get-adobject -LDAPFilter "(&(objectClass=User)(objectCategory=person))" -Server $d -properties canonicalname,proxyaddresses,mail,userprincipalname | select canonicalname,proxyaddresses,mail,userprincipalname
        $contactlist = get-adobject -LDAPFilter "objectClass=Contact" -Server $d -properties canonicalname,proxyaddresses,mail,userprincipalname | select canonicalname,proxyaddresses,mail,userprincipalname
        $grouplist = get-adobject -LDAPFilter "objectClass=group" -Server $d -properties canonicalname,proxyaddresses,mail,userprincipalname | select canonicalname,proxyaddresses,mail,userprincipalname
	    foreach ($ul in $userlist)
	        {
		        [array]$allobjs += $ul
	        }
	    foreach ($cl in $contactlist)
	        {
		        [array]$allobjs += $cl
	        }
	    foreach ($gl in $grouplist)
	        {
		    [array]$allobjs += $gl
	        }
    }
$total = $allobjs.count
write-host " "
write-host "-------------------"
write-host "Total user and contact objects collected : " $total
$count = 1
$MatchingObjs = $null
write-host " "
write-host "-------------------"
$address= read-Host "Enter search address. Hit ENTER or type exit to exit. : "
If (($address.Length -gt 0) -and ($address -ne "exit"))
    {
        Do
            {
                foreach ($ao in $allobjs)
                    {
                        Write-Progress -Activity "Scanning for" $address -PercentComplete ($count/$total * 100)
                        $MatchFound = $False
                        ForEach ($pa in $ao.ProxyAddresses) 
                            {
                                If ($pa –Match $address)
                                    {
                                        $MatchFound = $True
                                    }
                            }
                        #add matches to array 
                        If ($ao.mail –Match $address)
                            {
                                $MatchFound = $True
                            }
                        If ($ao.userprincipalname –Match $address)
                            {
                                $MatchFound = $True
                            }
                        If ($MatchFound) 
                            {
                                [array]$MatchingObjs += $ao
                            }
                        $count++
                    }
                write-host " "
                write-host "-------------------"
                Write-host "Matching objects:"
                write-host "-------------------"
                foreach ($mo in $MatchingObjs)
                    {
                        write-host $mo.canonicalname
                    }
                write-host "-------------------"
                write-host " "
                $count = 1
                $MatchingObjs = $null
                $address = read-Host "Enter search address. Hit ENTER or type exit to exit. : "
            }
        Until (($address.Length -eq 0) -or ($address -eq "exit"))
    }

Running a different Linux at home now

For years I’ve been using Ubuntu at home, but I finally decided to switch to something else.

It wasn’t one thing in particular that did it, more a collection of things. Ever since I had upgraded to the 13.04 version I had a problem with the system at boot up throwing an error. After some searching I found this was a common issue and had to do with, IIRC, the Gnome components versus some other things. I’d fix it, and then either the next version or some update would come along and re-introduce the problem. I also got tired of opening the software center and getting ads for programs when all I wanted to do was add something like gimp. Even the system updates was buggy, sometimes having to be manually started in order to get updates.

After doing a little research I decide to go with Linux Mint. So far I’ve been impressed. The install was smooth and the end result has been like a breath of fresh air. I haven’t re-installed everything yet but what I have installed has not balked or errored out.

The only thing I found frustrating, as I did with Ubuntu, was getting printers installed. I don’t blame either distro for that. Instead it is the fault of the printer manufacturers who want to hide how their things work. On that I really do wish the printer makers (I’m looking at you Epson, HP, and Brother) would get their act together and start publishing either drivers for Linux that are easy to install or the APIs so that the community could do it.

Interlude II : Excel with PowerShell

In my last post I finished with this code:

$xl = New-Object -ComObject "Excel.Application"
$xl.visible = $true
$xlbook = $xl.workbooks.Add()
$xlbook.worksheets.Add() | out-null
$xlsheets = $xlbook.worksheets
$xlsheet1 = $xlsheets.item(1)
$xlsheet1.name = "Report"
$xlsheet1.Cells.Item(1,1) = "Header1"
$xlsheet1.Cells.Item(1,2) = "Header2"
$xlsheet1.Cells.Item(1,3) = "Header3"
$xlbook.SaveAs("REPORT.XLSX")
$xl.quit() | Out-Null

As I said, this can be made much better looking for a report. In addition, we can do things a bit smarter to make our job easier down the road.

First off, let’s break our code up into functions.

Function NewExcelObj
     {
          Set-Variable -Name xl -Value (New-Object -ComObject "Excel.Application") -Scope 1
          $xl.visible = $true
          Set-Variable -Name xlbook -Value ($xl.workbooks.Add()) -Scope 1
         $xlbook.worksheets.Add() | out-null
     }
Function AddSheets
     {
          Set-Variable -Name xlsheet -Value ($xlbook.worksheets) -Scope 1
          Set-Variable -Name xlsheet1 -Value ($xlsheet.item(1)) -Scope 1
          $xlsheet1.name = "Report"
     }
Function AddHeaders
     {
          $xlsheet1.Cells.Item(1,1) = "Header1"
          $xlsheet1.Cells.Item(1,2) = "Header2"
          $xlsheet1.Cells.Item(1,3) = "Header3"
     }
Function CloseExcel
     {
          $xlbook.SaveAs("REPORT.XLSX")
          $xl.quit() | Out-Null
     }
# Main code
NewExcelObj
AddSheets
AddHeaders
CloseExcel

What we’ve done is moved all of our code into functions. Note we had to change how certain variables are defined.

In PowerShell there are different scopes. Variables, objects, and constants reside within the scopes. Variables (note: I’m using this as shorthand to include objects and constants as well) defined in the main script are available to all levels of the script. A variable defined within a function is normally available to just that function’s scope. When the code returns from the function the variable defined within the function is lost.

Because of that you’ll see I have redone the definitions of some of the items used previously . Instead of the normal “$Variable = value” methodology I have swapped in “Set-Variable –Name Variable –Value (value) –Scope 1”. Using this format allows me to make use of the Scope option which allows me to say that the variable is defined at the scope 1 level above the function scope, in this case the base script scope level. With the variables created that way I can proceed to make use of the variables in other functions.

This cleans up the code some, but what about the spreadsheet. It’s still as ugly as before.

Let’s do something about that:

# Main code
NewExcelObj
AddSheets
[array]$headerlist = ”Header1”,”Header2”,”Header3”
AddHeaders $headerlist
CloseExcel
Function AddHeaders ($AHhl)
     {
          $col = 1
          Foreach ($t in $AHhl)
               {
                    $xlsheet1.Cells.Item(1,$col) = $t
                    $xlsheet1.Cells.Item(1,$col).Font.Bold = $True
                    $xlsheet1.Cells.Item(1,$col).Interior.ColorIndex = 15
                    $col++
               }
     }

post4

We are doing A lot of things here at once. First, just before the call of the function AddHeaders we are defining an array that contains our header values. We then pass that variable to AddHeaders – notice the line for the call of AddHeaders now is “AddHeaders $headerlist” and our Function header is “Function AddHeaders ($AHhl)” where $AHhl is contains the passed array values of $headerlist. We now have a function-local variable called $col to keep track of what column we are in. We go into a standard Foreach loop where we step through each member of $AHhl, populate the header column, and move on with incrementing $col ($col++). Between the two lines populating and incrementing are two lines that contain formatting instructions for the cells we just touched. The first sets the font in the cell to bold, the second sets a fill colour in the cell of a light grey.

That makes our headers stand out:

There’s still one more thing we should do to our sheet – autofit the columns. There is nothing more annoying to people looking at a spreadsheet than having to change the column width. We can accomplish this with just two lines of code added to our CloseExcel function.

Function CloseExcel
     {
         $UsedRange = $xlsheet1.UsedRange
         [void] $UsedRange.EntireColumn.Autofit()
         $xlbook.SaveAs("REPORT.XLSX")
         $xl.quit() | Out-Null
     }

We define a variable in the scope of the function called $UsedRange and assign the used range property of the sheet to it. After that we use the Autofit() method on the property EntireColumn forcing all of the used columns to be autofit to the data within the column. Finally we perform our previously defined quit statements, saving and closing the workbook.