We are considering a webapp to manage some aspects of AD groups which have a managedby attribute.  This allows Outlook Web Access (OWA) accounts to get at the Members attribute.

The example i found used get-adgroup.  So i tried it.  No soap.

Went to one of our AD servers and still didn’t work.  Load-Module ActiveDirectory was really fast and allowed me to test in the ISE.  Also means that with this module, i can run scripts and not have to worry about the AD Command shell.

get-adgroup -filter {Name -like "Shared*"}

worked as expected and when i piped the output to |measure-object, i got the count.

get-adgroup -filter {Name -like "Shared*"} -properties managedby | select name, properties

This code did NOT work (returned curly braces for every group.  Which on reflection is expected.  I found that appending managedby to the properties switch would add the managedby property to the standard output of

get-adgroup -filter {Name -like "Shared*"} -properties managedby

or to get less, which is more of what we want, to this

get-adgroup -filter {Name -like "Shared*"} -properties managedby | select name, managedby

Some of the managedby entries are blank.  To count those i was not able to use the filter to distinguish entries with managedby blank, so i returned everything and did a select-object like this, piped to measure-object to get the count.

get-adgroup -filter {Name -like "Shared*"} -properties managedby | 
select name, managedby|where-object {$_.managedby -like "CN*"}|measure-object

you DO remember that in Powershell, the pipe acts like a continuation, so that the code above on two lines runs without complaint.  And is considerably easier to read.  The backtick (`) is the formal continuation character, but is not that easy to see in code.

Posted in Uncategorized

recovery model


transactions are logged, but each time a checkpoint occurs and the transactions are completed and written to disk, the log is truncated.  You can NOT backup the log file (error).  You can recovery the backup of the database, but can not do a point in time recovery – the only PIT is the time of the last backup.  

Logs can grow for long-running transactions, or transactions started after the oldest open transaction.  


SELECT *  from sys.databases WHERE NAME = ‘name’ – view Recovery_Model_Desc.

Posted in Uncategorized

list of end points from GC and array

I have a datafile with a list of “endpoints.”  Each endpoint has 8 readings.  they should be nicely arranged.  But i still want to create an array of the endpoint from the datafile so that i can scan the database a second time and get the readings for each endpoint.  I guess i could hardcode that, or put the endpoints into a database or configuration file that i share, but what i did was this:

  1. get-content on the remote file and store it  in $lines
  2. foreach line, split the line into the date, the endpoint and the reading
  3. If in the array, do nothing otherwise add the endpoint to the array $endpoints.  I could add all the endpoints and do UNIQUE and i have no idea if that would be performant.
  4. foreach endpoint in endpoints, write the endpoint to console, and next for each line in $lines, match the endpoint in the line and if found, split the line to get the reading  and write it.
  5. Output the last line.  If line is only used inside the foreach, it will not display after the loop is closed.  Therefore, declare it at the top and we will have it updated each time.

Finally, we do a lot of ‘split’.  We can either split to an array, or split to a implcit array or split and return only the desired item.

$array = $line.split(“,”)’ $array[2]
($a,$b,$c) = $line.split(“,”); $c

We are reading the same data 21 times.  I BET there is a more efficient way to do this, but the blog is mostly about the split options.

$endpoints = @()
$line = $null
$lines = gc "\\db-quartz\c$\scripts\biglogs\BigNeptureB.txt"

foreach ($line in $lines){
 ($a, $b, $c) = $line.split(",")
     if ($endpoints -contains $b) {
        } else {
         $endpoints += $b
     } # end if/else
 } # end foreach lines

foreach ($endpoint in $endpoints){
     foreach ($line in $lines){
        if ($line.contains("$endpoint")){
        } # end if
     } # end foreach line
   } # end foreach $endpoint
$line # declared above
Posted in Uncategorized

windows ports open sesame?

A DEVRANT blog post with a link to more tools for using netstat, et al to look at windows server ports.  

The netstat command is

netstat –an ¦find /i “listening”

and there is an example of actually getting the application that is running against a port by using the PID (Application process ID) from 

netstat –ano ¦find /i “listening”


Posted in Uncategorized


write a function, put it down.  All the day you’ll wear a frown.  Write a function, load it up, all the day you’ll have good luck.

Load the function, and what happens?  Nothing.  Unless the last line of the script that loads the function is written to call the function, or it writes a message to the console on usage or writes a ‘help’ message.  the function is (probably) loaded – and we could be doing this from your profile so we don’t have to explicitly load the function – and we can call it.  It will then do whatever we wrote it to do. And again, and again.

We could have written the function to expect input – and if we make a parameter required (in the param block), powershell will prompt us for it visually.  We can expect parameters to be added by position or by referencing them as named parameters, but we can also provide default values for the parameters. “You can tell me the computername but if not, i will use localhost as the computer”.

We also could write the function to look to the args array – these are the command-line entries after we call the function.  They are stored in args[] and we can retrieve them as args[0], [args[1] and see how many with args.count.  Best practive is not to use both named parameters and args in the same script or function. 

Posted in Uncategorized

date-stamp switch

I have the date.  I have a number of buckets and what to be in each bucket on the right date.  I need a switch statement to assign the bucket membership depending on the date.  that is – On January 1, i want to transition to the new bucket (that means change a variable to reflect the bucket depending on the date).

Get the date (and specify the format),  But it STILl is a datetime variable.  Create a number of trigger dates on\after which we want to change buckets.  Do the reverse of the above – enter a format and tell powershell that that is a date, create a datetime from that.  In the switch statement, compare the current datetime to the switch values.  If the current datetime is greater than a switch, we have gone past that date, move along.  Unless we use greater than and less than, we have a situation where we assign the variable in the first switch and in the second switch and so on until we are no longer greater than – we just keep teaching until we finish the block.  The final value of the variable is the largest “greater than” that was tested.  Might not be ideal, but it works.

I messed it up by also testing for an exact match on the dates – that means that we do something special on the change-over date.  Send an email to state we are changing buckets.

$today = get-date -format "MM/dd/yyyy" #format probably not needed

come to think of this, the format of the today date is not really relevant except as a visual to remind me how to  

We switch on the date.  If today is greater than a trigger date (we passed into the next semester), we set the semester bucket.  If we are ON the cusp of the trigger, we change the bucket AND send an email notification.  We do this for all the listed semesters.  This runs each time the script runs – each time we reset the $semester semester based on the today date.

AFter we set our semester to the last value (and every day after the first change) we also send an email to incidate we are at the end of our configuration and that new dates need to be set.  I guess i could have done that in code so that we are infinitely configured, but i didn’t want to go out that far.  Maybe i will be gone by then.

switch ($today){
{($today) -ge (get-date 09/1/2013)}{$semester = "13FA"}
{($today) -eq (get-date 09/12/2013)}{$semester = "14JA"; $email=$true}
{($today) -gt (get-date 01/01/2014)}{$semester = "14JA"}
{($today) -eq (get-date 01/25/2015)}{$semester = "15SP"; $email=$true}
{($atoday -gt (get-date 01/25/2015)}{$semester = "15SP";$notify = $true}
default {$semester = "not assigned"}
} # end of switch
If ($email) { write-host "changing semester to $semester"}
If ($notify) { write-host "$semester is the final semester coded into the script at ..."}
Posted in Uncategorized

law school images

The law school wants a “face-book” with images of each student.  I lost the old script files and had some procedures changes (ie, i no longer have db access to the LCMS – i was using existing student enrollment in the course as a starting point for the faces).  So i did some rewriting.

First step is to generate an XML file for each student with their userid from the LCMS riles and name and the number of their photo image from AD.

Create an array as $name = @(), then get content from the enrollment file, select those entries with the right courses in them, split the enrollment string into three parts and save out the userid.  Next – create a new array from the old array, which is sorted and unique.  Just get each student once.

Open a connection to our LDAP server, get, for each userid, the firstname, lastname, class and finally the numeric id, which is the name of the image file.  Output that to a manually created XML file.

[array]$array= @()
$sorted_array = @()

$array+= (get-content \\server\c$\lcms\apps\snapshot\data\semesterStudents.txt|where-object {$_ -like "*_13LF*"})|foreach {$_.split("|")[1]}
# write-output ("this is the count:" + $array.count) # for testing
$sorted_array = $lawstu|sort -Unique
# write-output ("this is the count sorted:" + $sorted_array.count) # for testing

set up connection to AD

$objDomain = New-Object System.DirectoryServices.DirectoryEntry("LDAP://, DC=university,DC=edu")
$objSearcher = New-Object System.DirectoryServices.DirectorySearcher
$objSearcher.SearchRoot = $objDomain
# header for XML file
"<?xml version='1.0'?>"|Out-File C:\users\rbeck\desktop\law\stuLawUsers.XML -Append
"<users>"|Out-File C:\users\rbeck\desktop\law\stuLawUsers.XML -Append
foreach ($user in $lawstusort){
     $strFilter = "(&(objectCategory=User)(samAccountName=" + $user + "))"
     $objSearcher.Filter = $strFilter
     $objSearcher.SearchScope = "Subtree"
     $colResults = $objSearcher.FindAll()

 foreach ($objResult in $colResults){
     $objUser = $objResult.GetDirectoryEntry()
     $fn = $objUser.FirstName
     $ln = $objUser.LastName
     $id = $objUser.samaccountname
     $classyr = $objUser.extensionattribute1
     $idnum = $objUser.employeeID
     $output = @"
$output |Out-File C:\users\rbeck\desktop\law\stuLawUsers.XML -Append 
} # end of while for reader
} # end of XML\
"</users>"|Out-File C:\users\rbeck\desktop\law\stuLawUsers.XML -Append

OK, we have a data file.  The second script does some prep work, creates and if they exist, removes the contents of some directories, and reads the new XML file,  For each entry, checks for the existence of the image file on a remote server, outputs an error message if the image doesn’t exist, otherwise copies the image down to the local machine  into a ‘class’ directory that is picked from data in the xml file, renames the file to firstname-lastname.ext and goes to the next entry.  Zipping the directory saves about 50% of the size.  The operator needs to  have admin access to the server since we are using the admin share for the server and the files.

$path2files = "C:\local\desktop\law\lawstuimages\"
$path2images = "\\serveri\drive$\Data exchange\images\"
$folders = "Law First Year","Law Second Year","Law Third Year","Law Fourth Year","Law Graduated",  "Law Visiting","Other","Faculty","Staff","Grad First Year","Grad Second Year","Grad Third Year"
foreach ($folder in $folders){
if (test-path ("$path2files$folder")){
       Remove-Item "$path2files$folder\*"  #remove the images
} else {
        New-Item -ItemType Directory -Force -Path "$path2files$folder"  # create the folder
} # end else

Read the XML file.  Notice the path2files is not correct for the XML file, and i have to mess with it with string.replace()

[xml]$file = Get-Content ($path2files.replace("\lawstuimages","")+"stuLawUsers.XML")
foreach( $entry in $file.users.u) { 
      $num = ([int]$entry.num).toString("0000000") # pad number to 7 digits.
      $fn = $entry.firstname
      $ln = $entry.lastname
      $class = $entry.class
           if ($class -eq ""){ $class = "Other" }
# if path not found, no image - write message to console.
 # otherwise, copy the file, put the file into the folder based on class and rename it. 
 if (test-path "$path2images\$($entry.num).jpg"){
     copy-item -path "$path2images\$($entry.num).jpg" -destination $($path2files+"$class\")
     # write-output "copying $num.jpg or $($entry.num) to $class" # for testing
     $z = $($path2files+"$class\"+$entry.num + ".jpg")
     rename-item -path $z -newname "$fn-$ln.jpg"
 }  else {
     write-output ("can't find image for $fn $ln at $num")
 } # end of if/else on test-path
} # end of foreach XML

Reading the XML is really easy – i could probably used an internal library to get the XML outfile.  Notice the evaluation operator needed at lots of points to get from a variable to the value of the variables properties.

Posted in Uncategorized