Monday, October 30, 2017

Changing AD User Login Hours

Today I received a request to remove the login hour restrictions for all the users in our forest. After a little bit of research didn't have my curiosity satisfied so I decided to look into the specifics of this. First, the script is pretty simple - sometime in the organization's past, someone decided to set the login hours so people could not login between 02:00 and 04:00. We have a new system and since students are 24x7 we needed to remove these restrictions. I was asked to simply remove it so I'm querying all the enabled users and simply updating them.

 $Users = Get-ADUser -Filter {enabled -eq $true}  
 [byte[]]$LogonHours = @(255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255)  
   
 $Users | ForEach-Object {  
   Set-ADUser $_ -Replace @{logonhours = $LogonHours}  
 }  

As you can see the script isn't complex and simply does its job. My curiosity was with the LogonHours value and why was it so peculiar.

I opened ADSIEdit and looked at the field in question and it appears like this:
ADSI LogonHours Data
As you can see it's separated into 21-byte fields. Each field represents eight hours starting midnight Sunday morning. If the byte is set to 0 they are not able to log in for that hour, however, if it's set to 1 then the user has authority to log in. As an example, if you wanted no restrictions they would all be set to 255. If you wanted to enable 8am until 6pm (08:00 - 18:00) it would appear as "00 FF 02". This would allow them to log into the system until 17:59, however at 18:01 they would be unable to log in.

It took me a few seconds to figure out what was going on here and why it wasn't in a standard format, however, once I put it together it actually makes perfect sense. 

Hope this helped someone out. 



The same field in binary just so you can see:

logonHours in Binary
each bit represents an hour



Tuesday, June 13, 2017

Another Update & Rename… kinda…

I’ve moved again and am now in Higher Education. Recap the last many years would be:
  • Death Care Provider
  • Federal Government (DHS)
  • IBM Partner / Contractor
  • Own Business
  • High Education
Not a horrible career trajectory. There are a few things I’ve noticed in the industry recently and am hoping to write a few articles about them soon.
Currently at the university that currently sends me money on a regular bases we have no configuration or standards management. Literally everything we do is ad-hoc and non-repeatable. We are currently implementing new processes, standards and security. A lot of this will be completed using Chef and related components. We are implementing a new internal cloud infrastructure and looking to be as responsive to our internal customers as the public cloud providers (actually we would like to be better, just sayin’).

I have never had the inclination to learn Git, Ruby, JavaScript or Python however, they are all required skills I will need to acquire in the next few months while we transition to a more DevOps culture here. I'm looking to start posting additional code on GitHub and will be linking it here. 

Should be exciting!
Cheers

Monday, August 8, 2016

Housekeeping & Info

First a little housekeeping - I've started my company Serion Technology, Inc. and it's been keeping me rather busy recently. I've had a few good ideas about blog topics and issues I've come across. However, I have not been able to sit down and put them down and write about anything until now. Please forgive the absence and I hope you find value in my rants and technical explorations.

Thanks for reading.
-Me-

Technorati Tags: ,

Wednesday, September 18, 2013

Automatically Restart AIX Printers

I have an issue with printers in AIX especially if the the network connection isn’t 100% reliable or if the users tend to turn the printers off frequently. One of my customers is a emergency first responder organization so they need to be up and running as close to 100% as possible and printing is an important piece of their workflow. I created this script a lot of years ago and needed to reuse it this morning for them. It simply checks all the printers for ‘DOWN’ status and restarts them. I scheduled it in cron to execute every 5 minutes and everyone is happy.

#!/usr/bin/ksh

while (enq -AW | grep DOWN | grep -v HOST)
do
enq -U -P `enq -AW | grep DOWN | grep -v HOST | awk {'print $1$2'} | awk 'NR>1{exit};1' | sed 's/@/:@/'`
done




Cheers

Saturday, May 11, 2013

Microsoft Surface

I admit it - I love new toys and gadgets and this certainly fits into that category. My current title is “Director of Managed Services” and as such I need to go and actually talk to customers. Now, this is a new role for me coming from a technical background to one that is customer facing and actually holds the customer’s interests at heart before those of my company. I am really enjoying this and going to customer locations to discuss issues and finding solutions is a familiar role in a new and exciting setting. 

So how does Microsoft’s Surface play into this role and why it isn’t just a toy for me? Well it’s quite simple, we are a Microsoft Partner and I felt it was best to utilize a Microsoft product to show off some of the features of the table and the Windows 8 operating system. Here are the specs:

  • Microsoft Surface Pro
  • 128GB SSD
  • 4Gb RAM
  • Intel i5 Processor @ 1.7Ghz

Honestly looking at the specs it seems that it’s more of a very small form factor computer than a tablet, which is one of the problems. It’s not really a tablet… I need to be able to show off the latest and greatest available with Microsoft’s products but having a full blown computer in tablet for is cheating a little bit.

The major features that my customers are interested in are management of remote assets, security, think sales guy at Starbucks, and in some cases slow access to remote offices. The best way to start that discussion is DirectAccess, BitLocker and BranchCache two of which require the Enterprise version of the operating system. So now I need to upgrade the tablet/computer and Windows RT is completely out of the question.

I do like the fact that now I can go into a meeting and write in OneNote with the customer right there. If I’m connected to the internet it’s automatically syncing to my SharePoint site or SkyDrive. That means that if I lost my bag I would still have all my notes and data that I collected from the customer. Additionally because all my data is encrypted with BitLocker I don’t need to worry about confidential customer or business data getting somewhere is shouldn’t.

With the exception of battery life I’m so far pretty pleased with the device. I’m sure I can stretch some more life out by tweaking the power settings but I’m not quite ready to do that yet. I’m still getting used to the differences between applications that are published for the Win8 UI and the regular Win applications, OneNote is a perfect example.

I’ll certainly keep posting as this experiment plays out – Cheers!

Thomas

del.icio.us Tags: ,

Thursday, December 27, 2012

Reclaim Underutilized OFFSITE Volumes in TSM

I have a customer that has a reasonable size TSM environment, however the daily change rate is really low. This causes the DRM media that is sent offsite to be utilized at roughly 1-5%. Since the data doesn’t expire quickly they aren’t reclaimed and the volumes tend to sit offsite without being rotated. Using LTO5 media you can imagine that it’s a huge waste of media space.

I put this together to be run periodically either manually or via cron and allow it to run until completion. It will start with the least utilized volumes and work to the ones that are the most utilized. Of course if you use collocation or will never fill a volume to 80% you will need to change the values accordingly.

#!/usr/bin/perl
use strict;
use warnings;
my $login = "/usr/bin/dsmadmc -id=ADMINUSER -password=ADMINPASS";
my @volumes;
sub getVolumeList {
    my @tempVolumes = ();
    @tempVolumes = `$login -dataonly=yes "select volume_name from volumes where access='OFFSITE' and pct_utilized>0 and pct_utilized<80 order by pct_utilized" `;
    foreach (@tempVolumes) {
        chomp;
        push @volumes, substr($_, -9);
    }
}
getVolumeList;
foreach (@volumes) {
    print "$_\n";
    my $result = `$login -noconfirm "move data $_ wait=yes`;
}

Monday, August 6, 2012

Schedule Creation of MKSYSB to NIM Server

I have a customer that has a fair amount of systems they need to protect. Recently we had a corruption that required the re-install of the operating system. While this is an extremely rare event on modern systems, the problem was compounded by the fact they didn’t have a good mksysb backup. We needed to find a system that was similar (Test/Dev), locate the tape drives, move them to the source, take the backup, move the tape to the target and re-install. Not a fun evening and all the trouble could have easily been adverted by having a good backup.
If you have more than four AIX systems I would recommend having a NIM server. This should be your point of administration for everything done in AIX if possible. Also it gives you an environment to script, upgrade and deploy without working on your production systems.
This is a script I wrote to be executed by cron on the NIM server. It does a few things here:
  1. Queries NIM for a list of “standalone” systems
  2. Performs a mksysb backup of each and registers them on the NIM server as a resource
  3. Creates a backup of the volume group that all my NIM data is on. (I know you didn’t put it on rootvg!)
  4. Ejects the tape so it can be brought offsite
  5. Emails the report to the admin
#!/usr/bin/sh
 
DATE=`date +%m%d%Y`
LOGFILE=/tmp/mksysblog_${DATE}
SENDTO="admin@domain.com"
MSGCONTENT=""
 
LOG()
{
    echo "$*" >> $LOGFILE 2>&1
}
 
LOG "------------ MKSYSB LOG FOR ${DATE} ------------------"
TIME=`date`
LOG "Process started at ${TIME}"
 
#for mach in 0; do
for mach in $(lsnim -t standalone | awk '{print $1}'); do
    LOG ""
    LOG ""
    LOG "**************************************************"
    LOG "Starting process for ${mach} "
    LOG "Removing the NIM resources for ${mach}"
 
    nim -o remove ${mach}_mksysb >> $LOGFILE 2>&1
 
    LOG "NIM resources removed for ${mach}"
    LOG ""
    LOG "Starting mksysb backup of ${mach}"
    nim -o define -t mksysb -a server=master -a location=/export/mksysb/${mach}_mksysb -a source=${mach} -a mk_image=yes -a mksysb_flags="-i -m -e -p" -F ${mach}_mksysb >> $LOGFILE 2>&1
    if [ $? != 0 ]
        then
        echo "ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR" > ${MSGCONTENT}
        mail -s "MKSYSB error on ${mach}" "${SENDTO}" < ${MSGCONTENT}
        LOG "  ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR "
        LOG " There was an error of ${mach} "
        echo "" > ${MSGCONTENT}
    fi
    LOG "Completed the mksysb of ${mach}"
done
 
LOG "Starting the SAVEVG of NIM_VG"
/usr/bin/savevg -vmpXf /dev/rmt0 nim_vg >> $LOGFILE 2>&1
if [ $? != 0 ] then
    echo "ERROR on SAVEVG " > ${MSGCONTENT}
    mail -s "ERROR on SAVEVG" ${SENDTO} < ${MSGCONTENT}
    LOG " ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR ERROR "
    LOG "There was an error on the NIM server while saving to tape"
    echo "" > ${MSGCONTENT}
fi
LOG "Completed the SAVEVG"
LOG "Rewinding and ejecting the tape"
mt -f /dev/rmt0 rewoffl
LOG "Backup process complete"
 
TIME=`date`
LOG "Completed at ${TIME}"
 
mail -s "MKSYSB Process Log" "${SENDTO}" < ${LOGFILE}




As always use as needed, but please comment if you have them.


Cheers

Wednesday, August 1, 2012

Collect Software Report for AIX

I had a need for a customer to report what software is installed and to report when it was installed. This included the AIX software sources, efixes and anything installed via RPM. I was able to throw this together in a few minutes which fit the bill.

#!/usr/bin/sh
 
HOSTNAME=`hostname`
DATE=`date +%m%d%Y`
LOGDIR=$HOME/log
REPORTNAME="SOFTWARE_AUDIT_$DATE"
 
MAILTO="someone@domain.com"
 
FULLLOG=$LOGDIR/$REPORTNAME
 
LOG()
{
        echo "$*" >> $FULLLOG 2>&1
}
 
 
if [[ ! -d $LOGDIR ]] then
        mkdir -p $LOGDIR
fi
 
 
LOG "#######################################"
LOG "# List AIX LPP Software "
LOG "#######################################"
lslpp -L | lslpp -h >> $FULLLOG 2>&1
 
 
LOG "\n\n\n "
LOG "######################################"
LOG "# List of EFIX"
LOG "######################################"
emgr -l >> $FULLLOG 2>&1
 
LOG "\n\n\n"
LOG "#####################################"
LOG "# List of RPM Software"
LOG "#####################################"
 
rpm -qa --qf '%{installtime:date} Installed: %{name} %{version} \n' |awk '{print $5, $2, $3, $1, $4, $6, $7, $8}'| head -10 | sort >> $FULLLOG 2>&1
 
mail -s "Software Audit Report for $DATE" $MAILTO < $FULLLOG

Sunday, December 18, 2011

Delete files based on date – ksh bash

Referring to an earlier blog post on how to delete log files or DRPlans using PowerShell I have another TSM server that is running on AIX. It’s new so I don’t have a bunch of DRM plans in the directory currently. Also this is a server I use for my internal testing and development so it’s not too heavily used by others.

Here is the directory listing before I wrote and ran the script:

image

After I wrote and ran the script:

image

I set it to only keep 7 days worth of files and now all I need to do is put it to run in my crontab every day…

Here’s the script – Cheers!


#!/usr/bin/ksh

DRM_Directory=/opt/tivoli/storage/drm/drplan
DaysToKeep=7

find $DRM_Directory -type f -mtime +$DaysToKeep -exec rm -f {} \;

Friday, December 16, 2011

Email Files with PowerShell

I have a need when dealing with customers and their disaster recovery plans provided by Tivoli Storage Manager (TSM) to get these files offsite on a regular basis. Normally every day at about the same time. It’s a great idea to email them to yourself, however not such a great idea if the email is on the server you may need to recover. I recommend in most cases that people get an external email account (Gmail, Live, Yahoo, etc.) and have the disaster recovery plans sent to them there. That way they are more likely to be able to retrieve them then if they were on the Exchange or Lotus Notes (Yes, people still use Notes for email) server that was in the datacenter that just imploded.
You need to update a few things and to make this work:
  • $SMTPServer – Make it your SMTP server
  • $DistributionList – I did it this way so you (or someone else) don’t need to edit the script when the recipients change
  • $SendingAddress – Who is this email going to be coming from?
  • $DataDirectory – What directory are the files kept in that need to be sent?
  • $RequiredFiles – The file names that need to be sent
In this instance the DR Plan itself is the last file to be created and has a different name everyday. I’m using the time difference to add it to the list of files that are needed.

# Author: Thomas Wimprine
# Creation Date: Dec 14, 2011
# Filename: SendDRPlan.ps1
# Description: Collect files needed for TSM Dr Recovery and email them to a distibution list
 
Function SendEmail {
    param (
        $FilesArray
    )
    $SMTPServer = "mail.domain.com"
    $DistributionList = "DRPlanRecipiants@domain.com"
    $SendingAddress = "TSM@domain.com"
    
    # Create our mail message objects
    $ToAddress = New-Object System.Net.Mail.MailAddress $DistributionList
    $FromAddress = New-object System.Net.Mail.MailAddress $SendingAddress
    $Message = New-Object System.Net.Mail.MailMessage $FromAddress, $ToAddress
    
    $Date = Get-Date
    $Date = $Date.ToShortDateString()
    $Message.Subject = "TSM DR Plan for $Date"
    $Message.Body = @("This is the daily DR plan as created by TSM with the required files to recover. Retain this message with attachments until it is no longer needed")
    
    # Add the attachments we need to the message
    foreach ($File in $FilesArray) {
        $Attachment = New-Object System.Net.Mail.Attachment($File,'text/plain')
        $Message.Attachments.Add($Attachment)
    }
    
    $Client = New-Object System.Net.Mail.SMTPClient $SMTPServer
    
    $Client.Send($Message)
    
}
 
Function GetLatestDRPlan {
    param ($Directory)
    foreach ($File in Get-ChildItem $Directory) {
        if ($NewestFile.CreationTime -lt $File.CreationTime) {
            $NewestFile = $File
        }
    }
    $NewestFile
}
 
$DataDirectory = "D:\DRPlanDir"
$RequiredFiles = "devconfig","volhist"
$RequiredFiles += GetLatestDRPlan($DataDirectory)
 
$AttachFiles = @()
foreach ($File in $RequiredFiles) {
    $AttachFiles += "$DataDirectory\$File"
}
 
SendEmail($AttachFiles)