My Reading List

Recently I had a colleague ask me for some book recommendations for his career development. And often, I am asked for book recommendations for learning PowerShell. Every time this happens I find myself scouring my book shelves and digging through old emails to find titles to recommend. So I finally decided to create a reading list. I have linked to it here and it also in the top navigation bar of this site.

This is just a start. I am constantly looking for new sources of learning. So I will be adding more recommendations when I find something worthy of sharing.  Also, if you have books to recommend, please comment below.

A note of transparency: The links in my reading list are affilliate links. If you buy one of these books, I would appreciate it if you would use one of these links. The pennies that I earn from your transaction shows your appreciation for this blog.

Happy reading!

Advertisements
Posted in Books, Uncategorized

Using Git from PowerShell

This year has been full of changes for me. One of the biggest changes is that my job requires me to use Git and GitHub for almost all of my work. Before this new job, I had never used Git. By default, the Git installer installs the bash command shell. Most of the documentation is written assuming that you are using bash. However, I prefer to work in PowerShell. In this article I will show how I set up my environment to enable Git functionality in PowerShell. This is not meant to be a tutorial on using Git but, rather, a example of what works for me and for my workflow.

Download and install the Git for Windows

First thing is to install Git for Windows.

Download and run the Git for Windows installer. As you step through the installation wizard you are presented with several options. The following is a list of the options on each page of the installation wizard with the reasoning behind my choice.

  • The Select Components page
    • Uncheck Associate .git* configuration files with the default text editor
      The editor I use for writing code is not my default text editor. So I let my code editor register the file extensions.
    • Check Use a TrueType font in all console windows
      I prefer the TrueType font Consolas as my monospaced font for command shells and code editors.
  • The Adjusting your PATH environment page
    • Select Use Git from the Windows Command Prompt
      This adds the Git tools to your PATH so that it works for Cmd, PowerShell, or bash.
  • The Configure the line ending conversions page
    • Select Checkout Windows-style, commit Unix-style line endings
      This is the recommended setting on Windows and provides the most compatibility for cross-platform projects.
  • The Configuring the terminal emulator to use with Git bash page
    • Select Use Windows’ default console window
      This is the console that PowerShell uses and works best with other Windows console-based applications.
  • The Configuring extra options page
    • Check Enable file system caching
      This option is checked by default. Caching improves performance of certain Git operations.
    • Check Enable Git Credential Manager
      The Git Credential Manager for Windows (GCM) provides secure Git credential storage for Windows. GCM provides multi-factor authentication support for Visual Studio Team Services, Team Foundation Server, and GitHub. Enabling GCM prevents the need for Git to continuously prompt for your Git credentials for nearly every operation. For more information see the GCM documentation on GitHub.

These are the options I chose. You may have different requirements in your environment.

Install the Posh-Git module

Now that we have the Git client installed we need to enable Git functionality for PowerShell. Posh-Git from the Gallery. For more information about Posh-Git, see Posh-Git on GitHub.

If you have PsGet installed just run:

Install-Module posh-git

Alternatively, you can install Posh-Git manually using the instructions in the README.MD in the GitHub repository.

Once Posh-Git is installed you need to integrate Git into your PowerShell environment. Posh-Git includes an example profile script that you can adapt to your needs.

Integrate Git into your PowerShell environment

Integrating Git into PowerShell is simple. There are three main things to do:

  1. Load the Posh-Git module
  2. Start the SSH Agent Service
  3. Configure your prompt to show the Git status

Add the following lines to your PowerShell profile script.

Import-Module posh-git
Start-SshAgent -Quiet
function global:prompt {
    $identity = [Security.Principal.WindowsIdentity]::GetCurrent()
    $principal = [Security.Principal.WindowsPrincipal] $identity
    $name = ($identity.Name -split '\\')[1]
    $path = Convert-Path $executionContext.SessionState.Path.CurrentLocation
    $prefix = "($env:PROCESSOR_ARCHITECTURE)"

    if($principal.IsInRole([Security.Principal.WindowsBuiltInRole] 'Administrator')) { $prefix = "Admin: $prefix" }
    $realLASTEXITCODE = $LASTEXITCODE
    $prefix = "Git $prefix"
    Write-Host ("$prefix[$Name]") -nonewline
    Write-VcsStatus
    ("`n$('*' * (get-location -stack).count)") * "PS $($path)$('>' * ($nestedPromptLevel * 1)) "
    $global:LASTEXITCODE = $realLASTEXITCODE
    $host.ui.RawUI.WindowTitle = "$prefix[$Name] $($path)"
}

The prompt function integrates Git into your PowerShell prompt to show an abbreviated git status. See the README for Posh-Git for a full explanation of the abbreviated status. I have also customize my prompt to show me my user context, whether I am running in a 64-bit or 32-bit shell, and if I am running elevated. Customize this function to meet your needs or preferences.

At this point you are done. You can use Git from PowerShell. Go forth and clone a repo.

Customize your Git environment

You may want to customize some of the settings of your Git environment, especially if this is a new install of Git. Being a good project contributor in Git you should identify yourself so that Git knows who to blame for your commits. Also, I found that the default colors used by Git in the shell could be hard to read. So I customized the colors to make them more visible. For more information, see the Customizing Git topic in the Git documentation.

The following commands only need to be run once. You are setting global preferences so, once they are set, they are used every time you start a new shell.

# Configure your user information to match your GitHub profile
git config --global user.name "John Doe"
git config --global user.email "alias@example.com"

# Set up the colors to improve visibility in the shell
git config --global color.ui true
git config --global color.status.changed "magenta bold"
git config --global color.status.untracked "red bold"
git config --global color.status.added "yellow bold"
git config --global color.status.unmerged "yellow bold"
git config --global color.branch.remote "magenta bold"
git config --global color.branch.upstream "blue bold"
git config --global color.branch.current "green bold"

As I said at the beginning, this is what works for me. Your mileage may vary. Customize this for your preferences and environmental needs.

In future articles, I plan to share scripts I have created to help me with my Git workflow. Do you use Git with Powershell? Share your questions and experiences in the comments.

Posted in Git, GitHub, PowerShell

What happened to 2016?

It has been more than a year since I posted anything. I can’t believe a year has already passed by. I wish I had a good story to tell about why I haven’t blogged, but I don’t. All I have are excuses. It has been a busy year for me personally. I have changed jobs and had other personal drama/excitement that has kept me busy. I also haven’t felt like I had anything new to blog about. I am making an early new years resolution to try to write at least one new post each month. If anyone has any ideas that they would like to see, especially about PowerShell topics, please let me know.

Posted in Other

Opening the door to the Mystery of Dates in PowerShell

Formatting and converting dates can be very confusing. Every programming language, operating system, and runtime environment seem to do it differently. And part of the difficulty in conversion is knowing what units you are starting with.

First, it is helpful to know the Epoch (or starting date) a stored value is based on. Wikipedia has a good article on this. Here is a brief excerpt.

Epoch date Notable uses Rationale for selection
January 1, AD 1 Microsoft .NET Common Era, ISO 2014, RFC 3339
January 1, 1601 NTFS, COBOL, Win32/Win64 1601 was the first year of the 400-year Gregorian calendar cycle at the time Windows NT was made
January 0, 1900 Microsoft Excel, Lotus 1-2-3 While logically January 0, 1900 is equivalent to December 31, 1899, these systems do not allow users to specify the latter date.
January 1, 1904 Apple Inc.’s Mac OS through version 9 1904 is the first leap year of the 20th century
January 1, 1970 Unix Epoch aka POSIX time. Used by Unix and Unix-like systems (Linux, Mac OS X), and programming languages: most C/C++ implementations, Java, JavaScript, Perl, PHP, Python, Ruby, Tcl, ActionScript.
January 1, 1980 IBM BIOS INT 1Ah, DOS, OS/2, FAT12, FAT16, FAT32, exFAT filesystems The IBM PC with its BIOS as well as 86-DOS, MS-DOS and PC DOS with their FAT12 file system were developed and introduced between 1980 and 1981

Common Date Conversion Tasks

WMI Dates

PS > $installDate = (Get-WmiObject win32_operatingsystem | select Installdate ).InstallDate
PS > [system.management.managementdatetimeconverter]::ToDateTime($InstallDate)
Friday, September 12, 2008 6:50:57 PM

PS > [System.Management.ManagementDateTimeConverter]::ToDmtfDateTime($(get-date))
20151127144036.886000-480

Excel dates – Excel stores dates as sequential serial numbers so that they can be used in calculations. By default, January 1, 1900, is serial number 1.

PS > ((Get-Date).AddDays(1) - (get-date "12/31/1899")).Days
42335

In this example, the value Days is 42335 which is the serial number for 11/27/2015 in Excel. The date “12/31/1899” is equivalent to January 0, 1900. The difference between “12/31/1899” and “11/27/2015” is 42334 but since the serial numbers start a 1 you need to add 1 day to get the serial number for “11/27/2015”.

Converting from custom string formats

PS > $information = '12Nov(2012)18h30m17s'
PS > $pattern = 'ddMMM\(yyyy\)HH\hmm\mss\s'
PS > [datetime]::ParseExact($information, $pattern, $null)
Monday, November 12, 2012 6:30:17 PM

FILETIME conversion – FILETIME is a 64-bit value representing the number of 100-nanosecond intervals since January 1, 1601 (UTC).

PS > get-aduser username -prop badPasswordTime,lastLogonTimestamp | select badPasswordTime,lastLogonTimestamp
badPasswordTime : 130927962789982434
lastLogonTimestamp : 130931333173599571

PS > [datetime]::fromfiletime(130927962789982434)
Monday, November 23, 2015 3:51:18 PM

PS > [datetime]::fromfiletime(130931333173599571)
Friday, November 27, 2015 1:28:37 PM

CTIME or Unix format – is an integral value representing the number of seconds elapsed since 00:00 hours, Jan 1, 1970 UTC (i.e., a Unix timestamp).

PS > $epoch = get-date "1/1/1970"
PS > $epoch.AddMilliseconds(1448302797803)
Monday, November 23, 2015 6:19:57 PM

PS > $epoch.AddSeconds(1448302797.803)
Monday, November 23, 2015 6:19:57 PM

References

Standard Date and Time Format Strings in .NET
https://msdn.microsoft.com/en-us/library/az4se3k1(v=vs.110).aspx

Custom Date and Time Format Strings in .NET
https://msdn.microsoft.com/en-us/library/8kb3ddd4(v=vs.110).aspx

Formatting Dates and Times in PowerShell
https://technet.microsoft.com/en-us/library/ee692801.aspx

PowerTip: Use PowerShell to Format Dates
http://blogs.technet.com/b/heyscriptingguy/archive/2013/11/11/powertip-use-powershell-to-format-dates.aspx

Parsing Custom Date and Time Formats
http://community.idera.com/powershell/powertips/b/tips/posts/parsing-custom-date-and-time-formats
https://msdn.microsoft.com/en-us/library/system.datetime.parseexact(v=vs.110).aspx

Wikipedia – Epoch (reference date)
https://en.wikipedia.org/wiki/Epoch_(reference_date)

Posted in PowerShell, Uncategorized

Hard to believe that I have been blogging for 1 year already

Yesterday, October 21st marks the 1-year anniversary of this blog. When I started I was not sure what I would write about and if anyone would really care.

I still struggle with ideas. I want to keep the content focused and relevant. And I don’t want to repeat topics that so many others have already covered. I have tried to write about things that I had to learn the hard way to document them and to make it easier for the next person to learn the same thing.

Looking back over the last year I am amazed at the amount of traffic I have received. I didn’t expect much and my numbers are a mere drop in the bucket for other, more established, bloggers. But I am proud of them. In the last year have had 4838 page views from 3260 unique visitors. I would have been surprised to hit the 1000 mark when I started.

Here is a list of my Top 10 posts ranked by page views:

Article Title Page Views
Using PowerShell and EWS to monitor a mailbox 1520
Understanding Byte Arrays in PowerShell 1164
PowerShell error: NativeCommandFailed after installing KB3000850 — fixed! See KB3062960 314
Installing BMC’s ARAPI.NET library 310
Simple ARAPI.NET Query Example in C# 163
Script to list the data fields on a Remedy Form 148
Using ARAPI.NET from PowerShell 114
Fixing “The specified module could not be found. errors when using ARAPI.NET” 91
Working with certificates in PowerShell 76
Use PowerShell and EWS to find out who is sending you email 70

The top 2 articles are a bit surprising. I wrote those as NaNoWriMo entries for PowerShell.org last year and they appeared in their newsletter. I got a few incoming views from that but don’t know how many people read these from the newsletter and never visited my site. Nice to see so many hits.

So, if you are reading this, thank you for coming. I hope I have provided some value in the past year. If you have any suggestions for topics I could write about I am always interested.

Thanks for a great year!

Tagged with: ,
Posted in Other

Adding a Contact to a Distribution List with PowerShell

The PowerShell ActiveDirectory module has a lot of great features that I use on a daily basis. However, there is one shortcoming that I have struggled with for a while. I did a lot of internet searching and testing to see if I was missing some hidden secret. But, alas, this is one task that the AD module does not do.

Here is the scenario. We have a lot of AD Groups (Distribution Lists) we use for notification messages. We want to send notifications to mobile devices. We do this by sending an email to the devices email address (e.g. 2065551212@mobilecarrier.xyz.com). These external email addresses are created as Contact objects in AD.

The problem is that the cmdlets for managing AD group objects only allow you to add objects that have a SamAccountName (and therefore a SID) to a group. This is fine for user and group objects. But Contact objects to not have SIDs. So now what do you do.

The answer is you do it the old way you would have done it in VBScript; use ADSI.

$dlGroup = [adsi]'LDAP://CN=DL-Group Name,OU=Corp Distribution Lists,DC=contoso,DC=net'
$dlGroup.Member.Add('CN=mobile-username,OU=Corp Contacts,DC=contoso,DC=net')
$dlGroup.psbase.CommitChanges()
Tagged with:
Posted in PowerShell

Create Remedy Work Orders with PowerShell

In previous posts, I have shown you how to query for data in Remedy using PowerShell and ARAPI.NET. Creating or updating information in Remedy is not much different.

  1. Log into the AR Server
  2. Build a list of Field IDs and values that you want the Work Order to contain
  3. Use the CreateEntry method to create a new Work Order
  4. Create a Qualifier string used to query for the newly created Work Order
  5. Build a list of Field IDs that you want to retrieve from Remedy
  6. Query a Remedy form (passing in the qualifier and field list)
  7. Process the results
  8. Log off the AR Server

The following script is a simple example to create a single Work Order in Remedy. It could easily be adapted to read from a CSV file or other data source to create multiple work orders. In fact, I wrote a tool in C# that is designed to create multiple Work Orders from CSV data. We use it to create the same Work Order task for multiple individual store locations.

The hardest part about creating items in Remedy is knowing which form to use and which data fields are required. It is not always obvious which fields are required. You may have to use Developer Studio to inspect the workflow and filter logic to figure all of that out. Also, fields like Product Categories, Operational Categories, Support Queues, etc. will have values specific to your environment.

param ($appServer = "appserver.contoso.net",
       $svcAccount = "remedyuserid",
       $svcPassword = "remedypassword",
       $remAuthDomain = "",
       $arSrvrPort = 51100,
       $formName = "WOI:WorkOrderInterface_Create")

add-type -path 'C:\Program Files (x86)\BMC Software\ARAPI80.NET\BMC.ARSystem.dll'

$arserver = New-Object -type BMC.ARSystem.Server
$arserver.Login($appServer, $svcAccount, $svcPassword, $remAuthDomain, $arSrvrPort)

$fieldIDs = @{     2="";           #Submitter
                   3="";           #Create Date
          1000000182="";           #Work Order ID
          ### Store identity
          1000000082="Contoso";    #Company
           301593100="s01174";     #RequesterLoginID
          1000000001="HQ";         #Location
          1000000018="Bloggs";     #Last Name
          1000000019="Joe";        #First Name
          ### WO Actions
          1000000000="Summary: short version of description";
          1000000151="Details: long version of description";
          1000000164="Low";
          1000000181="Project";
                   7="Assigned";
          1000000076="CREATE";
          ### Op Cat Tiers 1,2,3
          1000000063="Request";
          1000000064="Add hardware";
          1000000065="Wiring Closet";
          ### Prod Cat Tiers 1,2,3
          1000001270="Hardware Lifecycle";
          1000001271="Hardware replacement";
          1000001272="Internal Project";
          ### Product
          1000002268="Wireless Access";
          ### Manager Support Hierarchy
          1000000014="Infrastructure";  #Manager Support Org
          1000000015="Network";         #Manager Support Group
          1000000251="Contoso";         #Manager Company
          ### Support Hierarchy
          1000003227="Infrastructure";  #Support Org
          1000003228="Network";         #Support Group
          1000003229="Contoso";         #Company
          ### Customer Info Returned
          1000003296=""; #Customer Person ID
          1000003297=""; #Customer First Name
          1000003298=""; #Customer Last Name
          1000003299=""; #Customer Company
          1000003302=""; #Customer Email
          1000003306=""; #Customer Phone Number
         }
try
{

   #Build the list of field values to be used in the Create request - skip blank values
   [BMC.ARSystem.FieldValueList] $woValueList = New-Object -type BMC.ARSystem.FieldValueList
   $fieldIDs.keys | ForEach-Object {
      if ($fieldIDs[$_] -ne "") {
         $woValueList.Add($_,$fieldIDs[$_])
      }
   }
   #Create then new WO with listed values
   $entryID = $arserver.CreateEntry($formName, $woValueList);

   #Build a new field list containing ALL the fields you want returned
   [BMC.ARSystem.EntryListFieldList] $woEntryFieldList = New-Object -type BMC.ARSystem.EntryListFieldList
   $fieldIDs.Keys | ForEach-Object { $woEntryFieldList.AddField($_); }

   #Query for the newly created WO by its EntryID
   $strSQL = "'1' = {0}" -f $entryID
   [BMC.ARSystem.EntryFieldValueList] $woEntryValueList = New-Object -type BMC.ARSystem.EntryFieldValueList
   $woEntryValueList = $arserver.GetListEntryWithFields($formName, $strSQL, $woEntryFieldList, 0, 50);

   #Output the results
   $fieldIDs.Keys | ForEach-Object { "[{0:0000000000}] {1}" -f $_,$woEntryValueList.fieldvalues[$_] }
}
catch
{
   $_
}
$arserver.Logout()

Updating values on an existing item in Remedy is not much different. The process is mostly the same but you are using the SetEntry method instead of the CreateEntry method. I will try to share an update example in an upcoming post.

Tagged with: ,
Posted in ARAPI, PowerShell

Getting incident notes from Remedy using ARAPI.NET from PowerShell

Here is another example of using ARAPI.NET from PowerShell. This is a script I wrote and use every day to retrieve notes from a Remedy Incident. As a manager, I rarely use Remedy to work incidents. However, I frequently need to review the notes and ensure that incidents are being resolved efficiently. Logging into Remedy, finding the incident, and reviewing all of the notes can be a time-consuming process. This script makes it very easy for me to quickly see all of the notes and provide status updates to my management.

All I need is the incident number. Using the incident number passed on the command line the script searches the HPD:Search-Worklog form for Work Log entries for the specified Incident.

The output looks like this:

PS E:\> .\get-incident.ps1 IR0000000183197
===============================
incident      : IR0000000183197
submitdate    : 4/28/2015 8:22:47 AM
customer      : serverops
workphone     : 1 206 555-1212
source        : Systems Management
servicetype   : Monitoring Event
status        : Assigned
assignedgroup : Windows OS
assignee      :
opcat1        : Business Process
opcat2        :
opcat3        :
prodcat1      : Software
prodcat2      : Operating System
prodcat3      : Server OS
impact        : Moderate
urgency       : Medium
priority      : Medium
summary       : server1.contoso.net Logical disk C: on host server1.contoso.net has 10 % free space remaining.
notes         :

-------------------------------
date        : 4/28/2015 8:22:42 AM
submitter   : remedyuser15
logtype     : General Information
Attachment1 :
Attachment2 :
Attachment3 :
description : Alert Info for Ticketing:
              host: server1.contoso.net
              Severity: MINOR
              Message: Logical disk C: on host server1.contoso.net has 10 % free space remaining.
              Object: C:
              ITSM Queue: Windows OS
              Modified: 4/28/2015 8:12:19 AM
===============================

The pattern of the script is the same as my previous examples. I set up the parameters (which can be overridden on the command line) to connect to the Remedy application server. Then I define the field IDs that I want to retrieve. I also define some string dictionaries that I use to convert numeric values returned by Remedy into the text values that are displayed by the mid-tier.

param ($searchTerm,
       $fieldID=1000000161,
       $appServer = "appserver.contoso.net",
       $svcAccount = "remedyuserid",
       $svcPassword = "remedypassword",
       $remAuthDomain = "",
       $arSrvrPort = 51100,
       $formName = "HPD:Search-Worklog")

add-type -path 'C:\Program Files (x86)\BMC Software\ARAPI80.NET\BMC.ARSystem.dll'

$incFields = @{         7="Status         ";
                200000003="ProdCat 1      ";
                200000004="ProdCat 2      ";
                200000005="ProdCat 3      ";
               1000000000="Description    ";
               1000000018="Last Name      ";
               1000000019="First Name     ";
               1000000056="Phone Number   ";
               1000000063="OpCat 1        ";
               1000000064="OpCat 2        ";
               1000000065="OpCat 3        ";
               1000000099="Service Type   ";
               1000000151="Decription     ";
               1000000161="Incident Number";
               1000000162="Urgency        ";
               1000000163="Impact         ";
               1000000164="Priority       ";
               1000000215="Source         ";
               1000000217="Assigned Group ";
               1000000218="Assignee       ";
               1000000422="Owner Group    ";
               1000000560="Reported Date  ";
               1000000881="Status Reason  ";
               1000005782="Contact Last   ";
               1000005783="Contact First  ";
               1000005785="Contact Phone  ";
              }

$wlFields = @{  301394441="Description    ";
               1000000157="Submit Date    ";
               1000000159="Submitter      ";
               1000000170="Work Log Type  ";
               1000000351="Attachment01   ";
               1000000352="Attachment02   ";
               1000000353="Attachment03   ";
               1000002134="Work Log Date  ";
               1000003610="Summary        ";
             }
$arserver = new-object BMC.ARSystem.Server
$arserver.Login($appServer, $svcAccount, $svcPassword, $remAuthDomain, $arSrvrPort)
$qualifier = "'{0}' = ""{1}""" -f $fieldID,$searchTerm

[BMC.ARSystem.EntryListFieldList] $formEntryFieldList = new-object BMC.ARSystem.EntryListFieldList
$incFields.Keys | %{ $formEntryFieldList.AddField($_) }
$wlFields.Keys  | %{ $formEntryFieldList.AddField($_) }

$impactvalues = @{ 1000="Extensive"; 2000="Significant"; 3000="Moderate"; 4000="Minor" }
$urgencyvalues = @{ 1000="Critical"; 2000="High"; 3000="Medium"; 4000="Low" }
$priorityvalues = @{ 0="Critical"; 1="High"; 2="Medium"; 3="Low" }
$statusvalues = @{ 0="New"; 1="Assigned"; 2="In Progress"; 3="Pending"; 4="Resolved"; 5="Closed"; 6="Cancelled"; }
$servicevalues = @{ 0="User Restoration"; 1="User Service Request"; 2="System Restoration"; 3="Monitoring Event"; }
$sourcevalues =  @{ 1000="Direct Input"; 2000="Email"; 3000="External Escalation"; 4000="Fax"; 4200="Self Service"; 5000="Systems Management"; 6000="Phone"; 7000="Voice Mail"; 8000="Walk In"; 9000="Web"; 10000="Other"; 11000="BMC Impact Manager Event"; }
$worklogtypes = @{
   1000="----- Customer Inbound -----";
   2000="Customer Communication";
   3000="Customer Follow-up";
   4000="Customer Status Update";
   5000="----- Customer Outbound -----";
   6000="Closure Follow Up";
   7000="Detail Clarification";
   8000="General Information";
   9000="Resolution Communications";
   10000="Satisfaction Survey";
   11000="Status Update";
   12000="----- General -----";
   13000="Incident Task / Action";
   14000="Problem Script";
   15000="Working Log";
   16000="Email System";
   17000="Paging System";
   18000="BMC Impact Manager Update";
   35000="Chat";
   36000="B2B Vendor Update";
 }

[BMC.ARSystem.EntryFieldValueList] $fieldValues = $arserver.GetListEntryWithFields($formName, $qualifier, $formEntryFieldList, 0, 0);
$irfields = $fieldValues[0].fieldvalues

$irheader = [ordered]@{
               incident     =$irfields[1000000161];
               customer     =$irfields[1000000019] + " " + $irfields[1000000018];
               workphone    =$irfields[1000000056];
               source       =$sourcevalues[$irfields[1000000215]];
               servicetype  =$servicevalues[$irfields[1000000099]];
               status       =$statusvalues[$irfields[0000000007]];
               assignedgroup=$irfields[1000000217];
               assignee     =$irfields[1000000218];
               opcat1       =$irfields[1000000063];
               opcat2       =$irfields[1000000064];
               opcat3       =$irfields[1000000065];
               prodcat1     =$irfields[0200000003];
               prodcat2     =$irfields[0200000004];
               prodcat3     =$irfields[0200000005];
               impact       =$impactvalues[$irfields[1000000163]];
               urgency      =$urgencyvalues[$irfields[1000000162]];
               priority     =$priorityvalues[$irfields[1000000164]];
               summary      =$irfields[1000000000];
               notes        =$irfields[1000000151];
             }

"==============================="
new-object -type PSObject -prop $irheader             

foreach ($wl in $fieldvalues) {
   "-------------------------------"
   #$wlFields.Keys | %{ "[{0:0000000000}] {1} = {2}" -f $_,$wlFields[$_],$wl.fieldvalues[$_].tostring() }
   $wldata = [ordered]@{
                date        = $wl.fieldvalues[1000002134];
                submitter   = $wl.fieldvalues[1000000159];
                logtype     = $worklogtypes[$wl.fieldvalues[1000000170]];
                Attachment1 = $wl.fieldvalues[1000000351];
                Attachment2 = $wl.fieldvalues[1000000352];
                Attachment3 = $wl.fieldvalues[1000000353];
                description = $wl.fieldvalues[0301394441];
             }
   new-object -type PSObject -prop $wldata
}
"==============================="

$arserver.Logout()
Tagged with: ,
Posted in ARAPI, PowerShell

Fixing “The specified module could not be found.” errors when using ARAPI.NET

We have large projects where we are upgrading systems in our stores and we will schedule the same work for 30+ stores at a time. But to track that work we want a Work Order for each store. Entering all of those into our Remedy system by hand would take forever. So, I wrote a little utility in C# to bulk-create Remedy Work Orders from a CSV data file.

It worked great.

Until I deployed it to the people actually doing the work.

The weird thing was that it worked for some users and not for others. It made no sense to me. I couldn’t figure out which dependent files were missing. The error looked like this:

PS C:\Program Files (x86)\BMC Software\ARAPI80.NET> .\BulkWOCreate.exe /csv .\testdata.csv
Unhandled Exception: System.IO.FileNotFoundException: The specified module could not be found. (Exception from HRESULT:0x8007007E)
   at BMC.ARSystem.Server._Eval(Object v)
   at BMC.ARSystem.Server._performLogin(String methodName, String server, String user, String password, String authentication, String locale, String charSet, Int32 port, String apiCmdLog, String apiResLog, Boolean logInitAndTerm)
   at BMC.ARSystem.Server.Login(String server, String user, String password, String authentication, String locale, String charSet, Int32 port)
   at BMC.ARSystem.Server.Login(String server, String user, String password, String authentication, Int32 port)
   at BulkWOCreate.Program.Main(String[] args)

I found I had the same problem with my ARAPI.NET PowerShell scripts on the same machines. In PowerShell the error looks like this:

Exception calling "Login" with "5"; argument(s): "Could not load file or assembly 'BMC.arnettoc.dll' or one of its dependencies. The specified module could not be found."
At C:\Program Files (x86)\BMC Software\ARAPI80.NET\get-arform.ps1:20 char:4
+    $arserver.Login($ARServerName, $ARSvcAccount, $ARSvcPassword, $ARAuthenticati ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : FileNotFoundException

This error was actually more helpful because it calls out BMC.arnettoc.dll. This gives us a place to start investigating.

So the question is how to figure out what the dependencies are. The DUMPBIN tool from Visual Studio can show us the dependencies statically linked. So let us look at BMC.arnettoc.dll:

C:\Program Files (x86)\BMC Software\ARAPI80.NET> "C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\bin\dumpbin.exe" /dependents BMC.arnettoc.dll
Microsoft (R) COFF/PE Dumper Version 12.00.21005.1
Copyright (C) Microsoft Corporation.  All rights reserved.

Dump of file BMC.arnettoc.dll

File Type: DLL

  Image has the following dependencies:

    MSVCR71.dll
    KERNEL32.dll
    USER32.dll
    mscoree.dll
    arcni80_build001.dll
    OLEAUT32.dll

  Summary

        3000 .data
        7000 .rdata
        1000 .reloc
        1000 .rsrc
        2000 .text

Next step is to verify that all of the dependent DLLs exist on the system and are accessible to my application. All of these DLLS exist except MSVCR71.DLL. But installing that DLL didn’t fix the problem. So where are the other dependencies hiding? The most likely candidate is in another ARAPI.NET DLL. We can see that there is a dependency on arcni80_build001.dll so let’s look at these dependencies.

C:\Program Files (x86)\BMC Software\ARAPI80.NET> "C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\bin\dumpbin.exe" /dependents arcni80_build001.dll
Microsoft (R) COFF/PE Dumper Version 12.00.21005.1
Copyright (C) Microsoft Corporation.  All rights reserved.

Dump of file arcni80_build001.dll

File Type: DLL

  Image has the following dependencies:

    arapi80_build001.dll
    cmdbapi75.dll
    KERNEL32.dll
    OLEAUT32.dll
    MSVCP71.dll
    MSVCR71.dll

  Summary

       43000 .data
        B000 .rdata
        C000 .reloc
       97000 .text

Here we can see that we also need MSVCP71.DLL. Installing that DLL resolves the issue. Great, but what are these files and where do they come from. The following article from Microsoft gives us the answer.

Redistribution of the shared C runtime component in Visual C++
http://support.microsoft.com/en-us/kb/326922

So, as you can see, these are C-Runtime DLLs from .NET 1.1.

Unfortunately, BMC’s .NET library was built with a very old version of .NET. They really need to fix this. The .NET 1.1 components do not ship in the OS anymore. Newer versions of .NET are supposed to be backward compatible, and, for the most part, they are. BMC should have included them in their distribution. But applications and libraries really should be recompiled to remove dependencies on these very old runtime components.

So, what if you don’t have copies of these DLLs available anywhere? Where can you find this from a reliable source?

After a bit of searching I found:

Microsoft .NET Framework Version 1.1 Redistributable Package
http://www.microsoft.com/en-us/download/details.aspx?id=26

There are a few problems with this package. First, it will not install on anything newer than Windows XP or Server 2003. Second, this package does not contain both DLLs. It only contains MSVCR71.DLL. After more searching, I found the .NET 1.1 SDK.

.NET Framework SDK Version 1.1
http://www.microsoft.com/en-us/download/details.aspx?id=16217

Again, this does not play well with newer OS versions but the good news is that it does contain both DLLs that we need. So now I just need to extract the files somehow.

When you download this package you get a single setup.exe. Looking at the EXE file detailed properties I see that it is an IExpress package. IExpress is a setup framework that first shipped in the IEAK to help you package custom branded installations of Internet Explorer. You can read more about it here: https://technet.microsoft.com/en-us/library/dd346761.aspx

An IExpress package has the following command-line options:

Command line options:
/Q -- Quiet modes for package,
/T: -- Specifies temporary working folder,
/C -- Extract files only to the folder when used also with /T.
/C: -- Override Install Command defined by author.

So the next step is to extract the contents of setup.exe to a folder.

D:\Downloads> setup.exe /t:D:\Downloads\dotnet11sdk /c

D:\Downloads\dotnet11sdk> dir
 Volume in drive D has no label.
 Volume Serial Number is 40F1-817D

 Directory of D:\Downloads\dotnet11sdk

04/02/2015  03:59 PM    <DIR>          .
04/02/2015  03:59 PM    <DIR>          ..
02/20/2003  06:48 PM            94,208 Install.exe
09/26/2001  05:07 PM         1,707,856 InstMsi.exe
09/11/2001  02:46 PM         1,821,008 InstMsiW.exe
03/29/2003  02:55 AM        81,380,073 netfxsd1.cab
03/29/2003  02:55 AM        26,264,064 netfxsdk.msi
               5 File(s)    111,267,209 bytes
               2 Dir(s)  64,240,427,008 bytes free

The netfxsd1.cab file contains all of the files to be installed by netfxsdk.msi. All you need to do is open the CAB file using Windows Explorer and extract the following two files:

  • FL_msvcp71_dll_____X86.3643236F_FC70_11D3_A536_0090278A1BB8
  • FL_msvcr71_dll_____X86.3643236F_FC70_11D3_A536_0090278A1BB8

Then just rename the extracted files and copy them to the ARAPI.NET folder.

D:\Downloads\dotnet11sdk> ren FL_msvcp71_dll_____X86.3643236F_FC70_11D3_A536_0090278A1BB8 msvcp71.dll
D:\Downloads\dotnet11sdk> ren FL_msvcr71_dll_____X86.3643236F_FC70_11D3_A536_0090278A1BB8 msvcr71.dll

Now your applications should work.

Tagged with:
Posted in ARAPI, PowerShell, Remedy

Use PowerShell and EWS to find out who is sending you email

I get a lot of email from a lot of different sources. A lot of it is from automated alerts generated by services accounts that monitor various applications that my team supports. Each month I like to see how many messages I have gotten from the various sources. Looking at these numbers over time can be helpful to identify trends. If we are suddenly getting more alerts from a particular sender then we may want to look more closely at the health of that system.

Using Outlook’s rules engine I send all of these alert messages to a specific folder. Now I just need an easy way to count them. I created a script that scans that folder and counts the number of messages from each sender. The output looks like this:

Count Name
----- ----
   10 Service Account A <SMTP:svca@contoso.com>
   10 Ops Monitor 2 <SMTP:opsmon2@contoso.com>
    7 Ops Monitor 3 <SMTP:opsmon3@contoso.com>
    6 Service Account D <SMTP:svcd@contoso.com>
    6 Service Account E <SMTP:svce@contoso.com>

The script is pretty simple. I created two functions:

  • one to find the specific folder in the mailbox
  • one to iterate through all the items in the folder

To find the target folder you must walk the folder tree until you reach your destination. Once you have the target folder you can create an ItemView and search for all the messages in the folder. PowerShell’s Group-Object cmdlet does the work of counting for you.

# Load the EWS dll
Add-Type -Path 'C:\Program Files\Microsoft\Exchange\Web Services\2.2\Microsoft.Exchange.WebServices.dll'

#-----------------------------------------------------
function GetTargetFolder {
   param([string]$folderPath)

   $fldArray = $folderPath.Split("\")
   $tfTargetFolder = $MsgRoot

   for ($x = 1; $x -lt $fldArray.Length; $x++)
   {
      #$fldArray[$x]
      $fvFolderView = new-object Microsoft.Exchange.WebServices.Data.FolderView(1)
      $SfSearchFilter = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+IsEqualTo(
         [Microsoft.Exchange.WebServices.Data.FolderSchema]::DisplayName,
         $fldArray[$x]
      )
      $findFolderResults = $service.FindFolders($tfTargetFolder.Id,$SfSearchFilter,$fvFolderView)
      if ($findFolderResults.TotalCount -gt 0)
      {
         foreach($folder in $findFolderResults.Folders)
         {
             $tfTargetFolder = $folder
         }
      }
      else
      {
         "Error Folder Not Found"
         $tfTargetFolder = $null
         break
      }
   }
   $tfTargetFolder
}
#-----------------------------------------------------
function GetItems {
   param ($targetFolder)
   #Define ItemView to retrive just 100 Items at a time
   $ivItemView = New-Object Microsoft.Exchange.WebServices.Data.ItemView(100) 

   $AQSString = $null  #find all messages
   do
   {
        $fiItems = $service.FindItems($targetFolder.Id,$AQSString,$ivItemView)
        foreach($Item in $fiItems.Items)
        {
            $Item.Load()
            $Item
        }
        $ivItemView.Offset += $fiItems.Items.Count
   }
   while($fiItems.MoreAvailable -eq $true)
}
#-----------------------------------------------------
$ExchangeVersion = [Microsoft.Exchange.WebServices.Data.ExchangeVersion]::Exchange2010_SP2
$service = New-Object Microsoft.Exchange.WebServices.Data.ExchangeService($ExchangeVersion)

$service.UseDefaultCredentials = $true
$MailboxName = "mymailbox@contoso.com"
$service.AutodiscoverUrl($MailboxName)

#Bind to the Root of the mailbox so I can search the folder namespace for the target
$MsgRootId = [Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::MsgFolderRoot
$MsgRoot = [Microsoft.Exchange.WebServices.Data.Folder]::Bind($service,$MsgRootId)
$targetFolder = GetTargetFolder '\Inbox\Alert Message\Current'

$itemList = GetItems $targetFolder
$itemList | group-object Sender -noelement | sort Count -desc | ft -a
Tagged with: ,
Posted in PowerShell
sean on it
Categories
Follow Sean on IT on WordPress.com
Blog Stats
  • 38,012 hits
The Access Onion

Access Management and Identity Federation on a plate.

The Old New Thing

Random thoughts and things I have learned in my IT journey.

Anonymous's Activities

Random thoughts and things I have learned in my IT journey.

PowerShell.org

A Community for PowerShell People