Using Git from PowerShell

This year has been full of changes for me. One of the biggest changes is that my job requires me to use Git and GitHub for almost all of my work. Before this new job, I had never used Git. By default, the Git installer installs the bash command shell. Most of the documentation is written assuming that you are using bash. However, I prefer to work in PowerShell. In this article I will show how I set up my environment to enable Git functionality in PowerShell. This is not meant to be a tutorial on using Git but, rather, a example of what works for me and for my workflow.

Download and install the Git for Windows

First thing is to install Git for Windows.

Download and run the Git for Windows installer. As you step through the installation wizard you are presented with several options. The following is a list of the options on each page of the installation wizard with the reasoning behind my choice.

  • The Select Components page
    • Check Git LFS (Large File Support)
    • Check Associate .git* configuration files with the default text
    • Check Use a TrueType font in all console windows
      I prefer the TrueType font Consolas as my monospaced font for command shells and code editors.
  • The Choosing the default editor used by Git page
    • Select Use Visual Studio Code as Git’s default editor
      VS Code does everything.
  • The Adjusting your PATH environment page
    • Select Use Git from the Windows Command Prompt
      This adds the Git tools to your PATH so that it works for Cmd, PowerShell, or bash.
  • The Choosing HTTPS transport backend page
    • Select Use the native Windows Secure Channel library
  • The Configure the line ending conversions page
    • Select Checkout Windows-style, commit Unix-style line endings
      This is the recommended setting on Windows and provides the most compatibility for cross-platform projects.
  • The Configuring the terminal emulator to use with Git bash page
    • Select Use Windows’ default console window
      This is the console that PowerShell uses and works best with other Windows console-based applications.
  • The Configuring extra options page
    • Check Enable file system caching
      This option is checked by default. Caching improves performance of certain Git operations.
    • Check Enable Git Credential Manager
      The Git Credential Manager for Windows (GCM) provides secure Git credential storage for Windows. GCM provides multi-factor authentication support for Visual Studio Team Services, Team Foundation Server, and GitHub. Enabling GCM prevents the need for Git to continuously prompt for your Git credentials for nearly every operation. For more information see the GCM documentation on GitHub.
    • Check Enable symbolic links

These are the options I chose. You may have different requirements in your environment.

Install the Posh-Git module

Now that we have the Git client installed we need to enable Git functionality for PowerShell. Posh-Git from the Gallery. For more information about Posh-Git, see Posh-Git on GitHub.

If you have PsGet installed just run:

Install-Module posh-git

Alternatively, you can install Posh-Git manually using the instructions in the README.MD in the GitHub repository.

Once Posh-Git is installed you need to integrate Git into your PowerShell environment. Posh-Git includes an example profile script that you can adapt to your needs.

Integrate Git into your PowerShell environment

Integrating Git into PowerShell is simple. There are three main things to do:

  1. Load the Posh-Git module
  2. Start the SSH Agent Service
  3. Configure your prompt to show the Git status

Add the following lines to your PowerShell profile script.

Import-Module posh-git
Start-SshAgent -Quiet
function global:prompt {
    $identity = [Security.Principal.WindowsIdentity]::GetCurrent()
    $principal = [Security.Principal.WindowsPrincipal] $identity
    $name = ($identity.Name -split '\\')[1]
    $path = Convert-Path $executionContext.SessionState.Path.CurrentLocation
    $prefix = "($env:PROCESSOR_ARCHITECTURE)"

    if($principal.IsInRole([Security.Principal.WindowsBuiltInRole] 'Administrator')) { $prefix = "Admin: $prefix" }
    $realLASTEXITCODE = $LASTEXITCODE
    $prefix = "Git $prefix"
    Write-Host ("$prefix[$Name]") -nonewline
    Write-VcsStatus
    ("`n$('+' * (get-location -stack).count)") + "PS $($path)$('>' * ($nestedPromptLevel + 1)) "
    $global:LASTEXITCODE = $realLASTEXITCODE
    $host.ui.RawUI.WindowTitle = "$prefix[$Name] $($path)"
}

The prompt function integrates Git into your PowerShell prompt to show an abbreviated git status. See the README for Posh-Git for a full explanation of the abbreviated status. I have also customize my prompt to show me my user context, whether I am running in a 64-bit or 32-bit shell, and if I am running elevated. Customize this function to meet your needs or preferences.

At this point you are done. You can use Git from PowerShell. Go forth and clone a repo.

Customize your Git environment

You may want to customize some of the settings of your Git environment, especially if this is a new install of Git. Being a good project contributor in Git you should identify yourself so that Git knows who to blame for your commits. Also, I found that the default colors used by Git in the shell could be hard to read. So I customized the colors to make them more visible. For more information, see the Customizing Git topic in the Git documentation.

The following commands only need to be run once. You are setting global preferences so, once they are set, they are used every time you start a new shell.

# Configure your user information to match your GitHub profile
git config --global user.name "John Doe"
git config --global user.email "alias@example.com"

# Set up the colors to improve visibility in the shell
git config --global color.ui true
git config --global color.status.changed "magenta bold"
git config --global color.status.untracked "red bold"
git config --global color.status.added "yellow bold"
git config --global color.status.unmerged "yellow bold"
git config --global color.branch.remote "magenta bold"
git config --global color.branch.upstream "blue bold"
git config --global color.branch.current "green bold"

As I said at the beginning, this is what works for me. Your mileage may vary. Customize this for your preferences and environmental needs.

In future articles, I plan to share scripts I have created to help me with my Git workflow. Do you use Git with Powershell? Share your questions and experiences in the comments.

Posted in Git, GitHub, PowerShell

Opening the door to the Mystery of Dates in PowerShell

Formatting and converting dates can be very confusing. Every programming language, operating system, and runtime environment seem to do it differently. And part of the difficulty in conversion is knowing what units you are starting with.

First, it is helpful to know the Epoch (or starting date) a stored value is based on. Wikipedia has a good article on this. Here is a brief excerpt.

Epoch date Notable uses Rationale for selection
January 1, AD 1 Microsoft .NET Common Era, ISO 2014, RFC 3339
January 1, 1601 NTFS, COBOL, Win32/Win64 1601 was the first year of the 400-year Gregorian calendar cycle at the time Windows NT was made
January 0, 1900 Microsoft Excel, Lotus 1-2-3 While logically January 0, 1900 is equivalent to December 31, 1899, these systems do not allow users to specify the latter date.
January 1, 1904 Apple Inc.’s Mac OS through version 9 1904 is the first leap year of the 20th century
January 1, 1970 Unix Epoch aka POSIX time. Used by Unix and Unix-like systems (Linux, Mac OS X), and programming languages: most C/C++ implementations, Java, JavaScript, Perl, PHP, Python, Ruby, Tcl, ActionScript.
January 1, 1980 IBM BIOS INT 1Ah, DOS, OS/2, FAT12, FAT16, FAT32, exFAT filesystems The IBM PC with its BIOS as well as 86-DOS, MS-DOS and PC DOS with their FAT12 file system were developed and introduced between 1980 and 1981

Common Date Conversion Tasks

WMI Dates

PS > $installDate = (Get-WmiObject win32_operatingsystem | select Installdate ).InstallDate
PS > [system.management.managementdatetimeconverter]::ToDateTime($InstallDate)
Friday, September 12, 2008 6:50:57 PM

PS > [System.Management.ManagementDateTimeConverter]::ToDmtfDateTime($(get-date))
20151127144036.886000-480

Excel dates – Excel stores dates as sequential serial numbers so that they can be used in calculations. By default, January 1, 1900, is serial number 1.

PS > ((Get-Date).AddDays(1) - (get-date "12/31/1899")).Days
42335

In this example, the value Days is 42335 which is the serial number for 11/27/2015 in Excel. The date “12/31/1899” is equivalent to January 0, 1900. The difference between “12/31/1899” and “11/27/2015” is 42334 but since the serial numbers start a 1 you need to add 1 day to get the serial number for “11/27/2015”.

Converting from custom string formats

PS > $information = '12Nov(2012)18h30m17s'
PS > $pattern = 'ddMMM\(yyyy\)HH\hmm\mss\s'
PS > [datetime]::ParseExact($information, $pattern, $null)
Monday, November 12, 2012 6:30:17 PM

FILETIME conversion – FILETIME is a 64-bit value representing the number of 100-nanosecond intervals since January 1, 1601 (UTC).

PS > get-aduser username -prop badPasswordTime,lastLogonTimestamp | select badPasswordTime,lastLogonTimestamp
badPasswordTime : 130927962789982434
lastLogonTimestamp : 130931333173599571

PS > [datetime]::fromfiletime(130927962789982434)
Monday, November 23, 2015 3:51:18 PM

PS > [datetime]::fromfiletime(130931333173599571)
Friday, November 27, 2015 1:28:37 PM

CTIME or Unix format – is an integral value representing the number of seconds elapsed since 00:00 hours, Jan 1, 1970 UTC (i.e., a Unix timestamp).

PS > $epoch = get-date "1/1/1970"
PS > $epoch.AddMilliseconds(1448302797803)
Monday, November 23, 2015 6:19:57 PM

PS > $epoch.AddSeconds(1448302797.803)
Monday, November 23, 2015 6:19:57 PM

References

Standard Date and Time Format Strings in .NET
https://docs.microsoft.com/dotnet/standard/base-types/standard-date-and-time-format-strings

Custom Date and Time Format Strings in .NET
https://docs.microsoft.com/dotnet/standard/base-types/custom-date-and-time-format-strings

Formatting Dates and Times in PowerShell
https://docs.microsoft.com/previous-versions/windows/it-pro/windows-powershell-1.0/ee692801(v=technet.10)

PowerTip: Use PowerShell to Format Dates
https://devblogs.microsoft.com/scripting/powertip-use-powershell-to-format-dates/

Parsing Custom Date and Time Formats
http://community.idera.com/powershell/powertips/b/tips/posts/parsing-custom-date-and-time-formats
https://msdn.microsoft.com/en-us/library/system.datetime.parseexact(v=vs.110).aspx

Wikipedia – Epoch (reference date)
https://en.wikipedia.org/wiki/Epoch_(reference_date)

Posted in PowerShell, Uncategorized

Adding a Contact to a Distribution List with PowerShell

The PowerShell ActiveDirectory module has a lot of great features that I use on a daily basis. However, there is one shortcoming that I have struggled with for a while. I did a lot of internet searching and testing to see if I was missing some hidden secret. But, alas, this is one task that the AD module does not do.

Here is the scenario. We have a lot of AD Groups (Distribution Lists) we use for notification messages. We want to send notifications to mobile devices. We do this by sending an email to the devices email address. For example:

2065551212@mobilecarrier.xyz.com

These external email addresses are created as Contact objects in AD.

The problem is that the cmdlets for managing AD group objects only allow you to add objects that have a SamAccountName (and therefore a SID) to a group. This is fine for user and group objects. But Contact objects to not have SIDs. So now what do you do.

The answer is you do it the old way you would have done it in VBScript; use ADSI.

$dlGroup = [adsi]'LDAP://CN=DL-Group Name,OU=Corp Distribution Lists,DC=contoso,DC=net'
$dlGroup.Member.Add('CN=mobile-username,OU=Corp Contacts,DC=contoso,DC=net')
$dlGroup.psbase.CommitChanges()
Tagged with:
Posted in PowerShell

Create Remedy Work Orders with PowerShell

In previous posts, I have shown you how to query for data in Remedy using PowerShell and ARAPI.NET. Creating or updating information in Remedy is not much different.

  1. Log into the AR Server
  2. Build a list of Field IDs and values that you want the Work Order to contain
  3. Use the CreateEntry method to create a new Work Order
  4. Create a Qualifier string used to query for the newly created Work Order
  5. Build a list of Field IDs that you want to retrieve from Remedy
  6. Query a Remedy form (passing in the qualifier and field list)
  7. Process the results
  8. Log off the AR Server

The following script is a simple example to create a single Work Order in Remedy. It could easily be adapted to read from a CSV file or other data source to create multiple work orders. In fact, I wrote a tool in C# that is designed to create multiple Work Orders from CSV data. We use it to create the same Work Order task for multiple individual store locations.

The hardest part about creating items in Remedy is knowing which form to use and which data fields are required. It is not always obvious which fields are required. You may have to use Developer Studio to inspect the workflow and filter logic to figure all of that out. Also, fields like Product Categories, Operational Categories, Support Queues, etc. will have values specific to your environment.

param ($appServer = "appserver.contoso.net",
       $svcAccount = "remedyuserid",
       $svcPassword = "remedypassword",
       $remAuthDomain = "",
       $arSrvrPort = 51100,
       $formName = "WOI:WorkOrderInterface_Create")

add-type -path 'C:\Program Files (x86)\BMC Software\ARAPI80.NET\BMC.ARSystem.dll'

$arserver = New-Object -type BMC.ARSystem.Server
$arserver.Login($appServer, $svcAccount, $svcPassword, $remAuthDomain, $arSrvrPort)

$fieldIDs = @{     2="";           #Submitter
                   3="";           #Create Date
          1000000182="";           #Work Order ID
          ### Store identity
          1000000082="Contoso";    #Company
           301593100="s01174";     #RequesterLoginID
          1000000001="HQ";         #Location
          1000000018="Bloggs";     #Last Name
          1000000019="Joe";        #First Name
          ### WO Actions
          1000000000="Summary: short version of description";
          1000000151="Details: long version of description";
          1000000164="Low";
          1000000181="Project";
                   7="Assigned";
          1000000076="CREATE";
          ### Op Cat Tiers 1,2,3
          1000000063="Request";
          1000000064="Add hardware";
          1000000065="Wiring Closet";
          ### Prod Cat Tiers 1,2,3
          1000001270="Hardware Lifecycle";
          1000001271="Hardware replacement";
          1000001272="Internal Project";
          ### Product
          1000002268="Wireless Access";
          ### Manager Support Hierarchy
          1000000014="Infrastructure";  #Manager Support Org
          1000000015="Network";         #Manager Support Group
          1000000251="Contoso";         #Manager Company
          ### Support Hierarchy
          1000003227="Infrastructure";  #Support Org
          1000003228="Network";         #Support Group
          1000003229="Contoso";         #Company
          ### Customer Info Returned
          1000003296=""; #Customer Person ID
          1000003297=""; #Customer First Name
          1000003298=""; #Customer Last Name
          1000003299=""; #Customer Company
          1000003302=""; #Customer Email
          1000003306=""; #Customer Phone Number
         }
try
{

   #Build the list of field values to be used in the Create request - skip blank values
   [BMC.ARSystem.FieldValueList] $woValueList = New-Object -type BMC.ARSystem.FieldValueList
   $fieldIDs.keys | ForEach-Object {
      if ($fieldIDs[$_] -ne "") {
         $woValueList.Add($_,$fieldIDs[$_])
      }
   }
   #Create then new WO with listed values
   $entryID = $arserver.CreateEntry($formName, $woValueList);

   #Build a new field list containing ALL the fields you want returned
   [BMC.ARSystem.EntryListFieldList] $woEntryFieldList = New-Object -type BMC.ARSystem.EntryListFieldList
   $fieldIDs.Keys | ForEach-Object { $woEntryFieldList.AddField($_); }

   #Query for the newly created WO by its EntryID
   $strSQL = "'1' = {0}" -f $entryID
   [BMC.ARSystem.EntryFieldValueList] $woEntryValueList = New-Object -type BMC.ARSystem.EntryFieldValueList
   $woEntryValueList = $arserver.GetListEntryWithFields($formName, $strSQL, $woEntryFieldList, 0, 50);

   #Output the results
   $fieldIDs.Keys | ForEach-Object { "[{0:0000000000}] {1}" -f $_,$woEntryValueList.fieldvalues[$_] }
}
catch
{
   $_
}
$arserver.Logout()

Updating values on an existing item in Remedy is not much different. The process is mostly the same but you are using the SetEntry method instead of the CreateEntry method. I will try to share an update example in an upcoming post.

Tagged with: ,
Posted in ARAPI, PowerShell

Getting incident notes from Remedy using ARAPI.NET from PowerShell

Here is another example of using ARAPI.NET from PowerShell. This is a script I wrote and use every day to retrieve notes from a Remedy Incident. As a manager, I rarely use Remedy to work incidents. However, I frequently need to review the notes and ensure that incidents are being resolved efficiently. Logging into Remedy, finding the incident, and reviewing all of the notes can be a time-consuming process. This script makes it very easy for me to quickly see all of the notes and provide status updates to my management.

All I need is the incident number. Using the incident number passed on the command line the script searches the HPD:Search-Worklog form for Work Log entries for the specified Incident.

The output looks like this:

PS E:\> .\get-incident.ps1 IR0000000183197
===============================
incident      : IR0000000183197
submitdate    : 4/28/2015 8:22:47 AM
customer      : serverops
workphone     : 1 206 555-1212
source        : Systems Management
servicetype   : Monitoring Event
status        : Assigned
assignedgroup : Windows OS
assignee      :
opcat1        : Business Process
opcat2        :
opcat3        :
prodcat1      : Software
prodcat2      : Operating System
prodcat3      : Server OS
impact        : Moderate
urgency       : Medium
priority      : Medium
summary       : server1.contoso.net Logical disk C: on host server1.contoso.net has 10 % free space remaining.
notes         :

-------------------------------
date        : 4/28/2015 8:22:42 AM
submitter   : remedyuser15
logtype     : General Information
Attachment1 :
Attachment2 :
Attachment3 :
description : Alert Info for Ticketing:
              host: server1.contoso.net
              Severity: MINOR
              Message: Logical disk C: on host server1.contoso.net has 10 % free space remaining.
              Object: C:
              ITSM Queue: Windows OS
              Modified: 4/28/2015 8:12:19 AM
===============================

The pattern of the script is the same as my previous examples. I set up the parameters (which can be overridden on the command line) to connect to the Remedy application server. Then I define the field IDs that I want to retrieve. I also define some string dictionaries that I use to convert numeric values returned by Remedy into the text values that are displayed by the mid-tier.

param ($searchTerm,
       $fieldID=1000000161,
       $appServer = "appserver.contoso.net",
       $svcAccount = "remedyuserid",
       $svcPassword = "remedypassword",
       $remAuthDomain = "",
       $arSrvrPort = 51100,
       $formName = "HPD:Search-Worklog")

add-type -path 'C:\Program Files (x86)\BMC Software\ARAPI80.NET\BMC.ARSystem.dll'

$incFields = @{         7="Status         ";
                200000003="ProdCat 1      ";
                200000004="ProdCat 2      ";
                200000005="ProdCat 3      ";
               1000000000="Description    ";
               1000000018="Last Name      ";
               1000000019="First Name     ";
               1000000056="Phone Number   ";
               1000000063="OpCat 1        ";
               1000000064="OpCat 2        ";
               1000000065="OpCat 3        ";
               1000000099="Service Type   ";
               1000000151="Decription     ";
               1000000161="Incident Number";
               1000000162="Urgency        ";
               1000000163="Impact         ";
               1000000164="Priority       ";
               1000000215="Source         ";
               1000000217="Assigned Group ";
               1000000218="Assignee       ";
               1000000422="Owner Group    ";
               1000000560="Reported Date  ";
               1000000881="Status Reason  ";
               1000005782="Contact Last   ";
               1000005783="Contact First  ";
               1000005785="Contact Phone  ";
              }

$wlFields = @{  301394441="Description    ";
               1000000157="Submit Date    ";
               1000000159="Submitter      ";
               1000000170="Work Log Type  ";
               1000000351="Attachment01   ";
               1000000352="Attachment02   ";
               1000000353="Attachment03   ";
               1000002134="Work Log Date  ";
               1000003610="Summary        ";
             }
$arserver = new-object BMC.ARSystem.Server
$arserver.Login($appServer, $svcAccount, $svcPassword, $remAuthDomain, $arSrvrPort)
$qualifier = "'{0}' = ""{1}""" -f $fieldID,$searchTerm

[BMC.ARSystem.EntryListFieldList] $formEntryFieldList = new-object BMC.ARSystem.EntryListFieldList
$incFields.Keys | %{ $formEntryFieldList.AddField($_) }
$wlFields.Keys  | %{ $formEntryFieldList.AddField($_) }

$impactvalues = @{ 1000="Extensive"; 2000="Significant"; 3000="Moderate"; 4000="Minor" }
$urgencyvalues = @{ 1000="Critical"; 2000="High"; 3000="Medium"; 4000="Low" }
$priorityvalues = @{ 0="Critical"; 1="High"; 2="Medium"; 3="Low" }
$statusvalues = @{ 0="New"; 1="Assigned"; 2="In Progress"; 3="Pending"; 4="Resolved"; 5="Closed"; 6="Cancelled"; }
$servicevalues = @{ 0="User Restoration"; 1="User Service Request"; 2="System Restoration"; 3="Monitoring Event"; }
$sourcevalues =  @{ 1000="Direct Input"; 2000="Email"; 3000="External Escalation"; 4000="Fax"; 4200="Self Service"; 5000="Systems Management"; 6000="Phone"; 7000="Voice Mail"; 8000="Walk In"; 9000="Web"; 10000="Other"; 11000="BMC Impact Manager Event"; }
$worklogtypes = @{
   1000="----- Customer Inbound -----";
   2000="Customer Communication";
   3000="Customer Follow-up";
   4000="Customer Status Update";
   5000="----- Customer Outbound -----";
   6000="Closure Follow Up";
   7000="Detail Clarification";
   8000="General Information";
   9000="Resolution Communications";
   10000="Satisfaction Survey";
   11000="Status Update";
   12000="----- General -----";
   13000="Incident Task / Action";
   14000="Problem Script";
   15000="Working Log";
   16000="Email System";
   17000="Paging System";
   18000="BMC Impact Manager Update";
   35000="Chat";
   36000="B2B Vendor Update";
 }

[BMC.ARSystem.EntryFieldValueList] $fieldValues = $arserver.GetListEntryWithFields($formName, $qualifier, $formEntryFieldList, 0, 0);
$irfields = $fieldValues[0].fieldvalues

$irheader = [ordered]@{
               incident     =$irfields[1000000161];
               customer     =$irfields[1000000019] + " " + $irfields[1000000018];
               workphone    =$irfields[1000000056];
               source       =$sourcevalues[$irfields[1000000215]];
               servicetype  =$servicevalues[$irfields[1000000099]];
               status       =$statusvalues[$irfields[0000000007]];
               assignedgroup=$irfields[1000000217];
               assignee     =$irfields[1000000218];
               opcat1       =$irfields[1000000063];
               opcat2       =$irfields[1000000064];
               opcat3       =$irfields[1000000065];
               prodcat1     =$irfields[0200000003];
               prodcat2     =$irfields[0200000004];
               prodcat3     =$irfields[0200000005];
               impact       =$impactvalues[$irfields[1000000163]];
               urgency      =$urgencyvalues[$irfields[1000000162]];
               priority     =$priorityvalues[$irfields[1000000164]];
               summary      =$irfields[1000000000];
               notes        =$irfields[1000000151];
             }

"==============================="
new-object -type PSObject -prop $irheader             

foreach ($wl in $fieldvalues) {
   "-------------------------------"
   #$wlFields.Keys | %{ "[{0:0000000000}] {1} = {2}" -f $_,$wlFields[$_],$wl.fieldvalues[$_].tostring() }
   $wldata = [ordered]@{
                date        = $wl.fieldvalues[1000002134];
                submitter   = $wl.fieldvalues[1000000159];
                logtype     = $worklogtypes[$wl.fieldvalues[1000000170]];
                Attachment1 = $wl.fieldvalues[1000000351];
                Attachment2 = $wl.fieldvalues[1000000352];
                Attachment3 = $wl.fieldvalues[1000000353];
                description = $wl.fieldvalues[0301394441];
             }
   new-object -type PSObject -prop $wldata
}
"==============================="

$arserver.Logout()
Tagged with: ,
Posted in ARAPI, PowerShell

Fixing “The specified module could not be found.” errors when using ARAPI.NET

We have large projects where we are upgrading systems in our stores and we will schedule the same work for 30+ stores at a time. But to track that work we want a Work Order for each store. Entering all of those into our Remedy system by hand would take forever. So, I wrote a little utility in C# to bulk-create Remedy Work Orders from a CSV data file.

It worked great.

Until I deployed it to the people actually doing the work.

The weird thing was that it worked for some users and not for others. It made no sense to me. I couldn’t figure out which dependent files were missing. The error looked like this:

PS C:\Program Files (x86)\BMC Software\ARAPI80.NET> .\BulkWOCreate.exe /csv .\testdata.csv
Unhandled Exception: System.IO.FileNotFoundException: The specified module could not be found. (Exception from HRESULT:0x8007007E)
   at BMC.ARSystem.Server._Eval(Object v)
   at BMC.ARSystem.Server._performLogin(String methodName, String server, String user, String password, String authentication, String locale, String charSet, Int32 port, String apiCmdLog, String apiResLog, Boolean logInitAndTerm)
   at BMC.ARSystem.Server.Login(String server, String user, String password, String authentication, String locale, String charSet, Int32 port)
   at BMC.ARSystem.Server.Login(String server, String user, String password, String authentication, Int32 port)
   at BulkWOCreate.Program.Main(String[] args)

I found I had the same problem with my ARAPI.NET PowerShell scripts on the same machines. In PowerShell the error looks like this:

Exception calling "Login" with "5"; argument(s): "Could not load file or assembly 'BMC.arnettoc.dll' or one of its dependencies. The specified module could not be found."
At C:\Program Files (x86)\BMC Software\ARAPI80.NET\get-arform.ps1:20 char:4
+    $arserver.Login($ARServerName, $ARSvcAccount, $ARSvcPassword, $ARAuthenticati ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : FileNotFoundException

This error was actually more helpful because it calls out BMC.arnettoc.dll. This gives us a place to start investigating.

So the question is how to figure out what the dependencies are. The DUMPBIN tool from Visual Studio can show us the dependencies statically linked. So let us look at BMC.arnettoc.dll:

C:\Program Files (x86)\BMC Software\ARAPI80.NET> "C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\bin\dumpbin.exe" /dependents BMC.arnettoc.dll
Microsoft (R) COFF/PE Dumper Version 12.00.21005.1
Copyright (C) Microsoft Corporation.  All rights reserved.

Dump of file BMC.arnettoc.dll

File Type: DLL

  Image has the following dependencies:

    MSVCR71.dll
    KERNEL32.dll
    USER32.dll
    mscoree.dll
    arcni80_build001.dll
    OLEAUT32.dll

  Summary

        3000 .data
        7000 .rdata
        1000 .reloc
        1000 .rsrc
        2000 .text

Next step is to verify that all of the dependent DLLs exist on the system and are accessible to my application. All of these DLLS exist except MSVCR71.DLL. But installing that DLL didn’t fix the problem. So where are the other dependencies hiding? The most likely candidate is in another ARAPI.NET DLL. We can see that there is a dependency on arcni80_build001.dll so let’s look at these dependencies.

C:\Program Files (x86)\BMC Software\ARAPI80.NET> "C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\bin\dumpbin.exe" /dependents arcni80_build001.dll
Microsoft (R) COFF/PE Dumper Version 12.00.21005.1
Copyright (C) Microsoft Corporation.  All rights reserved.

Dump of file arcni80_build001.dll

File Type: DLL

  Image has the following dependencies:

    arapi80_build001.dll
    cmdbapi75.dll
    KERNEL32.dll
    OLEAUT32.dll
    MSVCP71.dll
    MSVCR71.dll

  Summary

       43000 .data
        B000 .rdata
        C000 .reloc
       97000 .text

Here we can see that we also need MSVCP71.DLL. Installing that DLL resolves the issue. Great, but what are these files and where do they come from. The following article from Microsoft gives us the answer.

Redistribution of the shared C runtime component in Visual C++
https://support.microsoft.com/en-us/help/326922/redistribution-of-the-shared-c-runtime-component-in-visual-c

So, as you can see, these are C-Runtime DLLs from .NET 1.1.

Unfortunately, BMC’s .NET library was built with a very old version of .NET. They really need to fix this. The .NET 1.1 components do not ship in the OS anymore. Newer versions of .NET are supposed to be backward compatible, and, for the most part, they are. BMC should have included them in their distribution. But applications and libraries really should be recompiled to remove dependencies on these very old runtime components.

So, what if you don’t have copies of these DLLs available anywhere? Where can you find this from a reliable source?

After a bit of searching I found:

Microsoft .NET Framework Version 1.1 Redistributable Package
https://www.microsoft.com/download/details.aspx?id=26

There are a few problems with this package. First, it will not install on anything newer than Windows XP or Server 2003. Second, this package does not contain both DLLs. It only contains MSVCR71.DLL. After more searching, I found the .NET 1.1 SDK.

.NET Framework SDK Version 1.1
https://www.microsoft.com/download/details.aspx?id=16217

Again, this does not play well with newer OS versions but the good news is that it does contain both DLLs that we need. So now I just need to extract the files somehow.

When you download this package you get a single setup.exe. Looking at the EXE file detailed properties I see that it is an IExpress package. IExpress is a setup framework that first shipped in the IEAK to help you package custom branded installations of Internet Explorer. You can read more about it here: https://docs.microsoft.com/en-us/internet-explorer/ie11-ieak/iexpress-command-line-options

An IExpress package has the following command-line options:

Command line options:
/Q -- Quiet modes for package,
/T: -- Specifies temporary working folder,
/C -- Extract files only to the folder when used also with /T.
/C: -- Override Install Command defined by author.

So the next step is to extract the contents of setup.exe to a folder.

D:\Downloads> setup.exe /t:D:\Downloads\dotnet11sdk /c

D:\Downloads\dotnet11sdk> dir
 Volume in drive D has no label.
 Volume Serial Number is 40F1-817D

 Directory of D:\Downloads\dotnet11sdk

04/02/2015  03:59 PM              .
04/02/2015  03:59 PM              ..
02/20/2003  06:48 PM            94,208 Install.exe
09/26/2001  05:07 PM         1,707,856 InstMsi.exe
09/11/2001  02:46 PM         1,821,008 InstMsiW.exe
03/29/2003  02:55 AM        81,380,073 netfxsd1.cab
03/29/2003  02:55 AM        26,264,064 netfxsdk.msi
               5 File(s)    111,267,209 bytes
               2 Dir(s)  64,240,427,008 bytes free

The netfxsd1.cab file contains all of the files to be installed by netfxsdk.msi. All you need to do is open the CAB file using Windows Explorer and extract the following two files:

  • FL_msvcp71_dll_____X86.3643236F_FC70_11D3_A536_0090278A1BB8
  • FL_msvcr71_dll_____X86.3643236F_FC70_11D3_A536_0090278A1BB8

Then just rename the extracted files and copy them to the ARAPI.NET folder.

D:\Downloads\dotnet11sdk> ren FL_msvcp71_dll_____X86.3643236F_FC70_11D3_A536_0090278A1BB8 msvcp71.dll
D:\Downloads\dotnet11sdk> ren FL_msvcr71_dll_____X86.3643236F_FC70_11D3_A536_0090278A1BB8 msvcr71.dll

Now your applications should work.

Tagged with:
Posted in ARAPI, PowerShell, Remedy

Use PowerShell and EWS to find out who is sending you email

I get a lot of email from a lot of different sources. A lot of it is from automated alerts generated by services accounts that monitor various applications that my team supports. Each month I like to see how many messages I have gotten from the various sources. Looking at these numbers over time can be helpful to identify trends. If we are suddenly getting more alerts from a particular sender then we may want to look more closely at the health of that system.

Using Outlook’s rules engine I send all of these alert messages to a specific folder. Now I just need an easy way to count them. I created a script that scans that folder and counts the number of messages from each sender. The output looks like this:

Count Name
----- ----
   10 Service Account A <SMTP:svca@contoso.com>
   10 Ops Monitor 2 <SMTP:opsmon2@contoso.com>
    7 Ops Monitor 3 <SMTP:opsmon3@contoso.com>
    6 Service Account D <SMTP:svcd@contoso.com>
    6 Service Account E <SMTP:svce@contoso.com>

The script is pretty simple. I created two functions:

  • one to find the specific folder in the mailbox
  • one to iterate through all the items in the folder

To find the target folder you must walk the folder tree until you reach your destination. Once you have the target folder you can create an ItemView and search for all the messages in the folder. PowerShell’s Group-Object cmdlet does the work of counting for you.

# Load the EWS dll
Add-Type -Path 'C:\Program Files\Microsoft\Exchange\Web Services\2.2\Microsoft.Exchange.WebServices.dll'

#-----------------------------------------------------
function GetTargetFolder {
   param([string]$folderPath)

   $fldArray = $folderPath.Split("\")
   $tfTargetFolder = $MsgRoot

   for ($x = 1; $x -lt $fldArray.Length; $x++)
   {
      #$fldArray[$x]
      $fvFolderView = new-object Microsoft.Exchange.WebServices.Data.FolderView(1)
      $SfSearchFilter = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+IsEqualTo(
         [Microsoft.Exchange.WebServices.Data.FolderSchema]::DisplayName,
         $fldArray[$x]
      )
      $findFolderResults = $service.FindFolders($tfTargetFolder.Id,$SfSearchFilter,$fvFolderView)
      if ($findFolderResults.TotalCount -gt 0)
      {
         foreach($folder in $findFolderResults.Folders)
         {
             $tfTargetFolder = $folder
         }
      }
      else
      {
         "Error Folder Not Found"
         $tfTargetFolder = $null
         break
      }
   }
   $tfTargetFolder
}
#-----------------------------------------------------
function GetItems {
   param ($targetFolder)
   #Define ItemView to retrive just 100 Items at a time
   $ivItemView = New-Object Microsoft.Exchange.WebServices.Data.ItemView(100) 

   $AQSString = $null  #find all messages
   do
   {
        $fiItems = $service.FindItems($targetFolder.Id,$AQSString,$ivItemView)
        foreach($Item in $fiItems.Items)
        {
            $Item.Load()
            $Item
        }
        $ivItemView.Offset += $fiItems.Items.Count
   }
   while($fiItems.MoreAvailable -eq $true)
}
#-----------------------------------------------------
$ExchangeVersion = [Microsoft.Exchange.WebServices.Data.ExchangeVersion]::Exchange2010_SP2
$service = New-Object Microsoft.Exchange.WebServices.Data.ExchangeService($ExchangeVersion)

$service.UseDefaultCredentials = $true
$MailboxName = "mymailbox@contoso.com"
$service.AutodiscoverUrl($MailboxName)

#Bind to the Root of the mailbox so I can search the folder namespace for the target
$MsgRootId = [Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::MsgFolderRoot
$MsgRoot = [Microsoft.Exchange.WebServices.Data.Folder]::Bind($service,$MsgRootId)
$targetFolder = GetTargetFolder '\Inbox\Alert Message\Current'

$itemList = GetItems $targetFolder
$itemList | group-object Sender -noelement | sort Count -desc | ft -a
Tagged with: ,
Posted in PowerShell

PowerShell error: NativeCommandFailed after installing KB3000850 — fixed! See KB3062960

Recently we went through the exercise of updating our Windows Server 2012 R2 deployment image to include a bunch of updates. After this, we noticed that several PowerShell scripts were failing. I also noticed that some commands stopped working properly including PowerShell’s own Get-Help cmdlet.

PS C:\temp> help
Program 'more.com' failed to run: Object reference not set to an instance of an object.At line:14 char:14
+     $input | more.com
+              ~~~~~~~~.
At line:14 char:5
+     $input | more.com
+     ~~~~~~~~~~~~~~~~~
+ CategoryInfo          : ResourceUnavailable: (:) [], ApplicationFailedException
+ FullyQualifiedErrorId : NativeCommandFailed

In fact, you will get the same error anytime you attempt to use a native command (EXE, etc.) in the pipeline. The problem occurs after KB3000850 is installed but is related to having PowerShell Module Logging and Process Creation auditing enabled.  See the following forum post for another reported example: https://community.idera.com/database-tools/powershell/ask_the_experts/f/learn_powershell_from_don_jones-24/19217/strange-program-more-com-failed-to-run-object-reference-not-set-to-an-instance-of-an-object#pi565=2

Here are the relevant policy settings in our environment.

Computer Settings
  Administrative Templates
    Windows Components/Windows PowerShell
      Turn on Module Logging = Enabled
      Module Names = *
      Turn on Script Execution = Enabled
      Execution Policy = Allow local scripts and remote signed scripts

The forum post on powershell.com seems to indicate that you must also have “Include command line in process creation events” enabled but that is not the case in our environment. You can work around the problem by disabling module logging for all modules for the duration of the PowerShell session/script.

Get-Module | %{ $_.LogPipelineExecutionDetails=$false }

This is not a viable solution for the existing scripts and automation that we use. Plus, process creation auditing and module logging are enabled by policy for security purposes. Removing KB3000850 removes the problem. But there are many fixes rolled up in KB3000850 that we want in our environment. I also noticed that the file list attached to this KB article includes 60 files related to PowerShell but none of the articles listed in the rollup mention those files so there is no documentation about the PowerShell changes included in this rollup. I have opened a case with Microsoft. I will post updates as I learn more.

Update 26-March-2015
We did some more testing and ruled out the Process Creation auditing. Having that auditing enabled or disabled does not affect the problem. You only need Module Logging enabled for PowerShell. Microsoft is still investigating the issue.

Update 22-April-2015
Microsoft has finally been able to isolate the problem and has declared this to be a bug. Now we are waiting for them to decide how and when to fix the problem.

Update 24-April-2015
Found another side effect of this update. With the update installed, you may experience errors adding Roles and Features to your server. I was trying to add the Desktop Experience feature to my desktop server and it was failing with the message: “Object reference not set to an instance of an object.” If you look in the Event log you will find the following event:

Log Name:      Microsoft-Windows-ServerManager-MultiMachine/Operational
Source:        Microsoft-Windows-ServerManager-MultiMachine
Date:          4/24/2015 11:08:43 AM
Event ID:      2014
Task Category: Server manager startup task.
Level:         Information
Keywords:
User:          CONTOSO\adminuser
Computer:      servername.contoso.net
Description:
Deployment Wizard commit action completed.  Target Server: localhost. Job: ID:d80f627f-4fee-4d87-bbe4-7858d64a2265;Feature installation. Status: Failed. Reason Object reference not set to an instance of an object.

So it seems that installing that Feature requires the use of a Native Command and it is failing for the same reason. Who knows how many other Roles and Features may be affected like this. Removing KB3000850 allowed me to add the Desktop Experience feature.

Update 1-May-2015
Good news! We just received a patch to test from Microsoft and have smoke-tested it on several machines. This patch resolves the issue. So now we just need Microsoft to test and package it for public consumption.

Update 10-July-2015
The fix is finally being released as KB3062960. It should be available on the 14th of July (Patch Tuesday).

Update 14-July-2015
The hotfix is now published: https://support.microsoft.com/en-us/kb/3062960
You must download and deploy this manually. This is not published in Windows Update.

Tagged with: , ,
Posted in PowerShell

Understanding Byte Arrays in PowerShell

In a previous article, I presented a PowerShell script for inspecting and validating certificates stored as PFX files. My goal is to get data into an X509Certificate2 object so that I can validate the certificate properties. The X509Certificate2 Import() methods have two sets variations. One set takes a filename for the certificate file to be imported. The second set takes a Byte array containing the certificate data to be imported.

In this script, I can import PFX certificate files by downloading a byte stream from a web server or by reading a file stored on disk. I want to avoid creating temporary files and I want to make a generic import function that could be used independently of the data retrieval method.  I settled on using an array of bytes as the import format for either scenario.

To import a PFX file from disk I use the Get-Content cmdlet. Let’s take a closer look at how Get-Content works and what it returns.

PS C:\temp> $pfxbytes = Get-Content .\DEV113.pfx
PS C:\temp> $pfxbytes.GetType().Name
Object[]
PS C:\temp> $pfxbytes[0].GetType().Name
String
PS C:\temp> $pfxbytes[0].length
18
PS C:\temp> $pfxbytes.length
70
PS C:\temp> $pfxbytes | ForEach-Object { $count += $_.length }
PS C:\temp> $count
7489
PS C:\temp> Get-ChildItem .\DEV113.pfx

Directory: C:\temp

Mode              LastWriteTime     Length Name
----              -------------     ------ ----
-a---       9/30/2014   1:03 PM       7558 DEV113.pfx

By default, we see that Get-Content returns an array of String objects. There are two problems with this for my use case.

  1. If you add up the length of all 70 strings you get a total of 7489 characters. But the files size is 7558 bytes, so this does not match. The data in a PFX files is not string-oriented. It is binary data.
  2. I need a Byte array to import the data into an X509Certificate2 object.

Fortunately, using the -Encoding parameter you can specify that you want Byte encoded data returned instead of strings.

PS C:\temp> $pfxbytes = Get-Content .\DEV113.pfx -Encoding Byte
PS C:\temp> $pfxbytes.GetType().Name
Object[]
PS C:\temp> $pfxbytes[0].GetType().Name
Byte
PS C:\temp> $pfxbytes[0].length
1
PS C:\temp> $pfxbytes.length
7558

Notice that Get-Content still returns an array of objects but those objects are Bytes. The total length of $pfxbytes now matches the size on disk.

To download the PFX file from the web server I am using the System.Net.Webclient class. System.Net.Webclient has three main ways of downloading content from a web server:

  • The DownloadString methods are useful when you are only expecting to receive text data (e.g. HTML, XML, or JSON). Since the PFX file format is binary, not text, this will not work as I have already shown above with Get-Content.
  • The DownloadFile methods would work except that I don’t want to have to save the file to disk as required by these methods.
  • The DownloadData methods return a byte array containing the data requested. This is the method that best meets our needs.

But what is a Byte array? How is a Byte array different than a string?
A byte array can contain any arbitrary binary data. The data does not have to be character data. Character data is subject to interpretation. Character data implies encoding. There is more than one way to encode a character. Take the following example:

PS C:\temp> $string = 'Hello World'
PS C:\temp> $string.length
11
PS C:\temp> $bytes = [System.Text.Encoding]::Unicode.GetBytes($string)
PS C:\temp> $bytes.length
22

As you can see, the length of $string is 11 characters. If we convert that to a byte[] we get 22 bytes of data. It is also important to know the format of the source data when you are converting between encoding schemes. Take for example:

PS C:\temp> $array = @(72,101,108,108,111,32,87,111,114,108,100)
PS C:\temp> $string = [System.Text.Encoding]::UTF8.GetString($array)
PS C:\temp> $string.length
11
PS C:\temp> $string
Hello World

You see it is possible to convert the byte[] $array to a UTF8 encoded string because each byte represents one character. However, if you try to convert that same array to Unicode it will treat each pair of bytes as a single character.

PS C:\temp> $string = [System.Text.Encoding]::Unicode.GetString($array)
PS C:\temp> $string.length
6
PS C:\temp> $string
??????

The result is an unreadable value stored in $string.

Tagged with: ,
Posted in Scripting

Simple ARAPI.NET Query Example in C#

Now that you have ARAPI.NET installed let’s take a look at an example using C#. There are a couple of tricks required to get your C# project to compile and run properly.

  • Be sure the ARAPI.NET library is installed correctly and that the directory path has been added to your System PATH environment variable.
  • Set the “Target Platform” to x86 in the Build properties of your Visual Studio project.
  • Add references for both BMC.ARSystem.dll and BMC.arnettoc.dll to your project.
    Project Reference
  • Include the “using BMC.ARSystem;” statement in your code.

In this example I am creating a simple query to find all the users in Remedy that are configured as “Support Staff” and output that list in a CSV format. I have used this to do a periodic audit of our users and remove users that have changed job roles and no longer need “Support Staff” access.

using System;
using System.Collections.Generic;
using BMC.ARSystem;

namespace SupportStaff
{
  class Program
  {
    static void Main(string[] args)
    {
      string arServerName = "arserver.contoso.com";
      string arUsername = "Demo";
      string arUserPassword = "Password";
      string arAuthentication = "";
      Int32 arServerPort = 51100;

      // Log into the AR Server
      Server arServer = new Server();
      arServer.Login(arServerName, arUsername, arUserPassword, arAuthentication, arServerPort);

      // Create list of fields we want to return from the Form
      EntryListFieldList peopleEntryFieldList = new EntryListFieldList();
      peopleEntryFieldList.AddField(1, 50, ""); // Person ID
      peopleEntryFieldList.AddField(4, 50, ""); // Remedy ID
      peopleEntryFieldList.AddField(1000000025, 50, ""); // Support Staff
      peopleEntryFieldList.AddField(1000006694, 50, ""); // Auth Alias
      peopleEntryFieldList.AddField(1764007102, 50, ""); // Network ID

      // Create the Qualifier string used to query the Form
      string qualifier = @"'1000000025' = 0"; // Support Staff value 0 = 'Yes'

      // Execute the query
      EntryFieldValueList entryListWithFields = arServer.GetListEntryWithFields("CTM:People", qualifier, peopleEntryFieldList, 0, 0);

      //Process the results
      if (entryListWithFields.Count != 0)
      {
        Console.WriteLine("\"PPL\",\"RemedyID\",\"AuthAlias\",\"NetworkID\"");
        foreach (var entry in entryListWithFields)
        {
          Console.WriteLine("\"{0}\",\"{1}\",\"{2}\",\"{3}\"",
          entry.FieldValues[1],
          entry.FieldValues[4],
          entry.FieldValues[1000006694],
          entry.FieldValues[1764007102]);
        }
      }
      arServer.Logout();
    }
  }
}
Tagged with: ,
Posted in ARAPI, Remedy
sean on it
Categories
Follow Sean on IT on WordPress.com
Blog Stats
  • 62,464 hits
Mike F Robbins

Scripting | Automation | Efficiency