Sunday, September 20, 2020

Azure: Determining the Parameters to Change in Azure Resource Manager (ARM) Templates

Overview

In the post "Azure: Azure Resource Manager (ARM) templates for creating Virtual Machines for Standard Window's SKU's" it was shown how to create an Azure Resource Manager (ARM) template that can be used to create a virtual machine. Also shown was how to generate the template's parameter file. There are dozens of parameters so what this post demonstrates is how to determine which parameters to modify.

Reading the parameters file it can be seen that one parameter, adminPassword is assigned to null as it is a password. The adminPassword parameter's value was excluded when the parameter file was created as is shown below:


The name of the VM specified when the ARM template was created was Machine04. The parameters tied to this machine name are:

  • networkInterfaceName
  • networkSecurityGroupName
  • publicIpAddressName
  • virtualMachineName
  • virtualMachineComputerName

A simple way to determine the parameters requiring modification is to:

  1. Create a copy of the parameters file
  2. In the copy of the parameters file change the text value of Machine04 to Machine123
  3. Perform a diff on the original parameters file and the modified copy of the parameters file
The remainder of this post is a list of tools that are useful in determining the parameters that need to be updated when deploying an ARM templet.

Visual Studio Code: File Compare

Quickdiff.net

A handy online way to diff two text files is to use https://quickdiff.net/. I like this site because the diff has options defined as follows:


QuickDiff.net shows the differences as follows:


Jsondiff.com

The site, http://www.jsondiff.com/, allows to Json objects (the parameters files are just Json objects) to be compared. The aforementioned site identifies how many differences there are between the Json objects:

The site, jsondiff.com, also allows navigation between all differences detected:

Saturday, September 19, 2020

Visual Studio Code: Comparing Text Files

Visual Studio Code has built in, albeit unintuitively, file comparison. To diff two files, from Explorer right click on the first file to compare and choose Select for Compare from the context menu:


In the screen snippet above the ParametersW10.json file was clicked on initially. To choose the second file to diff, from Explorer right click on a file and choose Compare with Selected:


In the screen snippet above the 20200919072819442ParametersW10.json file was selected to be compared to ParametersW10.json. 

The difference between the two files is shown as follows:


The upper right corner of the diff provides some handle tools for managing the file compare:


The options are defined as follows:

  • Up arrow: navigate to previous difference
  • Down Arrow: navigate to next difference
  • Backwards P:Show leading/trailing whitespace differences






Sunday, September 6, 2020

Azure/PowerShell: Finding and Removing Orphaned Network Interfaces

When an Azure virtual machine is deleted via the portal (https://portal.azure.com) any network interfaces associated with the VM are not deleted. This can lead to the pool of IP addresses associated with a subnet to be exhausted and no new VMs can be created as there are no IP addresses to assign to the new VMs.

The Get-AzNetworkInterface cmdlet returns all network instances for an Azure subscription and the Remove-AzNetworkInterface cmdlet removes a specific network interface. The following code uses Get-AzNetworkInterface in conjunction with Where-Object to get all orphaned network interfaces. Each network interface is represented by an instance of type PSNetworkInterface

[string] $subID = 'put subscription ID here'

Select-AzureRmSubscription `
 -Subscriptionid $SubID | Out-Null

[Microsoft.Azure.Commands.Network.Models.PSNetworkInterface []] ` 
 $nics =
   Get-AzNetworkInterface |
   Where-Object {
     ($_.VirtualMachine -eq $null) -And
     (($_.PrivateEndpointText -eq $null) -Or
      ($_.PrivateEndpointText -eq 'null'))}

foreach ($nic in $nics)
{
 Write-Host
     "Removing Orphaned NIC $($nic.Name) $($nic.resourcegroupname)"
 Remove-AzNetworkInterface `
   -Name $nic.Name `
   -ResourceGroupName $nic.resourcegroupname `
   -Force }

The Get-AzNetworkInterface | Where-Object code returns only network interfaces:

  • Not associated with virtual machines
  • Not associated with private endpoints 

This script snippet detects if a network interface is not associated with a virtual machine:

($_.VirtualMachine -eq $null)

This script snippet detects if a network interface is not associated with a private endpoint:

     (($_.PrivateEndpointText -eq $null) -Or 
      ($_.PrivateEndpointText -eq 'null'))}

Not being associated with a virtual machine appears to identify a network interface as orphaned having formally been assigned to a now deleted VM. There are network interfaces that were never associated with a virtual machine such as a network interface associated with a private endpoint. This is why there is an additional check to insure the PSNetworkInterface.PrivateEndpointText property is not assigned. Private endpoints are ancillary germane to detecting/removing orphaned network interfaces. More information on private endpoints can be found at What is Azure Private Endpoint?.


Saturday, September 5, 2020

PowerShell: Expanding Object Properties in Strings using Subexpression Operator

Expanding variables inside a double quoted string is useful as shown below but this approach works with variables and not with object properties:

[string] $hostName = 'Executive Officer Kane'

Write-Output "Host name: $hostName"

The output from the above script snippet is as follows:


The script below expands object properties using the subexpression operator in a string that is passed to the Write-Host cmd-let (see the last line of code, Write-Host, in the script below):


The subexpression operation, $(), means that code in the parenthesis is invoked first hence the property is evaluated and then expanded in the string:

"Orphaned $($nic.Name) $($nic.resourcegroupname)"

The subexpression operator is documented in About Operators as follows:


Double quoted strings are useful for variable and object/property expansion but recall the previous post PowerShell: Use Single Quotes Where Possible,



Saturday, August 29, 2020

Azure: Azure Resource Manager (ARM) templates for creating Virtual Machines for Standard Window's SKU's

Creating Virtual Machines (VMs) with the Azure portal is convenient but it can become tedious if numerous VMs are needed and it can be an error prone process. The New-AzVM PowerShell cmdlet can automate the creation of virtual machines. For rudimentary VMs, New-AzVM is straightforward to use from PowerShell. When this cmdlet is used to create virtual machines with complex restrictions that utilize the numerous parameter combinations, coding with New-AzVM can be a daunting task (see "Azure/PowerShell: Cmdlet Parameter and Result Complexity (Get-AzHost, Get-AzVM, New-AzVM)" for an overview of the parameter sets associated with New-AzVM and an example of the elaborate parameters that can be passed to New-AzVM).

One way to simplify the programmatic creation of a virtual machine is to use Azure Resource Manager (ARM) templates. Using PowerShell in conjunction with the New-AzVM cmdlet or ARM templates is an example of Infrastructure as Code (IAC). Instead of using portal and manual object creation, the code  used to create the infrastructure (Azure objects) can be checked into a source code control repository such as Git and treated like a first class Git citizen.

ARM templates are JSON format and describe how Azure entities are to be created. ARM templates can be created using Azure portal where instead of creating an object such as a VM, the configuration specified in Azure portal can be saved as a template. The templates saved do not magically create infrastructure objects. The template and its corresponding parameters are passed as command-line options to the New-AzResourceGroupDeployment cmdlet. Potentially the templates and their parameters are modified such as to create ten virtual machines named VM00 to VM09 which is simply using PowerShell to update the JSON attribute associated with virtual machine's name.

To demonstrate ARM templates, the steps to create virtual machine with the Azure portal will be demonstrated but instead of creating a VM, a template and parameter file will be generated for later use by New-AzResourceGroupDeployment.

The standard Azure portal appears as follows with a familar Virtual machines icon which when clicked on displays the virtual machine blade:


The virtual machine blade (screen) has an option to add a virtual machine which to no surprise is labeled by the word "Add":


Clicking on the Add option displays the "Create a virtual machine" page:


The desired parameters for a virtual machine can be filled in using the "Create a virtual machine page". For this scenario a VM is created the uses the image Windows Server 2012 R2 Datacenter - Gen 1. The image name "Windows Server 2012 R2 Datacenter - Gen 1" is a particular Windows SKU available to newly created Azure VMs.  Once all the parameters have been specified for the VM to be created the "Review + create" button can be clicked on:

After "Review + create" is clicked on, a set of validations are performed for the virtual machine configuration specified. An example validation might be that a virtual machine for a given Resource Group might not be permitted to expose a public IP address. Not permitting a public IP address  is standard-operation-procedure in environments that use VPNs to access VMs via private IP address. Another validation is the enforcement of password complexity for the default administrator account created.

The following screen is displayed after "Review + create" is clicked and the specified configuration has passed validation:



The command actions on the page above include a link (lower right) labeled as "Download a template for automation" (see below):


Clicking "Download a template for automation" link displays the following:


The Template tab allows the template to be selected in the right pane of the screen (see above). Notice above the options to Download, Add to library (preview), and Deploy are provided. Clicking on the Parameters tab displays the parameters associated with the template (see below):


Clicking on the Scripts tab displays the following:


Clicking on "Start" on the PowerShell tile displays the following (Manage Azure resources by using Azure PowerShell):


The New-AzResourceGroupDeployment cmdlet takes as command-line options a template file and a parameters file and deploys object specified by the template.

Friday, August 28, 2020

PowerShell: Two Interview Questions

 During a recent interview I was asked the following questions.

How do you remove duplicates from an array?

My answer was to perform a Select-Object on the array as there must be an command-line option to return only unique values. The Select-Object cmdlet behaves like C#'s LINQ and removing duplicates is something I would do in LINQ.

The answer is Select-Object -Unique:

[int[]] $arrayWithDuplicates = 1, 2, 3, 3, 3, 2, 1, 1, 3, 1, 2, 3

[int[]] $arrayWithUniqueValues = 
    $arrayWithDuplicates | Select-Object -Unique

$arrayWithUniqueValues -join ' '

The output of the above script is:
1 2 3

How do you read the first five lines of a file?

My answer was to perform a Get-Content on the file as there must be an command-line option to return only finite number of lines from the file. 

The answer is Get-Content <filename> -TotalCount 5 where an example is as follows:

Get-Content 'InterviewQuestions.ps1' -TotalCount 5


Tuesday, August 25, 2020

Azure/PowerShell: Cmdlet Parameter and Result Complexity (Get-AzHost, Get-AzVM, New-AzVM)

Overview

The concept of cmdlet parameter sets was discussed in "PowerShell: Cmdlet Parameter Sets". The motivation for introducing this topic was to show the complexity of certain PowerShell cmdlets and in a future posts how to escape this complexity (foreshadow: Azure Resource Manager templates). An example of a further complexity is that a cmdlet can have different return values and can return single values or arrays. Azure cmdlets can take dozens of parameters and contain multiple parameter sets. Making cmdlets more complicated is that certain cmdlets uses parameters that are created by invoking several different cmdlets which extend the values associated with the parameter. For example the New-AzVM can create an Azure VM using a parameter of type PSVirtualMachine. The PSVirtualMachine parameters could have been setup by invoking multiple cmdlets (New-AzVMConfig, Set-AzVMOperatingSystem, Add-AzVMNetworkInterface, Set-AzVMOSDisk). Debugging an invalid parameter passed to a cmdlet such as New-AzVm can be a nightmare.

In order to better understand cmdlet parameter sets and there complexity consider the documentation of several Azure related cmdlets:

  • Get-AzHost: retrieves a host or a list of all hosts if no host name is specified
  • Get-AzVM: retrieves the properties of an Azure VM
  • New-AzVM: creates an Azure VM
The documentation for each of these cmdlets is broken into sections where each shaded rectangle corresponds to a different parameter set. I am color blind so I have no idea if the shaded rectangles contain a color or are simply shades of gray. 

PowerShell cmdlets such as those referenced above return inconsistent types. For example, the Get-AzVM cmdlet can return either of the following types depending on the parameters specified:
PSVirtualMachine
PSVirtualMachineInstanceView

Even when a PowerShell cmdlet is documented to return a specific type or types this may not be the case. Cmdlets can return a single value or an array of values. 

Get-AzHost

The documentation for Get-AzHost shows two parameter sets each identified by shaded rectangles:
The output for each parameter set appears to be consistent as the return value is documented as follows:
Microsoft.Azure.Commands.Compute.Automation.Models.PSHost

If Get-AzHost returns one value the return type is PSHost. If Get-AzHost returns multiple types the return value is of type array.

The first parameter set contains the following unique parameters:
  • ResourceGroupName
  • HostGroupName
The second parameter set contains a unique parameter:
  • ResourceId
The first parameter set contains a -Name parameter which when excludes retrieves a list of all hosts.

    Get-AzVM

    The documentation for the Get-AzVM cmdlet identifies four parameter sets as follows:

    As mentioned above the Get-AzVM will return different types depending on the parameters specified to the cmdlet:
    PSVirtualMachine
    PSVirtualMachineInstanceView

    The return values can be a single value or an array of values.

    The first parameter set identifies all parameters as optional which means no parameter is an option and hence unique. When passed no parameters, Get-AzVM (the first parameter set) returns all virtual machines for a subscription. The second parameter set has positional parameters ResourceGroupName and Name or their named counterparts as required. DisplayHint is also a unique parameter for the second parameter set. The third parameter set has a unique parameter, Location. The fourth parameter set has a lone unique parameter, NextLink.

    New-AzVM

    The New-AzVM cmdlet defines the following multiple parameters sets and takes over a dozen potential parameters including complex types built up by invoking multiple setup cmdlets. 

    The unique parameter for this New-AzVm parameter set is -Credential:


    The unique parameter for this New-AzVM parameter set is -VM:


    The data type passed to the -VM parameter is PSVirtualMachine. Setting up a PSVirtualMachine can involve invoking multiple cmdlets in order to assign additional settings to the PSVirtualMachine instance.

    The unique parameter for this new-AzVM parameter set is -DiskFile:



    New-AzVM and PSVirtualMachine Configuration

    The following example from New-AzVM demonstrates the level of complexity that can be associated with setting up a cmdlet's parameters. The script marked in boldface shows how many PowerShell cmdlets have to be invoked simply (or not so simply) to set up the New-AzVM's -VM parameter value:


    Monday, August 17, 2020

    PowerShell: Cmdlet Parameter Sets

    PowerShell cmdlet parameter sets are documented at Cmdlet parameter sets where they are defined as follows:

    Each parameter set must contain a unique  parameter that allows the PowerShell runtime to identify which parameter set is being used. In order to demonstrate parameter sets consider the Get-Depth function from the post PowerShell: Converting Json-Formatted Strings to/from Json-Objects and Json-Object Depth. The Get-Depth function has been modified as follows, GetDepthEx, to take two parameters where each parameter belongs to a separate parameter set (see the attribute property, ParameterSetName, below):

    function Get-DepthEx()
    {
      param (
        [Parameter(
          Mandatory=$true,
          ParameterSetName='FormattedJsonString')]
          [string] $json,
        [Parameter(
          Mandatory=$true,
          ParameterSetName='JsonObject')]
          [PSCustomObject] $jsonObject
      )

    The code for GetDepthEx is the original GetDepth code save it has been modified (see text demarcated by boldface) to work with a Json-formatted string parameter (parameter set name, FormattedJsonString) or with  Json-Object (parameter set name, JsonObject):

    function Get-DepthEx()
    {
        param (
            [Parameter(
                Mandatory=$true,
                ParameterSetName='FormattedJsonString')]
            [string] $json,
            [Parameter(
                Mandatory=$true,
                ParameterSetName='JsonObject')]
            [PSCustomObject] $jsonObject
        )

        if (($null -ne $json) -and ($json.Length -gt 0))
        {
            # This step verifies that $json is a valid Json object
            $jsonObject = ConvertFrom-Json $json   
            if ($null -eq $jsonObject)
            {
                return 0
            }
        }

        if ($null -ne $jsonObject)
        {
            # As of PowerShell 7.0 the max -Depth is 100
            $json = ConvertTo-Json $jsonObject -Depth 100
        }
       

        [int] $maximumDepth = -1
        [int] $depth = 0
        [char[]] $startingBrackets = '[', '{'
        [char[]] $endingBrackets = @(']', '}')

        foreach ($c in $json.ToCharArray())
        {
            if ($c -in $startingBrackets)
            {
                ++$depth
                $maximumDepth = if ($maximumDepth -ge $depth)
                    { $maximumDepth } else { $depth }
            }

            elseif ($c -in $endingBrackets)
            {
                --$depth
            }
        }

        return $maximumDepth
    }

    The GetDepthEx method is invoked as follows where each invocation uses a different parameter set as is show using the code from the previous post "PowerShell: Reading, Modifying, and Saving Json Files":

    [string] $sourceWindowsVmTemplateFilename =
                 'Template2019.json'
    [string] $destinationWindowsVmTemplateFilename =
                 'TemplateAnyWindowsVM.json'
    [string] $content =
                 Get-Content -Raw -Path $sourceWindowsVmTemplateFilename
    [PSCustomObject] $jsonObject =
        $content |
        ConvertFrom-Json

    [int] $depthParameterSetFormattedJsonString =
            Get-DepthEx -Json $content
    [int] $depthParameterSetJsonObject =
            Get-DepthEx -JsonObject $jsonObject

    There are many permutations to parameter sets. For example it possible for a parameter to be included in multiple parameter sets. As a feature of PowerShell, parameter sets are not part of the day-to-day coding performed by most developers.  It is more likely that a developer will have to read the documentation associated with a cmdlet that supports multiple parameter sets. Cmdlets that contain multiple parameters sets include;

    • Add-AzVMNetworkInterface
    • Get-AzGallery
    • Get-AzHost
    • Get-AzHostGroup
    • New-AzVM
    • New-AzVMConfig
    • New-AzVMSqlServerAutoBackupConfig
    • Publish-AzVMDscConfiguration
    • Remove-AzGallery
    • Remove-AzHost
    • Remove-AzHostGroup

    Sunday, August 16, 2020

    PowerShell: Reading, Modifying, and Saving Json Files

    In the previous blog entry, PowerShell: Converting Json-Formatted Strings to/from Json-Objects and Json-Object Depth, it was shown how to correctly create Json-objects from Json-formatted strings and how to create Json-formatted strings from Json-objects by taking into account the depth of the underlying Json-object being manipulated. This post expands on the manipulation of Json-object with PowerShell by demonstrating how to:

    • read a Json-formatted string from a file
    • convert the Json-formatted string to a Json-object
    • modify the Json-object
    • covert the Json-object to a Json-formatted string
    • save the Json-formatted string to a file
    The code used to perform the above tasks is as follows:

    [string] $sourceWindowsVmTemplateFilename = 'Template2019.json'
    [string] $destinationWindowsVmTemplateFilename =
        'TemplateAnyWindowsVM.json'
    [string] $content =
         Get-Content -Raw -Path $sourceWindowsVmTemplateFilename
    [int] $depth = Get-Depth $content
    [PSCustomObject] $jsonObject = $content | 
                                     ConvertFrom-Json

    if ($jsonObject.resources.properties.
          storageProfile.imageReference.sku -ne $windowsSku)
    {
        $jsonObject.resources.properties.
            storageProfile.imageReference.sku = $windowsSku
        $jsonObject |
            ConvertTo-Json -Depth $depth |
            Set-Content $destinationWindowsVmTemplateFilename
    }

    The Json file is read by the Get-Content cmdlet where the -Raw command-line option causes the entire file to be read as a single string:

    [string] $content = 
      Get-Content -Raw -Path $sourceWindowsVmTemplateFilename

    A culled version of the Json identifies the Windows Sku is as follows:

    <#
      "resources": [
      {
        {
          "properties": {
            "storageProfile": {
              "imageReference": {
                 "sku": "2012-R2-Datacenter",
    #>

    Based on the pseudo-Json above the sku is accessible as follows:

    resources.properties.storageProfile.imageReference.sku


    The Json object's sku property is checked with an if statement and assigned as follows:

    if ($jsonObject.resources.properties.
          storageProfile.imageReference.sku -ne $windowsSku)
    {
      $jsonObject.resources.properties.
          storageProfile.imageReference.sku = $windowsSku

    The Json-object's depth is computed as follows by using the Get-Depth method and the conversion from Json-object to Json-string is performed using the ConvertTo-Json cmdlet and its -Depth parameter:

    [int] $depth = Get-Depth $content
    $jsonObject | ConvertTo-Json -Depth $depth

    The Json-formatted string is committed to a file using the Set-Content cmdlet:
        Set-Content $destinationWindowsVmTemplateFilename

    The Json data being manipulated is based off of a real world situation where Azure virtual machines were being created as the host for specific versions of SharePoint on-prem. The Json-formatted string read from a file will be an Azure Resource Manager (ARM) template file used in the creating of a new virtual machine image for a standard Azure Windows SKU. Assigning the SKU will be the modification made to the Json-object which the supported SKUs are:
    • 2012-R2-Datacenter
    • 2016-Datacenter
    • 2019-Datacenter

    The Window's SKU returned is simply a string that depends on the version of SharePoint to be installed on the virtual machine (SharePoint 2013, SharePoint 2016 or SharePoint 2019). The function used to return the appropriate Windows SKU is Get-WindowsSku:

    function Get-WindowsSku()
    {
      param (
        [ValidateSet(2013, 2016, 2019)]
        [Parameter(Mandatory)]
        [int] $sharePointVersionYear
      )

      [string] $sku

      switch($sharePointVersionYear){
        2013 { $sku = '2012-R2-Datacenter'}
        2016 { $sku = '2016-Datacenter'}
        2019 { $sku = '2019-Datacenter'}
      }

      return $sku
    }





    Saturday, August 15, 2020

    PowerShell: Converting Json-Formatted Strings to/from Json-Objects and Json-Object Depth

    PowerShell has extremely versatile cmdlets for converting Json-formatted strings to Json-objects (ConvertFrom-Json) and converting Json-objects to Json-formatted strings (ConvertTo-Json). A real world case where such cmdlets come in handy is when working with Azure Resource Manger (ARM) templates (JSON formatted files) in order to create virtual machines from standard SKU's and from images. The ConvertTo-Json cmdlet has a default behavior that needs to be accounted for when using it in real world applications. There are too many online blogs that fail to demonstrate that by default ConvertTo-Json only works on Jason objects with a maximum depth of 2. This blog post shows the correct use of ConvertTo-Json. 

    The PowerShell cmdlet that converts a Json-formatted string to a Json object or, when the -AsHashtable command-line option is specified, to a HashTable, is ConvertFrom-Json:

    The PowerShell cmdlet that converts an object to a Json-formatted string is ConvertTo-Json:

    The CovertFrom-Json cmdlet has a -Depth command-line option that defaults to a value of 1024 while the ConvertTo-Json cmdlet has a -Depth command-line option that defaults to a value of two. The depth of a Json object is defined as the maximum nested level of each property (starting with a { bracket and ending in a } bracket) and each nested level of an array (starting with a [ bracket and ending in a ] bracket).

    Consider a case where a Json-formatted string with a depth of nine is converted to Json-object. There would be no need to set the -Depth parameter when invoking CovertFrom-Json as the default depth is 1024. If the Json-object is edited and then converted to a string using the CovertTo-Json cmdlet, there would be a problem unless the -Depth command-line option was set to at least a value of nine (as computed by the Get-Depth method below). Since CovertTo-Json default -Depth defaults to two, then the Json-formatted string would be malformed since the Json-object converted has a depth of nine.

    The Get-Depth function takes as a parameter a Json-formatted string and returns the Json's depth:

    function Get-Depth()
    {
      param (
        [Parameter(Mandatory=$true)]
        [string] $json
      )

      # This step verifies that $json is a valid Json object
      $jsonObject = ConvertFrom-Json $json
      if ($null -eq $jsonObject)
      {
        return 0
      }

      [int] $maximumDepth = -1
      [int] $depth = 0
      [char[]] $startingBrackets = '[', '{'
      [char[]] $endingBrackets = @(']', '}')

      foreach ($c in $json.ToCharArray())
      {
        if ($c -in $startingBrackets)
        {
          ++$depth
          $maximumDepth = if ($maximumDepth -ge $depth)
            { $maximumDepth } else { $depth }
        }

        elseif ($c -in $endingBrackets)
        {
          --$depth
        }
      }

      return $maximumDepth
    }

    The following snippet of PowerShell shows CovertFrom-Json being used to convert a Json-formatted string to a Json-object. ConvertTo-Json is used with the -Depth command-line parameter assigned using a value return by the Get-Depth function and ConvertTo-Json is used without the -Depth command-line parameter assigned:

    [string] $content = <getting content shown in future blog>
    [int] $jsonDepth = Get-Depth $content
    [PSCustomObject] $jsonObject = $content | ConvertFrom-Json
    [string] $updatedContent = $jsonObject |
      ConvertTo-Json -Depth $jsonDepth
    [string] $updatedContentDefaultDepth = $jsonObject | ConvertTo-Json

    Below is an image of the Json-formatted string created by ConvertTo-Json using the default value for -Depth, two: 


    Note that the properties hardwareProfile, storageProfile, networkProfile, osProfile, and diagnosticsProfile are unassigned as these properties exceed the default depth.

    Below is an image of the Json-formatted string created by ConvertTo-Json using the -Deptth determined by the Get-Depth function and notice that hardwareProfile, storageProfile, networkProfile, osProfile, and diagnosticsProfile are all unassigned values:



    Thursday, August 13, 2020

    Git: Reverting git commit

    Every developer making use of git from the command-line has mistakenly invoked "git commit". The "git commit" command commits staged changes to the local repository. In a previous post it was demonstrated how to rollback a git add (Git: Reverting git add).

    A typical git commit takes the following form of a git add (with some command-line) followed by a git commit -m:
    git add .
    git commit -m "some message related to the commit" 

    The following undoes the last commit and leaves the state of your files on disk unchanged (meaning your working tree is unchanged). The changes previously staged by the commit are reverted to unstaged:
    git reset HEAD~ 

    After invoking "git reset HEAD~" the files on disk can be modified as needed and the git add and git commit commands can be invoked where "git commit -c" will commit using the previous commit message text (as in "some message related to the commit" from the example message above):
    git add .
    git commit -c ORIG_HEAD

    To edit the original commit message use the following sequence of commands:
    git add .
    git commit -c ORIG_HEAD

    When "git commit -c ORIG_HEAD" is invoked an editor will display the log message from the previous commit and the message associated with the previous commit can be edited.

    Sunday, August 9, 2020

    Git: Reverting git add

    Every developer making use of git from the command-line has mistakenly invoked "git add". The "git add" command commits a change or changes from the current working tree to a staging area where this change can be later committed to the repository.

    To rollback an add associated with a single file, invoke git reset specifying the file to be rolled back (un-stage the file):

    git reset <file> 

    To rollback all files added but not committed, invoke git reset without the <file> parameter specified:

    git reset




    Saturday, August 8, 2020

    Visual Studio: C# projects changed to target .NET Framework 4.7.2 won't build

    Upgrading an existing set of C# projects the following error was encountered while trying to build:
    your project does not reference ".NETFramework,Version=v4.7.2" framework. Add a reference to ".NETFramework,Version=v4.7.2" in the "TargetFrameworks" property of your project file and then re-run NuGet restore

    The solution is to delete the obj and bin folders of each project which fails to build with the above message.

    Monday, July 20, 2020

    Azure/PowerShell: Virtual Machines created from Image, Adding Applications to the Taskbar

    Once an Azure virtual machine is created from an image, users can be added either local users or by adding the machine to a domain. As part of user setup, it is convenient to add applications to taskbar from a user. PowerShell is just the automation tool to handle this task.

    The Taskbar is part of Windows Shell (thinking 1990s COM). It is no surprise that the code to add an application to the Taskbar requires accessing a COM object:
            $shellApplication = New-Object -ComObject shell.application
            $taskbarApplicaxtion = $shellApplication.Namespace($directory).ParseName($candidate.Name) 
            $taskbarApplicaxtion.invokeverb('taskbarpin')


    The PowerShell script to add the following applications to the TaskBar is below:
    • Chrome
    • Edge
    • Notepad++
    • Visual Studio Code
    • Visual Studio
    • SQL Server Management Studio
    The PowerShell script is as follows:

    [string] $shellApplicationVerbTaskbarPin = 'taskbarpin'

    function Add-TaskbarApplication() {
        param (
            [Parameter(Mandatory = $true)]
            [string] $applicationExecutable,
            [Parameter(Mandatory = $false)]
            [string] $applicationCandidatePath = ${env:ProgramFiles(x86)}
        )

        $candidates = Get-ChildItem `
            -Path $applicationCandidatePath `
            -Filter $applicationExecutable `
            -Recurse `
            -ErrorAction SilentlyContinue `
            -Force
        if ($null -eq $candidates) {
            # Application not found
            return 
        }

        foreach ($candidate in $candidates) {
            [string] $directory = Split-Path -Path $candidate.FullName

            $shellApplication = New-Object -ComObject shell.application
            $taskbarApplicaxtion = $shellApplication.Namespace($directory).ParseName($candidate.Name)
            if ($null -eq $taskbarApplicaxtion) {
                continue
            }
        
            $taskbarApplicaxtion.invokeverb($shellApplicationVerbTaskbarPin)
            break # certain files like msedge.exe are in multiple locations so only one pinning
        }
    }

    Add-TaskbarApplication 'chrome.exe' 
    Add-TaskbarApplication 'msedge.exe'
    Add-TaskbarApplication 'notepad++.exe'
    # Visual Studio Code
    Add-TaskbarApplication 'code.exe' $env:LOCALAPPDATA
    # Visual Studio
    Add-TaskbarApplication 'devenv.exe'
    # SQL Server Management Studio
    Add-TaskbarApplication 'ssms.exe'



    Sunday, July 19, 2020

    Azure/PowerShell: Virtual Machines created from Images, Cleaning cached up SQL Server Host Names

    As part of my DevOps role, I have created a set of standard images to be used by our QA engineers in creating test Virtual Machines on Azure. During setup, the QA lead reported that it was impossible to login to a SQL Server instance running on the virtual machine. The client being used to login to the instance of SQL Server was SQL Server Management Studio (SSMS). I investigated and found out that SSMS remembered the Server Name (host name) of the virtual machine from which the Azure image was created. The fix was simple. The QA lead needed to enter the current computer name of the virtual machine.

    This was a usability issues that merited fixing. To explain more clearly assume the virtual machine that was used to create the Azure image was named, SrcHost4Image. This source virtual machine was running SQL Server. If a new Virtual Machine named, NewVM001, is created from the image, SSMS will show the origin host name:




    The host, SrcHost4Image, no longer exists so there is no way to login with those SQL Server credentials.

    The file where SSMS stores its most recently used server names should be updated. In earlier incarnations of SSMS that most recently used (MRU) server names were stored in a binary file, mru.dat for SQL Server Management Studio 2005 and SqlStudio.bin for more recent incarnations of SSMS. SQL Server Management Studio has moved to an XML file, UserSettings.xml, in order to store the MRU server names. On most configurations of Windows this file is found under:

    C:\Users\%username%\AppData\Roaming\Microsoft\
        SQL Server Management Studio\18.0

    The XML to be modified from the UserSettings.xml configuration file was of the form:
    <#
    <ServerConnectionItem>
        <Instance>SrcHost4Image</Instance>
        <AuthenticationMethod>0</AuthenticationMethod>
        <Connections>
        <Element>
            <Time>
            <long>-637307108466504502</long>
            </Time>
            <Item>
            <ServerConnectionSettings>
                <Instance>SrcHost4Image</Instance>
                <UserName>SrcHost4Image\Jan Narkiewicz</UserName>
    #>

    The PowerShell modifying the two <Instance> XML elements and the <UserName> XML element is as follows:

    # $env:LOCALAPPDATA = C:\Users\Jan Narkiewicz\AppData\Roaming
    # PowerShell 7 style Join-Path
    [string] $userSettingsPath = Join-Path `
        $env:APPDATA 'Microsoft\SQL Server Management Studio\18.0'
    [string] $userSettingsFilename = 'UserSettings.xml'
    [string] $userSettingsFullFilename = 
        Join-Path $userSettingsPath $userSettingsFilename

    function Get-FirstElement($elementCandidate)
    {
        if ($elementCandidate.Count -eq 0)
        {
            return $null
        }
            
        elseif ($elementCandidate.Count -eq 1)
        {
            # if one element is found then no array is created 
            # (just an object)
            return $elementCandidate
        }
        
        elseif ($elementCandidate.Count -gt 1)
        {
            # if more than one element is found then an array is created 
            # (select 0th element)
            return $elementCandidate[0]
        }    
    }

    [xml]$settings = Get-Content $userSettingsFullFilename

    <#
    <ServerConnectionItem>
        <Instance>SrcHost4Image</Instance>
        <AuthenticationMethod>0</AuthenticationMethod>
        <Connections>
        <Element>
            <Time>
            <long>-637307108466504502</long>
            </Time>
            <Item>
            <ServerConnectionSettings>
                <Instance>SrcHost4Image</Instance>
                <UserName>SrcHost4Image\Jan Narkiewicz</UserName>
    #>

    $serverTypesElements = $settings.SqlStudio.SSMS.ConnectionOptions.ServerTypes.Element
    if ($serverTypesElements.Count -eq 0)
        exit
    }

    [System.Xml.XmlElement] $serverTypeItemElement

    $serverTypeItemElement = Get-FirstElement $serverTypesElements[0].Value.ServerTypeItem.Servers.Element
    if ($serverTypeItemElement -eq $null)
    {
        exit
    }
        
    [System.Xml.XmlElement] $serverConnectionItem = $serverTypeItemElement.Item.ServerConnectionItem
     
    $serverConnectionItem.Instance = $env:computername
        
    [System.Xml.XmlElement] $serverConnectionSettings = $serverConnectionItem.Element.ServerConnectionSettings

    $serverConnectionSettings.Instance = $env:computername
    $serverConnectionSettings.UserName = $env:computername + '\'+ $env:UserName

    $settings.Save($userSettingsFullFilename)

    Appendix A: Locations of SSMS User Settings Files

    To force SSMS to forget legacy for older versions of SSMS, the user settings file (a binary file) should be deleted. This sections lists the legacy configuration files used by SQL Server Management Studio:

    SQL Server Management Studio 2005: 
    C:\Users\%username%\AppData\Roaming\Microsoft\
        Microsoft SQL Server\90\Tools\Shell\mru.dat

    SQL Server Management Studio 2008: 
    C:\Users\%username%\AppData\Roaming\Microsoft\
        Microsoft SQL Server\100\Tools\Shell\SqlStudio.bin

    SQL Server Management Studio 2012: 
    C:\Users\%username%\AppData\Roaming\Microsoft\
        SQL Server Management Studio\11.0\SqlStudio.bin

    SQL Server Management Studio 2014: 
    C:\Users\%username%\AppData\Roaming\Microsoft\
        SQL Server Management Studio\12.0\SqlStudio.bin

    SQL Server Management Studio 2016: 
    C:\Users\%username%\AppData\Roaming\Microsoft\
        SQL Server Management Studio\13.0\SqlStudio.bin

    SQL Server Management Studio 2017:
    C:\Users\%username%\AppData\Roaming\Microsoft\
        SQL Server Management Studio\14.0\SqlStudio.bin

    Sunday, July 12, 2020

    Creating Windows Shortcuts (links) with .NET Core and .NET Standard

    I was tasked with creating a shortcut in a C# application. As a technology goal, it was desired to use .NET Standard so the code could be more readily updated to .NET Core/.NET 5 at a future time.  There are numerous examples of creating and managing file shortcuts programmatically with C# and the .NET Framework. One such example is: Create shortcut programmatically in C#. All such examples use COM and the Windows Script Host Object Model to access the Windows Shell (the system that creates and manages shortcuts). COM interop works for the .NET Framework but does not work for .NET Core or .NET Standard.

    The shortcut was to be same across every computer on which the software was installed such as:
    • c:\bin\nuget.exe: executable to invoke when shortcut is clicked on
    • c:\SomeFolder\nuget.lnk: location of the shortcut that invokes c:\bin\nuget.exe
    The problem to be solved is not about creating a shortcut. That can be done manually on Windows (see Windows How to Create/Copy a File Shortcut using File Explorer). The problem is solved by copying the shortcut file once created to the new machine. To make life simpler, the shortcut file should be renamed so that it could be written with the *.lnk extension but could be managed as a generic binary file (extension *.bytes) so that Windows does not attempt to treat the file as a shortcut. 

    Developers who need to create a custom shortcut (custom target, custom name, etc.) using .NET Standard or .NET Core should stop reading. That functionality is only available using .NET Framework and COM interop and this style of development is not presented as part of this blog.

    Recall in Windows How to Create/Copy a File Shortcut using File Explorer that a shortcut could be uploaded to Google Drive. As Google Drive is not Windows there was no attempt to follow the shortcut to the target folder so just the file associated with shortcut was updated:


    Changing the extension from *.lnk to *.bytes on a Widows computer requires custom Windows Shell coding, A simpler approach is to rename the file from nuget.exe.lnk to nuget.exe.bytes while it is stored on Google Drive:


    The nuget.exe.bytes file can be downloaded from Google Drive to the Downloads folder and Windows will not interpret the file as a shortcut. 

    To demonstrate that a .NET Standard assembly can create a shortcut, a class library named ExampleDotNetStardardLibrary was created. The way that the shortcut will be included in this assembly is by using a resource file. A resource file is added to a C# project by clicking on the project in Solution Explorer and selecting Add | New Item:


    Selecting Add | New Item displays the Add New Item dialog. Once the Add New Item dialog is displayed, the term "resource" can be typed in the textbox used to filter the file types displayed. This will reveal the Resource File file type as follows:


    Click add which adds a Resource File named Resource1.resx to the .NET Standard class library project. To rename the Resource File, Resoure1.resx, right click on it and select Rename:


    The resource file can be renamed ShortcutManager.resx. Double clicking on the resource file, ShortcutManager.resx, displays a designer that includes a menu entitled Add Resource. Clicking on the down arrow of Add Resource allows the Add Existing File menu item to be selected:


    Invoking the Add Existing File option displays the Add existing file to resource dialog. Using this dialog it is possible to navigate to the Downloads folder where nuget.exe.bytes was downloaded from Google drive:


    Once nuget.exe.bytes is selected, click on the Open button to add the file to the resource file: The nuget.exe.bytes file will be stripped of its extension (*.bytes) and stored in the project's Resources folder as follows:


    Double clicking on the ShortcutManager.Designer.cs displays the code generated by Visual Studio that allows the byte associated with the nuget.exe resource to be access (remember this a shortcut not the executable):


    The nuget_exe property can be invoked in order to access the bytes associated with the shortcut. This property has protection level internal so it can only be accessed by code internal to the Standard Library containing the resource file.

    The ShortcutManager class, like the nuget_exe property, is set to protection level internal so the calls's methods and properties cannot be invoked from outside its .NET Standard class library. The following class, Shortcut, with protection level public demonstrates how to save the shortcut to a folder thus showing .NET Standard can be used to create a shortcut:

    namespace ExampleDotNetStandardLibrary
    {
        using System.IO;

        public class Shortcut
        {
            public static void Save()
            {
                File.WriteAllBytes(
                    "C:\\SomeFolder\\nuget.exe.lnk", 
                     ShortcutManager.nuget_exe);
            }
        }
    }







    Saturday, July 11, 2020

    Windows How to Create/Copy a File Shortcut using File Explorer

    As with many of blog posts on this site, I create a rudimentary, background post for a more complex development or devops issue to be addressed in a future post. Today's topic is just such a rudimentary blog post and presents a basic Windows concept, namely how to create a shortcut. 

    The file to which the shortcut will link will be c:\bin\nuget.exe which shows File Explorer as having a size of 6,424 KB:


    Shortcuts are not limited to executable files like nuget.exe. A shortcut or link can be created to any file type in Windows or shortcuts can point to folders.

    To create a shortcut, using File Explorer, navigate to the folder in which the shortcut is to be created. Right click on the folder and select New | Shortcut from the context menu displayed by the right click action:


    When New | Shortcut is selected, the Create Shortcut wizard will be displayed as follows:



    Click on the Browse button to select the source file via the Browse for Files or folders dialog:


    The title of the dialog, Browse for Files or Folders, is a clear indication that shortcuts can reference files or folders. Using the dialog, navigate to the file to which the shortcut will reference;



    Click on OK which closes the Browse for Files and Folder dialog:


    Click on Next which allows the shortcut to be named:



    Click on Finish which shows the folder (C:\SomeFolder) containing the newly created shortcut:



    Recall that the actually file, c:\bin\nuget.exe, is 6,424 KB. The shortcut is 1 KB (actually less) and demarcated by an icon indicating it is a link. Under the covers, a shortcut is just a file referencing a file stored in a different location.

    It is possible to copy a shortcut to a different folder either using File Explorer's context menu or using CTRL+C, CTRL+V. Below is an example of the shortcut, C:\SomeFolder\nuget.exe, copied to a different location:


    The shortcut versus the file referenced by the shortcut can even be copied to Google Drive:


    Notice the file size is 866 bytes indicating the shortcut was copied and not the 6,624 KIB original file. Obviously, the shortcut stored on Google Drive's cloud storage can't invoke c:\bin\nuget.exe.

    One reason to copy the shortcut cloud storage is that the shortcut could be download and used on a different computer. As long as the destination file (c:\bin\nuget.exe) resides on the same location on a different Windows machine, the shortcut can be downloaded and used on a different Windows machine.