Jeroen Swart

.NET Architect

Download the ALM VM's without a download manager

If you don't know Brian Keller and his ALM VM's, you should really check them out. Each VM contains a completely installed environment for a specific (ALM-related) scenario and comes with a comprehensive set of hands-on-labs and demo's.

This post is not about the VM's themselves. If you want more information about them, go to the overview page and follow the links to the individual VM's.

If you've read the download instructions, you'll know that it tells you to use the 'Free Download Manager' and copy/paste a list of url's to download all the parts. This works nice, but the first thing I do after I'm done downloading, is remove the download manager. It intercepts every download, which I find rather annoying.

To be able to download the VM's without the download manager, I first wrote the simple PowerShell script below. Place it in a ps1 script-file and place the URL's from the VM-page into a urls.txt file in the same folder as the script. You may want to change the target-directory in the script. Then run it. You can restart the script if needed, as it only downloads the files that aren't downloaded yet. In case a file is only partially downloaded, delete it before restarting the script.

$targetDirectory = "D:\Downloads\VS13ALMVM"
mkdir $targetDirectory -ErrorAction SilentlyContinue

$urls = Get-Content (Join-Path $PSScriptRoot "urls.txt")

foreach($url in $urls) {
    $targetPath = Join-Path $targetDirectory ([System.IO.Path]::GetFileName($url))
    $targetPath

    if(!(Test-Path $targetPath)) {
        (new-object net.webclient).DownloadFile($url, $targetPath)
    }
}

This still requires some manual work, creating files and copy/pasting url's. Not a huge problem, but inspired by the way Chocolatey installs, I set out to minimize the manual steps. The result is a set of scripts, one for each VM, in a GitHub repository.

As an example, here is the script for the Visual Studio 2013 Update 3 ALM VM:

# VM details
$vmName = "vs13almvm"
$vmUrl = "http://aka.ms/$vmName"
Write-Host "Downloading ALM VM '$vmName' from '$vmUrl'" -ForegroundColor Yellow

# determine file-location
$targetDirectory = "$env:USERPROFILE\Downloads\$vmName"
Write-Host "Writing files to '$targetDirectory'"
mkdir $targetDirectory -ErrorAction SilentlyContinue | Out-Null

# get the HTML containing the url's for the VM
Write-Host "Downloading web-page containing url's"
$html = (new-object net.webclient).DownloadString($vmUrl)

# get the url's from the page
$match = [regex]::Match($html, '###[^\n]*Do Not Include[^\n]*###(?<urls>.*)###[^\n]*Do Not Include[^\n]*###')
if ($match.Success) {
    # split the url's into an array
    $urls = $match.Groups["urls"].Value -split "<br\s*>"
    Write-Host "Found $($urls.Length) url's on the page"
    
    # process each url
    foreach($url in $urls) {
        $url = $url.Trim()

        # determine the local path
        $targetPath = Join-Path $targetDirectory ([System.IO.Path]::GetFileName($url))

        # download the file, if it doesn't yet exist
        if(!(Test-Path $targetPath)) {
            Write-Host "Downloading '$targetPath'"
            (new-object net.webclient).DownloadFile($url, $targetPath)
        }
    }
}
else {
    Write-Host "Sorry, couldn't find the url's on the web-page" -BackgroundColor Red -ForegroundColor White
}

This script first downloads the web-page containing the url's. Then takes the url's from the page and splits them into an array. Finally, all url's are downloaded into a VM-specific folder in the current user's downloads-folder. I had a bit of trouble getting the url's from the HTML using a PowerShell regular expression, so I decided to use the .NET object. An alternative for getting the url's from the page, is to put them in the script. But since the VM's are sometimes updated and hands-on-labs & demo's are added, I would need to update the list of url's in the scripts. Now I don't :-).

SInce the script is on GitHub, and this is the Chocolatey-inspired bit, you can run the above script by simply running the following command:

iex ((new-object net.webclient).DownloadString("https://raw.githubusercontent.com/CodeblackNL/ALMVMs/master/vs13almvm.ps1"))

This command downloads the script as a string, and then executes that string as a script (iex is short for Invoke-Expression).

Check the GitHub repository for the commands to download the other VM's.

Executing powershell scripts

I'm using powershell a lot for performing upgrade actions on an existing Sharepoint installation. I try to keep the manual tasks to a minimum, since the (exact) same actions must be executed on different environments (DTAP).

Powershell scripts won't execute like a command file (like double-clicking in the explorer). Instead you need to open a commandline, navigate to the correct folder, start the powershell environment and then you can execute the powershell script. This is not so much an issue when a whole set of scripts need to be executed, but when I just want to run a single script this feels like a lot of work just to start a single script. So to make executing a powershell script a little easier, I've created a simple command file that will execute a single script. It works simply by dropping the desired powershell script on it in the explorer.

Create a command file with a name like 'ExecutePowershell.cmd' and put in the following code:

@echo off.
echo Execute powershell:
echo.
echo %1%
echo.
pause

powershell.exe "& "%1%""
pause
@echo on

The trick here is of course in the way powershell is started. You can use this syntax to run scripts from a file, like so:

powershell "& .\scriptfile.ps1"

Don't forget the .\ if you're running a script from the current directory (or powershell won't find the script). And when the path to the script contains spaces, place it between double double quotes (because the quotes are used within an, already quoted, string), like so:

powershell "& ""Installation Scripts\scriptfile.ps1"""

Or run a single (or several) statements, like so:

powershell "& get-process"