Skip to main content

Microsoft Azure - Virtual machines with 30 GB disk

In Firstpoint we have been using Azure virtual instances for a while now. If you just started using Azure, you probably will not have the issue that we have been facing, however if you have sized your disk properly.

Big disclaimer: use the procedure on your own risk! I’m not responsible if something breaks! The provided solution is as-is without warranty! Be warned.


When we deployed VMs in Azure, the default OS-disk was deployed as a 30GB disk. Now with Windows Server 2012, that does not leave much for other things. The Azure console (read: web GUI) does not provide any way to expand or shrink beyond the boundaries of your initial sized VHD file. People have then been forced to download the VHD file after having deleted the VM and the "disk". After the file have been downloaded, you have to use a VHD tool to increase the size of the VHD. Then you have to upload the file to Azure again, create a new VHD file based on the newly uploaded file and finally create a new VM which would use the new VHD-file. Time consuming and a lot of work. 

Microsoft have as of April 2013 changed the deployment template to a 127GB boot partition (


The solution is based upon this blogpost:

Mr Balliauw from the link above has create a utility were you don't need to download/upload the VHD file to increase/shrink it. Pretty sweet stuff. I have downloaded his REPRO on GitHUB ( You will need visual studio to compile it after having resolved all the NuGet packages. 

Off we go then. I have this VM in Azure:

The VM only have one disk of type OS disk. Browse to storage, select your storage account, the container where your VHDs are stored, the VHD-disk file and click the Edit BLOB button on the bottom:

When you click the Edit BLOB button this screen should show:

As you can see the disk size is 110 GB (I have shrinked it from 127GB which is the maximum size on Azure).

Next we have to power down the VM. Do a proper shutdown from the operating system and then a shutdown in the Azure web-console. When the status is updated with "Stopped (Deallocated)" you are ready to proceed. Make sure you select the Deallocated VM and press the Delete button: 

When the delete job finishes, we need to go to the "Disks" tab, select the disk of the deallocated VM and press the Delete button. Please make sure you select "Retain the associated VHD"!!!

Next step is to download the repro from GitHUB, please see the link posted above. Resolve the NuGet packages in Visual Studio and build the solution. 

Open up a new command window and browse to the location of your project. Try and run the console application and it should print the "how-to-use" help content:

Time to gather up some input data for the console application.

First one is easy. Determine the new size of the disk. I will set this to 120GB (remember 127 is the maximum for OS-disks in Azure). Next we have to find the bloburl. Navigate to storage, your storage account, the container for your VHDs. The file listing there should give you the file's URL. Next is the account name and the key. The account name is found under Storage: 

There you will also find the access key "Manage access keys" button on the bottom.

Running the command:

Just in case you are wondering, no that is not my access key or my accountname :-).
Good, everything looks okay. Going back to the Azure console, let's edit the blob and check out the details:

Nice, that saved me some time compared to download of 110GB and then uploading 120GB again :-)

Now we need to create a new disk. Please navigate to Virtual machine in the console and choose new disk:

Give the disk a name, chose the VHD by browsing the container of your VHDs and check the option "VHD contains an operating system":

The disk should appear in the disks list under Virtual Machines. Now you are ready to create a new VM using this new disk as the "template". 

Create a new VM:

Select the newly re-sized disk:

That's is, your VM will be deployed using the "new" disk. 

And then the disclaimer:

Big disclaimer: use the procedure on your own risk! I’m not responsible if something breaks! The provided solution is as-is without warranty! Be warned.


Post a Comment

Popular posts from this blog

Developing PowerShell modules for REST APIs – Part1

Over the years I have developed different PowerShell modules for different web APIs. I thought it would be a good idea to write a 2 series post about how you could go about to do this. This will be a 2 part blog series where we will run through the entire process of building a module for a REST API. I will try my best to keep this as simple as possible and leave more advanced stuff for a follow up post if the interest is there.What you needDepending on your experience with source control and PowerShell in general, you might want to use GIT or some other software repro for the code. In addition we are going to create a test REST API using the splendid UniversalDashboard PowerShell module created by Adam Driscoll. It is available on the PowershellGallery. Other prerequisites are built-in to Powershell. I will assume that you will be following along using at least PowerShell version 5 or greater.
What is HTTP metods for REST API.The primary or most common HTTP verbs used are POST, GET, PU…

Serialize data with PowerShell

Currently I am working on a big new module. In this module, I need to persist data to disk and reprocess them at some point even if the module/PowerShell session was closed. I needed to serialize objects and save them to disk. It needed to be very efficient to be able to support a high volume of objects. Hence I decided to turn this serializer into a module called HashData.

Other Serializing methods

In PowerShell we have several possibilities to serialize objects. There are two cmdlets you can use which are built in:
Both are excellent options if you do not care about the size of the file. In my case I needed something lean and mean in terms of the size on disk for the serialized object. Lets do some tests to compare the different types:


You might be curious why I do not use the Export-CliXML cmdlet and just use the [System.Management.Automation.PSSerializer]::Serialize static method. The static method will generate the same xml, however we …

Developing PowerShell modules for REST APIs – Part2

This is part 2 of the REST API blogpost. In part1 we successfully setup two REST API endpoints using the UniversalDashboard PowerShell module. In this part we are going to create a simple module that support some CRUD operation against our API. As we are trying to keep things as simple as possible, we will not use any fancy framework (like Plaster) to build our module. We are also going to skip a very important step you should familiarize yourself with, Pester tests. Lets get to it.

The moduleWe will build a module called FilesAPI. The module folder will look like this:

In the functions folder I have already added the 2 helper functions from part 1, Get-AuthorizationHeader and ConvertTo-Base64. The other folders are just placeholders for important stuff like classes, private functions that you do not want to make available for the module consumer and tests for Pester tests. For such a small module that we are going to create, one could argue that it is much easier to just add the functi…