Skip to main content

Hybrid cloud – the missing automation link?





Fair warning: I am probably off on a rant here about the cloud and how “everything is connected” in terms of automation. This post is going to discuss the recent offensive from Microsoft and their vision about a Hybrid Cloud. I will not do a deep dive into any technical aspects, however do a more overall discussion about the vision and how recent changes brought to us by Windows Server 2012 R2 (and windows management framework 4) can affect us going forward.


Background – What do we have and what has happened?


From “nowhere” Azure as a PaaS was born. A while before that people started talking about “The Cloud” and things started to move about in terms of Public (The Cloud) and Private Cloud (“On-Premises or On-Prem as it is called nowadays) . Moving forward a bit Microsoft started to launch their vision about Hybrid Cloud, a nice mix of the “best” of the Public and Private cloud. This was even supported with the release of the System Center 2012 suite and together with Windows Server 2012. A couple of final pieces of the puzzle came around in 2013 – Windows 2012 R2, Powershell V4 (Windows Management Framwork 4) which is supported on Windows server 2008 R2/2012/2012 R2 and Azure as a IaaS platform. Puzzle is complete for now.


A small disclaimer

You will see me talk about a lot of different stuff regarding System Center in general and certain products in the suite. Needless to say, I am no expert on all of those, except perhaps on OpsMgr and somewhat in terms of Powershell and Desired State Configuration.


Hybrid – The best from both worlds according to Microsoft

Will Microsoft’s vision prevail? I don’t know for certain, and that is not the meaning of this post. However if you look at what the other big players are doing, they are also starting to talk about the hybrid way and launching products that very much support this. Time will show if it is the “right” way. If you ask me, I think they are spot on and we will live happily several years to come with the hybrid.


System Center – Must have?

What a collection of products and what a launch. I think they timed it perfectly. Please don’t get me wrong, there are flaws in the products, however the overall package is impressive compared to the alternative. In my book, this is a success, not measured in cash flow and number of installations, but in what it enables. Add the Windows Azure Pack for building and connecting clouds and the world has come to the height of its evolution. To quote Dr. Evil; “You complete me <3”. Please do not write me off as crazy just yet, stay with me.


The evolution and automation – Darwin at it’s best?

This is where things start to get a little interesting. Surprisingly this started way back with Mr Jeffrey Snover – the inventor of Powershell. More than 10 years ago he wrote a manifesto about Powershell (just do a search in your favorite search engine for “Monad manifesto”) and described what it would look like and do. With the release of Powershell V4 the final piece of the manifesto was "complete” (declarative configuration). I am not saying finished because Microsoft have to do an upgrade of the final component in Powershell – DSC (Desired State Configuration) and they probably have other nice features for us.

Therefore, what is so special about powershell DSC you say. Well it changes things in a fundamental way. It enables you to configure and prevent configuration drift in your datacenter. Mr Snover and some of the powershell experts (Don Jones – Powershell MVP among others) predicts that the evolution of the IT-Pro will change dramatically. The predictions says that everything about the future is about automation. All the low bits/bytes of installing hardware and doing manual installations/upgrade is extinct, well almost. Their point is that if you are going to be successful in building an career as an IT-Pro, you have to start to learn Powershell or/and DSC to implement automation. It may be a brutal statement, but that does not make it less true. If you look at what is happening in the IT-department today, it is the logical step going forward. I think their Prophecy is going to come true, question is how fast and eventually to what degree.

That being said, there are some shortcomings in DSC as I have talked about before (Powershell – The DSC book). Those are the some of the issues that would need to be addressed by someone, either Microsoft or some other cunning people.


The missing link

So is everything nice and dandy? Go ahead learn powershell and/or System Center and you are set for live? Well a big fat perhaps. One of the most amazing things in System Center is Virtual Machine Manager. You can manage your Hyper-V cluster hosts and your VMWare vCenter/vSphere server from a single console. Add in an integration from Citrix of their hypervisor and you have a nice package. But is does not stop there. VMM can be fully managed without using a GUI-console at all. Everything you can configure you also can configure in Powershell. Automation then becomes a breeze you wonder? Yes, you are quite right, the GUI-console even gives you the powershell script when you run through the GUI-wizards. In VMM you can also create templates that configures your server with the roles/features of the OS you would like to have installed. That is powerful and have never been easier. Install the WAP and you can create even more detailed templates using Resource Definitions and Resource Extensions.

DSC is just sweet and extremely powerful. Please remember that DSC is version 1.0 and that Microsoft is releasing resources you can use in your DSC configuration reasonably frequently. They have released 2 waves of resources already since DSC was released with 2012 R2. Problem with DSC is that we are missing some enterprise features that Microsoft is bound to address in the next release, preferably before the next major release of Windows Server. It just need some tools in terms of easy management and compliance. Even though there are reasonably large companies using DSC today, they are using it because they desperately need it and/or they recognize the huge return on investment by implementing it. 

So now we have a couple of ways we can configure our datacenter and servers. We can use VMM and the features there with or without the Azure Pack or we can use DSC to create an configurations that can be applied to the datacenter. There are even ways you can inject an DSC configuration into a virtual machine using both the pull and push model of DSC, however I find it a bit limiting in terms of injecting the configuration file or GUID into the VHD. Point is there should be a way to integrate VMM/Azure Pack and DSC thus you DSC configuration becomes your “templates” that is available for you when you deploy a new virtual machine. The other way around we can create pristine fresh virtual machine with VMM and rely on DSC to configure them. We can even have a DSC configuration that creates the VMs for us in VMM and then apply a DSC configuration to those. 

Why is this important you say? Well templates are good and efficient in configuring the machine the way you want it, however it is a onetime event. To my knowledge you have no way of reapplying a template to an already created virtual machine say if you change the template. This however is the strength of DSC. It describes the configuration of the machine(s), applies the configuration and makes it stay that way for ever or until you change the configuration. If you add in scalability, DSC just outruns everything since it is designed to separate the configuration from the infrastructure. Say you have a finished and tested DSC configuration and you want to create 100 virtual machines and apply the same configuration, how long would it take? Obviously it depends on your hardware configuration and the current strain on you hypervisor/storage, however in general terms the time VMM need to create a new blank VM and the time DSC needs to process and apply the configuration. That is impressive but consider the scenario where you need to update a file or install a new component on those 100 servers. You change the configuration and apply it, and DSC makes it so.

Summary

How will you configure your datacenter? Will you rely on templates or is DSC something you should consider. What if you competitor is implementing DSC, where will they focus the freed up resources enabled by DSC?

The wind of change is coming. How fast and from which direction is too early to tell, however it is coming. The tools to automate and create scalable and easy to maintain solutions are here today. Even if you are not able to automate every single bit of the strange an unique monster you call your datacenter running on-prem or in the cloud, start where the ROI is overwhelming and work your way down towards zero. My prediction is that you will not fail by doing so. 

Cheers  

Comments

Popular posts from this blog

Serialize data with PowerShell

Currently I am working on a big new module. In this module, I need to persist data to disk and reprocess them at some point even if the module/PowerShell session was closed. I needed to serialize objects and save them to disk. It needed to be very efficient to be able to support a high volume of objects. Hence I decided to turn this serializer into a module called HashData. Other Serializing methods In PowerShell we have several possibilities to serialize objects. There are two cmdlets you can use which are built in: Export-CliXml ConvertTo-JSON Both are excellent options if you do not care about the size of the file. In my case I needed something lean and mean in terms of the size on disk for the serialized object. Lets do some tests to compare the different types: (Hashdata.Object.ps1) You might be curious why I do not use the Export-CliXML cmdlet and just use the [System.Management.Automation.PSSerializer]::Serialize static method. The static method will generate t

Toying with audio in powershell

Controlling mute/unmute and the volume on you computer with powershell. Add-Type -TypeDefinition @' using System.Runtime.InteropServices; [Guid("5CDF2C82-841E-4546-9722-0CF74078229A"), InterfaceType(ComInterfaceType.InterfaceIsIUnknown)] interface IAudioEndpointVolume { // f(), g(), ... are unused COM method slots. Define these if you care int f(); int g(); int h(); int i(); int SetMasterVolumeLevelScalar(float fLevel, System.Guid pguidEventContext); int j(); int GetMasterVolumeLevelScalar(out float pfLevel); int k(); int l(); int m(); int n(); int SetMute([MarshalAs(UnmanagedType.Bool)] bool bMute, System.Guid pguidEventContext); int GetMute(out bool pbMute); } [Guid("D666063F-1587-4E43-81F1-B948E807363F"), InterfaceType(ComInterfaceType.InterfaceIsIUnknown)] interface IMMDevice { int Activate(ref System.Guid id, int clsCtx, int activationParams, out IAudioEndpointVolume aev); } [Guid("A95664D2-9614-4F35-A746-DE8DB63617E6"), Inte

Creating Menus in Powershell

I have created another Powershell module. This time it is about Console Menus you can use to ease the usage for members of your oranization. It is available on GitHub and published to the PowershellGallery . It is called cliMenu. Puppies This is a Controller module. It uses Write-Host to create a Menu in the console. Some of you may recall that using Write-Host is bad practice. Controller scripts and modules are the exception to this rule. In addition with WMF5 Write-Host writes to the Information stream in Powershell, so it really does not matter anymore. Design goal I have seen to many crappy menus that is a mixture of controller script and business logic. It is in essence a wild west out there, hence my ultimate goal is to create something that makes it as easy as possible to create a menu and change the way it looks. Make it easy to build Menus and change them Make it as "declarative" as possible Menus The module supports multiple Men