Skip to main content

Hybrid cloud – the missing automation link?





Fair warning: I am probably off on a rant here about the cloud and how “everything is connected” in terms of automation. This post is going to discuss the recent offensive from Microsoft and their vision about a Hybrid Cloud. I will not do a deep dive into any technical aspects, however do a more overall discussion about the vision and how recent changes brought to us by Windows Server 2012 R2 (and windows management framework 4) can affect us going forward.


Background – What do we have and what has happened?


From “nowhere” Azure as a PaaS was born. A while before that people started talking about “The Cloud” and things started to move about in terms of Public (The Cloud) and Private Cloud (“On-Premises or On-Prem as it is called nowadays) . Moving forward a bit Microsoft started to launch their vision about Hybrid Cloud, a nice mix of the “best” of the Public and Private cloud. This was even supported with the release of the System Center 2012 suite and together with Windows Server 2012. A couple of final pieces of the puzzle came around in 2013 – Windows 2012 R2, Powershell V4 (Windows Management Framwork 4) which is supported on Windows server 2008 R2/2012/2012 R2 and Azure as a IaaS platform. Puzzle is complete for now.


A small disclaimer

You will see me talk about a lot of different stuff regarding System Center in general and certain products in the suite. Needless to say, I am no expert on all of those, except perhaps on OpsMgr and somewhat in terms of Powershell and Desired State Configuration.


Hybrid – The best from both worlds according to Microsoft

Will Microsoft’s vision prevail? I don’t know for certain, and that is not the meaning of this post. However if you look at what the other big players are doing, they are also starting to talk about the hybrid way and launching products that very much support this. Time will show if it is the “right” way. If you ask me, I think they are spot on and we will live happily several years to come with the hybrid.


System Center – Must have?

What a collection of products and what a launch. I think they timed it perfectly. Please don’t get me wrong, there are flaws in the products, however the overall package is impressive compared to the alternative. In my book, this is a success, not measured in cash flow and number of installations, but in what it enables. Add the Windows Azure Pack for building and connecting clouds and the world has come to the height of its evolution. To quote Dr. Evil; “You complete me <3”. Please do not write me off as crazy just yet, stay with me.


The evolution and automation – Darwin at it’s best?

This is where things start to get a little interesting. Surprisingly this started way back with Mr Jeffrey Snover – the inventor of Powershell. More than 10 years ago he wrote a manifesto about Powershell (just do a search in your favorite search engine for “Monad manifesto”) and described what it would look like and do. With the release of Powershell V4 the final piece of the manifesto was "complete” (declarative configuration). I am not saying finished because Microsoft have to do an upgrade of the final component in Powershell – DSC (Desired State Configuration) and they probably have other nice features for us.

Therefore, what is so special about powershell DSC you say. Well it changes things in a fundamental way. It enables you to configure and prevent configuration drift in your datacenter. Mr Snover and some of the powershell experts (Don Jones – Powershell MVP among others) predicts that the evolution of the IT-Pro will change dramatically. The predictions says that everything about the future is about automation. All the low bits/bytes of installing hardware and doing manual installations/upgrade is extinct, well almost. Their point is that if you are going to be successful in building an career as an IT-Pro, you have to start to learn Powershell or/and DSC to implement automation. It may be a brutal statement, but that does not make it less true. If you look at what is happening in the IT-department today, it is the logical step going forward. I think their Prophecy is going to come true, question is how fast and eventually to what degree.

That being said, there are some shortcomings in DSC as I have talked about before (Powershell – The DSC book). Those are the some of the issues that would need to be addressed by someone, either Microsoft or some other cunning people.


The missing link

So is everything nice and dandy? Go ahead learn powershell and/or System Center and you are set for live? Well a big fat perhaps. One of the most amazing things in System Center is Virtual Machine Manager. You can manage your Hyper-V cluster hosts and your VMWare vCenter/vSphere server from a single console. Add in an integration from Citrix of their hypervisor and you have a nice package. But is does not stop there. VMM can be fully managed without using a GUI-console at all. Everything you can configure you also can configure in Powershell. Automation then becomes a breeze you wonder? Yes, you are quite right, the GUI-console even gives you the powershell script when you run through the GUI-wizards. In VMM you can also create templates that configures your server with the roles/features of the OS you would like to have installed. That is powerful and have never been easier. Install the WAP and you can create even more detailed templates using Resource Definitions and Resource Extensions.

DSC is just sweet and extremely powerful. Please remember that DSC is version 1.0 and that Microsoft is releasing resources you can use in your DSC configuration reasonably frequently. They have released 2 waves of resources already since DSC was released with 2012 R2. Problem with DSC is that we are missing some enterprise features that Microsoft is bound to address in the next release, preferably before the next major release of Windows Server. It just need some tools in terms of easy management and compliance. Even though there are reasonably large companies using DSC today, they are using it because they desperately need it and/or they recognize the huge return on investment by implementing it. 

So now we have a couple of ways we can configure our datacenter and servers. We can use VMM and the features there with or without the Azure Pack or we can use DSC to create an configurations that can be applied to the datacenter. There are even ways you can inject an DSC configuration into a virtual machine using both the pull and push model of DSC, however I find it a bit limiting in terms of injecting the configuration file or GUID into the VHD. Point is there should be a way to integrate VMM/Azure Pack and DSC thus you DSC configuration becomes your “templates” that is available for you when you deploy a new virtual machine. The other way around we can create pristine fresh virtual machine with VMM and rely on DSC to configure them. We can even have a DSC configuration that creates the VMs for us in VMM and then apply a DSC configuration to those. 

Why is this important you say? Well templates are good and efficient in configuring the machine the way you want it, however it is a onetime event. To my knowledge you have no way of reapplying a template to an already created virtual machine say if you change the template. This however is the strength of DSC. It describes the configuration of the machine(s), applies the configuration and makes it stay that way for ever or until you change the configuration. If you add in scalability, DSC just outruns everything since it is designed to separate the configuration from the infrastructure. Say you have a finished and tested DSC configuration and you want to create 100 virtual machines and apply the same configuration, how long would it take? Obviously it depends on your hardware configuration and the current strain on you hypervisor/storage, however in general terms the time VMM need to create a new blank VM and the time DSC needs to process and apply the configuration. That is impressive but consider the scenario where you need to update a file or install a new component on those 100 servers. You change the configuration and apply it, and DSC makes it so.

Summary

How will you configure your datacenter? Will you rely on templates or is DSC something you should consider. What if you competitor is implementing DSC, where will they focus the freed up resources enabled by DSC?

The wind of change is coming. How fast and from which direction is too early to tell, however it is coming. The tools to automate and create scalable and easy to maintain solutions are here today. Even if you are not able to automate every single bit of the strange an unique monster you call your datacenter running on-prem or in the cloud, start where the ROI is overwhelming and work your way down towards zero. My prediction is that you will not fail by doing so. 

Cheers  

Comments

Popular posts from this blog

Monitoring Orchestrator runbook events from Operations Manager

Today I will follow up on my colleague’s post Mr ITblog (Knut Huglen) about monitoring Orchestrator Runbook events.  He has build a nice double up SNMP loopback feature that does self monitoring in Orchestrator resulting in entries written to a special Windows Eventlog. Now we need to raise alerts in SCOM when one of his runbooks fails or sends a platform event, who knows there could be trouble lurking in his paradise.

We are not going to do anything fancy, however these are the steps we will be focusing on today:
Create a Management Pack for our customizations Create rules that collects the events from the orchestrator serverOff we go then and fire up the SCOM console and a powershell window. First we create a MP, I am going to use powershell to do this, however you may use the SCOM console as well (Administration – ManagementPacks – Action: Create Management Pack):



Import the Management Pack into SCOM and move on to the Authoring section in the SCOM console. Create a new rule:



Give the…

Powershell – Log like you mean it

How do you do logging in powershell? Why should you do logging? What should you log? Where do you put your log? How do you remove your log? How do you search your log? All important questions and how you answer then depends upon what your background is like and the preferences you have. This will be a 2 part blog post and this is part 1.


Why should you log?

Well it is not mandatory, however I have 2 reasons:
Help with debugging a script/module/functionSelf documenting script/module/function
Firstly; Do you know any program that does not contain any bugs? Working with IT for the last 2 decades, I cannot name one. When you create scripts/modules/functions, you will create bugs, that is where they live and try to make your life a living mess.

Secondly: Adding a little extra information to your logging will make them self documenting. Do you like writing documentation? Well I normally am not fond of it and use logging while debugging to get two birds with one stone.


What should you log?

Anyt…

Powershell - List information about your WIFI networks

This is just a quick post about this new function I have created. Basically this is a text-output to powershell object output function that uses netsh to query the WIFI information. This illustrates the importance of changing the authentication level on your WIFI-network. No matter if you use WEP/WPA/WPA2 your password is available in clear text in your profile.



Cheers

Tore