Skip to main content

Ignite 2016 summary – Innovate, optimize, manage and empower your business with IT


Image result for cloud microsoft

This years Microsoft Ignite conference was all about transforming your business with technology. Here is a techy summary for business-minds.


Going forward, IT-Pros must prepare to answer both tricky business questions, and leverage new tools to meet business demands. I imagine questions like these:
 
  • What are the needs of our business?
  • How can we empower our users to apply the cloud to gain competitive advantages?
  • How can we innovate with greater agility and optimize our IT resources?
  • How can we migrate from the traditional model where IT is just a cost-center, to a lean/mean machine where IT is the engine that powers our business strategy with increased earnings?


A model of the traditional business case

We live in a traditional world with traditional problems. Simplified a business consists of a few silos:
  • Internal users
  • Your customers
  • Your suppliers and partners
  • The remainder of the universe

All of these are connected directly and indirectly through processes, some of them manual and some maybe through automation. The job of the IT department is to deliver services, preferably in the most cost effective way possible. Generally, if you change a process through a tool or automation (PowerShell), and you saved time/cost, you become the hero. Cost- and time-savings are always welcome, however the possible impact is superior when IT is driving your revenue, like in the new model.



The new model for IT

In the new world, everything is about processes, data and applications. In other words, algorithms. Everything is moving and changing at a higher speed than we have ever experienced before. Silos probably still exists, however they are interconnected and data-aware. Your CRM application will have access to and understand other applications and their data structure. It will empower your employees and provide you with just in time insights. With the new Azure PowerApp and Flow applications which implement the CDM (Common Data Model) you have this available today as a preview service. Throw Azure Functions into the picture, and you have a pretty robust and extendable model which is highly customizable and scalable.

In addition, Azure has implemented predictive analytics and machine learning (ML) in the different APIs, like Storage, Azure SQL, Hadoop etc. They are enabling ML for the masses by implementing it across their datacenters and in the Azure model. Your developer is not responsible for implementing intelligence in your application, he consumes predictive data from the Azure machine learning API possible through the integration with the Storage API. You do not consider IT as a cost-center, however as a business enabler, that helps you to increase revenue by applying analysis of big data through algorithms that is constantly updated to provide perfect information just in time. Theoretically possible, however immensely difficult to implement in practice if you are not in Azure.



What do you need?



:Speed and agility: If you have a clear understanding of your needs, your market and competitors, why not move as agile and fast as you can? If you can change faster than your competitors, you have an advantage and a head start. Let me illustrate with an example; You have probably heard about robot-trading in the stock-market? They move very fast and agile because the first person/robot that receives and understands specific market information, is the winning party and walks away with some profits. In our business case, it is the same thing. Rapid changes to your algorithm and IT system to understand the business and receive correct information just in time, is essential to become the leader and increasing profits.

:Scale: Your IT system need to be able to scale, up and down. You should not have to worry about it as the cloud does this for you within the limitations you have defined. The cloud empowers businesses of all sizes to use scaling technology that previously was the privilege of large enterprises with expensive dedicated appliances. Committing to services and applications that handles scaling is key in the new world. Relying on old legacy applications and services will prevent you from becoming a new force in your market. Startups in your market will become your new IT system performance benchmark and they probably do not consider legacy systems a match for their agile needs.

:Knowledge – Close the gap: The adoption of cloud resources and the hybrid cloud is just the beginning of the disruptive change that is here. Hybrid cloud is just a steppingstone towards the connected cloud with unlimited resources at your fingertips. That does not imply that the private clouds will not exists. They just need to be connected to the public cloud and empower it by binging some added value. In the other case, if it is not connected, it will be a relic and an edge-case for very special circumstances. In this scenario, knowledge will be important. New features and services are launched on an almost weekly basis. Products are migrating from private preview, to public preview and finally to general availability in matter of months. If you do not take advantage, someone else will, perhaps your competitors.

:New People and Organization 2.0: Best case scenario, you need a huge amount of training and designing. If ordering a new web-server or virtual machine takes longer than the time usually needed to create/deploy it automatically, trust me, you have to do something. Your organization is already changing, perhaps you just have not noticed it yet? Ever heard about Shadow IT, the evil from within? If it is not knocking on your door, it is because it is already inside. Shadow IT is a real problem that you need to take seriously. In the emerging world, people want things yesterday, like always. Problem is that if you do not deliver, someone else can, and asking for forgiveness beats asking for permission 9 out of 10 times, especially if it yielded a positive result. Rules, policies and guidelines are nice, however immediate results are king.

DevOps is a “must”: The new world relies on DevOps. DevOps is a merge between a developer and a IT-Pro where you bring the knowledge of both parties together and apply that knowledge to your business and culture in a series of new processes. DevOps is not automation; however, automation is a key part of DevOps.

:Security: You do know that hackers target IT-Pros due to the fact that they normally have access to everything? The tools to handle this is available and has been for quite some time now. Microsoft Identity Manager comes with PAM (Privileged Access Management) which audits privileged access with time constrains. Then your privileged access token expires, your access is revoked. The PowerShell team has created a toolkit called Just Enough Administration (JEA) which is very similar to the Identity Manager solution. Both solutions should be designed with a “break the glass” option for that time when you really don’t care about the security, but need to fix the issue. If you break the glass, all kinds of things happen and you probably would expect to face some sort of hearing where you have to justify the action, which is a good thing.

With Windows Server 2016 a new Hyper-V feature was launched giving us Shielded VMs. With shielded VMs the tenant of a shared resource owns the VM completely. The entity responsible for the platform it is running on, have the ability to manage it to a certain degree (like start, stop and make a backup). The backup of a shielded VM is encrypted if you were wondering.

Last but not least, security starts at the operating system level. In general, reducing the attach surface is regarded as a first line of defense. Windows Server 2016 Nano is the new operating system for the cloud and will change the way you work and handle datacenter workloads. Nano Server has a tiny footprint, small attach surface and is blazingly fast, which makes it a perfect match for a fast moving and agile business.

:Help – Private cloud or hybrid cloud: Even with a new organization and knowledge, it is highly likely that you will need some consultancy. According to Gartner, 95% of all attempts to create a private cloud fails or fails to yield the expected outcome. Building and implementing a private cloud is very hard and you should be very confident on your organization’s abilities before you embark on such a journey. Microsoft is the only public cloud provider that will provide you with a key-ready solution to run your hybrid cloud. If you have not heard about Microsoft AzureStack you should probably read up on it. Basically it is Azure wrapped up in a Hyper Converged ready solution for you to deploy in your datacenter delivered from OEM vendors like Dell, Lenovo, HP et al. New features initiated in Azure most likely will migrate to AzureStack ready for usage in your hybrid cloud.

AzureStack is targeted for release some time mid 2017 or later that year. That is almost a year away. The good thing is that AzureStack is based upon Azure. It has the same underlying technology that powers Azure like the portal and the Azure Resource Manager (ARM). Microsoft is delivering a consistent experience across the public and hybrid cloud with the ARM technology. To prepare yourself for AzureStack, you should invest time and effort into learning Azure and that knowledge will empower you if you decide to implement AzureStack next year.




All in - or not

Do you need to get all in on the private cloud or should you just integrate yourself with the public cloud? It depends on your organization and your business needs. One thing is for certain, you probably have to do something. Implementing your own version of ready to consume features in the public cloud in your own private datacenter, is not an option you should consider. If would require a tremendous effort and tie down your resources and in effect, make you static. You need to rub DevOps and business strategy on your business and culture. There are some really smart people out there that can help you with that and like everything else, it is an ongoing process that requires your constant attention.

The change is here. How will you empower your organization and become the new star? I am happy to discuss opportunities if you reach out by sending me an email.


Cheers

Tore


Comments

Popular posts from this blog

Monitoring Orchestrator runbook events from Operations Manager

Today I will follow up on my colleague’s post Mr ITblog (Knut Huglen) about monitoring Orchestrator Runbook events.  He has build a nice double up SNMP loopback feature that does self monitoring in Orchestrator resulting in entries written to a special Windows Eventlog. Now we need to raise alerts in SCOM when one of his runbooks fails or sends a platform event, who knows there could be trouble lurking in his paradise.

We are not going to do anything fancy, however these are the steps we will be focusing on today:
Create a Management Pack for our customizations Create rules that collects the events from the orchestrator serverOff we go then and fire up the SCOM console and a powershell window. First we create a MP, I am going to use powershell to do this, however you may use the SCOM console as well (Administration – ManagementPacks – Action: Create Management Pack):



Import the Management Pack into SCOM and move on to the Authoring section in the SCOM console. Create a new rule:



Give the…

Build your local powershell module repository - ProGet

So Windows Powershell Blog released a blog a couple of days ago (link). Not too long after, a discussion emerged about it being to complicated to setup. Even though the required software is open source (nugetgalleryserver), it looks like you need to have Visual Studio Installed to compile it. I looked into doing it without visual stuidio, however I have been unable to come up with a solution. I even tweeted about it since I am not an developer. Maybe someone how is familiar with “msbuild” could do a post on how to do it without VS.

Anyhow one of my twitter-friends (@sstranger) came to the rescue and pointed me in the direction of ProGet, hence the title of this post. ProGet comes in 2 different licensing modes
Free (reduced functionality)Enterprise (paid version with extra features)The good news is that the free version supports hosting a local PowershellGet repository which was my intention anyway. So off we go and create a Configration that can install ProGet for us. This is the conf…

Powershell – Log like you mean it

How do you do logging in powershell? Why should you do logging? What should you log? Where do you put your log? How do you remove your log? How do you search your log? All important questions and how you answer then depends upon what your background is like and the preferences you have. This will be a 2 part blog post and this is part 1.


Why should you log?

Well it is not mandatory, however I have 2 reasons:
Help with debugging a script/module/functionSelf documenting script/module/function
Firstly; Do you know any program that does not contain any bugs? Working with IT for the last 2 decades, I cannot name one. When you create scripts/modules/functions, you will create bugs, that is where they live and try to make your life a living mess.

Secondly: Adding a little extra information to your logging will make them self documenting. Do you like writing documentation? Well I normally am not fond of it and use logging while debugging to get two birds with one stone.


What should you log?

Anyt…