Skip to main content

Ignite 2016 summary – Innovate, optimize, manage and empower your business with IT


Image result for cloud microsoft

This years Microsoft Ignite conference was all about transforming your business with technology. Here is a techy summary for business-minds.


Going forward, IT-Pros must prepare to answer both tricky business questions, and leverage new tools to meet business demands. I imagine questions like these:
 
  • What are the needs of our business?
  • How can we empower our users to apply the cloud to gain competitive advantages?
  • How can we innovate with greater agility and optimize our IT resources?
  • How can we migrate from the traditional model where IT is just a cost-center, to a lean/mean machine where IT is the engine that powers our business strategy with increased earnings?


A model of the traditional business case

We live in a traditional world with traditional problems. Simplified a business consists of a few silos:
  • Internal users
  • Your customers
  • Your suppliers and partners
  • The remainder of the universe

All of these are connected directly and indirectly through processes, some of them manual and some maybe through automation. The job of the IT department is to deliver services, preferably in the most cost effective way possible. Generally, if you change a process through a tool or automation (PowerShell), and you saved time/cost, you become the hero. Cost- and time-savings are always welcome, however the possible impact is superior when IT is driving your revenue, like in the new model.



The new model for IT

In the new world, everything is about processes, data and applications. In other words, algorithms. Everything is moving and changing at a higher speed than we have ever experienced before. Silos probably still exists, however they are interconnected and data-aware. Your CRM application will have access to and understand other applications and their data structure. It will empower your employees and provide you with just in time insights. With the new Azure PowerApp and Flow applications which implement the CDM (Common Data Model) you have this available today as a preview service. Throw Azure Functions into the picture, and you have a pretty robust and extendable model which is highly customizable and scalable.

In addition, Azure has implemented predictive analytics and machine learning (ML) in the different APIs, like Storage, Azure SQL, Hadoop etc. They are enabling ML for the masses by implementing it across their datacenters and in the Azure model. Your developer is not responsible for implementing intelligence in your application, he consumes predictive data from the Azure machine learning API possible through the integration with the Storage API. You do not consider IT as a cost-center, however as a business enabler, that helps you to increase revenue by applying analysis of big data through algorithms that is constantly updated to provide perfect information just in time. Theoretically possible, however immensely difficult to implement in practice if you are not in Azure.



What do you need?



:Speed and agility: If you have a clear understanding of your needs, your market and competitors, why not move as agile and fast as you can? If you can change faster than your competitors, you have an advantage and a head start. Let me illustrate with an example; You have probably heard about robot-trading in the stock-market? They move very fast and agile because the first person/robot that receives and understands specific market information, is the winning party and walks away with some profits. In our business case, it is the same thing. Rapid changes to your algorithm and IT system to understand the business and receive correct information just in time, is essential to become the leader and increasing profits.

:Scale: Your IT system need to be able to scale, up and down. You should not have to worry about it as the cloud does this for you within the limitations you have defined. The cloud empowers businesses of all sizes to use scaling technology that previously was the privilege of large enterprises with expensive dedicated appliances. Committing to services and applications that handles scaling is key in the new world. Relying on old legacy applications and services will prevent you from becoming a new force in your market. Startups in your market will become your new IT system performance benchmark and they probably do not consider legacy systems a match for their agile needs.

:Knowledge – Close the gap: The adoption of cloud resources and the hybrid cloud is just the beginning of the disruptive change that is here. Hybrid cloud is just a steppingstone towards the connected cloud with unlimited resources at your fingertips. That does not imply that the private clouds will not exists. They just need to be connected to the public cloud and empower it by binging some added value. In the other case, if it is not connected, it will be a relic and an edge-case for very special circumstances. In this scenario, knowledge will be important. New features and services are launched on an almost weekly basis. Products are migrating from private preview, to public preview and finally to general availability in matter of months. If you do not take advantage, someone else will, perhaps your competitors.

:New People and Organization 2.0: Best case scenario, you need a huge amount of training and designing. If ordering a new web-server or virtual machine takes longer than the time usually needed to create/deploy it automatically, trust me, you have to do something. Your organization is already changing, perhaps you just have not noticed it yet? Ever heard about Shadow IT, the evil from within? If it is not knocking on your door, it is because it is already inside. Shadow IT is a real problem that you need to take seriously. In the emerging world, people want things yesterday, like always. Problem is that if you do not deliver, someone else can, and asking for forgiveness beats asking for permission 9 out of 10 times, especially if it yielded a positive result. Rules, policies and guidelines are nice, however immediate results are king.

DevOps is a “must”: The new world relies on DevOps. DevOps is a merge between a developer and a IT-Pro where you bring the knowledge of both parties together and apply that knowledge to your business and culture in a series of new processes. DevOps is not automation; however, automation is a key part of DevOps.

:Security: You do know that hackers target IT-Pros due to the fact that they normally have access to everything? The tools to handle this is available and has been for quite some time now. Microsoft Identity Manager comes with PAM (Privileged Access Management) which audits privileged access with time constrains. Then your privileged access token expires, your access is revoked. The PowerShell team has created a toolkit called Just Enough Administration (JEA) which is very similar to the Identity Manager solution. Both solutions should be designed with a “break the glass” option for that time when you really don’t care about the security, but need to fix the issue. If you break the glass, all kinds of things happen and you probably would expect to face some sort of hearing where you have to justify the action, which is a good thing.

With Windows Server 2016 a new Hyper-V feature was launched giving us Shielded VMs. With shielded VMs the tenant of a shared resource owns the VM completely. The entity responsible for the platform it is running on, have the ability to manage it to a certain degree (like start, stop and make a backup). The backup of a shielded VM is encrypted if you were wondering.

Last but not least, security starts at the operating system level. In general, reducing the attach surface is regarded as a first line of defense. Windows Server 2016 Nano is the new operating system for the cloud and will change the way you work and handle datacenter workloads. Nano Server has a tiny footprint, small attach surface and is blazingly fast, which makes it a perfect match for a fast moving and agile business.

:Help – Private cloud or hybrid cloud: Even with a new organization and knowledge, it is highly likely that you will need some consultancy. According to Gartner, 95% of all attempts to create a private cloud fails or fails to yield the expected outcome. Building and implementing a private cloud is very hard and you should be very confident on your organization’s abilities before you embark on such a journey. Microsoft is the only public cloud provider that will provide you with a key-ready solution to run your hybrid cloud. If you have not heard about Microsoft AzureStack you should probably read up on it. Basically it is Azure wrapped up in a Hyper Converged ready solution for you to deploy in your datacenter delivered from OEM vendors like Dell, Lenovo, HP et al. New features initiated in Azure most likely will migrate to AzureStack ready for usage in your hybrid cloud.

AzureStack is targeted for release some time mid 2017 or later that year. That is almost a year away. The good thing is that AzureStack is based upon Azure. It has the same underlying technology that powers Azure like the portal and the Azure Resource Manager (ARM). Microsoft is delivering a consistent experience across the public and hybrid cloud with the ARM technology. To prepare yourself for AzureStack, you should invest time and effort into learning Azure and that knowledge will empower you if you decide to implement AzureStack next year.




All in - or not

Do you need to get all in on the private cloud or should you just integrate yourself with the public cloud? It depends on your organization and your business needs. One thing is for certain, you probably have to do something. Implementing your own version of ready to consume features in the public cloud in your own private datacenter, is not an option you should consider. If would require a tremendous effort and tie down your resources and in effect, make you static. You need to rub DevOps and business strategy on your business and culture. There are some really smart people out there that can help you with that and like everything else, it is an ongoing process that requires your constant attention.

The change is here. How will you empower your organization and become the new star? I am happy to discuss opportunities if you reach out by sending me an email.


Cheers

Tore


Comments

Popular posts from this blog

Serialize data with PowerShell

Currently I am working on a big new module. In this module, I need to persist data to disk and reprocess them at some point even if the module/PowerShell session was closed. I needed to serialize objects and save them to disk. It needed to be very efficient to be able to support a high volume of objects. Hence I decided to turn this serializer into a module called HashData. Other Serializing methods In PowerShell we have several possibilities to serialize objects. There are two cmdlets you can use which are built in: Export-CliXml ConvertTo-JSON Both are excellent options if you do not care about the size of the file. In my case I needed something lean and mean in terms of the size on disk for the serialized object. Lets do some tests to compare the different types: (Hashdata.Object.ps1) You might be curious why I do not use the Export-CliXML cmdlet and just use the [System.Management.Automation.PSSerializer]::Serialize static method. The static method will generate t...

Toying with audio in powershell

Controlling mute/unmute and the volume on you computer with powershell. Add-Type -TypeDefinition @' using System.Runtime.InteropServices; [Guid("5CDF2C82-841E-4546-9722-0CF74078229A"), InterfaceType(ComInterfaceType.InterfaceIsIUnknown)] interface IAudioEndpointVolume { // f(), g(), ... are unused COM method slots. Define these if you care int f(); int g(); int h(); int i(); int SetMasterVolumeLevelScalar(float fLevel, System.Guid pguidEventContext); int j(); int GetMasterVolumeLevelScalar(out float pfLevel); int k(); int l(); int m(); int n(); int SetMute([MarshalAs(UnmanagedType.Bool)] bool bMute, System.Guid pguidEventContext); int GetMute(out bool pbMute); } [Guid("D666063F-1587-4E43-81F1-B948E807363F"), InterfaceType(ComInterfaceType.InterfaceIsIUnknown)] interface IMMDevice { int Activate(ref System.Guid id, int clsCtx, int activationParams, out IAudioEndpointVolume aev); } [Guid("A95664D2-9614-4F35-A746-DE8DB63617E6"), Inte...

Creating Menus in Powershell

I have created another Powershell module. This time it is about Console Menus you can use to ease the usage for members of your oranization. It is available on GitHub and published to the PowershellGallery . It is called cliMenu. Puppies This is a Controller module. It uses Write-Host to create a Menu in the console. Some of you may recall that using Write-Host is bad practice. Controller scripts and modules are the exception to this rule. In addition with WMF5 Write-Host writes to the Information stream in Powershell, so it really does not matter anymore. Design goal I have seen to many crappy menus that is a mixture of controller script and business logic. It is in essence a wild west out there, hence my ultimate goal is to create something that makes it as easy as possible to create a menu and change the way it looks. Make it easy to build Menus and change them Make it as "declarative" as possible Menus The module supports multiple Men...