Resultado de imagen para msix

Microsoft made a massive announcement at the Windows Developer Day keynote regarding the future of application packaging and installation for Windows 10. Microsoft has a deep history with Windows application installers, and the announcement revealed that they have finally worked out a way for software developers to package legacy and modern Windows applications for the Windows 10 platform with a single installer.

The new package specification, named MSIX, provides a unified packaging format for Win32, Desktop Bridges, and Universal Windows Platform (UWP) applications. MSIX was built from the ground up to deliver complete containerization for Windows 10 applications. The format utilizes the AppX application framework and provides a path to deploy all Windows 10 Applications to the Windows Store.

Containers are an intermediate layer that fits between a virtualized operating system and an application where the container ensures strict isolation between the application and the operating system. It is important not to confuse application virtualization with application containers where the former is mostly a redirection technology whereas the latter keeps the application isolated from the operating system and other applications.

While the announcement was music to my ears, I sympathize with many of my customers who are struggling to understand the advantages of MSIX for modern application development and why this is a (fingers crossed) new chapter in the modern management story. In this post, I will walk you through the history of Windows application deployment and explain why the MSIX format has the potential to truly unify how Windows applications are packaged and deployed.

In the Beginning it was Messy

Resultado de imagen para bebe pintandoA software developer would write an application to run on Windows and then compile that application, but the software installer itself was a separate step that involved a separate development effort. Essentially it was the wild west of software installation, and most software vendors had a narrow view when it came to application requirements. All they cared about was something that could install their application, and best practices were mostly a black art that many ignored out of ignorance.

The first few generations of application installers had limited logic and were not very well written, which resulted in poorly installed applications. The installers would blindly overwrite shared system files (e.g., DLLs) without checking file versions. The net result was that application installers continuously overwrote shared system files with older versions (i.e., the version of the file they were packaged with) and cause runtime errors with other applications. This problem became known as “DLL Hell,” and it became part and parcel of installing applications on Windows in the early days.

The need for more sophisticated installers fueled the emergence of application repackaging where we could take vendor supplied application installers and made them deployable through new management technologies such as Systems Management Server (SMS).

The point worth noting is that until the advent of deployable scripted installers, a technician or user was physically required to install an application on the user’s machine. With scripted installers, we were finally able to perform a silent install the application through SMS without human interaction.

Repackaging tools were built to monitor and record the changes made to the system by the software installers and generate an automated installer for mass deployment. Those working on repackaging tools hoped that by monitoring changes to the system, they could eliminate the cycle of DLL Hell introduced by the installation process. As smart as it sounded, the technology had shortcomings.

On the upside, repackaging tools were good at capturing the state of system settings when applications were installed. Unfortunately, depending on which repackaging approach a developer employed (clean machine vs. corporate Windows image) the capture utility was only making a best guess based on system changes it was privy to.

Those of us that were early adopters first started to repackage against the corporate build of the operating system. Very quickly, we started to notice that non-trivial changes to the corporate build (i.e., removing an application) resulted in many of the repackaged application installers not being able to compensate for the missing files and registry entries that had existed.

The situation continued to worsen because some of the logic in vendors’ installers could not be captured by the repackaging tools. For example, the capture mechanism was not able to record API calls or actions more advanced than copying files or setting registry values to the system.

The largest challenge for those creating repackaged software installers was how to identify relevant system changes. More often than not, they had limited real-world experience and had a hard time discerning if what was captured was a necessary part of the software installation or if it was background noise from the operating system or the activity of another application.

As we now know, each patch, each new version of Windows, and lots of third-party applications all changed the user’s machine, in sometimes subtle and sometimes drastic ways. Over time, the complexity of installing applications in this environment eclipsed the capabilities of the installers and inevitably lead to operating system instability, which became so ubiquitous that it earned the name, Winrot.

The evolution of silent installers led to advancements that reduced user downtime and the amount of effort needed to deploy software in large environments, however, the risk to do harm remained. Consequently, system administrators needed heaps of experience and skill to ensure that they didn’t make a mess of the environment.

 

The Introduction of the Windows Installer.

The next big push towards improving installer capabilities and performance came from the big kahuna, Microsoft, who created and launched the Windows Installer (MSI) format. The MSI package format should not be confused with a Message-signaled interrupt. Once again, Microsoft’s naming conventions appear to involve a magic 8 ball.

At any rate, the goal of the MSI format was to develop a standardized framework around the logic of the software installer that was better able to negotiate the changes the installer needed to make to the operating system. The format was used by software developers and application repackagers, however, instead of a script-based installation, the author (depending on his role) had to understand the relational database upon which MSI was built.

One of the new features introduced by the MSI format was the ability to modify an installer without having to repackage it by deploying a transform file with it. Typically, transform files were used to apply local configuration settings (e.g., to disable the automatic update feature of a software product).

Microsoft believed that the MSI format would be more user-friendly because they thought that a wider range of stakeholders could analyze the software installation package and understand what the installer required by inspecting the underlying database.

While a relational database is certainly more structured than a script file, in practice, it is more work to get the information you need because you need to use custom tools to get the information you need, and it is more difficult to interpret the result of the installation since the majority of changes need to be negotiated by the Windows Installer service which is part of the operating system.

Not surprisingly, more specialized tools were needed to grok these MSI installers, and so Windows Installer rolled out custom actions.

Custom actions allowed software developers and application repackagers to bypass most limitations enforced by Windows Installer so they could run scripts and EXEs. Ideally, custom actions weren’t needed, but not everything an application needs can be natively installed using the Windows Installer service. The downside of the flexibility to bypass installer limitations allowed lazy or uneducated people to break all sorts of best practices when it came to software installation.

The team behind the MSI format thought it would gradually eliminate DLL Hell and Winrot and by extension, significantly improve the stability of both the operating system and the applications installed on the user’s machine.

What Microsoft didn’t take into account was the scope of developing an ecosystem of packaging tools around the MSI technology nor did they want to make tools for this new standard. They intended to rely on software vendors to build package creation tools for developers and repackaging tools for IT staff.

For a time, Windows Installer was the go-to solution for installing applications on Windows machines. As the industry evolved, the need to deal with the remnants of DLL Hell and WinROT lead to the rise of application virtualization.

 

Application Repackaging Goes Virtual!

In the context of application virtualization, applications are considered “virtual” when they aren’t installed through traditional methods, and they don’t have direct access to the file system or system registry. Instead, virtualized applications either access a synthetic file system and registry, or the virtualization layer redirects application data to alternate locations in the file system and registry.

Application virtualization usually has COM isolation and a virtual file system to prevent problems such as DLL Hell, but it works so well that you can often run several versions of the same application side by side with no issues.

For remote desktop session hosts, this was a huge deal because more complex environments could simplify by allowing more applications to co-exist on the same server. The virtual applications removed the groups or silos of servers that were built for specific application sets and also reduced the uncertainty of adding a new application to the environment.

Since remote desktop session hosts are shared machines, there was a focus on making sure that they were stable (i.e., as few errors as possible), but also had no performance quirks that the end user would notice.

Microsoft recognized the value that enterprises were getting out of application virtualization and decided to acquire SoftGrid in 2006 because it was the best application virtualization technology on the market. The technology had a large user base with quite a bit of documented knowledge to make applications work with the technology. Once Microsoft began to introduce some of their security standards to the product and updated many of the features, it was rebranded to Microsoft Application Virtualization or App-V.

It was obvious back then that application virtualization was a complex and difficult technology to work with for Microsoft’s engineers, and although it was painful to watch, Microsoft needed to part ways with some of the design choices that SoftGrid brought with it.

Despite starting with a solid foundation, App-V 5 was almost a complete rewrite of the product, which not surprisingly, introduced bugs and a steep learning curve that needed to be ironed out.

App-V 5 relies on native capabilities of the operating system to redirect file and registry data to other parts of the registry and file system, which eliminated the need for synthetic subsystems.

In the end, App-V 5 turned out to be a solid technology, and the application repackagers were able to handle the majority of use cases. Having said that, I’m still working with clients that encounter 5%-20% of their application inventory failing to run properly in virtualized environments, which means the search for the perfect installer is still far from over.

On the flip side, the methods, technologies, and best practices that software developers used to write their software needed to evolve to keep pace with the new hardware, operating systems (and versions), and devices that their customers were beginning to use at work and remotely.

 If You Control the Code You Control the World

Microsoft understood that the shortcomings of the Win32 application architecture meant that it could not evolve to meet the demands of modern computing or Microsoft’s vision for the future.

This need for modernization was especially true for applications using cloud architectures where the user’s computer may never reside on the corporate network. Office 365 took advantage of App-V by integrating it into its version of Office to run, which gave Microsoft a lot of experience in creating and maintaining a complex virtual application. Due to the volume of its customer base, Microsoft was beginning to revolutionize the way businesses approached application packaging and deployment. Add in the burgeoning ecosystem of devices that users instinctively began using in all sorts of locations, and software developers were suddenly faced with new requirements to adapt to their applications.

A new way of building and installing easy-to-use applications that could automatically adjust to a wide variety of device form factors was needed. Microsoft shortened the gap with the Universal Windows Platform (UWP) and a “store” concept for Windows application consumption that mimicked the way applications were installed on mobile devices.

They also needed applications that were more autonomous because Windows was already heading down the road of behaving more like Android or iOS, where applications are installed and removed easily and cleanly without affecting the health of the operating system.

The UWP application format is very strict, and there are no significant equivalents of the Win32 API. This fact poses a serious and costly problem for Win32 developers because they have to rewrite significant parts of their code base to take advantage of UWP and install their application through the Windows store.

In some cases, the features had to be removed because there was no way to implement the solution with the limited set of APIs in the new UWP framework.

Those lucky few who were able to turn to UWP noticed two key benefits:

  1. UWP applications do not need to be packaged, which effectively eliminates the workload necessary to prepare them for automated installation on the user’s machine.
  2. Self-initiated installation and customization of an application through Windows Store almost eliminates the need for an IT technician to intervene on behalf of the user, as the user already has experience with an application store when using their smartphone.
  3. The Windows store also offers significant cost savings to its customers, as Microsoft hosts the application deployment infrastructure needed to store applications in the cloud.
  4. This means that companies no longer need to maintain a separate software deployment infrastructure with on-premise servers.
  5. Windows devices can download the application directly from the Windows Cloud Store.
  6. The experience can be managed through Windows Store for Business, which allows organizations to monitor available applications and track their usage.

Some software vendors are legitimately annoyed with the Windows Store model because Microsoft’s 30/70 split in each sale is considerable for expensive applications – a considerable non-technical barrier to adoption.

If you prefer to deploy your applications manually or with standard electronic software distribution tools, that remains an option, as this technology is not exclusive to the Windows store.

While Windows Store for Consumers offers an appropriate experience, Windows Store for Business only includes a basic set of features to manage Windows Store applications, which is important because it still lacks business features such as role-based access.

Of course, if your organization cannot use the cloud for application delivery, there are ways to deliver UWP applications using more traditional software distribution technologies.

Given the high cost of entry, the industry was reluctant to migrate applications to UWP. As a result, Microsoft needed to lower the entry barrier for Win32 applications and began working on Desktop Bridges (codename Project Centennial) to support both Win32 and UWP applications.

Desktop Bridges loosened the knot in the UWP API (a step in the right direction), but required access to the source code for the application to work.

This meant that Desktop Bridges was only useful for application developers, as applications were rarely successful when repackaged using this technology. There were UWP repackaging tools, but they were aimed at developers who could modify the application to overcome problems.

As a result, the adoption of modern application formats came close to solving the problems inherited from Win32 applications. However, application compatibility and the fact that applications were not isolated enough were areas where UWP fell short.

In our opinion, Microsoft needed a way to introduce more legacy applications into the store to make the migration to modern application formats a success story and, more importantly, technologies such as S Mode for Windows 10, where only modern applications can run, created a more technical need to complete the modern view of the Windows desktop.

 

3d rendering of the letter X in brushed metal on a white isolated background.

Put An X On It

That brings us to Microsoft’s announcement of MSIX and the hope for a unified packaging format that enables everyone to run applications on Windows 10. You may be thinking that Microsoft just refactored their MSI package format and added an ‘X’ on it to sound hip and cool like Apple, but MSIX has nothing to do with the MSI format. Nor does it have anything to do with MSI-X, which is the PCI V3.0 compliant Message-signaled interrupt. Confused?

MSIX is genuinely different from previous attempts at software installation technologies because it is a container-by-design technology targeted for use with end-user applications. The second reason is that the technology is useful for both developers and repackagers alike. The industry is working towards less repackaging, however, migrating existing applications into a container technology is an enormous and important goal to achieve Microsoft’s 365 vision.

Windows 10 makes use of containers from applications to the operating system. Containers are a great way to quickly deliver isolated application environments to different systems. Microsoft also uses containers for a wide range of services, primarily within the new security architecture built on top of Hyper-V.

With traditional Windows applications, the application state isn’t easy to manage. Containers, on the other hand, are analogous to application virtualization where application state is maintained inside the container regardless of what machine you deliver the application to. In theory, containers are that isolated, but I expect there will be compromises to make traditional Windows application formats work with MSIX.

Based on what I’ve read, you can use MSIX to deliver Win32 applications via the Windows Store. I hope that is true because it will help improve the modern management story for Windows 10. The most notable detractor is the relationship between Windows 10 and Intune because Intune’s mobile device management capabilities do not have a great deal of support for legacy software installers. Ideally, the Windows Store will be used to deliver applications, but only modern application formats are allowed in the store.

Aside from traditional Win32 application consumers, I am curious to see if MSIX opens the door to get more applications onto other Windows devices such as Xbox and the augmented reality realm, which further helps the whole Windows Store story.

 

Where MSIX May Take Us

I believe the traditional approach for applications has been a “good enough” journey, but as we focus on the modern desktop there needs to be a clean break from the past and traditional software installers must go away in favor of modern formats such as MSIX to simplify software deployment, increase security, and isolate applications from the operating system.