Resultado de imagen para msix

Microsoft made a massive announcement at the Windows Developer Day keynote regarding the future of application packaging and installation for Windows 10. Microsoft has a deep history with Windows application installers, and the announcement revealed that they have finally worked out a way for software developers to package legacy and modern Windows applications for the Windows 10 platform with a single installer.

The new package specification, named MSIX, provides a unified packaging format for Win32, Desktop Bridges, and Universal Windows Platform (UWP) applications. MSIX was built from the ground up to deliver complete containerization for Windows 10 applications. The format utilizes the AppX application framework and provides a path to deploy all Windows 10 Applications to the Windows Store.

Containers are an intermediate layer that fits between a virtualized operating system and an application where the container ensures strict isolation between the application and the operating system. It is important not to confuse application virtualization with application containers where the former is mostly a redirection technology whereas the latter keeps the application isolated from the operating system and other applications.

While the announcement was music to my ears, I sympathize with many of my customers who are struggling to understand the advantages of MSIX for modern application development and why this is a (fingers crossed) new chapter in the modern management story. In this post, I will walk you through the history of Windows application deployment and explain why the MSIX format has the potential to truly unify how Windows applications are packaged and deployed.

In the Beginning it was Messy

Resultado de imagen para bebe pintandoA software developer would write an application to run on Windows and then compile that application, but the software installer itself was a separate step that involved a separate development effort. Essentially it was the wild west of software installation, and most software vendors had a narrow view when it came to application requirements. All they cared about was something that could install their application, and best practices were mostly a black art that many ignored out of ignorance.

The first few generations of application installers had limited logic and were not very well written, which resulted in poorly installed applications. The installers would blindly overwrite shared system files (e.g., DLLs) without checking file versions. The net result was that application installers continuously overwrote shared system files with older versions (i.e., the version of the file they were packaged with) and cause runtime errors with other applications. This problem became known as “DLL Hell,” and it became part and parcel of installing applications on Windows in the early days.

The need for more sophisticated installers fueled the emergence of application repackaging where we could take vendor supplied application installers and made them deployable through new management technologies such as Systems Management Server (SMS).

The point worth noting is that until the advent of deployable scripted installers, a technician or user was physically required to install an application on the user’s machine. With scripted installers, we were finally able to perform a silent install the application through SMS without human interaction.

Repackaging tools were built to monitor and record the changes made to the system by the software installers and generate an automated installer for mass deployment. Those working on repackaging tools hoped that by monitoring changes to the system, they could eliminate the cycle of DLL Hell introduced by the installation process. As smart as it sounded, the technology had shortcomings.

On the upside, repackaging tools were good at capturing the state of system settings when applications were installed. Unfortunately, depending on which repackaging approach a developer employed (clean machine vs. corporate Windows image) the capture utility was only making a best guess based on system changes it was privy to.

Those of us that were early adopters first started to repackage against the corporate build of the operating system. Very quickly, we started to notice that non-trivial changes to the corporate build (i.e., removing an application) resulted in many of the repackaged application installers not being able to compensate for the missing files and registry entries that had existed.

The situation continued to worsen because some of the logic in vendors’ installers could not be captured by the repackaging tools. For example, the capture mechanism was not able to record API calls or actions more advanced than copying files or setting registry values to the system.

The largest challenge for those creating repackaged software installers was how to identify relevant system changes. More often than not, they had limited real-world experience and had a hard time discerning if what was captured was a necessary part of the software installation or if it was background noise from the operating system or the activity of another application.

As we now know, each patch, each new version of Windows, and lots of third-party applications all changed the user’s machine, in sometimes subtle and sometimes drastic ways. Over time, the complexity of installing applications in this environment eclipsed the capabilities of the installers and inevitably lead to operating system instability, which became so ubiquitous that it earned the name, Winrot.

The evolution of silent installers led to advancements that reduced user downtime and the amount of effort needed to deploy software in large environments, however, the risk to do harm remained. Consequently, system administrators needed heaps of experience and skill to ensure that they didn’t make a mess of the environment.

The Introduction of the Windows Installer.

The next big push towards improving installer capabilities and performance came from the big kahuna, Microsoft, who created and launched the Windows Installer (MSI) format. The MSI package format should not be confused with a Message-signaled interrupt. Once again, Microsoft’s naming conventions appear to involve a magic 8 ball.

At any rate, the goal of the MSI format was to develop a standardized framework around the logic of the software installer that was better able to negotiate the changes the installer needed to make to the operating system. The format was used by software developers and application repackagers, however, instead of a script-based installation, the author (depending on his role) had to understand the relational database upon which MSI was built.

One of the new features introduced by the MSI format was the ability to modify an installer without having to repackage it by deploying a transform file with it. Typically, transform files were used to apply local configuration settings (e.g., to disable the automatic update feature of a software product).

Microsoft believed that the MSI format would be more user-friendly because they thought that a wider range of stakeholders could analyze the software installation package and understand what the installer required by inspecting the underlying database.

While a relational database is certainly more structured than a script file, in practice, it is more work to get the information you need because you need to use custom tools to get the information you need, and it is more difficult to interpret the result of the installation since the majority of changes need to be negotiated by the Windows Installer service which is part of the operating system.

Not surprisingly, more specialized tools were needed to grok these MSI installers, and so Windows Installer rolled out custom actions.

Custom actions allowed software developers and application repackagers to bypass most limitations enforced by Windows Installer so they could run scripts and EXEs. Ideally, custom actions weren’t needed, but not everything an application needs can be natively installed using the Windows Installer service. The downside of the flexibility to bypass installer limitations allowed lazy or uneducated people to break all sorts of best practices when it came to software installation.

The team behind the MSI format thought it would gradually eliminate DLL Hell and Winrot and by extension, significantly improve the stability of both the operating system and the applications installed on the user’s machine.

What Microsoft didn’t take into account was the scope of developing an ecosystem of packaging tools around the MSI technology nor did they want to make tools for this new standard. They intended to rely on software vendors to build package creation tools for developers and repackaging tools for IT staff.

For a time, Windows Installer was the go-to solution for installing applications on Windows machines. As the industry evolved, the need to deal with the remnants of DLL Hell and WinROT lead to the rise of application virtualization.

Application Repackaging Goes Virtual!

In the context of application virtualization, applications are considered “virtual” when they aren’t installed through traditional methods, and they don’t have direct access to the file system or system registry. Instead, virtualized applications either access a synthetic file system and registry, or the virtualization layer redirects application data to alternate locations in the file system and registry.

Application virtualization usually has COM isolation and a virtual file system to prevent problems such as DLL Hell, but it works so well that you can often run several versions of the same application side by side with no issues.

For remote desktop session hosts, this was a huge deal because more complex environments could simplify by allowing more applications to co-exist on the same server. The virtual applications removed the groups or silos of servers that were built for specific application sets and also reduced the uncertainty of adding a new application to the environment.

Since remote desktop session hosts are shared machines, there was a focus on making sure that they were stable (i.e., as few errors as possible), but also had no performance quirks that the end user would notice.

Microsoft recognized the value that enterprises were getting out of application virtualization and decided to acquire SoftGrid in 2006 because it was the best application virtualization technology on the market. The technology had a large user base with quite a bit of documented knowledge to make applications work with the technology. Once Microsoft began to introduce some of their security standards to the product and updated many of the features, it was rebranded to Microsoft Application Virtualization or App-V.

It was obvious back then that application virtualization was a complex and difficult technology to work with for Microsoft’s engineers, and although it was painful to watch, Microsoft needed to part ways with some of the design choices that SoftGrid brought with it.

Despite starting with a solid foundation, App-V 5 was almost a complete rewrite of the product, which not surprisingly, introduced bugs and a steep learning curve that needed to be ironed out.

App-V 5 relies on native capabilities of the operating system to redirect file and registry data to other parts of the registry and file system, which eliminated the need for synthetic subsystems.

In the end, App-V 5 turned out to be a solid technology, and the application repackagers were able to handle the majority of use cases. Having said that, I’m still working with clients that encounter 5%-20% of their application inventory failing to run properly in virtualized environments, which means the search for the perfect installer is still far from over.

On the flip side, the methods, technologies, and best practices that software developers used to write their software needed to evolve to keep pace with the new hardware, operating systems (and versions), and devices that their customers were beginning to use at work and remotely.

 If You Control the Code You Control the World

Microsoft understood that the shortcomings of the Win32 application architecture meant that it could not evolve to meet the demands of modern computing or Microsoft’s vision for the future.

This need for modernization was especially true for applications using cloud architectures where the user’s computer may never reside on the corporate network. Office 365 took advantage of App-V by integrating it into its version of Office to run, which gave Microsoft a lot of experience in creating and maintaining a complex virtual application. Due to the volume of its customer base, Microsoft was beginning to revolutionize the way businesses approached application packaging and deployment. Add in the burgeoning ecosystem of devices that users instinctively began using in all sorts of locations, and software developers were suddenly faced with new requirements to adapt to their applications.

A new way of building and installing easy-to-use applications that could automatically adjust to a wide variety of device form factors was needed. Microsoft shortened the gap with the Universal Windows Platform (UWP) and a “store” concept for Windows application consumption that mimicked the way applications were installed on mobile devices.

They also needed applications that were more autonomous because Windows was already heading down the road of behaving more like Android or iOS, where applications are installed and removed easily and cleanly without affecting the health of the operating system.

The UWP application format is very strict, and there are no significant equivalents of the Win32 API. This fact poses a serious and costly problem for Win32 developers because they have to rewrite significant parts of their code base to take advantage of UWP and install their application through the Windows store.

In some cases, the features had to be removed because there was no way to implement the solution with the limited set of APIs in the new UWP framework.

Those lucky few who were able to turn to UWP noticed two key benefits:

    1. UWP applications do not need to be packaged, which effectively eliminates the workload necessary to prepare them for automated installation on the user’s machine.
    1. Self-initiated installation and customization of an application through Windows Store almost eliminates the need for an IT technician to intervene on behalf of the user, as the user already has experience with an application store when using their smartphone.
    1. The Windows store also offers significant cost savings to its customers, as Microsoft hosts the application deployment infrastructure needed to store applications in the cloud.
    1. This means that companies no longer need to maintain a separate software deployment infrastructure with on-premise servers.
  1. Windows devices can download the application directly from the Windows Cloud Store.
  2. The experience can be managed through Windows Store for Business, which allows organizations to monitor available applications and track their usage.

Some software vendors are legitimately annoyed with the Windows Store model because Microsoft’s 30/70 split in each sale is considerable for expensive applications – a considerable non-technical barrier to adoption.

If you prefer to deploy your applications manually or with standard electronic software distribution tools, that remains an option, as this technology is not exclusive to the Windows store.

While Windows Store for Consumers offers an appropriate experience, Windows Store for Business only includes a basic set of features to manage Windows Store applications, which is important because it still lacks business features such as role-based access.

Of course, if your organization cannot use the cloud for application delivery, there are ways to deliver UWP applications using more traditional software distribution technologies.

Given the high cost of entry, the industry was reluctant to migrate applications to UWP. As a result, Microsoft needed to lower the entry barrier for Win32 applications and began working on Desktop Bridges (codename Project Centennial) to support both Win32 and UWP applications.

Desktop Bridges loosened the knot in the UWP API (a step in the right direction), but required access to the source code for the application to work.

This meant that Desktop Bridges was only useful for application developers, as applications were rarely successful when repackaged using this technology. There were UWP repackaging tools, but they were aimed at developers who could modify the application to overcome problems.

As a result, the adoption of modern application formats came close to solving the problems inherited from Win32 applications. However, application compatibility and the fact that applications were not isolated enough were areas where UWP fell short.

In our opinion, Microsoft needed a way to introduce more legacy applications into the store to make the migration to modern application formats a success story and, more importantly, technologies such as S Mode for Windows 10, where only modern applications can run, created a more technical need to complete the modern view of the Windows desktop.

3d rendering of the letter X in brushed metal on a white isolated background.

Put An X On It

That brings us to Microsoft’s announcement of MSIX and the hope for a unified packaging format that enables everyone to run applications on Windows 10. You may be thinking that Microsoft just refactored their MSI package format and added an ‘X’ on it to sound hip and cool like Apple, but MSIX has nothing to do with the MSI format. Nor does it have anything to do with MSI-X, which is the PCI V3.0 compliant Message-signaled interrupt. Confused?

MSIX is genuinely different from previous attempts at software installation technologies because it is a container-by-design technology targeted for use with end-user applications. The second reason is that the technology is useful for both developers and repackagers alike. The industry is working towards less repackaging, however, migrating existing applications into a container technology is an enormous and important goal to achieve Microsoft’s 365 vision.

Windows 10 makes use of containers from applications to the operating system. Containers are a great way to quickly deliver isolated application environments to different systems. Microsoft also uses containers for a wide range of services, primarily within the new security architecture built on top of Hyper-V.

With traditional Windows applications, the application state isn’t easy to manage. Containers, on the other hand, are analogous to application virtualization where application state is maintained inside the container regardless of what machine you deliver the application to. In theory, containers are that isolated, but I expect there will be compromises to make traditional Windows application formats work with MSIX.

Based on what I’ve read, you can use MSIX to deliver Win32 applications via the Windows Store. I hope that is true because it will help improve the modern management story for Windows 10. The most notable detractor is the relationship between Windows 10 and Intune because Intune’s mobile device management capabilities do not have a great deal of support for legacy software installers. Ideally, the Windows Store will be used to deliver applications, but only modern application formats are allowed in the store.

Aside from traditional Win32 application consumers, I am curious to see if MSIX opens the door to get more applications onto other Windows devices such as Xbox and the augmented reality realm, which further helps the whole Windows Store story.

Where MSIX May Take Us

I believe the traditional approach for applications has been a “good enough” journey, but as we focus on the modern desktop there needs to be a clean break from the past and traditional software installers must go away in favor of modern formats such as MSIX to simplify software deployment, increase security, and isolate applications from the operating system.[:es]Resultado de la imagen para msix

Microsoft hizo un anuncio masivo en el discurso del día del desarrollador de Windows con respecto al futuro del empaquetado e instalación de aplicaciones para Windows 10. Microsoft tiene una historia profunda con los instaladores de aplicaciones de Windows, y el anuncio reveló que finalmente han encontrado una manera de empaquetar los desarrolladores de software. Aplicaciones heredadas y modernas de Windows para la plataforma Windows 10 con un solo instalador.

La nueva especificación del paquete, denominada MSIX, proporciona un formato de empaquetado unificado para las aplicaciones Win32, Desktop Bridges y Universal Windows Platform (UWP). MSIX se creó desde cero para ofrecer una contenedorización completa para las aplicaciones de Windows 10. El formato utiliza el marco de la aplicación AppX y proporciona una ruta para implementar todas las aplicaciones de Windows 10 en la tienda de Windows.

Los contenedores son una capa intermedia que se ajusta entre un sistema operativo virtualizado y una aplicación donde el contenedor garantiza un aislamiento estricto entre la aplicación y el sistema operativo. Es importante no confundir la virtualización de aplicaciones con los contenedores de aplicaciones, donde el primero es principalmente una tecnología de redirección, mientras que el segundo mantiene la aplicación aislada del sistema operativo y otras aplicaciones.

Si bien el anuncio fue música para mis oídos, simpatizo con muchos de mis clientes que luchan por comprender las ventajas de MSIX para el desarrollo de aplicaciones modernas y por qué este es un nuevo capítulo en la historia de la administración moderna. En esta publicación, lo guiaré a través del historial de implementación de aplicaciones de Windows y explicaré por qué el formato MSIX tiene la posibilidad de unificar realmente cómo se empaquetan y se implementan las aplicaciones de Windows.

En el principio era desordenado

Resultado de la imagen para bebe pintandoUn desarrollador de software escribiría una aplicación para ejecutarse en Windows y luego compilaría esa aplicación, pero el instalador del software en sí era un paso separado que implicaba un esfuerzo de desarrollo separado. Esencialmente, era el extremo oeste de la instalación de software, y la mayoría de los proveedores de software tenían una visión limitada cuando se trataba de los requisitos de la aplicación. Todo lo que les importaba era algo que pudiera instalar su aplicación, y las mejores prácticas eran en su mayoría un arte negro que muchos ignoraban por ignorancia.

Las primeras generaciones de instaladores de aplicaciones tenían una lógica limitada y no estaban muy bien escritas, lo que resultó en aplicaciones mal instaladas. Los instaladores sobrescribirían ciegamente los archivos de sistema compartidos (por ejemplo, DLL) sin verificar las versiones de los archivos. El resultado neto fue que los instaladores de aplicaciones sobrescribían continuamente los archivos compartidos del sistema con versiones anteriores (es decir, la versión del archivo con el que estaban empaquetados) y causaban errores de ejecución con otras aplicaciones. Este problema se conoció como “DLL Hell” y se convirtió en parte integral de la instalación de aplicaciones en Windows en los primeros días.

La necesidad de instaladores más sofisticados impulsó el surgimiento del reenvasado de aplicaciones donde pudimos tomar instaladores suministrados por el proveedor y hacerlos implementables a través de nuevas tecnologías de administración como Systems Management Server (SMS).

El punto que se debe tener en cuenta es que hasta la llegada de los instaladores de scripts desplegables, se requería físicamente que un técnico o usuario instalara una aplicación en la máquina del usuario. Con los instaladores de secuencias de comandos, finalmente pudimos realizar una instalación silenciosa de la aplicación a través de SMS sin interacción humana.

Las herramientas de reenvasado se crearon para monitorear y registrar los cambios realizados en el sistema por los instaladores de software y generar un instalador automatizado para la implementación masiva. Quienes trabajaban en herramientas de reempaquetado esperaban que al monitorear los cambios en el sistema, pudieran eliminar el ciclo de DLL Hell introducido por el proceso de instalación. Tan inteligente como suena, la tecnología tenía deficiencias.

Por el lado positivo, las herramientas de reenvasado fueron buenas para capturar el estado de la configuración del sistema cuando se instalaron las aplicaciones. Desafortunadamente, dependiendo del enfoque de reenvasado que emplee un desarrollador (máquina limpia vs. imagen corporativa de Windows), la utilidad de captura solo hizo una mejor estimación basada en los cambios del sistema de los que estaba al tanto.

Aquellos de nosotros que fuimos adoptadores tempranos empezamos a empaquetar contra la compilación corporativa del sistema operativo. Muy rápidamente, comenzamos a notar que los cambios no triviales en la compilación corporativa (es decir, la eliminación de una aplicación) hacían que muchos de los instaladores de aplicaciones reenvasados ​​no pudieran compensar los archivos faltantes y las entradas de registro que existían.

La situación continuó empeorando porque las herramientas de reenvasado no pudieron capturar parte de la lógica de los instaladores de los proveedores. Por ejemplo, el mecanismo de captura no pudo registrar llamadas a API o acciones más avanzadas que copiar archivos o establecer valores de registro en el sistema.

El mayor desafío para quienes crean instaladores de software reempaquetados fue cómo identificar los cambios relevantes del sistema. La mayoría de las veces, tenían una experiencia limitada en el mundo real y tenían dificultades para discernir si lo que se capturaba era una parte necesaria de la instalación del software o si se trataba de un ruido de fondo del sistema operativo o de la actividad de otra aplicación.

Como sabemos ahora, cada parche, cada nueva versión de Windows y muchas aplicaciones de terceros cambiaron la máquina del usuario, a veces de manera sutil y a veces drástica. Con el tiempo, la complejidad de la instalación de aplicaciones en este entorno eclipsó las capacidades de los instaladores e inevitablemente condujo a la inestabilidad del sistema operativo, que se volvió tan ubicuo que se ganó el nombre de Winrot.

La evolución de los instaladores silenciosos condujo a avances que redujeron el tiempo de inactividad del usuario y la cantidad de esfuerzo necesario para implementar el software en entornos grandes, sin embargo, el riesgo de hacer daño permaneció. En consecuencia, los administradores de sistemas necesitaban un montón de experiencia y habilidad para asegurarse de que no crearan un desastre del entorno.

La introducción del instalador de Windows.

El siguiente gran impulso para mejorar las capacidades y el rendimiento del instalador provino del gran kahuna, Microsoft, que creó y lanzó el formato Windows Installer (MSI). El formato del paquete MSI no debe confundirse con una interrupción señalizada por mensaje. Una vez más, las convenciones de nomenclatura de Microsoft parecen involucrar una bola mágica 8.

En cualquier caso, el objetivo del formato MSI era desarrollar un marco estandarizado alrededor de la lógica del instalador de software que fuera más capaz de negociar los cambios que el instalador necesitaba hacer al sistema operativo. El formato fue utilizado por los desarrolladores de software y los reempaquetadores de aplicaciones; sin embargo, en lugar de una instalación basada en scripts, el autor (según su función) tenía que entender la base de datos relacional sobre la que se construyó MSI.

Una de las nuevas características introducidas por el formato MSI fue la capacidad de modificar un instalador sin tener que volver a empaquetarlo al implementar un archivo de transformación con él. Normalmente, los archivos de transformación se utilizaron para aplicar las configuraciones locales (por ejemplo, para deshabilitar la función de actualización automática de un producto de software).

Microsoft creía que el formato MSI sería más fácil de usar porque pensaron que una gama más amplia de partes interesadas podría analizar el paquete de instalación del software y comprender qué requería el instalador al inspeccionar la base de datos subyacente.

Si bien una base de datos relacional es ciertamente más estructurada que un archivo de script, en la práctica es más trabajo obtener la información que necesita porque necesita usar herramientas personalizadas para obtener la información que necesita, y es más difícil interpretar el resultado de La instalación, ya que la mayoría de los cambios debe ser negociada por el servicio Windows Installer que forma parte del sistema operativo.

No es sorprendente que se necesitaran herramientas más especializadas para desarrollar estos instaladores de MSI, por lo que Windows Installer implementó  acciones personalizadas .

Las acciones personalizadas permitieron a los desarrolladores de software y a los administradores de aplicaciones omitir la mayoría de las limitaciones impuestas por Windows Installer para que pudieran ejecutar scripts y EXE. Idealmente, las acciones personalizadas no eran necesarias, pero no todo lo que necesita una aplicación se puede instalar de forma nativa con el servicio Windows Installer. La desventaja de la flexibilidad para eludir las limitaciones del instalador permitía a las personas perezosas o sin educación romper con todo tipo de mejores prácticas cuando se trataba de la instalación de software.

El equipo detrás del formato MSI pensó que eliminaría gradualmente DLL Hell y Winrot y, por extensión, mejoraría significativamente la estabilidad tanto del sistema operativo como de las aplicaciones instaladas en la máquina del usuario.

What Microsoft didn’t take into account was the scope of developing an ecosystem of packaging tools around the MSI technology nor did they want to make tools for this new standard. They intended to rely on software vendors to build package creation tools for developers and repackaging tools for IT staff.

For a time, Windows Installer was the go-to solution for installing applications on Windows machines. As the industry evolved, the need to deal with the remnants of DLL Hell and WinROT lead to the rise of application virtualization.

Application Repackaging Goes Virtual!

In the context of application virtualization, applications are considered “virtual” when they aren’t installed through traditional methods, and they don’t have direct access to the file system or system registry. Instead, virtualized applications either access a synthetic file system and registry, or the virtualization layer redirects application data to alternate locations in the file system and registry.

Application virtualization usually has COM isolation and a virtual file system to prevent problems such as DLL Hell, but it works so well that you can often run several versions of the same application side by side with no issues.

For remote desktop session hosts, this was a huge deal because more complex environments could simplify by allowing more applications to co-exist on the same server. The virtual applications removed the groups or silos of servers that were built for specific application sets and also reduced the uncertainty of adding a new application to the environment.

Since remote desktop session hosts are shared machines, there was a focus on making sure that they were stable (i.e., as few errors as possible), but also had no performance quirks that the end user would notice.

Microsoft recognized the value that enterprises were getting out of application virtualization and decided to acquire SoftGrid in 2006 because it was the best application virtualization technology on the market. The technology had a large user base with quite a bit of documented knowledge to make applications work with the technology. Once Microsoft began to introduce some of their security standards to the product and updated many of the features, it was rebranded to Microsoft Application Virtualization or App-V.

It was obvious back then that application virtualization was a complex and difficult technology to work with for Microsoft’s engineers, and although it was painful to watch, Microsoft needed to part ways with some of the design choices that SoftGrid brought with it.

Despite starting with a solid foundation, App-V 5 was almost a complete rewrite of the product, which not surprisingly, introduced bugs and a steep learning curve that needed to be ironed out.

App-V 5 relies on native capabilities of the operating system to redirect file and registry data to other parts of the registry and file system, which eliminated the need for synthetic subsystems.

In the end, App-V 5 turned out to be a solid technology, and the application repackagers were able to handle the majority of use cases. Having said that, I’m still working with clients that encounter 5%-20% of their application inventory failing to run properly in virtualized environments, which means the search for the perfect installer is still far from over.

On the flip side, the methods, technologies, and best practices that software developers used to write their software needed to evolve to keep pace with the new hardware, operating systems (and versions), and devices that their customers were beginning to use at work and remotely.

 If You Control the Code You Control the World

Microsoft understood that the shortcomings of the Win32 application architecture meant that it could not evolve to meet the demands of modern computing or Microsoft’s vision for the future.

This need for modernization was especially true for applications using cloud architectures where the user’s computer may never reside on the corporate network. Office 365 took advantage of App-V by integrating it into its version of Office to run, which gave Microsoft a lot of experience in creating and maintaining a complex virtual application. Due to the volume of its customer base, Microsoft was beginning to revolutionize the way businesses approached application packaging and deployment. Add in the burgeoning ecosystem of devices that users instinctively began using in all sorts of locations, and software developers were suddenly faced with new requirements to adapt to their applications.

A new way of building and installing easy-to-use applications that could automatically adjust to a wide variety of device form factors was needed. Microsoft shortened the gap with the Universal Windows Platform (UWP) and a “store” concept for Windows application consumption that mimicked the way applications were installed on mobile devices.

They also needed applications that were more autonomous because Windows was already heading down the road of behaving more like Android or iOS, where applications are installed and removed easily and cleanly without affecting the health of the operating system.

The UWP application format is very strict, and there are no significant equivalents of the Win32 API. This fact poses a serious and costly problem for Win32 developers because they have to rewrite significant parts of their code base to take advantage of UWP and install their application through the Windows store.

In some cases, the features had to be removed because there was no way to implement the solution with the limited set of APIs in the new UWP framework.

Those lucky few who were able to turn to UWP noticed two key benefits:

    1. UWP applications do not need to be packaged, which effectively eliminates the workload necessary to prepare them for automated installation on the user’s machine.
    1. Self-initiated installation and customization of an application through Windows Store almost eliminates the need for an IT technician to intervene on behalf of the user, as the user already has experience with an application store when using their smartphone.
    1. The Windows store also offers significant cost savings to its customers, as Microsoft hosts the application deployment infrastructure needed to store applications in the cloud.
    1. This means that companies no longer need to maintain a separate software deployment infrastructure with on-premise servers.
  1. Windows devices can download the application directly from the Windows Cloud Store.
  2. The experience can be managed through Windows Store for Business, which allows organizations to monitor available applications and track their usage.

Some software vendors are legitimately annoyed with the Windows Store model because Microsoft’s 30/70 split in each sale is considerable for expensive applications – a considerable non-technical barrier to adoption.

If you prefer to deploy your applications manually or with standard electronic software distribution tools, that remains an option, as this technology is not exclusive to the Windows store.

While Windows Store for Consumers offers an appropriate experience, Windows Store for Business only includes a basic set of features to manage Windows Store applications, which is important because it still lacks business features such as role-based access.

Of course, if your organization cannot use the cloud for application delivery, there are ways to deliver UWP applications using more traditional software distribution technologies.

Given the high cost of entry, the industry was reluctant to migrate applications to UWP. As a result, Microsoft needed to lower the entry barrier for Win32 applications and began working on Desktop Bridges (codename Project Centennial) to support both Win32 and UWP applications.

Desktop Bridges loosened the knot in the UWP API (a step in the right direction), but required access to the source code for the application to work.

This meant that Desktop Bridges was only useful for application developers, as applications were rarely successful when repackaged using this technology. There were UWP repackaging tools, but they were aimed at developers who could modify the application to overcome problems.

As a result, the adoption of modern application formats came close to solving the problems inherited from Win32 applications. However, application compatibility and the fact that applications were not isolated enough were areas where UWP fell short.

In our opinion, Microsoft needed a way to introduce more legacy applications into the store to make the migration to modern application formats a success story and, more importantly, technologies such as S Mode for Windows 10, where only modern applications can run, created a more technical need to complete the modern view of the Windows desktop.

3d rendering of the letter X in brushed metal on a white isolated background.

Put An X On It

That brings us to Microsoft’s announcement of MSIX and the hope for a unified packaging format that enables everyone to run applications on Windows 10. You may be thinking that Microsoft just refactored their MSI package format and added an ‘X’ on it to sound hip and cool like Apple, but MSIX has nothing to do with the MSI format. Nor does it have anything to do with MSI-X, which is the PCI V3.0 compliant Message-signaled interrupt. Confused?

MSIX is genuinely different from previous attempts at software installation technologies because it is a container-by-design technology targeted for use with end-user applications. The second reason is that the technology is useful for both developers and repackagers alike. The industry is working towards less repackaging, however, migrating existing applications into a container technology is an enormous and important goal to achieve Microsoft’s 365 vision.

Windows 10 makes use of containers from applications to the operating system. Containers are a great way to quickly deliver isolated application environments to different systems. Microsoft also uses containers for a wide range of services, primarily within the new security architecture built on top of Hyper-V.

With traditional Windows applications, the application state isn’t easy to manage. Containers, on the other hand, are analogous to application virtualization where application state is maintained inside the container regardless of what machine you deliver the application to. In theory, containers are that isolated, but I expect there will be compromises to make traditional Windows application formats work with MSIX.

Based on what I’ve read, you can use MSIX to deliver Win32 applications via the Windows Store. I hope that is true because it will help improve the modern management story for Windows 10. The most notable detractor is the relationship between Windows 10 and Intune because Intune’s mobile device management capabilities do not have a great deal of support for legacy software installers. Ideally, the Windows Store will be used to deliver applications, but only modern application formats are allowed in the store.

Aside from traditional Win32 application consumers, I am curious to see if MSIX opens the door to get more applications onto other Windows devices such as Xbox and the augmented reality realm, which further helps the whole Windows Store story.

Where MSIX May Take Us

Creo que el enfoque tradicional para las aplicaciones ha sido un viaje “suficientemente bueno”, pero a medida que nos enfocamos en el escritorio moderno, debe haber una ruptura clara con el pasado y los instaladores de software tradicionales deben irse en favor de formatos modernos como MSIX para simplifique la implementación de software, aumente la seguridad y aísle las aplicaciones del sistema operativo.[:]