Head-to-Head: Parallels Desktop for Mac vs. VMware Fusion
Volume Number: 25 (2009)
Issue Number: 04
Column Tag: Virtualization
Head-to-Head: Parallels Desktop for Mac vs. VMware Fusion (cont.)
How do VMware Fusion and Parallels Desktop for Mac stack up?
by Neil Ticktin, Editor-in-Chief/Publisher
< Previous Page
Start
| 1
|
2
|
3
|
4
Next Page>
Overview
We won't keep you in suspense. In the majority of overall averages of our tests, Parallels Desktop is the clear winner running 14-20% faster than VMware Fusion. The one exception is for those that need to run Windows XP, 32-bit on 2 virtual processors, VMware Fusion runs about 10% faster than Parallels Desktop.
Here are the overall conclusions, but you should really look at more of the detail to understand what works best for you. (Note: In all cases except battery life, when looking at the graphs in this article, take note that shorter bars are better.)
Figure 1: Overall Results, Parallels Desktop vs. VMware Fusion
For 32-bit Windows OSes, running under a single virtual processor (the default when you create virtual machines in either product, and therefore, the most commonly used configuration), Parallels Desktop runs both XP and Vista 14% faster than VMware Fusion. (Comparing 3 types of VM launch times, compression, transcoding MP3, 7 types of file and network IO, 3 types of application launches, and 3 application performance tests.)
For 32-bit Windows OSes, running under two virtual processors, VMware Fusion runs XP 10% faster than Parallels Desktop, and Parallels Desktop runs Vista 20% faster than VMware Fusion. (Comparing 3 types of VM launch times, compression, transcoding MP3, 3 types of application launches, and 3 application performance tests.)
For 64-bit Windows Vista, running under two virtual processors, Parallels Desktop runs 15% faster than VMware Fusion. (Compares 3 types of VM launch times, compression, and transcoding MP3.)
Another way to look at this is the color-coding on the results matrix. Green cell coloring means Parallels Desktop was faster than VMware Fusion. Blue cell coloring indicates VMware Fusion was faster than Parallels Desktop. Darkest coloring means faster by 10% or more, medium coloring indicates 1-10% difference, and lightest coloring means less than 1% difference. (Note: Not all tests were run on all configurations, hence the empty cells.)
Figure 2: Test Results Matrix with Coloring
(Note: This is not intended to be read, but to give you an overview of results
by coloring. See the ftp site for the spreadsheet.)
One thing to note: VMware Fusion was several times slower than Parallels Desktop in the Internet Explorer tests (across the board), so we removed IE from the overall analysis to avoid skewing the overall results. See more on this in the Internet Explorer section below.
The Test Suite and Results
In the sections below, we'll walk you through what we tested, and the results for each. These tests are designed to arm you with information to help you make the best decision for your type of use.
For each set of results, you can see the analysis for each model of computer for XP, and for Vista. If you want to see more detail for multiple processors, 64-bit, or on an individual Mac model, you can review the spreadsheet for those details.
For the launch tests (launching the VM, Windows, and Applications), we had the option of an "Adam" test, and a "Successive" test. Adam tests are when the computer has been completely restarted (hence avoiding caching). Successive tests are repeated tests without restarting the machine in between tests, and can benefit from caching. Both mimic real use situations.
The tests used were selected specifically to give a real-world view of what VMware Fusion and Parallels Desktop are like to run for many users. We eliminated those tests that we ran which were so short in time frame (e.g., fast) that we could not create statistically significant results, or that had imperceivable differences.
For some of the analysis, we "normalized" results by dividing the result by the fastest result for that test across all the machine configurations. We did this specifically so that we could make comparisons across different groups, and to be able to give you overview results combining a series of types of tests, and computer models.
Instead of a plain "average" or "mean", overall conclusions are done using a "geomean", which is a specific type of average that focuses on the central results and minimizes outliers. Geomean is the same averaging methodology used by SPEC tests, PCMark, Unixbench, and others, and it helps prevent against minor result skewing. (If you are interested in how it differs from a mean, instead of adding the set of numbers and then dividing the sum by the count of numbers in the set, n, the numbers are multiplied and then the nth root of the resulting product is taken.)
For those interested in the benchmarking methodologies, see the more detailed testing information in Appendix A. For the detailed results of the tests used for the analysis, see Appendix B. Both appendices are available on the MacTech web site.
Launch and CPU Tests
There are three situations in which users commonly launch a virtual machine:
- Launch the virtual machine from "off" mode, including a full Windows boot
- Launch the virtual machine from a suspended state, and resume from suspend (Adam)
- Launch the virtual machine from a suspended state, and resume from suspend (Successive)
For the first test, we started at the Finder and launched the virtualization application, which then immediately launched the virtual machine. The visual feedback is fairly different between Parallels Desktop and VMware Fusion when Windows first starts up. As a result, we focused on timing to the point of actually accomplishing something. In this case, we hovered over the Start button and launched Internet Explorer. The test ended when the home page (a very small locally served page which loaded fast in all environments) was rendered.
The primary difference between the last two types of VM launch test is that the computer is fully rebooted (both the virtual machine as well as Mac OS X) in between the "Adam" tests. The successive tests are launching the virtual machines and restoring them without restarting the Mac in between.
Successive tests benefit from both Mac OS X and possibly virtual machine caching, and are significantly faster. But, you may only see these types of situations if you constantly launching and terminating your virtual machine.
We noticed that while doing these tests, results wildly varied even within the same test machines. To be fair, we did these tests multiple times, and took the best results for each product.
End result for each of these sets of tests (geomean across Mac models) is:
Figure 3: Windows OS Launch Performance
Clearly, machines with more memory took far longer to restore, so if you are going in and out of a VM often, you may want to think about using less RAM, not more. In fact, you should just use as little as you need anyway for the best experience under either virtualized environment. (We suggest 1GB for most people.)
There are two CPU performance tests that are commonly run as part of benchmarks, and that we did here as well. These include:
As a matter of interest, we used compression instead of decompression, because with today's fast computers, decompression is actually much closer to a file copy than it is to CPU work. Compression requires the system to do a good amount of analysis to do the compression, and is therefore a better measurement of CPU.
Figure 4: Virtual Machine CPU Performance
Application Launch Test
Here, we tested two of the most common applications used in virtualized desktop environments: Microsoft Word and Microsoft Outlook. Most applications, including these, launch very quickly with the worst performance being under Vista for Adam launches.
Similar to the OS launch tests, an Adam launch is one where Windows has been completely rebooted, and then given a few minutes to finish its startup process. A successive launch test is done repeatedly without restarting Windows.
Here are the results:
Figure 5: Windows Application Launch Performance
< Previous Page
Start
| 1
|
2
|
3
|
4
Next Page>