Recently one of my work mates sent an email round the development community saying that:
If your visual studio is taking up too much memory then you can run an empty visual studio macro to reduce its foot print.
Some evidence was provided via task manager which showed VS taking over 1gb of memory before the macro was run, and only a few megabytes afterwards.
You may call me a sceptic but I immediately called foul, thinking it was impossible. I decided to try it myself just to see if it was a complete lie.
Before
After
Wow, devenv.exe has had its private memory (which is the value reported by Task Manager) reduced from 1.3gb to a mere 26mb. So my workmate wasn’t completely wrong. But taking a look at some of the other values shows the results are less promising. So I wanted to know what exactly each of those memory columns mean. Stack Overflow had an answer, the below is copied from here:
Working set:
Working set is the subset of virtual pages that are resident in physical memory only; this will be a partial amount of pages from that process.
Private working set:
The private working set is the amount of memory used by a process that cannot be shared among other processes
Commit size:
Amount of virtual memory that is reserved for use by a process.
And at microsoft.com you can find more details about other memory types.
So it looks like the working set and private set have been cleared down, but all of VSs process data is still in the virtual memory. So its my understanding that you are probably making VS slower since when VS needs some of that data, there is a higher chance of it hitting the pagefile. If your low on memory and need to use other applications then this would have been a good idea had it not been for the fact windows will probably automatically do this for you anyway.
Is this a myth busted or have I misunderstood something?