This is a part of the function that I created. It assigns all the variables to ints just before freeing them and does it all a large number of times.
My main function analyzes the results of calling this simpler function many times. I tracked the time that it takes for the function to run a print it off each time it does. I also calculated the average time that it takes over for the function to run. I also used that average and divided it in half to try to identify abnormal time values that are plus or minus half of the average from the average. For my analysis, I ran the function ten thousand times just to make sure there was a large enough sample size. What I found was that no matter the sample size the first two calls to my memory() function are the slowest usually between 80 and 100 milliseconds. Then it drops down to somewhere around 30 milliseconds where it stays until about 6800 interactions when it abruptly jumps up to about 60 milliseconds.
Upon thinking about this I think that the garbage collection happens when a file is first called clearing everything and taking longer for the first few iterations and after that, it becomes far more efficient before becoming bogged down later on as the memory gets filled up.
Here is the output with the memory function called 10 times just so it's easier to see. The first two iterations are always the slowest and you can see that they are the only abnormal values at the bottom of the output.