Not a programming question, but I'm hoping some of you graphics programmers might know an answer.

I use some 3rd party software that draws some line graphs adding about 1 series a minutes as it reads in live data.
Now the software gradually slows down after about 50 or so series which takes about an hour and gradually gets worse.
GPU usage is < 1%, CPU is maybe 5%, memory usage gradually increases but sits around 60MB on a 4GB machine and the data produced is < 100K so disk is quiet.
In an effort to try and reduce this slow down I remoted into the machine using MSTSC and this reduced the slow down by about 50%, but I can't work out why.
I'm guessing that maybe the machine knows it is being remoted into and when the 3rd party software asks the graphic drivers to draw the graphs, the local graphic drivers don't bother doing anything as no one locally is going to see it. It was at this stage I reran the software while watching a GPU monitor and GPU usage was sub 1%.
So currently by remoting into the machine that runs the software can save you an hour of so while it collects data and graphs it compared to being logged in locally.

Any suggestions gratefully received.