We have a .Net (VS 2012) 64 bit application, running on Windows Server 2012, which loads lots of data in memory to work with it for next 24 - 48 hours. As per our design we prefer to load the data upfront by connecting to the Data source. Net memory footprint of the process goes up to 7-8 GB, which is because we replicate the data for 4 independent threads, which represent the parallel clients, so it like 2 GB of Data per client.
Hardware that we are using is having 128 GB of RAM, 4 CPU (24 Core) processor
I was wondering that does the memory footprint will have an impact on the net performance of the process, like speed of execution, processor utilization etc
Even when we have such a good RAM and the net usage never exceed 60%, will there still be swapping of process pages to HD, thus making it slower, when it is continuously used for 48 hours. Is there some server settings or .Net c# code API, that will help us taking care of performance issues occurring due to memory footprint. We can explicitly tell the system that our process needs lots of memory.
Also let me add, that each parallel client spawns multiple Threadpool threads. to complete Sub task, now the average CPU usage of the process is between 50 - 60%, for the good configuration that we are having, does that mean we need to take steps to figure out the Disk IO bottlenecks, that might be impacting performance or a server setting would help