I'm not sure if this is the right place to post this, but here goes.
I'm using a data analysing software (OriginPro) with large amounts of data (anywhere between 4-64 million data points), and for the most part it's okay, but a few tasks seems to render the software useless for up to 30 mins at a time.
I've checked the memory usage via Task Manager, and the maximum the software uses is about 1.75 GB. The maximum RAM used by the system was 4.3 GB, which still leaves about 1.6 GB free. I'm not sure if it's an issue with the RAM or something else - CPU usage is below 40% (although there are occasional spikes) and the disk usage by the software is next to nothing.
My system info:
OS: Windows 10 Home, 64-bit
Processor: Intel(R) Core(TM) i5-3317U CPU @ 1.70GHz
RAM: 6.00 GB
Storage: 116 GB free of 585 GB (I only have the C: drive)
Will increasing the pagefile size make any difference? It shouldn't, since I'm not even using up my RAM. Any insight into the matter will be much appreciated.
Well how many cores does the CPU have? I am certain the software is designed for multiple cores, so the threads don't get processed so fast, as it is designed for parallel computing. A chip that runs 1.70 GHZ is probably at least 8 years old.