I used R a lot, and installed 96G memory on a server board. SAS is different as long as you have a large harddrive. Statistical program are not that CPU intensive as PC game, but I/O intensive . Make sure you have have enough RAM and fast harddrive. But SSD is proven to be more fragile than traditional HDD, however it can be used for work fold in SAS
【在 A***8 的大作中提到】 : I used R a lot, and installed 96G memory on a server board. : SAS is different as long as you have a large harddrive. : Statistical program are not that CPU intensive as PC game, but I/O intensive : . Make sure you have have enough RAM and fast harddrive. : But SSD is proven to be more fragile than traditional HDD, however it can be : used for work fold in SAS
I guess it really depends on the nature of your work. R, like most statistical software, is not optimized for parallel computation and cannot fully utilize a multi-core, multi-cpu machine or a cluster (that is why you would get the CPU usage at about 25% on a quad-core computer). You would see a big difference if you run some well-designed multi-threading programs ( 100% usage of CPUs at most time). But again, as many would argue, statistics "programers" are not programmers, they are data analyzers. So, it is good to have more memory/space for data if you are not interested in parallel computation.
intensive be
【在 A***8 的大作中提到】 : I used R a lot, and installed 96G memory on a server board. : SAS is different as long as you have a large harddrive. : Statistical program are not that CPU intensive as PC game, but I/O intensive : . Make sure you have have enough RAM and fast harddrive. : But SSD is proven to be more fragile than traditional HDD, however it can be : used for work fold in SAS
z*t
22 楼
有啥悲催的 天天都有新鲜工作做,多开心啊 faculty是个让人开心的工作
k*5
23 楼
priority high to low: 1. IO is bottleneck, increase IO bw first. 2. memory bw is bottleneck, make sure to use NUMA. (may require further tuning) 3. refer to 'BenW' suggestion. if 1 and/or 2, definitely go for cluster+mpi. using mpi implementation of R