= Estimating job resource requirements = == Computation == The computational size parameters ('''rsc_fpops_est''', '''rsc_fpops_bound''') are expressed in terms of number of floating-point operations. For example, suppose a job takes 1 hour to complete on a machine with a Whetstone benchmark of 1 GFLOPS; then the "size" of J is 3.6e12 FLOPs. To get an initial estimate of job size, run several typical jobs on your own computer, see how long they take, and multiply by the Whetstone score of the computer (to find this, run BOINC on the computer and look at the event log). It's possible that job size may change over time, or that job sizes on volunteered computers are different from those you measure on your own computers. To get estimates of job sizes averaged over recent jobs, use the script html/ops/job_times.php (the '''FLOP count statistics''' link on your project's [HtmlOps administrative web interface]). '''NOTE''': BOINC maintains various statistics, used for runtime estimation and credit, that are normalized by '''rsc_fpops_est'''. If you change this by a lot (more than a factor of 10) you should reset these statistics. To do so, click '''Manage Applications''' in the administrative web interface; click on the application name, then on '''reset credit statistics for this application'''. If job sizes vary in a predictable way, you can supply per-job values for '''rsc_fpops_est''' and '''rsc_fpops_bound'''. Doing so will give more accurate runtime estimates and more consistent credit. == Memory == To get an initial estimate, run a typical job on your own computer and monitor the working set size (using the Windows Task Manager or the Unix "top" command). Note: memory size parameters are not per-platform; you should do this all supported platforms and take the maximum. == Disk == To get an initial estimate, run a typical job on your own computer and observe the disk usage (including the input files, temp files, and output files, but not including the application files).