****************************** Solution B ****************************** Tprime,lab: The raw time for a Linux lab computer to execute the prime-testing program described below. real 0m0.003s Tgauss,lab: The raw time for a Linux lab computer to execute the Gaussian integral program described below. real 0m3.287s Tprime,grendel: The raw time for grendel to execute the prime-testing program described below. real 0m0.001s Tgauss,grendel: The raw time for grendel to execute the Gaussian integral program described below. real 0m12.239s Tprime,ozark: The raw time for ozark to execute the prime-testing program described below. real 0m0.003s Tgauss,ozark: The raw time for ozark to execute the Gaussian integral program described below. real 0m7.802s Tprime,grendel / Tprime,lab 0m0.001s/0m0.003s = .33 Tprime,ozark / Tprime,lab 0m0.003s/0m0.003s = 1 Tgauss,grendel / Tgauss,lab 0m12.239s/0m3.287s = 3.723 Tgauss,ozark / Tgauss,lab 0m7.802s/0m3.287s =2.373 To obtain the time, real time as well as user time, both the prime and gaussian integral programs take to run. I simply wrote the two program in c, tested if they were working as anticipated using the terminal. The command used to compile both programs was “gcc prime.c” for the prime program and “gcc gaussian.c -lm” for the gaussian program. To execute the two programs, i used the command “./a.out”. I measured the time they take to use to run in the lab, grendel, and ozark by using the command “time ./a.out” #include #include main(){ int n = 100; int d = 2; while((d*d) < n){ if (n % d == 0){ printf("false\n"); return 0; } d=d+1; } printf("true\n"); return 1; } #include #include #include double function(double x){ return exp(-(x*x)/2); } int main(){ double i ; double n = 100; double a = 2; double delta = 4.0/n; double sum = function(2.0) + function(-2.0); for(i; i<=n; i++){ sum = sum + 3 * function((-a + (i*delta))); } printf("%f",delta*sum/3 ); return (int) delta*sum/3; }