-
Notifications
You must be signed in to change notification settings - Fork 0
/
homework2problem1partc.tex
executable file
·18 lines (12 loc) · 1.17 KB
/
homework2problem1partc.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
\subsection{Part C}
Here we are executing a handwritten KS-test and comparing it to the one available from \texttt{scipy}. We do so by utilizing the numerically efficient version of Equation (14.3.7) from the textbook to compute the significance level of the maximum observed discrepancy $D$ between my normal distribution (computed using the same Box-Muller routine from 1B) and the one derived from a Gaussian. Initially I computed both distribution functions using histograms but the theoretical normal distribution can be determined using an error function; I used the \texttt{scipy} version so as to avoid computing the integral needed for the error function repeatedly.
As expected the $p$-value increases with the number of sampled numbers (as does the agreement my KS test and the one found in \texttt{scipy}); this is in effect an indication of switching from Poissonian to quasi-Gaussian statistics.
\lstinputlisting{homework2problem1partc.py}
\clearpage
\begin{figure}[h]
\centering
\includegraphics{homework2problem1partcfigure1.pdf}
\caption{Result of KS tests for $N_{samples} = (10^{1}, 10^{1.1}, \dots, 10^{5})$.}
\label{fig:21c1}
\end{figure}
\clearpage