![]() |
![]() |
![]() |
Kyle A. Beauchamp
Das and Pande Labs
![]() |
![]() |
![]() |
...EVKMDAEFRHDS... ...EVKMDTEFRHDS... ...EVKMDVEFRHDS...
$\downarrow$
$A = (111222222)$
$A = (111222222)$
$\downarrow$
$C = \begin{pmatrix} C_{1\rightarrow 1} & C_{1\rightarrow 2} \\ C_{2 \rightarrow 1} & C_{2 \rightarrow 2} \end{pmatrix} = \begin{pmatrix}2 & 1 \\ 0 & 5\end{pmatrix}$
$\downarrow$
$T = \begin{pmatrix} T_{1\rightarrow 1} & T_{1\rightarrow 2} \\ T_{2 \rightarrow 1} & T_{2 \rightarrow 2} \end{pmatrix} = \begin{pmatrix}\frac{2}{3} & \frac{1}{3} \\ 0 & 1\end{pmatrix}$
Assume independent normal errors.
$\underbrace{\log P(\alpha| F_1, ..., F_n)}_{Log _\ Posterior}$ = $\underbrace{-\sum_i \frac{1}{2}\frac{(\langle f_i(x)\rangle_α - F_i)^2}{\sigma_i^2}} _ {Log _\ Likelihood (\chi^2)} + \underbrace{\log P(\alpha)} _ {Log _\ Prior}$
Determine $\alpha$ by sampling the posterior.
X-Ray MD BELT
$\frac{1}{n}\chi^2 = 15.0$ $\frac{1}{n}\chi^2 = 13.7$ $\frac{1}{n}\chi^2 = 10.4$
Rhiju and Vijay
Greg Bowman
Vince Voelz
Robert McGibbon
Christian Schwantes
TJ Lane
Imran Haque
Everyone!
Buzz Baldwin
Dan Herschlag
Pehr Harbury
Xuesong Shi
Laura Wang, Jessica Metzger, Aimee Garza, Crystal Spitale and all the Biochem staff
Kathleen Guan
Everyone!
Todd Martinez
Pehr
Russ Altman
Folding@Home Donors and Forum Volunteers (Bruce Borden)
OpenMM Team: Joy Ku, Peter Eastman, Mark Friedrichs, Yutong Zhao
Thomas Kiefhaber, John Chodera, Frank Noe, Jesus Izaguirre
DE Shaw Research
Susan Marqusee, Laura Rosen
$$P(i\rightarrow j, dark \rightarrow dark) = P_0(i\rightarrow j)$$ $$P(i\rightarrow j, dark \rightarrow light) = P_0(i\rightarrow j)$$ $$P(i\rightarrow j, light \rightarrow light) = (1 - f_i) P_0(i\rightarrow j)$$ $$P(i\rightarrow j, light \rightarrow dark) = \delta_{ij} f_i$$
Linear virtual biasing potential (LVBP): $\Delta U(x;\alpha) = \sum_i \alpha_i f_i(x)$
$\alpha_i$ tells how the energy of structure $x$ is changed by the predicted observable $f_i(x)$.
Populations by Boltzmann: $\pi_j(\alpha) \propto \exp[-\Delta U(x_j;\alpha)]$
The true ensemble is described by some coefficients $\alpha$.
Let $F_i$ be a noisy measurement of the $i$th experiment.
Let $\langle \rangle_\alpha$ denote an equilibrium expectation in the $\alpha$ ensemble.
$$P(F_i | \alpha) \approx N(\langle f_i(x)\rangle _\alpha, \sigma_i)$$
$$P(\alpha | F_1, ..., F_n) \propto P(F_1, ..., F_n | \alpha) P(\alpha)$$
$$\log P(\alpha| F_1, ..., F_n) = -\sum_j \frac{1}{2}\frac{(\langle f_j(x)\rangle _\alpha - F_j)^2}{\sigma_i^2} + \log P(\alpha)$$
MaxEnt prior: $\log P(\alpha) = \lambda \sum_i \pi_i(\alpha) \log \pi_i(\alpha)$
Use MCMC (pymc) to sample the likelihood function.