About me:

I am passionate about understanding how the human mind works. I work for Astera Institute as a Senior Research Scientist developing artificial intelligence based on human cognition and neuroscience. I have previously worked as researcher in cognitive science and psychiatry. I have also worked as a statistician and methodologist for medical research generally.

What does neural computation actually look like?

This app may have performance issues in Firefox. Try a chromium browser.

Full view of app and control panel here

In the above simulation, you can watch and interact with the autonomous, self-organizing behavior of up to 1 million model neurons, roughly the scale of a few millimeters of cortical tissue.

Each neuron is simple and identical, signaling strongly to its immediate neighbors and weakly to all others. This global-local connectivity pattern produces a chaotic tension between small-scale organization and total synchronization. The global connections drive spontaneous activity in quiescent regions, while the local connections expand and organize that activity into waves, spirals, and other transient motifs. When local connections completely dominate, they trend toward a kind of entropy. When that entropy reaches its peak, global synchronization emerges once again, restarting the process.

[Read More]

New position at CCNLab

As of June 21, 2021 I have left my job as a biostatistician at Virginia Tech and joined UC Davis Computational Cognitive Neuroscience lab lab as a postdoc! I cannot express how exciting it has been to discover this kind of research but to also land an opportunity to train and do it myself. The lab has a variety of ongoing research involving biologically-constrained spiking and rate coded neural networks in the Leabra framework (github). There is an excellent textbook written and published for free online by the PI, Randy O’Reilly.

[Read More]

Binary neural nets in R, part 3: Recurrent Helmholtz Machine

github repository for this project

Part 3 of my neural network project in R involves coding a recurrent version of the Helmholtz machine. A recurrent (i.e., autoregressive or time-lagged) version of the network follows naturally (kind of), and of course, Hinton already published the idea in 1995. I used his paper as a guide to be sure nothing was markedly different about the approach and coded it according to my previous strategy.

[Read More]

Stochastic resonance

In digging through old models and scripts, I came upon one to illustrate the concept of stochastic resonance. In short, when noise is added to periodic signals with local stability, it can cause them to jump between equilibria at about the dominant frequency:

Stochastic Resonance Animation

We are always looking for interesting statistical phenomena like this for models in computational psychiatry, where symptoms of mania or depression, for example, are often episodic and make large, spontaneous leaps after periods of relative stability.

[Read More]

Discretization in 3 languages

There is an obscure and useful function that makes it easy to fit stochastic differential equations to data insofar as the model can be linearized without causing too much trouble. The function discretizes the continuous-time (i.e., differential equation) state matrices A, the drift or state transition matrix, B, the input or covariate coefficient matrix, and Q, diffusion or noise covariance matrix. That means that the function essentially takes the differential equation in matrix form and solves it for a given time step. The discretized matrices function like those of an autoregressive process. Some details of this approach and what this does can be found here but not exactly a complete implementation, namely with matrix B. So this is one of those code blocks I just have backed up in several project folders in various languages.

[Read More]
R  stan  rcpp  c++  programming 

Understanding MCMC and autodifferentiation

In putting together educational materials related to Stan and posterior sampling, I remembered two good ones.

The MCMC interactive gallery is the best set of MCMC visualizations I’ve found.

Stan uses NUTS, so it has to calculate numerous gradients for each new sample and does so by autodifferentiation. I recommend this video for understanding autodiff. It helps a lot to know what Stan is doing with the model code to avoid giving it more work than necessary.

[Read More]

Stan: Structural Ordinal Item Response Theory with a Latent Interaction

The Non-Gaussian factor model introduces the idea of using other distributions for a factor analysis. But that code is not very generalized, and in reality we’ll tend to need something more like structural equation modeling with non-Gaussian variables.

The name for a factor model with logit-linked indicators, whether dichotomous or ordinal, is Item Response Theory. It has been used for decades to develop instruments and in particular, tests. Because of its history, factor loadings are called the “discrimination” parameters, intercepts are the item “difficulty”, and the factor scores represent each person’s “ability”.

[Read More]

Stan: Non-Normal Factor Model

There are so many factor analyses in the world and so few truly normally distributed variables. I have not seen much careful tailoring of residual distributions in medical or psychological research. It is probably because most software don’t support it or make it convenient. It was a revelation to me that you can use Markov Chain Monte Carlo (MCMC) to sample latent variables and then do all kinds of things, like non-Gaussian distributions and latent variable interactions.

[Read More]

Binary neural nets in R, part 2: Helmholtz Machine

github source for this project

In the last post, I looked at coding a restricted Boltzmann machine (RBM) in R. In this one, I will compare the algorithm and its results to the Helmholtz Machine, which uses the wake-sleep algorithm.

The basic idea behind the wake-sleep algorithm is that the model learns in two alternating phases. In the wake phase, random values for the hidden nodes are generated according to the expectations of the recognition network, then the generative network weights are adjusted.

[Read More]

Binary neural nets in R, part 1: Restricted Boltzmann Machine

github source for this project

For most statistical models, you can verify that they work by simulating data from a set of parameter values, then fitting the model to the simulated data and seeing how well it recovers those values. With artificial neural networks it is common to have many local solutions and a stochastic learning algorithm, so while an ANN may find an good solution on the simulated data, it is far less likely to match the data-generating parameters.

[Read More]