SIAM News Blog
SIAM News
Print

Robotic Microscopy to Track Neurodegeneration and Drug Potency

By Steven Finkbeiner and Eric Christiansen

A new form of robotic microscopy can track cells over unusually long lengths of time to observe the process of neurodegeneration as it unfolds and monitor the effects of drugs and other manipulations designed to stop it.

Neurodegenerative disorders, such as Alzheimer’s and Parkinson’s disease, are devastating to patients. They rob patients of their memories, personalities, ability to move, and eventually their lives. These diseases represent a major unmet medical need, and the problem is only getting worse. Aging is the single most important risk factor, and the average age of people in the U.S., Europe, and Asia is rising rapidly. Worse still, no therapy slows the progression of any neurodegenerative disorders. 

Part of the problem is that scientists lack reliable experimental models by which to study the diseases and test potential treatments. Many promising findings in non-human models have failed in human clinical trials. 

We have tried to overcome these problems by combining exciting innovative technologies that provide new human models, a powerful method for examining them, and the use of machine learning to extract the most information as possible from our data.  

Firstly, we are using induced pluripotent stem cells to model diseases in human cells. Specifically, we collect skin or blood cells from a patient with a neurodegenerative disease, reprogram them into stem cells, and then coax the stem cells into becoming the same type of neuron that is lost in the disease. These cells obviate some of the concerns about species-specific differences that have plagued translational studies. 

Secondly, we have developed a special form of imaging called robotic microscopy (see video above). These microscopes are fully automated and combine a robotic incubator, a robotic arm to pass plates of cells to and from a fully-automated microscope, and an environmental chamber that encloses everything. We built and programmed the microscopes with the unique ability to find and track the same individual cells in high throughput for as often and as long as the investigator wants. This is important: we can track cells over unusually long lengths of time (e.g., weeks, months, or longer) to observe the process of neurodegeneration as it unfolds and monitor the effects of drugs and other manipulations designed to stop it. Importantly, we can effectively treat each cell like a patient in a clinical trial, using the same statistical tools. We found that the approach is about 100–1,000 times more sensitive than conventional approaches.

Each robotic microscope operates around the clock and generates terabytes of data per day. We developed automated analysis software that gives each cell its own unique number and tracks it over time. An array of hundreds of additional biosensors, which we have developed, allow us to observe additional structural and functional features. The automated analysis programs can extract and quantify critical features from each cell, and compare the behavior of individual cells to uncover new insights into diseases and treatments. 

Finally, to get the most out of the massive amounts of data we generate, we are working with statisticians to create new approaches, especially those relying on Bayesian modeling. Moreover, a few years ago we began collaborating with engineers at Google to use deep machine learning to improve our ability to extract features. We recently succeeded in training neural networks to accurately predict features and label cells from an image that a human can’t normally see (unless they use special techniques). 

We hope that these technologies will help us find desperately-needed treatments for neurodegenerative diseases, and that using human neurons made from patient stem cells will improve the reliability of our results. We also are hopeful that sophisticated computational approaches, such as deep learning and artificial intelligence, will provide new tools to examine datasets that are too complex for humans to understand. These approaches will possibly make connections between the laboratory and the clinic that will make a real difference for patients.

This article is based on a minisymposium talk entitled "Big Data in High Throughput Screening" at the SIAM Annual Meeting, held in Boston this July.

  Steven Finkbeiner, M.D., Ph.D. is a neurologist and neuroscientist, a professor at the University of California, San Francisco, and associate director at the Gladstone Institutes. He directs an academic research laboratory focused on neurodegenerative diseases and mental illnesses and directs the Taube/Koret Center for Neurological Disease, which focuses on collaborating with industry to develop neurotherapeutics. 

Eric Christiansen, PhDc, is a software engineer with a background in computer vision and machine learning, focused on applying deep learning to microscopy analysis. He is part of Google Accelerated Science, whose mission is to increase the rate of scientific discovery by leveraging Google’s extensive knowledge and technology.



blog comments powered by Disqus