Search
Menu

Lattice Light-Sheet Microscopy Tool Supports 4D Data Analysis

Facebook X LinkedIn Email
Researchers at the University of Chicago have designed a multidimensional imaging analysis pipeline for lattice light-sheet microscopy (LLSM). They set out to study T-cell function using high-dimensional microscopy, but then identified the need for an effective method of analysis.

Because of limited data points for LLSM data, the researchers did not have an effective way to analyze the data in 4D. To increase the amount of data points and allow more sophisticated analyses, researchers Jillian Rosenberg and Guoshuai Cao developed a way to treat each molecule, rather than each cell, as a data point.

They developed a pipeline composed of publicly available software packages, called lattice light-sheet microscopy multidimensional analyses (LaMDA). The LaMDA pipeline combines high spatiotemporal resolution, 4D LLSM, machine learning, and dimensionality reduction to analyze T-cell receptor dynamics and predict T-cell signaling states without the need for complex biochemical measurements.

The researchers used LaMDA to analyze images of T-cell receptor microclusters on the surface of live primary T cells under resting and stimulated conditions. They observed global spatial and temporal changes of T-cell receptors across the 3D cell surface; differentiated stimulated cells from unstimulated cells; predicted attenuated T-cell signaling after CD4 and CD28 receptor blockades; and reliably discriminated between structurally similar T-cell receptor ligands.

An image of a T-cell captured by lattice light-sheet microscopy. Courtesy of Huang Lab.
An image of a T cell captured by lattice light-sheet microscopy. Courtesy of Huang Lab.


In addition to helping to expand scientists’ knowledge of T-cell biology, LaMDA could be used for drug testing and vaccine development. According to Rosenberg, one of the most promising aspects of LaMDA is its potential to predict biological responses without the need for complex experiments.

“Researchers or pharmaceutical companies could use LaMDA to determine how certain drugs are resulting in subtle changes in subcellular signaling, which provides information on both drug safety and efficacy,” Rosenberg said. “Our LaMDA pipeline could also be extended to the development of peptide vaccines to treat infection, cancer, and autoimmunity, or be used to study thymic education or peripheral tolerance, two very important topics in T-cell biology.”

The researchers validated LaMDA as an effective analysis pipeline that could be expanded to other fields of study. They designed LaMDA to be easy for other scientists to use, including those who may be unfamiliar with data science techniques. “We believe this analysis pipeline will benefit users of high-dimensional microscopy across all fields of science,” Cao said.

So far, the researchers have only tested LaMDA in a single molecule, under a few different conditions. “To make this pipeline more robust, it should be validated on other cell types, molecules, and conditions to prove its wide applicability and address any potential unforeseen issues,” Cao said. The researchers hope that one day the LaMDA pipeline can be used to study the interaction of multiple molecules.

The research was published in Cell Systems (www.doi.org/10.1016/j.cels.2020.04.006). All instructions needed to implement LaMDA are included in the research paper.

Published: May 2020
Glossary
machine learning
Machine learning (ML) is a subset of artificial intelligence (AI) that focuses on the development of algorithms and statistical models that enable computers to improve their performance on a specific task through experience or training. Instead of being explicitly programmed to perform a task, a machine learning system learns from data and examples. The primary goal of machine learning is to develop models that can generalize patterns from data and make predictions or decisions without being...
computational imaging
Computational imaging refers to the use of computational techniques, algorithms, and hardware to enhance or enable imaging capabilities beyond what traditional optical systems can achieve. It involves the integration of digital processing with imaging systems to improve image quality, extract additional information from captured data, or enable novel imaging functionalities. Principles: Computational imaging combines optics, digital signal processing, and algorithms to manipulate and...
machine learningT cell receptorResearch & Technologycomputational imagingeducationAmericasUniversity of ChicagoImagingMicroscopyOpticsBiophotonicsmedicalmedicineLight Sourcespharmaceuticallattice light-sheet microscopyBioScan

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.