How to study complex therapeutic interactions more efficiently | MIT News



MIT researchers have developed a new theoretical framework for studying mechanisms of therapeutic interactions. Their approach allows scientists to efficiently estimate how treatment combinations affect groups of units, such as cells, allowing researchers to perform less expensive experiments while collecting more accurate data.

As an example, to study how interconnected genes affect cancer cell growth, biologists may need to use a combination of treatments to target multiple genes at once. However, there may be billions of potential combinations in each round of the experiment, so choosing a subset of combinations to test can bias the data the experiment generates.

In contrast, the new framework allows users to control outcomes by tuning the speed of each treatment, taking into account scenarios where users can efficiently design unbiased experiments by assigning all treatments in parallel.

MIT researchers theoretically proved a near-optimal strategy with this framework, running a series of simulations and testing it in multi-round experiments. Those methods minimized the error rate for each instance.

This technology will one day help scientists to better understand the mechanisms of disease and develop new drugs to treat cancer or genetic disorders.

“We introduced a concept that people can think more about when studying the best ways to select combination treatments for each round of experiments. Our hope is that one day we can use this to solve biologically relevant questions.”

She has joined the paper by co-lead author Divya Shyamal, an undergraduate at MIT. Senior author Caroline Wooler, professor of engineering at Andrew and Elna Viterbi of EECS, and MIT Data, Systems and Society (IDSS) is director of the Eric and Wendy Schmidt Center, MIT researcher and researcher of Information and Decision Systems (LIDS). The study was recently presented at an international conference on machine learning.

Simultaneous treatment

Treatment can interact with each other in complex ways. For example, scientists trying to determine whether a particular gene contributes to a particular disease symptoms may have to target several genes simultaneously to study the effect.

To do this, scientists use what is known as combinatorial perturbations, where multiple treatments are applied to the same group of cells at once.

“Combination perturbations provide a high level of networks about how different genes interact and allow us to understand how cells function,” explains Zhang.

Because genetic experiments are expensive and time-consuming, scientists aim to choose the best subset of treatment combinations. This is a sudden challenge due to the enormous number of possibilities.

Selecting the following subsets allows you to generate biased results by focusing only on user-selected combinations:

MIT researchers approached this question differently by looking at a probabilistic framework. Instead of focusing on selected subsets, each unit randomly takes on treatment combinations based on user-specified dosage levels for each treatment.

Users set dosage levels based on the experimental goals. Perhaps the scientist wants to study the effects of four different drugs on cell growth. The probabilistic approach does not limit the experiment to a given subset of treatments, thus generating less biased data.

Dosage levels are like probability, and each cell receives a random combination of treatments. If the user sets a high dose, most cells are more likely to receive the treatment. A small subset of cells will take up treatment if the dosage is low.

“The question from there is how do you design the dosage so that you can estimate the results as accurately as possible. This is where our theory comes in,” adds Shamal.

Their theoretical framework shows the best ways to design these dosages, so you can learn the most about the features and properties they are studying.

After each round of experiments, the user collects the results and returns them to the experimental framework. Output the ideal dosing strategy in the next round and actively adapt the strategy across multiple rounds.

Dosage optimization, error minimization

Researchers have proven that their theoretical approach produces the optimal dosage, even when doses are affected by limited treatment supply or when the noise of experimental results differs in each round.

In simulations, this new approach showed the lowest error rate when comparing the estimated results of multiple experiments with actual results, surpassing the two baseline methods.

In the future, researchers would like to strengthen their experimental framework to account for the fact that inter-unit interference and the specific treatments can lead to selection bias. We also want to apply this technique in a practical experimental setting.

“This is a new approach to a very interesting problem that is difficult to solve. Now, by getting this new framework, we can think more about the best ways to design experiments for many different applications,” says Zhang.

The study is funded in part by MIT, Apple, the National Institutes of Health, the Naval Lab, the Department of Energy, the Wendy Schmidt Center at the Broad Research Institute, and the Simmons Investigator Awards Advanced Undergraduate Research Opportunities Program.



Source link