Reviewed by Laura ThomsonDec 9 2024
In a study published in Communications Biology on December 6th, 2024, researchers from Kyushu University and Nagoya University in Japan created a model that employs artificial intelligence (AI) to predict organoid development at an early stage.
Organoids -- miniature, lab-grown tissues that imitate organ function and structure -- are revolutionizing biomedical research. They promise advances in tailored transplants, better modeling of diseases such as Alzheimer's and cancer, and more precise insights into the impacts of medical treatments.
The model, which is faster and more accurate than professional researchers, has the potential to increase efficiency and reduce the cost of organoid culture.
The researchers concentrated on forecasting the emergence of hypothalamic-pituitary organoids. These organoids imitate the pituitary gland's functions, including the generation of adrenocorticotropic hormone (ACTH), which regulates stress, metabolism, blood pressure, and inflammation. ACTH deficiency can cause lethargy, anorexia, and other potentially fatal complications.
In our lab, our studies on mice show that transplanting hypothalamic-pituitary organoids has the potential to treat ACTH deficiency in humans.
Hidetaka Suga, Study Corresponding Author and Associate Professor, Graduate School of Medicine, Nagoya University
However, one major challenge for the researchers is knowing whether the organoids are developing properly. Organoids, formed from stem cells suspended in liquid, are sensitive to minute environmental changes, resulting in variety in development and final quality.
The researchers discovered that broad expression of a protein called RAX at an early developmental stage indicates favorable advancement, as it frequently results in organoids with significant ACTH secretion later on.
Suga added, "We can track development by genetically modifying the organoids to make the RAX protein fluoresce. However, organoids intended for clinical use, like transplantation, can't be genetically modified to fluoresce. So, our researchers must judge instead based on what they see with their eyes: a time-consuming and inaccurate process."
Suga and his colleagues in Nagoya therefore teamed with Hirohiko Niioka, Professor of the Data-Driven Innovation Initiative at Kyushu University, to develop deep-learning models for the job.
Deep-learning models are a type of AI that mimics the way the human brain processes information, allowing them to analyze and categorize large amounts of data by recognizing patterns.
Hirohiko Niioka, Professor, Data-Driven Innovation Initiative, Kyushu University
At 30 days of development, the Nagoya researchers took fluorescent and bright-field images of organoids with fluorescent RAX proteins, illustrating how the organoids appear under regular white light without fluorescence.
They divided 1500 bright-field images into three quality categories based on the fluorescent images: A (broad RAX expression, high quality); B (medium RAX expression, medium quality); and C (narrow RAX expression, low quality).
After that, Niioka trained two sophisticated deep-learning models to predict the quality category of the organoids: EfficientNetV2-S and Vision Transformer, which Google created for image recognition. As the training set, he used 1200 bright-field photos, 400 in each category.
Niioka integrated the two deep-learning models into one ensemble model after training to further enhance performance. The study team used the remaining 300 photographs (100 from each category) to evaluate the now-optimized ensemble model, which achieved 70% accuracy in classifying the bright-field images of organoids.
On the other hand, researchers who had years of experience with organoid cultivation were less than 60% accurate in predicting the identical bright-field images category.
"The deep-learning models outperformed the experts in all respects: in their accuracy, their sensitivity, and in their speed," added Niioka.
The next step was to see if the ensemble model could accurately classify bright-field images of organoids that had not been genetically modified to cause RAX fluoresce.
The trained ensemble model was tested on bright-field images of hypothalamic-pituitary organoids without fluorescent RAX proteins at 30 days of development. Using staining techniques, they discovered that at 30 days, the organoids classed as A (good quality) showed strong RAX expression.
When they continued to culture, these organoids secreted a substantial amount of ACTH. Meanwhile, low levels of RAX, and later ACTH, were observed in organoids classed as C (low quality).
Niioka stated, "Our model can therefore predict at an early stage of development what the final quality of the organoid will be, based solely on visual appearance. As far as we know, this is the first time in the world that deep-learning has been used to predict the future of organoid development."
Moving forward, the researchers intend to increase the deep-learning model's accuracy by training it on a larger dataset. However, even at its current degree of accuracy, the model has significant implications for current organoid research.
Suga concluded, "We can quickly and easily select high-quality organoids for transplantation and disease modeling, and reduce time and costs by identifying and removing organoids that are developing less well. It is a game-changer."
Journal Reference:
Asano, T., et. al. (2024) A deep learning approach to predict differentiation outcomes in hypothalamic-pituitary organoids. Communications Biology. doi.org/10.1038/s42003-024-07109-1
Source:
Kyushu University