Using deep learning to improve super-resolution microscopy

L
Librarians
By: Guest contributor, Fri Oct 29 2021
_

Author: Guest contributor

Nature Portfolio is committed to publishing research that is rigorous, reproducible and impactful. Our dedicated in-house editorial teams tirelessly curate the best research for their journals, and, in that mission, find papers worthy of a brighter spotlight for how their findings advance their field. Our Game Changers series provides an opportunity to showcase these same developments.

In this blog post, Rita Strack, Handling Editor for Deep learning enables fast and dense single-molecule localization with high accuracy at Nature Methods, describes how deep learning is used to improve super-resolution microscopy, and explains why the journal plays a part in improving laboratory techniques and methods. 


Q: This paper explores how scientists used machine learning to improve biological image analysis. Can you tell us more about the research and its impact on laboratory practices?

RS: Scientists use super-resolution microscopy to reveal nanometer-scale details inside cells; the method revolutionized light microscopy and earned some of its inventors the 2014 Nobel Prize in Chemistry. For this paper, an international cohort of AI researchers and microscopists developed an algorithm that significantly accelerates this technology with the help of machine learning.

Their method advances a type of super-resolution microscopy called single molecule localization microscopy (SMLM). SMLM involves labeling proteins of interest with fluorescent molecules and using light to activate only a few molecules at a time. Using this trick, multiple images of the same sample are acquired, and computer algorithms are used to identify the precise positions of the individual molecules. Super-resolution images are then built up as the positions of thousands of emitters are compiled from individual frames.

New Content Item


Because so many individual emitters are needed to create a final image, imaging speed has been a key limitation for SMLM, especially when considering live-cell imaging, where structures of interest can move rapidly. 

This research advances the field by detecting and locating fluorophores at higher densities than were previously possible. This means that fewer images are needed per sample. As a result, imaging speeds in real-time laboratory activity can be increased up to tenfold with minimal loss of resolution. The ultimate impact is that it enables super-resolution imaging of fast processes in living cells.

Q: Nature Portfolio is unique in that each journal has a dedicated team of editors to handle manuscripts. Can you describe how the editorial team at Nature Methods partnered with authors to get the paper published?

RS: The moment I read this paper I thought it was special. Artificial intelligence, specifically deep learning, has been growing rapidly for bypassing conventional limitations of light microscopy, and I thought this application to SMLM pushed the state of the art forward. 

However, there have been numerous algorithms, including other deep learning-based approaches, for identifying individual emitters in densely labeled samples, so I wanted to push the authors to distinguish this work from what had come before it. For this reason, I encouraged the authors to add at least one demonstration of imaging a dynamic process in living cells. I thought this would not only set the paper apart from others, but would also be important for meeting the claims made in the paper about what the method could achieve. 

The final results surpassed my expectations, and though the paper is quite recent, I think it is likely to be very impactful. I know independent scientists are already implementing the method in their own labs.

Q: How do you think this method will contribute to future experiments?  

RS: I think the method will contribute in two obvious ways. The first is that other developers will be motivated by its success to find methods that outperform it, further improving super-resolution microscopy. The second is that I think the technology is mature enough for biologists to really benefit from it, meaning that people who have been limited in the past by things like imaging speed or possibly phototoxicity, where prolonged light exposure damages samples, will be able to do experiments that were previously out of reach. Given how much super-resolution microscopy in fixed cells has done for microscopy, I think extending these methods that achieve very high resolution more routinely into live cell imaging will be transformative.

Q: At Nature Portfolio, we are increasingly seeing how interdisciplinarity work expands the perspectives of research. Do you think this study will have the power to convene different scientific researchers?

RS: At Nature Methods, we believe methods drive biology, and many fields within the life sciences are currently benefiting from collaboration with colleagues from other disciplines including chemistry, physics, and computer science to push the limits of what can be observed or measured. Indeed, for microscopy, technological advances have often come from astronomy, optics, and physics, and now more than ever from computational fields. We hope to support and foster these interdisciplinary methods, as they often offer big leaps over the state-of-the-art.

This work is indicative of the joint approach authors took to develop the ideas underlying the machine learning approach from a range of different contexts. Through collaborating with experts in computational microscopy, they were able to turn them into powerful methods for analyzing data. The team also built a software package which implements the algorithm itself; the software is simple to install and free to use, so we hope it will be useful for many scientists in the future.

Q: The article is a prime example of how microscopy practices have evolved. Can you tell us more about the role Nature Methods plays in scientific reproducibility?

RS: Our journal strives to publish methods that are immediately practically useful to our broad audience of readers in the life sciences. As such, having reproducible methods is front-of-mind when we review and revise papers. For software papers like this one, we ask that the authors provide code and test data, and we ask the referees to test it and provide feedback. We also ask that software be made freely available upon publication. Along these lines, we offer a service via Code Ocean to help improve the review and dissemination of custom code published in our pages.

For wet or experimental methods, we ask referees to ensure that the method could be repeated based on the information provided, and try to ensure that claims made in our papers are clearly backed up by evidence. Along with other Nature Portfolio papers, we also ask authors to fill out statistics and reproducibility checklists to ensure their papers comply with our requirements. 

Further, we uphold data deposition requirements for various fields, and have developed our own specific reporting checklists for specific subfields. The end goal is to have readers be confident that if they invest the time, energy, and money associated with trying a new method we’ve published, that it will be worth their resources. 


PIC Rita Strack
About Rita Strack

Rita Strack is a Senior Editor at Nature Methods and has been with the journal since 2014. She developed fluorescent probes for imaging in live cells during her graduate work at the University of Chicago and during her postdoctoral fellowship at the Weill Cornell College of Medicine. Rita believes methods advances drive discovery, and she is excited about the future of microscopy and imaging for biologists. At home, she is a devoted mother of two children and an avid equestrian.


To license Nature Methods, or any Nature Portfolio journal, please contact your local sales & account development manager.

_

Author: Guest contributor

Guest Contributors for THE LINK include Springer Nature staff and authors, industry experts, society partners, and many others. If you are interested in being a Guest Contributor, please contact us via email.