Skip to content

View of Field

Building a Fluorescence Imaging System: Part 2 - System Design

CAD model of the demo fluorescence imaging systemWe discussed previously how we settled on a design concept for our demo imaging system. To recap, what we ultimately needed was:

  • Simple, real-time NIR fluorescence images without jeopardizing anyone’s health (no lasers)
  • ICG Compatibility: ICG phantoms comprise a large part of our product portfolio, making it simpler to demo the range product we offer without expanding hardware costs.
  • Budget-conscious: built from existing lab inventory for under $500
  • Professional appearance: needed to look like a finished product concept, not a lab prototype with a rat’s nest of cables running from the command line.
  • Conference-ready: standalone, compact, transportable, and operable from a battery pack all day

With these high-level user needs in mind, we could begin to figure out what hardware and software was needed to make it happen.

Hardware Design

We pulled together what we had available in our lab and settled on some of the following hardware:

  • ZWO ASI462MC camera with NIR-sensitive sensor - this little sensor works surprisingly well for NIR imaging despite being a color camera
  • RICOH f/4.0 machine vision lens (23° FOV, 250mm working distance) - it was what we had around the lab. But turns out it was ideal for the system footprint and our product sizes.
  • Thorlabs M730-L6 LED (730nm @ 800mW) with T-Cube driver - low power draw and simple to control with GPIO triggering.
  • Raspberry Pi 5 (8GB RAM) as the control system - cheap and quick to develop around
  • Raspberry Pi 7.5" touchscreen for integrated display and control - fewer dependencies to fight with on a tight timeline.
  • Custom 3D-printed enclosure to mechanically package everything together.
  • Battery bank (>20Ah for all-day use, <100Ah for air travel compliance)

Camera Selection: ZWO ASI462MC cameraStock image of the ZWO camera used in this imaging system

We chose this camera for a few reasons - the main one being that we had one available in the lab. But there’s a reason we like this camera.

ZWO sells camera systems for astrophotography. They provide extensive technical specifications on the camera’s performance, making it much easier to ensure we are getting what we need. This particular model uses a Sony IMX462 sensor - which has a back-thinned CMOS sensor that drastically improves light sensitivity - that designed with NIR sensitivity in mind. Ideally, we would prefer a monochrome sensor to a color sensor to maximize light detection - but we had this one available for use and knew we could make it work.

The sensor itself is packaged with ZWO’s custom DSP and firmware - which is specifically designed around low-light imaging applications. This is critical for fluorescence imaging, where we are typically starved for photons. Technical specifications graphic of the ZWO camera use in this imaging systemMachine vision cameras for low-light applications pay more attention to specifications like dark noise, read noise, and higher bit-depth image readout which become critical for fluorescence imaging performance. Further, the available mechanical drawings make it easier to mock up mechanical designs for the system in CAD before building.

We were able to fit an 800nm long pass filter behind the camera lens - which becomes critical to preserving filter performance with fluorescence imaging. Without it, we would struggle to get any usable contrast in our NIRF images.

Imaging Lens: RICOH f/4.0 12mm machine vision lens

Stock image of the machine vision lens used in this imaging systemWe only needed to image a 50-100mm field of view to support most of our reference target products. And we needed a relatively close working distance to maximize fluorescence signal collection. This RICOH f/4.0, 12mm lens was lying around the lab - which was close but not perfect for what we needed. But it gets the job done in a small form factor.

When picking a lens for fluorescence imaging, you typically want to minimize the f/# and field of view for the camera sensor size being used. Minimizing the f/# of your imaging lens will maximize light collection, but will yield a shallow depth of field. Your device will need to balance light collection with depth of field. Variable aperture lenses let you test this out to strike the right balance for your product design.

Additionally, smaller fields of view make it easier to optimize illumination uniformity - a critical determinant for fluorescence imaging performance. Typically, longer focal length lenses for a given f/# will narrow your field of view. A large field of view makes illumination uniformity challenging to maintain, while a narrow field of view will limit the size of features to can completely image.

In a perfect world, we might source a telecentric lens with a low f/# and 100mm field of view for maximal fluorescence imaging performance. But our timeline, mechanical envelope, and budget did not let us be too picky about our imaging optics here. Ultimately this lens was going to get us most of what we needed.

Embedded Systems: Raspberry Pi 5, 8GB with 7.5” touchscreen displayStock images of the Raspberry Pi touch display used for this imaging system.

The software development ecosystem and power demands of Raspberry Pi’s are hard to compete with on tight timelines and budgets. Other options are available, but this system was going to be easier and quicker to develop around for under $250.

Conveniently, the Python bindings for the camera drivers we use for the ZWO camera support ARM processors like the RPi, which lets us get an operational prototype running quickly.

Further, the Pi has integrated GPIO controls and libraries which we will use to toggle the illumination with TTL communication.

And one of the key selling points for this architecture was the natively supported touch screens that Raspberry Pi offers. This would let us develop a simple control GUI around without needing an external computer to run. This saves on cable management and makes the user experience more robust.

Illumination: Thorlabs M730-L6 LED with T-Cube driverStock image of a Thorlabs LED driver

We use a lot of Thorlabs equipment for system design. The nice thing about their LEDs are the flexibility in colors, optical powers, and drive electronics that are crosscompatible with their hardware ecosystem. Choosing Thorlabs hardware makes it simpler to extend the system to more colors in the future.

We like the T-cube LED drivers specifically because they are TTL-controlable, which integrates seamlessly with the Raspberry Pi GPIO without a bunch of extra drivers and debugging. Also, the manual power control is useful for getting the system dialed in quickly for field use.

We were not completely sure how much illumination power we needed for this system. We opted to design around an 800mW 727nm LED. Stock image of the Thorlabs LED mount with heat sink and adjustable collimation optics.This is not the optimal wavelength for ICG - most systems use ~785nm. But we wanted the system to work with other fluorophores in the future, and we knew from experience that 727nm was suitable for ICG - albeit not optimal. This also gave us flexibility on the optical filter selection because we did not have to worry about stray illumination bleeding into our fluorescence imaging passband. We will cover filter selection in another post - as it’s a critical but loaded topic.

In hindsight, 800mW was overkill. But that power overhead gave us the piece of mind that we would be able to capture decent quality NIRF images.

We used an adjustable collimating asphere to direct the LED illumination output, which offers some flexibility in physically building the system. Combined with a K-cube dichroic mirror mount, we could design a compact epi-illumination to integrate with the camera without a ton of issues.

System EnclosureComponent-colored CAD rendering of the fluorescence imaging system

We do a lot of 3D printing. And when we are limited in the hardware we can use, 3D printing becomes a critical tool for prototyping things like this. We use CAD heavily to build custom components, which lets us get creative with fixturing and packaging off-the-shelf components. Further, 3D printing and CAD saves a ton of time in getting an operational prototype working.

The enclosure needed to be compact for transport, but also hold our optomechanical components securely for system operation. Ultimately, we settled on a design that held all of the critical components needed and only needed two externally-run power cables to power the entire system.

 

Software Stack

Software block diagram for controlling and operating the fluorescence imaging systemUltimately, the software needed to be simple and robust. To achieve this, we opted for a light-weight locally-hosted webserver (via Flask) that used javascript to execute Python scripts that controlled all of the system hardware and rendered the data to an HTML webpage. In the end, after dependency management, we had a usable software system with a web-based GUI operating in under 500 lines of code.

  • Stock 64-bit Raspbian OS
  • Python bindings for our ZWO camera’s SDK
  • Flask for hosting a simple local web dev server
  • Simple touch-screen compatible web GUI for image capture and contrast adjustment
  • PyCV2 for some image processing and false color mapping
  • JSON for caching capture parameters

Close up of the imaging system display showing the GUI layout from a user perspective. The GUI was simple. It has:

  • A capture button to trigger image acquisition
  • Contrast sliders to adjust the rendered image on the webpage
  • A simple intensity scale bar for image visualization.
  • Some troubleshooting tools for capture setting verification and adjustment
  • A little bit of branding to make it look more polished

What is really nice about this setup is that it is extremely configurable and adaptable to future development. It would not produce reference-grade quantitative images, but it did show NIRF images to give some credibility to prospective customers when demoing our products. We plan to extend this in the future to include camera capture settings and a more slick UI with fancier javascript, but we could show it as is without too much worry.

Design Tradeoffs

Clear requirements are the easy part. The real work happens when you start making decisions and compromises:

Battery power vs. illumination intensity: Finding high-output LEDs that could run on battery power proved challenging. We accepted some limitations in signal strength and wavelength specifications to maintain portability and extensibility to other fluorophores. In the end, the camera and LED we used was more than suitable for our needs.

Lens optimization: Our existing machine vision lenses weren't NIR-optimized, which could limit system sensitivity. The more lenses and colors to see in the system, the more broadband lens optimization matters. Luckily, it was not an issue for us under monochrome imaging with enough illumination power overhead. But lens selection is worth revisiting in future iterations.

Software deployment: The Linux-based capture engine was simple to develop and easy to operate —perfect for demos. But we knew upfront it wouldn't be suitable for reference measurements without significant additional work to support full bit depth image capture and graphical adjustments for capture settings. It was fine for proof of concept, not ready for the QC station. We were willing to make that compromise here.

Mechanical assembly: The initial design was meant to be a standalone, friction-fit assembly for easy flat-packing. In practice, we needed mechanical fasteners to preserve function and stability. This changes some of the original design intent for the enclosure mechanical features, which we noted for future improvements.

Summary

Each one of these parts of the system could be blog posts of their own - and some will! We will focus on some of the high points in this series. But next up, we will dive deep into our approach to choosing the right camera for NIRF imaging.

Let us know if you want us to dive deeper into a particular topic! Drop us a note at feedback@quelimaging.com.