Giter Site home page Giter Site logo

Comments (9)

henrypinkard avatar henrypinkard commented on September 3, 2024

Example of FPGA use in micro-manager: https://github.com/jdeschamps/MicroFPGA

from futuremmcore.

nanthony21 avatar nanthony21 commented on September 3, 2024

I often use TTL synchronization between a Hamamatsu camera and various forms of tunable filters. This involves TTL from the camera (indicating the end of an exposure) as well as into the camera (triggering a new camera).

The difficulty in developing a standard API is that there is such huge variety in how much configuration complexity each device supports.

from futuremmcore.

jdeschamps avatar jdeschamps commented on September 3, 2024

Example of FPGA use in micro-manager: https://github.com/jdeschamps/MicroFPGA

We use that. It has a simple layout for synchronization: it receives a camera TTL trigger (high at start of each frame, low in between frames), process it into more complex patterns and redistribute it to all lasers (TTL). We had a similar system with an Arduino before. The reason we switched to the FPGA was that we could have >4 lasers triggered in parallel with pulses as low as 1 us (by design, not limited by the fpga), each following independent complex sequences (16 bits sequence of 0=OFF, 1=ON) on either rising or falling edge of the camera trigger. All that in one board, with the control of many other devices on the side (servomotors for filters and lenses, TTL for flip-mirrors and brightfield switches, analog output for AOM/AOTF, PWM for custom laser power, analog read-out to check focus-stabilization, temperature and laser power). It is programmed in an HDL-like language from the manufacturer. In MM, all "subdevices" (laser trigger, servomotors, TTL device, PWM device, Analog in) are independent but under a shared Hub-device.

Something that I've seen in many microscopes in the context of FPGAs, is to use the board to trigger the camera(s) as well. I have no experience with this, so I can't really comment on how much of the "camera API" is then under the responsability of the triggering device (exposure time, interval between frames). I guess one of the big advantage is when you have devices other than lasers to synchronize (e.g. galvos for beam positioning), it is easier to maintain synchronization when you also trigger the camera (less delay?).

Finally, we also had a hacky set-up with two cameras (different models, different sensor and image pixel sizes): one camera (EMCCD, Photometrics) would trigger the other camera (sCMOS, Hamamatsu) which in turn would send the trigger to be processed by the FPGA. Alternatively, the main camera could also trigger the FPGA in parallel to the second camera. Note that one camera was controlled by MM1.4 and the other by MM2, which highlights an important point for #23.

The difficulty in developing a standard API is that there is such huge variety in how much configuration complexity each device supports.

I also have trouble picturing it. Do you know how it is done in the big Python packages for microscope control that are around? It would be interesting to know what choices were made there.

from futuremmcore.

campagnola avatar campagnola commented on September 3, 2024

I have mostly done this with NI DAQ devices + camera / shutter / etc TTL lines. Every type of device is going to have its own idiosyncracies regarding synchronization, but a high-level API should allow the user to specify that they want Device1 to wait for a trigger from Device2, and then tell Device2 to start. The details of that triggering process can be negotiated by code and configuration supporting the devices. For example, in my configuration I would specify that a camera can trigger the DAQ on a specific PFI line, or the DAQ can trigger the camera by raising a specific DO line. Then the supporting code decides what actions are needed in order to synchronize the devices.

from futuremmcore.

kbellve avatar kbellve commented on September 3, 2024

Specifically, what devices do you use? (e.g. FPGA, Arduino, National Instruments board)

I programmed the Heka ITC18 Device Adapter specifically for hardware synchronization. It predates µManager's Hardware Synchronization API and I never instituted that API in the ITC18 Device Adapter due to its limitations.

This system is something I developed in the late 90s to early 2000s, but brought over almost completely when I changed to µManager about 2007/2008. It didn't take me much development time to port it over to µManager because it was already working and proven in imaging acquisition software that I had written previously.

How do you program that device?

I have a main CSH script that outputs time with 16 bit digital number for the TTL states of the ITC18. That script manages everything that needs a TTL signal (camera(s), shutters, lasers (on/off). I then have other scripts for ± 10V devices (piezos), which also outputs time ± analog number, using same time base for synchronization. Everything gets concatenated into a single file, which I call an imaging protocol file. Repeatable TTL/Voltage sequences can be used. The ITC18 Device Adapter uses a thread to process that file to continually feeds the ITC18 buffer.

I use bsh scripts inside Micromanager to load the imaging protocol file into the ITC18 Device Adapter, and set up and initiate µManager for the acquisition. µManager role is to wait for incoming images.

What are the essential features provided by this setup?

Timing precision and accuracy, speed, flexibility, and asynchronous from the operating system and µManager. µManager (and the ITC18 Device Adapter) is only used to keep the ITC18's internal buffer filled. I can control devices from 12 Hz to 200 kHz with microsecond accuracy.

A huge advantage is everything is external to the imaging application. Everything is in readable text (scripts and imaging protocols), and can be easily modified or validated with a text editor. Adding another TTL device for control just means modifying a text script (adding it to the bit pattern). Validating an imaging protocol just requires a text editor rather than debugger.

And how could this be generalized into a device/type API?

Uhm...

I have to think about this...

from futuremmcore.

kasasxav avatar kasasxav commented on September 3, 2024
  • We use a national instruments board

  • The python library nidaqmx (https://nidaqmx-python.readthedocs.io/en/latest/)

  • The essential features are opening and closing tasks, writing/reading analog or digital data to those tasks, and some triggering and sync functions.

from futuremmcore.

palmada avatar palmada commented on September 3, 2024

We use an NI board as the master control for triggering because we generate some wave form and want to trigger at specific points in the waveform.

from futuremmcore.

jondaniels avatar jondaniels commented on September 3, 2024

We commonly do hardware triggering with ASI controllers. For anything complicated we use the Tiger controller with a "PLC" card that emulates an FPGA, so it can be programmed to do some complicated things. Programming that functionality is the bottleneck: somebody has to grok the signaling scheme well enough to translate it into a logic diagram, then that diagram has to be translated into the PLC functions, and then that has to be converted into a a script setting MM properties (or equivalently a series of serial commands) to program that functionality into the card. Documentation including examples at http://asiimaging.com/docs/tiger_programmable_logic_card.

Overall this is super flexible and accessible via Micro-Manager properties. If you give an API we can probably implement it with the PLC, but it's so flexible that it's probably pointless to base an API off of it...

from futuremmcore.

henrypinkard avatar henrypinkard commented on September 3, 2024

A belated thank you to all of you for all this helpful feedback!

We're finally moving forward on a new features that address many of these limitations. Please feel free to chime in and get involved at: https://github.com/micro-manager/mmCoreAndDevices/issues

from futuremmcore.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.