Speakers:
Isobel Ojalvo
(Princeton University),
Kiley Kennedy
(Princeton University),
Lino Gerlach
(Princeton),
Mila Bileska
(Princeton University)
- torchlogix is our python library to train LGNs
- has functionality to write a trained (frozen, discrete) in C & verilog
- does this layer-by-layer (only works for sequential models)
- any optimization must happen at later stage
- proper compilers (transpilers): first build computational graph and then translate
- allows optimizations before writing out
- also allows arbitrary model structures
- e.g. smartpixels: conv layer on charge profile and then combine with high-level features
- da4ml is such a library: its tracer overloads numpy backend
- we are working on integrating torch, such that it works for torchlogix