Hi Robert, Carlo, Sal, Pierre,
Think it would be good to follow up on software project. Pierre has made some progress here and would be good to try and define tasks a little bit clearer to make progress…
There is always the potential issue of having “too many cooks in the kitchen” (that have different recipes for same thing) to move forward efficiently, something that I noticed can get quite confusing/frustrating when writing software together with people. So would be good to clearly assign tasks. I talked to Pierre today and he would be happy to integrate things in framework we have to try tie things together. What would foremost be needed would be ways of treating data, meaning code that takes a raw spectral image and meta-data and converts it into “standard” format (spectral representation) that can then be fitted. Then also “plugins” that serve a specific purpose in the analysis/rendering that can be included in framework.
The way I see it (and please comment if you see differently), there are ~4 steps here:
Take raw data (in .tif,, .dat, txt, etc. format) and meta data (in .cvs, xlsx, .dat, .txt, etc.) and render a standard spectral presentation. Also take provided instrument response in one of these formats and extract key parameters from this
Fit the data with drop-down menu list of functions, that will include different functional dependences and functions corrected for instrument response.
Generate/display a visual representation of results (frequency shift(s) and linewidth(s)), that is ideally interactive to some extent (and maybe has some funky features like looking at spectra at different points. These can be spatial maps and/or evolution with some other parameter (time, temperature, angle, etc.). Also be able to display maps of relative peak intensities in case of multiple peak fits, and whatever else useful you can think of.
Extract “mechanical” parameters given assigned refractive indices and densities
I think the idea of fitting modified functions (e.g. corrected based on instrument response) vs. deconvolving spectra makes more sense (as can account for more complex corrections due to non-optical anomalies in future –ultimately even functional variations in vicinity of e.g. phase transitions). It is also less error prone, as systematically doing decon with non-ideal registration data can really throw you off the cliff, so to speak.
My understanding is that we kind of agreed on initial meta-data reporting format. Getting from 1 to 2 will no doubt be most challenging as it is very instrument specific. So instructions will need to be written for different BLS implementations.
This is a huge project if we want it to be all inclusive.. so I would suggest to focus on making it work for just a couple of modalities first would be good (e.g. crossed VIPA, time-resolved, anisotropy, and maybe some time or temperature course of one of these). Extensions should then be more easy to navigate. At one point think would be good to involve SBS specific considerations also.
Think would be good to discuss a while per email to gather thoughts and opinions (and already start to share codes), and then plan a meeting beginning of March -- how does first week of March look for everyone?
I created this mailing list (software(a)biobrillouin.org) we can use for discussion. You should all be able to post to (and it makes it easier if we bring anyone else in along the way).
At moment on this mailing list is Robert, Carlo, Sal, Pierre and myself. Let me know if I should add anyone.
All the best,
Kareem