bunpc / homer3 Goto Github PK
View Code? Open in Web Editor NEWMATLAB application for fNIRS data processing and visualization
MATLAB application for fNIRS data processing and visualization
Homer3-1.80.2
MATLAB R2024a; Windows 11 Pro 64-bit 23H2
When applying the function ml = GetMeasList(obj, options) to a DataClass containing more than 1 condition the output has one extra row.
Steps to reproduce:
mlAct = dcAvg.GetMeasList();
Where dcAvg is a data cass containing more than 1 condition
This is because it enters into the if the statement of line 373 (if obj.measurementList(ii).GetCondition()>1), and the loop breaks, but ii has already been updated and then the unused rows are removed doing
ml(ii+1:end,:) = [];
We want to calculate the t-score from the beta values.
See for example - https://www.brainvoyager.com/bv/doc/UsersGuide/StatisticalAnalysis/TheGeneralLinearModel.html
For that we need to extract the regressors for the GLM fit. Where can we find them?
Thank you
1.80.2
2017b runtime
Given the error I figured it was due to 32bit data in dataTimeSeries / time and indeed, resampling the data to 64bit and the error is gone. I've attached the original, and the resampled file in the next section.
Steps to reproduce:
8001_00_24217_resampled.zip
8001_00_24217_original.zip
Expected behavior:
hmrR_MotionCorrectSpline shouldn't throw an error becuase dataTimeSeries is stored as 32bit, I validated the file with pysnirf2, and the snirf spec doesn't enforce 64bit storage:
Actual behavior:
Error if dataTimeSeries is in 32 bit format
A user requested that we add axis labels to give the units
Hello,
I'm trying to find a way to input a new function into Homer 3. Specifically, I'd like to input the autoregressive prewhitening algorithm (AR-IRLS) from NIRS Toolbox into Homer 3. The issue is that this algorithm is considered a class instead of a function, so I'm not sure how to input it properly for Homer 3. Are there any tutorials on this or has someone else experienced a similar issue?
Thanks,
P
Homer3, v1.80.2
OS and MATLAB independent
Homer3 assumes that the number of time points is greater than the number of channels. This is not always the case, especially using high density data. As a consequence, Homer3 fails to load some high density data. This issue has been addressed before (#79) but was not fully fixed.
Homer3 checks the time series when loading aux
data. If the channel count is greater than the number of time points, length(dataTimeSeries)
will return the channel count instead of the time count. Thus, the code will throw an error and the aux
data will be incorrectly rejected.
Homer3/DataTree/AcquiredData/Snirf/AuxClass.m
Lines 261 to 263 in 666ab6d
Steps to reproduce:
Attempt to load a SNIRF file where the channel count is greater than the number of time points.
Expected behavior:
Load without error.
Actual behavior:
Homer3 indicates an error (-4) claiming "aux" field is invalid and could not be loaded
. The aux
data is not loaded.
Version of Homer3 you are using
MATLAB Runtime R2021b, MATLAB R2017b; Windows 10 64-bit
function SetLastCheckForUpdates(dt)
if ~exist('dt','var') && ispathvalid([getAppDir, 'LastCheckForUpdates.dat'])
return;
end
if ~exist('dt','var')
try
dt = datetime;
catch
dt = -1;
end
end
fd = fopen([getAppDir, 'LastCheckForUpdates.dat'],'wt');
fprintf(fd, '%s\n', dt);
fclose(fd);
I think fd should become 1 or 2.
But, this code make fd to -1 .
Please fix the program or commentes to me.
development
All
To quote @dboas:
"often used length(d) because the number of time points has always been greater than the number of channels... well, we now have [high channel count data] where we have more channels than time points.
We have to fix all the places in the code where we assumed number of time points was greater than number of channels.
I found issues in hmrR_GLM i.e. line 200"
This would streamline stim editing for many users
v1.71.1
MATLAB R2022a, Windows 10 64-bit
I refer to the bug report I filed here: fieldtrip/fieldtrip#2198
It seems to be the same issue with the official Homer3.
Steps to reproduce:
Download e.g. the fnirs_tapping dataset: https://doi.org/10.5281/zenodo.5529797
Run:
snirf=SnirfClass
snirf.Load(snirf-file)
Expected behavior:
This should return 0, contain a stimClass, probeClass and auxClass.
Actual behavior:
It will return -1, contain a deleted stimClass and empty probeClass and auxClass. This is a downstream effect of strings not being read properly.
Hi, just a little question here, why choose MATLAB?
If it can build on Python environment, what is the main obstruct?
Thanks!
Homer3 v1.80.2; main release
MATLAB Runtime R2017b (9.3); Windows 10 64-bit
When loading a standalone SNIRF file, the command line reports there is an issue with the data
field. Compliance with the SNIRF data format was ensured, but the error still exists.
The only difference from any normal SNIRF file is that the dataTimeSeries is not proper fNIRS data, but a string of random doubles used for testing the output of an export program.
All the data formats were checked and extra optional fields were included in the file. Also, the pysnirf2 library was used to check the validity of the file and adherence to the specification, and the file has no errors.
Interestingly, the test.snirf file provided in the repository gets loaded correctly, even though pysnirf2 reports multiple FATAL errors.
Steps to reproduce:
snirf_test.zip
Find the attached file (snirf_test.snirf) and include it in your SubjDataSample folder. Then try to open Homer3 and notice the output in the CLI - it reports an error about the data
field.
Expected behavior:
The file should load correctly and without issue.
Actual behavior:
The file does not get loaded, no matter the configuration, included fields and so on.
The error that is visible in the output is not very informative:
DataFilesClass.ErrorCheck - ERROR: In file "snirf_test.snirf" "data" field is invalid.. File will not be added to data set
pysnirf2 output for my file's validity:
//formatVersion WARNING FIXED_LENGTH_STRING
/nirs/metaDataTags/SubjectID WARNING FIXED_LENGTH_STRING
/nirs/metaDataTags/MeasurementDate WARNING FIXED_LENGTH_STRING
/nirs/metaDataTags/MeasurementTime WARNING FIXED_LENGTH_STRING
/nirs/metaDataTags/LengthUnit WARNING FIXED_LENGTH_STRING
/nirs/metaDataTags/TimeUnit WARNING FIXED_LENGTH_STRING
/nirs/metaDataTags/FrequencyUnit WARNING FIXED_LENGTH_STRING
Found 40 OK (hidden)
Found 40 INFO (hidden)
Found 7 WARNING
Found 0 FATAL
and the validity output for the test.snirf file included in the project:
/nirs/data1/measurementList1/sourceIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList1/detectorIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList1/wavelengthIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList1/dataType FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList1/dataTypeIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList1/moduleIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList2/sourceIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList2/detectorIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList2/wavelengthIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList2/dataType FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList2/dataTypeIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList2/moduleIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList3/sourceIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList3/detectorIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList3/wavelengthIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList3/dataType FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList3/dataTypeIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList3/moduleIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList4/sourceIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList4/detectorIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList4/wavelengthIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList4/dataType FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList4/dataTypeIndex FATAL INVALID_DATASET_TYPE
/nirs/data1/measurementList4/moduleIndex FATAL INVALID_DATASET_TYPE
/nirs/stim1/data FATAL INVALID_DATASET_SHAPE
/nirs/aux1/dataTimeSeries FATAL INVALID_DATASET_SHAPE
/nirs/aux2/dataTimeSeries FATAL INVALID_DATASET_SHAPE
/nirs/aux3/dataTimeSeries FATAL INVALID_DATASET_SHAPE
/nirs/aux4/dataTimeSeries FATAL INVALID_DATASET_SHAPE
/nirs/aux5/dataTimeSeries FATAL INVALID_DATASET_SHAPE
/nirs/aux6/dataTimeSeries FATAL INVALID_DATASET_SHAPE
/nirs/aux7/dataTimeSeries FATAL INVALID_DATASET_SHAPE
/nirs/aux8/dataTimeSeries FATAL INVALID_DATASET_SHAPE
Found 53 OK (hidden)
Found 42 INFO (hidden)
Found 0 WARNING
Found 33 FATAL
To return SNIRF to the comfortable "read-only" relationship with Homer it had until recently, I propose to rebuild the stims system on top of the BIDS-specified events.tsv
record of stims.
This file can be generated from SNIRF files as necessary by data loading processes. The StimEditGUI of the future, incorporating #100 and #99, will edit only the events.tsv
file, which will be conveted back to StimClass
structure for processing.
We are looking toward a future model where users are not accessing homerOutput (formerly groupResults.mat) with MATLAB to analyze or visualize data with their own scripts-- they will export SNIRF files containing the derived data (HRFs) and open these on the platforms of their choice.
Currently, the tools for renaming conditions, adding conditions, deleting conditions do not work well.
No support exists for adding, deleting, or editing stim/dataLabels entries.
This is unrelated to BIDS implementation of "Event markers"-- it may be worth it to consider a StimEditGUI 2.0 milestone
MeasList information was lost in the Probe structure when used SnirfClass transform .nirs to .snirf data format, so
dc = hmrR_OD2Conc(dod,probe,ppf ) can not continune;
Homer3-1.80.2
Issue 1:
In the lines 89-111, the variable idxsExcl
records source-detector pairs where a time series from arbitrary wavelength fails to pass the threshold. However, the exclusion operation (aka line 110 chanList(lst(idxsExcl)) = 0
) happens for each loop. This will cause situations that (1) when the time series of 1st wavelength fails to be included (and 2nd not), both series will be pruned ultimately, (2) however, when the time series of 2nd wavelength fails (and 1st not), only the 2nd series will be pruned -- which probably leads a misinterpretation for the concentration data computed from only-one-wavelength source-detecter pairs.
Issue 2:
In the lines 115-126, the ii
variable is definitely equal to nLambda
. What information could it convey to users?
MATLAB Runtime 10, MATLAB R2017a; Windows 10 Pro 64-bit*
Impossible to get the block averages for "Session", "Subject" or "Group". I can only compute the block averages for each run.
Steps to reproduce:
Open the test data folder. Select "Session", "Subject" or "Group" at the "Processing level" panel.
Expected behavior:
Block averages obtained for any of this processing level.
Actual behavior:
Matlab does not recognize the variable "fcalls". The error is the following:
Unrecognized function or variable 'fcalls'.
Error in ProcStreamClass/Calc (line 361)
obj.ExportProcStream(filename, fcalls);
Error in TreeNodeClass/Calc (line 1007)
fcalls = obj.procStream.Calc([obj.path, obj.GetOutputFilename()]); %#ok
Error in SessClass/Calc (line 343)
Calc@TreeNodeClass(obj);
Error in DataTreeClass/CalcCurrElem (line 789)
obj.currElem.Calc();
Error in MainGUI>pushbuttonCalcProcStream_Callback (line 593)
maingui.dataTree.CalcCurrElem();
Error in gui_mainfcn (line 95)
feval(varargin{:});
Error in MainGUI (line 21)
gui_mainfcn(gui_State, varargin{:});
Error in matlab.graphics.internal.figfile.FigFile/read>@(hObject,eventdata)MainGUI('pushbuttonCalcProcStream_Callback',hObject,eventdata,guidata(hObject))
Error while evaluating UIControl Callback.
I have this issue since I updated to the last available version of Homer3.
Thanks for this amazing toolbox!
I am a learner of Homer, I was trying the tutorials provided by the training. I have a trouble with running this code displayStats. I am getting this error message, please let us know what the issue is. I have attached the dataset and command window environment/error message (notes) in this google drive link. Also, I have pasted the error message for you to look over. It would be a great help if you guide through this, Thank you!
https://drive.google.com/file/d/1FF4MWgv4QShONm1XbAmpJ9R826TpX5VX/view?usp=sharing
displayStats('DEMO.mat', 1, 0.05, 1)
Error using matlab.internal.tabular.private.varNamesDim.makeValidName (line 498)
'Source#' is not a valid variable name.
Error in matlab.internal.tabular.private.varNamesDim/validateAndAssignLabels (line 383)
[newLabels,wasMadeValid] = obj.makeValidName(newLabels,exceptionMode);
Error in matlab.internal.tabular.private.tabularDimension/setLabels (line 173)
obj = obj.validateAndAssignLabels(newLabels,indices,fullAssignment,fixDups,fixEmpties,fixIllegal);
Error in matlab.internal.tabular.private.tabularDimension/createLike_impl (line 355)
obj = obj.setLabels(dimLabels,[]);
Error in matlab.internal.tabular.private.varNamesDim/createLike (line 76)
obj = obj.createLike_impl(dimLength,dimLabels);
Error in tabular/initInternals (line 212)
t.varDim = t.varDim.createLike(nvars,varnames); % error if invalid, duplicate, or empty
Error in table.init (line 332)
t = t.initInternals(vars, numRows, rowLabels, numVars, varnames);
Error in array2table (line 64)
t = table.init(vars,nrows,rownames,nvars,varnames);
Error in displayStats (line 26)
T = array2table([ml(lst_thresh,1), ml(lst_thresh,2), tval(lst_thresh)',
output.misc.hmrstatsG_contrast.pval(Hb,lst_thresh)']...
When I try to run the Wavelet motion correction function through the stream edit GUI I get an error saying that intensity_to_delta_OD function is needed first even when that function is already in the stream before the wavelet function. When I click continue anyway Homer gives me an error saying that the filter exceeds the Nyquist frequency.
Hello,
It seems like the molar extinction coefficients in the GetExtinctions.m file do not correspond exactly to the 2 authors' dataset it is referencing, namely Gratzer and Kollias compiled by Scott Prahl that can be found here.
The data in Homer differ from this reference for wavelengths equal or above 650 nm, at line 365 of the file.
Which molar extinction coefficient dataset is Homer based on for 650 nm and above?
Kind regards
Hello,
Currently, the installation instructions if you do not have Matlab state that you need to go to the homer3 Install folder and unzip homer3_install_win.zip. Problem: it is not there.
In my lab, we use both the source and compiled versions of Homer, so here is what I did to get the installation files:
setpaths
createInstallFile
Doing that, I obtained the archive homer3_install_win.zip in the Install folder, that I could then use to follow the installation instructions for the case when you do not have Matlab.
It would be helpful to either document the process to build the installation files (as I am not sure what I did is correct), or to supply the archive (via releases maybe ?).
v1.80.2 & master
OS and MATLAB independent
The SNIRF specification allows for the time vector to be either equal in length to the associated data time series, or to be of length two, containing the start time and the sample time spacing.
/nirs(i)/data(j)/time
Presence: required
Type: numeric 1-D array
Location: /nirs(i)/data(j)/timeThe time variable. This provides the acquisition time of the measurement relative to the time origin. This will usually be a straight line with slope equal to the acquisition frequency, but does not need to be equal spacing. For the special case of equal sample spacing an array of length <2> is allowed where the first entry is the start time and the second entry is the sample time spacing in TimeUnit specified in the metaDataTags. The default time unit is in second ("s"). For example, a time spacing of 0.2 (s) indicates a sampling rate of 5 Hz.
Option 1 - The size of this variable is and corresponds to the sample time of every data point
Option 2- The size of this variable is <2> and corresponds to the start time and sample spacing.
However, when loading the data, Homer3 asserts that the time vector corresponds only to option one of the specification, as can be seen here:
Homer3/DataTree/AcquiredData/Snirf/DataClass.m
Lines 299 to 301 in 666ab6d
As such, Homer3 throws an error when loading standards compliant SNIRF files.
(debugged by @ernesto-vidal)
Steps to reproduce:
Attempt to load a SNIRF file with a time vector of length two, corresponding to option two of the SNIRF specification for this field.
Expected behavior:
A standards compliant implementation will extrapolate the time series based upon the initial time series and delta.
Actual behavior:
Homer3 indicates an error (-3) and does not load the data.
Hi
I think the baseline of each trial is set to zero based on the mean of the average baseline. The result is that, for example, when plotting the yTrial results, their baselines are not centered around zero.
Should the mean of the baseline of each trial be substracted from each trial?
My suggestion, for example (lines 109:111 of hmrR_BlockAvg):
for iBlk = 1:nBlk
foom_temp = ones(size(yTrials(iC).yblk,1),1)*mean(yTrials(iC).yblk(1:-nPre,:,ii,iBlk),1);
yTrials(iC).yblk(:,:,ii,iBlk) = yTrials(iC).yblk(:,:,ii,iBlk) - foom_temp;
end
Tried it in Homer2 and seems to work. Not sure though if there were other reasons for it being computed like that. Sorry in case I misunderstood.
Thank you
Latest master branch (1.80.2), tested also on latest development branch 1.80.4. The error appears to be here since v1.78.0.
Errors on ubuntu, macos and (Matlab 2022b, and 2023b prerelease). Not errors on Windows 11 (Matlab 2023a).
Problem lies on the line
as the datatype H5T_NATIVE_ULONG
differs among different systems [1], and in particular in 64-bit linux and macos it expresses 64-bit unsigned integers, while in Windows and 32-bit linux 32-bit unsigned integers. Thus, in linux and macos, I get an error in 3 lines after, on
as it essentially tries to write an int32
into a H5T_STD_U64LE
. In windows I get no error.
This is due to this commit that changed H5T_NATIVE_INT
into H5T_NATIVE_ULONG
.
Steps to reproduce:
Just try to write any .snirf
file to disk in 64-bit linux/macos. (eg run Nirs2Snirf
inside Homer3/SubjDataSample
).
Steps to fix:
Consider reverting H5T_NATIVE_ULONG
back to H5T_NATIVE_INT
or any other compatible platform indepedent type. Not sure if there were performance issues with that, I checked also H5T_NATIVE_UINT32
or H5T_NATIVE_INT32
and they work.
Actual behavior:
The error message
>> Nirs2Snirf
Converting /home/xxx/Documents/MATLAB/Homer3/SubjDataSample/test.nirs to /home/xxx/Documents/MATLAB/Homer3/SubjDataSample/test.snirf
Error using hdf5lib2
The class of input data must be integer instead of int32 when the HDF5 class is H5T_STD_U64LE.
Error in H5D.write (line 100)
H5ML.hdf5lib2('H5Dwrite', varargin{:});
Error in hdf5write_safe>write_integer (line 101)
H5D.write(dsid, tid, 'H5S_ALL', 'H5S_ALL', 'H5P_DEFAULT', int32(val));
Error in hdf5write_safe (line 52)
write_integer(fid, name, val);
Error in MeasListClass/SaveHdf5 (line 171)
hdf5write_safe(fid, [location, '/sourceIndex'], uint64(obj.sourceIndex));
Error in DataClass/SaveHdf5 (line 248)
obj.measurementList(ii).SaveHdf5(fid, [location, '/measurementList', num2str(ii)]);
Error in SnirfClass/SaveData (line 672)
obj.data(ii).SaveHdf5(fileobj, [obj.location, '/data', num2str(ii)]);
Error in SnirfClass/SaveHdf5 (line 742)
obj.SaveData(obj.fid);
Error in FileLoadSaveClass/Save (line 78)
obj.SaveHdf5(filename, params);
Error in Nirs2Snirf (line 65)
snirf(ii).Save(dst);
[1] https://www.ibm.com/docs/en/ibm-mq/9.1?topic=platforms-standard-data-types-unix-linux-windows
Development branch v1.35.4, latest as of date (same issue in master etc)
All.
Steps to reproduce:
Our nirs system samples on 10Hz mostly, but the actual rate can vary a bit in different time points, and even rarely some sample may be skipped etc. Eg it can happen that the times reported are 0.000, 0.110, 0.210, 0.310 ...
. The issue arises when data with such variations in sampling rate is used in the preprocessing (and in particular in the averaging parts).
Expected behavior:
The actual times/variation in sampling rate should be taken into account in the averaging processes.
Actual behavior:
In hmrR_BlockAvg
line 66 dt = t(2)-t(1)
the sampling rate is assumed constant and computed by the difference of the first two time points, which in some subjects happens to be bigger than normal (0.110 vs 0.100). The data range around the stimuli markers is selected based on this sampling rate, rather than based on the actual times in the time vector (line 66:70).
In subject level, as this also results in different runs to assume different sampling rates, it results in truncations in the data fed in hmrS_RunAvg
and no regard for the differences in assumed sampling rates between runs in the function itself, which leads to erroneously averaging different time points together, as well as losing a final part of the data timeseries in runs with the proper assumed sampling rate (because of the truncation).
All this should probably not matter in shorter trials, but in our long trial design it seems to cause some minor to moderate issues.
Possible solution:
I do not know if our case of small variation in sampling rate is a rare one (we use the NTS fnirs system). I am not sure if and how you want to handle this, but the solution I think I will apply in my script will be to interpolate/resample the data timeseries prior to feeding it to hmrR_BlockAvg
to a prefixed sampling rate.
Hello everyone,
I have a question concerning the validation of the format of a .snirf file on Python. I convert a .nirs file to a .snirf file via Homer3 (v.1.80.2). When I use the validateSnirf() function from the snirf package, then result.display() to check the validity of this converted .snirf file, I get 120 WARNING and 2 or 3 FATAL (for aux1/dataTimeSeries and stim3/data for example) depending on the file selected. It told me that there is an invalid_dataset_shape. I don't understand because the file is supposed to be converted by Homer3, so I thought it would be valid, without errors. Does anyone know where this problem comes from, is it a problem with Homer3 (the file converted by Homer3 would not respect the specifications of the official .snirf format) or with the validateSnirf function of the Python package snirf ? This may also be a mistake on my part, as I am not an expert in this area. Sorry if the problem is obvious, or if it has already been solved previously.
Thanks in advance for your time and your help.
Ewan Mahé
Hello,
In my knowledge, Homer is the only open-source fNIRS analysis platform support the data recorded with COBI studio.
Currently, I use the (barely working) in-house script based on below.
http://www.physiologicalcomputing.net/?page_id=2621
Is there any update/support in Homer3 for COBI .oxy data?
Homer3-1.80.2
MATLAB R2024a; Windows 11 Pro 64-bit 23H2
The issue is related to the channels considered active by the functions hmrR_MotionArtifact (and hmrR_MotionArtifactByChannel)
Actual behavior:
Currently the function does (lines 99-104)
% Get list of active channels
mlActMan{iBlk} = mlAct_Initialize(mlActMan{iBlk}, MeasList);
mlActAuto{iBlk} = mlAct_Initialize(mlActAuto{iBlk}, MeasList);
lstAct1 = mlAct_Matrix2IndexList(mlActAuto{iBlk}, MeasList);
lstAct2 = mlAct_Matrix2IndexList(mlActMan{iBlk}, MeasList);
lstAct = unique([lstAct1(:)', lstAct2(:)']);
This code will be considered active channels, channels that are considered active either in manual or automatic pruning. For example, if 5 channels were pruned automatically and none were pruned manually, then all channels will be considered active.
Expected behavior:
I think what we want is to consider active channels those that are active both in mlActMan and mlActAuto. Then, the last three lines should be
lstAct1 = mlAct_Matrix2BinaryVector(mlActAuto{iBlk}, MeasList);
lstAct2 = mlAct_Matrix2BinaryVector(mlActMan{iBlk}, MeasList);
lstAct = find(lstAct1 & lstAct2);
The same applies to hmrR_MotionArtifactByChannel on lines 111-113
v1.80.2
MATLAB R2021 on Windows 11 64-bit 21H1
Procedures to reproduce this issue is as follows.
Hello,
On a computer running Homer3 without Matlab (installation was done following the steps described in #14), the function hmrR_MotionCorrectWavelet returned an error because it was unable to find db2.mat. It was a fresh install.
I temporarily solved the issue by copy pasting the db2.mat inside the folder containing the fNIRS data, which was the only place where Homer3 managed to find it.
I have a snirf data file with
snirf.metaDataTags.tags.TimeUnit = 'ms'
When I load this data into Homer3, the x axis of any channel I select from my probe is plotted as 'Time (s)'. That is, Homer3 doesn't convert my snirf.data.time trace from ms to seconds.
Homer3 v1.80.2
MATLAB Runtime R2021a; Windows 11 64-bit
Homer3 shows "8 files failed to load".
Steps to reproduce:
Expected behavior:
Homer3 should not show "8 files failed to load".
Actual behavior:
Homer3 shows "8 files failed to load".
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.