Giter Site home page Giter Site logo

idr-metadata's Introduction

IDR studies

Published screens Published experiments

All metadata associated with published studies in IDR is managed in this repository.

Study name

After acceptance, IDR studies must be named as idr<NNNN>-<name>-<description> where idr<NNNN> is the accession number of the study using an incremental four digits integer, <name> is the name of one of the authors associated with the publication, usually the first author, and <description> is a short description of the study or the name of the project/consortium. The study name should be lowercase.

Study repository

For each new study, a repository must be created on GitHub under the IDR organization using the study name as defined above. When ready for publication in the IDR, the study repository must be registered in the top-level idr-metadata repository as a submodule.

A study repository contains all original and curated metadata files associated with a study. The idr0000-lastname-example repository contains the templates that should be used by submitters when sending original metadata files for screen or experiment studies. The structure of each study repository should use the following layout:

.travis.yml                                  # Travis CI configuration file, used for validation (mandatory)
bulk.yml                                     # Import configuration file for multi-experiment or multi-screen studies (optional)
experimentA/                                 # Curated metadata for experimentA (if applicable)
    idrNNNN-experimentA-annotation.csv       # Curated annotation file (mandatory)
    idrNNNN-experimentA-assays.txt           # Original annotation file (recommended)
    idrNNNN-experimentA-bulk.yml             # Configuration file for import (mandatory)
    idrNNNN-experimentA-bulkmap-config.yml   # Configuration file for annotation (mandatory)
    idrNNNN-experimentA-filePaths.tsv        # Files/folder to be imported (mandatory)
experimentB/                                 # Curated metadata for experimentB (if applicable)
   ...
idrNNNN-study.txt                            # Top-level metadata file describing the study (mandatory)
screenA/                                     # Curated metadata for screenA if applicable
    idrNNNN-screenA-annotation.csv           # Curated annotation file (mandatory)
    idrNNNN-screenA-bulk.yml                 # Configuration file for import (mandatory)
    idrNNNN-screenA-bulkmap-config.yml       # Configuration file for annotation (mandatory)
    idrNNNN-screenA-library.txt              # Original annotation file (recommended)
    idrNNNN-screenA-plates.tsv               # Plates to be imported (mandatory)
screenB/                                     # Curated metadata for screenB if applicable
   ...
scripts/                                     # Folder containing custom scripts associated with the study (optional)
README.md                                    # Optional top-level readme (optional)
requirements.txt                             # Python dependencies used for Travis or scripts (recommended)

idr-metadata's People

Contributors

atarkowska avatar dependabot[bot] avatar dominikl avatar eleanorwilliams avatar francesw avatar gebailey avatar jburel avatar joshmoore avatar manics avatar mestrelion avatar mtbc avatar sbesson avatar simleo avatar snoopycrimecop avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

idr-metadata's Issues

idr-testing logs analysis from testing on 10th April

TLDR: ERROR in logs seem to be coming from memo file generation (which was cancelled but still running during web testing), rather than from web testing itself....

Start by scanning logs on all 5 servers for ERROR statements from 9am today:
NB: logs are timestamped with GMT (not BST), so we need to grep for timestamps for an hour earlier '2024-04-10 08...'

for server in omeroreadwrite omeroreadonly-1 omeroreadonly-2 omeroreadonly-3 omeroreadonly-4; do echo $server && ssh $server "grep ERROR /opt/omero/server/OMERO.server/var/log/Blitz-0.log | grep '2024-04-10 08'"; done

omeroreadwrite
2024-04-10 08:25:44,777 ERROR [        ome.services.util.ServiceHandler] (.Server-19) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 7625339
2024-04-10 08:41:10,171 ERROR [        ome.services.util.ServiceHandler] (l.Server-0) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 7753026
2024-04-10 08:44:52,401 ERROR [        ome.services.util.ServiceHandler] (.Server-23) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 7404650
2024-04-10 08:57:04,335 ERROR [        ome.services.util.ServiceHandler] (.Server-12) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 7905672
omeroreadonly-1
omeroreadonly-2
2024-04-10 08:02:30,423 ERROR [        ome.services.util.ServiceHandler] (.Server-12) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8642795
2024-04-10 08:06:20,880 ERROR [        ome.services.util.ServiceHandler] (l.Server-7) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8516806
2024-04-10 08:07:12,305 ERROR [        ome.services.util.ServiceHandler] (.Server-14) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8421699
2024-04-10 08:09:47,515 ERROR [        ome.services.util.ServiceHandler] (.Server-16) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8625053
2024-04-10 08:11:25,710 ERROR [        ome.services.util.ServiceHandler] (.Server-18) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8535615
2024-04-10 08:11:43,066 ERROR [        ome.services.util.ServiceHandler] (l.Server-8) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8596496
2024-04-10 08:12:37,982 ERROR [        ome.services.util.ServiceHandler] (l.Server-6) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8332341
2024-04-10 08:21:45,104 ERROR [        ome.services.util.ServiceHandler] (l.Server-2) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8401669
omeroreadonly-3
2024-04-10 08:58:26,437 ERROR [        ome.services.util.ServiceHandler] (.Server-31) Method interface ome.services.util.Executor$Work.doWork invocation took 1443967
2024-04-10 08:58:26,438 ERROR [        ome.services.util.ServiceHandler] (.Server-31) Method interface omeis.providers.re.RenderingEngine.load invocation took 1443984
omeroreadonly-4

Since idr-testing web tends to use omeroreadonly-3 and omeroreadonly-4 lets check omeroreadonly-3:

ssh omeroreadonly-3

[wmoore@test120-omeroreadonly-3 ~]$ grep -B 10 -A 2 "2024-04-10 08:58:26,437" /opt/omero/server/OMERO.server/var/log/Blitz-0.log
2024-04-10 08:58:21,090 INFO  [                      omero.cmd.SessionI] (.Server-30) Removed servant from adapter: 984148d6-7e5b-469c-b9c2-e3f4b2c4d867omero.api.IConfig
2024-04-10 08:58:25,635 INFO  [ ome.services.blitz.fire.SessionManagerI] (.Server-45) Found session locally: fc82fd8f-f877-449c-9f1c-a6eeac4a3069
2024-04-10 08:58:25,636 INFO  [ ome.services.blitz.fire.SessionManagerI] (.Server-45) Rejoining session ServiceFactoryI(session-cc243189-5076-426a-9d5a-a03e0a312e36/fc82fd8f-f877-449c-9f1c-a6eeac4a3069) (agent=OMERO.web)
2024-04-10 08:58:25,642 INFO  [o.services.sessions.SessionContext$Count] (l.Server-8) -Reference count: fc82fd8f-f877-449c-9f1c-a6eeac4a3069=0
2024-04-10 08:58:25,642 INFO  [                      omero.cmd.SessionI] (l.Server-8) cleanupSelf(ServiceFactoryI(session-cc243189-5076-426a-9d5a-a03e0a312e36/fc82fd8f-f877-449c-9f1c-a6eeac4a3069)).
2024-04-10 08:58:26,432 DEBUG [                   loci.formats.Memoizer] (.Server-31) saved to temp file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/2016-07/28/18-54-45.119_mkngff/a78bd2cc-f574-47d9-ae83-e3df322efdda.zarr/OME/.METADATA.ome.xml.bfmemo7903852454255395295
2024-04-10 08:58:26,432 DEBUG [                   loci.formats.Memoizer] (.Server-31) start[1712739504492] time[1939] tag[loci.formats.Memoizer.saveMemo]
2024-04-10 08:58:26,435 DEBUG [                   loci.formats.Memoizer] (.Server-31) saved memo file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/2016-07/28/18-54-45.119_mkngff/a78bd2cc-f574-47d9-ae83-e3df322efdda.zarr/OME/.METADATA.ome.xml.bfmemo (357780 bytes)
2024-04-10 08:58:26,435 DEBUG [                   loci.formats.Memoizer] (.Server-31) start[1712738062477] time[1443958] tag[loci.formats.Memoizer.setId]
2024-04-10 08:58:26,435 INFO  [                ome.io.nio.PixelsService] (.Server-31) Creating BfPixelBuffer: /data/OMERO/ManagedRepository/demo_2/2016-07/28/18-54-45.119_mkngff/a78bd2cc-f574-47d9-ae83-e3df322efdda.zarr/OME/METADATA.ome.xml Series: 0
2024-04-10 08:58:26,437 INFO  [                 org.perf4j.TimingLogger] (.Server-31) start[1712738062469] time[1443967] tag[omero.call.success.ome.services.RenderingBean$12.doWork]
2024-04-10 08:58:26,437 INFO  [        ome.services.util.ServiceHandler] (.Server-31)  Rslt:	ome.io.bioformats.BfPixelBuffer@1671e72d
2024-04-10 08:58:26,437 ERROR [        ome.services.util.ServiceHandler] (.Server-31) Method interface ome.services.util.Executor$Work.doWork invocation took 1443967

from idr0011 - ScreenB:

less /data/OMERO/ManagedRepository/demo_2/2016-07/28/18-54-45.119_mkngff/a78bd2cc-f574-47d9-ae83-e3df322efdda.zarr/OME/METADATA.ome.xml
...
Name="/uod/idr/filesets/idr0011-thorpe-Dad4/20150826-peter_thorpe/T34 x TS/Plate1-TS/Plate1-TS-Red-B"

Using the same approach we can see...

On readonlyomero-2, the last error is at 2024-04-10 08:21:45,104 (before we started web testing) coming from the memo file saved for idr0013: plate LT0064_25.

On omeroreadwrite the error at 2024-04-10 08:41:10,171 (GMT) during testing (9:40 BST) is "memo file saved" from idr0013: plate LT0065_05.
2024-04-10 08:44:52,401 is from idr0013 plate LT0065_06.
2024-04-10 08:57:04,335 is from idr0013 plate LT0066_02

All these happened during testing, using a different server (omeroreadonly) but the same Database.

Checking an hour later, we see that memo file generation was still ongoing...

[wmoore@test120-proxy ~]$ for server in omeroreadwrite omeroreadonly-1 omeroreadonly-2 omeroreadonly-3 omeroreadonly-4; do echo $server && ssh $server "grep ERROR /opt/omero/server/OMERO.server/var/log/Blitz-0.log | grep '2024-04-10 09'"; done
omeroreadwrite

2024-04-10 09:01:27,393 ERROR [        ome.services.util.ServiceHandler] (.Server-22) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8237225
2024-04-10 09:09:47,201 ERROR [        ome.services.util.ServiceHandler] (l.Server-6) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8357950
2024-04-10 09:16:02,991 ERROR [        ome.services.util.ServiceHandler] (l.Server-9) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8531641
2024-04-10 09:21:19,547 ERROR [        ome.services.util.ServiceHandler] (.Server-10) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8734811
2024-04-10 09:22:39,273 ERROR [        ome.services.util.ServiceHandler] (.Server-14) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8577378
2024-04-10 09:24:30,872 ERROR [        ome.services.util.ServiceHandler] (.Server-24) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8822350
omeroreadonly-1
2024-04-10 09:39:57,966 ERROR [        ome.services.util.ServiceHandler] (.Server-26) Method interface ome.services.util.Executor$Work.doWork invocation took 1645799
2024-04-10 09:39:57,968 ERROR [        ome.services.util.ServiceHandler] (.Server-26) Method interface omeis.providers.re.RenderingEngine.load invocation took 1645815
2024-04-10 09:45:58,915 ERROR [        ome.services.util.ServiceHandler] (.Server-21) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8039111
2024-04-10 09:46:39,153 ERROR [        ome.services.util.ServiceHandler] (.Server-15) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 7571553
2024-04-10 09:47:48,514 ERROR [        ome.services.util.ServiceHandler] (.Server-24) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8129785
omeroreadonly-2
omeroreadonly-3
2024-04-10 09:13:40,629 ERROR [        ome.services.util.ServiceHandler] (.Server-48) Method interface ome.services.util.Executor$Work.doWork invocation took 1861115
2024-04-10 09:13:40,630 ERROR [        ome.services.util.ServiceHandler] (.Server-48) Method interface omeis.providers.re.RenderingEngine.load invocation took 1861135
2024-04-10 09:13:40,667 ERROR [        ome.services.util.ServiceHandler] (2-thread-5) Method interface ome.services.util.Executor$Work.doWork invocation took 22879
2024-04-10 09:13:40,679 ERROR [        ome.services.util.ServiceHandler] (-thread-71) Method interface ome.services.util.Executor$Work.doWork invocation took 80847
2024-04-10 09:13:40,680 ERROR [        ome.services.util.ServiceHandler] (.Server-22) Method interface ome.api.IAdmin.getEventContext invocation took 80852
2024-04-10 09:13:40,691 ERROR [        ome.services.util.ServiceHandler] (.Server-26) Method interface ome.api.IConfig.getConfigValues invocation took 49924
2024-04-10 09:13:40,696 ERROR [        ome.services.util.ServiceHandler] (-thread-70) Method interface ome.services.util.Executor$Work.doWork invocation took 80863
2024-04-10 09:13:40,697 ERROR [        ome.services.util.ServiceHandler] (.Server-30) Method interface ome.api.IAdmin.getEventContext invocation took 80869
2024-04-10 09:13:40,698 ERROR [        ome.services.util.ServiceHandler] (-thread-74) Method interface ome.services.util.Executor$Work.doWork invocation took 80864
2024-04-10 09:13:40,699 ERROR [        ome.services.util.ServiceHandler] (.Server-14) Method interface ome.api.IAdmin.getEventContext invocation took 80869
2024-04-10 09:13:40,703 ERROR [        ome.services.util.ServiceHandler] (-thread-75) Method interface ome.services.util.Executor$Work.doWork invocation took 80868
2024-04-10 09:13:40,704 ERROR [        ome.services.util.ServiceHandler] (.Server-42) Method interface ome.api.IAdmin.getEventContext invocation took 80875
2024-04-10 09:13:40,708 ERROR [        ome.services.util.ServiceHandler] (l.Server-6) Method interface ome.services.util.Executor$Work.doWork invocation took 44333
2024-04-10 09:13:40,709 ERROR [        ome.services.util.ServiceHandler] (.Server-31) Method interface ome.api.IConfig.getConfigValues invocation took 79937
2024-04-10 09:13:40,709 ERROR [        ome.services.util.ServiceHandler] (.Server-29) Method interface ome.services.util.Executor$Work.doWork invocation took 44281
2024-04-10 09:13:40,715 ERROR [        ome.services.util.ServiceHandler] (l.Server-9) Method interface ome.services.util.Executor$Work.doWork invocation took 40783
2024-04-10 09:13:40,715 ERROR [        ome.services.util.ServiceHandler] (.Server-25) Method interface ome.services.util.Executor$Work.doWork invocation took 40782
2024-04-10 09:13:40,715 ERROR [        ome.services.util.ServiceHandler] (.Server-46) Method interface ome.services.util.Executor$Work.doWork invocation took 44250
2024-04-10 09:13:40,716 ERROR [        ome.services.util.ServiceHandler] (.Server-24) Method interface ome.services.util.Executor$Work.doWork invocation took 40775
2024-04-10 09:13:40,729 ERROR [        ome.services.util.ServiceHandler] (.Server-27) Method interface ome.api.IConfig.getConfigValues invocation took 34965
2024-04-10 09:13:40,744 ERROR [        ome.services.util.ServiceHandler] (.Server-52) Method interface ome.services.util.Executor$Work.doWork invocation took 40802
2024-04-10 09:13:41,783 ERROR [        ome.services.util.ServiceHandler] (.Server-38) Method interface ome.api.IConfig.getConfigValues invocation took 66015
2024-04-10 09:40:41,938 ERROR [        ome.services.util.ServiceHandler] (.Server-44) Method interface ome.services.util.Executor$Work.doWork invocation took 1809920
2024-04-10 09:40:41,940 ERROR [        ome.services.util.ServiceHandler] (.Server-44) Method interface omeis.providers.re.RenderingEngine.load invocation took 1809936
2024-04-10 09:43:23,162 ERROR [        ome.services.util.ServiceHandler] (.Server-33) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8133881
2024-04-10 09:47:00,892 ERROR [        ome.services.util.ServiceHandler] (.Server-16) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 7794919
omeroreadonly-4
2024-04-10 09:37:05,549 ERROR [        ome.services.util.ServiceHandler] (.Server-92) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 7485398
2024-04-10 09:37:33,544 ERROR [        ome.services.util.ServiceHandler] (.Server-86) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 7707792
2024-04-10 09:37:57,370 ERROR [        ome.services.util.ServiceHandler] (.Server-73) Method interface ome.services.util.Executor$Work.doWork invocation took 1585274
2024-04-10 09:37:57,372 ERROR [        ome.services.util.ServiceHandler] (.Server-73) Method interface omeis.providers.re.RenderingEngine.load invocation took 1585286

Most of those are coming from omeroreadonly-3
First of these is associated with creating PixelBuffer which waited on the completion of memo file generation

[wmoore@test120-proxy ~]$ ssh omeroreadonly-3

[wmoore@test120-omeroreadonly-3 ~]$ grep -A 5 -B 5 "2024-04-10 09:13:40,629" /opt/omero/server/OMERO.server/var/log/Blitz-0.log
2024-04-10 09:13:40,571 DEBUG [                   loci.formats.Memoizer] (.Server-48) saved to temp file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/2016-07/28/20-51-59.292_mkngff/f63bc331-42b0-4e55-abd3-abf4de843026.zarr/OME/.METADATA.ome.xml.bfmemo1881140284604001071
2024-04-10 09:13:40,572 DEBUG [                   loci.formats.Memoizer] (.Server-48) start[1712740415493] time[5078] tag[loci.formats.Memoizer.saveMemo]
2024-04-10 09:13:40,627 DEBUG [                   loci.formats.Memoizer] (.Server-48) saved memo file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/2016-07/28/20-51-59.292_mkngff/f63bc331-42b0-4e55-abd3-abf4de843026.zarr/OME/.METADATA.ome.xml.bfmemo (399027 bytes)
2024-04-10 09:13:40,628 DEBUG [                   loci.formats.Memoizer] (.Server-48) start[1712738559523] time[1861105] tag[loci.formats.Memoizer.setId]
2024-04-10 09:13:40,628 INFO  [                ome.io.nio.PixelsService] (.Server-48) Creating BfPixelBuffer: /data/OMERO/ManagedRepository/demo_2/2016-07/28/20-51-59.292_mkngff/f63bc331-42b0-4e55-abd3-abf4de843026.zarr/OME/METADATA.ome.xml Series: 11
2024-04-10 09:13:40,629 INFO  [                 org.perf4j.TimingLogger] (.Server-48) start[1712738559514] time[1861115] tag[omero.call.success.ome.services.RenderingBean$12.doWork]
2024-04-10 09:13:40,629 INFO  [        ome.services.util.ServiceHandler] (.Server-48)  Rslt:	ome.io.bioformats.BfPixelBuffer@218d70be
2024-04-10 09:13:40,629 ERROR [        ome.services.util.ServiceHandler] (.Server-48) Method interface ome.services.util.Executor$Work.doWork invocation took 1861115

This is from idr0011:

less /data/OMERO/ManagedRepository/demo_2/2016-07/28/20-51-59.292_mkngff/f63bc331-42b0-4e55-abd3-abf4de843026.zarr/OME/METADATA.ome.xml
...
 Name="/uod/idr/filesets/idr0011-thorpe-Dad4/20150826-peter_thorpe/T34 x TS/Plate3-TS/Plate3-TS-Blue-B"

It looks like all the other ERROR from 09:13:40 were other services that were held-up by that one completing.

Errors at 09:40:41 were the same plate from idr0011:

[wmoore@test120-omeroreadonly-3 ~]$ grep -A 5 -B 5 "2024-04-10 09:40:41,938" /opt/omero/server/OMERO.server/var/log/Blitz-0.log
2024-04-10 09:40:41,789 DEBUG [                   loci.formats.Memoizer] (.Server-44) start[1712742032139] time[9650] tag[loci.formats.Memoizer.saveMemo]
2024-04-10 09:40:41,935 DEBUG [                   loci.formats.Memoizer] (.Server-44) saved memo file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/2016-07/28/20-51-59.292_mkngff/f63bc331-42b0-4e55-abd3-abf4de843026.zarr/OME/.METADATA.ome.xml.bfmemo (399013 bytes)
2024-04-10 09:40:41,935 DEBUG [                   loci.formats.Memoizer] (.Server-44) start[1712740232034] time[1809901] tag[loci.formats.Memoizer.setId]
2024-04-10 09:40:41,936 INFO  [                ome.io.nio.PixelsService] (.Server-44) Creating BfPixelBuffer: /data/OMERO/ManagedRepository/demo_2/2016-07/28/20-51-59.292_mkngff/f63bc331-42b0-4e55-abd3-abf4de843026.zarr/OME/METADATA.ome.xml Series: 25
2024-04-10 09:40:41,937 INFO  [                 org.perf4j.TimingLogger] (.Server-44) start[1712740232017] time[1809920] tag[omero.call.success.ome.services.RenderingBean$12.doWork]
2024-04-10 09:40:41,938 INFO  [        ome.services.util.ServiceHandler] (.Server-44)  Rslt:	ome.io.bioformats.BfPixelBuffer@5f163467
2024-04-10 09:40:41,938 ERROR [        ome.services.util.ServiceHandler] (.Server-44) Method interface ome.services.util.Executor$Work.doWork invocation took 1809920

NB: we have a repeated generation of memo file for this Fileset, saved at the same location but with a difference of 14 bytes in size:

2024-04-10 09:13:40,627 DEBUG [                   loci.formats.Memoizer] (.Server-48) saved memo file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/2016-07/28/20-51-59.292_mkngff/f63bc331-42b0-4e55-abd3-abf4de843026.zarr/OME/.METADATA.ome.xml.bfmemo (399027 bytes)

2024-04-10 09:40:41,935 DEBUG [                   loci.formats.Memoizer] (.Server-44) saved memo file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/2016-07/28/20-51-59.292_mkngff/f63bc331-42b0-4e55-abd3-abf4de843026.zarr/OME/.METADATA.ome.xml.bfmemo (399013 bytes)

Other ERRORs also associated with memo file generation:

[wmoore@test120-omeroreadonly-3 ~]$ grep -A 5 -B 5 "2024-04-10 09:43:23,162" /opt/omero/server/OMERO.server/var/log/Blitz-0.log
2024-04-10 09:43:22,951 DEBUG [                   loci.formats.Memoizer] (.Server-33) start[1712742197527] time[5424] tag[loci.formats.Memoizer.saveMemo]
2024-04-10 09:43:23,159 DEBUG [                   loci.formats.Memoizer] (.Server-33) saved memo file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/2016-05/04/03-27-54.535_mkngff/7493b87d-b6f6-48ce-a19d-3741eb11a57f.zarr/OME/.METADATA.ome.xml.bfmemo (420748 bytes)
2024-04-10 09:43:23,159 DEBUG [                   loci.formats.Memoizer] (.Server-33) start[1712734069303] time[8133856] tag[loci.formats.Memoizer.setId]
2024-04-10 09:43:23,160 INFO  [                ome.io.nio.PixelsService] (.Server-33) Creating BfPixelBuffer: /data/OMERO/ManagedRepository/demo_2/2016-05/04/03-27-54.535_mkngff/7493b87d-b6f6-48ce-a19d-3741eb11a57f.zarr/OME/METADATA.ome.xml Series: 0
2024-04-10 09:43:23,161 INFO  [                 org.perf4j.TimingLogger] (.Server-33) start[1712734069280] time[8133881] tag[omero.call.success.ome.services.RawPixelsBeanReadOnly.setPixelsId]
2024-04-10 09:43:23,162 INFO  [        ome.services.util.ServiceHandler] (.Server-33)  Rslt:	null
2024-04-10 09:43:23,162 ERROR [        ome.services.util.ServiceHandler] (.Server-33) Method interface ome.api.RawPixelsStore.setPixelsId invocation took 8133881

This is for idr0013, plate Name="LT0068_43" from less /data/OMERO/ManagedRepository/demo_2/2016-05/04/03-27-54.535_mkngff/7493b87d-b6f6-48ce-a19d-3741eb11a57f.zarr/OME/METADATA.ome.xml

Since we weren't looking at idr0013 during testing, this is likely coming from the parallel memo file generation that was only cancelled as testing started around 9:35.

This is confirmed by memo generation logs... (times here are GMT - 1 hour out). At 8:32 (9:32 BST) we see Killed by signal 15. in stderr.

[wmoore@test120-proxy ~]$ ls -alh /tmp/ngff_cache_20240409_ngff/1/Image\:1557553
total 20K
drwxrwxr-x.   2 wmoore wmoore   45 Apr 10 07:27 .
drwxrwxr-x. 258 wmoore wmoore 8.0K Apr 10 08:25 ..
-rw-rw-r--.   1 wmoore wmoore    3 Apr 10 07:27 seq
-rw-rw-r--.   1 wmoore wmoore   22 Apr 10 08:32 stderr
-rw-rw-r--.   1 wmoore wmoore    0 Apr 10 07:27 stdout
[wmoore@test120-proxy ~]$ cat /tmp/ngff_cache_20240409_ngff/1/Image\:1557553/stdout 
[wmoore@test120-proxy ~]$ cat /tmp/ngff_cache_20240409_ngff/1/Image\:1557553/stderr
Killed by signal 15.

But the memo file generation continued on the server until 09:43:23 (all during testing time).
8133856 ms (memo saved time) is 135 minutes, corresponding to start at 7:27 -> 9:43 (GMT).

Errors during NGFF testing on idr-next: 2024-01-16

Look for nginx logs...

ssh idr-next.openmicroscopy.org
[wmoore@prod120-proxy nginx]$ ls -alh /var/log/nginx/ | grep "Jan 1"
drwxr-xr-x.  2 root  root 4.0K Jan 17 03:41 .
drwxr-xr-x. 12 root  root 4.0K Jan 14 03:50 ..
-rw-r-----.  1 nginx adm  823K Jan 17 10:26 access.log
-rw-r-----.  1 nginx adm   89K Jan 10 03:06 access.log-20240110.gz
-rw-r-----.  1 nginx adm  159K Jan 11 03:31 access.log-20240111.gz
-rw-r-----.  1 nginx adm   82K Jan 12 03:22 access.log-20240112.gz
-rw-r-----.  1 nginx adm   86K Jan 13 03:42 access.log-20240113.gz
-rw-r-----.  1 nginx adm   86K Jan 14 03:49 access.log-20240114.gz
-rw-r-----.  1 nginx adm   84K Jan 15 03:48 access.log-20240115.gz
-rw-r-----.  1 nginx adm  162K Jan 16 03:45 access.log-20240116.gz
-rw-r-----.  1 nginx adm  8.4M Jan 17 03:40 access.log-20240117
-rw-r-----.  1 nginx adm   12K Jan 17 09:45 error.log
-rw-r-----.  1 nginx adm  1.9K Jan 10 10:31 error.log-20240111.gz
-rw-r-----.  1 nginx adm  2.4K Jan 15 15:00 error.log-20240116.gz
-rw-r-----.  1 nginx adm   37K Jan 16 10:36 error.log-20240117

sudo less error.log-20240117

...
2024/01/16 09:38:50 [warn] 10236#10236: *539804 upstream server temporarily disabled while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /gallery-api/thumbnails/?screen=3302&screen=3301&project=2501&screen=3452&project=2451&screen=3252&screen=3251&screen=3403&screen=3402&screen=3401 HTTP/2.0", subrequest: "/gallery-api/thumbnails/", upstream: "http://192.168.120.174:80/gallery-api/thumbnails/?screen=3302&screen=3301&project=2501&screen=3452&project=2451&screen=3252&screen=3251&screen=3403&screen=3402&screen=3401", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/"
2024/01/16 09:38:50 [error] 10236#10236: *539804 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /gallery-api/thumbnails/?screen=3302&screen=3301&project=2501&screen=3452&project=2451&screen=3252&screen=3251&screen=3403&screen=3402&screen=3401 HTTP/2.0", subrequest: "/gallery-api/thumbnails/", upstream: "http://192.168.120.174:80/gallery-api/thumbnails/?screen=3302&screen=3301&project=2501&screen=3452&project=2451&screen=3252&screen=3251&screen=3403&screen=3402&screen=3401", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/"
2024/01/16 09:47:05 [warn] 10238#10238: *542626 an upstream response is buffered to a temporary file /var/cache/nginx/proxy_temp/4/62/0000028624 while reading upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webgateway/original_file_paths/1921849/?_=1705397880076 HTTP/2.0", upstream: "http://192.168.120.174:80/webgateway/original_file_paths/1921849/?_=1705397880076", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"
2024/01/16 10:35:03 [warn] 10238#10238: *550125 upstream server temporarily disabled while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862868&_=1705400862932 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862868&_=1705400862932", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:35:03 [error] 10238#10238: *550125 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862868&_=1705400862932 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862868&_=1705400862932", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:35:05 [warn] 10236#10236: *548482 upstream server temporarily disabled while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /webclient/metadata_preview/image/10648071/?_=1705400875606 HTTP/2.0", subrequest: "/webclient/metadata_preview/image/10648071/", upstream: "http://192.168.120.174:80/webclient/metadata_preview/image/10648071/?_=1705400875606", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"
2024/01/16 10:35:05 [error] 10236#10236: *548482 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /webclient/metadata_preview/image/10648071/?_=1705400875606 HTTP/2.0", subrequest: "/webclient/metadata_preview/image/10648071/", upstream: "http://192.168.120.174:80/webclient/metadata_preview/image/10648071/?_=1705400875606", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"
2024/01/16 10:35:05 [warn] 10236#10236: *548482 upstream server temporarily disabled while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875557&_=1705400875607 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875557&_=1705400875607", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"
2024/01/16 10:35:05 [error] 10236#10236: *548482 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875557&_=1705400875607 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875557&_=1705400875607", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"
2024/01/16 10:35:17 [warn] 10238#10238: *550125 upstream server temporarily disabled while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862871&_=1705400862934 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862871&_=1705400862934", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:35:17 [error] 10238#10238: *550125 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862871&_=1705400862934 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862871&_=1705400862934", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:35:35 [warn] 10238#10238: *550125 upstream server temporarily disabled while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/metadata_preview/image/10648048/?_=1705400862935 HTTP/2.0", subrequest: "/webclient/metadata_preview/image/10648048/", upstream: "http://192.168.120.174:80/webclient/metadata_preview/image/10648048/?_=1705400862935", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:35:35 [error] 10238#10238: *550125 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/metadata_preview/image/10648048/?_=1705400862935 HTTP/2.0", subrequest: "/webclient/metadata_preview/image/10648048/", upstream: "http://192.168.120.174:80/webclient/metadata_preview/image/10648048/?_=1705400862935", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:35:35 [warn] 10238#10238: *550125 upstream server temporarily disabled while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648048/?callback=jQuery362031136720943034524_1705400862936&_=1705400862937 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648048/?callback=jQuery362031136720943034524_1705400862936&_=1705400862937", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:35:35 [error] 10238#10238: *550125 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648048/?callback=jQuery362031136720943034524_1705400862936&_=1705400862937 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648048/?callback=jQuery362031136720943034524_1705400862936&_=1705400862937", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:35:40 [warn] 10236#10236: *548482 upstream server temporarily disabled while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875560&_=1705400875609 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875560&_=1705400875609", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"
2024/01/16 10:35:40 [error] 10236#10236: *548482 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875560&_=1705400875609 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875560&_=1705400875609", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"
2024/01/16 10:35:46 [warn] 10238#10238: *550125 upstream server temporarily disabled while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862940&_=1705400862941 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862940&_=1705400862941", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:35:46 [error] 10238#10238: *550125 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862940&_=1705400862941 HTTP/2.0", upstream: "http://192.168.120.174:80/webclient/imgData/10648047/?callback=jQuery362031136720943034524_1705400862940&_=1705400862941", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:36:05 [warn] 10236#10236: *548482 upstream server temporarily disabled while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875557&_=1705400875607 HTTP/2.0", upstream: "http://192.168.120.132:80/webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875557&_=1705400875607", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"
2024/01/16 10:36:05 [error] 10236#10236: *548482 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875557&_=1705400875607 HTTP/2.0", upstream: "http://192.168.120.132:80/webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875557&_=1705400875607", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"
2024/01/16 10:36:35 [warn] 10238#10238: *550125 upstream server temporarily disabled while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648048/?callback=jQuery362031136720943034524_1705400862936&_=1705400862937 HTTP/2.0", upstream: "http://192.168.120.132:80/webclient/imgData/10648048/?callback=jQuery362031136720943034524_1705400862936&_=1705400862937", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:36:35 [error] 10238#10238: *550125 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 134.36.251.166, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648048/?callback=jQuery362031136720943034524_1705400862936&_=1705400862937 HTTP/2.0", upstream: "http://192.168.120.132:80/webclient/imgData/10648048/?callback=jQuery362031136720943034524_1705400862936&_=1705400862937", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1"
2024/01/16 10:36:40 [warn] 10236#10236: *548482 upstream server temporarily disabled while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875560&_=1705400875609 HTTP/2.0", upstream: "http://192.168.120.132:80/webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875560&_=1705400875609", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"
2024/01/16 10:36:40 [error] 10236#10236: *548482 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 81.79.160.1, server: prod120-proxy.novalocal, request: "GET /webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875560&_=1705400875609 HTTP/2.0", upstream: "http://192.168.120.132:80/webclient/imgData/10648071/?callback=jQuery36207056919841979_1705400875560&_=1705400875609", host: "idr-next.openmicroscopy.org", referrer: "https://idr-next.openmicroscopy.org/webclient/usertags/"

There's a bunch of gallery-api/thumbnails/ errors to start (around 9:30 ish) then 10:35 - 10:36 there's 6 imgData/ requests for images: - All from idr0091:

  • 10648047
  • 10648071 (x 4)
  • 10648048

sudo less access.log-20240117 gives access logs but no error logging:
We were looking at idr0090 around 10:31.. eg:

134.36.66.49 - - [16/Jan/2024:10:31:19 +0000] "GET /webclient/render_thumbnail/12549526/?rdefId=12101094&_=random0.2862405070337033 HTTP/2.0" 200 2834 "https://idr-next.openmicroscopy.org/webclient/?show=well-1011236" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36" "-" 0.132 MISS 192.168.120.132:80

Look at times that the main webclient page was requested since we expect to see that at the point that the browser refreshed due to a "logout". See one at 10:26.

[wmoore@prod120-proxy nginx]$ sudo grep " /webclient/?" access.log-20240117
134.36.66.49 - - [16/Jan/2024:09:32:38 +0000] "GET /webclient/?show=project-2801 HTTP/2.0" 200 148282 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" 0.581 - 192.168.120.132:80
134.36.66.49 - - [16/Jan/2024:09:35:13 +0000] "GET /webclient/?show=image-1230281 HTTP/2.0" 200 148346 "https://idr-next.openmicroscopy.org/iviewer/?well=590733" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" 0.523 - 192.168.120.132:80
134.36.66.49 - - [16/Jan/2024:09:36:53 +0000] "GET /webclient/?show=well-590776 HTTP/2.0" 200 148346 "https://idr-next.openmicroscopy.org/iviewer/?well=590776" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" 0.409 - 192.168.120.132:80
134.36.66.49 - - [16/Jan/2024:09:37:14 +0000] "GET /webclient/?show=image-1230542 HTTP/2.0" 200 148346 "https://idr-next.openmicroscopy.org/iviewer/?well=590776" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36" "-" 0.400 - 192.168.120.132:80
134.36.250.254 - - [16/Jan/2024:09:37:54 +0000] "GET /webclient/?show=project-2201 HTTP/2.0" 200 148282 "https://idr-next.openmicroscopy.org/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:121.0) Gecko/20100101 Firefox/121.0" "-" 0.488 - 192.168.120.132:80
81.79.160.1 - - [16/Jan/2024:09:37:56 +0000] "GET /webclient/?show=project-2801 HTTP/2.0" 200 148282 "https://idr-next.openmicroscopy.org/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Safari/605.1.15" "-" 1.864 - 192.168.120.174:80
134.36.251.166 - - [16/Jan/2024:09:37:57 +0000] "GET /webclient/?show=image-1315337 HTTP/2.0" 200 148348 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36" "-" 0.963 - 192.168.120.174:80
134.36.66.49 - - [16/Jan/2024:09:42:54 +0000] "GET /webclient/?experimenter=-1 HTTP/2.0" 200 148238 "https://idr-next.openmicroscopy.org/webclient/img_detail/10647647/?dataset=11909" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36" "-" 0.262 - 192.168.120.132:80
134.36.250.254 - - [16/Jan/2024:09:52:47 +0000] "GET /webclient/?show=well-1235537 HTTP/2.0" 200 148325 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:121.0) Gecko/20100101 Firefox/121.0" "-" 0.506 - 192.168.120.132:80
81.79.160.1 - - [16/Jan/2024:09:59:56 +0000] "GET /webclient/?show=well-1011236 HTTP/2.0" 200 148325 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Safari/605.1.15" "-" 0.285 - 192.168.120.174:80
134.36.251.166 - - [16/Jan/2024:09:59:56 +0000] "GET /webclient/?show=well-1011236 HTTP/2.0" 200 148325 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36" "-" 0.463 - 192.168.120.174:80
134.36.66.49 - - [16/Jan/2024:10:00:16 +0000] "GET /webclient/?show=well-1011236 HTTP/2.0" 200 148309 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36" "-" 0.360 - 192.168.120.132:80
81.79.160.1 - - [16/Jan/2024:10:26:55 +0000] "GET /webclient/?show=project-2801 HTTP/2.0" 200 148282 "https://idr-next.openmicroscopy.org/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Safari/605.1.15" "-" 0.390 - 192.168.120.174:80
127.0.0.1 - - [16/Jan/2024:11:00:16 +0000] "GET /webclient/?experimenter=-1 HTTP/1.1" 200 149037 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36" "-" 0.276 - 192.168.120.37:80
134.36.250.254 - - [16/Jan/2024:11:36:25 +0000] "GET /webclient/?show=screen-3103 HTTP/2.0" 200 148280 "https://idr-next.openmicroscopy.org/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:121.0) Gecko/20100101 Firefox/121.0" "-" 0.291 - 192.168.120.132:80

I took this screenshot to look for Console errors from Preview panel image viewer 'collapsed' (zero height):

Screenshot 2024-01-16 at 10 44 23

The image 10648048 referred to there is from idr0091 (corresponds to error log above):

[wmoore@prod120-proxy nginx]$ sudo grep "/10648048" access.log-20240117
134.36.251.166 - - [16/Jan/2024:10:34:35 +0000] "GET /webclient/render_thumbnail/10648048/?rdefId=10324053 HTTP/2.0" 200 990 "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36" "-" 0.413 STALE -
134.36.251.166 - - [16/Jan/2024:10:36:35 +0000] "GET /webclient/metadata_preview/image/10648048/?_=1705400862935 HTTP/2.0" 200 16399 "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36" "-" 120.012 STALE -
134.36.251.166 - - [16/Jan/2024:10:36:35 +0000] "GET /webclient/imgData/10648048/?callback=jQuery362031136720943034524_1705400862936&_=1705400862937 HTTP/2.0" 504 569 "https://idr-next.openmicroscopy.org/webclient/usertags/?experimenter=-1" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36" "-" 120.002 - 192.168.120.174:80, 192.168.120.132:80
[wmoore@prod120-omeroreadwrite ~]$ less /opt/omero/server/OMERO.server/var/log/master.err
...
-! 01/06/24 23:23:16.902 OMERO.Glacier2: warning: dispatch exception: ConnectionI.cpp:1573: Ice::MemoryLimitException:
   protocol error: memory limit exceeded:
   requested 1730150430 bytes, maximum allowed is 256000000 bytes (see Ice.MessageSizeMax)
   identity: 42bca4b9-538a-4dfa-96bd-aefacab4dbb8/c15e96ee-ef1a-46c8-b092-525ee48502e5omero.api.RawPixelsStore
   facet: 
   operation: getPlane
   remote host: 127.0.0.1 remote port: 47034
java.io.IOException: '.zarray' expected but is not readable or missing in store.
        at com.bc.zarr.ZarrArray.open(ZarrArray.java:106)
        at com.bc.zarr.ZarrArray.open(ZarrArray.java:99)
        at com.bc.zarr.ZarrArray.open(ZarrArray.java:95)
        at com.bc.zarr.ZarrArray.open(ZarrArray.java:91)
        at loci.formats.services.JZarrServiceImpl.open(JZarrServiceImpl.java:89)
        at loci.formats.in.ZarrReader.openZarr(ZarrReader.java:579)
        at loci.formats.in.ZarrReader.initializeZarrService(ZarrReader.java:467)
        at loci.formats.in.ZarrReader.reopenFile(ZarrReader.java:458)
        at loci.formats.ImageReader.reopenFile(ImageReader.java:869)
        at loci.formats.ReaderWrapper.reopenFile(ReaderWrapper.java:665)
        at loci.formats.ReaderWrapper.reopenFile(ReaderWrapper.java:665)
        at loci.formats.Memoizer.setId(Memoizer.java:712)
        at ome.io.bioformats.BfPixelsWrapper.<init>(BfPixelsWrapper.java:52)
        at ome.io.bioformats.BfPixelBuffer.reader(BfPixelBuffer.java:73)
        at ome.io.bioformats.BfPixelBuffer.setSeries(BfPixelBuffer.java:124)
        at ome.io.nio.PixelsService.createBfPixelBuffer(PixelsService.java:898)
        at ome.io.nio.PixelsService._getPixelBuffer(PixelsService.java:653)
        at ome.io.nio.PixelsService.getPixelBuffer(PixelsService.java:571)
        at ome.services.RenderingBean$12.doWork(RenderingBean.java:2205)
        at jdk.internal.reflect.GeneratedMethodAccessor282.invoke(Unknown Source)

For image 10648048 can see that the Fileset path contains 10648048.zarr, so let's search Blitz logs on all nodes for that yesterday 2024-01-16. Don't see any errors...

[wmoore@prod120-proxy ~]$ cat nodes
omeroreadonly-1
omeroreadonly-2
omeroreadonly-3
omeroreadonly-4
omeroreadwrite

[wmoore@prod120-proxy ~]$ for n in $(cat nodes); do echo $n && ssh $n "grep 418997a378d8 /opt/omero/server/OMERO.server/var/log/Blitz-0.log | grep 2024-01-16"; done
omeroreadonly-1
2024-01-16 10:35:35,553 INFO  [      ome.services.OmeroFilePathResolver] (.Server-25) Metadata only file, resulting path: /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs
2024-01-16 10:35:35,589 INFO  [                loci.formats.ImageReader] (.Server-25) ZarrReader initializing /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs
2024-01-16 10:35:35,811 INFO  [      ome.services.OmeroFilePathResolver] (.Server-11) Metadata only file, resulting path: /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs
2024-01-16 10:35:35,817 INFO  [                loci.formats.ImageReader] (.Server-11) ZarrReader initializing /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs
2024-01-16 11:15:16,448 DEBUG [                   loci.formats.Memoizer] (.Server-25) saved to temp file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/..zattrs.bfmemo16099173764015296735
2024-01-16 11:15:16,483 DEBUG [                   loci.formats.Memoizer] (.Server-25) saved memo file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/..zattrs.bfmemo (91345 bytes)
2024-01-16 11:15:16,484 INFO  [                ome.io.nio.PixelsService] (.Server-25) Creating BfPixelBuffer: /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs Series: 0
2024-01-16 11:15:16,537 DEBUG [                   loci.formats.Memoizer] (.Server-11) saved to temp file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/..zattrs.bfmemo10540699244396677351
2024-01-16 11:15:16,541 DEBUG [                   loci.formats.Memoizer] (.Server-11) saved memo file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/..zattrs.bfmemo (91345 bytes)
2024-01-16 11:15:16,541 INFO  [                ome.io.nio.PixelsService] (.Server-11) Creating BfPixelBuffer: /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs Series: 0
omeroreadonly-2
omeroreadonly-3
2024-01-16 10:34:35,410 INFO  [      ome.services.OmeroFilePathResolver] (l.Server-5) Metadata only file, resulting path: /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs
2024-01-16 10:34:35,518 INFO  [                loci.formats.ImageReader] (l.Server-5) ZarrReader initializing /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs
2024-01-16 10:34:35,788 INFO  [      ome.services.OmeroFilePathResolver] (l.Server-4) Metadata only file, resulting path: /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs
2024-01-16 10:34:35,794 INFO  [                loci.formats.ImageReader] (l.Server-4) ZarrReader initializing /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs
2024-01-16 11:14:25,171 DEBUG [                   loci.formats.Memoizer] (l.Server-4) saved to temp file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/..zattrs.bfmemo2437379478352568002
2024-01-16 11:14:25,268 DEBUG [                   loci.formats.Memoizer] (l.Server-4) saved memo file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/..zattrs.bfmemo (91345 bytes)
2024-01-16 11:14:25,269 INFO  [                ome.io.nio.PixelsService] (l.Server-4) Creating BfPixelBuffer: /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs Series: 0
2024-01-16 11:14:25,564 DEBUG [                   loci.formats.Memoizer] (l.Server-5) saved to temp file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/..zattrs.bfmemo8461342761922123114
2024-01-16 11:14:25,641 DEBUG [                   loci.formats.Memoizer] (l.Server-5) saved memo file: /data/OMERO/BioFormatsCache/data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/..zattrs.bfmemo (91345 bytes)
2024-01-16 11:14:25,642 INFO  [                ome.io.nio.PixelsService] (l.Server-5) Creating BfPixelBuffer: /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-10/2020-10/02/14-51-51.873_mkngff/4ae7771a-20ab-453a-ba15-418997a378d8.zarr/.zattrs Series: 0
omeroreadonly-4
omeroreadwrite

Bio-Formats cache regeneration walkthrough

Following https://github.com/IDR/deployment/blob/master/docs/operating-procedures.md#bio-formats-cache-regeneration with step by step notes...

The instructions there distribute the work across all 5 proxy servers..

Followed instructions exactly as written on idr-testing until:

for i in $(cat nodes); do ssh $i 'pip install --user omero-cli-render'; done
bash: pip: command not found
bash: pip: command not found

We can probably skip this step if we use the one shared python environment at /opt/omero/server/venv3 for subsequent steps...

[wmoore@test120-omeroreadwrite ~]$ sudo rm -rf /data/BioFormatsCache/*

Checking we have same jars on all servers... e.g. ZarrReader...

for i in $(cat nodes); do ssh $i 'md5sum /opt/omero/server/OMERO.server/lib/server/OMEZarr*'; done
ce6e7e67626f8c5cc59085b0282ba9cc  /opt/omero/server/OMERO.server/lib/server/OMEZarrReader.jar
ce6e7e67626f8c5cc59085b0282ba9cc  /opt/omero/server/OMERO.server/lib/server/OMEZarrReader.jar
ce6e7e67626f8c5cc59085b0282ba9cc  /opt/omero/server/OMERO.server/lib/server/OMEZarrReader.jar
ce6e7e67626f8c5cc59085b0282ba9cc  /opt/omero/server/OMERO.server/lib/server/OMEZarrReader.jar
ce6e7e67626f8c5cc59085b0282ba9cc  /opt/omero/server/OMERO.server/lib/server/OMEZarrReader.jar

Deletion has been running for an hour and a half now:

[wmoore@test120-omeroreadwrite ~]$ sudo rm -rf /data/BioFormatsCache/*

idr0008 - 2 plates missing channel-4.TIFFs

As reported #696 (comment)

  • Image:38517 & Image:40053 - 5th & 7th plates of idr0008 (58 plates)
    Viewing in webclient: message = Error instantiating pixel buffer: /data/OMERO/ManagedRepository/demo_2/2015-09/14/20-06-35.749/005B30_S2R.HTD
2024-06-25 10:13:03,452 ERROR [         ome.io.bioformats.BfPixelBuffer] (.Server-11) Failed to instantiate BfPixelsWrapper with /data/OMERO/ManagedRepository/demo_2/2015-09/14/20-06-35.749/005B30_S2R.HTD
2024-06-25 10:13:03,453 ERROR [                ome.io.nio.PixelsService] (.Server-11) Error instantiating pixel buffer: /data/OMERO/ManagedRepository/demo_2/2015-09/14/20-06-35.749/005B30_S2R.HTD
java.lang.RuntimeException: java.io.FileNotFoundException: /data/OMERO/ManagedRepository/demo_2/2015-09/14/20-06-35.749/005B30_S2R_P24_s2_w4.TIF (No such file or directory)
	at ome.io.bioformats.BfPixelBuffer.reader(BfPixelBuffer.java:79)
	at ome.io.bioformats.BfPixelBuffer.setSeries(BfPixelBuffer.java:124)

The file at /data/OMERO/ManagedRepository/demo_2/2015-09/14/20-06-35.749/005B30_S2R_P24_s2_w4.TIF is not found on idr-testing or idr-next or idr itself!
Image 38517 gives ResouceError above on idr-testing and idr-next but not on idr.openmicroscopy.org!?

Actually, for those 2 plates (5th and 7th of idr008), there are NO ...w4.TIF images and the 4th channel (DAPI) is black (idr.openmicroscopy.org), whereas for all other idr0008 plates, there are 768 images like ...w4.TIF and DAPI is viewable.
Images from these 2 plates are not viewable in idr-testing, possibly due to a change in Bio-Formats?

Seb: likely due to ome/bioformats#3806

JM: https://openmicroscopy.slack.com/archives/C0K5WAD8A/p1719395174513279

  • option 1: handle exception in Rendering in engine when metadata indicates that the image has 4 channels but one channel is missing. Cons: require a fork of openmicroscopy
  • option 2: generate a black tif for the 4th channels. This TIF will have to be put under the managedRepository. Question: Can Bioformats handle the case of files, composing the same image, mounted volume and managedRepository. This will also require to add entry in DB
  • option 3: Check if we can comment out the information about the missing 4th channel in the metadata file. Cons: modify the original file and possible side effect in viewer for example.

idr0036-gustafsdottir-cellpainting S-BIAD855

idr0036-gustafsdottir-cellpainting

Sample plate conversion failed with:

Conversion fails with:
(base) [dlindner@pilot-zarr2-dev idr0036]$ time /home/dlindner/bioformats2raw/bin/bioformats2raw 20608.screen 20608.ome.zarr
OpenJDK 64-Bit Server VM warning: You have loaded library /tmp/opencv_openpnp1378964024917150329/nu/pattern/opencv/linux/x86_64/libopencv_java342.so which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
[Fatal Error] :1:73: Character reference "&#0" is an invalid XML character.
Exception in thread "main" picocli.CommandLine$ExecutionException: Error while calling command (com.glencoesoftware.bioformats2raw.Converter@63a65a25): java.lang.RuntimeException: org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 73; Character reference "&#0" is an invalid XML character.
        at picocli.CommandLine.executeUserObject(CommandLine.java:1962)
        at picocli.CommandLine.access$1300(CommandLine.java:145)

Already handled by: IDR/bioformats#29 .

@will-moore successfully exported it using omero-cli-zarr.

Document NGFF Fileset replacement workflow

NGFF generation

Generation takes place on pilot-zarr1-dev or pilot-zarr2-dev machines.

We need to generate NGFF data with https://github.com/IDR/bioformats2raw/releases/tag/v0.6.0-24 which has ZarrReader fixes, including those required for .pattern file data.

Install bioformats2raw via conda:

conda create -n bioformats2raw python=3.9
conda activate bioformats2raw
conda install -c ome bioformats2raw

This is actually just for getting the dependencies installed. Get the actual bioformats2raw from the link above and just unzip it into your home directory.

We need to generate NGFF Filesets under /data volume.
Create a directory for the idr project and memo files (if it’s not already there), and change into the idr directory. For example for idr0051:

cd /data
sudo mkdir idr0051
sudo chown yourname idr0051
sudo mkdir memo
sudo chown yourname memo
cd idr0051

Find out where the pattern, screen or companion files are. For example: /nfs/bioimage/drop/idr0051-fulton-tailbudlightsheet/patterns/

Then run the conversion (using the bioformat2raw from above) in a screen (long running):

NB: it may be useful to convert a single Fileset to zarr initially to determine the size of this on disk and to tell whether you have enough space to convert all the others at once.
If not, might have to do a smaller number, zip and upload to BioStudies before deleting to make space available.

NB: please make sure that the --memo-directory specified here is writable by you.

screen -S idr0051ngff

for i in `ls /nfs/bioimage/drop/idr0051-fulton-tailbudlightsheet/patterns/`; do echo $i; ~/bioformats2raw-0.6.0-24/bin/bioformats2raw --memo-directory ../memo /nfs/bioimage/drop/idr0051-fulton-tailbudlightsheet/patterns/$i ${i%.*}.ome.zarr; done

($i is the pattern file, ${i%.*}.ome.zarr strips the .pattern file extension and adds .ome.zarr; this should work for pattern, screen and also companion file extensions)

Upload to EBI s3 for testing

Upload 1 or 2 Plates or Images to EBI's s3, so we can validate that the data can be viewed and imported on s3.

Create a bucket from local aws install:
Once installed aws just do aws configure and enter Access key and Secret key - use defaults for other options.

$ aws --endpoint-url https://uk1s3.embassy.ebi.ac.uk s3 mb s3://idr0010
make_bucket: idr0010

And update policy and CORS config as at https://github.com/IDR/deployment/blob/master/docs/object-store.md#policy (NB: replace idr0000 with e.g. idr0010 in the sample config etc)

Upload the data using mc, installed on dev servers where data is generated:

$ ssh pilot-zarr1-dev
$ wget https://dl.min.io/client/mc/release/linux-amd64/mc
$ ./mc config host add uk1s3 https://uk1s3.embassy.ebi.ac.uk
Enter Access Key: X8GE11ZK************
Enter Secret Key: 
Added `uk1s3` successfully.

$ /home/wmoore/mc cp -r idr0010/ uk1s3/idr0010/zarr

You should now be able to view and do some validation of the data with ome-ngff-validator and vizarr.
E.g.
https://ome.github.io/ome-ngff-validator/?source=https://uk1s3.embassy.ebi.ac.uk/idr0025/zarr/10x+images+plate+3.ome.zarr

https://hms-dbmi.github.io/vizarr/?source=https://uk1s3.embassy.ebi.ac.uk/idr0025/zarr/10x+images+plate+3.ome.zarr

Submission to BioStudies

Once the NGFF data has been validated to your satisfaction, we can upload to BioStudies.

We need to create a .zip file for each .ome.zarr Fileset.
It can be useful where space is short to use -m to move files into the zip and delete the original.

For a single zarr, this looks like $ zip -mr image.ome.zarr.zip image.ome.zarr.

Convert all the zarr Filesets for a study:
E.g:

screen -S idr0010_zip
cd idr0010
for i in */; do zip -mr "${i%/}.zip" "$i"; done

This will create zips in the same dir as the zarrs, but we want a directory that contains just the zips for upload...

mkdir idr0010
mv *.zip idr0010/

Upload via Aspera, using the "secret directory".
Login to BioStudies with the IDR account.
Click on the FTP/Aspera button at https://www.ebi.ac.uk/biostudies/submissions/files

# install...
$ wget https://ak-delivery04-mul.dhe.ibm.com/sar/CMA/OSA/08q6g/0/ibm-aspera-cli-3.9.6.1467.159c5b1-linux-64-release.sh
$ chmod +x ibm-aspera-cli-3.9.6.1467.159c5b1-linux-64-release.sh 
$ bash ibm-aspera-cli-3.9.6.1467.159c5b1-linux-64-release.sh 
$ cd .aspera/cli/bin

$ ./ascp -P33001 -i ../etc/asperaweb_id_dsa.openssh -d /path/to/idr00xx [email protected]:xx/xxxxxxxxxxxxxxxxxxxxxxx

Some JavaScript you can run in browser console to get the file names in the submission table:

let names = [];
[].forEach.call(document.querySelectorAll("div [role='row'] .ag-cell[col-id='name']"), function(div) {
  names.push(div.innerHTML.trim());
});
console.log(names.join("\n"));
console.log(names.length);

Create a tsv file that lists all the filesets for the submission with the first column named Files. See https://www.ebi.ac.uk/bioimage-archive/help-file-list/

E.g. idr0054_files.tsv:

Files
idr0054/Tonsil 1.ome.zarr.zip
idr0054/Tonsil 2.ome.zarr.zip
idr0054/Tonsil 3.ome.zarr.zip

Upload this to the same location as above (via FTP or using the web UI).
This is used to specify which files to be used in the submission.
You should be able to see all the uploaded files at https://www.ebi.ac.uk/biostudies/submissions/files

Create a new submission at https://www.ebi.ac.uk/biostudies/submissions/

Once submitted, we need to ask EBI to process the submission, unzip each zarr and upload data to s3
BioStudies will assign a uuid to each.
They will provide a mapping from each zip file to uuid.zarr as csv:

Spreadsheet for keeping track of the submissions status:
https://docs.google.com/spreadsheets/d/1P3dn-uL9KzE9O7XAKhpL8fUMTG3LWedMgjzSdnfAjQ4/edit#gid=0

Tonsil 2.ome.zarr.zip, https://uk1s3.embassy.ebi.ac.uk/bia-integrator-data/S-BIAD704/36cb5355-5134-4bdc-bde6-4e693055a8f9/36cb5355-5134-4bdc-bde6-4e693055a8f9.zarr/0
Tonsil 1.ome.zarr.zip, https://uk1s3.embassy.ebi.ac.uk/bia-integrator-data/S-BIAD704/5583fe0a-bbe6-4408-ab96-756e8e96af55/5583fe0a-bbe6-4408-ab96-756e8e96af55.zarr/0
Tonsil 3.ome.zarr.zip, https://uk1s3.embassy.ebi.ac.uk/bia-integrator-data/S-BIAD704/3b4a8721-1a28-4bc4-8443-9b6e145efbe9/3b4a8721-1a28-4bc4-8443-9b6e145efbe9.zarr/0

This needs to be used to create the necessary symlinks below.

If not already done, mount the bia-integrator-data bucket on the server machine and check to see if files are available:

$ sudo mkdir /bia-integrator-data && sudo /opt/goofys --endpoint https://uk1s3.embassy.ebi.ac.uk/ -o allow_other bia-integrator-data /bia-integrator-data

$ ls /bia-integrator-data/S-BIAD704
36cb5355-5134-4bdc-bde6-4e693055a8f9  3b4a8721-1a28-4bc4-8443-9b6e145efbe9  5583fe0a-bbe6-4408-ab96-756e8e96af55

Make NGFF Filesets

Work In progress
Use https://github.com/joshmoore/omero-mkngff to create filesets based on the mounted s3 NGFF Filesets.

See IDR/idr-utils#56 as a script for generating inputs required for omero-mkngff.

conda create -n mkngff -c conda-forge -c ome omero-py bioformats2raw
conda activate mkngff
pip install 'omero-mkngff @ git+https://github.com/joshmoore/omero-mkngff@main'
omero login demo@localhost

omero mkngff setup > setup.sql
omero mkgnff sql --secret=$SECRET 5287125 a.ome.zarr/ > my.sql
sudo -u postgres psql idr < setup.sql
sudo -u postgres psql idr < my.sql

sudo -u omero-server mkdir /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-2/2023-06/22/12-46-39.975_converted/
mv a.ome.zarr /tmp
ln -s /tmp/a.ome.zarr /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-2/2023-06/22/12-46-39.975_converted/a.ome.zarr
omero render test Image:14834721 # Failing here

Validation

See IDR/idr-utils#55
Checkout that branch of idr-utils (if not merged yet etc).

The script there allows us to check the pixel data for the lowest resolution of each image in a study, validating that each plane is identical to the corresponding one in IDR.

This could take a while, so lets run as a screen...

sudo -u omero-server -s
screen -S idr0012_check_pixels
source /opt/omero/server/venv3/bin/activate
omero login demo@localhost
cd /uod/idr/metadata/idr-utils/scripts
python check_pixels.py Plate:4299 /tmp/check_pixels_idr0012.log

Archived workflow below

The sections below were using a previous workflow (prior to the omero-mkngff approach)

Make a metadata-only copy of the data

Since we want to import NGFF data without chunks, we need to create a copy of the data without chunks for import. The easiest way to do this is to use aws to sync the data, ignoring chunks.

We want these to be owned by omero-server user in a location they can access, so they can be imported. Location at import time isn't too important.

$ screen -S idr0010_aws_sync      # can take a while if lots of data    
$ mkdir idr0010
$ cd idr0010
$ aws s3 sync --no-sign-request --exclude '*' --include "*/.z*" --include "*.xml" --endpoint-url https://uk1s3.embassy.ebi.ac.uk s3://idr0010/zarr .

$ sudo mv -f ./* /ngff/idr0010/
$ cd /ngff/
$ sudo chown -R omero-server idr0010/

Import metadata-only data

We can now perform a regular import as usual. Use a for loop to iterate through each plate in the directory instead of creating bulk import config, using name (removing .ome.zarr or .zarr for e.g. idr0036) so that data isn't named METADATA.ome.xml and Plate names match the original data. Could also add a target Screen or Dataset target (not shown) or move into container with webclient UI after import:

sudo -u omero-server -s
screen -S idr0010_ngff
source /opt/omero/server/venv3/bin/activate
export OMERODIR=/opt/omero/server/OMERO.server
omero login demo@localhost

cd /ngff/idr0010
for dir in *; do
  omero import --transfer=ln_s --depth=100 --name=${dir/.ome.zarr/} --skip=all $dir --file /tmp/$dir.log  --errs /tmp/$dir.err;
done

Update symlinks

Mount the s3 bucket on IDR server machine: (idr0125-pilot or idr0138-pilot)

sudo mkdir /idr0010 && sudo /opt/goofys --endpoint https://uk1s3.embassy.ebi.ac.uk/ -o allow_other idr0010 /idr0010

See IDR/idr-utils#54
Checkout that branch of idr-utils (if not merged yet etc).

We need to specify the container (e.g. Screen, Plate, Dataset, Image or Fileset) and the path where the data is mounted:
If the path to the data in each Fileset is e.g. filesetPrefix/plate1.zarr/.. and the path to each mounted plate is e.g. /path/to/plates/plate1.zarr we can run the following command to create 1 symlink for each plate from /ManagedRepository/filesetPrefix/plate1.zarr to /path/to/plates/plate1.zarr

The script also renders a single Image from each Fileset before updating symlinks, which avoids subsequent ResouceErrors.
The script can be run repeatedly on the same data without issue, e.g. if it fails part-way through and needs a re-run to complete.

A --repo option with default value is /data/OMERO/ManagedRepository.
Can also use --dry-run and --report options:

$ sudo -u omero-server -s
$ source /opt/omero/server/venv3/bin/activate
$ omero login demo@localhost
$ python idr-utils/scripts/managed_repo_symlinks.py Screen:123 /path/to/plates/ --report

Fileset: 5286929 /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-6/2023-04/25/13-53-43.777/
fs_contents ['10-34.ome.zarr']
Link from /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-6/2023-04/25/13-53-43.777/10-34.ome.zarr to /idr0010/zarr/10-34.ome.zarr
...

Swap Filesets

See IDR/idr-utils#53
Checkout that branch of idr-utils (if not merged yet etc).

The first Object (Screen, Plate, Image, Fileset) is the original data that we want to update to use NGFF Fileset, and the second is the NGFF data we imported above. In the case of Screens, Filesets are swapped between pairs of Plates matched by name (you should check that Plate names match before running this script).
The 3rd required argument is a file where you can write the sql commands that are required to update Pixels objects (we can't yet update these via the OMERO API).
The script supports --dry-run and --report flags.

$ source /opt/omero/server/venv3/bin/activate
$ omero login demo@localhost
$ python idr-utils/scripts/swap_filesets.py Screen:1202 Screen:3204 /tmp/idr0012_filesetswap.sql --report

This will write a psql command for each Fileset that we then need to execute...

$ export OMERODIR=/opt/omero/server/OMERO.server
$ omero config get --show-password

# Use the password, host etc to run the sql file generated above...
$ PGPASSWORD=****** psql -U omero -d idr -h 192.168.10.102 -f /tmp/idr0012_filesetswap.sql

psql commands are 1 per Fileset and are like:

UPDATE pixels SET name = '.zattrs', path = 'demo_2/Blitz-0-Ice.ThreadPool.Server-16/2023-04/12/10-20-20.483/10x_images_plate_2.ome.zarr' where image in (select id from Image where fileset = 5286921);

You can then view Images from the original data which is now using an NGFF Fileset!

Cleanup

We can now delete the uk1s3 data and buckets created above for testing.
The original Filesets will remain as "orphans".

idr0012 plates, Row C images have sizeZ=3 sizeC=1 instead of sizeZ=1 sizeC=3

For all plates in idr0012, the Row C thumbnails appear greyscale and the images are not viewable:
They have 1 Channel and sizeZ=3, instead of a single Z - 3-Channel as for the rest of the Images on each Plate.

E.g first Row C image of first plate:
https://idr.openmicroscopy.org/webclient/?show=image-1811462

On idr-testing omeroreadwrite, trying to render tiles, e.g. http://localhost:1080/webgateway/render_image_region/1811462/1/0/?tile=1,0,0,512,512

Gives this in Blitz log

2024-04-19 08:47:19,676 INFO  [        ome.services.util.ServiceHandler] (Server-118)  Rslt:	null
2024-04-19 08:47:19,677 INFO  [        ome.services.util.ServiceHandler] (Server-142)  Meth:	interface omeis.providers.re.RenderingEngine.renderCompressed
2024-04-19 08:47:19,677 INFO  [        ome.services.util.ServiceHandler] (Server-142)  Args:	[Type: XY, z=1, t=0; Region: x=0 y=0 width=512 height=512, renderShapes=false, shapeIds=[]]
2024-04-19 08:47:19,677 INFO  [             omeis.providers.re.Renderer] (Server-142) Using: 'omeis.providers.re.GreyScaleStrategy' rendering strategy.
2024-04-19 08:47:19,678 INFO  [                 org.perf4j.TimingLogger] (Server-142) start[1713516439677] time[0] tag[omero.call.exception]
2024-04-19 08:47:19,679 WARN  [        ome.services.util.ServiceHandler] (Server-142) IllegalArgumentException thrown.

java.lang.IllegalArgumentException: Invalid Z index: 1/1
	at loci.formats.FormatTools.getIndex(FormatTools.java:474)
	at loci.formats.FormatTools.getIndex(FormatTools.java:409)
	at loci.formats.ChannelSeparator.getIndex(ChannelSeparator.java:266)
	at loci.formats.ReaderWrapper.getIndex(ReaderWrapper.java:498)
	at ome.io.bioformats.BfPixelsWrapper.getTile(BfPixelsWrapper.java:350)
	at ome.io.bioformats.BfPixelBuffer.getTile(BfPixelBuffer.java:499)
	at omeis.providers.re.data.PlaneFactory.createPlane(PlaneFactory.java:193)
	at omeis.providers.re.GreyScaleStrategy.renderAsPackedInt(GreyScaleStrategy.java:153)
	at omeis.providers.re.Renderer.renderAsPackedInt(Renderer.java:558)
	at ome.services.RenderingBean.renderAsPackedInt(RenderingBean.java:512)
	at ome.services.RenderingBean.renderCompressed(RenderingBean.java:542)
	at jdk.internal.reflect.GeneratedMethodAccessor1485.invoke(Unknown Source)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333)
	at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)

Rendering the first plane (Z=0, T=0) works OK:
https://idr.openmicroscopy.org/webclient/render_image/1811462/0/0/

I can't see any pixel data for channels 2 and 3 (all black)
http://localhost:1080/webgateway/render_image/1811462/0/0/?c=2|0:5$fff,3|0:5$fff

idr-testing May 2024

Steps needed on idr-next for NGFF upgrade.
NB: current checklist is for actions on idr-testing (newly redeployed on 21st May 2024)

Detailed workflow is at https://github.com/IDR/mkngff_upgrade_scripts but this is an outline, also includes study-specific jobs:

Manual Software updates (should be part of the original deployment for idr-next):

  • Update ZarrReader if needed to include recent work
  • Install mkngff in venv, including recent PR branches (if not yet merged)
  • Install latest iviewer 0.14.0

NGFF and other udpates:

idr0016-wawer-bioactivecompoundprofiling S-BIAD851

https://github.com/IDR/idr0016-wawer-bioactivecompoundprofiling

Sample plate conversion failed with:

(base) [dlindner@pilot-zarr2-dev idr0016]$ time /home/dlindner/bioformats2raw/bin/bioformats2raw --memo-directory ../memo 24320.screen 24320.ome.zarr
OpenJDK 64-Bit Server VM warning: You have loaded library /tmp/opencv_openpnp6586654250319720590/nu/pattern/opencv/linux/x86_64/libopencv_java342.so which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
[Fatal Error] :1:84: Character reference "&#0" is an invalid XML character.
Exception in thread "main" picocli.CommandLine$ExecutionException: Error while calling command (com.glencoesoftware.bioformats2raw.Converter@63a65a25): java.lang.RuntimeException: org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 84; Character reference "&#0" is an invalid XML character.
        at picocli.CommandLine.executeUserObject(CommandLine.java:1962)
        at picocli.CommandLine.access$1300(CommandLine.java:145)
        at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2352)
        at picocli.CommandLine$RunLast.handle(CommandLine.java:2346)
        at picocli.CommandLine$RunLast.handle(CommandLine.java:2311)
        at picocli.CommandLine$AbstractParseResultHandler.handleParseResult(CommandLine.java:2172)
        at picocli.CommandLine.parseWithHandlers(CommandLine.java:2550)
        at picocli.CommandLine.parseWithHandler(CommandLine.java:2485)
        at picocli.CommandLine.call(CommandLine.java:2761)

This error Character reference "&#0" is an invalid XML character is already referenced by IDR/bioformats#29 .

Replace existing Fileset with NGFF Fileset

To avoid re-importing Images when updating data to NGFF, we want to create a new Fileset for the NGFF data, and replace old Filesets.

Testing workflow:

Imported png:
https://merge-ci.openmicroscopy.org/web/webclient/?show=image-257915

Converted same png to NGFF:

$ bioformats2raw OME_screenshot.png OME_screenshot.zarr --tile_width 256

To be able to tell an NGFF image, removed chunk of alpha channel:

$ rm OME_screenshot.zarr/0/0/0/3/0/0/0

Edited import.py to upload ALL files in directory:

def get_files_for_fileset(fs_path):
    filepaths = []
    for path, subdirs, files in os.walk(fs_path):
        for name in files:
            print(os.path.join(path, name))
            filepaths.append(os.path.join(path, name))
    return filepaths

Import the Zarr to create a Fileset AND import the Fileset

$ python import.py OME_screenshot.zarr --dataset 67919

Re-import this... NGFF_missing_chunk - as expected

https://merge-ci.openmicroscopy.org/web/webclient/?show=image-257917

Fileset ID: 138110

We want to update the Image:257915 (png) above to use Fileset:138110

$ omero obj update Image:257915 fileset=Fileset:138110

The 'png' image now lists NGFF files in it's Fileset, but looks the same.

Now try deleting the png Fileset...

$ omero delete Fileset:138108 --report
omero.cmd.Delete2 Fileset:138108 ok
Steps: 6
Elapsed time: 0.674 secs.
Flags: []
Deleted objects
  OriginalFile:1767759,1767760
  Fileset:138108
  FilesetEntry:1058458
  FilesetJobLink:429236-429240
  IndexingJob:498962
  JobOriginalFileLink:216552
  MetadataImportJob:498959
  PixelDataJob:498960
  ThumbnailGenerationJob:498961
  UploadJob:498958

Now, trying to view the 'png' image gives:

File "/home/omero/workspace/OMERO-web/.venv3/lib64/python3.6/site-packages/omero_api_RenderingEngine_ice.py", line 1192, in load
    return _M_omero.api.RenderingEngine._op_load.invoke(self, ((), _ctx))
...
    serverExceptionClass = ome.conditions.ResourceError
    message = Error instantiating pixel buffer: /home/omero/omero-server-data/ManagedRepository/user-3_454/Blitz-0-Ice.ThreadPool.Server-10/2023-02/23/09-01-23.654/OME_screenshot.png
}

To cut a long story short...

@jburel found that you need to update Pixels to point at the new NGFF data:

psql -U postgres -d OMERO-server -c "UPDATE pixels SET name = '.zattrs', path = 'user-3_454/Blitz-0-Ice.ThreadPool.Server-5/2023-02/23/09-26-58.983/OME_screenshot.zarr' where id = 256813"
UPDATE 1


This allows you to view the original `png` using the underlying NGFF Fileset!

idr0043 TIFs detected as SVS

Reported at #696 (comment)

ResourceErrors from idr0043, affecting Datasets from e.g 9101 -> 14734:
E.g. viewing http://localhost:1080/webclient/img_detail/13004862/
Gives:

Stack trace ``` 2024-06-26 13:20:46,228 DEBUG [ loci.formats.Memoizer] (.Server-15) start[1719408046201] time[26] tag[loci.formats.Memoizer.setId] 2024-06-26 13:20:46,228 ERROR [ ome.io.bioformats.BfPixelBuffer] (.Server-15) Failed to instantiate BfPixelsWrapper with /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-13/2021-06/05/12-54-52.237/141893_A_1_1.tif 2024-06-26 13:20:46,228 ERROR [ ome.io.nio.PixelsService] (.Server-15) Error instantiating pixel buffer: /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-13/2021-06/05/12-54-52.237/141893_A_1_1.tif java.lang.RuntimeException: java.lang.IndexOutOfBoundsException: Index 0 out of bounds for length 0 at ome.io.bioformats.BfPixelBuffer.reader(BfPixelBuffer.java:79) at ome.io.bioformats.BfPixelBuffer.setSeries(BfPixelBuffer.java:124) at ome.io.nio.PixelsService.createBfPixelBuffer(PixelsService.java:898) at ome.io.nio.PixelsService._getPixelBuffer(PixelsService.java:653) at ome.io.nio.PixelsService.getPixelBuffer(PixelsService.java:571) at ome.services.RenderingBean$12.doWork(RenderingBean.java:2205) at jdk.internal.reflect.GeneratedMethodAccessor319.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy101.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.util.Executor$Impl.execute(Executor.java:392) at ome.services.RenderingBean.getPixelBuffer(RenderingBean.java:2202) at ome.services.RenderingBean.load(RenderingBean.java:417) at jdk.internal.reflect.GeneratedMethodAccessor1342.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy122.load(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor1342.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy122.load(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor1417.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.RenderingEngineI.load_async(RenderingEngineI.java:316) at jdk.internal.reflect.GeneratedMethodAccessor1416.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy124.load_async(Unknown Source) at omero.api._RenderingEngineTie.load_async(_RenderingEngineTie.java:248) at omero.api._RenderingEngineDisp.___load(_RenderingEngineDisp.java:1223) at omero.api._RenderingEngineDisp.__dispatch(_RenderingEngineDisp.java:2405) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: java.lang.IndexOutOfBoundsException: Index 0 out of bounds for length 0 at java.base/jdk.internal.util.Preconditions.outOfBounds(Preconditions.java:64) at java.base/jdk.internal.util.Preconditions.outOfBoundsCheckIndex(Preconditions.java:70) at java.base/jdk.internal.util.Preconditions.checkIndex(Preconditions.java:248) at java.base/java.util.Objects.checkIndex(Objects.java:374) at java.base/java.util.ArrayList.get(ArrayList.java:459) at loci.formats.MetadataList.get(MetadataList.java:121) at loci.formats.SubResolutionFormatReader.getCurrentCore(SubResolutionFormatReader.java:238) at loci.formats.FormatReader.getPixelType(FormatReader.java:735) at loci.formats.MetadataTools.populatePixels(MetadataTools.java:149) at loci.formats.MetadataTools.populatePixels(MetadataTools.java:116) at loci.formats.in.BaseTiffReader.initMetadataStore(BaseTiffReader.java:426) at loci.formats.in.SVSReader.initMetadataStore(SVSReader.java:669) at loci.formats.in.BaseTiffReader.initMetadata(BaseTiffReader.java:99) at loci.formats.in.BaseTiffReader.initFile(BaseTiffReader.java:610) at loci.formats.FormatReader.setId(FormatReader.java:1480) at loci.formats.ImageReader.setId(ImageReader.java:865) at ome.io.nio.PixelsService$3.setId(PixelsService.java:869) at loci.formats.ReaderWrapper.setId(ReaderWrapper.java:692) at loci.formats.ChannelFiller.setId(ChannelFiller.java:258) at loci.formats.ReaderWrapper.setId(ReaderWrapper.java:692) at loci.formats.ChannelSeparator.setId(ChannelSeparator.java:317) at loci.formats.ReaderWrapper.setId(ReaderWrapper.java:692) at loci.formats.Memoizer.setId(Memoizer.java:726) at ome.io.bioformats.BfPixelsWrapper.(BfPixelsWrapper.java:52) at ome.io.bioformats.BfPixelBuffer.reader(BfPixelBuffer.java:73) ... 82 common frames omitted 2024-06-26 13:20:46,229 INFO [ org.perf4j.TimingLogger] (.Server-15) start[1719408046193] time[35] tag[omero.call.exception] 2024-06-26 13:20:46,229 INFO [ ome.services.util.ServiceHandler] (.Server-15) Excp: ome.conditions.ResourceError: Error instantiating pixel buffer: /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-13/2021-06/05/12-54-52.237/141893_A_1_1.tif 2024-06-26 13:20:46,229 INFO [ org.perf4j.TimingLogger] (.Server-15) start[1719408046181] time[47] tag[omero.call.exception] ```
[sbesson@test122-omeroreadwrite ~]$ ./bftools/showinf -version
Version: 7.3.0
Build date: 18 April 2024
VCS revision: acc4b6de79e55275b5e6c8200f8458f6d93c9ba0
[sbesson@test122-omeroreadwrite ~]$ ./bftools/showinf -nopix /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-13/2021-06/05/12-54-52.237/141893_A_1_1.tif
Checking file format [Aperio SVS]
Initializing reader
SVSReader initializing /data/OMERO/ManagedRepository/demo_2/Blitz-0-Ice.ThreadPool.Server-13/2021-06/05/12-54-52.237/141893_A_1_1.tif
Reading IFDs
Populating metadata
Populating OME metadata
Exception in thread "main" java.lang.IllegalArgumentException: Invalid resolution: 0
	at loci.formats.CoreMetadataList.flattenedIndex(CoreMetadataList.java:178)
	at loci.formats.SubResolutionFormatReader.getSeries(SubResolutionFormatReader.java:145)
	at loci.formats.MetadataTools.populatePixels(MetadataTools.java:135)
	at loci.formats.MetadataTools.populatePixels(MetadataTools.java:116)
	at loci.formats.in.BaseTiffReader.initMetadataStore(BaseTiffReader.java:426)
	at loci.formats.in.SVSReader.initMetadataStore(SVSReader.java:669)
	at loci.formats.in.BaseTiffReader.initMetadata(BaseTiffReader.java:99)
	at loci.formats.in.BaseTiffReader.initFile(BaseTiffReader.java:610)
	at loci.formats.FormatReader.setId(FormatReader.java:1480)
	at loci.formats.ImageReader.setId(ImageReader.java:865)
	at loci.formats.ReaderWrapper.setId(ReaderWrapper.java:692)
	at loci.formats.tools.ImageInfo.testRead(ImageInfo.java:1043)
	at loci.formats.tools.ImageInfo.main(ImageInfo.java:1129)

Seb: ome/bioformats#4144

Try to remove need for PSQL step

First step (as discussed in IDR meeting today) is to audit the code and check what uses the path and name fields of the Pixels table.

Testing mkngff on idr-testing

Seb: "test120 now deployed with OMERO.server and available at https://idr-testing.openmicroscopy.org/".
As discussed at IDR meeting 11th September, good to test mkngff with ALL the NGFF filesets:

  • idr0004 mkngff 2.5 hrs (46 x 100-well-Plate - 3mins per Plate) - setId 1 min per Fileset
  • idr0010 - 148 x 384-well-Plates - mkngff 9 mins per plate 142/147 done
  • idr0011 mkngff 3.5 hrs (182 x 48-well-Plate) 1 min per plate) - setId 1min per Plate. Improved to 40-50 secs
  • idr0012 - 68 x 320-well-Plates (2 fields) - mkngff 6 mins per plate. Memo file took 42 minutes
  • idr0013 - 538 x 384-well-Plates - mkngff 3 mins per plate x 538 => 26 hours. memo file took 11 minutes x 538 = 4 days
  • idr0015 - 84 x 396-well-Plates
  • idr0016 - 413 x 384-well-Plates (6 fields) mkngff 7.5 mins per Plate - x 413 => 51 hours
  • idr0025 mkngff 3 x 96-well-Plates. setId 95 mins for a plate, 4 mins for Plate 2, 12 mins for Plate 3.
  • idr0026 mkngff took 90 mins (111 images). setId took 26 secs for a single image
  • idr0033 - mkngff took 18 hours for 12 x 384-well-plates (9 fields). memo file took 24 hours for a single Plate, then 12 hours for Plate 2 - Now down to 81 minutes
  • idr0035 - mkngff took 6.5 hours (55 plates). setId took 1 hour for a single Plate
  • idr0036 - 20 Plates
  • idr0051 - 5 Images
  • idr0054 - 3 Images
  • idr0090 - 22 Plates
  • idr0091 - 342 Images - on idr0125-pilot mkngff took few secs each. 28 mins total

View on https://idr-testing.openmicroscopy.org - Updated ZarrReader on 2023.09.25

idr0033-rohban-pathways S-BIAD848

Conversion

  • 41744
  • 41744_illum_corrected
  • 41749
  • 41749_illum_corrected
  • 41754
  • 41754_illum_corrected
  • 41755
  • 41755_illum_corrected
  • 41756
  • 41756_illum_corrected
  • 41757
  • 41757_illum_corrected

NGFF perf testing

Compare formats (on disk)

To compare the performance of NGFF data (ZarrReader) with other formats (both on disk), we want to compare NGFF version of the data alongside the same data in it's original format on the same server.

Choose some data to work with: idr0003 is not too big at 2.3G for a plate. Summary: (more details below):

  • Use bioformats2raw to convert a plate from idr0003 to NGFF.
  • zip, copy to idr-testing, unzip and perform regular import (not in-place)
  • Update Plate name and place it in idr0003 Screen
  • With the preview panel enabled, click on 25 Wells of both plates (original and NGFF copy), recording the times to render_image to load the initial plane. Plot the average of 25 Wells - Times in millisecs: Error bars are 1 std dev.

Screenshot 2024-03-05 at 11 14 59

Conclusion: NGFF is no slower (maybe faster)?

Compare disk vv s3

We want to test the performance of loading data from s3 compared with loading the same data from local disk.
Use idr0010 data since all plates are identical in terms of size etc:

  • Downloaded plate.ome.zarr.zip data previously uploaded to BioStudies
  • Unzip and place in /ngff dir on each idr-testing server
  • For a plate, replace the symlink from ManagedRepository -> mounted s3 directory with a symlink ManagedRepository -> /ngff/plate.ome.zarr
  • Compare performance loading initial plane for 25 Wells for the plate on disk with 25 Wells from an identical plate using s3 data. Times are in seconds: Std deviation is 0.267 for S3 and 0.096 for Disk (can't seem to plot different error bars on each column in Numbers)!

Screenshot 2024-03-05 at 11 43 10

Conclusion: Data access via S3 is slower than on disk:

Unused "Home Sapiens" annotation?

idr=> select distinct annotation_id, name, value from annotation_mapvalue where (name = 'Organism' and value = 'Homo sapiensx' or value = 'Homo Sapiens');
 annotation_id |   name   |    value
---------------+----------+--------------
      16564093 | Organism | Homo Sapiens
(1 row)
[jamoore@test73-omeroreadwrite ~]$ /opt/omero/server/OMERO.server/bin/omero hql 'select el from EventLog el where entityId = 16564093'
Using session for public@localhost:4064. Idle timeout: 10 min. Current group: Public
 # | Class     | Id        | action | entityId | entityType                                | event
---+-----------+-----------+--------+----------+-------------------------------------------+------------------
...
 3 | EventLogI | 391698750 | INSERT | 16564093 | ome.model.annotations.MapAnnotation       | EventI:130976605
[jamoore@test73-omeroreadwrite ~]$ /opt/omero/server/OMERO.server/bin/omero hql 'select l from ome.model.IAnnotationLink l where l.child.id = 16564093'

shows nothing.

 # | Class  | Id        | experimenterGroup    | experimenter    | session          | type         | time
---+--------+-----------+----------------------+-----------------+------------------+--------------+--------------------------
 0 | EventI | 130976605 | ExperimenterGroupI:3 | ExperimenterI:2 | SessionI:4178906 | EventTypeI:4 | Tue Feb  5 13:31:47 2019
(1 row)

idr0001-graml-sysgro to NGFF

idr0001 has 192 x 96-Well Plates, 6 acquisitions each.
1 Plate converted below is 47 GB.
Approx 4.5 TB in total.
bioformats2raw took ~30mins to convert 1 Plate => approx 4 days in total.

NB: The need to convert multi-acquisition Flex data (idr0001) is because the support for that hasn't been ported from IDR to mainline BioFormats: ome/bioformats#3537

idr-next steps for NGFF upgrade

Steps needed on idr-next for NGFF upgrade.
NB: current checklist is for actions on idr-testing (newly redeployed on 10th Oct)

Detailed workflow is at IDR/omero-mkngff#2 but this is an outline, also includes study-specific jobs:

goofys needs remounting

All the BioStudies s3 data imported to idr0125-pilot is currently giving ResourceError when trying to view images.
This is due to a failure of the goofys-mounted BioStudies s3 bucket.

$ ls /bia-integrator-data 
ls: cannot access /bia-integrator-data: Transport endpoint is not connected

See kahing/goofys#208
Advice is to unmount and re-mount.

Tried un-mounting as at kahing/goofys#77

As omero-server user:

$ fusermount -u /bia-integrator-data 
fusermount: entry for /bia-integrator-data not found in /etc/mtab

This is currently a blocker on my NGFF update work on idr0125-pilot (can use idr0138-pilot in the mean time) but it also raises questions on how to detect and fix this once we start using it on production IDR server.

cc @sbesson @joshmoore

broken svs on Bio-Formats update

When testing ome/omero-web#536 on idr-testing, I noticed this issue which I think Seb mentioned was due to a Bio-Formats update (see below)

Deployed on idr-testing from branch at #537, this is returning different zoomLevelScaling from previously!?
On https://idr.openmicroscopy.org/webclient/img_detail/9840218/ that image has:

    "tiles": true,
    "tile_size": {
        "width": 240,
        "height": 240
    },
    "levels": 3,
    "zoomLevelScaling": {
        "0": 1.0,
        "1": 0.2499840367792606,
        "2": 0.029308473277568484
    },

But on idr-testing, a different zoomLevelScaling is causing problems:

Screenshot 2024-02-28 at 13 40 52

From slack idr, 28th Feb
Seb: "SVS in particular is a format which went through several changes in recent versions"

Will: Ah - so that bug may not be due to my PR?

Seb:
14:30
where that bug == the resolution levels are different between deployments or that bug == the resolution levels on idr-testing are incorrect?
14:32
actually the bug is clearly in production IDR (and has been addressed in recent Bio-Formats versions)
14:33
https://idr.openmicroscopy.org/webclient/img_detail/9840219/?dataset=10450 is absolutely not a label image, it's a resolution level of the whole slide image
14:34
so this dataset will partly break as part of the OMERO.server upgrade and we might need to consider reimporting these files unfortunately

idr0009 ScanR missing Wells

Currently on idr-next (prod120) idr0009 Plates are not handling missing Wells correctly.

Seb: “The problem is that this data was originally loaded with a version of the ScanR reader that represents missing wells as black images while the default behavior in Bio-Formats is to represent these as sparse plates. When this was backported to OME Bio-Formats, an option was introduced to support both behaviors - see https://bio-formats.readthedocs.io/en/latest/formats/options.html I think the action is to update all plates in this study to set scanr.skip_missing_wells to false , delete & regenerate the memo files and retest”

Study publication: metadata unification

Status

The gallery UI work carried in prod67 (see ome/design#100 and image.sc post) also drove the re-annotation of published IDR studies. In particular the Study Type and Study Public Release Date metadata fields were reviewed across all studies and a new Sample Type field was added to classify each study as cell or tissue.

Metadata that was discussed but not fixed/rationalized in prod67 was the Publication Authors. At the moment, we support different naming schemes and downstream consumers like the gallery UI needs to handle these variants.

Proposal

All IDR studies with an associated peer-reviewed publication have a PubMed ID. A natural proposal would be to unify the author naming scheme to comply with what PubMed store.

To minimize the impact on submitters, templates should be updated with the recommended formatting for Study Author List values as LastName 1 Initials1, LastName2 Initials2,.... The author list should be stored as a comma separated list of authors e.g.

Walther N, Hossain MJ, Politi AZ, Koch B, Kueblbeck M, Ødegård-Fougner Ø, Lampe M, Ellenberg J

Validation

The NCBI API can be used for validating a lot of the publication metadata (title, authors, PMC and DOI if applicable) given a PubMed ID:

+    def validate_publications(self):
+       URL = "https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi"
+       QUERY = "?db=pubmed&id=%s&retmode=json" 
+
+       for publication in self.study["Publications"]:
+           if "PubMed ID" not in publication:
+               continue
+           json = requests.get(URL + QUERY % publication["PubMed ID"]).json()
+           result = json['result'][publication["PubMed ID"]]
+
+           self.log.debug("Validating publication title")
+           assert publication["Title"] == result['title'], "%s != %s" % (
+               publication["Title"], result['title'])
+
+           self.log.debug("Validating publication author")
+           assert publication["Title"] == result['title'], "%s != %s" % (
+               publication["Title"], result['title'])
+
+           # Validate PMC ID and DOI if present
+           for articleid in result['articleids']:
+               articleids_map = {"pmc": "PMC ID", 'doi': "DOI"}
+               if articleid['idtype'] in articleids_map.keys():
+                   study_key = articleids_map[articleid['idtype']]
+                   self.log.debug("Validating %s" % study_key)
+                   assert publication[study_key] == articleid['value'], (
+                       "%s != %s" % (
+                       publication[study_key], articleid['value']))

Database and UI representation

At the moment, publications are included in the idr.openmicroscopy/study/info annotation as an ordered list of key/value pairs (Title, Authors, PubMed ID, PMC ID if applicable, DOI if applicable), one per publication:

Screen Shot 2019-06-18 at 14 33 46

In order for the gallery or any downstream application to consume this metadata effectively, we might need to rethink how to store and expose the publication metadata

  • should authors be listed as one key/value pair with comma separated authors or one key/value pair per author?
  • should publications be moved to their own map annotation with an idr.openmicroscopy.org/study/publication namespace? Should multiple publications be combined or as separate map annotations?

idr0035-caie-drugresponse S-BIAD847

idr0035-caie-drugresponse

Sample plate conversion failed with:

(base) [dlindner@pilot-zarr2-dev idr0035]$ time /home/dlindner/bioformats2raw/bin/bioformats2raw Week2_24121.screen Week2_24121.ome.zarr
OpenJDK 64-Bit Server VM warning: You have loaded library /tmp/opencv_openpnp5524631638925736558/nu/pattern/opencv/linux/x86_64/libopencv_java342.so which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
Exception in thread "main" picocli.CommandLine$ExecutionException: Error while calling command (com.glencoesoftware.bioformats2raw.Converter@63a65a25): java.lang.NullPointerException
        at picocli.CommandLine.executeUserObject(CommandLine.java:1962)
        at picocli.CommandLine.access$1300(CommandLine.java:145)
        at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2352)
        at picocli.CommandLine$RunLast.handle(CommandLine.java:2346)
        at picocli.CommandLine$RunLast.handle(CommandLine.java:2311)
        at picocli.CommandLine$AbstractParseResultHandler.handleParseResult(CommandLine.java:2172)
        at picocli.CommandLine.parseWithHandlers(CommandLine.java:2550)
        at picocli.CommandLine.parseWithHandler(CommandLine.java:2485)
        at picocli.CommandLine.call(CommandLine.java:2761)
        at com.glencoesoftware.bioformats2raw.Converter.main(Converter.java:2192)
Caused by: java.lang.NullPointerException
        at ome.xml.meta.OMEXMLMetadataImpl.getWellSampleImageRef(OMEXMLMetadataImpl.java:5205)
        at com.glencoesoftware.bioformats2raw.Converter.hasValidPlate(Converter.java:2055)
        at com.glencoesoftware.bioformats2raw.Converter.convert(Converter.java:604)
        at com.glencoesoftware.bioformats2raw.Converter.call(Converter.java:516)
        at com.glencoesoftware.bioformats2raw.Converter.call(Converter.java:107)
        at picocli.CommandLine.executeUserObject(CommandLine.java:1953)

That looks unrelated to IDR/bioformats#29 , doesn't it @sbesson ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.