Skip to content

Major refactoring in the Isobaric workflow enabling to work similar to proteomicsLFQ#692

Draft
ypriverol wants to merge 14 commits intodevfrom
dev_IsobaricWorkflow
Draft

Major refactoring in the Isobaric workflow enabling to work similar to proteomicsLFQ#692
ypriverol wants to merge 14 commits intodevfrom
dev_IsobaricWorkflow

Conversation

@ypriverol
Copy link
Copy Markdown
Member

@ypriverol ypriverol commented Apr 3, 2026

PR checklist

  • This comment contains a description of changes (with reason).
  • If you've fixed a bug or added code that should be tested, add tests!
  • If you've added a new tool - have you followed the pipeline conventions in the contribution docs
  • If necessary, also make a PR on the bigbio/quantms branch on the nf-core/test-datasets repository.
  • Make sure your code lints (nf-core pipelines lint).
  • Ensure the test suite passes (nextflow run . -profile test,docker --outdir <OUTDIR>).
  • Check for unexpected warnings in debug mode (nextflow run . -profile debug,test,docker --outdir <OUTDIR>).
  • Usage Documentation in docs/usage.md is updated.
  • Output Documentation in docs/output.md is updated.
  • CHANGELOG.md is updated.
  • README.md is updated (including new tool citations and authors/contributors).

Summary by CodeRabbit

  • New Features
    • New isobaric mass spectrometry analysis workflow supporting multiple labeling methodologies.
    • Added configurable analysis type parameter for customizing quantification approach (defaults to itraq4plex).
    • Refactored TMT workflow with optimized processing and result output handling.

@ypriverol ypriverol marked this pull request as draft April 3, 2026 10:15
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 3, 2026

Important

Review skipped

Draft detected.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 9899373a-6e6d-481b-b369-d74f7186cd92

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

A new ISOBARIC_WORKFLOW process is introduced to handle isobaric-labeled mass spectrometry analysis, replacing the prior feature-mapping and inference pipeline. The TMT workflow is refactored to integrate this module with a MSSTATS_CONVERTER component, and supporting configuration and metadata files are added.

Changes

Cohort / File(s) Summary
Configuration Updates
conf/modules/modules.config, nextflow.config
Modified publishing configuration for ISOBARIC_WORKFLOW process to include quant_tables output; added new params.type parameter with default value 'itraq4plex'.
New ISOBARIC_WORKFLOW Module
modules/local/openms/isobaric_workflow/main.nf, modules/local/openms/isobaric_workflow/meta.yml
Implemented new ISOBARIC_WORKFLOW process that sorts input .mzML and .idXML files by normalized base name and invokes IsobaricWorkflow command with parameterized arguments; produces .mzTab and .consensusXML outputs along with logs and version manifest.
TMT Workflow Refactoring
workflows/tmt.nf
Replaced FILE_MERGE, FEATURE_MAPPER, PROTEIN_INFERENCE, and PROTEIN_QUANT modules with ISOBARIC_WORKFLOW; integrated MSSTATS_CONVERTER to consume ISOBARIC_WORKFLOW consensusXML output; updated pipeline inputs, outputs, and version tracking accordingly.

Sequence Diagram(s)

sequenceDiagram
    participant ch_file_prep as File Preparation
    participant ch_id as ID Module
    participant ch_join as Data Join
    participant isobaric as ISOBARIC_WORKFLOW
    participant converter as MSSTATS_CONVERTER
    participant msstats as MSstats

    ch_file_prep->>ch_join: mzmls (prepared spectra)
    ch_id->>ch_join: id_results (identifications)
    ch_join->>isobaric: joined mzmls & id_files
    ch_join->>isobaric: expdesign (experimental design)
    
    isobaric->>isobaric: sort by normalized base name
    isobaric->>isobaric: run IsobaricWorkflow command
    isobaric->>converter: out_consensusXML (quantification)
    
    converter->>converter: convert consensus XML
    converter->>msstats: out_msstats (formatted input)
    msstats->>msstats: statistical analysis (optional)
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Poem

🐰 A workflow springs to life, labeled and bright,
With isobaric magic, our quantitation takes flight,
New modules integrated with parsimony and grace,
Converting consensus and data apace! ✨

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: a major refactoring of the Isobaric workflow to align its functionality with proteomicsLFQ, which is reflected across the changes (new ISOBARIC_WORKFLOW module, TMT workflow restructuring, and configuration updates).
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch dev_IsobaricWorkflow

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@qodo-code-review
Copy link
Copy Markdown
Contributor

Review Summary by Qodo

Refactor TMT workflow to use unified IsobaricWorkflow module

✨ Enhancement

Grey Divider

Walkthroughs

Description
• Refactored TMT workflow to use IsobaricWorkflow instead of separate modules
• Replaced FEATURE_MAPPER, FILE_MERGE, PROTEIN_INFERENCE, PROTEIN_QUANT with unified
  IsobaricWorkflow
• Added MSSTATS_CONVERTER module for consensus XML to mzTab conversion
• Simplified workflow pipeline architecture for isobaric quantification
Diagram
flowchart LR
  ID["ID Subworkflow"]
  ISO["ISOBARIC_WORKFLOW"]
  CONV["MSSTATS_CONVERTER"]
  MSSTATS["MSSTATS_TMT"]
  
  ID -- "id_results" --> ISO
  ISO -- "consensusXML" --> CONV
  CONV -- "msstats_csv" --> MSSTATS
  ISO -- "mzTab" --> RESULT["Final Results"]
Loading

Grey Divider

File Changes

1. conf/modules/modules.config ⚙️ Configuration changes +1/-1

Add ISOBARIC_WORKFLOW to publishDir configuration

• Added ISOBARIC_WORKFLOW to the publishDir configuration for result tables
• Ensures isobaric workflow outputs are published to quant_tables directory alongside other
 quantification results

conf/modules/modules.config


2. modules/local/openms/isobaric_workflow/main.nf ✨ Enhancement +64/-0

New IsobaricWorkflow process module implementation

• New process implementing OpenMS IsobaricWorkflow tool
• Handles isobaric labeling extraction and normalization from LC-MS/MS experiments
• Accepts mzML spectra, idXML identifications, and experimental design files
• Outputs mzTab and consensusXML files with version tracking
• Includes file sorting logic to ensure consistent input ordering

modules/local/openms/isobaric_workflow/main.nf


3. modules/local/openms/isobaric_workflow/meta.yml 📝 Documentation +42/-0

New IsobaricWorkflow module metadata documentation

• New metadata file documenting IsobaricWorkflow module
• Defines input/output specifications for mzML, idXML, and experimental design files
• Documents output formats including mzTab and consensusXML
• Includes tool documentation links and author attribution

modules/local/openms/isobaric_workflow/meta.yml


View more (2)
4. nextflow.config ⚙️ Configuration changes +3/-0

Add IsobaricWorkflow configuration parameters

• Added new parameter section for IsobaricWorkflow flags
• Introduced type parameter with default value itraq4plex for isobaric labeling type
 configuration

nextflow.config


5. workflows/tmt.nf ✨ Enhancement +20/-25

Refactor TMT workflow to use IsobaricWorkflow module

• Removed imports for FILE_MERGE, FEATURE_MAPPER, PROTEIN_INFERENCE, PROTEIN_QUANT modules
• Added imports for ISOBARIC_WORKFLOW and MSSTATS_CONVERTER modules
• Replaced multi-step quantification pipeline with single ISOBARIC_WORKFLOW call
• Simplified channel operations by removing intermediate processing steps
• Updated workflow outputs to use ISOBARIC_WORKFLOW and MSSTATS_CONVERTER results
• Streamlined post-processing to directly use MSSTATS_CONVERTER output for MSSTATS_TMT

workflows/tmt.nf


Grey Divider

Qodo Logo

@qodo-code-review
Copy link
Copy Markdown
Contributor

qodo-code-review bot commented Apr 3, 2026

Code Review by Qodo

🐞 Bugs (2) 📘 Rule violations (0) 📎 Requirement gaps (0) 🎨 UX Issues (0)

Grey Divider


Action required

1. Wrong isobaric type default 🐞 Bug ≡ Correctness
Description
The TMT workflow now runs ISOBARIC_WORKFLOW, but the -type passed to OpenMS is taken from
params.type which defaults to itraq4plex, so TMT experiments will be processed with the wrong
isobaric labeling type unless the user overrides it.
Code

nextflow.config[R181-183]

+    // IsobaricWorkflow flags
+    type                     = 'itraq4plex'
+
Evidence
The pipeline routes isobaric-labelled experiments into the TMT workflow based on
meta.labelling_type containing tmt/itraq, but ISOBARIC_WORKFLOW’s -type is not derived from
that value and instead uses a new global default of itraq4plex. The schema already defines the
canonical label choices under labelling_type (including TMT variants), making the current
default/type wiring a high-likelihood silent-results error for TMT runs.

nextflow.config[181-183]
modules/local/openms/isobaric_workflow/main.nf[42-55]
workflows/quantms.nf[70-75]
nextflow_schema.json[898-918]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

### Issue description
The new `ISOBARIC_WORKFLOW` process passes `-type ${params.type}` to OpenMS, while `params.type` defaults to `itraq4plex`. The pipeline selects the isobaric (TMT/iTRAQ) workflow based on `meta.labelling_type`, so TMT runs will be quantified with the wrong label type unless users manually override `params.type`.

### Issue Context
The schema already defines allowed isobaric label types via `labelling_type` (tmt6/10/11/16, itraq4/8). The tool’s `-type` should be driven by that value (or by the SDRF-derived `meta.labelling_type`) to avoid silent scientific mis-configuration.

### Fix Focus Areas
- nextflow.config[181-183]
- modules/local/openms/isobaric_workflow/main.nf[42-55]
- workflows/tmt.nf[44-54]
- nextflow_schema.json[898-918]

### Suggested fix approach
- Prefer using the existing `params.labelling_type` (and/or the SDRF-derived `meta.labelling_type`) as the source of truth for the OpenMS `-type` argument.
- Option A (minimal): rename `params.type` to `params.labelling_type` usage inside `ISOBARIC_WORKFLOW`, and remove/avoid the new `params.type` default.
- Option B (more robust): add an explicit `val label_type` input to `ISOBARIC_WORKFLOW` and pass the experiment’s label type from the workflow (ensuring it matches the single label-type-per-design invariant).

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools



Remediation recommended

2. Broken module meta.yml inputs 🐞 Bug ⚙ Maintainability
Description
The new isobaric_workflow meta.yml incorrectly defines input files under the tools section
instead of an input: section, so module metadata consumers (lint/docs) cannot correctly interpret
the module interface.
Code

modules/local/openms/isobaric_workflow/meta.yml[R6-23]

+tools:
+  - IsobaricWorkflow:
+      description: |
+        Extracts and normalizes isobaric labeling information from an LC-MS/MS experiment.
+      homepage: https://abibuilder.cs.uni-tuebingen.de/archive/openms/Documentation/release/latest/html/TOPP_IsobaricWorkflow.html
+      documentation: https://abibuilder.cs.uni-tuebingen.de/archive/openms/Documentation/release/latest/html/TOPP_IsobaricWorkflow.html
+  - mzmls:
+      type: file
+      description: Input Spectra in mzML format
+      pattern: "*.mzML"
+  - id_files:
+      type: file
+      description: Identifications in idXML or mzIdentML format with posterior error probabilities as score type.
+      pattern: "*.idXML"
+  - expdes:
+      type: file
+      description: An experimental design file
+      pattern: "*.tsv"
Evidence
Existing modules in this repo follow the tools: + input: + output: structure. The new
isobaric_workflow meta.yml places mzmls, id_files, and expdes under tools: (after the tool
definition), deviating from the established schema used elsewhere (e.g., MSSTATS_CONVERTER).

modules/local/openms/isobaric_workflow/meta.yml[1-24]
modules/local/openms/msstats_converter/meta.yml[6-22]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

### Issue description
`modules/local/openms/isobaric_workflow/meta.yml` is malformed: it lists `mzmls`, `id_files`, and `expdes` under `tools:` instead of defining them under an `input:` section. This breaks consistency with other module meta.yml files and can break tooling that reads module interfaces.

### Issue Context
Compare with `modules/local/openms/msstats_converter/meta.yml`, which correctly uses `input:` and `output:` sections.

### Fix Focus Areas
- modules/local/openms/isobaric_workflow/meta.yml[1-24]
- modules/local/openms/msstats_converter/meta.yml[6-22]

### Suggested fix approach
- Keep `tools:` containing only the `IsobaricWorkflow` tool metadata.
- Add a top-level `input:` section containing `mzmls`, `id_files`, `expdes` entries.
- Keep `output:` as-is, ensuring it matches what the module emits.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


Grey Divider

ⓘ The new review experience is currently in Beta. Learn more

Grey Divider

Qodo Logo

@codacy-production
Copy link
Copy Markdown

codacy-production bot commented Apr 3, 2026

Up to standards ✅

🟢 Issues 0 issues

Results:
0 new issues

View in Codacy

TIP This summary will be updated as you push new changes. Give us feedback

ypriverol and others added 2 commits April 3, 2026 11:17
…rection

- Use labelling_type from SDRF meta (auto-detected) instead of
  hardcoded params.type='itraq4plex' default
- Pass isotope correction matrix when params.isotope_correction is
  enabled, matching the IsobaricAnalyzer module behavior
- Pass extraction parameters (min_precursor_purity,
  precursor_isotope_deviation, min_reporter_intensity) that were
  missing from the IsobaricWorkflow invocation
- Remove unused 'type' parameter from nextflow.config

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
fix: IsobaricWorkflow - auto-detect labelling type, pass isotope correction and extraction params
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@modules/local/openms/isobaric_workflow/main.nf`:
- Around line 52-53: The command only forwards -picked_fdr and
-picked_decoy_string but omits the decoy-position option, so non-default
suffix/prefix decoy setups are ignored; update the invocation that sets
-picked_fdr ${params.picked_fdr} and -picked_decoy_string ${params.decoy_string}
to also pass the decoy position parameter (e.g., add the corresponding flag with
${params.decoy_string_position}) so the pipeline honors
params.decoy_string_position when running the picked-FDR step.
- Around line 23-39: The code rebuilds pairings by independently sorting mzmls
and id_files into mzml_sorted and id_sorted using extractBaseName, which can
silently mispair files if normalization misses a name; after computing
mzml_sorted and id_sorted, add a fast-fail check: ensure sizes match and then
iterate index-wise comparing extractBaseName(mzml_sorted[i].name) to
extractBaseName(id_sorted[i].name), and if any mismatch occurs throw an
exception (or call error/exit) with a clear message listing the offending
pair(s) and their original names so the pipeline fails immediately rather than
proceeding with wrong ID/mzML pairing.

In `@nextflow.config`:
- Around line 181-182: The repo adds a separate params.key "type" (value
'itraq4plex') which is unused by existing runs that set params.labelling_type,
because the workflow reads params.type; fix by either removing the duplicated
"type" setting from the config or making it derive from the existing
labelling_type (e.g., set type = params.labelling_type ?: 'itraq4plex'), or
change the workflow to read params.labelling_type instead of params.type; update
references to use the single canonical symbol (labelling_type) so existing CLI
flags continue to work.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 2aafdde8-9873-46f2-8b2e-d277e8e2568d

📥 Commits

Reviewing files that changed from the base of the PR and between e719f43 and fc363bb.

📒 Files selected for processing (5)
  • conf/modules/modules.config
  • modules/local/openms/isobaric_workflow/main.nf
  • modules/local/openms/isobaric_workflow/meta.yml
  • nextflow.config
  • workflows/tmt.nf

Comment on lines +23 to +39
def extractBaseName = { filename ->
def name = filename.toString()
name = name.replaceAll(/\.mzML$/, '')

if (name.endsWith('.idXML')) {
name = name.replaceAll(/\.idXML$/, '')
name = name.replaceAll(/_(comet|msgf|sage|consensus)(_perc)?(_filter)?(_fdr)?$/, '')
}
return name
}

def mzml_sorted = mzmls.collect().sort{ a, b ->
extractBaseName(a.name) <=> extractBaseName(b.name)
}
def id_sorted = id_files.collect().sort{ a, b ->
extractBaseName(a.name) <=> extractBaseName(b.name)
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Fail fast if the mzML/idXML pairing drifts.

Upstream these files are already matched by sample, but this process rebuilds that pairing by sorting both lists independently and then passing them positionally to -in / -in_id. If a filename falls outside the current normalization regex, the command still runs but can quantitate against the wrong ID file.

🛠️ Suggested guard
     def mzml_sorted = mzmls.collect().sort{ a, b ->
         extractBaseName(a.name) <=> extractBaseName(b.name)
     }
     def id_sorted = id_files.collect().sort{ a, b ->
         extractBaseName(a.name) <=> extractBaseName(b.name)
     }
+    def mzml_names = mzml_sorted.collect { extractBaseName(it.name) }
+    def id_names   = id_sorted.collect { extractBaseName(it.name) }
+    if (mzml_names != id_names) {
+        throw new IllegalArgumentException("Mismatched IsobaricWorkflow inputs: mzML=${mzml_names} idXML=${id_names}")
+    }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@modules/local/openms/isobaric_workflow/main.nf` around lines 23 - 39, The
code rebuilds pairings by independently sorting mzmls and id_files into
mzml_sorted and id_sorted using extractBaseName, which can silently mispair
files if normalization misses a name; after computing mzml_sorted and id_sorted,
add a fast-fail check: ensure sizes match and then iterate index-wise comparing
extractBaseName(mzml_sorted[i].name) to extractBaseName(id_sorted[i].name), and
if any mismatch occurs throw an exception (or call error/exit) with a clear
message listing the offending pair(s) and their original names so the pipeline
fails immediately rather than proceeding with wrong ID/mzML pairing.

Comment on lines +52 to +53
-picked_fdr ${params.picked_fdr} \\
-picked_decoy_string ${params.decoy_string} \\
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Propagate the decoy-position setting in this new path.

The pipeline already exposes params.decoy_string_position, but this command only forwards the decoy string. Any non-default suffix-decoy setup is ignored here, so the isobaric workflow no longer honors the full picked-FDR decoy configuration.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@modules/local/openms/isobaric_workflow/main.nf` around lines 52 - 53, The
command only forwards -picked_fdr and -picked_decoy_string but omits the
decoy-position option, so non-default suffix/prefix decoy setups are ignored;
update the invocation that sets -picked_fdr ${params.picked_fdr} and
-picked_decoy_string ${params.decoy_string} to also pass the decoy position
parameter (e.g., add the corresponding flag with
${params.decoy_string_position}) so the pipeline honors
params.decoy_string_position when running the picked-FDR step.

@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 3, 2026

This PR is against the master branch ❌

  • Do not close this PR
  • Click Edit and change the base to dev
  • This CI test will remain failed until you push a new commit

Hi @ypriverol,

It looks like this pull-request is has been made against the bigbio/quantms master branch.
The master branch on nf-core repositories should always contain code from the latest release.
Because of this, PRs to master are only allowed if they come from the bigbio/quantms dev branch.

You do not need to close this PR, you can change the target branch to dev by clicking the "Edit" button at the top of this page.
Note that even after this, the test will continue to show as failing until you push a new commit.

Thanks again for your contribution!

@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 3, 2026

nf-core pipelines lint overall result: Passed ✅ ⚠️

Posted for pipeline commit 69d5c61

+| ✅ 112 tests passed       |+
#| ❔  19 tests were ignored |#
!| ❗   3 tests had warnings |!
Details

❗ Test warnings:

  • pipeline_todos - TODO string in nextflow.config: Specify any additional parameters here
  • pipeline_if_empty_null - ifEmpty(null) found in /home/runner/work/quantms/quantms/subworkflows/local/dda_id/main.nf: _ ch_software_versions = ch_software_versions.mix(PHOSPHO_SCORING.out.versions.ifEmpty(null))
    _
  • pipeline_if_empty_null - ifEmpty(null) found in /home/runner/work/quantms/quantms/subworkflows/local/id/main.nf: _ ch_software_versions = ch_software_versions.mix(PHOSPHO_SCORING.out.versions.ifEmpty(null))
    _

❔ Tests ignored:

✅ Tests passed:

Run details

  • nf-core/tools version 3.5.1
  • Run at 2026-04-03 15:30:57

@qodo-code-review
Copy link
Copy Markdown
Contributor

qodo-code-review bot commented Apr 3, 2026

CI Feedback 🧐

(Feedback updated until commit 69d5c61)

A test triggered by this PR failed. Here is an AI-generated analysis of the failure:

Action: Docker Tests (latest-everything, test_dda_id_alphapeptdeep)

Failed stage: Run pipeline with test data in docker profile (master branch) [❌]

Failed test name: ""

Failure summary:

The action failed because the Nextflow run exited with code 1 due to a configuration parsing error
in modules.config.
- Nextflow reported: Error modules.config:86:1: If statements cannot be mixed
with config statements.
- The offending code starts at modules.config line 86: if
(params.mzml_features) { ... }, which is not allowed at that location in the config file, causing
ERROR ~ Config parsing failed.
- Follow-on steps to collect failed logs also errored because the
pipeline never produced ${TEST_PROFILE}_${EXEC_PROFILE}_results/pipeline_info/execution_trace.txt,
so grep, ls, and cp could not find expected files.

Relevant error logs:
1:  ##[group]Runner Image Provisioner
2:  Hosted Compute Agent
...

516:  �[36;1m�[0m
517:  �[36;1m  sudo rm -rf /opt/ghc || true�[0m
518:  �[36;1m  sudo rm -rf /usr/local/.ghcup || true�[0m
519:  �[36;1m  �[0m
520:  �[36;1m  AFTER=$(getAvailableSpace)�[0m
521:  �[36;1m  SAVED=$((AFTER-BEFORE))�[0m
522:  �[36;1m  printSavedSpace $SAVED "Haskell runtime"�[0m
523:  �[36;1mfi�[0m
524:  �[36;1m�[0m
525:  �[36;1m# Option: Remove large packages�[0m
526:  �[36;1m# REF: https://github.com/apache/flink/blob/master/tools/azure-pipelines/free_disk_space.sh�[0m
527:  �[36;1m�[0m
528:  �[36;1mif [[ true == 'true' ]]; then�[0m
529:  �[36;1m  BEFORE=$(getAvailableSpace)�[0m
530:  �[36;1m  �[0m
531:  �[36;1m  sudo apt-get remove -y '^aspnetcore-.*' || echo "::warning::The command [sudo apt-get remove -y '^aspnetcore-.*'] failed to complete successfully. Proceeding..."�[0m
532:  �[36;1m  sudo apt-get remove -y '^dotnet-.*' --fix-missing || echo "::warning::The command [sudo apt-get remove -y '^dotnet-.*' --fix-missing] failed to complete successfully. Proceeding..."�[0m
533:  �[36;1m  sudo apt-get remove -y '^llvm-.*' --fix-missing || echo "::warning::The command [sudo apt-get remove -y '^llvm-.*' --fix-missing] failed to complete successfully. Proceeding..."�[0m
534:  �[36;1m  sudo apt-get remove -y 'php.*' --fix-missing || echo "::warning::The command [sudo apt-get remove -y 'php.*' --fix-missing] failed to complete successfully. Proceeding..."�[0m
535:  �[36;1m  sudo apt-get remove -y '^mongodb-.*' --fix-missing || echo "::warning::The command [sudo apt-get remove -y '^mongodb-.*' --fix-missing] failed to complete successfully. Proceeding..."�[0m
536:  �[36;1m  sudo apt-get remove -y '^mysql-.*' --fix-missing || echo "::warning::The command [sudo apt-get remove -y '^mysql-.*' --fix-missing] failed to complete successfully. Proceeding..."�[0m
537:  �[36;1m  sudo apt-get remove -y azure-cli google-chrome-stable firefox powershell mono-devel libgl1-mesa-dri --fix-missing || echo "::warning::The command [sudo apt-get remove -y azure-cli google-chrome-stable firefox powershell mono-devel libgl1-mesa-dri --fix-missing] failed to complete successfully. Proceeding..."�[0m
538:  �[36;1m  sudo apt-get remove -y google-cloud-sdk --fix-missing || echo "::debug::The command [sudo apt-get remove -y google-cloud-sdk --fix-missing] failed to complete successfully. Proceeding..."�[0m
539:  �[36;1m  sudo apt-get remove -y google-cloud-cli --fix-missing || echo "::debug::The command [sudo apt-get remove -y google-cloud-cli --fix-missing] failed to complete successfully. Proceeding..."�[0m
540:  �[36;1m  sudo apt-get autoremove -y || echo "::warning::The command [sudo apt-get autoremove -y] failed to complete successfully. Proceeding..."�[0m
541:  �[36;1m  sudo apt-get clean || echo "::warning::The command [sudo apt-get clean] failed to complete successfully. Proceeding..."�[0m
542:  �[36;1m�[0m
...

1333:  Package 'php-sql-formatter' is not installed, so not removed
1334:  Package 'php8.3-ssh2' is not installed, so not removed
1335:  Package 'php-ssh2-all-dev' is not installed, so not removed
1336:  Package 'php8.3-stomp' is not installed, so not removed
1337:  Package 'php-stomp-all-dev' is not installed, so not removed
1338:  Package 'php-swiftmailer' is not installed, so not removed
1339:  Package 'php-symfony' is not installed, so not removed
1340:  Package 'php-symfony-asset' is not installed, so not removed
1341:  Package 'php-symfony-asset-mapper' is not installed, so not removed
1342:  Package 'php-symfony-browser-kit' is not installed, so not removed
1343:  Package 'php-symfony-clock' is not installed, so not removed
1344:  Package 'php-symfony-debug-bundle' is not installed, so not removed
1345:  Package 'php-symfony-doctrine-bridge' is not installed, so not removed
1346:  Package 'php-symfony-dom-crawler' is not installed, so not removed
1347:  Package 'php-symfony-dotenv' is not installed, so not removed
1348:  Package 'php-symfony-error-handler' is not installed, so not removed
1349:  Package 'php-symfony-event-dispatcher' is not installed, so not removed
...

1527:  Package 'php-twig-html-extra' is not installed, so not removed
1528:  Package 'php-twig-i18n-extension' is not installed, so not removed
1529:  Package 'php-twig-inky-extra' is not installed, so not removed
1530:  Package 'php-twig-intl-extra' is not installed, so not removed
1531:  Package 'php-twig-markdown-extra' is not installed, so not removed
1532:  Package 'php-twig-string-extra' is not installed, so not removed
1533:  Package 'php8.3-uopz' is not installed, so not removed
1534:  Package 'php-uopz-all-dev' is not installed, so not removed
1535:  Package 'php8.3-uploadprogress' is not installed, so not removed
1536:  Package 'php-uploadprogress-all-dev' is not installed, so not removed
1537:  Package 'php8.3-uuid' is not installed, so not removed
1538:  Package 'php-uuid-all-dev' is not installed, so not removed
1539:  Package 'php-validate' is not installed, so not removed
1540:  Package 'php-vlucas-phpdotenv' is not installed, so not removed
1541:  Package 'php-voku-portable-ascii' is not installed, so not removed
1542:  Package 'php-wmerrors' is not installed, so not removed
1543:  Package 'php-xdebug-all-dev' is not installed, so not removed
...

2155:  ********************************************************************************
2156:  ##[group]Run nextflow run ${GITHUB_WORKSPACE} -profile $TEST_PROFILE,$EXEC_PROFILE --outdir ${TEST_PROFILE}_${EXEC_PROFILE}_results
2157:  �[36;1mnextflow run ${GITHUB_WORKSPACE} -profile $TEST_PROFILE,$EXEC_PROFILE --outdir ${TEST_PROFILE}_${EXEC_PROFILE}_results�[0m
2158:  shell: /usr/bin/bash -e {0}
2159:  env:
2160:  NXF_ANSI_LOG: false
2161:  NXF_SINGULARITY_CACHEDIR: /home/runner/work/quantms/quantms/.singularity
2162:  NXF_SINGULARITY_LIBRARYDIR: /home/runner/work/quantms/quantms/.singularity
2163:  CAPSULE_LOG: none
2164:  TEST_PROFILE: test_dda_id_alphapeptdeep
2165:  EXEC_PROFILE: docker
2166:  JAVA_HOME: /opt/hostedtoolcache/Java_Zulu_jdk/17.0.18-8/x64
2167:  JAVA_HOME_17_X64: /opt/hostedtoolcache/Java_Zulu_jdk/17.0.18-8/x64
2168:  ##[endgroup]
2169:  N E X T F L O W  ~  version 26.03.1-edge
2170:  Error modules.config:86:1: If statements cannot be mixed with config statements
2171:  │  84 | 
2172:  │  85 | // Set default publish directory for all features tables
2173:  │  86 | if (params.mzml_features) {
2174:  │     | ^^^^^^^^^^^^^^^^^^^^^^^^^^^
2175:  │  87 |     process {
2176:  ╰  88 | 
2177:  ERROR ~ Config parsing failed
2178:  -- Check '.nextflow.log' file for details
2179:  ##[error]Process completed with exit code 1.
2180:  ##[group]Run mkdir failed_logs
2181:  �[36;1mmkdir failed_logs�[0m
2182:  �[36;1mfailed=$(grep "FAILED" ${TEST_PROFILE}_${EXEC_PROFILE}_results/pipeline_info/execution_trace.txt | cut -f 2)�[0m
2183:  �[36;1mwhile read -r line ; do cp $(ls work/${line}*/*.log) failed_logs/ | true ; done <<< "$failed"�[0m
2184:  shell: /usr/bin/bash -e {0}
2185:  env:
2186:  NXF_ANSI_LOG: false
2187:  NXF_SINGULARITY_CACHEDIR: /home/runner/work/quantms/quantms/.singularity
2188:  NXF_SINGULARITY_LIBRARYDIR: /home/runner/work/quantms/quantms/.singularity
2189:  CAPSULE_LOG: none
2190:  TEST_PROFILE: test_dda_id_alphapeptdeep
2191:  EXEC_PROFILE: docker
2192:  JAVA_HOME: /opt/hostedtoolcache/Java_Zulu_jdk/17.0.18-8/x64
2193:  JAVA_HOME_17_X64: /opt/hostedtoolcache/Java_Zulu_jdk/17.0.18-8/x64
2194:  ##[endgroup]
2195:  grep: test_dda_id_alphapeptdeep_docker_results/pipeline_info/execution_trace.txt: No such file or directory
2196:  ls: cannot access 'work/*/*.log': No such file or directory
2197:  cp: missing destination file operand after 'failed_logs/'
2198:  Try 'cp --help' for more information.
2199:  ##[group]Run actions/upload-artifact@v4
2200:  with:
2201:  name: failed_logs_test_dda_id_alphapeptdeep_latest-everything_23951629075_1
2202:  include-hidden-files: true
2203:  path: failed_logs
2204:  overwrite: false
2205:  if-no-files-found: warn
2206:  compression-level: 6
2207:  env:
2208:  NXF_ANSI_LOG: false
2209:  NXF_SINGULARITY_CACHEDIR: /home/runner/work/quantms/quantms/.singularity
2210:  NXF_SINGULARITY_LIBRARYDIR: /home/runner/work/quantms/quantms/.singularity
2211:  CAPSULE_LOG: none
2212:  TEST_PROFILE: test_dda_id_alphapeptdeep
2213:  EXEC_PROFILE: docker
2214:  JAVA_HOME: /opt/hostedtoolcache/Java_Zulu_jdk/17.0.18-8/x64
2215:  JAVA_HOME_17_X64: /opt/hostedtoolcache/Java_Zulu_jdk/17.0.18-8/x64
2216:  ##[endgroup]
2217:  ##[warning]No files were found with the provided path: failed_logs. No artifacts will be uploaded.
2218:  ##[group]Run actions/upload-artifact@v4

@jpfeuffer
Copy link
Copy Markdown
Collaborator

Why do you merge against master?

@ypriverol ypriverol changed the base branch from master to dev April 4, 2026 05:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

3 participants