Add editable XML params for pending jobs#115
Closed
martinemnoble1 wants to merge 810 commits intomainfrom
Closed
Conversation
- Fix typo: apppendErrorReport → appendErrorReport in acedrg.py and acedrgNew.py - Add missing return statements in lidiaAcedrgNew.py after failed status Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Update ACR from ccp4acrnekmay (North Europe) to ccp4acrukbwmx (UK South) - Update azure_extensions path from Docker/azure to Docker/azure-uksouth - Add BASE_IMAGE_NAME=ccp4i2/base-arpwarp for layered image build - Update docker-compose.bundled-ccp4.yml with correct build instructions - Comment out old Azure Files config (no longer needed with bundled CCP4) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add Microsoft Teams SSO authentication support with MSAL - Add CORP middleware for Cross-Origin headers (COEP/COOP) - Add supplier management with user ownership tracking - Extend assay serializers with data series and aggregations - Add privacy policy and terms of service pages - Simplify Docker configuration for single-stage builds - Various frontend improvements for compounds registry Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
When embedded in Teams, client-side navigation doesn't trigger fresh headers. Opening Moorhen in a new tab ensures COOP/COEP headers are applied, enabling SharedArrayBuffer for WebAssembly. - Change router.push to window.open for all Moorhen links - Add CORP header to UniProt proxy responses Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Use local state for the text field to prevent re-renders while typing. Server sync is debounced to 500ms after typing stops. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The pattern /:path*.js didn't match root-level files like /moorhen.js. Added explicit patterns for root-level .js and .wasm files, plus .data and .gz files for Moorhen's rotamer data. Also added "All Apps" navigation button to Targets page. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add comprehensive COEP compatibility for Moorhen molecular viewer in web deployment while maintaining Electron app functionality. Key changes: - Add API route (/api/moorhen/) to serve moorhen files with CORP headers - Add middleware for COEP/COOP headers on moorhen-page routes only - Patch Worker constructor to use blob URLs in COEP context - Inline moorhen.js into CootWorker.js to avoid importScripts failures - Provide absolute URLs via locateFile for blob worker context - Fix Electron detection (electronAPI vs electron) - Add Moorhen Viewer link to app-selector page - Update CSP to allow blob: in script-src and worker-src Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The Moorhen page requires COEP/COOP headers which break the opener relationship with the parent page. Opening in a new tab avoids issues when navigating from non-COEP pages. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add theme toggle button to compounds PageHeader (sun/moon icon) - Create CompoundsThemeProvider for standalone mode with light/dark support - Create re-export in client/renderer for Docker overlay compatibility - Fix CCP4i2ThemeProvider to not overwrite localStorage on initial mount - Add CssBaseline to ensure proper background colors in dark mode - Remove conflicting CSS color variables from globals.css - Add Home breadcrumb to supplier detail page - Update Dockerfile to use parent's theme-provider re-export Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
These legal pages were in the compounds frontend but not being copied during the Docker build process. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Fix file download auth in Azure by checking access_token query param (proxy routes now check Authorization header, Easy Auth header, and query parameter for downloads via anchor clicks) - Auto-close batch import dialog on success with popcorn notification - Add atom selection field to coords import dialog for excluding ligands - Fix SubstituteLigand NCS check with graceful fallback for older coot Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
fit_ligand returns fit_ligand_info_t objects, not raw molecule indices. Extract the imol attribute from each object before passing to merge_molecules. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add migrate-media.sh script to copy media files from legacy Azure File
Share to new Blob storage with path transformations:
- RegBatchQCFile_NCL-* directories relocated to RegisterCompounds/BatchQCFiles/NCL-*
- Update import_legacy_compounds.py to transform batch QC file paths in
database fixtures to match the new storage location
- Update _batch_qc_path() in registry models to use new path pattern for
newly uploaded batch QC files: RegisterCompounds/BatchQCFiles/{id}/
- Fix SMILES rendering in campaign member project table by using the
proper SmilesView component that accesses RDKit via React context
instead of the broken window.RDKit lookup
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add useMemo import to campaigns-api.ts (was causing build error) - Memoize smilesMap in useSmilesLookup to prevent recreation on every render - Sort and dedupe regIds for stable SWR cache keys - Add SMILES tooltip to SmilesView component - Add NCL ID tooltip to campaign detail rows for debugging SMILES mismatch - Remove duplicate tooltip wrapper from page.tsx MemberProjectRow Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
SmilesView was using useCCP4i2Window() which returns CCP4i2Context, but that context's rdkitModule is never populated. The RDKit module is loaded by RDKitProvider in the root layout and accessible via the useRDKit hook. This fixes the issue where SMILES molecules showed an animated skeleton placeholder instead of the rendered structure. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Adds a Django management command to import legacy ConstructDatabase
fixtures (JSON format) into the new compounds.constructs schema.
Model mappings:
- ConstructDatabase.project -> constructs.constructproject
- ConstructDatabase.plasmid -> constructs.plasmid
- ConstructDatabase.protein -> constructs.protein
- ConstructDatabase.proteinsynonym -> constructs.proteinsynonym
- ConstructDatabase.proteinuse -> constructs.proteinuse
- ConstructDatabase.cassette -> constructs.cassette
- ConstructDatabase.cassetteuse -> constructs.cassetteuse
- ConstructDatabase.sequencingresult -> constructs.sequencingresult
- ConstructDatabase.expressiontagtype -> constructs.expressiontagtype
- ConstructDatabase.protease -> constructs.protease
- ConstructDatabase.expressiontaglocation -> constructs.expressiontaglocation
- ConstructDatabase.expressiontag -> constructs.expressiontag
Features:
- Transforms camelCase field names to snake_case
- Handles timezone-naive datetime conversion
- Respects FK dependency order during import
- Supports --dry-run and --verbose flags
- Optional --output-dir for fixture inspection
Usage:
python manage.py import_legacy_constructs \
--constructs-fixture ConstructDatabase.json \
--dry-run --verbose
This complements the existing migrate_legacy_constructs.py command
which handles direct SQLite database migration.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Security improvements: - Add IsAuthenticated to FileViewSet endpoints (download, digest, preview) - Add middleware marker (_ccp4i2_auth_middleware_ran) to prevent spoofing - Always enable auth middleware (auto-assigns dev_admin in dev mode) - Use apiFetch for project campaigns API call (fixes Azure 401 errors) Admin page reorganization: - Split admin page into /admin (users), /admin/import, /admin/docs - Add MarkdownDoc component for rendering admin documentation - Add admin-guide.md with CLI access instructions for Container Apps File serving improvements: - Add generate_download_sas_url() for secure blob downloads - Update media_views to redirect to SAS URLs when Azure storage configured Data management: - Add backup_database.py management command - Add deploy-backup-job.sh for scheduled Azure backups - Add migrate-fixtures.sh for legacy data migration - Update OPERATIONS.md with backup and migration procedures Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add original_filename field to BatchQCFile to preserve upload names - Fix file downloads to use original filename in Content-Disposition - Add file validation with extension and size limits: - QC files: 50MB, supports PDF/images/NMR/MS formats - Documents: 25MB, supports office/PDF/text formats - Data files: 100MB, supports Excel/CSV/JSON - GenBank: 10MB, supports .gb/.fasta formats - Sequencing: 5MB, supports .ab1/.seq formats - Fix MEDIA_ROOT for local Docker to use mounted volume - Fix proxy to handle file downloads and Azure SAS redirects - Add delete functionality for batch QC files and protocol documents - Fix ProtocolDocument download (removed invalid doc.title reference) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add support for: - OpenDocument formats (.odt, .ods, .odp, .odg) - RTF files - Additional image formats (.bmp, .webp, .svg) - ZIP archives - More MS/chromatography formats (.mzml, .mzxml, .cdf, .aia) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…opulate Phase 1 complete - local cache patching eliminates race conditions: - Fix CList delete to use valueOfItem() for correct value extraction - Add removeChildLookupEntries() for proper CList reindexing on delete - Remove redundant mutateContainer() calls from shelx.tsx - Add auto-populate wavelength when file is pre-set from pipeline identity - Update comments to reflect local patching architecture Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The requiredContentFlag qualifier was being silently converted to None when it came as a list (e.g., [1, 2] for IPair/FPair). The old code tried int(list) which threw TypeError, caught and set to None. This caused IMEAN files (content=3) to be auto-populated into fields requiring only anomalous data (content=1 or 2). Added _normalize_int_qualifier helper that properly handles: - Lists (return as-is) - Comma-separated strings (parse to list) - Single int/string values (convert to int) - None/empty (return None) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
With local parameter patching, forceUpdate calls now patch the SWR cache directly. The subsequent mutateContainer() calls were redundant and could cause race conditions by triggering full refetches. - generic.tsx: Remove unused mutateContainer destructuring - SubstituteLigand.tsx: Remove unused mutateContainer and api imports - import_merged.tsx: Remove mutateContainer calls after forceUpdate batches Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…etches The visibilitychange handler that called mutateContainer() when returning to a tab was causing cascading container refetches in complex task interfaces like import_merged. With local parameter patching, the cache is already kept in sync - we don't need to refetch on visibility change. This handler was originally added to handle cases where users might edit in multiple tabs, but it was triggering dozens of redundant API calls when combined with task interface effects. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
CUnmergedDataContent.getColumnGroups() was grouping all columns with the same groupIndex (typically 0) into one group, then trying to exact-match the entire signature (e.g., "JQFWFPFPFPFPIRRRR") against patterns like "JQ". This failed for refined MTZ files which have observations mixed with map coefficients. Backend fix: - Delegate CUnmergedDataContent.getColumnGroups() to CMtzData.getColumnGroups() which uses pattern scanning (finds "JQ" within longer type strings) Frontend improvements: - Filter out columnGroups with empty/invalid columnGroupType - Add fallback to frontend pattern matching if backend misses Obs groups - Improve warning message to explain what data IS in the file when no observations found (map coefficients, phases, FreeR) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Frontend: - Add auto-selection of first observation group when digest is processed - Sets HKLIN_OBS_COLUMNS and HKLIN_OBS_CONTENT_FLAG automatically Backend: - x2mtz: Derive contentFlag from column groups if HKLIN_OBS_CONTENT_FLAG is 0 - CCP4XtalData: Fix columnNames() to handle nested CInt values Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Replace ACR build tasks with local Docker builds using docker-compose for parallel builds. The web image now uses a cross-platform Dockerfile that runs npm install and next build natively on Apple Silicon, with only the final runtime stage targeting linux/amd64. - Add Dockerfile.cross-platform with 3-stage build (native builder, native deps, emulated runner) - Add docker-compose.yml for parallel local builds - Rename build-and-push.sh to use local Docker (was ACR builds) - Keep build-and-push-acr.sh as fallback for ACR-based builds This significantly improves build times on Apple Silicon by avoiding QEMU emulation for the heavy npm/webpack operations. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…orhen pages Replace WASM-based MTZ header parsing with a pure TypeScript implementation that works in all environments (including Azure where SharedArrayBuffer isn't available). This enables MTZ file uploads on Azure without requiring COEP headers. Changes: - Add mtz-parser.ts: Pure TypeScript MTZ header parser handling byte order detection, Fortran 1-indexed offsets, and all standard header keywords - Add mtz-parser.test.ts: Test suite for the parser - Remove CootProvider from job and project layouts (keep only in moorhen-page) - Update mtz-column-dialog.tsx to use native parser exclusively - Add MTZ Header preview option to file context menu - Remove cootModule dependency from import_merged, cminimtzdatafile, fetch-file-for-param, and file-preview-context Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add version tracking for both web and server deployments: - Web: /api/version endpoint returning build timestamp and git commit - Server: /api/ccp4i2/version/ endpoint with same info - App-selector: "About" link in footer that toggles version display - Build scripts: Pass BUILD_TIMESTAMP and GIT_COMMIT to Docker builds Environment variables set during build: - NEXT_PUBLIC_BUILD_TIMESTAMP / BUILD_TIMESTAMP - NEXT_PUBLIC_GIT_COMMIT / GIT_COMMIT Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The deps stage was running on $BUILDPLATFORM (arm64 on Mac) which installed arm64-specific SWC binaries. These don't work when the runner stage targets linux/amd64, causing Next.js to fail at startup trying to download the missing SWC package. Fix: Change deps stage to use $TARGETPLATFORM so npm install gets the correct linux/amd64 SWC binaries. This uses QEMU emulation but is fast since it's only npm install, not the full build. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Split package.json modification into its own stage that both builder and deps stages depend on. This allows the deps stage (which runs on the emulated target platform) to be cached independently of source code changes. Build flow: 1. Package stage: Modifies package.json (cached unless deps change) 2. Builder stage: npm install + next build (rebuilds on code changes) 3. Deps stage: npm install --omit=dev (cached when package.json unchanged) 4. Runner stage: Assembles final image Previously, deps copied from builder, forcing it to wait for builder to complete. Now deps copies from package, so it can be cached even when source code changes trigger a builder rebuild. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The version endpoint is called from the app-selector page before login, so it needs to bypass authentication. Added PUBLIC_ENDPOINTS list for health and version routes that skip auth checks. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…n splitHklout When auto-inferring output column names, splitHklout was always using CONTENT_SIGNATURE_LIST[0] regardless of the output file's contentFlag. For CObsDataFile this meant always selecting 4-column anomalous format even when the data was 2-column mean (F,SIGF), causing "Input and output column counts must match" errors during import_merged. Now reads the contentFlag from the output file object and uses it as a 1-indexed offset into CONTENT_SIGNATURE_LIST. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The "edit query" button on the target dashboard was only passing the target name to the query builder, losing protocols and other settings. Also added missing include_properties field to backend serializer. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Standardize all ViewSets to use class-level permission_classes = [IsAuthenticated] - Fix URL prefix mismatch (tests now use /api/ccp4i2/ prefix) - Add bypass_api_permissions fixture for test authentication - Add skipUnless decorators for tests requiring test101 data - Fix mkdir() calls to use exist_ok=True Working tests: - test_data_reduction_api.py: 5/5 passed - test_utilities_api.py: 10/11 passed (1 skip - missing plugin) - test_project_tag_api.py: 14/14 passed - test_refinement_api.py: 5/5 passed Tests skipped (require test101/ProjectZips data): - test_api.py, test_job_utils.py, test_viewsets_comprehensive.py - test_parameter_setting_api.py, test_job_execution_via_api.py Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Use BuildKit cache mounts to persist npm cache between builds, significantly reducing npm install time on subsequent builds. Also add git package required by some npm dependencies. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Merge django branch into docker-django-sync, reconciling divergent development since commit 8dbeef1 (Replacing Coot scripts with CHAPI). Resolution strategy: - compounds app: Keep docker (active development, django had stale snapshot) - client/renderer: Keep docker (Paul's cleanup already incorporated) - auth/settings: Keep docker (production security, backward-compatible) - prosmart_refmac: Accept django (Paul's cleanup of unused params) - pyproject.toml: Merged superset (python-docx + openpyxl) - coot wrappers: Accept django (Paul's coot_headless_api migration) - worker.py, requirements-azure: Keep docker (Azure deployment) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Belt-and-braces fix for job status tracking where parent jobs would get stuck at RUNNING status while all child jobs had completed. Root cause: Many pipelines called reportStatus() but didn't return a value from process(). This caused plugin.get_status() to return None, which track_job didn't handle, skipping the status update. Infrastructure fixes: - async_run_job.py: Explicitly update job status to FINISHED after track_job context exits successfully - async_db_handler.py: Handle None status as SUCCEEDED in track_job Pipeline fixes (add return statements to process()): - aimless_pipe, import_merged, import_serial_pipe - phaser_simple, LidiaAcedrgNew, pisapipe - SubstituteLigand, PrepareDeposit - import_xia2 and wrappers (xia2_integration, xia2_aimless, xia2_ctruncate, xia2_pointless) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The validity check now accepts MTZ files whose contentFlag can be converted to the required type, not just exact matches. For example, if a task requires FMEAN data but the input file has IPAIR data, validation now passes because IPAIR can be converted to FMEAN via French-Wilson conversion. Changes: - CObsDataFile: Add CAN_CONVERT_TO mapping defining conversion paths (IPAIR→FPAIR/IMEAN/FMEAN, FPAIR→FMEAN, IMEAN→FMEAN) - cdata_file.py: Update validity() to check CAN_CONVERT_TO before reporting content type mismatch errors This fixes validation failures in pipelines like Parrot that accept convertible input types. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The super() call used lowercase 'lidiaAcedrgNew' instead of 'LidiaAcedrgNew', causing NameError when running acedrg tests. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- shelxeMR.process() now returns the result of super().process() - crank2.process() now returns proper CPluginScript status constants instead of returning the crank2 object or 0 These fixes ensure job status is properly tracked in the database. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The process() method called reportStatus() but didn't return the status value. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Django's TestCase uses database transactions which caused a TransactionManagementError during teardown. Using TransactionTestCase resolves this by not wrapping tests in transactions. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
COMP_BY, WAVELENGTH, LLGC_CYCLES, and ELEMENTS are in inputData, not controlParameters, per the phaser_EP.def.xml structure. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Import crank2 as a package in crank2_script.py (required for relative imports) - Add fallback imports in crank2 modules for standalone/package compatibility: - inout.py: Try processes.convert, fall back to crank2.processes.convert - process.py: Try processes.X, fall back to crank2.processes.X - program.py: Try programs.X, fall back to crank2.programs.X - Change verbose DEBUG logging to logger.debug in CCP4PluginScript.py This allows crank2 to work both standalone (original behavior) and when imported as a package within ccp4i2. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Try standalone import first, fall back to package import for ccp4i2 context. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add pytestmark = pytest.mark.pipeline to all API test files that run actual crystallographic jobs. This allows running fast API/utility tests separately from slow pipeline tests: - Fast tests only: pytest -m "not pipeline" - Pipeline tests only: pytest -m pipeline Marked files: - test_data_reduction_api.py - test_ep_pipelines_api.py - test_model_building_api.py - test_mr_pipelines_api.py - test_refinement_api.py - test_servalcat_async_run.py - test_utilities_api.py Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Remove unnecessary ROOT_URLCONF override (already set in test_settings.py) - Fix SimpleViewSetTests and ProjectExportViewSetTests to use API_PREFIX for URLs instead of bare paths like /fileuses/ This fixes 404 errors in these tests since all API routes are mounted under /api/ccp4i2/. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
input_params.xmlfor pending jobs via the "Params as xml" tab in dev modeChanges
Server (
server/ccp4i2/api/JobViewSet.py):params_xmlendpoint to accept PUT requestsinput_params.xmlClient (
client/renderer/components/job-view.tsx):params_xmlrefresh after successful saveTest plan
🤖 Generated with Claude Code