Skip to content

fix(rss): skip download when torrent already exists by infohash#715

Merged
s0up4200 merged 12 commits intomainfrom
feat/rss-infohash-precheck
Dec 10, 2025
Merged

fix(rss): skip download when torrent already exists by infohash#715
s0up4200 merged 12 commits intomainfrom
feat/rss-infohash-precheck

Conversation

@s0up4200
Copy link
Collaborator

@s0up4200 s0up4200 commented Dec 8, 2025

When RSS automation finds candidates, check if the feed item's infohash already exists on all candidate instances before downloading the torrent file. This prevents unnecessary downloads when a torrent was already added via another source like autobrr.

Also adds fallback extraction of infohash from magneturl attribute.

Indexers providing infohash via Prowlarr:

  • Generic Torznab (infohash attribute)
  • HDBits, BeyondHD, Avistaz/PrivateHD/ExoticaZ, BroadcastheNet
  • Knaben, TorrentsCSV
  • Cardigann (per-definition, with magnet fallback)
  • TorrentRssParser-based indexers (from magnet URL if present)

Not supported (no infohash in feed):

  • PassThePopcorn, UNIT3D-based trackers
  • Anidex, Animedia, SubsPlease, Shizaproject

Summary by CodeRabbit

  • New Features

    • Early pre-checks to detect existing torrents across targets (infohash or comment/URL match) and skip redundant downloads.
    • Improved infohash extraction that accepts magnet-URL sources as a fallback.
  • Bug Fixes

    • More robust validation and error handling for infohash checks, including better handling of cancellations and fallbacks.
  • Tests

    • Expanded unit tests for infohash extraction, magnet fallbacks, comment/URL detection, and cross-instance skip scenarios.

✏️ Tip: You can customize this high-level summary in your review settings.

When RSS automation finds candidates, check if the feed item's infohash
already exists on all candidate instances before downloading the torrent
file. This prevents unnecessary downloads (and Prowlarr "Grabbed"
notifications) when a torrent was already added via another source like
autobrr/IRC.

Also adds fallback extraction of infohash from magneturl attribute.

Indexers providing infohash via Prowlarr:
- Generic Torznab (infohash attribute)
- HDBits, BeyondHD, Avistaz/PrivateHD/ExoticaZ, BroadcastheNet
- Knaben, TorrentsCSV
- Cardigann (per-definition, with magnet fallback)
- TorrentRssParser-based indexers (from magnet URL if present)

Not supported (no infohash in feed):
- PassThePopcorn, UNIT3D-based trackers
- Anidex, Animedia, SubsPlease, Shizaproject
@s0up4200 s0up4200 added enhancement New feature or request cross-seed labels Dec 8, 2025
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Dec 8, 2025

Walkthrough

Adds early pre-download checks in cross-seed to short-circuit downloads when torrents already exist (infohash or comment/URL match), centralizes and extends Jackett infohash extraction (validation + magnet fallback), and adds comprehensive unit tests for both services.

Changes

Cohort / File(s) Summary
Cross-Seed Pre-Download Optimization
internal/services/crossseed/service.go
Adds an early optimization path: checks InfoHashV1 across target instances to skip downloads when present on all; adds UNIT3D/comment-URL extraction and matching (extractTorrentURLForCommentMatch, parseTorrentDetailsURL) to short-circuit when a comment-based torrent URL is present; handles cancellation and partial failures with fallbacks to download.
Cross-Seed Tests & Test Helpers
internal/services/crossseed/crossseed_test.go
Adds TestExtractTorrentURLForCommentMatch, in-memory test doubles (sync manager, instance store), and extensive scenarios covering full/partial presence, error propagation, context cancellation/deadlines, and comment/URL matching behavior.
Jackett Infohash Extraction & Validation
internal/services/jackett/service.go
Adds validateInfoHash and extractInfoHashFromMagnet; refactors extractInfoHashFromAttributes to prefer validated direct infohash attributes and fallback to magnet parsing, replacing ad-hoc inline validation.
Jackett Extraction Tests
internal/services/jackett/service_test.go
Adds TestExtractInfoHashFromAttributes and TestExtractInfoHashFromMagnet covering direct hashes, case-insensitive keys, alternative keys, SHA-256 variants, magnet fallbacks, and invalid inputs.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

  • Inspect cross-seed early-exit semantics and edge cases where some instances report errors vs. non-existence.
  • Verify correctness and normalization in parseTorrentDetailsURL / extractTorrentURLForCommentMatch.
  • Validate validateInfoHash and extractInfoHashFromMagnet for SHA-1/SHA-256 handling and case normalization.
  • Review new tests and test doubles for realistic simulation of multi-instance behaviors.

Possibly related PRs

Suggested labels

bugfix, torrent

Suggested reviewers

  • Audionut

Poem

🐰 I sniff the feeds and hashes with delight,
I skip what's found and save the download flight,
I parse a magnet and spot a URL’s trail,
I hop through tests so edge cases won't fail,
A tiny rabbit keeps your seeding light.

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 22.22% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: adding an optimization to skip torrent downloads when they already exist by infohash, which directly aligns with the primary objectives of the pre-check functionality.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feat/rss-infohash-precheck

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (2)
internal/services/crossseed/service.go (1)

1637-1685: Infohash pre-check logic correctly short-circuits redundant downloads

The new InfoHashV1 pre-check looks sound and matches the intended behavior:

  • It only runs when there are viable candidates and the run is not a dry-run, so existing behavior for “no candidates” and dry-runs is preserved.
  • allExist is only left true when every candidate instance reports exists == true from HasTorrentByAnyHash, and any error or miss immediately flips allExist to false and falls back to the existing download+cross-seed path. That keeps the optimization strictly opportunistic/safe.
  • When the short-circuit triggers, you:
    • Increment run.TorrentsSkipped once per candidate instance, mirroring how the later exists branch in the cross-seed path treated per-instance “already exists” outcomes.
    • Append CrossSeedRunResult entries with Status: "exists" and MatchedTorrentHash/Name populated, which aligns with existing result semantics and the CrossSeedRunResult model.
    • Return CrossSeedFeedItemStatusProcessed with a non-nil infoHash so markFeedItem persists the hash for the feed item, keeping feed history consistent with the “downloaded + parsed” flow.

Given the surrounding counters (CandidatesFound, TorrentsSkipped, run status calculation) and the fact that candidatesResp.Candidates only ever contains instances with matching complete torrents, this optimization should behave identically to the old path for “already-have-this-torrent-everywhere” cases while avoiding the unnecessary download and cross-seed roundtrip.

I don’t see any functional bugs or accounting inconsistencies in this block; it’s a clean, conservative optimization.

internal/services/jackett/service.go (1)

3076-3105: Minor: Redundant check after parameter splitting.

The magnet URL parsing is correct, but lines 3097-3099 check for & in hashPart after already splitting by & at line 3091. Each param after splitting should not contain &, making this check redundant. Since validateInfoHash will catch any malformed data anyway, consider simplifying:

 		if strings.HasPrefix(strings.ToLower(param), "xt=urn:btih:") {
 			// Extract the hash part after "xt=urn:btih:"
 			hashPart := param[12:] // len("xt=urn:btih:") == 12
-			// Hash might be followed by other parameters or URL encoding
-			if idx := strings.Index(hashPart, "&"); idx > 0 {
-				hashPart = hashPart[:idx]
-			}
 			return validateInfoHash(hashPart)
 		}

The defensive check is harmless but adds unnecessary complexity.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between bd2587b and 8da6203.

📒 Files selected for processing (3)
  • internal/services/crossseed/service.go (1 hunks)
  • internal/services/jackett/service.go (1 hunks)
  • internal/services/jackett/service_test.go (1 hunks)
🧰 Additional context used
🧠 Learnings (6)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: web/src/components/torrents/TorrentTableOptimized.tsx:1510-1515
Timestamp: 2025-11-06T11:59:21.390Z
Learning: In the qui project, the API layer in web/src/lib/api.ts normalizes backend snake_case responses to camelCase for frontend consumption. For CrossSeed search results, the backend's download_url field is transformed to downloadUrl in the searchCrossSeedTorrent method, so frontend code should always use the camelCase variant (result.downloadUrl).
📚 Learning: 2025-11-28T20:32:30.126Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.

Applied to files:

  • internal/services/crossseed/service.go
📚 Learning: 2025-11-25T22:46:03.762Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 632
File: internal/backups/service.go:1401-1404
Timestamp: 2025-11-25T22:46:03.762Z
Learning: In qui's backup service (internal/backups/service.go), background torrent downloads initiated during manifest import intentionally use a fire-and-forget pattern with the shared service context (s.ctx). Per-run cancellation is not needed, as orphaned downloads completing after run deletion are considered harmless and acceptable. This design prioritizes simplicity over per-run lifecycle management for background downloads.

Applied to files:

  • internal/services/crossseed/service.go
📚 Learning: 2025-11-28T22:21:20.730Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:2415-2457
Timestamp: 2025-11-28T22:21:20.730Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The determineSavePath function intentionally includes a contentLayout string parameter for future/content-layout branching and API consistency. Its presence is by design even if unused in the current body; do not flag as an issue in reviews.

Applied to files:

  • internal/services/crossseed/service.go
📚 Learning: 2025-11-06T12:11:04.963Z
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: internal/services/crossseed/service.go:1045-1082
Timestamp: 2025-11-06T12:11:04.963Z
Learning: The autobrr/qui project uses a custom go-qbittorrent client library (github.com/autobrr/go-qbittorrent) that supports both "paused" and "stopped" parameters when adding torrents via the options map. Both parameters should be set together when controlling torrent start state, as seen in internal/services/crossseed/service.go and throughout the codebase.

Applied to files:

  • internal/services/crossseed/service.go
📚 Learning: 2025-11-27T15:33:39.007Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/jackett/service.go:309-330
Timestamp: 2025-11-27T15:33:39.007Z
Learning: In internal/services/jackett/service.go, synchronous execution paths (test executors, interactive searches with explicit indexer selection, and scheduler-unavailable fallback) intentionally pass jobID=0 to result callbacks. This is acceptable because outcome tracking via ReportIndexerOutcome is only used for async cross-seed operations that always use the scheduler and receive unique jobIDs. Interactive searches and tests do not use outcome tracking.

Applied to files:

  • internal/services/crossseed/service.go
🧬 Code graph analysis (1)
internal/services/crossseed/service.go (2)
web/src/types/index.ts (1)
  • CrossSeedRunResult (1363-1371)
internal/models/crossseed.go (2)
  • CrossSeedRunResult (150-158)
  • CrossSeedFeedItemStatusProcessed (233-233)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Run tests
🔇 Additional comments (4)
internal/services/jackett/service.go (2)

3040-3063: Well-structured infohash extraction with fallback.

The two-phase approach (direct attribute lookup first, then magnet URL fallback) is sound and properly handles Prowlarr's magneturl attribute when direct infohash isn't available. Both paths go through validation which ensures data quality.


3065-3074: LGTM - Proper infohash validation.

Correctly validates both SHA-1 (40 chars) and SHA-256 (64 chars) hex-encoded hashes with appropriate normalization.

internal/services/jackett/service_test.go (2)

2008-2094: Comprehensive test coverage for attribute extraction.

The test suite thoroughly validates all extraction paths including direct attributes, alternative keys, validation edge cases, and magnet URL fallback. The priority test (lines 2070-2073) correctly confirms that direct infohash takes precedence over magneturl.


2096-2152: Good coverage of magnet URL parsing scenarios.

The test suite validates the magnet URL parser across different parameter orderings, case variations, and error conditions. This ensures the parser is robust to different magnet URL formats.

For UNIT3D trackers that don't expose infohash in RSS feeds,
check if the RSS GUID/InfoURL matches torrent comment fields
on candidate instances before downloading.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
internal/services/crossseed/crossseed_test.go (1)

3283-3352: LGTM! Comprehensive test coverage for URL extraction logic.

The test thoroughly validates the extractTorrentURLForCommentMatch function with excellent coverage of the main scenarios: valid tracker URLs (UNIT3D, BHD, Aither, Blutopia), fallback behavior, security validation (HTTPS-only), and edge cases (empty inputs, invalid URLs). The table-driven approach is idiomatic and makes the test easy to understand and extend.

Optional: Consider adding edge cases for more complete coverage:

You may want to add test cases for:

  1. Both inputs invalid: When both GUID and InfoURL are non-torrent URLs
  2. HTTP InfoURL: When GUID is invalid and InfoURL is HTTP (to verify consistent HTTPS enforcement)
  3. URLs with query parameters: e.g., https://seedpool.org/torrents/607803?key=value

Example additional test case:

{
    name:     "Both GUID and InfoURL invalid",
    guid:     "not-a-url",
    infoURL:  "https://example.com/page/123",
    expected: "",
},

These additions would further strengthen the test suite, but the current coverage is solid for the primary use cases.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 8da6203 and 8fa696d.

📒 Files selected for processing (2)
  • internal/services/crossseed/crossseed_test.go (1 hunks)
  • internal/services/crossseed/service.go (2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • internal/services/crossseed/service.go
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Run tests

@s0up4200 s0up4200 added this to the v1.10.0 milestone Dec 8, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (6)
internal/services/jackett/service.go (1)

3078-3109: LGTM! Consider using standard URL parsing for cleaner code.

The magnet URL parsing logic works correctly. For improved maintainability and robustness, consider using url.Parse and url.Query().Get("xt") instead of manual string splitting:

func extractInfoHashFromMagnet(magnetURL string) string {
	magnetURL = strings.TrimSpace(magnetURL)
	if magnetURL == "" || !strings.HasPrefix(strings.ToLower(magnetURL), "magnet:") {
		return ""
	}

	u, err := url.Parse(magnetURL)
	if err != nil {
		return ""
	}

	xt := u.Query().Get("xt")
	if xt == "" {
		return ""
	}

	// The xt parameter format is: urn:btih:<infohash>
	if !strings.HasPrefix(strings.ToLower(xt), "urn:btih:") {
		return ""
	}

	hashPart := xt[9:] // len("urn:btih:") == 9
	return validateInfoHash(hashPart)
}

This approach handles URL encoding automatically and is more maintainable, though the current implementation is functionally correct.

internal/services/crossseed/service.go (2)

1637-1737: RSS infohash & comment-URL pre-checks look correct; only very minor robustness nits

The early-exit paths behave as intended: they only short-circuit when all candidate instances already have the torrent, they degrade gracefully on qB errors, and they keep run counters and feed item status consistent with the existing “exists” semantics.

Two small, non-blocking suggestions:

  • Consider normalizing result.InfoHashV1 before passing it into HasTorrentByAnyHash (e.g., via normalizeHash) to be maximally tolerant of indexer casing/whitespace.
  • In the comment-URL pre-check, you might eventually want to log how often this path triggers (vs. infohash pre-check) to validate its impact and help tune/extend patterns, but that’s an observability refinement rather than a correctness issue.

7428-7469: Comment-URL helpers are clear and safe; you may later widen URL patterns if needed

extractTorrentURLForCommentMatch / parseTorrentDetailsURL are nicely constrained and avoid false positives by requiring HTTPS and specific /torrents/ or /details/ segments, which is appropriate for UNIT3D/BHD-style feeds.

If you ever hit real-world feeds that use http:// or slightly different detail paths (e.g. query-based torrents.php?id=), you can extend parseTorrentDetailsURL to accept those variants, but that’s an easy, incremental change and doesn’t block this PR.

internal/services/crossseed/crossseed_test.go (3)

3283-3353: Solid coverage for extractTorrentURLForCommentMatch; consider one InfoURL-driven case

The table-driven test nicely covers:

  • Accepted HTTPS tracker URLs (UNIT3D/BHD-style) in GUID.
  • Fallback to InfoURL.
  • Rejection of HTTP/non-torrent URLs and empty inputs.

If you want slightly tighter integration coverage, you could add a case where GUID is non-URL and only InfoURL carries the valid torrent details URL, mirroring the UNIT3D-style feeds that rely on link rather than guid.


3355-3499: infohash test fakes look correct; could also assert requested hash lists

infohashTestSyncManager / infohashTestInstanceStore match the existing qbittorrentSync test fakes and give fine‑grained control over HasTorrentByAnyHash outcomes per instance. Returning copies from GetTorrents/GetTorrentFilesBatch avoids accidental mutation in tests, which is good.

If you want these tests to also validate that processAutomationCandidate passes the right, normalized hash set into HasTorrentByAnyHash, you could extend the fake to capture the hashes argument per call and assert on it in the tests (e.g., ensure it only contains the validated v1 infohash for now).


3776-3849: Comment-URL match test cleanly validates the non-infohash fallback

TestProcessAutomationCandidate_SkipsWhenCommentURLMatches correctly models the UNIT3D-style scenario where:

  • No infohash is available.
  • The GUID is a tracker details URL.
  • An existing torrent’s Comment embeds that URL.

The test verifies that the helper URL extraction plus comment matching path short-circuits before download and leaves returnedHash nil, which matches the “no hash was trusted here” semantics.

If you later want even more confidence, you could add a variant where the GUID is non-URL and the same torrent URL lives only in InfoURL, to prove that the comment-match fallback works with both feed fields, not just GUID.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 8fa696d and 6c9f85f.

📒 Files selected for processing (3)
  • internal/services/crossseed/crossseed_test.go (2 hunks)
  • internal/services/crossseed/service.go (2 hunks)
  • internal/services/jackett/service.go (1 hunks)
🧰 Additional context used
🧠 Learnings (7)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
📚 Learning: 2025-11-28T20:32:30.126Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.

Applied to files:

  • internal/services/crossseed/service.go
  • internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-25T22:46:03.762Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 632
File: internal/backups/service.go:1401-1404
Timestamp: 2025-11-25T22:46:03.762Z
Learning: In qui's backup service (internal/backups/service.go), background torrent downloads initiated during manifest import intentionally use a fire-and-forget pattern with the shared service context (s.ctx). Per-run cancellation is not needed, as orphaned downloads completing after run deletion are considered harmless and acceptable. This design prioritizes simplicity over per-run lifecycle management for background downloads.

Applied to files:

  • internal/services/crossseed/service.go
  • internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-28T22:21:20.730Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:2415-2457
Timestamp: 2025-11-28T22:21:20.730Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The determineSavePath function intentionally includes a contentLayout string parameter for future/content-layout branching and API consistency. Its presence is by design even if unused in the current body; do not flag as an issue in reviews.

Applied to files:

  • internal/services/crossseed/service.go
📚 Learning: 2025-11-06T12:11:04.963Z
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: internal/services/crossseed/service.go:1045-1082
Timestamp: 2025-11-06T12:11:04.963Z
Learning: The autobrr/qui project uses a custom go-qbittorrent client library (github.com/autobrr/go-qbittorrent) that supports both "paused" and "stopped" parameters when adding torrents via the options map. Both parameters should be set together when controlling torrent start state, as seen in internal/services/crossseed/service.go and throughout the codebase.

Applied to files:

  • internal/services/crossseed/service.go
  • internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-27T15:33:39.007Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/jackett/service.go:309-330
Timestamp: 2025-11-27T15:33:39.007Z
Learning: In internal/services/jackett/service.go, synchronous execution paths (test executors, interactive searches with explicit indexer selection, and scheduler-unavailable fallback) intentionally pass jobID=0 to result callbacks. This is acceptable because outcome tracking via ReportIndexerOutcome is only used for async cross-seed operations that always use the scheduler and receive unique jobIDs. Interactive searches and tests do not use outcome tracking.

Applied to files:

  • internal/services/crossseed/service.go
  • internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-21T21:11:50.633Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 625
File: internal/qbittorrent/sync_manager.go:1112-1123
Timestamp: 2025-11-21T21:11:50.633Z
Learning: In internal/qbittorrent/sync_manager.go, the GetCachedFilesBatch interface documentation (around line 39-40) should specify "uppercase hex" instead of "lowercase hex" to match the actual normalization practice used throughout the codebase (e.g., normalizeHash in internal/services/crossseed/service.go uses strings.ToUpper, and hash filtering uses uppercase).

Applied to files:

  • internal/services/jackett/service.go
🧬 Code graph analysis (2)
internal/services/crossseed/service.go (2)
web/src/types/index.ts (2)
  • CrossSeedRunResult (1363-1371)
  • Torrent (216-273)
internal/models/crossseed.go (2)
  • CrossSeedRunResult (150-158)
  • CrossSeedFeedItemStatusProcessed (233-233)
internal/services/crossseed/crossseed_test.go (4)
internal/qbittorrent/sync_manager.go (2)
  • CrossInstanceTorrentView (99-103)
  • TorrentView (93-96)
internal/services/crossseed/service.go (1)
  • Service (236-287)
internal/models/crossseed.go (3)
  • CrossSeedAutomationSettings (20-51)
  • CrossSeedRun (161-177)
  • CrossSeedFeedItemStatusProcessed (233-233)
internal/services/crossseed/models.go (3)
  • CrossSeedRequest (15-38)
  • CrossSeedResponse (41-48)
  • InstanceCrossSeedResult (51-60)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Run tests
🔇 Additional comments (4)
internal/services/jackett/service.go (2)

3045-3063: LGTM! Well-structured fallback for infohash extraction.

The enhancement adds a sensible fallback to extract infohash from the magneturl attribute when direct infohash attributes aren't available. This broadens indexer support (e.g., Prowlarr) as noted in the PR objectives.


3065-3076: LGTM! Solid validation logic.

The validation correctly checks for hex-encoded SHA1 (40 chars) or SHA256 (64 chars) infohashes using hex.DecodeString, with clear documentation that Base32 encoding is not supported.

internal/services/crossseed/crossseed_test.go (2)

25-25: Jackett import is appropriate for the new test helpers

The added jackett import cleanly supports the new TorrentDownloadRequest and SearchResult usages in the tests; no issues here.


3500-3590: Infohash pre-check tests exercise the key behaviors well

The three processAutomationCandidate tests cover the intended matrix:

  • Skip when the infohash is reported as existing on all target instances.
  • Proceed (download + cross-seed) when only some instances have the infohash.
  • Proceed when HasTorrentByAnyHash returns an error, ensuring graceful degradation.

They also assert TorrentsSkipped in the “all instances have it” case, aligning per-instance accounting with how cross-seed results are tallied elsewhere. The explicit crossSeedInvoker stubs in the “proceeds” cases avoid nil panics and keep the tests focused on decision logic rather than injection details.

No functional concerns here.

Also applies to: 3592-3689, 3691-3774

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (4)
internal/services/crossseed/crossseed_test.go (4)

3283-3353: Comment-URL extraction tests exercise the key UNIT3D/BHD-style patterns

The TestExtractTorrentURLForCommentMatch table nicely covers:

  • Valid HTTPS GUID URLs for several UNIT3D trackers.
  • Fallback to InfoURL when GUID is empty or non-URL.
  • Rejection of plain HTTP and non-torrent URLs.
    This gives good confidence in the helper’s behavior.

If you want to harden it further, consider an extra case where InfoURL is HTTP (or an unsupported path) to assert that insecure/irrelevant URLs are explicitly rejected in both GUID and InfoURL positions.


3355-3499: infohashTestSyncManager/InstanceStore are well-focused fakes with minor realism gaps

The infohashTestSyncManager and infohashTestInstanceStore give you tight control over:

  • Per-instance torrent listings, files, and properties.
  • Per-instance HasTorrentByAnyHash outcomes via hashResults.

Two minor points to consider:

  • HasTorrentByAnyHash ignores the hashes slice and only keys off instanceID. That’s fine for these tests, but it does diverge from the real API contract and could make future tests less expressive if you ever need to assert behavior that depends on which hashes were passed.
  • Several methods (AddTorrent, BulkAction, GetQBittorrentSyncManager, etc.) are implemented as no-ops that always succeed. That keeps current tests simple, but it also means accidental new call paths won’t fail loudly. Mirroring the pattern from fakeSyncManager/mockRecoverSyncManager (returning an explicit “not implemented in infohashTestSyncManager” error) would make unexpected usage surface quickly during test runs.

Both points are optional; the current fakes are sufficient for the scenarios under test.


3500-3774: Infohash pre-check tests comprehensively cover success, partial, error, and cancellation paths

The four new processAutomationCandidate tests around infohash pre-checks look solid:

  • SkipsWhenInfohashExistsOnAllInstances ensures:

    • Hash presence is checked per instance.
    • The candidate is marked processed with TorrentsSkipped incremented.
    • No torrent download is attempted when all instances already have the infohash.
  • ProceedsWhenInfohashExistsOnSomeInstances validates the “some instances have it” branch, asserting that the download path is still exercised and cross-seed invocation can proceed for missing instances.

  • ProceedsOnHashCheckError confirms graceful degradation: hash-check errors cause a logged warning and fallback to the normal download path instead of failing the candidate outright.

  • PropagatesContextCancellation correctly treats a context.Canceled error from HasTorrentByAnyHash as fatal, returning a failed status and avoiding any download.

As a small enhancement, you could simulate actual context cancellation by using a context.WithCancel/WithTimeout and having the fake hash-check observe ctx.Err(), but that’s strictly optional given you already assert propagation of context.Canceled.


3853-3926: Comment-URL based skip test validates UNIT3D-style fallback behavior

TestProcessAutomationCandidate_SkipsWhenCommentURLMatches nicely exercises the new “comment URL match” shortcut for trackers that don’t expose infohashes:

  • Infohash is empty in the search result.
  • GUID carries the UNIT3D-style HTTPS details URL.
  • The existing torrent’s Comment embeds that URL.
  • The test asserts the candidate is processed as skipped, with no download attempted.

To further lock in the behavior, you could add a complementary case where the torrent URL must be taken from InfoURL instead of GUID (matching your helper’s fallback) to ensure the end-to-end path works for both GUID- and InfoURL-based feeds.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6c9f85f and 33a062d.

📒 Files selected for processing (2)
  • internal/services/crossseed/crossseed_test.go (2 hunks)
  • internal/services/crossseed/service.go (2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • internal/services/crossseed/service.go
🧰 Additional context used
🧠 Learnings (6)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
📚 Learning: 2025-11-28T20:32:30.126Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-28T22:21:20.730Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:2415-2457
Timestamp: 2025-11-28T22:21:20.730Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The determineSavePath function intentionally includes a contentLayout string parameter for future/content-layout branching and API consistency. Its presence is by design even if unused in the current body; do not flag as an issue in reviews.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-06T12:11:04.963Z
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: internal/services/crossseed/service.go:1045-1082
Timestamp: 2025-11-06T12:11:04.963Z
Learning: The autobrr/qui project uses a custom go-qbittorrent client library (github.com/autobrr/go-qbittorrent) that supports both "paused" and "stopped" parameters when adding torrents via the options map. Both parameters should be set together when controlling torrent start state, as seen in internal/services/crossseed/service.go and throughout the codebase.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-25T22:46:03.762Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 632
File: internal/backups/service.go:1401-1404
Timestamp: 2025-11-25T22:46:03.762Z
Learning: In qui's backup service (internal/backups/service.go), background torrent downloads initiated during manifest import intentionally use a fire-and-forget pattern with the shared service context (s.ctx). Per-run cancellation is not needed, as orphaned downloads completing after run deletion are considered harmless and acceptable. This design prioritizes simplicity over per-run lifecycle management for background downloads.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-27T15:33:39.007Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/jackett/service.go:309-330
Timestamp: 2025-11-27T15:33:39.007Z
Learning: In internal/services/jackett/service.go, synchronous execution paths (test executors, interactive searches with explicit indexer selection, and scheduler-unavailable fallback) intentionally pass jobID=0 to result callbacks. This is acceptable because outcome tracking via ReportIndexerOutcome is only used for async cross-seed operations that always use the scheduler and receive unique jobIDs. Interactive searches and tests do not use outcome tracking.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
🧬 Code graph analysis (1)
internal/services/crossseed/crossseed_test.go (5)
internal/qbittorrent/sync_manager.go (3)
  • CrossInstanceTorrentView (99-103)
  • TorrentView (93-96)
  • SyncManager (159-182)
internal/services/crossseed/service.go (2)
  • Service (236-287)
  • AutomationRunOptions (410-414)
internal/models/crossseed.go (4)
  • CrossSeedAutomationSettings (20-51)
  • CrossSeedRun (161-177)
  • CrossSeedFeedItemStatusProcessed (233-233)
  • CrossSeedFeedItemStatusFailed (235-235)
internal/services/jackett/models.go (1)
  • SearchResult (67-110)
internal/services/crossseed/models.go (2)
  • CrossSeedResponse (41-48)
  • InstanceCrossSeedResult (51-60)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Run tests
🔇 Additional comments (1)
internal/services/crossseed/crossseed_test.go (1)

23-28: Jackett service import is appropriate and correctly used

The new jackett import is used for TorrentDownloadRequest, SearchResult, and EnabledIndexerInfo in the added tests and keeps the tests aligned with the production APIs. No changes needed here.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
internal/services/crossseed/service.go (1)

1693-1746: Comment-URL pre-check + URL helpers are sound; optional robustness tweaks

The comment-URL pre-check correctly only short-circuits when every candidate instance has a torrent whose Comment includes the extracted URL, which matches the “already present on all candidates” requirement and safely falls back to the normal download path otherwise. extractTorrentURLForCommentMatch / parseTorrentDetailsURL are conservative (HTTPS + /torrents/ or /details/), so they can only cause missed optimizations, not false positives.

If you want to harden this further without changing behavior:

  • Consider allowing http:// in parseTorrentDetailsURL for trackers that still use plain HTTP, or at least making the scheme check configurable, to avoid silently skipping the optimization on such feeds.
  • Think about normalizing trivial variants (e.g., trimming a trailing / or dropping simple query strings) so that comments containing a slightly decorated details URL still match the GUID/InfoURL-derived string.

These are non-blocking polish items; the current implementation is safe and aligned with the described trackers.

Also applies to: 7437-7477

internal/services/crossseed/crossseed_test.go (1)

2355-2498: LGTM! Well-structured test doubles for multi-instance scenarios.

The infohashTestSyncManager and infohashTestInstanceStore provide comprehensive mocking capabilities with configurable behavior per instance. The hashResults map is a clean way to control HasTorrentByAnyHash responses for different test scenarios.

For future maintainability, if this test double is reused across multiple test files, consider extracting it to a shared test helper package.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 33a062d and 5f1a834.

📒 Files selected for processing (2)
  • internal/services/crossseed/crossseed_test.go (2 hunks)
  • internal/services/crossseed/service.go (2 hunks)
🧰 Additional context used
🧠 Learnings (6)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
📚 Learning: 2025-11-28T20:32:30.126Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
  • internal/services/crossseed/service.go
📚 Learning: 2025-11-06T12:11:04.963Z
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: internal/services/crossseed/service.go:1045-1082
Timestamp: 2025-11-06T12:11:04.963Z
Learning: The autobrr/qui project uses a custom go-qbittorrent client library (github.com/autobrr/go-qbittorrent) that supports both "paused" and "stopped" parameters when adding torrents via the options map. Both parameters should be set together when controlling torrent start state, as seen in internal/services/crossseed/service.go and throughout the codebase.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
  • internal/services/crossseed/service.go
📚 Learning: 2025-11-25T22:46:03.762Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 632
File: internal/backups/service.go:1401-1404
Timestamp: 2025-11-25T22:46:03.762Z
Learning: In qui's backup service (internal/backups/service.go), background torrent downloads initiated during manifest import intentionally use a fire-and-forget pattern with the shared service context (s.ctx). Per-run cancellation is not needed, as orphaned downloads completing after run deletion are considered harmless and acceptable. This design prioritizes simplicity over per-run lifecycle management for background downloads.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
  • internal/services/crossseed/service.go
📚 Learning: 2025-11-27T15:33:39.007Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/jackett/service.go:309-330
Timestamp: 2025-11-27T15:33:39.007Z
Learning: In internal/services/jackett/service.go, synchronous execution paths (test executors, interactive searches with explicit indexer selection, and scheduler-unavailable fallback) intentionally pass jobID=0 to result callbacks. This is acceptable because outcome tracking via ReportIndexerOutcome is only used for async cross-seed operations that always use the scheduler and receive unique jobIDs. Interactive searches and tests do not use outcome tracking.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
  • internal/services/crossseed/service.go
📚 Learning: 2025-11-28T22:21:20.730Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:2415-2457
Timestamp: 2025-11-28T22:21:20.730Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The determineSavePath function intentionally includes a contentLayout string parameter for future/content-layout branching and API consistency. Its presence is by design even if unused in the current body; do not flag as an issue in reviews.

Applied to files:

  • internal/services/crossseed/service.go
🧬 Code graph analysis (2)
internal/services/crossseed/crossseed_test.go (2)
internal/qbittorrent/sync_manager.go (2)
  • CrossInstanceTorrentView (99-103)
  • TorrentView (93-96)
internal/services/jackett/models.go (1)
  • SearchResult (67-110)
internal/services/crossseed/service.go (3)
web/src/types/index.ts (2)
  • CrossSeedRunResult (1366-1374)
  • Torrent (217-274)
internal/models/crossseed.go (3)
  • CrossSeedRunResult (150-158)
  • CrossSeedFeedItemStatusFailed (235-235)
  • CrossSeedFeedItemStatusProcessed (233-233)
web/src/pages/Torrents.tsx (1)
  • Torrents (28-350)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Run tests
🔇 Additional comments (5)
internal/services/crossseed/crossseed_test.go (5)

2284-2353: LGTM! Comprehensive test coverage for URL extraction.

The test covers all major UNIT3D-style tracker URL formats and properly validates the fallback behavior when GUID is empty or invalid. The HTTP rejection and non-torrent URL filtering are good security practices.


2500-2590: LGTM! Core feature test validates infohash pre-check optimization.

This test correctly verifies the primary optimization: when all target instances already have the torrent by infohash, the download is skipped entirely. The assertion on downloadCalled confirms the download is avoided, and the skip counter properly tracks both instances.


2592-2689: LGTM! Partial match scenario handled correctly.

The test properly validates that when only some instances have the torrent by infohash, the download proceeds for the instances that need it. This ensures the optimization doesn't prevent legitimate cross-seeding opportunities.


2691-2928: LGTM! Excellent error handling test coverage.

The three tests correctly distinguish between:

  1. Generic errors (graceful degradation - proceed with download)
  2. Context cancellation/deadline errors (propagate immediately - don't fallback)

This ensures the optimization doesn't hide critical context signals while still being resilient to transient errors.


2930-4003: LGTM! Fallback mechanism for non-infohash trackers.

This test validates the fallback to comment URL matching for trackers that don't provide infohash in feeds (like UNIT3D-based trackers mentioned in the PR objectives). The test correctly simulates the scenario and verifies the skip behavior works even without infohash.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (3)
internal/services/crossseed/service.go (3)

1637-1692: Infohash pre-check flow and counters look solid; only nit is pointer helper style

The InfoHashV1 pre-check correctly:

  • short-circuits only when all candidate instances already have the torrent,
  • treats context cancel/timeout as a real failure (run.TorrentsFailed++ + FeedItemStatusFailed),
  • otherwise logs and safely falls back to the normal download/apply path,
  • records per-instance "exists" results and increments run.TorrentsSkipped per instance, consistent with how later apply results are tallied.

One small nit: the MatchedTorrentHash/MatchedTorrentName fields are built via inline func() *string { ... }() closures, which allocate more than necessary and differ from the local-var pattern used elsewhere in this file. Consider the more idiomatic style for consistency and to avoid the tiny closure cost:

-			if exists && existing != nil {
-				existingResults = append(existingResults, models.CrossSeedRunResult{
-					InstanceID:         candidate.InstanceID,
-					InstanceName:       candidate.InstanceName,
-					Success:            false,
-					Status:             "exists",
-					Message:            "Torrent already exists (infohash pre-check)",
-					MatchedTorrentHash: func() *string { h := existing.Hash; return &h }(),
-					MatchedTorrentName: func() *string { n := existing.Name; return &n }(),
-				})
+			if exists && existing != nil {
+				hash := existing.Hash
+				name := existing.Name
+				existingResults = append(existingResults, models.CrossSeedRunResult{
+					InstanceID:         candidate.InstanceID,
+					InstanceName:       candidate.InstanceName,
+					Success:            false,
+					Status:             "exists",
+					Message:            "Torrent already exists (infohash pre-check)",
+					MatchedTorrentHash: &hash,
+					MatchedTorrentName: &name,
+				})

1694-1748: Comment-URL pre-check logic matches goals; same small pointer-style nit as above

The UNIT3D/BHD-style comment-URL pre-check:

  • Extracts a plausible torrent details URL once and reuses it,
  • Requires the URL-backed torrent to exist on all candidate instances before skipping the download, mirroring the infohash pre-check semantics,
  • Properly treats ctx.Err() != nil as a failure (run.TorrentsFailed++ + FeedItemStatusFailed), and
  • Emits per-instance "exists" results and increments run.TorrentsSkipped per instance, which fits existing run summary semantics.

As with the infohash block, you can optionally switch away from the inline closures for MatchedTorrentHash/MatchedTorrentName to a local-var pattern for consistency and slightly less allocation:

-			if found && matchedTorrent != nil {
-				existingResults = append(existingResults, models.CrossSeedRunResult{
-					InstanceID:         candidate.InstanceID,
-					InstanceName:       candidate.InstanceName,
-					Success:            false,
-					Status:             "exists",
-					Message:            "Torrent already exists (comment URL pre-check)",
-					MatchedTorrentHash: func() *string { h := matchedTorrent.Hash; return &h }(),
-					MatchedTorrentName: func() *string { n := matchedTorrent.Name; return &n }(),
-				})
+			if found && matchedTorrent != nil {
+				hash := matchedTorrent.Hash
+				name := matchedTorrent.Name
+				existingResults = append(existingResults, models.CrossSeedRunResult{
+					InstanceID:         candidate.InstanceID,
+					InstanceName:       candidate.InstanceName,
+					Success:            false,
+					Status:             "exists",
+					Message:            "Torrent already exists (comment URL pre-check)",
+					MatchedTorrentHash: &hash,
+					MatchedTorrentName: &name,
+				})

7438-7480: Helper URL extraction is clear; consider minor robustness tweaks only if needed

The new helpers cleanly encapsulate the comment-URL matching rule:

  • prefer GUID, fall back to InfoURL,
  • require https:// and /torrents/ or /details/ path segments, which matches the documented UNIT3D/BHD patterns.

If you ever need to support slightly messier feeds, two optional robustness tweaks would be:

  • strings.TrimSpace(rawURL) before inspection, and
  • scheme check using a lowercased copy (strings.HasPrefix(strings.ToLower(rawURL), "https://")) to tolerate unusual but valid mixed-case schemes.

Not required for the trackers you’re targeting today, but easy to adopt later if feed variance becomes an issue.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 5f1a834 and 64734d5.

📒 Files selected for processing (2)
  • internal/services/crossseed/crossseed_test.go (2 hunks)
  • internal/services/crossseed/service.go (2 hunks)
🧰 Additional context used
🧠 Learnings (7)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
📚 Learning: 2025-11-28T20:32:30.126Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
  • internal/services/crossseed/service.go
📚 Learning: 2025-11-06T12:11:04.963Z
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: internal/services/crossseed/service.go:1045-1082
Timestamp: 2025-11-06T12:11:04.963Z
Learning: The autobrr/qui project uses a custom go-qbittorrent client library (github.com/autobrr/go-qbittorrent) that supports both "paused" and "stopped" parameters when adding torrents via the options map. Both parameters should be set together when controlling torrent start state, as seen in internal/services/crossseed/service.go and throughout the codebase.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-25T22:46:03.762Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 632
File: internal/backups/service.go:1401-1404
Timestamp: 2025-11-25T22:46:03.762Z
Learning: In qui's backup service (internal/backups/service.go), background torrent downloads initiated during manifest import intentionally use a fire-and-forget pattern with the shared service context (s.ctx). Per-run cancellation is not needed, as orphaned downloads completing after run deletion are considered harmless and acceptable. This design prioritizes simplicity over per-run lifecycle management for background downloads.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
  • internal/services/crossseed/service.go
📚 Learning: 2025-11-27T15:33:39.007Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/jackett/service.go:309-330
Timestamp: 2025-11-27T15:33:39.007Z
Learning: In internal/services/jackett/service.go, synchronous execution paths (test executors, interactive searches with explicit indexer selection, and scheduler-unavailable fallback) intentionally pass jobID=0 to result callbacks. This is acceptable because outcome tracking via ReportIndexerOutcome is only used for async cross-seed operations that always use the scheduler and receive unique jobIDs. Interactive searches and tests do not use outcome tracking.

Applied to files:

  • internal/services/crossseed/crossseed_test.go
  • internal/services/crossseed/service.go
📚 Learning: 2025-11-28T22:21:20.730Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:2415-2457
Timestamp: 2025-11-28T22:21:20.730Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The determineSavePath function intentionally includes a contentLayout string parameter for future/content-layout branching and API consistency. Its presence is by design even if unused in the current body; do not flag as an issue in reviews.

Applied to files:

  • internal/services/crossseed/service.go
📚 Learning: 2025-11-21T21:11:50.633Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 625
File: internal/qbittorrent/sync_manager.go:1112-1123
Timestamp: 2025-11-21T21:11:50.633Z
Learning: In internal/qbittorrent/sync_manager.go, the GetCachedFilesBatch interface documentation (around line 39-40) should specify "uppercase hex" instead of "lowercase hex" to match the actual normalization practice used throughout the codebase (e.g., normalizeHash in internal/services/crossseed/service.go uses strings.ToUpper, and hash filtering uses uppercase).

Applied to files:

  • internal/services/crossseed/service.go
🧬 Code graph analysis (1)
internal/services/crossseed/service.go (1)
internal/models/crossseed.go (3)
  • CrossSeedRunResult (150-158)
  • CrossSeedFeedItemStatusFailed (235-235)
  • CrossSeedFeedItemStatusProcessed (233-233)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Run tests
🔇 Additional comments (6)
internal/services/crossseed/crossseed_test.go (6)

2284-2353: LGTM! Comprehensive test coverage for URL extraction.

The test covers all major tracker URL formats (UNIT3D, BHD, Aither, Blutopia), fallback behavior, and validation (HTTP rejection, non-torrent URLs). Good edge case coverage including empty inputs.


3355-3498: LGTM! Well-designed test doubles for infohash testing.

The infohashTestSyncManager and supporting types provide configurable mock behavior specifically for testing hash-based pre-checks. While there's some overlap with existing fakeSyncManager, the additional hashResults configuration is necessary for testing scenarios where hash checks succeed on some instances but fail on others.

The implementation correctly:

  • Stores per-instance torrents, files, and properties
  • Provides configurable HasTorrentByAnyHash results via hashResults
  • Copies data properly to avoid mutation issues
  • Implements all required interface methods

3500-3590: LGTM! Test correctly verifies skip behavior when infohash exists on all instances.

The test properly validates the optimization where downloads are skipped when the infohash already exists on all target instances. Key assertions correctly verify:

  • Download is NOT called (downloadCalled = false)
  • Skip counter is incremented for both instances (TorrentsSkipped = 2)
  • Returned hash matches the existing torrent

3592-3774: LGTM! Tests correctly verify proceed behavior for partial matches and errors.

Two well-designed tests:

  1. Partial instance coverage (lines 3592-3689): Correctly verifies that download proceeds when the infohash exists on some but not all instances, ensuring the torrent is added to remaining instances.

  2. Error handling (lines 3691-3774): Properly tests graceful degradation when hash checks fail. The system proceeds with the download rather than blocking on transient errors, which is the correct behavior for resilience.

Both include proper mock crossSeedInvoker setup to prevent nil panics.


3776-3930: LGTM! Critical tests for context error propagation.

These tests ensure that context cancellation and deadline exceeded errors are properly propagated rather than treated as transient failures that trigger fallback behavior. This distinction is important because:

  1. Context cancellation indicates user/system-initiated shutdown, not a transient error
  2. Deadline exceeded indicates timeout, which should fail fast rather than continue processing

Both tests correctly verify:

  • Errors are propagated using assert.ErrorIs
  • TorrentsFailed counter is incremented
  • Download is NOT called
  • Error messages contain appropriate context

3932-4005: LGTM! Test correctly verifies comment URL matching fallback.

This test validates the fallback behavior when infohash is unavailable. The system extracts torrent detail URLs from GUID/InfoURL fields and matches them against existing torrent comments, preventing duplicate downloads from trackers that don't provide infohash in their RSS feeds.

Test correctly:

  • Omits infohash in search result to force fallback
  • Sets up torrent comment with matching URL
  • Doesn't configure hashResults (returns nil, false, nil) to trigger fallback path
  • Verifies download is skipped
  • Verifies returnedHash is nil (appropriate since no infohash available)

@s0up4200 s0up4200 merged commit 3baa007 into main Dec 10, 2025
12 checks passed
@s0up4200 s0up4200 deleted the feat/rss-infohash-precheck branch December 10, 2025 15:18
alexlebens pushed a commit to alexlebens/infrastructure that referenced this pull request Dec 18, 2025
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [ghcr.io/autobrr/qui](https://github.com/autobrr/qui) | minor | `v1.9.1` -> `v1.10.0` |

---

### Release Notes

<details>
<summary>autobrr/qui (ghcr.io/autobrr/qui)</summary>

### [`v1.10.0`](https://github.com/autobrr/qui/releases/tag/v1.10.0)

[Compare Source](autobrr/qui@v1.9.1...v1.10.0)

#### Changelog

##### New Features

- [`f2b17e6`](autobrr/qui@f2b17e6): feat(config): add SESSION\_SECRET\_FILE env var ([#&#8203;661](autobrr/qui#661)) ([@&#8203;undefined-landmark](https://github.com/undefined-landmark))
- [`f5ede56`](autobrr/qui@f5ede56): feat(crossseed): add RSS source filters for categories and tags ([#&#8203;757](autobrr/qui#757)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`9dee7bb`](autobrr/qui@9dee7bb): feat(crossseed): add Unicode normalization for title and file matching ([#&#8203;742](autobrr/qui#742)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`d44058f`](autobrr/qui@d44058f): feat(crossseed): add skip auto-resume settings per mode ([#&#8203;755](autobrr/qui#755)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`9e3534a`](autobrr/qui@9e3534a): feat(crossseed): add webhook source filters for categories and tags ([#&#8203;763](autobrr/qui#763)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`c8bbe07`](autobrr/qui@c8bbe07): feat(crossseed): only poll status endpoints when features are enabled ([#&#8203;738](autobrr/qui#738)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`fda8101`](autobrr/qui@fda8101): feat(sidebar): add size tooltips and deduplicate cross-seed sizes ([#&#8203;724](autobrr/qui#724)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`e4c0556`](autobrr/qui@e4c0556): feat(torrent): add sequential download toggles ([#&#8203;776](autobrr/qui#776)) ([@&#8203;rare-magma](https://github.com/rare-magma))
- [`2a43f15`](autobrr/qui@2a43f15): feat(torrents): autocomplete paths ([#&#8203;634](autobrr/qui#634)) ([@&#8203;rare-magma](https://github.com/rare-magma))
- [`1c07b33`](autobrr/qui@1c07b33): feat(torrents): replace filtered speeds with global ([#&#8203;745](autobrr/qui#745)) ([@&#8203;jabloink](https://github.com/jabloink))
- [`cd0deee`](autobrr/qui@cd0deee): feat(tracker): add per-domain stats inclusion toggle for merged trackers ([#&#8203;781](autobrr/qui#781)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`b6a6200`](autobrr/qui@b6a6200): feat(web): add Size column to Tracker Breakdown table ([#&#8203;770](autobrr/qui#770)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`560071b`](autobrr/qui@560071b): feat(web): add zebra striping to torrent table ([#&#8203;726](autobrr/qui#726)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`f8f65a8`](autobrr/qui@f8f65a8): feat(web): improve auto-search on completion UX ([#&#8203;743](autobrr/qui#743)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`e36312f`](autobrr/qui@e36312f): feat(web): improve torrent selection UX with unified click and escape behavior ([#&#8203;782](autobrr/qui#782)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`27c1daa`](autobrr/qui@27c1daa): feat(web): napster theme ([#&#8203;728](autobrr/qui#728)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`e3950de`](autobrr/qui@e3950de): feat(web): new torrent details panel for desktop ([#&#8203;760](autobrr/qui#760)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`6c66ba5`](autobrr/qui@6c66ba5): feat(web): persist tab state in URL for CrossSeed and Settings pages ([#&#8203;775](autobrr/qui#775)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`59884a9`](autobrr/qui@59884a9): feat(web): share tracker customizations with filtersidebar ([#&#8203;717](autobrr/qui#717)) ([@&#8203;s0up4200](https://github.com/s0up4200))

##### Bug Fixes

- [`fafd278`](autobrr/qui@fafd278): fix(api): add webhook source filter fields to PATCH settings endpoint ([#&#8203;774](autobrr/qui#774)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`bdf0339`](autobrr/qui@bdf0339): fix(api): support apikey query param with custom base URL ([#&#8203;748](autobrr/qui#748)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`c3c8d66`](autobrr/qui@c3c8d66): fix(crossseed): compare Site and Sum fields for anime releases ([#&#8203;769](autobrr/qui#769)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`cb4c965`](autobrr/qui@cb4c965): fix(crossseed): detect file name differences and fix hasExtraSourceFiles ([#&#8203;741](autobrr/qui#741)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`fd9e054`](autobrr/qui@fd9e054): fix(crossseed): fix batch completion searches and remove legacy settings ([#&#8203;744](autobrr/qui#744)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`26706a0`](autobrr/qui@26706a0): fix(crossseed): normalize punctuation in title matching ([#&#8203;718](autobrr/qui#718)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`db30566`](autobrr/qui@db30566): fix(crossseed): rename files before folder to avoid path conflicts ([#&#8203;752](autobrr/qui#752)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`8886ac4`](autobrr/qui@8886ac4): fix(crossseed): resolve category creation race condition and relax autoTMM ([#&#8203;767](autobrr/qui#767)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`f8f2a05`](autobrr/qui@f8f2a05): fix(crossseed): support game scene releases with RAR files ([#&#8203;768](autobrr/qui#768)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`918adee`](autobrr/qui@918adee): fix(crossseed): treat x264/H.264/H264/AVC as equivalent codecs ([#&#8203;766](autobrr/qui#766)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`c4b1f0a`](autobrr/qui@c4b1f0a): fix(dashboard): merge tracker customizations with duplicate displayName ([#&#8203;751](autobrr/qui#751)) ([@&#8203;jabloink](https://github.com/jabloink))
- [`3c6e0f9`](autobrr/qui@3c6e0f9): fix(license): remove redundant validation call after activation ([#&#8203;749](autobrr/qui#749)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`a9c7754`](autobrr/qui@a9c7754): fix(reannounce): simplify tracker detection to match qbrr logic ([#&#8203;746](autobrr/qui#746)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`3baa007`](autobrr/qui@3baa007): fix(rss): skip download when torrent already exists by infohash ([#&#8203;715](autobrr/qui#715)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`55d0ccc`](autobrr/qui@55d0ccc): fix(swagger): respect base URL for API docs routes ([#&#8203;758](autobrr/qui#758)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`47695fd`](autobrr/qui@47695fd): fix(web): add height constraint to filter sidebar wrapper for proper scrolling ([#&#8203;778](autobrr/qui#778)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`4b3bfea`](autobrr/qui@4b3bfea): fix(web): default torrent format to v1 in creator dialog ([#&#8203;723](autobrr/qui#723)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`2d54b79`](autobrr/qui@2d54b79): fix(web): pin submit button in Services sheet footer ([#&#8203;756](autobrr/qui#756)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`2bcd6a3`](autobrr/qui@2bcd6a3): fix(web): preserve folder collapse state during file tree sync ([#&#8203;740](autobrr/qui#740)) ([@&#8203;ewenjo](https://github.com/ewenjo))
- [`57f3f1d`](autobrr/qui@57f3f1d): fix(web): sort Peers column by total peers instead of connected ([#&#8203;759](autobrr/qui#759)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`53a8818`](autobrr/qui@53a8818): fix(web): sort Seeds column by total seeds instead of connected ([#&#8203;747](autobrr/qui#747)) ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`d171915`](autobrr/qui@d171915): fix(web): sort folders before files in torrent file tree ([#&#8203;764](autobrr/qui#764)) ([@&#8203;s0up4200](https://github.com/s0up4200))

##### Other Changes

- [`172b4aa`](autobrr/qui@172b4aa): chore(assets): replace napster.svg with napster.png for logo update ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`dc83102`](autobrr/qui@dc83102): chore(deps): bump the github group with 3 updates ([#&#8203;761](autobrr/qui#761)) ([@&#8203;dependabot](https://github.com/dependabot)\[bot])
- [`75357d3`](autobrr/qui@75357d3): chore: fix napster logo ([@&#8203;s0up4200](https://github.com/s0up4200))
- [`206c4b2`](autobrr/qui@206c4b2): refactor(web): extract CrossSeed completion to accordion component ([#&#8203;762](autobrr/qui#762)) ([@&#8203;s0up4200](https://github.com/s0up4200))

**Full Changelog**: <autobrr/qui@v1.9.1...v1.10.0>

#### Docker images

- `docker pull ghcr.io/autobrr/qui:v1.10.0`
- `docker pull ghcr.io/autobrr/qui:latest`

#### What to do next?

- Join our [Discord server](https://discord.autobrr.com/qui)

Thank you for using qui!

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied.

♻ **Rebasing**: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update again.

---

 - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

---

This PR has been generated by [Renovate Bot](https://github.com/renovatebot/renovate).
<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi4zOS4xIiwidXBkYXRlZEluVmVyIjoiNDIuMzkuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW1hZ2UiXX0=-->

Reviewed-on: https://gitea.alexlebens.dev/alexlebens/infrastructure/pulls/2664
Co-authored-by: Renovate Bot <renovate-bot@alexlebens.net>
Co-committed-by: Renovate Bot <renovate-bot@alexlebens.net>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cross-seed enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant