fix(rss): skip download when torrent already exists by infohash#715
fix(rss): skip download when torrent already exists by infohash#715
Conversation
When RSS automation finds candidates, check if the feed item's infohash already exists on all candidate instances before downloading the torrent file. This prevents unnecessary downloads (and Prowlarr "Grabbed" notifications) when a torrent was already added via another source like autobrr/IRC. Also adds fallback extraction of infohash from magneturl attribute. Indexers providing infohash via Prowlarr: - Generic Torznab (infohash attribute) - HDBits, BeyondHD, Avistaz/PrivateHD/ExoticaZ, BroadcastheNet - Knaben, TorrentsCSV - Cardigann (per-definition, with magnet fallback) - TorrentRssParser-based indexers (from magnet URL if present) Not supported (no infohash in feed): - PassThePopcorn, UNIT3D-based trackers - Anidex, Animedia, SubsPlease, Shizaproject
WalkthroughAdds early pre-download checks in cross-seed to short-circuit downloads when torrents already exist (infohash or comment/URL match), centralizes and extends Jackett infohash extraction (validation + magnet fallback), and adds comprehensive unit tests for both services. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes
Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (2)
internal/services/crossseed/service.go (1)
1637-1685: Infohash pre-check logic correctly short-circuits redundant downloadsThe new
InfoHashV1pre-check looks sound and matches the intended behavior:
- It only runs when there are viable candidates and the run is not a dry-run, so existing behavior for “no candidates” and dry-runs is preserved.
allExistis only lefttruewhen every candidate instance reportsexists == truefromHasTorrentByAnyHash, and any error or miss immediately flipsallExisttofalseand falls back to the existing download+cross-seed path. That keeps the optimization strictly opportunistic/safe.- When the short-circuit triggers, you:
- Increment
run.TorrentsSkippedonce per candidate instance, mirroring how the laterexistsbranch in the cross-seed path treated per-instance “already exists” outcomes.- Append
CrossSeedRunResultentries withStatus: "exists"andMatchedTorrentHash/Namepopulated, which aligns with existing result semantics and theCrossSeedRunResultmodel.- Return
CrossSeedFeedItemStatusProcessedwith a non-nilinfoHashsomarkFeedItempersists the hash for the feed item, keeping feed history consistent with the “downloaded + parsed” flow.Given the surrounding counters (
CandidatesFound,TorrentsSkipped, run status calculation) and the fact thatcandidatesResp.Candidatesonly ever contains instances with matching complete torrents, this optimization should behave identically to the old path for “already-have-this-torrent-everywhere” cases while avoiding the unnecessary download and cross-seed roundtrip.I don’t see any functional bugs or accounting inconsistencies in this block; it’s a clean, conservative optimization.
internal/services/jackett/service.go (1)
3076-3105: Minor: Redundant check after parameter splitting.The magnet URL parsing is correct, but lines 3097-3099 check for
&inhashPartafter already splitting by&at line 3091. Each param after splitting should not contain&, making this check redundant. SincevalidateInfoHashwill catch any malformed data anyway, consider simplifying:if strings.HasPrefix(strings.ToLower(param), "xt=urn:btih:") { // Extract the hash part after "xt=urn:btih:" hashPart := param[12:] // len("xt=urn:btih:") == 12 - // Hash might be followed by other parameters or URL encoding - if idx := strings.Index(hashPart, "&"); idx > 0 { - hashPart = hashPart[:idx] - } return validateInfoHash(hashPart) }The defensive check is harmless but adds unnecessary complexity.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
internal/services/crossseed/service.go(1 hunks)internal/services/jackett/service.go(1 hunks)internal/services/jackett/service_test.go(1 hunks)
🧰 Additional context used
🧠 Learnings (6)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: web/src/components/torrents/TorrentTableOptimized.tsx:1510-1515
Timestamp: 2025-11-06T11:59:21.390Z
Learning: In the qui project, the API layer in web/src/lib/api.ts normalizes backend snake_case responses to camelCase for frontend consumption. For CrossSeed search results, the backend's download_url field is transformed to downloadUrl in the searchCrossSeedTorrent method, so frontend code should always use the camelCase variant (result.downloadUrl).
📚 Learning: 2025-11-28T20:32:30.126Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
Applied to files:
internal/services/crossseed/service.go
📚 Learning: 2025-11-25T22:46:03.762Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 632
File: internal/backups/service.go:1401-1404
Timestamp: 2025-11-25T22:46:03.762Z
Learning: In qui's backup service (internal/backups/service.go), background torrent downloads initiated during manifest import intentionally use a fire-and-forget pattern with the shared service context (s.ctx). Per-run cancellation is not needed, as orphaned downloads completing after run deletion are considered harmless and acceptable. This design prioritizes simplicity over per-run lifecycle management for background downloads.
Applied to files:
internal/services/crossseed/service.go
📚 Learning: 2025-11-28T22:21:20.730Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:2415-2457
Timestamp: 2025-11-28T22:21:20.730Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The determineSavePath function intentionally includes a contentLayout string parameter for future/content-layout branching and API consistency. Its presence is by design even if unused in the current body; do not flag as an issue in reviews.
Applied to files:
internal/services/crossseed/service.go
📚 Learning: 2025-11-06T12:11:04.963Z
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: internal/services/crossseed/service.go:1045-1082
Timestamp: 2025-11-06T12:11:04.963Z
Learning: The autobrr/qui project uses a custom go-qbittorrent client library (github.com/autobrr/go-qbittorrent) that supports both "paused" and "stopped" parameters when adding torrents via the options map. Both parameters should be set together when controlling torrent start state, as seen in internal/services/crossseed/service.go and throughout the codebase.
Applied to files:
internal/services/crossseed/service.go
📚 Learning: 2025-11-27T15:33:39.007Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/jackett/service.go:309-330
Timestamp: 2025-11-27T15:33:39.007Z
Learning: In internal/services/jackett/service.go, synchronous execution paths (test executors, interactive searches with explicit indexer selection, and scheduler-unavailable fallback) intentionally pass jobID=0 to result callbacks. This is acceptable because outcome tracking via ReportIndexerOutcome is only used for async cross-seed operations that always use the scheduler and receive unique jobIDs. Interactive searches and tests do not use outcome tracking.
Applied to files:
internal/services/crossseed/service.go
🧬 Code graph analysis (1)
internal/services/crossseed/service.go (2)
web/src/types/index.ts (1)
CrossSeedRunResult(1363-1371)internal/models/crossseed.go (2)
CrossSeedRunResult(150-158)CrossSeedFeedItemStatusProcessed(233-233)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Run tests
🔇 Additional comments (4)
internal/services/jackett/service.go (2)
3040-3063: Well-structured infohash extraction with fallback.The two-phase approach (direct attribute lookup first, then magnet URL fallback) is sound and properly handles Prowlarr's magneturl attribute when direct infohash isn't available. Both paths go through validation which ensures data quality.
3065-3074: LGTM - Proper infohash validation.Correctly validates both SHA-1 (40 chars) and SHA-256 (64 chars) hex-encoded hashes with appropriate normalization.
internal/services/jackett/service_test.go (2)
2008-2094: Comprehensive test coverage for attribute extraction.The test suite thoroughly validates all extraction paths including direct attributes, alternative keys, validation edge cases, and magnet URL fallback. The priority test (lines 2070-2073) correctly confirms that direct infohash takes precedence over magneturl.
2096-2152: Good coverage of magnet URL parsing scenarios.The test suite validates the magnet URL parser across different parameter orderings, case variations, and error conditions. This ensures the parser is robust to different magnet URL formats.
For UNIT3D trackers that don't expose infohash in RSS feeds, check if the RSS GUID/InfoURL matches torrent comment fields on candidate instances before downloading.
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (1)
internal/services/crossseed/crossseed_test.go (1)
3283-3352: LGTM! Comprehensive test coverage for URL extraction logic.The test thoroughly validates the
extractTorrentURLForCommentMatchfunction with excellent coverage of the main scenarios: valid tracker URLs (UNIT3D, BHD, Aither, Blutopia), fallback behavior, security validation (HTTPS-only), and edge cases (empty inputs, invalid URLs). The table-driven approach is idiomatic and makes the test easy to understand and extend.Optional: Consider adding edge cases for more complete coverage:
You may want to add test cases for:
- Both inputs invalid: When both GUID and InfoURL are non-torrent URLs
- HTTP InfoURL: When GUID is invalid and InfoURL is HTTP (to verify consistent HTTPS enforcement)
- URLs with query parameters: e.g.,
https://seedpool.org/torrents/607803?key=valueExample additional test case:
{ name: "Both GUID and InfoURL invalid", guid: "not-a-url", infoURL: "https://example.com/page/123", expected: "", },These additions would further strengthen the test suite, but the current coverage is solid for the primary use cases.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
internal/services/crossseed/crossseed_test.go(1 hunks)internal/services/crossseed/service.go(2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- internal/services/crossseed/service.go
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Run tests
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (6)
internal/services/jackett/service.go (1)
3078-3109: LGTM! Consider using standard URL parsing for cleaner code.The magnet URL parsing logic works correctly. For improved maintainability and robustness, consider using
url.Parseandurl.Query().Get("xt")instead of manual string splitting:func extractInfoHashFromMagnet(magnetURL string) string { magnetURL = strings.TrimSpace(magnetURL) if magnetURL == "" || !strings.HasPrefix(strings.ToLower(magnetURL), "magnet:") { return "" } u, err := url.Parse(magnetURL) if err != nil { return "" } xt := u.Query().Get("xt") if xt == "" { return "" } // The xt parameter format is: urn:btih:<infohash> if !strings.HasPrefix(strings.ToLower(xt), "urn:btih:") { return "" } hashPart := xt[9:] // len("urn:btih:") == 9 return validateInfoHash(hashPart) }This approach handles URL encoding automatically and is more maintainable, though the current implementation is functionally correct.
internal/services/crossseed/service.go (2)
1637-1737: RSS infohash & comment-URL pre-checks look correct; only very minor robustness nitsThe early-exit paths behave as intended: they only short-circuit when all candidate instances already have the torrent, they degrade gracefully on qB errors, and they keep run counters and feed item status consistent with the existing “exists” semantics.
Two small, non-blocking suggestions:
- Consider normalizing
result.InfoHashV1before passing it intoHasTorrentByAnyHash(e.g., vianormalizeHash) to be maximally tolerant of indexer casing/whitespace.- In the comment-URL pre-check, you might eventually want to log how often this path triggers (vs. infohash pre-check) to validate its impact and help tune/extend patterns, but that’s an observability refinement rather than a correctness issue.
7428-7469: Comment-URL helpers are clear and safe; you may later widen URL patterns if needed
extractTorrentURLForCommentMatch/parseTorrentDetailsURLare nicely constrained and avoid false positives by requiring HTTPS and specific/torrents/or/details/segments, which is appropriate for UNIT3D/BHD-style feeds.If you ever hit real-world feeds that use
http://or slightly different detail paths (e.g. query-basedtorrents.php?id=), you can extendparseTorrentDetailsURLto accept those variants, but that’s an easy, incremental change and doesn’t block this PR.internal/services/crossseed/crossseed_test.go (3)
3283-3353: Solid coverage forextractTorrentURLForCommentMatch; consider one InfoURL-driven caseThe table-driven test nicely covers:
- Accepted HTTPS tracker URLs (UNIT3D/BHD-style) in GUID.
- Fallback to
InfoURL.- Rejection of HTTP/non-torrent URLs and empty inputs.
If you want slightly tighter integration coverage, you could add a case where GUID is non-URL and only
InfoURLcarries the valid torrent details URL, mirroring the UNIT3D-style feeds that rely onlinkrather thanguid.
3355-3499: infohash test fakes look correct; could also assert requested hash lists
infohashTestSyncManager/infohashTestInstanceStorematch the existingqbittorrentSynctest fakes and give fine‑grained control overHasTorrentByAnyHashoutcomes per instance. Returning copies fromGetTorrents/GetTorrentFilesBatchavoids accidental mutation in tests, which is good.If you want these tests to also validate that
processAutomationCandidatepasses the right, normalized hash set intoHasTorrentByAnyHash, you could extend the fake to capture thehashesargument per call and assert on it in the tests (e.g., ensure it only contains the validated v1 infohash for now).
3776-3849: Comment-URL match test cleanly validates the non-infohash fallback
TestProcessAutomationCandidate_SkipsWhenCommentURLMatchescorrectly models the UNIT3D-style scenario where:
- No infohash is available.
- The GUID is a tracker details URL.
- An existing torrent’s
Commentembeds that URL.The test verifies that the helper URL extraction plus comment matching path short-circuits before download and leaves
returnedHashnil, which matches the “no hash was trusted here” semantics.If you later want even more confidence, you could add a variant where the GUID is non-URL and the same torrent URL lives only in
InfoURL, to prove that the comment-match fallback works with both feed fields, not just GUID.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
internal/services/crossseed/crossseed_test.go(2 hunks)internal/services/crossseed/service.go(2 hunks)internal/services/jackett/service.go(1 hunks)
🧰 Additional context used
🧠 Learnings (7)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
📚 Learning: 2025-11-28T20:32:30.126Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
Applied to files:
internal/services/crossseed/service.gointernal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-25T22:46:03.762Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 632
File: internal/backups/service.go:1401-1404
Timestamp: 2025-11-25T22:46:03.762Z
Learning: In qui's backup service (internal/backups/service.go), background torrent downloads initiated during manifest import intentionally use a fire-and-forget pattern with the shared service context (s.ctx). Per-run cancellation is not needed, as orphaned downloads completing after run deletion are considered harmless and acceptable. This design prioritizes simplicity over per-run lifecycle management for background downloads.
Applied to files:
internal/services/crossseed/service.gointernal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-28T22:21:20.730Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:2415-2457
Timestamp: 2025-11-28T22:21:20.730Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The determineSavePath function intentionally includes a contentLayout string parameter for future/content-layout branching and API consistency. Its presence is by design even if unused in the current body; do not flag as an issue in reviews.
Applied to files:
internal/services/crossseed/service.go
📚 Learning: 2025-11-06T12:11:04.963Z
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: internal/services/crossseed/service.go:1045-1082
Timestamp: 2025-11-06T12:11:04.963Z
Learning: The autobrr/qui project uses a custom go-qbittorrent client library (github.com/autobrr/go-qbittorrent) that supports both "paused" and "stopped" parameters when adding torrents via the options map. Both parameters should be set together when controlling torrent start state, as seen in internal/services/crossseed/service.go and throughout the codebase.
Applied to files:
internal/services/crossseed/service.gointernal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-27T15:33:39.007Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/jackett/service.go:309-330
Timestamp: 2025-11-27T15:33:39.007Z
Learning: In internal/services/jackett/service.go, synchronous execution paths (test executors, interactive searches with explicit indexer selection, and scheduler-unavailable fallback) intentionally pass jobID=0 to result callbacks. This is acceptable because outcome tracking via ReportIndexerOutcome is only used for async cross-seed operations that always use the scheduler and receive unique jobIDs. Interactive searches and tests do not use outcome tracking.
Applied to files:
internal/services/crossseed/service.gointernal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-21T21:11:50.633Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 625
File: internal/qbittorrent/sync_manager.go:1112-1123
Timestamp: 2025-11-21T21:11:50.633Z
Learning: In internal/qbittorrent/sync_manager.go, the GetCachedFilesBatch interface documentation (around line 39-40) should specify "uppercase hex" instead of "lowercase hex" to match the actual normalization practice used throughout the codebase (e.g., normalizeHash in internal/services/crossseed/service.go uses strings.ToUpper, and hash filtering uses uppercase).
Applied to files:
internal/services/jackett/service.go
🧬 Code graph analysis (2)
internal/services/crossseed/service.go (2)
web/src/types/index.ts (2)
CrossSeedRunResult(1363-1371)Torrent(216-273)internal/models/crossseed.go (2)
CrossSeedRunResult(150-158)CrossSeedFeedItemStatusProcessed(233-233)
internal/services/crossseed/crossseed_test.go (4)
internal/qbittorrent/sync_manager.go (2)
CrossInstanceTorrentView(99-103)TorrentView(93-96)internal/services/crossseed/service.go (1)
Service(236-287)internal/models/crossseed.go (3)
CrossSeedAutomationSettings(20-51)CrossSeedRun(161-177)CrossSeedFeedItemStatusProcessed(233-233)internal/services/crossseed/models.go (3)
CrossSeedRequest(15-38)CrossSeedResponse(41-48)InstanceCrossSeedResult(51-60)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Run tests
🔇 Additional comments (4)
internal/services/jackett/service.go (2)
3045-3063: LGTM! Well-structured fallback for infohash extraction.The enhancement adds a sensible fallback to extract infohash from the
magneturlattribute when direct infohash attributes aren't available. This broadens indexer support (e.g., Prowlarr) as noted in the PR objectives.
3065-3076: LGTM! Solid validation logic.The validation correctly checks for hex-encoded SHA1 (40 chars) or SHA256 (64 chars) infohashes using
hex.DecodeString, with clear documentation that Base32 encoding is not supported.internal/services/crossseed/crossseed_test.go (2)
25-25: Jackett import is appropriate for the new test helpersThe added
jackettimport cleanly supports the newTorrentDownloadRequestandSearchResultusages in the tests; no issues here.
3500-3590: Infohash pre-check tests exercise the key behaviors wellThe three
processAutomationCandidatetests cover the intended matrix:
- Skip when the infohash is reported as existing on all target instances.
- Proceed (download + cross-seed) when only some instances have the infohash.
- Proceed when
HasTorrentByAnyHashreturns an error, ensuring graceful degradation.They also assert
TorrentsSkippedin the “all instances have it” case, aligning per-instance accounting with how cross-seed results are tallied elsewhere. The explicitcrossSeedInvokerstubs in the “proceeds” cases avoid nil panics and keep the tests focused on decision logic rather than injection details.No functional concerns here.
Also applies to: 3592-3689, 3691-3774
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (4)
internal/services/crossseed/crossseed_test.go (4)
3283-3353: Comment-URL extraction tests exercise the key UNIT3D/BHD-style patternsThe
TestExtractTorrentURLForCommentMatchtable nicely covers:
- Valid HTTPS GUID URLs for several UNIT3D trackers.
- Fallback to
InfoURLwhen GUID is empty or non-URL.- Rejection of plain HTTP and non-torrent URLs.
This gives good confidence in the helper’s behavior.If you want to harden it further, consider an extra case where
InfoURLis HTTP (or an unsupported path) to assert that insecure/irrelevant URLs are explicitly rejected in both GUID and InfoURL positions.
3355-3499: infohashTestSyncManager/InstanceStore are well-focused fakes with minor realism gapsThe
infohashTestSyncManagerandinfohashTestInstanceStoregive you tight control over:
- Per-instance torrent listings, files, and properties.
- Per-instance
HasTorrentByAnyHashoutcomes viahashResults.Two minor points to consider:
HasTorrentByAnyHashignores thehashesslice and only keys offinstanceID. That’s fine for these tests, but it does diverge from the real API contract and could make future tests less expressive if you ever need to assert behavior that depends on which hashes were passed.- Several methods (
AddTorrent,BulkAction,GetQBittorrentSyncManager, etc.) are implemented as no-ops that always succeed. That keeps current tests simple, but it also means accidental new call paths won’t fail loudly. Mirroring the pattern fromfakeSyncManager/mockRecoverSyncManager(returning an explicit “not implemented in infohashTestSyncManager” error) would make unexpected usage surface quickly during test runs.Both points are optional; the current fakes are sufficient for the scenarios under test.
3500-3774: Infohash pre-check tests comprehensively cover success, partial, error, and cancellation pathsThe four new
processAutomationCandidatetests around infohash pre-checks look solid:
SkipsWhenInfohashExistsOnAllInstancesensures:
- Hash presence is checked per instance.
- The candidate is marked processed with
TorrentsSkippedincremented.- No torrent download is attempted when all instances already have the infohash.
ProceedsWhenInfohashExistsOnSomeInstancesvalidates the “some instances have it” branch, asserting that the download path is still exercised and cross-seed invocation can proceed for missing instances.
ProceedsOnHashCheckErrorconfirms graceful degradation: hash-check errors cause a logged warning and fallback to the normal download path instead of failing the candidate outright.
PropagatesContextCancellationcorrectly treats acontext.Cancelederror fromHasTorrentByAnyHashas fatal, returning a failed status and avoiding any download.As a small enhancement, you could simulate actual context cancellation by using a
context.WithCancel/WithTimeoutand having the fake hash-check observectx.Err(), but that’s strictly optional given you already assert propagation ofcontext.Canceled.
3853-3926: Comment-URL based skip test validates UNIT3D-style fallback behavior
TestProcessAutomationCandidate_SkipsWhenCommentURLMatchesnicely exercises the new “comment URL match” shortcut for trackers that don’t expose infohashes:
- Infohash is empty in the search result.
- GUID carries the UNIT3D-style HTTPS details URL.
- The existing torrent’s
Commentembeds that URL.- The test asserts the candidate is processed as skipped, with no download attempted.
To further lock in the behavior, you could add a complementary case where the torrent URL must be taken from
InfoURLinstead ofGUID(matching your helper’s fallback) to ensure the end-to-end path works for both GUID- and InfoURL-based feeds.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
internal/services/crossseed/crossseed_test.go(2 hunks)internal/services/crossseed/service.go(2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- internal/services/crossseed/service.go
🧰 Additional context used
🧠 Learnings (6)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
📚 Learning: 2025-11-28T20:32:30.126Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
Applied to files:
internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-28T22:21:20.730Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:2415-2457
Timestamp: 2025-11-28T22:21:20.730Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The determineSavePath function intentionally includes a contentLayout string parameter for future/content-layout branching and API consistency. Its presence is by design even if unused in the current body; do not flag as an issue in reviews.
Applied to files:
internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-06T12:11:04.963Z
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: internal/services/crossseed/service.go:1045-1082
Timestamp: 2025-11-06T12:11:04.963Z
Learning: The autobrr/qui project uses a custom go-qbittorrent client library (github.com/autobrr/go-qbittorrent) that supports both "paused" and "stopped" parameters when adding torrents via the options map. Both parameters should be set together when controlling torrent start state, as seen in internal/services/crossseed/service.go and throughout the codebase.
Applied to files:
internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-25T22:46:03.762Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 632
File: internal/backups/service.go:1401-1404
Timestamp: 2025-11-25T22:46:03.762Z
Learning: In qui's backup service (internal/backups/service.go), background torrent downloads initiated during manifest import intentionally use a fire-and-forget pattern with the shared service context (s.ctx). Per-run cancellation is not needed, as orphaned downloads completing after run deletion are considered harmless and acceptable. This design prioritizes simplicity over per-run lifecycle management for background downloads.
Applied to files:
internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-27T15:33:39.007Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/jackett/service.go:309-330
Timestamp: 2025-11-27T15:33:39.007Z
Learning: In internal/services/jackett/service.go, synchronous execution paths (test executors, interactive searches with explicit indexer selection, and scheduler-unavailable fallback) intentionally pass jobID=0 to result callbacks. This is acceptable because outcome tracking via ReportIndexerOutcome is only used for async cross-seed operations that always use the scheduler and receive unique jobIDs. Interactive searches and tests do not use outcome tracking.
Applied to files:
internal/services/crossseed/crossseed_test.go
🧬 Code graph analysis (1)
internal/services/crossseed/crossseed_test.go (5)
internal/qbittorrent/sync_manager.go (3)
CrossInstanceTorrentView(99-103)TorrentView(93-96)SyncManager(159-182)internal/services/crossseed/service.go (2)
Service(236-287)AutomationRunOptions(410-414)internal/models/crossseed.go (4)
CrossSeedAutomationSettings(20-51)CrossSeedRun(161-177)CrossSeedFeedItemStatusProcessed(233-233)CrossSeedFeedItemStatusFailed(235-235)internal/services/jackett/models.go (1)
SearchResult(67-110)internal/services/crossseed/models.go (2)
CrossSeedResponse(41-48)InstanceCrossSeedResult(51-60)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Run tests
🔇 Additional comments (1)
internal/services/crossseed/crossseed_test.go (1)
23-28: Jackett service import is appropriate and correctly usedThe new
jackettimport is used forTorrentDownloadRequest,SearchResult, andEnabledIndexerInfoin the added tests and keeps the tests aligned with the production APIs. No changes needed here.
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
internal/services/crossseed/service.go (1)
1693-1746: Comment-URL pre-check + URL helpers are sound; optional robustness tweaksThe comment-URL pre-check correctly only short-circuits when every candidate instance has a torrent whose
Commentincludes the extracted URL, which matches the “already present on all candidates” requirement and safely falls back to the normal download path otherwise.extractTorrentURLForCommentMatch/parseTorrentDetailsURLare conservative (HTTPS +/torrents/or/details/), so they can only cause missed optimizations, not false positives.If you want to harden this further without changing behavior:
- Consider allowing
http://inparseTorrentDetailsURLfor trackers that still use plain HTTP, or at least making the scheme check configurable, to avoid silently skipping the optimization on such feeds.- Think about normalizing trivial variants (e.g., trimming a trailing
/or dropping simple query strings) so that comments containing a slightly decorated details URL still match the GUID/InfoURL-derived string.These are non-blocking polish items; the current implementation is safe and aligned with the described trackers.
Also applies to: 7437-7477
internal/services/crossseed/crossseed_test.go (1)
2355-2498: LGTM! Well-structured test doubles for multi-instance scenarios.The
infohashTestSyncManagerandinfohashTestInstanceStoreprovide comprehensive mocking capabilities with configurable behavior per instance. ThehashResultsmap is a clean way to controlHasTorrentByAnyHashresponses for different test scenarios.For future maintainability, if this test double is reused across multiple test files, consider extracting it to a shared test helper package.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
internal/services/crossseed/crossseed_test.go(2 hunks)internal/services/crossseed/service.go(2 hunks)
🧰 Additional context used
🧠 Learnings (6)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
📚 Learning: 2025-11-28T20:32:30.126Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
Applied to files:
internal/services/crossseed/crossseed_test.gointernal/services/crossseed/service.go
📚 Learning: 2025-11-06T12:11:04.963Z
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: internal/services/crossseed/service.go:1045-1082
Timestamp: 2025-11-06T12:11:04.963Z
Learning: The autobrr/qui project uses a custom go-qbittorrent client library (github.com/autobrr/go-qbittorrent) that supports both "paused" and "stopped" parameters when adding torrents via the options map. Both parameters should be set together when controlling torrent start state, as seen in internal/services/crossseed/service.go and throughout the codebase.
Applied to files:
internal/services/crossseed/crossseed_test.gointernal/services/crossseed/service.go
📚 Learning: 2025-11-25T22:46:03.762Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 632
File: internal/backups/service.go:1401-1404
Timestamp: 2025-11-25T22:46:03.762Z
Learning: In qui's backup service (internal/backups/service.go), background torrent downloads initiated during manifest import intentionally use a fire-and-forget pattern with the shared service context (s.ctx). Per-run cancellation is not needed, as orphaned downloads completing after run deletion are considered harmless and acceptable. This design prioritizes simplicity over per-run lifecycle management for background downloads.
Applied to files:
internal/services/crossseed/crossseed_test.gointernal/services/crossseed/service.go
📚 Learning: 2025-11-27T15:33:39.007Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/jackett/service.go:309-330
Timestamp: 2025-11-27T15:33:39.007Z
Learning: In internal/services/jackett/service.go, synchronous execution paths (test executors, interactive searches with explicit indexer selection, and scheduler-unavailable fallback) intentionally pass jobID=0 to result callbacks. This is acceptable because outcome tracking via ReportIndexerOutcome is only used for async cross-seed operations that always use the scheduler and receive unique jobIDs. Interactive searches and tests do not use outcome tracking.
Applied to files:
internal/services/crossseed/crossseed_test.gointernal/services/crossseed/service.go
📚 Learning: 2025-11-28T22:21:20.730Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:2415-2457
Timestamp: 2025-11-28T22:21:20.730Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The determineSavePath function intentionally includes a contentLayout string parameter for future/content-layout branching and API consistency. Its presence is by design even if unused in the current body; do not flag as an issue in reviews.
Applied to files:
internal/services/crossseed/service.go
🧬 Code graph analysis (2)
internal/services/crossseed/crossseed_test.go (2)
internal/qbittorrent/sync_manager.go (2)
CrossInstanceTorrentView(99-103)TorrentView(93-96)internal/services/jackett/models.go (1)
SearchResult(67-110)
internal/services/crossseed/service.go (3)
web/src/types/index.ts (2)
CrossSeedRunResult(1366-1374)Torrent(217-274)internal/models/crossseed.go (3)
CrossSeedRunResult(150-158)CrossSeedFeedItemStatusFailed(235-235)CrossSeedFeedItemStatusProcessed(233-233)web/src/pages/Torrents.tsx (1)
Torrents(28-350)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Run tests
🔇 Additional comments (5)
internal/services/crossseed/crossseed_test.go (5)
2284-2353: LGTM! Comprehensive test coverage for URL extraction.The test covers all major UNIT3D-style tracker URL formats and properly validates the fallback behavior when GUID is empty or invalid. The HTTP rejection and non-torrent URL filtering are good security practices.
2500-2590: LGTM! Core feature test validates infohash pre-check optimization.This test correctly verifies the primary optimization: when all target instances already have the torrent by infohash, the download is skipped entirely. The assertion on
downloadCalledconfirms the download is avoided, and the skip counter properly tracks both instances.
2592-2689: LGTM! Partial match scenario handled correctly.The test properly validates that when only some instances have the torrent by infohash, the download proceeds for the instances that need it. This ensures the optimization doesn't prevent legitimate cross-seeding opportunities.
2691-2928: LGTM! Excellent error handling test coverage.The three tests correctly distinguish between:
- Generic errors (graceful degradation - proceed with download)
- Context cancellation/deadline errors (propagate immediately - don't fallback)
This ensures the optimization doesn't hide critical context signals while still being resilient to transient errors.
2930-4003: LGTM! Fallback mechanism for non-infohash trackers.This test validates the fallback to comment URL matching for trackers that don't provide infohash in feeds (like UNIT3D-based trackers mentioned in the PR objectives). The test correctly simulates the scenario and verifies the skip behavior works even without infohash.
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (3)
internal/services/crossseed/service.go (3)
1637-1692: Infohash pre-check flow and counters look solid; only nit is pointer helper styleThe
InfoHashV1pre-check correctly:
- short-circuits only when all candidate instances already have the torrent,
- treats context cancel/timeout as a real failure (
run.TorrentsFailed+++FeedItemStatusFailed),- otherwise logs and safely falls back to the normal download/apply path,
- records per-instance
"exists"results and incrementsrun.TorrentsSkippedper instance, consistent with how later apply results are tallied.One small nit: the
MatchedTorrentHash/MatchedTorrentNamefields are built via inlinefunc() *string { ... }()closures, which allocate more than necessary and differ from the local-var pattern used elsewhere in this file. Consider the more idiomatic style for consistency and to avoid the tiny closure cost:- if exists && existing != nil { - existingResults = append(existingResults, models.CrossSeedRunResult{ - InstanceID: candidate.InstanceID, - InstanceName: candidate.InstanceName, - Success: false, - Status: "exists", - Message: "Torrent already exists (infohash pre-check)", - MatchedTorrentHash: func() *string { h := existing.Hash; return &h }(), - MatchedTorrentName: func() *string { n := existing.Name; return &n }(), - }) + if exists && existing != nil { + hash := existing.Hash + name := existing.Name + existingResults = append(existingResults, models.CrossSeedRunResult{ + InstanceID: candidate.InstanceID, + InstanceName: candidate.InstanceName, + Success: false, + Status: "exists", + Message: "Torrent already exists (infohash pre-check)", + MatchedTorrentHash: &hash, + MatchedTorrentName: &name, + })
1694-1748: Comment-URL pre-check logic matches goals; same small pointer-style nit as aboveThe UNIT3D/BHD-style comment-URL pre-check:
- Extracts a plausible torrent details URL once and reuses it,
- Requires the URL-backed torrent to exist on all candidate instances before skipping the download, mirroring the infohash pre-check semantics,
- Properly treats
ctx.Err() != nilas a failure (run.TorrentsFailed+++FeedItemStatusFailed), and- Emits per-instance
"exists"results and incrementsrun.TorrentsSkippedper instance, which fits existing run summary semantics.As with the infohash block, you can optionally switch away from the inline closures for
MatchedTorrentHash/MatchedTorrentNameto a local-var pattern for consistency and slightly less allocation:- if found && matchedTorrent != nil { - existingResults = append(existingResults, models.CrossSeedRunResult{ - InstanceID: candidate.InstanceID, - InstanceName: candidate.InstanceName, - Success: false, - Status: "exists", - Message: "Torrent already exists (comment URL pre-check)", - MatchedTorrentHash: func() *string { h := matchedTorrent.Hash; return &h }(), - MatchedTorrentName: func() *string { n := matchedTorrent.Name; return &n }(), - }) + if found && matchedTorrent != nil { + hash := matchedTorrent.Hash + name := matchedTorrent.Name + existingResults = append(existingResults, models.CrossSeedRunResult{ + InstanceID: candidate.InstanceID, + InstanceName: candidate.InstanceName, + Success: false, + Status: "exists", + Message: "Torrent already exists (comment URL pre-check)", + MatchedTorrentHash: &hash, + MatchedTorrentName: &name, + })
7438-7480: Helper URL extraction is clear; consider minor robustness tweaks only if neededThe new helpers cleanly encapsulate the comment-URL matching rule:
- prefer GUID, fall back to
InfoURL,- require
https://and/torrents/or/details/path segments, which matches the documented UNIT3D/BHD patterns.If you ever need to support slightly messier feeds, two optional robustness tweaks would be:
strings.TrimSpace(rawURL)before inspection, and- scheme check using a lowercased copy (
strings.HasPrefix(strings.ToLower(rawURL), "https://")) to tolerate unusual but valid mixed-case schemes.Not required for the trackers you’re targeting today, but easy to adopt later if feed variance becomes an issue.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
internal/services/crossseed/crossseed_test.go(2 hunks)internal/services/crossseed/service.go(2 hunks)
🧰 Additional context used
🧠 Learnings (7)
📓 Common learnings
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
📚 Learning: 2025-11-28T20:32:30.126Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:209-212
Timestamp: 2025-11-28T20:32:30.126Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The cross-seed recheck-resume worker intentionally runs for the process lifetime and keys pending entries by hash only. This is acceptable under the current constraint that background seeded-search runs operate on a single instance at a time; graceful shutdown and instanceID|hash keying are deferred by design.
Applied to files:
internal/services/crossseed/crossseed_test.gointernal/services/crossseed/service.go
📚 Learning: 2025-11-06T12:11:04.963Z
Learnt from: Audionut
Repo: autobrr/qui PR: 553
File: internal/services/crossseed/service.go:1045-1082
Timestamp: 2025-11-06T12:11:04.963Z
Learning: The autobrr/qui project uses a custom go-qbittorrent client library (github.com/autobrr/go-qbittorrent) that supports both "paused" and "stopped" parameters when adding torrents via the options map. Both parameters should be set together when controlling torrent start state, as seen in internal/services/crossseed/service.go and throughout the codebase.
Applied to files:
internal/services/crossseed/crossseed_test.go
📚 Learning: 2025-11-25T22:46:03.762Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 632
File: internal/backups/service.go:1401-1404
Timestamp: 2025-11-25T22:46:03.762Z
Learning: In qui's backup service (internal/backups/service.go), background torrent downloads initiated during manifest import intentionally use a fire-and-forget pattern with the shared service context (s.ctx). Per-run cancellation is not needed, as orphaned downloads completing after run deletion are considered harmless and acceptable. This design prioritizes simplicity over per-run lifecycle management for background downloads.
Applied to files:
internal/services/crossseed/crossseed_test.gointernal/services/crossseed/service.go
📚 Learning: 2025-11-27T15:33:39.007Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/jackett/service.go:309-330
Timestamp: 2025-11-27T15:33:39.007Z
Learning: In internal/services/jackett/service.go, synchronous execution paths (test executors, interactive searches with explicit indexer selection, and scheduler-unavailable fallback) intentionally pass jobID=0 to result callbacks. This is acceptable because outcome tracking via ReportIndexerOutcome is only used for async cross-seed operations that always use the scheduler and receive unique jobIDs. Interactive searches and tests do not use outcome tracking.
Applied to files:
internal/services/crossseed/crossseed_test.gointernal/services/crossseed/service.go
📚 Learning: 2025-11-28T22:21:20.730Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go:2415-2457
Timestamp: 2025-11-28T22:21:20.730Z
Learning: Repo: autobrr/qui PR: 641
File: internal/services/crossseed/service.go
Learning: The determineSavePath function intentionally includes a contentLayout string parameter for future/content-layout branching and API consistency. Its presence is by design even if unused in the current body; do not flag as an issue in reviews.
Applied to files:
internal/services/crossseed/service.go
📚 Learning: 2025-11-21T21:11:50.633Z
Learnt from: s0up4200
Repo: autobrr/qui PR: 625
File: internal/qbittorrent/sync_manager.go:1112-1123
Timestamp: 2025-11-21T21:11:50.633Z
Learning: In internal/qbittorrent/sync_manager.go, the GetCachedFilesBatch interface documentation (around line 39-40) should specify "uppercase hex" instead of "lowercase hex" to match the actual normalization practice used throughout the codebase (e.g., normalizeHash in internal/services/crossseed/service.go uses strings.ToUpper, and hash filtering uses uppercase).
Applied to files:
internal/services/crossseed/service.go
🧬 Code graph analysis (1)
internal/services/crossseed/service.go (1)
internal/models/crossseed.go (3)
CrossSeedRunResult(150-158)CrossSeedFeedItemStatusFailed(235-235)CrossSeedFeedItemStatusProcessed(233-233)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Run tests
🔇 Additional comments (6)
internal/services/crossseed/crossseed_test.go (6)
2284-2353: LGTM! Comprehensive test coverage for URL extraction.The test covers all major tracker URL formats (UNIT3D, BHD, Aither, Blutopia), fallback behavior, and validation (HTTP rejection, non-torrent URLs). Good edge case coverage including empty inputs.
3355-3498: LGTM! Well-designed test doubles for infohash testing.The
infohashTestSyncManagerand supporting types provide configurable mock behavior specifically for testing hash-based pre-checks. While there's some overlap with existingfakeSyncManager, the additionalhashResultsconfiguration is necessary for testing scenarios where hash checks succeed on some instances but fail on others.The implementation correctly:
- Stores per-instance torrents, files, and properties
- Provides configurable
HasTorrentByAnyHashresults viahashResults- Copies data properly to avoid mutation issues
- Implements all required interface methods
3500-3590: LGTM! Test correctly verifies skip behavior when infohash exists on all instances.The test properly validates the optimization where downloads are skipped when the infohash already exists on all target instances. Key assertions correctly verify:
- Download is NOT called (
downloadCalled = false)- Skip counter is incremented for both instances (
TorrentsSkipped = 2)- Returned hash matches the existing torrent
3592-3774: LGTM! Tests correctly verify proceed behavior for partial matches and errors.Two well-designed tests:
Partial instance coverage (lines 3592-3689): Correctly verifies that download proceeds when the infohash exists on some but not all instances, ensuring the torrent is added to remaining instances.
Error handling (lines 3691-3774): Properly tests graceful degradation when hash checks fail. The system proceeds with the download rather than blocking on transient errors, which is the correct behavior for resilience.
Both include proper mock
crossSeedInvokersetup to prevent nil panics.
3776-3930: LGTM! Critical tests for context error propagation.These tests ensure that context cancellation and deadline exceeded errors are properly propagated rather than treated as transient failures that trigger fallback behavior. This distinction is important because:
- Context cancellation indicates user/system-initiated shutdown, not a transient error
- Deadline exceeded indicates timeout, which should fail fast rather than continue processing
Both tests correctly verify:
- Errors are propagated using
assert.ErrorIsTorrentsFailedcounter is incremented- Download is NOT called
- Error messages contain appropriate context
3932-4005: LGTM! Test correctly verifies comment URL matching fallback.This test validates the fallback behavior when infohash is unavailable. The system extracts torrent detail URLs from GUID/InfoURL fields and matches them against existing torrent comments, preventing duplicate downloads from trackers that don't provide infohash in their RSS feeds.
Test correctly:
- Omits infohash in search result to force fallback
- Sets up torrent comment with matching URL
- Doesn't configure
hashResults(returns nil, false, nil) to trigger fallback path- Verifies download is skipped
- Verifies
returnedHashis nil (appropriate since no infohash available)
This PR contains the following updates: | Package | Update | Change | |---|---|---| | [ghcr.io/autobrr/qui](https://github.com/autobrr/qui) | minor | `v1.9.1` -> `v1.10.0` | --- ### Release Notes <details> <summary>autobrr/qui (ghcr.io/autobrr/qui)</summary> ### [`v1.10.0`](https://github.com/autobrr/qui/releases/tag/v1.10.0) [Compare Source](autobrr/qui@v1.9.1...v1.10.0) #### Changelog ##### New Features - [`f2b17e6`](autobrr/qui@f2b17e6): feat(config): add SESSION\_SECRET\_FILE env var ([#​661](autobrr/qui#661)) ([@​undefined-landmark](https://github.com/undefined-landmark)) - [`f5ede56`](autobrr/qui@f5ede56): feat(crossseed): add RSS source filters for categories and tags ([#​757](autobrr/qui#757)) ([@​s0up4200](https://github.com/s0up4200)) - [`9dee7bb`](autobrr/qui@9dee7bb): feat(crossseed): add Unicode normalization for title and file matching ([#​742](autobrr/qui#742)) ([@​s0up4200](https://github.com/s0up4200)) - [`d44058f`](autobrr/qui@d44058f): feat(crossseed): add skip auto-resume settings per mode ([#​755](autobrr/qui#755)) ([@​s0up4200](https://github.com/s0up4200)) - [`9e3534a`](autobrr/qui@9e3534a): feat(crossseed): add webhook source filters for categories and tags ([#​763](autobrr/qui#763)) ([@​s0up4200](https://github.com/s0up4200)) - [`c8bbe07`](autobrr/qui@c8bbe07): feat(crossseed): only poll status endpoints when features are enabled ([#​738](autobrr/qui#738)) ([@​s0up4200](https://github.com/s0up4200)) - [`fda8101`](autobrr/qui@fda8101): feat(sidebar): add size tooltips and deduplicate cross-seed sizes ([#​724](autobrr/qui#724)) ([@​s0up4200](https://github.com/s0up4200)) - [`e4c0556`](autobrr/qui@e4c0556): feat(torrent): add sequential download toggles ([#​776](autobrr/qui#776)) ([@​rare-magma](https://github.com/rare-magma)) - [`2a43f15`](autobrr/qui@2a43f15): feat(torrents): autocomplete paths ([#​634](autobrr/qui#634)) ([@​rare-magma](https://github.com/rare-magma)) - [`1c07b33`](autobrr/qui@1c07b33): feat(torrents): replace filtered speeds with global ([#​745](autobrr/qui#745)) ([@​jabloink](https://github.com/jabloink)) - [`cd0deee`](autobrr/qui@cd0deee): feat(tracker): add per-domain stats inclusion toggle for merged trackers ([#​781](autobrr/qui#781)) ([@​s0up4200](https://github.com/s0up4200)) - [`b6a6200`](autobrr/qui@b6a6200): feat(web): add Size column to Tracker Breakdown table ([#​770](autobrr/qui#770)) ([@​s0up4200](https://github.com/s0up4200)) - [`560071b`](autobrr/qui@560071b): feat(web): add zebra striping to torrent table ([#​726](autobrr/qui#726)) ([@​s0up4200](https://github.com/s0up4200)) - [`f8f65a8`](autobrr/qui@f8f65a8): feat(web): improve auto-search on completion UX ([#​743](autobrr/qui#743)) ([@​s0up4200](https://github.com/s0up4200)) - [`e36312f`](autobrr/qui@e36312f): feat(web): improve torrent selection UX with unified click and escape behavior ([#​782](autobrr/qui#782)) ([@​s0up4200](https://github.com/s0up4200)) - [`27c1daa`](autobrr/qui@27c1daa): feat(web): napster theme ([#​728](autobrr/qui#728)) ([@​s0up4200](https://github.com/s0up4200)) - [`e3950de`](autobrr/qui@e3950de): feat(web): new torrent details panel for desktop ([#​760](autobrr/qui#760)) ([@​s0up4200](https://github.com/s0up4200)) - [`6c66ba5`](autobrr/qui@6c66ba5): feat(web): persist tab state in URL for CrossSeed and Settings pages ([#​775](autobrr/qui#775)) ([@​s0up4200](https://github.com/s0up4200)) - [`59884a9`](autobrr/qui@59884a9): feat(web): share tracker customizations with filtersidebar ([#​717](autobrr/qui#717)) ([@​s0up4200](https://github.com/s0up4200)) ##### Bug Fixes - [`fafd278`](autobrr/qui@fafd278): fix(api): add webhook source filter fields to PATCH settings endpoint ([#​774](autobrr/qui#774)) ([@​s0up4200](https://github.com/s0up4200)) - [`bdf0339`](autobrr/qui@bdf0339): fix(api): support apikey query param with custom base URL ([#​748](autobrr/qui#748)) ([@​s0up4200](https://github.com/s0up4200)) - [`c3c8d66`](autobrr/qui@c3c8d66): fix(crossseed): compare Site and Sum fields for anime releases ([#​769](autobrr/qui#769)) ([@​s0up4200](https://github.com/s0up4200)) - [`cb4c965`](autobrr/qui@cb4c965): fix(crossseed): detect file name differences and fix hasExtraSourceFiles ([#​741](autobrr/qui#741)) ([@​s0up4200](https://github.com/s0up4200)) - [`fd9e054`](autobrr/qui@fd9e054): fix(crossseed): fix batch completion searches and remove legacy settings ([#​744](autobrr/qui#744)) ([@​s0up4200](https://github.com/s0up4200)) - [`26706a0`](autobrr/qui@26706a0): fix(crossseed): normalize punctuation in title matching ([#​718](autobrr/qui#718)) ([@​s0up4200](https://github.com/s0up4200)) - [`db30566`](autobrr/qui@db30566): fix(crossseed): rename files before folder to avoid path conflicts ([#​752](autobrr/qui#752)) ([@​s0up4200](https://github.com/s0up4200)) - [`8886ac4`](autobrr/qui@8886ac4): fix(crossseed): resolve category creation race condition and relax autoTMM ([#​767](autobrr/qui#767)) ([@​s0up4200](https://github.com/s0up4200)) - [`f8f2a05`](autobrr/qui@f8f2a05): fix(crossseed): support game scene releases with RAR files ([#​768](autobrr/qui#768)) ([@​s0up4200](https://github.com/s0up4200)) - [`918adee`](autobrr/qui@918adee): fix(crossseed): treat x264/H.264/H264/AVC as equivalent codecs ([#​766](autobrr/qui#766)) ([@​s0up4200](https://github.com/s0up4200)) - [`c4b1f0a`](autobrr/qui@c4b1f0a): fix(dashboard): merge tracker customizations with duplicate displayName ([#​751](autobrr/qui#751)) ([@​jabloink](https://github.com/jabloink)) - [`3c6e0f9`](autobrr/qui@3c6e0f9): fix(license): remove redundant validation call after activation ([#​749](autobrr/qui#749)) ([@​s0up4200](https://github.com/s0up4200)) - [`a9c7754`](autobrr/qui@a9c7754): fix(reannounce): simplify tracker detection to match qbrr logic ([#​746](autobrr/qui#746)) ([@​s0up4200](https://github.com/s0up4200)) - [`3baa007`](autobrr/qui@3baa007): fix(rss): skip download when torrent already exists by infohash ([#​715](autobrr/qui#715)) ([@​s0up4200](https://github.com/s0up4200)) - [`55d0ccc`](autobrr/qui@55d0ccc): fix(swagger): respect base URL for API docs routes ([#​758](autobrr/qui#758)) ([@​s0up4200](https://github.com/s0up4200)) - [`47695fd`](autobrr/qui@47695fd): fix(web): add height constraint to filter sidebar wrapper for proper scrolling ([#​778](autobrr/qui#778)) ([@​s0up4200](https://github.com/s0up4200)) - [`4b3bfea`](autobrr/qui@4b3bfea): fix(web): default torrent format to v1 in creator dialog ([#​723](autobrr/qui#723)) ([@​s0up4200](https://github.com/s0up4200)) - [`2d54b79`](autobrr/qui@2d54b79): fix(web): pin submit button in Services sheet footer ([#​756](autobrr/qui#756)) ([@​s0up4200](https://github.com/s0up4200)) - [`2bcd6a3`](autobrr/qui@2bcd6a3): fix(web): preserve folder collapse state during file tree sync ([#​740](autobrr/qui#740)) ([@​ewenjo](https://github.com/ewenjo)) - [`57f3f1d`](autobrr/qui@57f3f1d): fix(web): sort Peers column by total peers instead of connected ([#​759](autobrr/qui#759)) ([@​s0up4200](https://github.com/s0up4200)) - [`53a8818`](autobrr/qui@53a8818): fix(web): sort Seeds column by total seeds instead of connected ([#​747](autobrr/qui#747)) ([@​s0up4200](https://github.com/s0up4200)) - [`d171915`](autobrr/qui@d171915): fix(web): sort folders before files in torrent file tree ([#​764](autobrr/qui#764)) ([@​s0up4200](https://github.com/s0up4200)) ##### Other Changes - [`172b4aa`](autobrr/qui@172b4aa): chore(assets): replace napster.svg with napster.png for logo update ([@​s0up4200](https://github.com/s0up4200)) - [`dc83102`](autobrr/qui@dc83102): chore(deps): bump the github group with 3 updates ([#​761](autobrr/qui#761)) ([@​dependabot](https://github.com/dependabot)\[bot]) - [`75357d3`](autobrr/qui@75357d3): chore: fix napster logo ([@​s0up4200](https://github.com/s0up4200)) - [`206c4b2`](autobrr/qui@206c4b2): refactor(web): extract CrossSeed completion to accordion component ([#​762](autobrr/qui#762)) ([@​s0up4200](https://github.com/s0up4200)) **Full Changelog**: <autobrr/qui@v1.9.1...v1.10.0> #### Docker images - `docker pull ghcr.io/autobrr/qui:v1.10.0` - `docker pull ghcr.io/autobrr/qui:latest` #### What to do next? - Join our [Discord server](https://discord.autobrr.com/qui) Thank you for using qui! </details> --- ### Configuration 📅 **Schedule**: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined). 🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied. ♻ **Rebasing**: Whenever PR is behind base branch, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [Renovate Bot](https://github.com/renovatebot/renovate). <!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi4zOS4xIiwidXBkYXRlZEluVmVyIjoiNDIuMzkuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW1hZ2UiXX0=--> Reviewed-on: https://gitea.alexlebens.dev/alexlebens/infrastructure/pulls/2664 Co-authored-by: Renovate Bot <renovate-bot@alexlebens.net> Co-committed-by: Renovate Bot <renovate-bot@alexlebens.net>
When RSS automation finds candidates, check if the feed item's infohash already exists on all candidate instances before downloading the torrent file. This prevents unnecessary downloads when a torrent was already added via another source like autobrr.
Also adds fallback extraction of infohash from magneturl attribute.
Indexers providing infohash via Prowlarr:
Not supported (no infohash in feed):
Summary by CodeRabbit
New Features
Bug Fixes
Tests
✏️ Tip: You can customize this high-level summary in your review settings.