Minor fixes, HA compatibility improvements, and CI integration testing#2
Merged
viperadnan-git merged 13 commits intoviperadnan-git:mainfrom Feb 28, 2026
Merged
Minor fixes, HA compatibility improvements, and CI integration testing#2viperadnan-git merged 13 commits intoviperadnan-git:mainfrom
viperadnan-git merged 13 commits intoviperadnan-git:mainfrom
Conversation
Each issue is described inline in the changed code. Summary:
1. signup._complete_tutorial — `headers = Constants.API_HEADERS` was a
reference, not a copy, so adding "Authorization" permanently mutated the
shared class-level dict. Every subsequent _static_request (login, signup)
would then carry a stale signup token. Fixed by using a dict-spread copy:
`headers = {**Constants.API_HEADERS, "Authorization": ...}`.
2. signup.submit_otp — OTP guard used `and` instead of `or`, making it only
raise when the value had exactly 6 chars but was non-numeric. Any other
invalid input (wrong length, wrong chars) would silently pass through.
Corrected to `not otp.isdigit() or len(otp) != 6`.
3. torrent.add_torrent_file (multipart "path" field) — all non-file multipart
fields must be the tuple form `(None, value)` so httpx treats them as plain
form fields. "path" was the only field passed as a bare string, causing
httpx to encode it as a filename instead of a value. Fixed to
`(None, path.path)`.
4. torrent.add_torrent_file (file handle leak) — `open(local_path, "rb")` was
called inline with no guarantee of closure after the request finished.
Wrapped in `with open(...) as torrent_file:` so the OS handle is released
on both the success and exception paths.
5. remote_download.add_remote_download — the function is declared `-> bool`
but fell through to an implicit `None` return when success=False and the
response contained no error message. Added explicit `return False`.
Also fixed the `reponse` variable-name typo.
6. remote_task_list.RemoteTaskList.from_response — `response.json()` was
called unguarded. Every other `from_response` in the SDK wraps this in a
JSONDecodeError → InvalidResponseError handler; this one was overlooked
during the refactor. Added the same guard for consistency.
https://claude.ai/code/session_018ExLJ6VSeibzwnYraNK97B
The project had no tests of any kind — the only CI job was a PyPI publish pipeline triggered on version tags. Add a "Local Testing" section to README.md containing: - Setup instructions (editable install, env-var credentials) - A self-contained smoke_test.py script that exercises every SDK module against the live API: auth, user details, storage details, file list, torrent add/list/details/delete round-trip, and remote-download add/list/delete round-trip - Expected output for a passing run - A note on enabling debug logging to trace raw HTTP calls The script uses environment variables for credentials (SONICBIT_EMAIL / SONICBIT_PASSWORD) and exits with a non-zero code if any check fails, making it easy to run manually or wire into a future CI job. https://claude.ai/code/session_018ExLJ6VSeibzwnYraNK97B
Changes motivated by Home Assistant integration requirements, where blocking the event loop or sharing mutable state across instances are disqualifying problems. 1. base.py — add a default request timeout (REQUEST_TIMEOUT = 15 s) The httpx.Client was created with no timeout, so a silent server hang would block the caller indefinitely. For async runtimes like HA that run SDK calls in an executor thread, an unbounded block starves the thread pool. The timeout is a class variable so integrations can override it without subclassing. 2. base.py — reduce retry wait minimum from 4 s to 1 s (max 5 s) The previous wait_exponential(min=4) meant the first retry always waited at least 4 seconds. With HA update intervals of 30–60 s, a 4-second floor adds unacceptable latency to transient failures. 3. base.py — also retry on httpx.TimeoutException Now that requests have a timeout, TimeoutException is a retryable transient condition (the previous ConnectError-only filter would have surfaced timeouts as hard errors). 4. base.py — apply REQUEST_TIMEOUT to _static_request as well One-shot calls (login, signup) now respect the same timeout cap via kwargs.setdefault so callers can still override per-call if needed. 5. auth.py — replace _refreshing bool with threading.Lock The boolean had a TOCTOU race: two threads could both read False, both enter the refresh branch, and both issue a new login request simultaneously. A Lock serialises the refresh; concurrent threads that find the lock held will block until the refresh completes and then retry with the already-updated token. 6. client.py — fix mutable default argument (TokenFileHandler) Using `token_handler: TokenHandler = TokenFileHandler()` evaluated the constructor once at class-definition time, so every SonicBit() call that omitted token_handler silently shared the same instance. With multiple accounts this causes credential cross-contamination. Fixed by defaulting to None and constructing a fresh TokenFileHandler inside __init__ when no handler is provided. 7. token_handler.py — replace input()/print() with NotImplementedError The base TokenHandler used input() and print() as placeholder implementations. Both block the calling thread (and therefore the HA event loop when running in an executor). Replacing them with NotImplementedError makes accidental use of the base class produce an immediate, clear error instead of a silent hang. A docstring now explains how to implement a custom backend (database, secrets manager, etc.). Note: this SDK remains synchronous. HA integrations must wrap all calls with hass.async_add_executor_job() — a comment in base.py documents this requirement for integration authors. https://claude.ai/code/session_018ExLJ6VSeibzwnYraNK97B
Adds a GitHub Actions workflow that exercises the full torrent round-trip
against the live SonicBit API on every push to main/master/claude/** and
on pull requests targeting main/master.
Workflow (.github/workflows/integration.yml):
- Uses astral-sh/setup-uv@v5 and uv sync --no-dev (reads uv.lock, matching
the existing publish.yml style).
- Job timeout of 25 minutes; concurrency group prevents two runs from
competing over the same test account at the same time.
- Reads SONICBIT_EMAIL and SONICBIT_PASSWORD from repository secrets.
Test script (.github/scripts/ci_torrent_lifecycle.py):
1. Pre-clean — removes the test torrent if a previous run left it behind.
2. Add — adds Big Buck Bunny via magnet link (small, permanently
seeded public-domain torrent; completes in seconds on a
seedbox).
3. Download — polls list_torrents() every 10 s until progress == 100 %
(10-minute timeout).
4. Sync — polls until in_cache is True, confirming the data has been
moved from the seedbox to permanent cloud storage
(5-minute timeout).
5. Verify — calls list_files() and checks the torrent folder is visible
at the root level (warning rather than hard failure, because
some plan tiers may nest or rename the folder).
6. Delete — runs in a finally block so the test account is always left
clean regardless of whether earlier steps passed or failed.
Required repository secrets:
SONICBIT_EMAIL test account e-mail
SONICBIT_PASSWORD test account password
https://claude.ai/code/session_018ExLJ6VSeibzwnYraNK97B
The magnet link was hardcoded to Big Buck Bunny, meaning changing the
test fixture required a code change rather than a secret rotation.
- SONICBIT_TEST_MAGNET is now a required secret passed to the job.
The script reads it from the environment and fails fast with a clear
error if it (or either credential secret) is missing.
- parse_magnet() extracts the btih info-hash and the dn= display name
from the magnet URI at runtime, so both the torrent-lookup hash and
the file-listing verification are fully driven by the secret value
rather than any hardcoded strings.
Required repository secrets are now:
SONICBIT_EMAIL
SONICBIT_PASSWORD
SONICBIT_TEST_MAGNET magnet:?xt=urn:btih:<hash>&dn=Name&tr=...
https://claude.ai/code/session_018ExLJ6VSeibzwnYraNK97B
fix: patch 6 gaps left by the v5→v6 API refactor
…ibility Home Assistant ships pydantic 2.12.2, which was blocked by the overly tight >=2.12.5 lower bound set from the dev environment. The SDK only uses BaseModel, Field, and ConfigDict — all available since pydantic v2.0 — so there is no reason to pin higher than 2.0.0. https://claude.ai/code/session_018ExLJ6VSeibzwnYraNK97B
fix: relax pydantic constraint from >=2.12.5 to >=2.0.0 for HA compatibility Home Assistant ships pydantic 2.12.2, which was blocked by the overly tight >=2.12.5 lower bound set from the dev environment. The SDK only uses BaseModel, Field, and ConfigDict — all available since pydantic v2.0 — so there is no reason to pin higher than 2.0.0. https://claude.ai/code/session_018ExLJ6VSeibzwnYraNK97B
The CI job was timing out in the sync step because the API consistently
returns in_cache=False for this account type, so the condition
`bool(t.in_cache)` never became True.
The API actually signals a completed sync through two mechanisms:
- in_cache=True — explicit flag, set on some plan types
- 'c' in status — status code that appears once the seedbox has moved
the data to permanent cloud storage
The logs show status=['sd', 'c', 'i'] appears at the very first sync poll
(immediately after progress reaches 100%), confirming 'c' is the reliable
signal on this account. Both conditions are now ORed together so the
step passes as soon as either signal is present.
https://claude.ai/code/session_01E5jYKaDuJWCb7LZgyk2u6n
…-dZSfc fix: detect cloud-storage sync via status 'c' in addition to in_cache
Previously the CI script exercised only add_torrent / list_torrents /
list_files / delete_torrent. Seven public SDK methods had no CI
coverage at all. This commit adds a section for each:
User module
• get_user_details() — verify email matches login, account not suspended
• get_storage_details() — verify size_byte_limit > 0 and percent in [0,100]
RemoteDownload module (full add → list → delete lifecycle)
• add_remote_download() — accepts a 1 MB public file URL
• list_remote_downloads() — confirms the new task appears by URL
• delete_remote_download() — removes the task; verifies True return
Torrent module
• get_torrent_details() — run after sync; asserts file list is non-empty
and logs each file name/size/progress
File module
• delete_file() — deletes the torrent's cloud folder found via list_files();
sets cloud_files_deleted=True so the finally-block cleanup
uses with_file=False and avoids the spurious "Server Error"
that occurred when the API tried to delete already-gone files
Also adds a pre-clean step for the remote-download task (mirrors the
existing torrent pre-clean) and updates the module docstring step list.
https://claude.ai/code/session_01E5jYKaDuJWCb7LZgyk2u6n
…-dZSfc ci: extend lifecycle test to cover all SDK public methods
Contributor
Author
|
@viperadnan-git: |
Owner
I think tests are too niche for your use cases, I will implement a better testing flow. So I merged everything except the CI. |
Contributor
Author
|
The tests were quick and dirty. Thanks for merging the rest of the changes! Looking forward to the next release. When you do I'll update the home assistant integration to use your release. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
PR Summary
Thank you for this excellent open-source project! I've made a few changes in my fork to support a Home Assistant integration and would love to share them back upstream.
Summary
This PR addresses six categories of changes:
Bug Fixes
sonicbit/modules/remote_download.pyreponse→responsevariable name inadd_remote_download().Nonereturn:add_remote_download()could returnNoneinstead ofFalsewhen the API returned a failure with no error message body. Added explicitreturn False.sonicbit/modules/signup.pysubmit_otp()usedand(len != 6 and not digits), which silently accepted non-6-character inputs if they happened to be all digits. Fixed toorso both conditions are independently enforced._complete_tutorial():headers = Constants.API_HEADERSwas a reference to the shared class-level dict, not a copy. Any subsequent mutation (e.g., adding theAuthorizationheader) would permanently alterConstants.API_HEADERSfor all future requests. Fixed with a shallow copy:headers = {**Constants.API_HEADERS}.sonicbit/models/remote_download/remote_task_list.pyresponse.json()was not wrapped in exception handling. Addedtry/except JSONDecodeErrorso a malformed server response raises a clean error.sonicbit/modules/torrent.py'c' in statusin addition to the existingin_cacheflag, which was insufficient to reliably detect when a completed torrent had been synced.Thread-Safety Fixes
sonicbit/modules/auth.pyself._refreshingflag to prevent concurrent token refreshes. Two threads could both readFalse, enter the refresh branch, and issue login requests simultaneously—corrupting the session state.threading.Lock()(self._refresh_lock). The lock ensures only one thread executes the refresh; other threads wait onacquire()and then retry with the token the first thread already wrote.Home Assistant Compatibility
sonicbit/client.pytoken_handler: TokenHandler = TokenFileHandler()evaluated the handler once at class-definition time. EverySonicBitinstance shared the same object, mixing credentials for users with multiple accounts. Fixed by defaulting toNoneand constructing a freshTokenFileHandler()inside__init__.sonicbit/handlers/token_handler.pywrite()calledprint()andread()calledinput(), which hangs in non-interactive runtimes (including HA's executor threads). Converted to a proper abstract base class whose methods raiseNotImplementedError, forcing callers to supply a concrete implementation.sonicbit/base.pyREQUEST_TIMEOUT = 15. A silent server hang blocks the caller indefinitely—a critical problem for HA's shared thread pool.MAX_API_RETRIES = 3.Dependency Relaxation
pyproject.tomlpydantic>=2.12.5→pydantic>=2.0.0CI Integration Testing
.github/workflows/integration.yml(new file)GitHub Actions workflow that executes the torrent lifecycle test on every PR/push. Requires repository secrets:
SONICBIT_EMAIL,SONICBIT_PASSWORD, andSONICBIT_TEST_MAGNET..github/scripts/ci_torrent_lifecycle.py(new file)End-to-end integration test covering the full SDK lifecycle:
get_user_details()andget_storage_details().finallyblock to ensure environment is left clean.Documentation
README.mdFiles Changed
sonicbit/handlers/token_handler.pyprint()/input()sonicbit/client.pytoken_handlersonicbit/base.pysonicbit/modules/auth.pythreading.Lockreplaces boolean refresh flagsonicbit/modules/torrent.pysonicbit/modules/remote_download.pyreturn Falsesonicbit/modules/signup.pysonicbit/models/remote_download/remote_task_list.pyJSONDecodeErrorhandlingpyproject.tomlpydantic>=2.12.5→>=2.0.0.github/workflows/integration.yml.github/scripts/ci_torrent_lifecycle.pyREADME.mduv.lock