Skip to content

Add a persistent Parakeet helper for low-latency host integrations#18861

Open
seyeong-han wants to merge 1 commit intomainfrom
parakeet-helper-macos-alignment
Open

Add a persistent Parakeet helper for low-latency host integrations#18861
seyeong-han wants to merge 1 commit intomainfrom
parakeet-helper-macos-alignment

Conversation

@seyeong-han
Copy link
Copy Markdown
Contributor

@seyeong-han seyeong-han commented Apr 14, 2026

Summary

  • Factor the Parakeet transcription core out of parakeet_runner into a shared ParakeetTranscriber class
  • Add a new parakeet_helper binary plus a stdin/stdout helper protocol for long-lived host integrations
  • Build the helper in the existing Parakeet CMake presets and document the helper workflow in the README

Why a helper?

The Voxtral Realtime macOS app (executorch-examples/voxtral_realtime/macos) didn't need any changes to the executorch repo because voxtral_realtime_runner was already designed as a streaming, long-running process — the app just launches it and feeds audio.

parakeet_runner is different: it's a one-shot batch CLI tool that loads the model, transcribes one WAV file, prints the result, and exits. There's no way to send it a second request without restarting the process and paying the ~1.4 s model-load cost again.

The ExecuWhisper macOS app (meta-pytorch/executorch-examples#232) runs repeated record-then-transcribe requests via system dictation, so a fresh process per recording is too slow. parakeet_helper fills that gap — it's the Parakeet equivalent of what the Voxtral Realtime runner already does natively: stay alive, keep the model warm, and accept multiple requests over stdin/stdout.

Test plan

  • cmake --preset llm-metal-stats -DEXECUTORCH_BUILD_MLX=OFF
  • cmake --build --preset llm-metal-stats-install
  • cd examples/models/parakeet && cmake --build --preset parakeet-metal -- both parakeet_runner and parakeet_helper link successfully

Made-with: Cursor

Factor the Parakeet transcription logic out of the one-shot runner so host apps can keep the model warm across requests. Build the new helper alongside the runner and document the helper workflow for app integrations.

Made-with: Cursor
@pytorch-bot
Copy link
Copy Markdown

pytorch-bot bot commented Apr 14, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/18861

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

❌ 2 New Failures, 1 Cancelled Job, 1 Unrelated Failure

As of commit b54a81c with merge base 411ede2 (image):

NEW FAILURES - The following jobs have failed:

CANCELLED JOB - The following job was cancelled. Please retry:

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 14, 2026
@github-actions
Copy link
Copy Markdown

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant