synthkit is a synth-first Flutter plugin for musical note playback and
simple beat-based scheduling across Flutter platforms.
It gives you one Dart API for:
- creating synths
- configuring waveform, ADSR envelope, and low-pass filtering
- triggering notes by note name, MIDI note, or raw frequency
- scheduling short patterns in beat time
The API is intentionally small. It is designed for metronomes, ear-training apps, music toys, UI sound prototypes, and lightweight sequencer-style interfaces.
Hosted example: taalaydev.github.io/synthkit
| Platform | Supported | Backend |
|---|---|---|
| Web | Yes | Tone.js loaded at runtime |
| iOS | Yes | FFI + native AVAudioEngine synth |
| macOS | Yes | FFI + native AVAudioEngine synth |
| Android | Yes | FFI + native AudioTrack synth |
| Windows | Yes | FFI + native waveOut synth |
| Linux | Yes | FFI + native ALSA synth |
- Unified
SynthKitEngine,SynthKitSynth, andSynthKitTransportAPI. sine,square,triangle, andsawtoothwaveforms.- ADSR envelope per synth.
- Optional low-pass filter per synth.
- Per-note velocity and delayed triggering.
- Beat-based transport for scheduling note sequences from Dart.
- Automatic Tone.js bootstrap on web.
FFInative backends for mobile and desktop.
Add the package to your app:
dependencies:
synthkit: ^0.1.0Then install dependencies:
flutter pub getNo extra platform setup is required for iOS, macOS, Windows, or web beyond normal Flutter plugin integration.
Android builds now include a native FFI library via the Android NDK.
Make sure your Android toolchain can build CMake-based native targets.
For Linux builds, install ALSA development headers first. On Debian or Ubuntu:
sudo apt install libasound2-devWhen debugging backend selection, synthkit logs the chosen transport once at
startup:
[synthkit] transport: FFI[synthkit] transport: MethodChannel
import 'package:synthkit/synthkit.dart';
final engine = SynthKitEngine();
await engine.initialize(
bpm: 112,
masterVolume: 0.7,
);
final synth = await engine.createSynth(
const SynthKitSynthOptions(
waveform: SynthKitWaveform.sawtooth,
envelope: SynthKitEnvelope(
attack: Duration(milliseconds: 8),
decay: Duration(milliseconds: 140),
sustain: 0.65,
release: Duration(milliseconds: 260),
),
filter: SynthKitFilter.lowPass(cutoffHz: 1600),
volume: 0.75,
),
);
await synth.triggerAttackRelease(
SynthKitNote.parse('C4'),
const Duration(milliseconds: 380),
);In most apps the flow looks like this:
- Create one
SynthKitEngine. - Call
initialize()once before creating synths. - Create one or more synths with
createSynth(). - Trigger notes directly or schedule them with
transport. - Dispose the engine when the owning widget or service is destroyed.
Example:
class _ExampleState extends State<Example> {
final SynthKitEngine _engine = SynthKitEngine();
SynthKitSynth? _synth;
Future<void> initializeAudio() async {
await _engine.initialize(bpm: 120, masterVolume: 0.8);
_synth ??= await _engine.createSynth();
}
Future<void> playA4() async {
await initializeAudio();
await _synth!.triggerAttackRelease(
SynthKitNote.parse('A4'),
const Duration(milliseconds: 250),
);
}
@override
void dispose() {
_engine.dispose();
super.dispose();
}
}Main entry point for audio.
final engine = SynthKitEngine();Important members:
initialize({double bpm = 120, double masterVolume = 0.8, String? webToneJsUrl})Initializes the platform backend and attaches the transport.createSynth([SynthKitSynthOptions options])Creates a synth on the active backend.setMasterVolume(double volume)Sets the global output volume from0.0to1.0.backendNameReturns a backend identifier such astonejs-web,ffi-ios,ffi-macos,ffi-android,ffi-linux, orffi-windows.transportBeat-based scheduler for short sequences.dispose()Stops transport playback, disposes synths, and tears down the backend.
Notes:
initialize()must be awaited beforecreateSynth()orsetMasterVolume().- Calling
initialize()more than once is safe; subsequent calls are ignored. dispose()is idempotent.
Represents a synth instance created by the engine.
final synth = await engine.createSynth();Important members:
update(SynthKitSynthOptions nextOptions)Reconfigures the synth waveform, envelope, filter, and volume.triggerAttackRelease(SynthKitNote note, Duration duration, {double velocity = 1, Duration delay = Duration.zero})Plays a note immediately or after a delay.cancelScheduledNotes()Cancels queued delayed notes for that synth.dispose()Disposes the synth and releases its native or web resources.
Schedules notes in beat time from Dart.
await engine.transport.schedule(
synth: synth,
note: SynthKitNote.parse('C4'),
beat: 0,
durationBeats: 0.5,
);
await engine.transport.start();Important members:
schedule(...)Adds a note event to the transport sequence.start()Starts playback of the scheduled sequence.stop({bool clearSequence = false})Stops playback, clears active notes, and optionally removes the sequence.setBpm(double bpm)Changes transport tempo. If transport is already running, remaining notes are rescheduled at the new BPM.clear()Removes all scheduled note events.unscheduleSynth(SynthKitSynth synth)Removes all events for a given synth.bpmCurrent BPM value.isRunningWhether the transport is currently active.
Notes:
- The transport is a one-shot scheduler. It does not loop automatically.
- Scheduling is managed from Dart, not from a native sequencer.
SynthKitEngine.initialize()attaches the transport for you.
You can construct notes in three ways:
final byName = SynthKitNote.parse('C#4');
final byMidi = SynthKitNote.midi(69); // A4
final byFrequency = SynthKitNote.frequency(440.0);Supported note-name format:
- note letter
AthroughG - optional
#orb - octave number, including negative octaves
Examples:
C4Bb3F#5
Use SynthKitSynthOptions to configure a synth:
const SynthKitSynthOptions(
waveform: SynthKitWaveform.square,
envelope: SynthKitEnvelope(
attack: Duration(milliseconds: 20),
decay: Duration(milliseconds: 100),
sustain: 0.5,
release: Duration(milliseconds: 200),
),
filter: SynthKitFilter.lowPass(cutoffHz: 1200),
volume: 0.7,
)| Setting | Default |
|---|---|
waveform |
SynthKitWaveform.sine |
envelope.attack |
10ms |
envelope.decay |
120ms |
envelope.sustain |
0.75 |
envelope.release |
240ms |
filter |
SynthKitFilter.disabled() |
volume |
0.8 |
Supported waveforms:
SynthKitWaveform.sineSynthKitWaveform.squareSynthKitWaveform.triangleSynthKitWaveform.sawtooth
SynthKitEnvelope models a standard ADSR envelope:
attackdecaysustainrelease
SynthKitFilter currently supports:
SynthKitFilter.disabled()SynthKitFilter.lowPass(cutoffHz: ...)
This schedules a short four-beat phrase:
await engine.transport.setBpm(112);
await engine.transport.schedule(
synth: synth,
note: SynthKitNote.parse('A3'),
beat: 0,
durationBeats: 0.5,
);
await engine.transport.schedule(
synth: synth,
note: SynthKitNote.parse('C4'),
beat: 1,
durationBeats: 0.5,
);
await engine.transport.schedule(
synth: synth,
note: SynthKitNote.parse('E4'),
beat: 2,
durationBeats: 0.5,
);
await engine.transport.schedule(
synth: synth,
note: SynthKitNote.parse('G4'),
beat: 3,
durationBeats: 1,
);
await engine.transport.start();To stop and clear the scheduled pattern:
await engine.transport.stop(clearSequence: true);On web, synthkit loads Tone.js automatically the first time you initialize
the engine.
Default Tone.js URL:
https://cdn.jsdelivr.net/npm/tone@15.1.0/build/Tone.js
If you want to self-host Tone.js or use a different CDN, pass a custom URL:
await engine.initialize(
bpm: 120,
masterVolume: 0.8,
webToneJsUrl: 'https://your-cdn.example.com/Tone.js',
);Important web note:
- Browser audio usually must be unlocked from a user gesture. In practice, call
initialize()from a button tap or another direct user interaction.
- Uses a native
AVAudioEngine-based synth backend. - No additional manual setup is required in a standard Flutter app.
- Uses a native
AudioTrack-based synth backend. - Good fit for lightweight synthesis and UI sound playback.
- Uses a native
waveOutbackend. - Best suited to simple synthesis and note triggering.
- Uses a native ALSA PCM playback backend.
- Requires ALSA development headers when building the Linux app or example.
The public Dart API is shared, but exact sound character can vary by backend. This is expected because each platform uses its own underlying audio engine.
- Always await
initialize()before using the engine. - Dispose the engine when you no longer need audio resources.
- Keep
masterVolume, synthvolume, and notevelocityin the0.0to1.0range. - On web, initialize from a user gesture.
- For long-running or advanced sequencing needs, treat the current transport as a lightweight musical scheduler rather than a DAW-style timeline.
synthkit is intentionally narrow today. It does not currently provide:
- audio file playback or sampling
- MIDI input or output
- recording or offline rendering
- effect chains beyond a simple low-pass filter
- looped transport playback
The package includes a runnable Flutter example in
example/ that demonstrates:
- initializing the engine
- creating a synth
- playing a one-shot note
- scheduling a short beat-based pattern
- stopping playback
You can also try the hosted web example here: taalaydev.github.io/synthkit
Run it with:
cd example
flutter runThis package has been verified in this workspace with:
flutter analyzeflutter build macos --debugflutter build ios --simulator --debug --no-codesignflutter drive --driver=test_driver/integration_test.dart --target=integration_test/initialize_test.dart -d macosflutter drive --driver=test_driver/integration_test.dart --target=integration_test/initialize_test.dart -d <ios-simulator-id>