Conversation
|
Woah nice! There are a lot of people who are interested in doing stuff like this! Looking forward to reviewing this when you're ready! cc @pcuenca |
|
I'm investigating a performance issue with Whisper tiny.en, and it looks like the performance is not as expected in the example. I quickly add log for # Encode result
LOG ONNX session run finished 767 ms
# Decode result
LOG ONNX session run finished 4518 ms
# ... 4 runs ...
LOG ONNX session run finished 4856 ms
LOG Time: 40016
LOG Result: {"text": " (buzzing)"}Click to expandconst t0 = performance.now()
// this.#inferenceSession === NativeModules.Onnxruntime
const results: Binding.ReturnType = await this.#inferenceSession.run(this.#key, input, outputNames, options);
const output = this.decodeReturnType(results);
console.log('ONNX session run finished', performance.now() - t0)Also see logs of native part (added log to OnnxruntimeModule.java), it is significantly less than the time taken by JS part: # Encode result
09:47:59.203 ONNXRUNTIME_RN run() is finished, time: 273 ms
# Decode result
09:48:00.280 ONNXRUNTIME_RN run() is finished, time: 339 ms
# ... 4 runs ...
09:48:23.807 ONNXRUNTIME_RN run() is finished, time: 541 ms
EDIT: It seems the logger have some bugs leads me to think that the problem is from the native bridge. I add timeout await to the decodeReturnType call and found the issue is from this function. |
|
Hi @hans00! 👋 Thanks so much for your work here - it will be very impactful. I'm finally able to commit more time to this, as I now have the ability to do mobile development in my environment. My main concern with the PR in its current state is the additional dependencies that it introduces. Ideally, the only new dependency that would be introduced is What do you think? |
Ok, I had removed JS codecs for RN. For FS access, |
|
hey @hans00 , sorry to bother you but I've been trying without success to get your changes working for the past day and a half and thought I'd ask... I installed I manually placed a model in my app's application folder: https://huggingface.co/distilbert/distilbert-base-uncased-finetuned-sst-2-english/tree/main the contents are in a folder called When I try to use it like: import { cacheDirectory, documentDirectory } from 'expo-file-system';
env.localModelPath = documentDirectory;
env.cacheDir = cacheDirectory;
env.allowRemoteModels = false;
export const sentimentAnalysis = async (text: string) => {
const classifier = await pipeline(
'sentiment-analysis',
'distilbert-base-uncased-finetuned-sst-2-english',
{
// progress_callback: (progress) => console.info('progress', progress)
}
);
const result = await classifier(text); // [{'label': 'POSITIVE', 'score': 0.999817686}]
return result;
};
export const sentimentAnalysisAsString = async (text: string) =>
sentimentAnalysis(text).then((result) => JSON.stringify(result, null, 2));It gives me these 3 warnings: It seems it might have come from the Any idea what could be the fix? |
|
@TowhidKashem I ran into same issue. I found it was because the metro reads the exports field of the package then loaded the browser version build. You can use my fork for now: https://www.npmjs.com/package/@axe-dev/transformers (I couldn't guarantee that I won't add my changes in there) or you can figure out some ways to load the build that bundled the universal fs package. |
I had same issue in my recent trials... |
I was able to get onnx models working using just onnx-web.stub.ts babel.config.js: this gets rid of the webpack error and allows u to use onnx models. But the experience isn't as easy as using transformers.js so that would still be much more preferred. Without transformers.js you have to figure out unique tokenizers for every model which is a real pain in the butt.. |
Thanks, is this the right repo: https://github.com/axe-me/transformers.js/tree/main and which branch should I be forking? |
|
@TowhidKashem same as the fugood one, |
This should fixed. |
the 3.6.1-beta.1 release works for my app, thanks! |
|
Just curious, is this work integrated in the latest release (I see the PR is still open) or is this PR left hanging? |
|
@jhen0409 @kungfooman could one of you kindly merge this, sorry for the ping |
Does any of you could help on how they got to make it work on Expo? Im a newbie on mobile development and it seems to be impossible but saw you guys achieved it hahaha |
As far as I know, this tutorial may also works with Expo. And you need run local dev. |
Make it support React Native.
And I made a example for it.
https://github.com/hans00/react-native-transformers-example
Guide to get it start
TODO:
Check models are works fineResearch more efficiently image processing