-
Notifications
You must be signed in to change notification settings - Fork 69
Cache OBJ/WRL models globally to prevent duplicate loading #663
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
11975f7
be55cce
3a04525
dd8171d
7387735
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -4,33 +4,80 @@ import { OBJLoader } from "three/examples/jsm/loaders/OBJLoader.js" | |
| import { STLLoader } from "three/examples/jsm/loaders/STLLoader.js" | ||
| import { loadVrml } from "./vrml" | ||
|
|
||
| // ✅ Finished model cache | ||
| const modelCache = new Map<string, THREE.Object3D>() | ||
|
|
||
| // ✅ In-flight promise cache (prevents duplicate simultaneous loads) | ||
| const loadingPromises = new Map<string, Promise<THREE.Object3D | null>>() | ||
|
|
||
| export async function load3DModel(url: string): Promise<THREE.Object3D | null> { | ||
| if (url.endsWith(".stl")) { | ||
| const loader = new STLLoader() | ||
| const geometry = await loader.loadAsync(url) | ||
| const material = new THREE.MeshStandardMaterial({ | ||
| color: 0x888888, | ||
| metalness: 0.5, | ||
| roughness: 0.5, | ||
| }) | ||
| return new THREE.Mesh(geometry, material) | ||
| // ✅ Cache HIT (already loaded) | ||
| if (modelCache.has(url)) { | ||
| return modelCache.get(url)!.clone() | ||
| } | ||
|
Comment on lines
+15
to
+17
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Race condition: Multiple simultaneous requests for the same uncached URL will all trigger separate network requests and parsing operations. Unlike What breaks: If 10 components mount simultaneously requesting the same model, all 10 will make separate network requests instead of sharing one. Fix: Cache the loading promise, not just the result: const modelCache = new Map<string, THREE.Object3D>()
const loadingPromises = new Map<string, Promise<THREE.Object3D | null>>()
export async function load3DModel(url: string): Promise<THREE.Object3D | null> {
if (modelCache.has(url)) {
return modelCache.get(url)!.clone()
}
if (loadingPromises.has(url)) {
const result = await loadingPromises.get(url)!
return result ? result.clone() : null
}
const promise = (async () => {
// ... existing loading logic ...
})()
loadingPromises.set(url, promise)
const result = await promise
loadingPromises.delete(url)
return result ? result.clone() : null
}Spotted by Graphite Agent This comment came from an experimental review—please leave feedback if it was helpful/unhelpful. Learn more about experimental comments here.
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Resolved |
||
|
|
||
| if (url.endsWith(".obj")) { | ||
| const loader = new OBJLoader() | ||
| return await loader.loadAsync(url) | ||
| // ✅ Already loading → await same promise | ||
| if (loadingPromises.has(url)) { | ||
| const result = await loadingPromises.get(url)! | ||
| return result ? result.clone() : null | ||
| } | ||
|
|
||
| if (url.endsWith(".wrl")) { | ||
| return await loadVrml(url) | ||
| } | ||
| // ✅ Start new load and store promise immediately | ||
| const promise = (async (): Promise<THREE.Object3D | null> => { | ||
| try { | ||
| // STL | ||
| if (url.endsWith(".stl")) { | ||
| const loader = new STLLoader() | ||
| const geometry = await loader.loadAsync(url) | ||
|
|
||
| const material = new THREE.MeshStandardMaterial({ | ||
| color: 0x888888, | ||
| metalness: 0.5, | ||
| roughness: 0.5, | ||
| }) | ||
|
|
||
| return new THREE.Mesh(geometry, material) | ||
| } | ||
|
|
||
| // OBJ | ||
| if (url.endsWith(".obj")) { | ||
| const loader = new OBJLoader() | ||
| return await loader.loadAsync(url) | ||
| } | ||
|
|
||
| // WRL | ||
| if (url.endsWith(".wrl")) { | ||
| return await loadVrml(url) | ||
| } | ||
|
|
||
| // GLTF / GLB | ||
| if (url.endsWith(".gltf") || url.endsWith(".glb")) { | ||
| const loader = new GLTFLoader() | ||
| const gltf = await loader.loadAsync(url) | ||
| return gltf.scene | ||
| } | ||
|
|
||
| console.error("Unsupported file format:", url) | ||
| return null | ||
| } catch (err) { | ||
| console.error("Failed to load model:", url, err) | ||
| return null | ||
| } | ||
| })() | ||
|
|
||
| loadingPromises.set(url, promise) | ||
|
|
||
| // Await load result | ||
| const result = await promise | ||
|
|
||
| // Remove promise after completion | ||
| loadingPromises.delete(url) | ||
|
|
||
| if (url.endsWith(".gltf") || url.endsWith(".glb")) { | ||
| const loader = new GLTFLoader() | ||
| const gltf = await loader.loadAsync(url) | ||
| return gltf.scene | ||
| // Store successful result in final cache | ||
| if (result) { | ||
| modelCache.set(url, result) | ||
| return result.clone() | ||
| } | ||
|
|
||
| console.error("Unsupported file format or failed to load 3D model.") | ||
| return null | ||
| } | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Failed loads are cached permanently: When
loadAndParseObj()returns an Error, the cache entry remains as{ promise, result: null }(set on line 103). All subsequent requests for this URL will return the cached error indefinitely. Transient failures (network issues, temporary server errors) can never recover without a page refresh.Spotted by Graphite Agent

Is this helpful? React 👍 or 👎 to let us know.
This comment came from an experimental review—please leave feedback if it was helpful/unhelpful. Learn more about experimental comments here.