Skip to content

feat: Add support for generating background mesh from a brain mask when viewing native space bundles#2

Open
gagnonanthony wants to merge 1 commit intoteanijarv:mainfrom
gagnonanthony:feat/native_space
Open

feat: Add support for generating background mesh from a brain mask when viewing native space bundles#2
gagnonanthony wants to merge 1 commit intoteanijarv:mainfrom
gagnonanthony:feat/native_space

Conversation

@gagnonanthony
Copy link

@gagnonanthony gagnonanthony commented Mar 16, 2026

Hi @teanijarv !

Thank you for developing this nice library! I have a rather specific (or maybe not) use case where I would like to use your library to generate QC screenshots of WM bundles. The thing is, the bundles are in subject-space, so overlaying them with template meshes gives poor results.

I propose adding an option to visualize the bundles with a mesh generated from a brain mask. This way, bundles can be viewed even in subject-space while having a background for a rough estimate of the location within the brain.

Here's a code snippet to test it out:

p = yab.plot_tracts(
    custom_atlas_path="tracks/",
    nifti_mask="path/to/brain_mask.nii.gz",
    nifti_mask_blur=1.5,
    views=['anterior'],
    bmesh_type=None,
    display_type='object'
)

# open a gif file
gif_path = os.path.join("./", "brain_rotation.gif")
p.open_gif(gif_path)

# rotate the camera 360 degrees by 5 frames
for angle in range(0, 360, 5):
    p.camera.azimuth += 5
    p.write_frame()
p.close()

from IPython.display import display, Image
display(Image(filename=gif_path))

And here's an example of what it would look like:
brain_rotation

I tried to modify as few lines as possible, feel free to ask for changes if needed. Again, great library!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant