12.4K Views
•
12:49 min
•
September 28th, 2019
DOI :
September 28th, 2019
•0:04
Title
1:02
Image Processing Using Fiji
3:58
Segmentation (Semi-Automated) and Reconstruction Using Ilastik 1.3.2
7:43
3D Analysis
9:41
Results: 3D Reconstruction and Virtual Reality Analysis
11:40
Conclusion
文字起こし
The of automated serial section electron microscopy techniques. The rate limiting step in the pipeline leading to the generation of biological three-dimensional models image processing. Our protocol provides a step-by-step guideline to produce a denser construction of image volumes in just a few days.
Combined with an approach for quantitative measurements using three-dimensional models. With this technique, 3D structure of tissues, other than brain, can potentially be analyzed to identify structural impairments typical of certain diseases and to improve the diagnostic strategy. Segmentation is a tedious step.
Take the time to make it as accurate as possible to avoid proofreading a bad segmentation and having to restart the whole process. The design of the software sometimes not very user-friendly. Therefore, it's important to see the start work on the segmentation.
Open the image stack by dragging and dropping the native file from the microscope containing the stack or buy dragging and dropping the folder containing the whole image stack into the software window Once the stack is opened go to image, properties, to make sure the voxels size has been read from the metadata. Transform the image into 8-bit by clicking on image, type, and select 8-bit. If the original stack is acquired as different tiles, apply stitching within TrakEM2.
Create a new TrakEM2 project using New, TrakEM2 Using TrakEM2 embedded functions segment structures of interest if needed. In the TrakEM2's gooey, right click'on anything'under the template window, and select add new child area list'Drag and drop anything'into the folder, under project objects'and one anything'would appear there. Drag and drop, the area list'from the template to the anything'located under project objects'On the image stack viewport, select the Z-space'with the cursor.
The area list'will appear with the unique ID number. Select the brush'tool on top and use the mouse to segment a structure by filling it's cytosol over the whole Z-stack. Export the segmented mass, to be used in Ilastik as seed for carving.
To do so, right click'either on the area list object on the Z-space list or on the mask in the view port, and select export'area list'as labels T-I-F. Depending on the resolution needed for further reconstructions, reduce the pixel size of the image stack by down-sampling. Consider the memory requirements of the software that will be used for segmentation and reconstruction.
Ilastik handles stacks of up to five hundred pixels on X-Y. Take into account, the minimum size that which the objects still appears recognizable and thus can be segmented. Use image'adjust size'To enhance contrast and help the segmentation, the unsharp mask filter can be applied to make membranes crisper.
Use process'filters'unsharp mass'Export the image stack as single images'for further processing in segmentation software, using file'save as'image sequence'and choose TIFF'format. In the main Ilastik gooey, select the carving'module. Load the image stack, using add new'and add a single 3D/4D volume from sequence'Select whole directory'and choose the folder containing the image stack saved as single files'At the bottom the new window, where options for loading the images are present, make sure to keep Z"selected.
For the following steps, all operations and buttons can be found on the left side of the main software gooey. Under the pre-processing tab, use the standard options already checked. Use bright lines'rich filters'and keep the filter scale at 1.600.
This perimeter can modified afterward. Once the pre-processing is finished, select the next page in the drop-down menu of the labeling module. One object and one background are present by default.
Select the object seed by clicking on it and draw a line on top of the structure of interest. Then, select the background seed and draw one or multiple lines outside of the object to be reconstructed. Now, click on segment'and wait.
Depending on the power of the computer, and the size of the stack, segmentation could take from a few seconds to hours. Once it's done, a semi-transparent mask highlighting the segmentation should appear on top the segmented structure. Scroll through the stack to check the segmentation.
The segmentation, might not be accurate if it does not follow the structure of interest or spills out from it. Correct any spill over by placing a background seed on the spilled segmentation. And add an object seed over the non-reconstructed segment of the object of interest.
Accurate visual proofreading of the segmentation with Ilastik might be tedious but it's essential to make sure that the exported objects do not contain artifacts. If the segmentation is still not correct, try to modify with the bias'parameter which will increase or decrease the amount of uncertain classified pixels accepted. It's value is 0.95 by default.
Decrease it to limit any spillover or increase if the segmentation is too conservative. Another possibility is to click on preprocessing'and to modify the size of the filter. Increasing the value will minimize the salt and pepper noise-like effects, but will also make membranes more blurred and smaller details harder to detect.
This might limit spillover. Reiterate as much as needed as long as all desired objects have been segmented. Once an object is finished, navigate to segment'and click on save current object'Two new seeds will appear to start the segmentation of a new object.
Abstract surface measures right away as O-B-J files by clicking on export all meshes'To visualize manually segmented models in 3D within TrakEM2, right click area list, then select show in 3D'A higher value will generate a lower resolution mesh. Finally, export the 3D mesh as WaveFront O-B-J'by choosing from the menu file'export surfaces'WaveFront'After installing the neuro morph tool kit as described in the text protocol, open blender. Import the objects using the neuro morph batch import by clicking import objects'under the scene menu to import multiple objects at once.
Make sure to activate, use re-mesh and smooth shading. Select the object of interest from the out-liner and modify the Octree Depth of the re-mesh function under the modifier's menu. Work iteratively to minimize the number of vertices and to avoid losing details in resolution and correct morphology.
When changing the Octree Depth, the mesh on the main gooey will change accordingly. Once finished, click on apply to finalize the process. Navigate to the neuro morph menu on the left panel and use the image superimposition tool image stack interactions to load the image stack.
Make sure to enter the physical size of the image stack for X, Y and Z and select the path of the stack by clicking on source Z.X and Y are orthogonal planes. They are optional and will loaded only if the user inserts the valid path. Then, select a mesh on the view port by right-clicking on it.
Enter the edit mode by pressing tab, select one or more vertices using mouse right-click and finally click on show image at vertex. One or more cut planes with the micrograph will appear superimposed on top of the mesh. Select the cut plane by right-clicking on it.
Then press control Y and scroll over the 3D model using the mouse scroll. This can also be used as proofreading method. Shown here is the segmentation and reconstruction using TrakEM2 and Ilastik.
The TrakEM2 gooey is shown with objects manually segmented in red. The exported mask can then be used as input for semi-automated segmentation. From Ilastik, masks can be further exported to TrakEM2 for manual proofreading.
Masks can be exported as 3D triangular meshes to reveal reconstructed structures. In this example, four neurons, astrocytes, microglia and pericytes were reconstructed using this process. A 3D analysis of reconstructed morphologies using customized tools is shown.
Here is an isotropic image volume from a focused ion beam scanning electron microscopy data set. A dense reconstruction of this data reveals axons, the astrocytic process and dendrites. This micrograph shows examples of targets for quantifications such as synapses and astrocytic glycogen granules.
The mask from that micrograph shows the distribution of glycogen granules around synapses. Shown here is a graphic illustration of the input and output of graphic visualization of the glycogen-derived lactate absorption model or GLAM'Here, a user is shown wearing a virtual reality or V-R headset while working on the dense reconstruction from a focused ion beam scanning electron microscopy dataset. From a subset of neurites, an immersive V-R scene can be observed.
The green laser is pointing to a GLAM'peak. is of primary importance as it determines the correct size of the exported 3D objects and determines its measurements reflects its actual size. The concept of serial section imaging, imaging segmentation and 3D construction is fairly old.
Discovering the technical acceleration is leading to advances in colectomies in review of the anatomy textbooks. Although method was developed for brain research in electron microscopy, it can be generalized to any and microscopy technique generating data Furthermore, any kind of 3D imaging technique can benefit from such imaging segmentation and construction techniques. Including C-T scans and M-R-I.
The described pipeline is designed for the segmentation of electron microscopy datasets larger than gigabytes, to extract whole-cell morphologies. Once the cells are reconstructed in 3D, customized software designed around individual needs can be used to perform a qualitative and quantitative analysis directly in 3D, also using virtual reality to overcome view occlusion.
JoVEについて
Copyright © 2023 MyJoVE Corporation. All rights reserved