Create Dataset from Images and Annotations locally

I am trying to create a Dataset to finetune a segmentation model. My images are stored in a folder and so are the annotation images which I’ve generated by segmenting images (I will accept these as my ground truth for this instance). The annotations are in folders organized by the label name. (root/road/imgs; root/sky/imgs; etc.)
In the docs or when downloading segmentation datasets the “annotation” is always a type of grey value image; do I have to combine my annotation images into grey scale images (“segmentation maps”)? How do I do this? Or can I simply load the annotation images into the dataset all separately? There are these “Folder-based Builders” in the docs for loading image datasets but if I load my annotations it doesn’t contain the RGB images? If you know what I’m talking about please help! Thanks in advance


We have a guide for that in the examples:

I’ve seen this entry and it’s not what the question is about but thanks!


Sorry misread your question, no they can also be RGB, you can cast them to be of type datasets.feature.Image. Basically you’d need to create a dataset with 2 columns (“image” and “annotation” and both can be of type Image).