Dear @John6666
I saw in your dataset that your images stays normalized, but when I save it with matplotlib
it converts back to 0-255 images (black and white). Is there a method to make it stays 0 and 1 as jpeg?
Here’s the proof of normalization before opening it back with PIL
after saving:
image_normalized = im.open("D:\semester_12\Data_Irredeano\\img_0.jpg")
npimage = np.array(image_normalized)
print(npimage)
output:
[[[ 0 0 0]
[ 0 0 0]
[ 0 0 0]
...
[ 0 0 0]
[ 0 0 0]
[ 0 0 0]]
[[ 0 0 0]
[ 0 0 0]
[ 0 0 0]
...
[ 0 0 0]
[ 0 0 0]
[ 0 0 0]]
[[ 0 0 0]
[ 0 0 0]
[ 0 0 0]
...
[ 0 0 0]
[ 0 0 0]
[ 0 0 0]]
...
[[ 0 0 0]
[ 0 0 0]
[ 0 0 0]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[ 0 0 0]
[ 0 0 0]
[ 0 0 0]
...
[255 255 255]
[255 255 255]
[255 255 255]]
[[ 0 0 0]
[ 0 0 0]
[ 0 0 0]
...
[255 255 255]
[255 255 255]
[255 255 255]]]
On loop
normalization code
imar = np.asarray(water_mask) / 255
plt.imsave('D:/'+'img_'+str(a)+'.jpg', imar, cmap="gray")
Output
[[0. 0. 0. ... 0. 0. 0.]
[0. 0. 0. ... 0. 0. 0.]
[0. 0. 0. ... 0. 0. 0.]
...
[0. 0. 0. ... 1. 1. 1.]
[0. 0. 0. ... 1. 1. 1.]
[0. 0. 0. ... 1. 1. 1.]]
Using images from dataset in huggingface
Or should I add the .convert(‘L’) as PIL.Image
after converting it to np.array?
Thanks in advance