Read Own Multiple Images from folder and Save as a Dataset for Training

I am working on training my own images read from my folders. I would be thankful if you could help me for this.

I successfully read my all images from the folder and create my own onehot_encoded labels. However, in each time I run my code, it takes a lot of time to do read all images from the folders. Therefore, I want to create dataset from these images and save it like MNIST to use faster. Thus, I will not read my whole images again. Could you please help me for this?

The code is:

  path = "D:/cleandata/train_data/"
  loadedImages = []
  labels = []
  sess = tf.InteractiveSession()
  for i in range(len(os.listdir(path))):
        imagesList = listdir(path+os.listdir(path)[i])
        for image in imagesList:        
            image_raw_data_jpg tf.gfile.FastGFile(path+os.listdir(path)
            [i]+'/'+image, 'rb').read()  
            raw_image =tf.image.decode_png(image_raw_data_jpg,3)
            gray_resize=tf.image.resize_images(raw_image, [28, 28]) 
            image_data = 

1 answer