Google colab, RAM memory full when converting CSV pixel data to images

I've google colab pro for a deep learning project im doing so i've 25gb RAM available.

I'm convering pixels from a csv to images, after around 4 minutes my colab notebook crashes due to no RAM memory left. I believe its something to do with the resizing, as i've tried models before with no resizing and it works no problem.

Any idea what is causing such high RAM usage and any solutions? Code below:

def get_data(dataset):
data = pd.read_csv(path+'fer2013.csv')
pixels = data['pixels'].tolist()
images = np.empty((len(data), img_height, img_width, 3))
i = 0

for pixel_sequence in pixels:
    single_image = [float(pixel) for pixel in pixel_sequence.split(' ')]  # Extraction of each single
    single_image = np.asarray(single_image).reshape(48, 48) # Dimension: 48x48
    single_image = resize(single_image, (img_height, img_width), order = 3, mode = 'constant') # Dimension: 139x139x3 (Bicubic)
    ret = np.empty((img_height, img_width, 3))  
    ret[:, :, 0] = single_image
    ret[:, :, 1] = single_image
    ret[:, :, 2] = single_image
    images[i, :, :, :] = ret
    i += 1
   

images = preprocess_input(images)
labels = to_categorical(data['emotion'])

return images, labels    
How many English words
do you know?
Test your English vocabulary size, and measure
how many words do you know
Online Test
Powered by Examplum