Memory error with tensorflow GPU

I have NVIDIA GPU Titan V installed on my remote machine and I am trying to do simple multi class lassification using CNN

 train_file = './data/csv_data.csv.zip'
 x_raw, y_raw, df, labels = data_helper.load_data_and_labels(train_file)

here my csv_data zip file is 65MB .

 max_document_length = max([len(x.split(' ')) for x in x_raw])
 logging.info('The maximum length of all sentences: {}'.format(max_document_length))
 vocab_processor = learn.preprocessing.VocabularyProcessor(max_document_length)
 x = np.array(list(vocab_processor.fit_transform(x_raw)))
 y = np.array(y_raw)

I am getting memory error at

vocab_processor = learn.preprocessing.VocabularyProcessor(max_document_length)

I think it is trying to load all the data in memory which is obviously a bad practice. How can I optimize this code to load it in the memory efficiently.

Can anyone please point me to a detailed example or tutorial?

Thanks