Used numba to consume GPU instead of CPU in loop

I am doing tuple operation in python which takes huge amount of memory. Every process has been ended after some minute. Now, i want to implement numba in loop to consume my GPU.

for i in range(0, len(idendities) - 1):
    for j in range(i + 1, len(idendities)):
        #print(samples_list[i], " vs ",samples_list[j])
        cross_product = itertools.product(samples_list[i], samples_list[j])
        cross_product = list(cross_product)
        #print(cross_product)

        for cross_sample in cross_product:
            #print(cross_sample[0], " vs ", cross_sample[1])
            negative = []
            negative.append(cross_sample[0])
            negative.append(cross_sample[1])
            negatives.append(negative)
            print(len(negatives))

negatives = pd.DataFrame(negatives, columns=["file_x", "file_y"])
negatives["decision"] = "No"