Python SQL Pandas - Cannot import dataframe larger than 27650 rows into database

I am trying to import a large csv file (5 million rows) into a local MySQL database using the code below:


import pandas as pd
from sqlalchemy import create_engine

engine = create_engine('mysql+mysqlconnector://[username]:[password]@[host]:[port]/[schema]', echo=False)
    df = pd.read_csv('C:/Users/[user]/Documents/Sales_Records.csv')
    df = df.head(27650)
    df.to_sql(con= engine, name='data', if_exists='replace', chunksize = 50000)

If I execute this code it works so long as df.head([row limit]) is less than 27650. However, as soon as I increase this row limit by just a single row, the import fails and now data is transferred to MySQL. Does anyone know why this would happen?

1 answer