spark with mysql get the count of the records executed by a pushdown query

spark mysql pushdown query result count

I have a join query joining more than 5 tables in mysql database. i used pushdown query method in spark to read the records into a dataframe df. however, its taking long time to count the number of records with df.count() method. since the query is already executed in the db, the count must be available somewhere in db, is it possible to get that count, instead of calculating it using df.count().

Any help is greatly appreciated.

Regards

Shakti