Memory leak Kafka v0.10 and Spark v2.2.1 using Spark Standalone?

I'm currently encountering a problem where the longer the spark steaming goes, the higher my memory usage is. As a result, my spark job dies!

The Spark Streaming app is basically getting records to Kafka every 15 seconds, parse and write back to Kafka (on another topic). Spark Standalone has this specification:

Total Workers: 10
Cores: 80
Total Memory: 303.8 GB

And my spark config is the following:

--executor-cores 1 
--executor-memory 2g 
--driver-memory 2g
--conf "spark.cores.max=60"

The records being consume is ~100k-200k. Processing time is ~12-19 seconds.

As what I've mentioned in the first paragraph, I'm suspecting this is a memory leak issue. But can someone please explain to me why this is happening? I might be missing something (most likely!).