subclass of multiprocessing.Process never joins

I've made a class that inherits multiprocessing.Process:

class Task(Process):
    def run(self) :
        print("RUN")

        # doing some works

        self.close()
        print("closed")

And I start it in this way:

proc = Task()
proc.start()
proc.join()
print("Joined")

But it won't join and the output will be like this:

>> RUN
>> closed

I'm not using any kind of Queues and Pipes. When I ran this on Ubuntu , I tracked the process with it's pid. Process still exists even after self.close() line is done without any exceptions. I also ran this on windows and tracked process in Task Manager. Process exits after self.close() but still not joining. another point is that on Ubuntu, after self.close() when everything is stuck , when I press Ctrl + C , I get this:

Traceback (most recent call last):
  File "Scheduler.py", line 164, in <module>
    scheduler.start()
  File "Scheduler.py", line 152, in start
    self.enqueueTask(plan)
  File "Scheduler.py", line 134, in enqueueTask
    proc.join()
  File "/usr/local/lib/python3.8/multiprocessing/process.py", line 149, in join
    res = self._popen.wait(timeout)
  File "/usr/local/lib/python3.8/multiprocessing/popen_fork.py", line 47, in wait
    return self.poll(os.WNOHANG if timeout == 0.0 else 0)
  File "/usr/local/lib/python3.8/multiprocessing/popen_fork.py", line 27, in poll
    pid, sts = os.waitpid(self.pid, flag)

According to last line, I guess the main process is waiting for something but what and why?

Update 1

Thanks to @Darkonaut's comment , I debugged the whole program. The problem is with a non-daemon thread that I'm starting in run() method of Task. so I can surely say that thread is preventing my process to be closed even after its MainThread is done. but I'm still confused because target function of that non-daemon thread is done successfully. So my more specific question is Why a non-daemon thread should exists and prevent its process to be killed , even after its target is done after 30 seconds?