ERROR SparkContext: Error initializing SparkContext coding in python

I want to run a python script using Enthought command prompt, but it is giving me the error below:

(User) C:\Users\shatak\Downloads\Compressed\full machine learning course udemy\souce code\DataScience-Python3>spark-submit SparkDecisionTree.py
18/04/17 10:47:21 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/04/17 10:47:30 ERROR SparkContext: Error initializing SparkContext.
java.io.FileNotFoundException: File file:/C:/Users/shatak/Downloads/Compressed/full%20machine%20learning%20course%20udemy/souce%20code/DataScience-Python3/SparkDecisionTree.py does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:611)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:421)
        at org.apache.spark.SparkContext.addFile(SparkContext.scala:1528)
        at org.apache.spark.SparkContext.addFile(SparkContext.scala:1498)
        at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461)
        at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:461)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
        at java.lang.reflect.Constructor.newInstance(Unknown Source)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:238)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Unknown Source)
18/04/17 10:47:30 WARN MetricsSystem: Stopping a MetricsSystem that is not running
Traceback (most recent call last):
  File "C:/Users/shatak/Downloads/Compressed/full machine learning course udemy/souce code/DataScience-Python3/SparkDecisionTree.py", line 8, in <module>
    sc = SparkContext(conf = conf)
  File "C:\spark\python\lib\pyspark.zip\pyspark\context.py", line 118, in __init__
  File "C:\spark\python\lib\pyspark.zip\pyspark\context.py", line 180, in _do_init
  File "C:\spark\python\lib\pyspark.zip\pyspark\context.py", line 270, in _initialize_context
  File "C:\spark\python\lib\py4j-0.10.6-src.zip\py4j\java_gateway.py", line 1428, in __call__
  File "C:\spark\python\lib\py4j-0.10.6-src.zip\py4j\protocol.py", line 320, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.io.FileNotFoundException: File file:/C:/Users/shatak/Downloads/Compressed/full%20machine%20learning%20course%20udemy/souce%20code/DataScience-Python3/SparkDecisionTree.py does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:611)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:421)
        at org.apache.spark.SparkContext.addFile(SparkContext.scala:1528)
        at org.apache.spark.SparkContext.addFile(SparkContext.scala:1498)
        at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461)
        at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:461)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
        at java.lang.reflect.Constructor.newInstance(Unknown Source)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:238)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Unknown Source)

I have configured my hadoop using winutils.exe path. I have previously used pyspark command which was executed fine with a simple warning, but this script could not be executed.

I think that there is some error in spark configuration, I searched Stack Overflow and saw one solution to my problem, so I, in my sparks-defaults.conf, I set spark.eventLog.enabled to false, but that didn't solve my issue as well.