How to send a job to Spark on Kubernetes. External scheduler cannot be instantiated

I have started a Minikube cluster but when I make spark-submit, job get an error.

My command:

bin/spark-submit --master k8s://https://192.168.99.101:8443 --deploy-mode cluster --name spark-test --class org.apache.spark.examples.SparkPi --conf spark.executor.instances=5 --conf spark.kubernetes.container.image=spark local:///opt/spark/examples/jars/spark-examples_2.11-2.4.0.jar

And I get the following error:

2019-03-13 18:26:57 ERROR SparkContext:91 - Error initializing SparkContext. org.apache.spark.SparkException: External scheduler cannot be instantiated at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2794) ...30 more Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: GET at: https://kubernetes.default.svc/api/v1/namespaces/default/pods/spark-test-1552501611130-driver. Message: Forbidden!Configured service account doesn't have access. Service account may have been revoked. pods "spark-test-1552501611130-driver" is forbidden: User "system:serviceaccount:default:default" cannot get resource "pods" in API group "" in the namespace "default". ... 30 more