scala spark-shell not opening in current directory

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

scala spark-shell not opening in current directory

Taylor Sansom
Howdy all, I'm new to scala and spark but not new to programming and I've having some problems that I have never encountered before and don't understand. I want to start my spark-shell in a specific directory but when I navigate to the folder, run spark-shell, and check the user.dir it always says C:\... I'm taking some online courses to learn scala and spark but nobody seems to have this problem. Summary below:

   PS C:\Spark\My_Programs> cd .\Spark_DataFrames
   PS C:\Spark\My_Programs\Spark_DataFrames> spark-shell
   ...
   loading text removed
   ...
   scala> System.getProperty("user.dir")
   res0: String = C:\


If I try to load the file df.scala (which lives in C:\Spark\My_Programs\Spark_DataFrames) it says the file does not exist. I have to specify the full path of the file to load it:

   scala> :load df.scala
   That file does not exist

   scala> :load \Spark\My_Programs\Spark_DataFrames\df.scala
   Loading \Spark\My_Programs\Spark_DataFrames\df.scala...
   I'm working!!!


I assumed that I could just change the user.dir variable to the correct path then load the file but I get the following:

   scala> System.setProperty("user.dir", "C:\\Spark\\My_Programs\\Spark_DataFrames\\")
   res1: String = C:\

   scala> System.getProperty("user.dir")
   res2: String = C:\Spark\My_Programs\Spark_DataFrames\

   scala> :load df.scala
   res3: That file does not exist


But I can still give the full path to the file and it will run fine (which tells me I'm still in C:\):

   scala> :load \Spark\My_Programs\Spark_DataFrames\df.scala
   Loading \Spark\My_Programs\Spark_DataFrames\df.scala...
   I'm working!!!


I would really love to be able to run the spark shell in my current directory (like I can on my mac and linux systems) so I don't have to specify the full path to the file every time. Anyone have suggestions on why this is happening and how I can remedy it? Many thanks - Taylor
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: scala spark-shell not opening in current directory

Oliver Ruebenacker

     Hello,

  The current working directory (cwd) at the time the JVM starts will be the cwd for that JVM for the rest of its life. It cannot be changed by changing System properties.

  spark-shell is a script that sets the cwd before it calls the JVM. Configure or edit the spark-shell script to run with a different cwd.

     Best, Oliver

On Fri, Jan 27, 2017 at 10:37 AM, taylor.sansom <[hidden email]> wrote:
Howdy all, I'm new to scala and spark but not new to programming and I've
having some problems that I have never encountered before and don't
understand. I want to start my spark-shell in a specific directory but when
I navigate to the folder, run spark-shell, and check the user.dir it always
says C:\... I'm taking some online courses to learn scala and spark but
nobody seems to have this problem. Summary below:

/   PS C:\Spark\My_Programs> cd .\Spark_DataFrames
   PS C:\Spark\My_Programs\Spark_DataFrames> spark-shell
   ...
   loading text removed
   ...
   scala> System.getProperty("user.dir")
   res0: String = C:\/

If I try to load the file df.scala (which lives in
C:\Spark\My_Programs\Spark_DataFrames) it says the file does not exist. I
have to specify the full path of the file to load it:

   /scala> :load df.scala
   That file does not exist

   scala> :load \Spark\My_Programs\Spark_DataFrames\df.scala
   Loading \Spark\My_Programs\Spark_DataFrames\df.scala...
   I'm working!!!/

I assumed that I could just change the user.dir variable to the correct path
then load the file but I get the following:

   /scala> System.setProperty("user.dir",
"C:\\Spark\\My_Programs\\Spark_DataFrames\\")
   res1: String = C:\

   scala> System.getProperty("user.dir")
   res2: String = C:\Spark\My_Programs\Spark_DataFrames\

   scala> :load df.scala
   res3: That file does not exist/

But I can still give the full path to the file and it will run fine (which
tells me I'm still in C:\):

   /scala> :load \Spark\My_Programs\Spark_DataFrames\df.scala
   Loading \Spark\My_Programs\Spark_DataFrames\df.scala...
   I'm working!!!/

I would really love to be able to run the spark shell in my current
directory (like I can on my mac and linux systems) so I don't have to
specify the full path to the file every time. Anyone have suggestions on why
this is happening and how I can remedy it? Many thanks - Taylor



--
View this message in context: http://scala-language.1934581.n4.nabble.com/scala-spark-shell-not-opening-in-current-directory-tp4648055.html
Sent from the Scala - User mailing list archive at Nabble.com.

--
You received this message because you are subscribed to the Google Groups "scala-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.



--
Oliver Ruebenacker
Senior Software Engineer, Diabetes Portal, Broad Institute

--
You received this message because you are subscribed to the Google Groups "scala-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: scala spark-shell not opening in current directory

Taylor Sansom
In reply to this post by Taylor Sansom
I was having the same problem when using spark-shell and pyspark. Try running spark-shell2 instead of spark-shell. 

Basically what spark-shell.cmd does is open a new command prompt instance which then runs spark-shell2.cmd. My computer always opens up a new command prompt instance in the C: drive. If I figure out how to change that I'll let you know. Hope this helps.

Taylor

On Friday, January 27, 2017 at 3:02:14 PM UTC-6, Taylor Sansom wrote:
Howdy all, I'm new to scala and spark but not new to programming and I've
having some problems that I have never encountered before and don't
understand. I want to start my spark-shell in a specific directory but when
I navigate to the folder, run spark-shell, and check the user.dir it always
says C:\... I'm taking some online courses to learn scala and spark but
nobody seems to have this problem. Summary below:

/   PS C:\Spark\My_Programs> cd .\Spark_DataFrames
   PS C:\Spark\My_Programs\Spark_DataFrames> spark-shell
   ...
   loading text removed
   ...
   scala> System.getProperty("user.dir")
   res0: String = C:\/

If I try to load the file df.scala (which lives in
C:\Spark\My_Programs\Spark_DataFrames) it says the file does not exist. I
have to specify the full path of the file to load it:

   /scala> :load df.scala
   That file does not exist

   scala> :load \Spark\My_Programs\Spark_DataFrames\df.scala
   Loading \Spark\My_Programs\Spark_DataFrames\df.scala...
   I'm working!!!/

I assumed that I could just change the user.dir variable to the correct path
then load the file but I get the following:

   /scala> System.setProperty("user.dir",
"C:\\Spark\\My_Programs\\Spark_DataFrames\\")
   res1: String = C:\

   scala> System.getProperty("user.dir")
   res2: String = C:\Spark\My_Programs\Spark_DataFrames\

   scala> :load df.scala
   res3: That file does not exist/

But I can still give the full path to the file and it will run fine (which
tells me I'm still in C:\):

   /scala> :load \Spark\My_Programs\Spark_DataFrames\df.scala
   Loading \Spark\My_Programs\Spark_DataFrames\df.scala...
   I'm working!!!/

I would really love to be able to run the spark shell in my current
directory (like I can on my mac and linux systems) so I don't have to
specify the full path to the file every time. Anyone have suggestions on why
this is happening and how I can remedy it? Many thanks - Taylor



--
View this message in context: <a href="http://scala-language.1934581.n4.nabble.com/scala-spark-shell-not-opening-in-current-directory-tp4648055.html" target="_blank" rel="nofollow" onmousedown="this.href=&#39;http://www.google.com/url?q\x3dhttp%3A%2F%2Fscala-language.1934581.n4.nabble.com%2Fscala-spark-shell-not-opening-in-current-directory-tp4648055.html\x26sa\x3dD\x26sntz\x3d1\x26usg\x3dAFQjCNH-YTBGNilITK-L4GYwj2jn5N-x5g&#39;;return true;" onclick="this.href=&#39;http://www.google.com/url?q\x3dhttp%3A%2F%2Fscala-language.1934581.n4.nabble.com%2Fscala-spark-shell-not-opening-in-current-directory-tp4648055.html\x26sa\x3dD\x26sntz\x3d1\x26usg\x3dAFQjCNH-YTBGNilITK-L4GYwj2jn5N-x5g&#39;;return true;">http://scala-language.1934581.n4.nabble.com/scala-spark-shell-not-opening-in-current-directory-tp4648055.html
Sent from the Scala - User mailing list archive at Nabble.com.

--
You received this message because you are subscribed to the Google Groups "scala-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Loading...