had not coded in spark for a lot while, specifically in pyspark, 

since i decided to use spark to generate neural network's sample/ data set, needed to install spark.

not interested to run from docker etc this time.

actually missed running program directly without docker. 

even had not defined any conda environments.  i mean just using the main environment there with no kind of special environment definition for project or makefile.

I only need to generate sample data and thats it and being lazy to create specific python environment and even lazy to use the main conda environment to install libraries. normally its better to define specific environment for project and have some makefile that makes it active right? but i am lazy right now and dont want to do spare time to makefile or conda environment specialization for this project component of sample data generation task. 

---------------


hmm tried to install spark to windows and pyspark conda has had some errors, then installed to linux/wsl and now seems as working yayy. but lotst many hours whilst trying to solve spark probilem on windows (environment variables seemed to be correct,  have no idea why it did not worked) but nice finally pyspark working with wsl's spark. hmm also needed to install x server to communicate with desktop applications.


Yorumlar

Bu blogdaki popüler yayınlar