When playing with Structured Streams, I left many stream queries active on the back ground. You can see the list with the following command spark.streams.active and you can stop the active queries running spark.streams.active.map(_.stop )
A quick exercise to easily load a csv file with header, into an oracle table, using spark ( spark session created when launching spark-shell ) val prop = new java.util.Properties prop.setProperty("user","username") prop.setProperty("password","password") val vd = spark.read.option("header", "true").option("quote","'").option("escape","^").csv("file:///C:/tmp/test.csv") vd.write.mode("overwrite").jdbc("jdbc:oracle:thin:@host:port/SID","table_name",prop)
Using Regex_replace you can extract bits of a path stored in an Oracle column using grouping backtracking. select regexp_replace('/tsacrm1/inf/prdcrm1/JEE/CRMProduct/application/attachments/attachments/case/3709/Request to clear.txt.txt', '(.+)\/([^\/]+)\/([^\/]+)\/([^\/]+)\.([^\/]+)$','\3') from dual;
After setting up a GitHub account, configure your local repository, using gitbash Create a folder for your local repository mkdir /c/localrepo cd /c/localrepo Git localrepo config git config –global user.name “username” git config –global user.email “youremail@yourdomain” git init git remote add origin https://github.com/username/repository.git Copy/Create your file in your local repository and configure it for git git add File.txt git commit -m “Add a comment for your addition/update” Push your file from your localrepo to your GitHub repository git push warning: push.default is unset; its implicit value has changed in Git 2.0 from ‘matching’ to ‘simple’. To squelch this message and maintain the traditional behavior, use: git config –global push.default matching To squelch this message and adopt the new behavior now, use: git config –g...
I was in a situation where I had to find a simple way to a variable mapping. The script whatIs would have to translate the received parameter into the predefined values within the script. I wanted something straightforward, and didn't want to do a series of ifs to compare the contents. The easiest solution I found was to simply use 'eval' to retrieve the matching value of the passed parameter and provide me with the correlated value. whatIs: orange="fruit" potato="vegetable" cod="fish" eval "v=\$$1" echo $v Call script whatIs with parameter orange and the output will be the value assigned to orange in the script. whatIs orange > fruit As for my use case, I would then use the output ( in this example: fruit ) as a variable name and use in my program.
One can filter data from the partitioning clause of an analytic function using case ie: select MIN(case when state <> 'CA' then createdate end) over (partition by id) MINACTIVEDATE from table;