Banking.scala:
-------------
package com.spark.scala.learning
object Banking {
def main(args:Array[String]):Unit = {
val ob:BankingTransactions = new BankingTransactions
while(true)
{
println("1 . Deposit \n 2. Withdrawal \n 3. Balance Check \n 4. Close")
println("Enter your choice:\t")
val ch:Int = scala.io.StdIn.readInt()
ch match{
case 1 =>
println("Enter the amount to deposit:\t")
val depositAmount:Long = scala.io.StdIn.readLong()
val bal_deposit:Long = ob.deposit(depositAmount)
println("The balance after deposit is " + bal_deposit)
case 2=>
println("Enter the amount to withdraw: \t")
val withdrawlAmount:Long = scala.io.StdIn.readLong()
val bal_withdrawn :Long = ob.withdrawal(withdrawlAmount)
if (bal_withdrawn <0)
println("Insufficient Funds")
else
println("The Balance after Withdrawal is "+bal_withdrawn)
case 3=>
val bal_check:Long = ob.balanceCheck()
println("The balance is " + bal_check)
case 4=>
System.exit(0)
case _=>
println("Bad Choice")
}
}
}
}
BankingTransactions.scala:
-------------------------
package com.spark.scala.learning
class BankingTransactions {
var bal:Long = 0
def deposit(amount:Long):Long = {
bal = bal + amount
bal
}
def withdrawal(amount:Long):Long = {
if (bal - amount < 0){
bal
}
bal = bal - amount
bal
}
def balanceCheck():Long={
bal
}
}
1 . Deposit
2. Withdrawal
3. Balance Check
4. Close
Enter your choice:
1
Enter the amount to deposit:
100
The balance after deposit is 100
1 . Deposit
2. Withdrawal
3. Balance Check
4. Close
Enter your choice:
2
Enter the amount to withdraw:
30
The Balance after Withdrawal is 70
1 . Deposit
2. Withdrawal
3. Balance Check
4. Close
Enter your choice:
3
The balance is 70
1 . Deposit
2. Withdrawal
3. Balance Check
4. Close
Enter your choice:
234
Bad Choice
1 . Deposit
2. Withdrawal
3. Balance Check
4. Close
Enter your choice:
Subscribe to:
Post Comments (Atom)
Flume - Simple Demo
// create a folder in hdfs : $ hdfs dfs -mkdir /user/flumeExa // Create a shell script which generates : Hadoop in real world <n>...
-
How to fetch Spark Application Id programmaticall while running the Spark Job? scala> spark.sparkContext.applicationId res124: String = l...
-
// Lead Example // Lead means Next row's salary value spark.sql("SELECT id, fname,lname, designation, technology,salary, LEAD(sal...
-
from pyspark.sql import SparkSession spark = SparkSession.builder.appName("LondonCrimes").getOrCreate() data = spark.read.format(...
No comments:
Post a Comment