Banking.scala:
-------------
package com.spark.scala.learning
object Banking {
def main(args:Array[String]):Unit = {
val ob:BankingTransactions = new BankingTransactions
while(true)
{
println("1 . Deposit \n 2. Withdrawal \n 3. Balance Check \n 4. Close")
println("Enter your choice:\t")
val ch:Int = scala.io.StdIn.readInt()
ch match{
case 1 =>
println("Enter the amount to deposit:\t")
val depositAmount:Long = scala.io.StdIn.readLong()
val bal_deposit:Long = ob.deposit(depositAmount)
println("The balance after deposit is " + bal_deposit)
case 2=>
println("Enter the amount to withdraw: \t")
val withdrawlAmount:Long = scala.io.StdIn.readLong()
val bal_withdrawn :Long = ob.withdrawal(withdrawlAmount)
if (bal_withdrawn <0)
println("Insufficient Funds")
else
println("The Balance after Withdrawal is "+bal_withdrawn)
case 3=>
val bal_check:Long = ob.balanceCheck()
println("The balance is " + bal_check)
case 4=>
System.exit(0)
case _=>
println("Bad Choice")
}
}
}
}
BankingTransactions.scala:
-------------------------
package com.spark.scala.learning
class BankingTransactions {
var bal:Long = 0
def deposit(amount:Long):Long = {
bal = bal + amount
bal
}
def withdrawal(amount:Long):Long = {
if (bal - amount < 0){
bal
}
bal = bal - amount
bal
}
def balanceCheck():Long={
bal
}
}
1 . Deposit
2. Withdrawal
3. Balance Check
4. Close
Enter your choice:
1
Enter the amount to deposit:
100
The balance after deposit is 100
1 . Deposit
2. Withdrawal
3. Balance Check
4. Close
Enter your choice:
2
Enter the amount to withdraw:
30
The Balance after Withdrawal is 70
1 . Deposit
2. Withdrawal
3. Balance Check
4. Close
Enter your choice:
3
The balance is 70
1 . Deposit
2. Withdrawal
3. Balance Check
4. Close
Enter your choice:
234
Bad Choice
1 . Deposit
2. Withdrawal
3. Balance Check
4. Close
Enter your choice:
Subscribe to:
Post Comments (Atom)
Flume - Simple Demo
// create a folder in hdfs : $ hdfs dfs -mkdir /user/flumeExa // Create a shell script which generates : Hadoop in real world <n>...
-
How to fetch Spark Application Id programmaticall while running the Spark Job? scala> spark.sparkContext.applicationId res124: String = l...
-
input data: ---------- customerID, itemID, amount 44,8602,37.19 35,5368,65.89 2,3391,40.64 47,6694,14.98 29,680,13.08 91,8900,24.59 ...
-
pattern matching is similar to switch statements in C#, Java no fall-through - at least one condition matched no breaks object PatternExa { ...
No comments:
Post a Comment