Quantcast
Channel: Recent Questions - Stack Overflow
Viewing all articles
Browse latest Browse all 15491

databricks function - how to test without inputting column

$
0
0

right now I have an azure databrick function that works successfully. I feed it a column from a dataframe as a parameter and I can view the results successfully.

however, while I am developing I would like to feed it a parameter I type, such as '202401'.

from pyspark.sql.functions import col, substrdef CreateBloombergSymbol(pctym):    digitMonth = pctym[4:6]    digitYear1 = pctym[3:4]    digitYear2 = pctym[2:4]    match_dict = {"01": "F","02": "G","03": "H","04": "J","05": "K","06": "M","07": "N","08": "Q","09": "U","10": "V","11": "X","12": "Z","foo": "missingValue"    }    charMonth = match_dict.get(digitMonth, "foo")                                   return digitYear2varMyBloombergFunction = spark.udf.register("CreateBloombergSymbol", CreateBloombergSymbol) 

when I run it like this... it works

from pyspark.sql.functions import coldfPos.select(    #col('*'),    varMyBloombergFunction(col('pctym')).alias('BloombergSymbol')).display()

when I run it like this... it does not work. anyone know what i am doing wrong? or misunderstanding?

x1 = varMyBloombergFunction('202401')print(x1)

this is the result of it not displaying appropriate result

Column<'CreateBloombergSymbol(202401)'>

Viewing all articles
Browse latest Browse all 15491

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>