how are you, I have 14 notebooks that run on data bricks in this way, the main notebook named batch_main_dwh_risk invokes another notebook called functions, and this last one invokes the other 12 notebooks that do data processing, within the main notebook. I extract from a cosmos 1000 files of the form "fg000012" which they pass as arguments to my other 12 functions, I take this to the data factory and do it the way it is in the image, my questions are, is that correct? How to do it? Is there any other correct way to upload the notebooks and allow me to reduce the time it takes to process the data in databricks?
I did thisenter image description here