I was wondering if anyone would be kind enough to share how they have setup their data engineering and automation for submitting to Signals.
I am already struggling with the most basic first step, i.e,:
- pricing data ETL: on a weekly scheduled basis grab pricing for Signals universe from some provider endpoint from, say, t-10 years to t and push to some storage bucket
I do not want to manage any VMs so am trying to do this in a serverless way. I have tried GCP Cloud Functions which is almost perfect, except the runtime is limited to 9 minutes and it takes about 40 minutes to grab pricing for the full universe. I have another data job which takes 4hrs.
How is everyone doing this? This is a common data engineering step that we all need to do.