I am currently designing several dockerized workers that I would like to run from time to time, o demand and recurrently, on my Qovery environment.
I would like to be able deploy one or several worker on my Qovery envrionment on a specific external event (thinking to SQS for its lightness…, one queue per job type) to handle the on demande deployments.
Do you have any idea what would be the best way to design this ? I am thinking to create a small service, that would be up anytime in my Qovery environment, listening to SQS queue and triggering job deployment with arguments thanks to Qovery API.
I was considering using n8n at first time to achieve this, but I don’t like the “no code” bias as we can not maintained workflow in a git repository, for tests, releasing, and also workflow are executed in same environments, there is no containerization.
Concerning your demands:
The service could expose an HTTP API yes, at least for the health checks, but only internally to my namespace, but the service would focus only on worker management
About the number of workers, to be increased in time as we gain customers, but atm. I have identified 2 type of workers:
first one is for assets resizing, only on demand, difficult to estimate but two/three time a day, depending on workload of our customers and not during the weekend
second one is for data scraping, on demand two/three time a week and recurring each night if customer choose to.
Once the execution is done, a worker can be safely dropped.
Hi @qvdp , it’s clear to me. What you want to achieve is perfectly doable via Qovery in a programmatic or non-programmatic way. We provide API and clients you can use to provide services with the appropriate configuration.