1. Yes. I also encourage you to checkout the incubator-liminal.
https://github.com/apache/incubator-liminal/
Natural Intelligence contributes to Apache open source and you can find there how we render DAGs.
2. CI - Each build generates DAGs and uploades them under its commitHash to Amazon S3.
The structure in Amazon S3 is per project and is divided to release (i.e. v1.4) and snapshot (i.e. commitHash)
CD - The deployment happens by the commitHash/releaseVersion
FYI: In the old way, we generated DAGs as per yaml and stored the .py in Amazon S3.
Alternatively the liminal project creates the DAGs programmatically therefore there's no need to store them in Artifactory since Github can serve it in the same way.
In case of a disaster recovery: We use EFS and we have a backup although Airflow production is the latest state of each repository in Github, so apparently we just need to deploy everything at once.
In addition, we create our own Airflow image which is based on the official one with extra dependencies such as Plugins.
Feel free to contribute to incubator-liminal in your spare time.