-
Looking at what you are offering here & think it's wonderful. Kudos for the level of detail & knowledge sharing put in across the board - not unnoticed! I am Dir of Data Science & MLops with an org committed to automating & proactively responding to data incidents. We are a GCP, cloud-first group. My question is, rather than joining DQOPS GCP resources, are we able to set up shop within our organization? Meaning not just the front end, but everything?? I haven't been able to find anything speaking to that on the resources you have share - but if that's a possibility I'd love to learn more. I'm guessing the initial reaching is something like, it's not just one service or tool it's integrated widely across the board. Compute, messaging, hosting etc...which would be understandable certainly. That said, this is what we do on a daily basis (manage massive data & provide that to customers end-to-end. I'm not worried about handing to /stand up, maintain & sustain on our own. So if you could share a bit about what that would look like, or perhaps we could jump on a call to do the same. Appreciate it! 🔥 |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
DQOps has two parts. The remaining part that would be hosted as a commercial SaaS platform is the data warehouse and the data quality dashboards. We are setting up a small data quality data lake on GCP for each user. That is:
Besides that, we have a set of dashboards built with Looker Studio. They are connecting to the customer's data warehouse using our Looker Studio Community Connector. Given that, there are four ways to host DQOps in your environment:
|
Beta Was this translation helpful? Give feedback.
DQOps has two parts.
The big part that you can host anywhere is the DQOps engine itself which has the UI, data quality rules, and an offline data lake with parquet files. You can run that in your Org any time.
The remaining part that would be hosted as a commercial SaaS platform is the data warehouse and the data quality dashboards. We are setting up a small data quality data lake on GCP for each user. That is:
Besides that, …