Multi-tenant SaaS using GCP as back-end data store and compute engine

by TeeJ   Last Updated August 13, 2019 22:05 PM

I have traweled through the furthest corners of the web for a solution to my problem and have come up with nothing after a week.

I am developing a SaaS product for multiple tenants. Each will be able to upload their data to BigQuery via Google Cloud Storage. However, for the users it would only be an "Upload" button on a front end and all the wonders of GCP will be abstracted away from them. Once they have uploaded their huge datasets (5GB for each file) they will be able to view the visualise it using my web-app built in React on that and subsequent logins. For each tenant (separate clients of mine) I would need to create viewer roles to be able to view datasets and creator roles to be able to create datasets for both viewers and creators to see.

Is at all possible? I do not believe service accounts are appropriate here as they would result in everyone having the same authorization (i.e. that of the service account). At the same time I do not want to force customers to have a Google account or as their preference is to use their enterprise domain email and their own passwords. I do not need access to any of their data either, and just want to simply authenticate them and assign a role.

Currently I have a mock-up that uses servcie accounts that works fine. But everyone would ahve the same role once they have logged into my app because the Flask backend just uses a service account with minimal priviledges.

I was thinking of a way to perhaps map a service account to the user type for respective organisations and then that environment variable is set in the Flask app that calls the GCP API's.

This will be on Kubernetes and multi-tenancy will be achieved through completely separate namespaces.

Two examples of a customer journey to illustrate: 1) A customer from client A logins in to my app by typing - > they have access to create datasets so they upload their vehicle dynamics datasets -> the file is read into a GCS bucket -> file is then uploaded to BigQuery using a schema that has been determined by a Flask backend application I ahve written -> client can then navigate data on a React dashboard employing aggregations to ensure data is within the size of the client browser limits.

2) A customer from client B login in to my app by typing -> they only have access as a viewer so they can view datasets that have been loaded previously by their dataset creator colleague -> they observe the data, save some png's to a slide pack and then logout of my web app.

Does anyone have any tips of documentation that could help or some use-cases? I am considering dropping GCP altogether because of it, despite it being very impressive.

Related Questions

Updated January 21, 2018 21:05 PM

Updated December 09, 2018 19:05 PM

Updated June 07, 2017 17:05 PM

Updated April 11, 2017 12:05 PM

Updated December 01, 2018 03:05 AM