Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removes support for plio-analytics and BIGQUERY #310

Merged
merged 26 commits into from
Jan 27, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
a582ec9
NEW: updated endpoint for listing plios and removed endpoint for list…
dalmia Jan 21, 2022
a988bc0
DEL: removed BIGQUERY support
dalmia Jan 21, 2022
cdb798b
NEW: added endpoint for calculating plio metrics
dalmia Jan 21, 2022
00f9282
FIX: handle the case when only one session
dalmia Jan 21, 2022
713518e
FIX: failing tests
dalmia Jan 24, 2022
3266e24
NEW: test for checking num_views in fetching plio list
dalmia Jan 25, 2022
9ab6179
NEW: more tests for metrics - average watch time and num views
dalmia Jan 25, 2022
4de7586
FIX: merge conflicts with master
dalmia Jan 25, 2022
a8d3111
NEW: tests for retention calculation
dalmia Jan 25, 2022
7184f0f
NEW: tests for question metrics without any answer
dalmia Jan 25, 2022
f21c604
NEW: tests for question metrics when answered have been added
dalmia Jan 25, 2022
fc13c1f
DEL: all traces of plio-analytics
dalmia Jan 25, 2022
eeffa6e
DEL: all traces of plio-analytics
dalmia Jan 25, 2022
8714ec6
DEL: all traces of plio-analytics
dalmia Jan 25, 2022
9fe2426
DEL: all traces of analytics
dalmia Jan 25, 2022
7375ffa
DEL: all traces of analytics
dalmia Jan 25, 2022
9ea8cd5
DOC: added comments
dalmia Jan 25, 2022
41b5f7d
FIX: user IDs hashed
dalmia Jan 25, 2022
d18d62f
FIX: bug when no session with valid retention
dalmia Jan 25, 2022
afc8717
NEW: added test for checking the bug solved in last commit
dalmia Jan 25, 2022
12436ee
merge master
dalmia Jan 25, 2022
09860b4
ENH: moved sql queries to queries.py
dalmia Jan 27, 2022
fad262c
ENH: add unique_viewers directly to the queryset
dalmia Jan 27, 2022
5c3204a
FIX: failing tests
dalmia Jan 27, 2022
79d8d96
FIX: used the correct type + better variable name
dalmia Jan 27, 2022
c6fa4f7
DEL: useless kwargs
dalmia Jan 27, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 0 additions & 13 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -27,12 +27,6 @@ AWS_SECRET_ACCESS_KEY=''
AWS_REGION=''
AWS_STORAGE_BUCKET_NAME=''

# bigquery credentials
BIGQUERY_ENABLED=False
BIGQUERY_PROJECT_ID=''
BIGQUERY_LOCATION=''
BIGQUERY_CREDENTIALS=''

# redis details
REDIS_HOSTNAME='redis'
REDIS_PORT=6379
Expand All @@ -44,13 +38,6 @@ DEFAULT_TENANT_NAME=Plio
DEFAULT_TENANT_SHORTCODE=plio
DEFAULT_TENANT_DOMAIN=0.0.0.0

# Analytics Identity Provider (IDP) configurations
ANALYTICS_IDP_TYPE='' # possible values are `cognito` or `auth0`
ANALYTICS_IDP_TOKEN_URL=''
ANALYTICS_IDP_CLIENT_ID=''
ANALYTICS_IDP_CLIENT_SECRET=''
ANALYTICS_IDP_AUDIENCE='' # not needed when IDP is `cognito`

# The driver for sending SMSs. Possible values are `sns` or `log`.
# Use `sns` to have AWS SNS support. The AWS credentials must be present for this.
# Use an empty string to log SMSs into a file instead. Recommended for development mode.
Expand Down
134 changes: 0 additions & 134 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -63,144 +63,10 @@ jobs:
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_REGION: ${{ secrets.AWS_REGION }}
AWS_STORAGE_BUCKET_NAME: ${{ secrets.AWS_STORAGE_BUCKET_NAME }}
ANALYTICS_IDP_TYPE: ${{ secrets.ANALYTICS_IDP_TYPE }}
ANALYTICS_IDP_TOKEN_URL: ${{ secrets.ANALYTICS_IDP_TOKEN_URL }}
ANALYTICS_IDP_CLIENT_ID: ${{ secrets.ANALYTICS_IDP_CLIENT_ID }}
ANALYTICS_IDP_CLIENT_SECRET: ${{ secrets.ANALYTICS_IDP_CLIENT_SECRET }}
dalmia marked this conversation as resolved.
Show resolved Hide resolved
REDIS_HOSTNAME: 127.0.0.1
REDIS_PORT: 6379
# command to run tests and generate coverage metrics
run: coverage run manage.py test

- name: Upload coverage to Codecov
uses: codecov/codecov-action@v1

integration-tests:
name: Integration tests
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2

- name: Set up Plio Frontend
run: |
# clone the project
mkdir -p projects/ && cd projects/
git clone https://github.com/avantifellows/plio-frontend/
cd plio-frontend/

# check branch and switch to branch if exists
if [ `git branch --list --remote origin/${{ github.head_ref }}` ]
then
echo "Switching to branch ${{ github.head_ref }}."
git checkout ${{ github.head_ref }}
git pull origin ${{ github.head_ref }}
else
echo "Branch not found. Going with default branch."
fi

# create the env file
cp .env.example .env

# add env secrets
echo 'VUE_APP_GOOGLE_CLIENT_ID=${{ secrets.GOOGLE_OAUTH2_CLIENT_ID }}' >> .env
echo 'VUE_APP_BACKEND_API_CLIENT_ID=${{ secrets.DJANGO_DEFAULT_OAUTH2_CLIENT_ID }}' >> .env
echo 'VUE_APP_BACKEND_API_CLIENT_SECRET=${{ secrets.DJANGO_DEFAULT_OAUTH2_CLIENT_SECRET }}' >> .env

# setup docker containers
docker-compose up -d --build

- name: Set up Plio Backend
run: |
# clone the project
mkdir -p projects/ && cd projects/
git clone https://github.com/avantifellows/plio-backend
cd plio-backend/

# check branch and switch to branch if exists
if [ `git branch --list --remote origin/${{ github.head_ref }}` ]
then
echo "Switching to branch ${{ github.head_ref }}."
git checkout ${{ github.head_ref }}
git pull origin ${{ github.head_ref }}
else
echo "Branch not found. Going with default branch."
fi

# create the env file
cp .env.example .env

# add env secrets
echo 'SECRET_KEY=${{ secrets.DJANGO_SECRET_KEY }}' >> .env
echo 'DEFAULT_OAUTH2_CLIENT_SETUP=${{ secrets.DJANGO_DEFAULT_OAUTH2_CLIENT_SETUP }}' >> .env
echo 'DEFAULT_OAUTH2_CLIENT_ID=${{ secrets.DJANGO_DEFAULT_OAUTH2_CLIENT_ID }}' >> .env
echo 'DEFAULT_OAUTH2_CLIENT_SECRET=${{ secrets.DJANGO_DEFAULT_OAUTH2_CLIENT_SECRET }}' >> .env
echo 'GOOGLE_OAUTH2_CLIENT_ID=${{ secrets.GOOGLE_OAUTH2_CLIENT_ID }}' >> .env
echo 'GOOGLE_OAUTH2_CLIENT_SECRET=${{ secrets.GOOGLE_OAUTH2_CLIENT_SECRET }}' >> .env

# setup docker containers
docker-compose up -d --build

- name: Set up Plio Analytics
run: |
# clone the project
mkdir -p projects/ && cd projects/
git clone https://github.com/avantifellows/plio-analytics
cd plio-analytics/

# check branch and switch to branch if exists
if [ `git branch --list --remote origin/${{ github.head_ref }}` ]
then
echo "Switching to branch ${{ github.head_ref }}."
git checkout ${{ github.head_ref }}
git pull origin ${{ github.head_ref }}
else
echo "Branch not found. Going with default branch."
fi

# create the env file
cp .env.example .env

# add env secrets
echo 'CUBEJS_API_SECRET=${{ secrets.ANALYTICS_CUBEJS_API_SECRET }}' >> .env

# setup docker containers
docker-compose up -d --build

- name: Run cypress
run: |
cd projects/plio-frontend/

# delete the node_modules created by the docker
echo 'deleting node modules'
rm -rf node_modules/

# install dependencies from the current shell user
echo 'running npm'
npm install
echo 'finished npm install'

# setup env secrets
export cypress_backend_convert_social_auth_token_url=${{ secrets.CYPRESS_PLIO_BACKEND_CONVERT_SOCIAL_AUTH_TOKEN_URL }}
export cypress_backend_client_id=${{ secrets.DJANGO_DEFAULT_OAUTH2_CLIENT_ID }}
export cypress_backend_client_secret=${{ secrets.DJANGO_DEFAULT_OAUTH2_CLIENT_SECRET }}
export cypress_auth_google_refresh_token=${{ secrets.CYPRESS_AUTH_GOOGLE_REFRESH_TOKEN }}
export cypress_auth_google_client_id=${{ secrets.GOOGLE_OAUTH2_CLIENT_ID }}
export cypress_auth_google_client_secret=${{ secrets.GOOGLE_OAUTH2_CLIENT_SECRET }}

# run cypress test cases
npx cypress run --record --key ${{ secrets.CYPRESS_RECORD_KEY }}

- name: Coveralls
uses: coverallsapp/github-action@master
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
path-to-lcov: ./projects/plio-frontend/coverage/lcov.info

- name: Stop the containers
if: always()
run: |
cd projects/
cd plio-frontend/ && docker-compose down
cd ../plio-backend/ && docker-compose down
cd ../plio-analytics/ && docker-compose down
9 changes: 0 additions & 9 deletions docs/DEPLOYMENT.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,6 @@ Deploying on AWS requires a basic understanding of the following tools and servi
9. AWS Elastic IPs
10. AWS Identity and Access Management (IAM)
11. AWS Relational Database Service (RDS)
12. Google BigQuery

## Staging

Expand Down Expand Up @@ -245,11 +244,6 @@ Follow the steps below to set up the staging environment on AWS.
- AWS_SECRET_ACCESS_KEY
- AWS_REGION
- AWS_STORAGE_BUCKET_NAME
- ANALYTICS_IDP_TYPE
- ANALYTICS_IDP_TOKEN_URL
- ANALYTICS_IDP_CLIENT_ID
- ANALYTICS_IDP_CLIENT_SECRET
- ANALYTICS_IDP_AUDIENCE (optional)

14. We are using Github Actions to trigger deployments. You can find the workflow defined in `.github/workflows/deploy_to_ecs_staging.yml`. It defines a target branch such that a deployment is initiated whenever a change is pushed to the target branch.

Expand Down Expand Up @@ -296,6 +290,3 @@ Setting up a production environment on AWS is almost the same as staging. Additi
14. Save the scaling policy.
15. Create or update the service name.
16. Use [k6.io](https://k6.io/) or other load testing tool to check if auto-scaling is working fine or not. You can lower down the target threshold for testing purposes.
5. If you're setting up [Plio Analytics](https://github.com/avantifellows/plio-analytics), also make sure to configure the required environment variables:
1. [Identity Provider for Plio Analytics](./ENV.md#identity-provider-for-plio-analytics).
2. [BigQuery configurations](./ENV.md#bigquery-configurations).
47 changes: 0 additions & 47 deletions docs/ENV.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,53 +103,6 @@ Shortcode for the default tenant (e.g. plio)
The domain for the default tenant (e.g. 0.0.0.0 locally, plio.in on production)


### Identity Provider for Plio Analytics
While setting up Plio analytics, you need to make sure the following variables are also updated. These are responsible to fetch an access token from the configured Identity Provider.

#### `ANALYTICS_IDP_TYPE`
Plio Analytics supports two identity providers. The possible values for this variable are `cognito` (AWS Cognito) and `auth0` (Auth0).

#### `ANALYTICS_IDP_TOKEN_URL`
The url to request access token from the Identity Provider. Generally looks like:
1. When type is `cognito`: `https://<APP-DOMAIN-PREFIX>.auth.<aws-region>.amazoncognito.com/oauth2/token`. This is the same as the Amazon Cognito domain you have configured.
2. When type is `auth0`: `https://<AUTH0-SUBDOMAIN>.<REGION>.auth0.com/oauth/token`

#### `ANALYTICS_IDP_CLIENT_ID`
The client id for your identity provider app.
1. When type is `cognito`: Retrieve this from your User pool's "App clients" page.
2. When type is `auth0`: Retrieve from Auth0 Application settings page.

#### `ANALYTICS_IDP_CLIENT_SECRET`
The client secret for your identity provider app.
1. When type is `cognito`: Retrieve this from your User pool's "App clients" page.
2. When type is `auth0`: Retrieve from Auth0 Application settings page.

#### `ANALYTICS_IDP_AUDIENCE`
Unique Identifier for your Auth0 API.
1. When type is `cognito`: Not needed.
2. When type is `auth0`: Retrieve from Auth0 API settings.


### BigQuery configurations
BigQuery settings are needed if Plio Analytics is configured to use BigQuery. We recommended using BigQuery for staging/production set ups.

#### `BIGQUERY_ENABLED`
Boolean value. Defaults to `False` if not set.

#### `BIGQUERY_PROJECT_ID`
The BigQuery project id that contains the datasets.

#### `BIGQUERY_LOCATION`
The location of the BigQuery project. All datasets must be in the same location.

#### `BIGQUERY_CREDENTIALS`
This is a base64 encoded value of your Google Cloud Platform's service account. You can learn more about acquiring service account credentials [here](https://cloud.google.com/docs/authentication/getting-started) and [here](https://console.cloud.google.com/projectselector2/iam-admin/serviceaccounts?supportedpurview=project). The service account must have BigQuery admin permissions.

Once you have downloaded the JSON file, run the following commands and use the output for this environment variable:
```sh
cat /path/to/gcp-service-account-filename.json | base64
```

### Error monitoring
Plio supports error monitoring on your app with [Sentry](https://sentry.io/). Visit our guide on [Error Monitoring](./docs/../ERROR-MONITORING.md) to enable it for your Plio setup.

Expand Down
1 change: 0 additions & 1 deletion organizations/middleware.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,6 @@ def get_schema(self, request):
tenant = self.get_tenant(request)
if tenant:
return tenant.schema_name
# as get_schema is being used when querying BigQuery datasets, we explicity need to mention `public`
return "public"

def process_request(self, request):
Expand Down
17 changes: 17 additions & 0 deletions plio/migrations/0029_auto_20220124_2224.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Generated by Django 3.1.1 on 2022-01-24 22:24

from django.db import migrations


class Migration(migrations.Migration):

dependencies = [
("plio", "0028_auto_20210902_1120"),
]

operations = [
migrations.AlterModelOptions(
name="plio",
options={"ordering": ["-updated_at"]},
),
]
13 changes: 13 additions & 0 deletions plio/migrations/0030_merge_20220125_0510.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Generated by Django 3.1.1 on 2022-01-25 05:10

from django.db import migrations


class Migration(migrations.Migration):

dependencies = [
("plio", "0029_auto_20220124_2224"),
("plio", "0029_auto_20220110_1044"),
]

operations = []
1 change: 1 addition & 0 deletions plio/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,7 @@ class Plio(SafeDeleteModel):

class Meta:
db_table = "plio"
ordering = ["-updated_at"]
dalmia marked this conversation as resolved.
Show resolved Hide resolved

def __str__(self):
return "%d: %s" % (self.id, self.name)
Expand Down
20 changes: 0 additions & 20 deletions plio/ordering.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,4 @@
from rest_framework.filters import OrderingFilter
from django.db.models import OuterRef, Subquery, Count
from entries.models import Session
from django.db.models.functions import Coalesce


class CustomOrderingFilter(OrderingFilter):
Expand Down Expand Up @@ -50,23 +47,6 @@ def filter_queryset(self, request, queryset, view):
ordering = self.get_ordering(request, queryset, view)

if ordering:
# if the ordering fields contain "unique_viewers"
if any("unique_viewers" in order_by for order_by in ordering):
# prepare a session queryset which has an annotated field "count_unique_users"
# that holds the count of unique users for every plio in the plio's queryset
plio_session_group = Session.objects.filter(
plio__uuid=OuterRef("uuid")
).values("plio__uuid")

plios_unique_users_count = plio_session_group.annotate(
count_unique_users=Count("user__id", distinct=True)
).values("count_unique_users")

# annotate the plio's queryset with the count of unique users
queryset = queryset.annotate(
unique_viewers=Coalesce(Subquery(plios_unique_users_count), 0)
)

return queryset.order_by(*ordering)

return queryset
Loading