Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docker dev fix: Fixes to commit comments and events #2891

Merged
merged 38 commits into from
Aug 15, 2024
Merged
Show file tree
Hide file tree
Changes from 26 commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
c7d44e5
collection tweak
sgoggins Aug 5, 2024
faa9eb1
update to collection
sgoggins Aug 5, 2024
6d4207e
updated docker schema
sgoggins Aug 5, 2024
ca7aeef
update docker config
sgoggins Aug 5, 2024
22c6848
docker updates
sgoggins Aug 5, 2024
5a14c42
got GeckoDriver running in Docker
sgoggins Aug 6, 2024
cd17dd3
path update to docker file
sgoggins Aug 6, 2024
24574eb
dockerfile update
sgoggins Aug 6, 2024
1860246
still fidling with chromedriver or geckodriver in docker
sgoggins Aug 6, 2024
2e0ab6d
update with new webdriver approach
sgoggins Aug 6, 2024
4cf43c5
still sorting out image generation
sgoggins Aug 6, 2024
8972c72
Handle url not found in contributor breadth worker
ABrain7710 Aug 7, 2024
258bda2
Merge pull request #2883 from chaoss/dev
sgoggins Aug 8, 2024
f39b5c9
Merge pull request #2884 from chaoss/dev-fixes
sgoggins Aug 8, 2024
9eb4b61
docker fun
sgoggins Aug 8, 2024
c066c92
failed docker config
sgoggins Aug 8, 2024
508091d
saving failed config
sgoggins Aug 8, 2024
9b66215
Docker compose script now workinggit add docker/
sgoggins Aug 8, 2024
c4c4344
removed old file
sgoggins Aug 8, 2024
86b96bd
debugging a couple errors. One with a missing import, the other tryin…
sgoggins Aug 10, 2024
fe5f4b5
possible fix for commit messages
sgoggins Aug 11, 2024
1d1b53e
changed page_count > 300 to a warning instead of an exception
sgoggins Aug 11, 2024
76391ce
updated unterminated string error
sgoggins Aug 11, 2024
39b5624
Fixing NPM
sgoggins Aug 12, 2024
f1e5b96
events page count seems to top out at 1,000, not 300
sgoggins Aug 12, 2024
7a2b134
update to events to use correct column name
sgoggins Aug 12, 2024
5d93762
update to augur event collection
sgoggins Aug 12, 2024
a49870c
logging event so we know what's supposed to be there
sgoggins Aug 12, 2024
39b8efb
Looking at event logs, it appears that issue is not always contained …
sgoggins Aug 14, 2024
d63b13c
fixing pickling error caused by not consistently importing our Metada…
sgoggins Aug 14, 2024
0aa2662
additional update for when issue is not an object in an event respons…
sgoggins Aug 14, 2024
9601c73
Trying to fix issue when the committer count API returns no value on …
sgoggins Aug 14, 2024
67c13e3
exception handling
sgoggins Aug 14, 2024
e63aa48
trying to fix the empty committer count issue
sgoggins Aug 14, 2024
e879b26
added some comments explaining the query_committers_count method changes
sgoggins Aug 14, 2024
b5af4ac
Making repo GONE log message more clear.
sgoggins Aug 14, 2024
9aee7ab
Working to get more information on this error:
sgoggins Aug 14, 2024
eb4fa84
formatting exception for OpenSSF Scorecard pickling error more typically
sgoggins Aug 14, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions augur/api/routes/pull_request_reports.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,12 @@
from bokeh.models.glyphs import Rect
from bokeh.transform import dodge, factor_cmap, transform

# from selenium.webdriver import Firefox, FirefoxOptions
# options = FirefoxOptions()
# options.headless = True
# webdriver = Firefox(options=options)
#export_png(item, path, webdriver=webdriver)

warnings.filterwarnings('ignore')

from augur.api.routes import AUGUR_API_VERSION
Expand Down Expand Up @@ -604,6 +610,7 @@ def average_commits_per_PR():
# opts = FirefoxOptions()
# opts.add_argument("--headless")
# driver = webdriver.Firefox(firefox_options=opts)
# filename = export_png(grid, timeout=180, webdriver=webdriver)
filename = export_png(grid, timeout=180)

return send_file(filename)
Expand Down
2 changes: 1 addition & 1 deletion augur/application/cli/backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@
logger.info(f'Augur is running at: {"http" if development else "https"}://{host}:{port}')
logger.info(f"The API is available at '{api_response.json()['route']}'")

processes = start_celery_worker_processes(float(worker_vmem_cap), disable_collection)

Check warning on line 102 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'processes' from outer scope (line 386) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:102:4: W0621: Redefining name 'processes' from outer scope (line 386) (redefined-outer-name)

if os.path.exists("celerybeat-schedule.db"):
logger.info("Deleting old task schedule")
Expand Down Expand Up @@ -185,7 +185,7 @@
sleep_time += 6

#60% of estimate, Maximum value of 45 : Reduced because it can be lower
core_num_processes = determine_worker_processes(.40, 50)
core_num_processes = determine_worker_processes(.40, 90)
logger.info(f"Starting core worker processes with concurrency={core_num_processes}")
core_worker = f"celery -A augur.tasks.init.celery_app.celery_app worker -l info --concurrency={core_num_processes} -n core:{uuid.uuid4().hex}@%h"
process_list.append(subprocess.Popen(core_worker.split(" ")))
Expand Down Expand Up @@ -220,7 +220,7 @@
"""
Sends SIGTERM to all Augur server & worker processes
"""
logger = logging.getLogger("augur.cli")

Check warning on line 223 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'logger' from outer scope (line 31) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:223:4: W0621: Redefining name 'logger' from outer scope (line 31) (redefined-outer-name)

augur_stop(signal.SIGTERM, logger, ctx.obj.engine)

Expand All @@ -233,11 +233,11 @@
"""
Sends SIGKILL to all Augur server & worker processes
"""
logger = logging.getLogger("augur.cli")

Check warning on line 236 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'logger' from outer scope (line 31) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:236:4: W0621: Redefining name 'logger' from outer scope (line 31) (redefined-outer-name)
augur_stop(signal.SIGKILL, logger, ctx.obj.engine)


def augur_stop(signal, logger, engine):

Check warning on line 240 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'signal' from outer scope (line 12) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:240:15: W0621: Redefining name 'signal' from outer scope (line 12) (redefined-outer-name)

Check warning on line 240 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'logger' from outer scope (line 31) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:240:23: W0621: Redefining name 'logger' from outer scope (line 31) (redefined-outer-name)
"""
Stops augur with the given signal,
and cleans up collection if it was running
Expand All @@ -253,7 +253,7 @@
cleanup_after_collection_halt(logger, engine)


def cleanup_after_collection_halt(logger, engine):

Check warning on line 256 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'logger' from outer scope (line 31) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:256:34: W0621: Redefining name 'logger' from outer scope (line 31) (redefined-outer-name)
clear_redis_caches()

connection_string = get_value("RabbitMQ", "connection_string")
Expand Down Expand Up @@ -402,7 +402,7 @@
pass
return augur_processes

def _broadcast_signal_to_processes(processes, broadcast_signal=signal.SIGTERM, given_logger=None):

Check warning on line 405 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'processes' from outer scope (line 386) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:405:35: W0621: Redefining name 'processes' from outer scope (line 386) (redefined-outer-name)
if given_logger is None:
_logger = logger
else:
Expand Down
2 changes: 1 addition & 1 deletion augur/application/cli/collection.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@

worker_vmem_cap = get_value("Celery", 'worker_process_vmem_cap')

processes = start_celery_collection_processes(float(worker_vmem_cap))

Check warning on line 59 in augur/application/cli/collection.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'processes' from outer scope (line 201) (redefined-outer-name) Raw Output: augur/application/cli/collection.py:59:4: W0621: Redefining name 'processes' from outer scope (line 201) (redefined-outer-name)

if os.path.exists("celerybeat-schedule.db"):
logger.info("Deleting old task schedule")
Expand Down Expand Up @@ -125,7 +125,7 @@
sleep_time += 6

#60% of estimate, Maximum value of 45: Reduced because not needed
core_num_processes = determine_worker_processes(.40, 50)
core_num_processes = determine_worker_processes(.40, 90)
logger.info(f"Starting core worker processes with concurrency={core_num_processes}")
core_worker = f"celery -A augur.tasks.init.celery_app.celery_app worker -l info --concurrency={core_num_processes} -n core:{uuid.uuid4().hex}@%h"
process_list.append(subprocess.Popen(core_worker.split(" ")))
Expand Down Expand Up @@ -158,7 +158,7 @@
"""
Sends SIGTERM to all Augur server & worker processes
"""
logger = logging.getLogger("augur.cli")

Check warning on line 161 in augur/application/cli/collection.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'logger' from outer scope (line 28) (redefined-outer-name) Raw Output: augur/application/cli/collection.py:161:4: W0621: Redefining name 'logger' from outer scope (line 28) (redefined-outer-name)

augur_stop(signal.SIGTERM, logger, ctx.obj.engine)

Expand All @@ -169,7 +169,7 @@
"""
Sends SIGKILL to all Augur server & worker processes
"""
logger = logging.getLogger("augur.cli")

Check warning on line 172 in augur/application/cli/collection.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'logger' from outer scope (line 28) (redefined-outer-name) Raw Output: augur/application/cli/collection.py:172:4: W0621: Redefining name 'logger' from outer scope (line 28) (redefined-outer-name)
augur_stop(signal.SIGKILL, logger, ctx.obj.engine)

@cli.command('repo-reset')
Expand Down
4 changes: 2 additions & 2 deletions augur/application/cli/tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,8 @@ def start():
secondary_worker_process = None

scheduling_worker = f"celery -A augur.tasks.init.celery_app.celery_app worker -l info --concurrency=1 -n scheduling:{uuid.uuid4().hex}@%h -Q scheduling"
core_worker = f"celery -A augur.tasks.init.celery_app.celery_app worker -l info --concurrency=50 -n core:{uuid.uuid4().hex}@%h"
secondary_worker = f"celery -A augur.tasks.init.celery_app.celery_app worker -l info --concurrency=50 -n secondary:{uuid.uuid4().hex}@%h -Q secondary"
core_worker = f"celery -A augur.tasks.init.celery_app.celery_app worker -l info --concurrency=90 -n core:{uuid.uuid4().hex}@%h"
secondary_worker = f"celery -A augur.tasks.init.celery_app.celery_app worker -l info --concurrency=20 -n secondary:{uuid.uuid4().hex}@%h -Q secondary"

scheduling_worker_process = subprocess.Popen(scheduling_worker.split(" "))
core_worker_process = subprocess.Popen(core_worker.split(" "))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
from datetime import datetime

from augur.tasks.init.celery_app import celery_app as celery
from augur.tasks.github.util.github_data_access import GithubDataAccess
from augur.tasks.github.util.github_data_access import GithubDataAccess, UrlNotFoundException
from augur.application.db.models import ContributorRepo
from augur.application.db.lib import bulk_insert_dicts
from augur.tasks.github.util.github_random_key_auth import GithubRandomKeyAuth
Expand Down Expand Up @@ -100,17 +100,22 @@ def contributor_breadth_model(self) -> None:


cntrb_events = []
for event in github_data_access.paginate_resource(repo_cntrb_url):
try:
for event in github_data_access.paginate_resource(repo_cntrb_url):

cntrb_events.append(event)
cntrb_events.append(event)

event_age = datetime.strptime(event["created_at"], "%Y-%m-%dT%H:%M:%SZ")
if event_age < newest_event_in_db:
logger.info("Found cntrb events we already have...skipping the rest")
break
event_age = datetime.strptime(event["created_at"], "%Y-%m-%dT%H:%M:%SZ")
if event_age < newest_event_in_db:
logger.info("Found cntrb events we already have...skipping the rest")
break

if len(cntrb_events) == 0:
logger.info("There are no cntrb events, or new events for this user.\n")
if len(cntrb_events) == 0:
logger.info("There are no cntrb events, or new events for this user.\n")
continue

except UrlNotFoundException as e:
logger.warning(e)
continue

events = process_contributor_events(cntrb, cntrb_events, logger, tool_source, tool_version, data_source)
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import requests
import logging
import traceback
import logging
import traceback

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pylint] reported by reviewdog 🐶
W0611: Unused import traceback (unused-import)


logger = logging.getLogger(__name__)

Expand All @@ -9,87 +9,81 @@ def get_NPM_data(package):
r = requests.get(url)
if r.status_code < 400:
return r.json()
logger.warning(f"Failed to fetch data for package {package}. HTTP Status: {r.status_code}")
return {}


def clean_version(version):
version = [v for v in version if v.isdigit() or v == '.']
return ''.join(version)

def split_version(version):
#Split version string into list seperated by .
#assign elements of list to respective variables.
version_list = list(version.split('.'))
patch = version_list.pop(-1)
minor = version_list.pop(-1)
major = version_list[0]

return major,minor,patch


return major, minor, patch

def get_latest_patch(version, data):
if 'versions' not in data:
logger.error(f"'versions' key not found in the NPM data for version {version}. Data: {data}")
raise KeyError("'versions' key not found")

versions = data['versions']
try:
index = list(versions.keys()).index(version)
except ValueError as e:
logger.error(f"Version {version} not found in the 'versions' list. Error: {e}")
raise e

major,minor,patch = split_version(version)
major, minor, patch = split_version(version)
consider_version = version
for v in list(versions.keys())[index:]:
if v.split('.')[0]==major:
if v.split('.')[1]== minor:
if v.split('.')[2]>patch:
if v.split('.')[0] == major:
if v.split('.')[1] == minor:
if v.split('.')[2] > patch:
consider_version = v
return consider_version


def get_lastest_minor(version, data):
try:
versions = data['versions']
except Exception as e:
logger.info(
''.join(traceback.format_exception(None, e, e.__traceback__)))
# raise e

if 'versions' not in data:
logger.error(f"'versions' key not found in the NPM data. Data: {data}")
raise KeyError("'versions' key not found")

versions = data['versions']
try:
index = list(versions.keys()).index(version)
except ValueError as e:
logger.info(f'error is {e} on the NPM. Some kind of value error. Probably a VALUES error for Node, #AmIRight?')
logger.info(f"Version {version} not found in the 'versions' list. Error: {e}")
raise e

major,minor,patch = split_version(version)

major, minor, patch = split_version(version)
consider_version = get_latest_patch(version, data)
for v in list(versions.keys())[index:]:
if v.split('.')[0]==major:
if v.split('.')[1]>minor:
consider_version = v
return consider_version

if v.split('.')[0] == major:
if v.split('.')[1] > minor:
consider_version = v
return consider_version

def get_npm_release_date(data, version):
release_time = data['time'][version]
release_time = data['time'].get(version)
if release_time:
return release_time
logger.warning(f"Release time not found for version {version}")
return None


def get_npm_latest_version(data):
return data['dist-tags']['latest']
return data['dist-tags'].get('latest', 'unknown')

#add code here
def get_npm_current_version(data, requirement):
if requirement[0]=='~':
if requirement[0] == '~':
try:
return get_latest_patch(clean_version(requirement), data)
except ValueError:
return None
elif requirement[0]=='^':
elif requirement[0] == '^':
try:
return get_lastest_minor(clean_version(requirement), data)
except ValueError:
return None
else:
return requirement
return requirement
Original file line number Diff line number Diff line change
Expand Up @@ -179,10 +179,13 @@ def generate_commit_record(repos_id,commit,filename,
#db_local.commit()
execute_sql(store_working_commit)

# commit_message = check_output(
# f"git --git-dir {repo_loc} log --format=%B -n 1 {commit}".split()
# ).strip()

commit_message = check_output(
f"git --git-dir {repo_loc} log --format=%B -n 1 {commit}".split()
).strip()

).decode('utf-8').strip()

msg_record = {
'repo_id' : repo_id,
Expand Down
1 change: 1 addition & 0 deletions augur/tasks/github/contributors.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import time
import logging
import traceback

from augur.tasks.init.celery_app import celery_app as celery
from augur.tasks.init.celery_app import AugurCoreRepoCollectionTask
Expand Down
8 changes: 4 additions & 4 deletions augur/tasks/github/events.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,10 +56,10 @@ def bulk_events_collection_endpoint_contains_all_data(key_auth, logger, owner, r

page_count = github_data_access.get_resource_page_count(url)

if page_count > 300:
raise Exception(f"Either github raised the paginator page limit for things like events and messages, or is_pagination_limited_by_max_github_pages is being used on a resource that does not have a page limit. Url: {url}")
if page_count > 1000:
raise Warning(f"Page Count is {page_count}. Either github raised the paginator page limit for things like events and messages, or is_pagination_limited_by_max_github_pages is being used on a resource that does not have a page limit. Url: {url}")

return page_count != 300
return page_count != 1000


def bulk_collect_pr_and_issue_events(repo_git: str, logger, key_auth):
Expand Down Expand Up @@ -89,7 +89,7 @@ def collect_pr_and_issues_events_by_number(repo_id, repo_git: str, logger, key_a
query = text(f"""
(select pr_src_number as number from pull_requests WHERE repo_id={repo_id} order by pr_created_at desc)
UNION
(select gh_issues_number as number from issues WHERE repo_id={repo_id} order by created_at desc);
(select gh_issue_number as number from issues WHERE repo_id={repo_id} order by created_at desc);
""")

result = connection.execute(query).fetchall()
Expand Down
2 changes: 1 addition & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
version: '3'
services:
augur-db:
image: postgres:14
image: postgres:16
restart: unless-stopped
environment:
- "POSTGRES_DB=augur"
Expand Down
66 changes: 58 additions & 8 deletions docker/backend/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
#SPDX-License-Identifier: MIT
FROM python:3.10-bookworm
# SPDX-License-Identifier: MIT
FROM python:3.12-bookworm

LABEL maintainer="outdoors@acm.org"
LABEL version="0.76.1"

ENV DEBIAN_FRONTEND=noninteractive
ENV PATH="/usr/bin/:/usr/local/bin:/usr/lib:${PATH}"

RUN set -x \
&& apt-get update \
&& apt-get -y install --no-install-recommends \
&& apt-get -y install \
git \
bash \
curl \
Expand All @@ -18,9 +19,55 @@ RUN set -x \
musl-dev \
python3-dev \
python3-distutils \
python3-venv \
wget \
postgresql-client \
&& rm -rf /var/lib/apt/lists/*
libpq-dev \
build-essential \
rustc \
cargo \
chromium \
chromium-driver \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* \
&& curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y

# Install Firefox from Debian repositories for ARM64 architecture
RUN set -x \
&& apt-get update \
&& apt-get install -y firefox-esr

# Install Geckodriver
RUN GECKODRIVER_VERSION=$(curl -s https://api.github.com/repos/mozilla/geckodriver/releases/latest | grep 'tag_name' | cut -d\" -f4 | sed 's/v//') \
&& ARCH=$(uname -m) \
&& if [ "$ARCH" = "aarch64" ]; then \
GECKODRIVER_URL="https://github.com/mozilla/geckodriver/releases/download/v${GECKODRIVER_VERSION}/geckodriver-v${GECKODRIVER_VERSION}-linux-aarch64.tar.gz"; \
GECKODRIVER_FILE="geckodriver-v${GECKODRIVER_VERSION}-linux-aarch64.tar.gz"; \
else \
GECKODRIVER_URL="https://github.com/mozilla/geckodriver/releases/download/v${GECKODRIVER_VERSION}/geckodriver-v${GECKODRIVER_VERSION}-linux64.tar.gz"; \
GECKODRIVER_FILE="geckodriver-v${GECKODRIVER_VERSION}-linux64.tar.gz"; \
fi \
&& wget $GECKODRIVER_URL \
&& tar -xzf $GECKODRIVER_FILE \
&& mv geckodriver /usr/local/bin/ \
&& rm $GECKODRIVER_FILE

# Verify installations
RUN firefox --version
RUN geckodriver --version

# Ensure Rust directories are writable
RUN mkdir -p /root/.rustup/downloads /root/.cargo/registry && \
chmod -R 777 /root/.rustup /root/.cargo

# Add rust and cargo to PATH
ENV PATH="/root/.cargo/bin:${PATH}"

# Install the specific version of Rust
RUN set -x \
&& rustup install 1.78.0
RUN set -x \
&& rustup default 1.78.0

EXPOSE 5000

Expand All @@ -32,7 +79,9 @@ COPY ./metadata.py .
COPY ./setup.py .
COPY ./scripts/ scripts/

#COPY ./docker/backend/docker.config.json .
# Add rust and cargo to PATH
ENV PATH="/usr/bin/:/root/.cargo/bin:/usr/local/bin:${PATH}"

RUN python3 -m venv /opt/venv

RUN set -x \
Expand All @@ -43,7 +92,7 @@ RUN set -x \

RUN set -x \
&& /opt/venv/bin/pip install .

RUN set -x \
&& /opt/venv/bin/pip install --upgrade pip \
&& /opt/venv/bin/pip install wheel \
Expand All @@ -59,5 +108,6 @@ RUN mkdir -p repos/ logs/ /augur/facade/
COPY ./docker/backend/entrypoint.sh /
COPY ./docker/backend/init.sh /
RUN chmod +x /entrypoint.sh /init.sh
ENTRYPOINT ["/entrypoint.sh"]
CMD /init.sh
ENTRYPOINT ["/bin/bash", "/entrypoint.sh"]
#ENTRYPOINT ["/entrypoint.sh"]
CMD /init.sh
7 changes: 7 additions & 0 deletions docker/backend/init.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,11 @@ if [[ -f /repos.csv ]]; then
augur db add-repos /repos.csv
fi

if [[ -d /augur/logs ]]; then
echo "The directory exists" > /augur/logs/log.holder

fi

echo $PATH

exec augur backend start
2 changes: 1 addition & 1 deletion docker/database/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#SPDX-License-Identifier: MIT
FROM postgres:14
FROM postgres:16

LABEL maintainer="outdoors@acm.org"
LABEL version="0.76.1"
Expand Down
Loading
Loading