-
Clone the Repository:
git clone https://github.com/k-zehnder/gophersignal.git cd gophersignal
-
Configure Environment:
Copy the example environment file:
cp .env.example .env
Edit the
.env
file:Open the
.env
file and set your environment variables:# Frontend NEXT_PUBLIC_ENV=development # Backend GO_ENV=development SERVER_ADDRESS=0.0.0.0:8080 # MySQL MYSQL_HOST=mysql MYSQL_PORT=3306 MYSQL_DATABASE=gophersignal MYSQL_USER=user MYSQL_PASSWORD=password MYSQL_ROOT_PASSWORD=password # Ollama Configuration OLLAMA_BASE_URL=http://localhost:11434/api/generate OLLAMA_MODEL=llama3:instruct
Set Up Ollama:
The project uses Ollama to summarize articles. Follow these steps to set up Ollama:
-
Install Ollama:
Visit the Ollama installation page and download the installer for your operating system. Alternatively, for macOS users with Homebrew:
brew install ollama/tap/ollama
-
Start the Ollama Server:
Ollama runs as a local server. Start it by running:
ollama serve
-
Download the Required Model:
The default model used is
llama3:instruct
. Pull the model using:ollama pull llama3:instruct
If you specified a different model in your
.env
file underOLLAMA_MODEL
, make sure to pull that model instead.
Note: Ensure that Ollama is running whenever you run the scraper or the application components that require summarization.
-
-
Ensure Docker is Installed and Running:
Make sure Docker is installed and running on your host machine. You can download Docker Desktop from here.
Alternatively, you can install Docker via the command line:
For Ubuntu:
sudo apt-get update sudo apt-get install -y apt-transport-https ca-certificates curl software-properties-common curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add - sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" sudo apt-get update sudo apt-get install -y docker-ce sudo systemctl status docker
For Mac:
brew install docker brew install docker-compose
For Windows:
You can download Docker Desktop for Windows from here and follow the installation instructions provided on the website.
-
Launch Development Environment with Docker:
This will build and start all necessary services:
make dev
-
Populate the Database with Data by Running the Scraper:
cd hackernews_scraper make scrape cd ..
Your development environment should now be running.
- Frontend: Visit
http://localhost:3000
to view the frontend. - Swagger UI: Access the API documentation at
http://localhost:8080/swagger/index.html