Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run React app inside Docker with env vars #982

Closed
furlanrapha opened this issue Oct 29, 2016 · 46 comments
Closed

How to run React app inside Docker with env vars #982

furlanrapha opened this issue Oct 29, 2016 · 46 comments

Comments

@furlanrapha
Copy link

furlanrapha commented Oct 29, 2016

Already set the env vars in my Docker on AWS, but the app is not using it when running.

Does anyone know how I can do it?

For development, I'm using the .env file. For the build, I'm putting the .env file in .dockerignore so this development file doesn't go into the build version. My intention is to use the default Env Vars configuration in AWS ECS.

I'm not using docker-compose. I'm just using the Dockerfile.

@jihchi
Copy link
Contributor

jihchi commented Oct 30, 2016

According to Docker's run document, you could use -e flags to set any environment variable in the container.

For example:

docker run  \
  -d \
  -e "NODE_ENV=production" \
  -e "REACT_APP_APIKEY=foObArBAz" \
  your-image-name

Then, your could get the value from process.env in your JS code:

console.log(process.env.REACT_APP_APIKEY)
// foObArBAz

@furlanrapha
Copy link
Author

Sorry @jihchi, I will give more context here:

I'm trying to run the npm run build and set this build version for Staging and Production environment.

Since I have 2 environments, I was trying to use the AWS ECS env vars (defining it inside the Task Definition) to set the environment variables. The catch is, when I run npm run build it tries to copies the local env vars to the build version. For development, I use the .env file but I don't want to use it for creating the Docker image, so I have created a .dockerignore file to ignore the .env file. That's all working fine here. Now it is generating the build version and I saw in the minified JS the env vars as REACT_APP_APIKEY (following your example). But now when I set the env vars it looks like it doesn't get my configuration.

My final question is: when I generate the build version (with npm run build), I can't have a custom configuration to use in this build version? Or I am missing some point here? If I can't have a builde version with a custom configuration, I have to assume running the "development" version to set custom configuration?

I will attach here my Dockerfile to give a better view of the situation.

@furlanrapha
Copy link
Author

Dockerfile

FROM node:6.3.1

# Create app directory
RUN mkdir -p /src/app
WORKDIR /src/app

# Install app dependencies
COPY package.json /src/app/
RUN npm install

# Bundle app source
COPY . /src/app

# Build and optimize react app
RUN npm run build

EXPOSE 3000

# defined in package.json
CMD [ "npm", "run", "start:server" ]

packages.json (the rest was omitted for context purposes)

...
"start:server": "http-server -p 3000 ./build",
...

@jihchi
Copy link
Contributor

jihchi commented Oct 31, 2016

Maybe you could use ARG directive in Dockerfile.

For example, Add following code in your Dockerfile:

ARG NODE_ENV=staging
ENV NODE_ENV=$NODE_ENV

ARG REACT_APP_APIKEY=foObArBAz
ENV REACT_APP_APIKEY=$REACT_APP_APIKEY

Then, execute docker build command with additional --build-arg flags:

(Assume that you have environment variables in host machine called HOST_NODE_ENV and HOST_REACT_APP_APIKEY)

docker build \
  --build-arg NODE_ENV=$HOST_NODE_ENV \
  --build-arg REACT_APP_APIKEY=$HOST_REACT_APP_APIKEY \
  .

docker build will pass in host environment variables to your Dockerfile.

@furlanrapha
Copy link
Author

@jihchi my intention is to use the build version with custom env vars. Looking to the build version it looks like is everything minified so I can't have a build version that I can customize the env vars for Staging and Production.

I think that what I want is not supported. I will have to probably run the development server in order to accomplish this.

@gaearon
Copy link
Contributor

gaearon commented Nov 20, 2016

I don't know anything about Docker.
Is there anything I could help you with?
Do we need to fix something in Create React App, or is it just a usage question?

@furlanrapha
Copy link
Author

So @gaearon, in Docker you generate an image and send this to the container (in my case in using AWS ECR). This image is like a build version, but I can use this image for all my environments. In my case here, we have Staging and Production environment. So I have the possibility to have the same image version of my application (0.1.0, 0.2.0, etc..) and use through the environments.

The point for create-react-app is, if I generate the Docker image using the npm run build it will try to read my env vars and I actually doesn't have it at this time (because the env vars stay on the AWS ECR). What I need would be the ability to generate the build version with the minified files but still have the ability to set the env vars to use in this build version.

I don't know if my explanation was good for your understanding and I don't know if what I want is possible to achieve since in the build version we only have static files.

@viankakrisna
Copy link
Contributor

So you build the image on the local machine, then deploy it? Once it is built, you cannot add any env var to it, because CRA builds a static file. I think the best way is to serve the html with php or node backend that reads from env var to print a global variable.

@lallmon
Copy link

lallmon commented Nov 23, 2016

@gaearon No, for the most part docker works fine with create-react-app, however if anyone has a similar issue to this, they just need to make sure they add env_file: .env to their docker-compose.yml

@haluvibe
Copy link

haluvibe commented Jan 17, 2017

@Gearon Docker is amazing, I can run create-react-app without installing node on my local, hence avoiding having to use node/npm version managers such as nvm. Plus, it just works! I would even go so far to say at the very least you should add a how to integrate with docker section to the readme.

If anyone is interested I got it to work with the following (for all steps below, replace the string boilerplate with you apps name) - if there are any improvements I can make, please let me know.

Step 1) After creating your react app with create-react-app -> cd into your newly created apps directory and run this command

docker network create boilerplate

Step 2) Add a dockerfile to the root of your app directory

Dockerfile

FROM node:6.9.4

# Prepare app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app/

# Install dependencies
COPY package.json /usr/src/app/
RUN npm install --silent

ADD . /usr/src/app/

EXPOSE 3000
CMD [ "npm", "start" ]

Step 3) create a docker-compose.yml in your root directory

docker-compose.yml

version: "2"

services:
  frontend:
    container_name: "boilerplate"
    build: .
    environment:
      env_file: .env
      NODE_ENV: development
    ports:
      - '3000:3000'
    volumes:
      - .:/usr/src/app

networks:
  default:
    external:
      name: boilerplate

You can now access your app as normal at http://localhost:3000

and you can even interact with your new docker container via the following command

docker-compose run --rm boilerplate /bin/bash

Yes, this isn't following some of the nodejs docker recommend best practices, but I use the above only for the dev environment configuration so I'm unaware if those recommendations are needed. I'm open minded to improving this though.

Obviously you need to configure the build docker differently than above.

@cschroeter
Copy link

@haluvibe
How can I run create-react-app without having nodejs installed on my local?

@lallmon
Copy link

lallmon commented Feb 2, 2017

@cschroeter

By using Docker as he describes. Docker makes a "container" that contains system level dependencies, and executes the Node runtime inside that.

If you have a dockerized app, you could completely uninstall Node from your system, and still run the app.

@cschroeter
Copy link

cschroeter commented Feb 3, 2017

@Lucaska
But I have to have NodeJS installed to create the boilerplate? That is a little disadvantage.
Also if I am using Typescript, it looks up the typings files in the corresponding node_modules folder.
So I am not 100% convinced how I can get rid of NodeJS during development time :/

@lallmon
Copy link

lallmon commented Feb 4, 2017

@cschroeter I can't speak for your use cases, as I don't know what boilerplate you are running and I don't use typescript.

However, a container is just that, a system level container. You can SSH into a Docker container and run commands, or set up scripts for container build or what have you.

I don't use Docker to "remove Node completely", I still keep the Current version of Node on my laptop (with all the global CLI tools).

I use docker to run my apps in as similar of an environment I am deploying to. Mostly Node 6.9.x LTS with NPM 3.10

@baxford
Copy link

baxford commented Feb 6, 2017

thanks for all the tips on this page, I've been able to get CRA running in docker.
I found a few other handy things:

  • if you link port 35729 on your container to 35729 on your system, hot reloading works
  • you can watch/run all tests within docker as well with a 'test' config in docker-compose.
  • I've also created a multi-purpose docker container that will build the production version and serve it with nginx. I'm not sure if this is necessarily what I'd use in production (as the files could just be served statically via s3/CDN, but it allows me to quickly look at a production version of my app, or let me run it in a CI enviroment).

Here's my setup:
Dockerfile

FROM node:6.3.1

RUN apt-get update && \
    apt-get install -y nginx

WORKDIR /src

COPY . /src

RUN npm install

CMD /bin/bash ./run.sh

run.sh (executed by the docker container)

#!/usr/bin/env bash
set -e
set -x

export NODE_ENV="${NODE_ENV:-development}"

if [ $NODE_ENV == "development" ]; then
  # this runs webpack-dev-server with hot reloading
  npm start
else
  # build the app and serve it via nginx
  npm run build
  mkdir -p $ROOT/logs/nginx
  nginx -g 'daemon off;' -c $ROOT/src/nginx.conf
  nginx -c $ROOT/src/nginx.conf
fi

NOTES: If NODE_ENV=development this will run the webpack-development-server, otherwise it will build the app and serve it via nginx. (I got this from a great blog post which I can't for the life of me find anymore)

nginx.conf

worker_processes 1;

events {
  worker_connections 1024;
}

http {
  access_log /var/log/nginx/access.log;
  error_log  /var/log/nginx/error.log;

  server {
    gzip on;
    listen 8000;
    server_name localhost;
    root /src/build;

    include /etc/nginx/mime.types;

    location /nginx_status {
      stub_status on;
      access_log off;
    }

    location / {
      try_files $uri $uri/ /index.html;
    }
  }
}

docker-compose.yml

version: '2'
services:
  dev:
    build:
      context: .
      dockerfile: Dockerfile
    image: ui-dev
    container_name: webpack-container
    environment:
      - NODE_ENV=development
    ports:
      - "8080:3000"
      - "35729:35729"
    volumes:
      - .:/src
      - /src/node_modules
  test:
    build:
      context: .
      dockerfile: Dockerfile
    image: ui-test
    container_name: webpack-test-container
    environment:
      - NODE_ENV=test
    volumes:
      - .:/src
      - /src/node_modules
    command: npm test
  prod:
    build:
      context: .
      dockerfile: Dockerfile
    image: guest-ui-prod
    container_name: prod-container
    environment:
      - NODE_ENV=production
    ports:
      - "8000:8000"
    volumes:
      - /src/node_modules

For my development with hot reloading, I run one terminal window and execute the following:
docker-compose up -d dev

To watch and run tests, I then open another terminal windown and execute:
docker-compose up test
(I don't use the -d flag as I want to see the output)

And for a production sanity check, I can run
docker-compose up prod
NOTE: - this will serve the react app at the time the docker image was created, so if you've been making changes since the docker build, these won't appear. However, if this is used as part of a CI system which checks out the code and builds it, the image built will be the latest.

On an unrelated note, I also found it useful to set the NODE_PATH in my package.json to allow relative imports:

  "scripts": {
    "start": "NODE_PATH=./src/ react-scripts start",
    "build": "NODE_PATH=./src/ react-scripts build",
   ...
  }

@jayhuang75
Copy link

@furlanrapha Did you have your problem solved? Thanks advices

@furlanrapha
Copy link
Author

No @jayhuang75, I'm still doing npm install and running the development server in the Docker image so I can have the env vars set in AWS Task Definition.

@jordan-enev
Copy link

@haluvibe your Docker configuration looks good. Solely creating additional network seems unneccessary, because by default Docker Compose sets up a network for you:

By default Compose sets up a single network for your app. Each container for a service joins the default network and is both reachable by other containers on that network, and discoverable by them at a hostname identical to the container name.

Also in order to enable livereload feature on Windows host, you have to enable chokidar polling, because of inotify does not work on Docker for Windows. Here I shared my experience and how I made it working on Windows.

@daveamit
Copy link

@furlanrapha I have had a very similar issue, and I was able to sucessfully fix it. I am using webpack to do the "build", and I am able to access all environment variables inside my react app.

Can you please tell me how do you do your builds ? I might be able to help.

@furlanrapha
Copy link
Author

furlanrapha commented Apr 14, 2017

@daveamit you ejected your app from react-scripts?

Here is the Dockerfile that I use today (we don't use docker-compose):

FROM node:6.9.4

EXPOSE 3000
CMD [ "npm", "run", "start" ]

WORKDIR /src/app

# Install app dependencies
COPY npm-shrinkwrap.json .
COPY package.json .
RUN npm install

COPY public ./public
COPY src ./src

@daveamit
Copy link

daveamit commented Apr 14, 2017

@furlanrapha This is what I do.

Step1:
Added DefinePlugin in webpack config. This tells webpack to include these node environment variables (process.env.<ENV_NAME>), and map them to corresponding names (which can be accessed in react app).

new webpack.DefinePlugin({
      'process.env': {
        NODE_ENV: JSON.stringify(process.env.NODE_ENV),
        API_ENDPOINT: JSON.stringify(process.env.API_ENDPOINT),
        DEFAULT_TOKEN: JSON.stringify(process.env.DEFAULT_TOKEN),
        SHOW_VERSION_INFO: JSON.stringify(process.env.SHOW_VERSION_INFO),
      },
    }),

Step 2:
This is my build command (in package.json)
"build": "cross-env NODE_ENV=production webpack --config internals/webpack/webpack.prod.babel.js --color -p --progress",

This will procude minified, optimized, chuncked output.

Step 3:
"start:production": "npm run test && npm run build && npm run start:prod",
This command, runs tests, then builds and then starts the production server

Step 4:
"start:prod": "cross-env NODE_ENV=production node server",

Runs production server (serves static content generated in step 2 - via npm run build command)

Step 5:
My docker file

FROM node

ADD /package.json /tmp/package.json
WORKDIR /tmp
RUN npm install
ADD . /tmp/
EXPOSE 3000

ENTRYPOINT npm run start:production

The magic happens at the "build" command. The webpack configuration that I do in step 1 causes the step 2 build step to actuall compile the app with given environment variables.

The trick is I have put "npm run start:production" in "ENTRYPOINT", this will enable access to the environemnt vairables passed while spinning off a container from the image.

For testing if I run following commands

docker build -t test
and then run it using

docker run -p 3000:3000 -e "SHOW_VERSION_INFO=true" /
-e "SHOW_VERSION_INFO: true" /
-e "API_ENDPOINT=http://localhost:6010" /
-e "DEFAULT_TOKEN=sometoken" test

We are taking build version from package.json, but decided to show or hide depending on environment variable. The component looks like this

import React from 'react';
import styled from 'styled-components';
import { version } from '../../../package.json';

const VersionWrapper = styled.span`
  position: fixed;
  align: top;
  top: 5px;
  right: 20px;
  fontSize: 9px;
  color: black;
  line-height: 1;
  display: ${process.env.SHOW_VERSION_INFO ? 'block' : 'none'}
`;

// Find a suitable way to show build numbers only during dev/qa/int.
// UPDATE: Solution incorporated at VersionWrapper (observe display prop)
const BuildInfo = () => <VersionWrapper> {version} </VersionWrapper>;

BuildInfo.propTypes = {

};

export default BuildInfo;

Hope this helps.

@stelioschar
Copy link

@baxford I liked your solution very much 👍 the problem is that i wasn't watching my files.

This guys gave the solution by setting this

CHOKIDAR_USEPOLLING=true

It is also reported to the official documentation.

If the project runs inside a virtual machine such as (a Vagrant provisioned) VirtualBox, create an .env file in your project directory if it doesn’t exist, and add CHOKIDAR_USEPOLLING=true to it. This ensures that the next time you run npm start, the watcher uses the polling mode, as necessary inside a VM.

@okbrown
Copy link

okbrown commented Apr 25, 2017

@daveamit in order to do this, you ejected your CRA right? because we have no access to webpack.config.js normally. Or has this changed?

@rmoorman
Copy link

@furlanrapha in order to be able to make environment variables for your container available to your already-built-for-production react app at run time, do any of the approaches outlined in #578 work for you? Basically this comes down to generating an env.js file (either during container startup within the entry point or dynamically through an addition to the (nodejs or something else) server you might have) and referencing it within the index.html of CRA.

@james-tiqk
Copy link

@furlanrapha I'm having the exact need and wondering what's your final approach, without eject CRA.

@rmoorman also tried the approaches you suggested but not working properly, maybe I've done something wrong.

@rmoorman
Copy link

rmoorman commented Jul 3, 2017

@james-tiqk would it be possible for you to share what you have got right now?

@james-tiqk
Copy link

@rmoorman Hi mate thanks for the reply. I have the env variables set in Beanstalk and in project Dockerfile, I use npm run build to create the prod bundle. In the src code, I'm referencing the process.env.REACT_APP_* but seems like it is not working properly.

I'm not using docker-compose and I wouldn't prefer eject react scripts.

@rmoorman
Copy link

rmoorman commented Jul 3, 2017

@james-tiqk do you run npm run build as part of your docker ENTRYPOINT/CMD script or within a RUN directive? The environment variables need to be around when the build command is run and if you build the react app within RUN, the build actually happens when building the image, not while running the container. Then, this could be why the environment variables set for your container in Beanstalk are not being picked up as they might affect the environment variables the script called for ENTRYPOINT or CMD receive (not knowing your precise setup though).

That's also what the discussion in #578 is mainly about. Usually, you actually don't want to run npm run build within the container entry point but rather within a RUN directive (so it can be cached and does not have to be executed everytime a container is launched).
So how to get the variables in there when npm run build was already executed? That's where the public/env.js file comes in. When you add a <script src="%PUBLIC_URL%/env.js"></script> to your public/index.html and add a public/env.js file (with some defaults for development..), you could then: a) within the docker entrypoint script write the environment variables (in your case from beanstalk) to the env.js file or b) in case you have a customizable server thing in your container (nodejs/python/whatever) that is serving your files, you could add a route in there for /env.js and render the javascript with the correct environment variables on the fly.
(By the way, an env.js file is chosen instead of a env.json within the linked thread because you can reference it within the script tag and it will be loaded along with the app.js right from the start so you won't have to fetch it manually and don't have to account for the additional loading time within the UX; the variables are then just there within the window global)

@james-tiqk
Copy link

@rmoorman thanks mate for the explanation, now knowing the cause I get it works by using a shell command executed after the docker image built.

@okbrown
Copy link

okbrown commented Jul 4, 2017

@james-tiqk, @rmoorman is right, my environment is AWS, uses Jenkins to build and deploy. npm run build then copy .env and Dockerfile to the build dir then run the docker build -t then in the docker file I do the usual updates, theninstall serve expose port and CMD which then runs a shell script that replaces (using sed commands) the envars in the `.env' file from a docker compose file.

Its long winded but fits an existing deployment structure I am used too.

Ultimately, its the CMD ["bash"] that runs the command to insert the environment variables.

@Timer Timer closed this as completed Aug 2, 2017
@przbadu
Copy link

przbadu commented Aug 24, 2017

Because we are using Docker, we should not install node, npm, create-react-app
in our development machine, not even for generating create-react-app scaffold.

For this purpose I am using 2-step docker configuration:

  • In first step, we will create a simple docker container, that does only one thing, install create-react-app
    and we can use it to generate create-react-app scaffold.
  • And second docker configuration is project specific. It will handle nodemon, npm, development servers, etc.

Build docker image for create-react-app

Because we are not installing create-react-app locally in our development machine, Let's build new docker container for it.

Dockerfile

FROM node:8.2.1-alpine
RUN npm install -g create-react-app \
                   create-react-native-app \
                   react-native-cli
RUN mkdir /app
WORKDIR /app
ADD . /app

my failed attempt to solve this problem without docker-compose

NOTE: we need to fix this problem, please help me if anyone know the solution.

  1. build the docker image that will handle just react cli part.
docker build . -t react-cli
  1. Once that is done, let's generate new create-react-app scaffold using this image:
docker run react-cli create-react-app myApp

EXPECTED

  • it should generate react scaffold app myApp in my current directory.

RESULT

Got nothing 😄 . Looks like it is generating app inside docker container.

Working Solution: using docker-compose

Because I was unable to solve it by using just docker command, I am solving it using
docker-compose.

docker-compose.yml

version: '3'
services:
  web:
    build: .
    image: react-cli
    container_name: react-cli
    volumes:
      - .:/app

Usage

Now using docker-compose we can generate our react application:

docker-compose run web create-react-app myApp

Everthing should work this time. And you should get react generated application inside myApp
directory.

If anyone knows how to use react-cli image to generate create-react-app without using docker-compose.yml file, please let me know.

Thanks

@Chaoste
Copy link

Chaoste commented Jan 15, 2018

@przbadu Did you start the container with a volume as you did in the docker compose config?

@droarty
Copy link

droarty commented Jan 31, 2018

After reading this whole thread, for me, solving the missing env variables was solved by the fifth comment... I simply needed to add the ARG and ENV parameters to my dockerfile so that my build process had access to certain env variables.

@mikesparr
Copy link

I noticed a lot of people struggle with this so provided a Github repo tutorial:

Background via a public article on LinkedIn articles if you care to read:

@simbo1905
Copy link

simbo1905 commented Apr 23, 2018

A lot of people on this thread didn't seem to read the actual description of the problem. It wasn't how to use docker but how to have a single docker image and have the browser use an API_URL that is set by environment variables on the server.

I wrote up the question on stackoverflow.com at https://stackoverflow.com/q/49975735/329496

A solution is provided over two comments at #578 that I have written up in the answer to the question at https://stackoverflow.com/a/49989975/329496

There is working sample code over at https://github.com/simbo1905/react-redux-realworld-example-app/tree/openshift

Update: If you are using a commercial container orchestrator then you might be charged differently for build memory and runtime memory. This answer ensures that you can keep the runtime image to a minimum especially if you run "npm prune" to strip your devDependencies at the end of your build. A big react build can be slow. This approach builds once and lets you move that work between environments which is a big time saver compared with building in each environment.

@mikesparr
Copy link

I created a tutorial on LinkedIN articles but also a companion repo: https://github.com/mikesparr/tutorial-react-docker

What you need to do is build your app during the run or cmd phase of Dockerfile and not before, otherwise the env params will be missing. I add a run bash script and call that from the Dockerfile and it works, but note you still need to name your ENV vars with REACT_APP_ prefix.

@geerlingguy
Copy link

I have the same issue, and I agree with @simbo1905 that half the commenters in this thread didn't really 'get' the issue that was being discussed (also don't know why it's closed, as there's no clean/official way to do this).

Basically:

  • I want to npm build the production artifact for my React app.
  • I want to put that artifact into a container image tagged my/react-app:1.0.0 (e.g. inside an Nginx container image)
  • I want to deploy that container image (my/react-app:1.0.0) to the following environments:
    1. Stage
    2. Prod
  • I have a variable like backendUrl, and it needs to be different per environment:
    1. In stage: http://stage.mybackend.com/
    2. In prod: https://www.mybackend.com/

And if I'm using a container orchestration tool like Kubernetes, ECS, Fargate, Mesos, Rancher, etc., then for all my other apps, I do something like:

  1. Run my/react-app:1.0.0 on stage with environment variable REACT_APP_BACKEND =http://stage.mybackend.com/
  2. Run my/react-app:1.0.0 on production with environment variable REACT_APP_BACKEND=https://www.mybackend.com/

The problem is that when I run npm build, the value for backendUrl is baked into the compiled JS file for my app, and I can't override it using configuration on my server.

Right now I'm considering doing something like @simbo1905's Stack Exchange answer, namely to render a env.js script using an entrypoint script when starting the container, and then modify my React app's index.html to read that JS file like <script src="%PUBLIC_URL%/env.js"></script>.

It would just be nice if there were a cleaner way to do this, or an officially-documented pattern.

When running the node.js development server, it's super easy because I can just refer to process.env. REACT_APP_BACKEND and know that it will pick it up. (And it looks like that's what @furlanrapha had resorted to as of this comment). But the development server itself has a warning "don't use this in production"... so I want to build the app and use that, but I'm finding it a bit annoying to deploy one Docker image in multiple environments with per-environment settings.

@mikesparr
Copy link

mikesparr commented Dec 4, 2018 via email

@mikesparr
Copy link

mikesparr commented Dec 4, 2018 via email

@dceddia
Copy link
Contributor

dceddia commented Dec 4, 2018

The problem with using one React build in multiple environments is that the environment variables are baked in at build time. Docker or not, those built files are static. Done. Fully-baked. Unless your server can change them somehow.

During development, you get the advantage of a development server, which makes it recompile with every change, and can make it feel like a "live" environment. In production, React has no way of modifying itself based on environment variables since it's just a bunch of static HTML and JS files.

One thing you can key off is the hostname. So, what I've done in the past is to have one file called api-config.js which knows about all my different environments, and sets the API endpoint based on the window.location.hostname at runtime. Then, anything that needs to make an API call can import this file and will know which URL to hit. I wrote up an article with an example of how to set this up and configure API endpoints dynamically. This is the relevant part:

//// api-config.js

let backendHost;
const apiVersion = 'v1';

const hostname = window && window.location && window.location.hostname;

if(hostname === 'realsite.com') {
  backendHost = 'https://api.realsite.com';
} else if(hostname === 'staging.realsite.com') {
  backendHost = 'https://staging.api.realsite.com';
} else if(/^qa/.test(hostname)) {    // starts with "qa"
  backendHost = `https://api.${hostname}`;
} else {
  backendHost = process.env.REACT_APP_BACKEND_HOST || 'http://localhost:8080';
}

export const API_ROOT = `${backendHost}/api/${apiVersion}`;

Then anywhere you need that endpoint, you can import it and make your calls:

import { API_ROOT } from './api-config';

function getUsers() {
  fetch(API_ROOT + "/users").then(...);
}

@mikesparr
Copy link

mikesparr commented Dec 4, 2018 via email

@geerlingguy
Copy link

geerlingguy commented Dec 4, 2018

@mikesparr - That's just the problem—you're saying that I need to have a docker image that has Node.js and all the associated baggage.

If you're supposed to run React apps compiled using npm build, with just the static compiled files, using a webserver like Nginx, then it's crazy to also have to require a Node.js runtime be present on the Docker image so I can do an npm build when I start my Docker container. Not only that, the container startup would take at least many seconds, if not minutes... and this all is completely anathema to 12fa apps and pretty much every other type of application I build and deploy in Kubernetes and ECS clusters.

I just noticed you're using serve (which is a Node.js-based http server, it seems—I haven't personally used it), but you still have to wait at container startup for the entire app to be built. That option seems like it could work, but it would not be ideal and limits you to either having a Docker image with Node.js and a decent webserver (meaning at least double or triple the size of an image like nginx-alpine), or a Docker image with Node.js and serve or some other Node-based http server.

@simbo1905
Copy link

simbo1905 commented Dec 4, 2018

@mikesparr The problem with building the app in run is that our app needs 1.2G of memory to npm run build --production but only 0.25G to run. We are charged per gig by our k8s provider. And as its a large app it takes a long, long, time for build. If that happens in run our rolling deployments and crash recovery are slowed down by minutes. Our CI builds can happily supply the 1.2G and pay the time cost once to build a release of our app. And out of that we want a docker image that is super fast to startup and only takes up the space it takes to serve the static app as our backend APIs are written in other languages. 12factor.net says to separate "build, release, run" and the example you give of running npm run build on startup cuts a corner. When the build time takes minutes for a large app and you get charged money for the memory you don't use after the compile phase its a real problem.

@dceddia
Copy link
Contributor

dceddia commented Dec 4, 2018

@mikesparr I don't know much about 12factor - which part is violated by configuring the endpoint based on hostname? For the most part, all it does is effectively prepend api. on the existing hostname, so I'm interested to learn where that breaks down in practice.

@mikesparr
Copy link

mikesparr commented Dec 5, 2018 via email

@rmoorman
Copy link

rmoorman commented Dec 5, 2018 via email

@lock lock bot locked and limited conversation to collaborators Jan 9, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests