-
-
Save judy2k/7656bfe3b322d669ef75364a46327836 to your computer and use it in GitHub Desktop.
# Pass the env-vars to MYCOMMAND | |
eval $(egrep -v '^#' .env | xargs) MYCOMMAND | |
# … or ... | |
# Export the vars in .env into your shell: | |
export $(egrep -v '^#' .env | xargs) |
why wouldn't you put that in quotes? Quotes were made for this...
Personally, I agree. But the docker developers unfortunately don't.
From the docker-compose documentation (https://docs.docker.com/compose/env-file/):
- There is no special handling of quotation marks. This means that they are part of the VAL.
Bummer. If you use the same .env file with docker-compose quotes must be avoided.
doing it wrong just because one idiot from docker thought it would be clever to do shit like that... Id rather have a script generating the yaml file from my correct .env file than using it like that
Do you know if podman copied the same behavior or fixed it?
@herbaltealeaf @bmmuller
What about the use case when values build on preceding entries in the dotenv
file?
Source file: gitlab.env
CI_SERVER_HOST=gitlab.selfhosted.domain
CI_SERVER_PROTOCOL=https
CI_SERVER_URL=${CI_SERVER_PROTOCOL}://${CI_SERVER_HOST}
CI_API_V4_URL=${CI_SERVER_URL}/api/v4
CI_REGISTRY=${CI_SERVER_HOST}:4567
CI_REGISTRY_USER=gitlab-ci-token
GITLAB_URL=${CI_SERVER_URL}
I get literals as values.
$ export_envs gitlab.env
$ env | grep -Fe CI_ -e GITLAB_ - | grep -Fv TOKEN
CI_REGISTRY_USER=gitlab-ci-token
CI_API_V4_URL=${CI_SERVER_URL}/api/v4
CI_PAGES_DOMAIN=pages.selfhosted.domain
CI_REGISTRY=${CI_SERVER_HOST}:4567
CI_SERVER_URL=${CI_SERVER_PROTOCOL}://${CI_SERVER_HOST}
CI_SERVER_PROTOCOL=https
CI_SERVER_HOST=gitlab.selfhosted.domain
GITLAB_URL=${CI_SERVER_URL}
Whereas sourcing the file evaluates the values correctly:
$ source gitlab.env
$ env | grep -Fe CI_ -e GITLAB_ - | grep -Fv TOKEN
CI_REGISTRY_USER=gitlab-ci-token
CI_API_V4_URL=https://gitlab.selfhosted.domain/api/v4
CI_PAGES_DOMAIN=pages.selfhosted.domain
CI_REGISTRY=gitlab.selfhosted.domain:4567
CI_SERVER_URL=https://gitlab.selfhosted.domain
CI_SERVER_PROTOCOL=https
CI_SERVER_HOST=gitlab.selfhosted.domain
GITLAB_URL=https://gitlab.selfhosted.domain
god bless you
This version withstands every special character in values:
set -a
source <(cat development.env | sed -e '/^#/d;/^\s*$/d' -e "s/'/'\\\''/g" -e "s/=\(.*\)/='\1'/g")
set +a
Explanation:
-a
means that every bash variable would become an environment variable/^#/d
removes comments (strings that start with#
)/^\s*$/d
removes empty strings, including whitespace"s/'/'\\\''/g"
replaces every single quote with'\''
, which is a trick sequence in bash to produce a quote :)"s/=\(.*\)/='\1'/g"
converts everya=b
intoa='b'
As a result, you are able to use special characters :)
To debug this code, replace source
with cat
and you'll see what this command produces.
Awesomest discussion in a gist ever
Epic gist indeed :)
@kolypto my initial version was using single quotes in the .env file and is double enriched with your version, so i tweaked it a bit.
#input from .env:
SECRET1='bla'
SECRET2 = "sap"
SECRET3 =buz
SECRET_4 =b'u"z
# wrong output:
SECRET1=''\''bla'\'''
SECRET2 =' "sap"'
SECRET3 ='buz'
SECRET_4 ='b'\''u"z'
Improved sed rules:
/^#/d
(same) removes comments (strings that start with #)/^\s*$/d
(same) removes empty strings, including whitespaces/\(\w*\)[ \t]*=[ \t]*\(.*\)/\1=\2/
keep group 1 and 2 before and after the = ignoring the whitespace."s/=['\"]\(.*\)['\"]/=\1/g"
value part, removes start/ending single or double quotes"s/'/'\\\''/g"
(same) replaces every single quote with ''', which is a trick sequence in bash to produce a quote :)"s/=\(.*\)/='\1'/g"
(same) converts every a=b into a='b'
set -o allexport
eval $(cat '.env' | sed -e '/^#/d;/^\s*$/d' -e 's/\(\w*\)[ \t]*=[ \t]*\(.*\)/\1=\2/' -e "s/=['\"]\(.*\)['\"]/=\1/g" -e "s/'/'\\\''/g" -e "s/=\(.*\)/='\1'/g")
set +o allexport
# Correct output, ready for export!
SECRET1='bla'
SECRET2='sap'
SECRET3='buz'
SECRET_4='b'\''u"z'
@Jules-Baratoux thanks.
your solution
# avoid override already set variables
source <(grep -v '^#' .env | sed -E 's|^(.+)=(.*)$|: ${\1=\2}; export \1|g')
Works great for me
🥳
For get value of specified variable by name:
echo $(grep -v '^#' .env | grep -e "YOUR_VARIABLE_NAME_FROM_DOT_ENV_FILE" | sed -e 's/.*=//')
This is the tricky the worked for me on bash and zsh
set variables from .compose.env
but don't override existing exported vars
eval "$(grep -v '^#' .compose.env | sed -E 's|^(.+)=(.*)$|export \1=${\1:-\2}|g' | xargs -L 1)"
Considering only the first =
:
eval "$(grep -v '^#' .env | sed -E '0,/^(.+)=/s/^(.+)=(.*)$/export \1=${\1:-\2}/g' | xargs -L 1)"
read_var(){
echo $(grep -v '^#' .env | grep -e "$1" | sed -e 's/.*=//')
}
CUR_APP_ENV=$(read_var "APP_ENV")
Not to override APP_ENV caz it will breaks .env hot reload.
Used export $(grep -v '^#' .env | xargs -d '\r\n')
to trim \r
from lines.
Seeing that this thread has been going on for long years, I figured we need a dotenv tool for the shell.
And I wrote it.
https://github.com/ko1nksm/shdotenv
There is no formal specification for .env, and each is slightly different, but shdotenv supports them and correctly parses comments, whitespace, quotes, etc. It is a single file shell script that requires only awk and runs lightly.
There is no need to waste time on trial and error anymore.
The following worked well for me in github actions:
eval "cat <<EOF
$(egrep -v '^#' .env)
EOF
" | tee --append $GITHUB_ENV
Tips for using this (things that bit me):
- Don't quote anything unless absolutely necessary; the quotes will be taken literally
- Avoid interpolation which uses a variable inside the
.env
file. So if your file containsFOO
and you try to use in in another variable likeFOOBAR=${FOO}bar
, all you will get isFOOBAR=bar
Apart from these, it supports pretty much anything you can do regularly including using variables that store the output of a command
Here is my contribution, which I developed with no idea someone already did something similar (I saw too many comments and I decided to develop it instead of reading all of them):
eval "$(
cat <(
grep -vE -e '^\s*#' -e '^\s*$' < .env |
grep -E "^[A-Z0-9_]+=['\"]"
) <(
grep -vE -e '^\s*#' -e '^\s*$' -e "^[A-Z0-9_]+=['\"]" < .env |
sed -r "s/^([A-Z0-9_]+=)(.*)/\\1'\\2'/"
)
)"
Explanation:
- Remove all comments and empty lines:
- Regex for comments:
^\s*#
- Regex for empty lines:
^\s*$
- Grep command:
grep -vE -e '^\s*#' -e '^\s*$' < .env
- Regex for comments:
- Get all variables that do not need treatment (i.e. those which are already inside quotes):
grep -E "^[A-Z0-9_]+=['\"]"
- Pipe step 1 to step 2:
grep -vE -e '^\s*#' -e '^\s*$' < .env | grep -E "^[A-Z0-9_]+=['\"]"
- Use process substitution to pass the result as a parameter to cat (I could have used a temporal file, but meh):
<( grep -vE -e '^\s*#' -e '^\s*$' < .env | grep -E "^[A-Z0-9_]+=['\"]" )
- Get all variables that do need treatment:
grep -vE -e '^\s*#' -e '^\s*$' -e "^[A-Z0-9_]+=['\"]" < .env
Notice how the 3 regular expressions are the same as in steps 1 and 2. The only difference here is that I reuse the -v option with the third regex.
- Put the single quotes to avoid problems with special characters:
- Regex for the variable with '=':
^([A-Z0-9_]+=)
- Regex for the rest of the contents that must be inside single quotes:
(.*)
- Put the first group as is and the second between quotes:
\\1'\\2'
- Sed command:
sed -r "s/^([A-Z0-9_]+=)(.*)/\\1'\\2'/"
- Regex for the variable with '=':
- Pipe step 5 to step 6:
grep -vE -e '^\s*#' -e '^\s*$' -e "^[A-Z0-9_]+=['\"]" < .env | sed -r "s/^([A-Z0-9_]+=)(.*)/\\1'\\2'/"
- Do pretty much the same as step 4:
<( grep -vE -e '^\s*#' -e '^\s*$' -e "^[A-Z0-9_]+=['\"]" < .env | sed -r "s/^([A-Z0-9_]+=)(.*)/\\1'\\2'/" )
- Concat and eval:
eval "$(
cat <(
grep -vE -e '^\s*#' -e '^\s*$' < .env |
grep -E "^[A-Z0-9_]+=['\"]"
) <(
grep -vE -e '^\s*#' -e '^\s*$' -e "^[A-Z0-9_]+=['\"]" < .env |
sed -r "s/^([A-Z0-9_]+=)(.*)/\\1'\\2'/"
)
)"
This is basically what @kolypto does here, but more complex and worse because it doesn't take into consideration the variables that contain single or double quotes inside them, because I should have been able to reuse the result of some grep commands and because maybe there is also a way to not have to feed 2 commands with the contents of the same file (I wrote < .env
twice). I had some fun coding this at least ;)
If you are seeking for an aswer and have scrolled to the bottom to find it, here it is (thanks to @abij. You can see his answer here):
set -o allexport
eval $(cat '.env' | sed -e '/^#/d;/^\s*$/d' -e 's/\(\w*\)[ \t]*=[ \t]*\(.*\)/\1=\2/' -e "s/=['\"]\(.*\)['\"]/=\1/g" -e "s/'/'\\\''/g" -e "s/=\(.*\)/='\1'/g")
eval $(cat '.env.local' | sed -e '/^#/d;/^\s*$/d' -e 's/\(\w*\)[ \t]*=[ \t]*\(.*\)/\1=\2/' -e "s/=['\"]\(.*\)['\"]/=\1/g" -e "s/'/'\\\''/g" -e "s/=\(.*\)/='\1'/g")
set +o allexport
Just make sure you execute that monster in the same folder that you have your .env file.
The second eval is to read the .env.local file, whose variables should override the ones in .env
TL;DR;
local result=$(grep ^VAR_NAME=.* path/to/.env | cut -d "=" -f 2);
More protected function to get necessary var and throw error if VAR not found, or invalid path provided.
Usage:
local port=`getEnvVar --var PORT --path ./some-project/.env`;
echo "port = $port";
Code
# Throws error - if it is.
#
# @example: exitIfError $? "Your error text".
# @example: exitIfError $1 "Your error text".
function exitIfError() {
local exit_code=$1
shift
[[ $exit_code ]] &&
((exit_code != 0)) && {
echo "ERROR. $@" >&2;
exit "$exit_code";
}
}
# Gets ENV property from provided .env file
#
# @param var - Variable name
# @param path - Path to env file.
# @param [file] - Optional fileName param. @default .env.
function getEnvVar() {
# This line is necessary to parse function named args
# @see https://gist.github.com/mopcweb/38f5d09525f8defa5aa807d95efa8307
while [[ $# -gt 0 ]]; do if [[ $1 == *"--"* ]]; then if [[ $2 != *"--"* ]]; then local "${1/--/}"="${2:-true}"; else local "${1/--/}"=true; fi; fi; shift; done;
[[ -n $file ]] && local fileName=$file || local fileName=".env";
[[ -z $var || -z $path ]] && exitIfError 1 "getEnvVar: --var & --path are required.";
[[ ! -d $path ]] && exitIfError 1 "getEnvVar: --path should be a valid dir.";
local result=$(grep ^$var=.* $path/$fileName | cut -d "=" -f 2);
[[ -z $result ]] && exitIfError 1 "getEnvVar: there is no such $var var in $path/$fileName file.";
echo $result;
}
@mopcweb Can you update with input and output, what is supported in the .env file?
And how much fun did you have, creating your own solution ;)?
# INPUT Expected:
'FOO=value' FOO='value'
"FOO=#value # comment" FOO='#value # comment'
"FOO=value " FOO='value '
'FOO=' FOO=''
'export FOO=value' export FOO='value'
"FOO=foo bar" FOO='foo bar'
"FOO= foo" FOO=' foo'
Test cases from @ko1nksm: https://github.com/ko1nksm/shdotenv/blob/main/spec/docker_spec.sh.
Note: check his awesome script: https://github.com/ko1nksm/shdotenv !
Why not dotenv-cli?
$ dotenv <command with arguments>
# or
$ dotenv -e .env.custom <command with arguments>
Because not every system has nodejs on it. And it's good that way.
One-liner that allows unquoted variables that contain spaces:
OLD_IFS=$IFS; IFS=$'\n'; for x in `grep -v '^#.*' .env`; do export $x; done; IFS=$OLD_IFS
ead_var() {
VAR=$(grep $1 $2 | xargs)
IFS="=" read -ra VAR <<< "$VAR"
echo ${VAR[1]}
}MY_VAR=$(read_var MY_VAR .env)
Perfect, thanks
What about this
# save the existing environment variables
prevEnv=$(env)
# if the .env file exists, source it
[ -f .env ] && . .env
# re-export all vars from the env so they override what ever was set in .env
for e in $prevEnv
do
export $e
done
I wrote my own because was using forbidden symbols in envs.
This basically adds apostrophes so that all variables will be treated as strings. This way you can use your Docker env files and source them with source envs.sh
import sys
def main(input_path, postfix='.sh'):
with open(input_path, 'r') as file_handle:
lines = file_handle.readlines()
envs = {}
for line in lines:
try:
parts = line.split('=')
name = parts[0]
value = ''.join(parts[1:]).rstrip('\n')
except ValueError:
pass
else:
envs[name] = value
output_path = f'{input_path}{postfix}'
with open(output_path, 'w') as file_handle:
lines = []
for name, value in envs.items():
line = f'{name}=\'{value}\'\n'
lines.append(line)
file_handle.writelines(lines)
if __name__ == '__main__':
# Passes first argument as input path.
main(sys.argv[1])
The solution proposed by @arizonaherbaltea will not work correctly when the file is not terminated with a newline. For example using this .env
example,
# test.env
MY_VAR=a
Will not work properly, whereas the following will,
# test.env
MY_VAR=a
The only change is adding a newline at the end of the file. This is because read
requires a newline to parse the line correctly. Thus, in order to ensure everything works properly we can add a check to see if the file ends with newline and if not append it before parsing it. Doing this will ensure all lines are parsed correctly.
#!/bin/bash
function export_envs() {
local env_file=${1:-.env}
local is_comment='^[[:space:]]*#'
local is_blank='^[[:space:]]*$'
echo "trying env file: ${env_file}"
# ensure it has a newline that the end, if it does not already
tail_line=`tail -n 1 "${env_file}"`
if [[ "${tail_line}" != "" ]]; then
echo "No Newline at end of ${env_file}, appending!"
echo "" >> "${env_file}"
fi
while IFS= read -r line; do
echo "${line}"
[[ $line =~ $is_comment ]] && continue
[[ $line =~ $is_blank ]] && continue
key=$(echo "$line" | cut -d '=' -f 1)
# shellcheck disable=SC2034
value=$(echo "$line" | cut -d '=' -f 2-)
# shellcheck disable=SC2116,SC1083
echo "The key: ${key} and value: ${value}"
eval "export ${key}=\"$(echo \${value})\""
done < <(cat "${env_file}")
}
export_envs ${1}
Then it results,
✗ ./test.sh test.env
trying env file: test.env
The key: MY_VAR and value: a
The key: MY_VAR and value: b
Hope this helps someone out there :)
What's wrong with:
if [ -f .env ]; then
set -o allexport
source .env
fi
Works on macOS with my .env that works with docker compose
and does not have quotes around every string.
Test with:
envsubst < "$secret_file" | cat
later in the same script.
why would it fail though? No problems in my case