-
-
Save judy2k/7656bfe3b322d669ef75364a46327836 to your computer and use it in GitHub Desktop.
# Pass the env-vars to MYCOMMAND | |
eval $(egrep -v '^#' .env | xargs) MYCOMMAND | |
# … or ... | |
# Export the vars in .env into your shell: | |
export $(egrep -v '^#' .env | xargs) |
The following worked well for me in github actions:
eval "cat <<EOF
$(egrep -v '^#' .env)
EOF
" | tee --append $GITHUB_ENV
Tips for using this (things that bit me):
- Don't quote anything unless absolutely necessary; the quotes will be taken literally
- Avoid interpolation which uses a variable inside the
.env
file. So if your file containsFOO
and you try to use in in another variable likeFOOBAR=${FOO}bar
, all you will get isFOOBAR=bar
Apart from these, it supports pretty much anything you can do regularly including using variables that store the output of a command
Here is my contribution, which I developed with no idea someone already did something similar (I saw too many comments and I decided to develop it instead of reading all of them):
eval "$(
cat <(
grep -vE -e '^\s*#' -e '^\s*$' < .env |
grep -E "^[A-Z0-9_]+=['\"]"
) <(
grep -vE -e '^\s*#' -e '^\s*$' -e "^[A-Z0-9_]+=['\"]" < .env |
sed -r "s/^([A-Z0-9_]+=)(.*)/\\1'\\2'/"
)
)"
Explanation:
- Remove all comments and empty lines:
- Regex for comments:
^\s*#
- Regex for empty lines:
^\s*$
- Grep command:
grep -vE -e '^\s*#' -e '^\s*$' < .env
- Regex for comments:
- Get all variables that do not need treatment (i.e. those which are already inside quotes):
grep -E "^[A-Z0-9_]+=['\"]"
- Pipe step 1 to step 2:
grep -vE -e '^\s*#' -e '^\s*$' < .env | grep -E "^[A-Z0-9_]+=['\"]"
- Use process substitution to pass the result as a parameter to cat (I could have used a temporal file, but meh):
<( grep -vE -e '^\s*#' -e '^\s*$' < .env | grep -E "^[A-Z0-9_]+=['\"]" )
- Get all variables that do need treatment:
grep -vE -e '^\s*#' -e '^\s*$' -e "^[A-Z0-9_]+=['\"]" < .env
Notice how the 3 regular expressions are the same as in steps 1 and 2. The only difference here is that I reuse the -v option with the third regex.
- Put the single quotes to avoid problems with special characters:
- Regex for the variable with '=':
^([A-Z0-9_]+=)
- Regex for the rest of the contents that must be inside single quotes:
(.*)
- Put the first group as is and the second between quotes:
\\1'\\2'
- Sed command:
sed -r "s/^([A-Z0-9_]+=)(.*)/\\1'\\2'/"
- Regex for the variable with '=':
- Pipe step 5 to step 6:
grep -vE -e '^\s*#' -e '^\s*$' -e "^[A-Z0-9_]+=['\"]" < .env | sed -r "s/^([A-Z0-9_]+=)(.*)/\\1'\\2'/"
- Do pretty much the same as step 4:
<( grep -vE -e '^\s*#' -e '^\s*$' -e "^[A-Z0-9_]+=['\"]" < .env | sed -r "s/^([A-Z0-9_]+=)(.*)/\\1'\\2'/" )
- Concat and eval:
eval "$(
cat <(
grep -vE -e '^\s*#' -e '^\s*$' < .env |
grep -E "^[A-Z0-9_]+=['\"]"
) <(
grep -vE -e '^\s*#' -e '^\s*$' -e "^[A-Z0-9_]+=['\"]" < .env |
sed -r "s/^([A-Z0-9_]+=)(.*)/\\1'\\2'/"
)
)"
This is basically what @kolypto does here, but more complex and worse because it doesn't take into consideration the variables that contain single or double quotes inside them, because I should have been able to reuse the result of some grep commands and because maybe there is also a way to not have to feed 2 commands with the contents of the same file (I wrote < .env
twice). I had some fun coding this at least ;)
If you are seeking for an aswer and have scrolled to the bottom to find it, here it is (thanks to @abij. You can see his answer here):
set -o allexport
eval $(cat '.env' | sed -e '/^#/d;/^\s*$/d' -e 's/\(\w*\)[ \t]*=[ \t]*\(.*\)/\1=\2/' -e "s/=['\"]\(.*\)['\"]/=\1/g" -e "s/'/'\\\''/g" -e "s/=\(.*\)/='\1'/g")
eval $(cat '.env.local' | sed -e '/^#/d;/^\s*$/d' -e 's/\(\w*\)[ \t]*=[ \t]*\(.*\)/\1=\2/' -e "s/=['\"]\(.*\)['\"]/=\1/g" -e "s/'/'\\\''/g" -e "s/=\(.*\)/='\1'/g")
set +o allexport
Just make sure you execute that monster in the same folder that you have your .env file.
The second eval is to read the .env.local file, whose variables should override the ones in .env
TL;DR;
local result=$(grep ^VAR_NAME=.* path/to/.env | cut -d "=" -f 2);
More protected function to get necessary var and throw error if VAR not found, or invalid path provided.
Usage:
local port=`getEnvVar --var PORT --path ./some-project/.env`;
echo "port = $port";
Code
# Throws error - if it is.
#
# @example: exitIfError $? "Your error text".
# @example: exitIfError $1 "Your error text".
function exitIfError() {
local exit_code=$1
shift
[[ $exit_code ]] &&
((exit_code != 0)) && {
echo "ERROR. $@" >&2;
exit "$exit_code";
}
}
# Gets ENV property from provided .env file
#
# @param var - Variable name
# @param path - Path to env file.
# @param [file] - Optional fileName param. @default .env.
function getEnvVar() {
# This line is necessary to parse function named args
# @see https://gist.github.com/mopcweb/38f5d09525f8defa5aa807d95efa8307
while [[ $# -gt 0 ]]; do if [[ $1 == *"--"* ]]; then if [[ $2 != *"--"* ]]; then local "${1/--/}"="${2:-true}"; else local "${1/--/}"=true; fi; fi; shift; done;
[[ -n $file ]] && local fileName=$file || local fileName=".env";
[[ -z $var || -z $path ]] && exitIfError 1 "getEnvVar: --var & --path are required.";
[[ ! -d $path ]] && exitIfError 1 "getEnvVar: --path should be a valid dir.";
local result=$(grep ^$var=.* $path/$fileName | cut -d "=" -f 2);
[[ -z $result ]] && exitIfError 1 "getEnvVar: there is no such $var var in $path/$fileName file.";
echo $result;
}
@mopcweb Can you update with input and output, what is supported in the .env file?
And how much fun did you have, creating your own solution ;)?
# INPUT Expected:
'FOO=value' FOO='value'
"FOO=#value # comment" FOO='#value # comment'
"FOO=value " FOO='value '
'FOO=' FOO=''
'export FOO=value' export FOO='value'
"FOO=foo bar" FOO='foo bar'
"FOO= foo" FOO=' foo'
Test cases from @ko1nksm: https://github.com/ko1nksm/shdotenv/blob/main/spec/docker_spec.sh.
Note: check his awesome script: https://github.com/ko1nksm/shdotenv !
Why not dotenv-cli?
$ dotenv <command with arguments>
# or
$ dotenv -e .env.custom <command with arguments>
Because not every system has nodejs on it. And it's good that way.
One-liner that allows unquoted variables that contain spaces:
OLD_IFS=$IFS; IFS=$'\n'; for x in `grep -v '^#.*' .env`; do export $x; done; IFS=$OLD_IFS
ead_var() {
VAR=$(grep $1 $2 | xargs)
IFS="=" read -ra VAR <<< "$VAR"
echo ${VAR[1]}
}MY_VAR=$(read_var MY_VAR .env)
Perfect, thanks
What about this
# save the existing environment variables
prevEnv=$(env)
# if the .env file exists, source it
[ -f .env ] && . .env
# re-export all vars from the env so they override what ever was set in .env
for e in $prevEnv
do
export $e
done
I wrote my own because was using forbidden symbols in envs.
This basically adds apostrophes so that all variables will be treated as strings. This way you can use your Docker env files and source them with source envs.sh
import sys
def main(input_path, postfix='.sh'):
with open(input_path, 'r') as file_handle:
lines = file_handle.readlines()
envs = {}
for line in lines:
try:
parts = line.split('=')
name = parts[0]
value = ''.join(parts[1:]).rstrip('\n')
except ValueError:
pass
else:
envs[name] = value
output_path = f'{input_path}{postfix}'
with open(output_path, 'w') as file_handle:
lines = []
for name, value in envs.items():
line = f'{name}=\'{value}\'\n'
lines.append(line)
file_handle.writelines(lines)
if __name__ == '__main__':
# Passes first argument as input path.
main(sys.argv[1])
The solution proposed by @arizonaherbaltea will not work correctly when the file is not terminated with a newline. For example using this .env
example,
# test.env
MY_VAR=a
Will not work properly, whereas the following will,
# test.env
MY_VAR=a
The only change is adding a newline at the end of the file. This is because read
requires a newline to parse the line correctly. Thus, in order to ensure everything works properly we can add a check to see if the file ends with newline and if not append it before parsing it. Doing this will ensure all lines are parsed correctly.
#!/bin/bash
function export_envs() {
local env_file=${1:-.env}
local is_comment='^[[:space:]]*#'
local is_blank='^[[:space:]]*$'
echo "trying env file: ${env_file}"
# ensure it has a newline that the end, if it does not already
tail_line=`tail -n 1 "${env_file}"`
if [[ "${tail_line}" != "" ]]; then
echo "No Newline at end of ${env_file}, appending!"
echo "" >> "${env_file}"
fi
while IFS= read -r line; do
echo "${line}"
[[ $line =~ $is_comment ]] && continue
[[ $line =~ $is_blank ]] && continue
key=$(echo "$line" | cut -d '=' -f 1)
# shellcheck disable=SC2034
value=$(echo "$line" | cut -d '=' -f 2-)
# shellcheck disable=SC2116,SC1083
echo "The key: ${key} and value: ${value}"
eval "export ${key}=\"$(echo \${value})\""
done < <(cat "${env_file}")
}
export_envs ${1}
Then it results,
✗ ./test.sh test.env
trying env file: test.env
The key: MY_VAR and value: a
The key: MY_VAR and value: b
Hope this helps someone out there :)
What's wrong with:
if [ -f .env ]; then
set -o allexport
source .env
fi
Works on macOS with my .env that works with docker compose
and does not have quotes around every string.
Test with:
envsubst < "$secret_file" | cat
later in the same script.
Seeing that this thread has been going on for long years, I figured we need a dotenv tool for the shell.
And I wrote it.
https://github.com/ko1nksm/shdotenv
There is no formal specification for .env, and each is slightly different, but shdotenv supports them and correctly parses comments, whitespace, quotes, etc. It is a single file shell script that requires only awk and runs lightly.
There is no need to waste time on trial and error anymore.