-
-
Save mihow/9c7f559807069a03e302605691f85572 to your computer and use it in GitHub Desktop.
# The initial version | |
if [ ! -f .env ] | |
then | |
export $(cat .env | xargs) | |
fi | |
# My favorite from the comments. Thanks @richarddewit & others! | |
set -a && source .env && set +a |
The above worked fine for me, but thought I'd share the solution I went with:
https://stackoverflow.com/a/30969768/179329set -o allexport; source .env; set +o allexport
As @richarddewit pointed out above, -a
/+a
can be used in place of -o allexport
to be more concise (thanks!).
I now use the following simple line to source .env files into my scripts...
set -a; source .env; set +a
export $(awk -F= '{output=output" "$1"="$2} END {print output}' aaa.env)
[ ! -f .env ] || export $(grep -v '^#' .env | xargs)
Sweet, works like a charm for me, thanks.
oh-my-zsh users can also activate the
dotenv
plugin.
thank you this was better
I had troubles with a (Docker) setup where environment variables had spaces in their value without quotes and I needed to get the container's env. vars. in a script called during the container execution/runtime.
I ended getting the variables in the entrypoint, exporting them to a file and them reading them when needed.
# In entrypoint
export -pn \
| grep "=" \
| grep -v -e PATH -e PWD -e OLDPWD \
| cut -d ' ' -f 3- \
> /docker-container.env
The export
command fixes issues with missing quotes, avoiding errors where the shell interpreter tries to execute parts of the variable value as commands.
# In script
set -o allexport
. /docker-container.env
set +o allexport
(I had to use /bin/sh
so not using source file
but . file
)
Posix compliant version built around set
, [ ]
and .
Many thanks to the prior posters who brought up set -o a
and set -a
/ set +a
This snippet will source a dotenv file, exporting the values into the environment. If allexport
is already set, it leaves it set, otherwise it sets, reads, and unsets.
if [ -z "${-%%*a*}" ]; then
set -a
. ./.env
set +a
else
. ./.env
fi
double brackets [[
, source
, setopt
are not available in posix. Nor is the test [[ -o a ]]
to check for set options. And we need to quote our comparison strings to deal with empty vars.
The code to check if an option is set is a bit of a pain. It could be a case statement or a grep on set -o
like set -o | grep allexport | grep -q yes
, but blech. Instead I've used parameter expansion with pattern matching to remove a maximum match from the $-
variable containing a single line of the set options.
${-%%*a*}
uses %%
parameter expansion to remove the longest suffix matching the pattern *a*
. If $-
contains a
then this expansion produces and empty string which we can test with -z or -n.
subtle bug if no options are set, so the comparison "$-" = "${-%%a*}"
will check that the expansion changed the string. allexport
is set if the two strings differ. And even % will work as we don't need a maximal match and can remove the leading *
from our pattern match.
if [ "$-" = "${-%a*}" ]; then
# allexport is not set
set -a
. ./.env
set +a
else
. ./.env
fi
When the values have newline chars \n
, spaces or quotes, it can get messy.
After a lot of trial and error, I ended up with a variation of what @bergkvist proposed in https://gist.github.com/mihow/9c7f559807069a03e302605691f85572?permalink_comment_id=4245050#gistcomment-4245050 (thank you very much!).
ENV_VARS="$(cat .env | awk '!/^\s*#/' | awk '!/^\s*$/')"
eval "$(
printf '%s\n' "$ENV_VARS" | while IFS='' read -r line; do
key=$(printf '%s\n' "$line"| sed 's/"/\\"/g' | cut -d '=' -f 1)
value=$(printf '%s\n' "$line" | cut -d '=' -f 2- | sed 's/"/\\\"/g')
printf '%s\n' "export $key=\"$value\""
done
)"
env $(cat .env)
this does not work for me but this one works
env $(cat .env|xargs) CMD
my .env has some special value such as FOO='VPTO&wH7$^3ZHZX$o$udY4&i'
@NatoBoram
A simple solution that works for bash
, zsh
, and fish
:
eval export $(cat .env)
Use this to create the file
export -p > .env
and just
. .env
to read it back in
From man export
:
The shell shall format the output, including the proper use of quoting, so that it is suitable for reinput to the shell as commands that achieve the same exporting results
Although set -a; source .env; set +a
is elegant and short, one feature which I missed is this overwrite existing exported variables.
I my use case I have a script, which connects to postgres with a predefined user. This user is stored in .env
file as PG_USER=myuser
. So the script does the magical set -a; source .env; set +a
and everything works. But sometimes I need ad-hoc change the user. So what I'd do is PG_USER=postgres ./my_script.sh
. In order not to over write the existing var I did this horrendous piece of code:
IFS=$'\n'
for l in $(cat /etc/my_service/.env); do
IFS='=' read -ra VARVAL <<< "$l"
# If variable with such name already exists, preserves it's value
eval "export ${VARVAL[0]}=\${${VARVAL[0]}:-${VARVAL[1]}}"
done
unset IFS
The cleanest solution I found for this was using
allexport
andsource
like thisset -o allexport source .env set +o allexportThis was by far the best solution here for me, removed all the complexity around certain chars, spaces comments etc. Just needed a tweak on formatting to prevent others being tripped up, should be:
set -o allexport source .env set +o allexport
work like a charm. ty
this read line by line, allowing to use previous set variables
while read -r LINE; do if [[ $LINE == *'='* ]] && [[ $LINE != '#'* ]]; then ENV_VAR="$(echo $LINE | envsubst)" eval "declare $ENV_VAR" fi done < .env
this was working the best for me but this still has 2 problems
- code breaks if the value has
()
characters inside it - can not be used inside a function
here is my solution:
read_env() {
local filename="${1:-.env}"
if [ ! -f "$filename" ]; then
echo "missing ${filename} file"
exit 1
fi
echo "reading .env file..."
while read -r LINE; do
if [[ $LINE != '#'* ]] && [[ $LINE == *'='* ]]; then
export "$LINE"
fi
done < "$filename"
}
UPDATED VERSION BELOW
Hi, here is mine solution to read vars from /etc/environemnt
file which I used in /etc/profile
or /etc/bash.bashrc
oneliner for easier validation if line exists in file
(I removed single quotes '
from conditions, so it could be easier parsed by grep)
while read -r LINE; do [[ ${LINE} =~ ^# || ${LINE} =~ ^PATH= || ! ${LINE} == *=* || ${LINE} =~ ^[0-9] || ${LINE} =~ ^[^a-zA-Z_] ]] || export "${LINE}"; done < "/etc/environment"
or formatad syntax:
while read -r LINE; do
if [[ ${LINE} =~ ^# || ${LINE} =~ ^PATH= || ! ${LINE} == *=* || ${LINE} =~ ^[0-9] || ${LINE} =~ ^[^a-zA-Z_] ]]; then
continue
else
export "${LINE}"
fi
done < "/etc/environment"
Hi,
Here is my solution to read vars from .env file and ignoring #
and cleaning values from '
and "
.
https://gist.github.com/shadiabuhilal/220aa09f9bb83caed93a1f87401fcc60
dot-env.sh
File:
#!/bin/bash
# Specify the path to your .env file
ENV_FILE=".env"
# Check if the .env file exists
if [ -f "$ENV_FILE" ]; then
echo "[INFO]: Reading $ENV_FILE file."
# Read the .env file line by line
while IFS= read -r line; do
# Skip comments and empty lines
if [[ "$line" =~ ^\s*#.*$ || -z "$line" ]]; then
continue
fi
# Split the line into key and value
key=$(echo "$line" | cut -d '=' -f 1)
value=$(echo "$line" | cut -d '=' -f 2-)
# Remove single quotes, double quotes, and leading/trailing spaces from the value
value=$(echo "$value" | sed -e "s/^'//" -e "s/'$//" -e 's/^"//' -e 's/"$//' -e 's/^[ \t]*//;s/[ \t]*$//')
# Export the key and value as environment variables
export "$key=$value"
done < "$ENV_FILE"
echo "[DONE]: Reading $ENV_FILE file."
else
echo "[ERROR]: $ENV_FILE not found."
fi
Enjoy :)
Thanks!
One-liner while running command a script (such as pnpm script)
I added parentheses not to pollute global environment vars. Not sure if it's needed though.
(export $(cat .env | xargs) && pnpm compile)
this is the final version that im using, seems to work for all situations
read_env() {
local filePath="${1:-.env}"
if [ ! -f "$filePath" ]; then
echo "missing ${filePath}"
exit 1
fi
echo "Reading $filePath"
while read -r LINE; do
# Remove leading and trailing whitespaces, and carriage return
CLEANED_LINE=$(echo "$LINE" | awk '{$1=$1};1' | tr -d '\r')
if [[ $CLEANED_LINE != '#'* ]] && [[ $CLEANED_LINE == *'='* ]]; then
export "$CLEANED_LINE"
fi
done < "$filePath"
}
Thanks @MansourM !
this is the final version that im using, seems to work for all situations
read_env() { local filePath="${1:-.env}" if [ ! -f "$filePath" ]; then echo "missing ${filePath}" exit 1 fi log "Reading $filePath" while read -r LINE; do # Remove leading and trailing whitespaces, and carriage return CLEANED_LINE=$(echo "$LINE" | awk '{$1=$1};1' | tr -d '\r') if [[ $CLEANED_LINE != '#'* ]] && [[ $CLEANED_LINE == *'='* ]]; then export "$CLEANED_LINE" fi done < "$filePath" }
Looks great, but it doesn't work if the .env file contains only 1 row (just one without br)
Looks great, but it doesn't work if the .env file contains only 1 row (just one without br)
how do you use it?
you get any errors?
btw u need to comment or remove this line as you don't have the log function
log "Reading $filePath"
I needed this and after reading the above realized none of the solutions quite worked for my .env file on a Mac... so I modified what @shadiabuhilal into this (which I put into a file called "readenv")
# quick bash function to read .env file
# use it via:
# source readenv
# readenv
#
# or
#
# readenv <filename>
#
# modified from https://gist.github.com/mihow/9c7f559807069a03e302605691f85572
# fixed for whitespace issues, posix compliance (e.g. \t on mac means t)
#
# NOT a standalone script as when used as a standalone script, it'll read in the ENV variables into a sub-process, not the
# calling process
readenv() {
local filePath="${1:-.env}"
if [ ! -f "$filePath" ]; then
# silently be done
# put some error / echo if you prefer non-silent errors
return 0
fi
# echo "Reading $filePath"
while read -r line; do
if [[ "$line" =~ ^\s*#.*$ || -z "$line" ]]; then
continue
fi
# Split the line into key and value. Trim whitespace on either side.
key=$(echo "$line" | cut -d '=' -f 1 | sed -e 's/^[[:space:]]*//' -e 's/[[:space:]]*$//')
value=$(echo "$line" | cut -d '=' -f 2- | sed -e 's/^[[:space:]]*//' -e 's/[[:space:]]*$//')
# Leaving the below here... normally this works, but if you have something like
# FOO=" string with leading and trailing "
# then the leading / trailing spaces are deleted. FOO="a word", FOO='a word', and FOO=a word all generally work
# so leave the quotes
# Remove single quotes, double quotes, and leading/trailing spaces from the value
# value=$(echo "$value" | sed -e "s/^'//" -e "s/'$//" -e 's/^"//' -e 's/"$//' -e 's/^[[:space:]]*//;s/[[:space:]]*$//')
# Export the key and value as environment variables
# echo "$key=$value"
export "$key=$value"
done < "$filePath"
}
Also - recommend to use [[:space:]] rather than \s or [ \t] --- on Macs, \s isn't space, and \t isn't TAB but t. Yay standardization!
This is what I use:
# shellcheck disable=SC2046
[ -f .env ] && export $(grep -v '^#' .env | xargs)
No need to do a double negation with [ ! -f .env ] ||
when you can do [ -f .env ] &&
@bfontaine thanks, worked like a charm.
Worked for me