-
-
Save mihow/9c7f559807069a03e302605691f85572 to your computer and use it in GitHub Desktop.
# The initial version | |
if [ ! -f .env ] | |
then | |
export $(cat .env | xargs) | |
fi | |
# My favorite from the comments. Thanks @richarddewit & others! | |
set -a && source .env && set +a |
@chengxuncc
usingexport $(grep -v '^#' .env | xargs)
could not export the following,A=10 B=$A C=${A}now,
echo $B
produces$A
while it should print10
this read line by line, allowing to use previous set variables
while read -r LINE; do if [[ $LINE == *'='* ]] && [[ $LINE != '#'* ]]; then ENV_VAR="$(echo $LINE | envsubst)" eval "declare $ENV_VAR" fi done < .env
This solution is helpful!
Thanks a lot!
You can also do:
eval "$(
cat .env | awk '!/^\s*#/' | awk '!/^\s*$/' | while IFS='' read -r line; do
key=$(echo "$line" | cut -d '=' -f 1)
value=$(echo "$line" | cut -d '=' -f 2-)
echo "export $key=\"$value\""
done
)"
This ignores empty lines, and lines starting with #
(comments). If you replace eval
with echo
- you can inspect the generated code.
The cleanest solution I found for this was using
allexport
andsource
like thisset -o allexport source .env set +o allexportThis was by far the best solution here for me, removed all the complexity around certain chars, spaces comments etc. Just needed a tweak on formatting to prevent others being tripped up, should be:
set -o allexport
source .env
set +o allexport
From man set
:
-o option
This option is supported if the system supports the User Portability Utilities op‐
tion. It shall set various options, many of which shall be equivalent to the single
option letters. The following values of option shall be supported:
allexport Equivalent to -a.
So this is the same as
set -a
source .env
set +a
[ ! -f .env ] || export $(sed 's/#.*//g' .env | xargs)Update:
TEXT="abc#def"
not work as expected, so just replace line begin with #.[ ! -f .env ] || export $(grep -v '^#' .env | xargs)
This one works for django .env
this works for me
#!/usr/bin/env bash
. .env
For those using sed to rewrite their .env files before evaluation by bash, for example the solution suggested by @kolypto in https://gist.github.com/mihow/9c7f559807069a03e302605691f85572?permalink_comment_id=3625310#gistcomment-3625310
I ran into another case that hadn't been considered: Windows line endings "\r\n". I'm now using:
set -o allexport # enable all variable definitions to be exported
source <(sed -e "s/\r//" -e '/^#/d;/^\s*$/d' -e "s/'/'\\\''/g" -e "s/=\(.*\)/=\"\1\"/g" "${ENV_FILE}")
set +o allexport
The cleanest solution I found for this was using
allexport
andsource
like thisset -o allexport source .env set +o allexportThis was by far the best solution here for me, removed all the complexity around certain chars, spaces comments etc. Just needed a tweak on formatting to prevent others being tripped up, should be:
set -o allexport
source .env
set +o allexportFrom
man set
:-o option This option is supported if the system supports the User Portability Utilities op‐ tion. It shall set various options, many of which shall be equivalent to the single option letters. The following values of option shall be supported: allexport Equivalent to -a.
So this is the same as
set -a source .env set +a
Worked for me
The above worked fine for me, but thought I'd share the solution I went with:
https://stackoverflow.com/a/30969768/179329set -o allexport; source .env; set +o allexport
As @richarddewit pointed out above, -a
/+a
can be used in place of -o allexport
to be more concise (thanks!).
I now use the following simple line to source .env files into my scripts...
set -a; source .env; set +a
export $(awk -F= '{output=output" "$1"="$2} END {print output}' aaa.env)
[ ! -f .env ] || export $(grep -v '^#' .env | xargs)
Sweet, works like a charm for me, thanks.
oh-my-zsh users can also activate the
dotenv
plugin.
thank you this was better
I had troubles with a (Docker) setup where environment variables had spaces in their value without quotes and I needed to get the container's env. vars. in a script called during the container execution/runtime.
I ended getting the variables in the entrypoint, exporting them to a file and them reading them when needed.
# In entrypoint
export -pn \
| grep "=" \
| grep -v -e PATH -e PWD -e OLDPWD \
| cut -d ' ' -f 3- \
> /docker-container.env
The export
command fixes issues with missing quotes, avoiding errors where the shell interpreter tries to execute parts of the variable value as commands.
# In script
set -o allexport
. /docker-container.env
set +o allexport
(I had to use /bin/sh
so not using source file
but . file
)
Posix compliant version built around set
, [ ]
and .
Many thanks to the prior posters who brought up set -o a
and set -a
/ set +a
This snippet will source a dotenv file, exporting the values into the environment. If allexport
is already set, it leaves it set, otherwise it sets, reads, and unsets.
if [ -z "${-%%*a*}" ]; then
set -a
. ./.env
set +a
else
. ./.env
fi
double brackets [[
, source
, setopt
are not available in posix. Nor is the test [[ -o a ]]
to check for set options. And we need to quote our comparison strings to deal with empty vars.
The code to check if an option is set is a bit of a pain. It could be a case statement or a grep on set -o
like set -o | grep allexport | grep -q yes
, but blech. Instead I've used parameter expansion with pattern matching to remove a maximum match from the $-
variable containing a single line of the set options.
${-%%*a*}
uses %%
parameter expansion to remove the longest suffix matching the pattern *a*
. If $-
contains a
then this expansion produces and empty string which we can test with -z or -n.
subtle bug if no options are set, so the comparison "$-" = "${-%%a*}"
will check that the expansion changed the string. allexport
is set if the two strings differ. And even % will work as we don't need a maximal match and can remove the leading *
from our pattern match.
if [ "$-" = "${-%a*}" ]; then
# allexport is not set
set -a
. ./.env
set +a
else
. ./.env
fi
When the values have newline chars \n
, spaces or quotes, it can get messy.
After a lot of trial and error, I ended up with a variation of what @bergkvist proposed in https://gist.github.com/mihow/9c7f559807069a03e302605691f85572?permalink_comment_id=4245050#gistcomment-4245050 (thank you very much!).
ENV_VARS="$(cat .env | awk '!/^\s*#/' | awk '!/^\s*$/')"
eval "$(
printf '%s\n' "$ENV_VARS" | while IFS='' read -r line; do
key=$(printf '%s\n' "$line"| sed 's/"/\\"/g' | cut -d '=' -f 1)
value=$(printf '%s\n' "$line" | cut -d '=' -f 2- | sed 's/"/\\\"/g')
printf '%s\n' "export $key=\"$value\""
done
)"
env $(cat .env)
this does not work for me but this one works
env $(cat .env|xargs) CMD
my .env has some special value such as FOO='VPTO&wH7$^3ZHZX$o$udY4&i'
@NatoBoram
A simple solution that works for bash
, zsh
, and fish
:
eval export $(cat .env)
Use this to create the file
export -p > .env
and just
. .env
to read it back in
From man export
:
The shell shall format the output, including the proper use of quoting, so that it is suitable for reinput to the shell as commands that achieve the same exporting results
Although set -a; source .env; set +a
is elegant and short, one feature which I missed is this overwrite existing exported variables.
I my use case I have a script, which connects to postgres with a predefined user. This user is stored in .env
file as PG_USER=myuser
. So the script does the magical set -a; source .env; set +a
and everything works. But sometimes I need ad-hoc change the user. So what I'd do is PG_USER=postgres ./my_script.sh
. In order not to over write the existing var I did this horrendous piece of code:
IFS=$'\n'
for l in $(cat /etc/my_service/.env); do
IFS='=' read -ra VARVAL <<< "$l"
# If variable with such name already exists, preserves it's value
eval "export ${VARVAL[0]}=\${${VARVAL[0]}:-${VARVAL[1]}}"
done
unset IFS
The cleanest solution I found for this was using
allexport
andsource
like thisset -o allexport source .env set +o allexportThis was by far the best solution here for me, removed all the complexity around certain chars, spaces comments etc. Just needed a tweak on formatting to prevent others being tripped up, should be:
set -o allexport source .env set +o allexport
work like a charm. ty
this read line by line, allowing to use previous set variables
while read -r LINE; do if [[ $LINE == *'='* ]] && [[ $LINE != '#'* ]]; then ENV_VAR="$(echo $LINE | envsubst)" eval "declare $ENV_VAR" fi done < .env
this was working the best for me but this still has 2 problems
- code breaks if the value has
()
characters inside it - can not be used inside a function
here is my solution:
read_env() {
local filename="${1:-.env}"
if [ ! -f "$filename" ]; then
echo "missing ${filename} file"
exit 1
fi
echo "reading .env file..."
while read -r LINE; do
if [[ $LINE != '#'* ]] && [[ $LINE == *'='* ]]; then
export "$LINE"
fi
done < "$filename"
}
UPDATED VERSION BELOW
Hi, here is mine solution to read vars from /etc/environemnt
file which I used in /etc/profile
or /etc/bash.bashrc
oneliner for easier validation if line exists in file
(I removed single quotes '
from conditions, so it could be easier parsed by grep)
while read -r LINE; do [[ ${LINE} =~ ^# || ${LINE} =~ ^PATH= || ! ${LINE} == *=* || ${LINE} =~ ^[0-9] || ${LINE} =~ ^[^a-zA-Z_] ]] || export "${LINE}"; done < "/etc/environment"
or formatad syntax:
while read -r LINE; do
if [[ ${LINE} =~ ^# || ${LINE} =~ ^PATH= || ! ${LINE} == *=* || ${LINE} =~ ^[0-9] || ${LINE} =~ ^[^a-zA-Z_] ]]; then
continue
else
export "${LINE}"
fi
done < "/etc/environment"
Hi,
Here is my solution to read vars from .env file and ignoring #
and cleaning values from '
and "
.
https://gist.github.com/shadiabuhilal/220aa09f9bb83caed93a1f87401fcc60
dot-env.sh
File:
#!/bin/bash
# Specify the path to your .env file
ENV_FILE=".env"
# Check if the .env file exists
if [ -f "$ENV_FILE" ]; then
echo "[INFO]: Reading $ENV_FILE file."
# Read the .env file line by line
while IFS= read -r line; do
# Skip comments and empty lines
if [[ "$line" =~ ^\s*#.*$ || -z "$line" ]]; then
continue
fi
# Split the line into key and value
key=$(echo "$line" | cut -d '=' -f 1)
value=$(echo "$line" | cut -d '=' -f 2-)
# Remove single quotes, double quotes, and leading/trailing spaces from the value
value=$(echo "$value" | sed -e "s/^'//" -e "s/'$//" -e 's/^"//' -e 's/"$//' -e 's/^[ \t]*//;s/[ \t]*$//')
# Export the key and value as environment variables
export "$key=$value"
done < "$ENV_FILE"
echo "[DONE]: Reading $ENV_FILE file."
else
echo "[ERROR]: $ENV_FILE not found."
fi
Enjoy :)
Thanks!
One-liner while running command a script (such as pnpm script)
I added parentheses not to pollute global environment vars. Not sure if it's needed though.
(export $(cat .env | xargs) && pnpm compile)
this is the final version that im using, seems to work for all situations
read_env() {
local filePath="${1:-.env}"
if [ ! -f "$filePath" ]; then
echo "missing ${filePath}"
exit 1
fi
echo "Reading $filePath"
while read -r LINE; do
# Remove leading and trailing whitespaces, and carriage return
CLEANED_LINE=$(echo "$LINE" | awk '{$1=$1};1' | tr -d '\r')
if [[ $CLEANED_LINE != '#'* ]] && [[ $CLEANED_LINE == *'='* ]]; then
export "$CLEANED_LINE"
fi
done < "$filePath"
}
Thanks @MansourM !
this is the final version that im using, seems to work for all situations
read_env() { local filePath="${1:-.env}" if [ ! -f "$filePath" ]; then echo "missing ${filePath}" exit 1 fi log "Reading $filePath" while read -r LINE; do # Remove leading and trailing whitespaces, and carriage return CLEANED_LINE=$(echo "$LINE" | awk '{$1=$1};1' | tr -d '\r') if [[ $CLEANED_LINE != '#'* ]] && [[ $CLEANED_LINE == *'='* ]]; then export "$CLEANED_LINE" fi done < "$filePath" }
Looks great, but it doesn't work if the .env file contains only 1 row (just one without br)
Looks great, but it doesn't work if the .env file contains only 1 row (just one without br)
how do you use it?
you get any errors?
btw u need to comment or remove this line as you don't have the log function
log "Reading $filePath"
this read line by line, allowing to use previous set variables