How can I share global variables between multiple bash scripts?

0
0
Asked By CuriousCoder99 On

Hey everyone! I'm working on a project that involves several shell scripts. In my main script (main.sh), I have a bunch of global variables that I want to share with other scripts that run later on. Initially, I thought about using `source main.sh` to bring in those variables, but then I realized that the values of these variables could change, and I would be importing old values before any modifications occurred.

While I know passing variables as arguments is a common approach, I'm not a fan of that for this particular case. I want to allow users to freely write and customize their scripts with ease. So, I had the idea of a function like

`__print_my_global_variables "vars.sh"`

This function would output all the global variables from the script into vars.sh. However, I want to avoid hardcoding the variable names in the function and make it applicable to any script. Does anyone have suggestions on how to accomplish this?

**Edit:** I also considered converting all global variables to environment variables, but I suspect there might be a more efficient method.

**Edit 2:** Thanks for all the help! I was able to resolve my issue with the code I shared below:

```bash
__print_my_global_variables() {
if [ "$#" -gt 1 ]; then
err "Error : Many arguments to __print_my_global_variables() function." $__ERROR $__RETURN -1; return $?
fi

which gawk > /dev/null || { err "gawk is required to run the function: __print_my_global_variables()!" $__ERROR $__RETURN -2; return $? ;}

local __output_file="$(realpath "$1" 2>/dev/null)"
if [ -z "$__output_file" ]; then
declare -p | gawk 'BEGIN{f=0} $0 ~ /^declare -- _=/{f=1; next} f==1{print $0}'
elif [ -w "$(dirname "$__output_file")" ] && [ ! -f "$__output_file" ]; then
declare -p | gawk 'BEGIN{f=0} $0 ~ /^declare -- _=/{f=1; next} f==1{print $0}' > "$__output_file"
elif [ -f "$__output_file" ] && [ -w "$__output_file" ]; then
declare -p | gawk 'BEGIN{f=0} $0 ~ /^declare -- _=/{f=1; next} f==1{print $0}' > "$__output_file"
else
err "Cannot write to $__output_file !" $__ERROR $__RETURN -3; return $?
fi
return 0
}
```

3 Answers

Answered By StackedSolutions On

If you want a more robust solution, consider using a key-value store like Redis. It allows you to set and get variables easily, and it’s great for concurrent access. You can set a variable like this:
```bash
redis-cli SET alphaVar "value"
```
And retrieve it like this:
```bash
value=$(redis-cli GET alphaVar)
```

Answered By ScriptingSavvy On

A good idea would be to use a common prefix for your global variables. This way, you can easily grab them using the `declare -p "[0m${!prefix@}" [0m> vars.bash` command, which collects all variable names starting with that prefix. Just a heads up—if you source this inside a function, the variables will be local to that function, so avoid wrapping the `source` command in a function.

Answered By TechWhiz123 On

I've experimented with something along these lines. You could define a function that exports your global variables to an external script. Here’s a quick example of how it could look:

```bash
export_globals() {
local output_file="${1}"
declare -x | grep -v "declare -[fx]" | grep -v -E '(BASH_|COMP_|DIRSTACK|FUNCNAME|GROUPS|PIPESTATUS|{w+})' | sed 's/declare -x //g' > "${output_file}"
}
```

Then just call `export_globals "vars.sh"` whenever you need to save the state of your variables.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.