79

I need to make an export like this in Python :

# export MY_DATA="my_export"

I've tried to do :

# -*- python-mode -*-
# -*- coding: utf-8 -*-
import os
os.system('export MY_DATA="my_export"')

But when I list export, "MY_DATA" not appear :

# export

How I can do an export with Python without saving "my_export" into a file ?

13 Answers 13

116

export is a command that you give directly to the shell (e.g. bash), to tell it to add or modify one of its environment variables. You can't change your shell's environment from a child process (such as Python), it's just not possible.

Here's what's happening when you try os.system('export MY_DATA="my_export"')...

/bin/bash process, command `python yourscript.py` forks python subprocess
 |_
   /usr/bin/python process, command `os.system()` forks /bin/sh subprocess
    |_
      /bin/sh process, command `export ...` changes its local environment

When the bottom-most /bin/sh subprocess finishes running your export ... command, then it's discarded, along with the environment that you have just changed.

8
  • Indeed I do not see it like that ! Commented Oct 1, 2009 at 21:05
  • 17
    I just realize, after a lot of test, that it's you who is right : I can't change my shell's environment from a child process (such as Python), it's just not possible. Commented Oct 2, 2009 at 0:48
  • 12
    @KevinCampion Please change the accepted answer in such case.
    – cubuspl42
    Commented Mar 16, 2016 at 20:09
  • hm.. I tried running subprocess.check_output( 'export x=foo && other_people_command_depending_on_x' ) and it didn't work somehow -- any ideas what happens there? Setting os.environ['x'] = 'foo' for Python (and thus all its' child-processes) works.
    – xealits
    Commented Sep 22, 2016 at 10:45
  • I recently have to do something similar , here is the issue and what has worked for me. The problem was to execute a python script which internally executes a ELF binary and I wanted a certain path to be set for this binary. The solution that worked for me was to fetch the current path variable from the python code and then just directly update the PATH variable using os.putenv. Though this will not update the PATH variable of the shell from where the python script was originally invoked.
    – krishna
    Commented Jan 12, 2017 at 5:43
100

You actually want to do

import os
os.environ["MY_DATA"] = "my_export"
1
  • 14
    This doesn't actually work (although it's a nicer way to do this): $ python Python 2.7.10 (default, Sep 8 2015, 17:20:17) [GCC 5.1.1 20150618 (Red Hat 5.1.1-4)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import os >>> os.environ["MY_DATA"] = "my_export" >>> $ export | grep -c MY_DATA 0 Commented Nov 12, 2015 at 7:01
22

Another way to do this, if you're in a hurry and don't mind the hacky-aftertaste, is to execute the output of the python script in your bash environment and print out the commands to execute setting the environment in python. Not ideal but it can get the job done in a pinch. It's not very portable across shells, so YMMV.

$(python -c 'print "export MY_DATA=my_export"')

(you can also enclose the statement in backticks in some shells ``)

3
  • 5
    Can others comment as to why this got downvoted? It seems like a reasonable solution given the desired requirements. It doesn't start a new subshell, and does actually add new environment variables to the current, running shell process. Commented Jan 11, 2016 at 6:00
  • Actually quite cool. Better that writing a script and souring it later on.
    – rhoerbe
    Commented Aug 21, 2018 at 10:06
  • Indeed, quite cool. This and the more detailed version of @Akhil should be the best answer. Commented Aug 6, 2019 at 13:46
10

Not that simple:

python -c "import os; os.putenv('MY_DATA','1233')"
$ echo $MY_DATA # <- empty

But:

python -c "import os; os.putenv('MY_DATA','123'); os.system('bash')"
$ echo $MY_DATA #<- 123
5
  • 1
    just reminding that if you run the second line many times, the same amount of recursive bash children will be created. Commented Jul 6, 2012 at 7:06
  • 4
    Basically, you just created a new bash instance on top of python which is on top of another bash
    – Paco
    Commented Jun 26, 2013 at 23:40
  • 3
    This solution is not correct. In a python script with many commands, the script will exit as the new bash instance is created.
    – Shailen
    Commented Oct 13, 2014 at 14:53
  • 2
    Don't do that, creating an entire new bash process just for environment variable is really bad practice.
    – Nico
    Commented Jul 12, 2016 at 15:30
  • That will take you to another shell inside the terminal, so, if you put that inside script that has several other commands after that command they will stuck till you return/exit from that new shell.
    – M Y
    Commented Dec 11, 2019 at 18:33
2

I have an excellent answer.

#! /bin/bash

output=$(git diff origin/master..origin/develop | \
python -c '
  # DO YOUR HACKING
  variable1_to_be_exported="Yo Yo"
  variable2_to_be_exported="Honey Singh"
  … so on
  magic=""
  magic+="export onShell-var1=\""+str(variable1_to_be_exported)+"\"\n"
  magic+="export onShell-var2=\""+str(variable2_to_be_exported)+"\""  
  print magic
'
)

eval "$output"
echo "$onShell-var1" // Output will be Yo Yo
echo "$onShell-var2" // Output will be Honey Singh

Mr Alex Tingle is correct about those processes and sub-process stuffs

How it can be achieved is like the above I have mentioned. Key Concept is :

  1. Whatever printed from python will be stored in the variable in the catching variable in bash [output]
  2. We can execute any command in the form of string using eval
  3. So, prepare your print output from python in a meaningful bash commands
  4. use eval to execute it in bash

And you can see your results

NOTE Always execute the eval using double quotes or else bash will mess up your \ns and outputs will be strange

PS: I don't like bash but your have to use it

1
  • It is indeed an excellent answer !! see also the answer by @mikepk, same idea Commented Aug 6, 2019 at 13:44
2

I've had to do something similar on a CI system recently. My options were to do it entirely in bash (yikes) or use a language like python which would have made programming the logic much simpler.

My workaround was to do the programming in python and write the results to a file. Then use bash to export the results.

For example:

# do calculations in python
with open("./my_export", "w") as f:
    f.write(your_results)
# then in bash
export MY_DATA="$(cat ./my_export)"
rm ./my_export  # if no longer needed
0

You could try os.environ["MY_DATA"] instead.

1
  • 4
    This doesn't answer the question at all, because this doesn't actually export to the current shell.
    – kevr
    Commented Oct 6, 2017 at 16:37
0

Kind of a hack because it's not really python doing anything special here, but if you run the export command in the same sub-shell, you will probably get the result you want.

import os

cmd = "export MY_DATA='1234'; echo $MY_DATA" # or whatever command
os.system(cmd)
0

In the hope of providing clarity over common cinfusion...

I have written many python <--> bash <--> elfbin toolchains and the proper way to see it is such as this:

Each process (originator) has a state of the environment inherited from whatever invoked it. Any change remains lokal to that process. Transfering an environment state is a function by itself and runs in two directions, each with it's own caveats. The most common thing is to modify environment before running a sub-process. To go down to the metal, look at the exec() - call in C. There is a variant that takes a pointer to environment data. This is the only actually supported transfer of environment in typical OS'es.

Shell scripts will create a state to pass when running children when you do an export. Otherwise it just uses that which it got in the first place.

In all other cases it will be some generic mechanism used to pass a set of data to allow the calling process itself to update it's environment based on the result of the child-processes output.

Ex:

ENVUPDATE = $(CMD_THAT_OUTPUTS_KEYVAL_LISTS)
echo $ENVUPDATE > $TMPFILE
source $TMPFILE

The same can of course be done using json, xml or other things as long as you have the tools to interpret and apply.

The need for this may be (50% chance) a sign of misconstruing the basic primitives and that you need a better config or parameter interchange in your solution.....

Oh, in python I would do something like... (need improvement depending on your situation)

import re

RE_KV=re.compile('([a-z][\w]*)\s*=\s*(.*)')

OUTPUT=RunSomething(...) (Assuming 'k1=v1 k2=v2')

for kv in OUTPUT.split(' ')
  try:
    k,v=RE_KV.match(kv).groups()
    os.environ[k]=str(v)
  except:
    #The not a property case...
    pass
0

One line solution:

eval `python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'`
echo $python_include_path  # prints /home/<usr>/anaconda3/include/python3.6m" in my case

Breakdown:

Python call

python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'

It's launching a python script that

  1. imports sysconfig
  2. gets the python include path corresponding to this python binary (use "which python" to see which one is being used)
  3. prints the script "python_include_path={0}" with {0} being the path from 2

Eval call

eval `python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'`

It's executing in the current bash instance the output from the python script. In my case, its executing:

python_include_path=/home/<usr>/anaconda3/include/python3.6m

In other words, it's setting the environment variable "python_include_path" with that path for this shell instance.

Inspired by: http://blog.tintoy.io/2017/06/exporting-environment-variables-from-python-to-bash/

0
import os
import shlex
from subprocess import Popen, PIPE


os.environ.update(key=value)

res = Popen(shlex.split("cmd xxx -xxx"), stdin=PIPE, stdout=PIPE, stderr=PIPE,
            env=os.environ, shell=True).communicate('y\ny\ny\n'.encode('utf8'))
stdout = res[0]
stderr = res[1]

1
  • 1
    Welcome to SO, Thank you for your contribution, please add some explanation along with the code, which will help SO members to understand your answer better.
    – dkb
    Commented Jun 24, 2019 at 10:48
0

If the calling script is python then using subprocess.run is more appropriate. You can pass a modified environment dictionary to the env parameter of subprocess.run.

Here's a step-by-step guide:

1] Import the Subprocess Module: Make sure you have the subprocess module imported in your Python script.

import subprocess
import os

2] Prepare the Environment Variables: Create or modify the environment variables as needed. You can start with a copy of the current environment and then update it with your specific variables.

# Copy the current environment
env = os.environ.copy()

# Set your custom environment variables
env["MY_VARIABLE"] = "value"
env["ANOTHER_VARIABLE"] = "another value"

3] Call subprocess.run with the Custom Environment: Use the env parameter to pass your custom environment to the subprocess.

# Call the subprocess with the custom environment
result = subprocess.run(["your_script.sh"], env=env)

Replace "your_script.sh" with the path to your script or command.

4] Optional: Handle the Result: You can handle the result of the subprocess call as needed, for example, checking if the script ran successfully.

if result.returncode == 0:
    print("Script executed successfully")
else:
    print("Script failed with return code", result.returncode)

-5

os.system ('/home/user1/exportPath.ksh')

exportPath.ksh:

export PATH=MY_DATA="my_export"

Not the answer you're looking for? Browse other questions tagged or ask your own question.