So, I have a python script I’d like to run from time to time from the CLI (on Linux) that resides inside a venv. What’s the recommended/intended way to do this?
Write a wrapper shell script and put it inside a $PATH-accessible directory that activates the virtual environment, runs the python script and deactivates the venv again? This seems a bit convoluted, but I can’t think of a better way.

      • Andy@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        I use my own Zsh project (zpy) to manage venvs stored like ~/.local/share/venvs/HASH-OF-PROJECT-PATH/venv, so use zpy’s vpy function to launch a script with its associated Python executable ad-hoc, or add a full path shebang to the script with zpy’s vpyshebang function.

        vpy and vpyshebang in the docs

        If anyone else is a Zsh fan and has any questions, I’m more than happy to answer or demo.

        • Faulkmore@mastodon.social
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          @Andy The convention is to place the venv in a .venv/ sub folder. Follow the convention!

          This is shell agnostic

          Learn pyenv and minimize shell scripts (only lives within a Makefile).

          Shell scripts within Python packages is depreciated

          • Andy@programming.dev
            link
            fedilink
            arrow-up
            2
            ·
            5 months ago

            The convention

            That’s one convention. I don’t like it, I prefer to keep my venvs elsewhere. One reason is that it makes it simpler to maintain multiple venvs for a single project, using a different Python version for each, if I ever want to. It shouldn’t matter to anyone else, as it’s my environment, not some aspect of the shared repo. If I ever needed it there for some reason, I could always ln -s $VIRTUAL_ENV .venv.

            Learn pyenv

            I have used pyenv. It’s fine. These days I use mise instead, which I prefer. But neither of them dictate how I create and store venvs.

            Shell scripts within Python packages is depreciated

            I don’t understand if what you’re referencing relates to my comment.

            • logging_strict@programming.dev
              link
              fedilink
              arrow-up
              1
              ·
              5 months ago

              The multiple venv for different Python versions sounds exactly like what tox does

              Then setup a github action that does nightly builds. Which will catch issues caused by changes that only tested against one python version or on one platform

              py313 is a good version to test against cuz there were many modules removed or depreciated or APIs changed

              good luck. Hope some of my advice is helpful

              • Andy@programming.dev
                link
                fedilink
                arrow-up
                2
                ·
                5 months ago

                Thanks, yes, I use nox and github actions for automated environments and testing in my own projects, and tox instead of nox when it’s someone else’s project. But for ad hoc, local and interactive multiple environments, I don’t.

    • Violet_McQuasional@feddit.uk
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      This. I’ve experimented by using pex before and one or two other means of executable python wrappers and they suck. Just do as lakeeffect says.

  • gitamar@feddit.de
    link
    fedilink
    arrow-up
    3
    ·
    5 months ago

    I use pipenv with pyenv together. This works pretty well, also in cron jobs. Just add pipenv run python script.py to the cron table.

  • Andy@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    18 days ago

    As someone’s new comments just brought me back to this post, I’ll point out that these days there’s another good option: uv run.

  • namingthingsiseasy@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    Just in case this comment didn’t make it explicitly clear, you can just invoke the python binary inside your venv directly and it will automatically locate all the libraries that are installed in your virtual environment.

    To show how this works, you can look at the sys.path variable to see which paths python will search for modules when you run import statements. Try running python3 -c 'import sys; print(sys.path)' using your system python, and you will only see system python library paths. Then, try running it again after replacing python3 with the full path to the python3 binary in your venv, and you will see an additional entry in the output with the lib directory in your venv, which shows that python will also look there for modules when an import statement is executed.

  • santa@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    arrow-down
    3
    ·
    5 months ago

    Does it need access to anything local? If not, you could run it as an AWS Lambda on a schedule.