Dependencies in Python
In Windmill standard mode, dependencies in Python are handled directly within their scripts without the need to manage separate dependency files. From the import lines, Windmill automatically handles the resolution and caching of the script dependencies to ensure fast and consistent execution (this is standard mode).
There are however methods to have more control on your dependencies:
- Leveraging standard mode on web IDE or locally.
- Overriding dependencies providing a requirements.txt.
Moreover, there are other tricks, compatible with the methodologies mentioned above:
- Sharing common logic with Relative Imports.
- Pinning dependencies and requirements.
- Private PyPI Repository.
- Python runtime settings.
To learn more about how dependencies from other languages are handled, see Dependency management & imports.
Lockfile per script inferred from imports (Standard)
In Windmill, you can run scripts without having to manage a requirements.txt directly. This is achieved by automatically parsing the imports and resolving the dependencies.
In Python, the imports are automatically parsed on saving of the script and a list of imports is generated. A dependency job is then spawned to associate that list of PyPI packages with a lockfile, which will lock the versions. This ensures that the same version of a Python script is always executed with the same versions of its dependencies. It also avoids the hassle of having to maintain a separate requirements file.
We use a simple heuristics to infer the package name: the import root name must be the package name. We also maintain a list of exceptions. You can make a PR to add your dependency to the list of exceptions here.
Web IDE
When a script is deployed through the Web IDE, Windmill generates a lockfile to ensure that the same version of a script is always executed with the same versions of its dependencies. To generate a lockfile, it analyzes the imports, the imports can use a version pin or if no version is used, it uses the latest version. Windmill's workers cache dependencies to ensure fast performance without the need to pre-package dependencies - most jobs take under 100ms end-to-end.
At runtime, a deployed script always uses the same version of its dependencies.
At each deployment, the lockfile is automatically recomputed from the imports in the script and the imports used by the relative imports. The computation of that lockfile is done by a dependency jobs that you can find in the Runs page.
CLI
On local development, each script gets:
- a content file (
script_path.py
,script_path.ts
,script_path.go
, etc.) that contains the code of the script, - a metadata file (
script_path.yaml
) that contains the metadata of the script, - a lockfile (
script_path.lock
) that contains the dependencies of the script.
You can get those 3 files for each script by pulling your workspace with command wmill sync pull
.
Editing a script is as simple as editing its content. The code can be edited freely in your IDE, and there are possibilities to even run it locally if you have the correct development environment setup for the script language.
Using wmill CLI command wmill script generate-metadata
, lockfiles can be generated and updated as files. The CLI asks the Windmill servers to run dependency job, using either the requirements.txt (if present) or asking Windmill to automatically resolve it from the script's code as input, and from the output of those jobs, create the lockfiles.
When a lockfile is present alongside a script at time of deployment by the CLI, no dependency job is run and the present lockfile is used instead.
Lockfile per script inferred from a requirements.txt
Although Windmill can automatically resolve imports. It is possible to override the dependencies by providing a requirements.txt
file in the same directory as the script as you would do in a standard Python project, building and maintaining a requirements.txt to declare dependencies.
When doing wmill script generate-metadata
, if a requirements.txt is discovered, the closest one will be used as source-of-truth instead of being discovered from the imports in the script directly to generate the lockfile from the server.
You can write those requirements.txt manually or through a standar pip install package_name
.
Several requirements.txt files can therefore coexist, each having authority over the scripts closest to it:
└── windmill_folder/
├── requirements.txt
├── f/foo/
│ ├── requirements.txt
│ ├── script1.py
│ ├── # script1.py will use the dependencies from windmill_folder/f/foo/requirements.txt
│ └── /bar/
│ ├── requirements.txt
│ ├── script2.py
│ └── # script2.py will use the dependencies from windmill_folder/f/foo/bar/requirements.txt
└── f/baz/
├── script3.py
└── # script3.py will use the dependencies from windmill_folder/requirements.txt
The Windmill VS Code extension has a toggle "Infer lockfile" / "Use current lockfile".
With this toggle, you can choose to use the metadata lockfile (derived from requirements.txt after wmill script generate-metadata
) instead of inferring them directly from the script.
Other
Other tricks can be used: Sharing common logic with relative imports, Pinning dependencies and requirements and Private PyPI Repository. All are compatible with the methods described above.
Sharing common logic with relative imports
If you want to share common logic with Relative Imports, this can be done easily using relative imports in both Python and TypeScript.
It is possible to import directly from other Python scripts. One can simply
follow the path layout. For instance,
import foo from f.<foldername>.script_name
. A more complete example below:
# u/user/common_logic
def foo():
print('Common logic!')
And in another Script:
# u/user/custom_script
from u.user.common_logic import foo
def main():
return foo()
It works with Scripts contained in folders, and for scripts contained in
user-spaces, e.g: f.<foldername>.script_path
or u.<username>.script_path
.
You can also do relative imports to the current script. For instance.
# if common_logic is a script in the same folder or user-space
from .common_logic import foo
# otherwise if you need to access the folder 'folder'
from ..folder.common_logic import foo
Beware that you can only import scripts that you have view rights on at time of execution.
The folder layout is identical with the one that works with the CLI for syncing scripts locally and on Windmill. See Developing scripts locally.
Pinning dependencies and requirements
If the imports are not properly analyzed, there exists an escape hatch to
override the inferred imports. One needs to head the Script with the requirements
comment followed by dependencies.
The standard pip requirement specifiers are supported. Some examples:
#requirements:
#dependency1[optional_module]
#dependency2>=0.40
#dependency3@git+https://github.com/myrepo/dependency3.git
import dependency1
import dependency2
import dependency3
def main(...):
...
To add extra dependencies or pin the version of some dependencies
To combine both the inference of Windmill and being able to pin dependencies, use extra_requirements
:
#extra_requirements:
#dependency==0.4
import pandas
import dependency
def main(...):
...
Private PyPI repository
Environment variables can be set to customize pip
's index-url and extra-index-url and certificate.
This is useful for private repositories.
In a docker-compose file, you would add following lines:
windmill_worker:
...
environment:
...
- PIP_TRUSTED_HOST=pypi.org
- PIP_INDEX_CERT=/custom-certs/root-ca.crt
"Pip Index Url" and "Pip Extra Index Url" are filled through Windmill UI, in Instance settings under Enterprise Edition.
Python runtime settings
For a given worker group, you can add Python runtime specific settings like additional Python paths and PIP local dependencies.