Skip to content

vt-c-python-security

pip/Python dependency security scanning and supply chain guidance. Activates when working with requirements.txt, pyproject.toml, poetry.lock, uv.lock, or pip commands — and when evaluating new PyPI packages.

Plugin: core-standards
Category: Other
Command: /vt-c-python-security


Python Security

Supply-chain and venv-hygiene guidance for Python projects. Pairs with the venv-guard.sh PreToolUse hook (SPEC-115 / FR-02) that blocks bare pip invocations outside a virtual environment.

When to Use

  • Adding a new PyPI dependency to requirements.txt or pyproject.toml
  • Reviewing dependency diffs in a pull request
  • Auditing a Python project for supply-chain or version issues
  • Evaluating a candidate package before it enters the project
  • Responding to a pip-audit finding
  • A bare pip install was blocked and you need the correct venv-aware replacement

Prerequisites

  • Project has a virtual environment: python -m venv .venv
  • pip-audit installed inside the venv: .venv/bin/pip install pip-audit
  • Optional but recommended: uv installed for lockfile pinning (pipx install uv keeps it isolated per SPEC-115 hook allowlist)

Execution

Scanning for Vulnerabilities

Run pip-audit against the installed environment, then against the pinned requirements file for defence-in-depth.

# Scan the live venv
.venv/bin/pip-audit

# Scan a requirements file directly (catches pins that installer skipped)
.venv/bin/pip-audit -r requirements.txt

# Strict mode — exit non-zero on any finding (CI gate)
.venv/bin/pip-audit --strict

CI/CD integration:

# GitHub Actions
- name: Python security audit
  run: |
    python -m venv .venv
    .venv/bin/pip install --require-hashes -r requirements.txt
    .venv/bin/pip install pip-audit
    .venv/bin/pip-audit --strict

Before Adding Dependencies

Evaluate a candidate package against this checklist. None of the items alone is disqualifying, but two or more red flags stop the add.

  1. Provenance: does the package publish PEP 740 build attestations? (pip download <pkg> --no-deps → inspect .attestations). Attestations prove the wheel came from the advertised source repo on the advertised builder.
  2. Recency: latest release within the last 12 months? Long gaps may signal abandonment. pip index versions <pkg> lists available versions; for upload dates use the PyPI JSON API: curl -s https://pypi.org/pypi/<pkg>/json | jq '.releases | to_entries[] | {v:.key, date:(.value[0].upload_time // "n/a")}'
  3. CVEs: check osv.dev (curl "https://api.osv.dev/v1/query" -d '{"package":{"name":"<pkg>","ecosystem":"PyPI"}}') and GitHub Advisory DB. A clean history beats a patched history.
  4. Download stats: pypistats recent <pkg> — sudden spikes or drops can indicate typosquat surges or takedowns.
  5. Maintainer governance: open https://pypi.org/project/<pkg>/ and check the "Maintainers" sidebar, or query the JSON API: curl -s https://pypi.org/pypi/<pkg>/json | jq '.info | {author, maintainer, author_email}'. A single maintainer with no 2FA flag is a compromise risk (PyPI shows a padlock icon on the project page when 2FA is enforced).
  6. Source/repo match: PyPI page's "Homepage" / "Repository" link matches an actual GitHub/GitLab repo, and that repo publishes the package (not a random fork).
  7. Wheel availability: prefer wheels over sdists. Sdists execute setup.py on install, which is a code-execution surface. Use pip install --only-binary=:all: <pkg> where possible.
  8. 2FA enforcement: verify the maintainer account requires 2FA (PyPI displays a padlock icon on the project page when enforced).

Red Flags

Stop and investigate before running install commands when you see:

  • Typosquat candidates: requets for requests, pandas-ng for pandas, urllib (not urllib3), python-dateutils (plural). Check Phylum or Check Point Research bulletins (2025 reported ~500 malicious PyPI packages/quarter targeting ML/AI developers specifically).
  • Fresh package with high downloads: a 2-week-old package with 10K downloads/week is almost always a typosquat or an airdrop.
  • Name-squat of deleted package: if a popular package was deleted and a new one with the same name appeared, treat it as a new package (the namespace is not owned by the original author anymore).
  • Install-time scripts: setup.py doing network calls, os.system, or reading environment variables during pip install.
  • Dependency confusion: a private/internal package name that also exists on public PyPI. Pin the private mirror with --index-url (never --extra-index-url — public PyPI wins ties).

Lockfile Hygiene

Pinned requirements are the single most effective supply-chain defence.

# Generate hash-locked requirements with uv (recommended)
uv pip compile pyproject.toml -o requirements.txt --generate-hashes

# Generate with pip-tools (older but stable)
pip-compile --generate-hashes pyproject.toml

# Install with hash verification (fails on tampered artifacts)
.venv/bin/pip install --require-hashes -r requirements.txt

--require-hashes is all-or-nothing: every requirement must have a hash or the install aborts. This is a feature, not a bug — partial pinning is worse than no pinning because it hides the unpinned rows.

For poetry projects: commit poetry.lock. For uv projects: commit uv.lock. Run poetry install --sync / uv sync in CI to detect lockfile drift.

Supply Chain Posture

  • Use Trusted Publishing for your own PyPI uploads (OIDC-based, no long-lived tokens). Configure in the package's PyPI project settings.
  • Avoid index-url overrides (--index-url, --extra-index-url, -i). These bypass the default PyPI trust path and are blocked by SPEC-115 deny rules. If you need a private registry, configure pip.conf at the repo-root level and document the rationale (and remove the project-level deny override only after the config is committed and peer-reviewed).
  • Prefer wheels with --only-binary=:all: unless you audit the sdist's setup.py.
  • Never use pip install --break-system-packages. PEP 668's EXTERNALLY-MANAGED marker exists for a reason — if you're hitting it, create a venv.

Vulnerability Response

When pip-audit reports a finding:

  1. Read the advisory: open the GHSA / CVE link. Determine severity and whether your code path actually triggers the vulnerability.
  2. Patch version available? .venv/bin/pip install -U <pkg>==X.Y.Z with the fixed minor. Rerun tests.
  3. No patch, but upgrade path? Bump to the next major and budget the migration work.
  4. No upgrade path? Evaluate replacement packages (see "Before Adding Dependencies" checklist).
  5. Cannot replace? Document the accepted risk in SECURITY.md with the CVE ID, the reason, and a review date. Revisit monthly.

Edge Cases

  • Dev dependencies: run pip-audit -r dev-requirements.txt in addition to the main audit — dev tooling often has broader attack surface (build tools, linters, etc.) and its pins are usually looser.
  • Private registries: trust the registry, then verify with hashes. A private registry reduces the typosquat risk but does not eliminate it — an insider still needs hash-locked requirements.
  • Conda / mamba projects: pip-audit does not understand conda-only dependencies. Run conda-audit or mamba-lock separately, or mirror conda deps into requirements.txt for pip-audit coverage. SPEC-115 is scoped to pip/venv (see spec.md "Out of Scope").
  • pipenv projects: pipenv install is not classified by venv-guard.sh (pipenv is out of scope). Users on pipenv must add a project-level deny override if they want the guard to fire.
  • PIP_INDEX_URL env-var bypass: env-var-based index redirection is known but out of scope for the deny rules (they target CLI flags). A maliciously-set env var can redirect installs — review shell rc files with grep PIP_ ~/.bashrc ~/.zshrc ~/.profile.

Examples

Scan before merging a PR

python -m venv .venv
.venv/bin/pip install --require-hashes -r requirements.txt
.venv/bin/pip install pip-audit
.venv/bin/pip-audit --strict
# Exit 0 → clean; non-zero → block merge

Evaluate a new package (httpx as example)

# 1. Project home
open https://pypi.org/project/httpx/

# 2. Recent releases
.venv/bin/pip index versions httpx

# 3. CVE history
curl -s -X POST https://api.osv.dev/v1/query \
  -H 'Content-Type: application/json' \
  -d '{"package":{"name":"httpx","ecosystem":"PyPI"}}' | jq '.vulns | length'

# 4. Download stats (requires pipx-installed pypistats)
pipx run pypistats recent httpx

Pin + install cleanly

python -m venv .venv
source .venv/bin/activate
uv pip compile pyproject.toml -o requirements.txt --generate-hashes
pip install --require-hashes -r requirements.txt

Respond to a pip-audit finding

.venv/bin/pip-audit
# ... reports CVE-2024-XXXXX in requests 2.31.0 (fixed in 2.32.0)

# Update the pin
uv pip compile pyproject.toml -o requirements.txt --generate-hashes

# Reinstall with new hashes
.venv/bin/pip install --require-hashes -r requirements.txt

# Re-audit
.venv/bin/pip-audit

Anti-patterns

  • curl <url> | sh installers for Python tools — no hash verification, no audit trail, no reproducibility. Use pipx or the venv pattern.
  • pip install --break-system-packages — bypasses PEP 668 and pollutes the system interpreter. Blocked by SPEC-115 deny rules.
  • Unpinned requirements.txtrequests without a version is a ticking supply-chain bomb. Use uv pip compile to pin everything.
  • sudo pip install — writes to the system Python, breaks package manager integrity, and is never the right answer. Use a venv.
  • pip install -U * — unbounded updates in CI bypass every pin. Bumps must be deliberate.
  • Adding --index-url https://some-mirror/simple — silently redirects trust. If you need a mirror, configure it project-wide with review. --extra-index-url is worse: public PyPI still wins on package-name collisions (dependency confusion).
  • Ignoring pip-audit findings "for now" — findings rot. Document accepted risk in writing with a review date, or fix it.

Success Criteria

  • pip-audit --strict exits 0 in CI
  • All production deps installed with --require-hashes
  • requirements.txt and pyproject.toml agree on pins (diff is empty)
  • No hard-coded --index-url or --extra-index-url in project scripts
  • Every added dependency has a documented rationale in the PR that introduced it