Compare commits

...

26 Commits
1.35.2 ... main

Author SHA1 Message Date
Mathijs van Veluw
9c7df6412c Fix apikey login (#6922)
The API Key login needs some extra JSON return key's, same as password login.
Fixes #6912

Signed-off-by: BlackDex <black.dex@gmail.com>
2026-03-09 21:13:27 +01:00
Daniel
065c1f2cd5 Fix checkout action version (#6921)
- wasn't getting picked up when updating action due to being formatted as `#v6.0.0` instead of `# v6.0.0`
2026-03-09 19:35:14 +01:00
Daniel García
1a1d7f578a Support new desktop origin on CORS (#6920) 2026-03-09 19:14:28 +01:00
Mathijs van Veluw
2b16a05e54 Misc updates and fixes (#6910)
* Fix collection details response

Signed-off-by: BlackDex <black.dex@gmail.com>

* Misc updates and fixes

- Some clippy fixes
- Crate updates
- Updated Rust to v1.94.0
- Updated all GitHub Actions
- Updated web-vault v2026.2.0

Signed-off-by: BlackDex <black.dex@gmail.com>

* Remove commented out code

---------

Signed-off-by: BlackDex <black.dex@gmail.com>
Co-authored-by: Daniel García <dani-garcia@users.noreply.github.com>
2026-03-09 18:38:22 +01:00
phoeagon
c6e9948984 Add cxp-import-mobile and cxp-export-mobile: feature flags on mobile (#6853)
Co-authored-by: Daniel García <dani-garcia@users.noreply.github.com>
2026-03-09 18:21:23 +01:00
Timshel
ecdb18fcde Add 30s cache to SSO exchange_refresh_token (#6866)
Co-authored-by: Timshel <timshel@users.noreply.github.com>
2026-03-09 18:10:06 +01:00
pasarenicu
df25d316d6 Add Webauthn related origins flag to known flags. (#6900)
support pm-30529-webauthn-related-origins flag
2026-03-09 18:06:41 +01:00
Chris Kruger
747286dccd fix: add ForcePasswordReset to api key login (#6904) 2026-03-09 18:04:17 +01:00
DerPlayer2001
e60105411b Feat(config): add feature flag for Safari account switching (#6891)
This enables the use of the Feature from this PR https://github.com/bitwarden/clients/pull/18339
2026-03-09 17:57:10 +01:00
Ken Watanabe
937857a0bc Merge commit from fork
* Fix WebAuthn backup flag update before signature verification

* fix format rust file

* Add test for migration update

* Remove webauthn test
2026-03-09 17:50:21 +01:00
Stefan Melmuk
ba55191676 apply policies only to confirmed members (#6892) 2026-03-04 06:58:39 +01:00
Mathijs van Veluw
c555f7d198 Misc organization fixes (#6867) 2026-02-23 21:52:44 +01:00
proofofcopilot
74819b95bd fix(send_invite): add orgSsoIdentifier if sso_only is enabled (#6824) 2026-02-23 20:28:12 +01:00
Stefan Melmuk
da2af3d362 hide remember 2fa token (#6852) 2026-02-23 20:27:40 +01:00
Mathijs van Veluw
1583fe4af3 Update Rust and Crates and GHA (#6843)
- Update Rust to v1.93.1
- Updated all the crates
  Adjust changes needed for the newer `rand` crate
- Updated GitHub Actions

Signed-off-by: BlackDex <black.dex@gmail.com>
2026-02-18 00:17:20 +01:00
Mathijs van Veluw
36f0620fd1 Fix org-details issue (#6811)
Fix an issue where it was possible for users who were not eligible to access all org ciphers to be able to download and extract the encrypted contents.
Only Managers with full access and Admins and Owners should be able to access this endpoint.

This change will block and prevent access for other users.

Signed-off-by: BlackDex <black.dex@gmail.com>
2026-02-10 20:34:30 +01:00
Mathijs van Veluw
3cd2d4afe7 Update crates and web-vault (#6810)
Signed-off-by: BlackDex <black.dex@gmail.com>
2026-02-10 20:24:35 +01:00
Mathijs van Veluw
d09c45bb63 Misc updates, crates, rust, js, gha, vault (#6799) 2026-02-08 19:24:20 +01:00
Stefan Melmuk
feecfb20da fix error message for purging auth requests (#6776) 2026-02-01 22:35:55 +01:00
Timshel
347279a12c Empty AccountKeys when no private key (#6761)
Co-authored-by: Timshel <timshel@users.noreply.github.com>
2026-02-01 22:35:22 +01:00
Helmut K. C. Tessarek
7f65a254b3 refactor: improve tooltips in diagnostics page (#6765)
The term "seems to" is used too loosely in many of the tooltips, but in
these 2 instances it is wrong wording.
An update is either available or not. If there is no update, one could
argue that "seems to" is valid, since the Internet could be down to
check for a new version. But in this situation the update is availble.
It is impossible that an update seems to be available.
2026-02-01 22:35:03 +01:00
Mathijs van Veluw
cc80f689ed Update crates, web-vault, js, workflows (#6749)
- Updated all crates
- Updated web-vault to v2025.12.2
- Updated all JavaScript files
- Updated all GitHub Action Workflows
  Also added the `concurrency` option to all workflows.

Signed-off-by: BlackDex <black.dex@gmail.com>
2026-01-22 23:40:39 +01:00
Stefan Melmuk
4737192853 fix email as 2fa with auth requests (#6736)
* fix email as 2fa with auth requests

* increase expiry time of auth_requests to 15 minutes
2026-01-22 23:25:11 +01:00
Stefan Melmuk
0c6817cb4e hide password hints via CSS (#6726) 2026-01-18 15:25:20 +01:00
Stefan Melmuk
25a71d913f use email instead of empty name for webauhn (#6733)
* if empty use email instead of name for webauhn

* use email as display name if name is empty
2026-01-18 15:23:21 +01:00
Mathijs van Veluw
b2cd556f3e Fix User API Key login (#6712)
When using the latest Bitwarden CLI and logging in using the API Key, it expects some extra fields, same as for normal login.
This PR adds those fields and login is possible again via API Key.

Fixes #6709

Signed-off-by: BlackDex <black.dex@gmail.com>
2026-01-14 13:11:43 +01:00
44 changed files with 4709 additions and 6654 deletions

View File

@@ -372,6 +372,7 @@
## Note that clients cache the /api/config endpoint for about 1 hour and it could take some time before they are enabled or disabled! ## Note that clients cache the /api/config endpoint for about 1 hour and it could take some time before they are enabled or disabled!
## ##
## The following flags are available: ## The following flags are available:
## - "pm-5594-safari-account-switching": Enable account switching in Safari. (Needs Safari >=2026.2.0)
## - "inline-menu-positioning-improvements": Enable the use of inline menu password generator and identity suggestions in the browser extension. ## - "inline-menu-positioning-improvements": Enable the use of inline menu password generator and identity suggestions in the browser extension.
## - "inline-menu-totp": Enable the use of inline menu TOTP codes in the browser extension. ## - "inline-menu-totp": Enable the use of inline menu TOTP codes in the browser extension.
## - "ssh-agent": Enable SSH agent support on Desktop. (Needs desktop >=2024.12.0) ## - "ssh-agent": Enable SSH agent support on Desktop. (Needs desktop >=2024.12.0)
@@ -381,6 +382,8 @@
## - "anon-addy-self-host-alias": Enable configuring self-hosted Anon Addy alias generator. (Needs Android >=2025.3.0, iOS >=2025.4.0) ## - "anon-addy-self-host-alias": Enable configuring self-hosted Anon Addy alias generator. (Needs Android >=2025.3.0, iOS >=2025.4.0)
## - "simple-login-self-host-alias": Enable configuring self-hosted Simple Login alias generator. (Needs Android >=2025.3.0, iOS >=2025.4.0) ## - "simple-login-self-host-alias": Enable configuring self-hosted Simple Login alias generator. (Needs Android >=2025.3.0, iOS >=2025.4.0)
## - "mutual-tls": Enable the use of mutual TLS on Android (Client >= 2025.2.0) ## - "mutual-tls": Enable the use of mutual TLS on Android (Client >= 2025.2.0)
## - "cxp-import-mobile": Enable the import via CXP on iOS (Clients >=2025.9.2)
## - "cxp-export-mobile": Enable the export via CXP on iOS (Clients >=2025.9.2)
# EXPERIMENTAL_CLIENT_FEATURE_FLAGS=fido2-vault-credentials # EXPERIMENTAL_CLIENT_FEATURE_FLAGS=fido2-vault-credentials
## Require new device emails. When a user logs in an email is required to be sent. ## Require new device emails. When a user logs in an email is required to be sent.

View File

@@ -1,6 +1,10 @@
name: Build name: Build
permissions: {} permissions: {}
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
on: on:
push: push:
paths: paths:
@@ -30,6 +34,10 @@ on:
- "docker/DockerSettings.yaml" - "docker/DockerSettings.yaml"
- "macros/**" - "macros/**"
defaults:
run:
shell: bash
jobs: jobs:
build: build:
name: Build and Test ${{ matrix.channel }} name: Build and Test ${{ matrix.channel }}
@@ -54,7 +62,7 @@ jobs:
# Checkout the repo # Checkout the repo
- name: "Checkout" - name: "Checkout"
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with: with:
persist-credentials: false persist-credentials: false
fetch-depth: 0 fetch-depth: 0
@@ -63,7 +71,6 @@ jobs:
# Determine rust-toolchain version # Determine rust-toolchain version
- name: Init Variables - name: Init Variables
id: toolchain id: toolchain
shell: bash
env: env:
CHANNEL: ${{ matrix.channel }} CHANNEL: ${{ matrix.channel }}
run: | run: |
@@ -80,7 +87,7 @@ jobs:
# Only install the clippy and rustfmt components on the default rust-toolchain # Only install the clippy and rustfmt components on the default rust-toolchain
- name: "Install rust-toolchain version" - name: "Install rust-toolchain version"
uses: dtolnay/rust-toolchain@f7ccc83f9ed1e5b9c81d8a67d7ad1a747e22a561 # master @ Dec 16, 2025, 6:11 PM GMT+1 uses: dtolnay/rust-toolchain@efa25f7f19611383d5b0ccf2d1c8914531636bf9 # master @ Feb 13, 2026, 3:46 AM GMT+1
if: ${{ matrix.channel == 'rust-toolchain' }} if: ${{ matrix.channel == 'rust-toolchain' }}
with: with:
toolchain: "${{steps.toolchain.outputs.RUST_TOOLCHAIN}}" toolchain: "${{steps.toolchain.outputs.RUST_TOOLCHAIN}}"
@@ -90,7 +97,7 @@ jobs:
# Install the any other channel to be used for which we do not execute clippy and rustfmt # Install the any other channel to be used for which we do not execute clippy and rustfmt
- name: "Install MSRV version" - name: "Install MSRV version"
uses: dtolnay/rust-toolchain@f7ccc83f9ed1e5b9c81d8a67d7ad1a747e22a561 # master @ Dec 16, 2025, 6:11 PM GMT+1 uses: dtolnay/rust-toolchain@efa25f7f19611383d5b0ccf2d1c8914531636bf9 # master @ Feb 13, 2026, 3:46 AM GMT+1
if: ${{ matrix.channel != 'rust-toolchain' }} if: ${{ matrix.channel != 'rust-toolchain' }}
with: with:
toolchain: "${{steps.toolchain.outputs.RUST_TOOLCHAIN}}" toolchain: "${{steps.toolchain.outputs.RUST_TOOLCHAIN}}"

View File

@@ -1,8 +1,16 @@
name: Check templates name: Check templates
permissions: {} permissions: {}
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
on: [ push, pull_request ] on: [ push, pull_request ]
defaults:
run:
shell: bash
jobs: jobs:
docker-templates: docker-templates:
name: Validate docker templates name: Validate docker templates
@@ -12,7 +20,7 @@ jobs:
steps: steps:
# Checkout the repo # Checkout the repo
- name: "Checkout" - name: "Checkout"
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with: with:
persist-credentials: false persist-credentials: false
# End Checkout the repo # End Checkout the repo

View File

@@ -1,8 +1,15 @@
name: Hadolint name: Hadolint
on: [ push, pull_request ]
permissions: {} permissions: {}
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
on: [ push, pull_request ]
defaults:
run:
shell: bash
jobs: jobs:
hadolint: hadolint:
@@ -13,7 +20,7 @@ jobs:
steps: steps:
# Start Docker Buildx # Start Docker Buildx
- name: Setup Docker Buildx - name: Setup Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0 uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
# https://github.com/moby/buildkit/issues/3969 # https://github.com/moby/buildkit/issues/3969
# Also set max parallelism to 2, the default of 4 breaks GitHub Actions and causes OOMKills # Also set max parallelism to 2, the default of 4 breaks GitHub Actions and causes OOMKills
with: with:
@@ -25,7 +32,6 @@ jobs:
# Download hadolint - https://github.com/hadolint/hadolint/releases # Download hadolint - https://github.com/hadolint/hadolint/releases
- name: Download hadolint - name: Download hadolint
shell: bash
run: | run: |
sudo curl -L https://github.com/hadolint/hadolint/releases/download/v${HADOLINT_VERSION}/hadolint-$(uname -s)-$(uname -m) -o /usr/local/bin/hadolint && \ sudo curl -L https://github.com/hadolint/hadolint/releases/download/v${HADOLINT_VERSION}/hadolint-$(uname -s)-$(uname -m) -o /usr/local/bin/hadolint && \
sudo chmod +x /usr/local/bin/hadolint sudo chmod +x /usr/local/bin/hadolint
@@ -34,20 +40,18 @@ jobs:
# End Download hadolint # End Download hadolint
# Checkout the repo # Checkout the repo
- name: Checkout - name: Checkout
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with: with:
persist-credentials: false persist-credentials: false
# End Checkout the repo # End Checkout the repo
# Test Dockerfiles with hadolint # Test Dockerfiles with hadolint
- name: Run hadolint - name: Run hadolint
shell: bash
run: hadolint docker/Dockerfile.{debian,alpine} run: hadolint docker/Dockerfile.{debian,alpine}
# End Test Dockerfiles with hadolint # End Test Dockerfiles with hadolint
# Test Dockerfiles with docker build checks # Test Dockerfiles with docker build checks
- name: Run docker build check - name: Run docker build check
shell: bash
run: | run: |
echo "Checking docker/Dockerfile.debian" echo "Checking docker/Dockerfile.debian"
docker build --check . -f docker/Dockerfile.debian docker build --check . -f docker/Dockerfile.debian

View File

@@ -1,6 +1,12 @@
name: Release name: Release
permissions: {} permissions: {}
concurrency:
# Apply concurrency control only on the upstream repo
group: ${{ github.repository == 'dani-garcia/vaultwarden' && format('{0}-{1}', github.workflow, github.ref) || github.run_id }}
# Don't cancel other runs when creating a tag
cancel-in-progress: ${{ github.ref_type == 'branch' }}
on: on:
push: push:
branches: branches:
@@ -10,12 +16,6 @@ on:
# https://docs.github.com/en/actions/writing-workflows/workflow-syntax-for-github-actions#filter-pattern-cheat-sheet # https://docs.github.com/en/actions/writing-workflows/workflow-syntax-for-github-actions#filter-pattern-cheat-sheet
- '[1-2].[0-9]+.[0-9]+' - '[1-2].[0-9]+.[0-9]+'
concurrency:
# Apply concurrency control only on the upstream repo
group: ${{ github.repository == 'dani-garcia/vaultwarden' && format('{0}-{1}', github.workflow, github.ref) || github.run_id }}
# Don't cancel other runs when creating a tag
cancel-in-progress: ${{ github.ref_type == 'branch' }}
defaults: defaults:
run: run:
shell: bash shell: bash
@@ -54,13 +54,13 @@ jobs:
steps: steps:
- name: Initialize QEMU binfmt support - name: Initialize QEMU binfmt support
uses: docker/setup-qemu-action@c7c53464625b32c7a7e944ae62b3e17d2b600130 # v3.7.0 uses: docker/setup-qemu-action@ce360397dd3f832beb865e1373c09c0e9f86d70a # v4.0.0
with: with:
platforms: "arm64,arm" platforms: "arm64,arm"
# Start Docker Buildx # Start Docker Buildx
- name: Setup Docker Buildx - name: Setup Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0 uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
# https://github.com/moby/buildkit/issues/3969 # https://github.com/moby/buildkit/issues/3969
# Also set max parallelism to 2, the default of 4 breaks GitHub Actions and causes OOMKills # Also set max parallelism to 2, the default of 4 breaks GitHub Actions and causes OOMKills
with: with:
@@ -73,7 +73,7 @@ jobs:
# Checkout the repo # Checkout the repo
- name: Checkout - name: Checkout
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
# We need fetch-depth of 0 so we also get all the tag metadata # We need fetch-depth of 0 so we also get all the tag metadata
with: with:
persist-credentials: false persist-credentials: false
@@ -102,7 +102,7 @@ jobs:
# Login to Docker Hub # Login to Docker Hub
- name: Login to Docker Hub - name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with: with:
username: ${{ secrets.DOCKERHUB_USERNAME }} username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }} password: ${{ secrets.DOCKERHUB_TOKEN }}
@@ -117,7 +117,7 @@ jobs:
# Login to GitHub Container Registry # Login to GitHub Container Registry
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
@@ -133,7 +133,7 @@ jobs:
# Login to Quay.io # Login to Quay.io
- name: Login to Quay.io - name: Login to Quay.io
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with: with:
registry: quay.io registry: quay.io
username: ${{ secrets.QUAY_USERNAME }} username: ${{ secrets.QUAY_USERNAME }}
@@ -181,7 +181,7 @@ jobs:
- name: Bake ${{ matrix.base_image }} containers - name: Bake ${{ matrix.base_image }} containers
id: bake_vw id: bake_vw
uses: docker/bake-action@5be5f02ff8819ecd3092ea6b2e6261c31774f2b4 # v6.10.0 uses: docker/bake-action@82490499d2e5613fcead7e128237ef0b0ea210f7 # v7.0.0
env: env:
BASE_TAGS: "${{ steps.determine-version.outputs.BASE_TAGS }}" BASE_TAGS: "${{ steps.determine-version.outputs.BASE_TAGS }}"
SOURCE_COMMIT: "${{ env.SOURCE_COMMIT }}" SOURCE_COMMIT: "${{ env.SOURCE_COMMIT }}"
@@ -218,7 +218,7 @@ jobs:
touch "${RUNNER_TEMP}/digests/${digest#sha256:}" touch "${RUNNER_TEMP}/digests/${digest#sha256:}"
- name: Upload digest - name: Upload digest
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0 uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with: with:
name: digests-${{ env.NORMALIZED_ARCH }}-${{ matrix.base_image }} name: digests-${{ env.NORMALIZED_ARCH }}-${{ matrix.base_image }}
path: ${{ runner.temp }}/digests/* path: ${{ runner.temp }}/digests/*
@@ -233,12 +233,12 @@ jobs:
# Upload artifacts to Github Actions and Attest the binaries # Upload artifacts to Github Actions and Attest the binaries
- name: Attest binaries - name: Attest binaries
uses: actions/attest-build-provenance@00014ed6ed5efc5b1ab7f7f34a39eb55d41aa4f8 # v3.1.0 uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v4.1.0
with: with:
subject-path: vaultwarden-${{ env.NORMALIZED_ARCH }} subject-path: vaultwarden-${{ env.NORMALIZED_ARCH }}
- name: Upload binaries as artifacts - name: Upload binaries as artifacts
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0 uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with: with:
name: vaultwarden-${{ env.SOURCE_VERSION }}-linux-${{ env.NORMALIZED_ARCH }}-${{ matrix.base_image }} name: vaultwarden-${{ env.SOURCE_VERSION }}-linux-${{ env.NORMALIZED_ARCH }}-${{ matrix.base_image }}
path: vaultwarden-${{ env.NORMALIZED_ARCH }} path: vaultwarden-${{ env.NORMALIZED_ARCH }}
@@ -257,7 +257,7 @@ jobs:
steps: steps:
- name: Download digests - name: Download digests
uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0 uses: actions/download-artifact@70fc10c6e5e1ce46ad2ea6f2b72d43f7d47b13c3 # v8.0.0
with: with:
path: ${{ runner.temp }}/digests path: ${{ runner.temp }}/digests
pattern: digests-*-${{ matrix.base_image }} pattern: digests-*-${{ matrix.base_image }}
@@ -265,7 +265,7 @@ jobs:
# Login to Docker Hub # Login to Docker Hub
- name: Login to Docker Hub - name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with: with:
username: ${{ secrets.DOCKERHUB_USERNAME }} username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }} password: ${{ secrets.DOCKERHUB_TOKEN }}
@@ -280,7 +280,7 @@ jobs:
# Login to GitHub Container Registry # Login to GitHub Container Registry
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
@@ -296,7 +296,7 @@ jobs:
# Login to Quay.io # Login to Quay.io
- name: Login to Quay.io - name: Login to Quay.io
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0 uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with: with:
registry: quay.io registry: quay.io
username: ${{ secrets.QUAY_USERNAME }} username: ${{ secrets.QUAY_USERNAME }}
@@ -358,7 +358,7 @@ jobs:
# Attest container images # Attest container images
- name: Attest - docker.io - ${{ matrix.base_image }} - name: Attest - docker.io - ${{ matrix.base_image }}
if: ${{ env.HAVE_DOCKERHUB_LOGIN == 'true' && env.DIGEST_SHA != ''}} if: ${{ env.HAVE_DOCKERHUB_LOGIN == 'true' && env.DIGEST_SHA != ''}}
uses: actions/attest-build-provenance@00014ed6ed5efc5b1ab7f7f34a39eb55d41aa4f8 # v3.1.0 uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v4.1.0
with: with:
subject-name: ${{ vars.DOCKERHUB_REPO }} subject-name: ${{ vars.DOCKERHUB_REPO }}
subject-digest: ${{ env.DIGEST_SHA }} subject-digest: ${{ env.DIGEST_SHA }}
@@ -366,7 +366,7 @@ jobs:
- name: Attest - ghcr.io - ${{ matrix.base_image }} - name: Attest - ghcr.io - ${{ matrix.base_image }}
if: ${{ env.HAVE_GHCR_LOGIN == 'true' && env.DIGEST_SHA != ''}} if: ${{ env.HAVE_GHCR_LOGIN == 'true' && env.DIGEST_SHA != ''}}
uses: actions/attest-build-provenance@00014ed6ed5efc5b1ab7f7f34a39eb55d41aa4f8 # v3.1.0 uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v4.1.0
with: with:
subject-name: ${{ vars.GHCR_REPO }} subject-name: ${{ vars.GHCR_REPO }}
subject-digest: ${{ env.DIGEST_SHA }} subject-digest: ${{ env.DIGEST_SHA }}
@@ -374,7 +374,7 @@ jobs:
- name: Attest - quay.io - ${{ matrix.base_image }} - name: Attest - quay.io - ${{ matrix.base_image }}
if: ${{ env.HAVE_QUAY_LOGIN == 'true' && env.DIGEST_SHA != ''}} if: ${{ env.HAVE_QUAY_LOGIN == 'true' && env.DIGEST_SHA != ''}}
uses: actions/attest-build-provenance@00014ed6ed5efc5b1ab7f7f34a39eb55d41aa4f8 # v3.1.0 uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v4.1.0
with: with:
subject-name: ${{ vars.QUAY_REPO }} subject-name: ${{ vars.QUAY_REPO }}
subject-digest: ${{ env.DIGEST_SHA }} subject-digest: ${{ env.DIGEST_SHA }}

View File

@@ -1,6 +1,10 @@
name: Cleanup name: Cleanup
permissions: {} permissions: {}
concurrency:
group: ${{ github.workflow }}
cancel-in-progress: false
on: on:
workflow_dispatch: workflow_dispatch:
inputs: inputs:

View File

@@ -1,6 +1,10 @@
name: Trivy name: Trivy
permissions: {} permissions: {}
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
on: on:
push: push:
branches: branches:
@@ -29,12 +33,12 @@ jobs:
steps: steps:
- name: Checkout code - name: Checkout code
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with: with:
persist-credentials: false persist-credentials: false
- name: Run Trivy vulnerability scanner - name: Run Trivy vulnerability scanner
uses: aquasecurity/trivy-action@b6643a29fecd7f34b3597bc6acb0a98b03d33ff8 # 0.33.1 uses: aquasecurity/trivy-action@97e0b3872f55f89b95b2f65b3dbab56962816478 # 0.34.2
env: env:
TRIVY_DB_REPOSITORY: docker.io/aquasec/trivy-db:2,public.ecr.aws/aquasecurity/trivy-db:2,ghcr.io/aquasecurity/trivy-db:2 TRIVY_DB_REPOSITORY: docker.io/aquasec/trivy-db:2,public.ecr.aws/aquasecurity/trivy-db:2,ghcr.io/aquasecurity/trivy-db:2
TRIVY_JAVA_DB_REPOSITORY: docker.io/aquasec/trivy-java-db:1,public.ecr.aws/aquasecurity/trivy-java-db:1,ghcr.io/aquasecurity/trivy-java-db:1 TRIVY_JAVA_DB_REPOSITORY: docker.io/aquasec/trivy-java-db:1,public.ecr.aws/aquasecurity/trivy-java-db:1,ghcr.io/aquasecurity/trivy-java-db:1
@@ -46,6 +50,6 @@ jobs:
severity: CRITICAL,HIGH severity: CRITICAL,HIGH
- name: Upload Trivy scan results to GitHub Security tab - name: Upload Trivy scan results to GitHub Security tab
uses: github/codeql-action/upload-sarif@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9 uses: github/codeql-action/upload-sarif@0d579ffd059c29b07949a3cce3983f0780820c98 # v4.32.6
with: with:
sarif_file: 'trivy-results.sarif' sarif_file: 'trivy-results.sarif'

View File

@@ -1,7 +1,11 @@
name: Code Spell Checking name: Code Spell Checking
permissions: {}
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
on: [ push, pull_request ] on: [ push, pull_request ]
permissions: {}
jobs: jobs:
typos: typos:
@@ -12,11 +16,11 @@ jobs:
steps: steps:
# Checkout the repo # Checkout the repo
- name: Checkout - name: Checkout
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with: with:
persist-credentials: false persist-credentials: false
# End Checkout the repo # End Checkout the repo
# When this version is updated, do not forget to update this in `.pre-commit-config.yaml` too # When this version is updated, do not forget to update this in `.pre-commit-config.yaml` too
- name: Spell Check Repo - name: Spell Check Repo
uses: crate-ci/typos@1a319b54cc9e3b333fed6a5c88ba1a90324da514 # v1.40.1 uses: crate-ci/typos@631208b7aac2daa8b707f55e7331f9112b0e062d # v1.44.0

View File

@@ -1,4 +1,9 @@
name: Security Analysis with zizmor name: Security Analysis with zizmor
permissions: {}
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
on: on:
push: push:
@@ -6,8 +11,6 @@ on:
pull_request: pull_request:
branches: ["**"] branches: ["**"]
permissions: {}
jobs: jobs:
zizmor: zizmor:
name: Run zizmor name: Run zizmor
@@ -16,12 +19,12 @@ jobs:
security-events: write # To write the security report security-events: write # To write the security report
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 #v6.0.0 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with: with:
persist-credentials: false persist-credentials: false
- name: Run zizmor - name: Run zizmor
uses: zizmorcore/zizmor-action@e639db99335bc9038abc0e066dfcd72e23d26fb4 # v0.3.0 uses: zizmorcore/zizmor-action@0dce2577a4760a2749d8cfb7a84b7d5585ebcb7d # v0.5.0
with: with:
# intentionally not scanning the entire repository, # intentionally not scanning the entire repository,
# since it contains integration tests. # since it contains integration tests.

View File

@@ -53,6 +53,6 @@ repos:
- "cd docker && make" - "cd docker && make"
# When this version is updated, do not forget to update this in `.github/workflows/typos.yaml` too # When this version is updated, do not forget to update this in `.github/workflows/typos.yaml` too
- repo: https://github.com/crate-ci/typos - repo: https://github.com/crate-ci/typos
rev: 1a319b54cc9e3b333fed6a5c88ba1a90324da514 # v1.40.1 rev: 631208b7aac2daa8b707f55e7331f9112b0e062d # v1.44.0
hooks: hooks:
- id: typos - id: typos

1089
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
[workspace.package] [workspace.package]
edition = "2021" edition = "2021"
rust-version = "1.90.0" rust-version = "1.92.0"
license = "AGPL-3.0-only" license = "AGPL-3.0-only"
repository = "https://github.com/dani-garcia/vaultwarden" repository = "https://github.com/dani-garcia/vaultwarden"
publish = false publish = false
@@ -78,17 +78,17 @@ rmpv = "1.3.1" # MessagePack library
dashmap = "6.1.0" dashmap = "6.1.0"
# Async futures # Async futures
futures = "0.3.31" futures = "0.3.32"
tokio = { version = "1.48.0", features = ["rt-multi-thread", "fs", "io-util", "parking_lot", "time", "signal", "net"] } tokio = { version = "1.50.0", features = ["rt-multi-thread", "fs", "io-util", "parking_lot", "time", "signal", "net"] }
tokio-util = { version = "0.7.17", features = ["compat"]} tokio-util = { version = "0.7.18", features = ["compat"]}
# A generic serialization/deserialization framework # A generic serialization/deserialization framework
serde = { version = "1.0.228", features = ["derive"] } serde = { version = "1.0.228", features = ["derive"] }
serde_json = "1.0.148" serde_json = "1.0.149"
# A safe, extensible ORM and Query builder # A safe, extensible ORM and Query builder
# Currently pinned diesel to v2.3.3 as newer version break MySQL/MariaDB compatibility # Currently pinned diesel to v2.3.3 as newer version break MySQL/MariaDB compatibility
diesel = { version = "2.3.5", features = ["chrono", "r2d2", "numeric"] } diesel = { version = "2.3.6", features = ["chrono", "r2d2", "numeric"] }
diesel_migrations = "2.3.1" diesel_migrations = "2.3.1"
derive_more = { version = "2.1.1", features = ["from", "into", "as_ref", "deref", "display"] } derive_more = { version = "2.1.1", features = ["from", "into", "as_ref", "deref", "display"] }
@@ -98,26 +98,26 @@ diesel-derive-newtype = "2.1.2"
libsqlite3-sys = { version = "0.35.0", features = ["bundled"], optional = true } libsqlite3-sys = { version = "0.35.0", features = ["bundled"], optional = true }
# Crypto-related libraries # Crypto-related libraries
rand = "0.9.2" rand = "0.10.0"
ring = "0.17.14" ring = "0.17.14"
subtle = "2.6.1" subtle = "2.6.1"
# UUID generation # UUID generation
uuid = { version = "1.19.0", features = ["v4"] } uuid = { version = "1.22.0", features = ["v4"] }
# Date and time libraries # Date and time libraries
chrono = { version = "0.4.42", features = ["clock", "serde"], default-features = false } chrono = { version = "0.4.44", features = ["clock", "serde"], default-features = false }
chrono-tz = "0.10.4" chrono-tz = "0.10.4"
time = "0.3.44" time = "0.3.47"
# Job scheduler # Job scheduler
job_scheduler_ng = "2.4.0" job_scheduler_ng = "2.4.0"
# Data encoding library Hex/Base32/Base64 # Data encoding library Hex/Base32/Base64
data-encoding = "2.9.0" data-encoding = "2.10.0"
# JWT library # JWT library
jsonwebtoken = { version = "10.2.0", features = ["use_pem", "rust_crypto"], default-features = false } jsonwebtoken = { version = "10.3.0", features = ["use_pem", "rust_crypto"], default-features = false }
# TOTP library # TOTP library
totp-lite = "2.0.1" totp-lite = "2.0.1"
@@ -133,7 +133,7 @@ webauthn-rs-proto = "0.5.4"
webauthn-rs-core = "0.5.4" webauthn-rs-core = "0.5.4"
# Handling of URL's for WebAuthn and favicons # Handling of URL's for WebAuthn and favicons
url = "2.5.7" url = "2.5.8"
# Email libraries # Email libraries
lettre = { version = "0.11.19", features = ["smtp-transport", "sendmail-transport", "builder", "serde", "hostname", "tracing", "tokio1-rustls", "ring", "rustls-native-certs"], default-features = false } lettre = { version = "0.11.19", features = ["smtp-transport", "sendmail-transport", "builder", "serde", "hostname", "tracing", "tokio1-rustls", "ring", "rustls-native-certs"], default-features = false }
@@ -141,7 +141,7 @@ percent-encoding = "2.3.2" # URL encoding library used for URL's in the emails
email_address = "0.2.9" email_address = "0.2.9"
# HTML Template library # HTML Template library
handlebars = { version = "6.3.2", features = ["dir_source"] } handlebars = { version = "6.4.0", features = ["dir_source"] }
# HTTP client (Used for favicons, version check, DUO and HIBP API) # HTTP client (Used for favicons, version check, DUO and HIBP API)
reqwest = { version = "0.12.28", features = ["rustls-tls", "rustls-tls-native-roots", "stream", "json", "deflate", "gzip", "brotli", "zstd", "socks", "cookies", "charset", "http2", "system-proxy"], default-features = false} reqwest = { version = "0.12.28", features = ["rustls-tls", "rustls-tls-native-roots", "stream", "json", "deflate", "gzip", "brotli", "zstd", "socks", "cookies", "charset", "http2", "system-proxy"], default-features = false}
@@ -149,17 +149,17 @@ hickory-resolver = "0.25.2"
# Favicon extraction libraries # Favicon extraction libraries
html5gum = "0.8.3" html5gum = "0.8.3"
regex = { version = "1.12.2", features = ["std", "perf", "unicode-perl"], default-features = false } regex = { version = "1.12.3", features = ["std", "perf", "unicode-perl"], default-features = false }
data-url = "0.3.2" data-url = "0.3.2"
bytes = "1.11.0" bytes = "1.11.1"
svg-hush = "0.9.5" svg-hush = "0.9.6"
# Cache function results (Used for version check and favicon fetching) # Cache function results (Used for version check and favicon fetching)
cached = { version = "0.56.0", features = ["async"] } cached = { version = "0.56.0", features = ["async"] }
# Used for custom short lived cookie jar during favicon extraction # Used for custom short lived cookie jar during favicon extraction
cookie = "0.18.1" cookie = "0.18.1"
cookie_store = "0.22.0" cookie_store = "0.22.1"
# Used by U2F, JWT and PostgreSQL # Used by U2F, JWT and PostgreSQL
openssl = "0.10.75" openssl = "0.10.75"
@@ -172,8 +172,8 @@ pastey = "0.2.1"
governor = "0.10.4" governor = "0.10.4"
# OIDC for SSO # OIDC for SSO
openidconnect = { version = "4.0.1", features = ["reqwest", "native-tls"] } openidconnect = { version = "4.0.1", features = ["reqwest", "rustls-tls"] }
mini-moka = "0.10.3" moka = { version = "0.12.13", features = ["future"] }
# Check client versions for specific features. # Check client versions for specific features.
semver = "1.0.27" semver = "1.0.27"
@@ -182,7 +182,7 @@ semver = "1.0.27"
# Mainly used for the musl builds, since the default musl malloc is very slow # Mainly used for the musl builds, since the default musl malloc is very slow
mimalloc = { version = "0.1.48", features = ["secure"], default-features = false, optional = true } mimalloc = { version = "0.1.48", features = ["secure"], default-features = false, optional = true }
which = "8.0.0" which = "8.0.1"
# Argon2 library with support for the PHC format # Argon2 library with support for the PHC format
argon2 = "0.5.3" argon2 = "0.5.3"
@@ -197,10 +197,10 @@ grass_compiler = { version = "0.13.4", default-features = false }
opendal = { version = "0.55.0", features = ["services-fs"], default-features = false } opendal = { version = "0.55.0", features = ["services-fs"], default-features = false }
# For retrieving AWS credentials, including temporary SSO credentials # For retrieving AWS credentials, including temporary SSO credentials
anyhow = { version = "1.0.100", optional = true } anyhow = { version = "1.0.102", optional = true }
aws-config = { version = "1.8.12", features = ["behavior-version-latest", "rt-tokio", "credentials-process", "sso"], default-features = false, optional = true } aws-config = { version = "1.8.15", features = ["behavior-version-latest", "rt-tokio", "credentials-process", "sso"], default-features = false, optional = true }
aws-credential-types = { version = "1.2.11", optional = true } aws-credential-types = { version = "1.2.14", optional = true }
aws-smithy-runtime-api = { version = "1.9.3", optional = true } aws-smithy-runtime-api = { version = "1.11.6", optional = true }
http = { version = "1.4.0", optional = true } http = { version = "1.4.0", optional = true }
reqsign = { version = "0.16.5", optional = true } reqsign = { version = "0.16.5", optional = true }

View File

@@ -1,11 +1,11 @@
--- ---
vault_version: "v2025.12.1+build.3" vault_version: "v2026.2.0"
vault_image_digest: "sha256:bf5aa55dc7bcb99f85d2a88ff44d32cdc832e934a0603fe28e5c3f92904bad42" vault_image_digest: "sha256:37c8661fa59dcdfbd3baa8366b6e950ef292b15adfeff1f57812b075c1fd3447"
# Cross Compile Docker Helper Scripts v1.9.0 # Cross Compile Docker Helper Scripts v1.9.0
# We use the linux/amd64 platform shell scripts since there is no difference between the different platform scripts # We use the linux/amd64 platform shell scripts since there is no difference between the different platform scripts
# https://github.com/tonistiigi/xx | https://hub.docker.com/r/tonistiigi/xx/tags # https://github.com/tonistiigi/xx | https://hub.docker.com/r/tonistiigi/xx/tags
xx_image_digest: "sha256:c64defb9ed5a91eacb37f96ccc3d4cd72521c4bd18d5442905b95e2226b0e707" xx_image_digest: "sha256:c64defb9ed5a91eacb37f96ccc3d4cd72521c4bd18d5442905b95e2226b0e707"
rust_version: 1.92.0 # Rust version to be used rust_version: 1.94.0 # Rust version to be used
debian_version: trixie # Debian release name to be used debian_version: trixie # Debian release name to be used
alpine_version: "3.23" # Alpine version to be used alpine_version: "3.23" # Alpine version to be used
# For which platforms/architectures will we try to build images # For which platforms/architectures will we try to build images

View File

@@ -19,23 +19,23 @@
# - From https://hub.docker.com/r/vaultwarden/web-vault/tags, # - From https://hub.docker.com/r/vaultwarden/web-vault/tags,
# click the tag name to view the digest of the image it currently points to. # click the tag name to view the digest of the image it currently points to.
# - From the command line: # - From the command line:
# $ docker pull docker.io/vaultwarden/web-vault:v2025.12.1_build.3 # $ docker pull docker.io/vaultwarden/web-vault:v2026.2.0
# $ docker image inspect --format "{{.RepoDigests}}" docker.io/vaultwarden/web-vault:v2025.12.1_build.3 # $ docker image inspect --format "{{.RepoDigests}}" docker.io/vaultwarden/web-vault:v2026.2.0
# [docker.io/vaultwarden/web-vault@sha256:bf5aa55dc7bcb99f85d2a88ff44d32cdc832e934a0603fe28e5c3f92904bad42] # [docker.io/vaultwarden/web-vault@sha256:37c8661fa59dcdfbd3baa8366b6e950ef292b15adfeff1f57812b075c1fd3447]
# #
# - Conversely, to get the tag name from the digest: # - Conversely, to get the tag name from the digest:
# $ docker image inspect --format "{{.RepoTags}}" docker.io/vaultwarden/web-vault@sha256:bf5aa55dc7bcb99f85d2a88ff44d32cdc832e934a0603fe28e5c3f92904bad42 # $ docker image inspect --format "{{.RepoTags}}" docker.io/vaultwarden/web-vault@sha256:37c8661fa59dcdfbd3baa8366b6e950ef292b15adfeff1f57812b075c1fd3447
# [docker.io/vaultwarden/web-vault:v2025.12.1_build.3] # [docker.io/vaultwarden/web-vault:v2026.2.0]
# #
FROM --platform=linux/amd64 docker.io/vaultwarden/web-vault@sha256:bf5aa55dc7bcb99f85d2a88ff44d32cdc832e934a0603fe28e5c3f92904bad42 AS vault FROM --platform=linux/amd64 docker.io/vaultwarden/web-vault@sha256:37c8661fa59dcdfbd3baa8366b6e950ef292b15adfeff1f57812b075c1fd3447 AS vault
########################## ALPINE BUILD IMAGES ########################## ########################## ALPINE BUILD IMAGES ##########################
## NOTE: The Alpine Base Images do not support other platforms then linux/amd64 and linux/arm64 ## NOTE: The Alpine Base Images do not support other platforms then linux/amd64 and linux/arm64
## And for Alpine we define all build images here, they will only be loaded when actually used ## And for Alpine we define all build images here, they will only be loaded when actually used
FROM --platform=$BUILDPLATFORM ghcr.io/blackdex/rust-musl:x86_64-musl-stable-1.92.0 AS build_amd64 FROM --platform=$BUILDPLATFORM ghcr.io/blackdex/rust-musl:x86_64-musl-stable-1.94.0 AS build_amd64
FROM --platform=$BUILDPLATFORM ghcr.io/blackdex/rust-musl:aarch64-musl-stable-1.92.0 AS build_arm64 FROM --platform=$BUILDPLATFORM ghcr.io/blackdex/rust-musl:aarch64-musl-stable-1.94.0 AS build_arm64
FROM --platform=$BUILDPLATFORM ghcr.io/blackdex/rust-musl:armv7-musleabihf-stable-1.92.0 AS build_armv7 FROM --platform=$BUILDPLATFORM ghcr.io/blackdex/rust-musl:armv7-musleabihf-stable-1.94.0 AS build_armv7
FROM --platform=$BUILDPLATFORM ghcr.io/blackdex/rust-musl:arm-musleabi-stable-1.92.0 AS build_armv6 FROM --platform=$BUILDPLATFORM ghcr.io/blackdex/rust-musl:arm-musleabi-stable-1.94.0 AS build_armv6
########################## BUILD IMAGE ########################## ########################## BUILD IMAGE ##########################
# hadolint ignore=DL3006 # hadolint ignore=DL3006

View File

@@ -19,15 +19,15 @@
# - From https://hub.docker.com/r/vaultwarden/web-vault/tags, # - From https://hub.docker.com/r/vaultwarden/web-vault/tags,
# click the tag name to view the digest of the image it currently points to. # click the tag name to view the digest of the image it currently points to.
# - From the command line: # - From the command line:
# $ docker pull docker.io/vaultwarden/web-vault:v2025.12.1_build.3 # $ docker pull docker.io/vaultwarden/web-vault:v2026.2.0
# $ docker image inspect --format "{{.RepoDigests}}" docker.io/vaultwarden/web-vault:v2025.12.1_build.3 # $ docker image inspect --format "{{.RepoDigests}}" docker.io/vaultwarden/web-vault:v2026.2.0
# [docker.io/vaultwarden/web-vault@sha256:bf5aa55dc7bcb99f85d2a88ff44d32cdc832e934a0603fe28e5c3f92904bad42] # [docker.io/vaultwarden/web-vault@sha256:37c8661fa59dcdfbd3baa8366b6e950ef292b15adfeff1f57812b075c1fd3447]
# #
# - Conversely, to get the tag name from the digest: # - Conversely, to get the tag name from the digest:
# $ docker image inspect --format "{{.RepoTags}}" docker.io/vaultwarden/web-vault@sha256:bf5aa55dc7bcb99f85d2a88ff44d32cdc832e934a0603fe28e5c3f92904bad42 # $ docker image inspect --format "{{.RepoTags}}" docker.io/vaultwarden/web-vault@sha256:37c8661fa59dcdfbd3baa8366b6e950ef292b15adfeff1f57812b075c1fd3447
# [docker.io/vaultwarden/web-vault:v2025.12.1_build.3] # [docker.io/vaultwarden/web-vault:v2026.2.0]
# #
FROM --platform=linux/amd64 docker.io/vaultwarden/web-vault@sha256:bf5aa55dc7bcb99f85d2a88ff44d32cdc832e934a0603fe28e5c3f92904bad42 AS vault FROM --platform=linux/amd64 docker.io/vaultwarden/web-vault@sha256:37c8661fa59dcdfbd3baa8366b6e950ef292b15adfeff1f57812b075c1fd3447 AS vault
########################## Cross Compile Docker Helper Scripts ########################## ########################## Cross Compile Docker Helper Scripts ##########################
## We use the linux/amd64 no matter which Build Platform, since these are all bash scripts ## We use the linux/amd64 no matter which Build Platform, since these are all bash scripts
@@ -36,7 +36,7 @@ FROM --platform=linux/amd64 docker.io/tonistiigi/xx@sha256:c64defb9ed5a91eacb37f
########################## BUILD IMAGE ########################## ########################## BUILD IMAGE ##########################
# hadolint ignore=DL3006 # hadolint ignore=DL3006
FROM --platform=$BUILDPLATFORM docker.io/library/rust:1.92.0-slim-trixie AS build FROM --platform=$BUILDPLATFORM docker.io/library/rust:1.94.0-slim-trixie AS build
COPY --from=xx / / COPY --from=xx / /
ARG TARGETARCH ARG TARGETARCH
ARG TARGETVARIANT ARG TARGETVARIANT

View File

@@ -13,8 +13,8 @@ path = "src/lib.rs"
proc-macro = true proc-macro = true
[dependencies] [dependencies]
quote = "1.0.42" quote = "1.0.45"
syn = "2.0.111" syn = "2.0.117"
[lints] [lints]
workspace = true workspace = true

View File

@@ -1,4 +1,4 @@
[toolchain] [toolchain]
channel = "1.92.0" channel = "1.94.0"
components = [ "rustfmt", "clippy" ] components = [ "rustfmt", "clippy" ]
profile = "minimal" profile = "minimal"

View File

@@ -33,7 +33,6 @@ use rocket::{
pub fn routes() -> Vec<rocket::Route> { pub fn routes() -> Vec<rocket::Route> {
routes![ routes![
register,
profile, profile,
put_profile, put_profile,
post_profile, post_profile,
@@ -168,11 +167,6 @@ async fn is_email_2fa_required(member_id: Option<MembershipId>, conn: &DbConn) -
false false
} }
#[post("/accounts/register", data = "<data>")]
async fn register(data: Json<RegisterData>, conn: DbConn) -> JsonResult {
_register(data, false, conn).await
}
pub async fn _register(data: Json<RegisterData>, email_verification: bool, conn: DbConn) -> JsonResult { pub async fn _register(data: Json<RegisterData>, email_verification: bool, conn: DbConn) -> JsonResult {
let mut data: RegisterData = data.into_inner(); let mut data: RegisterData = data.into_inner();
let email = data.email.to_lowercase(); let email = data.email.to_lowercase();
@@ -1199,10 +1193,9 @@ async fn password_hint(data: Json<PasswordHintData>, conn: DbConn) -> EmptyResul
// There is still a timing side channel here in that the code // There is still a timing side channel here in that the code
// paths that send mail take noticeably longer than ones that // paths that send mail take noticeably longer than ones that
// don't. Add a randomized sleep to mitigate this somewhat. // don't. Add a randomized sleep to mitigate this somewhat.
use rand::{rngs::SmallRng, Rng, SeedableRng}; use rand::{rngs::SmallRng, RngExt};
let mut rng = SmallRng::from_os_rng(); let mut rng: SmallRng = rand::make_rng();
let delta: i32 = 100; let sleep_ms = rng.random_range(900..=1100) as u64;
let sleep_ms = (1_000 + rng.random_range(-delta..=delta)) as u64;
tokio::time::sleep(tokio::time::Duration::from_millis(sleep_ms)).await; tokio::time::sleep(tokio::time::Duration::from_millis(sleep_ms)).await;
Ok(()) Ok(())
} else { } else {
@@ -1704,6 +1697,6 @@ pub async fn purge_auth_requests(pool: DbPool) {
if let Ok(conn) = pool.get().await { if let Ok(conn) = pool.get().await {
AuthRequest::purge_expired_auth_requests(&conn).await; AuthRequest::purge_expired_auth_requests(&conn).await;
} else { } else {
error!("Failed to get DB connection while purging trashed ciphers") error!("Failed to get DB connection while purging auth requests")
} }
} }

View File

@@ -715,9 +715,13 @@ async fn put_cipher_partial(
let data: PartialCipherData = data.into_inner(); let data: PartialCipherData = data.into_inner();
let Some(cipher) = Cipher::find_by_uuid(&cipher_id, &conn).await else { let Some(cipher) = Cipher::find_by_uuid(&cipher_id, &conn).await else {
err!("Cipher doesn't exist") err!("Cipher does not exist")
}; };
if !cipher.is_accessible_to_user(&headers.user.uuid, &conn).await {
err!("Cipher does not exist", "Cipher is not accessible for the current user")
}
if let Some(ref folder_id) = data.folder_id { if let Some(ref folder_id) = data.folder_id {
if Folder::find_by_uuid_and_user(folder_id, &headers.user.uuid, &conn).await.is_none() { if Folder::find_by_uuid_and_user(folder_id, &headers.user.uuid, &conn).await.is_none() {
err!("Invalid folder", "Folder does not exist or belongs to another user"); err!("Invalid folder", "Folder does not exist or belongs to another user");

View File

@@ -36,12 +36,9 @@ pub fn routes() -> Vec<Route> {
get_org_collections_details, get_org_collections_details,
get_org_collection_detail, get_org_collection_detail,
get_collection_users, get_collection_users,
put_collection_users,
put_organization, put_organization,
post_organization, post_organization,
post_organization_collections, post_organization_collections,
delete_organization_collection_member,
post_organization_collection_delete_member,
post_bulk_access_collections, post_bulk_access_collections,
post_organization_collection_update, post_organization_collection_update,
put_organization_collection_update, put_organization_collection_update,
@@ -64,28 +61,20 @@ pub fn routes() -> Vec<Route> {
put_member, put_member,
delete_member, delete_member,
bulk_delete_member, bulk_delete_member,
post_delete_member,
post_org_import, post_org_import,
list_policies, list_policies,
list_policies_token, list_policies_token,
get_master_password_policy, get_master_password_policy,
get_policy, get_policy,
put_policy, put_policy,
get_organization_tax, put_policy_vnext,
get_plans, get_plans,
get_plans_all,
get_plans_tax_rates,
import,
post_org_keys, post_org_keys,
get_organization_keys, get_organization_keys,
get_organization_public_key, get_organization_public_key,
bulk_public_keys, bulk_public_keys,
deactivate_member,
bulk_deactivate_members,
revoke_member, revoke_member,
bulk_revoke_members, bulk_revoke_members,
activate_member,
bulk_activate_members,
restore_member, restore_member,
bulk_restore_members, bulk_restore_members,
get_groups, get_groups,
@@ -100,10 +89,6 @@ pub fn routes() -> Vec<Route> {
bulk_delete_groups, bulk_delete_groups,
get_group_members, get_group_members,
put_group_members, put_group_members,
get_user_groups,
post_user_groups,
put_user_groups,
delete_group_member,
post_delete_group_member, post_delete_group_member,
put_reset_password_enrollment, put_reset_password_enrollment,
get_reset_password_details, get_reset_password_details,
@@ -380,6 +365,11 @@ async fn get_org_collections(org_id: OrganizationId, headers: ManagerHeadersLoos
if org_id != headers.membership.org_uuid { if org_id != headers.membership.org_uuid {
err!("Organization not found", "Organization id's do not match"); err!("Organization not found", "Organization id's do not match");
} }
if !headers.membership.has_full_access() {
err_code!("Resource not found.", "User does not have full access", rocket::http::Status::NotFound.code);
}
Ok(Json(json!({ Ok(Json(json!({
"data": _get_org_collections(&org_id, &conn).await, "data": _get_org_collections(&org_id, &conn).await,
"object": "list", "object": "list",
@@ -392,7 +382,6 @@ async fn get_org_collections_details(org_id: OrganizationId, headers: ManagerHea
if org_id != headers.membership.org_uuid { if org_id != headers.membership.org_uuid {
err!("Organization not found", "Organization id's do not match"); err!("Organization not found", "Organization id's do not match");
} }
let mut data = Vec::new();
let Some(member) = Membership::find_by_user_and_org(&headers.user.uuid, &org_id, &conn).await else { let Some(member) = Membership::find_by_user_and_org(&headers.user.uuid, &org_id, &conn).await else {
err!("User is not part of organization") err!("User is not part of organization")
@@ -406,7 +395,7 @@ async fn get_org_collections_details(org_id: OrganizationId, headers: ManagerHea
Membership::find_confirmed_by_org(&org_id, &conn).await.into_iter().map(|m| (m.uuid, m.atype)).collect(); Membership::find_confirmed_by_org(&org_id, &conn).await.into_iter().map(|m| (m.uuid, m.atype)).collect();
// check if current user has full access to the organization (either directly or via any group) // check if current user has full access to the organization (either directly or via any group)
let has_full_access_to_org = member.access_all let has_full_access_to_org = member.has_full_access()
|| (CONFIG.org_groups_enabled() && GroupUser::has_full_access_by_member(&org_id, &member.uuid, &conn).await); || (CONFIG.org_groups_enabled() && GroupUser::has_full_access_by_member(&org_id, &member.uuid, &conn).await);
// Get all admins, owners and managers who can manage/access all // Get all admins, owners and managers who can manage/access all
@@ -424,6 +413,7 @@ async fn get_org_collections_details(org_id: OrganizationId, headers: ManagerHea
}) })
.collect(); .collect();
let mut data = Vec::new();
for col in Collection::find_by_organization(&org_id, &conn).await { for col in Collection::find_by_organization(&org_id, &conn).await {
// check whether the current user has access to the given collection // check whether the current user has access to the given collection
let assigned = has_full_access_to_org let assigned = has_full_access_to_org
@@ -431,6 +421,11 @@ async fn get_org_collections_details(org_id: OrganizationId, headers: ManagerHea
|| (CONFIG.org_groups_enabled() || (CONFIG.org_groups_enabled()
&& GroupUser::has_access_to_collection_by_member(&col.uuid, &member.uuid, &conn).await); && GroupUser::has_access_to_collection_by_member(&col.uuid, &member.uuid, &conn).await);
// If the user is a manager, and is not assigned to this collection, skip this and continue with the next collection
if !assigned {
continue;
}
// get the users assigned directly to the given collection // get the users assigned directly to the given collection
let mut users: Vec<Value> = col_users let mut users: Vec<Value> = col_users
.iter() .iter()
@@ -566,6 +561,10 @@ async fn post_bulk_access_collections(
err!("Collection not found") err!("Collection not found")
}; };
if !collection.is_manageable_by_user(&headers.membership.user_uuid, &conn).await {
err!("Collection not found", "The current user isn't a manager for this collection")
}
// update collection modification date // update collection modification date
collection.save(&conn).await?; collection.save(&conn).await?;
@@ -682,43 +681,6 @@ async fn post_organization_collection_update(
Ok(Json(collection.to_json_details(&headers.user.uuid, None, &conn).await)) Ok(Json(collection.to_json_details(&headers.user.uuid, None, &conn).await))
} }
#[delete("/organizations/<org_id>/collections/<col_id>/user/<member_id>")]
async fn delete_organization_collection_member(
org_id: OrganizationId,
col_id: CollectionId,
member_id: MembershipId,
headers: AdminHeaders,
conn: DbConn,
) -> EmptyResult {
if org_id != headers.org_id {
err!("Organization not found", "Organization id's do not match");
}
let Some(collection) = Collection::find_by_uuid_and_org(&col_id, &org_id, &conn).await else {
err!("Collection not found", "Collection does not exist or does not belong to this organization")
};
match Membership::find_by_uuid_and_org(&member_id, &org_id, &conn).await {
None => err!("User not found in organization"),
Some(member) => {
match CollectionUser::find_by_collection_and_user(&collection.uuid, &member.user_uuid, &conn).await {
None => err!("User not assigned to collection"),
Some(col_user) => col_user.delete(&conn).await,
}
}
}
}
#[post("/organizations/<org_id>/collections/<col_id>/delete-user/<member_id>")]
async fn post_organization_collection_delete_member(
org_id: OrganizationId,
col_id: CollectionId,
member_id: MembershipId,
headers: AdminHeaders,
conn: DbConn,
) -> EmptyResult {
delete_organization_collection_member(org_id, col_id, member_id, headers, conn).await
}
async fn _delete_organization_collection( async fn _delete_organization_collection(
org_id: &OrganizationId, org_id: &OrganizationId,
col_id: &CollectionId, col_id: &CollectionId,
@@ -887,41 +849,6 @@ async fn get_collection_users(
Ok(Json(json!(member_list))) Ok(Json(json!(member_list)))
} }
#[put("/organizations/<org_id>/collections/<col_id>/users", data = "<data>")]
async fn put_collection_users(
org_id: OrganizationId,
col_id: CollectionId,
data: Json<Vec<CollectionMembershipData>>,
headers: ManagerHeaders,
conn: DbConn,
) -> EmptyResult {
if org_id != headers.org_id {
err!("Organization not found", "Organization id's do not match");
}
// Get org and collection, check that collection is from org
if Collection::find_by_uuid_and_org(&col_id, &org_id, &conn).await.is_none() {
err!("Collection not found in Organization")
}
// Delete all the user-collections
CollectionUser::delete_all_by_collection(&col_id, &conn).await?;
// And then add all the received ones (except if the user has access_all)
for d in data.iter() {
let Some(user) = Membership::find_by_uuid_and_org(&d.id, &org_id, &conn).await else {
err!("User is not part of organization")
};
if user.access_all {
continue;
}
CollectionUser::save(&user.user_uuid, &col_id, d.read_only, d.hide_passwords, d.manage, &conn).await?;
}
Ok(())
}
#[derive(FromForm)] #[derive(FromForm)]
struct OrgIdData { struct OrgIdData {
#[field(name = "organizationId")] #[field(name = "organizationId")]
@@ -929,11 +856,15 @@ struct OrgIdData {
} }
#[get("/ciphers/organization-details?<data..>")] #[get("/ciphers/organization-details?<data..>")]
async fn get_org_details(data: OrgIdData, headers: OrgMemberHeaders, conn: DbConn) -> JsonResult { async fn get_org_details(data: OrgIdData, headers: ManagerHeadersLoose, conn: DbConn) -> JsonResult {
if data.organization_id != headers.membership.org_uuid { if data.organization_id != headers.membership.org_uuid {
err_code!("Resource not found.", "Organization id's do not match", rocket::http::Status::NotFound.code); err_code!("Resource not found.", "Organization id's do not match", rocket::http::Status::NotFound.code);
} }
if !headers.membership.has_full_access() {
err_code!("Resource not found.", "User does not have full access", rocket::http::Status::NotFound.code);
}
Ok(Json(json!({ Ok(Json(json!({
"data": _get_org_details(&data.organization_id, &headers.host, &headers.user.uuid, &conn).await?, "data": _get_org_details(&data.organization_id, &headers.host, &headers.user.uuid, &conn).await?,
"object": "list", "object": "list",
@@ -1715,17 +1646,6 @@ async fn delete_member(
_delete_member(&org_id, &member_id, &headers, &conn, &nt).await _delete_member(&org_id, &member_id, &headers, &conn, &nt).await
} }
#[post("/organizations/<org_id>/users/<member_id>/delete")]
async fn post_delete_member(
org_id: OrganizationId,
member_id: MembershipId,
headers: AdminHeaders,
conn: DbConn,
nt: Notify<'_>,
) -> EmptyResult {
_delete_member(&org_id, &member_id, &headers, &conn, &nt).await
}
async fn _delete_member( async fn _delete_member(
org_id: &OrganizationId, org_id: &OrganizationId,
member_id: &MembershipId, member_id: &MembershipId,
@@ -2178,14 +2098,26 @@ async fn put_policy(
Ok(Json(policy.to_json())) Ok(Json(policy.to_json()))
} }
#[allow(unused_variables)] #[derive(Deserialize)]
#[get("/organizations/<org_id>/tax")] struct PolicyDataVnext {
fn get_organization_tax(org_id: OrganizationId, _headers: Headers) -> Json<Value> { policy: PolicyData,
// Prevent a 404 error, which also causes Javascript errors. // Ignore metadata for now as we do not yet support this
// Upstream sends "Only allowed when not self hosted." As an error message. // "metadata": {
// If we do the same it will also output this to the log, which is overkill. // "defaultUserCollectionName": "2.xx|xx==|xx="
// An empty list/data also works fine. // }
Json(_empty_data_json()) }
#[put("/organizations/<org_id>/policies/<pol_type>/vnext", data = "<data>")]
async fn put_policy_vnext(
org_id: OrganizationId,
pol_type: i32,
data: Json<PolicyDataVnext>,
headers: AdminHeaders,
conn: DbConn,
) -> JsonResult {
let data: PolicyDataVnext = data.into_inner();
let policy: PolicyData = data.policy;
put_policy(org_id, pol_type, Json(policy), headers, conn).await
} }
#[get("/plans")] #[get("/plans")]
@@ -2216,17 +2148,6 @@ fn get_plans() -> Json<Value> {
})) }))
} }
#[get("/plans/all")]
fn get_plans_all() -> Json<Value> {
get_plans()
}
#[get("/plans/sales-tax-rates")]
fn get_plans_tax_rates(_headers: Headers) -> Json<Value> {
// Prevent a 404 error, which also causes Javascript errors.
Json(_empty_data_json())
}
#[get("/organizations/<_org_id>/billing/metadata")] #[get("/organizations/<_org_id>/billing/metadata")]
fn get_billing_metadata(_org_id: OrganizationId, _headers: Headers) -> Json<Value> { fn get_billing_metadata(_org_id: OrganizationId, _headers: Headers) -> Json<Value> {
// Prevent a 404 error, which also causes Javascript errors. // Prevent a 404 error, which also causes Javascript errors.
@@ -2251,174 +2172,12 @@ fn _empty_data_json() -> Value {
}) })
} }
#[derive(Deserialize, Debug)]
#[serde(rename_all = "camelCase")]
struct OrgImportGroupData {
#[allow(dead_code)]
name: String, // "GroupName"
#[allow(dead_code)]
external_id: String, // "cn=GroupName,ou=Groups,dc=example,dc=com"
#[allow(dead_code)]
users: Vec<String>, // ["uid=user,ou=People,dc=example,dc=com"]
}
#[derive(Deserialize, Debug)]
#[serde(rename_all = "camelCase")]
struct OrgImportUserData {
email: String, // "user@maildomain.net"
#[allow(dead_code)]
external_id: String, // "uid=user,ou=People,dc=example,dc=com"
deleted: bool,
}
#[derive(Deserialize, Debug)]
#[serde(rename_all = "camelCase")]
struct OrgImportData {
#[allow(dead_code)]
groups: Vec<OrgImportGroupData>,
overwrite_existing: bool,
users: Vec<OrgImportUserData>,
}
/// This function seems to be deprecated
/// It is only used with older directory connectors
/// TODO: Cleanup Tech debt
#[post("/organizations/<org_id>/import", data = "<data>")]
async fn import(org_id: OrganizationId, data: Json<OrgImportData>, headers: Headers, conn: DbConn) -> EmptyResult {
let data = data.into_inner();
// TODO: Currently we aren't storing the externalId's anywhere, so we also don't have a way
// to differentiate between auto-imported users and manually added ones.
// This means that this endpoint can end up removing users that were added manually by an admin,
// as opposed to upstream which only removes auto-imported users.
// User needs to be admin or owner to use the Directory Connector
match Membership::find_by_user_and_org(&headers.user.uuid, &org_id, &conn).await {
Some(member) if member.atype >= MembershipType::Admin => { /* Okay, nothing to do */ }
Some(_) => err!("User has insufficient permissions to use Directory Connector"),
None => err!("User not part of organization"),
};
for user_data in &data.users {
if user_data.deleted {
// If user is marked for deletion and it exists, delete it
if let Some(member) = Membership::find_by_email_and_org(&user_data.email, &org_id, &conn).await {
log_event(
EventType::OrganizationUserRemoved as i32,
&member.uuid,
&org_id,
&headers.user.uuid,
headers.device.atype,
&headers.ip.ip,
&conn,
)
.await;
member.delete(&conn).await?;
}
// If user is not part of the organization, but it exists
} else if Membership::find_by_email_and_org(&user_data.email, &org_id, &conn).await.is_none() {
if let Some(user) = User::find_by_mail(&user_data.email, &conn).await {
let member_status = if CONFIG.mail_enabled() {
MembershipStatus::Invited as i32
} else {
MembershipStatus::Accepted as i32 // Automatically mark user as accepted if no email invites
};
let mut new_member =
Membership::new(user.uuid.clone(), org_id.clone(), Some(headers.user.email.clone()));
new_member.access_all = false;
new_member.atype = MembershipType::User as i32;
new_member.status = member_status;
if CONFIG.mail_enabled() {
let org_name = match Organization::find_by_uuid(&org_id, &conn).await {
Some(org) => org.name,
None => err!("Error looking up organization"),
};
mail::send_invite(
&user,
org_id.clone(),
new_member.uuid.clone(),
&org_name,
Some(headers.user.email.clone()),
)
.await?;
}
// Save the member after sending an email
// If sending fails the member will not be saved to the database, and will not result in the admin needing to reinvite the users manually
new_member.save(&conn).await?;
log_event(
EventType::OrganizationUserInvited as i32,
&new_member.uuid,
&org_id,
&headers.user.uuid,
headers.device.atype,
&headers.ip.ip,
&conn,
)
.await;
}
}
}
// If this flag is enabled, any user that isn't provided in the Users list will be removed (by default they will be kept unless they have Deleted == true)
if data.overwrite_existing {
for member in Membership::find_by_org_and_type(&org_id, MembershipType::User, &conn).await {
if let Some(user_email) = User::find_by_uuid(&member.user_uuid, &conn).await.map(|u| u.email) {
if !data.users.iter().any(|u| u.email == user_email) {
log_event(
EventType::OrganizationUserRemoved as i32,
&member.uuid,
&org_id,
&headers.user.uuid,
headers.device.atype,
&headers.ip.ip,
&conn,
)
.await;
member.delete(&conn).await?;
}
}
}
}
Ok(())
}
// Pre web-vault v2022.9.x endpoint
#[put("/organizations/<org_id>/users/<member_id>/deactivate")]
async fn deactivate_member(
org_id: OrganizationId,
member_id: MembershipId,
headers: AdminHeaders,
conn: DbConn,
) -> EmptyResult {
_revoke_member(&org_id, &member_id, &headers, &conn).await
}
#[derive(Deserialize, Debug)] #[derive(Deserialize, Debug)]
#[serde(rename_all = "camelCase")] #[serde(rename_all = "camelCase")]
struct BulkRevokeMembershipIds { struct BulkRevokeMembershipIds {
ids: Option<Vec<MembershipId>>, ids: Option<Vec<MembershipId>>,
} }
// Pre web-vault v2022.9.x endpoint
#[put("/organizations/<org_id>/users/deactivate", data = "<data>")]
async fn bulk_deactivate_members(
org_id: OrganizationId,
data: Json<BulkRevokeMembershipIds>,
headers: AdminHeaders,
conn: DbConn,
) -> JsonResult {
bulk_revoke_members(org_id, data, headers, conn).await
}
#[put("/organizations/<org_id>/users/<member_id>/revoke")] #[put("/organizations/<org_id>/users/<member_id>/revoke")]
async fn revoke_member( async fn revoke_member(
org_id: OrganizationId, org_id: OrganizationId,
@@ -2512,28 +2271,6 @@ async fn _revoke_member(
Ok(()) Ok(())
} }
// Pre web-vault v2022.9.x endpoint
#[put("/organizations/<org_id>/users/<member_id>/activate")]
async fn activate_member(
org_id: OrganizationId,
member_id: MembershipId,
headers: AdminHeaders,
conn: DbConn,
) -> EmptyResult {
_restore_member(&org_id, &member_id, &headers, &conn).await
}
// Pre web-vault v2022.9.x endpoint
#[put("/organizations/<org_id>/users/activate", data = "<data>")]
async fn bulk_activate_members(
org_id: OrganizationId,
data: Json<BulkMembershipIds>,
headers: AdminHeaders,
conn: DbConn,
) -> JsonResult {
bulk_restore_members(org_id, data, headers, conn).await
}
#[put("/organizations/<org_id>/users/<member_id>/restore")] #[put("/organizations/<org_id>/users/<member_id>/restore")]
async fn restore_member( async fn restore_member(
org_id: OrganizationId, org_id: OrganizationId,
@@ -3002,88 +2739,6 @@ async fn put_group_members(
Ok(()) Ok(())
} }
#[get("/organizations/<org_id>/users/<member_id>/groups")]
async fn get_user_groups(
org_id: OrganizationId,
member_id: MembershipId,
headers: AdminHeaders,
conn: DbConn,
) -> JsonResult {
if org_id != headers.org_id {
err!("Organization not found", "Organization id's do not match");
}
if !CONFIG.org_groups_enabled() {
err!("Group support is disabled");
}
if Membership::find_by_uuid_and_org(&member_id, &org_id, &conn).await.is_none() {
err!("User could not be found!")
};
let user_groups: Vec<GroupId> =
GroupUser::find_by_member(&member_id, &conn).await.iter().map(|entry| entry.groups_uuid.clone()).collect();
Ok(Json(json!(user_groups)))
}
#[derive(Deserialize)]
#[serde(rename_all = "camelCase")]
struct OrganizationUserUpdateGroupsRequest {
group_ids: Vec<GroupId>,
}
#[post("/organizations/<org_id>/users/<member_id>/groups", data = "<data>")]
async fn post_user_groups(
org_id: OrganizationId,
member_id: MembershipId,
data: Json<OrganizationUserUpdateGroupsRequest>,
headers: AdminHeaders,
conn: DbConn,
) -> EmptyResult {
put_user_groups(org_id, member_id, data, headers, conn).await
}
#[put("/organizations/<org_id>/users/<member_id>/groups", data = "<data>")]
async fn put_user_groups(
org_id: OrganizationId,
member_id: MembershipId,
data: Json<OrganizationUserUpdateGroupsRequest>,
headers: AdminHeaders,
conn: DbConn,
) -> EmptyResult {
if org_id != headers.org_id {
err!("Organization not found", "Organization id's do not match");
}
if !CONFIG.org_groups_enabled() {
err!("Group support is disabled");
}
if Membership::find_by_uuid_and_org(&member_id, &org_id, &conn).await.is_none() {
err!("User could not be found or does not belong to the organization.");
}
GroupUser::delete_all_by_member(&member_id, &conn).await?;
let assigned_group_ids = data.into_inner();
for assigned_group_id in assigned_group_ids.group_ids {
let mut group_user = GroupUser::new(assigned_group_id.clone(), member_id.clone());
group_user.save(&conn).await?;
}
log_event(
EventType::OrganizationUserUpdatedGroups as i32,
&member_id,
&org_id,
&headers.user.uuid,
headers.device.atype,
&headers.ip.ip,
&conn,
)
.await;
Ok(())
}
#[post("/organizations/<org_id>/groups/<group_id>/delete-user/<member_id>")] #[post("/organizations/<org_id>/groups/<group_id>/delete-user/<member_id>")]
async fn post_delete_group_member( async fn post_delete_group_member(
org_id: OrganizationId, org_id: OrganizationId,
@@ -3091,17 +2746,6 @@ async fn post_delete_group_member(
member_id: MembershipId, member_id: MembershipId,
headers: AdminHeaders, headers: AdminHeaders,
conn: DbConn, conn: DbConn,
) -> EmptyResult {
delete_group_member(org_id, group_id, member_id, headers, conn).await
}
#[delete("/organizations/<org_id>/groups/<group_id>/users/<member_id>")]
async fn delete_group_member(
org_id: OrganizationId,
group_id: GroupId,
member_id: MembershipId,
headers: AdminHeaders,
conn: DbConn,
) -> EmptyResult { ) -> EmptyResult {
if org_id != headers.org_id { if org_id != headers.org_id {
err!("Organization not found", "Organization id's do not match"); err!("Organization not found", "Organization id's do not match");
@@ -3207,7 +2851,7 @@ async fn put_reset_password(
// Sending email before resetting password to ensure working email configuration and the resulting // Sending email before resetting password to ensure working email configuration and the resulting
// user notification. Also this might add some protection against security flaws and misuse // user notification. Also this might add some protection against security flaws and misuse
if let Err(e) = mail::send_admin_reset_password(&user.email, &user.name, &org.name).await { if let Err(e) = mail::send_admin_reset_password(&user.email, user.display_name(), &org.name).await {
err!(format!("Error sending user reset password email: {e:#?}")); err!(format!("Error sending user reset password email: {e:#?}"));
} }

View File

@@ -7,10 +7,10 @@ use crate::{
core::{log_user_event, two_factor::_generate_recover_code}, core::{log_user_event, two_factor::_generate_recover_code},
EmptyResult, JsonResult, PasswordOrOtpData, EmptyResult, JsonResult, PasswordOrOtpData,
}, },
auth::Headers, auth::{ClientHeaders, Headers},
crypto, crypto,
db::{ db::{
models::{DeviceId, EventType, TwoFactor, TwoFactorType, User, UserId}, models::{AuthRequest, AuthRequestId, DeviceId, EventType, TwoFactor, TwoFactorType, User, UserId},
DbConn, DbConn,
}, },
error::{Error, MapResult}, error::{Error, MapResult},
@@ -30,35 +30,63 @@ struct SendEmailLoginData {
email: Option<String>, email: Option<String>,
#[serde(alias = "MasterPasswordHash")] #[serde(alias = "MasterPasswordHash")]
master_password_hash: Option<String>, master_password_hash: Option<String>,
auth_request_id: Option<AuthRequestId>,
auth_request_access_code: Option<String>,
} }
/// User is trying to login and wants to use email 2FA. /// User is trying to login and wants to use email 2FA.
/// Does not require Bearer token /// Does not require Bearer token
#[post("/two-factor/send-email-login", data = "<data>")] // JsonResult #[post("/two-factor/send-email-login", data = "<data>")] // JsonResult
async fn send_email_login(data: Json<SendEmailLoginData>, conn: DbConn) -> EmptyResult { async fn send_email_login(data: Json<SendEmailLoginData>, client_headers: ClientHeaders, conn: DbConn) -> EmptyResult {
let data: SendEmailLoginData = data.into_inner(); let data: SendEmailLoginData = data.into_inner();
if !CONFIG._enable_email_2fa() { if !CONFIG._enable_email_2fa() {
err!("Email 2FA is disabled") err!("Email 2FA is disabled")
} }
// Ratelimit the login
crate::ratelimit::check_limit_login(&client_headers.ip.ip)?;
// Get the user // Get the user
let email = match &data.email { let email = match &data.email {
Some(email) if !email.is_empty() => Some(email), Some(email) if !email.is_empty() => Some(email),
_ => None, _ => None,
}; };
let user = if let Some(email) = email { let master_password_hash = match &data.master_password_hash {
let Some(master_password_hash) = &data.master_password_hash else { Some(password_hash) if !password_hash.is_empty() => Some(password_hash),
err!("No password hash has been submitted.") _ => None,
}; };
let auth_request_id = match &data.auth_request_id {
Some(auth_request_id) if !auth_request_id.is_empty() => Some(auth_request_id),
_ => None,
};
let user = if let Some(email) = email {
let Some(user) = User::find_by_mail(email, &conn).await else { let Some(user) = User::find_by_mail(email, &conn).await else {
err!("Username or password is incorrect. Try again.") err!("Username or password is incorrect. Try again.")
}; };
// Check password if let Some(master_password_hash) = master_password_hash {
if !user.check_valid_password(master_password_hash) { // Check password
err!("Username or password is incorrect. Try again.") if !user.check_valid_password(master_password_hash) {
err!("Username or password is incorrect. Try again.")
}
} else if let Some(auth_request_id) = auth_request_id {
let Some(auth_request) = AuthRequest::find_by_uuid(auth_request_id, &conn).await else {
err!("AuthRequest doesn't exist", "User not found")
};
let Some(code) = &data.auth_request_access_code else {
err!("no auth request access code")
};
if auth_request.device_type != client_headers.device_type
|| auth_request.request_ip != client_headers.ip.ip.to_string()
|| !auth_request.check_access_code(code)
{
err!("AuthRequest doesn't exist", "Invalid device, IP or code")
}
} else {
err!("No password hash has been submitted.")
} }
user user

View File

@@ -9,7 +9,7 @@ use crate::{
core::{log_event, log_user_event}, core::{log_event, log_user_event},
EmptyResult, JsonResult, PasswordOrOtpData, EmptyResult, JsonResult, PasswordOrOtpData,
}, },
auth::{ClientHeaders, Headers}, auth::Headers,
crypto, crypto,
db::{ db::{
models::{ models::{
@@ -35,7 +35,6 @@ pub fn routes() -> Vec<Route> {
let mut routes = routes![ let mut routes = routes![
get_twofactor, get_twofactor,
get_recover, get_recover,
recover,
disable_twofactor, disable_twofactor,
disable_twofactor_put, disable_twofactor_put,
get_device_verification_settings, get_device_verification_settings,
@@ -76,54 +75,6 @@ async fn get_recover(data: Json<PasswordOrOtpData>, headers: Headers, conn: DbCo
}))) })))
} }
#[derive(Deserialize)]
#[serde(rename_all = "camelCase")]
struct RecoverTwoFactor {
master_password_hash: String,
email: String,
recovery_code: String,
}
#[post("/two-factor/recover", data = "<data>")]
async fn recover(data: Json<RecoverTwoFactor>, client_headers: ClientHeaders, conn: DbConn) -> JsonResult {
let data: RecoverTwoFactor = data.into_inner();
use crate::db::models::User;
// Get the user
let Some(mut user) = User::find_by_mail(&data.email, &conn).await else {
err!("Username or password is incorrect. Try again.")
};
// Check password
if !user.check_valid_password(&data.master_password_hash) {
err!("Username or password is incorrect. Try again.")
}
// Check if recovery code is correct
if !user.check_valid_recovery_code(&data.recovery_code) {
err!("Recovery code is incorrect. Try again.")
}
// Remove all twofactors from the user
TwoFactor::delete_all_by_user(&user.uuid, &conn).await?;
enforce_2fa_policy(&user, &user.uuid, client_headers.device_type, &client_headers.ip.ip, &conn).await?;
log_user_event(
EventType::UserRecovered2fa as i32,
&user.uuid,
client_headers.device_type,
&client_headers.ip.ip,
&conn,
)
.await;
// Remove the recovery code, not needed without twofactors
user.totp_recover = None;
user.save(&conn).await?;
Ok(Json(Value::Object(serde_json::Map::new())))
}
async fn _generate_recover_code(user: &mut User, conn: &DbConn) { async fn _generate_recover_code(user: &mut User, conn: &DbConn) {
if user.totp_recover.is_none() { if user.totp_recover.is_none() {
let totp_recover = crypto::encode_random_bytes::<20>(&BASE32); let totp_recover = crypto::encode_random_bytes::<20>(&BASE32);

View File

@@ -144,7 +144,7 @@ async fn generate_webauthn_challenge(data: Json<PasswordOrOtpData>, headers: Hea
let (mut challenge, state) = WEBAUTHN.start_passkey_registration( let (mut challenge, state) = WEBAUTHN.start_passkey_registration(
Uuid::from_str(&user.uuid).expect("Failed to parse UUID"), // Should never fail Uuid::from_str(&user.uuid).expect("Failed to parse UUID"), // Should never fail
&user.email, &user.email,
&user.name, user.display_name(),
Some(registrations), Some(registrations),
)?; )?;
@@ -438,7 +438,7 @@ pub async fn validate_webauthn_login(user_id: &UserId, response: &str, conn: &Db
// We need to check for and update the backup_eligible flag when needed. // We need to check for and update the backup_eligible flag when needed.
// Vaultwarden did not have knowledge of this flag prior to migrating to webauthn-rs v0.5.x // Vaultwarden did not have knowledge of this flag prior to migrating to webauthn-rs v0.5.x
// Because of this we check the flag at runtime and update the registrations and state when needed // Because of this we check the flag at runtime and update the registrations and state when needed
check_and_update_backup_eligible(user_id, &rsp, &mut registrations, &mut state, conn).await?; let backup_flags_updated = check_and_update_backup_eligible(&rsp, &mut registrations, &mut state)?;
let authentication_result = WEBAUTHN.finish_passkey_authentication(&rsp, &state)?; let authentication_result = WEBAUTHN.finish_passkey_authentication(&rsp, &state)?;
@@ -446,7 +446,8 @@ pub async fn validate_webauthn_login(user_id: &UserId, response: &str, conn: &Db
if ct_eq(reg.credential.cred_id(), authentication_result.cred_id()) { if ct_eq(reg.credential.cred_id(), authentication_result.cred_id()) {
// If the cred id matches and the credential is updated, Some(true) is returned // If the cred id matches and the credential is updated, Some(true) is returned
// In those cases, update the record, else leave it alone // In those cases, update the record, else leave it alone
if reg.credential.update_credential(&authentication_result) == Some(true) { let credential_updated = reg.credential.update_credential(&authentication_result) == Some(true);
if credential_updated || backup_flags_updated {
TwoFactor::new(user_id.clone(), TwoFactorType::Webauthn, serde_json::to_string(&registrations)?) TwoFactor::new(user_id.clone(), TwoFactorType::Webauthn, serde_json::to_string(&registrations)?)
.save(conn) .save(conn)
.await?; .await?;
@@ -463,13 +464,11 @@ pub async fn validate_webauthn_login(user_id: &UserId, response: &str, conn: &Db
) )
} }
async fn check_and_update_backup_eligible( fn check_and_update_backup_eligible(
user_id: &UserId,
rsp: &PublicKeyCredential, rsp: &PublicKeyCredential,
registrations: &mut Vec<WebauthnRegistration>, registrations: &mut Vec<WebauthnRegistration>,
state: &mut PasskeyAuthentication, state: &mut PasskeyAuthentication,
conn: &DbConn, ) -> Result<bool, Error> {
) -> EmptyResult {
// The feature flags from the response // The feature flags from the response
// For details see: https://www.w3.org/TR/webauthn-3/#sctn-authenticator-data // For details see: https://www.w3.org/TR/webauthn-3/#sctn-authenticator-data
const FLAG_BACKUP_ELIGIBLE: u8 = 0b0000_1000; const FLAG_BACKUP_ELIGIBLE: u8 = 0b0000_1000;
@@ -486,16 +485,7 @@ async fn check_and_update_backup_eligible(
let rsp_id = rsp.raw_id.as_slice(); let rsp_id = rsp.raw_id.as_slice();
for reg in &mut *registrations { for reg in &mut *registrations {
if ct_eq(reg.credential.cred_id().as_slice(), rsp_id) { if ct_eq(reg.credential.cred_id().as_slice(), rsp_id) {
// Try to update the key, and if needed also update the database, before the actual state check is done
if reg.set_backup_eligible(backup_eligible, backup_state) { if reg.set_backup_eligible(backup_eligible, backup_state) {
TwoFactor::new(
user_id.clone(),
TwoFactorType::Webauthn,
serde_json::to_string(&registrations)?,
)
.save(conn)
.await?;
// We also need to adjust the current state which holds the challenge used to start the authentication verification // We also need to adjust the current state which holds the challenge used to start the authentication verification
// Because Vaultwarden supports multiple keys, we need to loop through the deserialized state and check which key to update // Because Vaultwarden supports multiple keys, we need to loop through the deserialized state and check which key to update
let mut raw_state = serde_json::to_value(&state)?; let mut raw_state = serde_json::to_value(&state)?;
@@ -517,11 +507,12 @@ async fn check_and_update_backup_eligible(
} }
*state = serde_json::from_value(raw_state)?; *state = serde_json::from_value(raw_state)?;
return Ok(true);
} }
break; break;
} }
} }
} }
} }
Ok(()) Ok(false)
} }

View File

@@ -513,13 +513,11 @@ fn parse_sizes(sizes: &str) -> (u16, u16) {
if !sizes.is_empty() { if !sizes.is_empty() {
match ICON_SIZE_REGEX.captures(sizes.trim()) { match ICON_SIZE_REGEX.captures(sizes.trim()) {
None => {} Some(dimensions) if dimensions.len() >= 3 => {
Some(dimensions) => { width = dimensions[1].parse::<u16>().unwrap_or_default();
if dimensions.len() >= 3 { height = dimensions[2].parse::<u16>().unwrap_or_default();
width = dimensions[1].parse::<u16>().unwrap_or_default();
height = dimensions[2].parse::<u16>().unwrap_or_default();
}
} }
_ => {}
} }
} }

View File

@@ -266,7 +266,7 @@ async fn _sso_login(
Some((user, _)) if !user.enabled => { Some((user, _)) if !user.enabled => {
err!( err!(
"This user has been disabled", "This user has been disabled",
format!("IP: {}. Username: {}.", ip.ip, user.name), format!("IP: {}. Username: {}.", ip.ip, user.display_name()),
ErrorEvent { ErrorEvent {
event: EventType::UserFailedLogIn event: EventType::UserFailedLogIn
} }
@@ -482,14 +482,18 @@ async fn authenticated_response(
Value::Null Value::Null
}; };
let account_keys = json!({ let account_keys = if user.private_key.is_some() {
"publicKeyEncryptionKeyPair": { json!({
"wrappedPrivateKey": user.private_key, "publicKeyEncryptionKeyPair": {
"publicKey": user.public_key, "wrappedPrivateKey": user.private_key,
"Object": "publicKeyEncryptionKeyPair" "publicKey": user.public_key,
}, "Object": "publicKeyEncryptionKeyPair"
"Object": "privateKeys" },
}); "Object": "privateKeys"
})
} else {
Value::Null
};
let mut result = json!({ let mut result = json!({
"access_token": auth_tokens.access_token(), "access_token": auth_tokens.access_token(),
@@ -521,7 +525,7 @@ async fn authenticated_response(
result["TwoFactorToken"] = Value::String(token); result["TwoFactorToken"] = Value::String(token);
} }
info!("User {} logged in successfully. IP: {}", &user.name, ip.ip); info!("User {} logged in successfully. IP: {}", user.display_name(), ip.ip);
Ok(Json(result)) Ok(Json(result))
} }
@@ -610,6 +614,38 @@ async fn _user_api_key_login(
info!("User {} logged in successfully via API key. IP: {}", user.email, ip.ip); info!("User {} logged in successfully via API key. IP: {}", user.email, ip.ip);
let has_master_password = !user.password_hash.is_empty();
let master_password_unlock = if has_master_password {
json!({
"Kdf": {
"KdfType": user.client_kdf_type,
"Iterations": user.client_kdf_iter,
"Memory": user.client_kdf_memory,
"Parallelism": user.client_kdf_parallelism
},
// This field is named inconsistently and will be removed and replaced by the "wrapped" variant in the apps.
// https://github.com/bitwarden/android/blob/release/2025.12-rc41/network/src/main/kotlin/com/bitwarden/network/model/MasterPasswordUnlockDataJson.kt#L22-L26
"MasterKeyEncryptedUserKey": user.akey,
"MasterKeyWrappedUserKey": user.akey,
"Salt": user.email
})
} else {
Value::Null
};
let account_keys = if user.private_key.is_some() {
json!({
"publicKeyEncryptionKeyPair": {
"wrappedPrivateKey": user.private_key,
"publicKey": user.public_key,
"Object": "publicKeyEncryptionKeyPair"
},
"Object": "privateKeys"
})
} else {
Value::Null
};
// Note: No refresh_token is returned. The CLI just repeats the // Note: No refresh_token is returned. The CLI just repeats the
// client_credentials login flow when the existing token expires. // client_credentials login flow when the existing token expires.
let result = json!({ let result = json!({
@@ -624,7 +660,14 @@ async fn _user_api_key_login(
"KdfMemory": user.client_kdf_memory, "KdfMemory": user.client_kdf_memory,
"KdfParallelism": user.client_kdf_parallelism, "KdfParallelism": user.client_kdf_parallelism,
"ResetMasterPassword": false, // TODO: according to official server seems something like: user.password_hash.is_empty(), but would need testing "ResetMasterPassword": false, // TODO: according to official server seems something like: user.password_hash.is_empty(), but would need testing
"ForcePasswordReset": false,
"scope": AuthMethod::UserApiKey.scope(), "scope": AuthMethod::UserApiKey.scope(),
"AccountKeys": account_keys,
"UserDecryptionOptions": {
"HasMasterPassword": has_master_password,
"MasterPasswordUnlock": master_password_unlock,
"Object": "userDecryptionOptions"
},
}); });
Ok(Json(result)) Ok(Json(result))
@@ -947,12 +990,11 @@ async fn register_verification_email(
let user = User::find_by_mail(&data.email, &conn).await; let user = User::find_by_mail(&data.email, &conn).await;
if user.filter(|u| u.private_key.is_some()).is_some() { if user.filter(|u| u.private_key.is_some()).is_some() {
// There is still a timing side channel here in that the code // There is still a timing side channel here in that the code
// paths that send mail take noticeably longer than ones that // paths that send mail take noticeably longer than ones that don't.
// don't. Add a randomized sleep to mitigate this somewhat. // Add a randomized sleep to mitigate this somewhat.
use rand::{rngs::SmallRng, Rng, SeedableRng}; use rand::{rngs::SmallRng, RngExt};
let mut rng = SmallRng::from_os_rng(); let mut rng: SmallRng = rand::make_rng();
let delta: i32 = 100; let sleep_ms = rng.random_range(900..=1100) as u64;
let sleep_ms = (1_000 + rng.random_range(-delta..=delta)) as u64;
tokio::time::sleep(tokio::time::Duration::from_millis(sleep_ms)).await; tokio::time::sleep(tokio::time::Duration::from_millis(sleep_ms)).await;
} else { } else {
mail::send_register_verify_email(&data.email, &token).await?; mail::send_register_verify_email(&data.email, &token).await?;

View File

@@ -60,11 +60,13 @@ fn vaultwarden_css() -> Cached<Css<String>> {
"mail_2fa_enabled": CONFIG._enable_email_2fa(), "mail_2fa_enabled": CONFIG._enable_email_2fa(),
"mail_enabled": CONFIG.mail_enabled(), "mail_enabled": CONFIG.mail_enabled(),
"sends_allowed": CONFIG.sends_allowed(), "sends_allowed": CONFIG.sends_allowed(),
"remember_2fa_disabled": CONFIG.disable_2fa_remember(),
"password_hints_allowed": CONFIG.password_hints_allowed(),
"signup_disabled": CONFIG.is_signup_disabled(), "signup_disabled": CONFIG.is_signup_disabled(),
"sso_enabled": CONFIG.sso_enabled(), "sso_enabled": CONFIG.sso_enabled(),
"sso_only": CONFIG.sso_enabled() && CONFIG.sso_only(), "sso_only": CONFIG.sso_enabled() && CONFIG.sso_only(),
"yubico_enabled": CONFIG._enable_yubico() && CONFIG.yubico_client_id().is_some() && CONFIG.yubico_secret_key().is_some(),
"webauthn_2fa_supported": CONFIG.is_webauthn_2fa_supported(), "webauthn_2fa_supported": CONFIG.is_webauthn_2fa_supported(),
"yubico_enabled": CONFIG._enable_yubico() && CONFIG.yubico_client_id().is_some() && CONFIG.yubico_secret_key().is_some(),
}); });
let scss = match CONFIG.render_template("scss/vaultwarden.scss", &css_options) { let scss = match CONFIG.render_template("scss/vaultwarden.scss", &css_options) {
@@ -238,8 +240,8 @@ pub fn static_files(filename: &str) -> Result<(ContentType, &'static [u8]), Erro
"jdenticon-3.3.0.js" => Ok((ContentType::JavaScript, include_bytes!("../static/scripts/jdenticon-3.3.0.js"))), "jdenticon-3.3.0.js" => Ok((ContentType::JavaScript, include_bytes!("../static/scripts/jdenticon-3.3.0.js"))),
"datatables.js" => Ok((ContentType::JavaScript, include_bytes!("../static/scripts/datatables.js"))), "datatables.js" => Ok((ContentType::JavaScript, include_bytes!("../static/scripts/datatables.js"))),
"datatables.css" => Ok((ContentType::CSS, include_bytes!("../static/scripts/datatables.css"))), "datatables.css" => Ok((ContentType::CSS, include_bytes!("../static/scripts/datatables.css"))),
"jquery-3.7.1.slim.js" => { "jquery-4.0.0.slim.js" => {
Ok((ContentType::JavaScript, include_bytes!("../static/scripts/jquery-3.7.1.slim.js"))) Ok((ContentType::JavaScript, include_bytes!("../static/scripts/jquery-4.0.0.slim.js")))
} }
_ => err!(format!("Static file not found: {filename}")), _ => err!(format!("Static file not found: {filename}")),
} }

View File

@@ -826,7 +826,7 @@ impl<'r> FromRequest<'r> for ManagerHeaders {
_ => err_handler!("Error getting DB"), _ => err_handler!("Error getting DB"),
}; };
if !Collection::can_access_collection(&headers.membership, &col_id, &conn).await { if !Collection::is_coll_manageable_by_user(&col_id, &headers.membership.user_uuid, &conn).await {
err_handler!("The current user isn't a manager for this collection") err_handler!("The current user isn't a manager for this collection")
} }
} }
@@ -908,8 +908,8 @@ impl ManagerHeaders {
if uuid::Uuid::parse_str(col_id.as_ref()).is_err() { if uuid::Uuid::parse_str(col_id.as_ref()).is_err() {
err!("Collection Id is malformed!"); err!("Collection Id is malformed!");
} }
if !Collection::can_access_collection(&h.membership, col_id, conn).await { if !Collection::is_coll_manageable_by_user(col_id, &h.membership.user_uuid, conn).await {
err!("You don't have access to all collections!"); err!("Collection not found", "The current user isn't a manager for this collection")
} }
} }

View File

@@ -1027,12 +1027,14 @@ fn validate_config(cfg: &ConfigItems) -> Result<(), Error> {
} }
// Server (v2025.6.2): https://github.com/bitwarden/server/blob/d094be3267f2030bd0dc62106bc6871cf82682f5/src/Core/Constants.cs#L103 // Server (v2025.6.2): https://github.com/bitwarden/server/blob/d094be3267f2030bd0dc62106bc6871cf82682f5/src/Core/Constants.cs#L103
// Client (web-v2025.6.1): https://github.com/bitwarden/clients/blob/747c2fd6a1c348a57a76e4a7de8128466ffd3c01/libs/common/src/enums/feature-flag.enum.ts#L12 // Client (web-v2026.2.0): https://github.com/bitwarden/clients/blob/a2fefe804d8c9b4a56c42f9904512c5c5821e2f6/libs/common/src/enums/feature-flag.enum.ts#L12
// Android (v2025.6.0): https://github.com/bitwarden/android/blob/b5b022caaad33390c31b3021b2c1205925b0e1a2/app/src/main/kotlin/com/x8bit/bitwarden/data/platform/manager/model/FlagKey.kt#L22 // Android (v2025.6.0): https://github.com/bitwarden/android/blob/b5b022caaad33390c31b3021b2c1205925b0e1a2/app/src/main/kotlin/com/x8bit/bitwarden/data/platform/manager/model/FlagKey.kt#L22
// iOS (v2025.6.0): https://github.com/bitwarden/ios/blob/ff06d9c6cc8da89f78f37f376495800201d7261a/BitwardenShared/Core/Platform/Models/Enum/FeatureFlag.swift#L7 // iOS (v2025.6.0): https://github.com/bitwarden/ios/blob/ff06d9c6cc8da89f78f37f376495800201d7261a/BitwardenShared/Core/Platform/Models/Enum/FeatureFlag.swift#L7
// //
// NOTE: Move deprecated flags to the utils::parse_experimental_client_feature_flags() DEPRECATED_FLAGS const! // NOTE: Move deprecated flags to the utils::parse_experimental_client_feature_flags() DEPRECATED_FLAGS const!
const KNOWN_FLAGS: &[&str] = &[ const KNOWN_FLAGS: &[&str] = &[
// Auth Team
"pm-5594-safari-account-switching",
// Autofill Team // Autofill Team
"inline-menu-positioning-improvements", "inline-menu-positioning-improvements",
"inline-menu-totp", "inline-menu-totp",
@@ -1046,6 +1048,10 @@ fn validate_config(cfg: &ConfigItems) -> Result<(), Error> {
"anon-addy-self-host-alias", "anon-addy-self-host-alias",
"simple-login-self-host-alias", "simple-login-self-host-alias",
"mutual-tls", "mutual-tls",
"cxp-import-mobile",
"cxp-export-mobile",
// Webauthn Related Origins
"pm-30529-webauthn-related-origins",
]; ];
let configured_flags = parse_experimental_client_feature_flags(&cfg.experimental_client_feature_flags); let configured_flags = parse_experimental_client_feature_flags(&cfg.experimental_client_feature_flags);
let invalid_flags: Vec<_> = configured_flags.keys().filter(|flag| !KNOWN_FLAGS.contains(&flag.as_str())).collect(); let invalid_flags: Vec<_> = configured_flags.keys().filter(|flag| !KNOWN_FLAGS.contains(&flag.as_str())).collect();

View File

@@ -55,13 +55,13 @@ pub fn encode_random_bytes<const N: usize>(e: &Encoding) -> String {
/// Generates a random string over a specified alphabet. /// Generates a random string over a specified alphabet.
pub fn get_random_string(alphabet: &[u8], num_chars: usize) -> String { pub fn get_random_string(alphabet: &[u8], num_chars: usize) -> String {
// Ref: https://rust-lang-nursery.github.io/rust-cookbook/algorithms/randomness.html // Ref: https://rust-lang-nursery.github.io/rust-cookbook/algorithms/randomness.html
use rand::Rng; use rand::RngExt;
let mut rng = rand::rng(); let mut rng = rand::rng();
(0..num_chars) (0..num_chars)
.map(|_| { .map(|_| {
let i = rng.random_range(0..alphabet.len()); let i = rng.random_range(0..alphabet.len());
alphabet[i] as char char::from(alphabet[i])
}) })
.collect() .collect()
} }

View File

@@ -177,7 +177,9 @@ impl AuthRequest {
} }
pub async fn purge_expired_auth_requests(conn: &DbConn) { pub async fn purge_expired_auth_requests(conn: &DbConn) {
let expiry_time = Utc::now().naive_utc() - chrono::TimeDelta::try_minutes(5).unwrap(); //after 5 minutes, clients reject the request // delete auth requests older than 15 minutes which is functionally equivalent to upstream:
// https://github.com/bitwarden/server/blob/f8ee2270409f7a13125cd414c450740af605a175/src/Sql/dbo/Auth/Stored%20Procedures/AuthRequest_DeleteIfExpired.sql
let expiry_time = Utc::now().naive_utc() - chrono::TimeDelta::try_minutes(15).unwrap();
for auth_request in Self::find_created_before(&expiry_time, conn).await { for auth_request in Self::find_created_before(&expiry_time, conn).await {
auth_request.delete(conn).await.ok(); auth_request.delete(conn).await.ok();
} }

View File

@@ -513,7 +513,8 @@ impl Collection {
}} }}
} }
pub async fn is_manageable_by_user(&self, user_uuid: &UserId, conn: &DbConn) -> bool { pub async fn is_coll_manageable_by_user(uuid: &CollectionId, user_uuid: &UserId, conn: &DbConn) -> bool {
let uuid = uuid.to_string();
let user_uuid = user_uuid.to_string(); let user_uuid = user_uuid.to_string();
db_run! { conn: { db_run! { conn: {
collections::table collections::table
@@ -538,9 +539,9 @@ impl Collection {
collections_groups::collections_uuid.eq(collections::uuid) collections_groups::collections_uuid.eq(collections::uuid)
) )
)) ))
.filter(collections::uuid.eq(&self.uuid)) .filter(collections::uuid.eq(&uuid))
.filter( .filter(
users_collections::collection_uuid.eq(&self.uuid).and(users_collections::manage.eq(true)).or(// Directly accessed collection users_collections::collection_uuid.eq(&uuid).and(users_collections::manage.eq(true)).or(// Directly accessed collection
users_organizations::access_all.eq(true).or( // access_all in Organization users_organizations::access_all.eq(true).or( // access_all in Organization
users_organizations::atype.le(MembershipType::Admin as i32) // Org admin or owner users_organizations::atype.le(MembershipType::Admin as i32) // Org admin or owner
)).or( )).or(
@@ -558,6 +559,10 @@ impl Collection {
.unwrap_or(0) != 0 .unwrap_or(0) != 0
}} }}
} }
pub async fn is_manageable_by_user(&self, user_uuid: &UserId, conn: &DbConn) -> bool {
Self::is_coll_manageable_by_user(&self.uuid, user_uuid, conn).await
}
} }
/// Database methods /// Database methods

View File

@@ -269,7 +269,7 @@ impl OrgPolicy {
continue; continue;
} }
if let Some(user) = Membership::find_by_user_and_org(user_uuid, &policy.org_uuid, conn).await { if let Some(user) = Membership::find_confirmed_by_user_and_org(user_uuid, &policy.org_uuid, conn).await {
if user.atype < MembershipType::Admin { if user.atype < MembershipType::Admin {
return true; return true;
} }

View File

@@ -231,6 +231,15 @@ impl User {
pub fn reset_stamp_exception(&mut self) { pub fn reset_stamp_exception(&mut self) {
self.stamp_exception = None; self.stamp_exception = None;
} }
pub fn display_name(&self) -> &str {
// default to email if name is empty
if !&self.name.is_empty() {
&self.name
} else {
&self.email
}
}
} }
/// Database methods /// Database methods

View File

@@ -302,10 +302,10 @@ pub async fn send_invite(
.append_pair("organizationUserId", &member_id) .append_pair("organizationUserId", &member_id)
.append_pair("token", &invite_token); .append_pair("token", &invite_token);
if CONFIG.sso_enabled() { if CONFIG.sso_enabled() && CONFIG.sso_only() {
query_params.append_pair("orgUserHasExistingUser", "false");
query_params.append_pair("orgSsoIdentifier", &org_id); query_params.append_pair("orgSsoIdentifier", &org_id);
} else if user.private_key.is_some() { }
if user.private_key.is_some() {
query_params.append_pair("orgUserHasExistingUser", "true"); query_params.append_pair("orgUserHasExistingUser", "true");
} }
} }

View File

@@ -1,6 +1,5 @@
use std::{borrow::Cow, sync::LazyLock, time::Duration}; use std::{borrow::Cow, sync::LazyLock, time::Duration};
use mini_moka::sync::Cache;
use openidconnect::{core::*, reqwest, *}; use openidconnect::{core::*, reqwest, *};
use regex::Regex; use regex::Regex;
use url::Url; use url::Url;
@@ -13,9 +12,14 @@ use crate::{
}; };
static CLIENT_CACHE_KEY: LazyLock<String> = LazyLock::new(|| "sso-client".to_string()); static CLIENT_CACHE_KEY: LazyLock<String> = LazyLock::new(|| "sso-client".to_string());
static CLIENT_CACHE: LazyLock<Cache<String, Client>> = LazyLock::new(|| { static CLIENT_CACHE: LazyLock<moka::sync::Cache<String, Client>> = LazyLock::new(|| {
Cache::builder().max_capacity(1).time_to_live(Duration::from_secs(CONFIG.sso_client_cache_expiration())).build() moka::sync::Cache::builder()
.max_capacity(1)
.time_to_live(Duration::from_secs(CONFIG.sso_client_cache_expiration()))
.build()
}); });
static REFRESH_CACHE: LazyLock<moka::future::Cache<String, Result<RefreshTokenResponse, String>>> =
LazyLock::new(|| moka::future::Cache::builder().max_capacity(1000).time_to_live(Duration::from_secs(30)).build());
/// OpenID Connect Core client. /// OpenID Connect Core client.
pub type CustomClient = openidconnect::Client< pub type CustomClient = openidconnect::Client<
@@ -38,6 +42,8 @@ pub type CustomClient = openidconnect::Client<
EndpointSet, EndpointSet,
>; >;
pub type RefreshTokenResponse = (Option<String>, String, Option<Duration>);
#[derive(Clone)] #[derive(Clone)]
pub struct Client { pub struct Client {
pub http_client: reqwest::Client, pub http_client: reqwest::Client,
@@ -231,23 +237,29 @@ impl Client {
verifier verifier
} }
pub async fn exchange_refresh_token( pub async fn exchange_refresh_token(refresh_token: String) -> ApiResult<RefreshTokenResponse> {
refresh_token: String, let client = Client::cached().await?;
) -> ApiResult<(Option<String>, String, Option<Duration>)> {
REFRESH_CACHE
.get_with(refresh_token.clone(), async move { client._exchange_refresh_token(refresh_token).await })
.await
.map_err(Into::into)
}
async fn _exchange_refresh_token(&self, refresh_token: String) -> Result<RefreshTokenResponse, String> {
let rt = RefreshToken::new(refresh_token); let rt = RefreshToken::new(refresh_token);
let client = Client::cached().await?; match self.core_client.exchange_refresh_token(&rt).request_async(&self.http_client).await {
let token_response = Err(err) => {
match client.core_client.exchange_refresh_token(&rt).request_async(&client.http_client).await { error!("Request to exchange_refresh_token endpoint failed: {err}");
Err(err) => err!(format!("Request to exchange_refresh_token endpoint failed: {:?}", err)), Err(format!("Request to exchange_refresh_token endpoint failed: {err}"))
Ok(token_response) => token_response, }
}; Ok(token_response) => Ok((
token_response.refresh_token().map(|token| token.secret().clone()),
Ok(( token_response.access_token().secret().clone(),
token_response.refresh_token().map(|token| token.secret().clone()), token_response.expires_in(),
token_response.access_token().secret().clone(), )),
token_response.expires_in(), }
))
} }
} }

View File

@@ -4,10 +4,10 @@
* *
* To rebuild or modify this file with the latest versions of the included * To rebuild or modify this file with the latest versions of the included
* software please visit: * software please visit:
* https://datatables.net/download/#bs5/dt-2.3.5 * https://datatables.net/download/#bs5/dt-2.3.7
* *
* Included libraries: * Included libraries:
* DataTables 2.3.5 * DataTables 2.3.7
*/ */
:root { :root {
@@ -88,42 +88,42 @@ table.dataTable thead > tr > th:active,
table.dataTable thead > tr > td:active { table.dataTable thead > tr > td:active {
outline: none; outline: none;
} }
table.dataTable thead > tr > th.dt-orderable-asc span.dt-column-order:before, table.dataTable thead > tr > th.dt-ordering-asc span.dt-column-order:before, table.dataTable thead > tr > th.dt-orderable-asc .dt-column-order:before, table.dataTable thead > tr > th.dt-ordering-asc .dt-column-order:before,
table.dataTable thead > tr > td.dt-orderable-asc span.dt-column-order:before, table.dataTable thead > tr > td.dt-orderable-asc .dt-column-order:before,
table.dataTable thead > tr > td.dt-ordering-asc span.dt-column-order:before { table.dataTable thead > tr > td.dt-ordering-asc .dt-column-order:before {
position: absolute; position: absolute;
display: block; display: block;
bottom: 50%; bottom: 50%;
content: "\25B2"; content: "\25B2";
content: "\25B2"/""; content: "\25B2"/"";
} }
table.dataTable thead > tr > th.dt-orderable-desc span.dt-column-order:after, table.dataTable thead > tr > th.dt-ordering-desc span.dt-column-order:after, table.dataTable thead > tr > th.dt-orderable-desc .dt-column-order:after, table.dataTable thead > tr > th.dt-ordering-desc .dt-column-order:after,
table.dataTable thead > tr > td.dt-orderable-desc span.dt-column-order:after, table.dataTable thead > tr > td.dt-orderable-desc .dt-column-order:after,
table.dataTable thead > tr > td.dt-ordering-desc span.dt-column-order:after { table.dataTable thead > tr > td.dt-ordering-desc .dt-column-order:after {
position: absolute; position: absolute;
display: block; display: block;
top: 50%; top: 50%;
content: "\25BC"; content: "\25BC";
content: "\25BC"/""; content: "\25BC"/"";
} }
table.dataTable thead > tr > th.dt-orderable-asc span.dt-column-order, table.dataTable thead > tr > th.dt-orderable-desc span.dt-column-order, table.dataTable thead > tr > th.dt-ordering-asc span.dt-column-order, table.dataTable thead > tr > th.dt-ordering-desc span.dt-column-order, table.dataTable thead > tr > th.dt-orderable-asc .dt-column-order, table.dataTable thead > tr > th.dt-orderable-desc .dt-column-order, table.dataTable thead > tr > th.dt-ordering-asc .dt-column-order, table.dataTable thead > tr > th.dt-ordering-desc .dt-column-order,
table.dataTable thead > tr > td.dt-orderable-asc span.dt-column-order, table.dataTable thead > tr > td.dt-orderable-asc .dt-column-order,
table.dataTable thead > tr > td.dt-orderable-desc span.dt-column-order, table.dataTable thead > tr > td.dt-orderable-desc .dt-column-order,
table.dataTable thead > tr > td.dt-ordering-asc span.dt-column-order, table.dataTable thead > tr > td.dt-ordering-asc .dt-column-order,
table.dataTable thead > tr > td.dt-ordering-desc span.dt-column-order { table.dataTable thead > tr > td.dt-ordering-desc .dt-column-order {
position: relative; position: relative;
width: 12px; width: 12px;
height: 24px; height: 20px;
} }
table.dataTable thead > tr > th.dt-orderable-asc span.dt-column-order:before, table.dataTable thead > tr > th.dt-orderable-asc span.dt-column-order:after, table.dataTable thead > tr > th.dt-orderable-desc span.dt-column-order:before, table.dataTable thead > tr > th.dt-orderable-desc span.dt-column-order:after, table.dataTable thead > tr > th.dt-ordering-asc span.dt-column-order:before, table.dataTable thead > tr > th.dt-ordering-asc span.dt-column-order:after, table.dataTable thead > tr > th.dt-ordering-desc span.dt-column-order:before, table.dataTable thead > tr > th.dt-ordering-desc span.dt-column-order:after, table.dataTable thead > tr > th.dt-orderable-asc .dt-column-order:before, table.dataTable thead > tr > th.dt-orderable-asc .dt-column-order:after, table.dataTable thead > tr > th.dt-orderable-desc .dt-column-order:before, table.dataTable thead > tr > th.dt-orderable-desc .dt-column-order:after, table.dataTable thead > tr > th.dt-ordering-asc .dt-column-order:before, table.dataTable thead > tr > th.dt-ordering-asc .dt-column-order:after, table.dataTable thead > tr > th.dt-ordering-desc .dt-column-order:before, table.dataTable thead > tr > th.dt-ordering-desc .dt-column-order:after,
table.dataTable thead > tr > td.dt-orderable-asc span.dt-column-order:before, table.dataTable thead > tr > td.dt-orderable-asc .dt-column-order:before,
table.dataTable thead > tr > td.dt-orderable-asc span.dt-column-order:after, table.dataTable thead > tr > td.dt-orderable-asc .dt-column-order:after,
table.dataTable thead > tr > td.dt-orderable-desc span.dt-column-order:before, table.dataTable thead > tr > td.dt-orderable-desc .dt-column-order:before,
table.dataTable thead > tr > td.dt-orderable-desc span.dt-column-order:after, table.dataTable thead > tr > td.dt-orderable-desc .dt-column-order:after,
table.dataTable thead > tr > td.dt-ordering-asc span.dt-column-order:before, table.dataTable thead > tr > td.dt-ordering-asc .dt-column-order:before,
table.dataTable thead > tr > td.dt-ordering-asc span.dt-column-order:after, table.dataTable thead > tr > td.dt-ordering-asc .dt-column-order:after,
table.dataTable thead > tr > td.dt-ordering-desc span.dt-column-order:before, table.dataTable thead > tr > td.dt-ordering-desc .dt-column-order:before,
table.dataTable thead > tr > td.dt-ordering-desc span.dt-column-order:after { table.dataTable thead > tr > td.dt-ordering-desc .dt-column-order:after {
left: 0; left: 0;
opacity: 0.125; opacity: 0.125;
line-height: 9px; line-height: 9px;
@@ -140,15 +140,15 @@ table.dataTable thead > tr > td.dt-orderable-desc:hover {
outline: 2px solid rgba(0, 0, 0, 0.05); outline: 2px solid rgba(0, 0, 0, 0.05);
outline-offset: -2px; outline-offset: -2px;
} }
table.dataTable thead > tr > th.dt-ordering-asc span.dt-column-order:before, table.dataTable thead > tr > th.dt-ordering-desc span.dt-column-order:after, table.dataTable thead > tr > th.dt-ordering-asc .dt-column-order:before, table.dataTable thead > tr > th.dt-ordering-desc .dt-column-order:after,
table.dataTable thead > tr > td.dt-ordering-asc span.dt-column-order:before, table.dataTable thead > tr > td.dt-ordering-asc .dt-column-order:before,
table.dataTable thead > tr > td.dt-ordering-desc span.dt-column-order:after { table.dataTable thead > tr > td.dt-ordering-desc .dt-column-order:after {
opacity: 0.6; opacity: 0.6;
} }
table.dataTable thead > tr > th.dt-orderable-none:not(.dt-ordering-asc, .dt-ordering-desc) span.dt-column-order:empty, table.dataTable thead > tr > th.sorting_desc_disabled span.dt-column-order:after, table.dataTable thead > tr > th.sorting_asc_disabled span.dt-column-order:before, table.dataTable thead > tr > th.dt-orderable-none:not(.dt-ordering-asc, .dt-ordering-desc) .dt-column-order:empty, table.dataTable thead > tr > th.sorting_desc_disabled .dt-column-order:after, table.dataTable thead > tr > th.sorting_asc_disabled .dt-column-order:before,
table.dataTable thead > tr > td.dt-orderable-none:not(.dt-ordering-asc, .dt-ordering-desc) span.dt-column-order:empty, table.dataTable thead > tr > td.dt-orderable-none:not(.dt-ordering-asc, .dt-ordering-desc) .dt-column-order:empty,
table.dataTable thead > tr > td.sorting_desc_disabled span.dt-column-order:after, table.dataTable thead > tr > td.sorting_desc_disabled .dt-column-order:after,
table.dataTable thead > tr > td.sorting_asc_disabled span.dt-column-order:before { table.dataTable thead > tr > td.sorting_asc_disabled .dt-column-order:before {
display: none; display: none;
} }
table.dataTable thead > tr > th:active, table.dataTable thead > tr > th:active,
@@ -169,24 +169,24 @@ table.dataTable tfoot > tr > td div.dt-column-footer {
align-items: var(--dt-header-align-items); align-items: var(--dt-header-align-items);
gap: 4px; gap: 4px;
} }
table.dataTable thead > tr > th div.dt-column-header span.dt-column-title, table.dataTable thead > tr > th div.dt-column-header .dt-column-title,
table.dataTable thead > tr > th div.dt-column-footer span.dt-column-title, table.dataTable thead > tr > th div.dt-column-footer .dt-column-title,
table.dataTable thead > tr > td div.dt-column-header span.dt-column-title, table.dataTable thead > tr > td div.dt-column-header .dt-column-title,
table.dataTable thead > tr > td div.dt-column-footer span.dt-column-title, table.dataTable thead > tr > td div.dt-column-footer .dt-column-title,
table.dataTable tfoot > tr > th div.dt-column-header span.dt-column-title, table.dataTable tfoot > tr > th div.dt-column-header .dt-column-title,
table.dataTable tfoot > tr > th div.dt-column-footer span.dt-column-title, table.dataTable tfoot > tr > th div.dt-column-footer .dt-column-title,
table.dataTable tfoot > tr > td div.dt-column-header span.dt-column-title, table.dataTable tfoot > tr > td div.dt-column-header .dt-column-title,
table.dataTable tfoot > tr > td div.dt-column-footer span.dt-column-title { table.dataTable tfoot > tr > td div.dt-column-footer .dt-column-title {
flex-grow: 1; flex-grow: 1;
} }
table.dataTable thead > tr > th div.dt-column-header span.dt-column-title:empty, table.dataTable thead > tr > th div.dt-column-header .dt-column-title:empty,
table.dataTable thead > tr > th div.dt-column-footer span.dt-column-title:empty, table.dataTable thead > tr > th div.dt-column-footer .dt-column-title:empty,
table.dataTable thead > tr > td div.dt-column-header span.dt-column-title:empty, table.dataTable thead > tr > td div.dt-column-header .dt-column-title:empty,
table.dataTable thead > tr > td div.dt-column-footer span.dt-column-title:empty, table.dataTable thead > tr > td div.dt-column-footer .dt-column-title:empty,
table.dataTable tfoot > tr > th div.dt-column-header span.dt-column-title:empty, table.dataTable tfoot > tr > th div.dt-column-header .dt-column-title:empty,
table.dataTable tfoot > tr > th div.dt-column-footer span.dt-column-title:empty, table.dataTable tfoot > tr > th div.dt-column-footer .dt-column-title:empty,
table.dataTable tfoot > tr > td div.dt-column-header span.dt-column-title:empty, table.dataTable tfoot > tr > td div.dt-column-header .dt-column-title:empty,
table.dataTable tfoot > tr > td div.dt-column-footer span.dt-column-title:empty { table.dataTable tfoot > tr > td div.dt-column-footer .dt-column-title:empty {
display: none; display: none;
} }
@@ -588,16 +588,16 @@ table.dataTable.table-sm > thead > tr td.dt-ordering-asc,
table.dataTable.table-sm > thead > tr td.dt-ordering-desc { table.dataTable.table-sm > thead > tr td.dt-ordering-desc {
padding-right: 0.25rem; padding-right: 0.25rem;
} }
table.dataTable.table-sm > thead > tr th.dt-orderable-asc span.dt-column-order, table.dataTable.table-sm > thead > tr th.dt-orderable-desc span.dt-column-order, table.dataTable.table-sm > thead > tr th.dt-ordering-asc span.dt-column-order, table.dataTable.table-sm > thead > tr th.dt-ordering-desc span.dt-column-order, table.dataTable.table-sm > thead > tr th.dt-orderable-asc .dt-column-order, table.dataTable.table-sm > thead > tr th.dt-orderable-desc .dt-column-order, table.dataTable.table-sm > thead > tr th.dt-ordering-asc .dt-column-order, table.dataTable.table-sm > thead > tr th.dt-ordering-desc .dt-column-order,
table.dataTable.table-sm > thead > tr td.dt-orderable-asc span.dt-column-order, table.dataTable.table-sm > thead > tr td.dt-orderable-asc .dt-column-order,
table.dataTable.table-sm > thead > tr td.dt-orderable-desc span.dt-column-order, table.dataTable.table-sm > thead > tr td.dt-orderable-desc .dt-column-order,
table.dataTable.table-sm > thead > tr td.dt-ordering-asc span.dt-column-order, table.dataTable.table-sm > thead > tr td.dt-ordering-asc .dt-column-order,
table.dataTable.table-sm > thead > tr td.dt-ordering-desc span.dt-column-order { table.dataTable.table-sm > thead > tr td.dt-ordering-desc .dt-column-order {
right: 0.25rem; right: 0.25rem;
} }
table.dataTable.table-sm > thead > tr th.dt-type-date span.dt-column-order, table.dataTable.table-sm > thead > tr th.dt-type-numeric span.dt-column-order, table.dataTable.table-sm > thead > tr th.dt-type-date .dt-column-order, table.dataTable.table-sm > thead > tr th.dt-type-numeric .dt-column-order,
table.dataTable.table-sm > thead > tr td.dt-type-date span.dt-column-order, table.dataTable.table-sm > thead > tr td.dt-type-date .dt-column-order,
table.dataTable.table-sm > thead > tr td.dt-type-numeric span.dt-column-order { table.dataTable.table-sm > thead > tr td.dt-type-numeric .dt-column-order {
left: 0.25rem; left: 0.25rem;
} }
@@ -606,7 +606,8 @@ div.dt-scroll-head table.table-bordered {
} }
div.table-responsive > div.dt-container > div.row { div.table-responsive > div.dt-container > div.row {
margin: 0; margin-left: 0;
margin-right: 0;
} }
div.table-responsive > div.dt-container > div.row > div[class^=col-]:first-child { div.table-responsive > div.dt-container > div.row > div[class^=col-]:first-child {
padding-left: 0; padding-left: 0;

View File

@@ -4,13 +4,13 @@
* *
* To rebuild or modify this file with the latest versions of the included * To rebuild or modify this file with the latest versions of the included
* software please visit: * software please visit:
* https://datatables.net/download/#bs5/dt-2.3.5 * https://datatables.net/download/#bs5/dt-2.3.7
* *
* Included libraries: * Included libraries:
* DataTables 2.3.5 * DataTables 2.3.7
*/ */
/*! DataTables 2.3.5 /*! DataTables 2.3.7
* © SpryMedia Ltd - datatables.net/license * © SpryMedia Ltd - datatables.net/license
*/ */
@@ -186,7 +186,7 @@
"sDestroyWidth": $this[0].style.width, "sDestroyWidth": $this[0].style.width,
"sInstance": sId, "sInstance": sId,
"sTableId": sId, "sTableId": sId,
colgroup: $('<colgroup>').prependTo(this), colgroup: $('<colgroup>'),
fastData: function (row, column, type) { fastData: function (row, column, type) {
return _fnGetCellData(oSettings, row, column, type); return _fnGetCellData(oSettings, row, column, type);
} }
@@ -259,6 +259,7 @@
"orderHandler", "orderHandler",
"titleRow", "titleRow",
"typeDetect", "typeDetect",
"columnTitleTag",
[ "iCookieDuration", "iStateDuration" ], // backwards compat [ "iCookieDuration", "iStateDuration" ], // backwards compat
[ "oSearch", "oPreviousSearch" ], [ "oSearch", "oPreviousSearch" ],
[ "aoSearchCols", "aoPreSearchCols" ], [ "aoSearchCols", "aoPreSearchCols" ],
@@ -423,7 +424,7 @@
if ( oSettings.caption ) { if ( oSettings.caption ) {
if ( caption.length === 0 ) { if ( caption.length === 0 ) {
caption = $('<caption/>').appendTo( $this ); caption = $('<caption/>').prependTo( $this );
} }
caption.html( oSettings.caption ); caption.html( oSettings.caption );
@@ -436,6 +437,14 @@
oSettings.captionNode = caption[0]; oSettings.captionNode = caption[0];
} }
// Place the colgroup element in the correct location for the HTML structure
if (caption.length) {
oSettings.colgroup.insertAfter(caption);
}
else {
oSettings.colgroup.prependTo(oSettings.nTable);
}
if ( thead.length === 0 ) { if ( thead.length === 0 ) {
thead = $('<thead/>').appendTo($this); thead = $('<thead/>').appendTo($this);
} }
@@ -516,7 +525,7 @@
* *
* @type string * @type string
*/ */
builder: "bs5/dt-2.3.5", builder: "bs5/dt-2.3.7",
/** /**
* Buttons. For use with the Buttons extension for DataTables. This is * Buttons. For use with the Buttons extension for DataTables. This is
@@ -1292,7 +1301,7 @@
}; };
// Replaceable function in api.util // Replaceable function in api.util
var _stripHtml = function (input) { var _stripHtml = function (input, replacement) {
if (! input || typeof input !== 'string') { if (! input || typeof input !== 'string') {
return input; return input;
} }
@@ -1304,7 +1313,7 @@
var previous; var previous;
input = input.replace(_re_html, ''); // Complete tags input = input.replace(_re_html, replacement || ''); // Complete tags
// Safety for incomplete script tag - use do / while to ensure that // Safety for incomplete script tag - use do / while to ensure that
// we get all instances // we get all instances
@@ -1769,7 +1778,7 @@
} }
}, },
stripHtml: function (mixed) { stripHtml: function (mixed, replacement) {
var type = typeof mixed; var type = typeof mixed;
if (type === 'function') { if (type === 'function') {
@@ -1777,7 +1786,7 @@
return; return;
} }
else if (type === 'string') { else if (type === 'string') {
return _stripHtml(mixed); return _stripHtml(mixed, replacement);
} }
return mixed; return mixed;
}, },
@@ -3379,7 +3388,7 @@
colspan++; colspan++;
} }
var titleSpan = $('span.dt-column-title', cell); var titleSpan = $('.dt-column-title', cell);
structure[row][column] = { structure[row][column] = {
cell: cell, cell: cell,
@@ -4093,8 +4102,8 @@
} }
// Wrap the column title so we can write to it in future // Wrap the column title so we can write to it in future
if ( $('span.dt-column-title', cell).length === 0) { if ( $('.dt-column-title', cell).length === 0) {
$('<span>') $(document.createElement(settings.columnTitleTag))
.addClass('dt-column-title') .addClass('dt-column-title')
.append(cell.childNodes) .append(cell.childNodes)
.appendTo(cell); .appendTo(cell);
@@ -4105,9 +4114,9 @@
isHeader && isHeader &&
jqCell.filter(':not([data-dt-order=disable])').length !== 0 && jqCell.filter(':not([data-dt-order=disable])').length !== 0 &&
jqCell.parent(':not([data-dt-order=disable])').length !== 0 && jqCell.parent(':not([data-dt-order=disable])').length !== 0 &&
$('span.dt-column-order', cell).length === 0 $('.dt-column-order', cell).length === 0
) { ) {
$('<span>') $(document.createElement(settings.columnTitleTag))
.addClass('dt-column-order') .addClass('dt-column-order')
.appendTo(cell); .appendTo(cell);
} }
@@ -4116,7 +4125,7 @@
// layout for those elements // layout for those elements
var headerFooter = isHeader ? 'header' : 'footer'; var headerFooter = isHeader ? 'header' : 'footer';
if ( $('span.dt-column-' + headerFooter, cell).length === 0) { if ( $('div.dt-column-' + headerFooter, cell).length === 0) {
$('<div>') $('<div>')
.addClass('dt-column-' + headerFooter) .addClass('dt-column-' + headerFooter)
.append(cell.childNodes) .append(cell.childNodes)
@@ -4273,6 +4282,10 @@
// Custom Ajax option to submit the parameters as a JSON string // Custom Ajax option to submit the parameters as a JSON string
if (baseAjax.submitAs === 'json' && typeof data === 'object') { if (baseAjax.submitAs === 'json' && typeof data === 'object') {
baseAjax.data = JSON.stringify(data); baseAjax.data = JSON.stringify(data);
if (!baseAjax.contentType) {
baseAjax.contentType = 'application/json; charset=utf-8';
}
} }
if (typeof ajax === 'function') { if (typeof ajax === 'function') {
@@ -5531,7 +5544,7 @@
var autoClass = _ext.type.className[column.sType]; var autoClass = _ext.type.className[column.sType];
var padding = column.sContentPadding || (scrollX ? '-' : ''); var padding = column.sContentPadding || (scrollX ? '-' : '');
var text = longest + padding; var text = longest + padding;
var insert = longest.indexOf('<') === -1 var insert = longest.indexOf('<') === -1 && longest.indexOf('&') === -1
? document.createTextNode(text) ? document.createTextNode(text)
: text : text
@@ -5719,15 +5732,20 @@
.replace(/id=".*?"/g, '') .replace(/id=".*?"/g, '')
.replace(/name=".*?"/g, ''); .replace(/name=".*?"/g, '');
var s = _stripHtml(cellString) // Don't want Javascript at all in these calculation cells.
cellString = cellString.replace(/<script.*?<\/script>/gi, ' ');
var noHtml = _stripHtml(cellString, ' ')
.replace( /&nbsp;/g, ' ' ); .replace( /&nbsp;/g, ' ' );
// The length is calculated on the text only, but we keep the HTML
// in the string so it can be used in the calculation table
collection.push({ collection.push({
str: s, str: cellString,
len: s.length len: noHtml.length
}); });
allStrings.push(s); allStrings.push(noHtml);
} }
// Order and then cut down to the size we need // Order and then cut down to the size we need
@@ -8782,7 +8800,7 @@
// Automatic - find the _last_ unique cell from the top that is not empty (last for // Automatic - find the _last_ unique cell from the top that is not empty (last for
// backwards compatibility) // backwards compatibility)
for (var i=0 ; i<header.length ; i++) { for (var i=0 ; i<header.length ; i++) {
if (header[i][column].unique && $('span.dt-column-title', header[i][column].cell).text()) { if (header[i][column].unique && $('.dt-column-title', header[i][column].cell).text()) {
target = i; target = i;
} }
} }
@@ -8878,6 +8896,10 @@
return null; return null;
} }
if (col.responsiveVisible === false) {
return null;
}
// Selector // Selector
if (match[1]) { if (match[1]) {
return $(nodes[idx]).filter(match[1]).length > 0 ? idx : null; return $(nodes[idx]).filter(match[1]).length > 0 ? idx : null;
@@ -9089,7 +9111,7 @@
title = undefined; title = undefined;
} }
var span = $('span.dt-column-title', this.column(column).header(row)); var span = $('.dt-column-title', this.column(column).header(row));
if (title !== undefined) { if (title !== undefined) {
span.html(title); span.html(title);
@@ -10263,8 +10285,8 @@
// Needed for header and footer, so pulled into its own function // Needed for header and footer, so pulled into its own function
function cleanHeader(node, className) { function cleanHeader(node, className) {
$(node).find('span.dt-column-order').remove(); $(node).find('.dt-column-order').remove();
$(node).find('span.dt-column-title').each(function () { $(node).find('.dt-column-title').each(function () {
var title = $(this).html(); var title = $(this).html();
$(this).parent().parent().append(title); $(this).parent().parent().append(title);
$(this).remove(); $(this).remove();
@@ -10282,7 +10304,7 @@
* @type string * @type string
* @default Version number * @default Version number
*/ */
DataTable.version = "2.3.5"; DataTable.version = "2.3.7";
/** /**
* Private data store, containing all of the settings objects that are * Private data store, containing all of the settings objects that are
@@ -11450,7 +11472,10 @@
iDeferLoading: null, iDeferLoading: null,
/** Event listeners */ /** Event listeners */
on: null on: null,
/** Title wrapper element type */
columnTitleTag: 'span'
}; };
_fnHungarianMap( DataTable.defaults ); _fnHungarianMap( DataTable.defaults );
@@ -12414,7 +12439,10 @@
orderHandler: true, orderHandler: true,
/** Title row indicator */ /** Title row indicator */
titleRow: null titleRow: null,
/** Title wrapper element type */
columnTitleTag: 'span'
}; };
/** /**

View File

@@ -8,7 +8,7 @@
<dl class="row"> <dl class="row">
<dt class="col-sm-5">Server Installed <dt class="col-sm-5">Server Installed
<span class="badge bg-success d-none abbr-badge" id="server-success" title="Latest version is installed.">Ok</span> <span class="badge bg-success d-none abbr-badge" id="server-success" title="Latest version is installed.">Ok</span>
<span class="badge bg-warning text-dark d-none abbr-badge" id="server-warning" title="There seems to be an update available.">Update</span> <span class="badge bg-warning text-dark d-none abbr-badge" id="server-warning" title="An update is available.">Update</span>
<span class="badge bg-info text-dark d-none abbr-badge" id="server-branch" title="This is a branched version.">Branched</span> <span class="badge bg-info text-dark d-none abbr-badge" id="server-branch" title="This is a branched version.">Branched</span>
</dt> </dt>
<dd class="col-sm-7"> <dd class="col-sm-7">
@@ -23,8 +23,8 @@
{{#if page_data.web_vault_enabled}} {{#if page_data.web_vault_enabled}}
<dt class="col-sm-5">Web Installed <dt class="col-sm-5">Web Installed
<span class="badge bg-success d-none abbr-badge" id="web-success" title="Latest version is installed.">Ok</span> <span class="badge bg-success d-none abbr-badge" id="web-success" title="Latest version is installed.">Ok</span>
<span class="badge bg-warning text-dark d-none abbr-badge" id="web-warning" title="There seems to be an update available.">Update</span> <span class="badge bg-warning text-dark d-none abbr-badge" id="web-warning" title="An update is available.">Update</span>
<span class="badge bg-info text-dark d-none abbr-badge" id="web-prerelease" title="You seem to be using a pre-release version.">Pre-Release</span> <span class="badge bg-info text-dark d-none abbr-badge" id="web-prerelease" title="You are using a pre-release version.">Pre-Release</span>
</dt> </dt>
<dd class="col-sm-7"> <dd class="col-sm-7">
<span id="web-installed">{{page_data.active_web_release}}</span> <span id="web-installed">{{page_data.active_web_release}}</span>

View File

@@ -59,7 +59,7 @@
</main> </main>
<link rel="stylesheet" href="{{urlpath}}/vw_static/datatables.css" /> <link rel="stylesheet" href="{{urlpath}}/vw_static/datatables.css" />
<script src="{{urlpath}}/vw_static/jquery-3.7.1.slim.js"></script> <script src="{{urlpath}}/vw_static/jquery-4.0.0.slim.js"></script>
<script src="{{urlpath}}/vw_static/datatables.js"></script> <script src="{{urlpath}}/vw_static/datatables.js"></script>
<script src="{{urlpath}}/vw_static/admin_organizations.js"></script> <script src="{{urlpath}}/vw_static/admin_organizations.js"></script>
<script src="{{urlpath}}/vw_static/jdenticon-3.3.0.js"></script> <script src="{{urlpath}}/vw_static/jdenticon-3.3.0.js"></script>

View File

@@ -153,7 +153,7 @@
</main> </main>
<link rel="stylesheet" href="{{urlpath}}/vw_static/datatables.css" /> <link rel="stylesheet" href="{{urlpath}}/vw_static/datatables.css" />
<script src="{{urlpath}}/vw_static/jquery-3.7.1.slim.js"></script> <script src="{{urlpath}}/vw_static/jquery-4.0.0.slim.js"></script>
<script src="{{urlpath}}/vw_static/datatables.js"></script> <script src="{{urlpath}}/vw_static/datatables.js"></script>
<script src="{{urlpath}}/vw_static/admin_users.js"></script> <script src="{{urlpath}}/vw_static/admin_users.js"></script>
<script src="{{urlpath}}/vw_static/jdenticon-3.3.0.js"></script> <script src="{{urlpath}}/vw_static/jdenticon-3.3.0.js"></script>

View File

@@ -158,6 +158,13 @@ app-root a[routerlink="/signup"] {
{{/if}} {{/if}}
{{/if}} {{/if}}
{{#if remember_2fa_disabled}}
/* Hide checkbox to remember 2FA token for 30 days */
app-two-factor-auth > form > bit-form-control {
@extend %vw-hide;
}
{{/if}}
{{#unless mail_2fa_enabled}} {{#unless mail_2fa_enabled}}
/* Hide `Email` 2FA if mail is not enabled */ /* Hide `Email` 2FA if mail is not enabled */
.providers-2fa-1 { .providers-2fa-1 {
@@ -192,6 +199,19 @@ bit-nav-item[route="sends"] {
@extend %vw-hide; @extend %vw-hide;
} }
{{/unless}} {{/unless}}
{{#unless password_hints_allowed}}
/* Hide password hints if not allowed */
a[routerlink="/hint"],
{{#if (webver "<2025.12.2")}}
app-change-password > form > .form-group:nth-child(5),
auth-input-password > form > bit-form-field:nth-child(4) {
{{else}}
.vw-password-hint {
{{/if}}
@extend %vw-hide;
}
{{/unless}}
/**** End Dynamic Vaultwarden Changes ****/ /**** End Dynamic Vaultwarden Changes ****/
/**** Include a special user stylesheet for custom changes ****/ /**** Include a special user stylesheet for custom changes ****/
{{#if load_user_scss}} {{#if load_user_scss}}

View File

@@ -153,9 +153,11 @@ impl Cors {
fn get_allowed_origin(headers: &HeaderMap<'_>) -> Option<String> { fn get_allowed_origin(headers: &HeaderMap<'_>) -> Option<String> {
let origin = Cors::get_header(headers, "Origin"); let origin = Cors::get_header(headers, "Origin");
let safari_extension_origin = "file://"; let safari_extension_origin = "file://";
let desktop_custom_file_origin = "bw-desktop-file://bundle";
if origin == CONFIG.domain_origin() if origin == CONFIG.domain_origin()
|| origin == safari_extension_origin || origin == safari_extension_origin
|| origin == desktop_custom_file_origin
|| (CONFIG.sso_enabled() && origin == CONFIG.sso_authority()) || (CONFIG.sso_enabled() && origin == CONFIG.sso_authority())
{ {
Some(origin) Some(origin)