Compare commits
6 Commits
v4.2.0
...
test-apach
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f9749f64d8 | ||
|
|
c0633ec3ac | ||
|
|
c9edb49267 | ||
|
|
9cc15da90f | ||
|
|
69bb92d698 | ||
|
|
bafa9e2eb0 |
@@ -1,8 +1,8 @@
|
||||
# Configuring Azure Pipelines with Certbot
|
||||
|
||||
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
|
||||
* `.azure-pipelines/main.yml` is the main one, executed on PRs for main, and pushes to main,
|
||||
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for main.
|
||||
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
|
||||
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
|
||||
|
||||
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
|
||||
|
||||
@@ -64,17 +64,17 @@ Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses,
|
||||
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
|
||||
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
|
||||
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
|
||||
protected branches, like main, and cannot modify the security context around this on GitHub.
|
||||
protected branches, like master, and cannot modify the security context around this on GitHub.
|
||||
Access can be defined for all or only selected repositories, which is nice.
|
||||
```
|
||||
|
||||
- Redirected to Azure DevOps, select the account created in _Having an Azure DevOps account_ section.
|
||||
- Select the organization, and click "Create a new project" (let's name it the same than the targeted github repo)
|
||||
- Select the organization, and click "Create a new project" (let's name it the same than the targetted github repo)
|
||||
- The Visibility is public, to profit from 10 parallel jobs
|
||||
|
||||
```
|
||||
!!! ACCESS !!!
|
||||
Azure Pipelines needs access to the GitHub account (in term of being able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
|
||||
Azure Pipelines needs access to the GitHub account (in term of beeing able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
|
||||
```
|
||||
|
||||
_Done. We can move to pipelines configuration._
|
||||
@@ -91,11 +91,11 @@ grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAut
|
||||
then are way too large (admin level on almost everything), while the classic approach does not add any more
|
||||
permissions, and works perfectly well.__
|
||||
|
||||
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, main in default branch.
|
||||
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
|
||||
- Click on YAML option for "Select a template"
|
||||
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
|
||||
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
|
||||
- Click "Save & queue", choose the main branch to build the first pipeline, and click "Save and run" button.
|
||||
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
|
||||
|
||||
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._
|
||||
|
||||
|
||||
@@ -1,15 +0,0 @@
|
||||
# Advanced pipeline for running our full test suite on demand.
|
||||
trigger:
|
||||
# When changing these triggers, please ensure the documentation under
|
||||
# "Running tests in CI" is still correct.
|
||||
- test-*
|
||||
pr: none
|
||||
|
||||
variables:
|
||||
# We don't publish our Docker images in this pipeline, but when building them
|
||||
# for testing, let's use the nightly tag.
|
||||
dockerTag: nightly
|
||||
snapBuildTimeout: 5400
|
||||
|
||||
stages:
|
||||
- template: templates/stages/test-and-package-stage.yml
|
||||
20
.azure-pipelines/advanced.yml
Normal file
20
.azure-pipelines/advanced.yml
Normal file
@@ -0,0 +1,20 @@
|
||||
# Advanced pipeline for isolated checks and release purpose
|
||||
trigger:
|
||||
- test-*
|
||||
- '*.x'
|
||||
pr:
|
||||
- test-*
|
||||
# This pipeline is also nightly run on master
|
||||
schedules:
|
||||
- cron: "0 4 * * *"
|
||||
displayName: Nightly build
|
||||
branches:
|
||||
include:
|
||||
- master
|
||||
always: true
|
||||
|
||||
jobs:
|
||||
# Any addition here should be reflected in the release pipeline.
|
||||
# It is advised to declare all jobs here as templates to improve maintainability.
|
||||
- template: templates/tests-suite.yml
|
||||
- template: templates/installer-tests.yml
|
||||
@@ -1,18 +1,12 @@
|
||||
# We run the test suite on commits to main so codecov gets coverage data
|
||||
# about the main branch and can use it to track coverage changes.
|
||||
trigger:
|
||||
- main
|
||||
# apache-parser-v2 is a temporary branch for doing work related to
|
||||
# rewriting the parser in the Apache plugin.
|
||||
- apache-parser-v2
|
||||
- master
|
||||
pr:
|
||||
- main
|
||||
- apache-parser-v2
|
||||
- master
|
||||
- '*.x'
|
||||
|
||||
variables:
|
||||
# We set this here to avoid coverage data being uploaded from things like our
|
||||
# nightly pipeline. This is done because codecov (helpfully) keeps track of
|
||||
# the number of coverage uploads for a commit and displays a warning when
|
||||
# comparing two commits with an unequal number of uploads. Only uploading
|
||||
# coverage here should keep the number of uploads it sees consistent.
|
||||
uploadCoverage: true
|
||||
|
||||
jobs:
|
||||
- template: templates/jobs/standard-tests-jobs.yml
|
||||
- template: templates/tests-suite.yml
|
||||
|
||||
@@ -1,19 +0,0 @@
|
||||
# Nightly pipeline running each day for main.
|
||||
trigger: none
|
||||
pr: none
|
||||
schedules:
|
||||
- cron: "30 4 * * *"
|
||||
displayName: Nightly build
|
||||
branches:
|
||||
include:
|
||||
- main
|
||||
always: true
|
||||
|
||||
variables:
|
||||
dockerTag: nightly
|
||||
snapBuildTimeout: 19800
|
||||
|
||||
stages:
|
||||
- template: templates/stages/test-and-package-stage.yml
|
||||
- template: templates/stages/changelog-stage.yml
|
||||
- template: templates/stages/nightly-deploy-stage.yml
|
||||
@@ -1,16 +1,13 @@
|
||||
# Release pipeline to run our full test suite, build artifacts, and deploy them
|
||||
# for GitHub release tags.
|
||||
# Release pipeline to build and deploy Certbot for Windows for GitHub release tags
|
||||
trigger:
|
||||
tags:
|
||||
include:
|
||||
- v*
|
||||
pr: none
|
||||
|
||||
variables:
|
||||
dockerTag: ${{variables['Build.SourceBranchName']}}
|
||||
snapBuildTimeout: 19800
|
||||
|
||||
stages:
|
||||
- template: templates/stages/test-and-package-stage.yml
|
||||
- template: templates/stages/changelog-stage.yml
|
||||
- template: templates/stages/release-deploy-stage.yml
|
||||
jobs:
|
||||
# Any addition here should be reflected in the advanced pipeline.
|
||||
# It is advised to declare all jobs here as templates to improve maintainability.
|
||||
- template: templates/tests-suite.yml
|
||||
- template: templates/installer-tests.yml
|
||||
- template: templates/changelog.yml
|
||||
|
||||
14
.azure-pipelines/templates/changelog.yml
Normal file
14
.azure-pipelines/templates/changelog.yml
Normal file
@@ -0,0 +1,14 @@
|
||||
jobs:
|
||||
- job: changelog
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
steps:
|
||||
- bash: |
|
||||
CERTBOT_VERSION="$(cd certbot && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
|
||||
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
|
||||
displayName: Prepare changelog
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: changelog
|
||||
displayName: Publish changelog
|
||||
54
.azure-pipelines/templates/installer-tests.yml
Normal file
54
.azure-pipelines/templates/installer-tests.yml
Normal file
@@ -0,0 +1,54 @@
|
||||
jobs:
|
||||
- job: installer_build
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.7
|
||||
architecture: x86
|
||||
addToPath: true
|
||||
- script: python windows-installer/construct.py
|
||||
displayName: Build Certbot installer
|
||||
- task: CopyFiles@2
|
||||
inputs:
|
||||
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
|
||||
contents: '*.exe'
|
||||
targetFolder: $(Build.ArtifactStagingDirectory)
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: windows-installer
|
||||
displayName: Publish Windows installer
|
||||
- job: installer_run
|
||||
dependsOn: installer_build
|
||||
strategy:
|
||||
matrix:
|
||||
win2019:
|
||||
imageName: windows-2019
|
||||
win2016:
|
||||
imageName: vs2017-win2016
|
||||
win2012r2:
|
||||
imageName: vs2015-win2012r2
|
||||
pool:
|
||||
vmImage: $(imageName)
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: windows-installer
|
||||
path: $(Build.SourcesDirectory)/bin
|
||||
displayName: Retrieve Windows installer
|
||||
- script: $(Build.SourcesDirectory)\bin\certbot-beta-installer-win32.exe /S
|
||||
displayName: Install Certbot
|
||||
- powershell: Invoke-WebRequest https://www.python.org/ftp/python/3.8.0/python-3.8.0-amd64-webinstall.exe -OutFile C:\py3-setup.exe
|
||||
displayName: Get Python
|
||||
- script: C:\py3-setup.exe /quiet PrependPath=1 InstallAllUsers=1 Include_launcher=1 InstallLauncherAllUsers=1 Include_test=0 Include_doc=0 Include_dev=1 Include_debug=0 Include_tcltk=0 TargetDir=C:\py3
|
||||
displayName: Install Python
|
||||
- script: |
|
||||
py -3 -m venv venv
|
||||
venv\Scripts\python tools\pip_install.py -e certbot-ci
|
||||
displayName: Prepare Certbot-CI
|
||||
- script: |
|
||||
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
|
||||
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
|
||||
displayName: Run integration tests
|
||||
@@ -1,128 +0,0 @@
|
||||
# As (somewhat) described at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#context,
|
||||
# each template only has access to the parameters passed into it. To help make
|
||||
# use of this design, we define snapReleaseChannel without a default value
|
||||
# which requires the user of this template to define it as described at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/parameters-name?view=azure-pipelines#remarks.
|
||||
# This makes the user of this template be explicit while allowing them to
|
||||
# define their own parameters with defaults that make sense for that context.
|
||||
parameters:
|
||||
- name: snapReleaseChannel
|
||||
type: string
|
||||
values:
|
||||
- edge
|
||||
- beta
|
||||
|
||||
jobs:
|
||||
# This job relies on credentials used to publish the Certbot snaps. This
|
||||
# credential file was created by running:
|
||||
#
|
||||
# snapcraft logout
|
||||
# snapcraft export-login --channels=beta,edge snapcraft.cfg
|
||||
# (provide the shared snapcraft credentials when prompted)
|
||||
#
|
||||
# Then the file was added as a secure file in Azure pipelines
|
||||
# with the name snapcraft.cfg by following the instructions at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops
|
||||
# including authorizing the file for use in the "nightly" and "release"
|
||||
# pipelines as described at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops#q-how-do-i-authorize-a-secure-file-for-use-in-a-specific-pipeline.
|
||||
#
|
||||
# This file has a maximum lifetime of one year and the current file will
|
||||
# expire on 2024-02-10. The file will need to be updated before then to
|
||||
# prevent automated deploys from breaking.
|
||||
#
|
||||
# Revoking these credentials can be done by changing the password of the
|
||||
# account used to generate the credentials. See
|
||||
# https://forum.snapcraft.io/t/revoking-exported-credentials/19031 for
|
||||
# more info.
|
||||
- job: publish_snap
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
variables:
|
||||
- group: certbot-common
|
||||
strategy:
|
||||
matrix:
|
||||
amd64:
|
||||
SNAP_ARCH: amd64
|
||||
arm32v6:
|
||||
SNAP_ARCH: armhf
|
||||
arm64v8:
|
||||
SNAP_ARCH: arm64
|
||||
steps:
|
||||
- bash: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends snapd
|
||||
sudo snap install --classic snapcraft
|
||||
displayName: Install dependencies
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: snaps_$(SNAP_ARCH)
|
||||
path: $(Build.SourcesDirectory)/snap
|
||||
displayName: Retrieve Certbot snaps
|
||||
- task: DownloadSecureFile@1
|
||||
name: snapcraftCfg
|
||||
inputs:
|
||||
secureFile: snapcraft.cfg
|
||||
- bash: |
|
||||
set -e
|
||||
export SNAPCRAFT_STORE_CREDENTIALS=$(cat "$(snapcraftCfg.secureFilePath)")
|
||||
for SNAP_FILE in snap/*.snap; do
|
||||
tools/retry.sh eval snapcraft upload --release=${{ parameters.snapReleaseChannel }} "${SNAP_FILE}"
|
||||
done
|
||||
displayName: Publish to Snap store
|
||||
# The credentials used in the following jobs are for the shared
|
||||
# certbotbot account on Docker Hub. The credentials are stored
|
||||
# in a service account which was created by following the
|
||||
# instructions at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
|
||||
# The name given to this service account must match the value
|
||||
# given to containerRegistry below. The authentication used when
|
||||
# creating this service account was a personal access token
|
||||
# rather than a password to bypass 2FA. When Brad set this up,
|
||||
# Azure Pipelines failed to verify the credentials with an error
|
||||
# like "access is forbidden with a JWT issued from a personal
|
||||
# access token", but after saving them without verification, the
|
||||
# access token worked when the pipeline actually ran. "Grant
|
||||
# access to all pipelines" should also be checked on the service
|
||||
# account. The access token can be deleted on Docker Hub if
|
||||
# these credentials need to be revoked.
|
||||
- job: publish_docker_by_arch
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
strategy:
|
||||
matrix:
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
path: $(Build.SourcesDirectory)
|
||||
displayName: Retrieve Docker images
|
||||
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
|
||||
displayName: Load Docker images
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy_images.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Deploy the Docker images by architecture
|
||||
- job: publish_docker_multiarch
|
||||
dependsOn: publish_docker_by_arch
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy_manifests.sh $(dockerTag) all
|
||||
displayName: Deploy the Docker multiarch manifests
|
||||
@@ -1,60 +0,0 @@
|
||||
jobs:
|
||||
- job: extended_test
|
||||
variables:
|
||||
- name: IMAGE_NAME
|
||||
value: ubuntu-22.04
|
||||
- name: PYTHON_VERSION
|
||||
value: 3.13
|
||||
- group: certbot-common
|
||||
strategy:
|
||||
matrix:
|
||||
linux-py39:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: py39
|
||||
linux-py310:
|
||||
PYTHON_VERSION: 3.10
|
||||
TOXENV: py310
|
||||
linux-py311:
|
||||
PYTHON_VERSION: 3.11
|
||||
TOXENV: py311
|
||||
linux-py312:
|
||||
PYTHON_VERSION: 3.12
|
||||
TOXENV: py312
|
||||
linux-isolated:
|
||||
TOXENV: 'isolated-acme,isolated-certbot,isolated-apache,isolated-cloudflare,isolated-digitalocean,isolated-dnsimple,isolated-dnsmadeeasy,isolated-gehirn,isolated-google,isolated-linode,isolated-luadns,isolated-nsone,isolated-ovh,isolated-rfc2136,isolated-route53,isolated-sakuracloud,isolated-nginx'
|
||||
linux-integration-certbot-oldest:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration-certbot-oldest
|
||||
linux-integration-nginx-oldest:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration-nginx-oldest
|
||||
linux-py39-integration:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration
|
||||
linux-py310-integration:
|
||||
PYTHON_VERSION: 3.10
|
||||
TOXENV: integration
|
||||
linux-py311-integration:
|
||||
PYTHON_VERSION: 3.11
|
||||
TOXENV: integration
|
||||
linux-py312-integration:
|
||||
PYTHON_VERSION: 3.12
|
||||
TOXENV: integration
|
||||
# python 3.13 integration tests are not run here because they're run as
|
||||
# part of the standard test suite
|
||||
nginx-compat:
|
||||
TOXENV: nginx_compat
|
||||
linux-integration-rfc2136:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.12
|
||||
TOXENV: integration-dns-rfc2136
|
||||
le-modification:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: modification
|
||||
farmtest-apache2:
|
||||
PYTHON_VERSION: 3.12
|
||||
TOXENV: test-farm-apache2
|
||||
pool:
|
||||
vmImage: $(IMAGE_NAME)
|
||||
steps:
|
||||
- template: ../steps/tox-steps.yml
|
||||
@@ -1,159 +0,0 @@
|
||||
jobs:
|
||||
- job: docker_build
|
||||
pool:
|
||||
vmImage: ubuntu-24.04
|
||||
strategy:
|
||||
matrix:
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
# The default timeout of 60 minutes is a little low for compiling
|
||||
# cryptography on ARM architectures.
|
||||
timeoutInMinutes: 180
|
||||
steps:
|
||||
- bash: set -e && tools/docker/build.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Build the Docker images
|
||||
# We don't filter for the Docker Hub organization to continue to allow
|
||||
# easy testing of these scripts on forks.
|
||||
- bash: |
|
||||
set -e
|
||||
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}')
|
||||
docker save --output images.tar $DOCKER_IMAGES
|
||||
displayName: Save the Docker images
|
||||
# If the name of the tar file or artifact changes, the deploy stage will
|
||||
# also need to be updated.
|
||||
- bash: set -e && mv images.tar $(Build.ArtifactStagingDirectory)
|
||||
displayName: Prepare Docker artifact
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
displayName: Store Docker artifact
|
||||
- job: docker_test
|
||||
dependsOn: docker_build
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
strategy:
|
||||
matrix:
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
path: $(Build.SourcesDirectory)
|
||||
displayName: Retrieve Docker images
|
||||
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
|
||||
displayName: Load Docker images
|
||||
- bash: |
|
||||
set -e && tools/docker/test.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Run integration tests for Docker images
|
||||
- job: snaps_build
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
strategy:
|
||||
matrix:
|
||||
amd64:
|
||||
SNAP_ARCH: amd64
|
||||
armhf:
|
||||
SNAP_ARCH: armhf
|
||||
arm64:
|
||||
SNAP_ARCH: arm64
|
||||
timeoutInMinutes: 0
|
||||
steps:
|
||||
- script: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends snapd
|
||||
sudo snap install --classic snapcraft
|
||||
displayName: Install dependencies
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.12
|
||||
addToPath: true
|
||||
- task: DownloadSecureFile@1
|
||||
name: credentials
|
||||
inputs:
|
||||
secureFile: launchpad-credentials
|
||||
- script: |
|
||||
set -e
|
||||
git config --global user.email "$(Build.RequestedForEmail)"
|
||||
git config --global user.name "$(Build.RequestedFor)"
|
||||
mkdir -p ~/.local/share/snapcraft/provider/launchpad
|
||||
cp $(credentials.secureFilePath) ~/.local/share/snapcraft/provider/launchpad/credentials
|
||||
python3 tools/snap/build_remote.py ALL --archs ${SNAP_ARCH} --timeout $(snapBuildTimeout)
|
||||
displayName: Build snaps
|
||||
- script: |
|
||||
set -e
|
||||
mv *.snap $(Build.ArtifactStagingDirectory)
|
||||
mv certbot-dns-*/*.snap $(Build.ArtifactStagingDirectory)
|
||||
displayName: Prepare artifacts
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: snaps_$(SNAP_ARCH)
|
||||
displayName: Store snaps artifacts
|
||||
- job: snap_run
|
||||
dependsOn: snaps_build
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.12
|
||||
addToPath: true
|
||||
- script: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends nginx-light snapd
|
||||
python3 -m venv venv
|
||||
venv/bin/python tools/pip_install.py -U tox
|
||||
displayName: Install dependencies
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: snaps_amd64
|
||||
path: $(Build.SourcesDirectory)/snap
|
||||
displayName: Retrieve Certbot snaps
|
||||
- script: |
|
||||
set -e
|
||||
sudo snap install --dangerous --classic snap/certbot_*.snap
|
||||
displayName: Install Certbot snap
|
||||
- script: |
|
||||
set -e
|
||||
venv/bin/python -m tox run -e integration-external,apacheconftest-external-with-pebble
|
||||
displayName: Run tox
|
||||
- job: snap_dns_run
|
||||
dependsOn: snaps_build
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- script: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends snapd
|
||||
displayName: Install dependencies
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.12
|
||||
addToPath: true
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: snaps_amd64
|
||||
path: $(Build.SourcesDirectory)/snap
|
||||
displayName: Retrieve Certbot snaps
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m venv venv
|
||||
venv/bin/python tools/pip_install.py -e certbot-ci
|
||||
displayName: Prepare Certbot-CI
|
||||
- script: |
|
||||
set -e
|
||||
sudo -E venv/bin/pytest certbot-ci/src/snap_integration_tests/dns_tests --allow-persistent-changes --snap-folder $(Build.SourcesDirectory)/snap --snap-arch amd64
|
||||
displayName: Test DNS plugins snaps
|
||||
@@ -1,55 +0,0 @@
|
||||
jobs:
|
||||
- job: test
|
||||
variables:
|
||||
PYTHON_VERSION: 3.13
|
||||
strategy:
|
||||
matrix:
|
||||
macos-cover:
|
||||
IMAGE_NAME: macOS-15
|
||||
TOXENV: cover
|
||||
# As of pip 23.1.0, builds started failing on macOS unless this flag was set.
|
||||
# See https://github.com/certbot/certbot/pull/9717#issuecomment-1610861794.
|
||||
PIP_USE_PEP517: "true"
|
||||
linux-oldest:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: oldest
|
||||
linux-py39:
|
||||
# linux unit tests with the oldest python we support
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: py39
|
||||
linux-cover:
|
||||
# linux unit+cover tests with the newest python we support
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: cover
|
||||
linux-lint:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: lint-posix
|
||||
linux-mypy:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: mypy
|
||||
linux-integration:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: integration
|
||||
apache-compat:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: apache_compat
|
||||
apacheconftest:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: apacheconftest-with-pebble
|
||||
nginxroundtrip:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: nginxroundtrip
|
||||
validate-changelog:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: validate-changelog
|
||||
pool:
|
||||
vmImage: $(IMAGE_NAME)
|
||||
steps:
|
||||
- template: ../steps/tox-steps.yml
|
||||
- job: test_sphinx_builds
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- template: ../steps/sphinx-steps.yml
|
||||
@@ -1,19 +0,0 @@
|
||||
stages:
|
||||
- stage: Changelog
|
||||
jobs:
|
||||
- job: prepare
|
||||
pool:
|
||||
vmImage: ubuntu-latest
|
||||
steps:
|
||||
# If we change the output filename from `release_notes.md`, it should also be changed in tools/create_github_release.py
|
||||
- bash: |
|
||||
set -e
|
||||
CERTBOT_VERSION="$(cd certbot/src && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
|
||||
"${BUILD_REPOSITORY_LOCALPATH}/tools/extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
|
||||
displayName: Prepare changelog
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
|
||||
artifact: changelog
|
||||
displayName: Publish changelog
|
||||
@@ -1,6 +0,0 @@
|
||||
stages:
|
||||
- stage: Deploy
|
||||
jobs:
|
||||
- template: ../jobs/common-deploy-jobs.yml
|
||||
parameters:
|
||||
snapReleaseChannel: edge
|
||||
@@ -1,27 +0,0 @@
|
||||
stages:
|
||||
- stage: Deploy
|
||||
jobs:
|
||||
- template: ../jobs/common-deploy-jobs.yml
|
||||
parameters:
|
||||
snapReleaseChannel: beta
|
||||
- job: create_github_release
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: changelog
|
||||
path: '$(Pipeline.Workspace)'
|
||||
- task: GitHubRelease@1
|
||||
inputs:
|
||||
# this "github-releases" credential is what azure pipelines calls a
|
||||
# "service connection". it needs to be recreated annually. instructions
|
||||
# to do so and further information about the token are available at
|
||||
# https://github.com/EFForg/certbot-misc/wiki/Azure-Pipelines-setup#regenerating-github-release-credentials-for-use-on-azure
|
||||
#
|
||||
# as of writing this, the current token will expire on Wed, Feb 25 2026.
|
||||
gitHubConnection: github-releases
|
||||
title: ${{ format('Certbot {0}', replace(variables['Build.SourceBranchName'], 'v', '')) }}
|
||||
releaseNotesFilePath: '$(Pipeline.Workspace)/release_notes.md'
|
||||
assets: '$(Build.SourcesDirectory)/packages/{*.tar.gz,SHA256SUMS*}'
|
||||
addChangeLog: false
|
||||
@@ -1,6 +0,0 @@
|
||||
stages:
|
||||
- stage: TestAndPackage
|
||||
jobs:
|
||||
- template: ../jobs/standard-tests-jobs.yml
|
||||
- template: ../jobs/extended-tests-jobs.yml
|
||||
- template: ../jobs/packaging-jobs.yml
|
||||
@@ -1,24 +0,0 @@
|
||||
steps:
|
||||
- bash: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends libaugeas-dev
|
||||
FINAL_STATUS=0
|
||||
declare -a FAILED_BUILDS
|
||||
tools/venv.py
|
||||
source venv/bin/activate
|
||||
for doc_path in */docs
|
||||
do
|
||||
echo ""
|
||||
echo "##[group]Building $doc_path"
|
||||
if ! sphinx-build -W --keep-going -b html $doc_path $doc_path/_build/html; then
|
||||
FINAL_STATUS=1
|
||||
FAILED_BUILDS[${#FAILED_BUILDS[@]}]="${doc_path%/docs}"
|
||||
fi
|
||||
echo "##[endgroup]"
|
||||
done
|
||||
if [[ $FINAL_STATUS -ne 0 ]]; then
|
||||
echo "##[error]The following builds failed: ${FAILED_BUILDS[*]}"
|
||||
exit 1
|
||||
fi
|
||||
displayName: Build Sphinx Documentation
|
||||
@@ -1,77 +0,0 @@
|
||||
# This does not include the dependencies needed to build cryptography. See
|
||||
# https://cryptography.io/en/latest/installation/
|
||||
steps:
|
||||
# We run brew update because we've seen attempts to install an older version
|
||||
# of a package fail. See
|
||||
# https://github.com/actions/virtual-environments/issues/3165.
|
||||
#
|
||||
# We untap homebrew/core and homebrew/cask and unset HOMEBREW_NO_INSTALL_FROM_API (which
|
||||
# is set by the CI macOS env) because GitHub has been having issues, making these jobs
|
||||
# fail on git clones: https://github.com/orgs/Homebrew/discussions/4612.
|
||||
- bash: |
|
||||
set -e
|
||||
unset HOMEBREW_NO_INSTALL_FROM_API
|
||||
brew untap homebrew/core homebrew/cask
|
||||
brew update
|
||||
brew install augeas
|
||||
condition: startswith(variables['IMAGE_NAME'], 'macOS')
|
||||
displayName: Install MacOS dependencies
|
||||
- bash: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends \
|
||||
libaugeas-dev \
|
||||
nginx-light
|
||||
sudo systemctl stop nginx
|
||||
sudo sysctl net.ipv4.ip_unprivileged_port_start=0
|
||||
condition: startswith(variables['IMAGE_NAME'], 'ubuntu')
|
||||
displayName: Install Linux dependencies
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: $(PYTHON_VERSION)
|
||||
addToPath: true
|
||||
- bash: |
|
||||
set -e
|
||||
python3 tools/pip_install.py tox
|
||||
displayName: Install runtime dependencies
|
||||
- task: DownloadSecureFile@1
|
||||
name: testFarmPem
|
||||
inputs:
|
||||
secureFile: azure-test-farm.pem
|
||||
condition: contains(variables['TOXENV'], 'test-farm')
|
||||
- bash: |
|
||||
set -e
|
||||
export TARGET_BRANCH="`echo "${BUILD_SOURCEBRANCH}" | sed -E 's!refs/(heads|tags)/!!g'`"
|
||||
[ -z "${SYSTEM_PULLREQUEST_TARGETBRANCH}" ] || export TARGET_BRANCH="${SYSTEM_PULLREQUEST_TARGETBRANCH}"
|
||||
env
|
||||
python3 -m tox run
|
||||
env:
|
||||
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
|
||||
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
|
||||
AWS_EC2_PEM_FILE: $(testFarmPem.secureFilePath)
|
||||
displayName: Run tox
|
||||
# For now, let's omit `set -e` and avoid the script exiting with a nonzero
|
||||
# status code to prevent problems here from causing build failures. If
|
||||
# this turns out to work well, we can change this.
|
||||
- bash: |
|
||||
python3 tools/pip_install.py -I coverage
|
||||
case "$AGENT_OS" in
|
||||
Darwin)
|
||||
CODECOV_URL="https://uploader.codecov.io/latest/macos/codecov"
|
||||
;;
|
||||
Linux)
|
||||
CODECOV_URL="https://uploader.codecov.io/latest/linux/codecov"
|
||||
;;
|
||||
Windows_NT)
|
||||
CODECOV_URL="https://uploader.codecov.io/latest/windows/codecov.exe"
|
||||
;;
|
||||
*)
|
||||
echo "Unexpected OS"
|
||||
exit 0
|
||||
esac
|
||||
curl --retry 3 -o codecov "$CODECOV_URL"
|
||||
chmod +x codecov
|
||||
coverage xml
|
||||
./codecov || echo "Uploading coverage data failed"
|
||||
condition: and(eq(variables['uploadCoverage'], true), or(startsWith(variables['TOXENV'], 'cover'), startsWith(variables['TOXENV'], 'integration')))
|
||||
displayName: Upload coverage data
|
||||
38
.azure-pipelines/templates/tests-suite.yml
Normal file
38
.azure-pipelines/templates/tests-suite.yml
Normal file
@@ -0,0 +1,38 @@
|
||||
jobs:
|
||||
- job: test
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
strategy:
|
||||
matrix:
|
||||
py35:
|
||||
PYTHON_VERSION: 3.5
|
||||
TOXENV: py35
|
||||
py37-cover:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: py37-cover
|
||||
integration-certbot:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: integration-certbot
|
||||
PYTEST_ADDOPTS: --numprocesses 4
|
||||
variables:
|
||||
- group: certbot-common
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: $(PYTHON_VERSION)
|
||||
addToPath: true
|
||||
- script: python tools/pip_install.py -U tox coverage
|
||||
displayName: Install dependencies
|
||||
- script: python -m tox
|
||||
displayName: Run tox
|
||||
# We do not require codecov report upload to succeed. So to avoid to break the pipeline if
|
||||
# something goes wrong, each command is suffixed with a command that hides any non zero exit
|
||||
# codes and echoes an informative message instead.
|
||||
- bash: |
|
||||
curl -s https://codecov.io/bash -o codecov-bash || echo "Failed to download codecov-bash"
|
||||
chmod +x codecov-bash || echo "Failed to apply execute permissions on codecov-bash"
|
||||
./codecov-bash -F windows || echo "Codecov did not collect coverage reports"
|
||||
condition: in(variables['TOXENV'], 'py37-cover', 'integration-certbot')
|
||||
env:
|
||||
CODECOV_TOKEN: $(codecov_token)
|
||||
displayName: Publish coverage
|
||||
18
.codecov.yml
Normal file
18
.codecov.yml
Normal file
@@ -0,0 +1,18 @@
|
||||
coverage:
|
||||
status:
|
||||
project:
|
||||
default: off
|
||||
linux:
|
||||
flags: linux
|
||||
# Fixed target instead of auto set by #7173, can
|
||||
# be removed when flags in Codecov are added back.
|
||||
target: 97.4
|
||||
threshold: 0.1
|
||||
base: auto
|
||||
windows:
|
||||
flags: windows
|
||||
# Fixed target instead of auto set by #7173, can
|
||||
# be removed when flags in Codecov are added back.
|
||||
target: 97.4
|
||||
threshold: 0.1
|
||||
base: auto
|
||||
19
.coveragerc
19
.coveragerc
@@ -1,24 +1,5 @@
|
||||
[run]
|
||||
omit = */setup.py
|
||||
source =
|
||||
acme
|
||||
certbot
|
||||
certbot-apache
|
||||
certbot-dns-cloudflare
|
||||
certbot-dns-digitalocean
|
||||
certbot-dns-dnsimple
|
||||
certbot-dns-dnsmadeeasy
|
||||
certbot-dns-gehirn
|
||||
certbot-dns-google
|
||||
certbot-dns-linode
|
||||
certbot-dns-luadns
|
||||
certbot-dns-nsone
|
||||
certbot-dns-ovh
|
||||
certbot-dns-rfc2136
|
||||
certbot-dns-route53
|
||||
certbot-dns-sakuracloud
|
||||
certbot-nginx
|
||||
|
||||
[report]
|
||||
omit = */setup.py
|
||||
show_missing = True
|
||||
|
||||
@@ -8,4 +8,5 @@
|
||||
.git
|
||||
.tox
|
||||
venv
|
||||
venv3
|
||||
docs
|
||||
|
||||
@@ -1,18 +0,0 @@
|
||||
# https://editorconfig.org/
|
||||
|
||||
root = true
|
||||
|
||||
[*]
|
||||
insert_final_newline = true
|
||||
trim_trailing_whitespace = true
|
||||
end_of_line = lf
|
||||
|
||||
[*.py]
|
||||
indent_style = space
|
||||
indent_size = 4
|
||||
charset = utf-8
|
||||
max_line_length = 100
|
||||
|
||||
[*.yaml]
|
||||
indent_style = space
|
||||
indent_size = 2
|
||||
1
.github/FUNDING.yml
vendored
1
.github/FUNDING.yml
vendored
@@ -1 +0,0 @@
|
||||
custom: https://supporters.eff.org/donate/support-work-on-certbot
|
||||
69
.github/ISSUE_TEMPLATE/bug.yaml
vendored
69
.github/ISSUE_TEMPLATE/bug.yaml
vendored
@@ -1,69 +0,0 @@
|
||||
name: Bug Report
|
||||
description: File a bug report.
|
||||
title: "[Bug]: "
|
||||
type: Bug
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Thanks for taking the time to fill out this bug report! If you're
|
||||
having trouble using Certbot and aren't sure you've found a bug,
|
||||
please first try asking for help at https://community.letsencrypt.org/.
|
||||
There is a much larger community there of people familiar with the
|
||||
project who will be able to more quickly answer your questions.
|
||||
- type: input
|
||||
attributes:
|
||||
label: OS
|
||||
description: |
|
||||
Describe your Operating System. Examples: Ubuntu 18.04, CentOS 8 Stream
|
||||
placeholder: Ubuntu 24.04
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
attributes:
|
||||
label: Installation method
|
||||
description: |
|
||||
How did you install Certbot? Examples: snap, pip, apt, yum
|
||||
placeholder: snap
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
attributes:
|
||||
label: Certbot Version
|
||||
description: |
|
||||
If you're not sure, you can find this by running `certbot --version`.
|
||||
placeholder: 1.0.0
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: what-happened
|
||||
attributes:
|
||||
label: What happened?
|
||||
description: |
|
||||
I ran this command and it produced this output. Example:
|
||||
```
|
||||
$ sudo certbot certonly -d adfsfasdf.asdfasdf --staging
|
||||
Saving debug log to /var/log/letsencrypt/letsencrypt.log
|
||||
Plugins selected: Authenticator nginx, Installer nginx
|
||||
Requesting a certificate for example.com
|
||||
An unexpected error occurred:
|
||||
Invalid identifiers requested :: Cannot issue for "adfsfasdf.asdfasdf": Domain name does not end with a valid public suffix (TLD)
|
||||
Ask for help or search for solutions at https://community.letsencrypt.org. See the logfile /var/log/letsencrypt/letsencrypt.log or re-run Certbot with -v for more details.
|
||||
```
|
||||
placeholder: Tell us what you see!
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: expected
|
||||
attributes:
|
||||
label: Expected behavior
|
||||
description: Certbot's behavior differed from what I expected because.
|
||||
placeholder: "What was expected?"
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
label: Relevant log output
|
||||
description: Here is a Certbot log showing the issue (if available). Logs are stored in `/var/log/letsencrypt` by default. Feel free to redact domains, e-mail and IP addresses as you see fit.
|
||||
render: shell
|
||||
8
.github/ISSUE_TEMPLATE/config.yml
vendored
8
.github/ISSUE_TEMPLATE/config.yml
vendored
@@ -1,8 +0,0 @@
|
||||
blank_issues_enabled: false
|
||||
contact_links:
|
||||
- name: Let's Encrypt Community Support
|
||||
url: https://community.letsencrypt.org/
|
||||
about: If you're having trouble using Certbot and aren't sure you've found a bug or request for a new feature, please first try asking for help here. There is a much larger community there of people familiar with the project who will be able to more quickly answer your questions.
|
||||
- name: Certbot Security Policy
|
||||
url: https://github.com/certbot/certbot/security/advisories/new
|
||||
about: Please report security vulnerabilities here.
|
||||
27
.github/ISSUE_TEMPLATE/feature.yaml
vendored
27
.github/ISSUE_TEMPLATE/feature.yaml
vendored
@@ -1,27 +0,0 @@
|
||||
name: Feature Request
|
||||
description: Suggest a new feature or improvement to Certbot
|
||||
title: "[Feature Request]: "
|
||||
type: Feature
|
||||
body:
|
||||
- type: textarea
|
||||
id: problem
|
||||
attributes:
|
||||
label: What problem does this feature solve or what does it enhance?
|
||||
description: Explain what this feature addresses, or the benefit it provides.
|
||||
placeholder: For example, "Currently, users have to manually do X, which is time-consuming."
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: solution
|
||||
attributes:
|
||||
label: Proposed Solution
|
||||
description: Describe the solution you'd like to see implemented.
|
||||
placeholder: For example, "Implement a new button that automatically does X."
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: alternatives
|
||||
attributes:
|
||||
label: Alternatives Considered
|
||||
description: Have you considered any alternative solutions?
|
||||
placeholder: For example, "We considered Y, but Z is a better approach because..."
|
||||
15
.github/ISSUE_TEMPLATE/task.yaml
vendored
15
.github/ISSUE_TEMPLATE/task.yaml
vendored
@@ -1,15 +0,0 @@
|
||||
name: Task
|
||||
description: A codebase upkeep task such as managing deprecations and refactoring
|
||||
title: "[Task]: "
|
||||
type: Task
|
||||
body:
|
||||
- type: textarea
|
||||
id: problem
|
||||
attributes:
|
||||
label: Task description
|
||||
description: Describe the work that needs to happen, and why.
|
||||
placeholder: >
|
||||
For example, "In issue [link here], we noted that we cannot update [dependency] until
|
||||
[something happens]. That thing has happened, so now we should update [dependency]."
|
||||
validations:
|
||||
required: true
|
||||
7
.github/codecov.yml
vendored
7
.github/codecov.yml
vendored
@@ -1,7 +0,0 @@
|
||||
# This disables all reporting from codecov. Let's just set it up to collect
|
||||
# data for now and then we can play with the settings here.
|
||||
comment: false
|
||||
coverage:
|
||||
status:
|
||||
project: off
|
||||
patch: off
|
||||
7
.github/pull_request_template.md
vendored
7
.github/pull_request_template.md
vendored
@@ -1,7 +0,0 @@
|
||||
## Pull Request Checklist
|
||||
|
||||
- [ ] The Certbot team has recently expressed interest in reviewing a PR for this. If not, this PR may be closed due our limited resources and need to prioritize how we spend them.
|
||||
- [ ] If the change being made is to a [distributed component](https://certbot.eff.org/docs/contributing.html#code-components-and-layout), add a description of your change to the `newsfragments` directory. This should be a file called `<title>.<type>`, where `<title>` is either a GitHub issue number or some other unique name starting with `+`, and `<type>` is either `changed`, `fixed`, or `added`.
|
||||
* For example, if you fixed a bug for issue number 42, create a file called `42.fixed` and put a description of your change in that file.
|
||||
- [ ] Add or update any documentation as needed to support the changes in this PR.
|
||||
- [ ] Include your name in `AUTHORS.md` if you like.
|
||||
35
.github/stale.yml
vendored
Normal file
35
.github/stale.yml
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
# Configuration for https://github.com/marketplace/stale
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request becomes stale
|
||||
daysUntilStale: 365
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
|
||||
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
|
||||
# When changing this value, be sure to also update markComment below.
|
||||
daysUntilClose: 30
|
||||
|
||||
# Ignore issues with an assignee (defaults to false)
|
||||
exemptAssignees: true
|
||||
|
||||
# Label to use when marking as stale
|
||||
staleLabel: needs-update
|
||||
|
||||
# Comment to post when marking as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
We've made a lot of changes to Certbot since this issue was opened. If you
|
||||
still have this issue with an up-to-date version of Certbot, can you please
|
||||
add a comment letting us know? This helps us to better see what issues are
|
||||
still affecting our users. If there is no activity in the next 30 days, this
|
||||
issue will be automatically closed.
|
||||
|
||||
# Comment to post when closing a stale Issue or Pull Request.
|
||||
closeComment: >
|
||||
This issue has been closed due to lack of activity, but if you think it
|
||||
should be reopened, please open a new issue with a link to this one and we'll
|
||||
take a look.
|
||||
|
||||
# Limit the number of actions per hour, from 1-30. Default is 30
|
||||
limitPerRun: 1
|
||||
|
||||
# Don't mark pull requests as stale.
|
||||
only: issues
|
||||
14
.github/workflows/assigned.yaml
vendored
14
.github/workflows/assigned.yaml
vendored
@@ -1,14 +0,0 @@
|
||||
name: Issue Assigned
|
||||
|
||||
on:
|
||||
issues:
|
||||
types: [assigned]
|
||||
jobs:
|
||||
send-mattermost-message:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: mattermost/action-mattermost-notify@master
|
||||
with:
|
||||
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_ASSIGN_WEBHOOK }}
|
||||
TEXT: >
|
||||
${{ github.event.assignee.login }} assigned to "${{ github.event.issue.title }}": ${{ github.event.issue.html_url }}
|
||||
20
.github/workflows/merged.yaml
vendored
20
.github/workflows/merged.yaml
vendored
@@ -1,20 +0,0 @@
|
||||
name: Merge Event
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types:
|
||||
- closed
|
||||
|
||||
jobs:
|
||||
if_merged:
|
||||
if: github.event.pull_request.merged == true
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: mattermost/action-mattermost-notify@master
|
||||
with:
|
||||
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_MERGE_WEBHOOK }}
|
||||
TEXT: >
|
||||
[${{ github.repository }}] |
|
||||
[${{ github.event.pull_request.title }}
|
||||
#${{ github.event.number }}](https://github.com/${{ github.repository }}/pull/${{ github.event.number }})
|
||||
was merged into main by ${{ github.actor }}
|
||||
25
.github/workflows/notify_weekly.yaml
vendored
25
.github/workflows/notify_weekly.yaml
vendored
@@ -1,25 +0,0 @@
|
||||
name: Weekly Github Update
|
||||
|
||||
on:
|
||||
schedule:
|
||||
# Every week on Thursday @ 10:00
|
||||
- cron: "0 10 * * 4"
|
||||
workflow_dispatch:
|
||||
jobs:
|
||||
send-mattermost-message:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Create Mattermost Message
|
||||
run: |
|
||||
DATE=$(date --date="7 days ago" +"%Y-%m-%d")
|
||||
echo "ASSIGNED_PRS=https://github.com/pulls?q=is%3Apr+is%3Aopen+updated%3A%3E%3D${DATE}+assignee%3A*+user%3Acertbot" >> $GITHUB_ENV
|
||||
echo "UPDATED_URL=https://github.com/issues?q=is%3Aissue+is%3Aopen+sort%3Acomments-desc+updated%3A%3E%3D${DATE}+user%3Acertbot" >> $GITHUB_ENV
|
||||
- uses: mattermost/action-mattermost-notify@master
|
||||
with:
|
||||
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_WEBHOOK_URL }}
|
||||
MATTERMOST_CHANNEL: private-certbot
|
||||
TEXT: |
|
||||
## Updates In the Past Week
|
||||
- Most commented in the last week: [link](${{ env.UPDATED_URL }})
|
||||
- Updated (assigned) PRs in the last week: [link](${{ env.ASSIGNED_PRS }})
|
||||
16
.github/workflows/review_requested.yaml
vendored
16
.github/workflows/review_requested.yaml
vendored
@@ -1,16 +0,0 @@
|
||||
name: Review Requested
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types: [review_requested]
|
||||
jobs:
|
||||
send-mattermost-message:
|
||||
# Don't notify for the interim step of certbot/eff-devs being assigned
|
||||
if: ${{ github.event.requested_reviewer.login != ''}}
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: mattermost/action-mattermost-notify@master
|
||||
with:
|
||||
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_ASSIGN_WEBHOOK }}
|
||||
TEXT: >
|
||||
Review requested from ${{ github.event.requested_reviewer.login }} for "${{ github.event.pull_request.title }}": ${{ github.event.pull_request.html_url }}
|
||||
51
.github/workflows/stale.yml
vendored
51
.github/workflows/stale.yml
vendored
@@ -1,51 +0,0 @@
|
||||
name: Update Stale Issues
|
||||
on:
|
||||
schedule:
|
||||
# Run 1:24AM every night
|
||||
- cron: '24 1 * * *'
|
||||
workflow_dispatch:
|
||||
permissions:
|
||||
issues: write
|
||||
jobs:
|
||||
stale:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/stale@v6
|
||||
with:
|
||||
# Idle number of days before marking issues stale
|
||||
days-before-issue-stale: 365
|
||||
|
||||
# Never mark PRs as stale
|
||||
days-before-pr-stale: -1
|
||||
|
||||
# Idle number of days before closing stale issues
|
||||
days-before-issue-close: 30
|
||||
|
||||
# Never close PRs
|
||||
days-before-pr-close: -1
|
||||
|
||||
# Ignore issues with an assignee
|
||||
exempt-all-issue-assignees: true
|
||||
|
||||
# Label to use when marking as stale
|
||||
stale-issue-label: stale-needs-update
|
||||
|
||||
# Label to use when issue is automatically closed
|
||||
close-issue-label: auto-closed
|
||||
|
||||
stale-issue-message: >
|
||||
We've made a lot of changes to Certbot since this issue was opened. If you
|
||||
still have this issue with an up-to-date version of Certbot, can you please
|
||||
add a comment letting us know? This helps us to better see what issues are
|
||||
still affecting our users. If there is no activity in the next 30 days, this
|
||||
issue will be automatically closed.
|
||||
|
||||
close-issue-message: >
|
||||
This issue has been closed due to lack of activity, but if you think it
|
||||
should be reopened, please open a new issue with a link to this one and we'll
|
||||
take a look.
|
||||
|
||||
# Limit the number of actions per run. As of writing this, GitHub's
|
||||
# rate limit is 1000 requests per hour so we're still a ways off. See
|
||||
# https://docs.github.com/en/rest/overview/resources-in-the-rest-api?apiVersion=2022-11-28#rate-limits-for-requests-from-github-actions.
|
||||
operations-per-run: 180
|
||||
27
.gitignore
vendored
27
.gitignore
vendored
@@ -4,16 +4,16 @@
|
||||
build/
|
||||
dist*/
|
||||
/venv*/
|
||||
/kgs/
|
||||
/.tox/
|
||||
/releases*/
|
||||
/log*
|
||||
letsencrypt.log
|
||||
certbot.log
|
||||
poetry.lock
|
||||
letsencrypt-auto-source/letsencrypt-auto.sig.lzma.base64
|
||||
|
||||
# coverage
|
||||
.coverage
|
||||
.coverage.*
|
||||
/htmlcov/
|
||||
|
||||
/.vagrant
|
||||
@@ -26,13 +26,15 @@ tags
|
||||
\#*#
|
||||
.idea
|
||||
.ropeproject
|
||||
.vscode
|
||||
*.sublime-project
|
||||
*.sublime-workspace
|
||||
|
||||
# auth --cert-path --chain-path
|
||||
/*.pem
|
||||
|
||||
# letstest
|
||||
tests/letstest/letest-*/
|
||||
tests/letstest/*.pem
|
||||
tests/letstest/venv/
|
||||
|
||||
.venv
|
||||
|
||||
# pytest cache
|
||||
@@ -47,18 +49,3 @@ tags
|
||||
.certbot_test_workspace
|
||||
**/assets/pebble*
|
||||
**/assets/challtestsrv*
|
||||
|
||||
# snap files
|
||||
.snapcraft
|
||||
parts
|
||||
prime
|
||||
stage
|
||||
*.snap
|
||||
snap-constraints.txt
|
||||
qemu-*
|
||||
certbot-dns*/certbot-dns*_amd64*.txt
|
||||
certbot-dns*/certbot-dns*_arm*.txt
|
||||
/certbot_amd64*.txt
|
||||
/certbot_arm*.txt
|
||||
certbot-dns*/snap
|
||||
snapcraft.cfg
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
[settings]
|
||||
skip_glob=venv*
|
||||
skip=letsencrypt-auto-source
|
||||
force_sort_within_sections=True
|
||||
force_single_line=True
|
||||
order_by_type=False
|
||||
line_length=400
|
||||
src_paths=acme/src,acme/tests,certbot*/tests,certbot/src,certbot*/src/certbot*
|
||||
|
||||
796
.pylintrc
796
.pylintrc
@@ -1,373 +1,51 @@
|
||||
[MAIN]
|
||||
[MASTER]
|
||||
|
||||
# Analyse import fallback blocks. This can be used to support both Python 2 and
|
||||
# 3 compatible code, which means that the block might have code that exists
|
||||
# only in one or another interpreter, leading to false positives when analysed.
|
||||
analyse-fallback-blocks=no
|
||||
# use as many jobs as there are cores
|
||||
jobs=0
|
||||
|
||||
# Load and enable all available extensions. Use --list-extensions to see a list
|
||||
# all available extensions.
|
||||
#enable-all-extensions=
|
||||
|
||||
# In error mode, messages with a category besides ERROR or FATAL are
|
||||
# suppressed, and no reports are done by default. Error mode is compatible with
|
||||
# disabling specific errors.
|
||||
#errors-only=
|
||||
|
||||
# Always return a 0 (non-error) status code, even if lint errors are found.
|
||||
# This is primarily useful in continuous integration scripts.
|
||||
#exit-zero=
|
||||
|
||||
# A comma-separated list of package or module names from where C extensions may
|
||||
# be loaded. Extensions are loading into the active Python interpreter and may
|
||||
# run arbitrary code.
|
||||
extension-pkg-allow-list=
|
||||
|
||||
# A comma-separated list of package or module names from where C extensions may
|
||||
# be loaded. Extensions are loading into the active Python interpreter and may
|
||||
# run arbitrary code. (This is an alternative name to extension-pkg-allow-list
|
||||
# for backward compatibility.)
|
||||
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
|
||||
|
||||
# Return non-zero exit code if any of these messages/categories are detected,
|
||||
# even if score is above --fail-under value. Syntax same as enable. Messages
|
||||
# specified are enabled, while categories only check already-enabled messages.
|
||||
fail-on=
|
||||
|
||||
# Specify a score threshold under which the program will exit with error.
|
||||
fail-under=10
|
||||
|
||||
# Interpret the stdin as a python script, whose filename needs to be passed as
|
||||
# the module_or_package argument.
|
||||
#from-stdin=
|
||||
|
||||
# Files or directories to be skipped. They should be base names, not paths.
|
||||
ignore=CVS
|
||||
|
||||
# Add files or directories matching the regular expressions patterns to the
|
||||
# ignore-list. The regex matches against paths and can be in Posix or Windows
|
||||
# format. Because '\' represents the directory delimiter on Windows systems, it
|
||||
# can't be used as an escape character.
|
||||
# CERTBOT COMMENT
|
||||
# Changing this line back to the default of `ignore-paths=` is being tracked by
|
||||
# https://github.com/certbot/certbot/issues/7908.
|
||||
ignore-paths=.*/_internal/tests/
|
||||
|
||||
# Files or directories matching the regular expression patterns are skipped.
|
||||
# The regex matches against base names, not paths. The default value ignores
|
||||
# Emacs file locks
|
||||
ignore-patterns=^\.#
|
||||
|
||||
# List of module names for which member attributes should not be checked
|
||||
# (useful for modules/projects where namespaces are manipulated during runtime
|
||||
# and thus existing member attributes cannot be deduced by static analysis). It
|
||||
# supports qualified module names, as well as Unix pattern matching.
|
||||
ignored-modules=
|
||||
# Specify a configuration file.
|
||||
#rcfile=
|
||||
|
||||
# Python code to execute, usually for sys.path manipulation such as
|
||||
# pygtk.require().
|
||||
# CERTBOT COMMENT
|
||||
# This is needed for pylint to import linter_plugin.py since
|
||||
# https://github.com/PyCQA/pylint/pull/3396.
|
||||
init-hook="import pylint.config, os, sys; sys.path.append(os.path.dirname(next(pylint.config.find_default_config_files())))"
|
||||
#init-hook=
|
||||
|
||||
# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
|
||||
# number of processors available to use, and will cap the count on Windows to
|
||||
# avoid hangs.
|
||||
jobs=0
|
||||
# Profiled execution.
|
||||
profile=no
|
||||
|
||||
# Control the amount of potential inferred values when inferring a single
|
||||
# object. This can help the performance when dealing with large functions or
|
||||
# complex, nested conditions.
|
||||
limit-inference-results=100
|
||||
|
||||
# List of plugins (as comma separated values of python module names) to load,
|
||||
# usually to register additional checkers.
|
||||
load-plugins=linter_plugin
|
||||
# Add files or directories to the blacklist. They should be base names, not
|
||||
# paths.
|
||||
ignore=CVS
|
||||
|
||||
# Pickle collected data for later comparisons.
|
||||
persistent=yes
|
||||
|
||||
# Minimum Python version to use for version dependent checks. Will default to
|
||||
# the version used to run pylint.
|
||||
py-version=3.10
|
||||
|
||||
# Discover python modules and packages in the file system subtree.
|
||||
recursive=no
|
||||
|
||||
# When enabled, pylint would attempt to guess common misconfiguration and emit
|
||||
# user-friendly hints instead of false-positive error messages.
|
||||
suggestion-mode=yes
|
||||
|
||||
# Allow loading of arbitrary C extensions. Extensions are imported into the
|
||||
# active Python interpreter and may run arbitrary code.
|
||||
unsafe-load-any-extension=no
|
||||
|
||||
# In verbose mode, extra non-checker-related info will be displayed.
|
||||
#verbose=
|
||||
|
||||
|
||||
[BASIC]
|
||||
|
||||
# Naming style matching correct argument names.
|
||||
argument-naming-style=snake_case
|
||||
|
||||
# Regular expression matching correct argument names. Overrides argument-
|
||||
# naming-style. If left empty, argument names will be checked with the set
|
||||
# naming style.
|
||||
#argument-rgx=
|
||||
|
||||
# Naming style matching correct attribute names.
|
||||
attr-naming-style=snake_case
|
||||
|
||||
# Regular expression matching correct attribute names. Overrides attr-naming-
|
||||
# style. If left empty, attribute names will be checked with the set naming
|
||||
# style.
|
||||
#attr-rgx=
|
||||
|
||||
# Bad variable names which should always be refused, separated by a comma.
|
||||
bad-names=foo,
|
||||
bar,
|
||||
baz,
|
||||
toto,
|
||||
tutu,
|
||||
tata
|
||||
|
||||
# Bad variable names regexes, separated by a comma. If names match any regex,
|
||||
# they will always be refused
|
||||
bad-names-rgxs=
|
||||
|
||||
# Naming style matching correct class attribute names.
|
||||
class-attribute-naming-style=any
|
||||
|
||||
# Regular expression matching correct class attribute names. Overrides class-
|
||||
# attribute-naming-style. If left empty, class attribute names will be checked
|
||||
# with the set naming style.
|
||||
#class-attribute-rgx=
|
||||
|
||||
# Naming style matching correct class constant names.
|
||||
class-const-naming-style=UPPER_CASE
|
||||
|
||||
# Regular expression matching correct class constant names. Overrides class-
|
||||
# const-naming-style. If left empty, class constant names will be checked with
|
||||
# the set naming style.
|
||||
#class-const-rgx=
|
||||
|
||||
# Naming style matching correct class names.
|
||||
class-naming-style=PascalCase
|
||||
|
||||
# Regular expression matching correct class names. Overrides class-naming-
|
||||
# style. If left empty, class names will be checked with the set naming style.
|
||||
#class-rgx=
|
||||
|
||||
# Naming style matching correct constant names.
|
||||
const-naming-style=UPPER_CASE
|
||||
|
||||
# Regular expression matching correct constant names. Overrides const-naming-
|
||||
# style. If left empty, constant names will be checked with the set naming
|
||||
# style.
|
||||
#const-rgx=
|
||||
|
||||
# Minimum line length for functions/classes that require docstrings, shorter
|
||||
# ones are exempt.
|
||||
docstring-min-length=-1
|
||||
|
||||
# Naming style matching correct function names.
|
||||
function-naming-style=snake_case
|
||||
|
||||
# Regular expression matching correct function names. Overrides function-
|
||||
# naming-style. If left empty, function names will be checked with the set
|
||||
# naming style.
|
||||
function-rgx=[a-z_][a-z0-9_]{2,40}$
|
||||
|
||||
# Good variable names which should always be accepted, separated by a comma.
|
||||
good-names=i,
|
||||
j,
|
||||
k,
|
||||
ex,
|
||||
Run,
|
||||
_,
|
||||
fd,
|
||||
logger
|
||||
|
||||
# Good variable names regexes, separated by a comma. If names match any regex,
|
||||
# they will always be accepted
|
||||
good-names-rgxs=
|
||||
|
||||
# Include a hint for the correct naming format with invalid-name.
|
||||
include-naming-hint=no
|
||||
|
||||
# Naming style matching correct inline iteration names.
|
||||
inlinevar-naming-style=any
|
||||
|
||||
# Regular expression matching correct inline iteration names. Overrides
|
||||
# inlinevar-naming-style. If left empty, inline iteration names will be checked
|
||||
# with the set naming style.
|
||||
#inlinevar-rgx=
|
||||
|
||||
# Naming style matching correct method names.
|
||||
method-naming-style=snake_case
|
||||
|
||||
# Regular expression matching correct method names. Overrides method-naming-
|
||||
# style. If left empty, method names will be checked with the set naming style.
|
||||
method-rgx=[a-z_][a-z0-9_]{2,50}$
|
||||
|
||||
# Naming style matching correct module names.
|
||||
module-naming-style=snake_case
|
||||
|
||||
# Regular expression matching correct module names. Overrides module-naming-
|
||||
# style. If left empty, module names will be checked with the set naming style.
|
||||
#module-rgx=
|
||||
|
||||
# Colon-delimited sets of names that determine each other's naming style when
|
||||
# the name regexes allow several styles.
|
||||
name-group=
|
||||
|
||||
# Regular expression which should only match function or class names that do
|
||||
# not require a docstring.
|
||||
no-docstring-rgx=(__.*__)|(test_[A-Za-z0-9_]*)|(_.*)|(.*Test$)
|
||||
|
||||
# List of decorators that produce properties, such as abc.abstractproperty. Add
|
||||
# to this list to register other decorators that produce valid properties.
|
||||
# These decorators are taken in consideration only for invalid-name.
|
||||
property-classes=abc.abstractproperty
|
||||
|
||||
# Regular expression matching correct type variable names. If left empty, type
|
||||
# variable names will be checked with the set naming style.
|
||||
#typevar-rgx=
|
||||
|
||||
# Naming style matching correct variable names.
|
||||
variable-naming-style=snake_case
|
||||
|
||||
# Regular expression matching correct variable names. Overrides variable-
|
||||
# naming-style. If left empty, variable names will be checked with the set
|
||||
# naming style.
|
||||
variable-rgx=[a-z_][a-z0-9_]{1,30}$
|
||||
|
||||
|
||||
[CLASSES]
|
||||
|
||||
# Warn about protected attribute access inside special methods
|
||||
check-protected-access-in-special-methods=no
|
||||
|
||||
# List of method names used to declare (i.e. assign) instance attributes.
|
||||
defining-attr-methods=__init__,
|
||||
__new__,
|
||||
setUp,
|
||||
__post_init__
|
||||
|
||||
# List of valid names for the first argument in a class method.
|
||||
valid-classmethod-first-arg=cls
|
||||
|
||||
# List of valid names for the first argument in a metaclass class method.
|
||||
valid-metaclass-classmethod-first-arg=cls
|
||||
|
||||
|
||||
[EXCEPTIONS]
|
||||
|
||||
# Exceptions that will emit a warning when caught.
|
||||
overgeneral-exceptions=builtins.BaseException,
|
||||
builtins.Exception
|
||||
|
||||
|
||||
[FORMAT]
|
||||
|
||||
# Expected format of line ending, e.g. empty (any line ending), LF or CRLF.
|
||||
expected-line-ending-format=
|
||||
|
||||
# Regexp for a line that is allowed to be longer than the limit.
|
||||
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
|
||||
|
||||
# Number of spaces of indent required inside a hanging or continued line.
|
||||
# git history told me that "This does something silly/broken..."
|
||||
#indent-after-paren=4
|
||||
|
||||
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
|
||||
# tab).
|
||||
indent-string=' '
|
||||
|
||||
# Maximum number of characters on a single line.
|
||||
max-line-length=100
|
||||
|
||||
# Maximum number of lines in a module.
|
||||
max-module-lines=1250
|
||||
|
||||
# Allow the body of a class to be on the same line as the declaration if body
|
||||
# contains single statement.
|
||||
single-line-class-stmt=no
|
||||
|
||||
# Allow the body of an if to be on the same line as the test if there is no
|
||||
# else.
|
||||
single-line-if-stmt=no
|
||||
|
||||
|
||||
[IMPORTS]
|
||||
|
||||
# List of modules that can be imported at any level, not just the top level
|
||||
# one.
|
||||
allow-any-import-level=
|
||||
|
||||
# Allow wildcard imports from modules that define __all__.
|
||||
allow-wildcard-with-all=no
|
||||
|
||||
# Deprecated modules which should not be used, separated by a comma.
|
||||
deprecated-modules=
|
||||
|
||||
# Output a graph (.gv or any supported image format) of external dependencies
|
||||
# to the given file (report RP0402 must not be disabled).
|
||||
ext-import-graph=
|
||||
|
||||
# Output a graph (.gv or any supported image format) of all (i.e. internal and
|
||||
# external) dependencies to the given file (report RP0402 must not be
|
||||
# disabled).
|
||||
import-graph=
|
||||
|
||||
# Output a graph (.gv or any supported image format) of internal dependencies
|
||||
# to the given file (report RP0402 must not be disabled).
|
||||
int-import-graph=
|
||||
|
||||
# Force import order to recognize a module as part of the standard
|
||||
# compatibility libraries.
|
||||
known-standard-library=
|
||||
|
||||
# Force import order to recognize a module as part of a third party library.
|
||||
known-third-party=enchant
|
||||
|
||||
# Couples of modules and preferred modules, separated by a comma.
|
||||
preferred-modules=
|
||||
|
||||
|
||||
[LOGGING]
|
||||
|
||||
# The type of string formatting that logging methods do. `old` means using %
|
||||
# formatting, `new` is for `{}` formatting.
|
||||
logging-format-style=old
|
||||
|
||||
# Logging modules to check that the string format arguments are in logging
|
||||
# function parameter format.
|
||||
logging-modules=logging,logger
|
||||
# List of plugins (as comma separated values of python modules names) to load,
|
||||
# usually to register additional checkers.
|
||||
load-plugins=linter_plugin
|
||||
|
||||
# A comma-separated list of package or module names from where C extensions may
|
||||
# be loaded. Extensions are loading into the active Python interpreter and may
|
||||
# run arbitrary code.
|
||||
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
|
||||
|
||||
|
||||
[MESSAGES CONTROL]
|
||||
|
||||
# Only show warnings with the listed confidence levels. Leave empty to show
|
||||
# all. Valid levels: HIGH, CONTROL_FLOW, INFERENCE, INFERENCE_FAILURE,
|
||||
# UNDEFINED.
|
||||
confidence=HIGH,
|
||||
CONTROL_FLOW,
|
||||
INFERENCE,
|
||||
INFERENCE_FAILURE,
|
||||
UNDEFINED
|
||||
# Enable the message, report, category or checker with the given id(s). You can
|
||||
# either give multiple identifier separated by comma (,) or put this option
|
||||
# multiple time. See also the "--disable" option for examples.
|
||||
#enable=
|
||||
|
||||
# Disable the message, report, category or checker with the given id(s). You
|
||||
# can either give multiple identifiers separated by comma (,) or put this
|
||||
# option multiple times (only on the command line, not in the configuration
|
||||
# file where it should appear only once). You can also use "--disable=all" to
|
||||
# disable everything first and then re-enable specific checks. For example, if
|
||||
# file where it should appear only once).You can also use "--disable=all" to
|
||||
# disable everything first and then reenable specific checks. For example, if
|
||||
# you want to run only the similarities checker, you can use "--disable=all
|
||||
# --enable=similarities". If you want to run only the classes checker, but have
|
||||
# no Warning level messages displayed, use "--disable=all --enable=classes
|
||||
# --disable=W".
|
||||
# no Warning level messages displayed, use"--disable=all --enable=classes
|
||||
# --disable=W"
|
||||
# CERTBOT COMMENT
|
||||
# 1) Once certbot codebase is claimed to be compatible exclusively with Python 3,
|
||||
# the useless-object-inheritance check can be enabled again, and code fixed accordingly.
|
||||
@@ -375,203 +53,261 @@ confidence=HIGH,
|
||||
# See https://github.com/PyCQA/pylint/issues/1498.
|
||||
# 3) Same as point 2 for no-value-for-parameter.
|
||||
# See https://github.com/PyCQA/pylint/issues/2820.
|
||||
# 4) raise-missing-from makes it an error to raise an exception from except
|
||||
# block without using explicit exception chaining. While explicit exception
|
||||
# chaining results in a slightly more informative traceback, I don't think
|
||||
# it's beneficial enough for us to change all of our current instances and
|
||||
# give Certbot developers errors about this when they're working on new code
|
||||
# in the future. You can read more about exception chaining and this pylint
|
||||
# check at
|
||||
# https://blog.ram.rachum.com/post/621791438475296768/improving-python-exception-chaining-with.
|
||||
# 5) wrong-import-order generates false positives and a pylint developer
|
||||
# suggests that people using isort should disable this check at
|
||||
# https://github.com/PyCQA/pylint/issues/3817#issuecomment-687892090.
|
||||
# 6) unspecified-encoding generates errors when encoding is not specified in
|
||||
# in a call to the built-in open function. This relates more to a design decision
|
||||
# (unspecified encoding makes the open function use the default encoding of the system)
|
||||
# than a clear flaw on which a check should be enforced. Anyway the project does
|
||||
# not need to enforce encoding on files so we disable this check.
|
||||
# 7) consider-using-f-string is "suggesting" to move to f-string when possible with an error. This
|
||||
# clearly relates to code design and not to potential defects in the code, let's just ignore that.
|
||||
disable=fixme,locally-disabled,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue,raise-missing-from,wrong-import-order,unspecified-encoding,consider-using-f-string,raw-checker-failed,bad-inline-option,file-ignored,suppressed-message,useless-suppression,deprecated-pragma,use-symbolic-message-instead
|
||||
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue
|
||||
|
||||
# Enable the message, report, category or checker with the given id(s). You can
|
||||
# either give multiple identifier separated by comma (,) or put this option
|
||||
# multiple time (only on the command line, not in the configuration file where
|
||||
# it should appear only once). See also the "--disable" option for examples.
|
||||
enable=c-extension-no-member
|
||||
[REPORTS]
|
||||
|
||||
# Set the output format. Available formats are text, parseable, colorized, msvs
|
||||
# (visual studio) and html. You can also give a reporter class, eg
|
||||
# mypackage.mymodule.MyReporterClass.
|
||||
output-format=text
|
||||
|
||||
# Put messages in a separate file for each module / package specified on the
|
||||
# command line instead of printing them on stdout. Reports (if any) will be
|
||||
# written in a file name "pylint_global.[txt|html]".
|
||||
files-output=no
|
||||
|
||||
# Tells whether to display a full report or only the messages
|
||||
reports=yes
|
||||
|
||||
# Python expression which should return a note less than 10 (10 is the highest
|
||||
# note). You have access to the variables errors warning, statement which
|
||||
# respectively contain the number of errors / warnings messages and the total
|
||||
# number of statements analyzed. This is used by the global evaluation report
|
||||
# (RP0004).
|
||||
evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
|
||||
|
||||
# Add a comment according to your evaluation note. This is used by the global
|
||||
# evaluation report (RP0004).
|
||||
comment=no
|
||||
|
||||
# Template used to display messages. This is a python new-style format string
|
||||
# used to format the message information. See doc for all details
|
||||
#msg-template=
|
||||
|
||||
|
||||
[METHOD_ARGS]
|
||||
[BASIC]
|
||||
|
||||
# List of qualified names (i.e., library.method) which require a timeout
|
||||
# parameter e.g. 'requests.api.get,requests.api.post'
|
||||
timeout-methods=requests.api.delete,requests.api.get,requests.api.head,requests.api.options,requests.api.patch,requests.api.post,requests.api.put,requests.api.request
|
||||
# Required attributes for module, separated by a comma
|
||||
required-attributes=
|
||||
|
||||
# List of builtins function names that should not be used, separated by a comma
|
||||
bad-functions=map,filter,apply,input,file
|
||||
|
||||
# Good variable names which should always be accepted, separated by a comma
|
||||
good-names=f,i,j,k,ex,Run,_,fd,logger
|
||||
|
||||
# Bad variable names which should always be refused, separated by a comma
|
||||
bad-names=foo,bar,baz,toto,tutu,tata
|
||||
|
||||
# Colon-delimited sets of names that determine each other's naming style when
|
||||
# the name regexes allow several styles.
|
||||
name-group=
|
||||
|
||||
# Include a hint for the correct naming format with invalid-name
|
||||
include-naming-hint=no
|
||||
|
||||
# Regular expression matching correct function names
|
||||
function-rgx=[a-z_][a-z0-9_]{2,40}$
|
||||
|
||||
# Naming hint for function names
|
||||
function-name-hint=[a-z_][a-z0-9_]{2,40}$
|
||||
|
||||
# Regular expression matching correct variable names
|
||||
variable-rgx=[a-z_][a-z0-9_]{1,30}$
|
||||
|
||||
# Naming hint for variable names
|
||||
variable-name-hint=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Regular expression matching correct constant names
|
||||
const-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__))$
|
||||
|
||||
# Naming hint for constant names
|
||||
const-name-hint=(([A-Z_][A-Z0-9_]*)|(__.*__))$
|
||||
|
||||
# Regular expression matching correct attribute names
|
||||
attr-rgx=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Naming hint for attribute names
|
||||
attr-name-hint=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Regular expression matching correct argument names
|
||||
argument-rgx=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Naming hint for argument names
|
||||
argument-name-hint=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Regular expression matching correct class attribute names
|
||||
class-attribute-rgx=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$
|
||||
|
||||
# Naming hint for class attribute names
|
||||
class-attribute-name-hint=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$
|
||||
|
||||
# Regular expression matching correct inline iteration names
|
||||
inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$
|
||||
|
||||
# Naming hint for inline iteration names
|
||||
inlinevar-name-hint=[A-Za-z_][A-Za-z0-9_]*$
|
||||
|
||||
# Regular expression matching correct class names
|
||||
class-rgx=[A-Z_][a-zA-Z0-9]+$
|
||||
|
||||
# Naming hint for class names
|
||||
class-name-hint=[A-Z_][a-zA-Z0-9]+$
|
||||
|
||||
# Regular expression matching correct module names
|
||||
module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
|
||||
|
||||
# Naming hint for module names
|
||||
module-name-hint=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
|
||||
|
||||
# Regular expression matching correct method names
|
||||
method-rgx=[a-z_][a-z0-9_]{2,50}$
|
||||
|
||||
# Naming hint for method names
|
||||
method-name-hint=[a-z_][a-z0-9_]{2,50}$
|
||||
|
||||
# Regular expression which should only match function or class names that do
|
||||
# not require a docstring.
|
||||
no-docstring-rgx=(__.*__)|(test_[A-Za-z0-9_]*)|(_.*)|(.*Test$)
|
||||
|
||||
# Minimum line length for functions/classes that require docstrings, shorter
|
||||
# ones are exempt.
|
||||
docstring-min-length=-1
|
||||
|
||||
|
||||
[MISCELLANEOUS]
|
||||
|
||||
# List of note tags to take in consideration, separated by a comma.
|
||||
notes=FIXME,
|
||||
XXX,
|
||||
TODO
|
||||
|
||||
# Regular expression of note tags to take in consideration.
|
||||
notes-rgx=
|
||||
notes=FIXME,XXX,TODO
|
||||
|
||||
|
||||
[REFACTORING]
|
||||
[LOGGING]
|
||||
|
||||
# Maximum number of nested blocks for function / method body
|
||||
max-nested-blocks=5
|
||||
|
||||
# Complete name of functions that never returns. When checking for
|
||||
# inconsistent-return-statements if a never returning function is called then
|
||||
# it will be considered as an explicit return statement and no message will be
|
||||
# printed.
|
||||
never-returning-functions=sys.exit,argparse.parse_error
|
||||
|
||||
|
||||
[REPORTS]
|
||||
|
||||
# Python expression which should return a score less than or equal to 10. You
|
||||
# have access to the variables 'fatal', 'error', 'warning', 'refactor',
|
||||
# 'convention', and 'info' which contain the number of messages in each
|
||||
# category, as well as 'statement' which is the total number of statements
|
||||
# analyzed. This score is used by the global evaluation report (RP0004).
|
||||
evaluation=max(0, 0 if fatal else 10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10))
|
||||
|
||||
# Template used to display messages. This is a python new-style format string
|
||||
# used to format the message information. See doc for all details.
|
||||
msg-template=
|
||||
|
||||
# Set the output format. Available formats are text, parseable, colorized, json
|
||||
# and msvs (visual studio). You can also give a reporter class, e.g.
|
||||
# mypackage.mymodule.MyReporterClass.
|
||||
#output-format=
|
||||
|
||||
# Tells whether to display a full report or only the messages.
|
||||
reports=no
|
||||
|
||||
# Activate the evaluation score.
|
||||
score=yes
|
||||
|
||||
|
||||
[SIMILARITIES]
|
||||
|
||||
# Comments are removed from the similarity computation
|
||||
ignore-comments=yes
|
||||
|
||||
# Docstrings are removed from the similarity computation
|
||||
ignore-docstrings=yes
|
||||
|
||||
# Imports are removed from the similarity computation
|
||||
ignore-imports=yes
|
||||
|
||||
# Signatures are removed from the similarity computation
|
||||
ignore-signatures=yes
|
||||
|
||||
# Minimum lines number of a similarity.
|
||||
min-similarity-lines=6
|
||||
|
||||
|
||||
[STRING]
|
||||
|
||||
# This flag controls whether inconsistent-quotes generates a warning when the
|
||||
# character used as a quote delimiter is used inconsistently within a module.
|
||||
check-quote-consistency=no
|
||||
|
||||
# This flag controls whether the implicit-str-concat should generate a warning
|
||||
# on implicit string concatenation in sequences defined over several lines.
|
||||
check-str-concat-over-line-jumps=no
|
||||
|
||||
|
||||
[TYPECHECK]
|
||||
|
||||
# List of decorators that produce context managers, such as
|
||||
# contextlib.contextmanager. Add to this list to register other decorators that
|
||||
# produce valid context managers.
|
||||
contextmanager-decorators=contextlib.contextmanager
|
||||
|
||||
# List of members which are set dynamically and missed by pylint inference
|
||||
# system, and so shouldn't trigger E1101 when accessed. Python regular
|
||||
# expressions are accepted.
|
||||
generated-members=
|
||||
|
||||
# Tells whether to warn about missing members when the owner of the attribute
|
||||
# is inferred to be None.
|
||||
ignore-none=yes
|
||||
|
||||
# This flag controls whether pylint should warn about no-member and similar
|
||||
# checks whenever an opaque object is returned when inferring. The inference
|
||||
# can return multiple potential results while evaluating a Python object, but
|
||||
# some branches might not be evaluated, which results in partial inference. In
|
||||
# that case, it might be useful to still emit no-member and other checks for
|
||||
# the rest of the inferred objects.
|
||||
ignore-on-opaque-inference=yes
|
||||
|
||||
# List of symbolic message names to ignore for Mixin members.
|
||||
ignored-checks-for-mixins=no-member,
|
||||
not-async-context-manager,
|
||||
not-context-manager,
|
||||
attribute-defined-outside-init
|
||||
|
||||
# List of class names for which member attributes should not be checked (useful
|
||||
# for classes with dynamically set attributes). This supports the use of
|
||||
# qualified names.
|
||||
ignored-classes=optparse.Values,thread._local,_thread._local,argparse.Namespace,Field,Header,JWS,closing
|
||||
|
||||
# List of module names for which member attributes should not be checked
|
||||
# (useful for modules/projects where namespaces are manipulated during runtime
|
||||
# and thus existing member attributes cannot be deduced by static analysis
|
||||
ignored-modules=confargparse,argparse
|
||||
|
||||
# Show a hint with possible names when a member name was not found. The aspect
|
||||
# of finding the hint is based on edit distance.
|
||||
missing-member-hint=yes
|
||||
|
||||
# The minimum edit distance a name should have in order to be considered a
|
||||
# similar match for a missing member name.
|
||||
missing-member-hint-distance=1
|
||||
|
||||
# The total number of similar names that should be taken in consideration when
|
||||
# showing a hint for a missing member.
|
||||
missing-member-max-choices=1
|
||||
|
||||
# Regex pattern to define which classes are considered mixins.
|
||||
mixin-class-rgx=.*[Mm]ixin
|
||||
|
||||
# List of decorators that change the signature of a decorated function.
|
||||
signature-mutators=
|
||||
# Logging modules to check that the string format arguments are in logging
|
||||
# function parameter format
|
||||
logging-modules=logging,logger
|
||||
|
||||
|
||||
[VARIABLES]
|
||||
|
||||
# List of additional names supposed to be defined in builtins. Remember that
|
||||
# you should avoid defining new builtins when possible.
|
||||
additional-builtins=
|
||||
|
||||
# Tells whether unused global variables should be treated as a violation.
|
||||
allow-global-unused-variables=yes
|
||||
|
||||
# List of names allowed to shadow builtins
|
||||
allowed-redefined-builtins=
|
||||
|
||||
# List of strings which can identify a callback function by name. A callback
|
||||
# name must start or end with one of those strings.
|
||||
callbacks=cb_,
|
||||
_cb
|
||||
|
||||
# A regular expression matching the name of dummy variables (i.e. expected to
|
||||
# not be used).
|
||||
dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_
|
||||
|
||||
# Argument names that match this expression will be ignored.
|
||||
ignored-argument-names=_.*|^ignored_|^unused_
|
||||
|
||||
# Tells whether we should check for unused import in __init__ files.
|
||||
init-import=no
|
||||
|
||||
# List of qualified module names which can have objects that can redefine
|
||||
# builtins.
|
||||
redefining-builtins-modules=six.moves,past.builtins,future.builtins,builtins,io
|
||||
# A regular expression matching the name of dummy variables (i.e. expectedly
|
||||
# not used).
|
||||
dummy-variables-rgx=(unused)?_.*|dummy
|
||||
|
||||
# List of additional names supposed to be defined in builtins. Remember that
|
||||
# you should avoid to define new builtins when possible.
|
||||
additional-builtins=
|
||||
|
||||
|
||||
[SIMILARITIES]
|
||||
|
||||
# Minimum lines number of a similarity.
|
||||
min-similarity-lines=6
|
||||
|
||||
# Ignore comments when computing similarities.
|
||||
ignore-comments=yes
|
||||
|
||||
# Ignore docstrings when computing similarities.
|
||||
ignore-docstrings=yes
|
||||
|
||||
# Ignore imports when computing similarities.
|
||||
ignore-imports=yes
|
||||
|
||||
|
||||
[FORMAT]
|
||||
|
||||
# Maximum number of characters on a single line.
|
||||
max-line-length=100
|
||||
|
||||
# Regexp for a line that is allowed to be longer than the limit.
|
||||
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
|
||||
|
||||
# Allow the body of an if to be on the same line as the test if there is no
|
||||
# else.
|
||||
single-line-if-stmt=no
|
||||
|
||||
# List of optional constructs for which whitespace checking is disabled
|
||||
no-space-check=trailing-comma
|
||||
|
||||
# Maximum number of lines in a module
|
||||
max-module-lines=1250
|
||||
|
||||
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
|
||||
# tab).
|
||||
indent-string=' '
|
||||
|
||||
# Number of spaces of indent required inside a hanging or continued line.
|
||||
# This does something silly/broken...
|
||||
#indent-after-paren=4
|
||||
|
||||
|
||||
[TYPECHECK]
|
||||
|
||||
# Tells whether missing members accessed in mixin class should be ignored. A
|
||||
# mixin class is detected if its name ends with "mixin" (case insensitive).
|
||||
ignore-mixin-members=yes
|
||||
|
||||
# List of module names for which member attributes should not be checked
|
||||
# (useful for modules/projects where namespaces are manipulated during runtime
|
||||
# and thus existing member attributes cannot be deduced by static analysis
|
||||
ignored-modules=pkg_resources,confargparse,argparse,six.moves,six.moves.urllib
|
||||
# import errors ignored only in 1.4.4
|
||||
# https://bitbucket.org/logilab/pylint/commits/cd000904c9e2
|
||||
|
||||
# List of classes names for which member attributes should not be checked
|
||||
# (useful for classes with attributes dynamically set).
|
||||
ignored-classes=Field,Header,JWS,closing
|
||||
|
||||
# When zope mode is activated, add a predefined set of Zope acquired attributes
|
||||
# to generated-members.
|
||||
zope=yes
|
||||
|
||||
# List of members which are set dynamically and missed by pylint inference
|
||||
# system, and so shouldn't trigger E0201 when accessed. Python regular
|
||||
# expressions are accepted.
|
||||
generated-members=REQUEST,acl_users,aq_parent
|
||||
|
||||
|
||||
[IMPORTS]
|
||||
|
||||
# Deprecated modules which should not be used, separated by a comma
|
||||
deprecated-modules=regsub,TERMIOS,Bastion,rexec
|
||||
|
||||
# Create a graph of every (i.e. internal and external) dependencies in the
|
||||
# given file (report RP0402 must not be disabled)
|
||||
import-graph=
|
||||
|
||||
# Create a graph of external dependencies in the given file (report RP0402 must
|
||||
# not be disabled)
|
||||
ext-import-graph=
|
||||
|
||||
# Create a graph of internal dependencies in the given file (report RP0402 must
|
||||
# not be disabled)
|
||||
int-import-graph=
|
||||
|
||||
|
||||
[CLASSES]
|
||||
|
||||
# List of interface methods to ignore, separated by a comma. This is used for
|
||||
# instance to not check methods defined in Zope's Interface base class.
|
||||
ignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by,implementedBy,providedBy
|
||||
|
||||
# List of method names used to declare (i.e. assign) instance attributes.
|
||||
defining-attr-methods=__init__,__new__,setUp
|
||||
|
||||
# List of valid names for the first argument in a class method.
|
||||
valid-classmethod-first-arg=cls
|
||||
|
||||
# List of valid names for the first argument in a metaclass class method.
|
||||
valid-metaclass-classmethod-first-arg=mcs
|
||||
|
||||
|
||||
[EXCEPTIONS]
|
||||
|
||||
# Exceptions that will emit a warning when being caught. Defaults to
|
||||
# "Exception"
|
||||
overgeneral-exceptions=Exception
|
||||
|
||||
309
.travis.yml
Normal file
309
.travis.yml
Normal file
@@ -0,0 +1,309 @@
|
||||
language: python
|
||||
dist: xenial
|
||||
|
||||
cache:
|
||||
directories:
|
||||
- $HOME/.cache/pip
|
||||
|
||||
before_script:
|
||||
- 'if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then ulimit -n 1024 ; fi'
|
||||
# On Travis, the fastest parallelization for integration tests has proved to be 4.
|
||||
- 'if [[ "$TOXENV" == *"integration"* ]]; then export PYTEST_ADDOPTS="--numprocesses 4"; fi'
|
||||
# Use Travis retry feature for farm tests since they are flaky
|
||||
- 'if [[ "$TOXENV" == "travis-test-farm"* ]]; then export TRAVIS_RETRY=travis_retry; fi'
|
||||
- export TOX_TESTENV_PASSENV=TRAVIS
|
||||
|
||||
# Only build pushes to the master branch, PRs, and branches beginning with
|
||||
# `test-` or of the form `digit(s).digit(s).x`. This reduces the number of
|
||||
# simultaneous Travis runs, which speeds turnaround time on review since there
|
||||
# is a cap of on the number of simultaneous runs.
|
||||
branches:
|
||||
only:
|
||||
# apache-parser-v2 is a temporary branch for doing work related to
|
||||
# rewriting the parser in the Apache plugin.
|
||||
- apache-parser-v2
|
||||
- master
|
||||
- /^\d+\.\d+\.x$/
|
||||
- /^test-.*$/
|
||||
|
||||
# Jobs for the main test suite are always executed (including on PRs) except for pushes on master.
|
||||
not-on-master: ¬-on-master
|
||||
if: NOT (type = push AND branch = master)
|
||||
|
||||
# Jobs for the extended test suite are executed for cron jobs and pushes to
|
||||
# non-development branches. See the explanation for apache-parser-v2 above.
|
||||
extended-test-suite: &extended-test-suite
|
||||
if: type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))
|
||||
|
||||
matrix:
|
||||
include:
|
||||
# Main test suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=pebble TOXENV=integration
|
||||
<<: *not-on-master
|
||||
|
||||
# This job is always executed, including on master
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27-cover FYI="py27 tests + code coverage"
|
||||
|
||||
- python: "3.7"
|
||||
env: TOXENV=lint
|
||||
<<: *not-on-master
|
||||
- python: "3.5"
|
||||
env: TOXENV=mypy
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
env: TOXENV='py27-{acme,apache,apache-v2,certbot,dns,nginx}-oldest'
|
||||
<<: *not-on-master
|
||||
- python: "3.4"
|
||||
env: TOXENV=py34
|
||||
<<: *not-on-master
|
||||
- python: "3.7"
|
||||
env: TOXENV=py37
|
||||
<<: *not-on-master
|
||||
- python: "3.8"
|
||||
env: TOXENV=py38
|
||||
<<: *not-on-master
|
||||
- sudo: required
|
||||
env: TOXENV=apache_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
<<: *not-on-master
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_xenial
|
||||
services: docker
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
env: TOXENV=apacheconftest-with-pebble
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
env: TOXENV=nginxroundtrip
|
||||
<<: *not-on-master
|
||||
|
||||
# Extended test suite on cron jobs and pushes to tested branches other than master
|
||||
- sudo: required
|
||||
env: TOXENV=nginx_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-apache2
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-leauto-upgrades
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
git:
|
||||
depth: false # This is needed to have the history to checkout old versions of certbot-auto.
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-certonly-standalone
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-sdists
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: TOXENV=py37 CERTBOT_NO_PIN=1
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration-certbot-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration-certbot-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration-nginx-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration-nginx-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: TOXENV=py34
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: TOXENV=py35
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: TOXENV=py36
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: TOXENV=py37
|
||||
<<: *extended-test-suite
|
||||
- python: "3.8"
|
||||
env: TOXENV=py38
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.8"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
<<: *extended-test-suite
|
||||
- python: "3.8"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_jessie
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_centos6
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=docker_dev
|
||||
services: docker
|
||||
addons:
|
||||
apt:
|
||||
packages: # don't install nginx and apache
|
||||
- libaugeas0
|
||||
<<: *extended-test-suite
|
||||
- language: generic
|
||||
env: TOXENV=py27
|
||||
os: osx
|
||||
# Using this osx_image is a workaround for
|
||||
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
|
||||
osx_image: xcode10.2
|
||||
addons:
|
||||
homebrew:
|
||||
packages:
|
||||
- augeas
|
||||
- python2
|
||||
<<: *extended-test-suite
|
||||
- language: generic
|
||||
env: TOXENV=py3
|
||||
os: osx
|
||||
# Using this osx_image is a workaround for
|
||||
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
|
||||
osx_image: xcode10.2
|
||||
addons:
|
||||
homebrew:
|
||||
packages:
|
||||
- augeas
|
||||
- python3
|
||||
<<: *extended-test-suite
|
||||
|
||||
# container-based infrastructure
|
||||
sudo: false
|
||||
|
||||
addons:
|
||||
apt:
|
||||
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
|
||||
- python-dev
|
||||
- gcc
|
||||
- libaugeas0
|
||||
- libssl-dev
|
||||
- libffi-dev
|
||||
- ca-certificates
|
||||
# For certbot-nginx integration testing
|
||||
- nginx-light
|
||||
- openssl
|
||||
|
||||
# tools/pip_install.py is used to pin packages to a known working version
|
||||
# except in tests where the environment variable CERTBOT_NO_PIN is set.
|
||||
# virtualenv is listed here explicitly to make sure it is upgraded when
|
||||
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
|
||||
# version of virtualenv.
|
||||
install: 'tools/pip_install.py -U codecov tox virtualenv'
|
||||
# Most of the time TRAVIS_RETRY is an empty string, and has no effect on the
|
||||
# script command. It is set only to `travis_retry` during farm tests, in
|
||||
# order to trigger the Travis retry feature, and compensate the inherent
|
||||
# flakiness of these specific tests.
|
||||
script: '$TRAVIS_RETRY tox'
|
||||
|
||||
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov -F linux'
|
||||
|
||||
notifications:
|
||||
email: false
|
||||
irc:
|
||||
channels:
|
||||
# This is set to a secure variable to prevent forks from sending
|
||||
# notifications. This value was created by installing
|
||||
# https://github.com/travis-ci/travis.rb and running
|
||||
# `travis encrypt "chat.freenode.net#certbot-devel"`.
|
||||
- secure: "EWW66E2+KVPZyIPR8ViENZwfcup4Gx3/dlimmAZE0WuLwxDCshBBOd3O8Rf6pBokEoZlXM5eDT6XdyJj8n0DLslgjO62pExdunXpbcMwdY7l1ELxX2/UbnDTE6UnPYa09qVBHNG7156Z6yE0x2lH4M9Ykvp0G0cubjPQHylAwo0="
|
||||
on_cancel: never
|
||||
on_success: never
|
||||
on_failure: always
|
||||
41
AUTHORS.md
41
AUTHORS.md
@@ -1,7 +1,6 @@
|
||||
Authors
|
||||
=======
|
||||
|
||||
* [Aaron Gable](https://github.com/aarongable)
|
||||
* [Aaron Zirbes](https://github.com/aaronzirbes)
|
||||
* Aaron Zuehlke
|
||||
* Ada Lovelace
|
||||
@@ -17,16 +16,11 @@ Authors
|
||||
* [Alex Halderman](https://github.com/jhalderm)
|
||||
* [Alex Jordan](https://github.com/strugee)
|
||||
* [Alex Zorin](https://github.com/alexzorin)
|
||||
* [Alexis Hancock](https://github.com/zoracon)
|
||||
* [Amir Omidi](https://github.com/aaomidi)
|
||||
* [Amjad Mashaal](https://github.com/TheNavigat)
|
||||
* [amplifi](https://github.com/amplifi)
|
||||
* [Andrew Murray](https://github.com/radarhere)
|
||||
* [Andrzej Górski](https://github.com/andrzej3393)
|
||||
* [Anna Glasgall](https://github.com/aglasgall)
|
||||
* [Anselm Levskaya](https://github.com/levskaya)
|
||||
* [Antoine Jacoutot](https://github.com/ajacoutot)
|
||||
* [April King](https://github.com/april)
|
||||
* [asaph](https://github.com/asaph)
|
||||
* [Axel Beckert](https://github.com/xtaran)
|
||||
* [Bas](https://github.com/Mechazawa)
|
||||
@@ -41,9 +35,7 @@ Authors
|
||||
* [Blake Griffith](https://github.com/cowlicks)
|
||||
* [Brad Warren](https://github.com/bmw)
|
||||
* [Brandon Kraft](https://github.com/kraftbj)
|
||||
* [Brandon Kreisel](https://github.com/BKreisel)
|
||||
* [Brian Heim](https://github.com/brianlheim)
|
||||
* [Cameron Steel](https://github.com/Tugzrida)
|
||||
* [Brandon Kreisel](https://github.com/kraftbj)
|
||||
* [Ceesjan Luiten](https://github.com/quinox)
|
||||
* [Chad Whitacre](https://github.com/whit537)
|
||||
* [Chhatoi Pritam Baral](https://github.com/pritambaral)
|
||||
@@ -65,11 +57,8 @@ Authors
|
||||
* [DanCld](https://github.com/DanCld)
|
||||
* [Daniel Albers](https://github.com/AID)
|
||||
* [Daniel Aleksandersen](https://github.com/da2x)
|
||||
* [Daniel Almasi](https://github.com/almasen)
|
||||
* [Daniel Convissor](https://github.com/convissor)
|
||||
* [Daniel "Drex" Drexler](https://github.com/aeturnum)
|
||||
* [Daniel Huang](https://github.com/dhuang)
|
||||
* [Daniel McMahon] (https://github.com/igloodan)
|
||||
* [Dave Guarino](https://github.com/daguar)
|
||||
* [David cz](https://github.com/dave-cz)
|
||||
* [David Dworken](https://github.com/ddworken)
|
||||
@@ -93,8 +82,6 @@ Authors
|
||||
* [Felix Schwarz](https://github.com/FelixSchwarz)
|
||||
* [Felix Yan](https://github.com/felixonmars)
|
||||
* [Filip Ochnik](https://github.com/filipochnik)
|
||||
* [Florian Klink](https://github.com/flokli)
|
||||
* [Francesco Colista](https://github.com/fcolista)
|
||||
* [Francois Marier](https://github.com/fmarier)
|
||||
* [Frank](https://github.com/Frankkkkk)
|
||||
* [Frederic BLANC](https://github.com/fblanc)
|
||||
@@ -113,18 +100,14 @@ Authors
|
||||
* [Harlan Lieberman-Berg](https://github.com/hlieberman)
|
||||
* [Henri Salo](https://github.com/fgeek)
|
||||
* [Henry Chen](https://github.com/henrychen95)
|
||||
* [Hugo van Kemenade](https://github.com/hugovk)
|
||||
* [Ingolf Becker](https://github.com/watercrossing)
|
||||
* [Ivan Nejgebauer](https://github.com/inejge)
|
||||
* [Jaap Eldering](https://github.com/eldering)
|
||||
* [Jacob Hoffman-Andrews](https://github.com/jsha)
|
||||
* [Jacob Sachs](https://github.com/jsachs)
|
||||
* [Jairo Llopis](https://github.com/Yajo)
|
||||
* [Jakub Warmuz](https://github.com/kuba)
|
||||
* [James Balazs](https://github.com/jamesbalazs)
|
||||
* [James Kasten](https://github.com/jdkasten)
|
||||
* [Jason Grinblat](https://github.com/ptychomancer)
|
||||
* [Jawshua](https://github.com/jawshua)
|
||||
* [Jay Faulkner](https://github.com/jayofdoom)
|
||||
* [J.C. Jones](https://github.com/jcjones)
|
||||
* [Jeff Hodges](https://github.com/jmhodges)
|
||||
@@ -139,15 +122,12 @@ Authors
|
||||
* [John Reed](https://github.com/leerspace)
|
||||
* [Jonas Berlin](https://github.com/xkr47)
|
||||
* [Jonathan Herlin](https://github.com/Jonher937)
|
||||
* [Jonathan Vanasco](https://github.com/jvanasco)
|
||||
* [Jon Walsh](https://github.com/code-tree)
|
||||
* [Joona Hoikkala](https://github.com/joohoi)
|
||||
* [Josh McCullough](https://github.com/JoshMcCullough)
|
||||
* [Josh Soref](https://github.com/jsoref)
|
||||
* [Joubin Jabbari](https://github.com/joubin)
|
||||
* [Juho Juopperi](https://github.com/jkjuopperi)
|
||||
* [Kane York](https://github.com/riking)
|
||||
* [Katsuyoshi Ozaki](https://github.com/moratori)
|
||||
* [Kenichi Maehashi](https://github.com/kmaehashi)
|
||||
* [Kenneth Skovhede](https://github.com/kenkendk)
|
||||
* [Kevin Burke](https://github.com/kevinburke)
|
||||
@@ -156,20 +136,16 @@ Authors
|
||||
* [LeCoyote](https://github.com/LeCoyote)
|
||||
* [Lee Watson](https://github.com/TheReverend403)
|
||||
* [Leo Famulari](https://github.com/lfam)
|
||||
* [Leon G](https://github.com/LeonGr)
|
||||
* [lf](https://github.com/lf-)
|
||||
* [Liam Marshall](https://github.com/liamim)
|
||||
* [Lior Sabag](https://github.com/liorsbg)
|
||||
* [Lipis](https://github.com/lipis)
|
||||
* [lord63](https://github.com/lord63)
|
||||
* [Lorenzo Fundaró](https://github.com/lfundaro)
|
||||
* [Luca Beltrame](https://github.com/lbeltrame)
|
||||
* [Luca Ebach](https://github.com/lucebac)
|
||||
* [Luca Olivetti](https://github.com/olivluca)
|
||||
* [Luke Rogers](https://github.com/lukeroge)
|
||||
* [Lukhnos Liu](https://github.com/lukhnos)
|
||||
* [Maarten](https://github.com/mrtndwrd)
|
||||
* [Mads Jensen](https://github.com/atombrella)
|
||||
* [Maikel Martens](https://github.com/krukas)
|
||||
* [Malte Janduda](https://github.com/MalteJ)
|
||||
* [Mantas Mikulėnas](https://github.com/grawity)
|
||||
@@ -185,7 +161,6 @@ Authors
|
||||
* [Mathieu Leduc-Hamel](https://github.com/mlhamel)
|
||||
* [Matt Bostock](https://github.com/mattbostock)
|
||||
* [Matthew Ames](https://github.com/SuperMatt)
|
||||
* [Matthew W. Thomas](https://github.com/mwt)
|
||||
* [Michael Schumacher](https://github.com/schumaml)
|
||||
* [Michael Strache](https://github.com/Jarodiv)
|
||||
* [Michael Sverdlin](https://github.com/sveder)
|
||||
@@ -210,33 +185,25 @@ Authors
|
||||
* [osirisinferi](https://github.com/osirisinferi)
|
||||
* Patrick Figel
|
||||
* [Patrick Heppler](https://github.com/PatrickHeppler)
|
||||
* [Paul Buonopane](https://github.com/Zenexer)
|
||||
* [Paul Feitzinger](https://github.com/pfeyz)
|
||||
* [Paulo Dias](https://github.com/paulojmdias)
|
||||
* [Pavan Gupta](https://github.com/pavgup)
|
||||
* [Pavel Pavlov](https://github.com/ghost355)
|
||||
* [Peter Conrad](https://github.com/pconrad-fb)
|
||||
* [Peter Eckersley](https://github.com/pde)
|
||||
* [Peter Mosmans](https://github.com/PeterMosmans)
|
||||
* [Phil Martin](https://github.com/frillip)
|
||||
* [Philippe Langlois](https://github.com/langloisjp)
|
||||
* [Philipp Spitzer](https://github.com/spitza)
|
||||
* [Piero Steinger](https://github.com/Jadaw1n)
|
||||
* [Pierre Jaury](https://github.com/kaiyou)
|
||||
* [Piotr Kasprzyk](https://github.com/kwadrat)
|
||||
* [Prayag Verma](https://github.com/pra85)
|
||||
* [Preston Locke](https://github.com/Preston12321)
|
||||
* [Q Misell][https://magicalcodewit.ch]
|
||||
* [Rasesh Patel](https://github.com/raspat1)
|
||||
* [Reinaldo de Souza Jr](https://github.com/juniorz)
|
||||
* [Remi Rampin](https://github.com/remram44)
|
||||
* [Rémy HUBSCHER](https://github.com/Natim)
|
||||
* [Rémy Léone](https://github.com/sieben)
|
||||
* [Richard Barnes](https://github.com/r-barnes)
|
||||
* [Richard Harman](https://github.com/warewolf)
|
||||
* [Richard Panek](https://github.com/kernelpanek)
|
||||
* [Robert Buchholz](https://github.com/rbu)
|
||||
* [Robert Dailey](https://github.com/pahrohfit)
|
||||
* [Robert Habermann](https://github.com/frennkie)
|
||||
* [Robert Xiao](https://github.com/nneonneo)
|
||||
* [Roland Shoemaker](https://github.com/rolandshoemaker)
|
||||
@@ -246,7 +213,6 @@ Authors
|
||||
* [Sagi Kedmi](https://github.com/sagi)
|
||||
* [Sam Lanning](https://github.com/samlanning)
|
||||
* [sapics](https://github.com/sapics)
|
||||
* [SATOH Fumiyasu](https://github.com/fumiyas)
|
||||
* [Scott Barr](https://github.com/scottjbarr)
|
||||
* [Scott Merrill](https://github.com/skpy)
|
||||
* [Sebastian Bögl](https://github.com/TheBoegl)
|
||||
@@ -263,11 +229,9 @@ Authors
|
||||
* [Spencer Bliven](https://github.com/sbliven)
|
||||
* [Stacey Sheldon](https://github.com/solidgoldbomb)
|
||||
* [Stavros Korokithakis](https://github.com/skorokithakis)
|
||||
* [Ștefan Talpalaru](https://github.com/stefantalpalaru)
|
||||
* [Stefan Weil](https://github.com/stweil)
|
||||
* [Steve Desmond](https://github.com/stevedesmond-ca)
|
||||
* [sydneyli](https://github.com/sydneyli)
|
||||
* [taixx046](https://github.com/taixx046)
|
||||
* [Tan Jay Jun](https://github.com/jayjun)
|
||||
* [Tapple Gao](https://github.com/tapple)
|
||||
* [Telepenin Nikolay](https://github.com/telepenin)
|
||||
@@ -292,7 +256,6 @@ Authors
|
||||
* [Wilfried Teiken](https://github.com/wteiken)
|
||||
* [Willem Fibbe](https://github.com/fibbers)
|
||||
* [William Budington](https://github.com/Hainish)
|
||||
* [Will Greenberg](https://github.com/wgreenberg)
|
||||
* [Will Newby](https://github.com/willnewby)
|
||||
* [Will Oller](https://github.com/willoller)
|
||||
* [Yan](https://github.com/diracdeltas)
|
||||
@@ -300,7 +263,5 @@ Authors
|
||||
* [Yomna](https://github.com/ynasser)
|
||||
* [Yoni Jah](https://github.com/yonjah)
|
||||
* [YourDaddyIsHere](https://github.com/YourDaddyIsHere)
|
||||
* [Yuseong Cho](https://github.com/g6123)
|
||||
* [Zach Shepherd](https://github.com/zjs)
|
||||
* [陈三](https://github.com/chenxsan)
|
||||
* [Shahar Naveh](https://github.com/ShaharNaveh)
|
||||
|
||||
@@ -1,8 +1,5 @@
|
||||
<!---
|
||||
|
||||
zoracon: (This is an old comment below, not sure how accurate this is anymore.
|
||||
Since Github seems to lean more towards Markdown these days, it's still probably accurate)
|
||||
|
||||
This file serves as an entry point for GitHub's Contributing
|
||||
Guidelines [1] only.
|
||||
|
||||
@@ -14,7 +11,7 @@ to the Sphinx generated docs is provided below.
|
||||
|
||||
|
||||
[1] https://github.com/blog/1184-contributing-guidelines
|
||||
[2] https://docutils.sourceforge.io/docs/user/rst/quickref.html#hyperlink-targets
|
||||
[2] http://docutils.sourceforge.net/docs/user/rst/quickref.html#hyperlink-targets
|
||||
|
||||
-->
|
||||
|
||||
@@ -22,15 +19,19 @@ to the Sphinx generated docs is provided below.
|
||||
|
||||
Hi! Welcome to the Certbot project. We look forward to collaborating with you.
|
||||
|
||||
If you're reporting a bug in Certbot. Please open an issue: https://github.com/certbot/certbot/issues/new/choose.
|
||||
|
||||
If you're having trouble using Certbot and aren't sure you've found a bug, please first try asking for help at https://community.letsencrypt.org/. There is a much larger community there of people familiar with the project who will be able to more quickly answer your questions.
|
||||
If you're reporting a bug in Certbot, please make sure to include:
|
||||
- The version of Certbot you're running.
|
||||
- The operating system you're running it on.
|
||||
- The commands you ran.
|
||||
- What you expected to happen, and
|
||||
- What actually happened.
|
||||
|
||||
If you're a developer, we have some helpful information in our
|
||||
[Developer's Guide](https://certbot.eff.org/docs/contributing.html) to get you
|
||||
started. In particular, we recommend you read these sections:
|
||||
started. In particular, we recommend you read these sections
|
||||
|
||||
- [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode)
|
||||
- [Finding issues to work on](https://certbot.eff.org/docs/contributing.html#find-issues-to-work-on)
|
||||
- [Coding style](https://certbot.eff.org/docs/contributing.html#coding-style)
|
||||
- [Submitting a pull request](https://certbot.eff.org/docs/contributing.html#submitting-a-pull-request)
|
||||
- [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode)
|
||||
|
||||
|
||||
20
Dockerfile-dev
Normal file
20
Dockerfile-dev
Normal file
@@ -0,0 +1,20 @@
|
||||
# This Dockerfile builds an image for development.
|
||||
FROM debian:buster
|
||||
|
||||
# Note: this only exposes the port to other docker containers.
|
||||
EXPOSE 80 443
|
||||
|
||||
WORKDIR /opt/certbot/src
|
||||
|
||||
COPY . .
|
||||
RUN apt-get update && \
|
||||
apt-get install apache2 git python3-dev python3-venv gcc libaugeas0 \
|
||||
libssl-dev libffi-dev ca-certificates openssl nginx-light -y && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
RUN VENV_NAME="../venv3" python3 tools/venv3.py
|
||||
|
||||
ENV PATH /opt/certbot/venv3/bin:$PATH
|
||||
22
ISSUE_TEMPLATE.md
Normal file
22
ISSUE_TEMPLATE.md
Normal file
@@ -0,0 +1,22 @@
|
||||
If you're having trouble using Certbot and aren't sure you've found a bug or
|
||||
request for a new feature, please first try asking for help at
|
||||
https://community.letsencrypt.org/. There is a much larger community there of
|
||||
people familiar with the project who will be able to more quickly answer your
|
||||
questions.
|
||||
|
||||
## My operating system is (include version):
|
||||
|
||||
|
||||
## I installed Certbot with (certbot-auto, OS package manager, pip, etc):
|
||||
|
||||
|
||||
## I ran this command and it produced this output:
|
||||
|
||||
|
||||
## Certbot's behavior differed from what I expected because:
|
||||
|
||||
|
||||
## Here is a Certbot log showing the issue (if available):
|
||||
###### Logs are stored in `/var/log/letsencrypt` by default. Feel free to redact domains, e-mail and IP addresses as you see fit.
|
||||
|
||||
## Here is the relevant nginx server block or Apache virtualhost for the domain I am configuring:
|
||||
16
SECURITY.md
16
SECURITY.md
@@ -1,16 +0,0 @@
|
||||
# Security Policy
|
||||
|
||||
## Supported Versions
|
||||
|
||||
Explanation on supported versions [here](https://github.com/certbot/certbot/wiki/Architectural-Decision-Records#-update-to-certbots-version-policy-and-end-of-life-support-on-previous-major-versions)
|
||||
|
||||
| Major Version | Support Level |
|
||||
| ------- | ------------------ |
|
||||
| >= 4.0 | Full Support |
|
||||
| 3.x | Discretionary Backports |
|
||||
| <=2.x | None |
|
||||
|
||||
|
||||
## Reporting a Vulnerability
|
||||
|
||||
Security vulnerabilities can be reported using GitHub's [private vulnerability reporting tool](https://github.com/certbot/certbot/security/advisories/new).
|
||||
@@ -1,33 +0,0 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: acme/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: acme/readthedocs.org.requirements.txt
|
||||
@@ -1,8 +1,8 @@
|
||||
include LICENSE.txt
|
||||
include README.rst
|
||||
include pytest.ini
|
||||
recursive-include docs *
|
||||
recursive-include examples *
|
||||
recursive-include src/acme/_internal/tests/testdata *
|
||||
include src/acme/py.typed
|
||||
recursive-include tests *
|
||||
global-exclude __pycache__
|
||||
global-exclude *.py[cod]
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
This module is an implementation of the `ACME protocol`_.
|
||||
|
||||
.. _`ACME protocol`: https://datatracker.ietf.org/doc/html/rfc8555
|
||||
.. _`ACME protocol`: https://ietf-wg-acme.github.io/acme
|
||||
|
||||
"""
|
||||
import sys
|
||||
@@ -20,10 +20,3 @@ for mod in list(sys.modules):
|
||||
# preserved (acme.jose.* is josepy.*)
|
||||
if mod == 'josepy' or mod.startswith('josepy.'):
|
||||
sys.modules['acme.' + mod.replace('josepy', 'jose', 1)] = sys.modules[mod]
|
||||
|
||||
if sys.version_info[:2] == (3, 9):
|
||||
warnings.warn(
|
||||
"Python 3.9 support will be dropped in the next planned release of "
|
||||
"acme. Please upgrade your Python version.",
|
||||
DeprecationWarning,
|
||||
) # pragma: no cover
|
||||
470
acme/acme/challenges.py
Normal file
470
acme/acme/challenges.py
Normal file
@@ -0,0 +1,470 @@
|
||||
"""ACME Identifier Validation Challenges."""
|
||||
import abc
|
||||
import functools
|
||||
import hashlib
|
||||
import logging
|
||||
|
||||
from cryptography.hazmat.primitives import hashes # type: ignore
|
||||
import josepy as jose
|
||||
import requests
|
||||
import six
|
||||
|
||||
from acme import fields
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Challenge(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
"""ACME challenge."""
|
||||
TYPES = {} # type: dict
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
try:
|
||||
return super(Challenge, cls).from_json(jobj)
|
||||
except jose.UnrecognizedTypeError as error:
|
||||
logger.debug(error)
|
||||
return UnrecognizedChallenge.from_json(jobj)
|
||||
|
||||
|
||||
class ChallengeResponse(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
"""ACME challenge response."""
|
||||
TYPES = {} # type: dict
|
||||
resource_type = 'challenge'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class UnrecognizedChallenge(Challenge):
|
||||
"""Unrecognized challenge.
|
||||
|
||||
ACME specification defines a generic framework for challenges and
|
||||
defines some standard challenges that are implemented in this
|
||||
module. However, other implementations (including peers) might
|
||||
define additional challenge types, which should be ignored if
|
||||
unrecognized.
|
||||
|
||||
:ivar jobj: Original JSON decoded object.
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, jobj):
|
||||
super(UnrecognizedChallenge, self).__init__()
|
||||
object.__setattr__(self, "jobj", jobj)
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.jobj # pylint: disable=no-member
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
return cls(jobj)
|
||||
|
||||
|
||||
class _TokenChallenge(Challenge):
|
||||
"""Challenge with token.
|
||||
|
||||
:ivar bytes token:
|
||||
|
||||
"""
|
||||
TOKEN_SIZE = 128 / 8 # Based on the entropy value from the spec
|
||||
"""Minimum size of the :attr:`token` in bytes."""
|
||||
|
||||
# TODO: acme-spec doesn't specify token as base64-encoded value
|
||||
token = jose.Field(
|
||||
"token", encoder=jose.encode_b64jose, decoder=functools.partial(
|
||||
jose.decode_b64jose, size=TOKEN_SIZE, minimum=True))
|
||||
|
||||
# XXX: rename to ~token_good_for_url
|
||||
@property
|
||||
def good_token(self): # XXX: @token.decoder
|
||||
"""Is `token` good?
|
||||
|
||||
.. todo:: acme-spec wants "It MUST NOT contain any non-ASCII
|
||||
characters", but it should also warrant that it doesn't
|
||||
contain ".." or "/"...
|
||||
|
||||
"""
|
||||
# TODO: check that path combined with uri does not go above
|
||||
# URI_ROOT_PATH!
|
||||
# pylint: disable=unsupported-membership-test
|
||||
return b'..' not in self.token and b'/' not in self.token
|
||||
|
||||
|
||||
class KeyAuthorizationChallengeResponse(ChallengeResponse):
|
||||
"""Response to Challenges based on Key Authorization.
|
||||
|
||||
:param unicode key_authorization:
|
||||
|
||||
"""
|
||||
key_authorization = jose.Field("keyAuthorization")
|
||||
thumbprint_hash_function = hashes.SHA256
|
||||
|
||||
def verify(self, chall, account_public_key):
|
||||
"""Verify the key authorization.
|
||||
|
||||
:param KeyAuthorization chall: Challenge that corresponds to
|
||||
this response.
|
||||
:param JWK account_public_key:
|
||||
|
||||
:return: ``True`` iff verification of the key authorization was
|
||||
successful.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
parts = self.key_authorization.split('.')
|
||||
if len(parts) != 2:
|
||||
logger.debug("Key authorization (%r) is not well formed",
|
||||
self.key_authorization)
|
||||
return False
|
||||
|
||||
if parts[0] != chall.encode("token"):
|
||||
logger.debug("Mismatching token in key authorization: "
|
||||
"%r instead of %r", parts[0], chall.encode("token"))
|
||||
return False
|
||||
|
||||
thumbprint = jose.b64encode(account_public_key.thumbprint(
|
||||
hash_function=self.thumbprint_hash_function)).decode()
|
||||
if parts[1] != thumbprint:
|
||||
logger.debug("Mismatching thumbprint in key authorization: "
|
||||
"%r instead of %r", parts[0], thumbprint)
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def to_partial_json(self):
|
||||
jobj = super(KeyAuthorizationChallengeResponse, self).to_partial_json()
|
||||
jobj.pop('keyAuthorization', None)
|
||||
return jobj
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class KeyAuthorizationChallenge(_TokenChallenge):
|
||||
"""Challenge based on Key Authorization.
|
||||
|
||||
:param response_cls: Subclass of `KeyAuthorizationChallengeResponse`
|
||||
that will be used to generate `response`.
|
||||
:param str typ: type of the challenge
|
||||
"""
|
||||
typ = NotImplemented
|
||||
response_cls = NotImplemented
|
||||
thumbprint_hash_function = (
|
||||
KeyAuthorizationChallengeResponse.thumbprint_hash_function)
|
||||
|
||||
def key_authorization(self, account_key):
|
||||
"""Generate Key Authorization.
|
||||
|
||||
:param JWK account_key:
|
||||
:rtype unicode:
|
||||
|
||||
"""
|
||||
return self.encode("token") + "." + jose.b64encode(
|
||||
account_key.thumbprint(
|
||||
hash_function=self.thumbprint_hash_function)).decode()
|
||||
|
||||
def response(self, account_key):
|
||||
"""Generate response to the challenge.
|
||||
|
||||
:param JWK account_key:
|
||||
|
||||
:returns: Response (initialized `response_cls`) to the challenge.
|
||||
:rtype: KeyAuthorizationChallengeResponse
|
||||
|
||||
"""
|
||||
return self.response_cls( # pylint: disable=not-callable
|
||||
key_authorization=self.key_authorization(account_key))
|
||||
|
||||
@abc.abstractmethod
|
||||
def validation(self, account_key, **kwargs):
|
||||
"""Generate validation for the challenge.
|
||||
|
||||
Subclasses must implement this method, but they are likely to
|
||||
return completely different data structures, depending on what's
|
||||
necessary to complete the challenge. Interpretation of that
|
||||
return value must be known to the caller.
|
||||
|
||||
:param JWK account_key:
|
||||
:returns: Challenge-specific validation.
|
||||
|
||||
"""
|
||||
raise NotImplementedError() # pragma: no cover
|
||||
|
||||
def response_and_validation(self, account_key, *args, **kwargs):
|
||||
"""Generate response and validation.
|
||||
|
||||
Convenience function that return results of `response` and
|
||||
`validation`.
|
||||
|
||||
:param JWK account_key:
|
||||
:rtype: tuple
|
||||
|
||||
"""
|
||||
return (self.response(account_key),
|
||||
self.validation(account_key, *args, **kwargs))
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class DNS01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME dns-01 challenge response."""
|
||||
typ = "dns-01"
|
||||
|
||||
def simple_verify(self, chall, domain, account_public_key): # pylint: disable=unused-argument
|
||||
"""Simple verify.
|
||||
|
||||
This method no longer checks DNS records and is a simple wrapper
|
||||
around `KeyAuthorizationChallengeResponse.verify`.
|
||||
|
||||
:param challenges.DNS01 chall: Corresponding challenge.
|
||||
:param unicode domain: Domain name being verified.
|
||||
:param JWK account_public_key: Public key for the key pair
|
||||
being authorized.
|
||||
|
||||
:return: ``True`` iff verification of the key authorization was
|
||||
successful.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
verified = self.verify(chall, account_public_key)
|
||||
if not verified:
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return verified
|
||||
|
||||
|
||||
@Challenge.register
|
||||
class DNS01(KeyAuthorizationChallenge):
|
||||
"""ACME dns-01 challenge."""
|
||||
response_cls = DNS01Response
|
||||
typ = response_cls.typ
|
||||
|
||||
LABEL = "_acme-challenge"
|
||||
"""Label clients prepend to the domain name being validated."""
|
||||
|
||||
def validation(self, account_key, **unused_kwargs):
|
||||
"""Generate validation.
|
||||
|
||||
:param JWK account_key:
|
||||
:rtype: unicode
|
||||
|
||||
"""
|
||||
return jose.b64encode(hashlib.sha256(self.key_authorization(
|
||||
account_key).encode("utf-8")).digest()).decode()
|
||||
|
||||
def validation_domain_name(self, name):
|
||||
"""Domain name for TXT validation record.
|
||||
|
||||
:param unicode name: Domain name being validated.
|
||||
|
||||
"""
|
||||
return "{0}.{1}".format(self.LABEL, name)
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class HTTP01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME http-01 challenge response."""
|
||||
typ = "http-01"
|
||||
|
||||
PORT = 80
|
||||
"""Verification port as defined by the protocol.
|
||||
|
||||
You can override it (e.g. for testing) by passing ``port`` to
|
||||
`simple_verify`.
|
||||
|
||||
"""
|
||||
|
||||
WHITESPACE_CUTSET = "\n\r\t "
|
||||
"""Whitespace characters which should be ignored at the end of the body."""
|
||||
|
||||
def simple_verify(self, chall, domain, account_public_key, port=None):
|
||||
"""Simple verify.
|
||||
|
||||
:param challenges.SimpleHTTP chall: Corresponding challenge.
|
||||
:param unicode domain: Domain name being verified.
|
||||
:param JWK account_public_key: Public key for the key pair
|
||||
being authorized.
|
||||
:param int port: Port used in the validation.
|
||||
|
||||
:returns: ``True`` iff validation with the files currently served by the
|
||||
HTTP server is successful.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not self.verify(chall, account_public_key):
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return False
|
||||
|
||||
# TODO: ACME specification defines URI template that doesn't
|
||||
# allow to use a custom port... Make sure port is not in the
|
||||
# request URI, if it's standard.
|
||||
if port is not None and port != self.PORT:
|
||||
logger.warning(
|
||||
"Using non-standard port for http-01 verification: %s", port)
|
||||
domain += ":{0}".format(port)
|
||||
|
||||
uri = chall.uri(domain)
|
||||
logger.debug("Verifying %s at %s...", chall.typ, uri)
|
||||
try:
|
||||
http_response = requests.get(uri)
|
||||
except requests.exceptions.RequestException as error:
|
||||
logger.error("Unable to reach %s: %s", uri, error)
|
||||
return False
|
||||
logger.debug("Received %s: %s. Headers: %s", http_response,
|
||||
http_response.text, http_response.headers)
|
||||
|
||||
challenge_response = http_response.text.rstrip(self.WHITESPACE_CUTSET)
|
||||
if self.key_authorization != challenge_response:
|
||||
logger.debug("Key authorization from response (%r) doesn't match "
|
||||
"HTTP response (%r)", self.key_authorization,
|
||||
challenge_response)
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
@Challenge.register
|
||||
class HTTP01(KeyAuthorizationChallenge):
|
||||
"""ACME http-01 challenge."""
|
||||
response_cls = HTTP01Response
|
||||
typ = response_cls.typ
|
||||
|
||||
URI_ROOT_PATH = ".well-known/acme-challenge"
|
||||
"""URI root path for the server provisioned resource."""
|
||||
|
||||
@property
|
||||
def path(self):
|
||||
"""Path (starting with '/') for provisioned resource.
|
||||
|
||||
:rtype: string
|
||||
|
||||
"""
|
||||
return '/' + self.URI_ROOT_PATH + '/' + self.encode('token')
|
||||
|
||||
def uri(self, domain):
|
||||
"""Create an URI to the provisioned resource.
|
||||
|
||||
Forms an URI to the HTTPS server provisioned resource
|
||||
(containing :attr:`~SimpleHTTP.token`).
|
||||
|
||||
:param unicode domain: Domain name being verified.
|
||||
:rtype: string
|
||||
|
||||
"""
|
||||
return "http://" + domain + self.path
|
||||
|
||||
def validation(self, account_key, **unused_kwargs):
|
||||
"""Generate validation.
|
||||
|
||||
:param JWK account_key:
|
||||
:rtype: unicode
|
||||
|
||||
"""
|
||||
return self.key_authorization(account_key)
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME TLS-ALPN-01 challenge response.
|
||||
|
||||
This class only allows initiating a TLS-ALPN-01 challenge returned from the
|
||||
CA. Full support for responding to TLS-ALPN-01 challenges by generating and
|
||||
serving the expected response certificate is not currently provided.
|
||||
"""
|
||||
typ = "tls-alpn-01"
|
||||
|
||||
|
||||
@Challenge.register
|
||||
class TLSALPN01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-alpn-01 challenge.
|
||||
|
||||
This class simply allows parsing the TLS-ALPN-01 challenge returned from
|
||||
the CA. Full TLS-ALPN-01 support is not currently provided.
|
||||
|
||||
"""
|
||||
typ = "tls-alpn-01"
|
||||
response_cls = TLSALPN01Response
|
||||
|
||||
def validation(self, account_key, **kwargs):
|
||||
"""Generate validation for the challenge."""
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
@Challenge.register
|
||||
class DNS(_TokenChallenge):
|
||||
"""ACME "dns" challenge."""
|
||||
typ = "dns"
|
||||
|
||||
LABEL = "_acme-challenge"
|
||||
"""Label clients prepend to the domain name being validated."""
|
||||
|
||||
def gen_validation(self, account_key, alg=jose.RS256, **kwargs):
|
||||
"""Generate validation.
|
||||
|
||||
:param .JWK account_key: Private account key.
|
||||
:param .JWA alg:
|
||||
|
||||
:returns: This challenge wrapped in `.JWS`
|
||||
:rtype: .JWS
|
||||
|
||||
"""
|
||||
return jose.JWS.sign(
|
||||
payload=self.json_dumps(sort_keys=True).encode('utf-8'),
|
||||
key=account_key, alg=alg, **kwargs)
|
||||
|
||||
def check_validation(self, validation, account_public_key):
|
||||
"""Check validation.
|
||||
|
||||
:param JWS validation:
|
||||
:param JWK account_public_key:
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not validation.verify(key=account_public_key):
|
||||
return False
|
||||
try:
|
||||
return self == self.json_loads(
|
||||
validation.payload.decode('utf-8'))
|
||||
except jose.DeserializationError as error:
|
||||
logger.debug("Checking validation for DNS failed: %s", error)
|
||||
return False
|
||||
|
||||
def gen_response(self, account_key, **kwargs):
|
||||
"""Generate response.
|
||||
|
||||
:param .JWK account_key: Private account key.
|
||||
:param .JWA alg:
|
||||
|
||||
:rtype: DNSResponse
|
||||
|
||||
"""
|
||||
return DNSResponse(validation=self.gen_validation(
|
||||
account_key, **kwargs))
|
||||
|
||||
def validation_domain_name(self, name):
|
||||
"""Domain name for TXT validation record.
|
||||
|
||||
:param unicode name: Domain name being validated.
|
||||
|
||||
"""
|
||||
return "{0}.{1}".format(self.LABEL, name)
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class DNSResponse(ChallengeResponse):
|
||||
"""ACME "dns" challenge response.
|
||||
|
||||
:param JWS validation:
|
||||
|
||||
"""
|
||||
typ = "dns"
|
||||
|
||||
validation = jose.Field("validation", decoder=jose.JWS.from_json)
|
||||
|
||||
def check_validation(self, chall, account_public_key):
|
||||
"""Check validation.
|
||||
|
||||
:param challenges.DNS chall:
|
||||
:param JWK account_public_key:
|
||||
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
return chall.check_validation(self.validation, account_public_key)
|
||||
1186
acme/acme/client.py
Normal file
1186
acme/acme/client.py
Normal file
File diff suppressed because it is too large
Load Diff
307
acme/acme/crypto_util.py
Normal file
307
acme/acme/crypto_util.py
Normal file
@@ -0,0 +1,307 @@
|
||||
"""Crypto utilities."""
|
||||
import binascii
|
||||
import contextlib
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import socket
|
||||
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
|
||||
|
||||
from acme import errors
|
||||
from acme.magic_typing import Callable # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Optional # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Tuple # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Union # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Default SSL method selected here is the most compatible, while secure
|
||||
# SSL method: TLSv1_METHOD is only compatible with
|
||||
# TLSv1_METHOD, while SSLv23_METHOD is compatible with all other
|
||||
# methods, including TLSv2_METHOD (read more at
|
||||
# https://www.openssl.org/docs/ssl/SSLv23_method.html). _serve_sni
|
||||
# should be changed to use "set_options" to disable SSLv2 and SSLv3,
|
||||
# in case it's used for things other than probing/serving!
|
||||
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
|
||||
|
||||
|
||||
class SSLSocket(object):
|
||||
"""SSL wrapper for sockets.
|
||||
|
||||
:ivar socket sock: Original wrapped socket.
|
||||
:ivar dict certs: Mapping from domain names (`bytes`) to
|
||||
`OpenSSL.crypto.X509`.
|
||||
:ivar method: See `OpenSSL.SSL.Context` for allowed values.
|
||||
|
||||
"""
|
||||
def __init__(self, sock, certs, method=_DEFAULT_SSL_METHOD):
|
||||
self.sock = sock
|
||||
self.certs = certs
|
||||
self.method = method
|
||||
|
||||
def __getattr__(self, name):
|
||||
return getattr(self.sock, name)
|
||||
|
||||
def _pick_certificate_cb(self, connection):
|
||||
"""SNI certificate callback.
|
||||
|
||||
This method will set a new OpenSSL context object for this
|
||||
connection when an incoming connection provides an SNI name
|
||||
(in order to serve the appropriate certificate, if any).
|
||||
|
||||
:param connection: The TLS connection object on which the SNI
|
||||
extension was received.
|
||||
:type connection: :class:`OpenSSL.Connection`
|
||||
|
||||
"""
|
||||
server_name = connection.get_servername()
|
||||
try:
|
||||
key, cert = self.certs[server_name]
|
||||
except KeyError:
|
||||
logger.debug("Server name (%s) not recognized, dropping SSL",
|
||||
server_name)
|
||||
return
|
||||
new_context = SSL.Context(self.method)
|
||||
new_context.set_options(SSL.OP_NO_SSLv2)
|
||||
new_context.set_options(SSL.OP_NO_SSLv3)
|
||||
new_context.use_privatekey(key)
|
||||
new_context.use_certificate(cert)
|
||||
connection.set_context(new_context)
|
||||
|
||||
class FakeConnection(object):
|
||||
"""Fake OpenSSL.SSL.Connection."""
|
||||
|
||||
# pylint: disable=missing-docstring
|
||||
|
||||
def __init__(self, connection):
|
||||
self._wrapped = connection
|
||||
|
||||
def __getattr__(self, name):
|
||||
return getattr(self._wrapped, name)
|
||||
|
||||
def shutdown(self, *unused_args):
|
||||
# OpenSSL.SSL.Connection.shutdown doesn't accept any args
|
||||
return self._wrapped.shutdown()
|
||||
|
||||
def accept(self): # pylint: disable=missing-docstring
|
||||
sock, addr = self.sock.accept()
|
||||
|
||||
context = SSL.Context(self.method)
|
||||
context.set_options(SSL.OP_NO_SSLv2)
|
||||
context.set_options(SSL.OP_NO_SSLv3)
|
||||
context.set_tlsext_servername_callback(self._pick_certificate_cb)
|
||||
|
||||
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
|
||||
ssl_sock.set_accept_state()
|
||||
|
||||
logger.debug("Performing handshake with %s", addr)
|
||||
try:
|
||||
ssl_sock.do_handshake()
|
||||
except SSL.Error as error:
|
||||
# _pick_certificate_cb might have returned without
|
||||
# creating SSL context (wrong server name)
|
||||
raise socket.error(error)
|
||||
|
||||
return ssl_sock, addr
|
||||
|
||||
|
||||
def probe_sni(name, host, port=443, timeout=300,
|
||||
method=_DEFAULT_SSL_METHOD, source_address=('', 0)):
|
||||
"""Probe SNI server for SSL certificate.
|
||||
|
||||
:param bytes name: Byte string to send as the server name in the
|
||||
client hello message.
|
||||
:param bytes host: Host to connect to.
|
||||
:param int port: Port to connect to.
|
||||
:param int timeout: Timeout in seconds.
|
||||
:param method: See `OpenSSL.SSL.Context` for allowed values.
|
||||
:param tuple source_address: Enables multi-path probing (selection
|
||||
of source interface). See `socket.creation_connection` for more
|
||||
info. Available only in Python 2.7+.
|
||||
|
||||
:raises acme.errors.Error: In case of any problems.
|
||||
|
||||
:returns: SSL certificate presented by the server.
|
||||
:rtype: OpenSSL.crypto.X509
|
||||
|
||||
"""
|
||||
context = SSL.Context(method)
|
||||
context.set_timeout(timeout)
|
||||
|
||||
socket_kwargs = {'source_address': source_address}
|
||||
|
||||
try:
|
||||
logger.debug(
|
||||
"Attempting to connect to %s:%d%s.", host, port,
|
||||
" from {0}:{1}".format(
|
||||
source_address[0],
|
||||
source_address[1]
|
||||
) if socket_kwargs else ""
|
||||
)
|
||||
socket_tuple = (host, port) # type: Tuple[str, int]
|
||||
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore
|
||||
except socket.error as error:
|
||||
raise errors.Error(error)
|
||||
|
||||
with contextlib.closing(sock) as client:
|
||||
client_ssl = SSL.Connection(context, client)
|
||||
client_ssl.set_connect_state()
|
||||
client_ssl.set_tlsext_host_name(name) # pyOpenSSL>=0.13
|
||||
try:
|
||||
client_ssl.do_handshake()
|
||||
client_ssl.shutdown()
|
||||
except SSL.Error as error:
|
||||
raise errors.Error(error)
|
||||
return client_ssl.get_peer_certificate()
|
||||
|
||||
def make_csr(private_key_pem, domains, must_staple=False):
|
||||
"""Generate a CSR containing a list of domains as subjectAltNames.
|
||||
|
||||
:param buffer private_key_pem: Private key, in PEM PKCS#8 format.
|
||||
:param list domains: List of DNS names to include in subjectAltNames of CSR.
|
||||
:param bool must_staple: Whether to include the TLS Feature extension (aka
|
||||
OCSP Must Staple: https://tools.ietf.org/html/rfc7633).
|
||||
:returns: buffer PEM-encoded Certificate Signing Request.
|
||||
"""
|
||||
private_key = crypto.load_privatekey(
|
||||
crypto.FILETYPE_PEM, private_key_pem)
|
||||
csr = crypto.X509Req()
|
||||
extensions = [
|
||||
crypto.X509Extension(
|
||||
b'subjectAltName',
|
||||
critical=False,
|
||||
value=', '.join('DNS:' + d for d in domains).encode('ascii')
|
||||
),
|
||||
]
|
||||
if must_staple:
|
||||
extensions.append(crypto.X509Extension(
|
||||
b"1.3.6.1.5.5.7.1.24",
|
||||
critical=False,
|
||||
value=b"DER:30:03:02:01:05"))
|
||||
csr.add_extensions(extensions)
|
||||
csr.set_pubkey(private_key)
|
||||
csr.set_version(2)
|
||||
csr.sign(private_key, 'sha256')
|
||||
return crypto.dump_certificate_request(
|
||||
crypto.FILETYPE_PEM, csr)
|
||||
|
||||
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
|
||||
common_name = loaded_cert_or_req.get_subject().CN
|
||||
sans = _pyopenssl_cert_or_req_san(loaded_cert_or_req)
|
||||
|
||||
if common_name is None:
|
||||
return sans
|
||||
return [common_name] + [d for d in sans if d != common_name]
|
||||
|
||||
def _pyopenssl_cert_or_req_san(cert_or_req):
|
||||
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
|
||||
|
||||
.. todo:: Implement directly in PyOpenSSL!
|
||||
|
||||
.. note:: Although this is `acme` internal API, it is used by
|
||||
`letsencrypt`.
|
||||
|
||||
:param cert_or_req: Certificate or CSR.
|
||||
:type cert_or_req: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
|
||||
|
||||
:returns: A list of Subject Alternative Names.
|
||||
:rtype: `list` of `unicode`
|
||||
|
||||
"""
|
||||
# This function finds SANs by dumping the certificate/CSR to text and
|
||||
# searching for "X509v3 Subject Alternative Name" in the text. This method
|
||||
# is used to support PyOpenSSL version 0.13 where the
|
||||
# `_subjectAltNameString` and `get_extensions` methods are not available
|
||||
# for CSRs.
|
||||
|
||||
# constants based on PyOpenSSL certificate/CSR text dump
|
||||
part_separator = ":"
|
||||
parts_separator = ", "
|
||||
prefix = "DNS" + part_separator
|
||||
|
||||
if isinstance(cert_or_req, crypto.X509):
|
||||
# pylint: disable=line-too-long
|
||||
func = crypto.dump_certificate # type: Union[Callable[[int, crypto.X509Req], bytes], Callable[[int, crypto.X509], bytes]]
|
||||
else:
|
||||
func = crypto.dump_certificate_request
|
||||
text = func(crypto.FILETYPE_TEXT, cert_or_req).decode("utf-8")
|
||||
# WARNING: this function does not support multiple SANs extensions.
|
||||
# Multiple X509v3 extensions of the same type is disallowed by RFC 5280.
|
||||
match = re.search(r"X509v3 Subject Alternative Name:(?: critical)?\s*(.*)", text)
|
||||
# WARNING: this function assumes that no SAN can include
|
||||
# parts_separator, hence the split!
|
||||
sans_parts = [] if match is None else match.group(1).split(parts_separator)
|
||||
|
||||
return [part.split(part_separator)[1]
|
||||
for part in sans_parts if part.startswith(prefix)]
|
||||
|
||||
|
||||
def gen_ss_cert(key, domains, not_before=None,
|
||||
validity=(7 * 24 * 60 * 60), force_san=True):
|
||||
"""Generate new self-signed certificate.
|
||||
|
||||
:type domains: `list` of `unicode`
|
||||
:param OpenSSL.crypto.PKey key:
|
||||
:param bool force_san:
|
||||
|
||||
If more than one domain is provided, all of the domains are put into
|
||||
``subjectAltName`` X.509 extension and first domain is set as the
|
||||
subject CN. If only one domain is provided no ``subjectAltName``
|
||||
extension is used, unless `force_san` is ``True``.
|
||||
|
||||
"""
|
||||
assert domains, "Must provide one or more hostnames for the cert."
|
||||
cert = crypto.X509()
|
||||
cert.set_serial_number(int(binascii.hexlify(os.urandom(16)), 16))
|
||||
cert.set_version(2)
|
||||
|
||||
extensions = [
|
||||
crypto.X509Extension(
|
||||
b"basicConstraints", True, b"CA:TRUE, pathlen:0"),
|
||||
]
|
||||
|
||||
cert.get_subject().CN = domains[0]
|
||||
# TODO: what to put into cert.get_subject()?
|
||||
cert.set_issuer(cert.get_subject())
|
||||
|
||||
if force_san or len(domains) > 1:
|
||||
extensions.append(crypto.X509Extension(
|
||||
b"subjectAltName",
|
||||
critical=False,
|
||||
value=b", ".join(b"DNS:" + d.encode() for d in domains)
|
||||
))
|
||||
|
||||
cert.add_extensions(extensions)
|
||||
|
||||
cert.gmtime_adj_notBefore(0 if not_before is None else not_before)
|
||||
cert.gmtime_adj_notAfter(validity)
|
||||
|
||||
cert.set_pubkey(key)
|
||||
cert.sign(key, "sha256")
|
||||
return cert
|
||||
|
||||
def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
|
||||
"""Dump certificate chain into a bundle.
|
||||
|
||||
:param list chain: List of `OpenSSL.crypto.X509` (or wrapped in
|
||||
:class:`josepy.util.ComparableX509`).
|
||||
|
||||
:returns: certificate chain bundle
|
||||
:rtype: bytes
|
||||
|
||||
"""
|
||||
# XXX: returns empty string when no chain is available, which
|
||||
# shuts up RenewableCert, but might not be the best solution...
|
||||
|
||||
def _dump_cert(cert):
|
||||
if isinstance(cert, jose.ComparableX509):
|
||||
# pylint: disable=protected-access
|
||||
cert = cert.wrapped
|
||||
return crypto.dump_certificate(filetype, cert)
|
||||
|
||||
# assumes that OpenSSL.crypto.dump_certificate includes ending
|
||||
# newline character
|
||||
return b"".join(_dump_cert(cert) for cert in chain)
|
||||
@@ -1,18 +1,5 @@
|
||||
"""ACME errors."""
|
||||
import datetime
|
||||
import typing
|
||||
from typing import Any
|
||||
from typing import List
|
||||
from typing import Mapping
|
||||
from typing import Set
|
||||
|
||||
from josepy import errors as jose_errors
|
||||
import requests
|
||||
|
||||
# We import acme.messages only during type check to avoid circular dependencies. Type references
|
||||
# to acme.message.* must be quoted to be lazily initialized and avoid compilation errors.
|
||||
if typing.TYPE_CHECKING:
|
||||
from acme import messages # pragma: no cover
|
||||
|
||||
|
||||
class Error(Exception):
|
||||
@@ -41,12 +28,17 @@ class NonceError(ClientError):
|
||||
|
||||
class BadNonce(NonceError):
|
||||
"""Bad nonce error."""
|
||||
def __init__(self, nonce: str, error: Exception, *args: Any) -> None:
|
||||
super().__init__(*args)
|
||||
def __init__(self, nonce, error, *args, **kwargs):
|
||||
# MyPy complains here that there is too many arguments for BaseException constructor.
|
||||
# This is an error fixed in typeshed, see https://github.com/python/mypy/issues/4183
|
||||
# The fix is included in MyPy>=0.740, but upgrading it would bring dozen of errors due to
|
||||
# new types definitions. So we ignore the error until the code base is fixed to match
|
||||
# with MyPy>=0.740 referential.
|
||||
super(BadNonce, self).__init__(*args, **kwargs) # type: ignore
|
||||
self.nonce = nonce
|
||||
self.error = error
|
||||
|
||||
def __str__(self) -> str:
|
||||
def __str__(self):
|
||||
return 'Invalid nonce ({0!r}): {1}'.format(self.nonce, self.error)
|
||||
|
||||
|
||||
@@ -57,14 +49,15 @@ class MissingNonce(NonceError):
|
||||
Replay-Nonce header field in each successful response to a POST it
|
||||
provides to a client (...)".
|
||||
|
||||
:ivar requests.Response ~.response: HTTP Response
|
||||
:ivar requests.Response response: HTTP Response
|
||||
|
||||
"""
|
||||
def __init__(self, response: requests.Response, *args: Any) -> None:
|
||||
super().__init__(*args)
|
||||
def __init__(self, response, *args, **kwargs):
|
||||
# See comment in BadNonce constructor above for an explanation of type: ignore here.
|
||||
super(MissingNonce, self).__init__(*args, **kwargs) # type: ignore
|
||||
self.response = response
|
||||
|
||||
def __str__(self) -> str:
|
||||
def __str__(self):
|
||||
return ('Server {0} response did not include a replay '
|
||||
'nonce, headers: {1} (This may be a service outage)'.format(
|
||||
self.response.request.method, self.response.headers))
|
||||
@@ -82,20 +75,17 @@ class PollError(ClientError):
|
||||
to the most recently updated one
|
||||
|
||||
"""
|
||||
def __init__(self, exhausted: Set['messages.AuthorizationResource'],
|
||||
updated: Mapping['messages.AuthorizationResource',
|
||||
'messages.AuthorizationResource']
|
||||
) -> None:
|
||||
def __init__(self, exhausted, updated):
|
||||
self.exhausted = exhausted
|
||||
self.updated = updated
|
||||
super().__init__()
|
||||
super(PollError, self).__init__()
|
||||
|
||||
@property
|
||||
def timeout(self) -> bool:
|
||||
def timeout(self):
|
||||
"""Was the error caused by timeout?"""
|
||||
return bool(self.exhausted)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
def __repr__(self):
|
||||
return '{0}(exhausted={1!r}, updated={2!r})'.format(
|
||||
self.__class__.__name__, self.exhausted, self.updated)
|
||||
|
||||
@@ -104,19 +94,9 @@ class ValidationError(Error):
|
||||
"""Error for authorization failures. Contains a list of authorization
|
||||
resources, each of which is invalid and should have an error field.
|
||||
"""
|
||||
def __init__(self, failed_authzrs: List['messages.AuthorizationResource']) -> None:
|
||||
def __init__(self, failed_authzrs):
|
||||
self.failed_authzrs = failed_authzrs
|
||||
super().__init__()
|
||||
|
||||
def __str__(self) -> str:
|
||||
msg = []
|
||||
for authzr in self.failed_authzrs:
|
||||
msg.append(f'Authorization for {authzr.body.identifier.value} ' \
|
||||
'failed due to one or more failed challenges:')
|
||||
for challenge in authzr.body.challenges:
|
||||
msg.append(f' Challenge {challenge.chall.typ} failed ' \
|
||||
f'with error {str(challenge.error)}')
|
||||
return '\n'.join(msg)
|
||||
super(ValidationError, self).__init__()
|
||||
|
||||
|
||||
class TimeoutError(Error): # pylint: disable=redefined-builtin
|
||||
@@ -126,13 +106,13 @@ class TimeoutError(Error): # pylint: disable=redefined-builtin
|
||||
class IssuanceError(Error):
|
||||
"""Error sent by the server after requesting issuance of a certificate."""
|
||||
|
||||
def __init__(self, error: 'messages.Error') -> None:
|
||||
def __init__(self, error):
|
||||
"""Initialize.
|
||||
|
||||
:param messages.Error error: The error provided by the server.
|
||||
"""
|
||||
self.error = error
|
||||
super().__init__()
|
||||
super(IssuanceError, self).__init__()
|
||||
|
||||
|
||||
class ConflictError(ClientError):
|
||||
@@ -143,17 +123,10 @@ class ConflictError(ClientError):
|
||||
|
||||
Also used in V2 of the ACME client for the same purpose.
|
||||
"""
|
||||
def __init__(self, location: str) -> None:
|
||||
def __init__(self, location):
|
||||
self.location = location
|
||||
super().__init__()
|
||||
super(ConflictError, self).__init__()
|
||||
|
||||
|
||||
class WildcardUnsupportedError(Error):
|
||||
"""Error for when a wildcard is requested but is unsupported by ACME CA."""
|
||||
|
||||
|
||||
class ARIError(ClientError):
|
||||
"""An error occurred during an ARI request and we want to suggest a Retry-After time."""
|
||||
def __init__(self, message: str, retry_after: datetime.datetime) -> None:
|
||||
super().__init__(message, retry_after)
|
||||
self.retry_after = retry_after
|
||||
@@ -1,7 +1,5 @@
|
||||
"""ACME JSON fields."""
|
||||
import datetime
|
||||
import logging
|
||||
from typing import Any
|
||||
|
||||
import josepy as jose
|
||||
import pyrfc3339
|
||||
@@ -12,17 +10,17 @@ logger = logging.getLogger(__name__)
|
||||
class Fixed(jose.Field):
|
||||
"""Fixed field."""
|
||||
|
||||
def __init__(self, json_name: str, value: Any) -> None:
|
||||
def __init__(self, json_name, value):
|
||||
self.value = value
|
||||
super().__init__(
|
||||
super(Fixed, self).__init__(
|
||||
json_name=json_name, default=value, omitempty=False)
|
||||
|
||||
def decode(self, value: Any) -> Any:
|
||||
def decode(self, value):
|
||||
if value != self.value:
|
||||
raise jose.DeserializationError('Expected {0!r}'.format(self.value))
|
||||
return self.value
|
||||
|
||||
def encode(self, value: Any) -> Any:
|
||||
def encode(self, value):
|
||||
if value != self.value:
|
||||
logger.warning(
|
||||
'Overriding fixed field (%s) with %r', self.json_name, value)
|
||||
@@ -34,27 +32,33 @@ class RFC3339Field(jose.Field):
|
||||
|
||||
Handles decoding/encoding between RFC3339 strings and aware (not
|
||||
naive) `datetime.datetime` objects
|
||||
(e.g. ``datetime.datetime.now(datetime.timezone.utc)``).
|
||||
(e.g. ``datetime.datetime.now(pytz.utc)``).
|
||||
|
||||
"""
|
||||
|
||||
@classmethod
|
||||
def default_encoder(cls, value: datetime.datetime) -> str:
|
||||
def default_encoder(cls, value):
|
||||
return pyrfc3339.generate(value)
|
||||
|
||||
@classmethod
|
||||
def default_decoder(cls, value: str) -> datetime.datetime:
|
||||
def default_decoder(cls, value):
|
||||
try:
|
||||
return pyrfc3339.parse(value)
|
||||
except ValueError as error:
|
||||
raise jose.DeserializationError(error)
|
||||
|
||||
|
||||
def fixed(json_name: str, value: Any) -> Any:
|
||||
"""Generates a type-friendly Fixed field."""
|
||||
return Fixed(json_name, value)
|
||||
class Resource(jose.Field):
|
||||
"""Resource MITM field."""
|
||||
|
||||
def __init__(self, resource_type, *args, **kwargs):
|
||||
self.resource_type = resource_type
|
||||
super(Resource, self).__init__(
|
||||
'resource', default=resource_type, *args, **kwargs)
|
||||
|
||||
def rfc3339(json_name: str, omitempty: bool = False) -> Any:
|
||||
"""Generates a type-friendly RFC3339 field."""
|
||||
return RFC3339Field(json_name, omitempty=omitempty)
|
||||
def decode(self, value):
|
||||
if value != self.resource_type:
|
||||
raise jose.DeserializationError(
|
||||
'Wrong resource type: {0} instead of {1}'.format(
|
||||
value, self.resource_type))
|
||||
return value
|
||||
@@ -4,22 +4,18 @@ The JWS implementation in josepy only implements the base JOSE standard. In
|
||||
order to support the new header fields defined in ACME, this module defines some
|
||||
ACME-specific classes that layer on top of josepy.
|
||||
"""
|
||||
from typing import Optional
|
||||
|
||||
import josepy as jose
|
||||
|
||||
|
||||
class Header(jose.Header):
|
||||
"""ACME-specific JOSE Header. Implements nonce, kid, and url.
|
||||
"""
|
||||
nonce: Optional[bytes] = jose.field('nonce', omitempty=True, encoder=jose.encode_b64jose)
|
||||
kid: Optional[str] = jose.field('kid', omitempty=True)
|
||||
url: Optional[str] = jose.field('url', omitempty=True)
|
||||
nonce = jose.Field('nonce', omitempty=True, encoder=jose.encode_b64jose)
|
||||
kid = jose.Field('kid', omitempty=True)
|
||||
url = jose.Field('url', omitempty=True)
|
||||
|
||||
# Mypy does not understand the josepy magic happening here, and falsely claims
|
||||
# that nonce is redefined. Let's ignore the type check here.
|
||||
@nonce.decoder # type: ignore[no-redef,union-attr]
|
||||
def nonce(value: str) -> bytes: # type: ignore[misc] # pylint: disable=no-self-argument,missing-function-docstring
|
||||
@nonce.decoder
|
||||
def nonce(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
try:
|
||||
return jose.decode_b64jose(value)
|
||||
except jose.DeserializationError as error:
|
||||
@@ -29,12 +25,12 @@ class Header(jose.Header):
|
||||
|
||||
class Signature(jose.Signature):
|
||||
"""ACME-specific Signature. Uses ACME-specific Header for customer fields."""
|
||||
__slots__ = jose.Signature._orig_slots # pylint: disable=protected-access,no-member
|
||||
__slots__ = jose.Signature._orig_slots # pylint: disable=no-member
|
||||
|
||||
# TODO: decoder/encoder should accept cls? Otherwise, subclassing
|
||||
# JSONObjectWithFields is tricky...
|
||||
header_cls = Header
|
||||
header: Header = jose.field(
|
||||
header = jose.Field(
|
||||
'header', omitempty=True, default=header_cls(),
|
||||
decoder=header_cls.from_json)
|
||||
|
||||
@@ -44,16 +40,15 @@ class Signature(jose.Signature):
|
||||
class JWS(jose.JWS):
|
||||
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
|
||||
signature_cls = Signature
|
||||
__slots__ = jose.JWS._orig_slots # pylint: disable=protected-access
|
||||
__slots__ = jose.JWS._orig_slots
|
||||
|
||||
@classmethod
|
||||
# type: ignore[override] # pylint: disable=arguments-differ
|
||||
def sign(cls, payload: bytes, key: jose.JWK, alg: jose.JWASignature, nonce: Optional[bytes],
|
||||
url: Optional[str] = None, kid: Optional[str] = None) -> jose.JWS:
|
||||
# pylint: disable=arguments-differ
|
||||
def sign(cls, payload, key, alg, nonce, url=None, kid=None):
|
||||
# Per ACME spec, jwk and kid are mutually exclusive, so only include a
|
||||
# jwk field if kid is not provided.
|
||||
include_jwk = kid is None
|
||||
return super().sign(payload, key=key, alg=alg,
|
||||
protect=frozenset(['nonce', 'url', 'kid', 'jwk', 'alg']),
|
||||
nonce=nonce, url=url, kid=kid,
|
||||
include_jwk=include_jwk)
|
||||
return super(JWS, cls).sign(payload, key=key, alg=alg,
|
||||
protect=frozenset(['nonce', 'url', 'kid', 'jwk', 'alg']),
|
||||
nonce=nonce, url=url, kid=kid,
|
||||
include_jwk=include_jwk)
|
||||
17
acme/acme/magic_typing.py
Normal file
17
acme/acme/magic_typing.py
Normal file
@@ -0,0 +1,17 @@
|
||||
"""Shim class to not have to depend on typing module in prod."""
|
||||
import sys
|
||||
|
||||
|
||||
class TypingClass(object):
|
||||
"""Ignore import errors by getting anything"""
|
||||
def __getattr__(self, name):
|
||||
return None
|
||||
|
||||
try:
|
||||
# mypy doesn't respect modifying sys.modules
|
||||
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
|
||||
# pylint: disable=unused-import
|
||||
from typing import Collection, IO # type: ignore
|
||||
# pylint: enable=unused-import
|
||||
except ImportError:
|
||||
sys.modules[__name__] = TypingClass()
|
||||
609
acme/acme/messages.py
Normal file
609
acme/acme/messages.py
Normal file
@@ -0,0 +1,609 @@
|
||||
"""ACME protocol messages."""
|
||||
import json
|
||||
|
||||
import josepy as jose
|
||||
import six
|
||||
|
||||
from acme import challenges
|
||||
from acme import errors
|
||||
from acme import fields
|
||||
from acme import jws
|
||||
from acme import util
|
||||
|
||||
try:
|
||||
from collections.abc import Hashable # pylint: disable=no-name-in-module
|
||||
except ImportError: # pragma: no cover
|
||||
from collections import Hashable
|
||||
|
||||
|
||||
|
||||
OLD_ERROR_PREFIX = "urn:acme:error:"
|
||||
ERROR_PREFIX = "urn:ietf:params:acme:error:"
|
||||
|
||||
ERROR_CODES = {
|
||||
'accountDoesNotExist': 'The request specified an account that does not exist',
|
||||
'alreadyRevoked': 'The request specified a certificate to be revoked that has' \
|
||||
' already been revoked',
|
||||
'badCSR': 'The CSR is unacceptable (e.g., due to a short key)',
|
||||
'badNonce': 'The client sent an unacceptable anti-replay nonce',
|
||||
'badPublicKey': 'The JWS was signed by a public key the server does not support',
|
||||
'badRevocationReason': 'The revocation reason provided is not allowed by the server',
|
||||
'badSignatureAlgorithm': 'The JWS was signed with an algorithm the server does not support',
|
||||
'caa': 'Certification Authority Authorization (CAA) records forbid the CA from issuing' \
|
||||
' a certificate',
|
||||
'compound': 'Specific error conditions are indicated in the "subproblems" array',
|
||||
'connection': ('The server could not connect to the client to verify the'
|
||||
' domain'),
|
||||
'dns': 'There was a problem with a DNS query during identifier validation',
|
||||
'dnssec': 'The server could not validate a DNSSEC signed domain',
|
||||
'incorrectResponse': 'Response recieved didn\'t match the challenge\'s requirements',
|
||||
# deprecate invalidEmail
|
||||
'invalidEmail': 'The provided email for a registration was invalid',
|
||||
'invalidContact': 'The provided contact URI was invalid',
|
||||
'malformed': 'The request message was malformed',
|
||||
'rejectedIdentifier': 'The server will not issue certificates for the identifier',
|
||||
'orderNotReady': 'The request attempted to finalize an order that is not ready to be finalized',
|
||||
'rateLimited': 'There were too many requests of a given type',
|
||||
'serverInternal': 'The server experienced an internal error',
|
||||
'tls': 'The server experienced a TLS error during domain verification',
|
||||
'unauthorized': 'The client lacks sufficient authorization',
|
||||
'unsupportedContact': 'A contact URL for an account used an unsupported protocol scheme',
|
||||
'unknownHost': 'The server could not resolve a domain name',
|
||||
'unsupportedIdentifier': 'An identifier is of an unsupported type',
|
||||
'externalAccountRequired': 'The server requires external account binding',
|
||||
}
|
||||
|
||||
ERROR_TYPE_DESCRIPTIONS = dict(
|
||||
(ERROR_PREFIX + name, desc) for name, desc in ERROR_CODES.items())
|
||||
|
||||
ERROR_TYPE_DESCRIPTIONS.update(dict( # add errors with old prefix, deprecate me
|
||||
(OLD_ERROR_PREFIX + name, desc) for name, desc in ERROR_CODES.items()))
|
||||
|
||||
|
||||
def is_acme_error(err):
|
||||
"""Check if argument is an ACME error."""
|
||||
if isinstance(err, Error) and (err.typ is not None):
|
||||
return (ERROR_PREFIX in err.typ) or (OLD_ERROR_PREFIX in err.typ)
|
||||
return False
|
||||
|
||||
|
||||
@six.python_2_unicode_compatible
|
||||
class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
"""ACME error.
|
||||
|
||||
https://tools.ietf.org/html/draft-ietf-appsawg-http-problem-00
|
||||
|
||||
:ivar unicode typ:
|
||||
:ivar unicode title:
|
||||
:ivar unicode detail:
|
||||
|
||||
"""
|
||||
typ = jose.Field('type', omitempty=True, default='about:blank')
|
||||
title = jose.Field('title', omitempty=True)
|
||||
detail = jose.Field('detail', omitempty=True)
|
||||
|
||||
@classmethod
|
||||
def with_code(cls, code, **kwargs):
|
||||
"""Create an Error instance with an ACME Error code.
|
||||
|
||||
:unicode code: An ACME error code, like 'dnssec'.
|
||||
:kwargs: kwargs to pass to Error.
|
||||
|
||||
"""
|
||||
if code not in ERROR_CODES:
|
||||
raise ValueError("The supplied code: %s is not a known ACME error"
|
||||
" code" % code)
|
||||
typ = ERROR_PREFIX + code
|
||||
return cls(typ=typ, **kwargs)
|
||||
|
||||
@property
|
||||
def description(self):
|
||||
"""Hardcoded error description based on its type.
|
||||
|
||||
:returns: Description if standard ACME error or ``None``.
|
||||
:rtype: unicode
|
||||
|
||||
"""
|
||||
return ERROR_TYPE_DESCRIPTIONS.get(self.typ)
|
||||
|
||||
@property
|
||||
def code(self):
|
||||
"""ACME error code.
|
||||
|
||||
Basically self.typ without the ERROR_PREFIX.
|
||||
|
||||
:returns: error code if standard ACME code or ``None``.
|
||||
:rtype: unicode
|
||||
|
||||
"""
|
||||
code = str(self.typ).split(':')[-1]
|
||||
if code in ERROR_CODES:
|
||||
return code
|
||||
return None
|
||||
|
||||
def __str__(self):
|
||||
return b' :: '.join(
|
||||
part.encode('ascii', 'backslashreplace') for part in
|
||||
(self.typ, self.description, self.detail, self.title)
|
||||
if part is not None).decode()
|
||||
|
||||
|
||||
class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
|
||||
"""ACME constant."""
|
||||
__slots__ = ('name',)
|
||||
POSSIBLE_NAMES = NotImplemented
|
||||
|
||||
def __init__(self, name):
|
||||
super(_Constant, self).__init__()
|
||||
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
|
||||
self.name = name
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.name
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
|
||||
raise jose.DeserializationError(
|
||||
'{0} not recognized'.format(cls.__name__))
|
||||
return cls.POSSIBLE_NAMES[jobj]
|
||||
|
||||
def __repr__(self):
|
||||
return '{0}({1})'.format(self.__class__.__name__, self.name)
|
||||
|
||||
def __eq__(self, other):
|
||||
return isinstance(other, type(self)) and other.name == self.name
|
||||
|
||||
def __hash__(self):
|
||||
return hash((self.__class__, self.name))
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self == other
|
||||
|
||||
|
||||
class Status(_Constant):
|
||||
"""ACME "status" field."""
|
||||
POSSIBLE_NAMES = {} # type: dict
|
||||
STATUS_UNKNOWN = Status('unknown')
|
||||
STATUS_PENDING = Status('pending')
|
||||
STATUS_PROCESSING = Status('processing')
|
||||
STATUS_VALID = Status('valid')
|
||||
STATUS_INVALID = Status('invalid')
|
||||
STATUS_REVOKED = Status('revoked')
|
||||
STATUS_READY = Status('ready')
|
||||
STATUS_DEACTIVATED = Status('deactivated')
|
||||
|
||||
|
||||
class IdentifierType(_Constant):
|
||||
"""ACME identifier type."""
|
||||
POSSIBLE_NAMES = {} # type: dict
|
||||
IDENTIFIER_FQDN = IdentifierType('dns') # IdentifierDNS in Boulder
|
||||
|
||||
|
||||
class Identifier(jose.JSONObjectWithFields):
|
||||
"""ACME identifier.
|
||||
|
||||
:ivar IdentifierType typ:
|
||||
:ivar unicode value:
|
||||
|
||||
"""
|
||||
typ = jose.Field('type', decoder=IdentifierType.from_json)
|
||||
value = jose.Field('value')
|
||||
|
||||
|
||||
class Directory(jose.JSONDeSerializable):
|
||||
"""Directory."""
|
||||
|
||||
_REGISTERED_TYPES = {} # type: dict
|
||||
|
||||
class Meta(jose.JSONObjectWithFields):
|
||||
"""Directory Meta."""
|
||||
_terms_of_service = jose.Field('terms-of-service', omitempty=True)
|
||||
_terms_of_service_v2 = jose.Field('termsOfService', omitempty=True)
|
||||
website = jose.Field('website', omitempty=True)
|
||||
caa_identities = jose.Field('caaIdentities', omitempty=True)
|
||||
external_account_required = jose.Field('externalAccountRequired', omitempty=True)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
|
||||
super(Directory.Meta, self).__init__(**kwargs)
|
||||
|
||||
@property
|
||||
def terms_of_service(self):
|
||||
"""URL for the CA TOS"""
|
||||
return self._terms_of_service or self._terms_of_service_v2
|
||||
|
||||
def __iter__(self):
|
||||
# When iterating over fields, use the external name 'terms_of_service' instead of
|
||||
# the internal '_terms_of_service'.
|
||||
for name in super(Directory.Meta, self).__iter__():
|
||||
yield name[1:] if name == '_terms_of_service' else name
|
||||
|
||||
def _internal_name(self, name):
|
||||
return '_' + name if name == 'terms_of_service' else name
|
||||
|
||||
|
||||
@classmethod
|
||||
def _canon_key(cls, key):
|
||||
return getattr(key, 'resource_type', key)
|
||||
|
||||
@classmethod
|
||||
def register(cls, resource_body_cls):
|
||||
"""Register resource."""
|
||||
resource_type = resource_body_cls.resource_type
|
||||
assert resource_type not in cls._REGISTERED_TYPES
|
||||
cls._REGISTERED_TYPES[resource_type] = resource_body_cls
|
||||
return resource_body_cls
|
||||
|
||||
def __init__(self, jobj):
|
||||
canon_jobj = util.map_keys(jobj, self._canon_key)
|
||||
# TODO: check that everything is an absolute URL; acme-spec is
|
||||
# not clear on that
|
||||
self._jobj = canon_jobj
|
||||
|
||||
def __getattr__(self, name):
|
||||
try:
|
||||
return self[name.replace('_', '-')]
|
||||
except KeyError as error:
|
||||
raise AttributeError(str(error) + ': ' + name)
|
||||
|
||||
def __getitem__(self, name):
|
||||
try:
|
||||
return self._jobj[self._canon_key(name)]
|
||||
except KeyError:
|
||||
raise KeyError('Directory field not found')
|
||||
|
||||
def to_partial_json(self):
|
||||
return self._jobj
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
jobj['meta'] = cls.Meta.from_json(jobj.pop('meta', {}))
|
||||
return cls(jobj)
|
||||
|
||||
|
||||
class Resource(jose.JSONObjectWithFields):
|
||||
"""ACME Resource.
|
||||
|
||||
:ivar acme.messages.ResourceBody body: Resource body.
|
||||
|
||||
"""
|
||||
body = jose.Field('body')
|
||||
|
||||
|
||||
class ResourceWithURI(Resource):
|
||||
"""ACME Resource with URI.
|
||||
|
||||
:ivar unicode uri: Location of the resource.
|
||||
|
||||
"""
|
||||
uri = jose.Field('uri') # no ChallengeResource.uri
|
||||
|
||||
|
||||
class ResourceBody(jose.JSONObjectWithFields):
|
||||
"""ACME Resource Body."""
|
||||
|
||||
|
||||
class ExternalAccountBinding(object):
|
||||
"""ACME External Account Binding"""
|
||||
|
||||
@classmethod
|
||||
def from_data(cls, account_public_key, kid, hmac_key, directory):
|
||||
"""Create External Account Binding Resource from contact details, kid and hmac."""
|
||||
|
||||
key_json = json.dumps(account_public_key.to_partial_json()).encode()
|
||||
decoded_hmac_key = jose.b64.b64decode(hmac_key)
|
||||
url = directory["newAccount"]
|
||||
|
||||
eab = jws.JWS.sign(key_json, jose.jwk.JWKOct(key=decoded_hmac_key),
|
||||
jose.jwa.HS256, None,
|
||||
url, kid)
|
||||
|
||||
return eab.to_partial_json()
|
||||
|
||||
|
||||
class Registration(ResourceBody):
|
||||
"""Registration Resource Body.
|
||||
|
||||
:ivar josepy.jwk.JWK key: Public key.
|
||||
:ivar tuple contact: Contact information following ACME spec,
|
||||
`tuple` of `unicode`.
|
||||
:ivar unicode agreement:
|
||||
|
||||
"""
|
||||
# on new-reg key server ignores 'key' and populates it based on
|
||||
# JWS.signature.combined.jwk
|
||||
key = jose.Field('key', omitempty=True, decoder=jose.JWK.from_json)
|
||||
contact = jose.Field('contact', omitempty=True, default=())
|
||||
agreement = jose.Field('agreement', omitempty=True)
|
||||
status = jose.Field('status', omitempty=True)
|
||||
terms_of_service_agreed = jose.Field('termsOfServiceAgreed', omitempty=True)
|
||||
only_return_existing = jose.Field('onlyReturnExisting', omitempty=True)
|
||||
external_account_binding = jose.Field('externalAccountBinding', omitempty=True)
|
||||
|
||||
phone_prefix = 'tel:'
|
||||
email_prefix = 'mailto:'
|
||||
|
||||
@classmethod
|
||||
def from_data(cls, phone=None, email=None, external_account_binding=None, **kwargs):
|
||||
"""Create registration resource from contact details."""
|
||||
details = list(kwargs.pop('contact', ()))
|
||||
if phone is not None:
|
||||
details.append(cls.phone_prefix + phone)
|
||||
if email is not None:
|
||||
details.extend([cls.email_prefix + mail for mail in email.split(',')])
|
||||
kwargs['contact'] = tuple(details)
|
||||
|
||||
if external_account_binding:
|
||||
kwargs['external_account_binding'] = external_account_binding
|
||||
|
||||
return cls(**kwargs)
|
||||
|
||||
def _filter_contact(self, prefix):
|
||||
return tuple(
|
||||
detail[len(prefix):] for detail in self.contact # pylint: disable=not-an-iterable
|
||||
if detail.startswith(prefix))
|
||||
|
||||
@property
|
||||
def phones(self):
|
||||
"""All phones found in the ``contact`` field."""
|
||||
return self._filter_contact(self.phone_prefix)
|
||||
|
||||
@property
|
||||
def emails(self):
|
||||
"""All emails found in the ``contact`` field."""
|
||||
return self._filter_contact(self.email_prefix)
|
||||
|
||||
|
||||
@Directory.register
|
||||
class NewRegistration(Registration):
|
||||
"""New registration."""
|
||||
resource_type = 'new-reg'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class UpdateRegistration(Registration):
|
||||
"""Update registration."""
|
||||
resource_type = 'reg'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class RegistrationResource(ResourceWithURI):
|
||||
"""Registration Resource.
|
||||
|
||||
:ivar acme.messages.Registration body:
|
||||
:ivar unicode new_authzr_uri: Deprecated. Do not use.
|
||||
:ivar unicode terms_of_service: URL for the CA TOS.
|
||||
|
||||
"""
|
||||
body = jose.Field('body', decoder=Registration.from_json)
|
||||
new_authzr_uri = jose.Field('new_authzr_uri', omitempty=True)
|
||||
terms_of_service = jose.Field('terms_of_service', omitempty=True)
|
||||
|
||||
|
||||
class ChallengeBody(ResourceBody):
|
||||
"""Challenge Resource Body.
|
||||
|
||||
.. todo::
|
||||
Confusingly, this has a similar name to `.challenges.Challenge`,
|
||||
as well as `.achallenges.AnnotatedChallenge`. Please use names
|
||||
such as ``challb`` to distinguish instances of this class from
|
||||
``achall``.
|
||||
|
||||
:ivar acme.challenges.Challenge: Wrapped challenge.
|
||||
Conveniently, all challenge fields are proxied, i.e. you can
|
||||
call ``challb.x`` to get ``challb.chall.x`` contents.
|
||||
:ivar acme.messages.Status status:
|
||||
:ivar datetime.datetime validated:
|
||||
:ivar messages.Error error:
|
||||
|
||||
"""
|
||||
__slots__ = ('chall',)
|
||||
# ACMEv1 has a "uri" field in challenges. ACMEv2 has a "url" field. This
|
||||
# challenge object supports either one, but should be accessed through the
|
||||
# name "uri". In Client.answer_challenge, whichever one is set will be
|
||||
# used.
|
||||
_uri = jose.Field('uri', omitempty=True, default=None)
|
||||
_url = jose.Field('url', omitempty=True, default=None)
|
||||
status = jose.Field('status', decoder=Status.from_json,
|
||||
omitempty=True, default=STATUS_PENDING)
|
||||
validated = fields.RFC3339Field('validated', omitempty=True)
|
||||
error = jose.Field('error', decoder=Error.from_json,
|
||||
omitempty=True, default=None)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
|
||||
super(ChallengeBody, self).__init__(**kwargs)
|
||||
|
||||
def encode(self, name):
|
||||
return super(ChallengeBody, self).encode(self._internal_name(name))
|
||||
|
||||
def to_partial_json(self):
|
||||
jobj = super(ChallengeBody, self).to_partial_json()
|
||||
jobj.update(self.chall.to_partial_json())
|
||||
return jobj
|
||||
|
||||
@classmethod
|
||||
def fields_from_json(cls, jobj):
|
||||
jobj_fields = super(ChallengeBody, cls).fields_from_json(jobj)
|
||||
jobj_fields['chall'] = challenges.Challenge.from_json(jobj)
|
||||
return jobj_fields
|
||||
|
||||
@property
|
||||
def uri(self):
|
||||
"""The URL of this challenge."""
|
||||
return self._url or self._uri
|
||||
|
||||
def __getattr__(self, name):
|
||||
return getattr(self.chall, name)
|
||||
|
||||
def __iter__(self):
|
||||
# When iterating over fields, use the external name 'uri' instead of
|
||||
# the internal '_uri'.
|
||||
for name in super(ChallengeBody, self).__iter__():
|
||||
yield name[1:] if name == '_uri' else name
|
||||
|
||||
def _internal_name(self, name):
|
||||
return '_' + name if name == 'uri' else name
|
||||
|
||||
|
||||
class ChallengeResource(Resource):
|
||||
"""Challenge Resource.
|
||||
|
||||
:ivar acme.messages.ChallengeBody body:
|
||||
:ivar unicode authzr_uri: URI found in the 'up' ``Link`` header.
|
||||
|
||||
"""
|
||||
body = jose.Field('body', decoder=ChallengeBody.from_json)
|
||||
authzr_uri = jose.Field('authzr_uri')
|
||||
|
||||
@property
|
||||
def uri(self):
|
||||
"""The URL of the challenge body."""
|
||||
# pylint: disable=function-redefined,no-member
|
||||
return self.body.uri
|
||||
|
||||
|
||||
class Authorization(ResourceBody):
|
||||
"""Authorization Resource Body.
|
||||
|
||||
:ivar acme.messages.Identifier identifier:
|
||||
:ivar list challenges: `list` of `.ChallengeBody`
|
||||
:ivar tuple combinations: Challenge combinations (`tuple` of `tuple`
|
||||
of `int`, as opposed to `list` of `list` from the spec).
|
||||
:ivar acme.messages.Status status:
|
||||
:ivar datetime.datetime expires:
|
||||
|
||||
"""
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json, omitempty=True)
|
||||
challenges = jose.Field('challenges', omitempty=True)
|
||||
combinations = jose.Field('combinations', omitempty=True)
|
||||
|
||||
status = jose.Field('status', omitempty=True, decoder=Status.from_json)
|
||||
# TODO: 'expires' is allowed for Authorization Resources in
|
||||
# general, but for Key Authorization '[t]he "expires" field MUST
|
||||
# be absent'... then acme-spec gives example with 'expires'
|
||||
# present... That's confusing!
|
||||
expires = fields.RFC3339Field('expires', omitempty=True)
|
||||
wildcard = jose.Field('wildcard', omitempty=True)
|
||||
|
||||
@challenges.decoder
|
||||
def challenges(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
return tuple(ChallengeBody.from_json(chall) for chall in value)
|
||||
|
||||
@property
|
||||
def resolved_combinations(self):
|
||||
"""Combinations with challenges instead of indices."""
|
||||
return tuple(tuple(self.challenges[idx] for idx in combo)
|
||||
for combo in self.combinations) # pylint: disable=not-an-iterable
|
||||
|
||||
|
||||
@Directory.register
|
||||
class NewAuthorization(Authorization):
|
||||
"""New authorization."""
|
||||
resource_type = 'new-authz'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class UpdateAuthorization(Authorization):
|
||||
"""Update authorization."""
|
||||
resource_type = 'authz'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class AuthorizationResource(ResourceWithURI):
|
||||
"""Authorization Resource.
|
||||
|
||||
:ivar acme.messages.Authorization body:
|
||||
:ivar unicode new_cert_uri: Deprecated. Do not use.
|
||||
|
||||
"""
|
||||
body = jose.Field('body', decoder=Authorization.from_json)
|
||||
new_cert_uri = jose.Field('new_cert_uri', omitempty=True)
|
||||
|
||||
|
||||
@Directory.register
|
||||
class CertificateRequest(jose.JSONObjectWithFields):
|
||||
"""ACME new-cert request.
|
||||
|
||||
:ivar josepy.util.ComparableX509 csr:
|
||||
`OpenSSL.crypto.X509Req` wrapped in `.ComparableX509`
|
||||
|
||||
"""
|
||||
resource_type = 'new-cert'
|
||||
resource = fields.Resource(resource_type)
|
||||
csr = jose.Field('csr', decoder=jose.decode_csr, encoder=jose.encode_csr)
|
||||
|
||||
|
||||
class CertificateResource(ResourceWithURI):
|
||||
"""Certificate Resource.
|
||||
|
||||
:ivar josepy.util.ComparableX509 body:
|
||||
`OpenSSL.crypto.X509` wrapped in `.ComparableX509`
|
||||
:ivar unicode cert_chain_uri: URI found in the 'up' ``Link`` header
|
||||
:ivar tuple authzrs: `tuple` of `AuthorizationResource`.
|
||||
|
||||
"""
|
||||
cert_chain_uri = jose.Field('cert_chain_uri')
|
||||
authzrs = jose.Field('authzrs')
|
||||
|
||||
|
||||
@Directory.register
|
||||
class Revocation(jose.JSONObjectWithFields):
|
||||
"""Revocation message.
|
||||
|
||||
:ivar .ComparableX509 certificate: `OpenSSL.crypto.X509` wrapped in
|
||||
`.ComparableX509`
|
||||
|
||||
"""
|
||||
resource_type = 'revoke-cert'
|
||||
resource = fields.Resource(resource_type)
|
||||
certificate = jose.Field(
|
||||
'certificate', decoder=jose.decode_cert, encoder=jose.encode_cert)
|
||||
reason = jose.Field('reason')
|
||||
|
||||
|
||||
class Order(ResourceBody):
|
||||
"""Order Resource Body.
|
||||
|
||||
:ivar list of .Identifier: List of identifiers for the certificate.
|
||||
:ivar acme.messages.Status status:
|
||||
:ivar list of str authorizations: URLs of authorizations.
|
||||
:ivar str certificate: URL to download certificate as a fullchain PEM.
|
||||
:ivar str finalize: URL to POST to to request issuance once all
|
||||
authorizations have "valid" status.
|
||||
:ivar datetime.datetime expires: When the order expires.
|
||||
:ivar .Error error: Any error that occurred during finalization, if applicable.
|
||||
"""
|
||||
identifiers = jose.Field('identifiers', omitempty=True)
|
||||
status = jose.Field('status', decoder=Status.from_json,
|
||||
omitempty=True)
|
||||
authorizations = jose.Field('authorizations', omitempty=True)
|
||||
certificate = jose.Field('certificate', omitempty=True)
|
||||
finalize = jose.Field('finalize', omitempty=True)
|
||||
expires = fields.RFC3339Field('expires', omitempty=True)
|
||||
error = jose.Field('error', omitempty=True, decoder=Error.from_json)
|
||||
|
||||
@identifiers.decoder
|
||||
def identifiers(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
return tuple(Identifier.from_json(identifier) for identifier in value)
|
||||
|
||||
class OrderResource(ResourceWithURI):
|
||||
"""Order Resource.
|
||||
|
||||
:ivar acme.messages.Order body:
|
||||
:ivar str csr_pem: The CSR this Order will be finalized with.
|
||||
:ivar list of acme.messages.AuthorizationResource authorizations:
|
||||
Fully-fetched AuthorizationResource objects.
|
||||
:ivar str fullchain_pem: The fetched contents of the certificate URL
|
||||
produced once the order was finalized, if it's present.
|
||||
"""
|
||||
body = jose.Field('body', decoder=Order.from_json)
|
||||
csr_pem = jose.Field('csr_pem', omitempty=True)
|
||||
authorizations = jose.Field('authorizations')
|
||||
fullchain_pem = jose.Field('fullchain_pem', omitempty=True)
|
||||
|
||||
@Directory.register
|
||||
class NewOrder(Order):
|
||||
"""New order."""
|
||||
resource_type = 'new-order'
|
||||
228
acme/acme/standalone.py
Normal file
228
acme/acme/standalone.py
Normal file
@@ -0,0 +1,228 @@
|
||||
"""Support for standalone client challenge solvers. """
|
||||
import collections
|
||||
import functools
|
||||
import logging
|
||||
import socket
|
||||
import threading
|
||||
|
||||
from six.moves import BaseHTTPServer # type: ignore # pylint: disable=import-error
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
from six.moves import socketserver # type: ignore # pylint: disable=import-error
|
||||
|
||||
from acme import challenges
|
||||
from acme import crypto_util
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# six.moves.* | pylint: disable=no-member,attribute-defined-outside-init
|
||||
# pylint: disable=no-init
|
||||
|
||||
|
||||
class TLSServer(socketserver.TCPServer):
|
||||
"""Generic TLS Server."""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.ipv6 = kwargs.pop("ipv6", False)
|
||||
if self.ipv6:
|
||||
self.address_family = socket.AF_INET6
|
||||
else:
|
||||
self.address_family = socket.AF_INET
|
||||
self.certs = kwargs.pop("certs", {})
|
||||
self.method = kwargs.pop(
|
||||
# pylint: disable=protected-access
|
||||
"method", crypto_util._DEFAULT_SSL_METHOD)
|
||||
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
|
||||
socketserver.TCPServer.__init__(self, *args, **kwargs)
|
||||
|
||||
def _wrap_sock(self):
|
||||
self.socket = crypto_util.SSLSocket(
|
||||
self.socket, certs=self.certs, method=self.method)
|
||||
|
||||
def server_bind(self): # pylint: disable=missing-docstring
|
||||
self._wrap_sock()
|
||||
return socketserver.TCPServer.server_bind(self)
|
||||
|
||||
|
||||
class ACMEServerMixin:
|
||||
"""ACME server common settings mixin."""
|
||||
# TODO: c.f. #858
|
||||
server_version = "ACME client standalone challenge solver"
|
||||
allow_reuse_address = True
|
||||
|
||||
|
||||
class BaseDualNetworkedServers(object):
|
||||
"""Base class for a pair of IPv6 and IPv4 servers that tries to do everything
|
||||
it's asked for both servers, but where failures in one server don't
|
||||
affect the other.
|
||||
|
||||
If two servers are instantiated, they will serve on the same port.
|
||||
"""
|
||||
|
||||
def __init__(self, ServerClass, server_address, *remaining_args, **kwargs):
|
||||
port = server_address[1]
|
||||
self.threads = [] # type: List[threading.Thread]
|
||||
self.servers = [] # type: List[ACMEServerMixin]
|
||||
|
||||
# Must try True first.
|
||||
# Ubuntu, for example, will fail to bind to IPv4 if we've already bound
|
||||
# to IPv6. But that's ok, since it will accept IPv4 connections on the IPv6
|
||||
# socket. On the other hand, FreeBSD will successfully bind to IPv4 on the
|
||||
# same port, which means that server will accept the IPv4 connections.
|
||||
# If Python is compiled without IPv6, we'll error out but (probably) successfully
|
||||
# create the IPv4 server.
|
||||
for ip_version in [True, False]:
|
||||
try:
|
||||
kwargs["ipv6"] = ip_version
|
||||
new_address = (server_address[0],) + (port,) + server_address[2:]
|
||||
new_args = (new_address,) + remaining_args
|
||||
server = ServerClass(*new_args, **kwargs)
|
||||
logger.debug(
|
||||
"Successfully bound to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
except socket.error:
|
||||
if self.servers:
|
||||
# Already bound using IPv6.
|
||||
logger.debug(
|
||||
"Certbot wasn't able to bind to %s:%s using %s, this "
|
||||
"is often expected due to the dual stack nature of "
|
||||
"IPv6 socket implementations.",
|
||||
new_address[0], new_address[1],
|
||||
"IPv6" if ip_version else "IPv4")
|
||||
else:
|
||||
logger.debug(
|
||||
"Failed to bind to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
else:
|
||||
self.servers.append(server)
|
||||
# If two servers are set up and port 0 was passed in, ensure we always
|
||||
# bind to the same port for both servers.
|
||||
port = server.socket.getsockname()[1]
|
||||
if not self.servers:
|
||||
raise socket.error("Could not bind to IPv4 or IPv6.")
|
||||
|
||||
def serve_forever(self):
|
||||
"""Wraps socketserver.TCPServer.serve_forever"""
|
||||
for server in self.servers:
|
||||
thread = threading.Thread(
|
||||
target=server.serve_forever)
|
||||
thread.start()
|
||||
self.threads.append(thread)
|
||||
|
||||
def getsocknames(self):
|
||||
"""Wraps socketserver.TCPServer.socket.getsockname"""
|
||||
return [server.socket.getsockname() for server in self.servers]
|
||||
|
||||
def shutdown_and_server_close(self):
|
||||
"""Wraps socketserver.TCPServer.shutdown, socketserver.TCPServer.server_close, and
|
||||
threading.Thread.join"""
|
||||
for server in self.servers:
|
||||
server.shutdown()
|
||||
server.server_close()
|
||||
for thread in self.threads:
|
||||
thread.join()
|
||||
self.threads = []
|
||||
|
||||
|
||||
class HTTPServer(BaseHTTPServer.HTTPServer):
|
||||
"""Generic HTTP Server."""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.ipv6 = kwargs.pop("ipv6", False)
|
||||
if self.ipv6:
|
||||
self.address_family = socket.AF_INET6
|
||||
else:
|
||||
self.address_family = socket.AF_INET
|
||||
BaseHTTPServer.HTTPServer.__init__(self, *args, **kwargs)
|
||||
|
||||
|
||||
class HTTP01Server(HTTPServer, ACMEServerMixin):
|
||||
"""HTTP01 Server."""
|
||||
|
||||
def __init__(self, server_address, resources, ipv6=False):
|
||||
HTTPServer.__init__(
|
||||
self, server_address, HTTP01RequestHandler.partial_init(
|
||||
simple_http_resources=resources), ipv6=ipv6)
|
||||
|
||||
|
||||
class HTTP01DualNetworkedServers(BaseDualNetworkedServers):
|
||||
"""HTTP01Server Wrapper. Tries everything for both. Failures for one don't
|
||||
affect the other."""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
BaseDualNetworkedServers.__init__(self, HTTP01Server, *args, **kwargs)
|
||||
|
||||
|
||||
class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
"""HTTP01 challenge handler.
|
||||
|
||||
Adheres to the stdlib's `socketserver.BaseRequestHandler` interface.
|
||||
|
||||
:ivar set simple_http_resources: A set of `HTTP01Resource`
|
||||
objects. TODO: better name?
|
||||
|
||||
"""
|
||||
HTTP01Resource = collections.namedtuple(
|
||||
"HTTP01Resource", "chall response validation")
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.simple_http_resources = kwargs.pop("simple_http_resources", set())
|
||||
BaseHTTPServer.BaseHTTPRequestHandler.__init__(self, *args, **kwargs)
|
||||
|
||||
def log_message(self, format, *args): # pylint: disable=redefined-builtin
|
||||
"""Log arbitrary message."""
|
||||
logger.debug("%s - - %s", self.client_address[0], format % args)
|
||||
|
||||
def handle(self):
|
||||
"""Handle request."""
|
||||
self.log_message("Incoming request")
|
||||
BaseHTTPServer.BaseHTTPRequestHandler.handle(self)
|
||||
|
||||
def do_GET(self): # pylint: disable=invalid-name,missing-docstring
|
||||
if self.path == "/":
|
||||
self.handle_index()
|
||||
elif self.path.startswith("/" + challenges.HTTP01.URI_ROOT_PATH):
|
||||
self.handle_simple_http_resource()
|
||||
else:
|
||||
self.handle_404()
|
||||
|
||||
def handle_index(self):
|
||||
"""Handle index page."""
|
||||
self.send_response(200)
|
||||
self.send_header("Content-Type", "text/html")
|
||||
self.end_headers()
|
||||
self.wfile.write(self.server.server_version.encode())
|
||||
|
||||
def handle_404(self):
|
||||
"""Handler 404 Not Found errors."""
|
||||
self.send_response(http_client.NOT_FOUND, message="Not Found")
|
||||
self.send_header("Content-type", "text/html")
|
||||
self.end_headers()
|
||||
self.wfile.write(b"404")
|
||||
|
||||
def handle_simple_http_resource(self):
|
||||
"""Handle HTTP01 provisioned resources."""
|
||||
for resource in self.simple_http_resources:
|
||||
if resource.chall.path == self.path:
|
||||
self.log_message("Serving HTTP01 with token %r",
|
||||
resource.chall.encode("token"))
|
||||
self.send_response(http_client.OK)
|
||||
self.end_headers()
|
||||
self.wfile.write(resource.validation.encode())
|
||||
return
|
||||
else: # pylint: disable=useless-else-on-loop
|
||||
self.log_message("No resources to serve")
|
||||
self.log_message("%s does not correspond to any resource. ignoring",
|
||||
self.path)
|
||||
|
||||
@classmethod
|
||||
def partial_init(cls, simple_http_resources):
|
||||
"""Partially initialize this handler.
|
||||
|
||||
This is useful because `socketserver.BaseServer` takes
|
||||
uninitialized handler and initializes it with the current
|
||||
request.
|
||||
|
||||
"""
|
||||
return functools.partial(
|
||||
cls, simple_http_resources=simple_http_resources)
|
||||
7
acme/acme/util.py
Normal file
7
acme/acme/util.py
Normal file
@@ -0,0 +1,7 @@
|
||||
"""ACME utilities."""
|
||||
import six
|
||||
|
||||
|
||||
def map_keys(dikt, func):
|
||||
"""Map dictionary keys."""
|
||||
return dict((func(key), value) for key, value in six.iteritems(dikt))
|
||||
@@ -9,7 +9,7 @@ BUILDDIR = _build
|
||||
|
||||
# User-friendly check for sphinx-build
|
||||
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
|
||||
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from https://www.sphinx-doc.org/)
|
||||
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
|
||||
endif
|
||||
|
||||
# Internal variables.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
Crypto_util
|
||||
-----------
|
||||
|
||||
.. automodule:: acme.crypto_util
|
||||
:members:
|
||||
@@ -1,5 +0,0 @@
|
||||
JWS
|
||||
---
|
||||
|
||||
.. automodule:: acme.jws
|
||||
:members:
|
||||
@@ -1,5 +0,0 @@
|
||||
Util
|
||||
----
|
||||
|
||||
.. automodule:: acme.util
|
||||
:members:
|
||||
@@ -13,6 +13,7 @@
|
||||
# serve to show the default.
|
||||
|
||||
import os
|
||||
import shlex
|
||||
import sys
|
||||
|
||||
here = os.path.abspath(os.path.dirname(__file__))
|
||||
@@ -37,11 +38,10 @@ extensions = [
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme',
|
||||
]
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
autodoc_default_flags = ['show-inheritance']
|
||||
autodoc_default_flags = ['show-inheritance', 'private-members']
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ['_templates']
|
||||
@@ -59,7 +59,7 @@ master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
project = u'acme-python'
|
||||
copyright = u'2015, Let\'s Encrypt Project'
|
||||
copyright = u'2015-2015, Let\'s Encrypt Project'
|
||||
author = u'Let\'s Encrypt Project'
|
||||
|
||||
# The version info for the project you're documenting, acts as replacement for
|
||||
@@ -86,9 +86,7 @@ language = 'en'
|
||||
|
||||
# List of patterns, relative to source directory, that match files and
|
||||
# directories to ignore when looking for source files.
|
||||
exclude_patterns = [
|
||||
'_build',
|
||||
]
|
||||
exclude_patterns = ['_build']
|
||||
|
||||
# The reST default role (used for this markup: `text`) to use for all
|
||||
# documents.
|
||||
@@ -115,7 +113,7 @@ pygments_style = 'sphinx'
|
||||
#keep_warnings = False
|
||||
|
||||
# If true, `todo` and `todoList` produce output, else they produce nothing.
|
||||
todo_include_todos = False
|
||||
todo_include_todos = True
|
||||
|
||||
|
||||
# -- Options for HTML output ----------------------------------------------
|
||||
@@ -123,7 +121,14 @@ todo_include_todos = False
|
||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||
# a list of builtin themes.
|
||||
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
# http://docs.readthedocs.org/en/latest/theme.html#how-do-i-use-this-locally-and-on-read-the-docs
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
import sphinx_rtd_theme
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
|
||||
@@ -65,7 +65,7 @@ if errorlevel 9009 (
|
||||
echo.may add the Sphinx directory to PATH.
|
||||
echo.
|
||||
echo.If you don't have Sphinx installed, grab it from
|
||||
echo.https://www.sphinx-doc.org/
|
||||
echo.http://sphinx-doc.org/
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
|
||||
@@ -1,3 +1 @@
|
||||
:orphan:
|
||||
|
||||
.. literalinclude:: ../jws-help.txt
|
||||
|
||||
@@ -27,11 +27,10 @@ Workflow:
|
||||
"""
|
||||
from contextlib import contextmanager
|
||||
|
||||
from cryptography import x509
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
|
||||
from acme import challenges
|
||||
from acme import client
|
||||
@@ -69,11 +68,10 @@ def new_csr_comp(domain_name, pkey_pem=None):
|
||||
"""Create certificate signing request."""
|
||||
if pkey_pem is None:
|
||||
# Create private key.
|
||||
pkey = rsa.generate_private_key(public_exponent=65537, key_size=CERT_PKEY_BITS)
|
||||
pkey_pem = pkey.private_bytes(encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.PKCS8,
|
||||
encryption_algorithm=serialization.NoEncryption())
|
||||
|
||||
pkey = OpenSSL.crypto.PKey()
|
||||
pkey.generate_key(OpenSSL.crypto.TYPE_RSA, CERT_PKEY_BITS)
|
||||
pkey_pem = OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM,
|
||||
pkey)
|
||||
csr_pem = crypto_util.make_csr(pkey_pem, [domain_name])
|
||||
return pkey_pem, csr_pem
|
||||
|
||||
@@ -165,13 +163,16 @@ def example_http():
|
||||
# Register account and accept TOS
|
||||
|
||||
net = client.ClientNetwork(acc_key, user_agent=USER_AGENT)
|
||||
directory = client.ClientV2.get_directory(DIRECTORY_URL, net)
|
||||
directory = messages.Directory.from_json(net.get(DIRECTORY_URL).json())
|
||||
client_acme = client.ClientV2(directory, net=net)
|
||||
|
||||
# Terms of Service URL is in client_acme.directory.meta.terms_of_service
|
||||
# Registration Resource: regr
|
||||
# Creates account with contact information.
|
||||
email = ('fake@example.com')
|
||||
regr = client_acme.new_account(
|
||||
messages.NewRegistration.from_data(terms_of_service_agreed=True))
|
||||
messages.NewRegistration.from_data(
|
||||
email=email, terms_of_service_agreed=True))
|
||||
|
||||
# Create domain private key and CSR
|
||||
pkey_pem, csr_pem = new_csr_comp(DOMAIN)
|
||||
@@ -199,7 +200,9 @@ def example_http():
|
||||
|
||||
# Revoke certificate
|
||||
|
||||
fullchain_com = x509.load_pem_x509_certificate(fullchain_pem.encode())
|
||||
fullchain_com = jose.ComparableX509(
|
||||
OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_PEM, fullchain_pem))
|
||||
|
||||
try:
|
||||
client_acme.revoke(fullchain_com, 0) # revocation reason = 0
|
||||
@@ -212,7 +215,8 @@ def example_http():
|
||||
try:
|
||||
regr = client_acme.query_registration(regr)
|
||||
except errors.Error as err:
|
||||
if err.typ == messages.ERROR_PREFIX + 'unauthorized':
|
||||
if err.typ == messages.OLD_ERROR_PREFIX + 'unauthorized' \
|
||||
or err.typ == messages.ERROR_PREFIX + 'unauthorized':
|
||||
# Status is deactivated.
|
||||
pass
|
||||
raise
|
||||
|
||||
2
acme/examples/standalone/README
Normal file
2
acme/examples/standalone/README
Normal file
@@ -0,0 +1,2 @@
|
||||
python -m acme.standalone -p 1234
|
||||
curl -k https://localhost:1234
|
||||
1
acme/examples/standalone/localhost/cert.pem
Symbolic link
1
acme/examples/standalone/localhost/cert.pem
Symbolic link
@@ -0,0 +1 @@
|
||||
../../../acme/testdata/rsa2048_cert.pem
|
||||
1
acme/examples/standalone/localhost/key.pem
Symbolic link
1
acme/examples/standalone/localhost/key.pem
Symbolic link
@@ -0,0 +1 @@
|
||||
../../../acme/testdata/rsa2048_key.pem
|
||||
2
acme/pytest.ini
Normal file
2
acme/pytest.ini
Normal file
@@ -0,0 +1,2 @@
|
||||
[pytest]
|
||||
norecursedirs = .* build dist CVS _darcs {arch} *.egg
|
||||
@@ -7,7 +7,4 @@
|
||||
# in --editable mode (-e), just "pip install acme[docs]" does not work as
|
||||
# expected and "pip install -e acme[docs]" must be used instead
|
||||
|
||||
# We also pin our dependencies for increased stability.
|
||||
|
||||
-c ../tools/requirements.txt
|
||||
-e acme[docs]
|
||||
|
||||
2
acme/setup.cfg
Normal file
2
acme/setup.cfg
Normal file
@@ -0,0 +1,2 @@
|
||||
[bdist_wheel]
|
||||
universal = 1
|
||||
@@ -1,16 +1,35 @@
|
||||
import sys
|
||||
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
from setuptools.command.test import test as TestCommand
|
||||
|
||||
version = '4.2.0'
|
||||
version = '1.1.0.dev0'
|
||||
|
||||
# Please update tox.ini when modifying dependency version requirements
|
||||
install_requires = [
|
||||
'cryptography>=43.0.0',
|
||||
'josepy>=2.0.0',
|
||||
# PyOpenSSL>=25.0.0 is just needed to satisfy mypy right now so this dependency can probably be
|
||||
# relaxed to >=24.0.0 if needed.
|
||||
'PyOpenSSL>=25.0.0',
|
||||
# load_pem_private/public_key (>=0.6)
|
||||
# rsa_recover_prime_factors (>=0.8)
|
||||
'cryptography>=1.2.3',
|
||||
# formerly known as acme.jose:
|
||||
# 1.1.0+ is required to avoid the warnings described at
|
||||
# https://github.com/certbot/josepy/issues/13.
|
||||
'josepy>=1.1.0',
|
||||
'mock',
|
||||
# Connection.set_tlsext_host_name (>=0.13)
|
||||
'PyOpenSSL>=0.13.1',
|
||||
'pyrfc3339',
|
||||
'requests>=2.20.0',
|
||||
'pytz',
|
||||
'requests[security]>=2.6.0', # security extras added in 2.4.1
|
||||
'requests-toolbelt>=0.3.0',
|
||||
'setuptools',
|
||||
'six>=1.9.0', # needed for python_2_unicode_compatible
|
||||
]
|
||||
|
||||
dev_extras = [
|
||||
'pytest',
|
||||
'pytest-xdist',
|
||||
'tox',
|
||||
]
|
||||
|
||||
docs_extras = [
|
||||
@@ -18,42 +37,56 @@ docs_extras = [
|
||||
'sphinx_rtd_theme',
|
||||
]
|
||||
|
||||
test_extras = [
|
||||
'pytest',
|
||||
'pytest-xdist',
|
||||
'typing-extensions',
|
||||
]
|
||||
|
||||
class PyTest(TestCommand):
|
||||
user_options = []
|
||||
|
||||
def initialize_options(self):
|
||||
TestCommand.initialize_options(self)
|
||||
self.pytest_args = ''
|
||||
|
||||
def run_tests(self):
|
||||
import shlex
|
||||
# import here, cause outside the eggs aren't loaded
|
||||
import pytest
|
||||
errno = pytest.main(shlex.split(self.pytest_args))
|
||||
sys.exit(errno)
|
||||
|
||||
|
||||
setup(
|
||||
name='acme',
|
||||
version=version,
|
||||
description='ACME protocol implementation in Python',
|
||||
url='https://github.com/certbot/certbot',
|
||||
url='https://github.com/letsencrypt/letsencrypt',
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
author_email='client-dev@letsencrypt.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.9.2',
|
||||
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 2',
|
||||
'Programming Language :: Python :: 2.7',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Programming Language :: Python :: 3.13',
|
||||
'Programming Language :: Python :: 3.4',
|
||||
'Programming Language :: Python :: 3.5',
|
||||
'Programming Language :: Python :: 3.6',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
],
|
||||
|
||||
packages=find_packages(where='src'),
|
||||
package_dir={'': 'src'},
|
||||
packages=find_packages(),
|
||||
include_package_data=True,
|
||||
install_requires=install_requires,
|
||||
extras_require={
|
||||
'dev': dev_extras,
|
||||
'docs': docs_extras,
|
||||
'test': test_extras,
|
||||
},
|
||||
test_suite='acme',
|
||||
tests_require=["pytest"],
|
||||
cmdclass={"test": PyTest},
|
||||
)
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
"""acme's internal implementation"""
|
||||
@@ -1 +0,0 @@
|
||||
"""acme tests"""
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,342 +0,0 @@
|
||||
"""Tests for acme.crypto_util."""
|
||||
import ipaddress
|
||||
import itertools
|
||||
import socket
|
||||
import socketserver
|
||||
import sys
|
||||
import threading
|
||||
import time
|
||||
from typing import List
|
||||
import unittest
|
||||
from unittest import mock
|
||||
import warnings
|
||||
|
||||
import pytest
|
||||
from cryptography import x509
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa, x25519
|
||||
|
||||
from acme import errors
|
||||
from acme._internal.tests import test_util
|
||||
|
||||
|
||||
class FormatTest(unittest.TestCase):
|
||||
def test_to_cryptography_encoding(self):
|
||||
from acme.crypto_util import Format
|
||||
assert Format.DER.to_cryptography_encoding() == serialization.Encoding.DER
|
||||
assert Format.PEM.to_cryptography_encoding() == serialization.Encoding.PEM
|
||||
|
||||
|
||||
class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
"""Tests for acme.crypto_util.SSLSocket/probe_sni."""
|
||||
|
||||
def setUp(self):
|
||||
self.cert = test_util.load_cert('rsa2048_cert.pem')
|
||||
key = test_util.load_pyopenssl_private_key('rsa2048_key.pem')
|
||||
# pylint: disable=protected-access
|
||||
certs = {b'foo': (key, self.cert)}
|
||||
|
||||
from acme.crypto_util import SSLSocket
|
||||
|
||||
class _TestServer(socketserver.TCPServer):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.socket = SSLSocket(self.socket, certs)
|
||||
|
||||
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
|
||||
self.port = self.server.socket.getsockname()[1]
|
||||
self.server_thread = threading.Thread(
|
||||
target=self.server.handle_request)
|
||||
|
||||
def tearDown(self):
|
||||
if self.server_thread.is_alive():
|
||||
# The thread may have already terminated.
|
||||
self.server_thread.join() # pragma: no cover
|
||||
self.server.server_close()
|
||||
|
||||
def _probe(self, name):
|
||||
from acme.crypto_util import probe_sni
|
||||
return probe_sni(name, host='127.0.0.1', port=self.port)
|
||||
|
||||
def _start_server(self):
|
||||
self.server_thread.start()
|
||||
time.sleep(1) # TODO: avoid race conditions in other way
|
||||
|
||||
def test_probe_ok(self):
|
||||
self._start_server()
|
||||
assert self.cert == self._probe(b'foo')
|
||||
|
||||
def test_probe_not_recognized_name(self):
|
||||
self._start_server()
|
||||
with pytest.raises(errors.Error):
|
||||
self._probe(b'bar')
|
||||
|
||||
def test_probe_connection_error(self):
|
||||
self.server.server_close()
|
||||
original_timeout = socket.getdefaulttimeout()
|
||||
try:
|
||||
socket.setdefaulttimeout(1)
|
||||
with pytest.raises(errors.Error):
|
||||
self._probe(b'bar')
|
||||
finally:
|
||||
socket.setdefaulttimeout(original_timeout)
|
||||
|
||||
|
||||
class SSLSocketTest(unittest.TestCase):
|
||||
"""Tests for acme.crypto_util.SSLSocket."""
|
||||
|
||||
def test_ssl_socket_invalid_arguments(self):
|
||||
from acme.crypto_util import SSLSocket
|
||||
with pytest.raises(ValueError):
|
||||
_ = SSLSocket(None, {'sni': ('key', 'cert')},
|
||||
cert_selection=lambda _: None)
|
||||
with pytest.raises(ValueError):
|
||||
_ = SSLSocket(None)
|
||||
|
||||
|
||||
class MiscTests(unittest.TestCase):
|
||||
|
||||
def test_dump_cryptography_chain(self):
|
||||
from acme.crypto_util import dump_cryptography_chain
|
||||
|
||||
cert1 = test_util.load_cert('rsa2048_cert.pem')
|
||||
cert2 = test_util.load_cert('rsa4096_cert.pem')
|
||||
|
||||
chain = [cert1, cert2]
|
||||
dumped = dump_cryptography_chain(chain)
|
||||
|
||||
# default is PEM encoding Encoding.PEM
|
||||
assert isinstance(dumped, bytes)
|
||||
|
||||
|
||||
class CryptographyCertOrReqSANTest(unittest.TestCase):
|
||||
"""Test for acme.crypto_util._cryptography_cert_or_req_san."""
|
||||
|
||||
@classmethod
|
||||
def _call(cls, loader, name):
|
||||
# pylint: disable=protected-access
|
||||
from acme.crypto_util import _cryptography_cert_or_req_san
|
||||
return _cryptography_cert_or_req_san(loader(name))
|
||||
|
||||
@classmethod
|
||||
def _get_idn_names(cls):
|
||||
"""Returns expected names from '{cert,csr}-idnsans.pem'."""
|
||||
chars = [chr(i) for i in itertools.chain(range(0x3c3, 0x400),
|
||||
range(0x641, 0x6fc),
|
||||
range(0x1820, 0x1877))]
|
||||
return [''.join(chars[i: i + 45]) + '.invalid'
|
||||
for i in range(0, len(chars), 45)]
|
||||
|
||||
def _call_cert(self, name):
|
||||
return self._call(test_util.load_cert, name)
|
||||
|
||||
def _call_csr(self, name):
|
||||
return self._call(test_util.load_csr, name)
|
||||
|
||||
def test_cert_no_sans(self):
|
||||
assert self._call_cert('cert.pem') == []
|
||||
|
||||
def test_cert_two_sans(self):
|
||||
assert self._call_cert('cert-san.pem') == \
|
||||
['example.com', 'www.example.com']
|
||||
|
||||
def test_cert_hundred_sans(self):
|
||||
assert self._call_cert('cert-100sans.pem') == \
|
||||
['example{0}.com'.format(i) for i in range(1, 101)]
|
||||
|
||||
def test_cert_idn_sans(self):
|
||||
assert self._call_cert('cert-idnsans.pem') == \
|
||||
self._get_idn_names()
|
||||
|
||||
def test_csr_no_sans(self):
|
||||
assert self._call_csr('csr-nosans.pem') == []
|
||||
|
||||
def test_csr_one_san(self):
|
||||
assert self._call_csr('csr.pem') == ['example.com']
|
||||
|
||||
def test_csr_two_sans(self):
|
||||
assert self._call_csr('csr-san.pem') == \
|
||||
['example.com', 'www.example.com']
|
||||
|
||||
def test_csr_six_sans(self):
|
||||
assert self._call_csr('csr-6sans.pem') == \
|
||||
['example.com', 'example.org', 'example.net',
|
||||
'example.info', 'subdomain.example.com',
|
||||
'other.subdomain.example.com']
|
||||
|
||||
def test_csr_hundred_sans(self):
|
||||
assert self._call_csr('csr-100sans.pem') == \
|
||||
['example{0}.com'.format(i) for i in range(1, 101)]
|
||||
|
||||
def test_csr_idn_sans(self):
|
||||
assert self._call_csr('csr-idnsans.pem') == \
|
||||
self._get_idn_names()
|
||||
|
||||
def test_critical_san(self):
|
||||
assert self._call_cert('critical-san.pem') == \
|
||||
['chicago-cubs.venafi.example', 'cubs.venafi.example']
|
||||
|
||||
|
||||
class GenMakeSelfSignedCertTest(unittest.TestCase):
|
||||
"""Test for make_self_signed_cert."""
|
||||
|
||||
def setUp(self):
|
||||
self.cert_count = 5
|
||||
self.serial_num: List[int] = []
|
||||
self.privkey = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
|
||||
def test_sn_collisions(self):
|
||||
from acme.crypto_util import make_self_signed_cert
|
||||
for _ in range(self.cert_count):
|
||||
cert = make_self_signed_cert(self.privkey, ['dummy'], force_san=True,
|
||||
ips=[ipaddress.ip_address("10.10.10.10")])
|
||||
self.serial_num.append(cert.serial_number)
|
||||
assert len(set(self.serial_num)) >= self.cert_count
|
||||
|
||||
def test_no_ips(self):
|
||||
from acme.crypto_util import make_self_signed_cert
|
||||
cert = make_self_signed_cert(self.privkey, ['dummy'])
|
||||
|
||||
@mock.patch("acme.crypto_util._now")
|
||||
def test_expiry_times(self, mock_now):
|
||||
from acme.crypto_util import make_self_signed_cert
|
||||
from datetime import datetime, timedelta, timezone
|
||||
not_before = 1736200830
|
||||
validity = 100
|
||||
|
||||
not_before_dt = datetime.fromtimestamp(not_before)
|
||||
validity_td = timedelta(validity)
|
||||
not_after_dt = not_before_dt + validity_td
|
||||
cert = make_self_signed_cert(
|
||||
self.privkey,
|
||||
['dummy'],
|
||||
not_before=not_before_dt,
|
||||
validity=validity_td,
|
||||
)
|
||||
# TODO: This should be `not_valid_before_utc` once we raise the minimum
|
||||
# cryptography version.
|
||||
# https://github.com/certbot/certbot/issues/10105
|
||||
with warnings.catch_warnings():
|
||||
warnings.filterwarnings(
|
||||
'ignore',
|
||||
message='Properties that return.*datetime object'
|
||||
)
|
||||
self.assertEqual(cert.not_valid_before, not_before_dt)
|
||||
self.assertEqual(cert.not_valid_after, not_after_dt)
|
||||
|
||||
now = not_before + 1
|
||||
now_dt = datetime.fromtimestamp(now)
|
||||
mock_now.return_value = now_dt.replace(tzinfo=timezone.utc)
|
||||
valid_after_now_dt = now_dt + validity_td
|
||||
cert = make_self_signed_cert(
|
||||
self.privkey,
|
||||
['dummy'],
|
||||
validity=validity_td,
|
||||
)
|
||||
with warnings.catch_warnings():
|
||||
warnings.filterwarnings(
|
||||
'ignore',
|
||||
message='Properties that return.*datetime object'
|
||||
)
|
||||
self.assertEqual(cert.not_valid_before, now_dt)
|
||||
self.assertEqual(cert.not_valid_after, valid_after_now_dt)
|
||||
|
||||
def test_no_name(self):
|
||||
from acme.crypto_util import make_self_signed_cert
|
||||
with pytest.raises(AssertionError):
|
||||
make_self_signed_cert(self.privkey, ips=[ipaddress.ip_address("1.1.1.1")])
|
||||
make_self_signed_cert(self.privkey)
|
||||
|
||||
def test_extensions(self):
|
||||
from acme.crypto_util import make_self_signed_cert
|
||||
extension_type = x509.TLSFeature([x509.TLSFeatureType.status_request])
|
||||
extension = x509.Extension(
|
||||
x509.TLSFeature.oid,
|
||||
False,
|
||||
extension_type
|
||||
)
|
||||
cert = make_self_signed_cert(
|
||||
self.privkey,
|
||||
ips=[ipaddress.ip_address("1.1.1.1")],
|
||||
extensions=[extension]
|
||||
)
|
||||
self.assertIn(extension, cert.extensions)
|
||||
|
||||
|
||||
class MakeCSRTest(unittest.TestCase):
|
||||
"""Test for standalone functions."""
|
||||
|
||||
@classmethod
|
||||
def _call_with_key(cls, *args, **kwargs):
|
||||
privkey = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
privkey_pem = privkey.private_bytes(
|
||||
serialization.Encoding.PEM,
|
||||
serialization.PrivateFormat.PKCS8,
|
||||
serialization.NoEncryption(),
|
||||
)
|
||||
from acme.crypto_util import make_csr
|
||||
|
||||
return make_csr(privkey_pem, *args, **kwargs)
|
||||
|
||||
def test_make_csr(self):
|
||||
csr_pem = self._call_with_key(["a.example", "b.example"])
|
||||
assert b"--BEGIN CERTIFICATE REQUEST--" in csr_pem
|
||||
assert b"--END CERTIFICATE REQUEST--" in csr_pem
|
||||
csr = x509.load_pem_x509_csr(csr_pem)
|
||||
|
||||
assert len(csr.extensions) == 1
|
||||
assert list(
|
||||
csr.extensions.get_extension_for_class(x509.SubjectAlternativeName).value
|
||||
) == [
|
||||
x509.DNSName("a.example"),
|
||||
x509.DNSName("b.example"),
|
||||
]
|
||||
|
||||
def test_make_csr_ip(self):
|
||||
csr_pem = self._call_with_key(
|
||||
["a.example"],
|
||||
False,
|
||||
[ipaddress.ip_address("127.0.0.1"), ipaddress.ip_address("::1")],
|
||||
)
|
||||
assert b"--BEGIN CERTIFICATE REQUEST--" in csr_pem
|
||||
assert b"--END CERTIFICATE REQUEST--" in csr_pem
|
||||
|
||||
csr = x509.load_pem_x509_csr(csr_pem)
|
||||
|
||||
assert len(csr.extensions) == 1
|
||||
assert list(
|
||||
csr.extensions.get_extension_for_class(x509.SubjectAlternativeName).value
|
||||
) == [
|
||||
x509.DNSName("a.example"),
|
||||
x509.IPAddress(ipaddress.ip_address("127.0.0.1")),
|
||||
x509.IPAddress(ipaddress.ip_address("::1")),
|
||||
]
|
||||
|
||||
def test_make_csr_must_staple(self):
|
||||
csr_pem = self._call_with_key(["a.example"], must_staple=True)
|
||||
csr = x509.load_pem_x509_csr(csr_pem)
|
||||
|
||||
assert len(csr.extensions) == 2
|
||||
assert list(csr.extensions.get_extension_for_class(x509.TLSFeature).value) == [
|
||||
x509.TLSFeatureType.status_request
|
||||
]
|
||||
|
||||
def test_make_csr_without_hostname(self):
|
||||
with pytest.raises(ValueError):
|
||||
self._call_with_key()
|
||||
|
||||
def test_make_csr_invalid_key_type(self):
|
||||
privkey = x25519.X25519PrivateKey.generate()
|
||||
privkey_pem = privkey.private_bytes(
|
||||
serialization.Encoding.PEM,
|
||||
serialization.PrivateFormat.PKCS8,
|
||||
serialization.NoEncryption(),
|
||||
)
|
||||
from acme.crypto_util import make_csr
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
make_csr(privkey_pem, ["a.example"])
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
@@ -1,86 +0,0 @@
|
||||
"""Tests for acme.errors."""
|
||||
import sys
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
class BadNonceTest(unittest.TestCase):
|
||||
"""Tests for acme.errors.BadNonce."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.errors import BadNonce
|
||||
self.error = BadNonce(nonce="xxx", error="error")
|
||||
|
||||
def test_str(self):
|
||||
assert "Invalid nonce ('xxx'): error" == str(self.error)
|
||||
|
||||
|
||||
class MissingNonceTest(unittest.TestCase):
|
||||
"""Tests for acme.errors.MissingNonce."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.errors import MissingNonce
|
||||
self.response = mock.MagicMock(headers={})
|
||||
self.response.request.method = 'FOO'
|
||||
self.error = MissingNonce(self.response)
|
||||
|
||||
def test_str(self):
|
||||
assert "FOO" in str(self.error)
|
||||
assert "{}" in str(self.error)
|
||||
|
||||
|
||||
class PollErrorTest(unittest.TestCase):
|
||||
"""Tests for acme.errors.PollError."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.errors import PollError
|
||||
self.timeout = PollError(
|
||||
exhausted={mock.sentinel.AR},
|
||||
updated={})
|
||||
self.invalid = PollError(exhausted=set(), updated={
|
||||
mock.sentinel.AR: mock.sentinel.AR2})
|
||||
|
||||
def test_timeout(self):
|
||||
assert self.timeout.timeout
|
||||
assert not self.invalid.timeout
|
||||
|
||||
def test_repr(self):
|
||||
assert 'PollError(exhausted=%s, updated={sentinel.AR: ' \
|
||||
'sentinel.AR2})' % repr(set()) == repr(self.invalid)
|
||||
|
||||
|
||||
class ValidationErrorTest(unittest.TestCase):
|
||||
"""Tests for acme.errors.ValidationError"""
|
||||
|
||||
def setUp(self):
|
||||
from acme.errors import ValidationError
|
||||
from acme.challenges import DNS01
|
||||
from acme.messages import Error
|
||||
from acme.messages import Authorization
|
||||
from acme.messages import AuthorizationResource
|
||||
from acme.messages import IDENTIFIER_FQDN
|
||||
from acme.messages import ChallengeBody
|
||||
from acme.messages import Identifier
|
||||
self.challenge_error = Error(typ='custom', detail='bar')
|
||||
failed_authzr = AuthorizationResource(
|
||||
body=Authorization(
|
||||
identifier=Identifier(typ=IDENTIFIER_FQDN, value="example.com"),
|
||||
challenges=[ChallengeBody(
|
||||
chall=DNS01(),
|
||||
error=self.challenge_error,
|
||||
)]
|
||||
)
|
||||
)
|
||||
self.error = ValidationError([failed_authzr])
|
||||
|
||||
def test_repr(self):
|
||||
err_message = str(self.error)
|
||||
assert 'Authorization for example.com failed' in err_message
|
||||
assert 'Challenge dns-01 failed' in err_message
|
||||
assert str(self.challenge_error) in err_message
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
@@ -1,59 +0,0 @@
|
||||
"""Tests for acme.fields."""
|
||||
import datetime
|
||||
import sys
|
||||
import unittest
|
||||
import warnings
|
||||
|
||||
import josepy as jose
|
||||
import pytest
|
||||
|
||||
|
||||
class FixedTest(unittest.TestCase):
|
||||
"""Tests for acme.fields.Fixed."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.fields import fixed
|
||||
self.field = fixed('name', 'x')
|
||||
|
||||
def test_decode(self):
|
||||
assert 'x' == self.field.decode('x')
|
||||
|
||||
def test_decode_bad(self):
|
||||
with pytest.raises(jose.DeserializationError):
|
||||
self.field.decode('y')
|
||||
|
||||
def test_encode(self):
|
||||
assert 'x' == self.field.encode('x')
|
||||
|
||||
def test_encode_override(self):
|
||||
assert 'y' == self.field.encode('y')
|
||||
|
||||
|
||||
class RFC3339FieldTest(unittest.TestCase):
|
||||
"""Tests for acme.fields.RFC3339Field."""
|
||||
|
||||
def setUp(self):
|
||||
self.decoded = datetime.datetime(2015, 3, 27, tzinfo=datetime.timezone.utc)
|
||||
self.encoded = '2015-03-27T00:00:00Z'
|
||||
|
||||
def test_default_encoder(self):
|
||||
from acme.fields import RFC3339Field
|
||||
assert self.encoded == RFC3339Field.default_encoder(self.decoded)
|
||||
|
||||
def test_default_encoder_naive_fails(self):
|
||||
from acme.fields import RFC3339Field
|
||||
with pytest.raises(ValueError):
|
||||
RFC3339Field.default_encoder(datetime.datetime.now())
|
||||
|
||||
def test_default_decoder(self):
|
||||
from acme.fields import RFC3339Field
|
||||
assert self.decoded == RFC3339Field.default_decoder(self.encoded)
|
||||
|
||||
def test_default_decoder_raises_deserialization_error(self):
|
||||
from acme.fields import RFC3339Field
|
||||
with pytest.raises(jose.DeserializationError):
|
||||
RFC3339Field.default_decoder('')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
@@ -1,53 +0,0 @@
|
||||
"""Tests for acme.jose shim."""
|
||||
import importlib
|
||||
import sys
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
def _test_it(submodule, attribute):
|
||||
if submodule:
|
||||
acme_jose_path = 'acme.jose.' + submodule
|
||||
josepy_path = 'josepy.' + submodule
|
||||
else:
|
||||
acme_jose_path = 'acme.jose'
|
||||
josepy_path = 'josepy'
|
||||
acme_jose_mod = importlib.import_module(acme_jose_path)
|
||||
josepy_mod = importlib.import_module(josepy_path)
|
||||
|
||||
assert acme_jose_mod is josepy_mod
|
||||
assert getattr(acme_jose_mod, attribute) is getattr(josepy_mod, attribute)
|
||||
|
||||
# We use the imports below with eval, but pylint doesn't
|
||||
# understand that.
|
||||
import josepy # pylint: disable=unused-import
|
||||
|
||||
import acme # pylint: disable=unused-import
|
||||
acme_jose_mod = eval(acme_jose_path) # pylint: disable=eval-used
|
||||
josepy_mod = eval(josepy_path) # pylint: disable=eval-used
|
||||
assert acme_jose_mod is josepy_mod
|
||||
assert getattr(acme_jose_mod, attribute) is getattr(josepy_mod, attribute)
|
||||
|
||||
def test_top_level():
|
||||
_test_it('', 'RS512')
|
||||
|
||||
def test_submodules():
|
||||
# This test ensures that the modules in josepy that were
|
||||
# available at the time it was moved into its own package are
|
||||
# available under acme.jose. Backwards compatibility with new
|
||||
# modules or testing code is not maintained.
|
||||
mods_and_attrs = [('b64', 'b64decode',),
|
||||
('errors', 'Error',),
|
||||
('interfaces', 'JSONDeSerializable',),
|
||||
('json_util', 'Field',),
|
||||
('jwa', 'HS256',),
|
||||
('jwk', 'JWK',),
|
||||
('jws', 'JWS',),
|
||||
('util', 'ImmutableMap',),]
|
||||
|
||||
for mod, attr in mods_and_attrs:
|
||||
_test_it(mod, attr)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
@@ -1,77 +0,0 @@
|
||||
"""Test utilities.
|
||||
|
||||
.. warning:: This module is not part of the public API.
|
||||
|
||||
"""
|
||||
import importlib.resources
|
||||
import os
|
||||
from typing import Callable
|
||||
|
||||
from cryptography import x509
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
import josepy as jose
|
||||
from josepy.util import ComparableECKey
|
||||
from OpenSSL import crypto
|
||||
|
||||
|
||||
def load_vector(*names):
|
||||
"""Load contents of a test vector."""
|
||||
# luckily, resource_string opens file in binary mode
|
||||
vector_ref = importlib.resources.files(__package__).joinpath('testdata', *names)
|
||||
return vector_ref.read_bytes()
|
||||
|
||||
|
||||
def _guess_loader(filename: str, loader_pem: Callable, loader_der: Callable) -> Callable:
|
||||
_, ext = os.path.splitext(filename)
|
||||
if ext.lower() == ".pem":
|
||||
return loader_pem
|
||||
elif ext.lower() == ".der":
|
||||
return loader_der
|
||||
else: # pragma: no cover
|
||||
raise ValueError("Loader could not be recognized based on extension")
|
||||
|
||||
|
||||
def _guess_pyopenssl_loader(filename: str, loader_pem: int, loader_der: int) -> int:
|
||||
_, ext = os.path.splitext(filename)
|
||||
if ext.lower() == ".pem":
|
||||
return loader_pem
|
||||
else: # pragma: no cover
|
||||
raise ValueError("Loader could not be recognized based on extension")
|
||||
|
||||
|
||||
def load_cert(*names: str) -> x509.Certificate:
|
||||
"""Load certificate."""
|
||||
loader = _guess_loader(
|
||||
names[-1], x509.load_pem_x509_certificate, x509.load_der_x509_certificate
|
||||
)
|
||||
return loader(load_vector(*names))
|
||||
|
||||
|
||||
def load_csr(*names: str) -> x509.CertificateSigningRequest:
|
||||
"""Load certificate request."""
|
||||
loader = _guess_loader(names[-1], x509.load_pem_x509_csr, x509.load_der_x509_csr)
|
||||
return loader(load_vector(*names))
|
||||
|
||||
|
||||
def load_rsa_private_key(*names):
|
||||
"""Load RSA private key."""
|
||||
loader = _guess_loader(names[-1], serialization.load_pem_private_key,
|
||||
serialization.load_der_private_key)
|
||||
return jose.ComparableRSAKey(loader(
|
||||
load_vector(*names), password=None, backend=default_backend()))
|
||||
|
||||
|
||||
def load_ecdsa_private_key(*names):
|
||||
"""Load ECDSA private key."""
|
||||
loader = _guess_loader(names[-1], serialization.load_pem_private_key,
|
||||
serialization.load_der_private_key)
|
||||
return ComparableECKey(loader(
|
||||
load_vector(*names), password=None, backend=default_backend()))
|
||||
|
||||
|
||||
def load_pyopenssl_private_key(*names):
|
||||
"""Load pyOpenSSL private key."""
|
||||
loader = _guess_pyopenssl_loader(
|
||||
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
|
||||
return crypto.load_privatekey(loader, load_vector(*names))
|
||||
21
acme/src/acme/_internal/tests/testdata/README
vendored
21
acme/src/acme/_internal/tests/testdata/README
vendored
@@ -1,21 +0,0 @@
|
||||
In order for acme.test_util._guess_loader to work properly, make sure
|
||||
to use appropriate extension for vector filenames: .pem for PEM and
|
||||
.der for DER.
|
||||
|
||||
The following command has been used to generate test keys:
|
||||
|
||||
for k in 256 512 1024 2048 4096; do openssl genrsa -out rsa${k}_key.pem $k; done
|
||||
|
||||
and for the CSR:
|
||||
|
||||
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -outform DER > csr.der
|
||||
|
||||
and for the certificates:
|
||||
|
||||
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der
|
||||
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 > rsa2048_cert.pem
|
||||
openssl req -key rsa1024_key.pem -new -subj '/CN=example.com' -x509 > rsa1024_cert.pem
|
||||
|
||||
and for the elliptic key curves:
|
||||
|
||||
openssl genpkey -algorithm EC -out ec_secp384r1.pem -pkeyopt ec_paramgen_curve:P-384 -pkeyopt ec_param_enc:named_curve
|
||||
@@ -1,21 +0,0 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIDizCCAnOgAwIBAgIIPNBLQXwhoUkwDQYJKoZIhvcNAQELBQAwKDEmMCQGA1UE
|
||||
AxMdUGViYmxlIEludGVybWVkaWF0ZSBDQSAxNzNiMjYwHhcNMjAwNTI5MTkxODA5
|
||||
WhcNMjUwNTI5MTkxODA5WjAWMRQwEgYDVQQDEwsxOTIuMC4yLjE0NTCCASIwDQYJ
|
||||
KoZIhvcNAQEBBQADggEPADCCAQoCggEBALyChb+NDA26GF1AfC0nzEdfOTchKw0h
|
||||
q41xEjonvg5UXgZf/aH/ntvugIkYP0MaFifNAjebOVVsemEVEtyWcUKTfBHKZGbZ
|
||||
ukTDwFIjfTccCfo6U/B2H7ZLzJIywl8DcUw9DypadeQBm8PS0VVR2ncy73dvaqym
|
||||
crhAwlASyXU0mhLqRDMMxfg5Bn/FWpcsIcDpLmPn8Q/FvdRc2t5ryBNw/aWOlwqT
|
||||
Oy16nbfLj2T0zG1A3aPuD+eT/JFUe/o3K7R+FAx7wt+RziQO46wLVVF1SueZUrIU
|
||||
zqN04Gl8Kt1WM2SniZ0gq/rORUNcPtT0NAEsEslTQfA+Trq6j2peqyMCAwEAAaOB
|
||||
yjCBxzAOBgNVHQ8BAf8EBAMCBaAwHQYDVR0lBBYwFAYIKwYBBQUHAwEGCCsGAQUF
|
||||
BwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFHj1mwZzP//nMIH2i58NRUl/arHn
|
||||
MB8GA1UdIwQYMBaAFF5DVAKabvIUvKFHGouscA2Qdpe6MDEGCCsGAQUFBwEBBCUw
|
||||
IzAhBggrBgEFBQcwAYYVaHR0cDovLzEyNy4wLjAuMTo0MDAyMBUGA1UdEQQOMAyH
|
||||
BMAAApGHBMsAcQEwDQYJKoZIhvcNAQELBQADggEBAHjSgDg76/UCIMSYddyhj18r
|
||||
LdNKjA7p8ovnErSkebFT4lIZ9f3Sma9moNr0w64M33NamuFyHe/KTdk90mvoW8Uu
|
||||
26aDekiRIeeMakzbAtDKn67tt2tbedKIYRATcSYVwsV46uZKbM621dZKIjjxOWpo
|
||||
IY6rZYrku8LYhoXJXOqRduV3cTRVuTm5bBa9FfVNtt6N1T5JOtKKDEhuSaF4RSug
|
||||
PDy3hQIiHrVvhPfVrXU3j6owz/8UCS5549inES9ONTFrvM9o0H1R/MsmGNXR5hF5
|
||||
iJqHKC7n8LZujhVnoFIpHu2Dsiefbfr+yRYJS4I+ezy6Nq/Ok8rc8zp0eoX+uyY=
|
||||
-----END CERTIFICATE-----
|
||||
@@ -1,22 +0,0 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIDmzCCAoOgAwIBAgIIFdxeZP+v2rgwDQYJKoZIhvcNAQELBQAwKDEmMCQGA1UE
|
||||
AxMdUGViYmxlIEludGVybWVkaWF0ZSBDQSA0M2M5NTcwHhcNMjAwNTMwMDQwNzMw
|
||||
WhcNMjUwNTMwMDQwNzMwWjAOMQwwCgYDVQQDEwM6OjEwggEiMA0GCSqGSIb3DQEB
|
||||
AQUAA4IBDwAwggEKAoIBAQC7VidVduJvqKtrSH0fw6PjE0cqL4Kfzo7klexWUkHG
|
||||
KVAa0fRVZFZ462jxKOt417V2U4WJQ6WHHO9PJ+3gW62d/MhCw8FRtUQS4nYFjqB6
|
||||
32+RFU21VRN7cWoQEqSwnEPbh/v/zv/KS5JhQ+swWUo79AOLm1kjnZWCKtcqh1Lc
|
||||
Ug5Tkpot6luoxTKp52MkchvXDpj0q2B/XpLJ8/pw5cqjv7mH12EDOK2HXllA+WwX
|
||||
ZpstcEhaA4FqtaHOW/OHnwTX5MUbINXE5YYHVEDR6moVM31/W/3pe9NDUMTDE7Si
|
||||
lVQnZbXM9NYbzZqlh+WhemDWwnIfGI6rtsfNEiirVEOlAgMBAAGjgeIwgd8wDgYD
|
||||
VR0PAQH/BAQDAgWgMB0GA1UdJQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjAMBgNV
|
||||
HRMBAf8EAjAAMB0GA1UdDgQWBBS8DL+MZfDIy6AKky69Tgry2Vxq5DAfBgNVHSME
|
||||
GDAWgBRAsFqVenRRKgB1YPzWKzb9bzZ/ozAxBggrBgEFBQcBAQQlMCMwIQYIKwYB
|
||||
BQUHMAGGFWh0dHA6Ly8xMjcuMC4wLjE6NDAwMjAtBgNVHREEJjAkhxAAAAAAAAAA
|
||||
AAAAAAAAAAABhxCjvjLzIG7HXQlWDO6YWF7FMA0GCSqGSIb3DQEBCwUAA4IBAQBY
|
||||
M9UTZ3uaKMQ+He9kWR3p9jh6hTSD0FNi79ZdfkG0lgSzhhduhN7OhzQH2ihUUfa6
|
||||
rtKTw74fGbszhizCd9UB8YPKlm3si1Xbg6ZUQlA1RtoQo7RUGEa6ZbR68PKGm9Go
|
||||
hTTFIl/JS8jzxBR8jywZdyqtprUx+nnNUDiNk0hJtFLhw7OJH0AHlAUNqHsfD08m
|
||||
HXRdaV6q14HXU5g31slBat9H4D6tCU/2uqBURwW0wVdnqh4QeRfAeqiatJS9EmSF
|
||||
ctbc7n894Idy2Xce7NFoIy5cht3m6Rd42o/LmBsJopBmQcDPZT70/XzRtc2qE0cS
|
||||
CzBIGQHUJ6BfmBjrCQnp
|
||||
-----END CERTIFICATE-----
|
||||
@@ -1,16 +0,0 @@
|
||||
-----BEGIN CERTIFICATE REQUEST-----
|
||||
MIICbTCCAVUCAQIwADCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKT/
|
||||
CE7Y5EYBvI4p7Frt763upIKHDHO/R5/TWMjG8Jm9qTMui8sbMgyh2Yh+lR/j/5Xd
|
||||
tQrhgC6wx10MrW2+3JtYS88HP1p6si8zU1dbK34n3NyyklR2RivW0R7dXgnYNy7t
|
||||
5YcDYLCrbRMIPINV/uHrmzIHWYUDNcZVdAfIM2AHfKYuV6Mepcn///5GR+l4GcAh
|
||||
Nkf9CW8OdAIuKdbyLCxVr0mUW/vJz1b12uxPsgUdax9sjXgZdT4pfMXADsFd1NeF
|
||||
atpsXU073inqtHru+2F9ijHTQ75TC+u/rr6eYl3BnBntac0gp/ADtDBii7/Q1JOO
|
||||
Bhq7xJNqqxIEdiyM7zcCAwEAAaAoMCYGCSqGSIb3DQEJDjEZMBcwFQYDVR0RBA4w
|
||||
DIcEwAACkYcEywBxATANBgkqhkiG9w0BAQsFAAOCAQEADG5g3zdbSCaXpZhWHkzE
|
||||
Mek3f442TUE1pB+ITRpthmM4N3zZWETYmbLCIAO624uMrRnbCCMvAoLs/L/9ETg/
|
||||
XMMFtonQC8u9i9tV8B1ceBh8lpIfa+8b9TMWH3bqnrbWQ+YIl+Yd0gXiCZWJ9vK4
|
||||
eM1Gddu/2bR6s/k4h/XAWRgEexqk57EHr1z0N+T9OoX939n3mVcNI+u9kfd5VJ0z
|
||||
VyA3R8WR6T6KlEl5P5pcWe5Kuyhi7xMmLVImXqBtvKq4O1AMfM+gQr/yn9aE8IRq
|
||||
khP7JrMBLUIub1c/qu2TfvnynNPSM/ZcOX+6PHdHmRkR3nI0Ndpv7Ntv31FTplAm
|
||||
Dw==
|
||||
-----END CERTIFICATE REQUEST-----
|
||||
@@ -1,16 +0,0 @@
|
||||
-----BEGIN CERTIFICATE REQUEST-----
|
||||
MIIChTCCAW0CAQIwADCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAOIc
|
||||
UAppcqJfTkSqqOFqGt1v7lIJZPOcF4bcKI3d5cHAGbOuVxbC7uMaDuObwYLzoiED
|
||||
qnvs1NaEq2phO6KsgGESB7IE2LUjJivO7OnSZjNRpL5si/9egvBiNCn/50lULaWG
|
||||
gLEuyMfk3awZy2mVAymy7Grhbx069A4TH8TqsHuq2RpKyuDL27e/jUt6yYecb3pu
|
||||
hWMiWy3segif4tI46pkOW0/I6DpxyYD2OqOvzxm/voS9RMqE2+7YJA327H7bEi3N
|
||||
lJZEZ1zy7clZ9ga5fBQaetzbg2RyxTrZ7F919NQXSFoXgxb10Eg64wIpz0L3ooCm
|
||||
GEHehsZZexa3J5ccIvMCAwEAAaBAMD4GCSqGSIb3DQEJDjExMC8wLQYDVR0RBCYw
|
||||
JIcQAAAAAAAAAAAAAAAAAAAAAYcQo74y8yBux10JVgzumFhexTANBgkqhkiG9w0B
|
||||
AQsFAAOCAQEALvwVn0A/JPTCiNzcozHFnp5M23C9PXCplWc5u4k34d4XXzpSeFDz
|
||||
fL4gy7NpYIueme2K2ppw2j3PNQUdR6vQ5a75sriegWYrosL+7Q6Joh51ZyEUZQoD
|
||||
mNl4M4S4oX85EaChR6NFGBywTfjFarYi32XBTbFE7rK8N8KM+DQkNdwL1MXqaHWz
|
||||
F1obQKpNXlLedbCBOteV5Eg4zG3565zu/Gw/NhwzzV3mQmgxUcd1sMJxAfHQz4Vl
|
||||
ImLL+xMcR03nDsH2bgtDbK2tJm7WszSxA9tC+Xp2lRewxrnQloRWPYDz177WGQ5Q
|
||||
SoGDzTTtA6uWZxG8h7CkNLOGvA8LtU2rNA==
|
||||
-----END CERTIFICATE REQUEST-----
|
||||
@@ -1,16 +0,0 @@
|
||||
-----BEGIN CERTIFICATE REQUEST-----
|
||||
MIICdjCCAV4CAQAwADCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBANoV
|
||||
T1pdvRUUBOqvm7M2ebLEHV7higUH7qAGUZEkfP6W4YriYVY+IHrH1svNPSa+oPTK
|
||||
7weDNmT11ehWnGyECIM9z2r2Hi9yVV0ycxh4hWQ4Nt8BAKZwCwaXpyWm7Gj6m2Ez
|
||||
pSN5Dd67g5YAQBrUUh1+RRbFi9c0Ls/6ZOExMvfg8kqt4c2sXCgH1IFnxvvOjBYo
|
||||
p7xh0x3L1Akyax0tw8qgQp/z5mkupmVDNJYPFmbzFPMNyDR61ed6QUTDg7P4UAuF
|
||||
kejLLzFvz5YaO7vC+huaTuPhInAhpzqpr4yU97KIjos2/83Itu/Cv8U1RAeEeRTk
|
||||
h0WjUfltoem/5f8bIdsCAwEAAaAxMC8GCSqGSIb3DQEJDjEiMCAwHgYDVR0RBBcw
|
||||
FYINYS5leGVtcGxlLmNvbYcEwAACbzANBgkqhkiG9w0BAQsFAAOCAQEAQ7n/hYen
|
||||
5INHlcslHPYCQ/BAbX6Ou+Y8hUu8puWNVpE2OM95L2C87jbWwTmCRnkFBwtyoNqo
|
||||
j3DXVW2RYv8y/exq7V6Y5LtpHTgwfugINJ3XlcVzA4Vnf1xqOxv3kwejkq74RuXn
|
||||
xd5N28srgiFqb0e4tOAWVI8Tw27bgBqjoXl0QDFPZpctqUia5bcDJ9WzNSM7VaO1
|
||||
CBNGHBRz+zL8sqoqJA4HV58tjcgzl+1RtGM+iUHxXpnH+aCNKWIUINrAzIm4Sm00
|
||||
93RJjhb1kdNR0BC7ikWVbAWaVviHdvATK/RfpmhWDqfEaNgBpvT91GnkhpzctSFD
|
||||
ro0yCUUXXrIr0w==
|
||||
-----END CERTIFICATE REQUEST-----
|
||||
@@ -1,6 +0,0 @@
|
||||
-----BEGIN PRIVATE KEY-----
|
||||
MIG2AgEAMBAGByqGSM49AgEGBSuBBAAiBIGeMIGbAgEBBDArTn0pbFk3xHfKeXte
|
||||
xJgS4JVdJQ8mqvezhaNpULZPnwb+mcKLlrj6f5SRM52yREGhZANiAAQcrMoPMVqV
|
||||
rHnDGGz5HUKLNmXfChlNgsrwsruawXF+M283CA6eckAjTXNyiC/ounWmvtoKsZG0
|
||||
2UQOfQUNSCANId/r986yRGc03W6RJSkcRp86qBYjNsLgbZpber/3+M4=
|
||||
-----END PRIVATE KEY-----
|
||||
@@ -1,13 +0,0 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIB/TCCAWagAwIBAgIJAOyRIBs3QT8QMA0GCSqGSIb3DQEBCwUAMBYxFDASBgNV
|
||||
BAMMC2V4YW1wbGUuY29tMB4XDTE4MDQyMzEwMzE0NFoXDTE4MDUyMzEwMzE0NFow
|
||||
FjEUMBIGA1UEAwwLZXhhbXBsZS5jb20wgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJ
|
||||
AoGBAJqJ87R8aVwByONxgQA9hwgvQd/QqI1r1UInXhEF2VnEtZGtUWLi100IpIqr
|
||||
Mq4qusDwNZ3g8cUPtSkvJGs89djoajMDIJP7lQUEKUYnYrI0q755Tr/DgLWSk7iW
|
||||
l5ezym0VzWUD0/xXUz8yRbNMTjTac80rS5SZk2ja2wWkYlRJAgMBAAGjUzBRMB0G
|
||||
A1UdDgQWBBSsaX0IVZ4XXwdeffVAbG7gnxSYjTAfBgNVHSMEGDAWgBSsaX0IVZ4X
|
||||
XwdeffVAbG7gnxSYjTAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4GB
|
||||
ADe7SVmvGH2nkwVfONk8TauRUDkePN1CJZKFb2zW1uO9ANJ2v5Arm/OQp0BG/xnI
|
||||
Djw/aLTNVESF89oe15dkrUErtcaF413MC1Ld5lTCaJLHLGqDKY69e02YwRuxW7jY
|
||||
qarpt7k7aR5FbcfO5r4V/FK/Gvp4Dmoky8uap7SJIW6x
|
||||
-----END CERTIFICATE-----
|
||||
@@ -1,30 +0,0 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIFDTCCAvWgAwIBAgIUImqDrP53V69vFROsjP/gL0YtoA4wDQYJKoZIhvcNAQEL
|
||||
BQAwFjEUMBIGA1UEAwwLZXhhbXBsZS5jb20wHhcNMjAwNTI3MjMyNDE0WhcNMjAw
|
||||
NjI2MjMyNDE0WjAWMRQwEgYDVQQDDAtleGFtcGxlLmNvbTCCAiIwDQYJKoZIhvcN
|
||||
AQEBBQADggIPADCCAgoCggIBANY9LKLk9Dxn0MUMQFHwBoTN4ehDSWBws2KcytpF
|
||||
mc8m9Mfk1wmb4fQSKYtK3wIFMfIyo9HQu0nKqMkkUw52o3ZXyOv+oWwF5qNy2BKu
|
||||
lh5OMSkaZ0o13zoPpW42e+IUnyxvg70+0urD+sUue4cyTHh/nBIUjrM/05ZJ/ac8
|
||||
HR0RK3H41YoqBjq69JjMZczZZhbNFit3s6p0R1TbVAgc3ckqbtX5BDyQMQQCP4Ed
|
||||
m4DgbAFVqdcPUCC5W3F3fmuQiPKHiADzONZnXpy6lUvLDWqcd6loKp+nKHM6OkXX
|
||||
8hmD7pE1PYMQo4hqOfhBR2IgMjAShwd5qUFjl1m2oo0Qm3PFXOk6i2ZQdS6AA/yd
|
||||
B5/mX0RnM2oIdFZPb6UZFSmtEgs9sTzn+hMUyNSZQRE54px1ur1xws2R+vbsCyM5
|
||||
+KoFVxDjVjU9TlZx3GvDvnqz/tbHjji6l8VHZYOBMBUXbKHu2U6pJFZ5Zp7k68/z
|
||||
a3Fb9Pjtn3iRkXEyC0N5kLgqO4QTlExnxebV8aMvQpWd/qefnMn9qPYIZPEXSQAR
|
||||
mEBIahkcACb60s+acG0WFFluwBPtBqEr8Q67XlSF0Ibf4iBiRzpPobhlWta1nrFg
|
||||
4IWHMSoZ0PE75bhIGBEkhrpcXQCAxXmAfxfjKDH7jdJ1fRdnZ/9+OzwYGVX5GH/l
|
||||
0QDtAgMBAAGjUzBRMB0GA1UdDgQWBBQh3xiz/o1nEU2ySylZ9gxCXvIPGzAfBgNV
|
||||
HSMEGDAWgBQh3xiz/o1nEU2ySylZ9gxCXvIPGzAPBgNVHRMBAf8EBTADAQH/MA0G
|
||||
CSqGSIb3DQEBCwUAA4ICAQAELoXz31oR9pdAwidlv9ZBOKiC7KBWy8VMqXNVkfTn
|
||||
bVRxAUex7zleLFIOkWnqadsMesU9sIwrbLzBcZ8Q/vBY+z2xOPdXcgcAoAmdKWoq
|
||||
YBQNiqng9r54sqlzB/77QZCf5fdktESe7NTxhCifgx5SAWq7IUQs/lm3tnMUSAfE
|
||||
5ctuN6M+w8K54y3WDprcfMHpnc3ZHeSPhVQApHM0h/bDvXq0bRS7kmq27Hb153Qm
|
||||
nH3TwYB5pPSWW38NbUc+s/a7mItO7S8ly8yGbA0j9c/IbN5lM+OCdk06asz3+c8E
|
||||
uo8nuCBoYO5+6AqC2N7WJ3Tdr/pFA8jTbd6VNVlgCWTIR8ZosL5Fgkfv+4fUBrHt
|
||||
zdVUqMUzvga5rvZnwnJ5Qfu/drHeAAo9MTNFQNe2QgDlYfWBh5GweolgmFSwrpkY
|
||||
v/5wLtIyv/ASHKswybbqMIlpttcLTXjx5yuh8swttT6Wh+FQqqQ32KSRB3StiwyK
|
||||
oH0ZhrwYHiFYNlPxecGX6XUta6rFtTlEdkBGSnXzgiTzL2l+Nc0as0V5B9RninZG
|
||||
qJ+VOChSQ0OFvg1riSXv7tMvbLdGQnxwTRL3t6BMS8I4LA2m3ZfWUcuXT783ODTH
|
||||
16f1Q1AgXd2csstTWO9cv+N/0fpX31nqrm6+CrGduSr2u4HjYYnlLIUhmdTvK3fX
|
||||
Fg==
|
||||
-----END CERTIFICATE-----
|
||||
@@ -1,51 +0,0 @@
|
||||
-----BEGIN RSA PRIVATE KEY-----
|
||||
MIIJKgIBAAKCAgEA1j0souT0PGfQxQxAUfAGhM3h6ENJYHCzYpzK2kWZzyb0x+TX
|
||||
CZvh9BIpi0rfAgUx8jKj0dC7ScqoySRTDnajdlfI6/6hbAXmo3LYEq6WHk4xKRpn
|
||||
SjXfOg+lbjZ74hSfLG+DvT7S6sP6xS57hzJMeH+cEhSOsz/Tlkn9pzwdHRErcfjV
|
||||
iioGOrr0mMxlzNlmFs0WK3ezqnRHVNtUCBzdySpu1fkEPJAxBAI/gR2bgOBsAVWp
|
||||
1w9QILlbcXd+a5CI8oeIAPM41mdenLqVS8sNapx3qWgqn6coczo6RdfyGYPukTU9
|
||||
gxCjiGo5+EFHYiAyMBKHB3mpQWOXWbaijRCbc8Vc6TqLZlB1LoAD/J0Hn+ZfRGcz
|
||||
agh0Vk9vpRkVKa0SCz2xPOf6ExTI1JlBETninHW6vXHCzZH69uwLIzn4qgVXEONW
|
||||
NT1OVnHca8O+erP+1seOOLqXxUdlg4EwFRdsoe7ZTqkkVnlmnuTrz/NrcVv0+O2f
|
||||
eJGRcTILQ3mQuCo7hBOUTGfF5tXxoy9ClZ3+p5+cyf2o9ghk8RdJABGYQEhqGRwA
|
||||
JvrSz5pwbRYUWW7AE+0GoSvxDrteVIXQht/iIGJHOk+huGVa1rWesWDghYcxKhnQ
|
||||
8TvluEgYESSGulxdAIDFeYB/F+MoMfuN0nV9F2dn/347PBgZVfkYf+XRAO0CAwEA
|
||||
AQKCAgEA0hZdTkQtCYtYm9LexDsXeWYX8VcCfrMmBj7xYcg9A3oVMmzDPuYBVwH0
|
||||
gWbjd6y2hOaJ5TfGYZ99kvmvBRDsTSHaoyopC7BhssjtAKz6Ay/0X3VH8usPQ3WS
|
||||
aZi+NT65tK6KRqtz08ppgLGLa1G00bl5x/Um1rpxeACI4FU/y4BJ1VMJvJpnT3KE
|
||||
Z86Qyagqx5NH+UpCApZSWPFX3zjHePzGgcfXErjniCHYOnpZQrFQ2KIzkfSvQ9fg
|
||||
x01ByKOM2CB2C1B33TCzBAioXRH6zyAu7A59NeCK9ywTduhDvie1a+oEryFC7IQW
|
||||
4s7I/H3MGX4hsf/pLXlHMy+5CZJOjRaC2h+pypfbbcuiXu6Sn64kHNpiI7SxI5DI
|
||||
MIRjyG7MdUcrzq0Rt8ogwwpbCoRqrl/w3bhxtqmeZaEZtyxbjlm7reK2YkIFDgyz
|
||||
JMqiJK5ZAi+9L/8c0xhjjAQQ0sIzrjmjA8U+6YnWL9jU5qXTVnBB8XQucyeeZGgk
|
||||
yRHyMur71qOXN8z3UEva7MHkDTUBlj8DgTz6sEjqCipaWl0CXfDNa4IhHIXD5qiF
|
||||
wplhq7OeS0v6EGG/UFa3Q/lFntxtrayxJX7uvvSccGzjPKXTjpWUELLi/FdnIsum
|
||||
eXT3RgIEYozj4BibDXaBLfHTCVzxOr7AAEvKM9XWSUgLA0paSWECggEBAO9ZBeE1
|
||||
GWzd1ejTTkcxBC9AK2rNsYG8PdNqiof/iTbuJWNeRqpG+KB/0CNIpjZ2X5xZd0tM
|
||||
FDpHTFehlP26Roxuq50iRAFc+SN5KoiO0A3JuJAidreIgRTia1saUUrypHqWrYEA
|
||||
VZVj2AI8Pyg3s1OkR2frFskY7hXBVb/pJNDP/m9xTXXIYiIXYkHYe+4RIJCnAxRv
|
||||
q5YHKaX+0Ull9YCZJCxmwvcHat8sgu8qkiwUMEM6QSNEkrEbdnWYBABvC1AR6sws
|
||||
7MP1h9+j22n4Zc/3D6kpFZEL9Erx8nNyhbOZ6q2Tdnf6YKVVjZdyVa8VyNnR0ROl
|
||||
3BjkFaHb/bg4e4kCggEBAOUk8ZJS3qBeGCOjug384zbHGcnhUBYtYJiOz+RXBtP+
|
||||
PRksbFtTkgk1sHuSGO8YRddU4Qv7Av1xL8o+DEsLBSD0YQ7pmLrR/LK+iDQ5N63O
|
||||
Fve9uJH0ybxAOkiua7G24+lTsVUP//KWToL4Wh5zbHBBjL5D2Z9zoeVbcE87xhva
|
||||
lImMVr4Ex252DqNP9wkZxBjudFyJ/C/TnXrjPcgwhxWTC7sLQMhE5p+490G7c4hX
|
||||
PywkIKrANbu37KDiAvVS+dC66ZgpL/NUDkeloAmGNO08LGzbV6YKchlvDyWU/AvW
|
||||
0hYjbL0FUq7K/wp1G9fumolB+fbI25K9c13X93STzUUCggEBAJDsNFUyk5yJjbYW
|
||||
C/WrRj9d+WwH9Az77+uNPSgvn+O0usq6EMuVgYGdImfa21lqv2Wp/kOHY1AOT7lX
|
||||
yyD+oyzw7dSNJOQ2aVwDR6+72Vof5DLRy1RBwPbmSd61xrc8yD658YCEtU1pUSe5
|
||||
VvyBDYH9nIbdn8RP5gkiMUusXXBaIFNWJXLFzDWcNxBrhk6V7EPp/EFphFmpKJyr
|
||||
+AkbRVWCZJbF+hMdWKadCwLJogwyhS6PnVU/dhrq6AU38GRa2Fy5HJRYN1xH1Oej
|
||||
DX3Su8L6c28Xw0k6FcczTHx+wVoIPkKvYTIwVkiFzt/+iMckx6KsGo5tBSHFKRwC
|
||||
WlQrTxECggEBALjUruLnY1oZ7AC7bTUhOimSOfQEgTQSUCtebsRxijlvhtsKYTDd
|
||||
XRt+qidStjgN7S/+8DRYuZWzOeg5WnMhpXZqiOudcyume922IGl3ibjxVsdoyjs5
|
||||
J4xohlrgDlBgBMDNWGoTqNGFejjcmNydH+gAh8VlN2INxJYbxqCyx17qVgwJHmLR
|
||||
uggYxD/pHYvCs9GkbknCp5/wYsOgDtKuihfV741lS1D/esN1UEQ+LrfYIEW7snno
|
||||
5q7Pcdhn1hkKYCWEzy2Ec4Aj2gzixQ9JqOF/OxpnZvCw1k47rg0TeqcWFYnz8x8Y
|
||||
7xO8/DH0OoxXk2GJzVXJuItJs4gLzzfCjL0CggEAJFHfC9jisdy7CoWiOpNCSF1B
|
||||
S0/CWDz77cZdlWkpTdaXGGp1MA/UKUFPIH8sOHfvpKS660+X4G/1ZBHmFb4P5kFF
|
||||
Qy8UyUMKtSOEdZS6KFlRlfSCAMd5aSTmCvq4OSjYEpMRwUhU/iEJNkn9Z1Soehe0
|
||||
U3dxJ8KiT1071geO6rRquSHoSJs6Y0WQKriYYQJOhh4Axs3PQihER2eyh+WGk8YJ
|
||||
02m0mMsjntqnXtdc6IcdKaHp9ko+OpM9QZLsvt19fxBcrXj/i21uUXrzuNtKfO6M
|
||||
JqGhsOrO2dh8lMhvodENvgKA0DmYDC9N7ogo7bxTNSedcjBF46FhJoqii8m70Q==
|
||||
-----END RSA PRIVATE KEY-----
|
||||
@@ -1,15 +0,0 @@
|
||||
"""Tests for acme.util."""
|
||||
import sys
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
def test_it():
|
||||
from acme.util import map_keys
|
||||
assert {'a': 'b', 'c': 'd'} == \
|
||||
map_keys({'a': 'b', 'c': 'd'}, lambda key: key)
|
||||
assert {2: 2, 4: 4} == map_keys({1: 2, 3: 4}, lambda x: x + 1)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
@@ -1,686 +0,0 @@
|
||||
"""ACME Identifier Validation Challenges."""
|
||||
import abc
|
||||
import functools
|
||||
import hashlib
|
||||
import logging
|
||||
import socket
|
||||
from typing import Any
|
||||
from typing import cast
|
||||
from typing import Dict
|
||||
from typing import Mapping
|
||||
from typing import Optional
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
from typing import TypeVar
|
||||
from typing import Union
|
||||
import warnings
|
||||
|
||||
from cryptography import x509
|
||||
from cryptography.hazmat.primitives import hashes
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
from OpenSSL import SSL
|
||||
import requests
|
||||
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
GenericChallenge = TypeVar('GenericChallenge', bound='Challenge')
|
||||
|
||||
|
||||
class Challenge(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
"""ACME challenge."""
|
||||
TYPES: Dict[str, Type['Challenge']] = {}
|
||||
|
||||
@classmethod
|
||||
def from_json(cls: Type[GenericChallenge],
|
||||
jobj: Mapping[str, Any]) -> Union[GenericChallenge, 'UnrecognizedChallenge']:
|
||||
try:
|
||||
return cast(GenericChallenge, super().from_json(jobj))
|
||||
except jose.UnrecognizedTypeError as error:
|
||||
logger.debug(error)
|
||||
return UnrecognizedChallenge.from_json(jobj)
|
||||
|
||||
|
||||
class ChallengeResponse(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
"""ACME challenge response."""
|
||||
TYPES: Dict[str, Type['ChallengeResponse']] = {}
|
||||
|
||||
def to_partial_json(self) -> Dict[str, Any]:
|
||||
# Removes the `type` field which is inserted by TypedJSONObjectWithFields.to_partial_json.
|
||||
# This field breaks RFC8555 compliance.
|
||||
jobj = super().to_partial_json()
|
||||
jobj.pop(self.type_field_name, None)
|
||||
return jobj
|
||||
|
||||
|
||||
class UnrecognizedChallenge(Challenge):
|
||||
"""Unrecognized challenge.
|
||||
|
||||
ACME specification defines a generic framework for challenges and
|
||||
defines some standard challenges that are implemented in this
|
||||
module. However, other implementations (including peers) might
|
||||
define additional challenge types, which should be ignored if
|
||||
unrecognized.
|
||||
|
||||
:ivar jobj: Original JSON decoded object.
|
||||
|
||||
"""
|
||||
jobj: Dict[str, Any]
|
||||
|
||||
def __init__(self, jobj: Mapping[str, Any]) -> None:
|
||||
super().__init__()
|
||||
object.__setattr__(self, "jobj", jobj)
|
||||
|
||||
def to_partial_json(self) -> Dict[str, Any]:
|
||||
return self.jobj # pylint: disable=no-member
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj: Mapping[str, Any]) -> 'UnrecognizedChallenge':
|
||||
return cls(jobj)
|
||||
|
||||
|
||||
class _TokenChallenge(Challenge):
|
||||
"""Challenge with token.
|
||||
|
||||
:ivar bytes token:
|
||||
|
||||
"""
|
||||
TOKEN_SIZE = 128 // 8 # Based on the entropy value from the spec
|
||||
"""Minimum size of the :attr:`token` in bytes."""
|
||||
|
||||
# TODO: acme-spec doesn't specify token as base64-encoded value
|
||||
token: bytes = jose.field(
|
||||
"token", encoder=jose.encode_b64jose, decoder=functools.partial(
|
||||
jose.decode_b64jose, size=TOKEN_SIZE, minimum=True))
|
||||
|
||||
# XXX: rename to ~token_good_for_url
|
||||
@property
|
||||
def good_token(self) -> bool: # XXX: @token.decoder
|
||||
"""Is `token` good?
|
||||
|
||||
.. todo:: acme-spec wants "It MUST NOT contain any non-ASCII
|
||||
characters", but it should also warrant that it doesn't
|
||||
contain ".." or "/"...
|
||||
|
||||
"""
|
||||
# TODO: check that path combined with uri does not go above
|
||||
# URI_ROOT_PATH!
|
||||
# pylint: disable=unsupported-membership-test
|
||||
return b'..' not in self.token and b'/' not in self.token
|
||||
|
||||
|
||||
class KeyAuthorizationChallengeResponse(ChallengeResponse):
|
||||
"""Response to Challenges based on Key Authorization.
|
||||
|
||||
:param str key_authorization:
|
||||
|
||||
"""
|
||||
key_authorization: str = jose.field("keyAuthorization")
|
||||
thumbprint_hash_function = hashes.SHA256
|
||||
|
||||
def verify(self, chall: 'KeyAuthorizationChallenge', account_public_key: jose.JWK) -> bool:
|
||||
"""Verify the key authorization.
|
||||
|
||||
:param KeyAuthorization chall: Challenge that corresponds to
|
||||
this response.
|
||||
:param JWK account_public_key:
|
||||
|
||||
:return: ``True`` iff verification of the key authorization was
|
||||
successful.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
parts = self.key_authorization.split('.') # pylint: disable=no-member
|
||||
if len(parts) != 2:
|
||||
logger.debug("Key authorization (%r) is not well formed",
|
||||
self.key_authorization)
|
||||
return False
|
||||
|
||||
if parts[0] != chall.encode("token"):
|
||||
logger.debug("Mismatching token in key authorization: "
|
||||
"%r instead of %r", parts[0], chall.encode("token"))
|
||||
return False
|
||||
|
||||
thumbprint = jose.b64encode(account_public_key.thumbprint(
|
||||
hash_function=self.thumbprint_hash_function)).decode()
|
||||
if parts[1] != thumbprint:
|
||||
logger.debug("Mismatching thumbprint in key authorization: "
|
||||
"%r instead of %r", parts[0], thumbprint)
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def to_partial_json(self) -> Dict[str, Any]:
|
||||
jobj = super().to_partial_json()
|
||||
jobj.pop('keyAuthorization', None)
|
||||
return jobj
|
||||
|
||||
|
||||
# TODO: Make this method a generic of K (bound=KeyAuthorizationChallenge), response_cls of type
|
||||
# Type[K] and use it in response/response_and_validation return types once Python 3.6 support is
|
||||
# dropped (do not support generic ABC classes, see https://github.com/python/typing/issues/449).
|
||||
class KeyAuthorizationChallenge(_TokenChallenge, metaclass=abc.ABCMeta):
|
||||
"""Challenge based on Key Authorization.
|
||||
|
||||
:param response_cls: Subclass of `KeyAuthorizationChallengeResponse`
|
||||
that will be used to generate ``response``.
|
||||
:param str typ: type of the challenge
|
||||
"""
|
||||
typ: str = NotImplemented
|
||||
response_cls: Type[KeyAuthorizationChallengeResponse] = NotImplemented
|
||||
thumbprint_hash_function = (
|
||||
KeyAuthorizationChallengeResponse.thumbprint_hash_function)
|
||||
|
||||
def key_authorization(self, account_key: jose.JWK) -> str:
|
||||
"""Generate Key Authorization.
|
||||
|
||||
:param JWK account_key:
|
||||
:rtype str:
|
||||
|
||||
"""
|
||||
return self.encode("token") + "." + jose.b64encode(
|
||||
account_key.thumbprint(
|
||||
hash_function=self.thumbprint_hash_function)).decode()
|
||||
|
||||
def response(self, account_key: jose.JWK) -> KeyAuthorizationChallengeResponse:
|
||||
"""Generate response to the challenge.
|
||||
|
||||
:param JWK account_key:
|
||||
|
||||
:returns: Response (initialized `response_cls`) to the challenge.
|
||||
:rtype: KeyAuthorizationChallengeResponse
|
||||
|
||||
"""
|
||||
return self.response_cls( # pylint: disable=not-callable
|
||||
key_authorization=self.key_authorization(account_key))
|
||||
|
||||
@abc.abstractmethod
|
||||
def validation(self, account_key: jose.JWK, **kwargs: Any) -> Any:
|
||||
"""Generate validation for the challenge.
|
||||
|
||||
Subclasses must implement this method, but they are likely to
|
||||
return completely different data structures, depending on what's
|
||||
necessary to complete the challenge. Interpretation of that
|
||||
return value must be known to the caller.
|
||||
|
||||
:param JWK account_key:
|
||||
:returns: Challenge-specific validation.
|
||||
|
||||
"""
|
||||
raise NotImplementedError() # pragma: no cover
|
||||
|
||||
def response_and_validation(self, account_key: jose.JWK, *args: Any, **kwargs: Any
|
||||
) -> Tuple[KeyAuthorizationChallengeResponse, Any]:
|
||||
"""Generate response and validation.
|
||||
|
||||
Convenience function that return results of `response` and
|
||||
`validation`.
|
||||
|
||||
:param JWK account_key:
|
||||
:rtype: tuple
|
||||
|
||||
"""
|
||||
return (self.response(account_key),
|
||||
self.validation(account_key, *args, **kwargs))
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class DNS01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME dns-01 challenge response."""
|
||||
typ = "dns-01"
|
||||
|
||||
def simple_verify(self, chall: 'DNS01', domain: str, account_public_key: jose.JWK) -> bool: # pylint: disable=unused-argument
|
||||
"""Simple verify.
|
||||
|
||||
This method no longer checks DNS records and is a simple wrapper
|
||||
around `KeyAuthorizationChallengeResponse.verify`.
|
||||
|
||||
:param challenges.DNS01 chall: Corresponding challenge.
|
||||
:param str domain: Domain name being verified.
|
||||
:param JWK account_public_key: Public key for the key pair
|
||||
being authorized.
|
||||
|
||||
:return: ``True`` iff verification of the key authorization was
|
||||
successful.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
verified = self.verify(chall, account_public_key)
|
||||
if not verified:
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return verified
|
||||
|
||||
|
||||
@Challenge.register
|
||||
class DNS01(KeyAuthorizationChallenge):
|
||||
"""ACME dns-01 challenge."""
|
||||
response_cls = DNS01Response
|
||||
typ = response_cls.typ
|
||||
|
||||
LABEL = "_acme-challenge"
|
||||
"""Label clients prepend to the domain name being validated."""
|
||||
|
||||
def validation(self, account_key: jose.JWK, **unused_kwargs: Any) -> str:
|
||||
"""Generate validation.
|
||||
|
||||
:param JWK account_key:
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
return jose.b64encode(hashlib.sha256(self.key_authorization(
|
||||
account_key).encode("utf-8")).digest()).decode()
|
||||
|
||||
def validation_domain_name(self, name: str) -> str:
|
||||
"""Domain name for TXT validation record.
|
||||
|
||||
:param str name: Domain name being validated.
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
return f"{self.LABEL}.{name}"
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class HTTP01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME http-01 challenge response."""
|
||||
typ = "http-01"
|
||||
|
||||
PORT = 80
|
||||
"""Verification port as defined by the protocol.
|
||||
|
||||
You can override it (e.g. for testing) by passing ``port`` to
|
||||
`simple_verify`.
|
||||
|
||||
"""
|
||||
|
||||
WHITESPACE_CUTSET = "\n\r\t "
|
||||
"""Whitespace characters which should be ignored at the end of the body."""
|
||||
|
||||
def simple_verify(self, chall: 'HTTP01', domain: str, account_public_key: jose.JWK,
|
||||
port: Optional[int] = None, timeout: int = 30) -> bool:
|
||||
"""Simple verify.
|
||||
|
||||
:param challenges.SimpleHTTP chall: Corresponding challenge.
|
||||
:param str domain: Domain name being verified.
|
||||
:param JWK account_public_key: Public key for the key pair
|
||||
being authorized.
|
||||
:param int port: Port used in the validation.
|
||||
:param int timeout: Timeout in seconds.
|
||||
|
||||
:returns: ``True`` iff validation with the files currently served by the
|
||||
HTTP server is successful.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not self.verify(chall, account_public_key):
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return False
|
||||
|
||||
# TODO: ACME specification defines URI template that doesn't
|
||||
# allow to use a custom port... Make sure port is not in the
|
||||
# request URI, if it's standard.
|
||||
if port is not None and port != self.PORT:
|
||||
logger.warning(
|
||||
"Using non-standard port for http-01 verification: %s", port)
|
||||
domain += ":{0}".format(port)
|
||||
|
||||
uri = chall.uri(domain)
|
||||
logger.debug("Verifying %s at %s...", chall.typ, uri)
|
||||
try:
|
||||
http_response = requests.get(uri, verify=False, timeout=timeout)
|
||||
except requests.exceptions.RequestException as error:
|
||||
logger.error("Unable to reach %s: %s", uri, error)
|
||||
return False
|
||||
# By default, http_response.text will try to guess the encoding to use
|
||||
# when decoding the response to Python unicode strings. This guesswork
|
||||
# is error prone. RFC 8555 specifies that HTTP-01 responses should be
|
||||
# key authorizations with possible trailing whitespace. Since key
|
||||
# authorizations must be composed entirely of the base64url alphabet
|
||||
# plus ".", we tell requests that the response should be ASCII. See
|
||||
# https://datatracker.ietf.org/doc/html/rfc8555#section-8.3 for more
|
||||
# info.
|
||||
http_response.encoding = "ascii"
|
||||
logger.debug("Received %s: %s. Headers: %s", http_response,
|
||||
http_response.text, http_response.headers)
|
||||
|
||||
challenge_response = http_response.text.rstrip(self.WHITESPACE_CUTSET)
|
||||
if self.key_authorization != challenge_response:
|
||||
logger.debug("Key authorization from response (%r) doesn't match "
|
||||
"HTTP response (%r)", self.key_authorization,
|
||||
challenge_response)
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
@Challenge.register
|
||||
class HTTP01(KeyAuthorizationChallenge):
|
||||
"""ACME http-01 challenge."""
|
||||
response_cls = HTTP01Response
|
||||
typ = response_cls.typ
|
||||
|
||||
URI_ROOT_PATH = ".well-known/acme-challenge"
|
||||
"""URI root path for the server provisioned resource."""
|
||||
|
||||
@property
|
||||
def path(self) -> str:
|
||||
"""Path (starting with '/') for provisioned resource.
|
||||
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
return '/' + self.URI_ROOT_PATH + '/' + self.encode('token')
|
||||
|
||||
def uri(self, domain: str) -> str:
|
||||
"""Create an URI to the provisioned resource.
|
||||
|
||||
Forms an URI to the HTTPS server provisioned resource
|
||||
(containing :attr:`~SimpleHTTP.token`).
|
||||
|
||||
:param str domain: Domain name being verified.
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
return "http://" + domain + self.path
|
||||
|
||||
def validation(self, account_key: jose.JWK, **unused_kwargs: Any) -> str:
|
||||
"""Generate validation.
|
||||
|
||||
:param JWK account_key:
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
return self.key_authorization(account_key)
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME tls-alpn-01 challenge response.
|
||||
|
||||
.. deprecated:: 4.1.0
|
||||
|
||||
"""
|
||||
typ = "tls-alpn-01"
|
||||
|
||||
PORT = 443
|
||||
"""Verification port as defined by the protocol.
|
||||
|
||||
You can override it (e.g. for testing) by passing ``port`` to
|
||||
`simple_verify`.
|
||||
|
||||
"""
|
||||
|
||||
ID_PE_ACME_IDENTIFIER_V1 = b"1.3.6.1.5.5.7.1.30.1"
|
||||
ACME_TLS_1_PROTOCOL = b"acme-tls/1"
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
warnings.warn("TLSALPN01Response is deprecated and will be removed in an "
|
||||
"upcoming certbot major version update", DeprecationWarning)
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
@property
|
||||
def h(self) -> bytes:
|
||||
"""Hash value stored in challenge certificate"""
|
||||
return hashlib.sha256(self.key_authorization.encode('utf-8')).digest()
|
||||
|
||||
def gen_cert(self, domain: str, key: Optional[crypto.PKey] = None, bits: int = 2048
|
||||
) -> Tuple[x509.Certificate, crypto.PKey]:
|
||||
"""Generate tls-alpn-01 certificate.
|
||||
|
||||
:param str domain: Domain verified by the challenge.
|
||||
:param OpenSSL.crypto.PKey key: Optional private key used in
|
||||
certificate generation. If not provided (``None``), then
|
||||
fresh key will be generated.
|
||||
:param int bits: Number of bits for newly generated key.
|
||||
|
||||
:rtype: `tuple` of `x509.Certificate` and `OpenSSL.crypto.PKey`
|
||||
|
||||
"""
|
||||
if key is None:
|
||||
key = crypto.PKey()
|
||||
key.generate_key(crypto.TYPE_RSA, bits)
|
||||
|
||||
oid = x509.ObjectIdentifier(self.ID_PE_ACME_IDENTIFIER_V1.decode())
|
||||
acme_extension = x509.Extension(
|
||||
oid,
|
||||
critical=True,
|
||||
value=x509.UnrecognizedExtension(oid, self.h)
|
||||
)
|
||||
|
||||
cryptography_key = key.to_cryptography_key()
|
||||
assert isinstance(cryptography_key, crypto_util.CertificateIssuerPrivateKeyTypesTpl)
|
||||
cert = crypto_util.make_self_signed_cert(
|
||||
cryptography_key,
|
||||
[domain],
|
||||
force_san=True,
|
||||
extensions=[acme_extension]
|
||||
)
|
||||
return cert, key
|
||||
|
||||
def probe_cert(self, domain: str, host: Optional[str] = None,
|
||||
port: Optional[int] = None) -> x509.Certificate:
|
||||
"""Probe tls-alpn-01 challenge certificate.
|
||||
|
||||
:param str domain: domain being validated, required.
|
||||
:param str host: IP address used to probe the certificate.
|
||||
:param int port: Port used to probe the certificate.
|
||||
|
||||
"""
|
||||
if host is None:
|
||||
host = socket.gethostbyname(domain)
|
||||
logger.debug('%s resolved to %s', domain, host)
|
||||
if port is None:
|
||||
port = self.PORT
|
||||
|
||||
with warnings.catch_warnings():
|
||||
warnings.filterwarnings(
|
||||
'ignore',
|
||||
message='alpn_protocols parameter is deprecated'
|
||||
)
|
||||
return crypto_util.probe_sni(host=host.encode(), port=port, name=domain.encode(),
|
||||
alpn_protocols=[self.ACME_TLS_1_PROTOCOL])
|
||||
|
||||
def verify_cert(self, domain: str, cert: x509.Certificate, ) -> bool:
|
||||
"""Verify tls-alpn-01 challenge certificate.
|
||||
|
||||
:param str domain: Domain name being validated.
|
||||
:param cert: Challenge certificate.
|
||||
:type cert: `cryptography.x509.Certificate`
|
||||
|
||||
:returns: Whether the certificate was successfully verified.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
names = crypto_util.get_names_from_subject_and_extensions(
|
||||
cert.subject, cert.extensions
|
||||
)
|
||||
logger.debug(
|
||||
"Certificate %s. SANs: %s", cert.fingerprint(hashes.SHA256()), names
|
||||
)
|
||||
if len(names) != 1 or names[0].lower() != domain.lower():
|
||||
return False
|
||||
|
||||
try:
|
||||
ext = cert.extensions.get_extension_for_oid(
|
||||
x509.ObjectIdentifier(self.ID_PE_ACME_IDENTIFIER_V1.decode())
|
||||
)
|
||||
except x509.ExtensionNotFound:
|
||||
return False
|
||||
|
||||
# This is for the type checker.
|
||||
assert isinstance(ext.value, x509.UnrecognizedExtension)
|
||||
return ext.value.value == self.h
|
||||
|
||||
# pylint: disable=too-many-arguments
|
||||
def simple_verify(self, chall: 'TLSALPN01', domain: str, account_public_key: jose.JWK,
|
||||
cert: Optional[x509.Certificate] = None, host: Optional[str] = None,
|
||||
port: Optional[int] = None) -> bool:
|
||||
"""Simple verify.
|
||||
|
||||
Verify ``validation`` using ``account_public_key``, optionally
|
||||
probe tls-alpn-01 certificate and check using `verify_cert`.
|
||||
|
||||
:param .challenges.TLSALPN01 chall: Corresponding challenge.
|
||||
:param str domain: Domain name being validated.
|
||||
:param JWK account_public_key:
|
||||
:param x509.Certificate cert: Optional certificate. If not
|
||||
provided (``None``) certificate will be retrieved using
|
||||
`probe_cert`.
|
||||
:param string host: IP address used to probe the certificate.
|
||||
:param int port: Port used to probe the certificate.
|
||||
|
||||
|
||||
:returns: ``True`` if and only if client's control of the domain has been verified.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not self.verify(chall, account_public_key):
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return False
|
||||
|
||||
if cert is None:
|
||||
try:
|
||||
cert = self.probe_cert(domain=domain, host=host, port=port)
|
||||
except errors.Error as error:
|
||||
logger.debug(str(error), exc_info=True)
|
||||
return False
|
||||
|
||||
return self.verify_cert(domain, cert)
|
||||
|
||||
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class TLSALPN01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-alpn-01 challenge.
|
||||
|
||||
.. deprecated:: 4.1.0
|
||||
|
||||
"""
|
||||
response_cls = TLSALPN01Response
|
||||
typ = response_cls.typ
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
warnings.warn("TLSALPN01 is deprecated and will be removed in an "
|
||||
"upcoming certbot major version update", DeprecationWarning)
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def validation(self, account_key: jose.JWK,
|
||||
**kwargs: Any) -> Tuple[x509.Certificate, crypto.PKey]:
|
||||
"""Generate validation.
|
||||
|
||||
:param JWK account_key:
|
||||
:param str domain: Domain verified by the challenge.
|
||||
:param OpenSSL.crypto.PKey cert_key: Optional private key used
|
||||
in certificate generation. If not provided (``None``), then
|
||||
fresh key will be generated.
|
||||
|
||||
:rtype: `tuple` of `x509.Certificate` and `OpenSSL.crypto.PKey`
|
||||
|
||||
"""
|
||||
# TODO: Remove cast when response() is generic.
|
||||
return cast(TLSALPN01Response, self.response(account_key)).gen_cert(
|
||||
key=kwargs.get('cert_key'),
|
||||
domain=cast(str, kwargs.get('domain')))
|
||||
|
||||
@staticmethod
|
||||
def is_supported() -> bool:
|
||||
"""
|
||||
Check if TLS-ALPN-01 challenge is supported on this machine.
|
||||
This implies that a recent version of OpenSSL is installed (>= 1.0.2),
|
||||
or a recent cryptography version shipped with the OpenSSL library is installed.
|
||||
|
||||
:returns: ``True`` if TLS-ALPN-01 is supported on this machine, ``False`` otherwise.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
warnings.warn("TLSALPN01 is deprecated and will be removed in an "
|
||||
"upcoming certbot major version update", DeprecationWarning)
|
||||
return (hasattr(SSL.Connection, "set_alpn_protos")
|
||||
and hasattr(SSL.Context, "set_alpn_select_callback"))
|
||||
|
||||
|
||||
@Challenge.register
|
||||
class DNS(_TokenChallenge):
|
||||
"""ACME "dns" challenge."""
|
||||
typ = "dns"
|
||||
|
||||
LABEL = "_acme-challenge"
|
||||
"""Label clients prepend to the domain name being validated."""
|
||||
|
||||
def gen_validation(self, account_key: jose.JWK, alg: jose.JWASignature = jose.RS256,
|
||||
**kwargs: Any) -> jose.JWS:
|
||||
"""Generate validation.
|
||||
|
||||
:param .JWK account_key: Private account key.
|
||||
:param .JWA alg:
|
||||
|
||||
:returns: This challenge wrapped in `.JWS`
|
||||
:rtype: .JWS
|
||||
|
||||
"""
|
||||
return jose.JWS.sign(
|
||||
payload=self.json_dumps(sort_keys=True).encode('utf-8'),
|
||||
key=account_key, alg=alg, **kwargs)
|
||||
|
||||
def check_validation(self, validation: jose.JWS, account_public_key: jose.JWK) -> bool:
|
||||
"""Check validation.
|
||||
|
||||
:param JWS validation:
|
||||
:param JWK account_public_key:
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not validation.verify(key=account_public_key):
|
||||
return False
|
||||
try:
|
||||
return self == self.json_loads(
|
||||
validation.payload.decode('utf-8'))
|
||||
except jose.DeserializationError as error:
|
||||
logger.debug("Checking validation for DNS failed: %s", error)
|
||||
return False
|
||||
|
||||
def gen_response(self, account_key: jose.JWK, **kwargs: Any) -> 'DNSResponse':
|
||||
"""Generate response.
|
||||
|
||||
:param .JWK account_key: Private account key.
|
||||
:param .JWA alg:
|
||||
|
||||
:rtype: DNSResponse
|
||||
|
||||
"""
|
||||
return DNSResponse(validation=self.gen_validation(account_key, **kwargs))
|
||||
|
||||
def validation_domain_name(self, name: str) -> str:
|
||||
"""Domain name for TXT validation record.
|
||||
|
||||
:param str name: Domain name being validated.
|
||||
|
||||
"""
|
||||
return "{0}.{1}".format(self.LABEL, name)
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class DNSResponse(ChallengeResponse):
|
||||
"""ACME "dns" challenge response.
|
||||
|
||||
:param JWS validation:
|
||||
|
||||
"""
|
||||
typ = "dns"
|
||||
|
||||
validation: jose.JWS = jose.field("validation", decoder=jose.JWS.from_json)
|
||||
|
||||
def check_validation(self, chall: 'DNS', account_public_key: jose.JWK) -> bool:
|
||||
"""Check validation.
|
||||
|
||||
:param challenges.DNS chall:
|
||||
:param JWK account_public_key:
|
||||
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
return chall.check_validation(self.validation, account_public_key)
|
||||
@@ -1,857 +0,0 @@
|
||||
"""ACME client API."""
|
||||
import base64
|
||||
import datetime
|
||||
from email.utils import parsedate_tz
|
||||
import http.client as http_client
|
||||
import logging
|
||||
import math
|
||||
import random
|
||||
import time
|
||||
from typing import Any
|
||||
from typing import cast
|
||||
from typing import List
|
||||
from typing import Mapping
|
||||
from typing import Optional
|
||||
from typing import Set
|
||||
from typing import Tuple
|
||||
from typing import Union
|
||||
|
||||
from cryptography import x509
|
||||
|
||||
import josepy as jose
|
||||
import requests
|
||||
from requests.adapters import HTTPAdapter
|
||||
from requests.utils import parse_header_links
|
||||
|
||||
from acme import challenges
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
from acme import jws
|
||||
from acme import messages
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
DEFAULT_NETWORK_TIMEOUT = 45
|
||||
|
||||
|
||||
class ClientV2:
|
||||
"""ACME client for a v2 API.
|
||||
|
||||
:ivar messages.Directory directory:
|
||||
:ivar .ClientNetwork net: Client network.
|
||||
"""
|
||||
|
||||
def __init__(self, directory: messages.Directory, net: 'ClientNetwork') -> None:
|
||||
"""Initialize.
|
||||
|
||||
:param .messages.Directory directory: Directory Resource
|
||||
:param .ClientNetwork net: Client network.
|
||||
"""
|
||||
self.directory = directory
|
||||
self.net = net
|
||||
|
||||
def new_account(self, new_account: messages.NewRegistration) -> messages.RegistrationResource:
|
||||
"""Register.
|
||||
|
||||
:param .NewRegistration new_account:
|
||||
|
||||
:raises .ConflictError: in case the account already exists
|
||||
|
||||
:returns: Registration Resource.
|
||||
:rtype: `.RegistrationResource`
|
||||
"""
|
||||
response = self._post(self.directory['newAccount'], new_account)
|
||||
# if account already exists
|
||||
if response.status_code == 200 and 'Location' in response.headers:
|
||||
raise errors.ConflictError(response.headers['Location'])
|
||||
# "Instance of 'Field' has no key/contact member" bug:
|
||||
regr = self._regr_from_response(response)
|
||||
self.net.account = regr
|
||||
return regr
|
||||
|
||||
def query_registration(self, regr: messages.RegistrationResource
|
||||
) -> messages.RegistrationResource:
|
||||
"""Query server about registration.
|
||||
|
||||
:param messages.RegistrationResource regr: Existing Registration
|
||||
Resource.
|
||||
|
||||
"""
|
||||
self.net.account = self._get_v2_account(regr, True)
|
||||
|
||||
return self.net.account
|
||||
|
||||
def update_registration(self, regr: messages.RegistrationResource,
|
||||
update: Optional[messages.Registration] = None
|
||||
) -> messages.RegistrationResource:
|
||||
"""Update registration.
|
||||
|
||||
:param messages.RegistrationResource regr: Registration Resource.
|
||||
:param messages.Registration update: Updated body of the
|
||||
resource. If not provided, body will be taken from `regr`.
|
||||
|
||||
:returns: Updated Registration Resource.
|
||||
:rtype: `.RegistrationResource`
|
||||
|
||||
"""
|
||||
# https://github.com/certbot/certbot/issues/6155
|
||||
regr = self._get_v2_account(regr)
|
||||
|
||||
update = regr.body if update is None else update
|
||||
body = messages.UpdateRegistration(**dict(update))
|
||||
updated_regr = self._send_recv_regr(regr, body=body)
|
||||
self.net.account = updated_regr
|
||||
return updated_regr
|
||||
|
||||
def _get_v2_account(self, regr: messages.RegistrationResource, update_body: bool = False
|
||||
) -> messages.RegistrationResource:
|
||||
self.net.account = None
|
||||
only_existing_reg = regr.body.update(only_return_existing=True)
|
||||
response = self._post(self.directory['newAccount'], only_existing_reg)
|
||||
updated_uri = response.headers['Location']
|
||||
new_regr = regr.update(body=messages.Registration.from_json(response.json())
|
||||
if update_body else regr.body,
|
||||
uri=updated_uri)
|
||||
self.net.account = new_regr
|
||||
return new_regr
|
||||
|
||||
def new_order(self, csr_pem: bytes, profile: Optional[str] = None) -> messages.OrderResource:
|
||||
"""Request a new Order object from the server.
|
||||
|
||||
:param bytes csr_pem: A CSR in PEM format.
|
||||
|
||||
:returns: The newly created order.
|
||||
:rtype: OrderResource
|
||||
"""
|
||||
csr = x509.load_pem_x509_csr(csr_pem)
|
||||
dnsNames = crypto_util.get_names_from_subject_and_extensions(csr.subject, csr.extensions)
|
||||
try:
|
||||
san_ext = csr.extensions.get_extension_for_class(x509.SubjectAlternativeName)
|
||||
except x509.ExtensionNotFound:
|
||||
ipNames = []
|
||||
else:
|
||||
ipNames = san_ext.value.get_values_for_type(x509.IPAddress)
|
||||
identifiers = []
|
||||
for name in dnsNames:
|
||||
identifiers.append(messages.Identifier(typ=messages.IDENTIFIER_FQDN,
|
||||
value=name))
|
||||
for ip in ipNames:
|
||||
identifiers.append(messages.Identifier(typ=messages.IDENTIFIER_IP,
|
||||
value=str(ip)))
|
||||
if profile is None:
|
||||
profile = ""
|
||||
order = messages.NewOrder(identifiers=identifiers, profile=profile)
|
||||
response = self._post(self.directory['newOrder'], order)
|
||||
body = messages.Order.from_json(response.json())
|
||||
authorizations = []
|
||||
# pylint has trouble understanding our josepy based objects which use
|
||||
# things like custom metaclass logic. body.authorizations should be a
|
||||
# list of strings containing URLs so let's disable this check here.
|
||||
for url in body.authorizations: # pylint: disable=not-an-iterable
|
||||
authorizations.append(self._authzr_from_response(self._post_as_get(url), uri=url))
|
||||
return messages.OrderResource(
|
||||
body=body,
|
||||
uri=response.headers.get('Location'),
|
||||
authorizations=authorizations,
|
||||
csr_pem=csr_pem)
|
||||
|
||||
def poll(self, authzr: messages.AuthorizationResource
|
||||
) -> Tuple[messages.AuthorizationResource, requests.Response]:
|
||||
"""Poll Authorization Resource for status.
|
||||
|
||||
:param authzr: Authorization Resource
|
||||
:type authzr: `.AuthorizationResource`
|
||||
|
||||
:returns: Updated Authorization Resource and HTTP response.
|
||||
|
||||
:rtype: (`.AuthorizationResource`, `requests.Response`)
|
||||
|
||||
"""
|
||||
response = self._post_as_get(authzr.uri)
|
||||
updated_authzr = self._authzr_from_response(
|
||||
response, authzr.body.identifier, authzr.uri)
|
||||
return updated_authzr, response
|
||||
|
||||
def poll_and_finalize(self, orderr: messages.OrderResource,
|
||||
deadline: Optional[datetime.datetime] = None) -> messages.OrderResource:
|
||||
"""Poll authorizations and finalize the order.
|
||||
|
||||
If no deadline is provided, this method will timeout after 90
|
||||
seconds.
|
||||
|
||||
:param messages.OrderResource orderr: order to finalize
|
||||
:param datetime.datetime deadline: when to stop polling and timeout
|
||||
|
||||
:returns: finalized order
|
||||
:rtype: messages.OrderResource
|
||||
|
||||
"""
|
||||
if deadline is None:
|
||||
deadline = datetime.datetime.now() + datetime.timedelta(seconds=90)
|
||||
orderr = self.poll_authorizations(orderr, deadline)
|
||||
return self.finalize_order(orderr, deadline)
|
||||
|
||||
def poll_authorizations(self, orderr: messages.OrderResource, deadline: datetime.datetime
|
||||
) -> messages.OrderResource:
|
||||
"""Poll Order Resource for status."""
|
||||
responses = []
|
||||
for url in orderr.body.authorizations:
|
||||
while datetime.datetime.now() < deadline:
|
||||
authzr = self._authzr_from_response(self._post_as_get(url), uri=url)
|
||||
if authzr.body.status != messages.STATUS_PENDING: # pylint: disable=no-member
|
||||
responses.append(authzr)
|
||||
break
|
||||
time.sleep(1)
|
||||
# If we didn't get a response for every authorization, we fell through
|
||||
# the bottom of the loop due to hitting the deadline.
|
||||
if len(responses) < len(orderr.body.authorizations):
|
||||
raise errors.TimeoutError()
|
||||
failed = []
|
||||
for authzr in responses:
|
||||
if authzr.body.status != messages.STATUS_VALID:
|
||||
for chall in authzr.body.challenges:
|
||||
if chall.error is not None:
|
||||
failed.append(authzr)
|
||||
if failed:
|
||||
raise errors.ValidationError(failed)
|
||||
return orderr.update(authorizations=responses)
|
||||
|
||||
def begin_finalization(self, orderr: messages.OrderResource
|
||||
) -> messages.OrderResource:
|
||||
"""Start the process of finalizing an order.
|
||||
|
||||
:param messages.OrderResource orderr: order to finalize
|
||||
:param datetime.datetime deadline: when to stop polling and timeout
|
||||
|
||||
:returns: updated order
|
||||
:rtype: messages.OrderResource
|
||||
|
||||
:raises .messages.Error: If server indicates order is not yet in ready state,
|
||||
it will return a 403 (Forbidden) error with a problem document/error code of type
|
||||
"orderNotReady"
|
||||
|
||||
"""
|
||||
csr = x509.load_pem_x509_csr(orderr.csr_pem)
|
||||
wrapped_csr = messages.CertificateRequest(csr=csr)
|
||||
res = self._post(orderr.body.finalize, wrapped_csr)
|
||||
orderr = orderr.update(body=messages.Order.from_json(res.json()))
|
||||
return orderr
|
||||
|
||||
def poll_finalization(self, orderr: messages.OrderResource,
|
||||
deadline: datetime.datetime,
|
||||
fetch_alternative_chains: bool = False
|
||||
) -> messages.OrderResource:
|
||||
"""
|
||||
Poll an order that has been finalized for its status.
|
||||
If it becomes valid, obtain the certificate.
|
||||
|
||||
If a finalization request previously returned `orderNotReady`,
|
||||
poll until ready, send a new finalization request, and continue
|
||||
polling until valid as above.
|
||||
|
||||
:returns: finalized order (with certificate)
|
||||
:rtype: messages.OrderResource
|
||||
"""
|
||||
sleep_seconds: float = 1
|
||||
while datetime.datetime.now() < deadline:
|
||||
if sleep_seconds > 0:
|
||||
time.sleep(sleep_seconds)
|
||||
response = self._post_as_get(orderr.uri)
|
||||
body = messages.Order.from_json(response.json())
|
||||
if body.status == messages.STATUS_INVALID:
|
||||
# "invalid": The certificate will not be issued. Consider this
|
||||
# order process abandoned.
|
||||
if body.error is not None:
|
||||
raise errors.IssuanceError(body.error)
|
||||
raise errors.Error(
|
||||
"The certificate order failed. No further information was provided "
|
||||
"by the server.")
|
||||
elif body.status == messages.STATUS_READY:
|
||||
# "ready": The server agrees that the requirements have been
|
||||
# fulfilled, and is awaiting finalization. Submit a finalization
|
||||
# request.
|
||||
self.begin_finalization(orderr)
|
||||
sleep_seconds = 1
|
||||
elif body.status == messages.STATUS_VALID and body.certificate is not None:
|
||||
# "valid": The server has issued the certificate and provisioned its
|
||||
# URL to the "certificate" field of the order. Download the
|
||||
# certificate.
|
||||
certificate_response = self._post_as_get(body.certificate)
|
||||
orderr = orderr.update(body=body, fullchain_pem=certificate_response.text)
|
||||
if fetch_alternative_chains:
|
||||
alt_chains_urls = self._get_links(certificate_response, 'alternate')
|
||||
alt_chains = [self._post_as_get(url).text for url in alt_chains_urls]
|
||||
orderr = orderr.update(alternative_fullchains_pem=alt_chains)
|
||||
return orderr
|
||||
elif body.status == messages.STATUS_PROCESSING:
|
||||
# "processing": The certificate is being issued. Send a POST-as-GET request after
|
||||
# the time given in the Retry-After header field of the response, if any.
|
||||
retry_after = self.retry_after(response, 1)
|
||||
# Whatever Retry-After the ACME server requests, the polling must not take
|
||||
# longer than the overall deadline
|
||||
retry_after = min(retry_after, deadline)
|
||||
sleep_seconds = (retry_after - datetime.datetime.now()).total_seconds()
|
||||
raise errors.TimeoutError()
|
||||
|
||||
def finalize_order(self, orderr: messages.OrderResource, deadline: datetime.datetime,
|
||||
fetch_alternative_chains: bool = False) -> messages.OrderResource:
|
||||
"""Finalize an order and obtain a certificate.
|
||||
|
||||
:param messages.OrderResource orderr: order to finalize
|
||||
:param datetime.datetime deadline: when to stop polling and timeout
|
||||
:param bool fetch_alternative_chains: whether to also fetch alternative
|
||||
certificate chains
|
||||
|
||||
:returns: finalized order
|
||||
:rtype: messages.OrderResource
|
||||
|
||||
"""
|
||||
try:
|
||||
self.begin_finalization(orderr)
|
||||
except messages.Error as e:
|
||||
if e.code != 'orderNotReady':
|
||||
raise e
|
||||
return self.poll_finalization(orderr, deadline, fetch_alternative_chains)
|
||||
|
||||
def renewal_time(self, cert_pem: bytes
|
||||
) -> Tuple[Optional[datetime.datetime], datetime.datetime]:
|
||||
"""Return an appropriate time to attempt renewal of the certificate,
|
||||
and the next time to ask the ACME server for renewal info.
|
||||
|
||||
If the certificate has already expired, renewal info isn't checked.
|
||||
Instead, the certificate's notAfter time is returned and the certificate
|
||||
should be immediately renewed.
|
||||
|
||||
If the ACME directory has a "renewalInfo" field, the response will be
|
||||
based on a fetch of the renewal info resource for the certificate
|
||||
(https://www.ietf.org/archive/id/draft-ietf-acme-ari-08.html).
|
||||
|
||||
If there is no "renewalInfo" field, this function will return a tuple of
|
||||
None, and the next time to ask the ACME server for renewal info.
|
||||
|
||||
This function may make other network calls in the future (e.g., OCSP
|
||||
or CRL).
|
||||
|
||||
:param bytes cert_pem: cert as pem file
|
||||
|
||||
:returns: Tuple of time to attempt renewal, next time to ask for renewal info
|
||||
|
||||
:raises errors.ARIError: If an error occurs fetching ARI from the
|
||||
server. Explicit exception chaining is used so the original error
|
||||
can be accessed through the __cause__ attribute on the ARIError if
|
||||
desired.
|
||||
|
||||
"""
|
||||
now = datetime.datetime.now()
|
||||
# https://www.ietf.org/archive/id/draft-ietf-acme-ari-08.html#section-4.3.3
|
||||
default_retry_after = datetime.timedelta(seconds=6 * 60 * 60)
|
||||
|
||||
cert = x509.load_pem_x509_certificate(cert_pem)
|
||||
|
||||
# from https://www.ietf.org/archive/id/draft-ietf-acme-ari-08.html#section-4.3, "Clients
|
||||
# MUST NOT check a certificate's RenewalInfo after the certificate has expired."
|
||||
#
|
||||
# we call datetime.datetime.now here with the UTC argument to create a timezone aware
|
||||
# datetime object that can be compared with the UTC notAfter from cryptography
|
||||
if cert.not_valid_after_utc < datetime.datetime.now(datetime.timezone.utc):
|
||||
return cert.not_valid_after_utc, now + default_retry_after
|
||||
|
||||
try:
|
||||
renewal_info_base_url = self.directory['renewalInfo']
|
||||
except KeyError:
|
||||
return None, now + default_retry_after
|
||||
|
||||
ari_url = renewal_info_base_url + '/' + _renewal_info_path_component(cert)
|
||||
try:
|
||||
resp = self.net.get(ari_url, content_type='application/json')
|
||||
except Exception as e: # pylint: disable=broad-except
|
||||
error_msg = f'failed to fetch renewal_info URL {ari_url}'
|
||||
raise errors.ARIError(error_msg, now + default_retry_after) from e
|
||||
renewal_info: messages.RenewalInfo = messages.RenewalInfo.from_json(resp.json())
|
||||
|
||||
start = renewal_info.suggested_window.start # pylint: disable=no-member
|
||||
end = renewal_info.suggested_window.end # pylint: disable=no-member
|
||||
|
||||
delta_seconds = (end - start).total_seconds()
|
||||
random_seconds = random.uniform(0, delta_seconds)
|
||||
random_time = start + datetime.timedelta(seconds=random_seconds)
|
||||
|
||||
retry_after = self.retry_after(resp, default_retry_after.seconds)
|
||||
return random_time, retry_after
|
||||
|
||||
|
||||
def revoke(self, cert: x509.Certificate, rsn: int) -> None:
|
||||
"""Revoke certificate.
|
||||
|
||||
:param x509.Certificate cert: `x509.Certificate`
|
||||
|
||||
:param int rsn: Reason code for certificate revocation.
|
||||
|
||||
:raises .ClientError: If revocation is unsuccessful.
|
||||
|
||||
"""
|
||||
self._revoke(cert, rsn, self.directory['revokeCert'])
|
||||
|
||||
def external_account_required(self) -> bool:
|
||||
"""Checks if ACME server requires External Account Binding authentication."""
|
||||
return hasattr(self.directory, 'meta') and \
|
||||
hasattr(self.directory.meta, 'external_account_required') and \
|
||||
self.directory.meta.external_account_required
|
||||
|
||||
def _post_as_get(self, *args: Any, **kwargs: Any) -> requests.Response:
|
||||
"""
|
||||
Send GET request using the POST-as-GET protocol.
|
||||
:param args:
|
||||
:param kwargs:
|
||||
:return:
|
||||
"""
|
||||
new_args = args[:1] + (None,) + args[1:]
|
||||
return self._post(*new_args, **kwargs)
|
||||
|
||||
def _get_links(self, response: requests.Response, relation_type: str) -> List[str]:
|
||||
"""
|
||||
Retrieves all Link URIs of relation_type from the response.
|
||||
:param requests.Response response: The requests HTTP response.
|
||||
:param str relation_type: The relation type to filter by.
|
||||
"""
|
||||
# Can't use response.links directly because it drops multiple links
|
||||
# of the same relation type, which is possible in RFC8555 responses.
|
||||
if 'Link' not in response.headers:
|
||||
return []
|
||||
links = parse_header_links(response.headers['Link'])
|
||||
return [l['url'] for l in links
|
||||
if 'rel' in l and 'url' in l and l['rel'] == relation_type]
|
||||
|
||||
@classmethod
|
||||
def get_directory(cls, url: str, net: 'ClientNetwork') -> messages.Directory:
|
||||
"""
|
||||
Retrieves the ACME directory (RFC 8555 section 7.1.1) from the ACME server.
|
||||
:param str url: the URL where the ACME directory is available
|
||||
:param ClientNetwork net: the ClientNetwork to use to make the request
|
||||
|
||||
:returns: the ACME directory object
|
||||
:rtype: messages.Directory
|
||||
"""
|
||||
return messages.Directory.from_json(net.get(url).json())
|
||||
|
||||
@classmethod
|
||||
def _regr_from_response(cls, response: requests.Response, uri: Optional[str] = None,
|
||||
terms_of_service: Optional[str] = None
|
||||
) -> messages.RegistrationResource:
|
||||
if 'terms-of-service' in response.links:
|
||||
terms_of_service = response.links['terms-of-service']['url']
|
||||
|
||||
return messages.RegistrationResource(
|
||||
body=messages.Registration.from_json(response.json()),
|
||||
uri=response.headers.get('Location', uri),
|
||||
terms_of_service=terms_of_service)
|
||||
|
||||
def _send_recv_regr(self, regr: messages.RegistrationResource,
|
||||
body: messages.Registration) -> messages.RegistrationResource:
|
||||
response = self._post(regr.uri, body)
|
||||
|
||||
# TODO: Boulder returns httplib.ACCEPTED
|
||||
#assert response.status_code == httplib.OK
|
||||
|
||||
# TODO: Boulder does not set Location or Link on update
|
||||
# (c.f. acme-spec #94)
|
||||
|
||||
return self._regr_from_response(
|
||||
response, uri=regr.uri,
|
||||
terms_of_service=regr.terms_of_service)
|
||||
|
||||
def _post(self, *args: Any, **kwargs: Any) -> requests.Response:
|
||||
"""Wrapper around self.net.post that adds the newNonce URL.
|
||||
|
||||
This is used to retry the request in case of a badNonce error.
|
||||
|
||||
"""
|
||||
kwargs.setdefault('new_nonce_url', getattr(self.directory, 'newNonce'))
|
||||
return self.net.post(*args, **kwargs)
|
||||
|
||||
def deactivate_registration(self, regr: messages.RegistrationResource
|
||||
) -> messages.RegistrationResource:
|
||||
"""Deactivate registration.
|
||||
|
||||
:param messages.RegistrationResource regr: The Registration Resource
|
||||
to be deactivated.
|
||||
|
||||
:returns: The Registration resource that was deactivated.
|
||||
:rtype: `.RegistrationResource`
|
||||
|
||||
"""
|
||||
return self.update_registration(regr, messages.Registration.from_json(
|
||||
{"status": "deactivated", "contact": None}))
|
||||
|
||||
def deactivate_authorization(self,
|
||||
authzr: messages.AuthorizationResource
|
||||
) -> messages.AuthorizationResource:
|
||||
"""Deactivate authorization.
|
||||
|
||||
:param messages.AuthorizationResource authzr: The Authorization resource
|
||||
to be deactivated.
|
||||
|
||||
:returns: The Authorization resource that was deactivated.
|
||||
:rtype: `.AuthorizationResource`
|
||||
|
||||
"""
|
||||
body = messages.UpdateAuthorization(status='deactivated')
|
||||
response = self._post(authzr.uri, body)
|
||||
return self._authzr_from_response(response,
|
||||
authzr.body.identifier, authzr.uri)
|
||||
|
||||
def _authzr_from_response(self, response: requests.Response,
|
||||
identifier: Optional[messages.Identifier] = None,
|
||||
uri: Optional[str] = None) -> messages.AuthorizationResource:
|
||||
authzr = messages.AuthorizationResource(
|
||||
body=messages.Authorization.from_json(response.json()),
|
||||
uri=response.headers.get('Location', uri))
|
||||
if identifier is not None and authzr.body.identifier != identifier: # pylint: disable=no-member
|
||||
raise errors.UnexpectedUpdate(authzr)
|
||||
return authzr
|
||||
|
||||
def answer_challenge(self, challb: messages.ChallengeBody,
|
||||
response: challenges.ChallengeResponse) -> messages.ChallengeResource:
|
||||
"""Answer challenge.
|
||||
|
||||
:param challb: Challenge Resource body.
|
||||
:type challb: `.ChallengeBody`
|
||||
|
||||
:param response: Corresponding Challenge response
|
||||
:type response: `.challenges.ChallengeResponse`
|
||||
|
||||
:returns: Challenge Resource with updated body.
|
||||
:rtype: `.ChallengeResource`
|
||||
|
||||
:raises .UnexpectedUpdate:
|
||||
|
||||
"""
|
||||
resp = self._post(challb.uri, response)
|
||||
try:
|
||||
authzr_uri = resp.links['up']['url']
|
||||
except KeyError:
|
||||
raise errors.ClientError('"up" Link header missing')
|
||||
challr = messages.ChallengeResource(
|
||||
authzr_uri=authzr_uri,
|
||||
body=messages.ChallengeBody.from_json(resp.json()))
|
||||
# TODO: check that challr.uri == resp.headers['Location']?
|
||||
if challr.uri != challb.uri:
|
||||
raise errors.UnexpectedUpdate(challr.uri)
|
||||
return challr
|
||||
|
||||
@classmethod
|
||||
def retry_after(cls, response: requests.Response, default: int) -> datetime.datetime:
|
||||
"""Compute next `poll` time based on response ``Retry-After`` header.
|
||||
|
||||
Handles integers and various datestring formats per
|
||||
https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.37
|
||||
|
||||
:param requests.Response response: Response from `poll`.
|
||||
:param int default: Default value (in seconds), used when
|
||||
``Retry-After`` header is not present or invalid.
|
||||
|
||||
:returns: Time point when next `poll` should be performed.
|
||||
:rtype: `datetime.datetime`
|
||||
|
||||
"""
|
||||
retry_after = response.headers.get('Retry-After', str(default))
|
||||
try:
|
||||
seconds = int(retry_after)
|
||||
except ValueError:
|
||||
# The RFC 2822 parser handles all of RFC 2616's cases in modern
|
||||
# environments (primarily HTTP 1.1+ but also py27+)
|
||||
when = parsedate_tz(retry_after)
|
||||
if when is not None:
|
||||
try:
|
||||
tz_secs = datetime.timedelta(when[-1] if when[-1] is not None else 0)
|
||||
return datetime.datetime(*when[:7]) - tz_secs
|
||||
except (ValueError, OverflowError):
|
||||
pass
|
||||
seconds = default
|
||||
|
||||
return datetime.datetime.now() + datetime.timedelta(seconds=seconds)
|
||||
|
||||
def _revoke(self, cert: x509.Certificate, rsn: int, url: str) -> None:
|
||||
"""Revoke certificate.
|
||||
|
||||
:param .x509.Certificate cert: `x509.Certificate`
|
||||
|
||||
:param int rsn: Reason code for certificate revocation.
|
||||
|
||||
:param str url: ACME URL to post to
|
||||
|
||||
:raises .ClientError: If revocation is unsuccessful.
|
||||
|
||||
"""
|
||||
response = self._post(url,
|
||||
messages.Revocation(
|
||||
certificate=cert,
|
||||
reason=rsn))
|
||||
if response.status_code != http_client.OK:
|
||||
raise errors.ClientError(
|
||||
'Successful revocation must return HTTP OK status')
|
||||
|
||||
|
||||
class ClientNetwork:
|
||||
"""Wrapper around requests that signs POSTs for authentication.
|
||||
|
||||
Also adds user agent, and handles Content-Type.
|
||||
"""
|
||||
JSON_CONTENT_TYPE = 'application/json'
|
||||
JOSE_CONTENT_TYPE = 'application/jose+json'
|
||||
JSON_ERROR_CONTENT_TYPE = 'application/problem+json'
|
||||
REPLAY_NONCE_HEADER = 'Replay-Nonce'
|
||||
|
||||
"""Initialize.
|
||||
|
||||
:param josepy.JWK key: Account private key. Required to use .post().
|
||||
:param messages.RegistrationResource account: Account object. Required if you are
|
||||
planning to use .post() for anything other than creating a new account;
|
||||
may be set later after registering.
|
||||
:param josepy.JWASignature alg: Algorithm to use in signing JWS.
|
||||
:param bool verify_ssl: Whether to verify certificates on SSL connections.
|
||||
:param str user_agent: String to send as User-Agent header.
|
||||
:param int timeout: Timeout for requests.
|
||||
"""
|
||||
def __init__(self, key: Optional[jose.JWK] = None,
|
||||
account: Optional[messages.RegistrationResource] = None,
|
||||
alg: jose.JWASignature = jose.RS256, verify_ssl: bool = True,
|
||||
user_agent: str = 'acme-python', timeout: int = DEFAULT_NETWORK_TIMEOUT) -> None:
|
||||
self.key = key
|
||||
self.account = account
|
||||
self.alg = alg
|
||||
self.verify_ssl = verify_ssl
|
||||
self._nonces: Set[str] = set()
|
||||
self.user_agent = user_agent
|
||||
self.session = requests.Session()
|
||||
self._default_timeout = timeout
|
||||
adapter = HTTPAdapter()
|
||||
|
||||
self.session.mount("http://", adapter)
|
||||
self.session.mount("https://", adapter)
|
||||
|
||||
def __del__(self) -> None:
|
||||
# Try to close the session, but don't show exceptions to the
|
||||
# user if the call to close() fails. See #4840.
|
||||
try:
|
||||
self.session.close()
|
||||
except Exception: # pylint: disable=broad-except
|
||||
pass
|
||||
|
||||
def _wrap_in_jws(self, obj: jose.JSONDeSerializable, nonce: str, url: str) -> str:
|
||||
"""Wrap `JSONDeSerializable` object in JWS.
|
||||
|
||||
.. todo:: Implement ``acmePath``.
|
||||
|
||||
:param josepy.JSONDeSerializable obj:
|
||||
:param str url: The URL to which this object will be POSTed
|
||||
:param str nonce:
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
jobj = obj.json_dumps(indent=2).encode() if obj else b''
|
||||
logger.debug('JWS payload:\n%s', jobj)
|
||||
assert self.key
|
||||
kwargs = {
|
||||
"alg": self.alg,
|
||||
"nonce": nonce,
|
||||
"url": url,
|
||||
"key": self.key
|
||||
}
|
||||
# newAccount and revokeCert work without the kid
|
||||
# newAccount must not have kid
|
||||
if self.account is not None:
|
||||
kwargs["kid"] = self.account["uri"]
|
||||
return jws.JWS.sign(jobj, **cast(Mapping[str, Any], kwargs)).json_dumps(indent=2)
|
||||
|
||||
@classmethod
|
||||
def _check_response(cls, response: requests.Response,
|
||||
content_type: Optional[str] = None) -> requests.Response:
|
||||
"""Check response content and its type.
|
||||
|
||||
.. note::
|
||||
Checking is not strict: wrong server response ``Content-Type``
|
||||
HTTP header is ignored if response is an expected JSON object
|
||||
(c.f. Boulder #56).
|
||||
|
||||
:param str content_type: Expected Content-Type response header.
|
||||
If JSON is expected and not present in server response, this
|
||||
function will raise an error. Otherwise, wrong Content-Type
|
||||
is ignored, but logged.
|
||||
|
||||
:raises .messages.Error: If server response body
|
||||
carries HTTP Problem (https://datatracker.ietf.org/doc/html/rfc7807).
|
||||
:raises .ClientError: In case of other networking errors.
|
||||
|
||||
"""
|
||||
response_ct = response.headers.get('Content-Type')
|
||||
# Strip parameters from the media-type (rfc2616#section-3.7)
|
||||
if response_ct:
|
||||
response_ct = response_ct.split(';')[0].strip()
|
||||
try:
|
||||
# TODO: response.json() is called twice, once here, and
|
||||
# once in _get and _post clients
|
||||
jobj = response.json()
|
||||
except ValueError:
|
||||
jobj = None
|
||||
|
||||
if response.status_code == 409:
|
||||
raise errors.ConflictError(response.headers.get('Location', 'UNKNOWN-LOCATION'))
|
||||
|
||||
if not response.ok:
|
||||
if jobj is not None:
|
||||
if response_ct != cls.JSON_ERROR_CONTENT_TYPE:
|
||||
logger.debug(
|
||||
'Ignoring wrong Content-Type (%r) for JSON Error',
|
||||
response_ct)
|
||||
try:
|
||||
raise messages.Error.from_json(jobj)
|
||||
except jose.DeserializationError as error:
|
||||
# Couldn't deserialize JSON object
|
||||
raise errors.ClientError((response, error))
|
||||
else:
|
||||
# response is not JSON object
|
||||
raise errors.ClientError(response)
|
||||
else:
|
||||
if jobj is not None and response_ct != cls.JSON_CONTENT_TYPE:
|
||||
logger.debug(
|
||||
'Ignoring wrong Content-Type (%r) for JSON decodable '
|
||||
'response', response_ct)
|
||||
|
||||
if content_type == cls.JSON_CONTENT_TYPE and jobj is None:
|
||||
raise errors.ClientError(f'Unexpected response Content-Type: {response_ct}')
|
||||
|
||||
return response
|
||||
|
||||
def _send_request(self, method: str, url: str, *args: Any, **kwargs: Any) -> requests.Response:
|
||||
"""Send HTTP request.
|
||||
|
||||
Makes sure that `verify_ssl` is respected. Logs request and
|
||||
response (with headers). For allowed parameters please see
|
||||
`requests.request`.
|
||||
|
||||
:param str method: method for the new `requests.Request` object
|
||||
:param str url: URL for the new `requests.Request` object
|
||||
|
||||
:raises requests.exceptions.RequestException: in case of any problems
|
||||
|
||||
:returns: HTTP Response
|
||||
:rtype: `requests.Response`
|
||||
|
||||
|
||||
"""
|
||||
if method == "POST":
|
||||
logger.debug('Sending POST request to %s:\n%s',
|
||||
url, kwargs['data'])
|
||||
else:
|
||||
logger.debug('Sending %s request to %s.', method, url)
|
||||
kwargs['verify'] = self.verify_ssl
|
||||
kwargs.setdefault('headers', {})
|
||||
kwargs['headers'].setdefault('User-Agent', self.user_agent)
|
||||
kwargs.setdefault('timeout', self._default_timeout)
|
||||
response = self.session.request(method, url, *args, **kwargs)
|
||||
|
||||
# If an Accept header was sent in the request, the response may not be
|
||||
# UTF-8 encoded. In this case, we don't set response.encoding and log
|
||||
# the base64 response instead of raw bytes to keep binary data out of the logs.
|
||||
debug_content: Union[bytes, str]
|
||||
if "Accept" in kwargs["headers"]:
|
||||
debug_content = base64.b64encode(response.content)
|
||||
else:
|
||||
# We set response.encoding so response.text knows the response is
|
||||
# UTF-8 encoded instead of trying to guess the encoding that was
|
||||
# used which is error prone. This setting affects all future
|
||||
# accesses of .text made on the returned response object as well.
|
||||
response.encoding = "utf-8"
|
||||
debug_content = response.text
|
||||
logger.debug('Received response:\nHTTP %d\n%s\n\n%s',
|
||||
response.status_code,
|
||||
"\n".join("{0}: {1}".format(k, v)
|
||||
for k, v in response.headers.items()),
|
||||
debug_content)
|
||||
return response
|
||||
|
||||
def head(self, *args: Any, **kwargs: Any) -> requests.Response:
|
||||
"""Send HEAD request without checking the response.
|
||||
|
||||
Note, that `_check_response` is not called, as it is expected
|
||||
that status code other than successfully 2xx will be returned, or
|
||||
messages2.Error will be raised by the server.
|
||||
|
||||
"""
|
||||
return self._send_request('HEAD', *args, **kwargs)
|
||||
|
||||
def get(self, url: str, content_type: str = JSON_CONTENT_TYPE,
|
||||
**kwargs: Any) -> requests.Response:
|
||||
"""Send GET request and check response."""
|
||||
return self._check_response(
|
||||
self._send_request('GET', url, **kwargs), content_type=content_type)
|
||||
|
||||
def _add_nonce(self, response: requests.Response) -> None:
|
||||
if self.REPLAY_NONCE_HEADER in response.headers:
|
||||
nonce = response.headers[self.REPLAY_NONCE_HEADER]
|
||||
try:
|
||||
decoded_nonce = jws.Header._fields['nonce'].decode(nonce)
|
||||
except jose.DeserializationError as error:
|
||||
raise errors.BadNonce(nonce, error)
|
||||
logger.debug('Storing nonce: %s', nonce)
|
||||
self._nonces.add(decoded_nonce)
|
||||
else:
|
||||
raise errors.MissingNonce(response)
|
||||
|
||||
def _get_nonce(self, url: str, new_nonce_url: str) -> str:
|
||||
if not self._nonces:
|
||||
logger.debug('Requesting fresh nonce')
|
||||
if new_nonce_url is None:
|
||||
response = self.head(url)
|
||||
else:
|
||||
# request a new nonce from the acme newNonce endpoint
|
||||
response = self._check_response(self.head(new_nonce_url), content_type=None)
|
||||
self._add_nonce(response)
|
||||
return self._nonces.pop()
|
||||
|
||||
def post(self, *args: Any, **kwargs: Any) -> requests.Response:
|
||||
"""POST object wrapped in `.JWS` and check response.
|
||||
|
||||
If the server responded with a badNonce error, the request will
|
||||
be retried once.
|
||||
|
||||
"""
|
||||
try:
|
||||
return self._post_once(*args, **kwargs)
|
||||
except messages.Error as error:
|
||||
if error.code == 'badNonce':
|
||||
logger.debug('Retrying request after error:\n%s', error)
|
||||
return self._post_once(*args, **kwargs)
|
||||
raise
|
||||
|
||||
def _post_once(self, url: str, obj: jose.JSONDeSerializable,
|
||||
content_type: str = JOSE_CONTENT_TYPE, **kwargs: Any) -> requests.Response:
|
||||
new_nonce_url = kwargs.pop('new_nonce_url', None)
|
||||
if not self.key:
|
||||
raise errors.Error("acme.ClientNetwork with no private key can't POST.")
|
||||
data = self._wrap_in_jws(obj, self._get_nonce(url, new_nonce_url), url)
|
||||
kwargs.setdefault('headers', {'Content-Type': content_type})
|
||||
response = self._send_request('POST', url, data=data, **kwargs)
|
||||
response = self._check_response(response, content_type=content_type)
|
||||
self._add_nonce(response)
|
||||
return response
|
||||
|
||||
def _renewal_info_path_component(cert: x509.Certificate) -> str:
|
||||
akid_ext = cert.extensions.get_extension_for_oid(x509.ExtensionOID.AUTHORITY_KEY_IDENTIFIER)
|
||||
key_identifier = akid_ext.value.key_identifier # type: ignore[attr-defined]
|
||||
|
||||
akid_encoded = base64.urlsafe_b64encode(key_identifier).decode('ascii').replace("=", "")
|
||||
|
||||
# We add one to the reported bit_length so there is room for the sign bit.
|
||||
# https://docs.python.org/3/library/stdtypes.html#int.bit_length
|
||||
# "Return the number of bits necessary to represent an integer in binary, excluding
|
||||
# the sign and leading zeros"
|
||||
serial = cert.serial_number
|
||||
encoded_serial_len = math.ceil((serial.bit_length()+1)/8)
|
||||
# Serials are encoded as ASN.1 INTEGERS, which means big endian and signed (two's complement).
|
||||
# https://letsencrypt.org/docs/a-warm-welcome-to-asn1-and-der/#integer-encoding
|
||||
serial_bytes = serial.to_bytes(encoded_serial_len, byteorder='big', signed=True)
|
||||
serial_encoded = base64.urlsafe_b64encode(serial_bytes).decode('ascii').replace("=", "")
|
||||
|
||||
return f"{akid_encoded}.{serial_encoded}"
|
||||
@@ -1,502 +0,0 @@
|
||||
"""Crypto utilities."""
|
||||
import contextlib
|
||||
import enum
|
||||
from datetime import datetime, timedelta, timezone
|
||||
import ipaddress
|
||||
import logging
|
||||
import socket
|
||||
import typing
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import List
|
||||
from typing import Literal
|
||||
from typing import Mapping
|
||||
from typing import Optional
|
||||
from typing import Sequence
|
||||
from typing import Set
|
||||
from typing import Tuple
|
||||
from typing import Union
|
||||
import warnings
|
||||
|
||||
from cryptography import x509
|
||||
from cryptography.hazmat.primitives import hashes, serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import dsa, rsa, ec, ed25519, ed448, types
|
||||
from cryptography.hazmat.primitives.serialization import Encoding
|
||||
from OpenSSL import crypto
|
||||
from OpenSSL import SSL
|
||||
|
||||
from acme import errors
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Default SSL method selected here is the most compatible, while secure
|
||||
# SSL method: TLSv1_METHOD is only compatible with
|
||||
# TLSv1_METHOD, while TLS_method is compatible with all other
|
||||
# methods, including TLSv2_METHOD (read more at
|
||||
# https://docs.openssl.org/master/man3/SSL_CTX_new/#notes). _serve_sni
|
||||
# should be changed to use "set_options" to disable SSLv2 and SSLv3,
|
||||
# in case it's used for things other than probing/serving!
|
||||
_DEFAULT_SSL_METHOD = SSL.TLS_METHOD
|
||||
|
||||
|
||||
class Format(enum.IntEnum):
|
||||
"""File format to be used when parsing or serializing X.509 structures.
|
||||
|
||||
Backwards compatible with the `FILETYPE_ASN1` and `FILETYPE_PEM` constants
|
||||
from pyOpenSSL.
|
||||
"""
|
||||
DER = crypto.FILETYPE_ASN1
|
||||
PEM = crypto.FILETYPE_PEM
|
||||
|
||||
def to_cryptography_encoding(self) -> Encoding:
|
||||
"""Converts the Format to the corresponding cryptography `Encoding`.
|
||||
"""
|
||||
if self == Format.DER:
|
||||
return Encoding.DER
|
||||
else:
|
||||
return Encoding.PEM
|
||||
|
||||
|
||||
_KeyAndCert = Union[
|
||||
Tuple[crypto.PKey, crypto.X509],
|
||||
Tuple[types.CertificateIssuerPrivateKeyTypes, x509.Certificate],
|
||||
]
|
||||
|
||||
|
||||
class _DefaultCertSelection:
|
||||
def __init__(self, certs: Mapping[bytes, _KeyAndCert]):
|
||||
self.certs = certs
|
||||
|
||||
def __call__(self, connection: SSL.Connection) -> Optional[_KeyAndCert]:
|
||||
server_name = connection.get_servername()
|
||||
if server_name:
|
||||
return self.certs.get(server_name, None)
|
||||
return None # pragma: no cover
|
||||
|
||||
|
||||
class SSLSocket: # pylint: disable=too-few-public-methods
|
||||
"""SSL wrapper for sockets.
|
||||
|
||||
:ivar socket sock: Original wrapped socket.
|
||||
:ivar dict certs: Mapping from domain names (`bytes`) to
|
||||
`OpenSSL.crypto.X509`.
|
||||
:ivar method: See `OpenSSL.SSL.Context` for allowed values.
|
||||
:ivar alpn_selection: Hook to select negotiated ALPN protocol for
|
||||
connection.
|
||||
:ivar cert_selection: Hook to select certificate for connection. If given,
|
||||
`certs` parameter would be ignored, and therefore must be empty.
|
||||
|
||||
"""
|
||||
def __init__(
|
||||
self,
|
||||
sock: socket.socket,
|
||||
certs: Optional[Mapping[bytes, _KeyAndCert]] = None,
|
||||
method: int = _DEFAULT_SSL_METHOD,
|
||||
alpn_selection: Optional[Callable[[SSL.Connection, List[bytes]], bytes]] = None,
|
||||
cert_selection: Optional[
|
||||
Callable[
|
||||
[SSL.Connection],
|
||||
Optional[_KeyAndCert],
|
||||
]
|
||||
] = None,
|
||||
) -> None:
|
||||
warnings.warn("SSLSocket is deprecated and will be removed in an upcoming release",
|
||||
DeprecationWarning)
|
||||
self.sock = sock
|
||||
self.alpn_selection = alpn_selection
|
||||
self.method = method
|
||||
if not cert_selection and not certs:
|
||||
raise ValueError("Neither cert_selection or certs specified.")
|
||||
if cert_selection and certs:
|
||||
raise ValueError("Both cert_selection and certs specified.")
|
||||
if cert_selection is None:
|
||||
cert_selection = _DefaultCertSelection(certs if certs else {})
|
||||
self.cert_selection = cert_selection
|
||||
|
||||
def __getattr__(self, name: str) -> Any:
|
||||
return getattr(self.sock, name)
|
||||
|
||||
def _pick_certificate_cb(self, connection: SSL.Connection) -> None:
|
||||
"""SNI certificate callback.
|
||||
|
||||
This method will set a new OpenSSL context object for this
|
||||
connection when an incoming connection provides an SNI name
|
||||
(in order to serve the appropriate certificate, if any).
|
||||
|
||||
:param connection: The TLS connection object on which the SNI
|
||||
extension was received.
|
||||
:type connection: :class:`OpenSSL.Connection`
|
||||
|
||||
"""
|
||||
pair = self.cert_selection(connection)
|
||||
if pair is None:
|
||||
logger.debug("Certificate selection for server name %s failed, dropping SSL",
|
||||
connection.get_servername())
|
||||
return
|
||||
key, cert = pair
|
||||
new_context = SSL.Context(self.method)
|
||||
new_context.set_min_proto_version(SSL.TLS1_2_VERSION)
|
||||
new_context.use_privatekey(key)
|
||||
if isinstance(cert, x509.Certificate):
|
||||
cert = crypto.X509.from_cryptography(cert)
|
||||
new_context.use_certificate(cert)
|
||||
if self.alpn_selection is not None:
|
||||
new_context.set_alpn_select_callback(self.alpn_selection)
|
||||
connection.set_context(new_context)
|
||||
|
||||
class FakeConnection:
|
||||
"""Fake OpenSSL.SSL.Connection."""
|
||||
|
||||
# pylint: disable=missing-function-docstring
|
||||
|
||||
def __init__(self, connection: SSL.Connection) -> None:
|
||||
self._wrapped = connection
|
||||
|
||||
def __getattr__(self, name: str) -> Any:
|
||||
return getattr(self._wrapped, name)
|
||||
|
||||
def shutdown(self, *unused_args: Any) -> bool:
|
||||
# OpenSSL.SSL.Connection.shutdown doesn't accept any args
|
||||
try:
|
||||
return self._wrapped.shutdown()
|
||||
except SSL.Error as error: # pragma: no cover
|
||||
# We wrap the error so we raise the same error type as sockets
|
||||
# in the standard library. This is useful when this object is
|
||||
# used by code which expects a standard socket such as
|
||||
# socketserver in the standard library.
|
||||
#
|
||||
# We don't track code coverage in this "except" branch to avoid spurious CI failures
|
||||
# caused by missing test coverage. These aren't worth fixing because this entire
|
||||
# class has been deprecated. See https://github.com/certbot/certbot/issues/10284.
|
||||
raise OSError(error)
|
||||
|
||||
def accept(self) -> Tuple[FakeConnection, Any]: # pylint: disable=missing-function-docstring
|
||||
sock, addr = self.sock.accept()
|
||||
|
||||
try:
|
||||
context = SSL.Context(self.method)
|
||||
context.set_options(SSL.OP_NO_SSLv2)
|
||||
context.set_options(SSL.OP_NO_SSLv3)
|
||||
context.set_tlsext_servername_callback(self._pick_certificate_cb)
|
||||
if self.alpn_selection is not None:
|
||||
context.set_alpn_select_callback(self.alpn_selection)
|
||||
|
||||
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
|
||||
ssl_sock.set_accept_state()
|
||||
|
||||
# This log line is especially desirable because without it requests to
|
||||
# our standalone TLSALPN server would not be logged.
|
||||
logger.debug("Performing handshake with %s", addr)
|
||||
try:
|
||||
ssl_sock.do_handshake()
|
||||
except SSL.Error as error:
|
||||
# _pick_certificate_cb might have returned without
|
||||
# creating SSL context (wrong server name)
|
||||
raise OSError(error)
|
||||
|
||||
return ssl_sock, addr
|
||||
except:
|
||||
# If we encounter any error, close the new socket before reraising
|
||||
# the exception.
|
||||
sock.close()
|
||||
raise
|
||||
|
||||
|
||||
def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, # pylint: disable=too-many-arguments
|
||||
method: int = _DEFAULT_SSL_METHOD, source_address: Tuple[str, int] = ('', 0),
|
||||
alpn_protocols: Optional[Sequence[bytes]] = None) -> x509.Certificate:
|
||||
"""Probe SNI server for SSL certificate.
|
||||
|
||||
:param bytes name: Byte string to send as the server name in the
|
||||
client hello message.
|
||||
:param bytes host: Host to connect to.
|
||||
:param int port: Port to connect to.
|
||||
:param int timeout: Timeout in seconds.
|
||||
:param method: See `OpenSSL.SSL.Context` for allowed values.
|
||||
:param tuple source_address: Enables multi-path probing (selection
|
||||
of source interface). See `socket.creation_connection` for more
|
||||
info. Available only in Python 2.7+.
|
||||
:param alpn_protocols: Protocols to request using ALPN.
|
||||
:type alpn_protocols: `Sequence` of `bytes`
|
||||
|
||||
:raises acme.errors.Error: In case of any problems.
|
||||
|
||||
:returns: SSL certificate presented by the server.
|
||||
:rtype: cryptography.x509.Certificate
|
||||
|
||||
"""
|
||||
warnings.warn("probe_sni is deprecated and will be removed in an upcoming release",
|
||||
DeprecationWarning)
|
||||
context = SSL.Context(method)
|
||||
context.set_timeout(timeout)
|
||||
|
||||
socket_kwargs = {'source_address': source_address}
|
||||
|
||||
try:
|
||||
logger.debug(
|
||||
"Attempting to connect to %s:%d%s.", host, port,
|
||||
" from {0}:{1}".format(
|
||||
source_address[0],
|
||||
source_address[1]
|
||||
) if any(source_address) else ""
|
||||
)
|
||||
socket_tuple: Tuple[bytes, int] = (host, port)
|
||||
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore[arg-type]
|
||||
except OSError as error:
|
||||
raise errors.Error(error)
|
||||
|
||||
with contextlib.closing(sock) as client:
|
||||
client_ssl = SSL.Connection(context, client)
|
||||
client_ssl.set_connect_state()
|
||||
client_ssl.set_tlsext_host_name(name) # pyOpenSSL>=0.13
|
||||
if alpn_protocols is not None:
|
||||
client_ssl.set_alpn_protos(list(alpn_protocols))
|
||||
warnings.warn("alpn_protocols parameter is deprecated and will be removed in an "
|
||||
"upcoming certbot major version update", DeprecationWarning)
|
||||
try:
|
||||
client_ssl.do_handshake()
|
||||
client_ssl.shutdown()
|
||||
except SSL.Error as error:
|
||||
raise errors.Error(error)
|
||||
cert = client_ssl.get_peer_certificate()
|
||||
assert cert # Appease mypy. We would have crashed out by now if there was no certificate.
|
||||
return cert.to_cryptography()
|
||||
|
||||
|
||||
# Even *more* annoyingly, due to a mypy bug, we can't use Union[] types in
|
||||
# isinstance expressions without causing false mypy errors. So we have to
|
||||
# recreate the type collection as a tuple here. And no, typing.get_args doesn't
|
||||
# work due to another mypy bug.
|
||||
#
|
||||
# mypy issues:
|
||||
# * https://github.com/python/mypy/issues/17680
|
||||
# * https://github.com/python/mypy/issues/15106
|
||||
CertificateIssuerPrivateKeyTypesTpl = (
|
||||
dsa.DSAPrivateKey,
|
||||
rsa.RSAPrivateKey,
|
||||
ec.EllipticCurvePrivateKey,
|
||||
ed25519.Ed25519PrivateKey,
|
||||
ed448.Ed448PrivateKey,
|
||||
)
|
||||
|
||||
|
||||
def make_csr(
|
||||
private_key_pem: bytes,
|
||||
domains: Optional[Union[Set[str], List[str]]] = None,
|
||||
must_staple: bool = False,
|
||||
ipaddrs: Optional[List[Union[ipaddress.IPv4Address, ipaddress.IPv6Address]]] = None,
|
||||
) -> bytes:
|
||||
"""Generate a CSR containing domains or IPs as subjectAltNames.
|
||||
|
||||
Parameters are ordered this way for backwards compatibility when called using positional
|
||||
arguments.
|
||||
|
||||
:param buffer private_key_pem: Private key, in PEM PKCS#8 format.
|
||||
:param list domains: List of DNS names to include in subjectAltNames of CSR.
|
||||
:param bool must_staple: Whether to include the TLS Feature extension (aka
|
||||
OCSP Must Staple: https://tools.ietf.org/html/rfc7633).
|
||||
:param list ipaddrs: List of IPaddress(type ipaddress.IPv4Address or ipaddress.IPv6Address)
|
||||
names to include in subbjectAltNames of CSR.
|
||||
|
||||
:returns: buffer PEM-encoded Certificate Signing Request.
|
||||
|
||||
"""
|
||||
private_key = serialization.load_pem_private_key(private_key_pem, password=None)
|
||||
if not isinstance(private_key, CertificateIssuerPrivateKeyTypesTpl):
|
||||
raise ValueError(f"Invalid private key type: {type(private_key)}")
|
||||
if domains is None:
|
||||
domains = []
|
||||
if ipaddrs is None:
|
||||
ipaddrs = []
|
||||
if len(domains) + len(ipaddrs) == 0:
|
||||
raise ValueError(
|
||||
"At least one of domains or ipaddrs parameter need to be not empty"
|
||||
)
|
||||
|
||||
builder = (
|
||||
x509.CertificateSigningRequestBuilder()
|
||||
.subject_name(x509.Name([]))
|
||||
.add_extension(
|
||||
x509.SubjectAlternativeName(
|
||||
[x509.DNSName(d) for d in domains]
|
||||
+ [x509.IPAddress(i) for i in ipaddrs]
|
||||
),
|
||||
critical=False,
|
||||
)
|
||||
)
|
||||
if must_staple:
|
||||
builder = builder.add_extension(
|
||||
# "status_request" is the feature commonly known as OCSP
|
||||
# Must-Staple
|
||||
x509.TLSFeature([x509.TLSFeatureType.status_request]),
|
||||
critical=False,
|
||||
)
|
||||
|
||||
csr = builder.sign(private_key, hashes.SHA256())
|
||||
return csr.public_bytes(Encoding.PEM)
|
||||
|
||||
|
||||
def get_names_from_subject_and_extensions(
|
||||
subject: x509.Name, exts: x509.Extensions
|
||||
) -> List[str]:
|
||||
"""Gets all DNS SAN names as well as the first Common Name from subject.
|
||||
|
||||
:param subject: Name of the x509 object, which may include Common Name
|
||||
:type subject: `cryptography.x509.Name`
|
||||
:param exts: Extensions of the x509 object, which may include SANs
|
||||
:type exts: `cryptography.x509.Extensions`
|
||||
|
||||
:returns: List of DNS Subject Alternative Names and first Common Name
|
||||
:rtype: `list` of `str`
|
||||
"""
|
||||
# We know these are always `str` because `bytes` is only possible for
|
||||
# other OIDs.
|
||||
cns = [
|
||||
typing.cast(str, c.value)
|
||||
for c in subject.get_attributes_for_oid(x509.NameOID.COMMON_NAME)
|
||||
]
|
||||
try:
|
||||
san_ext = exts.get_extension_for_class(x509.SubjectAlternativeName)
|
||||
except x509.ExtensionNotFound:
|
||||
dns_names = []
|
||||
else:
|
||||
dns_names = san_ext.value.get_values_for_type(x509.DNSName)
|
||||
|
||||
if not cns:
|
||||
return dns_names
|
||||
else:
|
||||
# We only include the first CN, if there are multiple. This matches
|
||||
# the behavior of the previously implementation using pyOpenSSL.
|
||||
return [cns[0]] + [d for d in dns_names if d != cns[0]]
|
||||
|
||||
|
||||
def _cryptography_cert_or_req_san(
|
||||
cert_or_req: Union[x509.Certificate, x509.CertificateSigningRequest],
|
||||
) -> List[str]:
|
||||
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
|
||||
|
||||
.. note:: Although this is `acme` internal API, it is used by
|
||||
`letsencrypt`.
|
||||
|
||||
:param cert_or_req: Certificate or CSR.
|
||||
:type cert_or_req: `x509.Certificate` or `x509.CertificateSigningRequest`.
|
||||
|
||||
:returns: A list of Subject Alternative Names that is DNS.
|
||||
:rtype: `list` of `str`
|
||||
|
||||
Deprecated
|
||||
.. deprecated: 3.2.1
|
||||
"""
|
||||
# ???: is this translation needed?
|
||||
exts = cert_or_req.extensions
|
||||
try:
|
||||
san_ext = exts.get_extension_for_class(x509.SubjectAlternativeName)
|
||||
except x509.ExtensionNotFound:
|
||||
return []
|
||||
|
||||
return san_ext.value.get_values_for_type(x509.DNSName)
|
||||
|
||||
|
||||
# Helper function that can be mocked in unit tests
|
||||
def _now() -> datetime:
|
||||
return datetime.now(tz=timezone.utc)
|
||||
|
||||
|
||||
def make_self_signed_cert(private_key: types.CertificateIssuerPrivateKeyTypes,
|
||||
domains: Optional[List[str]] = None,
|
||||
not_before: Optional[datetime] = None,
|
||||
validity: Optional[timedelta] = None, force_san: bool = True,
|
||||
extensions: Optional[List[x509.Extension]] = None,
|
||||
ips: Optional[List[Union[ipaddress.IPv4Address,
|
||||
ipaddress.IPv6Address]]] = None
|
||||
) -> x509.Certificate:
|
||||
"""Generate new self-signed certificate.
|
||||
:param buffer private_key_pem: Private key, in PEM PKCS#8 format.
|
||||
:type domains: `list` of `str`
|
||||
:param int not_before: A datetime after which the cert is valid. If no
|
||||
timezone is specified, UTC is assumed
|
||||
:type not_before: `datetime.datetime`
|
||||
:param validity: Duration for which the cert will be valid. Defaults to 1
|
||||
week
|
||||
:type validity: `datetime.timedelta`
|
||||
:param buffer private_key_pem: One of
|
||||
`cryptography.hazmat.primitives.asymmetric.types.CertificateIssuerPrivateKeyTypes`
|
||||
:param bool force_san:
|
||||
:param extensions: List of additional extensions to include in the cert.
|
||||
:type extensions: `list` of `x509.Extension[x509.ExtensionType]`
|
||||
:type ips: `list` of (`ipaddress.IPv4Address` or `ipaddress.IPv6Address`)
|
||||
If more than one domain is provided, all of the domains are put into
|
||||
``subjectAltName`` X.509 extension and first domain is set as the
|
||||
subject CN. If only one domain is provided no ``subjectAltName``
|
||||
extension is used, unless `force_san` is ``True``.
|
||||
"""
|
||||
assert domains or ips, "Must provide one or more hostnames or IPs for the cert."
|
||||
|
||||
builder = x509.CertificateBuilder()
|
||||
builder = builder.serial_number(x509.random_serial_number())
|
||||
|
||||
if extensions is not None:
|
||||
for ext in extensions:
|
||||
builder = builder.add_extension(ext.value, ext.critical)
|
||||
if domains is None:
|
||||
domains = []
|
||||
if ips is None:
|
||||
ips = []
|
||||
builder = builder.add_extension(x509.BasicConstraints(ca=True, path_length=0), critical=True)
|
||||
|
||||
name_attrs = []
|
||||
if len(domains) > 0:
|
||||
name_attrs.append(x509.NameAttribute(
|
||||
x509.OID_COMMON_NAME,
|
||||
domains[0]
|
||||
))
|
||||
|
||||
builder = builder.subject_name(x509.Name(name_attrs))
|
||||
builder = builder.issuer_name(x509.Name(name_attrs))
|
||||
|
||||
sanlist: List[x509.GeneralName] = []
|
||||
for address in domains:
|
||||
sanlist.append(x509.DNSName(address))
|
||||
for ip in ips:
|
||||
sanlist.append(x509.IPAddress(ip))
|
||||
if force_san or len(domains) > 1 or len(ips) > 0:
|
||||
builder = builder.add_extension(
|
||||
x509.SubjectAlternativeName(sanlist),
|
||||
critical=False
|
||||
)
|
||||
|
||||
if not_before is None:
|
||||
not_before = _now()
|
||||
if validity is None:
|
||||
validity = timedelta(seconds=7 * 24 * 60 * 60)
|
||||
builder = builder.not_valid_before(not_before)
|
||||
builder = builder.not_valid_after(not_before + validity)
|
||||
|
||||
public_key = private_key.public_key()
|
||||
builder = builder.public_key(public_key)
|
||||
return builder.sign(private_key, hashes.SHA256())
|
||||
|
||||
|
||||
def dump_cryptography_chain(
|
||||
chain: List[x509.Certificate],
|
||||
encoding: Literal[Encoding.PEM, Encoding.DER] = Encoding.PEM,
|
||||
) -> bytes:
|
||||
"""Dump certificate chain into a bundle.
|
||||
|
||||
:param list chain: List of `cryptography.x509.Certificate`.
|
||||
|
||||
:returns: certificate chain bundle
|
||||
:rtype: bytes
|
||||
|
||||
Deprecated
|
||||
.. deprecated: 3.2.1
|
||||
"""
|
||||
# XXX: returns empty string when no chain is available, which
|
||||
# shuts up RenewableCert, but might not be the best solution...
|
||||
|
||||
def _dump_cert(cert: x509.Certificate) -> bytes:
|
||||
return cert.public_bytes(encoding)
|
||||
|
||||
# assumes that x509.Certificate.public_bytes includes ending
|
||||
# newline character
|
||||
return b"".join(_dump_cert(cert) for cert in chain)
|
||||
@@ -1,713 +0,0 @@
|
||||
"""ACME protocol messages."""
|
||||
from collections.abc import Hashable
|
||||
import datetime
|
||||
import json
|
||||
from typing import Any
|
||||
from typing import Dict
|
||||
from typing import Iterator
|
||||
from typing import List
|
||||
from typing import Mapping
|
||||
from typing import MutableMapping
|
||||
from typing import Optional
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
from typing import TypeVar
|
||||
|
||||
from cryptography import x509
|
||||
|
||||
import josepy as jose
|
||||
|
||||
from acme import challenges
|
||||
from acme import errors
|
||||
from acme import fields
|
||||
from acme import jws
|
||||
from acme import util
|
||||
|
||||
ERROR_PREFIX = "urn:ietf:params:acme:error:"
|
||||
|
||||
ERROR_CODES = {
|
||||
'accountDoesNotExist': 'The request specified an account that does not exist',
|
||||
'alreadyRevoked': 'The request specified a certificate to be revoked that has' \
|
||||
' already been revoked',
|
||||
'badCSR': 'The CSR is unacceptable (e.g., due to a short key)',
|
||||
'badNonce': 'The client sent an unacceptable anti-replay nonce',
|
||||
'badPublicKey': 'The JWS was signed by a public key the server does not support',
|
||||
'badRevocationReason': 'The revocation reason provided is not allowed by the server',
|
||||
'badSignatureAlgorithm': 'The JWS was signed with an algorithm the server does not support',
|
||||
'caa': 'Certification Authority Authorization (CAA) records forbid the CA from issuing' \
|
||||
' a certificate',
|
||||
'compound': 'Specific error conditions are indicated in the "subproblems" array',
|
||||
'connection': ('The server could not connect to the client to verify the'
|
||||
' domain'),
|
||||
'dns': 'There was a problem with a DNS query during identifier validation',
|
||||
'dnssec': 'The server could not validate a DNSSEC signed domain',
|
||||
'incorrectResponse': 'Response received didn\'t match the challenge\'s requirements',
|
||||
# deprecate invalidEmail
|
||||
'invalidEmail': 'The provided email for a registration was invalid',
|
||||
'invalidContact': 'The provided contact URI was invalid',
|
||||
'malformed': 'The request message was malformed',
|
||||
'rejectedIdentifier': 'The server will not issue certificates for the identifier',
|
||||
'orderNotReady': 'The request attempted to finalize an order that is not ready to be finalized',
|
||||
'rateLimited': 'There were too many requests of a given type',
|
||||
'serverInternal': 'The server experienced an internal error',
|
||||
'tls': 'The server experienced a TLS error during domain verification',
|
||||
'unauthorized': 'The client lacks sufficient authorization',
|
||||
'unsupportedContact': 'A contact URL for an account used an unsupported protocol scheme',
|
||||
'unknownHost': 'The server could not resolve a domain name',
|
||||
'unsupportedIdentifier': 'An identifier is of an unsupported type',
|
||||
'externalAccountRequired': 'The server requires external account binding',
|
||||
}
|
||||
|
||||
ERROR_TYPE_DESCRIPTIONS = {**{
|
||||
ERROR_PREFIX + name: desc for name, desc in ERROR_CODES.items()
|
||||
}}
|
||||
|
||||
|
||||
def is_acme_error(err: BaseException) -> bool:
|
||||
"""Check if argument is an ACME error."""
|
||||
if isinstance(err, Error) and (err.typ is not None):
|
||||
return ERROR_PREFIX in err.typ
|
||||
return False
|
||||
|
||||
|
||||
class _Constant(jose.JSONDeSerializable, Hashable):
|
||||
"""ACME constant."""
|
||||
__slots__ = ('name',)
|
||||
POSSIBLE_NAMES: Dict[str, '_Constant'] = NotImplemented
|
||||
|
||||
def __init__(self, name: str) -> None:
|
||||
super().__init__()
|
||||
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
|
||||
self.name = name
|
||||
|
||||
def to_partial_json(self) -> str:
|
||||
return self.name
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj: str) -> '_Constant':
|
||||
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
|
||||
raise jose.DeserializationError(f'{cls.__name__} not recognized')
|
||||
return cls.POSSIBLE_NAMES[jobj]
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return f'{self.__class__.__name__}({self.name})'
|
||||
|
||||
def __eq__(self, other: Any) -> bool:
|
||||
return isinstance(other, type(self)) and other.name == self.name
|
||||
|
||||
def __hash__(self) -> int:
|
||||
return hash((self.__class__, self.name))
|
||||
|
||||
|
||||
class IdentifierType(_Constant):
|
||||
"""ACME identifier type."""
|
||||
POSSIBLE_NAMES: Dict[str, _Constant] = {}
|
||||
|
||||
|
||||
IDENTIFIER_FQDN = IdentifierType('dns') # IdentifierDNS in Boulder
|
||||
IDENTIFIER_IP = IdentifierType('ip') # IdentifierIP in pebble - not in Boulder yet
|
||||
|
||||
|
||||
class Identifier(jose.JSONObjectWithFields):
|
||||
"""ACME identifier.
|
||||
|
||||
:ivar IdentifierType typ:
|
||||
:ivar str value:
|
||||
|
||||
"""
|
||||
typ: IdentifierType = jose.field('type', decoder=IdentifierType.from_json)
|
||||
value: str = jose.field('value')
|
||||
|
||||
|
||||
class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
"""ACME error.
|
||||
|
||||
https://datatracker.ietf.org/doc/html/rfc7807
|
||||
|
||||
Note: Although Error inherits from JSONObjectWithFields, which is immutable,
|
||||
we add mutability for Error to comply with the Python exception API.
|
||||
|
||||
:ivar str typ:
|
||||
:ivar str title:
|
||||
:ivar str detail:
|
||||
:ivar Identifier identifier:
|
||||
:ivar tuple subproblems: An array of ACME Errors which may be present when the CA
|
||||
returns multiple errors related to the same request, `tuple` of `Error`.
|
||||
|
||||
"""
|
||||
typ: str = jose.field('type', omitempty=True, default='about:blank')
|
||||
title: str = jose.field('title', omitempty=True)
|
||||
detail: str = jose.field('detail', omitempty=True)
|
||||
identifier: Optional['Identifier'] = jose.field(
|
||||
'identifier', decoder=Identifier.from_json, omitempty=True)
|
||||
subproblems: Optional[Tuple['Error', ...]] = jose.field('subproblems', omitempty=True)
|
||||
|
||||
# Mypy does not understand the josepy magic happening here, and falsely claims
|
||||
# that subproblems is redefined. Let's ignore the type check here.
|
||||
@subproblems.decoder # type: ignore
|
||||
def subproblems(value: List[Dict[str, Any]]) -> Tuple['Error', ...]: # pylint: disable=no-self-argument,missing-function-docstring
|
||||
return tuple(Error.from_json(subproblem) for subproblem in value)
|
||||
|
||||
@classmethod
|
||||
def with_code(cls, code: str, **kwargs: Any) -> 'Error':
|
||||
"""Create an Error instance with an ACME Error code.
|
||||
|
||||
:str code: An ACME error code, like 'dnssec'.
|
||||
:kwargs: kwargs to pass to Error.
|
||||
|
||||
"""
|
||||
if code not in ERROR_CODES:
|
||||
raise ValueError("The supplied code: %s is not a known ACME error"
|
||||
" code" % code)
|
||||
typ = ERROR_PREFIX + code
|
||||
# Mypy will not understand that the Error constructor accepts a named argument
|
||||
# "typ" because of josepy magic. Let's ignore the type check here.
|
||||
return cls(typ=typ, **kwargs)
|
||||
|
||||
@property
|
||||
def description(self) -> Optional[str]:
|
||||
"""Hardcoded error description based on its type.
|
||||
|
||||
:returns: Description if standard ACME error or ``None``.
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
return ERROR_TYPE_DESCRIPTIONS.get(self.typ)
|
||||
|
||||
@property
|
||||
def code(self) -> Optional[str]:
|
||||
"""ACME error code.
|
||||
|
||||
Basically self.typ without the ERROR_PREFIX.
|
||||
|
||||
:returns: error code if standard ACME code or ``None``.
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
code = str(self.typ).rsplit(':', maxsplit=1)[-1]
|
||||
if code in ERROR_CODES:
|
||||
return code
|
||||
return None
|
||||
|
||||
# Hack to allow mutability on Errors (see GH #9539)
|
||||
def __setattr__(self, name: str, value: Any) -> None:
|
||||
return object.__setattr__(self, name, value)
|
||||
|
||||
def __str__(self) -> str:
|
||||
result = b' :: '.join(
|
||||
part.encode('ascii', 'backslashreplace') for part in
|
||||
(self.typ, self.description, self.detail, self.title)
|
||||
if part is not None).decode()
|
||||
if self.identifier:
|
||||
result = f'Problem for {self.identifier.value}: ' + result # pylint: disable=no-member
|
||||
if self.subproblems and len(self.subproblems) > 0:
|
||||
for subproblem in self.subproblems:
|
||||
result += f'\n{subproblem}'
|
||||
return result
|
||||
|
||||
|
||||
class Status(_Constant):
|
||||
"""ACME "status" field."""
|
||||
POSSIBLE_NAMES: Dict[str, _Constant] = {}
|
||||
|
||||
|
||||
STATUS_UNKNOWN = Status('unknown')
|
||||
STATUS_PENDING = Status('pending')
|
||||
STATUS_PROCESSING = Status('processing')
|
||||
STATUS_VALID = Status('valid')
|
||||
STATUS_INVALID = Status('invalid')
|
||||
STATUS_REVOKED = Status('revoked')
|
||||
STATUS_READY = Status('ready')
|
||||
STATUS_DEACTIVATED = Status('deactivated')
|
||||
|
||||
|
||||
class Directory(jose.JSONDeSerializable):
|
||||
"""Directory.
|
||||
|
||||
Directory resources must be accessed by the exact field name in RFC8555 (section 9.7.5).
|
||||
"""
|
||||
|
||||
class Meta(jose.JSONObjectWithFields):
|
||||
"""Directory Meta."""
|
||||
_terms_of_service: str = jose.field('termsOfService', omitempty=True)
|
||||
website: str = jose.field('website', omitempty=True)
|
||||
caa_identities: List[str] = jose.field('caaIdentities', omitempty=True)
|
||||
external_account_required: bool = jose.field('externalAccountRequired', omitempty=True)
|
||||
profiles: Dict[str, str] = jose.field('profiles', omitempty=True)
|
||||
|
||||
def __init__(self, **kwargs: Any) -> None:
|
||||
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
|
||||
super().__init__(**kwargs)
|
||||
|
||||
@property
|
||||
def terms_of_service(self) -> str:
|
||||
"""URL for the CA TOS"""
|
||||
return self._terms_of_service
|
||||
|
||||
def __iter__(self) -> Iterator[str]:
|
||||
# When iterating over fields, use the external name 'terms_of_service' instead of
|
||||
# the internal '_terms_of_service'.
|
||||
for name in super().__iter__():
|
||||
yield name[1:] if name == '_terms_of_service' else name
|
||||
|
||||
def _internal_name(self, name: str) -> str:
|
||||
return '_' + name if name == 'terms_of_service' else name
|
||||
|
||||
def __init__(self, jobj: Mapping[str, Any]) -> None:
|
||||
self._jobj = jobj
|
||||
|
||||
def __getattr__(self, name: str) -> Any:
|
||||
try:
|
||||
return self[name]
|
||||
except KeyError as error:
|
||||
raise AttributeError(str(error))
|
||||
|
||||
def __getitem__(self, name: str) -> Any:
|
||||
try:
|
||||
return self._jobj[name]
|
||||
except KeyError:
|
||||
raise KeyError(f'Directory field "{name}" not found')
|
||||
|
||||
def to_partial_json(self) -> Dict[str, Any]:
|
||||
return util.map_keys(self._jobj, lambda k: k)
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj: MutableMapping[str, Any]) -> 'Directory':
|
||||
jobj['meta'] = cls.Meta.from_json(jobj.pop('meta', {}))
|
||||
return cls(jobj)
|
||||
|
||||
|
||||
class Resource(jose.JSONObjectWithFields):
|
||||
"""ACME Resource.
|
||||
|
||||
:ivar acme.messages.ResourceBody body: Resource body.
|
||||
|
||||
"""
|
||||
body: "ResourceBody" = jose.field('body')
|
||||
|
||||
|
||||
class ResourceWithURI(Resource):
|
||||
"""ACME Resource with URI.
|
||||
|
||||
:ivar str uri: Location of the resource.
|
||||
|
||||
"""
|
||||
uri: str = jose.field('uri') # no ChallengeResource.uri
|
||||
|
||||
|
||||
class ResourceBody(jose.JSONObjectWithFields):
|
||||
"""ACME Resource Body."""
|
||||
|
||||
|
||||
class ExternalAccountBinding:
|
||||
"""ACME External Account Binding"""
|
||||
|
||||
@classmethod
|
||||
def from_data(cls, account_public_key: jose.JWK, kid: str, hmac_key: str,
|
||||
directory: Directory, hmac_alg: str = "HS256") -> Dict[str, Any]:
|
||||
"""Create External Account Binding Resource from contact details, kid and hmac."""
|
||||
|
||||
key_json = json.dumps(account_public_key.to_partial_json()).encode()
|
||||
decoded_hmac_key = jose.b64.b64decode(hmac_key)
|
||||
url = directory["newAccount"]
|
||||
|
||||
hmac_alg_map = {
|
||||
"HS256": jose.jwa.HS256,
|
||||
"HS384": jose.jwa.HS384,
|
||||
"HS512": jose.jwa.HS512,
|
||||
}
|
||||
alg = hmac_alg_map.get(hmac_alg)
|
||||
if alg is None:
|
||||
supported = ", ".join(hmac_alg_map.keys())
|
||||
raise ValueError(f"Invalid value for hmac_alg: {hmac_alg}. "
|
||||
f"Expected one of: {supported}.")
|
||||
|
||||
eab = jws.JWS.sign(key_json, jose.jwk.JWKOct(key=decoded_hmac_key),
|
||||
alg, None,
|
||||
url, kid)
|
||||
|
||||
return eab.to_partial_json()
|
||||
|
||||
|
||||
GenericRegistration = TypeVar('GenericRegistration', bound='Registration')
|
||||
|
||||
|
||||
class Registration(ResourceBody):
|
||||
"""Registration Resource Body.
|
||||
|
||||
:ivar jose.JWK key: Public key.
|
||||
:ivar tuple contact: Contact information following ACME spec,
|
||||
`tuple` of `str`.
|
||||
:ivar str agreement:
|
||||
|
||||
"""
|
||||
# on new-reg key server ignores 'key' and populates it based on
|
||||
# JWS.signature.combined.jwk
|
||||
key: jose.JWK = jose.field('key', omitempty=True, decoder=jose.JWK.from_json)
|
||||
# Contact field implements special behavior to allow messages that clear existing
|
||||
# contacts while not expecting the `contact` field when loading from json.
|
||||
# This is implemented in the constructor and *_json methods.
|
||||
contact: Tuple[str, ...] = jose.field('contact', omitempty=True, default=())
|
||||
agreement: str = jose.field('agreement', omitempty=True)
|
||||
status: Status = jose.field('status', omitempty=True)
|
||||
terms_of_service_agreed: bool = jose.field('termsOfServiceAgreed', omitempty=True)
|
||||
only_return_existing: bool = jose.field('onlyReturnExisting', omitempty=True)
|
||||
external_account_binding: Dict[str, Any] = jose.field('externalAccountBinding',
|
||||
omitempty=True)
|
||||
|
||||
phone_prefix = 'tel:'
|
||||
email_prefix = 'mailto:'
|
||||
|
||||
@classmethod
|
||||
def from_data(cls: Type[GenericRegistration], phone: Optional[str] = None,
|
||||
email: Optional[str] = None,
|
||||
external_account_binding: Optional[Dict[str, Any]] = None,
|
||||
**kwargs: Any) -> GenericRegistration:
|
||||
"""
|
||||
Create registration resource from contact details.
|
||||
|
||||
The `contact` keyword being passed to a Registration object is meaningful, so
|
||||
this function represents empty iterables in its kwargs by passing on an empty
|
||||
`tuple`.
|
||||
"""
|
||||
|
||||
# Note if `contact` was in kwargs.
|
||||
contact_provided = 'contact' in kwargs
|
||||
|
||||
# Pop `contact` from kwargs and add formatted email or phone numbers
|
||||
details = list(kwargs.pop('contact', ()))
|
||||
if phone is not None:
|
||||
details.append(cls.phone_prefix + phone)
|
||||
if email is not None:
|
||||
details.extend([cls.email_prefix + mail for mail in email.split(',')])
|
||||
|
||||
# Insert formatted contact information back into kwargs
|
||||
# or insert an empty tuple if `contact` provided.
|
||||
if details or contact_provided:
|
||||
kwargs['contact'] = tuple(details)
|
||||
|
||||
if external_account_binding:
|
||||
kwargs['external_account_binding'] = external_account_binding
|
||||
|
||||
return cls(**kwargs)
|
||||
|
||||
def __init__(self, **kwargs: Any) -> None:
|
||||
"""Note if the user provides a value for the `contact` member."""
|
||||
if 'contact' in kwargs and kwargs['contact'] is not None:
|
||||
# Avoid the __setattr__ used by jose.TypedJSONObjectWithFields
|
||||
object.__setattr__(self, '_add_contact', True)
|
||||
super().__init__(**kwargs)
|
||||
|
||||
def _filter_contact(self, prefix: str) -> Tuple[str, ...]:
|
||||
return tuple(
|
||||
detail[len(prefix):] for detail in self.contact # pylint: disable=not-an-iterable
|
||||
if detail.startswith(prefix))
|
||||
|
||||
def _add_contact_if_appropriate(self, jobj: Dict[str, Any]) -> Dict[str, Any]:
|
||||
"""
|
||||
The `contact` member of Registration objects should not be required when
|
||||
de-serializing (as it would be if the Fields' `omitempty` flag were `False`), but
|
||||
it should be included in serializations if it was provided.
|
||||
|
||||
:param jobj: Dictionary containing this Registrations' data
|
||||
:type jobj: dict
|
||||
|
||||
:returns: Dictionary containing Registrations data to transmit to the server
|
||||
:rtype: dict
|
||||
"""
|
||||
if getattr(self, '_add_contact', False):
|
||||
jobj['contact'] = self.encode('contact')
|
||||
|
||||
return jobj
|
||||
|
||||
def to_partial_json(self) -> Dict[str, Any]:
|
||||
"""Modify josepy.JSONDeserializable.to_partial_json()"""
|
||||
jobj = super().to_partial_json()
|
||||
return self._add_contact_if_appropriate(jobj)
|
||||
|
||||
def fields_to_partial_json(self) -> Dict[str, Any]:
|
||||
"""Modify josepy.JSONObjectWithFields.fields_to_partial_json()"""
|
||||
jobj = super().fields_to_partial_json()
|
||||
return self._add_contact_if_appropriate(jobj)
|
||||
|
||||
@property
|
||||
def phones(self) -> Tuple[str, ...]:
|
||||
"""All phones found in the ``contact`` field."""
|
||||
return self._filter_contact(self.phone_prefix)
|
||||
|
||||
@property
|
||||
def emails(self) -> Tuple[str, ...]:
|
||||
"""All emails found in the ``contact`` field."""
|
||||
return self._filter_contact(self.email_prefix)
|
||||
|
||||
|
||||
class NewRegistration(Registration):
|
||||
"""New registration."""
|
||||
|
||||
|
||||
class UpdateRegistration(Registration):
|
||||
"""Update registration."""
|
||||
|
||||
|
||||
class RegistrationResource(ResourceWithURI):
|
||||
"""Registration Resource.
|
||||
|
||||
:ivar acme.messages.Registration body:
|
||||
:ivar str new_authzr_uri: Deprecated. Do not use.
|
||||
:ivar str terms_of_service: URL for the CA TOS.
|
||||
|
||||
"""
|
||||
body: Registration = jose.field('body', decoder=Registration.from_json)
|
||||
new_authzr_uri: str = jose.field('new_authzr_uri', omitempty=True)
|
||||
terms_of_service: str = jose.field('terms_of_service', omitempty=True)
|
||||
|
||||
|
||||
class ChallengeBody(ResourceBody):
|
||||
"""Challenge Resource Body.
|
||||
|
||||
.. todo::
|
||||
Confusingly, this has a similar name to `.challenges.Challenge`,
|
||||
as well as `.achallenges.AnnotatedChallenge`. Please use names
|
||||
such as ``challb`` to distinguish instances of this class from
|
||||
``achall``.
|
||||
|
||||
:ivar acme.challenges.Challenge: Wrapped challenge.
|
||||
Conveniently, all challenge fields are proxied, i.e. you can
|
||||
call ``challb.x`` to get ``challb.chall.x`` contents.
|
||||
:ivar acme.messages.Status status:
|
||||
:ivar datetime.datetime validated:
|
||||
:ivar messages.Error error:
|
||||
|
||||
"""
|
||||
__slots__ = ('chall',)
|
||||
# ACMEv1 has a "uri" field in challenges. ACMEv2 has a "url" field. This
|
||||
# challenge object supports either one, but should be accessed through the
|
||||
# name "uri". In Client.answer_challenge, whichever one is set will be
|
||||
# used.
|
||||
_url: str = jose.field('url', omitempty=True, default=None)
|
||||
status: Status = jose.field('status', decoder=Status.from_json,
|
||||
omitempty=True, default=STATUS_PENDING)
|
||||
validated: datetime.datetime = fields.rfc3339('validated', omitempty=True)
|
||||
error: Error = jose.field('error', decoder=Error.from_json,
|
||||
omitempty=True, default=None)
|
||||
|
||||
def __init__(self, **kwargs: Any) -> None:
|
||||
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
|
||||
super().__init__(**kwargs)
|
||||
|
||||
def encode(self, name: str) -> Any:
|
||||
return super().encode(self._internal_name(name))
|
||||
|
||||
def to_partial_json(self) -> Dict[str, Any]:
|
||||
jobj = super().to_partial_json()
|
||||
jobj.update(self.chall.to_partial_json())
|
||||
return jobj
|
||||
|
||||
@classmethod
|
||||
def fields_from_json(cls, jobj: Mapping[str, Any]) -> Dict[str, Any]:
|
||||
jobj_fields = super().fields_from_json(jobj)
|
||||
jobj_fields['chall'] = challenges.Challenge.from_json(jobj)
|
||||
return jobj_fields
|
||||
|
||||
@property
|
||||
def uri(self) -> str:
|
||||
"""The URL of this challenge."""
|
||||
return self._url
|
||||
|
||||
def __getattr__(self, name: str) -> Any:
|
||||
return getattr(self.chall, name)
|
||||
|
||||
def __iter__(self) -> Iterator[str]:
|
||||
# When iterating over fields, use the external name 'uri' instead of
|
||||
# the internal '_uri'.
|
||||
for name in super().__iter__():
|
||||
yield 'uri' if name == '_url' else name
|
||||
|
||||
def _internal_name(self, name: str) -> str:
|
||||
return '_url' if name == 'uri' else name
|
||||
|
||||
|
||||
class ChallengeResource(Resource):
|
||||
"""Challenge Resource.
|
||||
|
||||
:ivar acme.messages.ChallengeBody body:
|
||||
:ivar str authzr_uri: URI found in the 'up' ``Link`` header.
|
||||
|
||||
"""
|
||||
body: ChallengeBody = jose.field('body', decoder=ChallengeBody.from_json)
|
||||
authzr_uri: str = jose.field('authzr_uri')
|
||||
|
||||
@property
|
||||
def uri(self) -> str:
|
||||
"""The URL of the challenge body."""
|
||||
return self.body.uri # pylint: disable=no-member
|
||||
|
||||
|
||||
class Authorization(ResourceBody):
|
||||
"""Authorization Resource Body.
|
||||
|
||||
:ivar acme.messages.Identifier identifier:
|
||||
:ivar list challenges: `list` of `.ChallengeBody`
|
||||
:ivar acme.messages.Status status:
|
||||
:ivar datetime.datetime expires:
|
||||
|
||||
"""
|
||||
identifier: Identifier = jose.field('identifier', decoder=Identifier.from_json, omitempty=True)
|
||||
challenges: List[ChallengeBody] = jose.field('challenges', omitempty=True)
|
||||
|
||||
status: Status = jose.field('status', omitempty=True, decoder=Status.from_json)
|
||||
# TODO: 'expires' is allowed for Authorization Resources in
|
||||
# general, but for Key Authorization '[t]he "expires" field MUST
|
||||
# be absent'... then acme-spec gives example with 'expires'
|
||||
# present... That's confusing!
|
||||
expires: datetime.datetime = fields.rfc3339('expires', omitempty=True)
|
||||
wildcard: bool = jose.field('wildcard', omitempty=True)
|
||||
|
||||
# Mypy does not understand the josepy magic happening here, and falsely claims
|
||||
# that challenge is redefined. Let's ignore the type check here.
|
||||
@challenges.decoder # type: ignore
|
||||
def challenges(value: List[Dict[str, Any]]) -> Tuple[ChallengeBody, ...]: # pylint: disable=no-self-argument,missing-function-docstring
|
||||
return tuple(ChallengeBody.from_json(chall) for chall in value)
|
||||
|
||||
|
||||
class NewAuthorization(Authorization):
|
||||
"""New authorization."""
|
||||
|
||||
|
||||
class UpdateAuthorization(Authorization):
|
||||
"""Update authorization."""
|
||||
|
||||
|
||||
class AuthorizationResource(ResourceWithURI):
|
||||
"""Authorization Resource.
|
||||
|
||||
:ivar acme.messages.Authorization body:
|
||||
:ivar str new_cert_uri: Deprecated. Do not use.
|
||||
|
||||
"""
|
||||
body: Authorization = jose.field('body', decoder=Authorization.from_json)
|
||||
new_cert_uri: str = jose.field('new_cert_uri', omitempty=True)
|
||||
|
||||
|
||||
class CertificateRequest(jose.JSONObjectWithFields):
|
||||
"""ACME newOrder request.
|
||||
|
||||
:ivar x509.CertificateSigningRequest csr: `x509.CertificateSigningRequest`
|
||||
|
||||
"""
|
||||
csr: x509.CertificateSigningRequest = jose.field(
|
||||
'csr', decoder=jose.decode_csr, encoder=jose.encode_csr)
|
||||
|
||||
|
||||
class CertificateResource(ResourceWithURI):
|
||||
"""Certificate Resource.
|
||||
|
||||
:ivar x509.Certificate body: `x509.Certificate`
|
||||
:ivar str cert_chain_uri: URI found in the 'up' ``Link`` header
|
||||
:ivar tuple authzrs: `tuple` of `AuthorizationResource`.
|
||||
|
||||
"""
|
||||
cert_chain_uri: str = jose.field('cert_chain_uri')
|
||||
authzrs: Tuple[AuthorizationResource, ...] = jose.field('authzrs')
|
||||
|
||||
|
||||
class Revocation(jose.JSONObjectWithFields):
|
||||
"""Revocation message.
|
||||
|
||||
:ivar x509.Certificate certificate: `x509.Certificate`
|
||||
|
||||
"""
|
||||
certificate: x509.Certificate = jose.field(
|
||||
'certificate', decoder=jose.decode_cert, encoder=jose.encode_cert)
|
||||
reason: int = jose.field('reason')
|
||||
|
||||
|
||||
class Order(ResourceBody):
|
||||
"""Order Resource Body.
|
||||
|
||||
:ivar profile: The profile to request.
|
||||
:vartype profile: str
|
||||
:ivar identifiers: List of identifiers for the certificate.
|
||||
:vartype identifiers: `list` of `.Identifier`
|
||||
:ivar acme.messages.Status status:
|
||||
:ivar authorizations: URLs of authorizations.
|
||||
:vartype authorizations: `list` of `str`
|
||||
:ivar str certificate: URL to download certificate as a fullchain PEM.
|
||||
:ivar str finalize: URL to POST to to request issuance once all
|
||||
authorizations have "valid" status.
|
||||
:ivar datetime.datetime expires: When the order expires.
|
||||
:ivar ~.Error error: Any error that occurred during finalization, if applicable.
|
||||
"""
|
||||
# https://datatracker.ietf.org/doc/draft-aaron-acme-profiles/
|
||||
profile: str = jose.field('profile', omitempty=True)
|
||||
identifiers: List[Identifier] = jose.field('identifiers', omitempty=True)
|
||||
status: Status = jose.field('status', decoder=Status.from_json, omitempty=True)
|
||||
authorizations: List[str] = jose.field('authorizations', omitempty=True)
|
||||
certificate: str = jose.field('certificate', omitempty=True)
|
||||
finalize: str = jose.field('finalize', omitempty=True)
|
||||
expires: datetime.datetime = fields.rfc3339('expires', omitempty=True)
|
||||
error: Error = jose.field('error', omitempty=True, decoder=Error.from_json)
|
||||
|
||||
# Mypy does not understand the josepy magic happening here, and falsely claims
|
||||
# that identifiers is redefined. Let's ignore the type check here.
|
||||
@identifiers.decoder # type: ignore
|
||||
def identifiers(value: List[Dict[str, Any]]) -> Tuple[Identifier, ...]: # pylint: disable=no-self-argument,missing-function-docstring
|
||||
return tuple(Identifier.from_json(identifier) for identifier in value)
|
||||
|
||||
|
||||
class OrderResource(ResourceWithURI):
|
||||
"""Order Resource.
|
||||
|
||||
:ivar acme.messages.Order body:
|
||||
:ivar bytes csr_pem: The CSR this Order will be finalized with.
|
||||
:ivar authorizations: Fully-fetched AuthorizationResource objects.
|
||||
:vartype authorizations: `list` of `acme.messages.AuthorizationResource`
|
||||
:ivar str fullchain_pem: The fetched contents of the certificate URL
|
||||
produced once the order was finalized, if it's present.
|
||||
:ivar alternative_fullchains_pem: The fetched contents of alternative certificate
|
||||
chain URLs produced once the order was finalized, if present and requested during
|
||||
finalization.
|
||||
:vartype alternative_fullchains_pem: `list` of `str`
|
||||
"""
|
||||
body: Order = jose.field('body', decoder=Order.from_json)
|
||||
csr_pem: bytes = jose.field('csr_pem', omitempty=True,
|
||||
# This looks backwards, but it's not -
|
||||
# we want the deserialized value to be
|
||||
# `bytes`, but anything we put into
|
||||
# JSON needs to be `str`, so we encode
|
||||
# to decode and decode to
|
||||
# encode. Otherwise we end up with an
|
||||
# array of ints on serialization
|
||||
decoder=lambda s: s.encode("utf-8"),
|
||||
encoder=lambda b: b.decode("utf-8"))
|
||||
|
||||
authorizations: List[AuthorizationResource] = jose.field('authorizations')
|
||||
fullchain_pem: str = jose.field('fullchain_pem', omitempty=True)
|
||||
alternative_fullchains_pem: List[str] = jose.field('alternative_fullchains_pem',
|
||||
omitempty=True)
|
||||
|
||||
# Mypy does not understand the josepy magic happening here, and falsely claims
|
||||
# that authorizations is redefined. Let's ignore the type check here.
|
||||
@authorizations.decoder # type: ignore
|
||||
def authorizations(value: List[Dict[str, Any]]) -> Tuple[AuthorizationResource, ...]: # pylint: disable=no-self-argument,missing-function-docstring
|
||||
return tuple(AuthorizationResource.from_json(authz) for authz in value)
|
||||
|
||||
|
||||
class NewOrder(Order):
|
||||
"""New order."""
|
||||
|
||||
|
||||
class RenewalInfo(ResourceBody):
|
||||
"""Renewal Info Resource Body.
|
||||
:ivar acme.messages.SuggestedWindow window: The suggested renewal window.
|
||||
"""
|
||||
class SuggestedWindow(jose.JSONObjectWithFields):
|
||||
"""Suggested Renewal Window, sub-resource of Renewal Info Resource.
|
||||
:ivar datetime.datetime start: Beginning of suggested renewal window
|
||||
:ivar datetime.datetime end: End of suggested renewal window (inclusive)
|
||||
"""
|
||||
start: datetime.datetime = fields.rfc3339('start', omitempty=True)
|
||||
end: datetime.datetime = fields.rfc3339('end', omitempty=True)
|
||||
|
||||
suggested_window: SuggestedWindow = jose.field('suggestedWindow',
|
||||
decoder=SuggestedWindow.from_json)
|
||||
@@ -1,331 +0,0 @@
|
||||
"""Support for standalone client challenge solvers. """
|
||||
from __future__ import annotations
|
||||
|
||||
import collections
|
||||
import functools
|
||||
import http.client as http_client
|
||||
import http.server as BaseHTTPServer
|
||||
import logging
|
||||
import socket
|
||||
import socketserver
|
||||
import threading
|
||||
from typing import Any
|
||||
from typing import cast
|
||||
from typing import List
|
||||
from typing import Mapping
|
||||
from typing import Optional
|
||||
from typing import Set
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
import warnings
|
||||
|
||||
from OpenSSL import SSL
|
||||
|
||||
from acme import challenges
|
||||
from acme import crypto_util
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class TLSServer(socketserver.TCPServer):
|
||||
"""Generic TLS Server
|
||||
|
||||
.. deprecated:: 4.1.0
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
warnings.warn("TLSServer is deprecated and will be removed in an upcoming release",
|
||||
DeprecationWarning)
|
||||
self.ipv6 = kwargs.pop("ipv6", False)
|
||||
if self.ipv6:
|
||||
self.address_family = socket.AF_INET6
|
||||
else:
|
||||
self.address_family = socket.AF_INET
|
||||
self.certs = kwargs.pop("certs", {})
|
||||
self.method = kwargs.pop("method", crypto_util._DEFAULT_SSL_METHOD)
|
||||
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def _wrap_sock(self) -> None:
|
||||
with warnings.catch_warnings():
|
||||
warnings.filterwarnings('ignore', 'SSLSocket is deprecated')
|
||||
self.socket = cast(socket.socket, crypto_util.SSLSocket(
|
||||
self.socket, cert_selection=self._cert_selection,
|
||||
alpn_selection=getattr(self, '_alpn_selection', None),
|
||||
method=self.method))
|
||||
|
||||
def _cert_selection(self, connection: SSL.Connection
|
||||
) -> Optional[crypto_util._KeyAndCert]: # pragma: no cover
|
||||
"""Callback selecting certificate for connection."""
|
||||
server_name = connection.get_servername()
|
||||
if server_name:
|
||||
return self.certs.get(server_name, None)
|
||||
return None
|
||||
|
||||
def server_bind(self) -> None:
|
||||
self._wrap_sock()
|
||||
return socketserver.TCPServer.server_bind(self)
|
||||
|
||||
|
||||
class ACMEServerMixin:
|
||||
"""ACME server common settings mixin."""
|
||||
# TODO: c.f. #858
|
||||
server_version = "ACME client standalone challenge solver"
|
||||
allow_reuse_address = True
|
||||
|
||||
|
||||
class BaseDualNetworkedServers:
|
||||
"""Base class for a pair of IPv6 and IPv4 servers that tries to do everything
|
||||
it's asked for both servers, but where failures in one server don't
|
||||
affect the other.
|
||||
|
||||
If two servers are instantiated, they will serve on the same port.
|
||||
"""
|
||||
|
||||
def __init__(self, ServerClass: Type[socketserver.TCPServer], server_address: Tuple[str, int],
|
||||
*remaining_args: Any, **kwargs: Any) -> None:
|
||||
port = server_address[1]
|
||||
self.threads: List[threading.Thread] = []
|
||||
self.servers: List[socketserver.BaseServer] = []
|
||||
|
||||
# Preserve socket error for re-raising, if no servers can be started
|
||||
last_socket_err: Optional[socket.error] = None
|
||||
|
||||
# Must try True first.
|
||||
# Ubuntu, for example, will fail to bind to IPv4 if we've already bound
|
||||
# to IPv6. But that's ok, since it will accept IPv4 connections on the IPv6
|
||||
# socket. On the other hand, FreeBSD will successfully bind to IPv4 on the
|
||||
# same port, which means that server will accept the IPv4 connections.
|
||||
# If Python is compiled without IPv6, we'll error out but (probably) successfully
|
||||
# create the IPv4 server.
|
||||
for ip_version in [True, False]:
|
||||
try:
|
||||
kwargs["ipv6"] = ip_version
|
||||
new_address = (server_address[0],) + (port,) + server_address[2:]
|
||||
new_args = (new_address,) + remaining_args
|
||||
server = ServerClass(*new_args, **kwargs)
|
||||
logger.debug(
|
||||
"Successfully bound to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
except OSError as e:
|
||||
last_socket_err = e
|
||||
if self.servers:
|
||||
# Already bound using IPv6.
|
||||
logger.debug(
|
||||
"Certbot wasn't able to bind to %s:%s using %s, this "
|
||||
"is often expected due to the dual stack nature of "
|
||||
"IPv6 socket implementations.",
|
||||
new_address[0], new_address[1],
|
||||
"IPv6" if ip_version else "IPv4")
|
||||
else:
|
||||
logger.debug(
|
||||
"Failed to bind to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
else:
|
||||
self.servers.append(server)
|
||||
# If two servers are set up and port 0 was passed in, ensure we always
|
||||
# bind to the same port for both servers.
|
||||
port = server.socket.getsockname()[1]
|
||||
if not self.servers:
|
||||
if last_socket_err:
|
||||
raise last_socket_err
|
||||
else: # pragma: no cover
|
||||
raise OSError("Could not bind to IPv4 or IPv6.")
|
||||
|
||||
def serve_forever(self) -> None:
|
||||
"""Wraps socketserver.TCPServer.serve_forever"""
|
||||
for server in self.servers:
|
||||
thread = threading.Thread(
|
||||
target=server.serve_forever)
|
||||
thread.start()
|
||||
self.threads.append(thread)
|
||||
|
||||
def getsocknames(self) -> List[Tuple[str, int]]:
|
||||
"""Wraps socketserver.TCPServer.socket.getsockname"""
|
||||
return [server.socket.getsockname() for server in self.servers]
|
||||
|
||||
def shutdown_and_server_close(self) -> None:
|
||||
"""Wraps socketserver.TCPServer.shutdown, socketserver.TCPServer.server_close, and
|
||||
threading.Thread.join"""
|
||||
for server in self.servers:
|
||||
server.shutdown()
|
||||
server.server_close()
|
||||
for thread in self.threads:
|
||||
thread.join()
|
||||
self.threads = []
|
||||
|
||||
|
||||
class TLSALPN01Server(TLSServer, ACMEServerMixin):
|
||||
"""TLSALPN01 Server.
|
||||
|
||||
.. deprecated:: 4.1.0
|
||||
|
||||
"""
|
||||
|
||||
ACME_TLS_1_PROTOCOL = b"acme-tls/1"
|
||||
|
||||
def __init__(self, server_address: Tuple[str, int],
|
||||
certs: List[crypto_util._KeyAndCert],
|
||||
challenge_certs: Mapping[bytes, crypto_util._KeyAndCert],
|
||||
ipv6: bool = False) -> None:
|
||||
warnings.warn("TLSALPN01Server is deprecated and will be removed in an "
|
||||
"upcoming certbot major version update", DeprecationWarning)
|
||||
# We don't need to implement a request handler here because the work
|
||||
# (including logging) is being done by wrapped socket set up in the
|
||||
# parent TLSServer class.
|
||||
with warnings.catch_warnings():
|
||||
warnings.filterwarnings("ignore", "TLSServer is deprecated")
|
||||
TLSServer.__init__(
|
||||
self, server_address, socketserver.BaseRequestHandler, certs=certs,
|
||||
ipv6=ipv6)
|
||||
self.challenge_certs = challenge_certs
|
||||
|
||||
def _cert_selection(self, connection: SSL.Connection) -> Optional[crypto_util._KeyAndCert]:
|
||||
# TODO: We would like to serve challenge cert only if asked for it via
|
||||
# ALPN. To do this, we need to retrieve the list of protos from client
|
||||
# hello, but this is currently impossible with openssl [0], and ALPN
|
||||
# negotiation is done after cert selection.
|
||||
# Therefore, currently we always return challenge cert, and terminate
|
||||
# handshake in alpn_selection() if ALPN protos are not what we expect.
|
||||
# [0] https://github.com/openssl/openssl/issues/4952
|
||||
server_name = connection.get_servername()
|
||||
if server_name:
|
||||
logger.debug("Serving challenge cert for server name %s", server_name)
|
||||
return self.challenge_certs[server_name]
|
||||
return None # pragma: no cover
|
||||
|
||||
def _alpn_selection(self, _connection: SSL.Connection, alpn_protos: List[bytes]) -> bytes:
|
||||
"""Callback to select alpn protocol."""
|
||||
if len(alpn_protos) == 1 and alpn_protos[0] == self.ACME_TLS_1_PROTOCOL:
|
||||
logger.debug("Agreed on %s ALPN", self.ACME_TLS_1_PROTOCOL)
|
||||
return self.ACME_TLS_1_PROTOCOL
|
||||
logger.debug("Cannot agree on ALPN proto. Got: %s", str(alpn_protos))
|
||||
# Explicitly close the connection now, by returning an empty string.
|
||||
# See https://www.pyopenssl.org/en/stable/api/ssl.html#OpenSSL.SSL.Context.set_alpn_select_callback # pylint: disable=line-too-long
|
||||
return b""
|
||||
|
||||
|
||||
class HTTPServer(BaseHTTPServer.HTTPServer):
|
||||
"""Generic HTTP Server."""
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
self.ipv6 = kwargs.pop("ipv6", False)
|
||||
if self.ipv6:
|
||||
self.address_family = socket.AF_INET6
|
||||
else:
|
||||
self.address_family = socket.AF_INET
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
|
||||
class HTTP01Server(HTTPServer, ACMEServerMixin):
|
||||
"""HTTP01 Server."""
|
||||
|
||||
def __init__(self, server_address: Tuple[str, int],
|
||||
resources: Set[HTTP01RequestHandler.HTTP01Resource],
|
||||
ipv6: bool = False, timeout: int = 30) -> None:
|
||||
super().__init__(
|
||||
server_address, HTTP01RequestHandler.partial_init(
|
||||
simple_http_resources=resources, timeout=timeout), ipv6=ipv6)
|
||||
|
||||
|
||||
class HTTP01DualNetworkedServers(BaseDualNetworkedServers):
|
||||
"""HTTP01Server Wrapper. Tries everything for both. Failures for one don't
|
||||
affect the other."""
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
super().__init__(HTTP01Server, *args, **kwargs)
|
||||
|
||||
|
||||
class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
"""HTTP01 challenge handler.
|
||||
|
||||
Adheres to the stdlib's `socketserver.BaseRequestHandler` interface.
|
||||
|
||||
:ivar set simple_http_resources: A set of `HTTP01Resource`
|
||||
objects. TODO: better name?
|
||||
|
||||
"""
|
||||
HTTP01Resource = collections.namedtuple(
|
||||
"HTTP01Resource", "chall response validation")
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
self.simple_http_resources = kwargs.pop("simple_http_resources", set())
|
||||
self._timeout = kwargs.pop('timeout', 30)
|
||||
super().__init__(*args, **kwargs)
|
||||
self.server: HTTP01Server
|
||||
|
||||
# In parent class BaseHTTPRequestHandler, 'timeout' is a class-level property but we
|
||||
# need to define its value during the initialization phase in HTTP01RequestHandler.
|
||||
# However MyPy does not appreciate that we dynamically shadow a class-level property
|
||||
# with an instance-level property (eg. self.timeout = ... in __init__()). So to make
|
||||
# everyone happy, we statically redefine 'timeout' as a method property, and set the
|
||||
# timeout value in a new internal instance-level property _timeout.
|
||||
@property
|
||||
def timeout(self) -> int: # type: ignore[override]
|
||||
"""
|
||||
The default timeout this server should apply to requests.
|
||||
:return: timeout to apply
|
||||
:rtype: int
|
||||
"""
|
||||
return self._timeout
|
||||
|
||||
def log_message(self, format: str, *args: Any) -> None: # pylint: disable=redefined-builtin
|
||||
"""Log arbitrary message."""
|
||||
logger.debug("%s - - %s", self.client_address[0], format % args)
|
||||
|
||||
def handle(self) -> None:
|
||||
"""Handle request."""
|
||||
self.log_message("Incoming request")
|
||||
BaseHTTPServer.BaseHTTPRequestHandler.handle(self)
|
||||
|
||||
def do_GET(self) -> None: # pylint: disable=invalid-name,missing-function-docstring
|
||||
if self.path == "/":
|
||||
self.handle_index()
|
||||
elif self.path.startswith("/" + challenges.HTTP01.URI_ROOT_PATH):
|
||||
self.handle_simple_http_resource()
|
||||
else:
|
||||
self.handle_404()
|
||||
|
||||
def handle_index(self) -> None:
|
||||
"""Handle index page."""
|
||||
self.send_response(200)
|
||||
self.send_header("Content-Type", "text/html")
|
||||
self.end_headers()
|
||||
self.wfile.write(self.server.server_version.encode())
|
||||
|
||||
def handle_404(self) -> None:
|
||||
"""Handler 404 Not Found errors."""
|
||||
self.send_response(http_client.NOT_FOUND, message="Not Found")
|
||||
self.send_header("Content-type", "text/html")
|
||||
self.end_headers()
|
||||
self.wfile.write(b"404")
|
||||
|
||||
def handle_simple_http_resource(self) -> None:
|
||||
"""Handle HTTP01 provisioned resources."""
|
||||
for resource in self.simple_http_resources:
|
||||
if resource.chall.path == self.path:
|
||||
self.log_message("Serving HTTP01 with token %r",
|
||||
resource.chall.encode("token"))
|
||||
self.send_response(http_client.OK)
|
||||
self.end_headers()
|
||||
self.wfile.write(resource.validation.encode())
|
||||
return
|
||||
else: # pylint: disable=useless-else-on-loop
|
||||
self.log_message("No resources to serve")
|
||||
self.log_message("%s does not correspond to any resource. ignoring",
|
||||
self.path)
|
||||
|
||||
@classmethod
|
||||
def partial_init(cls, simple_http_resources: Set[HTTP01RequestHandler.HTTP01Resource],
|
||||
timeout: int) -> functools.partial[HTTP01RequestHandler]:
|
||||
"""Partially initialize this handler.
|
||||
|
||||
This is useful because `socketserver.BaseServer` takes
|
||||
uninitialized handler and initializes it with the current
|
||||
request.
|
||||
|
||||
"""
|
||||
return functools.partial(
|
||||
cls, simple_http_resources=simple_http_resources,
|
||||
timeout=timeout)
|
||||
@@ -1,10 +0,0 @@
|
||||
"""ACME utilities."""
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import Dict
|
||||
from typing import Mapping
|
||||
|
||||
|
||||
def map_keys(dikt: Mapping[Any, Any], func: Callable[[Any], Any]) -> Dict[Any, Any]:
|
||||
"""Map dictionary keys."""
|
||||
return {func(key): value for key, value in dikt.items()}
|
||||
@@ -1,19 +1,14 @@
|
||||
"""Tests for acme.challenges."""
|
||||
import sys
|
||||
import unittest
|
||||
from unittest import mock
|
||||
import urllib.parse as urllib_parse
|
||||
|
||||
import josepy as jose
|
||||
from josepy.jwk import JWKEC
|
||||
import OpenSSL
|
||||
import pytest
|
||||
import mock
|
||||
import requests
|
||||
from six.moves.urllib import parse as urllib_parse
|
||||
|
||||
from acme import errors
|
||||
from acme._internal.tests import test_util
|
||||
import test_util
|
||||
|
||||
CERT = test_util.load_cert('cert.pem')
|
||||
CERT = test_util.load_comparable_cert('cert.pem')
|
||||
KEY = jose.JWKRSA(key=test_util.load_rsa_private_key('rsa512_key.pem'))
|
||||
|
||||
|
||||
@@ -23,7 +18,7 @@ class ChallengeTest(unittest.TestCase):
|
||||
from acme.challenges import Challenge
|
||||
from acme.challenges import UnrecognizedChallenge
|
||||
chall = UnrecognizedChallenge({"type": "foo"})
|
||||
assert chall == Challenge.from_json(chall.jobj)
|
||||
self.assertEqual(chall, Challenge.from_json(chall.jobj))
|
||||
|
||||
|
||||
class UnrecognizedChallengeTest(unittest.TestCase):
|
||||
@@ -34,11 +29,12 @@ class UnrecognizedChallengeTest(unittest.TestCase):
|
||||
self.chall = UnrecognizedChallenge(self.jobj)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
assert self.jobj == self.chall.to_partial_json()
|
||||
self.assertEqual(self.jobj, self.chall.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import UnrecognizedChallenge
|
||||
assert self.chall == UnrecognizedChallenge.from_json(self.jobj)
|
||||
self.assertEqual(
|
||||
self.chall, UnrecognizedChallenge.from_json(self.jobj))
|
||||
|
||||
|
||||
class KeyAuthorizationChallengeResponseTest(unittest.TestCase):
|
||||
@@ -54,26 +50,26 @@ class KeyAuthorizationChallengeResponseTest(unittest.TestCase):
|
||||
from acme.challenges import KeyAuthorizationChallengeResponse
|
||||
response = KeyAuthorizationChallengeResponse(
|
||||
key_authorization='foo.oKGqedy-b-acd5eoybm2f-NVFxvyOoET5CNy3xnv8WY')
|
||||
assert response.verify(self.chall, KEY.public_key())
|
||||
self.assertTrue(response.verify(self.chall, KEY.public_key()))
|
||||
|
||||
def test_verify_wrong_token(self):
|
||||
from acme.challenges import KeyAuthorizationChallengeResponse
|
||||
response = KeyAuthorizationChallengeResponse(
|
||||
key_authorization='bar.oKGqedy-b-acd5eoybm2f-NVFxvyOoET5CNy3xnv8WY')
|
||||
assert not response.verify(self.chall, KEY.public_key())
|
||||
self.assertFalse(response.verify(self.chall, KEY.public_key()))
|
||||
|
||||
def test_verify_wrong_thumbprint(self):
|
||||
from acme.challenges import KeyAuthorizationChallengeResponse
|
||||
response = KeyAuthorizationChallengeResponse(
|
||||
key_authorization='foo.oKGqedy-b-acd5eoybm2f-NVFxv')
|
||||
assert not response.verify(self.chall, KEY.public_key())
|
||||
self.assertFalse(response.verify(self.chall, KEY.public_key()))
|
||||
|
||||
def test_verify_wrong_form(self):
|
||||
from acme.challenges import KeyAuthorizationChallengeResponse
|
||||
response = KeyAuthorizationChallengeResponse(
|
||||
key_authorization='.foo.oKGqedy-b-acd5eoybm2f-'
|
||||
'NVFxvyOoET5CNy3xnv8WY')
|
||||
assert not response.verify(self.chall, KEY.public_key())
|
||||
self.assertFalse(response.verify(self.chall, KEY.public_key()))
|
||||
|
||||
|
||||
class DNS01ResponseTest(unittest.TestCase):
|
||||
@@ -92,11 +88,12 @@ class DNS01ResponseTest(unittest.TestCase):
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
assert {} == self.msg.to_partial_json()
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import DNS01Response
|
||||
assert self.msg == DNS01Response.from_json(self.jmsg)
|
||||
self.assertEqual(self.msg, DNS01Response.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import DNS01Response
|
||||
@@ -106,12 +103,12 @@ class DNS01ResponseTest(unittest.TestCase):
|
||||
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
|
||||
public_key = key2.public_key()
|
||||
verified = self.response.simple_verify(self.chall, "local", public_key)
|
||||
assert not verified
|
||||
self.assertFalse(verified)
|
||||
|
||||
def test_simple_verify_success(self):
|
||||
public_key = KEY.public_key()
|
||||
verified = self.response.simple_verify(self.chall, "local", public_key)
|
||||
assert verified
|
||||
self.assertTrue(verified)
|
||||
|
||||
|
||||
class DNS01Test(unittest.TestCase):
|
||||
@@ -126,19 +123,20 @@ class DNS01Test(unittest.TestCase):
|
||||
}
|
||||
|
||||
def test_validation_domain_name(self):
|
||||
assert '_acme-challenge.www.example.com' == \
|
||||
self.msg.validation_domain_name('www.example.com')
|
||||
self.assertEqual('_acme-challenge.www.example.com',
|
||||
self.msg.validation_domain_name('www.example.com'))
|
||||
|
||||
def test_validation(self):
|
||||
assert "rAa7iIg4K2y63fvUhCfy8dP1Xl7wEhmQq0oChTcE3Zk" == \
|
||||
self.msg.validation(KEY)
|
||||
self.assertEqual(
|
||||
"rAa7iIg4K2y63fvUhCfy8dP1Xl7wEhmQq0oChTcE3Zk",
|
||||
self.msg.validation(KEY))
|
||||
|
||||
def test_to_partial_json(self):
|
||||
assert self.jmsg == self.msg.to_partial_json()
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import DNS01
|
||||
assert self.msg == DNS01.from_json(self.jmsg)
|
||||
self.assertEqual(self.msg, DNS01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import DNS01
|
||||
@@ -161,11 +159,13 @@ class HTTP01ResponseTest(unittest.TestCase):
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
assert {} == self.msg.to_partial_json()
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
assert self.msg == HTTP01Response.from_json(self.jmsg)
|
||||
self.assertEqual(
|
||||
self.msg, HTTP01Response.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
@@ -179,16 +179,15 @@ class HTTP01ResponseTest(unittest.TestCase):
|
||||
def test_simple_verify_good_validation(self, mock_get):
|
||||
validation = self.chall.validation(KEY)
|
||||
mock_get.return_value = mock.MagicMock(text=validation)
|
||||
assert self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key())
|
||||
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False,
|
||||
timeout=mock.ANY)
|
||||
self.assertTrue(self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key()))
|
||||
mock_get.assert_called_once_with(self.chall.uri("local"))
|
||||
|
||||
@mock.patch("acme.challenges.requests.get")
|
||||
def test_simple_verify_bad_validation(self, mock_get):
|
||||
mock_get.return_value = mock.MagicMock(text="!")
|
||||
assert not self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key())
|
||||
self.assertFalse(self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key()))
|
||||
|
||||
@mock.patch("acme.challenges.requests.get")
|
||||
def test_simple_verify_whitespace_validation(self, mock_get):
|
||||
@@ -196,34 +195,23 @@ class HTTP01ResponseTest(unittest.TestCase):
|
||||
mock_get.return_value = mock.MagicMock(
|
||||
text=(self.chall.validation(KEY) +
|
||||
HTTP01Response.WHITESPACE_CUTSET))
|
||||
assert self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key())
|
||||
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False,
|
||||
timeout=mock.ANY)
|
||||
self.assertTrue(self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key()))
|
||||
mock_get.assert_called_once_with(self.chall.uri("local"))
|
||||
|
||||
@mock.patch("acme.challenges.requests.get")
|
||||
def test_simple_verify_connection_error(self, mock_get):
|
||||
mock_get.side_effect = requests.exceptions.RequestException
|
||||
assert not self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key())
|
||||
self.assertFalse(self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key()))
|
||||
|
||||
@mock.patch("acme.challenges.requests.get")
|
||||
def test_simple_verify_port(self, mock_get):
|
||||
self.response.simple_verify(
|
||||
self.chall, domain="local",
|
||||
account_public_key=KEY.public_key(), port=8080)
|
||||
assert "local:8080" == urllib_parse.urlparse(
|
||||
mock_get.mock_calls[0][1][0]).netloc
|
||||
|
||||
@mock.patch("acme.challenges.requests.get")
|
||||
def test_simple_verify_timeout(self, mock_get):
|
||||
self.response.simple_verify(self.chall, "local", KEY.public_key())
|
||||
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False,
|
||||
timeout=30)
|
||||
mock_get.reset_mock()
|
||||
self.response.simple_verify(self.chall, "local", KEY.public_key(), timeout=1234)
|
||||
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False,
|
||||
timeout=1234)
|
||||
self.assertEqual("local:8080", urllib_parse.urlparse(
|
||||
mock_get.mock_calls[0][1][0]).netloc)
|
||||
|
||||
|
||||
class HTTP01Test(unittest.TestCase):
|
||||
@@ -239,112 +227,59 @@ class HTTP01Test(unittest.TestCase):
|
||||
}
|
||||
|
||||
def test_path(self):
|
||||
assert self.msg.path == '/.well-known/acme-challenge/' \
|
||||
'evaGxfADs6pSRb2LAv9IZf17Dt3juxGJ-PCt92wr-oA'
|
||||
self.assertEqual(self.msg.path, '/.well-known/acme-challenge/'
|
||||
'evaGxfADs6pSRb2LAv9IZf17Dt3juxGJ-PCt92wr-oA')
|
||||
|
||||
def test_uri(self):
|
||||
assert 'http://example.com/.well-known/acme-challenge/' \
|
||||
'evaGxfADs6pSRb2LAv9IZf17Dt3juxGJ-PCt92wr-oA' == \
|
||||
self.msg.uri('example.com')
|
||||
self.assertEqual(
|
||||
'http://example.com/.well-known/acme-challenge/'
|
||||
'evaGxfADs6pSRb2LAv9IZf17Dt3juxGJ-PCt92wr-oA',
|
||||
self.msg.uri('example.com'))
|
||||
|
||||
def test_to_partial_json(self):
|
||||
assert self.jmsg == self.msg.to_partial_json()
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import HTTP01
|
||||
assert self.msg == HTTP01.from_json(self.jmsg)
|
||||
self.assertEqual(self.msg, HTTP01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import HTTP01
|
||||
hash(HTTP01.from_json(self.jmsg))
|
||||
|
||||
def test_good_token(self):
|
||||
assert self.msg.good_token
|
||||
assert not self.msg.update(token=b'..').good_token
|
||||
self.assertTrue(self.msg.good_token)
|
||||
self.assertFalse(
|
||||
self.msg.update(token=b'..').good_token)
|
||||
|
||||
|
||||
class TLSALPN01ResponseTest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
self.chall = TLSALPN01(
|
||||
token=jose.b64decode(b'a82d5ff8ef740d12881f6d3c2277ab2e'))
|
||||
self.domain = u'example.com'
|
||||
self.domain2 = u'example2.com'
|
||||
|
||||
self.response = self.chall.response(KEY)
|
||||
from acme.challenges import TLSALPN01Response
|
||||
self.msg = TLSALPN01Response(key_authorization=u'foo')
|
||||
self.jmsg = {
|
||||
'resource': 'challenge',
|
||||
'type': 'tls-alpn-01',
|
||||
'keyAuthorization': self.response.key_authorization,
|
||||
'keyAuthorization': u'foo',
|
||||
}
|
||||
|
||||
from acme.challenges import TLSALPN01
|
||||
self.chall = TLSALPN01(token=(b'x' * 16))
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
assert {} == self.response.to_partial_json()
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
assert self.response == TLSALPN01Response.from_json(self.jmsg)
|
||||
self.assertEqual(self.msg, TLSALPN01Response.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
hash(TLSALPN01Response.from_json(self.jmsg))
|
||||
|
||||
def test_gen_verify_cert(self):
|
||||
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
|
||||
cert, key2 = self.response.gen_cert(self.domain, key1)
|
||||
assert key1 == key2
|
||||
assert self.response.verify_cert(self.domain, cert)
|
||||
|
||||
def test_gen_verify_cert_gen_key(self):
|
||||
cert, key = self.response.gen_cert(self.domain)
|
||||
assert isinstance(key, OpenSSL.crypto.PKey)
|
||||
assert self.response.verify_cert(self.domain, cert)
|
||||
|
||||
def test_verify_bad_cert(self):
|
||||
assert not self.response.verify_cert(self.domain,
|
||||
test_util.load_cert('cert.pem'))
|
||||
|
||||
def test_verify_bad_domain(self):
|
||||
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
|
||||
cert, key2 = self.response.gen_cert(self.domain, key1)
|
||||
assert key1 == key2
|
||||
assert not self.response.verify_cert(self.domain2, cert)
|
||||
|
||||
def test_simple_verify_bad_key_authorization(self):
|
||||
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
|
||||
self.response.simple_verify(self.chall, "local", key2.public_key())
|
||||
|
||||
@mock.patch('acme.challenges.TLSALPN01Response.verify_cert', autospec=True)
|
||||
def test_simple_verify(self, mock_verify_cert):
|
||||
mock_verify_cert.return_value = mock.sentinel.verification
|
||||
assert mock.sentinel.verification == self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key(),
|
||||
cert=mock.sentinel.cert)
|
||||
mock_verify_cert.assert_called_once_with(
|
||||
self.response, self.domain, mock.sentinel.cert)
|
||||
|
||||
@mock.patch('acme.challenges.socket.gethostbyname')
|
||||
@mock.patch('acme.challenges.crypto_util.probe_sni')
|
||||
def test_probe_cert(self, mock_probe_sni, mock_gethostbyname):
|
||||
mock_gethostbyname.return_value = '127.0.0.1'
|
||||
self.response.probe_cert('foo.com')
|
||||
mock_gethostbyname.assert_called_once_with('foo.com')
|
||||
mock_probe_sni.assert_called_once_with(
|
||||
host=b'127.0.0.1', port=self.response.PORT, name=b'foo.com',
|
||||
alpn_protocols=[b'acme-tls/1'])
|
||||
|
||||
self.response.probe_cert('foo.com', host='8.8.8.8')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=b'8.8.8.8', port=mock.ANY, name=b'foo.com',
|
||||
alpn_protocols=[b'acme-tls/1'])
|
||||
|
||||
@mock.patch('acme.challenges.TLSALPN01Response.probe_cert')
|
||||
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
|
||||
mock_probe_cert.side_effect = errors.Error
|
||||
assert not self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key())
|
||||
|
||||
|
||||
class TLSALPN01Test(unittest.TestCase):
|
||||
|
||||
@@ -358,11 +293,11 @@ class TLSALPN01Test(unittest.TestCase):
|
||||
}
|
||||
|
||||
def test_to_partial_json(self):
|
||||
assert self.jmsg == self.msg.to_partial_json()
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
assert self.msg == TLSALPN01.from_json(self.jmsg)
|
||||
self.assertEqual(self.msg, TLSALPN01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
@@ -371,16 +306,11 @@ class TLSALPN01Test(unittest.TestCase):
|
||||
def test_from_json_invalid_token_length(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
self.jmsg['token'] = jose.encode_b64jose(b'abcd')
|
||||
with pytest.raises(jose.DeserializationError):
|
||||
TLSALPN01.from_json(self.jmsg)
|
||||
self.assertRaises(
|
||||
jose.DeserializationError, TLSALPN01.from_json, self.jmsg)
|
||||
|
||||
@mock.patch('acme.challenges.TLSALPN01Response.gen_cert')
|
||||
def test_validation(self, mock_gen_cert):
|
||||
mock_gen_cert.return_value = ('cert', 'key')
|
||||
assert ('cert', 'key') == self.msg.validation(
|
||||
KEY, cert_key=mock.sentinel.cert_key, domain=mock.sentinel.domain)
|
||||
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key,
|
||||
domain=mock.sentinel.domain)
|
||||
def test_validation(self):
|
||||
self.assertRaises(NotImplementedError, self.msg.validation, KEY)
|
||||
|
||||
|
||||
class DNSTest(unittest.TestCase):
|
||||
@@ -395,27 +325,24 @@ class DNSTest(unittest.TestCase):
|
||||
}
|
||||
|
||||
def test_to_partial_json(self):
|
||||
assert self.jmsg == self.msg.to_partial_json()
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import DNS
|
||||
assert self.msg == DNS.from_json(self.jmsg)
|
||||
self.assertEqual(self.msg, DNS.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import DNS
|
||||
hash(DNS.from_json(self.jmsg))
|
||||
|
||||
def test_gen_check_validation(self):
|
||||
ec_key_secp384r1 = JWKEC(key=test_util.load_ecdsa_private_key('ec_secp384r1_key.pem'))
|
||||
for key, alg in [(KEY, jose.RS256), (ec_key_secp384r1, jose.ES384)]:
|
||||
with self.subTest(key=key, alg=alg):
|
||||
assert self.msg.check_validation(
|
||||
self.msg.gen_validation(key, alg=alg), key.public_key())
|
||||
self.assertTrue(self.msg.check_validation(
|
||||
self.msg.gen_validation(KEY), KEY.public_key()))
|
||||
|
||||
def test_gen_check_validation_wrong_key(self):
|
||||
key2 = jose.JWKRSA.load(test_util.load_vector('rsa1024_key.pem'))
|
||||
assert not self.msg.check_validation(
|
||||
self.msg.gen_validation(KEY), key2.public_key())
|
||||
self.assertFalse(self.msg.check_validation(
|
||||
self.msg.gen_validation(KEY), key2.public_key()))
|
||||
|
||||
def test_check_validation_wrong_payload(self):
|
||||
validations = tuple(
|
||||
@@ -423,32 +350,28 @@ class DNSTest(unittest.TestCase):
|
||||
for payload in (b'', b'{}')
|
||||
)
|
||||
for validation in validations:
|
||||
assert not self.msg.check_validation(
|
||||
validation, KEY.public_key())
|
||||
self.assertFalse(self.msg.check_validation(
|
||||
validation, KEY.public_key()))
|
||||
|
||||
def test_check_validation_wrong_fields(self):
|
||||
bad_validation = jose.JWS.sign(
|
||||
payload=self.msg.update(
|
||||
token=b'x' * 20).json_dumps().encode('utf-8'),
|
||||
alg=jose.RS256, key=KEY)
|
||||
assert not self.msg.check_validation(bad_validation, KEY.public_key())
|
||||
self.assertFalse(self.msg.check_validation(
|
||||
bad_validation, KEY.public_key()))
|
||||
|
||||
def test_gen_response(self):
|
||||
with mock.patch('acme.challenges.DNS.gen_validation') as mock_gen:
|
||||
mock_gen.return_value = mock.sentinel.validation
|
||||
response = self.msg.gen_response(KEY)
|
||||
from acme.challenges import DNSResponse
|
||||
assert isinstance(response, DNSResponse)
|
||||
assert response.validation == mock.sentinel.validation
|
||||
self.assertTrue(isinstance(response, DNSResponse))
|
||||
self.assertEqual(response.validation, mock.sentinel.validation)
|
||||
|
||||
def test_validation_domain_name(self):
|
||||
assert '_acme-challenge.le.wtf' == self.msg.validation_domain_name('le.wtf')
|
||||
|
||||
def test_validation_domain_name_ecdsa(self):
|
||||
ec_key_secp384r1 = JWKEC(key=test_util.load_ecdsa_private_key('ec_secp384r1_key.pem'))
|
||||
assert self.msg.check_validation(
|
||||
self.msg.gen_validation(ec_key_secp384r1, alg=jose.ES384),
|
||||
ec_key_secp384r1.public_key()) is True
|
||||
self.assertEqual(
|
||||
'_acme-challenge.le.wtf', self.msg.validation_domain_name('le.wtf'))
|
||||
|
||||
|
||||
class DNSResponseTest(unittest.TestCase):
|
||||
@@ -464,6 +387,8 @@ class DNSResponseTest(unittest.TestCase):
|
||||
from acme.challenges import DNSResponse
|
||||
self.msg = DNSResponse(validation=self.validation)
|
||||
self.jmsg_to = {
|
||||
'resource': 'challenge',
|
||||
'type': 'dns',
|
||||
'validation': self.validation,
|
||||
}
|
||||
self.jmsg_from = {
|
||||
@@ -473,31 +398,20 @@ class DNSResponseTest(unittest.TestCase):
|
||||
}
|
||||
|
||||
def test_to_partial_json(self):
|
||||
assert self.jmsg_to == self.msg.to_partial_json()
|
||||
self.assertEqual(self.jmsg_to, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import DNSResponse
|
||||
assert self.msg == DNSResponse.from_json(self.jmsg_from)
|
||||
self.assertEqual(self.msg, DNSResponse.from_json(self.jmsg_from))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import DNSResponse
|
||||
hash(DNSResponse.from_json(self.jmsg_from))
|
||||
|
||||
def test_check_validation(self):
|
||||
assert self.msg.check_validation(self.chall, KEY.public_key())
|
||||
|
||||
|
||||
class JWSPayloadRFC8555Compliant(unittest.TestCase):
|
||||
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
|
||||
def test_challenge_payload(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
|
||||
challenge_body = HTTP01Response()
|
||||
|
||||
jobj = challenge_body.json_dumps(indent=2).encode()
|
||||
# RFC8555 states that challenge responses must have an empty payload.
|
||||
assert jobj == b'{}'
|
||||
self.assertTrue(
|
||||
self.msg.check_validation(self.chall, KEY.public_key()))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
unittest.main() # pragma: no cover
|
||||
1293
acme/tests/client_test.py
Normal file
1293
acme/tests/client_test.py
Normal file
File diff suppressed because it is too large
Load Diff
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user