Compare commits

..

1 Commits

Author SHA1 Message Date
Brad Warren
99935e7343 quiet and fast 2019-08-09 11:16:28 -07:00
1144 changed files with 34659 additions and 39351 deletions

View File

@@ -1,18 +0,0 @@
# Pipeline for testing, building, and deploying Certbot 2.0 pre-releases.
trigger: none
pr: none
variables:
# We don't publish our Docker images in this pipeline, but when building them
# for testing, let's use the nightly tag.
dockerTag: nightly
snapBuildTimeout: 5400
stages:
- template: templates/stages/test-and-package-stage.yml
- stage: DeploySnaps
jobs:
- template: templates/jobs/snap-deploy-job.yml
parameters:
snapReleaseChannel: beta
- template: templates/stages/notify-failure-stage.yml

View File

@@ -1,119 +0,0 @@
# Configuring Azure Pipelines with Certbot
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
Unlike Travis, where CodeCov is working without any action required, CodeCov supports Azure Pipelines
using the coverage-bash utility (not python-coverage for now) only if you provide the Codecov repo token
using the `CODECOV_TOKEN` environment variable. So `CODECOV_TOKEN` needs to be set as a secured
environment variable to allow the main pipeline to publish coverage reports to CodeCov.
This INSTALL.md file explains how to configure Azure Pipelines with Certbot in order to execute the CI/CD logic defined in `.azure-pipelines` folder with it.
During this installation step, warnings describing user access and legal comitments will be displayed like this:
```
!!! ACCESS REQUIRED !!!
```
This document suppose that the Azure DevOps organization is named _certbot_, and the Azure DevOps project is also _certbot_.
## Useful links
* https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema
* https://www.azuredevopslabs.com/labs/azuredevops/github-integration/
* https://docs.microsoft.com/en-us/azure/devops/pipelines/ecosystems/python?view=azure-devops
## Prerequisites
### Having a GitHub account
Use your GitHub user for a normal GitHub account, or a user that has administrative rights to the GitHub organization if relevant.
### Having an Azure DevOps account
- Go to https://dev.azure.com/, click "Start free with GitHub"
- Login to GitHub
```
!!! ACCESS REQUIRED !!!
Personal user data (email + profile info, in read-only)
```
- Microsoft will create a Live account using the email referenced for the GitHub account. This account is also linked to GitHub account (meaning you can log it using GitHub authentication)
- Proceed with account registration (birth date, country), add details about name and email contact
```
!!! ACCESS REQUIRED !!!
Microsoft proposes to send commercial links to this mail
Azure DevOps terms of service need to be accepted
```
_Logged to Azure DevOps, account is ready._
### Installing Azure Pipelines to GitHub
- On GitHub, go to Marketplace
- Select Azure Pipeline, and "Set up a plan"
- Select Free, then "Install it for free"
- Click "Complete order and begin installation"
```
!!! ACCESS !!!
Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses, deployments, issues, pull requests.
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
protected branches, like master, and cannot modify the security context around this on GitHub.
Access can be defined for all or only selected repositories, which is nice.
```
- Redirected to Azure DevOps, select the account created in _Having an Azure DevOps account_ section.
- Select the organization, and click "Create a new project" (let's name it the same than the targeted github repo)
- The Visibility is public, to profit from 10 parallel jobs
```
!!! ACCESS !!!
Azure Pipelines needs access to the GitHub account (in term of being able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
```
_Done. We can move to pipelines configuration._
## Import an existing pipelines from `.azure-pipelines` folder
- On Azure DevOps, go to your organization (eg. _certbot_) then your project (eg. _certbot_)
- Click "Pipelines" tab
- Click "New pipeline"
- Where is your code?: select "__Use the classic editor__"
__Warning: Do not choose the GitHub option in Where is your code? section. Indeed, this option will trigger an OAuth
grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAuth Application. The permissions asked
then are way too large (admin level on almost everything), while the classic approach does not add any more
permissions, and works perfectly well.__
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
- Click on YAML option for "Select a template"
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._
## Add a secret variable to a pipeline (like `CODECOV_TOKEN`)
__NB: Following steps suppose that you already setup the YAML pipeline file to
consume the secret variable that these steps will create as an environment variable.
For a variable named `CODECOV_TOKEN` consuming the variable `codecov_token`,
in the YAML file this setup would take the form of the following:
```
steps:
- script: ./do_something_that_consumes_CODECOV_TOKEN # Eg. `codecov -F windows`
env:
CODECOV_TOKEN: $(codecov_token)
```
To set up a variable that is shared between pipelines, follow the instructions
at
https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups.
When adding variables to a group, don't forget to tick "Keep this value secret"
if it shouldn't be shared publcily.

View File

@@ -1,15 +0,0 @@
# Advanced pipeline for running our full test suite on demand.
trigger:
# When changing these triggers, please ensure the documentation under
# "Running tests in CI" is still correct.
- test-*
pr: none
variables:
# We don't publish our Docker images in this pipeline, but when building them
# for testing, let's use the nightly tag.
dockerTag: nightly
snapBuildTimeout: 5400
stages:
- template: templates/stages/test-and-package-stage.yml

View File

@@ -1,8 +0,0 @@
trigger: none
pr:
- master
- '*.x'
jobs:
- template: templates/jobs/standard-tests-jobs.yml

View File

@@ -1,19 +0,0 @@
# Nightly pipeline running each day for master.
trigger: none
pr: none
schedules:
- cron: "30 4 * * *"
displayName: Nightly build
branches:
include:
- master
always: true
variables:
dockerTag: nightly
snapBuildTimeout: 19800
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/deploy-stage.yml
- template: templates/stages/notify-failure-stage.yml

View File

@@ -1,19 +0,0 @@
# Release pipeline to run our full test suite, build artifacts, and deploy them
# for GitHub release tags.
trigger:
tags:
include:
- v*
pr: none
variables:
dockerTag: ${{variables['Build.SourceBranchName']}}
snapBuildTimeout: 5400
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/changelog-stage.yml
- template: templates/stages/deploy-stage.yml
parameters:
snapReleaseChannel: candidate
- template: templates/stages/notify-failure-stage.yml

View File

@@ -1,42 +0,0 @@
jobs:
- job: extended_test
variables:
- name: IMAGE_NAME
value: ubuntu-22.04
- name: PYTHON_VERSION
value: 3.10
- group: certbot-common
strategy:
matrix:
linux-boulder-v2-integration-certbot-oldest:
PYTHON_VERSION: 3.7
TOXENV: integration-certbot-oldest
ACME_SERVER: boulder-v2
linux-boulder-v2-integration-nginx-oldest:
PYTHON_VERSION: 3.7
TOXENV: integration-nginx-oldest
ACME_SERVER: boulder-v2
linux-boulder-v2-py37-integration:
PYTHON_VERSION: 3.7
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v2-py38-integration:
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v2-py39-integration:
PYTHON_VERSION: 3.9
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v2-py310-integration:
PYTHON_VERSION: 3.10
TOXENV: integration
ACME_SERVER: boulder-v2
linux-integration-rfc2136:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.8
TOXENV: integration-dns-rfc2136
pool:
vmImage: $(IMAGE_NAME)
steps:
- template: ../steps/tox-steps.yml

View File

@@ -1,218 +0,0 @@
jobs:
- job: docker_build
pool:
vmImage: ubuntu-22.04
strategy:
matrix:
amd64:
DOCKER_ARCH: amd64
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
# The default timeout of 60 minutes is a little low for compiling
# cryptography on ARM architectures.
timeoutInMinutes: 180
steps:
- bash: set -e && tools/docker/build.sh $(dockerTag) $DOCKER_ARCH
displayName: Build the Docker images
# We don't filter for the Docker Hub organization to continue to allow
# easy testing of these scripts on forks.
- bash: |
set -e
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}')
docker save --output images.tar $DOCKER_IMAGES
displayName: Save the Docker images
# If the name of the tar file or artifact changes, the deploy stage will
# also need to be updated.
- bash: set -e && mv images.tar $(Build.ArtifactStagingDirectory)
displayName: Prepare Docker artifact
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: docker_$(DOCKER_ARCH)
displayName: Store Docker artifact
- job: docker_run
dependsOn: docker_build
pool:
vmImage: ubuntu-22.04
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: docker_amd64
path: $(Build.SourcesDirectory)
displayName: Retrieve Docker images
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
displayName: Load Docker images
- bash: |
set -ex
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}:{{.Tag}}')
for DOCKER_IMAGE in ${DOCKER_IMAGES}
do docker run --rm "${DOCKER_IMAGE}" plugins --prepare
done
displayName: Run integration tests for Docker images
- job: installer_build
pool:
vmImage: windows-2019
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.9
architecture: x64
addToPath: true
- script: |
python -m venv venv
venv\Scripts\python tools\pipstrap.py
venv\Scripts\python tools\pip_install.py -e windows-installer
displayName: Prepare Windows installer build environment
- script: |
venv\Scripts\construct-windows-installer
displayName: Build Certbot installer
- task: CopyFiles@2
inputs:
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
contents: '*.exe'
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
artifact: windows-installer
displayName: Publish Windows installer
- job: installer_run
dependsOn: installer_build
strategy:
matrix:
win2019:
imageName: windows-2019
pool:
vmImage: $(imageName)
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.9
addToPath: true
- task: DownloadPipelineArtifact@2
inputs:
artifact: windows-installer
path: $(Build.SourcesDirectory)/bin
displayName: Retrieve Windows installer
- script: |
python -m venv venv
venv\Scripts\python tools\pipstrap.py
venv\Scripts\python tools\pip_install.py -e certbot-ci
env:
PIP_NO_BUILD_ISOLATION: no
displayName: Prepare Certbot-CI
- script: |
set PATH=%ProgramFiles%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\windows_installer_integration_tests --allow-persistent-changes --installer-path $(Build.SourcesDirectory)\bin\certbot-beta-installer-win_amd64.exe
displayName: Run windows installer integration tests
- script: |
set PATH=%ProgramFiles%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
displayName: Run certbot integration tests
- job: snaps_build
pool:
vmImage: ubuntu-22.04
strategy:
matrix:
amd64:
SNAP_ARCH: amd64
armhf:
SNAP_ARCH: armhf
arm64:
SNAP_ARCH: arm64
timeoutInMinutes: 0
steps:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
sudo snap install --classic snapcraft
displayName: Install dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- task: DownloadSecureFile@1
name: credentials
inputs:
secureFile: launchpad-credentials
- script: |
set -e
git config --global user.email "$(Build.RequestedForEmail)"
git config --global user.name "$(Build.RequestedFor)"
mkdir -p ~/.local/share/snapcraft/provider/launchpad
cp $(credentials.secureFilePath) ~/.local/share/snapcraft/provider/launchpad/credentials
python3 tools/snap/build_remote.py ALL --archs ${SNAP_ARCH} --timeout $(snapBuildTimeout)
displayName: Build snaps
- script: |
set -e
mv *.snap $(Build.ArtifactStagingDirectory)
mv certbot-dns-*/*.snap $(Build.ArtifactStagingDirectory)
displayName: Prepare artifacts
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: snaps_$(SNAP_ARCH)
displayName: Store snaps artifacts
- job: snap_run
dependsOn: snaps_build
pool:
vmImage: ubuntu-22.04
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- script: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends nginx-light snapd
python3 -m venv venv
venv/bin/python tools/pipstrap.py
venv/bin/python tools/pip_install.py -U tox
displayName: Install dependencies
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps_amd64
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- script: |
set -e
sudo snap install --dangerous --classic snap/certbot_*.snap
displayName: Install Certbot snap
- script: |
set -e
venv/bin/python -m tox -e integration-external,apacheconftest-external-with-pebble
displayName: Run tox
- job: snap_dns_run
dependsOn: snaps_build
pool:
vmImage: ubuntu-22.04
steps:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
displayName: Install dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps_amd64
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- script: |
set -e
python3 -m venv venv
venv/bin/python tools/pipstrap.py
venv/bin/python tools/pip_install.py -e certbot-ci
displayName: Prepare Certbot-CI
- script: |
set -e
sudo -E venv/bin/pytest certbot-ci/snap_integration_tests/dns_tests --allow-persistent-changes --snap-folder $(Build.SourcesDirectory)/snap --snap-arch amd64
displayName: Test DNS plugins snaps

View File

@@ -1,75 +0,0 @@
# As (somewhat) described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#context,
# each template only has access to the parameters passed into it. To help make
# use of this design, we define snapReleaseChannel without a default value
# which requires the user of this template to define it as described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/parameters-name?view=azure-pipelines#remarks.
# This makes the user of this template be explicit while allowing them to
# define their own parameters with defaults that make sense for that context.
parameters:
- name: snapReleaseChannel
type: string
values:
- edge
- beta
- candidate
jobs:
# This job relies on credentials used to publish the Certbot snaps. This
# credential file was created by running:
#
# snapcraft logout
# snapcraft export-login --channels=candidate,beta,edge snapcraft.cfg
# (provide the shared snapcraft credentials when prompted)
#
# Then the file was added as a secure file in Azure pipelines
# with the name snapcraft.cfg by following the instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops
# including authorizing the file for use in the "nightly" and "release"
# pipelines as described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops#q-how-do-i-authorize-a-secure-file-for-use-in-a-specific-pipeline.
#
# This file has a maximum lifetime of one year and the current file will
# expire on 2023-09-06. The file will need to be updated before then to
# prevent automated deploys from breaking.
#
# Revoking these credentials can be done by changing the password of the
# account used to generate the credentials. See
# https://forum.snapcraft.io/t/revoking-exported-credentials/19031 for
# more info.
- job: publish_snap
pool:
vmImage: ubuntu-22.04
variables:
- group: certbot-common
strategy:
matrix:
amd64:
SNAP_ARCH: amd64
arm32v6:
SNAP_ARCH: armhf
arm64v8:
SNAP_ARCH: arm64
steps:
- bash: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
sudo snap install --classic snapcraft
displayName: Install dependencies
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps_$(SNAP_ARCH)
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- task: DownloadSecureFile@1
name: snapcraftCfg
inputs:
secureFile: snapcraft.cfg
- bash: |
set -e
export SNAPCRAFT_STORE_CREDENTIALS=$(cat "$(snapcraftCfg.secureFilePath)")
for SNAP_FILE in snap/*.snap; do
tools/retry.sh eval snapcraft upload --release=${{ parameters.snapReleaseChannel }} "${SNAP_FILE}"
done
displayName: Publish to Snap store

View File

@@ -1,73 +0,0 @@
jobs:
- job: test
variables:
PYTHON_VERSION: 3.10
strategy:
matrix:
macos-py37-cover:
IMAGE_NAME: macOS-12
PYTHON_VERSION: 3.7
TOXENV: py37-cover
macos-py310-cover:
IMAGE_NAME: macOS-12
PYTHON_VERSION: 3.10
TOXENV: py310-cover
windows-py37:
IMAGE_NAME: windows-2019
PYTHON_VERSION: 3.7
TOXENV: py37-win
windows-py39-cover:
IMAGE_NAME: windows-2019
PYTHON_VERSION: 3.9
TOXENV: py39-cover-win
windows-integration-certbot:
IMAGE_NAME: windows-2019
PYTHON_VERSION: 3.9
TOXENV: integration-certbot
linux-oldest-tests-1:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.7
TOXENV: '{acme,apache,apache-v2,certbot}-oldest'
linux-oldest-tests-2:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.7
TOXENV: '{dns,nginx}-oldest'
linux-py37:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.7
TOXENV: py37
linux-py310-cover:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.10
TOXENV: py310-cover
linux-py310-lint:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.10
TOXENV: lint-posix
linux-py310-mypy:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.10
TOXENV: mypy-posix
linux-integration:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: pebble
apache-compat:
IMAGE_NAME: ubuntu-22.04
TOXENV: apache_compat
apacheconftest:
IMAGE_NAME: ubuntu-22.04
TOXENV: apacheconftest-with-pebble
nginxroundtrip:
IMAGE_NAME: ubuntu-22.04
TOXENV: nginxroundtrip
pool:
vmImage: $(IMAGE_NAME)
steps:
- template: ../steps/tox-steps.yml
- job: test_sphinx_builds
pool:
vmImage: ubuntu-20.04
steps:
- template: ../steps/sphinx-steps.yml

View File

@@ -1,19 +0,0 @@
stages:
- stage: Changelog
jobs:
- job: prepare
pool:
vmImage: windows-2019
steps:
# If we change the output filename from `release_notes.md`, it should also be changed in tools/create_github_release.py
- bash: |
set -e
CERTBOT_VERSION="$(cd certbot && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
displayName: Prepare changelog
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
artifact: changelog
displayName: Publish changelog

View File

@@ -1,54 +0,0 @@
parameters:
# We do not define acceptable values for this parameter here as it is passed
# through to ../jobs/snap-deploy-job.yml which does its own sanity checking.
- name: snapReleaseChannel
type: string
default: edge
stages:
- stage: Deploy
jobs:
- template: ../jobs/snap-deploy-job.yml
parameters:
snapReleaseChannel: ${{ parameters.snapReleaseChannel }}
- job: publish_docker
pool:
vmImage: ubuntu-22.04
strategy:
matrix:
amd64:
DOCKER_ARCH: amd64
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: docker_$(DOCKER_ARCH)
path: $(Build.SourcesDirectory)
displayName: Retrieve Docker images
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
displayName: Load Docker images
- task: Docker@2
inputs:
command: login
# The credentials used here are for the shared certbotbot account
# on Docker Hub. The credentials are stored in a service account
# which was created by following the instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
# The name given to this service account must match the value
# given to containerRegistry below. The authentication used when
# creating this service account was a personal access token
# rather than a password to bypass 2FA. When Brad set this up,
# Azure Pipelines failed to verify the credentials with an error
# like "access is forbidden with a JWT issued from a personal
# access token", but after saving them without verification, the
# access token worked when the pipeline actually ran. "Grant
# access to all pipelines" should also be checked on the service
# account. The access token can be deleted on Docker Hub if
# these credentials need to be revoked.
containerRegistry: docker-hub
displayName: Login to Docker Hub
- bash: set -e && tools/docker/deploy.sh $(dockerTag) $DOCKER_ARCH
displayName: Deploy the Docker images

View File

@@ -1,19 +0,0 @@
stages:
- stage: On_Failure
jobs:
- job: notify_mattermost
variables:
- group: certbot-common
pool:
vmImage: ubuntu-20.04
steps:
- bash: |
set -e
MESSAGE="\
---\n\
##### Azure Pipeline
*Repo* $(Build.Repository.ID) - *Pipeline* $(Build.DefinitionName) #$(Build.BuildNumber) - *Branch/PR* $(Build.SourceBranchName)\n\
:warning: __Pipeline has failed__: [Link to the build](https://dev.azure.com/$(Build.Repository.ID)/_build/results?buildId=$(Build.BuildId)&view=results)\n\n\
---"
curl -i -X POST --data-urlencode "payload={\"text\":\"${MESSAGE}\"}" "$(MATTERMOST_URL)"
condition: failed()

View File

@@ -1,4 +0,0 @@
stages:
- stage: TestAndPackage
jobs:
- template: ../jobs/extended-tests-jobs.yml

View File

@@ -1,24 +0,0 @@
steps:
- bash: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends libaugeas0
FINAL_STATUS=0
declare -a FAILED_BUILDS
tools/venv.py
source venv/bin/activate
for doc_path in */docs
do
echo ""
echo "##[group]Building $doc_path"
if ! sphinx-build -W --keep-going -b html $doc_path $doc_path/_build/html; then
FINAL_STATUS=1
FAILED_BUILDS[${#FAILED_BUILDS[@]}]="${doc_path%/docs}"
fi
echo "##[endgroup]"
done
if [[ $FINAL_STATUS -ne 0 ]]; then
echo "##[error]The following builds failed: ${FAILED_BUILDS[*]}"
exit 1
fi
displayName: Build Sphinx Documentation

View File

@@ -1,57 +0,0 @@
steps:
# We run brew update because we've seen attempts to install an older version
# of a package fail. See
# https://github.com/actions/virtual-environments/issues/3165.
- bash: |
set -e
brew update
brew install augeas
condition: startswith(variables['IMAGE_NAME'], 'macOS')
displayName: Install MacOS dependencies
- bash: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends \
python3-dev \
gcc \
libaugeas0 \
libssl-dev \
libffi-dev \
ca-certificates \
nginx-light \
openssl
sudo systemctl stop nginx
condition: startswith(variables['IMAGE_NAME'], 'ubuntu')
displayName: Install Linux dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: $(PYTHON_VERSION)
addToPath: true
# tools/pip_install.py is used to pin packages to a known working version
# except in tests where the environment variable CERTBOT_NO_PIN is set.
# virtualenv is listed here explicitly to make sure it is upgraded when
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
# version of virtualenv. The option "-I" is set so when CERTBOT_NO_PIN is also
# set, pip updates dependencies it thinks are already satisfied to avoid some
# problems with its lack of real dependency resolution.
- bash: |
set -e
python3 tools/pipstrap.py
python3 tools/pip_install.py -I tox virtualenv
displayName: Install runtime dependencies
- task: DownloadSecureFile@1
name: testFarmPem
inputs:
secureFile: azure-test-farm.pem
condition: contains(variables['TOXENV'], 'test-farm')
- bash: |
set -e
export TARGET_BRANCH="`echo "${BUILD_SOURCEBRANCH}" | sed -E 's!refs/(heads|tags)/!!g'`"
[ -z "${SYSTEM_PULLREQUEST_TARGETBRANCH}" ] || export TARGET_BRANCH="${SYSTEM_PULLREQUEST_TARGETBRANCH}"
env
python3 -m tox
env:
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
AWS_EC2_PEM_FILE: $(testFarmPem.secureFilePath)
displayName: Run tox

18
.codecov.yml Normal file
View File

@@ -0,0 +1,18 @@
coverage:
status:
project:
default: off
linux:
flags: linux
# Fixed target instead of auto set by #7173, can
# be removed when flags in Codecov are added back.
target: 97.5
threshold: 0.1
base: auto
windows:
flags: windows
# Fixed target instead of auto set by #7173, can
# be removed when flags in Codecov are added back.
target: 97.6
threshold: 0.1
base: auto

View File

@@ -1,5 +1,2 @@
[run]
omit = */setup.py
[report]
omit = */setup.py

View File

@@ -8,4 +8,5 @@
.git
.tox
venv
venv3
docs

View File

@@ -1,18 +0,0 @@
# https://editorconfig.org/
root = true
[*]
insert_final_newline = true
trim_trailing_whitespace = true
end_of_line = lf
[*.py]
indent_style = space
indent_size = 4
charset = utf-8
max_line_length = 100
[*.yaml]
indent_style = space
indent_size = 2

12
.envrc
View File

@@ -1,12 +0,0 @@
# This file is just a nicety for developers who use direnv. When you cd under
# the Certbot repo, Certbot's virtual environment will be automatically
# activated and then deactivated when you cd elsewhere. Developers have to have
# direnv set up and run `direnv allow` to allow this file to execute on their
# system. You can find more information at https://direnv.net/.
. venv/bin/activate
# direnv doesn't support modifying PS1 so we unset it to squelch the error
# it'll otherwise print about this being done in the activate script. See
# https://github.com/direnv/direnv/wiki/PS1. If you would like your shell
# prompt to change like it normally does, see
# https://github.com/direnv/direnv/wiki/Python#restoring-the-ps1.
unset PS1

1
.github/FUNDING.yml vendored
View File

@@ -1 +0,0 @@
custom: https://supporters.eff.org/donate/support-work-on-certbot

View File

@@ -1,6 +0,0 @@
## Pull Request Checklist
- [ ] The Certbot team has recently expressed interest in reviewing a PR for this. If not, this PR may be closed due our limited resources and need to prioritize how we spend them.
- [ ] If the change being made is to a [distributed component](https://certbot.eff.org/docs/contributing.html#code-components-and-layout), edit the `master` section of `certbot/CHANGELOG.md` to include a description of the change being made.
- [ ] Add or update any documentation as needed to support the changes in this PR.
- [ ] Include your name in `AUTHORS.md` if you like.

25
.gitignore vendored
View File

@@ -4,16 +4,16 @@
build/
dist*/
/venv*/
/kgs/
/.tox/
/releases*/
/log*
letsencrypt.log
certbot.log
poetry.lock
letsencrypt-auto-source/letsencrypt-auto.sig.lzma.base64
# coverage
.coverage
.coverage.*
/htmlcov/
/.vagrant
@@ -26,11 +26,15 @@ tags
\#*#
.idea
.ropeproject
.vscode
# auth --cert-path --chain-path
/*.pem
# letstest
tests/letstest/letest-*/
tests/letstest/*.pem
tests/letstest/venv/
.venv
# pytest cache
@@ -45,18 +49,3 @@ tags
.certbot_test_workspace
**/assets/pebble*
**/assets/challtestsrv*
# snap files
.snapcraft
parts
prime
stage
*.snap
snap-constraints.txt
qemu-*
certbot-dns*/certbot-dns*_amd64*.txt
certbot-dns*/certbot-dns*_arm*.txt
/certbot_amd64*.txt
/certbot_arm*.txt
certbot-dns*/snap
snapcraft.cfg

View File

@@ -1,6 +0,0 @@
[settings]
skip_glob=venv*
force_sort_within_sections=True
force_single_line=True
order_by_type=False
line_length=400

View File

@@ -8,10 +8,7 @@ jobs=0
# Python code to execute, usually for sys.path manipulation such as
# pygtk.require().
# CERTBOT COMMENT
# This is needed for pylint to import linter_plugin.py since
# https://github.com/PyCQA/pylint/pull/3396.
init-hook="import pylint.config, os, sys; sys.path.append(os.path.dirname(pylint.config.PYLINTRC))"
#init-hook=
# Profiled execution.
profile=no
@@ -27,11 +24,6 @@ persistent=yes
# usually to register additional checkers.
load-plugins=linter_plugin
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
[MESSAGES CONTROL]
@@ -49,32 +41,10 @@ extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
# CERTBOT COMMENT
# 1) Once certbot codebase is claimed to be compatible exclusively with Python 3,
# the useless-object-inheritance check can be enabled again, and code fixed accordingly.
# 2) Check unsubscriptable-object tends to create a lot of false positives. Let's disable it.
# See https://github.com/PyCQA/pylint/issues/1498.
# 3) Same as point 2 for no-value-for-parameter.
# See https://github.com/PyCQA/pylint/issues/2820.
# 4) raise-missing-from makes it an error to raise an exception from except
# block without using explicit exception chaining. While explicit exception
# chaining results in a slightly more informative traceback, I don't think
# it's beneficial enough for us to change all of our current instances and
# give Certbot developers errors about this when they're working on new code
# in the future. You can read more about exception chaining and this pylint
# check at
# https://blog.ram.rachum.com/post/621791438475296768/improving-python-exception-chaining-with.
# 5) wrong-import-order generates false positives and a pylint developer
# suggests that people using isort should disable this check at
# https://github.com/PyCQA/pylint/issues/3817#issuecomment-687892090.
# 6) unspecified-encoding generates errors when encoding is not specified in
# in a call to the built-in open function. This relates more to a design decision
# (unspecified encoding makes the open function use the default encoding of the system)
# than a clear flaw on which a check should be enforced. Anyway the project does
# not need to enforce encoding on files so we disable this check.
# 7) consider-using-f-string is "suggesting" to move to f-string when possible with an error. This
# clearly relates to code design and not to potential defects in the code, let's just ignore that.
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue,raise-missing-from,wrong-import-order,unspecified-encoding,consider-using-f-string
disable=fixme,locally-disabled,locally-enabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,too-few-public-methods,no-self-use,invalid-name,too-many-instance-attributes,cyclic-import,duplicate-code
# abstract-class-not-used cannot be disabled locally (at least in
# pylint 1.4.1), same for abstract-class-little-used
[REPORTS]
@@ -275,7 +245,7 @@ ignore-mixin-members=yes
# List of module names for which member attributes should not be checked
# (useful for modules/projects where namespaces are manipulated during runtime
# and thus existing member attributes cannot be deduced by static analysis
ignored-modules=pkg_resources,confargparse,argparse
ignored-modules=pkg_resources,confargparse,argparse,six.moves,six.moves.urllib
# import errors ignored only in 1.4.4
# https://bitbucket.org/logilab/pylint/commits/cd000904c9e2
@@ -327,6 +297,40 @@ valid-classmethod-first-arg=cls
valid-metaclass-classmethod-first-arg=mcs
[DESIGN]
# Maximum number of arguments for function / method
max-args=6
# Argument names that match this expression will be ignored. Default to name
# with leading underscore
ignored-argument-names=(unused)?_.*|dummy
# Maximum number of locals for function / method body
max-locals=15
# Maximum number of return / yield for function / method body
max-returns=6
# Maximum number of branch for function / method body
max-branches=12
# Maximum number of statements in function / method body
max-statements=50
# Maximum number of parents for a class (see R0901).
max-parents=12
# Maximum number of attributes for a class (see R0902).
max-attributes=7
# Minimum number of public methods for a class (see R0903).
min-public-methods=2
# Maximum number of public methods for a class (see R0904).
max-public-methods=20
[EXCEPTIONS]
# Exceptions that will emit a warning when being caught. Defaults to

70
.travis.yml Normal file
View File

@@ -0,0 +1,70 @@
language: python
cache:
directories:
- $HOME/.cache/pip
before_script:
- 'if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then ulimit -n 1024 ; fi'
# On Travis, the fastest parallelization for integration tests has proved to be 4.
- 'if [[ "$TOXENV" == *"integration"* ]]; then export PYTEST_ADDOPTS="--numprocesses 4"; fi'
- export TOX_TESTENV_PASSENV=TRAVIS
# Only build pushes to the master branch, PRs, and branches beginning with
# `test-` or of the form `digit(s).digit(s).x`. This reduces the number of
# simultaneous Travis runs, which speeds turnaround time on review since there
# is a cap of on the number of simultaneous runs.
branches:
only:
# apache-parser-v2 is a temporary branch for doing work related to
# rewriting the parser in the Apache plugin.
- apache-parser-v2
- master
- /^\d+\.\d+\.x$/
- /^test-.*$/
# Jobs for the main test suite are always executed (including on PRs) except for pushes on master.
not-on-master: &not-on-master
if: NOT (type = push AND branch = master)
# Jobs for the extended test suite are executed for cron jobs and pushes to
# non-development branches. See the explanation for apache-parser-v2 above.
extended-test-suite: &extended-test-suite
if: type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))
matrix:
include:
# Main test suite
- python: "2.7"
env: ACME_SERVER=pebble TOXENV=integration
sudo: required
services: docker
<<: *not-on-master
# container-based infrastructure
sudo: false
addons:
apt:
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
- python-dev
- gcc
- libaugeas0
- libssl-dev
- libffi-dev
- ca-certificates
# For certbot-nginx integration testing
- nginx-light
- openssl
# tools/pip_install.py is used to pin packages to a known working version
# except in tests where the environment variable CERTBOT_NO_PIN is set.
# virtualenv is listed here explicitly to make sure it is upgraded when
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
# version of virtualenv.
install: "tools/pip_install.py -U codecov tox virtualenv"
script: tox
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov -F linux'
notifications:
email: false

View File

@@ -1,7 +1,6 @@
Authors
=======
* [Aaron Gable](https://github.com/aarongable)
* [Aaron Zirbes](https://github.com/aaronzirbes)
* Aaron Zuehlke
* Ada Lovelace
@@ -17,15 +16,10 @@ Authors
* [Alex Halderman](https://github.com/jhalderm)
* [Alex Jordan](https://github.com/strugee)
* [Alex Zorin](https://github.com/alexzorin)
* [Alexis Hancock](https://github.com/zoracon)
* [Amir Omidi](https://github.com/aaomidi)
* [Amjad Mashaal](https://github.com/TheNavigat)
* [amplifi](https://github.com/amplifi)
* [Andrew Murray](https://github.com/radarhere)
* [Andrzej Górski](https://github.com/andrzej3393)
* [Anselm Levskaya](https://github.com/levskaya)
* [Antoine Jacoutot](https://github.com/ajacoutot)
* [April King](https://github.com/april)
* [asaph](https://github.com/asaph)
* [Axel Beckert](https://github.com/xtaran)
* [Bas](https://github.com/Mechazawa)
@@ -40,9 +34,7 @@ Authors
* [Blake Griffith](https://github.com/cowlicks)
* [Brad Warren](https://github.com/bmw)
* [Brandon Kraft](https://github.com/kraftbj)
* [Brandon Kreisel](https://github.com/BKreisel)
* [Brian Heim](https://github.com/brianlheim)
* [Cameron Steel](https://github.com/Tugzrida)
* [Brandon Kreisel](https://github.com/kraftbj)
* [Ceesjan Luiten](https://github.com/quinox)
* [Chad Whitacre](https://github.com/whit537)
* [Chhatoi Pritam Baral](https://github.com/pritambaral)
@@ -64,9 +56,7 @@ Authors
* [DanCld](https://github.com/DanCld)
* [Daniel Albers](https://github.com/AID)
* [Daniel Aleksandersen](https://github.com/da2x)
* [Daniel Almasi](https://github.com/almasen)
* [Daniel Convissor](https://github.com/convissor)
* [Daniel "Drex" Drexler](https://github.com/aeturnum)
* [Daniel Huang](https://github.com/dhuang)
* [Dave Guarino](https://github.com/daguar)
* [David cz](https://github.com/dave-cz)
@@ -91,7 +81,6 @@ Authors
* [Felix Schwarz](https://github.com/FelixSchwarz)
* [Felix Yan](https://github.com/felixonmars)
* [Filip Ochnik](https://github.com/filipochnik)
* [Florian Klink](https://github.com/flokli)
* [Francois Marier](https://github.com/fmarier)
* [Frank](https://github.com/Frankkkkk)
* [Frederic BLANC](https://github.com/fblanc)
@@ -110,15 +99,12 @@ Authors
* [Harlan Lieberman-Berg](https://github.com/hlieberman)
* [Henri Salo](https://github.com/fgeek)
* [Henry Chen](https://github.com/henrychen95)
* [Hugo van Kemenade](https://github.com/hugovk)
* [Ingolf Becker](https://github.com/watercrossing)
* [Ivan Nejgebauer](https://github.com/inejge)
* [Jaap Eldering](https://github.com/eldering)
* [Jacob Hoffman-Andrews](https://github.com/jsha)
* [Jacob Sachs](https://github.com/jsachs)
* [Jairo Llopis](https://github.com/Yajo)
* [Jakub Warmuz](https://github.com/kuba)
* [James Balazs](https://github.com/jamesbalazs)
* [James Kasten](https://github.com/jdkasten)
* [Jason Grinblat](https://github.com/ptychomancer)
* [Jay Faulkner](https://github.com/jayofdoom)
@@ -137,13 +123,10 @@ Authors
* [Jonathan Herlin](https://github.com/Jonher937)
* [Jon Walsh](https://github.com/code-tree)
* [Joona Hoikkala](https://github.com/joohoi)
* [Josh McCullough](https://github.com/JoshMcCullough)
* [Josh Soref](https://github.com/jsoref)
* [Joubin Jabbari](https://github.com/joubin)
* [Juho Juopperi](https://github.com/jkjuopperi)
* [Kane York](https://github.com/riking)
* [Katsuyoshi Ozaki](https://github.com/moratori)
* [Kenichi Maehashi](https://github.com/kmaehashi)
* [Kenneth Skovhede](https://github.com/kenkendk)
* [Kevin Burke](https://github.com/kevinburke)
* [Kevin London](https://github.com/kevinlondon)
@@ -156,13 +139,11 @@ Authors
* [Lior Sabag](https://github.com/liorsbg)
* [Lipis](https://github.com/lipis)
* [lord63](https://github.com/lord63)
* [Lorenzo Fundaró](https://github.com/lfundaro)
* [Luca Beltrame](https://github.com/lbeltrame)
* [Luca Ebach](https://github.com/lucebac)
* [Luca Olivetti](https://github.com/olivluca)
* [Luke Rogers](https://github.com/lukeroge)
* [Maarten](https://github.com/mrtndwrd)
* [Mads Jensen](https://github.com/atombrella)
* [Maikel Martens](https://github.com/krukas)
* [Malte Janduda](https://github.com/MalteJ)
* [Mantas Mikulėnas](https://github.com/grawity)
@@ -178,14 +159,11 @@ Authors
* [Mathieu Leduc-Hamel](https://github.com/mlhamel)
* [Matt Bostock](https://github.com/mattbostock)
* [Matthew Ames](https://github.com/SuperMatt)
* [Matthew W. Thomas](https://github.com/mwt)
* [Michael Schumacher](https://github.com/schumaml)
* [Michael Strache](https://github.com/Jarodiv)
* [Michael Sverdlin](https://github.com/sveder)
* [Michael Watters](https://github.com/blackknight36)
* [Michal Moravec](https://github.com/https://github.com/Majkl578)
* [Michal Papis](https://github.com/mpapis)
* [Mickaël Schoentgen](https://github.com/BoboTiG)
* [Minn Soe](https://github.com/MinnSoe)
* [Min RK](https://github.com/minrk)
* [Miquel Ruiz](https://github.com/miquelruiz)
@@ -203,7 +181,6 @@ Authors
* [osirisinferi](https://github.com/osirisinferi)
* Patrick Figel
* [Patrick Heppler](https://github.com/PatrickHeppler)
* [Paul Buonopane](https://github.com/Zenexer)
* [Paul Feitzinger](https://github.com/pfeyz)
* [Pavan Gupta](https://github.com/pavgup)
* [Pavel Pavlov](https://github.com/ghost355)
@@ -216,8 +193,6 @@ Authors
* [Pierre Jaury](https://github.com/kaiyou)
* [Piotr Kasprzyk](https://github.com/kwadrat)
* [Prayag Verma](https://github.com/pra85)
* [Preston Locke](https://github.com/Preston12321)
* [Rasesh Patel](https://github.com/raspat1)
* [Reinaldo de Souza Jr](https://github.com/juniorz)
* [Remi Rampin](https://github.com/remram44)
* [Rémy HUBSCHER](https://github.com/Natim)
@@ -225,7 +200,6 @@ Authors
* [Richard Barnes](https://github.com/r-barnes)
* [Richard Panek](https://github.com/kernelpanek)
* [Robert Buchholz](https://github.com/rbu)
* [Robert Dailey](https://github.com/pahrohfit)
* [Robert Habermann](https://github.com/frennkie)
* [Robert Xiao](https://github.com/nneonneo)
* [Roland Shoemaker](https://github.com/rolandshoemaker)
@@ -251,11 +225,8 @@ Authors
* [Spencer Bliven](https://github.com/sbliven)
* [Stacey Sheldon](https://github.com/solidgoldbomb)
* [Stavros Korokithakis](https://github.com/skorokithakis)
* [Ștefan Talpalaru](https://github.com/stefantalpalaru)
* [Stefan Weil](https://github.com/stweil)
* [Steve Desmond](https://github.com/stevedesmond-ca)
* [sydneyli](https://github.com/sydneyli)
* [taixx046](https://github.com/taixx046)
* [Tan Jay Jun](https://github.com/jayjun)
* [Tapple Gao](https://github.com/tapple)
* [Telepenin Nikolay](https://github.com/telepenin)
@@ -280,7 +251,6 @@ Authors
* [Wilfried Teiken](https://github.com/wteiken)
* [Willem Fibbe](https://github.com/fibbers)
* [William Budington](https://github.com/Hainish)
* [Will Greenberg](https://github.com/wgreenberg)
* [Will Newby](https://github.com/willnewby)
* [Will Oller](https://github.com/willoller)
* [Yan](https://github.com/diracdeltas)
@@ -288,7 +258,5 @@ Authors
* [Yomna](https://github.com/ynasser)
* [Yoni Jah](https://github.com/yonjah)
* [YourDaddyIsHere](https://github.com/YourDaddyIsHere)
* [Yuseong Cho](https://github.com/g6123)
* [Zach Shepherd](https://github.com/zjs)
* [陈三](https://github.com/chenxsan)
* [Shahar Naveh](https://github.com/ShaharNaveh)

View File

@@ -1 +0,0 @@
certbot/CHANGELOG.md

1716
CHANGELOG.md Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -11,7 +11,7 @@ to the Sphinx generated docs is provided below.
[1] https://github.com/blog/1184-contributing-guidelines
[2] https://docutils.sourceforge.io/docs/user/rst/quickref.html#hyperlink-targets
[2] http://docutils.sourceforge.net/docs/user/rst/quickref.html#hyperlink-targets
-->

View File

@@ -1,21 +1,21 @@
# This Dockerfile builds an image for development.
FROM ubuntu:focal
FROM debian:buster
# Note: this only exposes the port to other docker containers.
EXPOSE 80 443
WORKDIR /opt/certbot/src
# TODO: Install Apache/Nginx for plugin development.
COPY . .
RUN apt-get update && \
DEBIAN_FRONTEND=noninteractive apt-get install apache2 git python3-dev \
python3-venv gcc libaugeas0 libssl-dev libffi-dev ca-certificates \
openssl nginx-light -y --no-install-recommends && \
apt-get install apache2 git nginx-light -y && \
letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*
RUN VENV_NAME="../venv" python3 tools/venv.py
RUN VENV_NAME="../venv" python tools/venv.py
ENV PATH /opt/certbot/venv/bin:$PATH

View File

@@ -7,7 +7,7 @@ questions.
## My operating system is (include version):
## I installed Certbot with (snap, OS package manager, pip, certbot-auto, etc):
## I installed Certbot with (certbot-auto, OS package manager, pip, etc):
## I ran this command and it produced this output:

View File

@@ -1,11 +1,9 @@
include README.rst
include CHANGELOG.md
include CONTRIBUTING.md
include LICENSE.txt
include linter_plugin.py
recursive-include docs *
recursive-include examples *
recursive-include certbot/tests/testdata *
recursive-include tests *.py
include certbot/ssl-dhparams.pem
include certbot/py.typed
global-exclude __pycache__
global-exclude *.py[cod]

View File

@@ -1 +0,0 @@
certbot/README.rst

131
README.rst Normal file
View File

@@ -0,0 +1,131 @@
.. This file contains a series of comments that are used to include sections of this README in other files. Do not modify these comments unless you know what you are doing. tag:intro-begin
Certbot is part of EFFs effort to encrypt the entire Internet. Secure communication over the Web relies on HTTPS, which requires the use of a digital certificate that lets browsers verify the identity of web servers (e.g., is that really google.com?). Web servers obtain their certificates from trusted third parties called certificate authorities (CAs). Certbot is an easy-to-use client that fetches a certificate from Lets Encrypt—an open certificate authority launched by the EFF, Mozilla, and others—and deploys it to a web server.
Anyone who has gone through the trouble of setting up a secure website knows what a hassle getting and maintaining a certificate is. Certbot and Lets Encrypt can automate away the pain and let you turn on and manage HTTPS with simple commands. Using Certbot and Let's Encrypt is free, so theres no need to arrange payment.
How you use Certbot depends on the configuration of your web server. The best way to get started is to use our `interactive guide <https://certbot.eff.org>`_. It generates instructions based on your configuration settings. In most cases, youll need `root or administrator access <https://certbot.eff.org/faq/#does-certbot-require-root-administrator-privileges>`_ to your web server to run Certbot.
Certbot is meant to be run directly on your web server, not on your personal computer. If youre using a hosted service and dont have direct access to your web server, you might not be able to use Certbot. Check with your hosting provider for documentation about uploading certificates or using certificates issued by Lets Encrypt.
Certbot is a fully-featured, extensible client for the Let's
Encrypt CA (or any other CA that speaks the `ACME
<https://github.com/ietf-wg-acme/acme/blob/master/draft-ietf-acme-acme.md>`_
protocol) that can automate the tasks of obtaining certificates and
configuring webservers to use them. This client runs on Unix-based operating
systems.
To see the changes made to Certbot between versions please refer to our
`changelog <https://github.com/certbot/certbot/blob/master/CHANGELOG.md>`_.
Until May 2016, Certbot was named simply ``letsencrypt`` or ``letsencrypt-auto``,
depending on install method. Instructions on the Internet, and some pieces of the
software, may still refer to this older name.
Contributing
------------
If you'd like to contribute to this project please read `Developer Guide
<https://certbot.eff.org/docs/contributing.html>`_.
This project is governed by `EFF's Public Projects Code of Conduct <https://www.eff.org/pages/eppcode>`_.
.. _installation:
How to run the client
---------------------
The easiest way to install and run Certbot is by visiting `certbot.eff.org`_,
where you can find the correct instructions for many web server and OS
combinations. For more information, see `Get Certbot
<https://certbot.eff.org/docs/install.html>`_.
.. _certbot.eff.org: https://certbot.eff.org/
Understanding the client in more depth
--------------------------------------
To understand what the client is doing in detail, it's important to
understand the way it uses plugins. Please see the `explanation of
plugins <https://certbot.eff.org/docs/using.html#plugins>`_ in
the User Guide.
Links
=====
.. Do not modify this comment unless you know what you're doing. tag:links-begin
Documentation: https://certbot.eff.org/docs
Software project: https://github.com/certbot/certbot
Notes for developers: https://certbot.eff.org/docs/contributing.html
Main Website: https://certbot.eff.org
Let's Encrypt Website: https://letsencrypt.org
Community: https://community.letsencrypt.org
ACME spec: http://ietf-wg-acme.github.io/acme/
ACME working area in github: https://github.com/ietf-wg-acme/acme
|build-status| |coverage| |docs| |container|
.. |build-status| image:: https://travis-ci.com/certbot/certbot.svg?branch=master
:target: https://travis-ci.com/certbot/certbot
:alt: Travis CI status
.. |coverage| image:: https://codecov.io/gh/certbot/certbot/branch/master/graph/badge.svg
:target: https://codecov.io/gh/certbot/certbot
:alt: Coverage status
.. |docs| image:: https://readthedocs.org/projects/letsencrypt/badge/
:target: https://readthedocs.org/projects/letsencrypt/
:alt: Documentation status
.. |container| image:: https://quay.io/repository/letsencrypt/letsencrypt/status
:target: https://quay.io/repository/letsencrypt/letsencrypt
:alt: Docker Repository on Quay.io
.. Do not modify this comment unless you know what you're doing. tag:links-end
System Requirements
===================
See https://certbot.eff.org/docs/install.html#system-requirements.
.. Do not modify this comment unless you know what you're doing. tag:intro-end
.. Do not modify this comment unless you know what you're doing. tag:features-begin
Current Features
=====================
* Supports multiple web servers:
- apache/2.x
- nginx/0.8.48+
- webroot (adds files to webroot directories in order to prove control of
domains and obtain certs)
- standalone (runs its own simple webserver to prove you control a domain)
- other server software via `third party plugins <https://certbot.eff.org/docs/using.html#third-party-plugins>`_
* The private key is generated locally on your system.
* Can talk to the Let's Encrypt CA or optionally to other ACME
compliant services.
* Can get domain-validated (DV) certificates.
* Can revoke certificates.
* Adjustable RSA key bit-length (2048 (default), 4096, ...).
* Can optionally install a http -> https redirect, so your site effectively
runs https only (Apache only)
* Fully automated.
* Configuration changes are logged and can be reverted.
* Supports an interactive text UI, or can be driven entirely from the
command line.
* Free and Open Source Software, made with Python.
.. Do not modify this comment unless you know what you're doing. tag:features-end
For extensive documentation on using and contributing to Certbot, go to https://certbot.eff.org/docs. If you would like to contribute to the project or run the latest code from git, you should read our `developer guide <https://certbot.eff.org/docs/contributing.html>`_.

View File

@@ -3,7 +3,4 @@ include README.rst
include pytest.ini
recursive-include docs *
recursive-include examples *
recursive-include tests *
include acme/py.typed
global-exclude __pycache__
global-exclude *.py[cod]
recursive-include acme/testdata *

View File

@@ -2,16 +2,18 @@
This module is an implementation of the `ACME protocol`_.
.. _`ACME protocol`: https://datatracker.ietf.org/doc/html/rfc8555
.. _`ACME protocol`: https://ietf-wg-acme.github.io/acme
"""
import sys
import warnings
# This code exists to keep backwards compatibility with people using acme.jose
# before it became the standalone josepy package.
#
# It is based on
# https://github.com/requests/requests/blob/1278ecdf71a312dc2268f3bfc0aabfab3c006dcf/requests/packages.py
import josepy as jose
for mod in list(sys.modules):
@@ -19,3 +21,30 @@ for mod in list(sys.modules):
# preserved (acme.jose.* is josepy.*)
if mod == 'josepy' or mod.startswith('josepy.'):
sys.modules['acme.' + mod.replace('josepy', 'jose', 1)] = sys.modules[mod]
# This class takes a similar approach to the cryptography project to deprecate attributes
# in public modules. See the _ModuleWithDeprecation class here:
# https://github.com/pyca/cryptography/blob/91105952739442a74582d3e62b3d2111365b0dc7/src/cryptography/utils.py#L129
class _TLSSNI01DeprecationModule(object):
"""
Internal class delegating to a module, and displaying warnings when
attributes related to TLS-SNI-01 are accessed.
"""
def __init__(self, module):
self.__dict__['_module'] = module
def __getattr__(self, attr):
if 'TLSSNI01' in attr:
warnings.warn('{0} attribute is deprecated, and will be removed soon.'.format(attr),
DeprecationWarning, stacklevel=2)
return getattr(self._module, attr)
def __setattr__(self, attr, value): # pragma: no cover
setattr(self._module, attr, value)
def __delattr__(self, attr): # pragma: no cover
delattr(self._module, attr)
def __dir__(self): # pragma: no cover
return ['_module'] + dir(self._module)

View File

@@ -1,64 +1,45 @@
"""ACME Identifier Validation Challenges."""
import abc
import codecs
import functools
import hashlib
import logging
import socket
from typing import cast
from typing import Any
from typing import Dict
from typing import Mapping
from typing import Optional
from typing import Tuple
from typing import Type
from typing import TypeVar
from typing import Union
import warnings
import sys
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.primitives import hashes # type: ignore
import josepy as jose
from OpenSSL import crypto
from OpenSSL import SSL
import OpenSSL
import requests
import six
from acme import crypto_util
from acme import errors
from acme import crypto_util
from acme import fields
with warnings.catch_warnings():
warnings.filterwarnings("ignore", category=DeprecationWarning)
from acme.mixins import ResourceMixin
from acme.mixins import TypeMixin
from acme import _TLSSNI01DeprecationModule
logger = logging.getLogger(__name__)
GenericChallenge = TypeVar('GenericChallenge', bound='Challenge')
class Challenge(jose.TypedJSONObjectWithFields):
# _fields_to_partial_json
"""ACME challenge."""
TYPES: Dict[str, Type['Challenge']] = {}
TYPES = {} # type: dict
@classmethod
def from_json(cls: Type[GenericChallenge],
jobj: Mapping[str, Any]) -> Union[GenericChallenge, 'UnrecognizedChallenge']:
def from_json(cls, jobj):
try:
return cast(GenericChallenge, super().from_json(jobj))
return super(Challenge, cls).from_json(jobj)
except jose.UnrecognizedTypeError as error:
logger.debug(error)
return UnrecognizedChallenge.from_json(jobj)
class ChallengeResponse(ResourceMixin, TypeMixin, jose.TypedJSONObjectWithFields):
class ChallengeResponse(jose.TypedJSONObjectWithFields):
# _fields_to_partial_json
"""ACME challenge response."""
TYPES: Dict[str, Type['ChallengeResponse']] = {}
TYPES = {} # type: dict
resource_type = 'challenge'
with warnings.catch_warnings():
warnings.filterwarnings('ignore', 'resource attribute in acme.fields', DeprecationWarning)
resource: str = fields.resource(resource_type)
resource = fields.Resource(resource_type)
class UnrecognizedChallenge(Challenge):
@@ -73,17 +54,17 @@ class UnrecognizedChallenge(Challenge):
:ivar jobj: Original JSON decoded object.
"""
jobj: Dict[str, Any]
def __init__(self, jobj: Mapping[str, Any]) -> None:
super().__init__()
def __init__(self, jobj):
super(UnrecognizedChallenge, self).__init__()
object.__setattr__(self, "jobj", jobj)
def to_partial_json(self) -> Dict[str, Any]:
return self.jobj # pylint: disable=no-member
def to_partial_json(self):
# pylint: disable=no-member
return self.jobj
@classmethod
def from_json(cls, jobj: Mapping[str, Any]) -> 'UnrecognizedChallenge':
def from_json(cls, jobj):
return cls(jobj)
@@ -97,13 +78,13 @@ class _TokenChallenge(Challenge):
"""Minimum size of the :attr:`token` in bytes."""
# TODO: acme-spec doesn't specify token as base64-encoded value
token: bytes = jose.field(
token = jose.Field(
"token", encoder=jose.encode_b64jose, decoder=functools.partial(
jose.decode_b64jose, size=TOKEN_SIZE, minimum=True))
# XXX: rename to ~token_good_for_url
@property
def good_token(self) -> bool: # XXX: @token.decoder
def good_token(self): # XXX: @token.decoder
"""Is `token` good?
.. todo:: acme-spec wants "It MUST NOT contain any non-ASCII
@@ -120,13 +101,13 @@ class _TokenChallenge(Challenge):
class KeyAuthorizationChallengeResponse(ChallengeResponse):
"""Response to Challenges based on Key Authorization.
:param str key_authorization:
:param unicode key_authorization:
"""
key_authorization: str = jose.field("keyAuthorization")
key_authorization = jose.Field("keyAuthorization")
thumbprint_hash_function = hashes.SHA256
def verify(self, chall: 'KeyAuthorizationChallenge', account_public_key: jose.JWK) -> bool:
def verify(self, chall, account_public_key):
"""Verify the key authorization.
:param KeyAuthorization chall: Challenge that corresponds to
@@ -158,39 +139,37 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
return True
def to_partial_json(self) -> Dict[str, Any]:
jobj = super().to_partial_json()
def to_partial_json(self):
jobj = super(KeyAuthorizationChallengeResponse, self).to_partial_json()
jobj.pop('keyAuthorization', None)
return jobj
# TODO: Make this method a generic of K (bound=KeyAuthorizationChallenge), response_cls of type
# Type[K] and use it in response/response_and_validation return types once Python 3.6 support is
# dropped (do not support generic ABC classes, see https://github.com/python/typing/issues/449).
class KeyAuthorizationChallenge(_TokenChallenge, metaclass=abc.ABCMeta):
@six.add_metaclass(abc.ABCMeta)
class KeyAuthorizationChallenge(_TokenChallenge):
"""Challenge based on Key Authorization.
:param response_cls: Subclass of `KeyAuthorizationChallengeResponse`
that will be used to generate ``response``.
that will be used to generate `response`.
:param str typ: type of the challenge
"""
typ: str = NotImplemented
response_cls: Type[KeyAuthorizationChallengeResponse] = NotImplemented
typ = NotImplemented
response_cls = NotImplemented
thumbprint_hash_function = (
KeyAuthorizationChallengeResponse.thumbprint_hash_function)
def key_authorization(self, account_key: jose.JWK) -> str:
def key_authorization(self, account_key):
"""Generate Key Authorization.
:param JWK account_key:
:rtype str:
:rtype unicode:
"""
return self.encode("token") + "." + jose.b64encode(
account_key.thumbprint(
hash_function=self.thumbprint_hash_function)).decode()
def response(self, account_key: jose.JWK) -> KeyAuthorizationChallengeResponse:
def response(self, account_key):
"""Generate response to the challenge.
:param JWK account_key:
@@ -203,7 +182,7 @@ class KeyAuthorizationChallenge(_TokenChallenge, metaclass=abc.ABCMeta):
key_authorization=self.key_authorization(account_key))
@abc.abstractmethod
def validation(self, account_key: jose.JWK, **kwargs: Any) -> Any:
def validation(self, account_key, **kwargs):
"""Generate validation for the challenge.
Subclasses must implement this method, but they are likely to
@@ -217,8 +196,7 @@ class KeyAuthorizationChallenge(_TokenChallenge, metaclass=abc.ABCMeta):
"""
raise NotImplementedError() # pragma: no cover
def response_and_validation(self, account_key: jose.JWK, *args: Any, **kwargs: Any
) -> Tuple[KeyAuthorizationChallengeResponse, Any]:
def response_and_validation(self, account_key, *args, **kwargs):
"""Generate response and validation.
Convenience function that return results of `response` and
@@ -237,14 +215,14 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
"""ACME dns-01 challenge response."""
typ = "dns-01"
def simple_verify(self, chall: 'DNS01', domain: str, account_public_key: jose.JWK) -> bool: # pylint: disable=unused-argument
def simple_verify(self, chall, domain, account_public_key): # pylint: disable=unused-argument
"""Simple verify.
This method no longer checks DNS records and is a simple wrapper
around `KeyAuthorizationChallengeResponse.verify`.
:param challenges.DNS01 chall: Corresponding challenge.
:param str domain: Domain name being verified.
:param unicode domain: Domain name being verified.
:param JWK account_public_key: Public key for the key pair
being authorized.
@@ -259,7 +237,7 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
return verified
@Challenge.register
@Challenge.register # pylint: disable=too-many-ancestors
class DNS01(KeyAuthorizationChallenge):
"""ACME dns-01 challenge."""
response_cls = DNS01Response
@@ -268,24 +246,23 @@ class DNS01(KeyAuthorizationChallenge):
LABEL = "_acme-challenge"
"""Label clients prepend to the domain name being validated."""
def validation(self, account_key: jose.JWK, **unused_kwargs: Any) -> str:
def validation(self, account_key, **unused_kwargs):
"""Generate validation.
:param JWK account_key:
:rtype: str
:rtype: unicode
"""
return jose.b64encode(hashlib.sha256(self.key_authorization(
account_key).encode("utf-8")).digest()).decode()
def validation_domain_name(self, name: str) -> str:
def validation_domain_name(self, name):
"""Domain name for TXT validation record.
:param str name: Domain name being validated.
:rtype: str
:param unicode name: Domain name being validated.
"""
return f"{self.LABEL}.{name}"
return "{0}.{1}".format(self.LABEL, name)
@ChallengeResponse.register
@@ -304,12 +281,11 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
WHITESPACE_CUTSET = "\n\r\t "
"""Whitespace characters which should be ignored at the end of the body."""
def simple_verify(self, chall: 'HTTP01', domain: str, account_public_key: jose.JWK,
port: Optional[int] = None) -> bool:
def simple_verify(self, chall, domain, account_public_key, port=None):
"""Simple verify.
:param challenges.SimpleHTTP chall: Corresponding challenge.
:param str domain: Domain name being verified.
:param unicode domain: Domain name being verified.
:param JWK account_public_key: Public key for the key pair
being authorized.
:param int port: Port used in the validation.
@@ -334,19 +310,10 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
uri = chall.uri(domain)
logger.debug("Verifying %s at %s...", chall.typ, uri)
try:
http_response = requests.get(uri, verify=False)
http_response = requests.get(uri)
except requests.exceptions.RequestException as error:
logger.error("Unable to reach %s: %s", uri, error)
return False
# By default, http_response.text will try to guess the encoding to use
# when decoding the response to Python unicode strings. This guesswork
# is error prone. RFC 8555 specifies that HTTP-01 responses should be
# key authorizations with possible trailing whitespace. Since key
# authorizations must be composed entirely of the base64url alphabet
# plus ".", we tell requests that the response should be ASCII. See
# https://datatracker.ietf.org/doc/html/rfc8555#section-8.3 for more
# info.
http_response.encoding = "ascii"
logger.debug("Received %s: %s. Headers: %s", http_response,
http_response.text, http_response.headers)
@@ -360,7 +327,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
return True
@Challenge.register
@Challenge.register # pylint: disable=too-many-ancestors
class HTTP01(KeyAuthorizationChallenge):
"""ACME http-01 challenge."""
response_cls = HTTP01Response
@@ -370,40 +337,43 @@ class HTTP01(KeyAuthorizationChallenge):
"""URI root path for the server provisioned resource."""
@property
def path(self) -> str:
def path(self):
"""Path (starting with '/') for provisioned resource.
:rtype: str
:rtype: string
"""
return '/' + self.URI_ROOT_PATH + '/' + self.encode('token')
def uri(self, domain: str) -> str:
def uri(self, domain):
"""Create an URI to the provisioned resource.
Forms an URI to the HTTPS server provisioned resource
(containing :attr:`~SimpleHTTP.token`).
:param str domain: Domain name being verified.
:rtype: str
:param unicode domain: Domain name being verified.
:rtype: string
"""
return "http://" + domain + self.path
def validation(self, account_key: jose.JWK, **unused_kwargs: Any) -> str:
def validation(self, account_key, **unused_kwargs):
"""Generate validation.
:param JWK account_key:
:rtype: str
:rtype: unicode
"""
return self.key_authorization(account_key)
@ChallengeResponse.register
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""ACME tls-alpn-01 challenge response."""
typ = "tls-alpn-01"
class TLSSNI01Response(KeyAuthorizationChallengeResponse):
"""ACME tls-sni-01 challenge response."""
typ = "tls-sni-01"
DOMAIN_SUFFIX = b".acme.invalid"
"""Domain name suffix."""
PORT = 443
"""Verification port as defined by the protocol.
@@ -413,19 +383,28 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""
ID_PE_ACME_IDENTIFIER_V1 = b"1.3.6.1.5.5.7.1.30.1"
ACME_TLS_1_PROTOCOL = b"acme-tls/1"
@property
def z(self): # pylint: disable=invalid-name
"""``z`` value used for verification.
:rtype bytes:
"""
return hashlib.sha256(
self.key_authorization.encode("utf-8")).hexdigest().lower().encode()
@property
def h(self) -> bytes:
"""Hash value stored in challenge certificate"""
return hashlib.sha256(self.key_authorization.encode('utf-8')).digest()
def z_domain(self):
"""Domain name used for verification, generated from `z`.
def gen_cert(self, domain: str, key: Optional[crypto.PKey] = None, bits: int = 2048
) -> Tuple[crypto.X509, crypto.PKey]:
"""Generate tls-alpn-01 certificate.
:rtype bytes:
"""
return self.z[:32] + b'.' + self.z[32:] + self.DOMAIN_SUFFIX
def gen_cert(self, key=None, bits=2048):
"""Generate tls-sni-01 certificate.
:param str domain: Domain verified by the challenge.
:param OpenSSL.crypto.PKey key: Optional private key used in
certificate generation. If not provided (``None``), then
fresh key will be generated.
@@ -435,38 +414,32 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""
if key is None:
key = crypto.PKey()
key.generate_key(crypto.TYPE_RSA, bits)
key = OpenSSL.crypto.PKey()
key.generate_key(OpenSSL.crypto.TYPE_RSA, bits)
return crypto_util.gen_ss_cert(key, [
# z_domain is too big to fit into CN, hence first dummy domain
'dummy', self.z_domain.decode()], force_san=True), key
der_value = b"DER:" + codecs.encode(self.h, 'hex')
acme_extension = crypto.X509Extension(self.ID_PE_ACME_IDENTIFIER_V1,
critical=True, value=der_value)
def probe_cert(self, domain, **kwargs):
"""Probe tls-sni-01 challenge certificate.
return crypto_util.gen_ss_cert(key, [domain], force_san=True,
extensions=[acme_extension]), key
def probe_cert(self, domain: str, host: Optional[str] = None,
port: Optional[int] = None) -> crypto.X509:
"""Probe tls-alpn-01 challenge certificate.
:param str domain: domain being validated, required.
:param str host: IP address used to probe the certificate.
:param int port: Port used to probe the certificate.
:param unicode domain:
"""
if host is None:
# TODO: domain is not necessary if host is provided
if "host" not in kwargs:
host = socket.gethostbyname(domain)
logger.debug('%s resolved to %s', domain, host)
if port is None:
port = self.PORT
kwargs["host"] = host
return crypto_util.probe_sni(host=host.encode(), port=port, name=domain.encode(),
alpn_protocols=[self.ACME_TLS_1_PROTOCOL])
kwargs.setdefault("port", self.PORT)
kwargs["name"] = self.z_domain
# TODO: try different methods?
return crypto_util.probe_sni(**kwargs)
def verify_cert(self, domain: str, cert: crypto.X509) -> bool:
"""Verify tls-alpn-01 challenge certificate.
def verify_cert(self, cert):
"""Verify tls-sni-01 challenge certificate.
:param str domain: Domain name being validated.
:param OpensSSL.crypto.X509 cert: Challenge certificate.
:returns: Whether the certificate was successfully verified.
@@ -474,44 +447,28 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""
# pylint: disable=protected-access
names = crypto_util._pyopenssl_cert_or_req_all_names(cert)
# Type ignore needed due to
# https://github.com/pyca/pyopenssl/issues/730.
logger.debug('Certificate %s. SANs: %s',
cert.digest('sha256'), names)
if len(names) != 1 or names[0].lower() != domain.lower():
return False
sans = crypto_util._pyopenssl_cert_or_req_san(cert)
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), sans)
return self.z_domain.decode() in sans
for i in range(cert.get_extension_count()):
ext = cert.get_extension(i)
# FIXME: assume this is the ACME extension. Currently there is no
# way to get full OID of an unknown extension from pyopenssl.
if ext.get_short_name() == b'UNDEF':
data = ext.get_data()
return data == self.h
return False
# pylint: disable=too-many-arguments
def simple_verify(self, chall: 'TLSALPN01', domain: str, account_public_key: jose.JWK,
cert: Optional[crypto.X509] = None, host: Optional[str] = None,
port: Optional[int] = None) -> bool:
def simple_verify(self, chall, domain, account_public_key,
cert=None, **kwargs):
"""Simple verify.
Verify ``validation`` using ``account_public_key``, optionally
probe tls-alpn-01 certificate and check using `verify_cert`.
probe tls-sni-01 certificate and check using `verify_cert`.
:param .challenges.TLSALPN01 chall: Corresponding challenge.
:param .challenges.TLSSNI01 chall: Corresponding challenge.
:param str domain: Domain name being validated.
:param JWK account_public_key:
:param OpenSSL.crypto.X509 cert: Optional certificate. If not
provided (``None``) certificate will be retrieved using
`probe_cert`.
:param string host: IP address used to probe the certificate.
:param int port: Port used to probe the certificate.
:returns: ``True`` if and only if client's control of the domain has been verified.
:returns: ``True`` iff client's control of the domain has been
verified.
:rtype: bool
"""
@@ -521,25 +478,27 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
if cert is None:
try:
cert = self.probe_cert(domain=domain, host=host, port=port)
cert = self.probe_cert(domain=domain, **kwargs)
except errors.Error as error:
logger.debug(str(error), exc_info=True)
return False
return self.verify_cert(domain, cert)
return self.verify_cert(cert)
@Challenge.register # pylint: disable=too-many-ancestors
class TLSALPN01(KeyAuthorizationChallenge):
"""ACME tls-alpn-01 challenge."""
response_cls = TLSALPN01Response
class TLSSNI01(KeyAuthorizationChallenge):
"""ACME tls-sni-01 challenge."""
response_cls = TLSSNI01Response
typ = response_cls.typ
def validation(self, account_key: jose.JWK, **kwargs: Any) -> Tuple[crypto.X509, crypto.PKey]:
# boulder#962, ietf-wg-acme#22
#n = jose.Field("n", encoder=int, decoder=int)
def validation(self, account_key, **kwargs):
"""Generate validation.
:param JWK account_key:
:param str domain: Domain verified by the challenge.
:param OpenSSL.crypto.PKey cert_key: Optional private key used
in certificate generation. If not provided (``None``), then
fresh key will be generated.
@@ -547,24 +506,34 @@ class TLSALPN01(KeyAuthorizationChallenge):
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
"""
# TODO: Remove cast when response() is generic.
return cast(TLSALPN01Response, self.response(account_key)).gen_cert(
key=kwargs.get('cert_key'),
domain=cast(str, kwargs.get('domain')))
return self.response(account_key).gen_cert(key=kwargs.get('cert_key'))
@staticmethod
def is_supported() -> bool:
"""
Check if TLS-ALPN-01 challenge is supported on this machine.
This implies that a recent version of OpenSSL is installed (>= 1.0.2),
or a recent cryptography version shipped with the OpenSSL library is installed.
:returns: ``True`` if TLS-ALPN-01 is supported on this machine, ``False`` otherwise.
:rtype: bool
@ChallengeResponse.register
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""ACME TLS-ALPN-01 challenge response.
"""
return (hasattr(SSL.Connection, "set_alpn_protos")
and hasattr(SSL.Context, "set_alpn_select_callback"))
This class only allows initiating a TLS-ALPN-01 challenge returned from the
CA. Full support for responding to TLS-ALPN-01 challenges by generating and
serving the expected response certificate is not currently provided.
"""
typ = "tls-alpn-01"
@Challenge.register # pylint: disable=too-many-ancestors
class TLSALPN01(KeyAuthorizationChallenge):
"""ACME tls-alpn-01 challenge.
This class simply allows parsing the TLS-ALPN-01 challenge returned from
the CA. Full TLS-ALPN-01 support is not currently provided.
"""
typ = "tls-alpn-01"
response_cls = TLSALPN01Response
def validation(self, account_key, **kwargs):
"""Generate validation for the challenge."""
raise NotImplementedError()
@Challenge.register
@@ -575,8 +544,7 @@ class DNS(_TokenChallenge):
LABEL = "_acme-challenge"
"""Label clients prepend to the domain name being validated."""
def gen_validation(self, account_key: jose.JWK, alg: jose.JWASignature = jose.RS256,
**kwargs: Any) -> jose.JWS:
def gen_validation(self, account_key, alg=jose.RS256, **kwargs):
"""Generate validation.
:param .JWK account_key: Private account key.
@@ -590,7 +558,7 @@ class DNS(_TokenChallenge):
payload=self.json_dumps(sort_keys=True).encode('utf-8'),
key=account_key, alg=alg, **kwargs)
def check_validation(self, validation: jose.JWS, account_public_key: jose.JWK) -> bool:
def check_validation(self, validation, account_public_key):
"""Check validation.
:param JWS validation:
@@ -607,7 +575,7 @@ class DNS(_TokenChallenge):
logger.debug("Checking validation for DNS failed: %s", error)
return False
def gen_response(self, account_key: jose.JWK, **kwargs: Any) -> 'DNSResponse':
def gen_response(self, account_key, **kwargs):
"""Generate response.
:param .JWK account_key: Private account key.
@@ -616,12 +584,13 @@ class DNS(_TokenChallenge):
:rtype: DNSResponse
"""
return DNSResponse(validation=self.gen_validation(account_key, **kwargs))
return DNSResponse(validation=self.gen_validation(
account_key, **kwargs))
def validation_domain_name(self, name: str) -> str:
def validation_domain_name(self, name):
"""Domain name for TXT validation record.
:param str name: Domain name being validated.
:param unicode name: Domain name being validated.
"""
return "{0}.{1}".format(self.LABEL, name)
@@ -636,9 +605,9 @@ class DNSResponse(ChallengeResponse):
"""
typ = "dns"
validation: jose.JWS = jose.field("validation", decoder=jose.JWS.from_json)
validation = jose.Field("validation", decoder=jose.JWS.from_json)
def check_validation(self, chall: 'DNS', account_public_key: jose.JWK) -> bool:
def check_validation(self, chall, account_public_key):
"""Check validation.
:param challenges.DNS chall:
@@ -648,3 +617,7 @@ class DNSResponse(ChallengeResponse):
"""
return chall.check_validation(self.validation, account_public_key)
# Patching ourselves to warn about TLS-SNI challenge deprecation and removal.
sys.modules[__name__] = _TLSSNI01DeprecationModule(sys.modules[__name__])

View File

@@ -1,16 +1,15 @@
"""Tests for acme.challenges."""
import urllib.parse as urllib_parse
import unittest
from unittest import mock
import josepy as jose
import mock
import OpenSSL
import requests
from josepy.jwk import JWKEC
from six.moves.urllib import parse as urllib_parse # pylint: disable=relative-import
from acme import errors
import test_util
from acme import test_util
CERT = test_util.load_comparable_cert('cert.pem')
KEY = jose.JWKRSA(key=test_util.load_rsa_private_key('rsa512_key.pem'))
@@ -22,6 +21,7 @@ class ChallengeTest(unittest.TestCase):
from acme.challenges import Challenge
from acme.challenges import UnrecognizedChallenge
chall = UnrecognizedChallenge({"type": "foo"})
# pylint: disable=no-member
self.assertEqual(chall, Challenge.from_json(chall.jobj))
@@ -77,6 +77,7 @@ class KeyAuthorizationChallengeResponseTest(unittest.TestCase):
class DNS01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import DNS01Response
@@ -148,6 +149,7 @@ class DNS01Test(unittest.TestCase):
class HTTP01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import HTTP01Response
@@ -185,7 +187,7 @@ class HTTP01ResponseTest(unittest.TestCase):
mock_get.return_value = mock.MagicMock(text=validation)
self.assertTrue(self.response.simple_verify(
self.chall, "local", KEY.public_key()))
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
mock_get.assert_called_once_with(self.chall.uri("local"))
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_bad_validation(self, mock_get):
@@ -201,7 +203,7 @@ class HTTP01ResponseTest(unittest.TestCase):
HTTP01Response.WHITESPACE_CUTSET))
self.assertTrue(self.response.simple_verify(
self.chall, "local", KEY.public_key()))
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
mock_get.assert_called_once_with(self.chall.uri("local"))
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_connection_error(self, mock_get):
@@ -257,68 +259,43 @@ class HTTP01Test(unittest.TestCase):
self.msg.update(token=b'..').good_token)
class TLSALPN01ResponseTest(unittest.TestCase):
class TLSSNI01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import TLSALPN01
self.chall = TLSALPN01(
from acme.challenges import TLSSNI01
self.chall = TLSSNI01(
token=jose.b64decode(b'a82d5ff8ef740d12881f6d3c2277ab2e'))
self.domain = u'example.com'
self.domain2 = u'example2.com'
self.response = self.chall.response(KEY)
self.jmsg = {
'resource': 'challenge',
'type': 'tls-alpn-01',
'type': 'tls-sni-01',
'keyAuthorization': self.response.key_authorization,
}
# pylint: disable=invalid-name
label1 = b'dc38d9c3fa1a4fdcc3a5501f2d38583f'
label2 = b'b7793728f084394f2a1afd459556bb5c'
self.z = label1 + label2
self.z_domain = label1 + b'.' + label2 + b'.acme.invalid'
self.domain = 'foo.com'
def test_z_and_domain(self):
self.assertEqual(self.z, self.response.z)
self.assertEqual(self.z_domain, self.response.z_domain)
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.response.to_partial_json())
def test_from_json(self):
from acme.challenges import TLSALPN01Response
self.assertEqual(self.response, TLSALPN01Response.from_json(self.jmsg))
from acme.challenges import TLSSNI01Response
self.assertEqual(self.response, TLSSNI01Response.from_json(self.jmsg))
def test_from_json_hashable(self):
from acme.challenges import TLSALPN01Response
hash(TLSALPN01Response.from_json(self.jmsg))
def test_gen_verify_cert(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(self.domain, key1)
self.assertEqual(key1, key2)
self.assertTrue(self.response.verify_cert(self.domain, cert))
def test_gen_verify_cert_gen_key(self):
cert, key = self.response.gen_cert(self.domain)
self.assertIsInstance(key, OpenSSL.crypto.PKey)
self.assertTrue(self.response.verify_cert(self.domain, cert))
def test_verify_bad_cert(self):
self.assertFalse(self.response.verify_cert(self.domain,
test_util.load_cert('cert.pem')))
def test_verify_bad_domain(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(self.domain, key1)
self.assertEqual(key1, key2)
self.assertFalse(self.response.verify_cert(self.domain2, cert))
def test_simple_verify_bad_key_authorization(self):
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
self.response.simple_verify(self.chall, "local", key2.public_key())
@mock.patch('acme.challenges.TLSALPN01Response.verify_cert', autospec=True)
def test_simple_verify(self, mock_verify_cert):
mock_verify_cert.return_value = mock.sentinel.verification
self.assertEqual(
mock.sentinel.verification, self.response.simple_verify(
self.chall, self.domain, KEY.public_key(),
cert=mock.sentinel.cert))
mock_verify_cert.assert_called_once_with(
self.response, self.domain, mock.sentinel.cert)
from acme.challenges import TLSSNI01Response
hash(TLSSNI01Response.from_json(self.jmsg))
@mock.patch('acme.challenges.socket.gethostbyname')
@mock.patch('acme.challenges.crypto_util.probe_sni')
@@ -327,21 +304,134 @@ class TLSALPN01ResponseTest(unittest.TestCase):
self.response.probe_cert('foo.com')
mock_gethostbyname.assert_called_once_with('foo.com')
mock_probe_sni.assert_called_once_with(
host=b'127.0.0.1', port=self.response.PORT, name=b'foo.com',
alpn_protocols=[b'acme-tls/1'])
host='127.0.0.1', port=self.response.PORT,
name=self.z_domain)
self.response.probe_cert('foo.com', host='8.8.8.8')
mock_probe_sni.assert_called_with(
host=b'8.8.8.8', port=mock.ANY, name=b'foo.com',
alpn_protocols=[b'acme-tls/1'])
host='8.8.8.8', port=mock.ANY, name=mock.ANY)
@mock.patch('acme.challenges.TLSALPN01Response.probe_cert')
self.response.probe_cert('foo.com', port=1234)
mock_probe_sni.assert_called_with(
host=mock.ANY, port=1234, name=mock.ANY)
self.response.probe_cert('foo.com', bar='baz')
mock_probe_sni.assert_called_with(
host=mock.ANY, port=mock.ANY, name=mock.ANY, bar='baz')
self.response.probe_cert('foo.com', name=b'xxx')
mock_probe_sni.assert_called_with(
host=mock.ANY, port=mock.ANY,
name=self.z_domain)
def test_gen_verify_cert(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(key1)
self.assertEqual(key1, key2)
self.assertTrue(self.response.verify_cert(cert))
def test_gen_verify_cert_gen_key(self):
cert, key = self.response.gen_cert()
self.assertTrue(isinstance(key, OpenSSL.crypto.PKey))
self.assertTrue(self.response.verify_cert(cert))
def test_verify_bad_cert(self):
self.assertFalse(self.response.verify_cert(
test_util.load_cert('cert.pem')))
def test_simple_verify_bad_key_authorization(self):
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
self.response.simple_verify(self.chall, "local", key2.public_key())
@mock.patch('acme.challenges.TLSSNI01Response.verify_cert', autospec=True)
def test_simple_verify(self, mock_verify_cert):
mock_verify_cert.return_value = mock.sentinel.verification
self.assertEqual(
mock.sentinel.verification, self.response.simple_verify(
self.chall, self.domain, KEY.public_key(),
cert=mock.sentinel.cert))
mock_verify_cert.assert_called_once_with(
self.response, mock.sentinel.cert)
@mock.patch('acme.challenges.TLSSNI01Response.probe_cert')
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
mock_probe_cert.side_effect = errors.Error
self.assertFalse(self.response.simple_verify(
self.chall, self.domain, KEY.public_key()))
class TLSSNI01Test(unittest.TestCase):
def setUp(self):
self.jmsg = {
'type': 'tls-sni-01',
'token': 'a82d5ff8ef740d12881f6d3c2277ab2e',
}
from acme.challenges import TLSSNI01
self.msg = TLSSNI01(
token=jose.b64decode('a82d5ff8ef740d12881f6d3c2277ab2e'))
def test_to_partial_json(self):
self.assertEqual(self.jmsg, self.msg.to_partial_json())
def test_from_json(self):
from acme.challenges import TLSSNI01
self.assertEqual(self.msg, TLSSNI01.from_json(self.jmsg))
def test_from_json_hashable(self):
from acme.challenges import TLSSNI01
hash(TLSSNI01.from_json(self.jmsg))
def test_from_json_invalid_token_length(self):
from acme.challenges import TLSSNI01
self.jmsg['token'] = jose.encode_b64jose(b'abcd')
self.assertRaises(
jose.DeserializationError, TLSSNI01.from_json, self.jmsg)
@mock.patch('acme.challenges.TLSSNI01Response.gen_cert')
def test_validation(self, mock_gen_cert):
mock_gen_cert.return_value = ('cert', 'key')
self.assertEqual(('cert', 'key'), self.msg.validation(
KEY, cert_key=mock.sentinel.cert_key))
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key)
def test_deprecation_message(self):
with mock.patch('acme.warnings.warn') as mock_warn:
from acme.challenges import TLSSNI01
assert TLSSNI01
self.assertEqual(mock_warn.call_count, 1)
self.assertTrue('deprecated' in mock_warn.call_args[0][0])
class TLSALPN01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import TLSALPN01Response
self.msg = TLSALPN01Response(key_authorization=u'foo')
self.jmsg = {
'resource': 'challenge',
'type': 'tls-alpn-01',
'keyAuthorization': u'foo',
}
from acme.challenges import TLSALPN01
self.chall = TLSALPN01(token=(b'x' * 16))
self.response = self.chall.response(KEY)
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.msg.to_partial_json())
def test_from_json(self):
from acme.challenges import TLSALPN01Response
self.assertEqual(self.msg, TLSALPN01Response.from_json(self.jmsg))
def test_from_json_hashable(self):
from acme.challenges import TLSALPN01Response
hash(TLSALPN01Response.from_json(self.jmsg))
class TLSALPN01Test(unittest.TestCase):
def setUp(self):
@@ -370,13 +460,8 @@ class TLSALPN01Test(unittest.TestCase):
self.assertRaises(
jose.DeserializationError, TLSALPN01.from_json, self.jmsg)
@mock.patch('acme.challenges.TLSALPN01Response.gen_cert')
def test_validation(self, mock_gen_cert):
mock_gen_cert.return_value = ('cert', 'key')
self.assertEqual(('cert', 'key'), self.msg.validation(
KEY, cert_key=mock.sentinel.cert_key, domain=mock.sentinel.domain))
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key,
domain=mock.sentinel.domain)
def test_validation(self):
self.assertRaises(NotImplementedError, self.msg.validation, KEY)
class DNSTest(unittest.TestCase):
@@ -402,11 +487,8 @@ class DNSTest(unittest.TestCase):
hash(DNS.from_json(self.jmsg))
def test_gen_check_validation(self):
ec_key_secp384r1 = JWKEC(key=test_util.load_ecdsa_private_key('ec_secp384r1_key.pem'))
for key, alg in [(KEY, jose.RS256), (ec_key_secp384r1, jose.ES384)]:
with self.subTest(key=key, alg=alg):
self.assertTrue(self.msg.check_validation(
self.msg.gen_validation(key, alg=alg), key.public_key()))
self.assertTrue(self.msg.check_validation(
self.msg.gen_validation(KEY), KEY.public_key()))
def test_gen_check_validation_wrong_key(self):
key2 = jose.JWKRSA.load(test_util.load_vector('rsa1024_key.pem'))
@@ -427,25 +509,20 @@ class DNSTest(unittest.TestCase):
payload=self.msg.update(
token=b'x' * 20).json_dumps().encode('utf-8'),
alg=jose.RS256, key=KEY)
self.assertFalse(self.msg.check_validation(bad_validation, KEY.public_key()))
self.assertFalse(self.msg.check_validation(
bad_validation, KEY.public_key()))
def test_gen_response(self):
with mock.patch('acme.challenges.DNS.gen_validation') as mock_gen:
mock_gen.return_value = mock.sentinel.validation
response = self.msg.gen_response(KEY)
from acme.challenges import DNSResponse
self.assertIsInstance(response, DNSResponse)
self.assertTrue(isinstance(response, DNSResponse))
self.assertEqual(response.validation, mock.sentinel.validation)
def test_validation_domain_name(self):
self.assertEqual('_acme-challenge.le.wtf', self.msg.validation_domain_name('le.wtf'))
def test_validation_domain_name_ecdsa(self):
ec_key_secp384r1 = JWKEC(key=test_util.load_ecdsa_private_key('ec_secp384r1_key.pem'))
self.assertIs(self.msg.check_validation(
self.msg.gen_validation(ec_key_secp384r1, alg=jose.ES384),
ec_key_secp384r1.public_key()), True
)
self.assertEqual(
'_acme-challenge.le.wtf', self.msg.validation_domain_name('le.wtf'))
class DNSResponseTest(unittest.TestCase):
@@ -483,20 +560,8 @@ class DNSResponseTest(unittest.TestCase):
hash(DNSResponse.from_json(self.jmsg_from))
def test_check_validation(self):
self.assertTrue(self.msg.check_validation(self.chall, KEY.public_key()))
class JWSPayloadRFC8555Compliant(unittest.TestCase):
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
def test_challenge_payload(self):
from acme.challenges import HTTP01Response
challenge_body = HTTP01Response()
challenge_body.le_acme_version = 2
jobj = challenge_body.json_dumps(indent=2).encode()
# RFC8555 states that challenge responses must have an empty payload.
self.assertEqual(jobj, b'{}')
self.assertTrue(
self.msg.check_validation(self.chall, KEY.public_key()))
if __name__ == '__main__':

File diff suppressed because it is too large Load Diff

View File

@@ -2,14 +2,13 @@
# pylint: disable=too-many-lines
import copy
import datetime
import http.client as http_client
import json
import unittest
from typing import Dict
from unittest import mock
import warnings
from six.moves import http_client # pylint: disable=import-error
import josepy as jose
import mock
import OpenSSL
import requests
@@ -17,22 +16,14 @@ from acme import challenges
from acme import errors
from acme import jws as acme_jws
from acme import messages
from acme.client import ClientNetwork
from acme.client import ClientV2
from acme.mixins import VersionedLEACMEMixin
import messages_test
import test_util
# Remove the following in Certbot 2.0:
with warnings.catch_warnings():
warnings.filterwarnings('ignore', '.* in acme.client', DeprecationWarning)
from acme.client import BackwardsCompatibleClientV2
from acme.client import Client
from acme import messages_test
from acme import test_util
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
CERT_DER = test_util.load_vector('cert.der')
CERT_SAN_PEM = test_util.load_vector('cert-san.pem')
CSR_SAN_PEM = test_util.load_vector('csr-san.pem')
CSR_MIXED_PEM = test_util.load_vector('csr-mixed.pem')
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
KEY2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
@@ -72,7 +63,7 @@ class ClientTestBase(unittest.TestCase):
self.contact = ('mailto:cert-admin@example.com', 'tel:+12025551212')
reg = messages.Registration(
contact=self.contact, key=KEY.public_key())
the_arg: Dict = dict(reg)
the_arg = dict(reg) # type: Dict
self.new_reg = messages.NewRegistration(**the_arg)
self.regr = messages.RegistrationResource(
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1')
@@ -95,17 +86,12 @@ class ClientTestBase(unittest.TestCase):
# Reason code for revocation
self.rsn = 1
class BackwardsCompatibleClientV2Test(ClientTestBase):
"""Tests for acme.client.BackwardsCompatibleClientV2."""
def setUp(self):
super().setUp()
# For some reason, required to suppress warnings on mock.patch('acme.client.Client')
self.warning_cap = warnings.catch_warnings()
self.warning_cap.__enter__()
warnings.filterwarnings('ignore', '.*acme.client', DeprecationWarning)
super(BackwardsCompatibleClientV2Test, self).setUp()
# contains a loaded cert
self.certr = messages.CertificateResource(
body=messages_test.CERT)
@@ -127,17 +113,15 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
self.orderr = messages.OrderResource(
csr_pem=CSR_SAN_PEM)
def tearDown(self) -> None:
self.warning_cap.__exit__()
return super().tearDown()
def _init(self):
uri = 'http://www.letsencrypt-demo.org/directory'
from acme.client import BackwardsCompatibleClientV2
return BackwardsCompatibleClientV2(net=self.net,
key=KEY, server=uri)
def test_init_downloads_directory(self):
uri = 'http://www.letsencrypt-demo.org/directory'
from acme.client import BackwardsCompatibleClientV2
BackwardsCompatibleClientV2(net=self.net,
key=KEY, server=uri)
self.net.get.assert_called_once_with(uri)
@@ -155,7 +139,6 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
self.response.json.return_value = DIRECTORY_V2.to_json()
client = self._init()
self.response.json.return_value = self.regr.body.to_json()
self.response.headers = {'Location': 'https://www.letsencrypt-demo.org/acme/reg/1'}
self.assertEqual(self.regr, client.query_registration(self.regr))
def test_forwarding(self):
@@ -279,7 +262,7 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.finalize_order(mock_orderr, mock_deadline)
mock_client().finalize_order.assert_called_once_with(mock_orderr, mock_deadline, False)
mock_client().finalize_order.assert_called_once_with(mock_orderr, mock_deadline)
def test_revoke(self):
self.response.json.return_value = DIRECTORY_V1.to_json()
@@ -335,9 +318,10 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
class ClientTest(ClientTestBase):
"""Tests for acme.client.Client."""
# pylint: disable=too-many-instance-attributes,too-many-public-methods
def setUp(self):
super().setUp()
super(ClientTest, self).setUp()
self.directory = DIRECTORY_V1
@@ -351,11 +335,13 @@ class ClientTest(ClientTestBase):
uri='https://www.letsencrypt-demo.org/acme/cert/1',
cert_chain_uri='https://www.letsencrypt-demo.org/ca')
from acme.client import Client
self.client = Client(
directory=self.directory, key=KEY, alg=jose.RS256, net=self.net)
def test_init_downloads_directory(self):
uri = 'http://www.letsencrypt-demo.org/directory'
from acme.client import Client
self.client = Client(
directory=uri, key=KEY, alg=jose.RS256, net=self.net)
self.net.get.assert_called_once_with(uri)
@@ -364,6 +350,7 @@ class ClientTest(ClientTestBase):
def test_init_without_net(self, mock_net):
mock_net.return_value = mock.sentinel.net
alg = jose.RS256
from acme.client import Client
self.client = Client(
directory=self.directory, key=KEY, alg=alg)
mock_net.called_once_with(KEY, alg=alg, verify_ssl=True)
@@ -619,8 +606,8 @@ class ClientTest(ClientTestBase):
# make sure that max_attempts is per-authorization, rather
# than global
max_attempts=max(len(authzrs[0].retries), len(authzrs[1].retries)))
self.assertIs(cert[0], csr)
self.assertIs(cert[1], updated_authzrs)
self.assertTrue(cert[0] is csr)
self.assertTrue(cert[1] is updated_authzrs)
self.assertEqual(updated_authzrs[0].uri, 'a...')
self.assertEqual(updated_authzrs[1].uri, 'b.')
self.assertEqual(updated_authzrs[0].times, [
@@ -656,7 +643,7 @@ class ClientTest(ClientTestBase):
authzr = self.client.deactivate_authorization(self.authzr)
self.assertEqual(authzb, authzr.body)
self.assertEqual(self.client.net.post.call_count, 1)
self.assertIn(self.authzr.uri, self.net.post.call_args_list[0][0])
self.assertTrue(self.authzr.uri in self.net.post.call_args_list[0][0])
def test_check_cert(self):
self.response.headers['Location'] = self.certr.uri
@@ -715,7 +702,7 @@ class ClientTest(ClientTestBase):
def test_revocation_payload(self):
obj = messages.Revocation(certificate=self.certr.body, reason=self.rsn)
self.assertIn('reason', obj.to_partial_json().keys())
self.assertTrue('reason' in obj.to_partial_json().keys())
self.assertEqual(self.rsn, obj.to_partial_json()['reason'])
def test_revoke_bad_status_raises_error(self):
@@ -731,10 +718,11 @@ class ClientV2Test(ClientTestBase):
"""Tests for acme.client.ClientV2."""
def setUp(self):
super().setUp()
super(ClientV2Test, self).setUp()
self.directory = DIRECTORY_V2
from acme.client import ClientV2
self.client = ClientV2(self.directory, self.net)
self.new_reg = self.new_reg.update(terms_of_service_agreed=True)
@@ -754,7 +742,7 @@ class ClientV2Test(ClientTestBase):
self.orderr = messages.OrderResource(
body=self.order,
uri='https://www.letsencrypt-demo.org/acme/acct/1/order/1',
authorizations=[self.authzr, self.authzr2], csr_pem=CSR_MIXED_PEM)
authorizations=[self.authzr, self.authzr2], csr_pem=CSR_SAN_PEM)
def test_new_account(self):
self.response.status_code = http_client.CREATED
@@ -784,7 +772,7 @@ class ClientV2Test(ClientTestBase):
with mock.patch('acme.client.ClientV2._post_as_get') as mock_post_as_get:
mock_post_as_get.side_effect = (authz_response, authz_response2)
self.assertEqual(self.client.new_order(CSR_MIXED_PEM), self.orderr)
self.assertEqual(self.client.new_order(CSR_SAN_PEM), self.orderr)
@mock.patch('acme.client.datetime')
def test_poll_and_finalize(self, mock_datetime):
@@ -834,8 +822,7 @@ class ClientV2Test(ClientTestBase):
def test_finalize_order_success(self):
updated_order = self.order.update(
certificate='https://www.letsencrypt-demo.org/acme/cert/',
status=messages.STATUS_VALID)
certificate='https://www.letsencrypt-demo.org/acme/cert/')
updated_orderr = self.orderr.update(body=updated_order, fullchain_pem=CERT_SAN_PEM)
self.response.json.return_value = updated_order.to_json()
@@ -845,53 +832,16 @@ class ClientV2Test(ClientTestBase):
self.assertEqual(self.client.finalize_order(self.orderr, deadline), updated_orderr)
def test_finalize_order_error(self):
updated_order = self.order.update(
error=messages.Error.with_code('unauthorized'),
status=messages.STATUS_INVALID)
updated_order = self.order.update(error=messages.Error.with_code('unauthorized'))
self.response.json.return_value = updated_order.to_json()
deadline = datetime.datetime(9999, 9, 9)
self.assertRaises(errors.IssuanceError, self.client.finalize_order, self.orderr, deadline)
def test_finalize_order_invalid_status(self):
# https://github.com/certbot/certbot/issues/9296
order = self.order.update(error=None, status=messages.STATUS_INVALID)
self.response.json.return_value = order.to_json()
with self.assertRaises(errors.Error) as error:
self.client.finalize_order(self.orderr, datetime.datetime(9999, 9, 9))
self.assertIn("The certificate order failed", str(error.exception))
def test_finalize_order_timeout(self):
deadline = datetime.datetime.now() - datetime.timedelta(seconds=60)
self.assertRaises(errors.TimeoutError, self.client.finalize_order, self.orderr, deadline)
def test_finalize_order_alt_chains(self):
updated_order = self.order.update(
certificate='https://www.letsencrypt-demo.org/acme/cert/',
status=messages.STATUS_VALID
)
updated_orderr = self.orderr.update(body=updated_order,
fullchain_pem=CERT_SAN_PEM,
alternative_fullchains_pem=[CERT_SAN_PEM,
CERT_SAN_PEM])
self.response.json.return_value = updated_order.to_json()
self.response.text = CERT_SAN_PEM
self.response.headers['Link'] ='<https://example.com/acme/cert/1>;rel="alternate", ' + \
'<https://example.com/dir>;rel="index", ' + \
'<https://example.com/acme/cert/2>;title="foo";rel="alternate"'
deadline = datetime.datetime(9999, 9, 9)
resp = self.client.finalize_order(self.orderr, deadline, fetch_alternative_chains=True)
self.net.post.assert_any_call('https://example.com/acme/cert/1',
mock.ANY, acme_version=2, new_nonce_url=mock.ANY)
self.net.post.assert_any_call('https://example.com/acme/cert/2',
mock.ANY, acme_version=2, new_nonce_url=mock.ANY)
self.assertEqual(resp, updated_orderr)
del self.response.headers['Link']
resp = self.client.finalize_order(self.orderr, deadline, fetch_alternative_chains=True)
self.assertEqual(resp, updated_orderr.update(alternative_fullchains_pem=[]))
def test_revoke(self):
self.client.revoke(messages_test.CERT, self.rsn)
self.net.post.assert_called_once_with(
@@ -903,9 +853,9 @@ class ClientV2Test(ClientTestBase):
self.response.headers['Location'] = self.regr.uri
self.response.json.return_value = self.regr.body.to_json()
self.assertEqual(self.regr, self.client.update_registration(self.regr))
self.assertIsNotNone(self.client.net.account)
self.assertNotEqual(self.client.net.account, None)
self.assertEqual(self.client.net.post.call_count, 2)
self.assertIn(DIRECTORY_V2.newAccount, self.net.post.call_args_list[0][0])
self.assertTrue(DIRECTORY_V2.newAccount in self.net.post.call_args_list[0][0])
self.response.json.return_value = self.regr.body.update(
contact=()).to_json()
@@ -938,8 +888,21 @@ class ClientV2Test(ClientTestBase):
new_nonce_url='https://www.letsencrypt-demo.org/acme/new-nonce')
self.client.net.get.assert_not_called()
class FakeError(messages.Error): # pylint: disable=too-many-ancestors
"""Fake error to reproduce a malformed request ACME error"""
def __init__(self): # pylint: disable=super-init-not-called
pass
@property
def code(self):
return 'malformed'
self.client.net.post.side_effect = FakeError()
class MockJSONDeSerializable(VersionedLEACMEMixin, jose.JSONDeSerializable):
self.client.poll(self.authzr2) # pylint: disable=protected-access
self.client.net.get.assert_called_once_with(self.authzr2.uri)
class MockJSONDeSerializable(jose.JSONDeSerializable):
# pylint: disable=missing-docstring
def __init__(self, value):
self.value = value
@@ -954,11 +917,13 @@ class MockJSONDeSerializable(VersionedLEACMEMixin, jose.JSONDeSerializable):
class ClientNetworkTest(unittest.TestCase):
"""Tests for acme.client.ClientNetwork."""
# pylint: disable=too-many-public-methods
def setUp(self):
self.verify_ssl = mock.MagicMock()
self.wrap_in_jws = mock.MagicMock(return_value=mock.sentinel.wrapped)
from acme.client import ClientNetwork
self.net = ClientNetwork(
key=KEY, alg=jose.RS256, verify_ssl=self.verify_ssl,
user_agent='acme-python-test')
@@ -968,7 +933,7 @@ class ClientNetworkTest(unittest.TestCase):
self.response.links = {}
def test_init(self):
self.assertIs(self.net.verify_ssl, self.verify_ssl)
self.assertTrue(self.net.verify_ssl is self.verify_ssl)
def test_wrap_in_jws(self):
# pylint: disable=protected-access
@@ -1002,8 +967,8 @@ class ClientNetworkTest(unittest.TestCase):
def test_check_response_not_ok_jobj_error(self):
self.response.ok = False
self.response.json.return_value = messages.Error.with_code(
'serverInternal', detail='foo', title='some title').to_json()
self.response.json.return_value = messages.Error(
detail='foo', typ='serverInternal', title='some title').to_json()
# pylint: disable=protected-access
self.assertRaises(
messages.Error, self.net._check_response, self.response)
@@ -1028,39 +993,10 @@ class ClientNetworkTest(unittest.TestCase):
self.response.json.side_effect = ValueError
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
self.response.headers['Content-Type'] = response_ct
# pylint: disable=protected-access
# pylint: disable=protected-access,no-value-for-parameter
self.assertEqual(
self.response, self.net._check_response(self.response))
@mock.patch('acme.client.logger')
def test_check_response_ok_ct_with_charset(self, mock_logger):
self.response.json.return_value = {}
self.response.headers['Content-Type'] = 'application/json; charset=utf-8'
# pylint: disable=protected-access
self.assertEqual(self.response, self.net._check_response(
self.response, content_type='application/json'))
try:
mock_logger.debug.assert_called_with(
'Ignoring wrong Content-Type (%r) for JSON decodable response',
'application/json; charset=utf-8'
)
except AssertionError:
return
raise AssertionError('Expected Content-Type warning ' #pragma: no cover
'to not have been logged')
@mock.patch('acme.client.logger')
def test_check_response_ok_bad_ct(self, mock_logger):
self.response.json.return_value = {}
self.response.headers['Content-Type'] = 'text/plain'
# pylint: disable=protected-access
self.assertEqual(self.response, self.net._check_response(
self.response, content_type='application/json'))
mock_logger.debug.assert_called_with(
'Ignoring wrong Content-Type (%r) for JSON decodable response',
'text/plain'
)
def test_check_response_conflict(self):
self.response.ok = False
self.response.status_code = 409
@@ -1071,7 +1007,7 @@ class ClientNetworkTest(unittest.TestCase):
self.response.json.return_value = {}
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
self.response.headers['Content-Type'] = response_ct
# pylint: disable=protected-access
# pylint: disable=protected-access,no-value-for-parameter
self.assertEqual(
self.response, self.net._check_response(self.response))
@@ -1187,16 +1123,18 @@ class ClientNetworkTest(unittest.TestCase):
class ClientNetworkWithMockedResponseTest(unittest.TestCase):
"""Tests for acme.client.ClientNetwork which mock out response."""
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.client import ClientNetwork
self.net = ClientNetwork(key=None, alg=None)
self.response = mock.MagicMock(ok=True, status_code=http_client.OK)
self.response.headers = {}
self.response.links = {}
self.response.checked = False
self.acmev1_nonce_response = mock.MagicMock(
ok=False, status_code=http_client.METHOD_NOT_ALLOWED)
self.acmev1_nonce_response = mock.MagicMock(ok=False,
status_code=http_client.METHOD_NOT_ALLOWED)
self.acmev1_nonce_response.headers = {}
self.obj = mock.MagicMock()
self.wrapped_obj = mock.MagicMock()
@@ -1209,7 +1147,7 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
def send_request(*args, **kwargs):
# pylint: disable=unused-argument,missing-docstring
self.assertNotIn("new_nonce_url", kwargs)
self.assertFalse("new_nonce_url" in kwargs)
method = args[0]
uri = args[1]
if method == 'HEAD' and uri != "new_nonce_uri":
@@ -1351,21 +1289,20 @@ class ClientNetworkSourceAddressBindingTest(unittest.TestCase):
self.source_address = "8.8.8.8"
def test_source_address_set(self):
with mock.patch('warnings.warn') as mock_warn:
net = ClientNetwork(key=None, alg=None, source_address=self.source_address)
mock_warn.assert_called_once()
self.assertIn('source_address', mock_warn.call_args[0][0])
from acme.client import ClientNetwork
net = ClientNetwork(key=None, alg=None, source_address=self.source_address)
for adapter in net.session.adapters.values():
self.assertIn(self.source_address, adapter.source_address)
self.assertTrue(self.source_address in adapter.source_address)
def test_behavior_assumption(self):
"""This is a test that guardrails the HTTPAdapter behavior so that if the default for
a Session() changes, the assumptions here aren't violated silently."""
from acme.client import ClientNetwork
# Source address not specified, so the default adapter type should be bound -- this
# test should fail if the default adapter type is changed by requests
net = ClientNetwork(key=None, alg=None)
session = requests.Session()
for scheme in session.adapters:
for scheme in session.adapters.keys():
client_network_adapter = net.session.adapters.get(scheme)
default_adapter = session.adapters.get(scheme)
self.assertEqual(client_network_adapter.__class__, default_adapter.__class__)

View File

@@ -1,26 +1,20 @@
"""Crypto utilities."""
import binascii
import contextlib
import ipaddress
import logging
import os
import re
import socket
from typing import Any
from typing import Callable
from typing import List
from typing import Mapping
from typing import Optional
from typing import Sequence
from typing import Set
from typing import Tuple
from typing import Union
import josepy as jose
from OpenSSL import crypto
from OpenSSL import SSL
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
import josepy as jose
from acme import errors
# pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Callable, Union, Tuple, Optional
# pylint: enable=unused-import, no-name-in-module
logger = logging.getLogger(__name__)
@@ -31,60 +25,27 @@ logger = logging.getLogger(__name__)
# https://www.openssl.org/docs/ssl/SSLv23_method.html). _serve_sni
# should be changed to use "set_options" to disable SSLv2 and SSLv3,
# in case it's used for things other than probing/serving!
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
class _DefaultCertSelection:
def __init__(self, certs: Mapping[bytes, Tuple[crypto.PKey, crypto.X509]]):
self.certs = certs
def __call__(self, connection: SSL.Connection) -> Optional[Tuple[crypto.PKey, crypto.X509]]:
server_name = connection.get_servername()
if server_name:
return self.certs.get(server_name, None)
return None # pragma: no cover
class SSLSocket: # pylint: disable=too-few-public-methods
class SSLSocket(object): # pylint: disable=too-few-public-methods
"""SSL wrapper for sockets.
:ivar socket sock: Original wrapped socket.
:ivar dict certs: Mapping from domain names (`bytes`) to
`OpenSSL.crypto.X509`.
:ivar method: See `OpenSSL.SSL.Context` for allowed values.
:ivar alpn_selection: Hook to select negotiated ALPN protocol for
connection.
:ivar cert_selection: Hook to select certificate for connection. If given,
`certs` parameter would be ignored, and therefore must be empty.
"""
def __init__(self, sock: socket.socket,
certs: Optional[Mapping[bytes, Tuple[crypto.PKey, crypto.X509]]] = None,
method: int = _DEFAULT_SSL_METHOD,
alpn_selection: Optional[Callable[[SSL.Connection, List[bytes]], bytes]] = None,
cert_selection: Optional[Callable[[SSL.Connection],
Optional[Tuple[crypto.PKey,
crypto.X509]]]] = None
) -> None:
def __init__(self, sock, certs, method=_DEFAULT_SSL_METHOD):
self.sock = sock
self.alpn_selection = alpn_selection
self.certs = certs
self.method = method
if not cert_selection and not certs:
raise ValueError("Neither cert_selection or certs specified.")
if cert_selection and certs:
raise ValueError("Both cert_selection and certs specified.")
actual_cert_selection: Union[_DefaultCertSelection,
Optional[Callable[[SSL.Connection],
Optional[Tuple[crypto.PKey,
crypto.X509]]]]] = cert_selection
if actual_cert_selection is None:
actual_cert_selection = _DefaultCertSelection(certs if certs else {})
self.cert_selection = actual_cert_selection
def __getattr__(self, name: str) -> Any:
def __getattr__(self, name):
return getattr(self.sock, name)
def _pick_certificate_cb(self, connection: SSL.Connection) -> None:
def _pick_certificate_cb(self, connection):
"""SNI certificate callback.
This method will set a new OpenSSL context object for this
@@ -96,58 +57,46 @@ class SSLSocket: # pylint: disable=too-few-public-methods
:type connection: :class:`OpenSSL.Connection`
"""
pair = self.cert_selection(connection)
if pair is None:
logger.debug("Certificate selection for server name %s failed, dropping SSL",
connection.get_servername())
server_name = connection.get_servername()
try:
key, cert = self.certs[server_name]
except KeyError:
logger.debug("Server name (%s) not recognized, dropping SSL",
server_name)
return
key, cert = pair
new_context = SSL.Context(self.method)
new_context.set_options(SSL.OP_NO_SSLv2)
new_context.set_options(SSL.OP_NO_SSLv3)
new_context.use_privatekey(key)
new_context.use_certificate(cert)
if self.alpn_selection is not None:
new_context.set_alpn_select_callback(self.alpn_selection)
connection.set_context(new_context)
class FakeConnection:
class FakeConnection(object):
"""Fake OpenSSL.SSL.Connection."""
# pylint: disable=missing-function-docstring
# pylint: disable=too-few-public-methods,missing-docstring
def __init__(self, connection: SSL.Connection) -> None:
def __init__(self, connection):
self._wrapped = connection
def __getattr__(self, name: str) -> Any:
def __getattr__(self, name):
return getattr(self._wrapped, name)
def shutdown(self, *unused_args: Any) -> bool:
def shutdown(self, *unused_args):
# OpenSSL.SSL.Connection.shutdown doesn't accept any args
try:
return self._wrapped.shutdown()
except SSL.Error as error:
# We wrap the error so we raise the same error type as sockets
# in the standard library. This is useful when this object is
# used by code which expects a standard socket such as
# socketserver in the standard library.
raise socket.error(error)
return self._wrapped.shutdown()
def accept(self) -> Tuple[FakeConnection, Any]: # pylint: disable=missing-function-docstring
def accept(self): # pylint: disable=missing-docstring
sock, addr = self.sock.accept()
context = SSL.Context(self.method)
context.set_options(SSL.OP_NO_SSLv2)
context.set_options(SSL.OP_NO_SSLv3)
context.set_tlsext_servername_callback(self._pick_certificate_cb)
if self.alpn_selection is not None:
context.set_alpn_select_callback(self.alpn_selection)
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
ssl_sock.set_accept_state()
# This log line is especially desirable because without it requests to
# our standalone TLSALPN server would not be logged.
logger.debug("Performing handshake with %s", addr)
try:
ssl_sock.do_handshake()
@@ -159,9 +108,8 @@ class SSLSocket: # pylint: disable=too-few-public-methods
return ssl_sock, addr
def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, # pylint: disable=too-many-arguments
method: int = _DEFAULT_SSL_METHOD, source_address: Tuple[str, int] = ('', 0),
alpn_protocols: Optional[Sequence[bytes]] = None) -> crypto.X509:
def probe_sni(name, host, port=443, timeout=300,
method=_DEFAULT_SSL_METHOD, source_address=('', 0)):
"""Probe SNI server for SSL certificate.
:param bytes name: Byte string to send as the server name in the
@@ -173,8 +121,6 @@ def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, #
:param tuple source_address: Enables multi-path probing (selection
of source interface). See `socket.creation_connection` for more
info. Available only in Python 2.7+.
:param alpn_protocols: Protocols to request using ALPN.
:type alpn_protocols: `Sequence` of `bytes`
:raises acme.errors.Error: In case of any problems.
@@ -193,10 +139,10 @@ def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, #
" from {0}:{1}".format(
source_address[0],
source_address[1]
) if any(source_address) else ""
) if socket_kwargs else ""
)
socket_tuple: Tuple[bytes, int] = (host, port)
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore[arg-type]
socket_tuple = (host, port) # type: Tuple[str, int]
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore
except socket.error as error:
raise errors.Error(error)
@@ -204,57 +150,30 @@ def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, #
client_ssl = SSL.Connection(context, client)
client_ssl.set_connect_state()
client_ssl.set_tlsext_host_name(name) # pyOpenSSL>=0.13
if alpn_protocols is not None:
client_ssl.set_alpn_protos(alpn_protocols)
try:
client_ssl.do_handshake()
client_ssl.shutdown()
except SSL.Error as error:
raise errors.Error(error)
cert = client_ssl.get_peer_certificate()
assert cert # Appease mypy. We would have crashed out by now if there was no certificate.
return cert
return client_ssl.get_peer_certificate()
def make_csr(private_key_pem: bytes, domains: Optional[Union[Set[str], List[str]]] = None,
must_staple: bool = False,
ipaddrs: Optional[List[Union[ipaddress.IPv4Address, ipaddress.IPv6Address]]] = None
) -> bytes:
"""Generate a CSR containing domains or IPs as subjectAltNames.
def make_csr(private_key_pem, domains, must_staple=False):
"""Generate a CSR containing a list of domains as subjectAltNames.
:param buffer private_key_pem: Private key, in PEM PKCS#8 format.
:param list domains: List of DNS names to include in subjectAltNames of CSR.
:param bool must_staple: Whether to include the TLS Feature extension (aka
OCSP Must Staple: https://tools.ietf.org/html/rfc7633).
:param list ipaddrs: List of IPaddress(type ipaddress.IPv4Address or ipaddress.IPv6Address)
names to include in subbjectAltNames of CSR.
params ordered this way for backward competablity when called by positional argument.
:returns: buffer PEM-encoded Certificate Signing Request.
"""
private_key = crypto.load_privatekey(
crypto.FILETYPE_PEM, private_key_pem)
csr = crypto.X509Req()
sanlist = []
# if domain or ip list not supplied make it empty list so it's easier to iterate
if domains is None:
domains = []
if ipaddrs is None:
ipaddrs = []
if len(domains)+len(ipaddrs) == 0:
raise ValueError("At least one of domains or ipaddrs parameter need to be not empty")
for address in domains:
sanlist.append('DNS:' + address)
for ips in ipaddrs:
sanlist.append('IP:' + ips.exploded)
# make sure its ascii encoded
san_string = ', '.join(sanlist).encode('ascii')
# for IP san it's actually need to be octet-string,
# but somewhere downsteam thankfully handle it for us
extensions = [
crypto.X509Extension(
b'subjectAltName',
critical=False,
value=san_string
value=', '.join('DNS:' + d for d in domains).encode('ascii')
),
]
if must_staple:
@@ -264,16 +183,12 @@ def make_csr(private_key_pem: bytes, domains: Optional[Union[Set[str], List[str]
value=b"DER:30:03:02:01:05"))
csr.add_extensions(extensions)
csr.set_pubkey(private_key)
# RFC 2986 Section 4.1 only defines version 0
csr.set_version(0)
csr.set_version(2)
csr.sign(private_key, 'sha256')
return crypto.dump_certificate_request(
crypto.FILETYPE_PEM, csr)
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req: Union[crypto.X509, crypto.X509Req]
) -> List[str]:
# unlike its name this only outputs DNS names, other type of idents will ignored
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
common_name = loaded_cert_or_req.get_subject().CN
sans = _pyopenssl_cert_or_req_san(loaded_cert_or_req)
@@ -281,8 +196,7 @@ def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req: Union[crypto.X509, cryp
return sans
return [common_name] + [d for d in sans if d != common_name]
def _pyopenssl_cert_or_req_san(cert_or_req: Union[crypto.X509, crypto.X509Req]) -> List[str]:
def _pyopenssl_cert_or_req_san(cert_or_req):
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
.. todo:: Implement directly in PyOpenSSL!
@@ -293,87 +207,45 @@ def _pyopenssl_cert_or_req_san(cert_or_req: Union[crypto.X509, crypto.X509Req])
:param cert_or_req: Certificate or CSR.
:type cert_or_req: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
:returns: A list of Subject Alternative Names that is DNS.
:rtype: `list` of `str`
:returns: A list of Subject Alternative Names.
:rtype: `list` of `unicode`
"""
# This function finds SANs with dns name
# This function finds SANs by dumping the certificate/CSR to text and
# searching for "X509v3 Subject Alternative Name" in the text. This method
# is used to support PyOpenSSL version 0.13 where the
# `_subjectAltNameString` and `get_extensions` methods are not available
# for CSRs.
# constants based on PyOpenSSL certificate/CSR text dump
part_separator = ":"
parts_separator = ", "
prefix = "DNS" + part_separator
sans_parts = _pyopenssl_extract_san_list_raw(cert_or_req)
if isinstance(cert_or_req, crypto.X509):
# pylint: disable=line-too-long
func = crypto.dump_certificate # type: Union[Callable[[int, crypto.X509Req], bytes], Callable[[int, crypto.X509], bytes]]
else:
func = crypto.dump_certificate_request
text = func(crypto.FILETYPE_TEXT, cert_or_req).decode("utf-8")
# WARNING: this function does not support multiple SANs extensions.
# Multiple X509v3 extensions of the same type is disallowed by RFC 5280.
match = re.search(r"X509v3 Subject Alternative Name:(?: critical)?\s*(.*)", text)
# WARNING: this function assumes that no SAN can include
# parts_separator, hence the split!
sans_parts = [] if match is None else match.group(1).split(parts_separator)
return [part.split(part_separator)[1]
for part in sans_parts if part.startswith(prefix)]
def _pyopenssl_cert_or_req_san_ip(cert_or_req: Union[crypto.X509, crypto.X509Req]) -> List[str]:
"""Get Subject Alternative Names IPs from certificate or CSR using pyOpenSSL.
:param cert_or_req: Certificate or CSR.
:type cert_or_req: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
:returns: A list of Subject Alternative Names that are IP Addresses.
:rtype: `list` of `str`. note that this returns as string, not IPaddress object
"""
# constants based on PyOpenSSL certificate/CSR text dump
part_separator = ":"
prefix = "IP Address" + part_separator
sans_parts = _pyopenssl_extract_san_list_raw(cert_or_req)
return [part[len(prefix):] for part in sans_parts if part.startswith(prefix)]
def _pyopenssl_extract_san_list_raw(cert_or_req: Union[crypto.X509, crypto.X509Req]) -> List[str]:
"""Get raw SAN string from cert or csr, parse it as UTF-8 and return.
:param cert_or_req: Certificate or CSR.
:type cert_or_req: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
:returns: raw san strings, parsed byte as utf-8
:rtype: `list` of `str`
"""
# This function finds SANs by dumping the certificate/CSR to text and
# searching for "X509v3 Subject Alternative Name" in the text. This method
# is used to because in PyOpenSSL version <0.17 `_subjectAltNameString` methods are
# not able to Parse IP Addresses in subjectAltName string.
if isinstance(cert_or_req, crypto.X509):
# pylint: disable=line-too-long
text = crypto.dump_certificate(crypto.FILETYPE_TEXT, cert_or_req).decode('utf-8')
else:
text = crypto.dump_certificate_request(crypto.FILETYPE_TEXT, cert_or_req).decode('utf-8')
# WARNING: this function does not support multiple SANs extensions.
# Multiple X509v3 extensions of the same type is disallowed by RFC 5280.
raw_san = re.search(r"X509v3 Subject Alternative Name:(?: critical)?\s*(.*)", text)
parts_separator = ", "
# WARNING: this function assumes that no SAN can include
# parts_separator, hence the split!
sans_parts = [] if raw_san is None else raw_san.group(1).split(parts_separator)
return sans_parts
def gen_ss_cert(key: crypto.PKey, domains: Optional[List[str]] = None,
not_before: Optional[int] = None,
validity: int = (7 * 24 * 60 * 60), force_san: bool = True,
extensions: Optional[List[crypto.X509Extension]] = None,
ips: Optional[List[Union[ipaddress.IPv4Address, ipaddress.IPv6Address]]] = None
) -> crypto.X509:
def gen_ss_cert(key, domains, not_before=None,
validity=(7 * 24 * 60 * 60), force_san=True):
"""Generate new self-signed certificate.
:type domains: `list` of `str`
:type domains: `list` of `unicode`
:param OpenSSL.crypto.PKey key:
:param bool force_san:
:param extensions: List of additional extensions to include in the cert.
:type extensions: `list` of `OpenSSL.crypto.X509Extension`
:type ips: `list` of (`ipaddress.IPv4Address` or `ipaddress.IPv6Address`)
If more than one domain is provided, all of the domains are put into
``subjectAltName`` X.509 extension and first domain is set as the
@@ -381,39 +253,25 @@ def gen_ss_cert(key: crypto.PKey, domains: Optional[List[str]] = None,
extension is used, unless `force_san` is ``True``.
"""
assert domains or ips, "Must provide one or more hostnames or IPs for the cert."
assert domains, "Must provide one or more hostnames for the cert."
cert = crypto.X509()
cert.set_serial_number(int(binascii.hexlify(os.urandom(16)), 16))
cert.set_version(2)
if extensions is None:
extensions = []
if domains is None:
domains = []
if ips is None:
ips = []
extensions.append(
extensions = [
crypto.X509Extension(
b"basicConstraints", True, b"CA:TRUE, pathlen:0"),
)
]
if len(domains) > 0:
cert.get_subject().CN = domains[0]
cert.get_subject().CN = domains[0]
# TODO: what to put into cert.get_subject()?
cert.set_issuer(cert.get_subject())
sanlist = []
for address in domains:
sanlist.append('DNS:' + address)
for ip in ips:
sanlist.append('IP:' + ip.exploded)
san_string = ', '.join(sanlist).encode('ascii')
if force_san or len(domains) > 1 or len(ips) > 0:
if force_san or len(domains) > 1:
extensions.append(crypto.X509Extension(
b"subjectAltName",
critical=False,
value=san_string
value=b", ".join(b"DNS:" + d.encode() for d in domains)
))
cert.add_extensions(extensions)
@@ -425,9 +283,7 @@ def gen_ss_cert(key: crypto.PKey, domains: Optional[List[str]] = None,
cert.sign(key, "sha256")
return cert
def dump_pyopenssl_chain(chain: Union[List[jose.ComparableX509], List[crypto.X509]],
filetype: int = crypto.FILETYPE_PEM) -> bytes:
def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
"""Dump certificate chain into a bundle.
:param list chain: List of `OpenSSL.crypto.X509` (or wrapped in
@@ -440,10 +296,9 @@ def dump_pyopenssl_chain(chain: Union[List[jose.ComparableX509], List[crypto.X50
# XXX: returns empty string when no chain is available, which
# shuts up RenewableCert, but might not be the best solution...
def _dump_cert(cert: Union[jose.ComparableX509, crypto.X509]) -> bytes:
def _dump_cert(cert):
if isinstance(cert, jose.ComparableX509):
if isinstance(cert.wrapped, crypto.X509Req):
raise errors.Error("Unexpected CSR provided.") # pragma: no cover
# pylint: disable=protected-access
cert = cert.wrapped
return crypto.dump_certificate(filetype, cert)

View File

@@ -1,23 +1,25 @@
"""Tests for acme.crypto_util."""
import itertools
import ipaddress
import socket
import socketserver
import threading
import time
import unittest
from typing import List
import six
from six.moves import socketserver #type: ignore # pylint: disable=import-error
import josepy as jose
import OpenSSL
from acme import errors
import test_util
from acme import test_util
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
class SSLSocketAndProbeSNITest(unittest.TestCase):
"""Tests for acme.crypto_util.SSLSocket/probe_sni."""
def setUp(self):
self.cert = test_util.load_comparable_cert('rsa2048_cert.pem')
key = test_util.load_pyopenssl_private_key('rsa2048_key.pem')
@@ -28,14 +30,17 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
class _TestServer(socketserver.TCPServer):
# pylint: disable=too-few-public-methods
# six.moves.* | pylint: disable=attribute-defined-outside-init,no-init
def server_bind(self): # pylint: disable=missing-docstring
self.socket = SSLSocket(socket.socket(),
certs)
self.socket = SSLSocket(socket.socket(), certs=certs)
socketserver.TCPServer.server_bind(self)
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
self.port = self.server.socket.getsockname()[1]
self.server_thread = threading.Thread(
# pylint: disable=no-member
target=self.server.handle_request)
def tearDown(self):
@@ -61,7 +66,8 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
self.assertRaises(errors.Error, self._probe, b'bar')
def test_probe_connection_error(self):
self.server.server_close()
# pylint has a hard time with six
self.server.server_close() # pylint: disable=no-member
original_timeout = socket.getdefaulttimeout()
try:
socket.setdefaulttimeout(1)
@@ -70,18 +76,6 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
socket.setdefaulttimeout(original_timeout)
class SSLSocketTest(unittest.TestCase):
"""Tests for acme.crypto_util.SSLSocket."""
def test_ssl_socket_invalid_arguments(self):
from acme.crypto_util import SSLSocket
with self.assertRaises(ValueError):
_ = SSLSocket(None, {'sni': ('key', 'cert')},
cert_selection=lambda _: None)
with self.assertRaises(ValueError):
_ = SSLSocket(None)
class PyOpenSSLCertOrReqAllNamesTest(unittest.TestCase):
"""Test for acme.crypto_util._pyopenssl_cert_or_req_all_names."""
@@ -109,6 +103,7 @@ class PyOpenSSLCertOrReqAllNamesTest(unittest.TestCase):
class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
"""Test for acme.crypto_util._pyopenssl_cert_or_req_san."""
@classmethod
def _call(cls, loader, name):
# pylint: disable=protected-access
@@ -118,9 +113,9 @@ class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
@classmethod
def _get_idn_names(cls):
"""Returns expected names from '{cert,csr}-idnsans.pem'."""
chars = [chr(i) for i in itertools.chain(range(0x3c3, 0x400),
range(0x641, 0x6fc),
range(0x1820, 0x1877))]
chars = [six.unichr(i) for i in itertools.chain(range(0x3c3, 0x400),
range(0x641, 0x6fc),
range(0x1820, 0x1877))]
return [''.join(chars[i: i + 45]) + '.invalid'
for i in range(0, len(chars), 45)]
@@ -174,73 +169,24 @@ class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
['chicago-cubs.venafi.example', 'cubs.venafi.example'])
class PyOpenSSLCertOrReqSANIPTest(unittest.TestCase):
"""Test for acme.crypto_util._pyopenssl_cert_or_req_san_ip."""
@classmethod
def _call(cls, loader, name):
# pylint: disable=protected-access
from acme.crypto_util import _pyopenssl_cert_or_req_san_ip
return _pyopenssl_cert_or_req_san_ip(loader(name))
def _call_cert(self, name):
return self._call(test_util.load_cert, name)
def _call_csr(self, name):
return self._call(test_util.load_csr, name)
def test_cert_no_sans(self):
self.assertEqual(self._call_cert('cert.pem'), [])
def test_csr_no_sans(self):
self.assertEqual(self._call_csr('csr-nosans.pem'), [])
def test_cert_domain_sans(self):
self.assertEqual(self._call_cert('cert-san.pem'), [])
def test_csr_domain_sans(self):
self.assertEqual(self._call_csr('csr-san.pem'), [])
def test_cert_ip_two_sans(self):
self.assertEqual(self._call_cert('cert-ipsans.pem'), ['192.0.2.145', '203.0.113.1'])
def test_csr_ip_two_sans(self):
self.assertEqual(self._call_csr('csr-ipsans.pem'), ['192.0.2.145', '203.0.113.1'])
def test_csr_ipv6_sans(self):
self.assertEqual(self._call_csr('csr-ipv6sans.pem'),
['0:0:0:0:0:0:0:1', 'A3BE:32F3:206E:C75D:956:CEE:9858:5EC5'])
def test_cert_ipv6_sans(self):
self.assertEqual(self._call_cert('cert-ipv6sans.pem'),
['0:0:0:0:0:0:0:1', 'A3BE:32F3:206E:C75D:956:CEE:9858:5EC5'])
class GenSsCertTest(unittest.TestCase):
"""Test for gen_ss_cert (generation of self-signed cert)."""
class RandomSnTest(unittest.TestCase):
"""Test for random certificate serial numbers."""
def setUp(self):
self.cert_count = 5
self.serial_num: List[int] = []
self.serial_num = [] # type: List[int]
self.key = OpenSSL.crypto.PKey()
self.key.generate_key(OpenSSL.crypto.TYPE_RSA, 2048)
def test_sn_collisions(self):
from acme.crypto_util import gen_ss_cert
for _ in range(self.cert_count):
cert = gen_ss_cert(self.key, ['dummy'], force_san=True,
ips=[ipaddress.ip_address("10.10.10.10")])
cert = gen_ss_cert(self.key, ['dummy'], force_san=True)
self.serial_num.append(cert.get_serial_number())
self.assertGreaterEqual(len(set(self.serial_num)), self.cert_count)
def test_no_name(self):
from acme.crypto_util import gen_ss_cert
with self.assertRaises(AssertionError):
gen_ss_cert(self.key, ips=[ipaddress.ip_address("1.1.1.1")])
gen_ss_cert(self.key)
self.assertTrue(len(set(self.serial_num)) > 1)
class MakeCSRTest(unittest.TestCase):
"""Test for standalone functions."""
@@ -255,8 +201,8 @@ class MakeCSRTest(unittest.TestCase):
def test_make_csr(self):
csr_pem = self._call_with_key(["a.example", "b.example"])
self.assertIn(b'--BEGIN CERTIFICATE REQUEST--', csr_pem)
self.assertIn(b'--END CERTIFICATE REQUEST--', csr_pem)
self.assertTrue(b'--BEGIN CERTIFICATE REQUEST--' in csr_pem)
self.assertTrue(b'--END CERTIFICATE REQUEST--' in csr_pem)
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
@@ -272,27 +218,6 @@ class MakeCSRTest(unittest.TestCase):
).get_data(),
)
def test_make_csr_ip(self):
csr_pem = self._call_with_key(["a.example"], False, [ipaddress.ip_address('127.0.0.1'), ipaddress.ip_address('::1')])
self.assertIn(b'--BEGIN CERTIFICATE REQUEST--' , csr_pem)
self.assertIn(b'--END CERTIFICATE REQUEST--' , csr_pem)
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
# have a get_extensions() method, so we skip this test if the method
# isn't available.
if hasattr(csr, 'get_extensions'):
self.assertEqual(len(csr.get_extensions()), 1)
self.assertEqual(csr.get_extensions()[0].get_data(),
OpenSSL.crypto.X509Extension(
b'subjectAltName',
critical=False,
value=b'DNS:a.example, IP:127.0.0.1, IP:::1',
).get_data(),
)
# for IP san it's actually need to be octet-string,
# but somewhere downstream thankfully handle it for us
def test_make_csr_must_staple(self):
csr_pem = self._call_with_key(["a.example"], must_staple=True)
csr = OpenSSL.crypto.load_certificate_request(
@@ -311,17 +236,6 @@ class MakeCSRTest(unittest.TestCase):
self.assertEqual(len(must_staple_exts), 1,
"Expected exactly one Must Staple extension")
def test_make_csr_without_hostname(self):
self.assertRaises(ValueError, self._call_with_key)
def test_make_csr_correct_version(self):
csr_pem = self._call_with_key(["a.example"])
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
self.assertEqual(csr.get_version(), 0,
"Expected CSR version to be v1 (encoded as 0), per RFC 2986, section 4")
class DumpPyopensslChainTest(unittest.TestCase):
"""Test for dump_pyopenssl_chain."""

View File

@@ -1,17 +1,5 @@
"""ACME errors."""
import typing
from typing import Any
from typing import List
from typing import Mapping
from typing import Set
from josepy import errors as jose_errors
import requests
# We import acme.messages only during type check to avoid circular dependencies. Type references
# to acme.message.* must be quoted to be lazily initialized and avoid compilation errors.
if typing.TYPE_CHECKING:
from acme import messages # pragma: no cover
class Error(Exception):
@@ -40,12 +28,12 @@ class NonceError(ClientError):
class BadNonce(NonceError):
"""Bad nonce error."""
def __init__(self, nonce: str, error: Exception, *args: Any) -> None:
super().__init__(*args)
def __init__(self, nonce, error, *args, **kwargs):
super(BadNonce, self).__init__(*args, **kwargs)
self.nonce = nonce
self.error = error
def __str__(self) -> str:
def __str__(self):
return 'Invalid nonce ({0!r}): {1}'.format(self.nonce, self.error)
@@ -56,14 +44,14 @@ class MissingNonce(NonceError):
Replay-Nonce header field in each successful response to a POST it
provides to a client (...)".
:ivar requests.Response ~.response: HTTP Response
:ivar requests.Response response: HTTP Response
"""
def __init__(self, response: requests.Response, *args: Any) -> None:
super().__init__(*args)
def __init__(self, response, *args, **kwargs):
super(MissingNonce, self).__init__(*args, **kwargs)
self.response = response
def __str__(self) -> str:
def __str__(self):
return ('Server {0} response did not include a replay '
'nonce, headers: {1} (This may be a service outage)'.format(
self.response.request.method, self.response.headers))
@@ -81,48 +69,41 @@ class PollError(ClientError):
to the most recently updated one
"""
def __init__(self, exhausted: Set['messages.AuthorizationResource'],
updated: Mapping['messages.AuthorizationResource',
'messages.AuthorizationResource']
) -> None:
def __init__(self, exhausted, updated):
self.exhausted = exhausted
self.updated = updated
super().__init__()
super(PollError, self).__init__()
@property
def timeout(self) -> bool:
def timeout(self):
"""Was the error caused by timeout?"""
return bool(self.exhausted)
def __repr__(self) -> str:
def __repr__(self):
return '{0}(exhausted={1!r}, updated={2!r})'.format(
self.__class__.__name__, self.exhausted, self.updated)
class ValidationError(Error):
"""Error for authorization failures. Contains a list of authorization
resources, each of which is invalid and should have an error field.
"""
def __init__(self, failed_authzrs: List['messages.AuthorizationResource']) -> None:
def __init__(self, failed_authzrs):
self.failed_authzrs = failed_authzrs
super().__init__()
super(ValidationError, self).__init__()
class TimeoutError(Error): # pylint: disable=redefined-builtin
class TimeoutError(Error):
"""Error for when polling an authorization or an order times out."""
class IssuanceError(Error):
"""Error sent by the server after requesting issuance of a certificate."""
def __init__(self, error: 'messages.Error') -> None:
def __init__(self, error):
"""Initialize.
:param messages.Error error: The error provided by the server.
"""
self.error = error
super().__init__()
super(IssuanceError, self).__init__()
class ConflictError(ClientError):
"""Error for when the server returns a 409 (Conflict) HTTP status.
@@ -132,9 +113,9 @@ class ConflictError(ClientError):
Also used in V2 of the ACME client for the same purpose.
"""
def __init__(self, location: str) -> None:
def __init__(self, location):
self.location = location
super().__init__()
super(ConflictError, self).__init__()
class WildcardUnsupportedError(Error):

View File

@@ -1,6 +1,7 @@
"""Tests for acme.errors."""
import unittest
from unittest import mock
import mock
class BadNonceTest(unittest.TestCase):
@@ -24,8 +25,8 @@ class MissingNonceTest(unittest.TestCase):
self.error = MissingNonce(self.response)
def test_str(self):
self.assertIn("FOO", str(self.error))
self.assertIn("{}", str(self.error))
self.assertTrue("FOO" in str(self.error))
self.assertTrue("{}" in str(self.error))
class PollErrorTest(unittest.TestCase):
@@ -34,7 +35,7 @@ class PollErrorTest(unittest.TestCase):
def setUp(self):
from acme.errors import PollError
self.timeout = PollError(
exhausted={mock.sentinel.AR},
exhausted=set([mock.sentinel.AR]),
updated={})
self.invalid = PollError(exhausted=set(), updated={
mock.sentinel.AR: mock.sentinel.AR2})

View File

@@ -1,33 +1,27 @@
"""ACME JSON fields."""
import datetime
import logging
import sys
from types import ModuleType
from typing import Any
from typing import cast
from typing import List
import warnings
import josepy as jose
import pyrfc3339
logger = logging.getLogger(__name__)
class Fixed(jose.Field):
"""Fixed field."""
def __init__(self, json_name: str, value: Any) -> None:
def __init__(self, json_name, value):
self.value = value
super().__init__(
super(Fixed, self).__init__(
json_name=json_name, default=value, omitempty=False)
def decode(self, value: Any) -> Any:
def decode(self, value):
if value != self.value:
raise jose.DeserializationError('Expected {0!r}'.format(self.value))
return self.value
def encode(self, value: Any) -> Any:
def encode(self, value):
if value != self.value:
logger.warning(
'Overriding fixed field (%s) with %r', self.json_name, value)
@@ -44,11 +38,11 @@ class RFC3339Field(jose.Field):
"""
@classmethod
def default_encoder(cls, value: datetime.datetime) -> str:
def default_encoder(cls, value):
return pyrfc3339.generate(value)
@classmethod
def default_decoder(cls, value: str) -> datetime.datetime:
def default_decoder(cls, value):
try:
return pyrfc3339.parse(value)
except ValueError as error:
@@ -56,70 +50,16 @@ class RFC3339Field(jose.Field):
class Resource(jose.Field):
"""Resource MITM field.
"""Resource MITM field."""
.. deprecated: 1.30.0
"""
def __init__(self, resource_type: str, *args: Any, **kwargs: Any) -> None:
def __init__(self, resource_type, *args, **kwargs):
self.resource_type = resource_type
kwargs['default'] = resource_type
super().__init__('resource', *args, **kwargs)
super(Resource, self).__init__(
'resource', default=resource_type, *args, **kwargs)
def decode(self, value: Any) -> Any:
def decode(self, value):
if value != self.resource_type:
raise jose.DeserializationError(
'Wrong resource type: {0} instead of {1}'.format(
value, self.resource_type))
return value
def fixed(json_name: str, value: Any) -> Any:
"""Generates a type-friendly Fixed field."""
return Fixed(json_name, value)
def rfc3339(json_name: str, omitempty: bool = False) -> Any:
"""Generates a type-friendly RFC3339 field."""
return RFC3339Field(json_name, omitempty=omitempty)
def resource(resource_type: str) -> Any:
"""Generates a type-friendly Resource field.
.. deprecated: 1.30.0
"""
return Resource(resource_type)
# This class takes a similar approach to the cryptography project to deprecate attributes
# in public modules. See the _ModuleWithDeprecation class here:
# https://github.com/pyca/cryptography/blob/91105952739442a74582d3e62b3d2111365b0dc7/src/cryptography/utils.py#L129
class _FieldsDeprecationModule: # pragma: no cover
"""
Internal class delegating to a module, and displaying warnings when
module attributes deprecated in acme.fields are accessed.
"""
def __init__(self, module: ModuleType) -> None:
self.__dict__['_module'] = module
def __getattr__(self, attr: str) -> None:
if attr in ('Resource', 'resource'):
warnings.warn('{0} attribute in acme.fields module is deprecated '
'and will be removed soon.'.format(attr),
DeprecationWarning, stacklevel=2)
return getattr(self._module, attr)
def __setattr__(self, attr: str, value: Any) -> None:
setattr(self._module, attr, value)
def __delattr__(self, attr: str) -> None:
delattr(self._module, attr)
def __dir__(self) -> List[str]:
return ['_module'] + dir(self._module)
sys.modules[__name__] = cast(ModuleType, _FieldsDeprecationModule(sys.modules[__name__]))

View File

@@ -1,7 +1,6 @@
"""Tests for acme.fields."""
import datetime
import unittest
import warnings
import josepy as jose
import pytz
@@ -11,8 +10,8 @@ class FixedTest(unittest.TestCase):
"""Tests for acme.fields.Fixed."""
def setUp(self):
from acme.fields import fixed
self.field = fixed('name', 'x')
from acme.fields import Fixed
self.field = Fixed('name', 'x')
def test_decode(self):
self.assertEqual('x', self.field.decode('x'))
@@ -59,10 +58,8 @@ class ResourceTest(unittest.TestCase):
"""Tests for acme.fields.Resource."""
def setUp(self):
with warnings.catch_warnings():
warnings.filterwarnings('ignore', '.*Resource', DeprecationWarning)
from acme.fields import Resource
self.field = Resource('x')
from acme.fields import Resource
self.field = Resource('x')
def test_decode_good(self):
self.assertEqual('x', self.field.decode('x'))

View File

@@ -2,7 +2,6 @@
import importlib
import unittest
class JoseTest(unittest.TestCase):
"""Tests for acme.jose shim."""
@@ -21,10 +20,11 @@ class JoseTest(unittest.TestCase):
# We use the imports below with eval, but pylint doesn't
# understand that.
import acme # pylint: disable=unused-import
import josepy # pylint: disable=unused-import
acme_jose_mod = eval(acme_jose_path) # pylint: disable=eval-used
josepy_mod = eval(josepy_path) # pylint: disable=eval-used
# pylint: disable=eval-used,unused-variable
import acme
import josepy
acme_jose_mod = eval(acme_jose_path)
josepy_mod = eval(josepy_path)
self.assertIs(acme_jose_mod, josepy_mod)
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))

View File

@@ -4,22 +4,18 @@ The JWS implementation in josepy only implements the base JOSE standard. In
order to support the new header fields defined in ACME, this module defines some
ACME-specific classes that layer on top of josepy.
"""
from typing import Optional
import josepy as jose
class Header(jose.Header):
"""ACME-specific JOSE Header. Implements nonce, kid, and url.
"""
nonce: Optional[bytes] = jose.field('nonce', omitempty=True, encoder=jose.encode_b64jose)
kid: Optional[str] = jose.field('kid', omitempty=True)
url: Optional[str] = jose.field('url', omitempty=True)
nonce = jose.Field('nonce', omitempty=True, encoder=jose.encode_b64jose)
kid = jose.Field('kid', omitempty=True)
url = jose.Field('url', omitempty=True)
# Mypy does not understand the josepy magic happening here, and falsely claims
# that nonce is redefined. Let's ignore the type check here.
@nonce.decoder # type: ignore[no-redef,union-attr]
def nonce(value: str) -> bytes: # type: ignore[misc] # pylint: disable=no-self-argument,missing-function-docstring
@nonce.decoder
def nonce(value): # pylint: disable=missing-docstring,no-self-argument
try:
return jose.decode_b64jose(value)
except jose.DeserializationError as error:
@@ -29,12 +25,12 @@ class Header(jose.Header):
class Signature(jose.Signature):
"""ACME-specific Signature. Uses ACME-specific Header for customer fields."""
__slots__ = jose.Signature._orig_slots # type: ignore[attr-defined] # pylint: disable=protected-access,no-member
__slots__ = jose.Signature._orig_slots # pylint: disable=no-member
# TODO: decoder/encoder should accept cls? Otherwise, subclassing
# JSONObjectWithFields is tricky...
header_cls = Header
header: Header = jose.field(
header = jose.Field(
'header', omitempty=True, default=header_cls(),
decoder=header_cls.from_json)
@@ -44,16 +40,15 @@ class Signature(jose.Signature):
class JWS(jose.JWS):
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
signature_cls = Signature
__slots__ = jose.JWS._orig_slots # type: ignore[attr-defined] # pylint: disable=protected-access
__slots__ = jose.JWS._orig_slots # pylint: disable=no-member
@classmethod
# type: ignore[override] # pylint: disable=arguments-differ
def sign(cls, payload: bytes, key: jose.JWK, alg: jose.JWASignature, nonce: Optional[bytes],
url: Optional[str] = None, kid: Optional[str] = None) -> jose.JWS:
# pylint: disable=arguments-differ,too-many-arguments
def sign(cls, payload, key, alg, nonce, url=None, kid=None):
# Per ACME spec, jwk and kid are mutually exclusive, so only include a
# jwk field if kid is not provided.
include_jwk = kid is None
return super().sign(payload, key=key, alg=alg,
protect=frozenset(['nonce', 'url', 'kid', 'jwk', 'alg']),
nonce=nonce, url=url, kid=kid,
include_jwk=include_jwk)
return super(JWS, cls).sign(payload, key=key, alg=alg,
protect=frozenset(['nonce', 'url', 'kid', 'jwk', 'alg']),
nonce=nonce, url=url, kid=kid,
include_jwk=include_jwk)

View File

@@ -3,7 +3,8 @@ import unittest
import josepy as jose
import test_util
from acme import test_util
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
@@ -48,7 +49,7 @@ class JWSTest(unittest.TestCase):
self.assertEqual(jws.signature.combined.nonce, self.nonce)
self.assertEqual(jws.signature.combined.url, self.url)
self.assertEqual(jws.signature.combined.kid, self.kid)
self.assertIsNone(jws.signature.combined.jwk)
self.assertEqual(jws.signature.combined.jwk, None)
# TODO: check that nonce is in protected header
self.assertEqual(jws, JWS.from_json(jws.to_json()))
@@ -58,7 +59,7 @@ class JWSTest(unittest.TestCase):
jws = JWS.sign(payload=b'foo', key=self.privkey,
alg=jose.RS256, nonce=self.nonce,
url=self.url)
self.assertIsNone(jws.signature.combined.kid)
self.assertEqual(jws.signature.combined.kid, None)
self.assertEqual(jws.signature.combined.jwk, self.pubkey)

View File

@@ -1,18 +1,16 @@
"""Simple shim around the typing module.
"""Shim class to not have to depend on typing module in prod."""
import sys
This was useful when this code supported Python 2 and typing wasn't always
available. This code is being kept for now for backwards compatibility.
"""
import warnings
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
from typing import Any
warnings.warn("acme.magic_typing is deprecated and will be removed in a future release.",
DeprecationWarning)
class TypingClass:
class TypingClass(object):
"""Ignore import errors by getting anything"""
def __getattr__(self, name: str) -> Any:
return None # pragma: no cover
def __getattr__(self, name):
return None
try:
# mypy doesn't respect modifying sys.modules
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
# pylint: disable=unused-import
from typing import Collection, IO # type: ignore
# pylint: enable=unused-import
except ImportError:
sys.modules[__name__] = TypingClass()

View File

@@ -0,0 +1,41 @@
"""Tests for acme.magic_typing."""
import sys
import unittest
import mock
class MagicTypingTest(unittest.TestCase):
"""Tests for acme.magic_typing."""
def test_import_success(self):
try:
import typing as temp_typing
except ImportError: # pragma: no cover
temp_typing = None # pragma: no cover
typing_class_mock = mock.MagicMock()
text_mock = mock.MagicMock()
typing_class_mock.Text = text_mock
sys.modules['typing'] = typing_class_mock
if 'acme.magic_typing' in sys.modules:
del sys.modules['acme.magic_typing'] # pragma: no cover
from acme.magic_typing import Text # pylint: disable=no-name-in-module
self.assertEqual(Text, text_mock)
del sys.modules['acme.magic_typing']
sys.modules['typing'] = temp_typing
def test_import_failure(self):
try:
import typing as temp_typing
except ImportError: # pragma: no cover
temp_typing = None # pragma: no cover
sys.modules['typing'] = None
if 'acme.magic_typing' in sys.modules:
del sys.modules['acme.magic_typing'] # pragma: no cover
from acme.magic_typing import Text # pylint: disable=no-name-in-module
self.assertTrue(Text is None)
del sys.modules['acme.magic_typing']
sys.modules['typing'] = temp_typing
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,39 +1,18 @@
"""ACME protocol messages."""
import datetime
from collections.abc import Hashable
import json
from types import ModuleType
from typing import Any
from typing import cast
from typing import Dict
from typing import Iterator
from typing import List
from typing import Mapping
from typing import MutableMapping
from typing import Optional
from typing import Tuple
from typing import Type
from typing import TYPE_CHECKING
from typing import TypeVar
from typing import Union
import sys
import warnings
import six
try:
from collections.abc import Hashable # pylint: disable=no-name-in-module
except ImportError: # pragma: no cover
from collections import Hashable
import josepy as jose
from acme import challenges
from acme import errors
from acme import fields
from acme import jws
from acme import util
with warnings.catch_warnings():
warnings.filterwarnings("ignore", ".*acme.mixins", category=DeprecationWarning)
from acme.mixins import ResourceMixin
if TYPE_CHECKING:
from typing_extensions import Protocol # pragma: no cover
else:
Protocol = object
from acme import jws
OLD_ERROR_PREFIX = "urn:acme:error:"
ERROR_PREFIX = "urn:ietf:params:acme:error:"
@@ -54,7 +33,7 @@ ERROR_CODES = {
' domain'),
'dns': 'There was a problem with a DNS query during identifier validation',
'dnssec': 'The server could not validate a DNSSEC signed domain',
'incorrectResponse': 'Response received didn\'t match the challenge\'s requirements',
'incorrectResponse': 'Response recieved didn\'t match the challenge\'s requirements',
# deprecate invalidEmail
'invalidEmail': 'The provided email for a registration was invalid',
'invalidContact': 'The provided contact URI was invalid',
@@ -71,100 +50,40 @@ ERROR_CODES = {
'externalAccountRequired': 'The server requires external account binding',
}
ERROR_TYPE_DESCRIPTIONS = {**{
ERROR_PREFIX + name: desc for name, desc in ERROR_CODES.items()
}, **{ # add errors with old prefix, deprecate me
OLD_ERROR_PREFIX + name: desc for name, desc in ERROR_CODES.items()
}}
ERROR_TYPE_DESCRIPTIONS = dict(
(ERROR_PREFIX + name, desc) for name, desc in ERROR_CODES.items())
ERROR_TYPE_DESCRIPTIONS.update(dict( # add errors with old prefix, deprecate me
(OLD_ERROR_PREFIX + name, desc) for name, desc in ERROR_CODES.items()))
def is_acme_error(err: BaseException) -> bool:
def is_acme_error(err):
"""Check if argument is an ACME error."""
if isinstance(err, Error) and (err.typ is not None):
return (ERROR_PREFIX in err.typ) or (OLD_ERROR_PREFIX in err.typ)
return False
class _Constant(jose.JSONDeSerializable, Hashable):
"""ACME constant."""
__slots__ = ('name',)
POSSIBLE_NAMES: Dict[str, '_Constant'] = NotImplemented
def __init__(self, name: str) -> None:
super().__init__()
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
self.name = name
def to_partial_json(self) -> str:
return self.name
@classmethod
def from_json(cls, jobj: str) -> '_Constant':
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
raise jose.DeserializationError(f'{cls.__name__} not recognized')
return cls.POSSIBLE_NAMES[jobj]
def __repr__(self) -> str:
return f'{self.__class__.__name__}({self.name})'
def __eq__(self, other: Any) -> bool:
return isinstance(other, type(self)) and other.name == self.name
def __hash__(self) -> int:
return hash((self.__class__, self.name))
class IdentifierType(_Constant):
"""ACME identifier type."""
POSSIBLE_NAMES: Dict[str, _Constant] = {}
IDENTIFIER_FQDN = IdentifierType('dns') # IdentifierDNS in Boulder
IDENTIFIER_IP = IdentifierType('ip') # IdentifierIP in pebble - not in Boulder yet
class Identifier(jose.JSONObjectWithFields):
"""ACME identifier.
:ivar IdentifierType typ:
:ivar str value:
"""
typ: IdentifierType = jose.field('type', decoder=IdentifierType.from_json)
value: str = jose.field('value')
@six.python_2_unicode_compatible
class Error(jose.JSONObjectWithFields, errors.Error):
"""ACME error.
https://datatracker.ietf.org/doc/html/rfc7807
https://tools.ietf.org/html/draft-ietf-appsawg-http-problem-00
:ivar str typ:
:ivar str title:
:ivar str detail:
:ivar Identifier identifier:
:ivar tuple subproblems: An array of ACME Errors which may be present when the CA
returns multiple errors related to the same request, `tuple` of `Error`.
:ivar unicode typ:
:ivar unicode title:
:ivar unicode detail:
"""
typ: str = jose.field('type', omitempty=True, default='about:blank')
title: str = jose.field('title', omitempty=True)
detail: str = jose.field('detail', omitempty=True)
identifier: Optional['Identifier'] = jose.field(
'identifier', decoder=Identifier.from_json, omitempty=True)
subproblems: Optional[Tuple['Error', ...]] = jose.field('subproblems', omitempty=True)
# Mypy does not understand the josepy magic happening here, and falsely claims
# that subproblems is redefined. Let's ignore the type check here.
@subproblems.decoder # type: ignore
def subproblems(value: List[Dict[str, Any]]) -> Tuple['Error', ...]: # type: ignore[misc] # pylint: disable=no-self-argument,missing-function-docstring
return tuple(Error.from_json(subproblem) for subproblem in value)
typ = jose.Field('type', omitempty=True, default='about:blank')
title = jose.Field('title', omitempty=True)
detail = jose.Field('detail', omitempty=True)
@classmethod
def with_code(cls, code: str, **kwargs: Any) -> 'Error':
def with_code(cls, code, **kwargs):
"""Create an Error instance with an ACME Error code.
:str code: An ACME error code, like 'dnssec'.
:unicode code: An ACME error code, like 'dnssec'.
:kwargs: kwargs to pass to Error.
"""
@@ -172,53 +91,76 @@ class Error(jose.JSONObjectWithFields, errors.Error):
raise ValueError("The supplied code: %s is not a known ACME error"
" code" % code)
typ = ERROR_PREFIX + code
# Mypy will not understand that the Error constructor accepts a named argument
# "typ" because of josepy magic. Let's ignore the type check here.
return cls(typ=typ, **kwargs)
@property
def description(self) -> Optional[str]:
def description(self):
"""Hardcoded error description based on its type.
:returns: Description if standard ACME error or ``None``.
:rtype: str
:rtype: unicode
"""
return ERROR_TYPE_DESCRIPTIONS.get(self.typ)
@property
def code(self) -> Optional[str]:
def code(self):
"""ACME error code.
Basically self.typ without the ERROR_PREFIX.
:returns: error code if standard ACME code or ``None``.
:rtype: str
:rtype: unicode
"""
code = str(self.typ).rsplit(':', maxsplit=1)[-1]
code = str(self.typ).split(':')[-1]
if code in ERROR_CODES:
return code
return None
def __str__(self) -> str:
result = b' :: '.join(
def __str__(self):
return b' :: '.join(
part.encode('ascii', 'backslashreplace') for part in
(self.typ, self.description, self.detail, self.title)
if part is not None).decode()
if self.identifier:
result = f'Problem for {self.identifier.value}: ' + result # pylint: disable=no-member
if self.subproblems and len(self.subproblems) > 0:
for subproblem in self.subproblems:
result += f'\n{subproblem}'
return result
class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
"""ACME constant."""
__slots__ = ('name',)
POSSIBLE_NAMES = NotImplemented
def __init__(self, name):
super(_Constant, self).__init__()
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
self.name = name
def to_partial_json(self):
return self.name
@classmethod
def from_json(cls, jobj):
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
raise jose.DeserializationError(
'{0} not recognized'.format(cls.__name__))
return cls.POSSIBLE_NAMES[jobj] # pylint: disable=unsubscriptable-object
def __repr__(self):
return '{0}({1})'.format(self.__class__.__name__, self.name)
def __eq__(self, other):
return isinstance(other, type(self)) and other.name == self.name
def __hash__(self):
return hash((self.__class__, self.name))
def __ne__(self, other):
return not self == other
class Status(_Constant):
"""ACME "status" field."""
POSSIBLE_NAMES: Dict[str, _Constant] = {}
POSSIBLE_NAMES = {} # type: dict
STATUS_UNKNOWN = Status('unknown')
STATUS_PENDING = Status('pending')
STATUS_PROCESSING = Status('processing')
@@ -229,95 +171,90 @@ STATUS_READY = Status('ready')
STATUS_DEACTIVATED = Status('deactivated')
class HasResourceType(Protocol):
"""
Represents a class with a resource_type class parameter of type string.
"""
resource_type: str = NotImplemented
class IdentifierType(_Constant):
"""ACME identifier type."""
POSSIBLE_NAMES = {} # type: dict
IDENTIFIER_FQDN = IdentifierType('dns') # IdentifierDNS in Boulder
GenericHasResourceType = TypeVar("GenericHasResourceType", bound=HasResourceType)
class Identifier(jose.JSONObjectWithFields):
"""ACME identifier.
:ivar IdentifierType typ:
:ivar unicode value:
"""
typ = jose.Field('type', decoder=IdentifierType.from_json)
value = jose.Field('value')
class Directory(jose.JSONDeSerializable):
"""Directory."""
_REGISTERED_TYPES: Dict[str, Type[HasResourceType]] = {}
_REGISTERED_TYPES = {} # type: dict
class Meta(jose.JSONObjectWithFields):
"""Directory Meta."""
_terms_of_service: str = jose.field('terms-of-service', omitempty=True)
_terms_of_service_v2: str = jose.field('termsOfService', omitempty=True)
website: str = jose.field('website', omitempty=True)
caa_identities: List[str] = jose.field('caaIdentities', omitempty=True)
external_account_required: bool = jose.field('externalAccountRequired', omitempty=True)
_terms_of_service = jose.Field('terms-of-service', omitempty=True)
_terms_of_service_v2 = jose.Field('termsOfService', omitempty=True)
website = jose.Field('website', omitempty=True)
caa_identities = jose.Field('caaIdentities', omitempty=True)
external_account_required = jose.Field('externalAccountRequired', omitempty=True)
def __init__(self, **kwargs: Any) -> None:
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
super().__init__(**kwargs)
def __init__(self, **kwargs):
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
super(Directory.Meta, self).__init__(**kwargs)
@property
def terms_of_service(self) -> str:
def terms_of_service(self):
"""URL for the CA TOS"""
return self._terms_of_service or self._terms_of_service_v2
def __iter__(self) -> Iterator[str]:
def __iter__(self):
# When iterating over fields, use the external name 'terms_of_service' instead of
# the internal '_terms_of_service'.
for name in super().__iter__():
for name in super(Directory.Meta, self).__iter__():
yield name[1:] if name == '_terms_of_service' else name
def _internal_name(self, name: str) -> str:
def _internal_name(self, name):
return '_' + name if name == 'terms_of_service' else name
@classmethod
def _canon_key(cls, key: Union[str, HasResourceType, Type[HasResourceType]]) -> str:
if isinstance(key, str):
return key
return key.resource_type
@classmethod
def register(cls,
resource_body_cls: Type[GenericHasResourceType]) -> Type[GenericHasResourceType]:
def _canon_key(cls, key):
return getattr(key, 'resource_type', key)
@classmethod
def register(cls, resource_body_cls):
"""Register resource."""
warnings.warn(
"acme.messages.Directory.register is deprecated and will be removed in the next "
"major release of Certbot", DeprecationWarning, stacklevel=2
)
resource_type = resource_body_cls.resource_type
assert resource_type not in cls._REGISTERED_TYPES
cls._REGISTERED_TYPES[resource_type] = resource_body_cls
return resource_body_cls
def __init__(self, jobj: Mapping[str, Any]) -> None:
def __init__(self, jobj):
canon_jobj = util.map_keys(jobj, self._canon_key)
# TODO: check that everything is an absolute URL; acme-spec is
# not clear on that
self._jobj = canon_jobj
def __getattr__(self, name: str) -> Any:
def __getattr__(self, name):
try:
return self[name.replace('_', '-')]
except KeyError as error:
raise AttributeError(str(error))
raise AttributeError(str(error) + ': ' + name)
def __getitem__(self, name: Union[str, HasResourceType, Type[HasResourceType]]) -> Any:
if not isinstance(name, str):
warnings.warn(
"Looking up acme.messages.Directory resources by non-string keys is deprecated "
"and will be removed in the next major release of Certbot",
DeprecationWarning, stacklevel=2
)
def __getitem__(self, name):
try:
return self._jobj[self._canon_key(name)]
except KeyError:
raise KeyError('Directory field "' + self._canon_key(name) + '" not found')
raise KeyError('Directory field not found')
def to_partial_json(self) -> Dict[str, Any]:
def to_partial_json(self):
return self._jobj
@classmethod
def from_json(cls, jobj: MutableMapping[str, Any]) -> 'Directory':
def from_json(cls, jobj):
jobj['meta'] = cls.Meta.from_json(jobj.pop('meta', {}))
return cls(jobj)
@@ -328,28 +265,27 @@ class Resource(jose.JSONObjectWithFields):
:ivar acme.messages.ResourceBody body: Resource body.
"""
body: "ResourceBody" = jose.field('body')
body = jose.Field('body')
class ResourceWithURI(Resource):
"""ACME Resource with URI.
:ivar str uri: Location of the resource.
:ivar unicode uri: Location of the resource.
"""
uri: str = jose.field('uri') # no ChallengeResource.uri
uri = jose.Field('uri') # no ChallengeResource.uri
class ResourceBody(jose.JSONObjectWithFields):
"""ACME Resource Body."""
class ExternalAccountBinding:
class ExternalAccountBinding(object):
"""ACME External Account Binding"""
@classmethod
def from_data(cls, account_public_key: jose.JWK, kid: str, hmac_key: str,
directory: Directory) -> Dict[str, Any]:
def from_data(cls, account_public_key, kid, hmac_key, directory):
"""Create External Account Binding Resource from contact details, kid and hmac."""
key_json = json.dumps(account_public_key.to_partial_json()).encode()
@@ -363,129 +299,83 @@ class ExternalAccountBinding:
return eab.to_partial_json()
GenericRegistration = TypeVar('GenericRegistration', bound='Registration')
class Registration(ResourceBody):
"""Registration Resource Body.
:ivar jose.JWK key: Public key.
:ivar josepy.jwk.JWK key: Public key.
:ivar tuple contact: Contact information following ACME spec,
`tuple` of `str`.
:ivar str agreement:
`tuple` of `unicode`.
:ivar unicode agreement:
"""
# on new-reg key server ignores 'key' and populates it based on
# JWS.signature.combined.jwk
key: jose.JWK = jose.field('key', omitempty=True, decoder=jose.JWK.from_json)
# Contact field implements special behavior to allow messages that clear existing
# contacts while not expecting the `contact` field when loading from json.
# This is implemented in the constructor and *_json methods.
contact: Tuple[str, ...] = jose.field('contact', omitempty=True, default=())
agreement: str = jose.field('agreement', omitempty=True)
status: Status = jose.field('status', omitempty=True)
terms_of_service_agreed: bool = jose.field('termsOfServiceAgreed', omitempty=True)
only_return_existing: bool = jose.field('onlyReturnExisting', omitempty=True)
external_account_binding: Dict[str, Any] = jose.field('externalAccountBinding',
omitempty=True)
key = jose.Field('key', omitempty=True, decoder=jose.JWK.from_json)
contact = jose.Field('contact', omitempty=True, default=())
agreement = jose.Field('agreement', omitempty=True)
status = jose.Field('status', omitempty=True)
terms_of_service_agreed = jose.Field('termsOfServiceAgreed', omitempty=True)
only_return_existing = jose.Field('onlyReturnExisting', omitempty=True)
external_account_binding = jose.Field('externalAccountBinding', omitempty=True)
phone_prefix = 'tel:'
email_prefix = 'mailto:'
@classmethod
def from_data(cls: Type[GenericRegistration], phone: Optional[str] = None,
email: Optional[str] = None,
external_account_binding: Optional[Dict[str, Any]] = None,
**kwargs: Any) -> GenericRegistration:
"""
Create registration resource from contact details.
The `contact` keyword being passed to a Registration object is meaningful, so
this function represents empty iterables in its kwargs by passing on an empty
`tuple`.
"""
# Note if `contact` was in kwargs.
contact_provided = 'contact' in kwargs
# Pop `contact` from kwargs and add formatted email or phone numbers
def from_data(cls, phone=None, email=None, external_account_binding=None, **kwargs):
"""Create registration resource from contact details."""
details = list(kwargs.pop('contact', ()))
if phone is not None:
details.append(cls.phone_prefix + phone)
if email is not None:
details.extend([cls.email_prefix + mail for mail in email.split(',')])
# Insert formatted contact information back into kwargs
# or insert an empty tuple if `contact` provided.
if details or contact_provided:
kwargs['contact'] = tuple(details)
kwargs['contact'] = tuple(details)
if external_account_binding:
kwargs['external_account_binding'] = external_account_binding
return cls(**kwargs)
def __init__(self, **kwargs: Any) -> None:
"""Note if the user provides a value for the `contact` member."""
if 'contact' in kwargs and kwargs['contact'] is not None:
# Avoid the __setattr__ used by jose.TypedJSONObjectWithFields
object.__setattr__(self, '_add_contact', True)
super().__init__(**kwargs)
def _filter_contact(self, prefix: str) -> Tuple[str, ...]:
def _filter_contact(self, prefix):
return tuple(
detail[len(prefix):] for detail in self.contact # pylint: disable=not-an-iterable
if detail.startswith(prefix))
def _add_contact_if_appropriate(self, jobj: Dict[str, Any]) -> Dict[str, Any]:
"""
The `contact` member of Registration objects should not be required when
de-serializing (as it would be if the Fields' `omitempty` flag were `False`), but
it should be included in serializations if it was provided.
:param jobj: Dictionary containing this Registrations' data
:type jobj: dict
:returns: Dictionary containing Registrations data to transmit to the server
:rtype: dict
"""
if getattr(self, '_add_contact', False):
jobj['contact'] = self.encode('contact')
return jobj
def to_partial_json(self) -> Dict[str, Any]:
"""Modify josepy.JSONDeserializable.to_partial_json()"""
jobj = super().to_partial_json()
return self._add_contact_if_appropriate(jobj)
def fields_to_partial_json(self) -> Dict[str, Any]:
"""Modify josepy.JSONObjectWithFields.fields_to_partial_json()"""
jobj = super().fields_to_partial_json()
return self._add_contact_if_appropriate(jobj)
@property
def phones(self) -> Tuple[str, ...]:
def phones(self):
"""All phones found in the ``contact`` field."""
return self._filter_contact(self.phone_prefix)
@property
def emails(self) -> Tuple[str, ...]:
def emails(self):
"""All emails found in the ``contact`` field."""
return self._filter_contact(self.email_prefix)
@Directory.register
class NewRegistration(Registration):
"""New registration."""
resource_type = 'new-reg'
resource = fields.Resource(resource_type)
class UpdateRegistration(Registration):
"""Update registration."""
resource_type = 'reg'
resource = fields.Resource(resource_type)
class RegistrationResource(ResourceWithURI):
"""Registration Resource.
:ivar acme.messages.Registration body:
:ivar str new_authzr_uri: Deprecated. Do not use.
:ivar str terms_of_service: URL for the CA TOS.
:ivar unicode new_authzr_uri: Deprecated. Do not use.
:ivar unicode terms_of_service: URL for the CA TOS.
"""
body: Registration = jose.field('body', decoder=Registration.from_json)
new_authzr_uri: str = jose.field('new_authzr_uri', omitempty=True)
terms_of_service: str = jose.field('terms_of_service', omitempty=True)
body = jose.Field('body', decoder=Registration.from_json)
new_authzr_uri = jose.Field('new_authzr_uri', omitempty=True)
terms_of_service = jose.Field('terms_of_service', omitempty=True)
class ChallengeBody(ResourceBody):
@@ -510,47 +400,47 @@ class ChallengeBody(ResourceBody):
# challenge object supports either one, but should be accessed through the
# name "uri". In Client.answer_challenge, whichever one is set will be
# used.
_uri: str = jose.field('uri', omitempty=True, default=None)
_url: str = jose.field('url', omitempty=True, default=None)
status: Status = jose.field('status', decoder=Status.from_json,
_uri = jose.Field('uri', omitempty=True, default=None)
_url = jose.Field('url', omitempty=True, default=None)
status = jose.Field('status', decoder=Status.from_json,
omitempty=True, default=STATUS_PENDING)
validated: datetime.datetime = fields.rfc3339('validated', omitempty=True)
error: Error = jose.field('error', decoder=Error.from_json,
validated = fields.RFC3339Field('validated', omitempty=True)
error = jose.Field('error', decoder=Error.from_json,
omitempty=True, default=None)
def __init__(self, **kwargs: Any) -> None:
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
super().__init__(**kwargs)
def __init__(self, **kwargs):
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
super(ChallengeBody, self).__init__(**kwargs)
def encode(self, name: str) -> Any:
return super().encode(self._internal_name(name))
def encode(self, name):
return super(ChallengeBody, self).encode(self._internal_name(name))
def to_partial_json(self) -> Dict[str, Any]:
jobj = super().to_partial_json()
def to_partial_json(self):
jobj = super(ChallengeBody, self).to_partial_json()
jobj.update(self.chall.to_partial_json())
return jobj
@classmethod
def fields_from_json(cls, jobj: Mapping[str, Any]) -> Dict[str, Any]:
jobj_fields = super().fields_from_json(jobj)
def fields_from_json(cls, jobj):
jobj_fields = super(ChallengeBody, cls).fields_from_json(jobj)
jobj_fields['chall'] = challenges.Challenge.from_json(jobj)
return jobj_fields
@property
def uri(self) -> str:
def uri(self):
"""The URL of this challenge."""
return self._url or self._uri
def __getattr__(self, name: str) -> Any:
def __getattr__(self, name):
return getattr(self.chall, name)
def __iter__(self) -> Iterator[str]:
def __iter__(self):
# When iterating over fields, use the external name 'uri' instead of
# the internal '_uri'.
for name in super().__iter__():
for name in super(ChallengeBody, self).__iter__():
yield name[1:] if name == '_uri' else name
def _internal_name(self, name: str) -> str:
def _internal_name(self, name):
return '_' + name if name == 'uri' else name
@@ -558,16 +448,17 @@ class ChallengeResource(Resource):
"""Challenge Resource.
:ivar acme.messages.ChallengeBody body:
:ivar str authzr_uri: URI found in the 'up' ``Link`` header.
:ivar unicode authzr_uri: URI found in the 'up' ``Link`` header.
"""
body: ChallengeBody = jose.field('body', decoder=ChallengeBody.from_json)
authzr_uri: str = jose.field('authzr_uri')
body = jose.Field('body', decoder=ChallengeBody.from_json)
authzr_uri = jose.Field('authzr_uri')
@property
def uri(self) -> str:
def uri(self):
"""The URL of the challenge body."""
return self.body.uri # pylint: disable=no-member
# pylint: disable=function-redefined,no-member
return self.body.uri
class Authorization(ResourceBody):
@@ -576,81 +467,69 @@ class Authorization(ResourceBody):
:ivar acme.messages.Identifier identifier:
:ivar list challenges: `list` of `.ChallengeBody`
:ivar tuple combinations: Challenge combinations (`tuple` of `tuple`
of `int`, as opposed to `list` of `list` from the spec). (deprecated since 1.30.0)
of `int`, as opposed to `list` of `list` from the spec).
:ivar acme.messages.Status status:
:ivar datetime.datetime expires:
"""
identifier: Identifier = jose.field('identifier', decoder=Identifier.from_json, omitempty=True)
challenges: List[ChallengeBody] = jose.field('challenges', omitempty=True)
_combinations: Tuple[Tuple[int, ...], ...] = jose.field('combinations', omitempty=True)
identifier = jose.Field('identifier', decoder=Identifier.from_json, omitempty=True)
challenges = jose.Field('challenges', omitempty=True)
combinations = jose.Field('combinations', omitempty=True)
status: Status = jose.field('status', omitempty=True, decoder=Status.from_json)
status = jose.Field('status', omitempty=True, decoder=Status.from_json)
# TODO: 'expires' is allowed for Authorization Resources in
# general, but for Key Authorization '[t]he "expires" field MUST
# be absent'... then acme-spec gives example with 'expires'
# present... That's confusing!
expires: datetime.datetime = fields.rfc3339('expires', omitempty=True)
wildcard: bool = jose.field('wildcard', omitempty=True)
expires = fields.RFC3339Field('expires', omitempty=True)
wildcard = jose.Field('wildcard', omitempty=True)
# combinations is temporarily renamed to _combinations during its deprecation
# period. See https://github.com/certbot/certbot/pull/9369#issuecomment-1199849262.
def __init__(self, **kwargs: Any) -> None:
if 'combinations' in kwargs:
kwargs['_combinations'] = kwargs.pop('combinations')
super().__init__(**kwargs)
# Mypy does not understand the josepy magic happening here, and falsely claims
# that challenge is redefined. Let's ignore the type check here.
@challenges.decoder # type: ignore
def challenges(value: List[Dict[str, Any]]) -> Tuple[ChallengeBody, ...]: # type: ignore[misc] # pylint: disable=no-self-argument,missing-function-docstring
@challenges.decoder
def challenges(value): # pylint: disable=missing-docstring,no-self-argument
return tuple(ChallengeBody.from_json(chall) for chall in value)
@property
def combinations(self) -> Tuple[Tuple[int, ...], ...]:
"""Challenge combinations.
(`tuple` of `tuple` of `int`, as opposed to `list` of `list` from the spec).
def resolved_combinations(self):
"""Combinations with challenges instead of indices."""
return tuple(tuple(self.challenges[idx] for idx in combo)
for combo in self.combinations) # pylint: disable=not-an-iterable
.. deprecated: 1.30.0
"""
warnings.warn(
"acme.messages.Authorization.combinations is deprecated and will be "
"removed in a future release.", DeprecationWarning, stacklevel=2)
return self._combinations
@Directory.register
class NewAuthorization(Authorization):
"""New authorization."""
resource_type = 'new-authz'
resource = fields.Resource(resource_type)
@combinations.setter
def combinations(self, combos: Tuple[Tuple[int, ...], ...]) -> None: # pragma: no cover
warnings.warn(
"acme.messages.Authorization.combinations is deprecated and will be "
"removed in a future release.", DeprecationWarning, stacklevel=2)
self._combinations = combos
@property
def resolved_combinations(self) -> Tuple[Tuple[ChallengeBody, ...], ...]:
"""Combinations with challenges instead of indices.
.. deprecated: 1.30.0
"""
warnings.warn(
"acme.messages.Authorization.resolved_combinations is deprecated and will be "
"removed in a future release.", DeprecationWarning, stacklevel=2)
with warnings.catch_warnings():
warnings.filterwarnings('ignore', '.*combinations', DeprecationWarning)
return tuple(tuple(self.challenges[idx] for idx in combo)
for combo in self.combinations) # pylint: disable=not-an-iterable
class UpdateAuthorization(Authorization):
"""Update authorization."""
resource_type = 'authz'
resource = fields.Resource(resource_type)
class AuthorizationResource(ResourceWithURI):
"""Authorization Resource.
:ivar acme.messages.Authorization body:
:ivar str new_cert_uri: Deprecated. Do not use.
:ivar unicode new_cert_uri: Deprecated. Do not use.
"""
body: Authorization = jose.field('body', decoder=Authorization.from_json)
new_cert_uri: str = jose.field('new_cert_uri', omitempty=True)
body = jose.Field('body', decoder=Authorization.from_json)
new_cert_uri = jose.Field('new_cert_uri', omitempty=True)
@Directory.register
class CertificateRequest(jose.JSONObjectWithFields):
"""ACME new-cert request.
:ivar josepy.util.ComparableX509 csr:
`OpenSSL.crypto.X509Req` wrapped in `.ComparableX509`
"""
resource_type = 'new-cert'
resource = fields.Resource(resource_type)
csr = jose.Field('csr', decoder=jose.decode_csr, encoder=jose.encode_csr)
class CertificateResource(ResourceWithURI):
@@ -658,157 +537,70 @@ class CertificateResource(ResourceWithURI):
:ivar josepy.util.ComparableX509 body:
`OpenSSL.crypto.X509` wrapped in `.ComparableX509`
:ivar str cert_chain_uri: URI found in the 'up' ``Link`` header
:ivar unicode cert_chain_uri: URI found in the 'up' ``Link`` header
:ivar tuple authzrs: `tuple` of `AuthorizationResource`.
"""
cert_chain_uri: str = jose.field('cert_chain_uri')
authzrs: Tuple[AuthorizationResource, ...] = jose.field('authzrs')
cert_chain_uri = jose.Field('cert_chain_uri')
authzrs = jose.Field('authzrs')
@Directory.register
class Revocation(jose.JSONObjectWithFields):
"""Revocation message.
:ivar .ComparableX509 certificate: `OpenSSL.crypto.X509` wrapped in
`.ComparableX509`
"""
resource_type = 'revoke-cert'
resource = fields.Resource(resource_type)
certificate = jose.Field(
'certificate', decoder=jose.decode_cert, encoder=jose.encode_cert)
reason = jose.Field('reason')
class Order(ResourceBody):
"""Order Resource Body.
:ivar identifiers: List of identifiers for the certificate.
:vartype identifiers: `list` of `.Identifier`
:ivar list of .Identifier: List of identifiers for the certificate.
:ivar acme.messages.Status status:
:ivar authorizations: URLs of authorizations.
:vartype authorizations: `list` of `str`
:ivar list of str authorizations: URLs of authorizations.
:ivar str certificate: URL to download certificate as a fullchain PEM.
:ivar str finalize: URL to POST to to request issuance once all
authorizations have "valid" status.
:ivar datetime.datetime expires: When the order expires.
:ivar ~.Error error: Any error that occurred during finalization, if applicable.
:ivar .Error error: Any error that occurred during finalization, if applicable.
"""
identifiers: List[Identifier] = jose.field('identifiers', omitempty=True)
status: Status = jose.field('status', decoder=Status.from_json, omitempty=True)
authorizations: List[str] = jose.field('authorizations', omitempty=True)
certificate: str = jose.field('certificate', omitempty=True)
finalize: str = jose.field('finalize', omitempty=True)
expires: datetime.datetime = fields.rfc3339('expires', omitempty=True)
error: Error = jose.field('error', omitempty=True, decoder=Error.from_json)
identifiers = jose.Field('identifiers', omitempty=True)
status = jose.Field('status', decoder=Status.from_json,
omitempty=True)
authorizations = jose.Field('authorizations', omitempty=True)
certificate = jose.Field('certificate', omitempty=True)
finalize = jose.Field('finalize', omitempty=True)
expires = fields.RFC3339Field('expires', omitempty=True)
error = jose.Field('error', omitempty=True, decoder=Error.from_json)
# Mypy does not understand the josepy magic happening here, and falsely claims
# that identifiers is redefined. Let's ignore the type check here.
@identifiers.decoder # type: ignore
def identifiers(value: List[Dict[str, Any]]) -> Tuple[Identifier, ...]: # type: ignore[misc] # pylint: disable=no-self-argument,missing-function-docstring
@identifiers.decoder
def identifiers(value): # pylint: disable=missing-docstring,no-self-argument
return tuple(Identifier.from_json(identifier) for identifier in value)
class OrderResource(ResourceWithURI):
"""Order Resource.
:ivar acme.messages.Order body:
:ivar bytes csr_pem: The CSR this Order will be finalized with.
:ivar authorizations: Fully-fetched AuthorizationResource objects.
:vartype authorizations: `list` of `acme.messages.AuthorizationResource`
:ivar str csr_pem: The CSR this Order will be finalized with.
:ivar list of acme.messages.AuthorizationResource authorizations:
Fully-fetched AuthorizationResource objects.
:ivar str fullchain_pem: The fetched contents of the certificate URL
produced once the order was finalized, if it's present.
:ivar alternative_fullchains_pem: The fetched contents of alternative certificate
chain URLs produced once the order was finalized, if present and requested during
finalization.
:vartype alternative_fullchains_pem: `list` of `str`
"""
body: Order = jose.field('body', decoder=Order.from_json)
csr_pem: bytes = jose.field('csr_pem', omitempty=True)
authorizations: List[AuthorizationResource] = jose.field('authorizations')
fullchain_pem: str = jose.field('fullchain_pem', omitempty=True)
alternative_fullchains_pem: List[str] = jose.field('alternative_fullchains_pem',
omitempty=True)
body = jose.Field('body', decoder=Order.from_json)
csr_pem = jose.Field('csr_pem', omitempty=True)
authorizations = jose.Field('authorizations')
fullchain_pem = jose.Field('fullchain_pem', omitempty=True)
with warnings.catch_warnings():
warnings.filterwarnings("ignore", "acme.messages.Directory.register", DeprecationWarning)
warnings.filterwarnings("ignore", "resource attribute in acme.fields", DeprecationWarning)
@Directory.register
class NewOrder(Order):
"""New order."""
resource_type = 'new-order'
@Directory.register
class Revocation(ResourceMixin, jose.JSONObjectWithFields):
"""Revocation message.
:ivar jose.ComparableX509 certificate: `OpenSSL.crypto.X509` wrapped in
`jose.ComparableX509`
"""
resource_type = 'revoke-cert'
resource: str = fields.resource(resource_type)
certificate: jose.ComparableX509 = jose.field(
'certificate', decoder=jose.decode_cert, encoder=jose.encode_cert)
reason: int = jose.field('reason')
@Directory.register
class CertificateRequest(ResourceMixin, jose.JSONObjectWithFields):
"""ACME new-cert request.
:ivar jose.ComparableX509 csr:
`OpenSSL.crypto.X509Req` wrapped in `.ComparableX509`
"""
resource_type = 'new-cert'
resource: str = fields.resource(resource_type)
csr: jose.ComparableX509 = jose.field('csr', decoder=jose.decode_csr,
encoder=jose.encode_csr)
@Directory.register
class NewAuthorization(ResourceMixin, Authorization):
"""New authorization."""
resource_type = 'new-authz'
resource: str = fields.resource(resource_type)
class UpdateAuthorization(ResourceMixin, Authorization):
"""Update authorization."""
resource_type = 'authz'
resource: str = fields.resource(resource_type)
@Directory.register
class NewRegistration(ResourceMixin, Registration):
"""New registration."""
resource_type = 'new-reg'
resource: str = fields.resource(resource_type)
class UpdateRegistration(ResourceMixin, Registration):
"""Update registration."""
resource_type = 'reg'
resource: str = fields.resource(resource_type)
# This class takes a similar approach to the cryptography project to deprecate attributes
# in public modules. See the _ModuleWithDeprecation class here:
# https://github.com/pyca/cryptography/blob/91105952739442a74582d3e62b3d2111365b0dc7/src/cryptography/utils.py#L129
class _MessagesDeprecationModule: # pragma: no cover
"""
Internal class delegating to a module, and displaying warnings when
module attributes deprecated in acme.messages are accessed.
"""
def __init__(self, module: ModuleType) -> None:
self.__dict__['_module'] = module
def __getattr__(self, attr: str) -> None:
if attr == 'OLD_ERROR_PREFIX':
warnings.warn('{0} attribute in acme.messages module is deprecated '
'and will be removed soon.'.format(attr),
DeprecationWarning, stacklevel=2)
return getattr(self._module, attr)
def __setattr__(self, attr: str, value: Any) -> None:
setattr(self._module, attr, value)
def __delattr__(self, attr: str) -> None:
delattr(self._module, attr)
def __dir__(self) -> List[str]:
return ['_module'] + dir(self._module)
# Patching ourselves to warn about acme.messages.OLD_ERROR_PREFIX deprecation and planned removal.
sys.modules[__name__] = cast(ModuleType, _MessagesDeprecationModule(sys.modules[__name__]))
@Directory.register
class NewOrder(Order):
"""New order."""
resource_type = 'new-order'

View File

@@ -1,13 +1,13 @@
"""Tests for acme.messages."""
from typing import Dict
import unittest
from unittest import mock
import warnings
import josepy as jose
import mock
from acme import challenges
import test_util
from acme import test_util
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
CERT = test_util.load_comparable_cert('cert.der')
CSR = test_util.load_comparable_csr('csr.der')
@@ -18,18 +18,17 @@ class ErrorTest(unittest.TestCase):
"""Tests for acme.messages.Error."""
def setUp(self):
from acme.messages import Error, ERROR_PREFIX, Identifier, IDENTIFIER_FQDN
self.error = Error.with_code('malformed', detail='foo', title='title')
from acme.messages import Error, ERROR_PREFIX
self.error = Error(
detail='foo', typ=ERROR_PREFIX + 'malformed', title='title')
self.jobj = {
'detail': 'foo',
'title': 'some title',
'type': ERROR_PREFIX + 'malformed',
}
self.error_custom = Error(typ='custom', detail='bar')
self.identifier = Identifier(typ=IDENTIFIER_FQDN, value='example.com')
self.subproblem = Error.with_code('caa', detail='bar', title='title', identifier=self.identifier)
self.error_with_subproblems = Error.with_code('malformed', detail='foo', title='title', subproblems=[self.subproblem])
self.empty_error = Error()
self.jobj_custom = {'type': 'custom', 'detail': 'bar'}
def test_default_typ(self):
from acme.messages import Error
@@ -43,36 +42,29 @@ class ErrorTest(unittest.TestCase):
from acme.messages import Error
hash(Error.from_json(self.error.to_json()))
def test_from_json_with_subproblems(self):
from acme.messages import Error
parsed_error = Error.from_json(self.error_with_subproblems.to_json())
self.assertEqual(1, len(parsed_error.subproblems))
self.assertEqual(self.subproblem, parsed_error.subproblems[0])
def test_description(self):
self.assertEqual('The request message was malformed', self.error.description)
self.assertIsNone(self.error_custom.description)
self.assertEqual(
'The request message was malformed', self.error.description)
self.assertTrue(self.error_custom.description is None)
def test_code(self):
from acme.messages import Error
self.assertEqual('malformed', self.error.code)
self.assertIsNone(self.error_custom.code)
self.assertIsNone(Error().code)
self.assertEqual(None, self.error_custom.code)
self.assertEqual(None, Error().code)
def test_is_acme_error(self):
from acme.messages import is_acme_error, Error
from acme.messages import is_acme_error
self.assertTrue(is_acme_error(self.error))
self.assertFalse(is_acme_error(self.error_custom))
self.assertFalse(is_acme_error(Error()))
self.assertFalse(is_acme_error(self.empty_error))
self.assertFalse(is_acme_error("must pet all the {dogs|rabbits}"))
def test_unicode_error(self):
from acme.messages import Error, is_acme_error
arabic_error = Error.with_code(
'malformed', detail=u'\u0639\u062f\u0627\u0644\u0629', title='title')
from acme.messages import Error, ERROR_PREFIX, is_acme_error
arabic_error = Error(
detail=u'\u0639\u062f\u0627\u0644\u0629', typ=ERROR_PREFIX + 'malformed',
title='title')
self.assertTrue(is_acme_error(arabic_error))
def test_with_code(self):
@@ -85,11 +77,7 @@ class ErrorTest(unittest.TestCase):
str(self.error),
u"{0.typ} :: {0.description} :: {0.detail} :: {0.title}"
.format(self.error))
self.assertEqual(
str(self.error_with_subproblems),
(u"{0.typ} :: {0.description} :: {0.detail} :: {0.title}\n"+
u"Problem for {1.identifier.value}: {1.typ} :: {1.description} :: {1.detail} :: {1.title}").format(
self.error_with_subproblems, self.subproblem))
class ConstantTest(unittest.TestCase):
"""Tests for acme.messages._Constant."""
@@ -98,7 +86,7 @@ class ConstantTest(unittest.TestCase):
from acme.messages import _Constant
class MockConstant(_Constant): # pylint: disable=missing-docstring
POSSIBLE_NAMES: Dict = {}
POSSIBLE_NAMES = {} # type: Dict
self.MockConstant = MockConstant # pylint: disable=invalid-name
self.const_a = MockConstant('a')
@@ -122,11 +110,11 @@ class ConstantTest(unittest.TestCase):
def test_equality(self):
const_a_prime = self.MockConstant('a')
self.assertNotEqual(self.const_a, self.const_b)
self.assertEqual(self.const_a, const_a_prime)
self.assertFalse(self.const_a == self.const_b)
self.assertTrue(self.const_a == const_a_prime)
self.assertNotEqual(self.const_a, self.const_b)
self.assertEqual(self.const_a, const_a_prime)
self.assertTrue(self.const_a != self.const_b)
self.assertFalse(self.const_a != const_a_prime)
class DirectoryTest(unittest.TestCase):
@@ -151,10 +139,8 @@ class DirectoryTest(unittest.TestCase):
def test_getitem(self):
self.assertEqual('reg', self.dir['new-reg'])
from acme.messages import NewRegistration
with warnings.catch_warnings():
warnings.filterwarnings('ignore', '.* non-string keys', DeprecationWarning)
self.assertEqual('reg', self.dir[NewRegistration])
self.assertEqual('reg', self.dir[NewRegistration()])
self.assertEqual('reg', self.dir[NewRegistration])
self.assertEqual('reg', self.dir[NewRegistration()])
def test_getitem_fails_with_key_error(self):
self.assertRaises(KeyError, self.dir.__getitem__, 'foo')
@@ -270,19 +256,6 @@ class RegistrationTest(unittest.TestCase):
from acme.messages import Registration
hash(Registration.from_json(self.jobj_from))
def test_default_not_transmitted(self):
from acme.messages import NewRegistration
empty_new_reg = NewRegistration()
new_reg_with_contact = NewRegistration(contact=())
self.assertEqual(empty_new_reg.contact, ())
self.assertEqual(new_reg_with_contact.contact, ())
self.assertNotIn('contact', empty_new_reg.to_partial_json())
self.assertNotIn('contact', empty_new_reg.fields_to_partial_json())
self.assertIn('contact', new_reg_with_contact.to_partial_json())
self.assertIn('contact', new_reg_with_contact.fields_to_partial_json())
class UpdateRegistrationTest(unittest.TestCase):
"""Tests for acme.messages.UpdateRegistration."""
@@ -332,7 +305,8 @@ class ChallengeBodyTest(unittest.TestCase):
from acme.messages import Error
from acme.messages import STATUS_INVALID
self.status = STATUS_INVALID
error = Error.with_code('serverInternal', detail='Unable to communicate with DNS server')
error = Error(typ='urn:ietf:params:acme:error:serverInternal',
detail='Unable to communicate with DNS server')
self.challb = ChallengeBody(
uri='http://challb', chall=self.chall, status=self.status,
error=error)
@@ -410,12 +384,10 @@ class AuthorizationTest(unittest.TestCase):
hash(Authorization.from_json(self.jobj_from))
def test_resolved_combinations(self):
with warnings.catch_warnings():
warnings.filterwarnings('ignore', '.*resolved_combinations', DeprecationWarning)
self.assertEqual(self.authz.resolved_combinations, (
(self.challbs[0],),
(self.challbs[1],),
))
self.assertEqual(self.authz.resolved_combinations, (
(self.challbs[0],),
(self.challbs[1],),
))
class AuthorizationResourceTest(unittest.TestCase):
@@ -426,7 +398,7 @@ class AuthorizationResourceTest(unittest.TestCase):
authzr = AuthorizationResource(
uri=mock.sentinel.uri,
body=mock.sentinel.body)
self.assertIsInstance(authzr, jose.JSONDeSerializable)
self.assertTrue(isinstance(authzr, jose.JSONDeSerializable))
class CertificateRequestTest(unittest.TestCase):
@@ -437,7 +409,7 @@ class CertificateRequestTest(unittest.TestCase):
self.req = CertificateRequest(csr=CSR)
def test_json_de_serializable(self):
self.assertIsInstance(self.req, jose.JSONDeSerializable)
self.assertTrue(isinstance(self.req, jose.JSONDeSerializable))
from acme.messages import CertificateRequest
self.assertEqual(
self.req, CertificateRequest.from_json(self.req.to_json()))
@@ -453,7 +425,7 @@ class CertificateResourceTest(unittest.TestCase):
cert_chain_uri=mock.sentinel.cert_chain_uri)
def test_json_de_serializable(self):
self.assertIsInstance(self.certr, jose.JSONDeSerializable)
self.assertTrue(isinstance(self.certr, jose.JSONDeSerializable))
from acme.messages import CertificateResource
self.assertEqual(
self.certr, CertificateResource.from_json(self.certr.to_json()))
@@ -486,7 +458,6 @@ class OrderResourceTest(unittest.TestCase):
'authorizations': None,
})
class NewOrderTest(unittest.TestCase):
"""Tests for acme.messages.NewOrder."""
@@ -501,18 +472,5 @@ class NewOrderTest(unittest.TestCase):
})
class JWSPayloadRFC8555Compliant(unittest.TestCase):
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
def test_message_payload(self):
from acme.messages import NewAuthorization
new_order = NewAuthorization()
new_order.le_acme_version = 2
jobj = new_order.json_dumps(indent=2).encode()
# RFC8555 states that JWS bodies must not have a resource field.
self.assertEqual(jobj, b'{}')
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,72 +0,0 @@
"""Useful mixins for Challenge and Resource objects"""
from typing import Any
from typing import Dict
import warnings
warnings.warn(f'The module {__name__} is deprecated and will be removed in a future release',
DeprecationWarning, stacklevel=2)
class VersionedLEACMEMixin:
"""This mixin stores the version of Let's Encrypt's endpoint being used."""
@property
def le_acme_version(self) -> int:
"""Define the version of ACME protocol to use"""
return getattr(self, '_le_acme_version', 1)
@le_acme_version.setter
def le_acme_version(self, version: int) -> None:
# We need to use object.__setattr__ to not depend on the specific implementation of
# __setattr__ in current class (eg. jose.TypedJSONObjectWithFields raises AttributeError
# for any attempt to set an attribute to make objects immutable).
object.__setattr__(self, '_le_acme_version', version)
def __setattr__(self, key: str, value: Any) -> None:
if key == 'le_acme_version':
# Required for @property to operate properly. See comment above.
object.__setattr__(self, key, value)
else:
super().__setattr__(key, value) # pragma: no cover
class ResourceMixin(VersionedLEACMEMixin):
"""
This mixin generates a RFC8555 compliant JWS payload
by removing the `resource` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(),
'to_partial_json', 'resource')
def fields_to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(),
'fields_to_partial_json', 'resource')
class TypeMixin(VersionedLEACMEMixin):
"""
This mixin allows generation of a RFC8555 compliant JWS payload
by removing the `type` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(),
'to_partial_json', 'type')
def fields_to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(),
'fields_to_partial_json', 'type')
def _safe_jobj_compliance(instance: Any, jobj_method: str,
uncompliant_field: str) -> Dict[str, Any]:
if hasattr(instance, jobj_method):
jobj: Dict[str, Any] = getattr(instance, jobj_method)()
if instance.le_acme_version == 2:
jobj.pop(uncompliant_field, None)
return jobj
raise AttributeError(f'Method {jobj_method}() is not implemented.') # pragma: no cover

View File

@@ -1,71 +1,64 @@
"""Support for standalone client challenge solvers. """
import argparse
import collections
import functools
import http.client as http_client
import http.server as BaseHTTPServer
import logging
import os
import socket
import socketserver
import sys
import threading
from typing import Any
from typing import cast
from typing import List
from typing import Mapping
from typing import Optional
from typing import Set
from typing import Tuple
from typing import Type
from OpenSSL import crypto
from OpenSSL import SSL
from six.moves import BaseHTTPServer # type: ignore # pylint: disable=import-error
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
import OpenSSL
from acme import challenges
from acme import crypto_util
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme import _TLSSNI01DeprecationModule
logger = logging.getLogger(__name__)
# six.moves.* | pylint: disable=no-member,attribute-defined-outside-init
# pylint: disable=too-few-public-methods,no-init
class TLSServer(socketserver.TCPServer):
"""Generic TLS Server."""
def __init__(self, *args: Any, **kwargs: Any) -> None:
def __init__(self, *args, **kwargs):
self.ipv6 = kwargs.pop("ipv6", False)
if self.ipv6:
self.address_family = socket.AF_INET6
else:
self.address_family = socket.AF_INET
self.certs = kwargs.pop("certs", {})
self.method = kwargs.pop("method", crypto_util._DEFAULT_SSL_METHOD)
self.method = kwargs.pop(
# pylint: disable=protected-access
"method", crypto_util._DEFAULT_SSL_METHOD)
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
super().__init__(*args, **kwargs)
socketserver.TCPServer.__init__(self, *args, **kwargs)
def _wrap_sock(self) -> None:
self.socket = cast(socket.socket, crypto_util.SSLSocket(
self.socket, cert_selection=self._cert_selection,
alpn_selection=getattr(self, '_alpn_selection', None),
method=self.method))
def _wrap_sock(self):
self.socket = crypto_util.SSLSocket(
self.socket, certs=self.certs, method=self.method)
def _cert_selection(self, connection: SSL.Connection
) -> Optional[Tuple[crypto.PKey, crypto.X509]]: # pragma: no cover
"""Callback selecting certificate for connection."""
server_name = connection.get_servername()
if server_name:
return self.certs.get(server_name, None)
return None
def server_bind(self) -> None:
def server_bind(self): # pylint: disable=missing-docstring
self._wrap_sock()
return socketserver.TCPServer.server_bind(self)
class ACMEServerMixin:
class ACMEServerMixin: # pylint: disable=old-style-class
"""ACME server common settings mixin."""
# TODO: c.f. #858
server_version = "ACME client standalone challenge solver"
allow_reuse_address = True
class BaseDualNetworkedServers:
class BaseDualNetworkedServers(object):
"""Base class for a pair of IPv6 and IPv4 servers that tries to do everything
it's asked for both servers, but where failures in one server don't
affect the other.
@@ -73,14 +66,10 @@ class BaseDualNetworkedServers:
If two servers are instantiated, they will serve on the same port.
"""
def __init__(self, ServerClass: Type[socketserver.TCPServer], server_address: Tuple[str, int],
*remaining_args: Any, **kwargs: Any) -> None:
def __init__(self, ServerClass, server_address, *remaining_args, **kwargs):
port = server_address[1]
self.threads: List[threading.Thread] = []
self.servers: List[socketserver.BaseServer] = []
# Preserve socket error for re-raising, if no servers can be started
last_socket_err: Optional[socket.error] = None
self.threads = [] # type: List[threading.Thread]
self.servers = [] # type: List[ACMEServerMixin]
# Must try True first.
# Ubuntu, for example, will fail to bind to IPv4 if we've already bound
@@ -98,8 +87,7 @@ class BaseDualNetworkedServers:
logger.debug(
"Successfully bound to %s:%s using %s", new_address[0],
new_address[1], "IPv6" if ip_version else "IPv4")
except socket.error as e:
last_socket_err = e
except socket.error:
if self.servers:
# Already bound using IPv6.
logger.debug(
@@ -118,24 +106,22 @@ class BaseDualNetworkedServers:
# bind to the same port for both servers.
port = server.socket.getsockname()[1]
if not self.servers:
if last_socket_err:
raise last_socket_err
else: # pragma: no cover
raise socket.error("Could not bind to IPv4 or IPv6.")
raise socket.error("Could not bind to IPv4 or IPv6.")
def serve_forever(self) -> None:
def serve_forever(self):
"""Wraps socketserver.TCPServer.serve_forever"""
for server in self.servers:
thread = threading.Thread(
# pylint: disable=no-member
target=server.serve_forever)
thread.start()
self.threads.append(thread)
def getsocknames(self) -> List[Tuple[str, int]]:
def getsocknames(self):
"""Wraps socketserver.TCPServer.socket.getsockname"""
return [server.socket.getsockname() for server in self.servers]
def shutdown_and_server_close(self) -> None:
def shutdown_and_server_close(self):
"""Wraps socketserver.TCPServer.shutdown, socketserver.TCPServer.server_close, and
threading.Thread.join"""
for server in self.servers:
@@ -146,77 +132,62 @@ class BaseDualNetworkedServers:
self.threads = []
class TLSALPN01Server(TLSServer, ACMEServerMixin):
"""TLSALPN01 Server."""
class TLSSNI01Server(TLSServer, ACMEServerMixin):
"""TLSSNI01 Server."""
ACME_TLS_1_PROTOCOL = b"acme-tls/1"
def __init__(self, server_address: Tuple[str, int],
certs: List[Tuple[crypto.PKey, crypto.X509]],
challenge_certs: Mapping[bytes, Tuple[crypto.PKey, crypto.X509]],
ipv6: bool = False) -> None:
# We don't need to implement a request handler here because the work
# (including logging) is being done by wrapped socket set up in the
# parent TLSServer class.
def __init__(self, server_address, certs, ipv6=False):
TLSServer.__init__(
self, server_address, socketserver.BaseRequestHandler, certs=certs,
ipv6=ipv6)
self.challenge_certs = challenge_certs
self, server_address, BaseRequestHandlerWithLogging, certs=certs, ipv6=ipv6)
def _cert_selection(self, connection: SSL.Connection) -> Optional[Tuple[crypto.PKey,
crypto.X509]]:
# TODO: We would like to serve challenge cert only if asked for it via
# ALPN. To do this, we need to retrieve the list of protos from client
# hello, but this is currently impossible with openssl [0], and ALPN
# negotiation is done after cert selection.
# Therefore, currently we always return challenge cert, and terminate
# handshake in alpn_selection() if ALPN protos are not what we expect.
# [0] https://github.com/openssl/openssl/issues/4952
server_name = connection.get_servername()
if server_name:
logger.debug("Serving challenge cert for server name %s", server_name)
return self.challenge_certs[server_name]
return None # pragma: no cover
def _alpn_selection(self, _connection: SSL.Connection, alpn_protos: List[bytes]) -> bytes:
"""Callback to select alpn protocol."""
if len(alpn_protos) == 1 and alpn_protos[0] == self.ACME_TLS_1_PROTOCOL:
logger.debug("Agreed on %s ALPN", self.ACME_TLS_1_PROTOCOL)
return self.ACME_TLS_1_PROTOCOL
logger.debug("Cannot agree on ALPN proto. Got: %s", str(alpn_protos))
# Explicitly close the connection now, by returning an empty string.
# See https://www.pyopenssl.org/en/stable/api/ssl.html#OpenSSL.SSL.Context.set_alpn_select_callback # pylint: disable=line-too-long
return b""
class TLSSNI01DualNetworkedServers(BaseDualNetworkedServers):
"""TLSSNI01Server Wrapper. Tries everything for both. Failures for one don't
affect the other."""
def __init__(self, *args, **kwargs):
BaseDualNetworkedServers.__init__(self, TLSSNI01Server, *args, **kwargs)
class BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
"""BaseRequestHandler with logging."""
def log_message(self, format, *args): # pylint: disable=redefined-builtin
"""Log arbitrary message."""
logger.debug("%s - - %s", self.client_address[0], format % args)
def handle(self):
"""Handle request."""
self.log_message("Incoming request")
socketserver.BaseRequestHandler.handle(self)
class HTTPServer(BaseHTTPServer.HTTPServer):
"""Generic HTTP Server."""
def __init__(self, *args: Any, **kwargs: Any) -> None:
def __init__(self, *args, **kwargs):
self.ipv6 = kwargs.pop("ipv6", False)
if self.ipv6:
self.address_family = socket.AF_INET6
else:
self.address_family = socket.AF_INET
super().__init__(*args, **kwargs)
BaseHTTPServer.HTTPServer.__init__(self, *args, **kwargs)
class HTTP01Server(HTTPServer, ACMEServerMixin):
"""HTTP01 Server."""
def __init__(self, server_address: Tuple[str, int], resources: Set[challenges.HTTP01],
ipv6: bool = False, timeout: int = 30) -> None:
super().__init__(
server_address, HTTP01RequestHandler.partial_init(
simple_http_resources=resources, timeout=timeout), ipv6=ipv6)
def __init__(self, server_address, resources, ipv6=False):
HTTPServer.__init__(
self, server_address, HTTP01RequestHandler.partial_init(
simple_http_resources=resources), ipv6=ipv6)
class HTTP01DualNetworkedServers(BaseDualNetworkedServers):
"""HTTP01Server Wrapper. Tries everything for both. Failures for one don't
affect the other."""
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(HTTP01Server, *args, **kwargs)
def __init__(self, *args, **kwargs):
BaseDualNetworkedServers.__init__(self, HTTP01Server, *args, **kwargs)
class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
@@ -231,37 +202,20 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
HTTP01Resource = collections.namedtuple(
"HTTP01Resource", "chall response validation")
def __init__(self, *args: Any, **kwargs: Any) -> None:
def __init__(self, *args, **kwargs):
self.simple_http_resources = kwargs.pop("simple_http_resources", set())
self._timeout = kwargs.pop('timeout', 30)
super().__init__(*args, **kwargs)
self.server: HTTP01Server
BaseHTTPServer.BaseHTTPRequestHandler.__init__(self, *args, **kwargs)
# In parent class BaseHTTPRequestHandler, 'timeout' is a class-level property but we
# need to define its value during the initialization phase in HTTP01RequestHandler.
# However MyPy does not appreciate that we dynamically shadow a class-level property
# with an instance-level property (eg. self.timeout = ... in __init__()). So to make
# everyone happy, we statically redefine 'timeout' as a method property, and set the
# timeout value in a new internal instance-level property _timeout.
@property
def timeout(self) -> int: # type: ignore[override]
"""
The default timeout this server should apply to requests.
:return: timeout to apply
:rtype: int
"""
return self._timeout
def log_message(self, format: str, *args: Any) -> None: # pylint: disable=redefined-builtin
def log_message(self, format, *args): # pylint: disable=redefined-builtin
"""Log arbitrary message."""
logger.debug("%s - - %s", self.client_address[0], format % args)
def handle(self) -> None:
def handle(self):
"""Handle request."""
self.log_message("Incoming request")
BaseHTTPServer.BaseHTTPRequestHandler.handle(self)
def do_GET(self) -> None: # pylint: disable=invalid-name,missing-function-docstring
def do_GET(self): # pylint: disable=invalid-name,missing-docstring
if self.path == "/":
self.handle_index()
elif self.path.startswith("/" + challenges.HTTP01.URI_ROOT_PATH):
@@ -269,21 +223,21 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
else:
self.handle_404()
def handle_index(self) -> None:
def handle_index(self):
"""Handle index page."""
self.send_response(200)
self.send_header("Content-Type", "text/html")
self.end_headers()
self.wfile.write(self.server.server_version.encode())
def handle_404(self) -> None:
def handle_404(self):
"""Handler 404 Not Found errors."""
self.send_response(http_client.NOT_FOUND, message="Not Found")
self.send_header("Content-type", "text/html")
self.end_headers()
self.wfile.write(b"404")
def handle_simple_http_resource(self) -> None:
def handle_simple_http_resource(self):
"""Handle HTTP01 provisioned resources."""
for resource in self.simple_http_resources:
if resource.chall.path == self.path:
@@ -299,8 +253,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
self.path)
@classmethod
def partial_init(cls, simple_http_resources: Set[challenges.HTTP01],
timeout: int) -> 'functools.partial[HTTP01RequestHandler]':
def partial_init(cls, simple_http_resources):
"""Partially initialize this handler.
This is useful because `socketserver.BaseServer` takes
@@ -309,5 +262,44 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
"""
return functools.partial(
cls, simple_http_resources=simple_http_resources,
timeout=timeout)
cls, simple_http_resources=simple_http_resources)
def simple_tls_sni_01_server(cli_args, forever=True):
"""Run simple standalone TLSSNI01 server."""
logging.basicConfig(level=logging.DEBUG)
parser = argparse.ArgumentParser()
parser.add_argument(
"-p", "--port", default=0, help="Port to serve at. By default "
"picks random free port.")
args = parser.parse_args(cli_args[1:])
certs = {}
_, hosts, _ = next(os.walk('.')) # type: ignore # https://github.com/python/mypy/issues/465
for host in hosts:
with open(os.path.join(host, "cert.pem")) as cert_file:
cert_contents = cert_file.read()
with open(os.path.join(host, "key.pem")) as key_file:
key_contents = key_file.read()
certs[host.encode()] = (
OpenSSL.crypto.load_privatekey(
OpenSSL.crypto.FILETYPE_PEM, key_contents),
OpenSSL.crypto.load_certificate(
OpenSSL.crypto.FILETYPE_PEM, cert_contents))
server = TLSSNI01Server(('', int(args.port)), certs=certs)
logger.info("Serving at https://%s:%s...", *server.socket.getsockname()[:2])
if forever: # pragma: no cover
server.serve_forever()
else:
server.handle_request()
# Patching ourselves to warn about TLS-SNI challenge deprecation and removal.
sys.modules[__name__] = _TLSSNI01DeprecationModule(sys.modules[__name__])
if __name__ == "__main__":
sys.exit(simple_tls_sni_01_server(sys.argv)) # pragma: no cover

View File

@@ -1,20 +1,26 @@
"""Tests for acme.standalone."""
import http.client as http_client
import multiprocessing
import os
import shutil
import socket
import socketserver
import threading
import tempfile
import unittest
from typing import Set
from unittest import mock
import time
from contextlib import closing
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
import josepy as jose
import mock
import requests
from acme import challenges
from acme import crypto_util
from acme import errors
import test_util
from acme import test_util
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
class TLSServerTest(unittest.TestCase):
@@ -35,6 +41,32 @@ class TLSServerTest(unittest.TestCase):
server.server_close()
class TLSSNI01ServerTest(unittest.TestCase):
"""Test for acme.standalone.TLSSNI01Server."""
def setUp(self):
self.certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
test_util.load_cert('rsa2048_cert.pem'),
)}
from acme.standalone import TLSSNI01Server
self.server = TLSSNI01Server(('localhost', 0), certs=self.certs)
self.thread = threading.Thread(target=self.server.serve_forever)
self.thread.start()
def tearDown(self):
self.server.shutdown()
self.thread.join()
def test_it(self):
host, port = self.server.socket.getsockname()[:2]
cert = crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1)
self.assertEqual(jose.ComparableX509(cert),
jose.ComparableX509(self.certs[b'localhost'][1]))
class HTTP01ServerTest(unittest.TestCase):
"""Tests for acme.standalone.HTTP01Server."""
@@ -42,7 +74,7 @@ class HTTP01ServerTest(unittest.TestCase):
def setUp(self):
self.account_key = jose.JWK.load(
test_util.load_vector('rsa1024_key.pem'))
self.resources: Set = set()
self.resources = set() # type: Set
from acme.standalone import HTTP01Server
self.server = HTTP01Server(('', 0), resources=self.resources)
@@ -86,85 +118,11 @@ class HTTP01ServerTest(unittest.TestCase):
def test_http01_not_found(self):
self.assertFalse(self._test_http01(add=False))
def test_timely_shutdown(self):
from acme.standalone import HTTP01Server
server = HTTP01Server(('', 0), resources=set(), timeout=0.05)
server_thread = threading.Thread(target=server.serve_forever)
server_thread.start()
client = socket.socket()
client.connect(('localhost', server.socket.getsockname()[1]))
stop_thread = threading.Thread(target=server.shutdown)
stop_thread.start()
server_thread.join(5.)
is_hung = server_thread.is_alive()
try:
client.shutdown(socket.SHUT_RDWR)
except: # pragma: no cover, pylint: disable=bare-except
# may raise error because socket could already be closed
pass
self.assertFalse(is_hung, msg='Server shutdown should not be hung')
@unittest.skipIf(not challenges.TLSALPN01.is_supported(), "pyOpenSSL too old")
class TLSALPN01ServerTest(unittest.TestCase):
"""Test for acme.standalone.TLSALPN01Server."""
def setUp(self):
self.certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
test_util.load_cert('rsa2048_cert.pem'),
)}
# Use different certificate for challenge.
self.challenge_certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa4096_key.pem'),
test_util.load_cert('rsa4096_cert.pem'),
)}
from acme.standalone import TLSALPN01Server
self.server = TLSALPN01Server(("localhost", 0), certs=self.certs,
challenge_certs=self.challenge_certs)
# pylint: disable=no-member
self.thread = threading.Thread(target=self.server.serve_forever)
self.thread.start()
def tearDown(self):
self.server.shutdown() # pylint: disable=no-member
self.thread.join()
# TODO: This is not implemented yet, see comments in standalone.py
# def test_certs(self):
# host, port = self.server.socket.getsockname()[:2]
# cert = crypto_util.probe_sni(
# b'localhost', host=host, port=port, timeout=1)
# # Expect normal cert when connecting without ALPN.
# self.assertEqual(jose.ComparableX509(cert),
# jose.ComparableX509(self.certs[b'localhost'][1]))
def test_challenge_certs(self):
host, port = self.server.socket.getsockname()[:2]
cert = crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1,
alpn_protocols=[b"acme-tls/1"])
# Expect challenge cert when connecting with ALPN.
self.assertEqual(
jose.ComparableX509(cert),
jose.ComparableX509(self.challenge_certs[b'localhost'][1])
)
def test_bad_alpn(self):
host, port = self.server.socket.getsockname()[:2]
with self.assertRaises(errors.Error):
crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1,
alpn_protocols=[b"bad-alpn"])
class BaseDualNetworkedServersTest(unittest.TestCase):
"""Test for acme.standalone.BaseDualNetworkedServers."""
class SingleProtocolServer(socketserver.TCPServer):
"""Server that only serves on a single protocol. FreeBSD has this behavior for AF_INET6."""
def __init__(self, *args, **kwargs):
@@ -174,7 +132,7 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
kwargs["bind_and_activate"] = False
else:
self.address_family = socket.AF_INET
super().__init__(*args, **kwargs)
socketserver.TCPServer.__init__(self, *args, **kwargs)
if ipv6:
# NB: On Windows, socket.IPPROTO_IPV6 constant may be missing.
# We use the corresponding value (41) instead.
@@ -189,17 +147,12 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
@mock.patch("socket.socket.bind")
def test_fail_to_bind(self, mock_bind):
from errno import EADDRINUSE
mock_bind.side_effect = socket.error
from acme.standalone import BaseDualNetworkedServers
mock_bind.side_effect = socket.error(EADDRINUSE, "Fake addr in use error")
with self.assertRaises(socket.error) as em:
BaseDualNetworkedServers(
BaseDualNetworkedServersTest.SingleProtocolServer,
('', 0), socketserver.BaseRequestHandler)
self.assertEqual(em.exception.errno, EADDRINUSE)
self.assertRaises(socket.error, BaseDualNetworkedServers,
BaseDualNetworkedServersTest.SingleProtocolServer,
('', 0),
socketserver.BaseRequestHandler)
def test_ports_equal(self):
from acme.standalone import BaseDualNetworkedServers
@@ -217,13 +170,41 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
prev_port = port
class TLSSNI01DualNetworkedServersTest(unittest.TestCase):
"""Test for acme.standalone.TLSSNI01DualNetworkedServers."""
def setUp(self):
self.certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
test_util.load_cert('rsa2048_cert.pem'),
)}
from acme.standalone import TLSSNI01DualNetworkedServers
self.servers = TLSSNI01DualNetworkedServers(('localhost', 0), certs=self.certs)
self.servers.serve_forever()
def tearDown(self):
self.servers.shutdown_and_server_close()
def test_connect(self):
socknames = self.servers.getsocknames()
# connect to all addresses
for sockname in socknames:
host, port = sockname[:2]
cert = crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1)
self.assertEqual(jose.ComparableX509(cert),
jose.ComparableX509(self.certs[b'localhost'][1]))
class HTTP01DualNetworkedServersTest(unittest.TestCase):
"""Tests for acme.standalone.HTTP01DualNetworkedServers."""
def setUp(self):
self.account_key = jose.JWK.load(
test_util.load_vector('rsa1024_key.pem'))
self.resources: Set = set()
self.resources = set() # type: Set
from acme.standalone import HTTP01DualNetworkedServers
self.servers = HTTP01DualNetworkedServers(('', 0), resources=self.resources)
@@ -266,5 +247,60 @@ class HTTP01DualNetworkedServersTest(unittest.TestCase):
self.assertFalse(self._test_http01(add=False))
class TestSimpleTLSSNI01Server(unittest.TestCase):
"""Tests for acme.standalone.simple_tls_sni_01_server."""
def setUp(self):
# mirror ../examples/standalone
self.test_cwd = tempfile.mkdtemp()
localhost_dir = os.path.join(self.test_cwd, 'localhost')
os.makedirs(localhost_dir)
shutil.copy(test_util.vector_path('rsa2048_cert.pem'),
os.path.join(localhost_dir, 'cert.pem'))
shutil.copy(test_util.vector_path('rsa2048_key.pem'),
os.path.join(localhost_dir, 'key.pem'))
with closing(socket.socket(socket.AF_INET, socket.SOCK_STREAM)) as sock:
sock.bind(('', 0))
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
self.port = sock.getsockname()[1]
from acme.standalone import simple_tls_sni_01_server
self.process = multiprocessing.Process(target=simple_tls_sni_01_server,
args=(['path', '-p', str(self.port)],))
self.old_cwd = os.getcwd()
os.chdir(self.test_cwd)
def tearDown(self):
os.chdir(self.old_cwd)
if self.process.is_alive():
self.process.terminate()
self.process.join(timeout=5)
# Check that we didn't timeout waiting for the process to
# terminate.
self.assertNotEqual(self.process.exitcode, None)
shutil.rmtree(self.test_cwd)
@mock.patch('acme.standalone.TLSSNI01Server.handle_request')
def test_mock(self, handle):
from acme.standalone import simple_tls_sni_01_server
simple_tls_sni_01_server(cli_args=['path', '-p', str(self.port)], forever=False)
self.assertEqual(handle.call_count, 1)
def test_live(self):
self.process.start()
cert = None
for _ in range(50):
time.sleep(0.1)
try:
cert = crypto_util.probe_sni(b'localhost', b'127.0.0.1', self.port)
break
except errors.Error: # pragma: no cover
pass
self.assertEqual(jose.ComparableX509(cert),
test_util.load_comparable_cert('rsa2048_cert.pem'))
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -4,13 +4,19 @@
"""
import os
import unittest
import pkg_resources
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import serialization
import josepy as jose
from OpenSSL import crypto
import pkg_resources
from josepy.util import ComparableECKey
def vector_path(*names):
"""Path to a test vector."""
return pkg_resources.resource_filename(
__name__, os.path.join('testdata', *names))
def load_vector(*names):
@@ -26,7 +32,8 @@ def _guess_loader(filename, loader_pem, loader_der):
return loader_pem
elif ext.lower() == '.der':
return loader_der
raise ValueError("Loader could not be recognized based on extension") # pragma: no cover
else: # pragma: no cover
raise ValueError("Loader could not be recognized based on extension")
def load_cert(*names):
@@ -61,16 +68,28 @@ def load_rsa_private_key(*names):
load_vector(*names), password=None, backend=default_backend()))
def load_ecdsa_private_key(*names):
"""Load ECDSA private key."""
loader = _guess_loader(names[-1], serialization.load_pem_private_key,
serialization.load_der_private_key)
return ComparableECKey(loader(
load_vector(*names), password=None, backend=default_backend()))
def load_pyopenssl_private_key(*names):
"""Load pyOpenSSL private key."""
loader = _guess_loader(
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
return crypto.load_privatekey(loader, load_vector(*names))
def skip_unless(condition, reason): # pragma: no cover
"""Skip tests unless a condition holds.
This implements the basic functionality of unittest.skipUnless
which is only available on Python 2.7+.
:param bool condition: If ``False``, the test will be skipped
:param str reason: the reason for skipping the test
:rtype: callable
:returns: decorator that hides tests unless condition is ``True``
"""
if hasattr(unittest, "skipUnless"):
return unittest.skipUnless(condition, reason)
elif condition:
return lambda cls: cls
return lambda cls: None

15
acme/acme/testdata/README vendored Normal file
View File

@@ -0,0 +1,15 @@
In order for acme.test_util._guess_loader to work properly, make sure
to use appropriate extension for vector filenames: .pem for PEM and
.der for DER.
The following command has been used to generate test keys:
for x in 256 512 1024 2048; do openssl genrsa -out rsa${k}_key.pem $k; done
and for the CSR:
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -outform DER > csr.der
and for the certificate:
openssl req -key rsa2047_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der

View File

@@ -1,10 +1,7 @@
"""ACME utilities."""
from typing import Any
from typing import Callable
from typing import Dict
from typing import Mapping
import six
def map_keys(dikt: Mapping[Any, Any], func: Callable[[Any], Any]) -> Dict[Any, Any]:
def map_keys(dikt, func):
"""Map dictionary keys."""
return {func(key): value for key, value in dikt.items()}
return dict((func(key), value) for key, value in six.iteritems(dikt))

View File

@@ -9,7 +9,7 @@ BUILDDIR = _build
# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from https://www.sphinx-doc.org/)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
endif
# Internal variables.

View File

@@ -12,8 +12,10 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
import os
import sys
import os
import shlex
here = os.path.abspath(os.path.dirname(__file__))
@@ -40,7 +42,7 @@ extensions = [
]
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
autodoc_default_flags = ['show-inheritance', 'private-members']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
@@ -58,7 +60,7 @@ master_doc = 'index'
# General information about the project.
project = u'acme-python'
copyright = u'2015, Let\'s Encrypt Project'
copyright = u'2015-2015, Let\'s Encrypt Project'
author = u'Let\'s Encrypt Project'
# The version info for the project you're documenting, acts as replacement for
@@ -85,9 +87,7 @@ language = 'en'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = [
'_build',
]
exclude_patterns = ['_build']
# The reST default role (used for this markup: `text`) to use for all
# documents.
@@ -114,7 +114,7 @@ pygments_style = 'sphinx'
#keep_warnings = False
# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = False
todo_include_todos = True
# -- Options for HTML output ----------------------------------------------
@@ -122,7 +122,7 @@ todo_include_todos = False
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# http://docs.readthedocs.org/en/latest/theme.html#how-do-i-use-this-locally-and-on-read-the-docs
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally

View File

@@ -3,6 +3,6 @@ usage: jws [-h] [--compact] {sign,verify} ...
positional arguments:
{sign,verify}
options:
optional arguments:
-h, --help show this help message and exit
--compact

View File

@@ -65,7 +65,7 @@ if errorlevel 9009 (
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.https://www.sphinx-doc.org/
echo.http://sphinx-doc.org/
exit /b 1
)

View File

@@ -1,3 +1 @@
:orphan:
.. literalinclude:: ../jws-help.txt

View File

@@ -26,10 +26,8 @@ Workflow:
- Deactivate Account
"""
from contextlib import contextmanager
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import rsa
import josepy as jose
import OpenSSL
from acme import challenges
@@ -38,6 +36,7 @@ from acme import crypto_util
from acme import errors
from acme import messages
from acme import standalone
import josepy as jose
# Constants:

View File

@@ -0,0 +1,2 @@
python -m acme.standalone -p 1234
curl -k https://localhost:1234

View File

@@ -0,0 +1 @@
../../../acme/testdata/rsa2048_cert.pem

View File

@@ -0,0 +1 @@
../../../acme/testdata/rsa2048_key.pem

View File

@@ -1,13 +1,10 @@
# readthedocs.org gives no way to change the install command to "pip
# install -e acme[docs]" (that would in turn install documentation
# install -e .[docs]" (that would in turn install documentation
# dependencies), but it allows to specify a requirements.txt file at
# https://readthedocs.org/dashboard/letsencrypt/advanced/ (c.f. #259)
# Although ReadTheDocs certainly doesn't need to install the project
# in --editable mode (-e), just "pip install acme[docs]" does not work as
# expected and "pip install -e acme[docs]" must be used instead
# in --editable mode (-e), just "pip install .[docs]" does not work as
# expected and "pip install -e .[docs]" must be used instead
# We also pin our dependencies for increased stability.
-c ../tools/requirements.txt
-e acme[docs]

2
acme/setup.cfg Normal file
View File

@@ -0,0 +1,2 @@
[bdist_wheel]
universal = 1

View File

@@ -1,19 +1,34 @@
from setuptools import setup
from setuptools import find_packages
from setuptools.command.test import test as TestCommand
import sys
from setuptools import find_packages
from setuptools import setup
version = '1.32.0.dev0'
version = '0.37.0.dev0'
# Please update tox.ini when modifying dependency version requirements
install_requires = [
'cryptography>=2.5.0',
'josepy>=1.13.0',
'PyOpenSSL>=17.5.0',
# load_pem_private/public_key (>=0.6)
# rsa_recover_prime_factors (>=0.8)
'cryptography>=1.2.3',
# formerly known as acme.jose:
# 1.1.0+ is required to avoid the warnings described at
# https://github.com/certbot/josepy/issues/13.
'josepy>=1.1.0',
# Connection.set_tlsext_host_name (>=0.13)
'mock',
'PyOpenSSL>=0.13.1',
'pyrfc3339',
'pytz>=2019.3',
'requests>=2.20.0',
'pytz',
'requests[security]>=2.6.0', # security extras added in 2.4.1
'requests-toolbelt>=0.3.0',
'setuptools>=41.6.0',
'setuptools',
'six>=1.9.0', # needed for python_2_unicode_compatible
]
dev_extras = [
'pytest',
'pytest-xdist',
'tox',
]
docs_extras = [
@@ -21,11 +36,21 @@ docs_extras = [
'sphinx_rtd_theme',
]
test_extras = [
'pytest',
'pytest-xdist',
'typing-extensions',
]
class PyTest(TestCommand):
user_options = []
def initialize_options(self):
TestCommand.initialize_options(self)
self.pytest_args = ''
def run_tests(self):
import shlex
# import here, cause outside the eggs aren't loaded
import pytest
errno = pytest.main(shlex.split(self.pytest_args))
sys.exit(errno)
setup(
name='acme',
@@ -33,19 +58,21 @@ setup(
description='ACME protocol implementation in Python',
url='https://github.com/letsencrypt/letsencrypt',
author="Certbot Project",
author_email='certbot-dev@eff.org',
author_email='client-dev@letsencrypt.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],
@@ -54,7 +81,10 @@ setup(
include_package_data=True,
install_requires=install_requires,
extras_require={
'dev': dev_extras,
'docs': docs_extras,
'test': test_extras,
},
test_suite='acme',
tests_require=["pytest"],
cmdclass={"test": PyTest},
)

View File

@@ -1,30 +0,0 @@
"""Tests for acme.magic_typing."""
import sys
import unittest
import warnings
from unittest import mock
class MagicTypingTest(unittest.TestCase):
"""Tests for acme.magic_typing."""
def test_import_success(self):
try:
import typing as temp_typing
except ImportError: # pragma: no cover
temp_typing = None # pragma: no cover
typing_class_mock = mock.MagicMock()
text_mock = mock.MagicMock()
typing_class_mock.Text = text_mock
sys.modules['typing'] = typing_class_mock
if 'acme.magic_typing' in sys.modules:
del sys.modules['acme.magic_typing'] # pragma: no cover
with warnings.catch_warnings():
warnings.filterwarnings("ignore", category=DeprecationWarning)
from acme.magic_typing import Text
self.assertEqual(Text, text_mock)
del sys.modules['acme.magic_typing']
sys.modules['typing'] = temp_typing
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,21 +0,0 @@
In order for acme.test_util._guess_loader to work properly, make sure
to use appropriate extension for vector filenames: .pem for PEM and
.der for DER.
The following command has been used to generate test keys:
for k in 256 512 1024 2048 4096; do openssl genrsa -out rsa${k}_key.pem $k; done
and for the CSR:
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -outform DER > csr.der
and for the certificates:
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 > rsa2048_cert.pem
openssl req -key rsa1024_key.pem -new -subj '/CN=example.com' -x509 > rsa1024_cert.pem
and for the elliptic key curves:
openssl genpkey -algorithm EC -out ec_secp384r1.pem -pkeyopt ec_paramgen_curve:P-384 -pkeyopt ec_param_enc:named_curve

View File

@@ -1,21 +0,0 @@
-----BEGIN CERTIFICATE-----
MIIDizCCAnOgAwIBAgIIPNBLQXwhoUkwDQYJKoZIhvcNAQELBQAwKDEmMCQGA1UE
AxMdUGViYmxlIEludGVybWVkaWF0ZSBDQSAxNzNiMjYwHhcNMjAwNTI5MTkxODA5
WhcNMjUwNTI5MTkxODA5WjAWMRQwEgYDVQQDEwsxOTIuMC4yLjE0NTCCASIwDQYJ
KoZIhvcNAQEBBQADggEPADCCAQoCggEBALyChb+NDA26GF1AfC0nzEdfOTchKw0h
q41xEjonvg5UXgZf/aH/ntvugIkYP0MaFifNAjebOVVsemEVEtyWcUKTfBHKZGbZ
ukTDwFIjfTccCfo6U/B2H7ZLzJIywl8DcUw9DypadeQBm8PS0VVR2ncy73dvaqym
crhAwlASyXU0mhLqRDMMxfg5Bn/FWpcsIcDpLmPn8Q/FvdRc2t5ryBNw/aWOlwqT
Oy16nbfLj2T0zG1A3aPuD+eT/JFUe/o3K7R+FAx7wt+RziQO46wLVVF1SueZUrIU
zqN04Gl8Kt1WM2SniZ0gq/rORUNcPtT0NAEsEslTQfA+Trq6j2peqyMCAwEAAaOB
yjCBxzAOBgNVHQ8BAf8EBAMCBaAwHQYDVR0lBBYwFAYIKwYBBQUHAwEGCCsGAQUF
BwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFHj1mwZzP//nMIH2i58NRUl/arHn
MB8GA1UdIwQYMBaAFF5DVAKabvIUvKFHGouscA2Qdpe6MDEGCCsGAQUFBwEBBCUw
IzAhBggrBgEFBQcwAYYVaHR0cDovLzEyNy4wLjAuMTo0MDAyMBUGA1UdEQQOMAyH
BMAAApGHBMsAcQEwDQYJKoZIhvcNAQELBQADggEBAHjSgDg76/UCIMSYddyhj18r
LdNKjA7p8ovnErSkebFT4lIZ9f3Sma9moNr0w64M33NamuFyHe/KTdk90mvoW8Uu
26aDekiRIeeMakzbAtDKn67tt2tbedKIYRATcSYVwsV46uZKbM621dZKIjjxOWpo
IY6rZYrku8LYhoXJXOqRduV3cTRVuTm5bBa9FfVNtt6N1T5JOtKKDEhuSaF4RSug
PDy3hQIiHrVvhPfVrXU3j6owz/8UCS5549inES9ONTFrvM9o0H1R/MsmGNXR5hF5
iJqHKC7n8LZujhVnoFIpHu2Dsiefbfr+yRYJS4I+ezy6Nq/Ok8rc8zp0eoX+uyY=
-----END CERTIFICATE-----

View File

@@ -1,22 +0,0 @@
-----BEGIN CERTIFICATE-----
MIIDmzCCAoOgAwIBAgIIFdxeZP+v2rgwDQYJKoZIhvcNAQELBQAwKDEmMCQGA1UE
AxMdUGViYmxlIEludGVybWVkaWF0ZSBDQSA0M2M5NTcwHhcNMjAwNTMwMDQwNzMw
WhcNMjUwNTMwMDQwNzMwWjAOMQwwCgYDVQQDEwM6OjEwggEiMA0GCSqGSIb3DQEB
AQUAA4IBDwAwggEKAoIBAQC7VidVduJvqKtrSH0fw6PjE0cqL4Kfzo7klexWUkHG
KVAa0fRVZFZ462jxKOt417V2U4WJQ6WHHO9PJ+3gW62d/MhCw8FRtUQS4nYFjqB6
32+RFU21VRN7cWoQEqSwnEPbh/v/zv/KS5JhQ+swWUo79AOLm1kjnZWCKtcqh1Lc
Ug5Tkpot6luoxTKp52MkchvXDpj0q2B/XpLJ8/pw5cqjv7mH12EDOK2HXllA+WwX
ZpstcEhaA4FqtaHOW/OHnwTX5MUbINXE5YYHVEDR6moVM31/W/3pe9NDUMTDE7Si
lVQnZbXM9NYbzZqlh+WhemDWwnIfGI6rtsfNEiirVEOlAgMBAAGjgeIwgd8wDgYD
VR0PAQH/BAQDAgWgMB0GA1UdJQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjAMBgNV
HRMBAf8EAjAAMB0GA1UdDgQWBBS8DL+MZfDIy6AKky69Tgry2Vxq5DAfBgNVHSME
GDAWgBRAsFqVenRRKgB1YPzWKzb9bzZ/ozAxBggrBgEFBQcBAQQlMCMwIQYIKwYB
BQUHMAGGFWh0dHA6Ly8xMjcuMC4wLjE6NDAwMjAtBgNVHREEJjAkhxAAAAAAAAAA
AAAAAAAAAAABhxCjvjLzIG7HXQlWDO6YWF7FMA0GCSqGSIb3DQEBCwUAA4IBAQBY
M9UTZ3uaKMQ+He9kWR3p9jh6hTSD0FNi79ZdfkG0lgSzhhduhN7OhzQH2ihUUfa6
rtKTw74fGbszhizCd9UB8YPKlm3si1Xbg6ZUQlA1RtoQo7RUGEa6ZbR68PKGm9Go
hTTFIl/JS8jzxBR8jywZdyqtprUx+nnNUDiNk0hJtFLhw7OJH0AHlAUNqHsfD08m
HXRdaV6q14HXU5g31slBat9H4D6tCU/2uqBURwW0wVdnqh4QeRfAeqiatJS9EmSF
ctbc7n894Idy2Xce7NFoIy5cht3m6Rd42o/LmBsJopBmQcDPZT70/XzRtc2qE0cS
CzBIGQHUJ6BfmBjrCQnp
-----END CERTIFICATE-----

View File

@@ -1,16 +0,0 @@
-----BEGIN CERTIFICATE REQUEST-----
MIICbTCCAVUCAQIwADCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKT/
CE7Y5EYBvI4p7Frt763upIKHDHO/R5/TWMjG8Jm9qTMui8sbMgyh2Yh+lR/j/5Xd
tQrhgC6wx10MrW2+3JtYS88HP1p6si8zU1dbK34n3NyyklR2RivW0R7dXgnYNy7t
5YcDYLCrbRMIPINV/uHrmzIHWYUDNcZVdAfIM2AHfKYuV6Mepcn///5GR+l4GcAh
Nkf9CW8OdAIuKdbyLCxVr0mUW/vJz1b12uxPsgUdax9sjXgZdT4pfMXADsFd1NeF
atpsXU073inqtHru+2F9ijHTQ75TC+u/rr6eYl3BnBntac0gp/ADtDBii7/Q1JOO
Bhq7xJNqqxIEdiyM7zcCAwEAAaAoMCYGCSqGSIb3DQEJDjEZMBcwFQYDVR0RBA4w
DIcEwAACkYcEywBxATANBgkqhkiG9w0BAQsFAAOCAQEADG5g3zdbSCaXpZhWHkzE
Mek3f442TUE1pB+ITRpthmM4N3zZWETYmbLCIAO624uMrRnbCCMvAoLs/L/9ETg/
XMMFtonQC8u9i9tV8B1ceBh8lpIfa+8b9TMWH3bqnrbWQ+YIl+Yd0gXiCZWJ9vK4
eM1Gddu/2bR6s/k4h/XAWRgEexqk57EHr1z0N+T9OoX939n3mVcNI+u9kfd5VJ0z
VyA3R8WR6T6KlEl5P5pcWe5Kuyhi7xMmLVImXqBtvKq4O1AMfM+gQr/yn9aE8IRq
khP7JrMBLUIub1c/qu2TfvnynNPSM/ZcOX+6PHdHmRkR3nI0Ndpv7Ntv31FTplAm
Dw==
-----END CERTIFICATE REQUEST-----

View File

@@ -1,16 +0,0 @@
-----BEGIN CERTIFICATE REQUEST-----
MIIChTCCAW0CAQIwADCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAOIc
UAppcqJfTkSqqOFqGt1v7lIJZPOcF4bcKI3d5cHAGbOuVxbC7uMaDuObwYLzoiED
qnvs1NaEq2phO6KsgGESB7IE2LUjJivO7OnSZjNRpL5si/9egvBiNCn/50lULaWG
gLEuyMfk3awZy2mVAymy7Grhbx069A4TH8TqsHuq2RpKyuDL27e/jUt6yYecb3pu
hWMiWy3segif4tI46pkOW0/I6DpxyYD2OqOvzxm/voS9RMqE2+7YJA327H7bEi3N
lJZEZ1zy7clZ9ga5fBQaetzbg2RyxTrZ7F919NQXSFoXgxb10Eg64wIpz0L3ooCm
GEHehsZZexa3J5ccIvMCAwEAAaBAMD4GCSqGSIb3DQEJDjExMC8wLQYDVR0RBCYw
JIcQAAAAAAAAAAAAAAAAAAAAAYcQo74y8yBux10JVgzumFhexTANBgkqhkiG9w0B
AQsFAAOCAQEALvwVn0A/JPTCiNzcozHFnp5M23C9PXCplWc5u4k34d4XXzpSeFDz
fL4gy7NpYIueme2K2ppw2j3PNQUdR6vQ5a75sriegWYrosL+7Q6Joh51ZyEUZQoD
mNl4M4S4oX85EaChR6NFGBywTfjFarYi32XBTbFE7rK8N8KM+DQkNdwL1MXqaHWz
F1obQKpNXlLedbCBOteV5Eg4zG3565zu/Gw/NhwzzV3mQmgxUcd1sMJxAfHQz4Vl
ImLL+xMcR03nDsH2bgtDbK2tJm7WszSxA9tC+Xp2lRewxrnQloRWPYDz177WGQ5Q
SoGDzTTtA6uWZxG8h7CkNLOGvA8LtU2rNA==
-----END CERTIFICATE REQUEST-----

Some files were not shown because too many files have changed in this diff Show More