Compare commits
6 Commits
update-dev
...
python37-t
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
cc85d309a9 | ||
|
|
c1b4f21325 | ||
|
|
29a75eb8a7 | ||
|
|
309a70c3fe | ||
|
|
e5d35099a7 | ||
|
|
9fade9c85c |
@@ -1,119 +0,0 @@
|
||||
# Configuring Azure Pipelines with Certbot
|
||||
|
||||
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
|
||||
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
|
||||
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
|
||||
|
||||
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
|
||||
|
||||
Unlike Travis, where CodeCov is working without any action required, CodeCov supports Azure Pipelines
|
||||
using the coverage-bash utility (not python-coverage for now) only if you provide the Codecov repo token
|
||||
using the `CODECOV_TOKEN` environment variable. So `CODECOV_TOKEN` needs to be set as a secured
|
||||
environment variable to allow the main pipeline to publish coverage reports to CodeCov.
|
||||
|
||||
This INSTALL.md file explains how to configure Azure Pipelines with Certbot in order to execute the CI/CD logic defined in `.azure-pipelines` folder with it.
|
||||
During this installation step, warnings describing user access and legal comitments will be displayed like this:
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
```
|
||||
|
||||
This document suppose that the Azure DevOps organization is named _certbot_, and the Azure DevOps project is also _certbot_.
|
||||
|
||||
## Useful links
|
||||
|
||||
* https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema
|
||||
* https://www.azuredevopslabs.com/labs/azuredevops/github-integration/
|
||||
* https://docs.microsoft.com/en-us/azure/devops/pipelines/ecosystems/python?view=azure-devops
|
||||
|
||||
## Prerequisites
|
||||
|
||||
### Having a GitHub account
|
||||
|
||||
Use your GitHub user for a normal GitHub account, or a user that has administrative rights to the GitHub organization if relevant.
|
||||
|
||||
### Having an Azure DevOps account
|
||||
- Go to https://dev.azure.com/, click "Start free with GitHub"
|
||||
- Login to GitHub
|
||||
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
Personal user data (email + profile info, in read-only)
|
||||
```
|
||||
|
||||
- Microsoft will create a Live account using the email referenced for the GitHub account. This account is also linked to GitHub account (meaning you can log it using GitHub authentication)
|
||||
- Proceed with account registration (birth date, country), add details about name and email contact
|
||||
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
Microsoft proposes to send commercial links to this mail
|
||||
Azure DevOps terms of service need to be accepted
|
||||
```
|
||||
|
||||
_Logged to Azure DevOps, account is ready._
|
||||
|
||||
### Installing Azure Pipelines to GitHub
|
||||
|
||||
- On GitHub, go to Marketplace
|
||||
- Select Azure Pipeline, and "Set up a plan"
|
||||
- Select Free, then "Install it for free"
|
||||
- Click "Complete order and begin installation"
|
||||
|
||||
```
|
||||
!!! ACCESS !!!
|
||||
Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses, deployments, issues, pull requests.
|
||||
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
|
||||
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
|
||||
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
|
||||
protected branches, like master, and cannot modify the security context around this on GitHub.
|
||||
Access can be defined for all or only selected repositories, which is nice.
|
||||
```
|
||||
|
||||
- Redirected to Azure DevOps, select the account created in _Having an Azure DevOps account_ section.
|
||||
- Select the organization, and click "Create a new project" (let's name it the same than the targeted github repo)
|
||||
- The Visibility is public, to profit from 10 parallel jobs
|
||||
|
||||
```
|
||||
!!! ACCESS !!!
|
||||
Azure Pipelines needs access to the GitHub account (in term of being able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
|
||||
```
|
||||
|
||||
_Done. We can move to pipelines configuration._
|
||||
|
||||
## Import an existing pipelines from `.azure-pipelines` folder
|
||||
|
||||
- On Azure DevOps, go to your organization (eg. _certbot_) then your project (eg. _certbot_)
|
||||
- Click "Pipelines" tab
|
||||
- Click "New pipeline"
|
||||
- Where is your code?: select "__Use the classic editor__"
|
||||
|
||||
__Warning: Do not choose the GitHub option in Where is your code? section. Indeed, this option will trigger an OAuth
|
||||
grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAuth Application. The permissions asked
|
||||
then are way too large (admin level on almost everything), while the classic approach does not add any more
|
||||
permissions, and works perfectly well.__
|
||||
|
||||
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
|
||||
- Click on YAML option for "Select a template"
|
||||
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
|
||||
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
|
||||
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
|
||||
|
||||
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._
|
||||
|
||||
## Add a secret variable to a pipeline (like `CODECOV_TOKEN`)
|
||||
|
||||
__NB: Following steps suppose that you already setup the YAML pipeline file to
|
||||
consume the secret variable that these steps will create as an environment variable.
|
||||
For a variable named `CODECOV_TOKEN` consuming the variable `codecov_token`,
|
||||
in the YAML file this setup would take the form of the following:
|
||||
```
|
||||
steps:
|
||||
- script: ./do_something_that_consumes_CODECOV_TOKEN # Eg. `codecov -F windows`
|
||||
env:
|
||||
CODECOV_TOKEN: $(codecov_token)
|
||||
```
|
||||
|
||||
To set up a variable that is shared between pipelines, follow the instructions
|
||||
at
|
||||
https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups.
|
||||
When adding variables to a group, don't forget to tick "Keep this value secret"
|
||||
if it shouldn't be shared publcily.
|
||||
@@ -1,14 +0,0 @@
|
||||
# Advanced pipeline for running our full test suite on demand.
|
||||
trigger:
|
||||
# When changing these triggers, please ensure the documentation under
|
||||
# "Running tests in CI" is still correct.
|
||||
- test-*
|
||||
pr: none
|
||||
|
||||
variables:
|
||||
# We don't publish our Docker images in this pipeline, but when building them
|
||||
# for testing, let's use the nightly tag.
|
||||
dockerTag: nightly
|
||||
|
||||
stages:
|
||||
- template: templates/stages/test-and-package-stage.yml
|
||||
@@ -1,8 +0,0 @@
|
||||
trigger: none
|
||||
pr:
|
||||
- master
|
||||
- '*.x'
|
||||
|
||||
jobs:
|
||||
- template: templates/jobs/standard-tests-jobs.yml
|
||||
|
||||
@@ -1,18 +0,0 @@
|
||||
# Nightly pipeline running each day for master.
|
||||
trigger: none
|
||||
pr: none
|
||||
schedules:
|
||||
- cron: "30 4 * * *"
|
||||
displayName: Nightly build
|
||||
branches:
|
||||
include:
|
||||
- master
|
||||
always: true
|
||||
|
||||
variables:
|
||||
dockerTag: nightly
|
||||
|
||||
stages:
|
||||
- template: templates/stages/test-and-package-stage.yml
|
||||
- template: templates/stages/deploy-stage.yml
|
||||
- template: templates/stages/notify-failure-stage.yml
|
||||
@@ -1,18 +0,0 @@
|
||||
# Release pipeline to run our full test suite, build artifacts, and deploy them
|
||||
# for GitHub release tags.
|
||||
trigger:
|
||||
tags:
|
||||
include:
|
||||
- v*
|
||||
pr: none
|
||||
|
||||
variables:
|
||||
dockerTag: ${{variables['Build.SourceBranchName']}}
|
||||
|
||||
stages:
|
||||
- template: templates/stages/test-and-package-stage.yml
|
||||
- template: templates/stages/changelog-stage.yml
|
||||
- template: templates/stages/deploy-stage.yml
|
||||
parameters:
|
||||
snapReleaseChannel: beta
|
||||
- template: templates/stages/notify-failure-stage.yml
|
||||
@@ -1,94 +0,0 @@
|
||||
jobs:
|
||||
- job: extended_test
|
||||
variables:
|
||||
- name: IMAGE_NAME
|
||||
value: ubuntu-18.04
|
||||
- name: PYTHON_VERSION
|
||||
value: 3.9
|
||||
- group: certbot-common
|
||||
strategy:
|
||||
matrix:
|
||||
linux-py36:
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: py36
|
||||
linux-py37:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: py37
|
||||
linux-py38:
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: py38
|
||||
linux-py37-nopin:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: py37
|
||||
CERTBOT_NO_PIN: 1
|
||||
linux-external-mock:
|
||||
TOXENV: external-mock
|
||||
linux-boulder-v1-integration-certbot-oldest:
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: integration-certbot-oldest
|
||||
ACME_SERVER: boulder-v1
|
||||
linux-boulder-v2-integration-certbot-oldest:
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: integration-certbot-oldest
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v1-integration-nginx-oldest:
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: integration-nginx-oldest
|
||||
ACME_SERVER: boulder-v1
|
||||
linux-boulder-v2-integration-nginx-oldest:
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: integration-nginx-oldest
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v1-py36-integration:
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v1
|
||||
linux-boulder-v2-py36-integration:
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v1-py37-integration:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v1
|
||||
linux-boulder-v2-py37-integration:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v1-py38-integration:
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v1
|
||||
linux-boulder-v2-py38-integration:
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v1-py39-integration:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v1
|
||||
linux-boulder-v2-py39-integration:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
nginx-compat:
|
||||
TOXENV: nginx_compat
|
||||
linux-integration-rfc2136:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration-dns-rfc2136
|
||||
docker-dev:
|
||||
TOXENV: docker_dev
|
||||
macos-farmtest-apache2:
|
||||
# We run one of these test farm tests on macOS to help ensure the
|
||||
# tests continue to work on the platform.
|
||||
IMAGE_NAME: macOS-10.15
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: test-farm-apache2
|
||||
farmtest-sdists:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: test-farm-sdists
|
||||
pool:
|
||||
vmImage: $(IMAGE_NAME)
|
||||
steps:
|
||||
- template: ../steps/tox-steps.yml
|
||||
@@ -1,230 +0,0 @@
|
||||
jobs:
|
||||
- job: docker_build
|
||||
pool:
|
||||
vmImage: ubuntu-18.04
|
||||
strategy:
|
||||
matrix:
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
# Do not run the heavy non-amd64 builds for test branches
|
||||
${{ if not(startsWith(variables['Build.SourceBranchName'], 'test-')) }}:
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
# The default timeout of 60 minutes is a little low for compiling
|
||||
# cryptography on ARM architectures.
|
||||
timeoutInMinutes: 180
|
||||
steps:
|
||||
- bash: set -e && tools/docker/build.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Build the Docker images
|
||||
# We don't filter for the Docker Hub organization to continue to allow
|
||||
# easy testing of these scripts on forks.
|
||||
- bash: |
|
||||
set -e
|
||||
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}')
|
||||
docker save --output images.tar $DOCKER_IMAGES
|
||||
displayName: Save the Docker images
|
||||
# If the name of the tar file or artifact changes, the deploy stage will
|
||||
# also need to be updated.
|
||||
- bash: set -e && mv images.tar $(Build.ArtifactStagingDirectory)
|
||||
displayName: Prepare Docker artifact
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
displayName: Store Docker artifact
|
||||
- job: docker_run
|
||||
dependsOn: docker_build
|
||||
pool:
|
||||
vmImage: ubuntu-18.04
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: docker_amd64
|
||||
path: $(Build.SourcesDirectory)
|
||||
displayName: Retrieve Docker images
|
||||
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
|
||||
displayName: Load Docker images
|
||||
- bash: |
|
||||
set -ex
|
||||
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}:{{.Tag}}')
|
||||
for DOCKER_IMAGE in ${DOCKER_IMAGES}
|
||||
do docker run --rm "${DOCKER_IMAGE}" plugins --prepare
|
||||
done
|
||||
displayName: Run integration tests for Docker images
|
||||
- job: installer_build
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
architecture: x86
|
||||
addToPath: true
|
||||
- script: |
|
||||
python -m venv venv
|
||||
venv\Scripts\python tools\pipstrap.py
|
||||
venv\Scripts\python tools\pip_install.py -e windows-installer
|
||||
displayName: Prepare Windows installer build environment
|
||||
- script: |
|
||||
venv\Scripts\construct-windows-installer
|
||||
displayName: Build Certbot installer
|
||||
- task: CopyFiles@2
|
||||
inputs:
|
||||
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
|
||||
contents: '*.exe'
|
||||
targetFolder: $(Build.ArtifactStagingDirectory)
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
|
||||
artifact: windows-installer
|
||||
displayName: Publish Windows installer
|
||||
- job: installer_run
|
||||
dependsOn: installer_build
|
||||
strategy:
|
||||
matrix:
|
||||
win2019:
|
||||
imageName: windows-2019
|
||||
win2016:
|
||||
imageName: vs2017-win2016
|
||||
pool:
|
||||
vmImage: $(imageName)
|
||||
steps:
|
||||
- powershell: |
|
||||
if ($PSVersionTable.PSVersion.Major -ne 5) {
|
||||
throw "Powershell version is not 5.x"
|
||||
}
|
||||
condition: eq(variables['imageName'], 'vs2017-win2016')
|
||||
displayName: Check Powershell 5.x is used in vs2017-win2016
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
addToPath: true
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: windows-installer
|
||||
path: $(Build.SourcesDirectory)/bin
|
||||
displayName: Retrieve Windows installer
|
||||
- script: |
|
||||
python -m venv venv
|
||||
venv\Scripts\python tools\pipstrap.py
|
||||
venv\Scripts\python tools\pip_install.py -e certbot-ci
|
||||
env:
|
||||
PIP_NO_BUILD_ISOLATION: no
|
||||
displayName: Prepare Certbot-CI
|
||||
- script: |
|
||||
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
|
||||
venv\Scripts\python -m pytest certbot-ci\windows_installer_integration_tests --allow-persistent-changes --installer-path $(Build.SourcesDirectory)\bin\certbot-beta-installer-win32.exe
|
||||
displayName: Run windows installer integration tests
|
||||
- script: |
|
||||
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
|
||||
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
|
||||
displayName: Run certbot integration tests
|
||||
- job: snaps_build
|
||||
pool:
|
||||
vmImage: ubuntu-18.04
|
||||
strategy:
|
||||
matrix:
|
||||
amd64:
|
||||
SNAP_ARCH: amd64
|
||||
# Do not run the heavy non-amd64 builds for test branches
|
||||
${{ if not(startsWith(variables['Build.SourceBranchName'], 'test-')) }}:
|
||||
armhf:
|
||||
SNAP_ARCH: armhf
|
||||
arm64:
|
||||
SNAP_ARCH: arm64
|
||||
timeoutInMinutes: 0
|
||||
steps:
|
||||
- script: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends snapd
|
||||
sudo snap install --classic snapcraft
|
||||
displayName: Install dependencies
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
addToPath: true
|
||||
- task: DownloadSecureFile@1
|
||||
name: credentials
|
||||
inputs:
|
||||
secureFile: launchpad-credentials
|
||||
- script: |
|
||||
set -e
|
||||
git config --global user.email "$(Build.RequestedForEmail)"
|
||||
git config --global user.name "$(Build.RequestedFor)"
|
||||
mkdir -p ~/.local/share/snapcraft/provider/launchpad
|
||||
cp $(credentials.secureFilePath) ~/.local/share/snapcraft/provider/launchpad/credentials
|
||||
python3 tools/snap/build_remote.py ALL --archs ${SNAP_ARCH} --timeout 19800
|
||||
displayName: Build snaps
|
||||
- script: |
|
||||
set -e
|
||||
mv *.snap $(Build.ArtifactStagingDirectory)
|
||||
mv certbot-dns-*/*.snap $(Build.ArtifactStagingDirectory)
|
||||
displayName: Prepare artifacts
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: snaps_$(SNAP_ARCH)
|
||||
displayName: Store snaps artifacts
|
||||
- job: snap_run
|
||||
dependsOn: snaps_build
|
||||
pool:
|
||||
vmImage: ubuntu-18.04
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
addToPath: true
|
||||
- script: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends nginx-light snapd
|
||||
python3 -m venv venv
|
||||
venv/bin/python tools/pipstrap.py
|
||||
venv/bin/python tools/pip_install.py -U tox
|
||||
displayName: Install dependencies
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: snaps_amd64
|
||||
path: $(Build.SourcesDirectory)/snap
|
||||
displayName: Retrieve Certbot snaps
|
||||
- script: |
|
||||
set -e
|
||||
sudo snap install --dangerous --classic snap/certbot_*.snap
|
||||
displayName: Install Certbot snap
|
||||
- script: |
|
||||
set -e
|
||||
venv/bin/python -m tox -e integration-external,apacheconftest-external-with-pebble
|
||||
displayName: Run tox
|
||||
- job: snap_dns_run
|
||||
dependsOn: snaps_build
|
||||
pool:
|
||||
vmImage: ubuntu-18.04
|
||||
steps:
|
||||
- script: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends snapd
|
||||
displayName: Install dependencies
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
addToPath: true
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: snaps_amd64
|
||||
path: $(Build.SourcesDirectory)/snap
|
||||
displayName: Retrieve Certbot snaps
|
||||
- script: |
|
||||
set -e
|
||||
python3 -m venv venv
|
||||
venv/bin/python tools/pipstrap.py
|
||||
venv/bin/python tools/pip_install.py -e certbot-ci
|
||||
displayName: Prepare Certbot-CI
|
||||
- script: |
|
||||
set -e
|
||||
sudo -E venv/bin/pytest certbot-ci/snap_integration_tests/dns_tests --allow-persistent-changes --snap-folder $(Build.SourcesDirectory)/snap --snap-arch amd64
|
||||
displayName: Test DNS plugins snaps
|
||||
@@ -1,80 +0,0 @@
|
||||
jobs:
|
||||
- job: test
|
||||
variables:
|
||||
PYTHON_VERSION: 3.9
|
||||
strategy:
|
||||
matrix:
|
||||
macos-py36:
|
||||
IMAGE_NAME: macOS-10.15
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: py36
|
||||
macos-py39:
|
||||
IMAGE_NAME: macOS-10.15
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: py39
|
||||
windows-py36:
|
||||
IMAGE_NAME: vs2017-win2016
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: py36
|
||||
windows-py38-cover:
|
||||
IMAGE_NAME: vs2017-win2016
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: py38-cover
|
||||
windows-integration-certbot:
|
||||
IMAGE_NAME: vs2017-win2016
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration-certbot
|
||||
linux-oldest-tests-1:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: '{acme,apache,apache-v2,certbot}-oldest'
|
||||
linux-oldest-tests-2:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: '{dns,nginx}-oldest'
|
||||
linux-py36:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: py36
|
||||
linux-py39-cover:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: py39-cover
|
||||
linux-py39-lint:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: lint
|
||||
linux-py39-mypy:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: mypy
|
||||
linux-integration:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration
|
||||
ACME_SERVER: pebble
|
||||
apache-compat:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
TOXENV: apache_compat
|
||||
# le-modification can be moved to the extended test suite once
|
||||
# https://github.com/certbot/certbot/issues/8742 is resolved.
|
||||
le-modification:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
TOXENV: modification
|
||||
apacheconftest:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: apacheconftest-with-pebble
|
||||
nginxroundtrip:
|
||||
IMAGE_NAME: ubuntu-18.04
|
||||
PYTHON_VERSION: 3.6
|
||||
TOXENV: nginxroundtrip
|
||||
pool:
|
||||
vmImage: $(IMAGE_NAME)
|
||||
steps:
|
||||
- template: ../steps/tox-steps.yml
|
||||
- job: test_sphinx_builds
|
||||
pool:
|
||||
vmImage: ubuntu-20.04
|
||||
steps:
|
||||
- template: ../steps/sphinx-steps.yml
|
||||
@@ -1,19 +0,0 @@
|
||||
stages:
|
||||
- stage: Changelog
|
||||
jobs:
|
||||
- job: prepare
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
steps:
|
||||
# If we change the output filename from `release_notes.md`, it should also be changed in tools/create_github_release.py
|
||||
- bash: |
|
||||
set -e
|
||||
CERTBOT_VERSION="$(cd certbot && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
|
||||
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
|
||||
displayName: Prepare changelog
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
|
||||
artifact: changelog
|
||||
displayName: Publish changelog
|
||||
@@ -1,106 +0,0 @@
|
||||
parameters:
|
||||
- name: snapReleaseChannel
|
||||
type: string
|
||||
default: edge
|
||||
values:
|
||||
- edge
|
||||
- beta
|
||||
|
||||
stages:
|
||||
- stage: Deploy
|
||||
jobs:
|
||||
# This job relies on credentials used to publish the Certbot snaps. This
|
||||
# credential file was created by running:
|
||||
#
|
||||
# snapcraft logout
|
||||
# snapcraft login (provide the shared snapcraft credentials when prompted)
|
||||
# snapcraft export-login --channels=beta,edge snapcraft.cfg
|
||||
#
|
||||
# Then the file was added as a secure file in Azure pipelines
|
||||
# with the name snapcraft.cfg by following the instructions at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops
|
||||
# including authorizing the file in all pipelines as described at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops#how-do-i-authorize-a-secure-file-for-use-in-all-pipelines.
|
||||
#
|
||||
# This file has a maximum lifetime of one year and the current
|
||||
# file will expire on 2021-07-28 which is also tracked by
|
||||
# https://github.com/certbot/certbot/issues/7931. The file will
|
||||
# need to be updated before then to prevent automated deploys
|
||||
# from breaking.
|
||||
#
|
||||
# Revoking these credentials can be done by changing the password of the
|
||||
# account used to generate the credentials. See
|
||||
# https://forum.snapcraft.io/t/revoking-exported-credentials/19031 for
|
||||
# more info.
|
||||
- job: publish_snap
|
||||
pool:
|
||||
vmImage: ubuntu-18.04
|
||||
variables:
|
||||
- group: certbot-common
|
||||
strategy:
|
||||
matrix:
|
||||
amd64:
|
||||
SNAP_ARCH: amd64
|
||||
arm32v6:
|
||||
SNAP_ARCH: armhf
|
||||
arm64v8:
|
||||
SNAP_ARCH: arm64
|
||||
steps:
|
||||
- bash: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends snapd
|
||||
sudo snap install --classic snapcraft
|
||||
displayName: Install dependencies
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: snaps_$(SNAP_ARCH)
|
||||
path: $(Build.SourcesDirectory)/snap
|
||||
displayName: Retrieve Certbot snaps
|
||||
- task: DownloadSecureFile@1
|
||||
name: snapcraftCfg
|
||||
inputs:
|
||||
secureFile: snapcraft.cfg
|
||||
- bash: |
|
||||
set -e
|
||||
snapcraft login --with $(snapcraftCfg.secureFilePath)
|
||||
for SNAP_FILE in snap/*.snap; do
|
||||
tools/retry.sh eval snapcraft upload --release=${{ parameters.snapReleaseChannel }} "${SNAP_FILE}"
|
||||
done
|
||||
displayName: Publish to Snap store
|
||||
- job: publish_docker
|
||||
pool:
|
||||
vmImage: ubuntu-18.04
|
||||
strategy:
|
||||
matrix:
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
path: $(Build.SourcesDirectory)
|
||||
displayName: Retrieve Docker images
|
||||
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
|
||||
displayName: Load Docker images
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
# The credentials used here are for the shared certbotbot account
|
||||
# on Docker Hub. The credentials are stored in a service account
|
||||
# which was created by following the instructions at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
|
||||
# The name given to this service account must match the value
|
||||
# given to containerRegistry below. "Grant access to all
|
||||
# pipelines" should also be checked. To revoke these
|
||||
# credentials, we can change the password on the certbotbot
|
||||
# Docker Hub account or remove the account from the
|
||||
# Certbot organization on Docker Hub.
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Deploy the Docker images
|
||||
@@ -1,19 +0,0 @@
|
||||
stages:
|
||||
- stage: On_Failure
|
||||
jobs:
|
||||
- job: notify_mattermost
|
||||
variables:
|
||||
- group: certbot-common
|
||||
pool:
|
||||
vmImage: ubuntu-20.04
|
||||
steps:
|
||||
- bash: |
|
||||
set -e
|
||||
MESSAGE="\
|
||||
---\n\
|
||||
##### Azure Pipeline
|
||||
*Repo* $(Build.Repository.ID) - *Pipeline* $(Build.DefinitionName) #$(Build.BuildNumber) - *Branch/PR* $(Build.SourceBranchName)\n\
|
||||
:warning: __Pipeline has failed__: [Link to the build](https://dev.azure.com/$(Build.Repository.ID)/_build/results?buildId=$(Build.BuildId)&view=results)\n\n\
|
||||
---"
|
||||
curl -i -X POST --data-urlencode "payload={\"text\":\"${MESSAGE}\"}" "$(MATTERMOST_URL)"
|
||||
condition: failed()
|
||||
@@ -1,6 +0,0 @@
|
||||
stages:
|
||||
- stage: TestAndPackage
|
||||
jobs:
|
||||
- template: ../jobs/standard-tests-jobs.yml
|
||||
- template: ../jobs/extended-tests-jobs.yml
|
||||
- template: ../jobs/packaging-jobs.yml
|
||||
@@ -1,21 +0,0 @@
|
||||
steps:
|
||||
- bash: |
|
||||
FINAL_STATUS=0
|
||||
declare -a FAILED_BUILDS
|
||||
tools/venv.py
|
||||
source venv/bin/activate
|
||||
for doc_path in */docs
|
||||
do
|
||||
echo ""
|
||||
echo "##[group]Building $doc_path"
|
||||
if ! sphinx-build -W --keep-going -b html $doc_path $doc_path/_build/html; then
|
||||
FINAL_STATUS=1
|
||||
FAILED_BUILDS[${#FAILED_BUILDS[@]}]="${doc_path%/docs}"
|
||||
fi
|
||||
echo "##[endgroup]"
|
||||
done
|
||||
if [[ $FINAL_STATUS -ne 0 ]]; then
|
||||
echo "##[error]The following builds failed: ${FAILED_BUILDS[*]}"
|
||||
exit 1
|
||||
fi
|
||||
displayName: Build Sphinx Documentation
|
||||
@@ -1,57 +0,0 @@
|
||||
steps:
|
||||
# We run brew update because we've seen attempts to install an older version
|
||||
# of a package fail. See
|
||||
# https://github.com/actions/virtual-environments/issues/3165.
|
||||
- bash: |
|
||||
set -e
|
||||
brew update
|
||||
brew install augeas
|
||||
condition: startswith(variables['IMAGE_NAME'], 'macOS')
|
||||
displayName: Install MacOS dependencies
|
||||
- bash: |
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends \
|
||||
python-dev \
|
||||
gcc \
|
||||
libaugeas0 \
|
||||
libssl-dev \
|
||||
libffi-dev \
|
||||
ca-certificates \
|
||||
nginx-light \
|
||||
openssl
|
||||
sudo systemctl stop nginx
|
||||
condition: startswith(variables['IMAGE_NAME'], 'ubuntu')
|
||||
displayName: Install Linux dependencies
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: $(PYTHON_VERSION)
|
||||
addToPath: true
|
||||
# tools/pip_install.py is used to pin packages to a known working version
|
||||
# except in tests where the environment variable CERTBOT_NO_PIN is set.
|
||||
# virtualenv is listed here explicitly to make sure it is upgraded when
|
||||
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
|
||||
# version of virtualenv. The option "-I" is set so when CERTBOT_NO_PIN is also
|
||||
# set, pip updates dependencies it thinks are already satisfied to avoid some
|
||||
# problems with its lack of real dependency resolution.
|
||||
- bash: |
|
||||
set -e
|
||||
python tools/pipstrap.py
|
||||
python tools/pip_install.py -I tox virtualenv
|
||||
displayName: Install runtime dependencies
|
||||
- task: DownloadSecureFile@1
|
||||
name: testFarmPem
|
||||
inputs:
|
||||
secureFile: azure-test-farm.pem
|
||||
condition: contains(variables['TOXENV'], 'test-farm')
|
||||
- bash: |
|
||||
set -e
|
||||
export TARGET_BRANCH="`echo "${BUILD_SOURCEBRANCH}" | sed -E 's!refs/(heads|tags)/!!g'`"
|
||||
[ -z "${SYSTEM_PULLREQUEST_TARGETBRANCH}" ] || export TARGET_BRANCH="${SYSTEM_PULLREQUEST_TARGETBRANCH}"
|
||||
env
|
||||
python -m tox
|
||||
env:
|
||||
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
|
||||
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
|
||||
AWS_EC2_PEM_FILE: $(testFarmPem.secureFilePath)
|
||||
displayName: Run tox
|
||||
@@ -1,5 +1,2 @@
|
||||
[run]
|
||||
omit = */setup.py
|
||||
|
||||
[report]
|
||||
omit = */setup.py
|
||||
|
||||
@@ -8,4 +8,5 @@
|
||||
.git
|
||||
.tox
|
||||
venv
|
||||
venv3
|
||||
docs
|
||||
|
||||
@@ -1,18 +0,0 @@
|
||||
# https://editorconfig.org/
|
||||
|
||||
root = true
|
||||
|
||||
[*]
|
||||
insert_final_newline = true
|
||||
trim_trailing_whitespace = true
|
||||
end_of_line = lf
|
||||
|
||||
[*.py]
|
||||
indent_style = space
|
||||
indent_size = 4
|
||||
charset = utf-8
|
||||
max_line_length = 100
|
||||
|
||||
[*.yaml]
|
||||
indent_style = space
|
||||
indent_size = 2
|
||||
12
.envrc
12
.envrc
@@ -1,12 +0,0 @@
|
||||
# This file is just a nicety for developers who use direnv. When you cd under
|
||||
# the Certbot repo, Certbot's virtual environment will be automatically
|
||||
# activated and then deactivated when you cd elsewhere. Developers have to have
|
||||
# direnv set up and run `direnv allow` to allow this file to execute on their
|
||||
# system. You can find more information at https://direnv.net/.
|
||||
. venv/bin/activate
|
||||
# direnv doesn't support modifying PS1 so we unset it to squelch the error
|
||||
# it'll otherwise print about this being done in the activate script. See
|
||||
# https://github.com/direnv/direnv/wiki/PS1. If you would like your shell
|
||||
# prompt to change like it normally does, see
|
||||
# https://github.com/direnv/direnv/wiki/Python#restoring-the-ps1.
|
||||
unset PS1
|
||||
1
.github/FUNDING.yml
vendored
1
.github/FUNDING.yml
vendored
@@ -1 +0,0 @@
|
||||
custom: https://supporters.eff.org/donate/support-work-on-certbot
|
||||
5
.github/pull_request_template.md
vendored
5
.github/pull_request_template.md
vendored
@@ -1,5 +0,0 @@
|
||||
## Pull Request Checklist
|
||||
|
||||
- [ ] If the change being made is to a [distributed component](https://certbot.eff.org/docs/contributing.html#code-components-and-layout), edit the `master` section of `certbot/CHANGELOG.md` to include a description of the change being made.
|
||||
- [ ] Add or update any documentation as needed to support the changes in this PR.
|
||||
- [ ] Include your name in `AUTHORS.md` if you like.
|
||||
35
.github/stale.yml
vendored
35
.github/stale.yml
vendored
@@ -1,35 +0,0 @@
|
||||
# Configuration for https://github.com/marketplace/stale
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request becomes stale
|
||||
daysUntilStale: 365
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
|
||||
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
|
||||
# When changing this value, be sure to also update markComment below.
|
||||
daysUntilClose: 30
|
||||
|
||||
# Ignore issues with an assignee (defaults to false)
|
||||
exemptAssignees: true
|
||||
|
||||
# Label to use when marking as stale
|
||||
staleLabel: needs-update
|
||||
|
||||
# Comment to post when marking as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
We've made a lot of changes to Certbot since this issue was opened. If you
|
||||
still have this issue with an up-to-date version of Certbot, can you please
|
||||
add a comment letting us know? This helps us to better see what issues are
|
||||
still affecting our users. If there is no activity in the next 30 days, this
|
||||
issue will be automatically closed.
|
||||
|
||||
# Comment to post when closing a stale Issue or Pull Request.
|
||||
closeComment: >
|
||||
This issue has been closed due to lack of activity, but if you think it
|
||||
should be reopened, please open a new issue with a link to this one and we'll
|
||||
take a look.
|
||||
|
||||
# Limit the number of actions per hour, from 1-30. Default is 30
|
||||
limitPerRun: 1
|
||||
|
||||
# Don't mark pull requests as stale.
|
||||
only: issues
|
||||
32
.gitignore
vendored
32
.gitignore
vendored
@@ -4,12 +4,12 @@
|
||||
build/
|
||||
dist*/
|
||||
/venv*/
|
||||
/kgs/
|
||||
/.tox/
|
||||
/releases*/
|
||||
/log*
|
||||
/releases/
|
||||
letsencrypt.log
|
||||
certbot.log
|
||||
poetry.lock
|
||||
letsencrypt-auto-source/letsencrypt-auto.sig.lzma.base64
|
||||
|
||||
# coverage
|
||||
.coverage
|
||||
@@ -25,36 +25,20 @@ tags
|
||||
\#*#
|
||||
.idea
|
||||
.ropeproject
|
||||
.vscode
|
||||
|
||||
# auth --cert-path --chain-path
|
||||
/*.pem
|
||||
|
||||
# letstest
|
||||
tests/letstest/letest-*/
|
||||
tests/letstest/*.pem
|
||||
tests/letstest/venv/
|
||||
|
||||
.venv
|
||||
|
||||
# pytest cache
|
||||
.cache
|
||||
.mypy_cache/
|
||||
.pytest_cache/
|
||||
|
||||
# docker files
|
||||
.docker
|
||||
|
||||
# certbot tests
|
||||
.certbot_test_workspace
|
||||
**/assets/pebble*
|
||||
**/assets/challtestsrv*
|
||||
|
||||
# snap files
|
||||
.snapcraft
|
||||
parts
|
||||
prime
|
||||
stage
|
||||
*.snap
|
||||
snap-constraints.txt
|
||||
qemu-*
|
||||
certbot-dns*/certbot-dns*_amd64*.txt
|
||||
certbot-dns*/certbot-dns*_arm*.txt
|
||||
/certbot_amd64*.txt
|
||||
/certbot_arm*.txt
|
||||
certbot-dns*/snap
|
||||
|
||||
@@ -1,6 +0,0 @@
|
||||
[settings]
|
||||
skip_glob=venv*
|
||||
force_sort_within_sections=True
|
||||
force_single_line=True
|
||||
order_by_type=False
|
||||
line_length=400
|
||||
71
.pylintrc
71
.pylintrc
@@ -8,10 +8,7 @@ jobs=0
|
||||
|
||||
# Python code to execute, usually for sys.path manipulation such as
|
||||
# pygtk.require().
|
||||
# CERTBOT COMMENT
|
||||
# This is needed for pylint to import linter_plugin.py since
|
||||
# https://github.com/PyCQA/pylint/pull/3396.
|
||||
init-hook="import pylint.config, os, sys; sys.path.append(os.path.dirname(pylint.config.PYLINTRC))"
|
||||
#init-hook=
|
||||
|
||||
# Profiled execution.
|
||||
profile=no
|
||||
@@ -27,11 +24,6 @@ persistent=yes
|
||||
# usually to register additional checkers.
|
||||
load-plugins=linter_plugin
|
||||
|
||||
# A comma-separated list of package or module names from where C extensions may
|
||||
# be loaded. Extensions are loading into the active Python interpreter and may
|
||||
# run arbitrary code.
|
||||
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
|
||||
|
||||
|
||||
[MESSAGES CONTROL]
|
||||
|
||||
@@ -49,25 +41,10 @@ extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
|
||||
# --enable=similarities". If you want to run only the classes checker, but have
|
||||
# no Warning level messages displayed, use"--disable=all --enable=classes
|
||||
# --disable=W"
|
||||
# CERTBOT COMMENT
|
||||
# 1) Once certbot codebase is claimed to be compatible exclusively with Python 3,
|
||||
# the useless-object-inheritance check can be enabled again, and code fixed accordingly.
|
||||
# 2) Check unsubscriptable-object tends to create a lot of false positives. Let's disable it.
|
||||
# See https://github.com/PyCQA/pylint/issues/1498.
|
||||
# 3) Same as point 2 for no-value-for-parameter.
|
||||
# See https://github.com/PyCQA/pylint/issues/2820.
|
||||
# 4) raise-missing-from makes it an error to raise an exception from except
|
||||
# block without using explicit exception chaining. While explicit exception
|
||||
# chaining results in a slightly more informative traceback, I don't think
|
||||
# it's beneficial enough for us to change all of our current instances and
|
||||
# give Certbot developers errors about this when they're working on new code
|
||||
# in the future. You can read more about exception chaining and this pylint
|
||||
# check at
|
||||
# https://blog.ram.rachum.com/post/621791438475296768/improving-python-exception-chaining-with.
|
||||
# 5) wrong-import-order generates false positives and a pylint developer
|
||||
# suggests that people using isort should disable this check at
|
||||
# https://github.com/PyCQA/pylint/issues/3817#issuecomment-687892090.
|
||||
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue,raise-missing-from,wrong-import-order
|
||||
disable=fixme,locally-disabled,locally-enabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,too-few-public-methods,no-self-use,invalid-name,too-many-instance-attributes,cyclic-import,duplicate-code
|
||||
# abstract-class-not-used cannot be disabled locally (at least in
|
||||
# pylint 1.4.1), same for abstract-class-little-used
|
||||
|
||||
|
||||
[REPORTS]
|
||||
|
||||
@@ -268,13 +245,13 @@ ignore-mixin-members=yes
|
||||
# List of module names for which member attributes should not be checked
|
||||
# (useful for modules/projects where namespaces are manipulated during runtime
|
||||
# and thus existing member attributes cannot be deduced by static analysis
|
||||
ignored-modules=pkg_resources,confargparse,argparse
|
||||
ignored-modules=pkg_resources,confargparse,argparse,six.moves,six.moves.urllib
|
||||
# import errors ignored only in 1.4.4
|
||||
# https://bitbucket.org/logilab/pylint/commits/cd000904c9e2
|
||||
|
||||
# List of classes names for which member attributes should not be checked
|
||||
# (useful for classes with attributes dynamically set).
|
||||
ignored-classes=Field,Header,JWS,closing
|
||||
ignored-classes=SQLObject
|
||||
|
||||
# When zope mode is activated, add a predefined set of Zope acquired attributes
|
||||
# to generated-members.
|
||||
@@ -320,6 +297,40 @@ valid-classmethod-first-arg=cls
|
||||
valid-metaclass-classmethod-first-arg=mcs
|
||||
|
||||
|
||||
[DESIGN]
|
||||
|
||||
# Maximum number of arguments for function / method
|
||||
max-args=6
|
||||
|
||||
# Argument names that match this expression will be ignored. Default to name
|
||||
# with leading underscore
|
||||
ignored-argument-names=_.*
|
||||
|
||||
# Maximum number of locals for function / method body
|
||||
max-locals=15
|
||||
|
||||
# Maximum number of return / yield for function / method body
|
||||
max-returns=6
|
||||
|
||||
# Maximum number of branch for function / method body
|
||||
max-branches=12
|
||||
|
||||
# Maximum number of statements in function / method body
|
||||
max-statements=50
|
||||
|
||||
# Maximum number of parents for a class (see R0901).
|
||||
max-parents=12
|
||||
|
||||
# Maximum number of attributes for a class (see R0902).
|
||||
max-attributes=7
|
||||
|
||||
# Minimum number of public methods for a class (see R0903).
|
||||
min-public-methods=2
|
||||
|
||||
# Maximum number of public methods for a class (see R0904).
|
||||
max-public-methods=20
|
||||
|
||||
|
||||
[EXCEPTIONS]
|
||||
|
||||
# Exceptions that will emit a warning when being caught. Defaults to
|
||||
|
||||
108
.travis.yml
Normal file
108
.travis.yml
Normal file
@@ -0,0 +1,108 @@
|
||||
language: python
|
||||
|
||||
cache:
|
||||
directories:
|
||||
- $HOME/.cache/pip
|
||||
|
||||
before_install:
|
||||
- '([ $TRAVIS_OS_NAME == linux ] && dpkg -s libaugeas0) || (brew update && brew install augeas python3 && brew upgrade python && brew link python)'
|
||||
|
||||
before_script:
|
||||
- 'if [ $TRAVIS_OS_NAME = osx ] ; then ulimit -n 1024 ; fi'
|
||||
|
||||
matrix:
|
||||
include:
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27_install BOULDER_INTEGRATION=v1
|
||||
sudo: required
|
||||
services: docker
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27_install BOULDER_INTEGRATION=v2
|
||||
sudo: required
|
||||
services: docker
|
||||
- python: "2.7"
|
||||
env: TOXENV=cover FYI="this also tests py27"
|
||||
- sudo: required
|
||||
env: TOXENV=nginx_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
- python: "2.7"
|
||||
env: TOXENV=lint
|
||||
- python: "3.4"
|
||||
env: TOXENV=mypy
|
||||
- python: "3.5"
|
||||
env: TOXENV=mypy
|
||||
- python: "2.7"
|
||||
env: TOXENV='py27-{acme,apache,certbot,dns,nginx}-oldest'
|
||||
sudo: required
|
||||
services: docker
|
||||
- python: "3.4"
|
||||
env: TOXENV=py34
|
||||
sudo: required
|
||||
services: docker
|
||||
- python: "3.7"
|
||||
dist: xenial
|
||||
env: TOXENV=py37
|
||||
sudo: required
|
||||
services: docker
|
||||
- sudo: required
|
||||
env: TOXENV=apache_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_trusty
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
- python: "2.7"
|
||||
env: TOXENV=apacheconftest
|
||||
sudo: required
|
||||
- python: "2.7"
|
||||
env: TOXENV=nginxroundtrip
|
||||
|
||||
|
||||
# Only build pushes to the master branch, PRs, and branches beginning with
|
||||
# `test-` or of the form `digit(s).digit(s).x`. This reduces the number of
|
||||
# simultaneous Travis runs, which speeds turnaround time on review since there
|
||||
# is a cap of on the number of simultaneous runs.
|
||||
branches:
|
||||
only:
|
||||
- master
|
||||
- /^\d+\.\d+\.x$/
|
||||
- /^test-.*$/
|
||||
|
||||
# container-based infrastructure
|
||||
sudo: false
|
||||
|
||||
addons:
|
||||
apt:
|
||||
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
|
||||
- python-dev
|
||||
- python-virtualenv
|
||||
- gcc
|
||||
- libaugeas0
|
||||
- libssl-dev
|
||||
- libffi-dev
|
||||
- ca-certificates
|
||||
# For certbot-nginx integration testing
|
||||
- nginx-light
|
||||
- openssl
|
||||
|
||||
install: "travis_retry $(command -v pip || command -v pip3) install tox coveralls"
|
||||
script:
|
||||
- travis_retry tox
|
||||
- '[ -z "${BOULDER_INTEGRATION+x}" ] || (travis_retry tests/boulder-fetch.sh && tests/tox-boulder-integration.sh)'
|
||||
|
||||
after_success: '[ "$TOXENV" == "cover" ] && coveralls'
|
||||
|
||||
notifications:
|
||||
email: false
|
||||
irc:
|
||||
channels:
|
||||
- secure: "SGWZl3ownKx9xKVV2VnGt7DqkTmutJ89oJV9tjKhSs84kLijU6EYdPnllqISpfHMTxXflNZuxtGo0wTDYHXBuZL47w1O32W6nzuXdra5zC+i4sYQwYULUsyfOv9gJX8zWAULiK0Z3r0oho45U+FR5ZN6TPCidi8/eGU+EEPwaAw="
|
||||
on_cancel: never
|
||||
on_success: never
|
||||
on_failure: always
|
||||
use_notice: true
|
||||
27
AUTHORS.md
27
AUTHORS.md
@@ -1,12 +1,10 @@
|
||||
Authors
|
||||
=======
|
||||
|
||||
* [Aaron Gable](https://github.com/aarongable)
|
||||
* [Aaron Zirbes](https://github.com/aaronzirbes)
|
||||
* Aaron Zuehlke
|
||||
* Ada Lovelace
|
||||
* [Adam Woodbeck](https://github.com/awoodbeck)
|
||||
* [Adrien Ferrand](https://github.com/adferrand)
|
||||
* [Aidin Gharibnavaz](https://github.com/aidin36)
|
||||
* [AJ ONeal](https://github.com/coolaj86)
|
||||
* [Alcaro](https://github.com/Alcaro)
|
||||
@@ -16,13 +14,10 @@ Authors
|
||||
* [Alex Gaynor](https://github.com/alex)
|
||||
* [Alex Halderman](https://github.com/jhalderm)
|
||||
* [Alex Jordan](https://github.com/strugee)
|
||||
* [Alex Zorin](https://github.com/alexzorin)
|
||||
* [Amjad Mashaal](https://github.com/TheNavigat)
|
||||
* [Andrew Murray](https://github.com/radarhere)
|
||||
* [Andrzej Górski](https://github.com/andrzej3393)
|
||||
* [Anselm Levskaya](https://github.com/levskaya)
|
||||
* [Antoine Jacoutot](https://github.com/ajacoutot)
|
||||
* [April King](https://github.com/april)
|
||||
* [asaph](https://github.com/asaph)
|
||||
* [Axel Beckert](https://github.com/xtaran)
|
||||
* [Bas](https://github.com/Mechazawa)
|
||||
@@ -37,9 +32,7 @@ Authors
|
||||
* [Blake Griffith](https://github.com/cowlicks)
|
||||
* [Brad Warren](https://github.com/bmw)
|
||||
* [Brandon Kraft](https://github.com/kraftbj)
|
||||
* [Brandon Kreisel](https://github.com/BKreisel)
|
||||
* [Brian Heim](https://github.com/brianlheim)
|
||||
* [Cameron Steel](https://github.com/Tugzrida)
|
||||
* [Brandon Kreisel](https://github.com/kraftbj)
|
||||
* [Ceesjan Luiten](https://github.com/quinox)
|
||||
* [Chad Whitacre](https://github.com/whit537)
|
||||
* [Chhatoi Pritam Baral](https://github.com/pritambaral)
|
||||
@@ -61,9 +54,7 @@ Authors
|
||||
* [DanCld](https://github.com/DanCld)
|
||||
* [Daniel Albers](https://github.com/AID)
|
||||
* [Daniel Aleksandersen](https://github.com/da2x)
|
||||
* [Daniel Almasi](https://github.com/almasen)
|
||||
* [Daniel Convissor](https://github.com/convissor)
|
||||
* [Daniel "Drex" Drexler](https://github.com/aeturnum)
|
||||
* [Daniel Huang](https://github.com/dhuang)
|
||||
* [Dave Guarino](https://github.com/daguar)
|
||||
* [David cz](https://github.com/dave-cz)
|
||||
@@ -84,11 +75,9 @@ Authors
|
||||
* [Fabian](https://github.com/faerbit)
|
||||
* [Faidon Liambotis](https://github.com/paravoid)
|
||||
* [Fan Jiang](https://github.com/tcz001)
|
||||
* [Felix Lechner](https://github.com/lechner)
|
||||
* [Felix Schwarz](https://github.com/FelixSchwarz)
|
||||
* [Felix Yan](https://github.com/felixonmars)
|
||||
* [Filip Ochnik](https://github.com/filipochnik)
|
||||
* [Florian Klink](https://github.com/flokli)
|
||||
* [Francois Marier](https://github.com/fmarier)
|
||||
* [Frank](https://github.com/Frankkkkk)
|
||||
* [Frederic BLANC](https://github.com/fblanc)
|
||||
@@ -107,9 +96,7 @@ Authors
|
||||
* [Harlan Lieberman-Berg](https://github.com/hlieberman)
|
||||
* [Henri Salo](https://github.com/fgeek)
|
||||
* [Henry Chen](https://github.com/henrychen95)
|
||||
* [Hugo van Kemenade](https://github.com/hugovk)
|
||||
* [Ingolf Becker](https://github.com/watercrossing)
|
||||
* [Ivan Nejgebauer](https://github.com/inejge)
|
||||
* [Jaap Eldering](https://github.com/eldering)
|
||||
* [Jacob Hoffman-Andrews](https://github.com/jsha)
|
||||
* [Jacob Sachs](https://github.com/jsachs)
|
||||
@@ -133,12 +120,10 @@ Authors
|
||||
* [Jonathan Herlin](https://github.com/Jonher937)
|
||||
* [Jon Walsh](https://github.com/code-tree)
|
||||
* [Joona Hoikkala](https://github.com/joohoi)
|
||||
* [Josh McCullough](https://github.com/JoshMcCullough)
|
||||
* [Josh Soref](https://github.com/jsoref)
|
||||
* [Joubin Jabbari](https://github.com/joubin)
|
||||
* [Juho Juopperi](https://github.com/jkjuopperi)
|
||||
* [Kane York](https://github.com/riking)
|
||||
* [Kenichi Maehashi](https://github.com/kmaehashi)
|
||||
* [Kenneth Skovhede](https://github.com/kenkendk)
|
||||
* [Kevin Burke](https://github.com/kevinburke)
|
||||
* [Kevin London](https://github.com/kevinlondon)
|
||||
@@ -151,13 +136,11 @@ Authors
|
||||
* [Lior Sabag](https://github.com/liorsbg)
|
||||
* [Lipis](https://github.com/lipis)
|
||||
* [lord63](https://github.com/lord63)
|
||||
* [Lorenzo Fundaró](https://github.com/lfundaro)
|
||||
* [Luca Beltrame](https://github.com/lbeltrame)
|
||||
* [Luca Ebach](https://github.com/lucebac)
|
||||
* [Luca Olivetti](https://github.com/olivluca)
|
||||
* [Luke Rogers](https://github.com/lukeroge)
|
||||
* [Maarten](https://github.com/mrtndwrd)
|
||||
* [Mads Jensen](https://github.com/atombrella)
|
||||
* [Maikel Martens](https://github.com/krukas)
|
||||
* [Malte Janduda](https://github.com/MalteJ)
|
||||
* [Mantas Mikulėnas](https://github.com/grawity)
|
||||
@@ -176,10 +159,8 @@ Authors
|
||||
* [Michael Schumacher](https://github.com/schumaml)
|
||||
* [Michael Strache](https://github.com/Jarodiv)
|
||||
* [Michael Sverdlin](https://github.com/sveder)
|
||||
* [Michael Watters](https://github.com/blackknight36)
|
||||
* [Michal Moravec](https://github.com/https://github.com/Majkl578)
|
||||
* [Michal Papis](https://github.com/mpapis)
|
||||
* [Mickaël Schoentgen](https://github.com/BoboTiG)
|
||||
* [Minn Soe](https://github.com/MinnSoe)
|
||||
* [Min RK](https://github.com/minrk)
|
||||
* [Miquel Ruiz](https://github.com/miquelruiz)
|
||||
@@ -209,7 +190,6 @@ Authors
|
||||
* [Pierre Jaury](https://github.com/kaiyou)
|
||||
* [Piotr Kasprzyk](https://github.com/kwadrat)
|
||||
* [Prayag Verma](https://github.com/pra85)
|
||||
* [Rasesh Patel](https://github.com/raspat1)
|
||||
* [Reinaldo de Souza Jr](https://github.com/juniorz)
|
||||
* [Remi Rampin](https://github.com/remram44)
|
||||
* [Rémy HUBSCHER](https://github.com/Natim)
|
||||
@@ -217,7 +197,6 @@ Authors
|
||||
* [Richard Barnes](https://github.com/r-barnes)
|
||||
* [Richard Panek](https://github.com/kernelpanek)
|
||||
* [Robert Buchholz](https://github.com/rbu)
|
||||
* [Robert Dailey](https://github.com/pahrohfit)
|
||||
* [Robert Habermann](https://github.com/frennkie)
|
||||
* [Robert Xiao](https://github.com/nneonneo)
|
||||
* [Roland Shoemaker](https://github.com/rolandshoemaker)
|
||||
@@ -243,11 +222,8 @@ Authors
|
||||
* [Spencer Bliven](https://github.com/sbliven)
|
||||
* [Stacey Sheldon](https://github.com/solidgoldbomb)
|
||||
* [Stavros Korokithakis](https://github.com/skorokithakis)
|
||||
* [Ștefan Talpalaru](https://github.com/stefantalpalaru)
|
||||
* [Stefan Weil](https://github.com/stweil)
|
||||
* [Steve Desmond](https://github.com/stevedesmond-ca)
|
||||
* [sydneyli](https://github.com/sydneyli)
|
||||
* [taixx046](https://github.com/taixx046)
|
||||
* [Tan Jay Jun](https://github.com/jayjun)
|
||||
* [Tapple Gao](https://github.com/tapple)
|
||||
* [Telepenin Nikolay](https://github.com/telepenin)
|
||||
@@ -279,6 +255,5 @@ Authors
|
||||
* [Yomna](https://github.com/ynasser)
|
||||
* [Yoni Jah](https://github.com/yonjah)
|
||||
* [YourDaddyIsHere](https://github.com/YourDaddyIsHere)
|
||||
* [Yuseong Cho](https://github.com/g6123)
|
||||
* [Zach Shepherd](https://github.com/zjs)
|
||||
* [陈三](https://github.com/chenxsan)
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
certbot/CHANGELOG.md
|
||||
1111
CHANGELOG.md
Normal file
1111
CHANGELOG.md
Normal file
File diff suppressed because it is too large
Load Diff
8
CHANGES.rst
Normal file
8
CHANGES.rst
Normal file
@@ -0,0 +1,8 @@
|
||||
ChangeLog
|
||||
=========
|
||||
|
||||
To see the changes in a given release, view the issues closed in a given
|
||||
release's GitHub milestone:
|
||||
|
||||
- `Past releases <https://github.com/certbot/certbot/milestones?state=closed>`_
|
||||
- `Upcoming releases <https://github.com/certbot/certbot/milestones>`_
|
||||
@@ -1 +0,0 @@
|
||||
This project is governed by [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode).
|
||||
@@ -11,7 +11,7 @@ to the Sphinx generated docs is provided below.
|
||||
|
||||
|
||||
[1] https://github.com/blog/1184-contributing-guidelines
|
||||
[2] https://docutils.sourceforge.io/docs/user/rst/quickref.html#hyperlink-targets
|
||||
[2] http://docutils.sourceforge.net/docs/user/rst/quickref.html#hyperlink-targets
|
||||
|
||||
-->
|
||||
|
||||
@@ -33,5 +33,3 @@ started. In particular, we recommend you read these sections
|
||||
- [Finding issues to work on](https://certbot.eff.org/docs/contributing.html#find-issues-to-work-on)
|
||||
- [Coding style](https://certbot.eff.org/docs/contributing.html#coding-style)
|
||||
- [Submitting a pull request](https://certbot.eff.org/docs/contributing.html#submitting-a-pull-request)
|
||||
- [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode)
|
||||
|
||||
|
||||
27
Dockerfile
Normal file
27
Dockerfile
Normal file
@@ -0,0 +1,27 @@
|
||||
FROM python:2-alpine3.7
|
||||
|
||||
ENTRYPOINT [ "certbot" ]
|
||||
EXPOSE 80 443
|
||||
VOLUME /etc/letsencrypt /var/lib/letsencrypt
|
||||
WORKDIR /opt/certbot
|
||||
|
||||
COPY CHANGES.rst README.rst setup.py src/
|
||||
COPY acme src/acme
|
||||
COPY certbot src/certbot
|
||||
|
||||
RUN apk add --no-cache --virtual .certbot-deps \
|
||||
libffi \
|
||||
libssl1.0 \
|
||||
openssl \
|
||||
ca-certificates \
|
||||
binutils
|
||||
RUN apk add --no-cache --virtual .build-deps \
|
||||
gcc \
|
||||
linux-headers \
|
||||
openssl-dev \
|
||||
musl-dev \
|
||||
libffi-dev \
|
||||
&& pip install --no-cache-dir \
|
||||
--editable /opt/certbot/src/acme \
|
||||
--editable /opt/certbot/src \
|
||||
&& apk del .build-deps
|
||||
@@ -1,21 +1,21 @@
|
||||
# This Dockerfile builds an image for development.
|
||||
FROM ubuntu:focal
|
||||
FROM ubuntu:xenial
|
||||
|
||||
# Note: this only exposes the port to other docker containers.
|
||||
EXPOSE 80 443
|
||||
|
||||
WORKDIR /opt/certbot/src
|
||||
|
||||
# TODO: Install Apache/Nginx for plugin development.
|
||||
COPY . .
|
||||
RUN apt-get update && \
|
||||
DEBIAN_FRONTEND=noninteractive apt-get install apache2 git python3-dev \
|
||||
python3-venv gcc libaugeas0 libssl-dev libffi-dev ca-certificates \
|
||||
openssl nginx-light -y --no-install-recommends && \
|
||||
apt-get install apache2 git nginx-light -y && \
|
||||
letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
RUN VENV_NAME="../venv" python3 tools/venv.py
|
||||
RUN VENV_NAME="../venv" tools/venv.sh
|
||||
|
||||
ENV PATH /opt/certbot/venv/bin:$PATH
|
||||
|
||||
75
Dockerfile-old
Normal file
75
Dockerfile-old
Normal file
@@ -0,0 +1,75 @@
|
||||
# https://github.com/letsencrypt/letsencrypt/pull/431#issuecomment-103659297
|
||||
# it is more likely developers will already have ubuntu:trusty rather
|
||||
# than e.g. debian:jessie and image size differences are negligible
|
||||
FROM ubuntu:trusty
|
||||
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
|
||||
MAINTAINER William Budington <bill@eff.org>
|
||||
|
||||
# Note: this only exposes the port to other docker containers. You
|
||||
# still have to bind to 443@host at runtime, as per the ACME spec.
|
||||
EXPOSE 443
|
||||
|
||||
# TODO: make sure --config-dir and --work-dir cannot be changed
|
||||
# through the CLI (certbot-docker wrapper that uses standalone
|
||||
# authenticator and text mode only?)
|
||||
VOLUME /etc/letsencrypt /var/lib/letsencrypt
|
||||
|
||||
WORKDIR /opt/certbot
|
||||
|
||||
# no need to mkdir anything:
|
||||
# https://docs.docker.com/reference/builder/#copy
|
||||
# If <dest> doesn't exist, it is created along with all missing
|
||||
# directories in its path.
|
||||
|
||||
ENV DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
|
||||
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
# the above is not likely to change, so by putting it further up the
|
||||
# Dockerfile we make sure we cache as much as possible
|
||||
|
||||
|
||||
COPY setup.py README.rst CHANGES.rst MANIFEST.in letsencrypt-auto-source/pieces/pipstrap.py /opt/certbot/src/
|
||||
|
||||
# all above files are necessary for setup.py and venv setup, however,
|
||||
# package source code directory has to be copied separately to a
|
||||
# subdirectory...
|
||||
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
|
||||
# directory, the entire contents of the directory are copied,
|
||||
# including filesystem metadata. Note: The directory itself is not
|
||||
# copied, just its contents." Order again matters, three files are far
|
||||
# more likely to be cached than the whole project directory
|
||||
|
||||
COPY certbot /opt/certbot/src/certbot/
|
||||
COPY acme /opt/certbot/src/acme/
|
||||
COPY certbot-apache /opt/certbot/src/certbot-apache/
|
||||
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
|
||||
|
||||
|
||||
RUN virtualenv --no-site-packages -p python2 /opt/certbot/venv
|
||||
|
||||
# PATH is set now so pipstrap upgrades the correct (v)env
|
||||
ENV PATH /opt/certbot/venv/bin:$PATH
|
||||
RUN /opt/certbot/venv/bin/python /opt/certbot/src/pipstrap.py && \
|
||||
/opt/certbot/venv/bin/pip install \
|
||||
-e /opt/certbot/src/acme \
|
||||
-e /opt/certbot/src \
|
||||
-e /opt/certbot/src/certbot-apache \
|
||||
-e /opt/certbot/src/certbot-nginx
|
||||
|
||||
# install in editable mode (-e) to save space: it's not possible to
|
||||
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
|
||||
# this might also help in debugging: you can "docker run --entrypoint
|
||||
# bash" and investigate, apply patches, etc.
|
||||
|
||||
# set up certbot/letsencrypt wrapper to warn people about Dockerfile changes
|
||||
COPY tools/docker-warning.sh /opt/certbot/bin/certbot
|
||||
RUN ln -s /opt/certbot/bin/certbot /opt/certbot/bin/letsencrypt
|
||||
ENV PATH /opt/certbot/bin:$PATH
|
||||
|
||||
ENTRYPOINT [ "certbot" ]
|
||||
@@ -7,7 +7,7 @@ questions.
|
||||
## My operating system is (include version):
|
||||
|
||||
|
||||
## I installed Certbot with (snap, OS package manager, pip, certbot-auto, etc):
|
||||
## I installed Certbot with (certbot-auto, OS package manager, pip, etc):
|
||||
|
||||
|
||||
## I ran this command and it produced this output:
|
||||
@@ -1,10 +1,9 @@
|
||||
include README.rst
|
||||
include CHANGELOG.md
|
||||
include CHANGES.rst
|
||||
include CONTRIBUTING.md
|
||||
include LICENSE.txt
|
||||
include linter_plugin.py
|
||||
recursive-include docs *
|
||||
recursive-include examples *
|
||||
recursive-include certbot/tests/testdata *
|
||||
recursive-include tests *.py
|
||||
include certbot/ssl-dhparams.pem
|
||||
global-exclude __pycache__
|
||||
global-exclude *.py[cod]
|
||||
@@ -1 +0,0 @@
|
||||
certbot/README.rst
|
||||
161
README.rst
Normal file
161
README.rst
Normal file
@@ -0,0 +1,161 @@
|
||||
.. This file contains a series of comments that are used to include sections of this README in other files. Do not modify these comments unless you know what you are doing. tag:intro-begin
|
||||
|
||||
Certbot is part of EFF’s effort to encrypt the entire Internet. Secure communication over the Web relies on HTTPS, which requires the use of a digital certificate that lets browsers verify the identity of web servers (e.g., is that really google.com?). Web servers obtain their certificates from trusted third parties called certificate authorities (CAs). Certbot is an easy-to-use client that fetches a certificate from Let’s Encrypt—an open certificate authority launched by the EFF, Mozilla, and others—and deploys it to a web server.
|
||||
|
||||
Anyone who has gone through the trouble of setting up a secure website knows what a hassle getting and maintaining a certificate is. Certbot and Let’s Encrypt can automate away the pain and let you turn on and manage HTTPS with simple commands. Using Certbot and Let's Encrypt is free, so there’s no need to arrange payment.
|
||||
|
||||
How you use Certbot depends on the configuration of your web server. The best way to get started is to use our `interactive guide <https://certbot.eff.org>`_. It generates instructions based on your configuration settings. In most cases, you’ll need `root or administrator access <https://certbot.eff.org/faq/#does-certbot-require-root-administrator-privileges>`_ to your web server to run Certbot.
|
||||
|
||||
If you’re using a hosted service and don’t have direct access to your web server, you might not be able to use Certbot. Check with your hosting provider for documentation about uploading certificates or using certificates issued by Let’s Encrypt.
|
||||
|
||||
Certbot is a fully-featured, extensible client for the Let's
|
||||
Encrypt CA (or any other CA that speaks the `ACME
|
||||
<https://github.com/ietf-wg-acme/acme/blob/master/draft-ietf-acme-acme.md>`_
|
||||
protocol) that can automate the tasks of obtaining certificates and
|
||||
configuring webservers to use them. This client runs on Unix-based operating
|
||||
systems.
|
||||
|
||||
To see the changes made to Certbot between versions please refer to our
|
||||
`changelog <https://github.com/certbot/certbot/blob/master/CHANGELOG.md>`_.
|
||||
|
||||
Until May 2016, Certbot was named simply ``letsencrypt`` or ``letsencrypt-auto``,
|
||||
depending on install method. Instructions on the Internet, and some pieces of the
|
||||
software, may still refer to this older name.
|
||||
|
||||
Contributing
|
||||
------------
|
||||
|
||||
If you'd like to contribute to this project please read `Developer Guide
|
||||
<https://certbot.eff.org/docs/contributing.html>`_.
|
||||
|
||||
.. _installation:
|
||||
|
||||
Installation
|
||||
------------
|
||||
|
||||
The easiest way to install Certbot is by visiting `certbot.eff.org`_, where you can
|
||||
find the correct installation instructions for many web server and OS combinations.
|
||||
For more information, see `Get Certbot <https://certbot.eff.org/docs/install.html>`_.
|
||||
|
||||
.. _certbot.eff.org: https://certbot.eff.org/
|
||||
|
||||
How to run the client
|
||||
---------------------
|
||||
|
||||
In many cases, you can just run ``certbot-auto`` or ``certbot``, and the
|
||||
client will guide you through the process of obtaining and installing certs
|
||||
interactively.
|
||||
|
||||
For full command line help, you can type::
|
||||
|
||||
./certbot-auto --help all
|
||||
|
||||
|
||||
You can also tell it exactly what you want it to do from the command line.
|
||||
For instance, if you want to obtain a cert for ``example.com``,
|
||||
``www.example.com``, and ``other.example.net``, using the Apache plugin to both
|
||||
obtain and install the certs, you could do this::
|
||||
|
||||
./certbot-auto --apache -d example.com -d www.example.com -d other.example.net
|
||||
|
||||
(The first time you run the command, it will make an account, and ask for an
|
||||
email and agreement to the Let's Encrypt Subscriber Agreement; you can
|
||||
automate those with ``--email`` and ``--agree-tos``)
|
||||
|
||||
If you want to use a webserver that doesn't have full plugin support yet, you
|
||||
can still use "standalone" or "webroot" plugins to obtain a certificate::
|
||||
|
||||
./certbot-auto certonly --standalone --email admin@example.com -d example.com -d www.example.com -d other.example.net
|
||||
|
||||
|
||||
Understanding the client in more depth
|
||||
--------------------------------------
|
||||
|
||||
To understand what the client is doing in detail, it's important to
|
||||
understand the way it uses plugins. Please see the `explanation of
|
||||
plugins <https://certbot.eff.org/docs/using.html#plugins>`_ in
|
||||
the User Guide.
|
||||
|
||||
Links
|
||||
=====
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:links-begin
|
||||
|
||||
Documentation: https://certbot.eff.org/docs
|
||||
|
||||
Software project: https://github.com/certbot/certbot
|
||||
|
||||
Notes for developers: https://certbot.eff.org/docs/contributing.html
|
||||
|
||||
Main Website: https://certbot.eff.org
|
||||
|
||||
Let's Encrypt Website: https://letsencrypt.org
|
||||
|
||||
IRC Channel: #letsencrypt on `Freenode`_
|
||||
|
||||
Community: https://community.letsencrypt.org
|
||||
|
||||
ACME spec: http://ietf-wg-acme.github.io/acme/
|
||||
|
||||
ACME working area in github: https://github.com/ietf-wg-acme/acme
|
||||
|
||||
|build-status| |coverage| |docs| |container|
|
||||
|
||||
.. _Freenode: https://webchat.freenode.net?channels=%23letsencrypt
|
||||
|
||||
.. |build-status| image:: https://travis-ci.org/certbot/certbot.svg?branch=master
|
||||
:target: https://travis-ci.org/certbot/certbot
|
||||
:alt: Travis CI status
|
||||
|
||||
.. |coverage| image:: https://coveralls.io/repos/certbot/certbot/badge.svg?branch=master
|
||||
:target: https://coveralls.io/r/certbot/certbot
|
||||
:alt: Coverage status
|
||||
|
||||
.. |docs| image:: https://readthedocs.org/projects/letsencrypt/badge/
|
||||
:target: https://readthedocs.org/projects/letsencrypt/
|
||||
:alt: Documentation status
|
||||
|
||||
.. |container| image:: https://quay.io/repository/letsencrypt/letsencrypt/status
|
||||
:target: https://quay.io/repository/letsencrypt/letsencrypt
|
||||
:alt: Docker Repository on Quay.io
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:links-end
|
||||
|
||||
System Requirements
|
||||
===================
|
||||
|
||||
See https://certbot.eff.org/docs/install.html#system-requirements.
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:intro-end
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:features-begin
|
||||
|
||||
Current Features
|
||||
=====================
|
||||
|
||||
* Supports multiple web servers:
|
||||
|
||||
- apache/2.x
|
||||
- nginx/0.8.48+
|
||||
- webroot (adds files to webroot directories in order to prove control of
|
||||
domains and obtain certs)
|
||||
- standalone (runs its own simple webserver to prove you control a domain)
|
||||
- other server software via `third party plugins <https://certbot.eff.org/docs/using.html#third-party-plugins>`_
|
||||
|
||||
* The private key is generated locally on your system.
|
||||
* Can talk to the Let's Encrypt CA or optionally to other ACME
|
||||
compliant services.
|
||||
* Can get domain-validated (DV) certificates.
|
||||
* Can revoke certificates.
|
||||
* Adjustable RSA key bit-length (2048 (default), 4096, ...).
|
||||
* Can optionally install a http -> https redirect, so your site effectively
|
||||
runs https only (Apache only)
|
||||
* Fully automated.
|
||||
* Configuration changes are logged and can be reverted.
|
||||
* Supports an interactive text UI, or can be driven entirely from the
|
||||
command line.
|
||||
* Free and Open Source Software, made with Python.
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:features-end
|
||||
|
||||
For extensive documentation on using and contributing to Certbot, go to https://certbot.eff.org/docs. If you would like to contribute to the project or run the latest code from git, you should read our `developer guide <https://certbot.eff.org/docs/contributing.html>`_.
|
||||
@@ -3,6 +3,4 @@ include README.rst
|
||||
include pytest.ini
|
||||
recursive-include docs *
|
||||
recursive-include examples *
|
||||
recursive-include tests *
|
||||
global-exclude __pycache__
|
||||
global-exclude *.py[cod]
|
||||
recursive-include acme/testdata *
|
||||
|
||||
@@ -1,21 +1,12 @@
|
||||
"""ACME protocol implementation.
|
||||
|
||||
This module is an implementation of the `ACME protocol`_.
|
||||
This module is an implementation of the `ACME protocol`_. Latest
|
||||
supported version: `draft-ietf-acme-01`_.
|
||||
|
||||
|
||||
.. _`ACME protocol`: https://ietf-wg-acme.github.io/acme
|
||||
|
||||
.. _`draft-ietf-acme-01`:
|
||||
https://github.com/ietf-wg-acme/acme/tree/draft-ietf-acme-acme-01
|
||||
|
||||
"""
|
||||
import sys
|
||||
|
||||
# This code exists to keep backwards compatibility with people using acme.jose
|
||||
# before it became the standalone josepy package.
|
||||
#
|
||||
# It is based on
|
||||
# https://github.com/requests/requests/blob/1278ecdf71a312dc2268f3bfc0aabfab3c006dcf/requests/packages.py
|
||||
import josepy as jose
|
||||
|
||||
for mod in list(sys.modules):
|
||||
# This traversal is apparently necessary such that the identities are
|
||||
# preserved (acme.jose.* is josepy.*)
|
||||
if mod == 'josepy' or mod.startswith('josepy.'):
|
||||
sys.modules['acme.' + mod.replace('josepy', 'jose', 1)] = sys.modules[mod]
|
||||
|
||||
@@ -1,45 +1,44 @@
|
||||
"""ACME Identifier Validation Challenges."""
|
||||
import abc
|
||||
import codecs
|
||||
import functools
|
||||
import hashlib
|
||||
import logging
|
||||
import socket
|
||||
from typing import Type
|
||||
|
||||
from cryptography.hazmat.primitives import hashes # type: ignore
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
|
||||
import OpenSSL
|
||||
import requests
|
||||
import six
|
||||
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
from acme import crypto_util
|
||||
from acme import fields
|
||||
from acme.mixins import ResourceMixin
|
||||
from acme.mixins import TypeMixin
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# pylint: disable=too-few-public-methods
|
||||
|
||||
|
||||
class Challenge(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
# _fields_to_partial_json | pylint: disable=abstract-method
|
||||
"""ACME challenge."""
|
||||
TYPES: dict = {}
|
||||
TYPES = {} # type: dict
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
try:
|
||||
return super().from_json(jobj)
|
||||
return super(Challenge, cls).from_json(jobj)
|
||||
except jose.UnrecognizedTypeError as error:
|
||||
logger.debug(error)
|
||||
return UnrecognizedChallenge.from_json(jobj)
|
||||
|
||||
|
||||
class ChallengeResponse(ResourceMixin, TypeMixin, jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
class ChallengeResponse(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json | pylint: disable=abstract-method
|
||||
"""ACME challenge response."""
|
||||
TYPES: dict = {}
|
||||
TYPES = {} # type: dict
|
||||
resource_type = 'challenge'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
@@ -58,11 +57,12 @@ class UnrecognizedChallenge(Challenge):
|
||||
"""
|
||||
|
||||
def __init__(self, jobj):
|
||||
super().__init__()
|
||||
super(UnrecognizedChallenge, self).__init__()
|
||||
object.__setattr__(self, "jobj", jobj)
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.jobj # pylint: disable=no-member
|
||||
# pylint: disable=no-member
|
||||
return self.jobj
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
@@ -95,7 +95,6 @@ class _TokenChallenge(Challenge):
|
||||
"""
|
||||
# TODO: check that path combined with uri does not go above
|
||||
# URI_ROOT_PATH!
|
||||
# pylint: disable=unsupported-membership-test
|
||||
return b'..' not in self.token and b'/' not in self.token
|
||||
|
||||
|
||||
@@ -120,7 +119,7 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
parts = self.key_authorization.split('.')
|
||||
parts = self.key_authorization.split('.') # pylint: disable=no-member
|
||||
if len(parts) != 2:
|
||||
logger.debug("Key authorization (%r) is not well formed",
|
||||
self.key_authorization)
|
||||
@@ -140,21 +139,18 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
|
||||
|
||||
return True
|
||||
|
||||
def to_partial_json(self):
|
||||
jobj = super().to_partial_json()
|
||||
jobj.pop('keyAuthorization', None)
|
||||
return jobj
|
||||
|
||||
|
||||
class KeyAuthorizationChallenge(_TokenChallenge, metaclass=abc.ABCMeta):
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class KeyAuthorizationChallenge(_TokenChallenge):
|
||||
# pylint: disable=abstract-class-little-used,too-many-ancestors
|
||||
"""Challenge based on Key Authorization.
|
||||
|
||||
:param response_cls: Subclass of `KeyAuthorizationChallengeResponse`
|
||||
that will be used to generate ``response``.
|
||||
that will be used to generate `response`.
|
||||
:param str typ: type of the challenge
|
||||
"""
|
||||
typ: str = NotImplemented
|
||||
response_cls: Type[KeyAuthorizationChallengeResponse] = NotImplemented
|
||||
typ = NotImplemented
|
||||
response_cls = NotImplemented
|
||||
thumbprint_hash_function = (
|
||||
KeyAuthorizationChallengeResponse.thumbprint_hash_function)
|
||||
|
||||
@@ -178,7 +174,7 @@ class KeyAuthorizationChallenge(_TokenChallenge, metaclass=abc.ABCMeta):
|
||||
:rtype: KeyAuthorizationChallengeResponse
|
||||
|
||||
"""
|
||||
return self.response_cls( # pylint: disable=not-callable
|
||||
return self.response_cls(
|
||||
key_authorization=self.key_authorization(account_key))
|
||||
|
||||
@abc.abstractmethod
|
||||
@@ -215,7 +211,7 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME dns-01 challenge response."""
|
||||
typ = "dns-01"
|
||||
|
||||
def simple_verify(self, chall, domain, account_public_key): # pylint: disable=unused-argument
|
||||
def simple_verify(self, chall, domain, account_public_key):
|
||||
"""Simple verify.
|
||||
|
||||
This method no longer checks DNS records and is a simple wrapper
|
||||
@@ -231,13 +227,14 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
# pylint: disable=unused-argument
|
||||
verified = self.verify(chall, account_public_key)
|
||||
if not verified:
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return verified
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class DNS01(KeyAuthorizationChallenge):
|
||||
"""ACME dns-01 challenge."""
|
||||
response_cls = DNS01Response
|
||||
@@ -310,7 +307,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
|
||||
uri = chall.uri(domain)
|
||||
logger.debug("Verifying %s at %s...", chall.typ, uri)
|
||||
try:
|
||||
http_response = requests.get(uri, verify=False)
|
||||
http_response = requests.get(uri)
|
||||
except requests.exceptions.RequestException as error:
|
||||
logger.error("Unable to reach %s: %s", uri, error)
|
||||
return False
|
||||
@@ -327,7 +324,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
|
||||
return True
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class HTTP01(KeyAuthorizationChallenge):
|
||||
"""ACME http-01 challenge."""
|
||||
response_cls = HTTP01Response
|
||||
@@ -368,9 +365,12 @@ class HTTP01(KeyAuthorizationChallenge):
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME tls-alpn-01 challenge response."""
|
||||
typ = "tls-alpn-01"
|
||||
class TLSSNI01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME tls-sni-01 challenge response."""
|
||||
typ = "tls-sni-01"
|
||||
|
||||
DOMAIN_SUFFIX = b".acme.invalid"
|
||||
"""Domain name suffix."""
|
||||
|
||||
PORT = 443
|
||||
"""Verification port as defined by the protocol.
|
||||
@@ -380,18 +380,28 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
|
||||
"""
|
||||
|
||||
ID_PE_ACME_IDENTIFIER_V1 = b"1.3.6.1.5.5.7.1.30.1"
|
||||
ACME_TLS_1_PROTOCOL = "acme-tls/1"
|
||||
@property
|
||||
def z(self): # pylint: disable=invalid-name
|
||||
"""``z`` value used for verification.
|
||||
|
||||
:rtype bytes:
|
||||
|
||||
"""
|
||||
return hashlib.sha256(
|
||||
self.key_authorization.encode("utf-8")).hexdigest().lower().encode()
|
||||
|
||||
@property
|
||||
def h(self):
|
||||
"""Hash value stored in challenge certificate"""
|
||||
return hashlib.sha256(self.key_authorization.encode('utf-8')).digest()
|
||||
def z_domain(self):
|
||||
"""Domain name used for verification, generated from `z`.
|
||||
|
||||
def gen_cert(self, domain, key=None, bits=2048):
|
||||
"""Generate tls-alpn-01 certificate.
|
||||
:rtype bytes:
|
||||
|
||||
"""
|
||||
return self.z[:32] + b'.' + self.z[32:] + self.DOMAIN_SUFFIX
|
||||
|
||||
def gen_cert(self, key=None, bits=2048):
|
||||
"""Generate tls-sni-01 certificate.
|
||||
|
||||
:param unicode domain: Domain verified by the challenge.
|
||||
:param OpenSSL.crypto.PKey key: Optional private key used in
|
||||
certificate generation. If not provided (``None``), then
|
||||
fresh key will be generated.
|
||||
@@ -401,38 +411,33 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
|
||||
"""
|
||||
if key is None:
|
||||
key = crypto.PKey()
|
||||
key.generate_key(crypto.TYPE_RSA, bits)
|
||||
key = OpenSSL.crypto.PKey()
|
||||
key.generate_key(OpenSSL.crypto.TYPE_RSA, bits)
|
||||
return crypto_util.gen_ss_cert(key, [
|
||||
# z_domain is too big to fit into CN, hence first dummy domain
|
||||
'dummy', self.z_domain.decode()], force_san=True), key
|
||||
|
||||
def probe_cert(self, domain, **kwargs):
|
||||
"""Probe tls-sni-01 challenge certificate.
|
||||
|
||||
der_value = b"DER:" + codecs.encode(self.h, 'hex')
|
||||
acme_extension = crypto.X509Extension(self.ID_PE_ACME_IDENTIFIER_V1,
|
||||
critical=True, value=der_value)
|
||||
|
||||
return crypto_util.gen_ss_cert(key, [domain], force_san=True,
|
||||
extensions=[acme_extension]), key
|
||||
|
||||
def probe_cert(self, domain, host=None, port=None):
|
||||
"""Probe tls-alpn-01 challenge certificate.
|
||||
|
||||
:param unicode domain: domain being validated, required.
|
||||
:param string host: IP address used to probe the certificate.
|
||||
:param int port: Port used to probe the certificate.
|
||||
:param unicode domain:
|
||||
|
||||
"""
|
||||
if host is None:
|
||||
# TODO: domain is not necessary if host is provided
|
||||
if "host" not in kwargs:
|
||||
host = socket.gethostbyname(domain)
|
||||
logger.debug('%s resolved to %s', domain, host)
|
||||
if port is None:
|
||||
port = self.PORT
|
||||
kwargs["host"] = host
|
||||
|
||||
return crypto_util.probe_sni(host=host, port=port, name=domain,
|
||||
alpn_protocols=[self.ACME_TLS_1_PROTOCOL])
|
||||
kwargs.setdefault("port", self.PORT)
|
||||
kwargs["name"] = self.z_domain
|
||||
# TODO: try different methods?
|
||||
# pylint: disable=protected-access
|
||||
return crypto_util.probe_sni(**kwargs)
|
||||
|
||||
def verify_cert(self, domain, cert):
|
||||
"""Verify tls-alpn-01 challenge certificate.
|
||||
def verify_cert(self, cert):
|
||||
"""Verify tls-sni-01 challenge certificate.
|
||||
|
||||
:param unicode domain: Domain name being validated.
|
||||
:param OpensSSL.crypto.X509 cert: Challenge certificate.
|
||||
|
||||
:returns: Whether the certificate was successfully verified.
|
||||
@@ -440,40 +445,28 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
|
||||
"""
|
||||
# pylint: disable=protected-access
|
||||
names = crypto_util._pyopenssl_cert_or_req_all_names(cert)
|
||||
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), names)
|
||||
if len(names) != 1 or names[0].lower() != domain.lower():
|
||||
return False
|
||||
sans = crypto_util._pyopenssl_cert_or_req_san(cert)
|
||||
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), sans)
|
||||
return self.z_domain.decode() in sans
|
||||
|
||||
for i in range(cert.get_extension_count()):
|
||||
ext = cert.get_extension(i)
|
||||
# FIXME: assume this is the ACME extension. Currently there is no
|
||||
# way to get full OID of an unknown extension from pyopenssl.
|
||||
if ext.get_short_name() == b'UNDEF':
|
||||
data = ext.get_data()
|
||||
return data == self.h
|
||||
|
||||
return False
|
||||
|
||||
# pylint: disable=too-many-arguments
|
||||
def simple_verify(self, chall, domain, account_public_key,
|
||||
cert=None, host=None, port=None):
|
||||
cert=None, **kwargs):
|
||||
"""Simple verify.
|
||||
|
||||
Verify ``validation`` using ``account_public_key``, optionally
|
||||
probe tls-alpn-01 certificate and check using `verify_cert`.
|
||||
probe tls-sni-01 certificate and check using `verify_cert`.
|
||||
|
||||
:param .challenges.TLSALPN01 chall: Corresponding challenge.
|
||||
:param .challenges.TLSSNI01 chall: Corresponding challenge.
|
||||
:param str domain: Domain name being validated.
|
||||
:param JWK account_public_key:
|
||||
:param OpenSSL.crypto.X509 cert: Optional certificate. If not
|
||||
provided (``None``) certificate will be retrieved using
|
||||
`probe_cert`.
|
||||
:param string host: IP address used to probe the certificate.
|
||||
:param int port: Port used to probe the certificate.
|
||||
|
||||
|
||||
:returns: ``True`` if and only if client's control of the domain has been verified.
|
||||
:returns: ``True`` iff client's control of the domain has been
|
||||
verified.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
@@ -483,25 +476,27 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
|
||||
if cert is None:
|
||||
try:
|
||||
cert = self.probe_cert(domain=domain, host=host, port=port)
|
||||
cert = self.probe_cert(domain=domain, **kwargs)
|
||||
except errors.Error as error:
|
||||
logger.debug(str(error), exc_info=True)
|
||||
return False
|
||||
|
||||
return self.verify_cert(domain, cert)
|
||||
return self.verify_cert(cert)
|
||||
|
||||
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class TLSALPN01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-alpn-01 challenge."""
|
||||
response_cls = TLSALPN01Response
|
||||
class TLSSNI01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-sni-01 challenge."""
|
||||
response_cls = TLSSNI01Response
|
||||
typ = response_cls.typ
|
||||
|
||||
# boulder#962, ietf-wg-acme#22
|
||||
#n = jose.Field("n", encoder=int, decoder=int)
|
||||
|
||||
def validation(self, account_key, **kwargs):
|
||||
"""Generate validation.
|
||||
|
||||
:param JWK account_key:
|
||||
:param unicode domain: Domain verified by the challenge.
|
||||
:param OpenSSL.crypto.PKey cert_key: Optional private key used
|
||||
in certificate generation. If not provided (``None``), then
|
||||
fresh key will be generated.
|
||||
@@ -509,26 +504,25 @@ class TLSALPN01(KeyAuthorizationChallenge):
|
||||
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
|
||||
|
||||
"""
|
||||
return self.response(account_key).gen_cert(
|
||||
key=kwargs.get('cert_key'),
|
||||
domain=kwargs.get('domain'))
|
||||
|
||||
@staticmethod
|
||||
def is_supported():
|
||||
"""
|
||||
Check if TLS-ALPN-01 challenge is supported on this machine.
|
||||
This implies that a recent version of OpenSSL is installed (>= 1.0.2),
|
||||
or a recent cryptography version shipped with the OpenSSL library is installed.
|
||||
|
||||
:returns: ``True`` if TLS-ALPN-01 is supported on this machine, ``False`` otherwise.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
return (hasattr(SSL.Connection, "set_alpn_protos")
|
||||
and hasattr(SSL.Context, "set_alpn_select_callback"))
|
||||
return self.response(account_key).gen_cert(key=kwargs.get('cert_key'))
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class TLSALPN01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-alpn-01 challenge.
|
||||
|
||||
This class simply allows parsing the TLS-ALPN-01 challenge returned from
|
||||
the CA. Full TLS-ALPN-01 support is not currently provided.
|
||||
|
||||
"""
|
||||
typ = "tls-alpn-01"
|
||||
|
||||
def validation(self, account_key, **kwargs):
|
||||
"""Generate validation for the challenge."""
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class DNS(_TokenChallenge):
|
||||
"""ACME "dns" challenge."""
|
||||
typ = "dns"
|
||||
|
||||
@@ -1,15 +1,15 @@
|
||||
"""Tests for acme.challenges."""
|
||||
import urllib.parse as urllib_parse
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
import josepy as jose
|
||||
import mock
|
||||
import OpenSSL
|
||||
import requests
|
||||
|
||||
from acme import errors
|
||||
from six.moves.urllib import parse as urllib_parse # pylint: disable=import-error
|
||||
|
||||
import test_util
|
||||
from acme import errors
|
||||
from acme import test_util
|
||||
|
||||
CERT = test_util.load_comparable_cert('cert.pem')
|
||||
KEY = jose.JWKRSA(key=test_util.load_rsa_private_key('rsa512_key.pem'))
|
||||
@@ -21,6 +21,7 @@ class ChallengeTest(unittest.TestCase):
|
||||
from acme.challenges import Challenge
|
||||
from acme.challenges import UnrecognizedChallenge
|
||||
chall = UnrecognizedChallenge({"type": "foo"})
|
||||
# pylint: disable=no-member
|
||||
self.assertEqual(chall, Challenge.from_json(chall.jobj))
|
||||
|
||||
|
||||
@@ -76,6 +77,7 @@ class KeyAuthorizationChallengeResponseTest(unittest.TestCase):
|
||||
|
||||
|
||||
class DNS01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import DNS01Response
|
||||
@@ -91,8 +93,7 @@ class DNS01ResponseTest(unittest.TestCase):
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import DNS01Response
|
||||
@@ -147,6 +148,7 @@ class DNS01Test(unittest.TestCase):
|
||||
|
||||
|
||||
class HTTP01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
@@ -162,8 +164,7 @@ class HTTP01ResponseTest(unittest.TestCase):
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
@@ -184,7 +185,7 @@ class HTTP01ResponseTest(unittest.TestCase):
|
||||
mock_get.return_value = mock.MagicMock(text=validation)
|
||||
self.assertTrue(self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key()))
|
||||
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
|
||||
mock_get.assert_called_once_with(self.chall.uri("local"))
|
||||
|
||||
@mock.patch("acme.challenges.requests.get")
|
||||
def test_simple_verify_bad_validation(self, mock_get):
|
||||
@@ -200,7 +201,7 @@ class HTTP01ResponseTest(unittest.TestCase):
|
||||
HTTP01Response.WHITESPACE_CUTSET))
|
||||
self.assertTrue(self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key()))
|
||||
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
|
||||
mock_get.assert_called_once_with(self.chall.uri("local"))
|
||||
|
||||
@mock.patch("acme.challenges.requests.get")
|
||||
def test_simple_verify_connection_error(self, mock_get):
|
||||
@@ -256,68 +257,42 @@ class HTTP01Test(unittest.TestCase):
|
||||
self.msg.update(token=b'..').good_token)
|
||||
|
||||
|
||||
class TLSALPN01ResponseTest(unittest.TestCase):
|
||||
class TLSSNI01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
self.chall = TLSALPN01(
|
||||
from acme.challenges import TLSSNI01
|
||||
self.chall = TLSSNI01(
|
||||
token=jose.b64decode(b'a82d5ff8ef740d12881f6d3c2277ab2e'))
|
||||
self.domain = u'example.com'
|
||||
self.domain2 = u'example2.com'
|
||||
|
||||
self.response = self.chall.response(KEY)
|
||||
self.jmsg = {
|
||||
'resource': 'challenge',
|
||||
'type': 'tls-alpn-01',
|
||||
'type': 'tls-sni-01',
|
||||
'keyAuthorization': self.response.key_authorization,
|
||||
}
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
label1 = b'dc38d9c3fa1a4fdcc3a5501f2d38583f'
|
||||
label2 = b'b7793728f084394f2a1afd459556bb5c'
|
||||
self.z = label1 + label2
|
||||
self.z_domain = label1 + b'.' + label2 + b'.acme.invalid'
|
||||
self.domain = 'foo.com'
|
||||
|
||||
def test_z_and_domain(self):
|
||||
self.assertEqual(self.z, self.response.z)
|
||||
self.assertEqual(self.z_domain, self.response.z_domain)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.response.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.response.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
self.assertEqual(self.response, TLSALPN01Response.from_json(self.jmsg))
|
||||
from acme.challenges import TLSSNI01Response
|
||||
self.assertEqual(self.response, TLSSNI01Response.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
hash(TLSALPN01Response.from_json(self.jmsg))
|
||||
|
||||
def test_gen_verify_cert(self):
|
||||
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
|
||||
cert, key2 = self.response.gen_cert(self.domain, key1)
|
||||
self.assertEqual(key1, key2)
|
||||
self.assertTrue(self.response.verify_cert(self.domain, cert))
|
||||
|
||||
def test_gen_verify_cert_gen_key(self):
|
||||
cert, key = self.response.gen_cert(self.domain)
|
||||
self.assertIsInstance(key, OpenSSL.crypto.PKey)
|
||||
self.assertTrue(self.response.verify_cert(self.domain, cert))
|
||||
|
||||
def test_verify_bad_cert(self):
|
||||
self.assertFalse(self.response.verify_cert(self.domain,
|
||||
test_util.load_cert('cert.pem')))
|
||||
|
||||
def test_verify_bad_domain(self):
|
||||
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
|
||||
cert, key2 = self.response.gen_cert(self.domain, key1)
|
||||
self.assertEqual(key1, key2)
|
||||
self.assertFalse(self.response.verify_cert(self.domain2, cert))
|
||||
|
||||
def test_simple_verify_bad_key_authorization(self):
|
||||
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
|
||||
self.response.simple_verify(self.chall, "local", key2.public_key())
|
||||
|
||||
@mock.patch('acme.challenges.TLSALPN01Response.verify_cert', autospec=True)
|
||||
def test_simple_verify(self, mock_verify_cert):
|
||||
mock_verify_cert.return_value = mock.sentinel.verification
|
||||
self.assertEqual(
|
||||
mock.sentinel.verification, self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key(),
|
||||
cert=mock.sentinel.cert))
|
||||
mock_verify_cert.assert_called_once_with(
|
||||
self.response, self.domain, mock.sentinel.cert)
|
||||
from acme.challenges import TLSSNI01Response
|
||||
hash(TLSSNI01Response.from_json(self.jmsg))
|
||||
|
||||
@mock.patch('acme.challenges.socket.gethostbyname')
|
||||
@mock.patch('acme.challenges.crypto_util.probe_sni')
|
||||
@@ -326,21 +301,98 @@ class TLSALPN01ResponseTest(unittest.TestCase):
|
||||
self.response.probe_cert('foo.com')
|
||||
mock_gethostbyname.assert_called_once_with('foo.com')
|
||||
mock_probe_sni.assert_called_once_with(
|
||||
host='127.0.0.1', port=self.response.PORT, name='foo.com',
|
||||
alpn_protocols=['acme-tls/1'])
|
||||
host='127.0.0.1', port=self.response.PORT,
|
||||
name=self.z_domain)
|
||||
|
||||
self.response.probe_cert('foo.com', host='8.8.8.8')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host='8.8.8.8', port=mock.ANY, name='foo.com',
|
||||
alpn_protocols=['acme-tls/1'])
|
||||
host='8.8.8.8', port=mock.ANY, name=mock.ANY)
|
||||
|
||||
@mock.patch('acme.challenges.TLSALPN01Response.probe_cert')
|
||||
self.response.probe_cert('foo.com', port=1234)
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=1234, name=mock.ANY)
|
||||
|
||||
self.response.probe_cert('foo.com', bar='baz')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=mock.ANY, name=mock.ANY, bar='baz')
|
||||
|
||||
self.response.probe_cert('foo.com', name=b'xxx')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=mock.ANY,
|
||||
name=self.z_domain)
|
||||
|
||||
def test_gen_verify_cert(self):
|
||||
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
|
||||
cert, key2 = self.response.gen_cert(key1)
|
||||
self.assertEqual(key1, key2)
|
||||
self.assertTrue(self.response.verify_cert(cert))
|
||||
|
||||
def test_gen_verify_cert_gen_key(self):
|
||||
cert, key = self.response.gen_cert()
|
||||
self.assertTrue(isinstance(key, OpenSSL.crypto.PKey))
|
||||
self.assertTrue(self.response.verify_cert(cert))
|
||||
|
||||
def test_verify_bad_cert(self):
|
||||
self.assertFalse(self.response.verify_cert(
|
||||
test_util.load_cert('cert.pem')))
|
||||
|
||||
def test_simple_verify_bad_key_authorization(self):
|
||||
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
|
||||
self.response.simple_verify(self.chall, "local", key2.public_key())
|
||||
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.verify_cert', autospec=True)
|
||||
def test_simple_verify(self, mock_verify_cert):
|
||||
mock_verify_cert.return_value = mock.sentinel.verification
|
||||
self.assertEqual(
|
||||
mock.sentinel.verification, self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key(),
|
||||
cert=mock.sentinel.cert))
|
||||
mock_verify_cert.assert_called_once_with(
|
||||
self.response, mock.sentinel.cert)
|
||||
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.probe_cert')
|
||||
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
|
||||
mock_probe_cert.side_effect = errors.Error
|
||||
self.assertFalse(self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key()))
|
||||
|
||||
|
||||
class TLSSNI01Test(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import TLSSNI01
|
||||
self.msg = TLSSNI01(
|
||||
token=jose.b64decode('a82d5ff8ef740d12881f6d3c2277ab2e'))
|
||||
self.jmsg = {
|
||||
'type': 'tls-sni-01',
|
||||
'token': 'a82d5ff8ef740d12881f6d3c2277ab2e',
|
||||
}
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSSNI01
|
||||
self.assertEqual(self.msg, TLSSNI01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import TLSSNI01
|
||||
hash(TLSSNI01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_invalid_token_length(self):
|
||||
from acme.challenges import TLSSNI01
|
||||
self.jmsg['token'] = jose.encode_b64jose(b'abcd')
|
||||
self.assertRaises(
|
||||
jose.DeserializationError, TLSSNI01.from_json, self.jmsg)
|
||||
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.gen_cert')
|
||||
def test_validation(self, mock_gen_cert):
|
||||
mock_gen_cert.return_value = ('cert', 'key')
|
||||
self.assertEqual(('cert', 'key'), self.msg.validation(
|
||||
KEY, cert_key=mock.sentinel.cert_key))
|
||||
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key)
|
||||
|
||||
|
||||
class TLSALPN01Test(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
@@ -369,13 +421,8 @@ class TLSALPN01Test(unittest.TestCase):
|
||||
self.assertRaises(
|
||||
jose.DeserializationError, TLSALPN01.from_json, self.jmsg)
|
||||
|
||||
@mock.patch('acme.challenges.TLSALPN01Response.gen_cert')
|
||||
def test_validation(self, mock_gen_cert):
|
||||
mock_gen_cert.return_value = ('cert', 'key')
|
||||
self.assertEqual(('cert', 'key'), self.msg.validation(
|
||||
KEY, cert_key=mock.sentinel.cert_key, domain=mock.sentinel.domain))
|
||||
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key,
|
||||
domain=mock.sentinel.domain)
|
||||
def test_validation(self):
|
||||
self.assertRaises(NotImplementedError, self.msg.validation, KEY)
|
||||
|
||||
|
||||
class DNSTest(unittest.TestCase):
|
||||
@@ -431,7 +478,7 @@ class DNSTest(unittest.TestCase):
|
||||
mock_gen.return_value = mock.sentinel.validation
|
||||
response = self.msg.gen_response(KEY)
|
||||
from acme.challenges import DNSResponse
|
||||
self.assertIsInstance(response, DNSResponse)
|
||||
self.assertTrue(isinstance(response, DNSResponse))
|
||||
self.assertEqual(response.validation, mock.sentinel.validation)
|
||||
|
||||
def test_validation_domain_name(self):
|
||||
@@ -478,18 +525,5 @@ class DNSResponseTest(unittest.TestCase):
|
||||
self.msg.check_validation(self.chall, KEY.public_key()))
|
||||
|
||||
|
||||
class JWSPayloadRFC8555Compliant(unittest.TestCase):
|
||||
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
|
||||
def test_challenge_payload(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
|
||||
challenge_body = HTTP01Response()
|
||||
challenge_body.le_acme_version = 2
|
||||
|
||||
jobj = challenge_body.json_dumps(indent=2).encode()
|
||||
# RFC8555 states that challenge responses must have an empty payload.
|
||||
self.assertEqual(jobj, b'{}')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -4,44 +4,53 @@ import collections
|
||||
import datetime
|
||||
from email.utils import parsedate_tz
|
||||
import heapq
|
||||
import http.client as http_client
|
||||
import logging
|
||||
import re
|
||||
import time
|
||||
from typing import cast
|
||||
from typing import Dict
|
||||
from typing import List
|
||||
from typing import Set
|
||||
from typing import Text
|
||||
from typing import Union
|
||||
|
||||
import six
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
import re
|
||||
from requests_toolbelt.adapters.source import SourceAddressAdapter
|
||||
import requests
|
||||
from requests.adapters import HTTPAdapter
|
||||
from requests.utils import parse_header_links
|
||||
from requests_toolbelt.adapters.source import SourceAddressAdapter
|
||||
import sys
|
||||
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
from acme import jws
|
||||
from acme import messages
|
||||
from acme.mixins import VersionedLEACMEMixin
|
||||
# pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Dict, List, Set, Text
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Prior to Python 2.7.9 the stdlib SSL module did not allow a user to configure
|
||||
# many important security related options. On these platforms we use PyOpenSSL
|
||||
# for SSL, which does allow these options to be configured.
|
||||
# https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning
|
||||
if sys.version_info < (2, 7, 9): # pragma: no cover
|
||||
try:
|
||||
requests.packages.urllib3.contrib.pyopenssl.inject_into_urllib3() # type: ignore
|
||||
except AttributeError:
|
||||
import urllib3.contrib.pyopenssl # pylint: disable=import-error
|
||||
urllib3.contrib.pyopenssl.inject_into_urllib3()
|
||||
|
||||
DEFAULT_NETWORK_TIMEOUT = 45
|
||||
|
||||
DER_CONTENT_TYPE = 'application/pkix-cert'
|
||||
|
||||
|
||||
class ClientBase:
|
||||
class ClientBase(object): # pylint: disable=too-many-instance-attributes
|
||||
"""ACME client base object.
|
||||
|
||||
:ivar messages.Directory directory:
|
||||
:ivar .ClientNetwork net: Client network.
|
||||
:ivar int acme_version: ACME protocol version. 1 or 2.
|
||||
"""
|
||||
|
||||
def __init__(self, directory, net, acme_version):
|
||||
"""Initialize.
|
||||
|
||||
@@ -81,8 +90,6 @@ class ClientBase:
|
||||
|
||||
"""
|
||||
kwargs.setdefault('acme_version', self.acme_version)
|
||||
if hasattr(self.directory, 'newNonce'):
|
||||
kwargs.setdefault('new_nonce_url', getattr(self.directory, 'newNonce'))
|
||||
return self.net.post(*args, **kwargs)
|
||||
|
||||
def update_registration(self, regr, update=None):
|
||||
@@ -114,22 +121,14 @@ class ClientBase:
|
||||
"""
|
||||
return self.update_registration(regr, update={'status': 'deactivated'})
|
||||
|
||||
def deactivate_authorization(self,
|
||||
authzr: messages.AuthorizationResource
|
||||
) -> messages.AuthorizationResource:
|
||||
"""Deactivate authorization.
|
||||
def query_registration(self, regr):
|
||||
"""Query server about registration.
|
||||
|
||||
:param messages.AuthorizationResource authzr: The Authorization resource
|
||||
to be deactivated.
|
||||
|
||||
:returns: The Authorization resource that was deactivated.
|
||||
:rtype: `.AuthorizationResource`
|
||||
:param messages.RegistrationResource: Existing Registration
|
||||
Resource.
|
||||
|
||||
"""
|
||||
body = messages.UpdateAuthorization(status='deactivated')
|
||||
response = self._post(authzr.uri, body)
|
||||
return self._authzr_from_response(response,
|
||||
authzr.body.identifier, authzr.uri)
|
||||
return self._send_recv_regr(regr, messages.UpdateRegistration())
|
||||
|
||||
def _authzr_from_response(self, response, identifier=None, uri=None):
|
||||
authzr = messages.AuthorizationResource(
|
||||
@@ -191,7 +190,7 @@ class ClientBase:
|
||||
when = parsedate_tz(retry_after)
|
||||
if when is not None:
|
||||
try:
|
||||
tz_secs = datetime.timedelta(when[-1] if when[-1] is not None else 0)
|
||||
tz_secs = datetime.timedelta(when[-1] if when[-1] else 0)
|
||||
return datetime.datetime(*when[:7]) - tz_secs
|
||||
except (ValueError, OverflowError):
|
||||
pass
|
||||
@@ -199,6 +198,22 @@ class ClientBase:
|
||||
|
||||
return datetime.datetime.now() + datetime.timedelta(seconds=seconds)
|
||||
|
||||
def poll(self, authzr):
|
||||
"""Poll Authorization Resource for status.
|
||||
|
||||
:param authzr: Authorization Resource
|
||||
:type authzr: `.AuthorizationResource`
|
||||
|
||||
:returns: Updated Authorization Resource and HTTP response.
|
||||
|
||||
:rtype: (`.AuthorizationResource`, `requests.Response`)
|
||||
|
||||
"""
|
||||
response = self.net.get(authzr.uri)
|
||||
updated_authzr = self._authzr_from_response(
|
||||
response, authzr.body.identifier, authzr.uri)
|
||||
return updated_authzr, response
|
||||
|
||||
def _revoke(self, cert, rsn, url):
|
||||
"""Revoke certificate.
|
||||
|
||||
@@ -220,7 +235,6 @@ class ClientBase:
|
||||
raise errors.ClientError(
|
||||
'Successful revocation must return HTTP OK status')
|
||||
|
||||
|
||||
class Client(ClientBase):
|
||||
"""ACME client for a v1 API.
|
||||
|
||||
@@ -246,14 +260,15 @@ class Client(ClientBase):
|
||||
URI from which the resource will be downloaded.
|
||||
|
||||
"""
|
||||
# pylint: disable=too-many-arguments
|
||||
self.key = key
|
||||
if net is None:
|
||||
net = ClientNetwork(key, alg=alg, verify_ssl=verify_ssl)
|
||||
|
||||
if isinstance(directory, str):
|
||||
if isinstance(directory, six.string_types):
|
||||
directory = messages.Directory.from_json(
|
||||
net.get(directory).json())
|
||||
super().__init__(directory=directory,
|
||||
super(Client, self).__init__(directory=directory,
|
||||
net=net, acme_version=1)
|
||||
|
||||
def register(self, new_reg=None):
|
||||
@@ -271,17 +286,9 @@ class Client(ClientBase):
|
||||
assert response.status_code == http_client.CREATED
|
||||
|
||||
# "Instance of 'Field' has no key/contact member" bug:
|
||||
# pylint: disable=no-member
|
||||
return self._regr_from_response(response)
|
||||
|
||||
def query_registration(self, regr):
|
||||
"""Query server about registration.
|
||||
|
||||
:param messages.RegistrationResource: Existing Registration
|
||||
Resource.
|
||||
|
||||
"""
|
||||
return self._send_recv_regr(regr, messages.UpdateRegistration())
|
||||
|
||||
def agree_to_tos(self, regr):
|
||||
"""Agree to the terms-of-service.
|
||||
|
||||
@@ -380,22 +387,6 @@ class Client(ClientBase):
|
||||
body=jose.ComparableX509(OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_ASN1, response.content)))
|
||||
|
||||
def poll(self, authzr):
|
||||
"""Poll Authorization Resource for status.
|
||||
|
||||
:param authzr: Authorization Resource
|
||||
:type authzr: `.AuthorizationResource`
|
||||
|
||||
:returns: Updated Authorization Resource and HTTP response.
|
||||
|
||||
:rtype: (`.AuthorizationResource`, `requests.Response`)
|
||||
|
||||
"""
|
||||
response = self.net.get(authzr.uri)
|
||||
updated_authzr = self._authzr_from_response(
|
||||
response, authzr.body.identifier, authzr.uri)
|
||||
return updated_authzr, response
|
||||
|
||||
def poll_and_request_issuance(
|
||||
self, csr, authzrs, mintime=5, max_attempts=10):
|
||||
"""Poll and request issuance.
|
||||
@@ -425,8 +416,9 @@ class Client(ClientBase):
|
||||
was marked by the CA as invalid
|
||||
|
||||
"""
|
||||
# pylint: disable=too-many-locals
|
||||
assert max_attempts > 0
|
||||
attempts: Dict[messages.AuthorizationResource, int] = collections.defaultdict(int)
|
||||
attempts = collections.defaultdict(int) # type: Dict[messages.AuthorizationResource, int]
|
||||
exhausted = set()
|
||||
|
||||
# priority queue with datetime.datetime (based on Retry-After) as key,
|
||||
@@ -438,7 +430,7 @@ class Client(ClientBase):
|
||||
heapq.heapify(waiting)
|
||||
# mapping between original Authorization Resource and the most
|
||||
# recently updated one
|
||||
updated = {authzr: authzr for authzr in authzrs}
|
||||
updated = dict((authzr, authzr) for authzr in authzrs)
|
||||
|
||||
while waiting:
|
||||
# find the smallest Retry-After, and sleep if necessary
|
||||
@@ -455,6 +447,7 @@ class Client(ClientBase):
|
||||
updated[authzr] = updated_authzr
|
||||
|
||||
attempts[authzr] += 1
|
||||
# pylint: disable=no-member
|
||||
if updated_authzr.body.status not in (
|
||||
messages.STATUS_VALID, messages.STATUS_INVALID):
|
||||
if attempts[authzr] < max_attempts:
|
||||
@@ -465,7 +458,7 @@ class Client(ClientBase):
|
||||
exhausted.add(authzr)
|
||||
|
||||
if exhausted or any(authzr.body.status == messages.STATUS_INVALID
|
||||
for authzr in updated.values()):
|
||||
for authzr in six.itervalues(updated)):
|
||||
raise errors.PollError(exhausted, updated)
|
||||
|
||||
updated_authzrs = tuple(updated[authzr] for authzr in authzrs)
|
||||
@@ -539,7 +532,7 @@ class Client(ClientBase):
|
||||
:rtype: `list` of `OpenSSL.crypto.X509` wrapped in `.ComparableX509`
|
||||
|
||||
"""
|
||||
chain: List[jose.ComparableX509] = []
|
||||
chain = [] # type: List[jose.ComparableX509]
|
||||
uri = certr.cert_chain_uri
|
||||
while uri is not None and len(chain) < max_length:
|
||||
response, cert = self._get_cert(uri)
|
||||
@@ -577,7 +570,7 @@ class ClientV2(ClientBase):
|
||||
:param .messages.Directory directory: Directory Resource
|
||||
:param .ClientNetwork net: Client network.
|
||||
"""
|
||||
super().__init__(directory=directory,
|
||||
super(ClientV2, self).__init__(directory=directory,
|
||||
net=net, acme_version=2)
|
||||
|
||||
def new_account(self, new_account):
|
||||
@@ -585,59 +578,16 @@ class ClientV2(ClientBase):
|
||||
|
||||
:param .NewRegistration new_account:
|
||||
|
||||
:raises .ConflictError: in case the account already exists
|
||||
|
||||
:returns: Registration Resource.
|
||||
:rtype: `.RegistrationResource`
|
||||
"""
|
||||
response = self._post(self.directory['newAccount'], new_account)
|
||||
# if account already exists
|
||||
if response.status_code == 200 and 'Location' in response.headers:
|
||||
raise errors.ConflictError(response.headers.get('Location'))
|
||||
# "Instance of 'Field' has no key/contact member" bug:
|
||||
# pylint: disable=no-member
|
||||
regr = self._regr_from_response(response)
|
||||
self.net.account = regr
|
||||
return regr
|
||||
|
||||
def query_registration(self, regr):
|
||||
"""Query server about registration.
|
||||
|
||||
:param messages.RegistrationResource: Existing Registration
|
||||
Resource.
|
||||
|
||||
"""
|
||||
self.net.account = regr # See certbot/certbot#6258
|
||||
# ACME v2 requires to use a POST-as-GET request (POST an empty JWS) here.
|
||||
# This is done by passing None instead of an empty UpdateRegistration to _post().
|
||||
response = self._post(regr.uri, None)
|
||||
self.net.account = self._regr_from_response(response, uri=regr.uri,
|
||||
terms_of_service=regr.terms_of_service)
|
||||
return self.net.account
|
||||
|
||||
def update_registration(self, regr, update=None):
|
||||
"""Update registration.
|
||||
|
||||
:param messages.RegistrationResource regr: Registration Resource.
|
||||
:param messages.Registration update: Updated body of the
|
||||
resource. If not provided, body will be taken from `regr`.
|
||||
|
||||
:returns: Updated Registration Resource.
|
||||
:rtype: `.RegistrationResource`
|
||||
|
||||
"""
|
||||
# https://github.com/certbot/certbot/issues/6155
|
||||
new_regr = self._get_v2_account(regr)
|
||||
return super().update_registration(new_regr, update)
|
||||
|
||||
def _get_v2_account(self, regr):
|
||||
self.net.account = None
|
||||
only_existing_reg = regr.body.update(only_return_existing=True)
|
||||
response = self._post(self.directory['newAccount'], only_existing_reg)
|
||||
updated_uri = response.headers['Location']
|
||||
new_regr = regr.update(uri=updated_uri)
|
||||
self.net.account = new_regr
|
||||
return new_regr
|
||||
|
||||
def new_order(self, csr_pem):
|
||||
"""Request a new Order object from the server.
|
||||
|
||||
@@ -659,29 +609,13 @@ class ClientV2(ClientBase):
|
||||
body = messages.Order.from_json(response.json())
|
||||
authorizations = []
|
||||
for url in body.authorizations:
|
||||
authorizations.append(self._authzr_from_response(self._post_as_get(url), uri=url))
|
||||
authorizations.append(self._authzr_from_response(self.net.get(url), uri=url))
|
||||
return messages.OrderResource(
|
||||
body=body,
|
||||
uri=response.headers.get('Location'),
|
||||
authorizations=authorizations,
|
||||
csr_pem=csr_pem)
|
||||
|
||||
def poll(self, authzr):
|
||||
"""Poll Authorization Resource for status.
|
||||
|
||||
:param authzr: Authorization Resource
|
||||
:type authzr: `.AuthorizationResource`
|
||||
|
||||
:returns: Updated Authorization Resource and HTTP response.
|
||||
|
||||
:rtype: (`.AuthorizationResource`, `requests.Response`)
|
||||
|
||||
"""
|
||||
response = self._post_as_get(authzr.uri)
|
||||
updated_authzr = self._authzr_from_response(
|
||||
response, authzr.body.identifier, authzr.uri)
|
||||
return updated_authzr, response
|
||||
|
||||
def poll_and_finalize(self, orderr, deadline=None):
|
||||
"""Poll authorizations and finalize the order.
|
||||
|
||||
@@ -705,7 +639,7 @@ class ClientV2(ClientBase):
|
||||
responses = []
|
||||
for url in orderr.body.authorizations:
|
||||
while datetime.datetime.now() < deadline:
|
||||
authzr = self._authzr_from_response(self._post_as_get(url), uri=url)
|
||||
authzr = self._authzr_from_response(self.net.get(url), uri=url)
|
||||
if authzr.body.status != messages.STATUS_PENDING:
|
||||
responses.append(authzr)
|
||||
break
|
||||
@@ -718,19 +652,17 @@ class ClientV2(ClientBase):
|
||||
for authzr in responses:
|
||||
if authzr.body.status != messages.STATUS_VALID:
|
||||
for chall in authzr.body.challenges:
|
||||
if chall.error is not None:
|
||||
if chall.error != None:
|
||||
failed.append(authzr)
|
||||
if failed:
|
||||
if len(failed) > 0:
|
||||
raise errors.ValidationError(failed)
|
||||
return orderr.update(authorizations=responses)
|
||||
|
||||
def finalize_order(self, orderr, deadline, fetch_alternative_chains=False):
|
||||
def finalize_order(self, orderr, deadline):
|
||||
"""Finalize an order and obtain a certificate.
|
||||
|
||||
:param messages.OrderResource orderr: order to finalize
|
||||
:param datetime.datetime deadline: when to stop polling and timeout
|
||||
:param bool fetch_alternative_chains: whether to also fetch alternative
|
||||
certificate chains
|
||||
|
||||
:returns: finalized order
|
||||
:rtype: messages.OrderResource
|
||||
@@ -742,18 +674,14 @@ class ClientV2(ClientBase):
|
||||
self._post(orderr.body.finalize, wrapped_csr)
|
||||
while datetime.datetime.now() < deadline:
|
||||
time.sleep(1)
|
||||
response = self._post_as_get(orderr.uri)
|
||||
response = self.net.get(orderr.uri)
|
||||
body = messages.Order.from_json(response.json())
|
||||
if body.error is not None:
|
||||
raise errors.IssuanceError(body.error)
|
||||
if body.certificate is not None:
|
||||
certificate_response = self._post_as_get(body.certificate)
|
||||
orderr = orderr.update(body=body, fullchain_pem=certificate_response.text)
|
||||
if fetch_alternative_chains:
|
||||
alt_chains_urls = self._get_links(certificate_response, 'alternate')
|
||||
alt_chains = [self._post_as_get(url).text for url in alt_chains_urls]
|
||||
orderr = orderr.update(alternative_fullchains_pem=alt_chains)
|
||||
return orderr
|
||||
certificate_response = self.net.get(body.certificate,
|
||||
content_type=DER_CONTENT_TYPE).text
|
||||
return orderr.update(body=body, fullchain_pem=certificate_response)
|
||||
raise errors.TimeoutError()
|
||||
|
||||
def revoke(self, cert, rsn):
|
||||
@@ -769,36 +697,8 @@ class ClientV2(ClientBase):
|
||||
"""
|
||||
return self._revoke(cert, rsn, self.directory['revokeCert'])
|
||||
|
||||
def external_account_required(self):
|
||||
"""Checks if ACME server requires External Account Binding authentication."""
|
||||
return hasattr(self.directory, 'meta') and self.directory.meta.external_account_required
|
||||
|
||||
def _post_as_get(self, *args, **kwargs):
|
||||
"""
|
||||
Send GET request using the POST-as-GET protocol.
|
||||
:param args:
|
||||
:param kwargs:
|
||||
:return:
|
||||
"""
|
||||
new_args = args[:1] + (None,) + args[1:]
|
||||
return self._post(*new_args, **kwargs)
|
||||
|
||||
def _get_links(self, response, relation_type):
|
||||
"""
|
||||
Retrieves all Link URIs of relation_type from the response.
|
||||
:param requests.Response response: The requests HTTP response.
|
||||
:param str relation_type: The relation type to filter by.
|
||||
"""
|
||||
# Can't use response.links directly because it drops multiple links
|
||||
# of the same relation type, which is possible in RFC8555 responses.
|
||||
if 'Link' not in response.headers:
|
||||
return []
|
||||
links = parse_header_links(response.headers['Link'])
|
||||
return [l['url'] for l in links
|
||||
if 'rel' in l and 'url' in l and l['rel'] == relation_type]
|
||||
|
||||
|
||||
class BackwardsCompatibleClientV2:
|
||||
class BackwardsCompatibleClientV2(object):
|
||||
"""ACME client wrapper that tends towards V2-style calls, but
|
||||
supports V1 servers.
|
||||
|
||||
@@ -820,14 +720,18 @@ class BackwardsCompatibleClientV2:
|
||||
def __init__(self, net, key, server):
|
||||
directory = messages.Directory.from_json(net.get(server).json())
|
||||
self.acme_version = self._acme_version_from_directory(directory)
|
||||
self.client: Union[Client, ClientV2]
|
||||
if self.acme_version == 1:
|
||||
self.client = Client(directory, key=key, net=net)
|
||||
else:
|
||||
self.client = ClientV2(directory, net=net)
|
||||
|
||||
def __getattr__(self, name):
|
||||
return getattr(self.client, name)
|
||||
if name in vars(self.client):
|
||||
return getattr(self.client, name)
|
||||
elif name in dir(ClientBase):
|
||||
return getattr(self.client, name)
|
||||
else:
|
||||
raise AttributeError()
|
||||
|
||||
def new_account_and_tos(self, regr, check_tos_cb=None):
|
||||
"""Combined register and agree_tos for V1, new_account for V2
|
||||
@@ -840,18 +744,16 @@ class BackwardsCompatibleClientV2:
|
||||
if check_tos_cb is not None:
|
||||
check_tos_cb(tos)
|
||||
if self.acme_version == 1:
|
||||
client_v1 = cast(Client, self.client)
|
||||
regr = client_v1.register(regr)
|
||||
regr = self.client.register(regr)
|
||||
if regr.terms_of_service is not None:
|
||||
_assess_tos(regr.terms_of_service)
|
||||
return client_v1.agree_to_tos(regr)
|
||||
return self.client.agree_to_tos(regr)
|
||||
return regr
|
||||
else:
|
||||
client_v2 = cast(ClientV2, self.client)
|
||||
if "terms_of_service" in client_v2.directory.meta:
|
||||
_assess_tos(client_v2.directory.meta.terms_of_service)
|
||||
if "terms_of_service" in self.client.directory.meta:
|
||||
_assess_tos(self.client.directory.meta.terms_of_service)
|
||||
regr = regr.update(terms_of_service_agreed=True)
|
||||
return client_v2.new_account(regr)
|
||||
return self.client.new_account(regr)
|
||||
|
||||
def new_order(self, csr_pem):
|
||||
"""Request a new Order object from the server.
|
||||
@@ -869,32 +771,29 @@ class BackwardsCompatibleClientV2:
|
||||
|
||||
"""
|
||||
if self.acme_version == 1:
|
||||
client_v1 = cast(Client, self.client)
|
||||
csr = OpenSSL.crypto.load_certificate_request(OpenSSL.crypto.FILETYPE_PEM, csr_pem)
|
||||
# pylint: disable=protected-access
|
||||
dnsNames = crypto_util._pyopenssl_cert_or_req_all_names(csr)
|
||||
authorizations = []
|
||||
for domain in dnsNames:
|
||||
authorizations.append(client_v1.request_domain_challenges(domain))
|
||||
authorizations.append(self.client.request_domain_challenges(domain))
|
||||
return messages.OrderResource(authorizations=authorizations, csr_pem=csr_pem)
|
||||
return cast(ClientV2, self.client).new_order(csr_pem)
|
||||
else:
|
||||
return self.client.new_order(csr_pem)
|
||||
|
||||
def finalize_order(self, orderr, deadline, fetch_alternative_chains=False):
|
||||
def finalize_order(self, orderr, deadline):
|
||||
"""Finalize an order and obtain a certificate.
|
||||
|
||||
:param messages.OrderResource orderr: order to finalize
|
||||
:param datetime.datetime deadline: when to stop polling and timeout
|
||||
:param bool fetch_alternative_chains: whether to also fetch alternative
|
||||
certificate chains
|
||||
|
||||
:returns: finalized order
|
||||
:rtype: messages.OrderResource
|
||||
|
||||
"""
|
||||
if self.acme_version == 1:
|
||||
client_v1 = cast(Client, self.client)
|
||||
csr_pem = orderr.csr_pem
|
||||
certr = client_v1.request_issuance(
|
||||
certr = self.client.request_issuance(
|
||||
jose.ComparableX509(
|
||||
OpenSSL.crypto.load_certificate_request(OpenSSL.crypto.FILETYPE_PEM, csr_pem)),
|
||||
orderr.authorizations)
|
||||
@@ -902,7 +801,7 @@ class BackwardsCompatibleClientV2:
|
||||
chain = None
|
||||
while datetime.datetime.now() < deadline:
|
||||
try:
|
||||
chain = client_v1.fetch_chain(certr)
|
||||
chain = self.client.fetch_chain(certr)
|
||||
break
|
||||
except errors.Error:
|
||||
time.sleep(1)
|
||||
@@ -917,8 +816,8 @@ class BackwardsCompatibleClientV2:
|
||||
chain = crypto_util.dump_pyopenssl_chain(chain).decode()
|
||||
|
||||
return orderr.update(fullchain_pem=(cert + chain))
|
||||
return cast(ClientV2, self.client).finalize_order(
|
||||
orderr, deadline, fetch_alternative_chains)
|
||||
else:
|
||||
return self.client.finalize_order(orderr, deadline)
|
||||
|
||||
def revoke(self, cert, rsn):
|
||||
"""Revoke certificate.
|
||||
@@ -936,18 +835,11 @@ class BackwardsCompatibleClientV2:
|
||||
def _acme_version_from_directory(self, directory):
|
||||
if hasattr(directory, 'newNonce'):
|
||||
return 2
|
||||
return 1
|
||||
|
||||
def external_account_required(self):
|
||||
"""Checks if the server requires an external account for ACMEv2 servers.
|
||||
|
||||
Always return False for ACMEv1 servers, as it doesn't use External Account Binding."""
|
||||
if self.acme_version == 1:
|
||||
return False
|
||||
return cast(ClientV2, self.client).external_account_required()
|
||||
else:
|
||||
return 1
|
||||
|
||||
|
||||
class ClientNetwork:
|
||||
class ClientNetwork(object): # pylint: disable=too-many-instance-attributes
|
||||
"""Wrapper around requests that signs POSTs for authentication.
|
||||
|
||||
Also adds user agent, and handles Content-Type.
|
||||
@@ -963,7 +855,7 @@ class ClientNetwork:
|
||||
:param messages.RegistrationResource account: Account object. Required if you are
|
||||
planning to use .post() with acme_version=2 for anything other than
|
||||
creating a new account; may be set later after registering.
|
||||
:param josepy.JWASignature alg: Algorithm to use in signing JWS.
|
||||
:param josepy.JWASignature alg: Algoritm to use in signing JWS.
|
||||
:param bool verify_ssl: Whether to verify certificates on SSL connections.
|
||||
:param str user_agent: String to send as User-Agent header.
|
||||
:param float timeout: Timeout for requests.
|
||||
@@ -973,11 +865,12 @@ class ClientNetwork:
|
||||
def __init__(self, key, account=None, alg=jose.RS256, verify_ssl=True,
|
||||
user_agent='acme-python', timeout=DEFAULT_NETWORK_TIMEOUT,
|
||||
source_address=None):
|
||||
# pylint: disable=too-many-arguments
|
||||
self.key = key
|
||||
self.account = account
|
||||
self.alg = alg
|
||||
self.verify_ssl = verify_ssl
|
||||
self._nonces: Set[Text] = set()
|
||||
self._nonces = set() # type: Set[Text]
|
||||
self.user_agent = user_agent
|
||||
self.session = requests.Session()
|
||||
self._default_timeout = timeout
|
||||
@@ -1008,9 +901,7 @@ class ClientNetwork:
|
||||
:rtype: `josepy.JWS`
|
||||
|
||||
"""
|
||||
if isinstance(obj, VersionedLEACMEMixin):
|
||||
obj.le_acme_version = acme_version
|
||||
jobj = obj.json_dumps(indent=2).encode() if obj else b''
|
||||
jobj = obj.json_dumps(indent=2).encode()
|
||||
logger.debug('JWS payload:\n%s', jobj)
|
||||
kwargs = {
|
||||
"alg": self.alg,
|
||||
@@ -1019,10 +910,10 @@ class ClientNetwork:
|
||||
if acme_version == 2:
|
||||
kwargs["url"] = url
|
||||
# newAccount and revokeCert work without the kid
|
||||
# newAccount must not have kid
|
||||
if self.account is not None:
|
||||
kwargs["kid"] = self.account["uri"]
|
||||
kwargs["key"] = self.key
|
||||
# pylint: disable=star-args
|
||||
return jws.JWS.sign(jobj, **kwargs).json_dumps(indent=2)
|
||||
|
||||
@classmethod
|
||||
@@ -1045,9 +936,6 @@ class ClientNetwork:
|
||||
|
||||
"""
|
||||
response_ct = response.headers.get('Content-Type')
|
||||
# Strip parameters from the media-type (rfc2616#section-3.7)
|
||||
if response_ct:
|
||||
response_ct = response_ct.split(';')[0].strip()
|
||||
try:
|
||||
# TODO: response.json() is called twice, once here, and
|
||||
# once in _get and _post clients
|
||||
@@ -1085,6 +973,7 @@ class ClientNetwork:
|
||||
return response
|
||||
|
||||
def _send_request(self, method, url, *args, **kwargs):
|
||||
# pylint: disable=too-many-locals
|
||||
"""Send HTTP request.
|
||||
|
||||
Makes sure that `verify_ssl` is respected. Logs request and
|
||||
@@ -1131,21 +1020,21 @@ class ClientNetwork:
|
||||
err_regex = r".*host='(\S*)'.*Max retries exceeded with url\: (\/\w*).*(\[Errno \d+\])([A-Za-z ]*)"
|
||||
m = re.match(err_regex, str(e))
|
||||
if m is None:
|
||||
raise # pragma: no cover
|
||||
host, path, _err_no, err_msg = m.groups()
|
||||
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
|
||||
raise # pragma: no cover
|
||||
else:
|
||||
host, path, _err_no, err_msg = m.groups()
|
||||
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
|
||||
|
||||
# If content is DER, log the base64 of it instead of raw bytes, to keep
|
||||
# binary data out of the logs.
|
||||
debug_content: Union[bytes, str]
|
||||
if response.headers.get("Content-Type") == DER_CONTENT_TYPE:
|
||||
debug_content = base64.b64encode(response.content)
|
||||
else:
|
||||
debug_content = response.content.decode("utf-8")
|
||||
logger.debug('Received response:\nHTTP %d\n%s\n\n%s',
|
||||
response.status_code,
|
||||
"\n".join("{0}: {1}".format(k, v)
|
||||
for k, v in response.headers.items()),
|
||||
"\n".join(["{0}: {1}".format(k, v)
|
||||
for k, v in response.headers.items()]),
|
||||
debug_content)
|
||||
return response
|
||||
|
||||
@@ -1176,15 +1065,10 @@ class ClientNetwork:
|
||||
else:
|
||||
raise errors.MissingNonce(response)
|
||||
|
||||
def _get_nonce(self, url, new_nonce_url):
|
||||
def _get_nonce(self, url):
|
||||
if not self._nonces:
|
||||
logger.debug('Requesting fresh nonce')
|
||||
if new_nonce_url is None:
|
||||
response = self.head(url)
|
||||
else:
|
||||
# request a new nonce from the acme newNonce endpoint
|
||||
response = self._check_response(self.head(new_nonce_url), content_type=None)
|
||||
self._add_nonce(response)
|
||||
self._add_nonce(self.head(url))
|
||||
return self._nonces.pop()
|
||||
|
||||
def post(self, *args, **kwargs):
|
||||
@@ -1200,14 +1084,13 @@ class ClientNetwork:
|
||||
if error.code == 'badNonce':
|
||||
logger.debug('Retrying request after error:\n%s', error)
|
||||
return self._post_once(*args, **kwargs)
|
||||
raise
|
||||
else:
|
||||
raise
|
||||
|
||||
def _post_once(self, url, obj, content_type=JOSE_CONTENT_TYPE,
|
||||
acme_version=1, **kwargs):
|
||||
new_nonce_url = kwargs.pop('new_nonce_url', None)
|
||||
data = self._wrap_in_jws(obj, self._get_nonce(url, new_nonce_url), url, acme_version)
|
||||
data = self._wrap_in_jws(obj, self._get_nonce(url), url, acme_version)
|
||||
kwargs.setdefault('headers', {'Content-Type': content_type})
|
||||
response = self._send_request('POST', url, data=data, **kwargs)
|
||||
response = self._check_response(response, content_type=content_type)
|
||||
self._add_nonce(response)
|
||||
return response
|
||||
return self._check_response(response, content_type=content_type)
|
||||
|
||||
@@ -1,14 +1,13 @@
|
||||
"""Tests for acme.client."""
|
||||
# pylint: disable=too-many-lines
|
||||
import copy
|
||||
import datetime
|
||||
import http.client as http_client
|
||||
import json
|
||||
import unittest
|
||||
from typing import Dict
|
||||
from unittest import mock
|
||||
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
|
||||
import josepy as jose
|
||||
import mock
|
||||
import OpenSSL
|
||||
import requests
|
||||
|
||||
@@ -16,9 +15,10 @@ from acme import challenges
|
||||
from acme import errors
|
||||
from acme import jws as acme_jws
|
||||
from acme import messages
|
||||
from acme.mixins import VersionedLEACMEMixin
|
||||
import messages_test
|
||||
import test_util
|
||||
from acme import messages_test
|
||||
from acme import test_util
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
CERT_DER = test_util.load_vector('cert.der')
|
||||
CERT_SAN_PEM = test_util.load_vector('cert-san.pem')
|
||||
@@ -62,8 +62,8 @@ class ClientTestBase(unittest.TestCase):
|
||||
self.contact = ('mailto:cert-admin@example.com', 'tel:+12025551212')
|
||||
reg = messages.Registration(
|
||||
contact=self.contact, key=KEY.public_key())
|
||||
the_arg: Dict = dict(reg)
|
||||
self.new_reg = messages.NewRegistration(**the_arg)
|
||||
the_arg = dict(reg) # type: Dict
|
||||
self.new_reg = messages.NewRegistration(**the_arg) # pylint: disable=star-args
|
||||
self.regr = messages.RegistrationResource(
|
||||
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1')
|
||||
|
||||
@@ -90,7 +90,7 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
|
||||
"""Tests for acme.client.BackwardsCompatibleClientV2."""
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
super(BackwardsCompatibleClientV2Test, self).setUp()
|
||||
# contains a loaded cert
|
||||
self.certr = messages.CertificateResource(
|
||||
body=messages_test.CERT)
|
||||
@@ -134,18 +134,12 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
|
||||
client = self._init()
|
||||
self.assertEqual(client.acme_version, 2)
|
||||
|
||||
def test_query_registration_client_v2(self):
|
||||
self.response.json.return_value = DIRECTORY_V2.to_json()
|
||||
client = self._init()
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.assertEqual(self.regr, client.query_registration(self.regr))
|
||||
|
||||
def test_forwarding(self):
|
||||
self.response.json.return_value = DIRECTORY_V1.to_json()
|
||||
client = self._init()
|
||||
self.assertEqual(client.directory, client.client.directory)
|
||||
self.assertEqual(client.key, KEY)
|
||||
self.assertEqual(client.deactivate_registration, client.client.deactivate_registration)
|
||||
self.assertEqual(client.update_registration, client.client.update_registration)
|
||||
self.assertRaises(AttributeError, client.__getattr__, 'nonexistent')
|
||||
self.assertRaises(AttributeError, client.__getattr__, 'new_account_and_tos')
|
||||
self.assertRaises(AttributeError, client.__getattr__, 'new_account')
|
||||
@@ -261,7 +255,7 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
|
||||
with mock.patch('acme.client.ClientV2') as mock_client:
|
||||
client = self._init()
|
||||
client.finalize_order(mock_orderr, mock_deadline)
|
||||
mock_client().finalize_order.assert_called_once_with(mock_orderr, mock_deadline, False)
|
||||
mock_client().finalize_order.assert_called_once_with(mock_orderr, mock_deadline)
|
||||
|
||||
def test_revoke(self):
|
||||
self.response.json.return_value = DIRECTORY_V1.to_json()
|
||||
@@ -276,50 +270,13 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
|
||||
client.revoke(messages_test.CERT, self.rsn)
|
||||
mock_client().revoke.assert_called_once_with(messages_test.CERT, self.rsn)
|
||||
|
||||
def test_update_registration(self):
|
||||
self.response.json.return_value = DIRECTORY_V1.to_json()
|
||||
with mock.patch('acme.client.Client') as mock_client:
|
||||
client = self._init()
|
||||
client.update_registration(mock.sentinel.regr, None)
|
||||
mock_client().update_registration.assert_called_once_with(mock.sentinel.regr, None)
|
||||
|
||||
# newNonce present means it will pick acme_version 2
|
||||
def test_external_account_required_true(self):
|
||||
self.response.json.return_value = messages.Directory({
|
||||
'newNonce': 'http://letsencrypt-test.com/acme/new-nonce',
|
||||
'meta': messages.Directory.Meta(external_account_required=True),
|
||||
}).to_json()
|
||||
|
||||
client = self._init()
|
||||
|
||||
self.assertTrue(client.external_account_required())
|
||||
|
||||
# newNonce present means it will pick acme_version 2
|
||||
def test_external_account_required_false(self):
|
||||
self.response.json.return_value = messages.Directory({
|
||||
'newNonce': 'http://letsencrypt-test.com/acme/new-nonce',
|
||||
'meta': messages.Directory.Meta(external_account_required=False),
|
||||
}).to_json()
|
||||
|
||||
client = self._init()
|
||||
|
||||
self.assertFalse(client.external_account_required())
|
||||
|
||||
def test_external_account_required_false_v1(self):
|
||||
self.response.json.return_value = messages.Directory({
|
||||
'meta': messages.Directory.Meta(external_account_required=False),
|
||||
}).to_json()
|
||||
|
||||
client = self._init()
|
||||
|
||||
self.assertFalse(client.external_account_required())
|
||||
|
||||
|
||||
class ClientTest(ClientTestBase):
|
||||
"""Tests for acme.client.Client."""
|
||||
# pylint: disable=too-many-instance-attributes,too-many-public-methods
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
super(ClientTest, self).setUp()
|
||||
|
||||
self.directory = DIRECTORY_V1
|
||||
|
||||
@@ -356,6 +313,7 @@ class ClientTest(ClientTestBase):
|
||||
|
||||
def test_register(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
# pylint: disable=no-member
|
||||
self.response.status_code = http_client.CREATED
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
@@ -368,6 +326,7 @@ class ClientTest(ClientTestBase):
|
||||
|
||||
def test_update_registration(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
# pylint: disable=no-member
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.assertEqual(self.regr, self.client.update_registration(self.regr))
|
||||
@@ -604,8 +563,8 @@ class ClientTest(ClientTestBase):
|
||||
# make sure that max_attempts is per-authorization, rather
|
||||
# than global
|
||||
max_attempts=max(len(authzrs[0].retries), len(authzrs[1].retries)))
|
||||
self.assertIs(cert[0], csr)
|
||||
self.assertIs(cert[1], updated_authzrs)
|
||||
self.assertTrue(cert[0] is csr)
|
||||
self.assertTrue(cert[1] is updated_authzrs)
|
||||
self.assertEqual(updated_authzrs[0].uri, 'a...')
|
||||
self.assertEqual(updated_authzrs[1].uri, 'b.')
|
||||
self.assertEqual(updated_authzrs[0].times, [
|
||||
@@ -635,14 +594,6 @@ class ClientTest(ClientTestBase):
|
||||
errors.PollError, self.client.poll_and_request_issuance,
|
||||
csr, authzrs, mintime=mintime, max_attempts=2)
|
||||
|
||||
def test_deactivate_authorization(self):
|
||||
authzb = self.authzr.body.update(status=messages.STATUS_DEACTIVATED)
|
||||
self.response.json.return_value = authzb.to_json()
|
||||
authzr = self.client.deactivate_authorization(self.authzr)
|
||||
self.assertEqual(authzb, authzr.body)
|
||||
self.assertEqual(self.client.net.post.call_count, 1)
|
||||
self.assertIn(self.authzr.uri, self.net.post.call_args_list[0][0])
|
||||
|
||||
def test_check_cert(self):
|
||||
self.response.headers['Location'] = self.certr.uri
|
||||
self.response.content = CERT_DER
|
||||
@@ -700,8 +651,8 @@ class ClientTest(ClientTestBase):
|
||||
|
||||
def test_revocation_payload(self):
|
||||
obj = messages.Revocation(certificate=self.certr.body, reason=self.rsn)
|
||||
self.assertIn('reason', obj.to_partial_json().keys())
|
||||
self.assertEqual(self.rsn, obj.to_partial_json()['reason'])
|
||||
self.assertTrue('reason' in obj.to_partial_json().keys())
|
||||
self.assertEquals(self.rsn, obj.to_partial_json()['reason'])
|
||||
|
||||
def test_revoke_bad_status_raises_error(self):
|
||||
self.response.status_code = http_client.METHOD_NOT_ALLOWED
|
||||
@@ -711,12 +662,11 @@ class ClientTest(ClientTestBase):
|
||||
self.certr,
|
||||
self.rsn)
|
||||
|
||||
|
||||
class ClientV2Test(ClientTestBase):
|
||||
"""Tests for acme.client.ClientV2."""
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
super(ClientV2Test, self).setUp()
|
||||
|
||||
self.directory = DIRECTORY_V2
|
||||
|
||||
@@ -749,11 +699,6 @@ class ClientV2Test(ClientTestBase):
|
||||
|
||||
self.assertEqual(self.regr, self.client.new_account(self.new_reg))
|
||||
|
||||
def test_new_account_conflict(self):
|
||||
self.response.status_code = http_client.OK
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
self.assertRaises(errors.ConflictError, self.client.new_account, self.new_reg)
|
||||
|
||||
def test_new_order(self):
|
||||
order_response = copy.deepcopy(self.response)
|
||||
order_response.status_code = http_client.CREATED
|
||||
@@ -767,10 +712,9 @@ class ClientV2Test(ClientTestBase):
|
||||
authz_response2 = self.response
|
||||
authz_response2.json.return_value = self.authz2.to_json()
|
||||
authz_response2.headers['Location'] = self.authzr2.uri
|
||||
self.net.get.side_effect = (authz_response, authz_response2)
|
||||
|
||||
with mock.patch('acme.client.ClientV2._post_as_get') as mock_post_as_get:
|
||||
mock_post_as_get.side_effect = (authz_response, authz_response2)
|
||||
self.assertEqual(self.client.new_order(CSR_SAN_PEM), self.orderr)
|
||||
self.assertEqual(self.client.new_order(CSR_SAN_PEM), self.orderr)
|
||||
|
||||
@mock.patch('acme.client.datetime')
|
||||
def test_poll_and_finalize(self, mock_datetime):
|
||||
@@ -840,80 +784,13 @@ class ClientV2Test(ClientTestBase):
|
||||
deadline = datetime.datetime.now() - datetime.timedelta(seconds=60)
|
||||
self.assertRaises(errors.TimeoutError, self.client.finalize_order, self.orderr, deadline)
|
||||
|
||||
def test_finalize_order_alt_chains(self):
|
||||
updated_order = self.order.update(
|
||||
certificate='https://www.letsencrypt-demo.org/acme/cert/',
|
||||
)
|
||||
updated_orderr = self.orderr.update(body=updated_order,
|
||||
fullchain_pem=CERT_SAN_PEM,
|
||||
alternative_fullchains_pem=[CERT_SAN_PEM,
|
||||
CERT_SAN_PEM])
|
||||
self.response.json.return_value = updated_order.to_json()
|
||||
self.response.text = CERT_SAN_PEM
|
||||
self.response.headers['Link'] ='<https://example.com/acme/cert/1>;rel="alternate", ' + \
|
||||
'<https://example.com/dir>;rel="index", ' + \
|
||||
'<https://example.com/acme/cert/2>;title="foo";rel="alternate"'
|
||||
|
||||
deadline = datetime.datetime(9999, 9, 9)
|
||||
resp = self.client.finalize_order(self.orderr, deadline, fetch_alternative_chains=True)
|
||||
self.net.post.assert_any_call('https://example.com/acme/cert/1',
|
||||
mock.ANY, acme_version=2, new_nonce_url=mock.ANY)
|
||||
self.net.post.assert_any_call('https://example.com/acme/cert/2',
|
||||
mock.ANY, acme_version=2, new_nonce_url=mock.ANY)
|
||||
self.assertEqual(resp, updated_orderr)
|
||||
|
||||
del self.response.headers['Link']
|
||||
resp = self.client.finalize_order(self.orderr, deadline, fetch_alternative_chains=True)
|
||||
self.assertEqual(resp, updated_orderr.update(alternative_fullchains_pem=[]))
|
||||
|
||||
def test_revoke(self):
|
||||
self.client.revoke(messages_test.CERT, self.rsn)
|
||||
self.net.post.assert_called_once_with(
|
||||
self.directory["revokeCert"], mock.ANY, acme_version=2,
|
||||
new_nonce_url=DIRECTORY_V2['newNonce'])
|
||||
|
||||
def test_update_registration(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.assertEqual(self.regr, self.client.update_registration(self.regr))
|
||||
self.assertIsNotNone(self.client.net.account)
|
||||
self.assertEqual(self.client.net.post.call_count, 2)
|
||||
self.assertIn(DIRECTORY_V2.newAccount, self.net.post.call_args_list[0][0])
|
||||
|
||||
self.response.json.return_value = self.regr.body.update(
|
||||
contact=()).to_json()
|
||||
|
||||
def test_external_account_required_true(self):
|
||||
self.client.directory = messages.Directory({
|
||||
'meta': messages.Directory.Meta(external_account_required=True)
|
||||
})
|
||||
|
||||
self.assertTrue(self.client.external_account_required())
|
||||
|
||||
def test_external_account_required_false(self):
|
||||
self.client.directory = messages.Directory({
|
||||
'meta': messages.Directory.Meta(external_account_required=False)
|
||||
})
|
||||
|
||||
self.assertFalse(self.client.external_account_required())
|
||||
|
||||
def test_external_account_required_default(self):
|
||||
self.assertFalse(self.client.external_account_required())
|
||||
|
||||
def test_post_as_get(self):
|
||||
with mock.patch('acme.client.ClientV2._authzr_from_response') as mock_client:
|
||||
mock_client.return_value = self.authzr2
|
||||
|
||||
self.client.poll(self.authzr2) # pylint: disable=protected-access
|
||||
|
||||
self.client.net.post.assert_called_once_with(
|
||||
self.authzr2.uri, None, acme_version=2,
|
||||
new_nonce_url='https://www.letsencrypt-demo.org/acme/new-nonce')
|
||||
self.client.net.get.assert_not_called()
|
||||
self.directory["revokeCert"], mock.ANY, acme_version=2)
|
||||
|
||||
|
||||
class MockJSONDeSerializable(VersionedLEACMEMixin, jose.JSONDeSerializable):
|
||||
class MockJSONDeSerializable(jose.JSONDeSerializable):
|
||||
# pylint: disable=missing-docstring
|
||||
def __init__(self, value):
|
||||
self.value = value
|
||||
@@ -922,12 +799,13 @@ class MockJSONDeSerializable(VersionedLEACMEMixin, jose.JSONDeSerializable):
|
||||
return {'foo': self.value}
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
def from_json(cls, value):
|
||||
pass # pragma: no cover
|
||||
|
||||
|
||||
class ClientNetworkTest(unittest.TestCase):
|
||||
"""Tests for acme.client.ClientNetwork."""
|
||||
# pylint: disable=too-many-public-methods
|
||||
|
||||
def setUp(self):
|
||||
self.verify_ssl = mock.MagicMock()
|
||||
@@ -943,7 +821,7 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
self.response.links = {}
|
||||
|
||||
def test_init(self):
|
||||
self.assertIs(self.net.verify_ssl, self.verify_ssl)
|
||||
self.assertTrue(self.net.verify_ssl is self.verify_ssl)
|
||||
|
||||
def test_wrap_in_jws(self):
|
||||
# pylint: disable=protected-access
|
||||
@@ -966,6 +844,7 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
self.assertEqual(jws.signature.combined.kid, u'acct-uri')
|
||||
self.assertEqual(jws.signature.combined.url, u'url')
|
||||
|
||||
|
||||
def test_check_response_not_ok_jobj_no_error(self):
|
||||
self.response.ok = False
|
||||
self.response.json.return_value = {}
|
||||
@@ -977,8 +856,8 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
|
||||
def test_check_response_not_ok_jobj_error(self):
|
||||
self.response.ok = False
|
||||
self.response.json.return_value = messages.Error.with_code(
|
||||
'serverInternal', detail='foo', title='some title').to_json()
|
||||
self.response.json.return_value = messages.Error(
|
||||
detail='foo', typ='serverInternal', title='some title').to_json()
|
||||
# pylint: disable=protected-access
|
||||
self.assertRaises(
|
||||
messages.Error, self.net._check_response, self.response)
|
||||
@@ -1003,39 +882,10 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
self.response.json.side_effect = ValueError
|
||||
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
|
||||
self.response.headers['Content-Type'] = response_ct
|
||||
# pylint: disable=protected-access
|
||||
# pylint: disable=protected-access,no-value-for-parameter
|
||||
self.assertEqual(
|
||||
self.response, self.net._check_response(self.response))
|
||||
|
||||
@mock.patch('acme.client.logger')
|
||||
def test_check_response_ok_ct_with_charset(self, mock_logger):
|
||||
self.response.json.return_value = {}
|
||||
self.response.headers['Content-Type'] = 'application/json; charset=utf-8'
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.response, self.net._check_response(
|
||||
self.response, content_type='application/json'))
|
||||
try:
|
||||
mock_logger.debug.assert_called_with(
|
||||
'Ignoring wrong Content-Type (%r) for JSON decodable response',
|
||||
'application/json; charset=utf-8'
|
||||
)
|
||||
except AssertionError:
|
||||
return
|
||||
raise AssertionError('Expected Content-Type warning ' #pragma: no cover
|
||||
'to not have been logged')
|
||||
|
||||
@mock.patch('acme.client.logger')
|
||||
def test_check_response_ok_bad_ct(self, mock_logger):
|
||||
self.response.json.return_value = {}
|
||||
self.response.headers['Content-Type'] = 'text/plain'
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.response, self.net._check_response(
|
||||
self.response, content_type='application/json'))
|
||||
mock_logger.debug.assert_called_with(
|
||||
'Ignoring wrong Content-Type (%r) for JSON decodable response',
|
||||
'text/plain'
|
||||
)
|
||||
|
||||
def test_check_response_conflict(self):
|
||||
self.response.ok = False
|
||||
self.response.status_code = 409
|
||||
@@ -1046,7 +896,7 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
self.response.json.return_value = {}
|
||||
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
|
||||
self.response.headers['Content-Type'] = response_ct
|
||||
# pylint: disable=protected-access
|
||||
# pylint: disable=protected-access,no-value-for-parameter
|
||||
self.assertEqual(
|
||||
self.response, self.net._check_response(self.response))
|
||||
|
||||
@@ -1157,11 +1007,12 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
|
||||
# Requests Library Exceptions
|
||||
except requests.exceptions.ConnectionError as z: #pragma: no cover
|
||||
self.assertTrue("'Connection aborted.'" in str(z) or "[WinError 10061]" in str(z))
|
||||
|
||||
self.assertEqual("('Connection aborted.', "
|
||||
"error(111, 'Connection refused'))", str(z))
|
||||
|
||||
class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
"""Tests for acme.client.ClientNetwork which mock out response."""
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.client import ClientNetwork
|
||||
@@ -1170,10 +1021,7 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
self.response = mock.MagicMock(ok=True, status_code=http_client.OK)
|
||||
self.response.headers = {}
|
||||
self.response.links = {}
|
||||
self.response.checked = False
|
||||
self.acmev1_nonce_response = mock.MagicMock(
|
||||
ok=False, status_code=http_client.METHOD_NOT_ALLOWED)
|
||||
self.acmev1_nonce_response.headers = {}
|
||||
self.checked_response = mock.MagicMock()
|
||||
self.obj = mock.MagicMock()
|
||||
self.wrapped_obj = mock.MagicMock()
|
||||
self.content_type = mock.sentinel.content_type
|
||||
@@ -1185,21 +1033,13 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
|
||||
def send_request(*args, **kwargs):
|
||||
# pylint: disable=unused-argument,missing-docstring
|
||||
self.assertNotIn("new_nonce_url", kwargs)
|
||||
method = args[0]
|
||||
uri = args[1]
|
||||
if method == 'HEAD' and uri != "new_nonce_uri":
|
||||
response = self.acmev1_nonce_response
|
||||
else:
|
||||
response = self.response
|
||||
|
||||
if self.available_nonces:
|
||||
response.headers = {
|
||||
self.response.headers = {
|
||||
self.net.REPLAY_NONCE_HEADER:
|
||||
self.available_nonces.pop().decode()}
|
||||
else:
|
||||
response.headers = {}
|
||||
return response
|
||||
self.response.headers = {}
|
||||
return self.response
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.net._send_request = self.send_request = mock.MagicMock(
|
||||
@@ -1211,39 +1051,28 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
# pylint: disable=missing-docstring
|
||||
self.assertEqual(self.response, response)
|
||||
self.assertEqual(self.content_type, content_type)
|
||||
self.assertTrue(self.response.ok)
|
||||
self.response.checked = True
|
||||
return self.response
|
||||
return self.checked_response
|
||||
|
||||
def test_head(self):
|
||||
self.assertEqual(self.acmev1_nonce_response, self.net.head(
|
||||
self.assertEqual(self.response, self.net.head(
|
||||
'http://example.com/', 'foo', bar='baz'))
|
||||
self.send_request.assert_called_once_with(
|
||||
'HEAD', 'http://example.com/', 'foo', bar='baz')
|
||||
|
||||
def test_head_v2(self):
|
||||
self.assertEqual(self.response, self.net.head(
|
||||
'new_nonce_uri', 'foo', bar='baz'))
|
||||
self.send_request.assert_called_once_with(
|
||||
'HEAD', 'new_nonce_uri', 'foo', bar='baz')
|
||||
|
||||
def test_get(self):
|
||||
self.assertEqual(self.response, self.net.get(
|
||||
self.assertEqual(self.checked_response, self.net.get(
|
||||
'http://example.com/', content_type=self.content_type, bar='baz'))
|
||||
self.assertTrue(self.response.checked)
|
||||
self.send_request.assert_called_once_with(
|
||||
'GET', 'http://example.com/', bar='baz')
|
||||
|
||||
def test_post_no_content_type(self):
|
||||
self.content_type = self.net.JOSE_CONTENT_TYPE
|
||||
self.assertEqual(self.response, self.net.post('uri', self.obj))
|
||||
self.assertTrue(self.response.checked)
|
||||
self.assertEqual(self.checked_response, self.net.post('uri', self.obj))
|
||||
|
||||
def test_post(self):
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.response, self.net.post(
|
||||
self.assertEqual(self.checked_response, self.net.post(
|
||||
'uri', self.obj, content_type=self.content_type))
|
||||
self.assertTrue(self.response.checked)
|
||||
self.net._wrap_in_jws.assert_called_once_with(
|
||||
self.obj, jose.b64decode(self.all_nonces.pop()), "uri", 1)
|
||||
|
||||
@@ -1275,7 +1104,7 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
def test_post_not_retried(self):
|
||||
check_response = mock.MagicMock()
|
||||
check_response.side_effect = [messages.Error.with_code('malformed'),
|
||||
self.response]
|
||||
self.checked_response]
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.net._check_response = check_response
|
||||
@@ -1283,12 +1112,13 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
self.obj, content_type=self.content_type)
|
||||
|
||||
def test_post_successful_retry(self):
|
||||
post_once = mock.MagicMock()
|
||||
post_once.side_effect = [messages.Error.with_code('badNonce'),
|
||||
self.response]
|
||||
check_response = mock.MagicMock()
|
||||
check_response.side_effect = [messages.Error.with_code('badNonce'),
|
||||
self.checked_response]
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.response, self.net.post(
|
||||
self.net._check_response = check_response
|
||||
self.assertEqual(self.checked_response, self.net.post(
|
||||
'uri', self.obj, content_type=self.content_type))
|
||||
|
||||
def test_head_get_post_error_passthrough(self):
|
||||
@@ -1299,26 +1129,6 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
self.assertRaises(requests.exceptions.RequestException,
|
||||
self.net.post, 'uri', obj=self.obj)
|
||||
|
||||
def test_post_bad_nonce_head(self):
|
||||
# pylint: disable=protected-access
|
||||
# regression test for https://github.com/certbot/certbot/issues/6092
|
||||
bad_response = mock.MagicMock(ok=False, status_code=http_client.SERVICE_UNAVAILABLE)
|
||||
self.net._send_request = mock.MagicMock()
|
||||
self.net._send_request.return_value = bad_response
|
||||
self.content_type = None
|
||||
check_response = mock.MagicMock()
|
||||
self.net._check_response = check_response
|
||||
self.assertRaises(errors.ClientError, self.net.post, 'uri',
|
||||
self.obj, content_type=self.content_type, acme_version=2,
|
||||
new_nonce_url='new_nonce_uri')
|
||||
self.assertEqual(check_response.call_count, 1)
|
||||
|
||||
def test_new_nonce_uri_removed(self):
|
||||
self.content_type = None
|
||||
self.net.post('uri', self.obj, content_type=None,
|
||||
acme_version=2, new_nonce_url='new_nonce_uri')
|
||||
|
||||
|
||||
class ClientNetworkSourceAddressBindingTest(unittest.TestCase):
|
||||
"""Tests that if ClientNetwork has a source IP set manually, the underlying library has
|
||||
used the provided source address."""
|
||||
@@ -1330,7 +1140,7 @@ class ClientNetworkSourceAddressBindingTest(unittest.TestCase):
|
||||
from acme.client import ClientNetwork
|
||||
net = ClientNetwork(key=None, alg=None, source_address=self.source_address)
|
||||
for adapter in net.session.adapters.values():
|
||||
self.assertIn(self.source_address, adapter.source_address)
|
||||
self.assertTrue(self.source_address in adapter.source_address)
|
||||
|
||||
def test_behavior_assumption(self):
|
||||
"""This is a test that guardrails the HTTPAdapter behavior so that if the default for
|
||||
@@ -1340,7 +1150,7 @@ class ClientNetworkSourceAddressBindingTest(unittest.TestCase):
|
||||
# test should fail if the default adapter type is changed by requests
|
||||
net = ClientNetwork(key=None, alg=None)
|
||||
session = requests.Session()
|
||||
for scheme in session.adapters:
|
||||
for scheme in session.adapters.keys():
|
||||
client_network_adapter = net.session.adapters.get(scheme)
|
||||
default_adapter = session.adapters.get(scheme)
|
||||
self.assertEqual(client_network_adapter.__class__, default_adapter.__class__)
|
||||
@@ -5,63 +5,45 @@ import logging
|
||||
import os
|
||||
import re
|
||||
import socket
|
||||
from typing import Callable
|
||||
from typing import Tuple
|
||||
from typing import Union
|
||||
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
|
||||
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
|
||||
import josepy as jose
|
||||
|
||||
from acme import errors
|
||||
# pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Callable, Union, Tuple, Optional
|
||||
# pylint: enable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Default SSL method selected here is the most compatible, while secure
|
||||
# SSL method: TLSv1_METHOD is only compatible with
|
||||
# TLSSNI01 certificate serving and probing is not affected by SSL
|
||||
# vulnerabilities: prober needs to check certificate for expected
|
||||
# contents anyway. Working SNI is the only thing that's necessary for
|
||||
# the challenge and thus scoping down SSL/TLS method (version) would
|
||||
# cause interoperability issues: TLSv1_METHOD is only compatible with
|
||||
# TLSv1_METHOD, while SSLv23_METHOD is compatible with all other
|
||||
# methods, including TLSv2_METHOD (read more at
|
||||
# https://www.openssl.org/docs/ssl/SSLv23_method.html). _serve_sni
|
||||
# should be changed to use "set_options" to disable SSLv2 and SSLv3,
|
||||
# in case it's used for things other than probing/serving!
|
||||
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
|
||||
_DEFAULT_TLSSNI01_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
|
||||
|
||||
|
||||
class _DefaultCertSelection:
|
||||
def __init__(self, certs):
|
||||
self.certs = certs
|
||||
|
||||
def __call__(self, connection):
|
||||
server_name = connection.get_servername()
|
||||
return self.certs.get(server_name, None)
|
||||
|
||||
|
||||
class SSLSocket: # pylint: disable=too-few-public-methods
|
||||
class SSLSocket(object): # pylint: disable=too-few-public-methods
|
||||
"""SSL wrapper for sockets.
|
||||
|
||||
:ivar socket sock: Original wrapped socket.
|
||||
:ivar dict certs: Mapping from domain names (`bytes`) to
|
||||
`OpenSSL.crypto.X509`.
|
||||
:ivar method: See `OpenSSL.SSL.Context` for allowed values.
|
||||
:ivar alpn_selection: Hook to select negotiated ALPN protocol for
|
||||
connection.
|
||||
:ivar cert_selection: Hook to select certificate for connection. If given,
|
||||
`certs` parameter would be ignored, and therefore must be empty.
|
||||
|
||||
"""
|
||||
def __init__(self, sock, certs=None,
|
||||
method=_DEFAULT_SSL_METHOD, alpn_selection=None,
|
||||
cert_selection=None):
|
||||
def __init__(self, sock, certs, method=_DEFAULT_TLSSNI01_SSL_METHOD):
|
||||
self.sock = sock
|
||||
self.alpn_selection = alpn_selection
|
||||
self.certs = certs
|
||||
self.method = method
|
||||
if not cert_selection and not certs:
|
||||
raise ValueError("Neither cert_selection or certs specified.")
|
||||
if cert_selection and certs:
|
||||
raise ValueError("Both cert_selection and certs specified.")
|
||||
if cert_selection is None:
|
||||
cert_selection = _DefaultCertSelection(certs)
|
||||
self.cert_selection = cert_selection
|
||||
|
||||
def __getattr__(self, name):
|
||||
return getattr(self.sock, name)
|
||||
@@ -78,25 +60,24 @@ class SSLSocket: # pylint: disable=too-few-public-methods
|
||||
:type connection: :class:`OpenSSL.Connection`
|
||||
|
||||
"""
|
||||
pair = self.cert_selection(connection)
|
||||
if pair is None:
|
||||
logger.debug("Certificate selection for server name %s failed, dropping SSL",
|
||||
connection.get_servername())
|
||||
server_name = connection.get_servername()
|
||||
try:
|
||||
key, cert = self.certs[server_name]
|
||||
except KeyError:
|
||||
logger.debug("Server name (%s) not recognized, dropping SSL",
|
||||
server_name)
|
||||
return
|
||||
key, cert = pair
|
||||
new_context = SSL.Context(self.method)
|
||||
new_context.set_options(SSL.OP_NO_SSLv2)
|
||||
new_context.set_options(SSL.OP_NO_SSLv3)
|
||||
new_context.use_privatekey(key)
|
||||
new_context.use_certificate(cert)
|
||||
if self.alpn_selection is not None:
|
||||
new_context.set_alpn_select_callback(self.alpn_selection)
|
||||
connection.set_context(new_context)
|
||||
|
||||
class FakeConnection:
|
||||
class FakeConnection(object):
|
||||
"""Fake OpenSSL.SSL.Connection."""
|
||||
|
||||
# pylint: disable=missing-function-docstring
|
||||
# pylint: disable=too-few-public-methods,missing-docstring
|
||||
|
||||
def __init__(self, connection):
|
||||
self._wrapped = connection
|
||||
@@ -108,15 +89,13 @@ class SSLSocket: # pylint: disable=too-few-public-methods
|
||||
# OpenSSL.SSL.Connection.shutdown doesn't accept any args
|
||||
return self._wrapped.shutdown()
|
||||
|
||||
def accept(self): # pylint: disable=missing-function-docstring
|
||||
def accept(self): # pylint: disable=missing-docstring
|
||||
sock, addr = self.sock.accept()
|
||||
|
||||
context = SSL.Context(self.method)
|
||||
context.set_options(SSL.OP_NO_SSLv2)
|
||||
context.set_options(SSL.OP_NO_SSLv3)
|
||||
context.set_tlsext_servername_callback(self._pick_certificate_cb)
|
||||
if self.alpn_selection is not None:
|
||||
context.set_alpn_select_callback(self.alpn_selection)
|
||||
|
||||
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
|
||||
ssl_sock.set_accept_state()
|
||||
@@ -132,9 +111,8 @@ class SSLSocket: # pylint: disable=too-few-public-methods
|
||||
return ssl_sock, addr
|
||||
|
||||
|
||||
def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-arguments
|
||||
method=_DEFAULT_SSL_METHOD, source_address=('', 0),
|
||||
alpn_protocols=None):
|
||||
def probe_sni(name, host, port=443, timeout=300,
|
||||
method=_DEFAULT_TLSSNI01_SSL_METHOD, source_address=('', 0)):
|
||||
"""Probe SNI server for SSL certificate.
|
||||
|
||||
:param bytes name: Byte string to send as the server name in the
|
||||
@@ -146,8 +124,6 @@ def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-argu
|
||||
:param tuple source_address: Enables multi-path probing (selection
|
||||
of source interface). See `socket.creation_connection` for more
|
||||
info. Available only in Python 2.7+.
|
||||
:param alpn_protocols: Protocols to request using ALPN.
|
||||
:type alpn_protocols: `list` of `bytes`
|
||||
|
||||
:raises acme.errors.Error: In case of any problems.
|
||||
|
||||
@@ -160,15 +136,22 @@ def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-argu
|
||||
|
||||
socket_kwargs = {'source_address': source_address}
|
||||
|
||||
host_protocol_agnostic = host
|
||||
if host == '::' or host == '0':
|
||||
# https://github.com/python/typeshed/pull/2136
|
||||
# while PR is not merged, we need to ignore
|
||||
host_protocol_agnostic = None
|
||||
|
||||
try:
|
||||
# pylint: disable=star-args
|
||||
logger.debug(
|
||||
"Attempting to connect to %s:%d%s.", host, port,
|
||||
"Attempting to connect to %s:%d%s.", host_protocol_agnostic, port,
|
||||
" from {0}:{1}".format(
|
||||
source_address[0],
|
||||
source_address[1]
|
||||
) if any(source_address) else ""
|
||||
) if socket_kwargs else ""
|
||||
)
|
||||
socket_tuple: Tuple[str, int] = (host, port)
|
||||
socket_tuple = (host_protocol_agnostic, port) # type: Tuple[Optional[str], int]
|
||||
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore
|
||||
except socket.error as error:
|
||||
raise errors.Error(error)
|
||||
@@ -177,8 +160,6 @@ def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-argu
|
||||
client_ssl = SSL.Connection(context, client)
|
||||
client_ssl.set_connect_state()
|
||||
client_ssl.set_tlsext_host_name(name) # pyOpenSSL>=0.13
|
||||
if alpn_protocols is not None:
|
||||
client_ssl.set_alpn_protos(alpn_protocols)
|
||||
try:
|
||||
client_ssl.do_handshake()
|
||||
client_ssl.shutdown()
|
||||
@@ -186,7 +167,6 @@ def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-argu
|
||||
raise errors.Error(error)
|
||||
return client_ssl.get_peer_certificate()
|
||||
|
||||
|
||||
def make_csr(private_key_pem, domains, must_staple=False):
|
||||
"""Generate a CSR containing a list of domains as subjectAltNames.
|
||||
|
||||
@@ -218,15 +198,14 @@ def make_csr(private_key_pem, domains, must_staple=False):
|
||||
return crypto.dump_certificate_request(
|
||||
crypto.FILETYPE_PEM, csr)
|
||||
|
||||
|
||||
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
|
||||
common_name = loaded_cert_or_req.get_subject().CN
|
||||
sans = _pyopenssl_cert_or_req_san(loaded_cert_or_req)
|
||||
|
||||
if common_name is None:
|
||||
return sans
|
||||
return [common_name] + [d for d in sans if d != common_name]
|
||||
|
||||
else:
|
||||
return [common_name] + [d for d in sans if d != common_name]
|
||||
|
||||
def _pyopenssl_cert_or_req_san(cert_or_req):
|
||||
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
|
||||
@@ -256,7 +235,7 @@ def _pyopenssl_cert_or_req_san(cert_or_req):
|
||||
|
||||
if isinstance(cert_or_req, crypto.X509):
|
||||
# pylint: disable=line-too-long
|
||||
func: Union[Callable[[int, crypto.X509Req], bytes], Callable[[int, crypto.X509], bytes]] = crypto.dump_certificate
|
||||
func = crypto.dump_certificate # type: Union[Callable[[int, crypto.X509Req], bytes], Callable[[int, crypto.X509], bytes]]
|
||||
else:
|
||||
func = crypto.dump_certificate_request
|
||||
text = func(crypto.FILETYPE_TEXT, cert_or_req).decode("utf-8")
|
||||
@@ -272,14 +251,12 @@ def _pyopenssl_cert_or_req_san(cert_or_req):
|
||||
|
||||
|
||||
def gen_ss_cert(key, domains, not_before=None,
|
||||
validity=(7 * 24 * 60 * 60), force_san=True, extensions=None):
|
||||
validity=(7 * 24 * 60 * 60), force_san=True):
|
||||
"""Generate new self-signed certificate.
|
||||
|
||||
:type domains: `list` of `unicode`
|
||||
:param OpenSSL.crypto.PKey key:
|
||||
:param bool force_san:
|
||||
:param extensions: List of additional extensions to include in the cert.
|
||||
:type extensions: `list` of `OpenSSL.crypto.X509Extension`
|
||||
|
||||
If more than one domain is provided, all of the domains are put into
|
||||
``subjectAltName`` X.509 extension and first domain is set as the
|
||||
@@ -292,13 +269,10 @@ def gen_ss_cert(key, domains, not_before=None,
|
||||
cert.set_serial_number(int(binascii.hexlify(os.urandom(16)), 16))
|
||||
cert.set_version(2)
|
||||
|
||||
if extensions is None:
|
||||
extensions = []
|
||||
|
||||
extensions.append(
|
||||
extensions = [
|
||||
crypto.X509Extension(
|
||||
b"basicConstraints", True, b"CA:TRUE, pathlen:0"),
|
||||
)
|
||||
]
|
||||
|
||||
cert.get_subject().CN = domains[0]
|
||||
# TODO: what to put into cert.get_subject()?
|
||||
@@ -320,7 +294,6 @@ def gen_ss_cert(key, domains, not_before=None,
|
||||
cert.sign(key, "sha256")
|
||||
return cert
|
||||
|
||||
|
||||
def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
|
||||
"""Dump certificate chain into a bundle.
|
||||
|
||||
@@ -336,6 +309,7 @@ def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
|
||||
|
||||
def _dump_cert(cert):
|
||||
if isinstance(cert, jose.ComparableX509):
|
||||
# pylint: disable=protected-access
|
||||
cert = cert.wrapped
|
||||
return crypto.dump_certificate(filetype, cert)
|
||||
|
||||
|
||||
@@ -1,22 +1,25 @@
|
||||
"""Tests for acme.crypto_util."""
|
||||
import itertools
|
||||
import socket
|
||||
import socketserver
|
||||
import threading
|
||||
import time
|
||||
import unittest
|
||||
from typing import List
|
||||
|
||||
import six
|
||||
from six.moves import socketserver #type: ignore # pylint: disable=import-error
|
||||
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
|
||||
from acme import errors
|
||||
import test_util
|
||||
from acme import test_util
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
"""Tests for acme.crypto_util.SSLSocket/probe_sni."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
self.cert = test_util.load_comparable_cert('rsa2048_cert.pem')
|
||||
key = test_util.load_pyopenssl_private_key('rsa2048_key.pem')
|
||||
@@ -27,14 +30,17 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
|
||||
class _TestServer(socketserver.TCPServer):
|
||||
|
||||
# pylint: disable=too-few-public-methods
|
||||
# six.moves.* | pylint: disable=attribute-defined-outside-init,no-init
|
||||
|
||||
def server_bind(self): # pylint: disable=missing-docstring
|
||||
self.socket = SSLSocket(socket.socket(),
|
||||
certs)
|
||||
self.socket = SSLSocket(socket.socket(), certs=certs)
|
||||
socketserver.TCPServer.server_bind(self)
|
||||
|
||||
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
|
||||
self.port = self.server.socket.getsockname()[1]
|
||||
self.server_thread = threading.Thread(
|
||||
# pylint: disable=no-member
|
||||
target=self.server.handle_request)
|
||||
|
||||
def tearDown(self):
|
||||
@@ -60,7 +66,8 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
self.assertRaises(errors.Error, self._probe, b'bar')
|
||||
|
||||
def test_probe_connection_error(self):
|
||||
self.server.server_close()
|
||||
# pylint has a hard time with six
|
||||
self.server.server_close() # pylint: disable=no-member
|
||||
original_timeout = socket.getdefaulttimeout()
|
||||
try:
|
||||
socket.setdefaulttimeout(1)
|
||||
@@ -69,18 +76,6 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
socket.setdefaulttimeout(original_timeout)
|
||||
|
||||
|
||||
class SSLSocketTest(unittest.TestCase):
|
||||
"""Tests for acme.crypto_util.SSLSocket."""
|
||||
|
||||
def test_ssl_socket_invalid_arguments(self):
|
||||
from acme.crypto_util import SSLSocket
|
||||
with self.assertRaises(ValueError):
|
||||
_ = SSLSocket(None, {'sni': ('key', 'cert')},
|
||||
cert_selection=lambda _: None)
|
||||
with self.assertRaises(ValueError):
|
||||
_ = SSLSocket(None)
|
||||
|
||||
|
||||
class PyOpenSSLCertOrReqAllNamesTest(unittest.TestCase):
|
||||
"""Test for acme.crypto_util._pyopenssl_cert_or_req_all_names."""
|
||||
|
||||
@@ -118,9 +113,9 @@ class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
|
||||
@classmethod
|
||||
def _get_idn_names(cls):
|
||||
"""Returns expected names from '{cert,csr}-idnsans.pem'."""
|
||||
chars = [chr(i) for i in itertools.chain(range(0x3c3, 0x400),
|
||||
range(0x641, 0x6fc),
|
||||
range(0x1820, 0x1877))]
|
||||
chars = [six.unichr(i) for i in itertools.chain(range(0x3c3, 0x400),
|
||||
range(0x641, 0x6fc),
|
||||
range(0x1820, 0x1877))]
|
||||
return [''.join(chars[i: i + 45]) + '.invalid'
|
||||
for i in range(0, len(chars), 45)]
|
||||
|
||||
@@ -181,7 +176,7 @@ class RandomSnTest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.cert_count = 5
|
||||
self.serial_num: List[int] = []
|
||||
self.serial_num = [] # type: List[int]
|
||||
self.key = OpenSSL.crypto.PKey()
|
||||
self.key.generate_key(OpenSSL.crypto.TYPE_RSA, 2048)
|
||||
|
||||
@@ -191,7 +186,7 @@ class RandomSnTest(unittest.TestCase):
|
||||
for _ in range(self.cert_count):
|
||||
cert = gen_ss_cert(self.key, ['dummy'], force_san=True)
|
||||
self.serial_num.append(cert.get_serial_number())
|
||||
self.assertGreater(len(set(self.serial_num)), 1)
|
||||
self.assertTrue(len(set(self.serial_num)) > 1)
|
||||
|
||||
class MakeCSRTest(unittest.TestCase):
|
||||
"""Test for standalone functions."""
|
||||
@@ -206,16 +201,16 @@ class MakeCSRTest(unittest.TestCase):
|
||||
|
||||
def test_make_csr(self):
|
||||
csr_pem = self._call_with_key(["a.example", "b.example"])
|
||||
self.assertIn(b'--BEGIN CERTIFICATE REQUEST--', csr_pem)
|
||||
self.assertIn(b'--END CERTIFICATE REQUEST--', csr_pem)
|
||||
self.assertTrue(b'--BEGIN CERTIFICATE REQUEST--' in csr_pem)
|
||||
self.assertTrue(b'--END CERTIFICATE REQUEST--' in csr_pem)
|
||||
csr = OpenSSL.crypto.load_certificate_request(
|
||||
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
|
||||
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
|
||||
# have a get_extensions() method, so we skip this test if the method
|
||||
# isn't available.
|
||||
if hasattr(csr, 'get_extensions'):
|
||||
self.assertEqual(len(csr.get_extensions()), 1)
|
||||
self.assertEqual(csr.get_extensions()[0].get_data(),
|
||||
self.assertEquals(len(csr.get_extensions()), 1)
|
||||
self.assertEquals(csr.get_extensions()[0].get_data(),
|
||||
OpenSSL.crypto.X509Extension(
|
||||
b'subjectAltName',
|
||||
critical=False,
|
||||
@@ -232,7 +227,7 @@ class MakeCSRTest(unittest.TestCase):
|
||||
# have a get_extensions() method, so we skip this test if the method
|
||||
# isn't available.
|
||||
if hasattr(csr, 'get_extensions'):
|
||||
self.assertEqual(len(csr.get_extensions()), 2)
|
||||
self.assertEquals(len(csr.get_extensions()), 2)
|
||||
# NOTE: Ideally we would filter by the TLS Feature OID, but
|
||||
# OpenSSL.crypto.X509Extension doesn't give us the extension's raw OID,
|
||||
# and the shortname field is just "UNDEF"
|
||||
@@ -28,8 +28,8 @@ class NonceError(ClientError):
|
||||
|
||||
class BadNonce(NonceError):
|
||||
"""Bad nonce error."""
|
||||
def __init__(self, nonce, error, *args):
|
||||
super().__init__(*args)
|
||||
def __init__(self, nonce, error, *args, **kwargs):
|
||||
super(BadNonce, self).__init__(*args, **kwargs)
|
||||
self.nonce = nonce
|
||||
self.error = error
|
||||
|
||||
@@ -44,11 +44,11 @@ class MissingNonce(NonceError):
|
||||
Replay-Nonce header field in each successful response to a POST it
|
||||
provides to a client (...)".
|
||||
|
||||
:ivar requests.Response ~.response: HTTP Response
|
||||
:ivar requests.Response response: HTTP Response
|
||||
|
||||
"""
|
||||
def __init__(self, response, *args):
|
||||
super().__init__(*args)
|
||||
def __init__(self, response, *args, **kwargs):
|
||||
super(MissingNonce, self).__init__(*args, **kwargs)
|
||||
self.response = response
|
||||
|
||||
def __str__(self):
|
||||
@@ -72,7 +72,7 @@ class PollError(ClientError):
|
||||
def __init__(self, exhausted, updated):
|
||||
self.exhausted = exhausted
|
||||
self.updated = updated
|
||||
super().__init__()
|
||||
super(PollError, self).__init__()
|
||||
|
||||
@property
|
||||
def timeout(self):
|
||||
@@ -83,20 +83,17 @@ class PollError(ClientError):
|
||||
return '{0}(exhausted={1!r}, updated={2!r})'.format(
|
||||
self.__class__.__name__, self.exhausted, self.updated)
|
||||
|
||||
|
||||
class ValidationError(Error):
|
||||
"""Error for authorization failures. Contains a list of authorization
|
||||
resources, each of which is invalid and should have an error field.
|
||||
"""
|
||||
def __init__(self, failed_authzrs):
|
||||
self.failed_authzrs = failed_authzrs
|
||||
super().__init__()
|
||||
super(ValidationError, self).__init__()
|
||||
|
||||
|
||||
class TimeoutError(Error): # pylint: disable=redefined-builtin
|
||||
class TimeoutError(Error):
|
||||
"""Error for when polling an authorization or an order times out."""
|
||||
|
||||
|
||||
class IssuanceError(Error):
|
||||
"""Error sent by the server after requesting issuance of a certificate."""
|
||||
|
||||
@@ -106,20 +103,17 @@ class IssuanceError(Error):
|
||||
:param messages.Error error: The error provided by the server.
|
||||
"""
|
||||
self.error = error
|
||||
super().__init__()
|
||||
|
||||
super(IssuanceError, self).__init__()
|
||||
|
||||
class ConflictError(ClientError):
|
||||
"""Error for when the server returns a 409 (Conflict) HTTP status.
|
||||
|
||||
In the version of ACME implemented by Boulder, this is used to find an
|
||||
account if you only have the private key, but don't know the account URL.
|
||||
|
||||
Also used in V2 of the ACME client for the same purpose.
|
||||
"""
|
||||
def __init__(self, location):
|
||||
self.location = location
|
||||
super().__init__()
|
||||
super(ConflictError, self).__init__()
|
||||
|
||||
|
||||
class WildcardUnsupportedError(Error):
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
"""Tests for acme.errors."""
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
import mock
|
||||
|
||||
|
||||
class BadNonceTest(unittest.TestCase):
|
||||
@@ -24,8 +25,8 @@ class MissingNonceTest(unittest.TestCase):
|
||||
self.error = MissingNonce(self.response)
|
||||
|
||||
def test_str(self):
|
||||
self.assertIn("FOO", str(self.error))
|
||||
self.assertIn("{}", str(self.error))
|
||||
self.assertTrue("FOO" in str(self.error))
|
||||
self.assertTrue("{}" in str(self.error))
|
||||
|
||||
|
||||
class PollErrorTest(unittest.TestCase):
|
||||
@@ -34,7 +35,7 @@ class PollErrorTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
from acme.errors import PollError
|
||||
self.timeout = PollError(
|
||||
exhausted={mock.sentinel.AR},
|
||||
exhausted=set([mock.sentinel.AR]),
|
||||
updated={})
|
||||
self.invalid = PollError(exhausted=set(), updated={
|
||||
mock.sentinel.AR: mock.sentinel.AR2})
|
||||
@@ -4,6 +4,7 @@ import logging
|
||||
import josepy as jose
|
||||
import pyrfc3339
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@@ -12,7 +13,7 @@ class Fixed(jose.Field):
|
||||
|
||||
def __init__(self, json_name, value):
|
||||
self.value = value
|
||||
super().__init__(
|
||||
super(Fixed, self).__init__(
|
||||
json_name=json_name, default=value, omitempty=False)
|
||||
|
||||
def decode(self, value):
|
||||
@@ -53,7 +54,7 @@ class Resource(jose.Field):
|
||||
|
||||
def __init__(self, resource_type, *args, **kwargs):
|
||||
self.resource_type = resource_type
|
||||
super().__init__(
|
||||
super(Resource, self).__init__(
|
||||
'resource', default=resource_type, *args, **kwargs)
|
||||
|
||||
def decode(self, value):
|
||||
|
||||
@@ -14,10 +14,8 @@ class Header(jose.Header):
|
||||
kid = jose.Field('kid', omitempty=True)
|
||||
url = jose.Field('url', omitempty=True)
|
||||
|
||||
# Mypy does not understand the josepy magic happening here, and falsely claims
|
||||
# that nonce is redefined. Let's ignore the type check here.
|
||||
@nonce.decoder # type: ignore
|
||||
def nonce(value): # pylint: disable=no-self-argument,missing-function-docstring
|
||||
@nonce.decoder
|
||||
def nonce(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
try:
|
||||
return jose.decode_b64jose(value)
|
||||
except jose.DeserializationError as error:
|
||||
@@ -42,15 +40,15 @@ class Signature(jose.Signature):
|
||||
class JWS(jose.JWS):
|
||||
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
|
||||
signature_cls = Signature
|
||||
__slots__ = jose.JWS._orig_slots
|
||||
__slots__ = jose.JWS._orig_slots # pylint: disable=no-member
|
||||
|
||||
@classmethod
|
||||
# pylint: disable=arguments-differ
|
||||
# pylint: disable=arguments-differ,too-many-arguments
|
||||
def sign(cls, payload, key, alg, nonce, url=None, kid=None):
|
||||
# Per ACME spec, jwk and kid are mutually exclusive, so only include a
|
||||
# jwk field if kid is not provided.
|
||||
include_jwk = kid is None
|
||||
return super().sign(payload, key=key, alg=alg,
|
||||
return super(JWS, cls).sign(payload, key=key, alg=alg,
|
||||
protect=frozenset(['nonce', 'url', 'kid', 'jwk', 'alg']),
|
||||
nonce=nonce, url=url, kid=kid,
|
||||
include_jwk=include_jwk)
|
||||
|
||||
@@ -3,7 +3,8 @@ import unittest
|
||||
|
||||
import josepy as jose
|
||||
|
||||
import test_util
|
||||
from acme import test_util
|
||||
|
||||
|
||||
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
|
||||
|
||||
@@ -48,7 +49,7 @@ class JWSTest(unittest.TestCase):
|
||||
self.assertEqual(jws.signature.combined.nonce, self.nonce)
|
||||
self.assertEqual(jws.signature.combined.url, self.url)
|
||||
self.assertEqual(jws.signature.combined.kid, self.kid)
|
||||
self.assertIsNone(jws.signature.combined.jwk)
|
||||
self.assertEqual(jws.signature.combined.jwk, None)
|
||||
# TODO: check that nonce is in protected header
|
||||
|
||||
self.assertEqual(jws, JWS.from_json(jws.to_json()))
|
||||
@@ -58,7 +59,7 @@ class JWSTest(unittest.TestCase):
|
||||
jws = JWS.sign(payload=b'foo', key=self.privkey,
|
||||
alg=jose.RS256, nonce=self.nonce,
|
||||
url=self.url)
|
||||
self.assertIsNone(jws.signature.combined.kid)
|
||||
self.assertEqual(jws.signature.combined.kid, None)
|
||||
self.assertEqual(jws.signature.combined.jwk, self.pubkey)
|
||||
|
||||
|
||||
@@ -1,17 +1,16 @@
|
||||
"""Simple shim around the typing module.
|
||||
"""Shim class to not have to depend on typing module in prod."""
|
||||
import sys
|
||||
|
||||
This was useful when this code supported Python 2 and typing wasn't always
|
||||
available. This code is being kept for now for backwards compatibility.
|
||||
|
||||
"""
|
||||
import warnings
|
||||
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
|
||||
from typing import Collection, IO # type: ignore
|
||||
|
||||
warnings.warn("acme.magic_typing is deprecated and will be removed in a future release.",
|
||||
DeprecationWarning)
|
||||
|
||||
class TypingClass:
|
||||
class TypingClass(object):
|
||||
"""Ignore import errors by getting anything"""
|
||||
def __getattr__(self, name):
|
||||
return None # pragma: no cover
|
||||
return None
|
||||
|
||||
try:
|
||||
# mypy doesn't respect modifying sys.modules
|
||||
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
|
||||
# pylint: disable=unused-import
|
||||
from typing import Collection, IO # type: ignore
|
||||
# pylint: enable=unused-import
|
||||
except ImportError:
|
||||
sys.modules[__name__] = TypingClass()
|
||||
|
||||
41
acme/acme/magic_typing_test.py
Normal file
41
acme/acme/magic_typing_test.py
Normal file
@@ -0,0 +1,41 @@
|
||||
"""Tests for acme.magic_typing."""
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
|
||||
|
||||
class MagicTypingTest(unittest.TestCase):
|
||||
"""Tests for acme.magic_typing."""
|
||||
def test_import_success(self):
|
||||
try:
|
||||
import typing as temp_typing
|
||||
except ImportError: # pragma: no cover
|
||||
temp_typing = None # pragma: no cover
|
||||
typing_class_mock = mock.MagicMock()
|
||||
text_mock = mock.MagicMock()
|
||||
typing_class_mock.Text = text_mock
|
||||
sys.modules['typing'] = typing_class_mock
|
||||
if 'acme.magic_typing' in sys.modules:
|
||||
del sys.modules['acme.magic_typing'] # pragma: no cover
|
||||
from acme.magic_typing import Text # pylint: disable=no-name-in-module
|
||||
self.assertEqual(Text, text_mock)
|
||||
del sys.modules['acme.magic_typing']
|
||||
sys.modules['typing'] = temp_typing
|
||||
|
||||
def test_import_failure(self):
|
||||
try:
|
||||
import typing as temp_typing
|
||||
except ImportError: # pragma: no cover
|
||||
temp_typing = None # pragma: no cover
|
||||
sys.modules['typing'] = None
|
||||
if 'acme.magic_typing' in sys.modules:
|
||||
del sys.modules['acme.magic_typing'] # pragma: no cover
|
||||
from acme.magic_typing import Text # pylint: disable=no-name-in-module
|
||||
self.assertTrue(Text is None)
|
||||
del sys.modules['acme.magic_typing']
|
||||
sys.modules['typing'] = temp_typing
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,53 +1,32 @@
|
||||
"""ACME protocol messages."""
|
||||
from collections.abc import Hashable
|
||||
import json
|
||||
from typing import Any
|
||||
from typing import Dict
|
||||
from typing import Type
|
||||
import collections
|
||||
import six
|
||||
|
||||
import josepy as jose
|
||||
|
||||
from acme import challenges
|
||||
from acme import errors
|
||||
from acme import fields
|
||||
from acme import jws
|
||||
from acme import util
|
||||
from acme.mixins import ResourceMixin
|
||||
|
||||
OLD_ERROR_PREFIX = "urn:acme:error:"
|
||||
ERROR_PREFIX = "urn:ietf:params:acme:error:"
|
||||
|
||||
ERROR_CODES = {
|
||||
'accountDoesNotExist': 'The request specified an account that does not exist',
|
||||
'alreadyRevoked': 'The request specified a certificate to be revoked that has' \
|
||||
' already been revoked',
|
||||
'badCSR': 'The CSR is unacceptable (e.g., due to a short key)',
|
||||
'badNonce': 'The client sent an unacceptable anti-replay nonce',
|
||||
'badPublicKey': 'The JWS was signed by a public key the server does not support',
|
||||
'badRevocationReason': 'The revocation reason provided is not allowed by the server',
|
||||
'badSignatureAlgorithm': 'The JWS was signed with an algorithm the server does not support',
|
||||
'caa': 'Certification Authority Authorization (CAA) records forbid the CA from issuing' \
|
||||
' a certificate',
|
||||
'compound': 'Specific error conditions are indicated in the "subproblems" array',
|
||||
'connection': ('The server could not connect to the client to verify the'
|
||||
' domain'),
|
||||
'dns': 'There was a problem with a DNS query during identifier validation',
|
||||
'dnssec': 'The server could not validate a DNSSEC signed domain',
|
||||
'incorrectResponse': 'Response received didn\'t match the challenge\'s requirements',
|
||||
# deprecate invalidEmail
|
||||
'invalidEmail': 'The provided email for a registration was invalid',
|
||||
'invalidContact': 'The provided contact URI was invalid',
|
||||
'malformed': 'The request message was malformed',
|
||||
'rejectedIdentifier': 'The server will not issue certificates for the identifier',
|
||||
'orderNotReady': 'The request attempted to finalize an order that is not ready to be finalized',
|
||||
'rateLimited': 'There were too many requests of a given type',
|
||||
'serverInternal': 'The server experienced an internal error',
|
||||
'tls': 'The server experienced a TLS error during domain verification',
|
||||
'unauthorized': 'The client lacks sufficient authorization',
|
||||
'unsupportedContact': 'A contact URL for an account used an unsupported protocol scheme',
|
||||
'unknownHost': 'The server could not resolve a domain name',
|
||||
'unsupportedIdentifier': 'An identifier is of an unsupported type',
|
||||
'externalAccountRequired': 'The server requires external account binding',
|
||||
}
|
||||
|
||||
ERROR_TYPE_DESCRIPTIONS = dict(
|
||||
@@ -61,9 +40,11 @@ def is_acme_error(err):
|
||||
"""Check if argument is an ACME error."""
|
||||
if isinstance(err, Error) and (err.typ is not None):
|
||||
return (ERROR_PREFIX in err.typ) or (OLD_ERROR_PREFIX in err.typ)
|
||||
return False
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
@six.python_2_unicode_compatible
|
||||
class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
"""ACME error.
|
||||
|
||||
@@ -90,9 +71,7 @@ class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
raise ValueError("The supplied code: %s is not a known ACME error"
|
||||
" code" % code)
|
||||
typ = ERROR_PREFIX + code
|
||||
# Mypy will not understand that the Error constructor accepts a named argument
|
||||
# "typ" because of josepy magic. Let's ignore the type check here.
|
||||
return cls(typ=typ, **kwargs) # type: ignore
|
||||
return cls(typ=typ, **kwargs)
|
||||
|
||||
@property
|
||||
def description(self):
|
||||
@@ -117,7 +96,6 @@ class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
code = str(self.typ).split(':')[-1]
|
||||
if code in ERROR_CODES:
|
||||
return code
|
||||
return None
|
||||
|
||||
def __str__(self):
|
||||
return b' :: '.join(
|
||||
@@ -126,25 +104,24 @@ class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
if part is not None).decode()
|
||||
|
||||
|
||||
class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
|
||||
class _Constant(jose.JSONDeSerializable, collections.Hashable): # type: ignore
|
||||
"""ACME constant."""
|
||||
__slots__ = ('name',)
|
||||
POSSIBLE_NAMES: Dict[str, '_Constant'] = NotImplemented
|
||||
POSSIBLE_NAMES = NotImplemented
|
||||
|
||||
def __init__(self, name):
|
||||
super().__init__()
|
||||
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
|
||||
self.POSSIBLE_NAMES[name] = self
|
||||
self.name = name
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.name
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
|
||||
def from_json(cls, value):
|
||||
if value not in cls.POSSIBLE_NAMES:
|
||||
raise jose.DeserializationError(
|
||||
'{0} not recognized'.format(cls.__name__))
|
||||
return cls.POSSIBLE_NAMES[jobj]
|
||||
return cls.POSSIBLE_NAMES[value]
|
||||
|
||||
def __repr__(self):
|
||||
return '{0}({1})'.format(self.__class__.__name__, self.name)
|
||||
@@ -155,10 +132,13 @@ class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
|
||||
def __hash__(self):
|
||||
return hash((self.__class__, self.name))
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self == other
|
||||
|
||||
|
||||
class Status(_Constant):
|
||||
"""ACME "status" field."""
|
||||
POSSIBLE_NAMES: dict = {}
|
||||
POSSIBLE_NAMES = {} # type: dict
|
||||
STATUS_UNKNOWN = Status('unknown')
|
||||
STATUS_PENDING = Status('pending')
|
||||
STATUS_PROCESSING = Status('processing')
|
||||
@@ -166,12 +146,11 @@ STATUS_VALID = Status('valid')
|
||||
STATUS_INVALID = Status('invalid')
|
||||
STATUS_REVOKED = Status('revoked')
|
||||
STATUS_READY = Status('ready')
|
||||
STATUS_DEACTIVATED = Status('deactivated')
|
||||
|
||||
|
||||
class IdentifierType(_Constant):
|
||||
"""ACME identifier type."""
|
||||
POSSIBLE_NAMES: Dict[str, 'IdentifierType'] = {}
|
||||
POSSIBLE_NAMES = {} # type: dict
|
||||
IDENTIFIER_FQDN = IdentifierType('dns') # IdentifierDNS in Boulder
|
||||
|
||||
|
||||
@@ -189,7 +168,7 @@ class Identifier(jose.JSONObjectWithFields):
|
||||
class Directory(jose.JSONDeSerializable):
|
||||
"""Directory."""
|
||||
|
||||
_REGISTERED_TYPES: Dict[str, Type[Any]] = {}
|
||||
_REGISTERED_TYPES = {} # type: dict
|
||||
|
||||
class Meta(jose.JSONObjectWithFields):
|
||||
"""Directory Meta."""
|
||||
@@ -197,11 +176,11 @@ class Directory(jose.JSONDeSerializable):
|
||||
_terms_of_service_v2 = jose.Field('termsOfService', omitempty=True)
|
||||
website = jose.Field('website', omitempty=True)
|
||||
caa_identities = jose.Field('caaIdentities', omitempty=True)
|
||||
external_account_required = jose.Field('externalAccountRequired', omitempty=True)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
|
||||
super().__init__(**kwargs)
|
||||
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
|
||||
# pylint: disable=star-args
|
||||
super(Directory.Meta, self).__init__(**kwargs)
|
||||
|
||||
@property
|
||||
def terms_of_service(self):
|
||||
@@ -211,7 +190,7 @@ class Directory(jose.JSONDeSerializable):
|
||||
def __iter__(self):
|
||||
# When iterating over fields, use the external name 'terms_of_service' instead of
|
||||
# the internal '_terms_of_service'.
|
||||
for name in super().__iter__():
|
||||
for name in super(Directory.Meta, self).__iter__():
|
||||
yield name[1:] if name == '_terms_of_service' else name
|
||||
|
||||
def _internal_name(self, name):
|
||||
@@ -223,7 +202,7 @@ class Directory(jose.JSONDeSerializable):
|
||||
return getattr(key, 'resource_type', key)
|
||||
|
||||
@classmethod
|
||||
def register(cls, resource_body_cls: Type[Any]) -> Type[Any]:
|
||||
def register(cls, resource_body_cls):
|
||||
"""Register resource."""
|
||||
resource_type = resource_body_cls.resource_type
|
||||
assert resource_type not in cls._REGISTERED_TYPES
|
||||
@@ -240,13 +219,13 @@ class Directory(jose.JSONDeSerializable):
|
||||
try:
|
||||
return self[name.replace('_', '-')]
|
||||
except KeyError as error:
|
||||
raise AttributeError(str(error))
|
||||
raise AttributeError(str(error) + ': ' + name)
|
||||
|
||||
def __getitem__(self, name):
|
||||
try:
|
||||
return self._jobj[self._canon_key(name)]
|
||||
except KeyError:
|
||||
raise KeyError('Directory field "' + self._canon_key(name) + '" not found')
|
||||
raise KeyError('Directory field not found')
|
||||
|
||||
def to_partial_json(self):
|
||||
return self._jobj
|
||||
@@ -269,7 +248,7 @@ class Resource(jose.JSONObjectWithFields):
|
||||
class ResourceWithURI(Resource):
|
||||
"""ACME Resource with URI.
|
||||
|
||||
:ivar unicode ~.uri: Location of the resource.
|
||||
:ivar unicode uri: Location of the resource.
|
||||
|
||||
"""
|
||||
uri = jose.Field('uri') # no ChallengeResource.uri
|
||||
@@ -279,24 +258,6 @@ class ResourceBody(jose.JSONObjectWithFields):
|
||||
"""ACME Resource Body."""
|
||||
|
||||
|
||||
class ExternalAccountBinding:
|
||||
"""ACME External Account Binding"""
|
||||
|
||||
@classmethod
|
||||
def from_data(cls, account_public_key, kid, hmac_key, directory):
|
||||
"""Create External Account Binding Resource from contact details, kid and hmac."""
|
||||
|
||||
key_json = json.dumps(account_public_key.to_partial_json()).encode()
|
||||
decoded_hmac_key = jose.b64.b64decode(hmac_key)
|
||||
url = directory["newAccount"]
|
||||
|
||||
eab = jws.JWS.sign(key_json, jose.jwk.JWKOct(key=decoded_hmac_key),
|
||||
jose.jwa.HS256, None,
|
||||
url, kid)
|
||||
|
||||
return eab.to_partial_json()
|
||||
|
||||
|
||||
class Registration(ResourceBody):
|
||||
"""Registration Resource Body.
|
||||
|
||||
@@ -309,88 +270,30 @@ class Registration(ResourceBody):
|
||||
# on new-reg key server ignores 'key' and populates it based on
|
||||
# JWS.signature.combined.jwk
|
||||
key = jose.Field('key', omitempty=True, decoder=jose.JWK.from_json)
|
||||
# Contact field implements special behavior to allow messages that clear existing
|
||||
# contacts while not expecting the `contact` field when loading from json.
|
||||
# This is implemented in the constructor and *_json methods.
|
||||
contact = jose.Field('contact', omitempty=True, default=())
|
||||
agreement = jose.Field('agreement', omitempty=True)
|
||||
status = jose.Field('status', omitempty=True)
|
||||
terms_of_service_agreed = jose.Field('termsOfServiceAgreed', omitempty=True)
|
||||
only_return_existing = jose.Field('onlyReturnExisting', omitempty=True)
|
||||
external_account_binding = jose.Field('externalAccountBinding', omitempty=True)
|
||||
|
||||
phone_prefix = 'tel:'
|
||||
email_prefix = 'mailto:'
|
||||
|
||||
@classmethod
|
||||
def from_data(cls, phone=None, email=None, external_account_binding=None, **kwargs):
|
||||
"""
|
||||
Create registration resource from contact details.
|
||||
|
||||
The `contact` keyword being passed to a Registration object is meaningful, so
|
||||
this function represents empty iterables in its kwargs by passing on an empty
|
||||
`tuple`.
|
||||
"""
|
||||
|
||||
# Note if `contact` was in kwargs.
|
||||
contact_provided = 'contact' in kwargs
|
||||
|
||||
# Pop `contact` from kwargs and add formatted email or phone numbers
|
||||
def from_data(cls, phone=None, email=None, **kwargs):
|
||||
"""Create registration resource from contact details."""
|
||||
details = list(kwargs.pop('contact', ()))
|
||||
if phone is not None:
|
||||
details.append(cls.phone_prefix + phone)
|
||||
if email is not None:
|
||||
details.extend([cls.email_prefix + mail for mail in email.split(',')])
|
||||
|
||||
# Insert formatted contact information back into kwargs
|
||||
# or insert an empty tuple if `contact` provided.
|
||||
if details or contact_provided:
|
||||
kwargs['contact'] = tuple(details)
|
||||
|
||||
if external_account_binding:
|
||||
kwargs['external_account_binding'] = external_account_binding
|
||||
|
||||
kwargs['contact'] = tuple(details)
|
||||
return cls(**kwargs)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
"""Note if the user provides a value for the `contact` member."""
|
||||
if 'contact' in kwargs:
|
||||
# Avoid the __setattr__ used by jose.TypedJSONObjectWithFields
|
||||
object.__setattr__(self, '_add_contact', True)
|
||||
super().__init__(**kwargs)
|
||||
|
||||
def _filter_contact(self, prefix):
|
||||
return tuple(
|
||||
detail[len(prefix):] for detail in self.contact # pylint: disable=not-an-iterable
|
||||
detail[len(prefix):] for detail in self.contact
|
||||
if detail.startswith(prefix))
|
||||
|
||||
def _add_contact_if_appropriate(self, jobj):
|
||||
"""
|
||||
The `contact` member of Registration objects should not be required when
|
||||
de-serializing (as it would be if the Fields' `omitempty` flag were `False`), but
|
||||
it should be included in serializations if it was provided.
|
||||
|
||||
:param jobj: Dictionary containing this Registrations' data
|
||||
:type jobj: dict
|
||||
|
||||
:returns: Dictionary containing Registrations data to transmit to the server
|
||||
:rtype: dict
|
||||
"""
|
||||
if getattr(self, '_add_contact', False):
|
||||
jobj['contact'] = self.encode('contact')
|
||||
|
||||
return jobj
|
||||
|
||||
def to_partial_json(self):
|
||||
"""Modify josepy.JSONDeserializable.to_partial_json()"""
|
||||
jobj = super().to_partial_json()
|
||||
return self._add_contact_if_appropriate(jobj)
|
||||
|
||||
def fields_to_partial_json(self):
|
||||
"""Modify josepy.JSONObjectWithFields.fields_to_partial_json()"""
|
||||
jobj = super().fields_to_partial_json()
|
||||
return self._add_contact_if_appropriate(jobj)
|
||||
|
||||
@property
|
||||
def phones(self):
|
||||
"""All phones found in the ``contact`` field."""
|
||||
@@ -403,13 +306,13 @@ class Registration(ResourceBody):
|
||||
|
||||
|
||||
@Directory.register
|
||||
class NewRegistration(ResourceMixin, Registration):
|
||||
class NewRegistration(Registration):
|
||||
"""New registration."""
|
||||
resource_type = 'new-reg'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class UpdateRegistration(ResourceMixin, Registration):
|
||||
class UpdateRegistration(Registration):
|
||||
"""Update registration."""
|
||||
resource_type = 'reg'
|
||||
resource = fields.Resource(resource_type)
|
||||
@@ -459,20 +362,21 @@ class ChallengeBody(ResourceBody):
|
||||
omitempty=True, default=None)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
|
||||
super().__init__(**kwargs)
|
||||
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
|
||||
# pylint: disable=star-args
|
||||
super(ChallengeBody, self).__init__(**kwargs)
|
||||
|
||||
def encode(self, name):
|
||||
return super().encode(self._internal_name(name))
|
||||
return super(ChallengeBody, self).encode(self._internal_name(name))
|
||||
|
||||
def to_partial_json(self):
|
||||
jobj = super().to_partial_json()
|
||||
jobj = super(ChallengeBody, self).to_partial_json()
|
||||
jobj.update(self.chall.to_partial_json())
|
||||
return jobj
|
||||
|
||||
@classmethod
|
||||
def fields_from_json(cls, jobj):
|
||||
jobj_fields = super().fields_from_json(jobj)
|
||||
jobj_fields = super(ChallengeBody, cls).fields_from_json(jobj)
|
||||
jobj_fields['chall'] = challenges.Challenge.from_json(jobj)
|
||||
return jobj_fields
|
||||
|
||||
@@ -487,7 +391,7 @@ class ChallengeBody(ResourceBody):
|
||||
def __iter__(self):
|
||||
# When iterating over fields, use the external name 'uri' instead of
|
||||
# the internal '_uri'.
|
||||
for name in super().__iter__():
|
||||
for name in super(ChallengeBody, self).__iter__():
|
||||
yield name[1:] if name == '_uri' else name
|
||||
|
||||
def _internal_name(self, name):
|
||||
@@ -507,6 +411,7 @@ class ChallengeResource(Resource):
|
||||
@property
|
||||
def uri(self):
|
||||
"""The URL of the challenge body."""
|
||||
# pylint: disable=function-redefined,no-member
|
||||
return self.body.uri
|
||||
|
||||
|
||||
@@ -521,7 +426,7 @@ class Authorization(ResourceBody):
|
||||
:ivar datetime.datetime expires:
|
||||
|
||||
"""
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json, omitempty=True)
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json)
|
||||
challenges = jose.Field('challenges', omitempty=True)
|
||||
combinations = jose.Field('combinations', omitempty=True)
|
||||
|
||||
@@ -533,32 +438,24 @@ class Authorization(ResourceBody):
|
||||
expires = fields.RFC3339Field('expires', omitempty=True)
|
||||
wildcard = jose.Field('wildcard', omitempty=True)
|
||||
|
||||
# Mypy does not understand the josepy magic happening here, and falsely claims
|
||||
# that challenge is redefined. Let's ignore the type check here.
|
||||
@challenges.decoder # type: ignore
|
||||
def challenges(value): # pylint: disable=no-self-argument,missing-function-docstring
|
||||
@challenges.decoder
|
||||
def challenges(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
return tuple(ChallengeBody.from_json(chall) for chall in value)
|
||||
|
||||
@property
|
||||
def resolved_combinations(self):
|
||||
"""Combinations with challenges instead of indices."""
|
||||
return tuple(tuple(self.challenges[idx] for idx in combo)
|
||||
for combo in self.combinations) # pylint: disable=not-an-iterable
|
||||
for combo in self.combinations)
|
||||
|
||||
|
||||
@Directory.register
|
||||
class NewAuthorization(ResourceMixin, Authorization):
|
||||
class NewAuthorization(Authorization):
|
||||
"""New authorization."""
|
||||
resource_type = 'new-authz'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class UpdateAuthorization(ResourceMixin, Authorization):
|
||||
"""Update authorization."""
|
||||
resource_type = 'authz'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class AuthorizationResource(ResourceWithURI):
|
||||
"""Authorization Resource.
|
||||
|
||||
@@ -571,7 +468,7 @@ class AuthorizationResource(ResourceWithURI):
|
||||
|
||||
|
||||
@Directory.register
|
||||
class CertificateRequest(ResourceMixin, jose.JSONObjectWithFields):
|
||||
class CertificateRequest(jose.JSONObjectWithFields):
|
||||
"""ACME new-cert request.
|
||||
|
||||
:ivar josepy.util.ComparableX509 csr:
|
||||
@@ -597,7 +494,7 @@ class CertificateResource(ResourceWithURI):
|
||||
|
||||
|
||||
@Directory.register
|
||||
class Revocation(ResourceMixin, jose.JSONObjectWithFields):
|
||||
class Revocation(jose.JSONObjectWithFields):
|
||||
"""Revocation message.
|
||||
|
||||
:ivar .ComparableX509 certificate: `OpenSSL.crypto.X509` wrapped in
|
||||
@@ -614,30 +511,26 @@ class Revocation(ResourceMixin, jose.JSONObjectWithFields):
|
||||
class Order(ResourceBody):
|
||||
"""Order Resource Body.
|
||||
|
||||
:ivar identifiers: List of identifiers for the certificate.
|
||||
:vartype identifiers: `list` of `.Identifier`
|
||||
:ivar list of .Identifier: List of identifiers for the certificate.
|
||||
:ivar acme.messages.Status status:
|
||||
:ivar authorizations: URLs of authorizations.
|
||||
:vartype authorizations: `list` of `str`
|
||||
:ivar list of str authorizations: URLs of authorizations.
|
||||
:ivar str certificate: URL to download certificate as a fullchain PEM.
|
||||
:ivar str finalize: URL to POST to to request issuance once all
|
||||
authorizations have "valid" status.
|
||||
:ivar datetime.datetime expires: When the order expires.
|
||||
:ivar ~.Error error: Any error that occurred during finalization, if applicable.
|
||||
:ivar .Error error: Any error that occurred during finalization, if applicable.
|
||||
"""
|
||||
identifiers = jose.Field('identifiers', omitempty=True)
|
||||
status = jose.Field('status', decoder=Status.from_json,
|
||||
omitempty=True)
|
||||
omitempty=True, default=STATUS_PENDING)
|
||||
authorizations = jose.Field('authorizations', omitempty=True)
|
||||
certificate = jose.Field('certificate', omitempty=True)
|
||||
finalize = jose.Field('finalize', omitempty=True)
|
||||
expires = fields.RFC3339Field('expires', omitempty=True)
|
||||
error = jose.Field('error', omitempty=True, decoder=Error.from_json)
|
||||
|
||||
# Mypy does not understand the josepy magic happening here, and falsely claims
|
||||
# that identifiers is redefined. Let's ignore the type check here.
|
||||
@identifiers.decoder # type: ignore
|
||||
def identifiers(value): # pylint: disable=no-self-argument,missing-function-docstring
|
||||
@identifiers.decoder
|
||||
def identifiers(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
return tuple(Identifier.from_json(identifier) for identifier in value)
|
||||
|
||||
class OrderResource(ResourceWithURI):
|
||||
@@ -645,22 +538,18 @@ class OrderResource(ResourceWithURI):
|
||||
|
||||
:ivar acme.messages.Order body:
|
||||
:ivar str csr_pem: The CSR this Order will be finalized with.
|
||||
:ivar authorizations: Fully-fetched AuthorizationResource objects.
|
||||
:vartype authorizations: `list` of `acme.messages.AuthorizationResource`
|
||||
:ivar list of acme.messages.AuthorizationResource authorizations:
|
||||
Fully-fetched AuthorizationResource objects.
|
||||
:ivar str fullchain_pem: The fetched contents of the certificate URL
|
||||
produced once the order was finalized, if it's present.
|
||||
:ivar alternative_fullchains_pem: The fetched contents of alternative certificate
|
||||
chain URLs produced once the order was finalized, if present and requested during
|
||||
finalization.
|
||||
:vartype alternative_fullchains_pem: `list` of `str`
|
||||
"""
|
||||
body = jose.Field('body', decoder=Order.from_json)
|
||||
csr_pem = jose.Field('csr_pem', omitempty=True)
|
||||
authorizations = jose.Field('authorizations')
|
||||
fullchain_pem = jose.Field('fullchain_pem', omitempty=True)
|
||||
alternative_fullchains_pem = jose.Field('alternative_fullchains_pem', omitempty=True)
|
||||
|
||||
@Directory.register
|
||||
class NewOrder(Order):
|
||||
"""New order."""
|
||||
resource_type = 'new-order'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
"""Tests for acme.messages."""
|
||||
from typing import Dict
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
import josepy as jose
|
||||
import mock
|
||||
|
||||
from acme import challenges
|
||||
import test_util
|
||||
from acme import test_util
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
CERT = test_util.load_comparable_cert('cert.der')
|
||||
CSR = test_util.load_comparable_csr('csr.der')
|
||||
@@ -18,7 +19,8 @@ class ErrorTest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
from acme.messages import Error, ERROR_PREFIX
|
||||
self.error = Error.with_code('malformed', detail='foo', title='title')
|
||||
self.error = Error(
|
||||
detail='foo', typ=ERROR_PREFIX + 'malformed', title='title')
|
||||
self.jobj = {
|
||||
'detail': 'foo',
|
||||
'title': 'some title',
|
||||
@@ -26,6 +28,7 @@ class ErrorTest(unittest.TestCase):
|
||||
}
|
||||
self.error_custom = Error(typ='custom', detail='bar')
|
||||
self.empty_error = Error()
|
||||
self.jobj_custom = {'type': 'custom', 'detail': 'bar'}
|
||||
|
||||
def test_default_typ(self):
|
||||
from acme.messages import Error
|
||||
@@ -40,27 +43,28 @@ class ErrorTest(unittest.TestCase):
|
||||
hash(Error.from_json(self.error.to_json()))
|
||||
|
||||
def test_description(self):
|
||||
self.assertEqual('The request message was malformed', self.error.description)
|
||||
self.assertIsNone(self.error_custom.description)
|
||||
self.assertEqual(
|
||||
'The request message was malformed', self.error.description)
|
||||
self.assertTrue(self.error_custom.description is None)
|
||||
|
||||
def test_code(self):
|
||||
from acme.messages import Error
|
||||
self.assertEqual('malformed', self.error.code)
|
||||
self.assertIsNone(self.error_custom.code)
|
||||
self.assertIsNone(Error().code)
|
||||
self.assertEqual(None, self.error_custom.code)
|
||||
self.assertEqual(None, Error().code)
|
||||
|
||||
def test_is_acme_error(self):
|
||||
from acme.messages import is_acme_error, Error
|
||||
from acme.messages import is_acme_error
|
||||
self.assertTrue(is_acme_error(self.error))
|
||||
self.assertFalse(is_acme_error(self.error_custom))
|
||||
self.assertFalse(is_acme_error(Error()))
|
||||
self.assertFalse(is_acme_error(self.empty_error))
|
||||
self.assertFalse(is_acme_error("must pet all the {dogs|rabbits}"))
|
||||
|
||||
def test_unicode_error(self):
|
||||
from acme.messages import Error, is_acme_error
|
||||
arabic_error = Error.with_code(
|
||||
'malformed', detail=u'\u0639\u062f\u0627\u0644\u0629', title='title')
|
||||
from acme.messages import Error, ERROR_PREFIX, is_acme_error
|
||||
arabic_error = Error(
|
||||
detail=u'\u0639\u062f\u0627\u0644\u0629', typ=ERROR_PREFIX + 'malformed',
|
||||
title='title')
|
||||
self.assertTrue(is_acme_error(arabic_error))
|
||||
|
||||
def test_with_code(self):
|
||||
@@ -82,7 +86,7 @@ class ConstantTest(unittest.TestCase):
|
||||
from acme.messages import _Constant
|
||||
|
||||
class MockConstant(_Constant): # pylint: disable=missing-docstring
|
||||
POSSIBLE_NAMES: Dict = {}
|
||||
POSSIBLE_NAMES = {} # type: Dict
|
||||
|
||||
self.MockConstant = MockConstant # pylint: disable=invalid-name
|
||||
self.const_a = MockConstant('a')
|
||||
@@ -106,11 +110,11 @@ class ConstantTest(unittest.TestCase):
|
||||
|
||||
def test_equality(self):
|
||||
const_a_prime = self.MockConstant('a')
|
||||
self.assertNotEqual(self.const_a, self.const_b)
|
||||
self.assertEqual(self.const_a, const_a_prime)
|
||||
self.assertFalse(self.const_a == self.const_b)
|
||||
self.assertTrue(self.const_a == const_a_prime)
|
||||
|
||||
self.assertNotEqual(self.const_a, self.const_b)
|
||||
self.assertEqual(self.const_a, const_a_prime)
|
||||
self.assertTrue(self.const_a != self.const_b)
|
||||
self.assertFalse(self.const_a != const_a_prime)
|
||||
|
||||
|
||||
class DirectoryTest(unittest.TestCase):
|
||||
@@ -170,24 +174,6 @@ class DirectoryTest(unittest.TestCase):
|
||||
self.assertTrue(result)
|
||||
|
||||
|
||||
class ExternalAccountBindingTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
from acme.messages import Directory
|
||||
self.key = jose.jwk.JWKRSA(key=KEY.public_key())
|
||||
self.kid = "kid-for-testing"
|
||||
self.hmac_key = "hmac-key-for-testing"
|
||||
self.dir = Directory({
|
||||
'newAccount': 'http://url/acme/new-account',
|
||||
})
|
||||
|
||||
def test_from_data(self):
|
||||
from acme.messages import ExternalAccountBinding
|
||||
eab = ExternalAccountBinding.from_data(self.key, self.kid, self.hmac_key, self.dir)
|
||||
|
||||
self.assertEqual(len(eab), 3)
|
||||
self.assertEqual(sorted(eab.keys()), sorted(['protected', 'payload', 'signature']))
|
||||
|
||||
|
||||
class RegistrationTest(unittest.TestCase):
|
||||
"""Tests for acme.messages.Registration."""
|
||||
|
||||
@@ -219,22 +205,6 @@ class RegistrationTest(unittest.TestCase):
|
||||
'mailto:admin@foo.com',
|
||||
))
|
||||
|
||||
def test_new_registration_from_data_with_eab(self):
|
||||
from acme.messages import NewRegistration, ExternalAccountBinding, Directory
|
||||
key = jose.jwk.JWKRSA(key=KEY.public_key())
|
||||
kid = "kid-for-testing"
|
||||
hmac_key = "hmac-key-for-testing"
|
||||
directory = Directory({
|
||||
'newAccount': 'http://url/acme/new-account',
|
||||
})
|
||||
eab = ExternalAccountBinding.from_data(key, kid, hmac_key, directory)
|
||||
reg = NewRegistration.from_data(email='admin@foo.com', external_account_binding=eab)
|
||||
self.assertEqual(reg.contact, (
|
||||
'mailto:admin@foo.com',
|
||||
))
|
||||
self.assertEqual(sorted(reg.external_account_binding.keys()),
|
||||
sorted(['protected', 'payload', 'signature']))
|
||||
|
||||
def test_phones(self):
|
||||
self.assertEqual(('1234',), self.reg.phones)
|
||||
|
||||
@@ -252,19 +222,6 @@ class RegistrationTest(unittest.TestCase):
|
||||
from acme.messages import Registration
|
||||
hash(Registration.from_json(self.jobj_from))
|
||||
|
||||
def test_default_not_transmitted(self):
|
||||
from acme.messages import NewRegistration
|
||||
empty_new_reg = NewRegistration()
|
||||
new_reg_with_contact = NewRegistration(contact=())
|
||||
|
||||
self.assertEqual(empty_new_reg.contact, ())
|
||||
self.assertEqual(new_reg_with_contact.contact, ())
|
||||
|
||||
self.assertNotIn('contact', empty_new_reg.to_partial_json())
|
||||
self.assertNotIn('contact', empty_new_reg.fields_to_partial_json())
|
||||
self.assertIn('contact', new_reg_with_contact.to_partial_json())
|
||||
self.assertIn('contact', new_reg_with_contact.fields_to_partial_json())
|
||||
|
||||
|
||||
class UpdateRegistrationTest(unittest.TestCase):
|
||||
"""Tests for acme.messages.UpdateRegistration."""
|
||||
@@ -314,7 +271,8 @@ class ChallengeBodyTest(unittest.TestCase):
|
||||
from acme.messages import Error
|
||||
from acme.messages import STATUS_INVALID
|
||||
self.status = STATUS_INVALID
|
||||
error = Error.with_code('serverInternal', detail='Unable to communicate with DNS server')
|
||||
error = Error(typ='urn:ietf:params:acme:error:serverInternal',
|
||||
detail='Unable to communicate with DNS server')
|
||||
self.challb = ChallengeBody(
|
||||
uri='http://challb', chall=self.chall, status=self.status,
|
||||
error=error)
|
||||
@@ -406,7 +364,7 @@ class AuthorizationResourceTest(unittest.TestCase):
|
||||
authzr = AuthorizationResource(
|
||||
uri=mock.sentinel.uri,
|
||||
body=mock.sentinel.body)
|
||||
self.assertIsInstance(authzr, jose.JSONDeSerializable)
|
||||
self.assertTrue(isinstance(authzr, jose.JSONDeSerializable))
|
||||
|
||||
|
||||
class CertificateRequestTest(unittest.TestCase):
|
||||
@@ -417,7 +375,7 @@ class CertificateRequestTest(unittest.TestCase):
|
||||
self.req = CertificateRequest(csr=CSR)
|
||||
|
||||
def test_json_de_serializable(self):
|
||||
self.assertIsInstance(self.req, jose.JSONDeSerializable)
|
||||
self.assertTrue(isinstance(self.req, jose.JSONDeSerializable))
|
||||
from acme.messages import CertificateRequest
|
||||
self.assertEqual(
|
||||
self.req, CertificateRequest.from_json(self.req.to_json()))
|
||||
@@ -433,7 +391,7 @@ class CertificateResourceTest(unittest.TestCase):
|
||||
cert_chain_uri=mock.sentinel.cert_chain_uri)
|
||||
|
||||
def test_json_de_serializable(self):
|
||||
self.assertIsInstance(self.certr, jose.JSONDeSerializable)
|
||||
self.assertTrue(isinstance(self.certr, jose.JSONDeSerializable))
|
||||
from acme.messages import CertificateResource
|
||||
self.assertEqual(
|
||||
self.certr, CertificateResource.from_json(self.certr.to_json()))
|
||||
@@ -467,32 +425,5 @@ class OrderResourceTest(unittest.TestCase):
|
||||
})
|
||||
|
||||
|
||||
class NewOrderTest(unittest.TestCase):
|
||||
"""Tests for acme.messages.NewOrder."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.messages import NewOrder
|
||||
self.reg = NewOrder(
|
||||
identifiers=mock.sentinel.identifiers)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.reg.to_json(), {
|
||||
'identifiers': mock.sentinel.identifiers,
|
||||
})
|
||||
|
||||
|
||||
class JWSPayloadRFC8555Compliant(unittest.TestCase):
|
||||
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
|
||||
def test_message_payload(self):
|
||||
from acme.messages import NewAuthorization
|
||||
|
||||
new_order = NewAuthorization()
|
||||
new_order.le_acme_version = 2
|
||||
|
||||
jobj = new_order.json_dumps(indent=2).encode()
|
||||
# RFC8555 states that JWS bodies must not have a resource field.
|
||||
self.assertEqual(jobj, b'{}')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,65 +0,0 @@
|
||||
"""Useful mixins for Challenge and Resource objects"""
|
||||
|
||||
|
||||
class VersionedLEACMEMixin:
|
||||
"""This mixin stores the version of Let's Encrypt's endpoint being used."""
|
||||
@property
|
||||
def le_acme_version(self):
|
||||
"""Define the version of ACME protocol to use"""
|
||||
return getattr(self, '_le_acme_version', 1)
|
||||
|
||||
@le_acme_version.setter
|
||||
def le_acme_version(self, version):
|
||||
# We need to use object.__setattr__ to not depend on the specific implementation of
|
||||
# __setattr__ in current class (eg. jose.TypedJSONObjectWithFields raises AttributeError
|
||||
# for any attempt to set an attribute to make objects immutable).
|
||||
object.__setattr__(self, '_le_acme_version', version)
|
||||
|
||||
def __setattr__(self, key, value):
|
||||
if key == 'le_acme_version':
|
||||
# Required for @property to operate properly. See comment above.
|
||||
object.__setattr__(self, key, value)
|
||||
else:
|
||||
super().__setattr__(key, value) # pragma: no cover
|
||||
|
||||
|
||||
class ResourceMixin(VersionedLEACMEMixin):
|
||||
"""
|
||||
This mixin generates a RFC8555 compliant JWS payload
|
||||
by removing the `resource` field if needed (eg. ACME v2 protocol).
|
||||
"""
|
||||
def to_partial_json(self):
|
||||
"""See josepy.JSONDeserializable.to_partial_json()"""
|
||||
return _safe_jobj_compliance(super(),
|
||||
'to_partial_json', 'resource')
|
||||
|
||||
def fields_to_partial_json(self):
|
||||
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
|
||||
return _safe_jobj_compliance(super(),
|
||||
'fields_to_partial_json', 'resource')
|
||||
|
||||
|
||||
class TypeMixin(VersionedLEACMEMixin):
|
||||
"""
|
||||
This mixin allows generation of a RFC8555 compliant JWS payload
|
||||
by removing the `type` field if needed (eg. ACME v2 protocol).
|
||||
"""
|
||||
def to_partial_json(self):
|
||||
"""See josepy.JSONDeserializable.to_partial_json()"""
|
||||
return _safe_jobj_compliance(super(),
|
||||
'to_partial_json', 'type')
|
||||
|
||||
def fields_to_partial_json(self):
|
||||
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
|
||||
return _safe_jobj_compliance(super(),
|
||||
'fields_to_partial_json', 'type')
|
||||
|
||||
|
||||
def _safe_jobj_compliance(instance, jobj_method, uncompliant_field):
|
||||
if hasattr(instance, jobj_method):
|
||||
jobj = getattr(instance, jobj_method)()
|
||||
if instance.le_acme_version == 2:
|
||||
jobj.pop(uncompliant_field, None)
|
||||
return jobj
|
||||
|
||||
raise AttributeError('Method {0}() is not implemented.'.format(jobj_method)) # pragma: no cover
|
||||
@@ -1,19 +1,29 @@
|
||||
"""Support for standalone client challenge solvers. """
|
||||
import argparse
|
||||
import collections
|
||||
import functools
|
||||
import http.client as http_client
|
||||
import http.server as BaseHTTPServer
|
||||
import logging
|
||||
import os
|
||||
import socket
|
||||
import socketserver
|
||||
import sys
|
||||
import threading
|
||||
from typing import List
|
||||
|
||||
from six.moves import BaseHTTPServer # type: ignore # pylint: disable=import-error
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
from six.moves import socketserver # type: ignore # pylint: disable=import-error
|
||||
|
||||
import OpenSSL
|
||||
|
||||
from acme import challenges
|
||||
from acme import crypto_util
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# six.moves.* | pylint: disable=no-member,attribute-defined-outside-init
|
||||
# pylint: disable=too-few-public-methods,no-init
|
||||
|
||||
|
||||
class TLSServer(socketserver.TCPServer):
|
||||
"""Generic TLS Server."""
|
||||
@@ -26,34 +36,28 @@ class TLSServer(socketserver.TCPServer):
|
||||
self.address_family = socket.AF_INET
|
||||
self.certs = kwargs.pop("certs", {})
|
||||
self.method = kwargs.pop(
|
||||
"method", crypto_util._DEFAULT_SSL_METHOD)
|
||||
# pylint: disable=protected-access
|
||||
"method", crypto_util._DEFAULT_TLSSNI01_SSL_METHOD)
|
||||
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
|
||||
socketserver.TCPServer.__init__(self, *args, **kwargs)
|
||||
|
||||
def _wrap_sock(self):
|
||||
self.socket = crypto_util.SSLSocket(
|
||||
self.socket, cert_selection=self._cert_selection,
|
||||
alpn_selection=getattr(self, '_alpn_selection', None),
|
||||
method=self.method)
|
||||
self.socket, certs=self.certs, method=self.method)
|
||||
|
||||
def _cert_selection(self, connection): # pragma: no cover
|
||||
"""Callback selecting certificate for connection."""
|
||||
server_name = connection.get_servername()
|
||||
return self.certs.get(server_name, None)
|
||||
|
||||
def server_bind(self):
|
||||
def server_bind(self): # pylint: disable=missing-docstring
|
||||
self._wrap_sock()
|
||||
return socketserver.TCPServer.server_bind(self)
|
||||
|
||||
|
||||
class ACMEServerMixin:
|
||||
class ACMEServerMixin: # pylint: disable=old-style-class
|
||||
"""ACME server common settings mixin."""
|
||||
# TODO: c.f. #858
|
||||
server_version = "ACME client standalone challenge solver"
|
||||
allow_reuse_address = True
|
||||
|
||||
|
||||
class BaseDualNetworkedServers:
|
||||
class BaseDualNetworkedServers(object):
|
||||
"""Base class for a pair of IPv6 and IPv4 servers that tries to do everything
|
||||
it's asked for both servers, but where failures in one server don't
|
||||
affect the other.
|
||||
@@ -63,8 +67,8 @@ class BaseDualNetworkedServers:
|
||||
|
||||
def __init__(self, ServerClass, server_address, *remaining_args, **kwargs):
|
||||
port = server_address[1]
|
||||
self.threads: List[threading.Thread] = []
|
||||
self.servers: List[socketserver.BaseServer] = []
|
||||
self.threads = [] # type: List[threading.Thread]
|
||||
self.servers = [] # type: List[ACMEServerMixin]
|
||||
|
||||
# Must try True first.
|
||||
# Ubuntu, for example, will fail to bind to IPv4 if we've already bound
|
||||
@@ -78,7 +82,7 @@ class BaseDualNetworkedServers:
|
||||
kwargs["ipv6"] = ip_version
|
||||
new_address = (server_address[0],) + (port,) + server_address[2:]
|
||||
new_args = (new_address,) + remaining_args
|
||||
server = ServerClass(*new_args, **kwargs)
|
||||
server = ServerClass(*new_args, **kwargs) # pylint: disable=star-args
|
||||
logger.debug(
|
||||
"Successfully bound to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
@@ -86,8 +90,8 @@ class BaseDualNetworkedServers:
|
||||
if self.servers:
|
||||
# Already bound using IPv6.
|
||||
logger.debug(
|
||||
"Certbot wasn't able to bind to %s:%s using %s, this "
|
||||
"is often expected due to the dual stack nature of "
|
||||
"Certbot wasn't able to bind to %s:%s using %s, this " +
|
||||
"is often expected due to the dual stack nature of " +
|
||||
"IPv6 socket implementations.",
|
||||
new_address[0], new_address[1],
|
||||
"IPv6" if ip_version else "IPv4")
|
||||
@@ -100,13 +104,14 @@ class BaseDualNetworkedServers:
|
||||
# If two servers are set up and port 0 was passed in, ensure we always
|
||||
# bind to the same port for both servers.
|
||||
port = server.socket.getsockname()[1]
|
||||
if not self.servers:
|
||||
if len(self.servers) == 0:
|
||||
raise socket.error("Could not bind to IPv4 or IPv6.")
|
||||
|
||||
def serve_forever(self):
|
||||
"""Wraps socketserver.TCPServer.serve_forever"""
|
||||
for server in self.servers:
|
||||
thread = threading.Thread(
|
||||
# pylint: disable=no-member
|
||||
target=server.serve_forever)
|
||||
thread.start()
|
||||
self.threads.append(thread)
|
||||
@@ -126,38 +131,33 @@ class BaseDualNetworkedServers:
|
||||
self.threads = []
|
||||
|
||||
|
||||
class TLSALPN01Server(TLSServer, ACMEServerMixin):
|
||||
"""TLSALPN01 Server."""
|
||||
class TLSSNI01Server(TLSServer, ACMEServerMixin):
|
||||
"""TLSSNI01 Server."""
|
||||
|
||||
ACME_TLS_1_PROTOCOL = b"acme-tls/1"
|
||||
|
||||
def __init__(self, server_address, certs, challenge_certs, ipv6=False):
|
||||
def __init__(self, server_address, certs, ipv6=False):
|
||||
TLSServer.__init__(
|
||||
self, server_address, _BaseRequestHandlerWithLogging, certs=certs,
|
||||
ipv6=ipv6)
|
||||
self.challenge_certs = challenge_certs
|
||||
self, server_address, BaseRequestHandlerWithLogging, certs=certs, ipv6=ipv6)
|
||||
|
||||
def _cert_selection(self, connection):
|
||||
# TODO: We would like to serve challenge cert only if asked for it via
|
||||
# ALPN. To do this, we need to retrieve the list of protos from client
|
||||
# hello, but this is currently impossible with openssl [0], and ALPN
|
||||
# negotiation is done after cert selection.
|
||||
# Therefore, currently we always return challenge cert, and terminate
|
||||
# handshake in alpn_selection() if ALPN protos are not what we expect.
|
||||
# [0] https://github.com/openssl/openssl/issues/4952
|
||||
server_name = connection.get_servername()
|
||||
logger.debug("Serving challenge cert for server name %s", server_name)
|
||||
return self.challenge_certs.get(server_name, None)
|
||||
|
||||
def _alpn_selection(self, _connection, alpn_protos):
|
||||
"""Callback to select alpn protocol."""
|
||||
if len(alpn_protos) == 1 and alpn_protos[0] == self.ACME_TLS_1_PROTOCOL:
|
||||
logger.debug("Agreed on %s ALPN", self.ACME_TLS_1_PROTOCOL)
|
||||
return self.ACME_TLS_1_PROTOCOL
|
||||
logger.debug("Cannot agree on ALPN proto. Got: %s", str(alpn_protos))
|
||||
# Explicitly close the connection now, by returning an empty string.
|
||||
# See https://www.pyopenssl.org/en/stable/api/ssl.html#OpenSSL.SSL.Context.set_alpn_select_callback # pylint: disable=line-too-long
|
||||
return b""
|
||||
class TLSSNI01DualNetworkedServers(BaseDualNetworkedServers):
|
||||
"""TLSSNI01Server Wrapper. Tries everything for both. Failures for one don't
|
||||
affect the other."""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
BaseDualNetworkedServers.__init__(self, TLSSNI01Server, *args, **kwargs)
|
||||
|
||||
|
||||
class BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
|
||||
"""BaseRequestHandler with logging."""
|
||||
|
||||
def log_message(self, format, *args): # pylint: disable=redefined-builtin
|
||||
"""Log arbitrary message."""
|
||||
logger.debug("%s - - %s", self.client_address[0], format % args)
|
||||
|
||||
def handle(self):
|
||||
"""Handle request."""
|
||||
self.log_message("Incoming request")
|
||||
socketserver.BaseRequestHandler.handle(self)
|
||||
|
||||
|
||||
class HTTPServer(BaseHTTPServer.HTTPServer):
|
||||
@@ -175,10 +175,10 @@ class HTTPServer(BaseHTTPServer.HTTPServer):
|
||||
class HTTP01Server(HTTPServer, ACMEServerMixin):
|
||||
"""HTTP01 Server."""
|
||||
|
||||
def __init__(self, server_address, resources, ipv6=False, timeout=30):
|
||||
def __init__(self, server_address, resources, ipv6=False):
|
||||
HTTPServer.__init__(
|
||||
self, server_address, HTTP01RequestHandler.partial_init(
|
||||
simple_http_resources=resources, timeout=timeout), ipv6=ipv6)
|
||||
simple_http_resources=resources), ipv6=ipv6)
|
||||
|
||||
|
||||
class HTTP01DualNetworkedServers(BaseDualNetworkedServers):
|
||||
@@ -203,24 +203,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.simple_http_resources = kwargs.pop("simple_http_resources", set())
|
||||
self._timeout = kwargs.pop('timeout', 30)
|
||||
BaseHTTPServer.BaseHTTPRequestHandler.__init__(self, *args, **kwargs)
|
||||
self.server: HTTP01Server
|
||||
|
||||
# In parent class BaseHTTPRequestHandler, 'timeout' is a class-level property but we
|
||||
# need to define its value during the initialization phase in HTTP01RequestHandler.
|
||||
# However MyPy does not appreciate that we dynamically shadow a class-level property
|
||||
# with an instance-level property (eg. self.timeout = ... in __init__()). So to make
|
||||
# everyone happy, we statically redefine 'timeout' as a method property, and set the
|
||||
# timeout value in a new internal instance-level property _timeout.
|
||||
@property
|
||||
def timeout(self):
|
||||
"""
|
||||
The default timeout this server should apply to requests.
|
||||
:return: timeout to apply
|
||||
:rtype: int
|
||||
"""
|
||||
return self._timeout
|
||||
|
||||
def log_message(self, format, *args): # pylint: disable=redefined-builtin
|
||||
"""Log arbitrary message."""
|
||||
@@ -231,7 +214,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
self.log_message("Incoming request")
|
||||
BaseHTTPServer.BaseHTTPRequestHandler.handle(self)
|
||||
|
||||
def do_GET(self): # pylint: disable=invalid-name,missing-function-docstring
|
||||
def do_GET(self): # pylint: disable=invalid-name,missing-docstring
|
||||
if self.path == "/":
|
||||
self.handle_index()
|
||||
elif self.path.startswith("/" + challenges.HTTP01.URI_ROOT_PATH):
|
||||
@@ -269,7 +252,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
self.path)
|
||||
|
||||
@classmethod
|
||||
def partial_init(cls, simple_http_resources, timeout):
|
||||
def partial_init(cls, simple_http_resources):
|
||||
"""Partially initialize this handler.
|
||||
|
||||
This is useful because `socketserver.BaseServer` takes
|
||||
@@ -278,18 +261,40 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
|
||||
"""
|
||||
return functools.partial(
|
||||
cls, simple_http_resources=simple_http_resources,
|
||||
timeout=timeout)
|
||||
cls, simple_http_resources=simple_http_resources)
|
||||
|
||||
|
||||
class _BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
|
||||
"""BaseRequestHandler with logging."""
|
||||
def simple_tls_sni_01_server(cli_args, forever=True):
|
||||
"""Run simple standalone TLSSNI01 server."""
|
||||
logging.basicConfig(level=logging.DEBUG)
|
||||
|
||||
def log_message(self, format, *args): # pylint: disable=redefined-builtin
|
||||
"""Log arbitrary message."""
|
||||
logger.debug("%s - - %s", self.client_address[0], format % args)
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
"-p", "--port", default=0, help="Port to serve at. By default "
|
||||
"picks random free port.")
|
||||
args = parser.parse_args(cli_args[1:])
|
||||
|
||||
def handle(self):
|
||||
"""Handle request."""
|
||||
self.log_message("Incoming request")
|
||||
socketserver.BaseRequestHandler.handle(self)
|
||||
certs = {}
|
||||
|
||||
_, hosts, _ = next(os.walk('.')) # type: ignore # https://github.com/python/mypy/issues/465
|
||||
for host in hosts:
|
||||
with open(os.path.join(host, "cert.pem")) as cert_file:
|
||||
cert_contents = cert_file.read()
|
||||
with open(os.path.join(host, "key.pem")) as key_file:
|
||||
key_contents = key_file.read()
|
||||
certs[host.encode()] = (
|
||||
OpenSSL.crypto.load_privatekey(
|
||||
OpenSSL.crypto.FILETYPE_PEM, key_contents),
|
||||
OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_PEM, cert_contents))
|
||||
|
||||
server = TLSSNI01Server(('', int(args.port)), certs=certs)
|
||||
logger.info("Serving at https://%s:%s...", *server.socket.getsockname()[:2])
|
||||
if forever: # pragma: no cover
|
||||
server.serve_forever()
|
||||
else:
|
||||
server.handle_request()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(simple_tls_sni_01_server(sys.argv)) # pragma: no cover
|
||||
|
||||
@@ -1,20 +1,23 @@
|
||||
"""Tests for acme.standalone."""
|
||||
import http.client as http_client
|
||||
import os
|
||||
import shutil
|
||||
import socket
|
||||
import socketserver
|
||||
import threading
|
||||
import tempfile
|
||||
import unittest
|
||||
from typing import Set
|
||||
from unittest import mock
|
||||
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
from six.moves import queue # pylint: disable=import-error
|
||||
from six.moves import socketserver # type: ignore # pylint: disable=import-error
|
||||
|
||||
import josepy as jose
|
||||
import mock
|
||||
import requests
|
||||
|
||||
from acme import challenges
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
|
||||
import test_util
|
||||
from acme import test_util
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
class TLSServerTest(unittest.TestCase):
|
||||
@@ -25,14 +28,41 @@ class TLSServerTest(unittest.TestCase):
|
||||
from acme.standalone import TLSServer
|
||||
server = TLSServer(
|
||||
('', 0), socketserver.BaseRequestHandler, bind_and_activate=True)
|
||||
server.server_close()
|
||||
server.server_close() # pylint: disable=no-member
|
||||
|
||||
def test_ipv6(self):
|
||||
if socket.has_ipv6:
|
||||
from acme.standalone import TLSServer
|
||||
server = TLSServer(
|
||||
('', 0), socketserver.BaseRequestHandler, bind_and_activate=True, ipv6=True)
|
||||
server.server_close()
|
||||
server.server_close() # pylint: disable=no-member
|
||||
|
||||
|
||||
class TLSSNI01ServerTest(unittest.TestCase):
|
||||
"""Test for acme.standalone.TLSSNI01Server."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
self.certs = {b'localhost': (
|
||||
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
|
||||
test_util.load_cert('rsa2048_cert.pem'),
|
||||
)}
|
||||
from acme.standalone import TLSSNI01Server
|
||||
self.server = TLSSNI01Server(("", 0), certs=self.certs)
|
||||
# pylint: disable=no-member
|
||||
self.thread = threading.Thread(target=self.server.serve_forever)
|
||||
self.thread.start()
|
||||
|
||||
def tearDown(self):
|
||||
self.server.shutdown() # pylint: disable=no-member
|
||||
self.thread.join()
|
||||
|
||||
def test_it(self):
|
||||
host, port = self.server.socket.getsockname()[:2]
|
||||
cert = crypto_util.probe_sni(
|
||||
b'localhost', host=host, port=port, timeout=1)
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
jose.ComparableX509(self.certs[b'localhost'][1]))
|
||||
|
||||
|
||||
class HTTP01ServerTest(unittest.TestCase):
|
||||
@@ -42,17 +72,18 @@ class HTTP01ServerTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.account_key = jose.JWK.load(
|
||||
test_util.load_vector('rsa1024_key.pem'))
|
||||
self.resources: Set = set()
|
||||
self.resources = set() # type: Set
|
||||
|
||||
from acme.standalone import HTTP01Server
|
||||
self.server = HTTP01Server(('', 0), resources=self.resources)
|
||||
|
||||
# pylint: disable=no-member
|
||||
self.port = self.server.socket.getsockname()[1]
|
||||
self.thread = threading.Thread(target=self.server.serve_forever)
|
||||
self.thread.start()
|
||||
|
||||
def tearDown(self):
|
||||
self.server.shutdown()
|
||||
self.server.shutdown() # pylint: disable=no-member
|
||||
self.thread.join()
|
||||
|
||||
def test_index(self):
|
||||
@@ -86,81 +117,6 @@ class HTTP01ServerTest(unittest.TestCase):
|
||||
def test_http01_not_found(self):
|
||||
self.assertFalse(self._test_http01(add=False))
|
||||
|
||||
def test_timely_shutdown(self):
|
||||
from acme.standalone import HTTP01Server
|
||||
server = HTTP01Server(('', 0), resources=set(), timeout=0.05)
|
||||
server_thread = threading.Thread(target=server.serve_forever)
|
||||
server_thread.start()
|
||||
|
||||
client = socket.socket()
|
||||
client.connect(('localhost', server.socket.getsockname()[1]))
|
||||
|
||||
stop_thread = threading.Thread(target=server.shutdown)
|
||||
stop_thread.start()
|
||||
server_thread.join(5.)
|
||||
|
||||
is_hung = server_thread.is_alive()
|
||||
try:
|
||||
client.shutdown(socket.SHUT_RDWR)
|
||||
except: # pragma: no cover, pylint: disable=bare-except
|
||||
# may raise error because socket could already be closed
|
||||
pass
|
||||
|
||||
self.assertFalse(is_hung, msg='Server shutdown should not be hung')
|
||||
|
||||
|
||||
@unittest.skipIf(not challenges.TLSALPN01.is_supported(), "pyOpenSSL too old")
|
||||
class TLSALPN01ServerTest(unittest.TestCase):
|
||||
"""Test for acme.standalone.TLSALPN01Server."""
|
||||
|
||||
def setUp(self):
|
||||
self.certs = {b'localhost': (
|
||||
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
|
||||
test_util.load_cert('rsa2048_cert.pem'),
|
||||
)}
|
||||
# Use different certificate for challenge.
|
||||
self.challenge_certs = {b'localhost': (
|
||||
test_util.load_pyopenssl_private_key('rsa4096_key.pem'),
|
||||
test_util.load_cert('rsa4096_cert.pem'),
|
||||
)}
|
||||
from acme.standalone import TLSALPN01Server
|
||||
self.server = TLSALPN01Server(("localhost", 0), certs=self.certs,
|
||||
challenge_certs=self.challenge_certs)
|
||||
# pylint: disable=no-member
|
||||
self.thread = threading.Thread(target=self.server.serve_forever)
|
||||
self.thread.start()
|
||||
|
||||
def tearDown(self):
|
||||
self.server.shutdown() # pylint: disable=no-member
|
||||
self.thread.join()
|
||||
|
||||
# TODO: This is not implemented yet, see comments in standalone.py
|
||||
# def test_certs(self):
|
||||
# host, port = self.server.socket.getsockname()[:2]
|
||||
# cert = crypto_util.probe_sni(
|
||||
# b'localhost', host=host, port=port, timeout=1)
|
||||
# # Expect normal cert when connecting without ALPN.
|
||||
# self.assertEqual(jose.ComparableX509(cert),
|
||||
# jose.ComparableX509(self.certs[b'localhost'][1]))
|
||||
|
||||
def test_challenge_certs(self):
|
||||
host, port = self.server.socket.getsockname()[:2]
|
||||
cert = crypto_util.probe_sni(
|
||||
b'localhost', host=host, port=port, timeout=1,
|
||||
alpn_protocols=[b"acme-tls/1"])
|
||||
# Expect challenge cert when connecting with ALPN.
|
||||
self.assertEqual(
|
||||
jose.ComparableX509(cert),
|
||||
jose.ComparableX509(self.challenge_certs[b'localhost'][1])
|
||||
)
|
||||
|
||||
def test_bad_alpn(self):
|
||||
host, port = self.server.socket.getsockname()[:2]
|
||||
with self.assertRaises(errors.Error):
|
||||
crypto_util.probe_sni(
|
||||
b'localhost', host=host, port=port, timeout=1,
|
||||
alpn_protocols=[b"bad-alpn"])
|
||||
|
||||
|
||||
class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
"""Test for acme.standalone.BaseDualNetworkedServers."""
|
||||
@@ -177,10 +133,8 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
self.address_family = socket.AF_INET
|
||||
socketserver.TCPServer.__init__(self, *args, **kwargs)
|
||||
if ipv6:
|
||||
# NB: On Windows, socket.IPPROTO_IPV6 constant may be missing.
|
||||
# We use the corresponding value (41) instead.
|
||||
level = getattr(socket, "IPPROTO_IPV6", 41)
|
||||
self.socket.setsockopt(level, socket.IPV6_V6ONLY, 1)
|
||||
# pylint: disable=no-member
|
||||
self.socket.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_V6ONLY, 1)
|
||||
try:
|
||||
self.server_bind()
|
||||
self.server_activate()
|
||||
@@ -193,15 +147,15 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
mock_bind.side_effect = socket.error
|
||||
from acme.standalone import BaseDualNetworkedServers
|
||||
self.assertRaises(socket.error, BaseDualNetworkedServers,
|
||||
BaseDualNetworkedServersTest.SingleProtocolServer,
|
||||
('', 0),
|
||||
socketserver.BaseRequestHandler)
|
||||
BaseDualNetworkedServersTest.SingleProtocolServer,
|
||||
("", 0),
|
||||
socketserver.BaseRequestHandler)
|
||||
|
||||
def test_ports_equal(self):
|
||||
from acme.standalone import BaseDualNetworkedServers
|
||||
servers = BaseDualNetworkedServers(
|
||||
BaseDualNetworkedServersTest.SingleProtocolServer,
|
||||
('', 0),
|
||||
("", 0),
|
||||
socketserver.BaseRequestHandler)
|
||||
socknames = servers.getsocknames()
|
||||
prev_port = None
|
||||
@@ -213,17 +167,46 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
prev_port = port
|
||||
|
||||
|
||||
class TLSSNI01DualNetworkedServersTest(unittest.TestCase):
|
||||
"""Test for acme.standalone.TLSSNI01DualNetworkedServers."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
self.certs = {b'localhost': (
|
||||
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
|
||||
test_util.load_cert('rsa2048_cert.pem'),
|
||||
)}
|
||||
from acme.standalone import TLSSNI01DualNetworkedServers
|
||||
self.servers = TLSSNI01DualNetworkedServers(("", 0), certs=self.certs)
|
||||
self.servers.serve_forever()
|
||||
|
||||
def tearDown(self):
|
||||
self.servers.shutdown_and_server_close()
|
||||
|
||||
def test_connect(self):
|
||||
socknames = self.servers.getsocknames()
|
||||
# connect to all addresses
|
||||
for sockname in socknames:
|
||||
host, port = sockname[:2]
|
||||
cert = crypto_util.probe_sni(
|
||||
b'localhost', host=host, port=port, timeout=1)
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
jose.ComparableX509(self.certs[b'localhost'][1]))
|
||||
|
||||
|
||||
class HTTP01DualNetworkedServersTest(unittest.TestCase):
|
||||
"""Tests for acme.standalone.HTTP01DualNetworkedServers."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
self.account_key = jose.JWK.load(
|
||||
test_util.load_vector('rsa1024_key.pem'))
|
||||
self.resources: Set = set()
|
||||
self.resources = set() # type: Set
|
||||
|
||||
from acme.standalone import HTTP01DualNetworkedServers
|
||||
self.servers = HTTP01DualNetworkedServers(('', 0), resources=self.resources)
|
||||
|
||||
# pylint: disable=no-member
|
||||
self.port = self.servers.getsocknames()[0][1]
|
||||
self.servers.serve_forever()
|
||||
|
||||
@@ -262,5 +245,50 @@ class HTTP01DualNetworkedServersTest(unittest.TestCase):
|
||||
self.assertFalse(self._test_http01(add=False))
|
||||
|
||||
|
||||
class TestSimpleTLSSNI01Server(unittest.TestCase):
|
||||
"""Tests for acme.standalone.simple_tls_sni_01_server."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
# mirror ../examples/standalone
|
||||
self.test_cwd = tempfile.mkdtemp()
|
||||
localhost_dir = os.path.join(self.test_cwd, 'localhost')
|
||||
os.makedirs(localhost_dir)
|
||||
shutil.copy(test_util.vector_path('rsa2048_cert.pem'),
|
||||
os.path.join(localhost_dir, 'cert.pem'))
|
||||
shutil.copy(test_util.vector_path('rsa2048_key.pem'),
|
||||
os.path.join(localhost_dir, 'key.pem'))
|
||||
|
||||
from acme.standalone import simple_tls_sni_01_server
|
||||
self.thread = threading.Thread(
|
||||
target=simple_tls_sni_01_server, kwargs={
|
||||
'cli_args': ('filename',),
|
||||
'forever': False,
|
||||
},
|
||||
)
|
||||
self.old_cwd = os.getcwd()
|
||||
os.chdir(self.test_cwd)
|
||||
|
||||
def tearDown(self):
|
||||
os.chdir(self.old_cwd)
|
||||
self.thread.join()
|
||||
shutil.rmtree(self.test_cwd)
|
||||
|
||||
@mock.patch('acme.standalone.logger')
|
||||
def test_it(self, mock_logger):
|
||||
# Use a Queue because mock objects aren't thread safe.
|
||||
q = queue.Queue() # type: queue.Queue[int]
|
||||
# Add port number to the queue.
|
||||
mock_logger.info.side_effect = lambda *args: q.put(args[-1])
|
||||
self.thread.start()
|
||||
|
||||
# After the timeout, an exception is raised if the queue is empty.
|
||||
port = q.get(timeout=5)
|
||||
cert = crypto_util.probe_sni(b'localhost', b'0.0.0.0', port)
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
test_util.load_comparable_cert(
|
||||
'rsa2048_cert.pem'))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -4,12 +4,19 @@
|
||||
|
||||
"""
|
||||
import os
|
||||
import pkg_resources
|
||||
import unittest
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
import pkg_resources
|
||||
|
||||
|
||||
def vector_path(*names):
|
||||
"""Path to a test vector."""
|
||||
return pkg_resources.resource_filename(
|
||||
__name__, os.path.join('testdata', *names))
|
||||
|
||||
|
||||
def load_vector(*names):
|
||||
@@ -25,7 +32,8 @@ def _guess_loader(filename, loader_pem, loader_der):
|
||||
return loader_pem
|
||||
elif ext.lower() == '.der':
|
||||
return loader_der
|
||||
raise ValueError("Loader could not be recognized based on extension") # pragma: no cover
|
||||
else: # pragma: no cover
|
||||
raise ValueError("Loader could not be recognized based on extension")
|
||||
|
||||
|
||||
def load_cert(*names):
|
||||
@@ -65,3 +73,24 @@ def load_pyopenssl_private_key(*names):
|
||||
loader = _guess_loader(
|
||||
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
|
||||
return crypto.load_privatekey(loader, load_vector(*names))
|
||||
|
||||
|
||||
def skip_unless(condition, reason): # pragma: no cover
|
||||
"""Skip tests unless a condition holds.
|
||||
|
||||
This implements the basic functionality of unittest.skipUnless
|
||||
which is only available on Python 2.7+.
|
||||
|
||||
:param bool condition: If ``False``, the test will be skipped
|
||||
:param str reason: the reason for skipping the test
|
||||
|
||||
:rtype: callable
|
||||
:returns: decorator that hides tests unless condition is ``True``
|
||||
|
||||
"""
|
||||
if hasattr(unittest, "skipUnless"):
|
||||
return unittest.skipUnless(condition, reason)
|
||||
elif condition:
|
||||
return lambda cls: cls
|
||||
else:
|
||||
return lambda cls: None
|
||||
@@ -4,14 +4,12 @@ to use appropriate extension for vector filenames: .pem for PEM and
|
||||
|
||||
The following command has been used to generate test keys:
|
||||
|
||||
for k in 256 512 1024 2048 4096; do openssl genrsa -out rsa${k}_key.pem $k; done
|
||||
for x in 256 512 1024 2048; do openssl genrsa -out rsa${k}_key.pem $k; done
|
||||
|
||||
and for the CSR:
|
||||
|
||||
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -outform DER > csr.der
|
||||
|
||||
and for the certificates:
|
||||
and for the certificate:
|
||||
|
||||
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der
|
||||
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 > rsa2048_cert.pem
|
||||
openssl req -key rsa1024_key.pem -new -subj '/CN=example.com' -x509 > rsa1024_cert.pem
|
||||
openssl req -key rsa2047_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der
|
||||
@@ -1,6 +1,7 @@
|
||||
"""ACME utilities."""
|
||||
import six
|
||||
|
||||
|
||||
def map_keys(dikt, func):
|
||||
"""Map dictionary keys."""
|
||||
return {func(key): value for key, value in dikt.items()}
|
||||
return dict((func(key), value) for key, value in six.iteritems(dikt))
|
||||
|
||||
@@ -9,7 +9,7 @@ BUILDDIR = _build
|
||||
|
||||
# User-friendly check for sphinx-build
|
||||
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
|
||||
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from https://www.sphinx-doc.org/)
|
||||
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
|
||||
endif
|
||||
|
||||
# Internal variables.
|
||||
|
||||
@@ -12,8 +12,10 @@
|
||||
# All configuration values have a default; values that are commented out
|
||||
# serve to show the default.
|
||||
|
||||
import os
|
||||
import sys
|
||||
import os
|
||||
import shlex
|
||||
|
||||
|
||||
here = os.path.abspath(os.path.dirname(__file__))
|
||||
|
||||
@@ -40,7 +42,7 @@ extensions = [
|
||||
]
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
autodoc_default_flags = ['show-inheritance']
|
||||
autodoc_default_flags = ['show-inheritance', 'private-members']
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ['_templates']
|
||||
@@ -85,9 +87,7 @@ language = 'en'
|
||||
|
||||
# List of patterns, relative to source directory, that match files and
|
||||
# directories to ignore when looking for source files.
|
||||
exclude_patterns = [
|
||||
'_build',
|
||||
]
|
||||
exclude_patterns = ['_build']
|
||||
|
||||
# The reST default role (used for this markup: `text`) to use for all
|
||||
# documents.
|
||||
@@ -114,7 +114,7 @@ pygments_style = 'sphinx'
|
||||
#keep_warnings = False
|
||||
|
||||
# If true, `todo` and `todoList` produce output, else they produce nothing.
|
||||
todo_include_todos = False
|
||||
todo_include_todos = True
|
||||
|
||||
|
||||
# -- Options for HTML output ----------------------------------------------
|
||||
@@ -122,7 +122,7 @@ todo_include_todos = False
|
||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||
# a list of builtin themes.
|
||||
|
||||
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
|
||||
# http://docs.readthedocs.org/en/latest/theme.html#how-do-i-use-this-locally-and-on-read-the-docs
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
|
||||
@@ -16,6 +16,13 @@ Contents:
|
||||
.. automodule:: acme
|
||||
:members:
|
||||
|
||||
|
||||
Example client:
|
||||
|
||||
.. include:: ../examples/example_client.py
|
||||
:code: python
|
||||
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
||||
|
||||
@@ -65,7 +65,7 @@ if errorlevel 9009 (
|
||||
echo.may add the Sphinx directory to PATH.
|
||||
echo.
|
||||
echo.If you don't have Sphinx installed, grab it from
|
||||
echo.https://www.sphinx-doc.org/
|
||||
echo.http://sphinx-doc.org/
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
|
||||
@@ -1,3 +1 @@
|
||||
:orphan:
|
||||
|
||||
.. literalinclude:: ../jws-help.txt
|
||||
|
||||
47
acme/examples/example_client.py
Normal file
47
acme/examples/example_client.py
Normal file
@@ -0,0 +1,47 @@
|
||||
"""Example script showing how to use acme client API."""
|
||||
import logging
|
||||
import os
|
||||
import pkg_resources
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
|
||||
from acme import client
|
||||
from acme import messages
|
||||
|
||||
|
||||
logging.basicConfig(level=logging.DEBUG)
|
||||
|
||||
|
||||
DIRECTORY_URL = 'https://acme-staging.api.letsencrypt.org/directory'
|
||||
BITS = 2048 # minimum for Boulder
|
||||
DOMAIN = 'example1.com' # example.com is ignored by Boulder
|
||||
|
||||
# generate_private_key requires cryptography>=0.5
|
||||
key = jose.JWKRSA(key=rsa.generate_private_key(
|
||||
public_exponent=65537,
|
||||
key_size=BITS,
|
||||
backend=default_backend()))
|
||||
acme = client.Client(DIRECTORY_URL, key)
|
||||
|
||||
regr = acme.register()
|
||||
logging.info('Auto-accepting TOS: %s', regr.terms_of_service)
|
||||
acme.agree_to_tos(regr)
|
||||
logging.debug(regr)
|
||||
|
||||
authzr = acme.request_challenges(
|
||||
identifier=messages.Identifier(typ=messages.IDENTIFIER_FQDN, value=DOMAIN))
|
||||
logging.debug(authzr)
|
||||
|
||||
authzr, authzr_response = acme.poll(authzr)
|
||||
|
||||
csr = OpenSSL.crypto.load_certificate_request(
|
||||
OpenSSL.crypto.FILETYPE_ASN1, pkg_resources.resource_string(
|
||||
'acme', os.path.join('testdata', 'csr.der')))
|
||||
try:
|
||||
acme.request_issuance(jose.util.ComparableX509(csr), (authzr,))
|
||||
except messages.Error as error:
|
||||
print ("This script is doomed to fail as no authorization "
|
||||
"challenges are ever solved. Error from server: {0}".format(error))
|
||||
@@ -1,241 +0,0 @@
|
||||
"""Example ACME-V2 API for HTTP-01 challenge.
|
||||
|
||||
Brief:
|
||||
|
||||
This a complete usage example of the python-acme API.
|
||||
|
||||
Limitations of this example:
|
||||
- Works for only one Domain name
|
||||
- Performs only HTTP-01 challenge
|
||||
- Uses ACME-v2
|
||||
|
||||
Workflow:
|
||||
(Account creation)
|
||||
- Create account key
|
||||
- Register account and accept TOS
|
||||
(Certificate actions)
|
||||
- Select HTTP-01 within offered challenges by the CA server
|
||||
- Set up http challenge resource
|
||||
- Set up standalone web server
|
||||
- Create domain private key and CSR
|
||||
- Issue certificate
|
||||
- Renew certificate
|
||||
- Revoke certificate
|
||||
(Account update actions)
|
||||
- Change contact information
|
||||
- Deactivate Account
|
||||
"""
|
||||
from contextlib import contextmanager
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
|
||||
from acme import challenges
|
||||
from acme import client
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
from acme import messages
|
||||
from acme import standalone
|
||||
|
||||
# Constants:
|
||||
|
||||
# This is the staging point for ACME-V2 within Let's Encrypt.
|
||||
DIRECTORY_URL = 'https://acme-staging-v02.api.letsencrypt.org/directory'
|
||||
|
||||
USER_AGENT = 'python-acme-example'
|
||||
|
||||
# Account key size
|
||||
ACC_KEY_BITS = 2048
|
||||
|
||||
# Certificate private key size
|
||||
CERT_PKEY_BITS = 2048
|
||||
|
||||
# Domain name for the certificate.
|
||||
DOMAIN = 'client.example.com'
|
||||
|
||||
# If you are running Boulder locally, it is possible to configure any port
|
||||
# number to execute the challenge, but real CA servers will always use port
|
||||
# 80, as described in the ACME specification.
|
||||
PORT = 80
|
||||
|
||||
|
||||
# Useful methods and classes:
|
||||
|
||||
|
||||
def new_csr_comp(domain_name, pkey_pem=None):
|
||||
"""Create certificate signing request."""
|
||||
if pkey_pem is None:
|
||||
# Create private key.
|
||||
pkey = OpenSSL.crypto.PKey()
|
||||
pkey.generate_key(OpenSSL.crypto.TYPE_RSA, CERT_PKEY_BITS)
|
||||
pkey_pem = OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM,
|
||||
pkey)
|
||||
csr_pem = crypto_util.make_csr(pkey_pem, [domain_name])
|
||||
return pkey_pem, csr_pem
|
||||
|
||||
|
||||
def select_http01_chall(orderr):
|
||||
"""Extract authorization resource from within order resource."""
|
||||
# Authorization Resource: authz.
|
||||
# This object holds the offered challenges by the server and their status.
|
||||
authz_list = orderr.authorizations
|
||||
|
||||
for authz in authz_list:
|
||||
# Choosing challenge.
|
||||
# authz.body.challenges is a set of ChallengeBody objects.
|
||||
for i in authz.body.challenges:
|
||||
# Find the supported challenge.
|
||||
if isinstance(i.chall, challenges.HTTP01):
|
||||
return i
|
||||
|
||||
raise Exception('HTTP-01 challenge was not offered by the CA server.')
|
||||
|
||||
|
||||
@contextmanager
|
||||
def challenge_server(http_01_resources):
|
||||
"""Manage standalone server set up and shutdown."""
|
||||
|
||||
# Setting up a fake server that binds at PORT and any address.
|
||||
address = ('', PORT)
|
||||
try:
|
||||
servers = standalone.HTTP01DualNetworkedServers(address,
|
||||
http_01_resources)
|
||||
# Start client standalone web server.
|
||||
servers.serve_forever()
|
||||
yield servers
|
||||
finally:
|
||||
# Shutdown client web server and unbind from PORT
|
||||
servers.shutdown_and_server_close()
|
||||
|
||||
|
||||
def perform_http01(client_acme, challb, orderr):
|
||||
"""Set up standalone webserver and perform HTTP-01 challenge."""
|
||||
|
||||
response, validation = challb.response_and_validation(client_acme.net.key)
|
||||
|
||||
resource = standalone.HTTP01RequestHandler.HTTP01Resource(
|
||||
chall=challb.chall, response=response, validation=validation)
|
||||
|
||||
with challenge_server({resource}):
|
||||
# Let the CA server know that we are ready for the challenge.
|
||||
client_acme.answer_challenge(challb, response)
|
||||
|
||||
# Wait for challenge status and then issue a certificate.
|
||||
# It is possible to set a deadline time.
|
||||
finalized_orderr = client_acme.poll_and_finalize(orderr)
|
||||
|
||||
return finalized_orderr.fullchain_pem
|
||||
|
||||
|
||||
# Main examples:
|
||||
|
||||
|
||||
def example_http():
|
||||
"""This example executes the whole process of fulfilling a HTTP-01
|
||||
challenge for one specific domain.
|
||||
|
||||
The workflow consists of:
|
||||
(Account creation)
|
||||
- Create account key
|
||||
- Register account and accept TOS
|
||||
(Certificate actions)
|
||||
- Select HTTP-01 within offered challenges by the CA server
|
||||
- Set up http challenge resource
|
||||
- Set up standalone web server
|
||||
- Create domain private key and CSR
|
||||
- Issue certificate
|
||||
- Renew certificate
|
||||
- Revoke certificate
|
||||
(Account update actions)
|
||||
- Change contact information
|
||||
- Deactivate Account
|
||||
|
||||
"""
|
||||
# Create account key
|
||||
|
||||
acc_key = jose.JWKRSA(
|
||||
key=rsa.generate_private_key(public_exponent=65537,
|
||||
key_size=ACC_KEY_BITS,
|
||||
backend=default_backend()))
|
||||
|
||||
# Register account and accept TOS
|
||||
|
||||
net = client.ClientNetwork(acc_key, user_agent=USER_AGENT)
|
||||
directory = messages.Directory.from_json(net.get(DIRECTORY_URL).json())
|
||||
client_acme = client.ClientV2(directory, net=net)
|
||||
|
||||
# Terms of Service URL is in client_acme.directory.meta.terms_of_service
|
||||
# Registration Resource: regr
|
||||
# Creates account with contact information.
|
||||
email = ('fake@example.com')
|
||||
regr = client_acme.new_account(
|
||||
messages.NewRegistration.from_data(
|
||||
email=email, terms_of_service_agreed=True))
|
||||
|
||||
# Create domain private key and CSR
|
||||
pkey_pem, csr_pem = new_csr_comp(DOMAIN)
|
||||
|
||||
# Issue certificate
|
||||
|
||||
orderr = client_acme.new_order(csr_pem)
|
||||
|
||||
# Select HTTP-01 within offered challenges by the CA server
|
||||
challb = select_http01_chall(orderr)
|
||||
|
||||
# The certificate is ready to be used in the variable "fullchain_pem".
|
||||
fullchain_pem = perform_http01(client_acme, challb, orderr)
|
||||
|
||||
# Renew certificate
|
||||
|
||||
_, csr_pem = new_csr_comp(DOMAIN, pkey_pem)
|
||||
|
||||
orderr = client_acme.new_order(csr_pem)
|
||||
|
||||
challb = select_http01_chall(orderr)
|
||||
|
||||
# Performing challenge
|
||||
fullchain_pem = perform_http01(client_acme, challb, orderr)
|
||||
|
||||
# Revoke certificate
|
||||
|
||||
fullchain_com = jose.ComparableX509(
|
||||
OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_PEM, fullchain_pem))
|
||||
|
||||
try:
|
||||
client_acme.revoke(fullchain_com, 0) # revocation reason = 0
|
||||
except errors.ConflictError:
|
||||
# Certificate already revoked.
|
||||
pass
|
||||
|
||||
# Query registration status.
|
||||
client_acme.net.account = regr
|
||||
try:
|
||||
regr = client_acme.query_registration(regr)
|
||||
except errors.Error as err:
|
||||
if err.typ == messages.OLD_ERROR_PREFIX + 'unauthorized' \
|
||||
or err.typ == messages.ERROR_PREFIX + 'unauthorized':
|
||||
# Status is deactivated.
|
||||
pass
|
||||
raise
|
||||
|
||||
# Change contact information
|
||||
|
||||
email = 'newfake@example.com'
|
||||
regr = client_acme.update_registration(
|
||||
regr.update(
|
||||
body=regr.body.update(
|
||||
contact=('mailto:' + email,)
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
# Deactivate account/registration
|
||||
|
||||
regr = client_acme.deactivate_registration(regr)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
example_http()
|
||||
@@ -1,10 +1,10 @@
|
||||
# readthedocs.org gives no way to change the install command to "pip
|
||||
# install -e acme[docs]" (that would in turn install documentation
|
||||
# install -e .[docs]" (that would in turn install documentation
|
||||
# dependencies), but it allows to specify a requirements.txt file at
|
||||
# https://readthedocs.org/dashboard/letsencrypt/advanced/ (c.f. #259)
|
||||
|
||||
# Although ReadTheDocs certainly doesn't need to install the project
|
||||
# in --editable mode (-e), just "pip install acme[docs]" does not work as
|
||||
# expected and "pip install -e acme[docs]" must be used instead
|
||||
# in --editable mode (-e), just "pip install .[docs]" does not work as
|
||||
# expected and "pip install -e .[docs]" must be used instead
|
||||
|
||||
-e acme[docs]
|
||||
|
||||
@@ -1,22 +1,26 @@
|
||||
from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
from setuptools.command.test import test as TestCommand
|
||||
import sys
|
||||
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '1.17.0.dev0'
|
||||
version = '0.26.0.dev0'
|
||||
|
||||
# Please update tox.ini when modifying dependency version requirements
|
||||
install_requires = [
|
||||
'cryptography>=2.1.4',
|
||||
# load_pem_private/public_key (>=0.6)
|
||||
# rsa_recover_prime_factors (>=0.8)
|
||||
'cryptography>=0.8',
|
||||
# formerly known as acme.jose:
|
||||
# 1.1.0+ is required to avoid the warnings described at
|
||||
# https://github.com/certbot/josepy/issues/13.
|
||||
'josepy>=1.1.0',
|
||||
'PyOpenSSL>=17.3.0',
|
||||
'josepy>=1.0.0',
|
||||
# Connection.set_tlsext_host_name (>=0.13)
|
||||
'mock',
|
||||
'PyOpenSSL>=0.13',
|
||||
'pyrfc3339',
|
||||
'pytz',
|
||||
'requests>=2.6.0',
|
||||
'requests[security]>=2.4.1', # security extras added in 2.4.1
|
||||
'requests-toolbelt>=0.3.0',
|
||||
'setuptools>=39.0.1',
|
||||
'setuptools',
|
||||
'six>=1.9.0', # needed for python_2_unicode_compatible
|
||||
]
|
||||
|
||||
dev_extras = [
|
||||
@@ -30,25 +34,40 @@ docs_extras = [
|
||||
'sphinx_rtd_theme',
|
||||
]
|
||||
|
||||
class PyTest(TestCommand):
|
||||
user_options = []
|
||||
|
||||
def initialize_options(self):
|
||||
TestCommand.initialize_options(self)
|
||||
self.pytest_args = ''
|
||||
|
||||
def run_tests(self):
|
||||
import shlex
|
||||
# import here, cause outside the eggs aren't loaded
|
||||
import pytest
|
||||
errno = pytest.main(shlex.split(self.pytest_args))
|
||||
sys.exit(errno)
|
||||
|
||||
setup(
|
||||
name='acme',
|
||||
version=version,
|
||||
description='ACME protocol implementation in Python',
|
||||
url='https://github.com/letsencrypt/letsencrypt',
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
author_email='client-dev@letsencrypt.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.6',
|
||||
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Development Status :: 3 - Alpha',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 2',
|
||||
'Programming Language :: Python :: 2.7',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.4',
|
||||
'Programming Language :: Python :: 3.5',
|
||||
'Programming Language :: Python :: 3.6',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
],
|
||||
@@ -60,4 +79,7 @@ setup(
|
||||
'dev': dev_extras,
|
||||
'docs': docs_extras,
|
||||
},
|
||||
tests_require=["pytest"],
|
||||
test_suite='acme',
|
||||
cmdclass={"test": PyTest},
|
||||
)
|
||||
|
||||
@@ -1,53 +0,0 @@
|
||||
"""Tests for acme.jose shim."""
|
||||
import importlib
|
||||
import unittest
|
||||
|
||||
|
||||
class JoseTest(unittest.TestCase):
|
||||
"""Tests for acme.jose shim."""
|
||||
|
||||
def _test_it(self, submodule, attribute):
|
||||
if submodule:
|
||||
acme_jose_path = 'acme.jose.' + submodule
|
||||
josepy_path = 'josepy.' + submodule
|
||||
else:
|
||||
acme_jose_path = 'acme.jose'
|
||||
josepy_path = 'josepy'
|
||||
acme_jose_mod = importlib.import_module(acme_jose_path)
|
||||
josepy_mod = importlib.import_module(josepy_path)
|
||||
|
||||
self.assertIs(acme_jose_mod, josepy_mod)
|
||||
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))
|
||||
|
||||
# We use the imports below with eval, but pylint doesn't
|
||||
# understand that.
|
||||
import acme # pylint: disable=unused-import
|
||||
import josepy # pylint: disable=unused-import
|
||||
acme_jose_mod = eval(acme_jose_path) # pylint: disable=eval-used
|
||||
josepy_mod = eval(josepy_path) # pylint: disable=eval-used
|
||||
self.assertIs(acme_jose_mod, josepy_mod)
|
||||
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))
|
||||
|
||||
def test_top_level(self):
|
||||
self._test_it('', 'RS512')
|
||||
|
||||
def test_submodules(self):
|
||||
# This test ensures that the modules in josepy that were
|
||||
# available at the time it was moved into its own package are
|
||||
# available under acme.jose. Backwards compatibility with new
|
||||
# modules or testing code is not maintained.
|
||||
mods_and_attrs = [('b64', 'b64decode',),
|
||||
('errors', 'Error',),
|
||||
('interfaces', 'JSONDeSerializable',),
|
||||
('json_util', 'Field',),
|
||||
('jwa', 'HS256',),
|
||||
('jwk', 'JWK',),
|
||||
('jws', 'JWS',),
|
||||
('util', 'ImmutableMap',),]
|
||||
|
||||
for mod, attr in mods_and_attrs:
|
||||
self._test_it(mod, attr)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,30 +0,0 @@
|
||||
"""Tests for acme.magic_typing."""
|
||||
import sys
|
||||
import unittest
|
||||
import warnings
|
||||
from unittest import mock
|
||||
|
||||
|
||||
class MagicTypingTest(unittest.TestCase):
|
||||
"""Tests for acme.magic_typing."""
|
||||
def test_import_success(self):
|
||||
try:
|
||||
import typing as temp_typing
|
||||
except ImportError: # pragma: no cover
|
||||
temp_typing = None # pragma: no cover
|
||||
typing_class_mock = mock.MagicMock()
|
||||
text_mock = mock.MagicMock()
|
||||
typing_class_mock.Text = text_mock
|
||||
sys.modules['typing'] = typing_class_mock
|
||||
if 'acme.magic_typing' in sys.modules:
|
||||
del sys.modules['acme.magic_typing'] # pragma: no cover
|
||||
with warnings.catch_warnings():
|
||||
warnings.filterwarnings("ignore", category=DeprecationWarning)
|
||||
from acme.magic_typing import Text
|
||||
self.assertEqual(Text, text_mock)
|
||||
del sys.modules['acme.magic_typing']
|
||||
sys.modules['typing'] = temp_typing
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
13
acme/tests/testdata/rsa1024_cert.pem
vendored
13
acme/tests/testdata/rsa1024_cert.pem
vendored
@@ -1,13 +0,0 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIB/TCCAWagAwIBAgIJAOyRIBs3QT8QMA0GCSqGSIb3DQEBCwUAMBYxFDASBgNV
|
||||
BAMMC2V4YW1wbGUuY29tMB4XDTE4MDQyMzEwMzE0NFoXDTE4MDUyMzEwMzE0NFow
|
||||
FjEUMBIGA1UEAwwLZXhhbXBsZS5jb20wgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJ
|
||||
AoGBAJqJ87R8aVwByONxgQA9hwgvQd/QqI1r1UInXhEF2VnEtZGtUWLi100IpIqr
|
||||
Mq4qusDwNZ3g8cUPtSkvJGs89djoajMDIJP7lQUEKUYnYrI0q755Tr/DgLWSk7iW
|
||||
l5ezym0VzWUD0/xXUz8yRbNMTjTac80rS5SZk2ja2wWkYlRJAgMBAAGjUzBRMB0G
|
||||
A1UdDgQWBBSsaX0IVZ4XXwdeffVAbG7gnxSYjTAfBgNVHSMEGDAWgBSsaX0IVZ4X
|
||||
XwdeffVAbG7gnxSYjTAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4GB
|
||||
ADe7SVmvGH2nkwVfONk8TauRUDkePN1CJZKFb2zW1uO9ANJ2v5Arm/OQp0BG/xnI
|
||||
Djw/aLTNVESF89oe15dkrUErtcaF413MC1Ld5lTCaJLHLGqDKY69e02YwRuxW7jY
|
||||
qarpt7k7aR5FbcfO5r4V/FK/Gvp4Dmoky8uap7SJIW6x
|
||||
-----END CERTIFICATE-----
|
||||
30
acme/tests/testdata/rsa4096_cert.pem
vendored
30
acme/tests/testdata/rsa4096_cert.pem
vendored
@@ -1,30 +0,0 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIFDTCCAvWgAwIBAgIUImqDrP53V69vFROsjP/gL0YtoA4wDQYJKoZIhvcNAQEL
|
||||
BQAwFjEUMBIGA1UEAwwLZXhhbXBsZS5jb20wHhcNMjAwNTI3MjMyNDE0WhcNMjAw
|
||||
NjI2MjMyNDE0WjAWMRQwEgYDVQQDDAtleGFtcGxlLmNvbTCCAiIwDQYJKoZIhvcN
|
||||
AQEBBQADggIPADCCAgoCggIBANY9LKLk9Dxn0MUMQFHwBoTN4ehDSWBws2KcytpF
|
||||
mc8m9Mfk1wmb4fQSKYtK3wIFMfIyo9HQu0nKqMkkUw52o3ZXyOv+oWwF5qNy2BKu
|
||||
lh5OMSkaZ0o13zoPpW42e+IUnyxvg70+0urD+sUue4cyTHh/nBIUjrM/05ZJ/ac8
|
||||
HR0RK3H41YoqBjq69JjMZczZZhbNFit3s6p0R1TbVAgc3ckqbtX5BDyQMQQCP4Ed
|
||||
m4DgbAFVqdcPUCC5W3F3fmuQiPKHiADzONZnXpy6lUvLDWqcd6loKp+nKHM6OkXX
|
||||
8hmD7pE1PYMQo4hqOfhBR2IgMjAShwd5qUFjl1m2oo0Qm3PFXOk6i2ZQdS6AA/yd
|
||||
B5/mX0RnM2oIdFZPb6UZFSmtEgs9sTzn+hMUyNSZQRE54px1ur1xws2R+vbsCyM5
|
||||
+KoFVxDjVjU9TlZx3GvDvnqz/tbHjji6l8VHZYOBMBUXbKHu2U6pJFZ5Zp7k68/z
|
||||
a3Fb9Pjtn3iRkXEyC0N5kLgqO4QTlExnxebV8aMvQpWd/qefnMn9qPYIZPEXSQAR
|
||||
mEBIahkcACb60s+acG0WFFluwBPtBqEr8Q67XlSF0Ibf4iBiRzpPobhlWta1nrFg
|
||||
4IWHMSoZ0PE75bhIGBEkhrpcXQCAxXmAfxfjKDH7jdJ1fRdnZ/9+OzwYGVX5GH/l
|
||||
0QDtAgMBAAGjUzBRMB0GA1UdDgQWBBQh3xiz/o1nEU2ySylZ9gxCXvIPGzAfBgNV
|
||||
HSMEGDAWgBQh3xiz/o1nEU2ySylZ9gxCXvIPGzAPBgNVHRMBAf8EBTADAQH/MA0G
|
||||
CSqGSIb3DQEBCwUAA4ICAQAELoXz31oR9pdAwidlv9ZBOKiC7KBWy8VMqXNVkfTn
|
||||
bVRxAUex7zleLFIOkWnqadsMesU9sIwrbLzBcZ8Q/vBY+z2xOPdXcgcAoAmdKWoq
|
||||
YBQNiqng9r54sqlzB/77QZCf5fdktESe7NTxhCifgx5SAWq7IUQs/lm3tnMUSAfE
|
||||
5ctuN6M+w8K54y3WDprcfMHpnc3ZHeSPhVQApHM0h/bDvXq0bRS7kmq27Hb153Qm
|
||||
nH3TwYB5pPSWW38NbUc+s/a7mItO7S8ly8yGbA0j9c/IbN5lM+OCdk06asz3+c8E
|
||||
uo8nuCBoYO5+6AqC2N7WJ3Tdr/pFA8jTbd6VNVlgCWTIR8ZosL5Fgkfv+4fUBrHt
|
||||
zdVUqMUzvga5rvZnwnJ5Qfu/drHeAAo9MTNFQNe2QgDlYfWBh5GweolgmFSwrpkY
|
||||
v/5wLtIyv/ASHKswybbqMIlpttcLTXjx5yuh8swttT6Wh+FQqqQ32KSRB3StiwyK
|
||||
oH0ZhrwYHiFYNlPxecGX6XUta6rFtTlEdkBGSnXzgiTzL2l+Nc0as0V5B9RninZG
|
||||
qJ+VOChSQ0OFvg1riSXv7tMvbLdGQnxwTRL3t6BMS8I4LA2m3ZfWUcuXT783ODTH
|
||||
16f1Q1AgXd2csstTWO9cv+N/0fpX31nqrm6+CrGduSr2u4HjYYnlLIUhmdTvK3fX
|
||||
Fg==
|
||||
-----END CERTIFICATE-----
|
||||
51
acme/tests/testdata/rsa4096_key.pem
vendored
51
acme/tests/testdata/rsa4096_key.pem
vendored
@@ -1,51 +0,0 @@
|
||||
-----BEGIN RSA PRIVATE KEY-----
|
||||
MIIJKgIBAAKCAgEA1j0souT0PGfQxQxAUfAGhM3h6ENJYHCzYpzK2kWZzyb0x+TX
|
||||
CZvh9BIpi0rfAgUx8jKj0dC7ScqoySRTDnajdlfI6/6hbAXmo3LYEq6WHk4xKRpn
|
||||
SjXfOg+lbjZ74hSfLG+DvT7S6sP6xS57hzJMeH+cEhSOsz/Tlkn9pzwdHRErcfjV
|
||||
iioGOrr0mMxlzNlmFs0WK3ezqnRHVNtUCBzdySpu1fkEPJAxBAI/gR2bgOBsAVWp
|
||||
1w9QILlbcXd+a5CI8oeIAPM41mdenLqVS8sNapx3qWgqn6coczo6RdfyGYPukTU9
|
||||
gxCjiGo5+EFHYiAyMBKHB3mpQWOXWbaijRCbc8Vc6TqLZlB1LoAD/J0Hn+ZfRGcz
|
||||
agh0Vk9vpRkVKa0SCz2xPOf6ExTI1JlBETninHW6vXHCzZH69uwLIzn4qgVXEONW
|
||||
NT1OVnHca8O+erP+1seOOLqXxUdlg4EwFRdsoe7ZTqkkVnlmnuTrz/NrcVv0+O2f
|
||||
eJGRcTILQ3mQuCo7hBOUTGfF5tXxoy9ClZ3+p5+cyf2o9ghk8RdJABGYQEhqGRwA
|
||||
JvrSz5pwbRYUWW7AE+0GoSvxDrteVIXQht/iIGJHOk+huGVa1rWesWDghYcxKhnQ
|
||||
8TvluEgYESSGulxdAIDFeYB/F+MoMfuN0nV9F2dn/347PBgZVfkYf+XRAO0CAwEA
|
||||
AQKCAgEA0hZdTkQtCYtYm9LexDsXeWYX8VcCfrMmBj7xYcg9A3oVMmzDPuYBVwH0
|
||||
gWbjd6y2hOaJ5TfGYZ99kvmvBRDsTSHaoyopC7BhssjtAKz6Ay/0X3VH8usPQ3WS
|
||||
aZi+NT65tK6KRqtz08ppgLGLa1G00bl5x/Um1rpxeACI4FU/y4BJ1VMJvJpnT3KE
|
||||
Z86Qyagqx5NH+UpCApZSWPFX3zjHePzGgcfXErjniCHYOnpZQrFQ2KIzkfSvQ9fg
|
||||
x01ByKOM2CB2C1B33TCzBAioXRH6zyAu7A59NeCK9ywTduhDvie1a+oEryFC7IQW
|
||||
4s7I/H3MGX4hsf/pLXlHMy+5CZJOjRaC2h+pypfbbcuiXu6Sn64kHNpiI7SxI5DI
|
||||
MIRjyG7MdUcrzq0Rt8ogwwpbCoRqrl/w3bhxtqmeZaEZtyxbjlm7reK2YkIFDgyz
|
||||
JMqiJK5ZAi+9L/8c0xhjjAQQ0sIzrjmjA8U+6YnWL9jU5qXTVnBB8XQucyeeZGgk
|
||||
yRHyMur71qOXN8z3UEva7MHkDTUBlj8DgTz6sEjqCipaWl0CXfDNa4IhHIXD5qiF
|
||||
wplhq7OeS0v6EGG/UFa3Q/lFntxtrayxJX7uvvSccGzjPKXTjpWUELLi/FdnIsum
|
||||
eXT3RgIEYozj4BibDXaBLfHTCVzxOr7AAEvKM9XWSUgLA0paSWECggEBAO9ZBeE1
|
||||
GWzd1ejTTkcxBC9AK2rNsYG8PdNqiof/iTbuJWNeRqpG+KB/0CNIpjZ2X5xZd0tM
|
||||
FDpHTFehlP26Roxuq50iRAFc+SN5KoiO0A3JuJAidreIgRTia1saUUrypHqWrYEA
|
||||
VZVj2AI8Pyg3s1OkR2frFskY7hXBVb/pJNDP/m9xTXXIYiIXYkHYe+4RIJCnAxRv
|
||||
q5YHKaX+0Ull9YCZJCxmwvcHat8sgu8qkiwUMEM6QSNEkrEbdnWYBABvC1AR6sws
|
||||
7MP1h9+j22n4Zc/3D6kpFZEL9Erx8nNyhbOZ6q2Tdnf6YKVVjZdyVa8VyNnR0ROl
|
||||
3BjkFaHb/bg4e4kCggEBAOUk8ZJS3qBeGCOjug384zbHGcnhUBYtYJiOz+RXBtP+
|
||||
PRksbFtTkgk1sHuSGO8YRddU4Qv7Av1xL8o+DEsLBSD0YQ7pmLrR/LK+iDQ5N63O
|
||||
Fve9uJH0ybxAOkiua7G24+lTsVUP//KWToL4Wh5zbHBBjL5D2Z9zoeVbcE87xhva
|
||||
lImMVr4Ex252DqNP9wkZxBjudFyJ/C/TnXrjPcgwhxWTC7sLQMhE5p+490G7c4hX
|
||||
PywkIKrANbu37KDiAvVS+dC66ZgpL/NUDkeloAmGNO08LGzbV6YKchlvDyWU/AvW
|
||||
0hYjbL0FUq7K/wp1G9fumolB+fbI25K9c13X93STzUUCggEBAJDsNFUyk5yJjbYW
|
||||
C/WrRj9d+WwH9Az77+uNPSgvn+O0usq6EMuVgYGdImfa21lqv2Wp/kOHY1AOT7lX
|
||||
yyD+oyzw7dSNJOQ2aVwDR6+72Vof5DLRy1RBwPbmSd61xrc8yD658YCEtU1pUSe5
|
||||
VvyBDYH9nIbdn8RP5gkiMUusXXBaIFNWJXLFzDWcNxBrhk6V7EPp/EFphFmpKJyr
|
||||
+AkbRVWCZJbF+hMdWKadCwLJogwyhS6PnVU/dhrq6AU38GRa2Fy5HJRYN1xH1Oej
|
||||
DX3Su8L6c28Xw0k6FcczTHx+wVoIPkKvYTIwVkiFzt/+iMckx6KsGo5tBSHFKRwC
|
||||
WlQrTxECggEBALjUruLnY1oZ7AC7bTUhOimSOfQEgTQSUCtebsRxijlvhtsKYTDd
|
||||
XRt+qidStjgN7S/+8DRYuZWzOeg5WnMhpXZqiOudcyume922IGl3ibjxVsdoyjs5
|
||||
J4xohlrgDlBgBMDNWGoTqNGFejjcmNydH+gAh8VlN2INxJYbxqCyx17qVgwJHmLR
|
||||
uggYxD/pHYvCs9GkbknCp5/wYsOgDtKuihfV741lS1D/esN1UEQ+LrfYIEW7snno
|
||||
5q7Pcdhn1hkKYCWEzy2Ec4Aj2gzixQ9JqOF/OxpnZvCw1k47rg0TeqcWFYnz8x8Y
|
||||
7xO8/DH0OoxXk2GJzVXJuItJs4gLzzfCjL0CggEAJFHfC9jisdy7CoWiOpNCSF1B
|
||||
S0/CWDz77cZdlWkpTdaXGGp1MA/UKUFPIH8sOHfvpKS660+X4G/1ZBHmFb4P5kFF
|
||||
Qy8UyUMKtSOEdZS6KFlRlfSCAMd5aSTmCvq4OSjYEpMRwUhU/iEJNkn9Z1Soehe0
|
||||
U3dxJ8KiT1071geO6rRquSHoSJs6Y0WQKriYYQJOhh4Axs3PQihER2eyh+WGk8YJ
|
||||
02m0mMsjntqnXtdc6IcdKaHp9ko+OpM9QZLsvt19fxBcrXj/i21uUXrzuNtKfO6M
|
||||
JqGhsOrO2dh8lMhvodENvgKA0DmYDC9N7ogo7bxTNSedcjBF46FhJoqii8m70Q==
|
||||
-----END RSA PRIVATE KEY-----
|
||||
@@ -1,7 +1,7 @@
|
||||
include LICENSE.txt
|
||||
include README.rst
|
||||
recursive-include tests *
|
||||
recursive-include certbot_apache/_internal/augeas_lens *.aug
|
||||
recursive-include certbot_apache/_internal/tls_configs *.conf
|
||||
global-exclude __pycache__
|
||||
global-exclude *.py[cod]
|
||||
recursive-include docs *
|
||||
recursive-include certbot_apache/tests/testdata *
|
||||
include certbot_apache/centos-options-ssl-apache.conf
|
||||
include certbot_apache/options-ssl-apache.conf
|
||||
recursive-include certbot_apache/augeas_lens *.aug
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
"""Certbot Apache plugin."""
|
||||
@@ -1,257 +0,0 @@
|
||||
""" Utility functions for certbot-apache plugin """
|
||||
import binascii
|
||||
import fnmatch
|
||||
import logging
|
||||
import re
|
||||
import subprocess
|
||||
|
||||
import pkg_resources
|
||||
|
||||
from certbot import errors
|
||||
from certbot import util
|
||||
from certbot.compat import os
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def get_mod_deps(mod_name):
|
||||
"""Get known module dependencies.
|
||||
|
||||
.. note:: This does not need to be accurate in order for the client to
|
||||
run. This simply keeps things clean if the user decides to revert
|
||||
changes.
|
||||
.. warning:: If all deps are not included, it may cause incorrect parsing
|
||||
behavior, due to enable_mod's shortcut for updating the parser's
|
||||
currently defined modules (`.ApacheParser.add_mod`)
|
||||
This would only present a major problem in extremely atypical
|
||||
configs that use ifmod for the missing deps.
|
||||
|
||||
"""
|
||||
deps = {
|
||||
"ssl": ["setenvif", "mime"]
|
||||
}
|
||||
return deps.get(mod_name, [])
|
||||
|
||||
|
||||
def get_file_path(vhost_path):
|
||||
"""Get file path from augeas_vhost_path.
|
||||
|
||||
Takes in Augeas path and returns the file name
|
||||
|
||||
:param str vhost_path: Augeas virtual host path
|
||||
|
||||
:returns: filename of vhost
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
if not vhost_path or not vhost_path.startswith("/files/"):
|
||||
return None
|
||||
|
||||
return _split_aug_path(vhost_path)[0]
|
||||
|
||||
|
||||
def get_internal_aug_path(vhost_path):
|
||||
"""Get the Augeas path for a vhost with the file path removed.
|
||||
|
||||
:param str vhost_path: Augeas virtual host path
|
||||
|
||||
:returns: Augeas path to vhost relative to the containing file
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
return _split_aug_path(vhost_path)[1]
|
||||
|
||||
|
||||
def _split_aug_path(vhost_path):
|
||||
"""Splits an Augeas path into a file path and an internal path.
|
||||
|
||||
After removing "/files", this function splits vhost_path into the
|
||||
file path and the remaining Augeas path.
|
||||
|
||||
:param str vhost_path: Augeas virtual host path
|
||||
|
||||
:returns: file path and internal Augeas path
|
||||
:rtype: `tuple` of `str`
|
||||
|
||||
"""
|
||||
# Strip off /files
|
||||
file_path = vhost_path[6:]
|
||||
internal_path = []
|
||||
|
||||
# Remove components from the end of file_path until it becomes valid
|
||||
while not os.path.exists(file_path):
|
||||
file_path, _, internal_path_part = file_path.rpartition("/")
|
||||
internal_path.append(internal_path_part)
|
||||
|
||||
return file_path, "/".join(reversed(internal_path))
|
||||
|
||||
|
||||
def parse_define_file(filepath, varname):
|
||||
""" Parses Defines from a variable in configuration file
|
||||
|
||||
:param str filepath: Path of file to parse
|
||||
:param str varname: Name of the variable
|
||||
|
||||
:returns: Dict of Define:Value pairs
|
||||
:rtype: `dict`
|
||||
|
||||
"""
|
||||
return_vars = {}
|
||||
# Get list of words in the variable
|
||||
a_opts = util.get_var_from_file(varname, filepath).split()
|
||||
for i, v in enumerate(a_opts):
|
||||
# Handle Define statements and make sure it has an argument
|
||||
if v == "-D" and len(a_opts) >= i+2:
|
||||
var_parts = a_opts[i+1].partition("=")
|
||||
return_vars[var_parts[0]] = var_parts[2]
|
||||
elif len(v) > 2 and v.startswith("-D"):
|
||||
# Found var with no whitespace separator
|
||||
var_parts = v[2:].partition("=")
|
||||
return_vars[var_parts[0]] = var_parts[2]
|
||||
return return_vars
|
||||
|
||||
|
||||
def unique_id():
|
||||
""" Returns an unique id to be used as a VirtualHost identifier"""
|
||||
return binascii.hexlify(os.urandom(16)).decode("utf-8")
|
||||
|
||||
|
||||
def included_in_paths(filepath, paths):
|
||||
"""
|
||||
Returns true if the filepath is included in the list of paths
|
||||
that may contain full paths or wildcard paths that need to be
|
||||
expanded.
|
||||
|
||||
:param str filepath: Filepath to check
|
||||
:params list paths: List of paths to check against
|
||||
|
||||
:returns: True if included
|
||||
:rtype: bool
|
||||
"""
|
||||
|
||||
return any(fnmatch.fnmatch(filepath, path) for path in paths)
|
||||
|
||||
|
||||
def parse_defines(apachectl):
|
||||
"""
|
||||
Gets Defines from httpd process and returns a dictionary of
|
||||
the defined variables.
|
||||
|
||||
:param str apachectl: Path to apachectl executable
|
||||
|
||||
:returns: dictionary of defined variables
|
||||
:rtype: dict
|
||||
"""
|
||||
|
||||
variables = {}
|
||||
define_cmd = [apachectl, "-t", "-D",
|
||||
"DUMP_RUN_CFG"]
|
||||
matches = parse_from_subprocess(define_cmd, r"Define: ([^ \n]*)")
|
||||
try:
|
||||
matches.remove("DUMP_RUN_CFG")
|
||||
except ValueError:
|
||||
return {}
|
||||
|
||||
for match in matches:
|
||||
if match.count("=") > 1:
|
||||
logger.error("Unexpected number of equal signs in "
|
||||
"runtime config dump.")
|
||||
raise errors.PluginError(
|
||||
"Error parsing Apache runtime variables")
|
||||
parts = match.partition("=")
|
||||
variables[parts[0]] = parts[2]
|
||||
|
||||
return variables
|
||||
|
||||
|
||||
def parse_includes(apachectl):
|
||||
"""
|
||||
Gets Include directives from httpd process and returns a list of
|
||||
their values.
|
||||
|
||||
:param str apachectl: Path to apachectl executable
|
||||
|
||||
:returns: list of found Include directive values
|
||||
:rtype: list of str
|
||||
"""
|
||||
|
||||
inc_cmd = [apachectl, "-t", "-D",
|
||||
"DUMP_INCLUDES"]
|
||||
return parse_from_subprocess(inc_cmd, r"\(.*\) (.*)")
|
||||
|
||||
|
||||
def parse_modules(apachectl):
|
||||
"""
|
||||
Get loaded modules from httpd process, and return the list
|
||||
of loaded module names.
|
||||
|
||||
:param str apachectl: Path to apachectl executable
|
||||
|
||||
:returns: list of found LoadModule module names
|
||||
:rtype: list of str
|
||||
"""
|
||||
|
||||
mod_cmd = [apachectl, "-t", "-D",
|
||||
"DUMP_MODULES"]
|
||||
return parse_from_subprocess(mod_cmd, r"(.*)_module")
|
||||
|
||||
|
||||
def parse_from_subprocess(command, regexp):
|
||||
"""Get values from stdout of subprocess command
|
||||
|
||||
:param list command: Command to run
|
||||
:param str regexp: Regexp for parsing
|
||||
|
||||
:returns: list parsed from command output
|
||||
:rtype: list
|
||||
|
||||
"""
|
||||
stdout = _get_runtime_cfg(command)
|
||||
return re.compile(regexp).findall(stdout)
|
||||
|
||||
|
||||
def _get_runtime_cfg(command):
|
||||
"""
|
||||
Get runtime configuration info.
|
||||
|
||||
:param command: Command to run
|
||||
|
||||
:returns: stdout from command
|
||||
|
||||
"""
|
||||
try:
|
||||
proc = subprocess.run(
|
||||
command,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE,
|
||||
universal_newlines=True,
|
||||
check=False,
|
||||
env=util.env_no_snap_for_external_calls())
|
||||
stdout, stderr = proc.stdout, proc.stderr
|
||||
|
||||
except (OSError, ValueError):
|
||||
logger.error(
|
||||
"Error running command %s for runtime parameters!%s",
|
||||
command, os.linesep)
|
||||
raise errors.MisconfigurationError(
|
||||
"Error accessing loaded Apache parameters: {0}".format(
|
||||
command))
|
||||
# Small errors that do not impede
|
||||
if proc.returncode != 0:
|
||||
logger.warning("Error in checking parameter list: %s", stderr)
|
||||
raise errors.MisconfigurationError(
|
||||
"Apache is unable to check whether or not the module is "
|
||||
"loaded because Apache is misconfigured.")
|
||||
|
||||
return stdout
|
||||
|
||||
def find_ssl_apache_conf(prefix):
|
||||
"""
|
||||
Find a TLS Apache config file in the dedicated storage.
|
||||
:param str prefix: prefix of the TLS Apache config file to find
|
||||
:return: the path the TLS Apache config file
|
||||
:rtype: str
|
||||
"""
|
||||
return pkg_resources.resource_filename(
|
||||
"certbot_apache",
|
||||
os.path.join("_internal", "tls_configs", "{0}-options-ssl-apache.conf".format(prefix)))
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user