Compare commits

..

1 Commits

Author SHA1 Message Date
Erica Portnoy
dcf6c64464 Address erikrose's comments on #5329 2018-01-09 16:17:49 -08:00
1147 changed files with 24252 additions and 51488 deletions

View File

@@ -1,119 +0,0 @@
# Configuring Azure Pipelines with Certbot
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
Unlike Travis, where CodeCov is working without any action required, CodeCov supports Azure Pipelines
using the coverage-bash utility (not python-coverage for now) only if you provide the Codecov repo token
using the `CODECOV_TOKEN` environment variable. So `CODECOV_TOKEN` needs to be set as a secured
environment variable to allow the main pipeline to publish coverage reports to CodeCov.
This INSTALL.md file explains how to configure Azure Pipelines with Certbot in order to execute the CI/CD logic defined in `.azure-pipelines` folder with it.
During this installation step, warnings describing user access and legal comitments will be displayed like this:
```
!!! ACCESS REQUIRED !!!
```
This document suppose that the Azure DevOps organization is named _certbot_, and the Azure DevOps project is also _certbot_.
## Useful links
* https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema
* https://www.azuredevopslabs.com/labs/azuredevops/github-integration/
* https://docs.microsoft.com/en-us/azure/devops/pipelines/ecosystems/python?view=azure-devops
## Prerequisites
### Having a GitHub account
Use your GitHub user for a normal GitHub account, or a user that has administrative rights to the GitHub organization if relevant.
### Having an Azure DevOps account
- Go to https://dev.azure.com/, click "Start free with GitHub"
- Login to GitHub
```
!!! ACCESS REQUIRED !!!
Personal user data (email + profile info, in read-only)
```
- Microsoft will create a Live account using the email referenced for the GitHub account. This account is also linked to GitHub account (meaning you can log it using GitHub authentication)
- Proceed with account registration (birth date, country), add details about name and email contact
```
!!! ACCESS REQUIRED !!!
Microsoft proposes to send commercial links to this mail
Azure DevOps terms of service need to be accepted
```
_Logged to Azure DevOps, account is ready._
### Installing Azure Pipelines to GitHub
- On GitHub, go to Marketplace
- Select Azure Pipeline, and "Set up a plan"
- Select Free, then "Install it for free"
- Click "Complete order and begin installation"
```
!!! ACCESS !!!
Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses, deployments, issues, pull requests.
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
protected branches, like master, and cannot modify the security context around this on GitHub.
Access can be defined for all or only selected repositories, which is nice.
```
- Redirected to Azure DevOps, select the account created in _Having an Azure DevOps account_ section.
- Select the organization, and click "Create a new project" (let's name it the same than the targeted github repo)
- The Visibility is public, to profit from 10 parallel jobs
```
!!! ACCESS !!!
Azure Pipelines needs access to the GitHub account (in term of being able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
```
_Done. We can move to pipelines configuration._
## Import an existing pipelines from `.azure-pipelines` folder
- On Azure DevOps, go to your organization (eg. _certbot_) then your project (eg. _certbot_)
- Click "Pipelines" tab
- Click "New pipeline"
- Where is your code?: select "__Use the classic editor__"
__Warning: Do not choose the GitHub option in Where is your code? section. Indeed, this option will trigger an OAuth
grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAuth Application. The permissions asked
then are way too large (admin level on almost everything), while the classic approach does not add any more
permissions, and works perfectly well.__
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
- Click on YAML option for "Select a template"
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._
## Add a secret variable to a pipeline (like `CODECOV_TOKEN`)
__NB: Following steps suppose that you already setup the YAML pipeline file to
consume the secret variable that these steps will create as an environment variable.
For a variable named `CODECOV_TOKEN` consuming the variable `codecov_token`,
in the YAML file this setup would take the form of the following:
```
steps:
- script: ./do_something_that_consumes_CODECOV_TOKEN # Eg. `codecov -F windows`
env:
CODECOV_TOKEN: $(codecov_token)
```
To set up a variable that is shared between pipelines, follow the instructions
at
https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups.
When adding variables to a group, don't forget to tick "Keep this value secret"
if it shouldn't be shared publcily.

View File

@@ -1,14 +0,0 @@
# Advanced pipeline for running our full test suite on demand.
trigger:
# When changing these triggers, please ensure the documentation under
# "Running tests in CI" is still correct.
- test-*
pr: none
variables:
# We don't publish our Docker images in this pipeline, but when building them
# for testing, let's use the nightly tag.
dockerTag: nightly
stages:
- template: templates/stages/test-and-package-stage.yml

View File

@@ -1,8 +0,0 @@
trigger: none
pr:
- master
- '*.x'
jobs:
- template: templates/jobs/standard-tests-jobs.yml

View File

@@ -1,18 +0,0 @@
# Nightly pipeline running each day for master.
trigger: none
pr: none
schedules:
- cron: "30 4 * * *"
displayName: Nightly build
branches:
include:
- master
always: true
variables:
dockerTag: nightly
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/deploy-stage.yml
- template: templates/stages/notify-failure-stage.yml

View File

@@ -1,18 +0,0 @@
# Release pipeline to run our full test suite, build artifacts, and deploy them
# for GitHub release tags.
trigger:
tags:
include:
- v*
pr: none
variables:
dockerTag: ${{variables['Build.SourceBranchName']}}
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/changelog-stage.yml
- template: templates/stages/deploy-stage.yml
parameters:
snapReleaseChannel: beta
- template: templates/stages/notify-failure-stage.yml

View File

@@ -1,98 +0,0 @@
jobs:
- job: extended_test
variables:
- name: IMAGE_NAME
value: ubuntu-18.04
- name: PYTHON_VERSION
value: 3.9
- group: certbot-common
strategy:
matrix:
linux-py36:
PYTHON_VERSION: 3.6
TOXENV: py36
linux-py37:
PYTHON_VERSION: 3.7
TOXENV: py37
linux-py38:
PYTHON_VERSION: 3.8
TOXENV: py38
linux-py37-nopin:
PYTHON_VERSION: 3.7
TOXENV: py37
CERTBOT_NO_PIN: 1
linux-boulder-v1-integration-certbot-oldest:
PYTHON_VERSION: 3.6
TOXENV: integration-certbot-oldest
ACME_SERVER: boulder-v1
linux-boulder-v2-integration-certbot-oldest:
PYTHON_VERSION: 3.6
TOXENV: integration-certbot-oldest
ACME_SERVER: boulder-v2
linux-boulder-v1-integration-nginx-oldest:
PYTHON_VERSION: 3.6
TOXENV: integration-nginx-oldest
ACME_SERVER: boulder-v1
linux-boulder-v2-integration-nginx-oldest:
PYTHON_VERSION: 3.6
TOXENV: integration-nginx-oldest
ACME_SERVER: boulder-v2
linux-boulder-v1-py36-integration:
PYTHON_VERSION: 3.6
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py36-integration:
PYTHON_VERSION: 3.6
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py37-integration:
PYTHON_VERSION: 3.7
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py37-integration:
PYTHON_VERSION: 3.7
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py38-integration:
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py38-integration:
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py39-integration:
PYTHON_VERSION: 3.9
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py39-integration:
PYTHON_VERSION: 3.9
TOXENV: integration
ACME_SERVER: boulder-v2
nginx-compat:
TOXENV: nginx_compat
linux-integration-rfc2136:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.8
TOXENV: integration-dns-rfc2136
docker-dev:
TOXENV: docker_dev
macos-farmtest-apache2:
# We run one of these test farm tests on macOS to help ensure the
# tests continue to work on the platform.
IMAGE_NAME: macOS-10.15
PYTHON_VERSION: 3.8
TOXENV: test-farm-apache2
farmtest-leauto-upgrades:
PYTHON_VERSION: 3.7
TOXENV: test-farm-leauto-upgrades
farmtest-certonly-standalone:
PYTHON_VERSION: 3.7
TOXENV: test-farm-certonly-standalone
farmtest-sdists:
PYTHON_VERSION: 3.7
TOXENV: test-farm-sdists
pool:
vmImage: $(IMAGE_NAME)
steps:
- template: ../steps/tox-steps.yml

View File

@@ -1,217 +0,0 @@
jobs:
- job: docker_build
pool:
vmImage: ubuntu-18.04
strategy:
matrix:
amd64:
DOCKER_ARCH: amd64
# Do not run the heavy non-amd64 builds for test branches
${{ if not(startsWith(variables['Build.SourceBranchName'], 'test-')) }}:
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
steps:
- bash: set -e && tools/docker/build.sh $(dockerTag) $DOCKER_ARCH
displayName: Build the Docker images
# We don't filter for the Docker Hub organization to continue to allow
# easy testing of these scripts on forks.
- bash: |
set -e
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}')
docker save --output images.tar $DOCKER_IMAGES
displayName: Save the Docker images
# If the name of the tar file or artifact changes, the deploy stage will
# also need to be updated.
- bash: set -e && mv images.tar $(Build.ArtifactStagingDirectory)
displayName: Prepare Docker artifact
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: docker_$(DOCKER_ARCH)
displayName: Store Docker artifact
- job: docker_run
dependsOn: docker_build
pool:
vmImage: ubuntu-18.04
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: docker_amd64
path: $(Build.SourcesDirectory)
displayName: Retrieve Docker images
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
displayName: Load Docker images
- bash: |
set -ex
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}:{{.Tag}}')
for DOCKER_IMAGE in ${DOCKER_IMAGES}
do docker run --rm "${DOCKER_IMAGE}" plugins --prepare
done
displayName: Run integration tests for Docker images
- job: installer_build
pool:
vmImage: vs2017-win2016
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
architecture: x86
addToPath: true
- script: python windows-installer/construct.py
displayName: Build Certbot installer
- task: CopyFiles@2
inputs:
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
contents: '*.exe'
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
artifact: windows-installer
displayName: Publish Windows installer
- job: installer_run
dependsOn: installer_build
strategy:
matrix:
win2019:
imageName: windows-2019
win2016:
imageName: vs2017-win2016
pool:
vmImage: $(imageName)
steps:
- powershell: |
if ($PSVersionTable.PSVersion.Major -ne 5) {
throw "Powershell version is not 5.x"
}
condition: eq(variables['imageName'], 'vs2017-win2016')
displayName: Check Powershell 5.x is used in vs2017-win2016
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- task: DownloadPipelineArtifact@2
inputs:
artifact: windows-installer
path: $(Build.SourcesDirectory)/bin
displayName: Retrieve Windows installer
- script: |
python -m venv venv
venv\Scripts\python tools\pipstrap.py
venv\Scripts\python tools\pip_install.py -e certbot-ci
env:
PIP_NO_BUILD_ISOLATION: no
displayName: Prepare Certbot-CI
- script: |
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\windows_installer_integration_tests --allow-persistent-changes --installer-path $(Build.SourcesDirectory)\bin\certbot-beta-installer-win32.exe
displayName: Run windows installer integration tests
- script: |
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
displayName: Run certbot integration tests
- job: snaps_build
pool:
vmImage: ubuntu-18.04
timeoutInMinutes: 0
variables:
# Do not run the heavy non-amd64 builds for test branches
${{ if not(startsWith(variables['Build.SourceBranchName'], 'test-')) }}:
ARCHS: amd64 arm64 armhf
${{ if startsWith(variables['Build.SourceBranchName'], 'test-') }}:
ARCHS: amd64
steps:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
sudo snap install --classic snapcraft
displayName: Install dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- task: DownloadSecureFile@1
name: credentials
inputs:
secureFile: launchpad-credentials
- script: |
set -e
git config --global user.email "$(Build.RequestedForEmail)"
git config --global user.name "$(Build.RequestedFor)"
mkdir -p ~/.local/share/snapcraft/provider/launchpad
cp $(credentials.secureFilePath) ~/.local/share/snapcraft/provider/launchpad/credentials
python3 tools/snap/build_remote.py ALL --archs ${ARCHS} --timeout 19800
displayName: Build snaps
- script: |
set -e
mv *.snap $(Build.ArtifactStagingDirectory)
mv certbot-dns-*/*.snap $(Build.ArtifactStagingDirectory)
displayName: Prepare artifacts
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: snaps
displayName: Store snaps artifacts
- job: snap_run
dependsOn: snaps_build
pool:
vmImage: ubuntu-18.04
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- script: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends nginx-light snapd
python3 -m venv venv
venv/bin/python tools/pipstrap.py
venv/bin/python tools/pip_install.py -U tox
displayName: Install dependencies
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- script: |
set -e
sudo snap install --dangerous --classic snap/certbot_*_amd64.snap
displayName: Install Certbot snap
- script: |
set -e
venv/bin/python -m tox -e integration-external,apacheconftest-external-with-pebble
displayName: Run tox
- job: snap_dns_run
dependsOn: snaps_build
pool:
vmImage: ubuntu-18.04
steps:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
displayName: Install dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- script: |
set -e
python3 -m venv venv
venv/bin/python tools/pipstrap.py
venv/bin/python tools/pip_install.py -e certbot-ci
displayName: Prepare Certbot-CI
- script: |
set -e
sudo -E venv/bin/pytest certbot-ci/snap_integration_tests/dns_tests --allow-persistent-changes --snap-folder $(Build.SourcesDirectory)/snap --snap-arch amd64
displayName: Test DNS plugins snaps

View File

@@ -1,78 +0,0 @@
jobs:
- job: test
variables:
PYTHON_VERSION: 3.9
strategy:
matrix:
macos-py36:
IMAGE_NAME: macOS-10.15
PYTHON_VERSION: 3.6
TOXENV: py36
macos-py39:
IMAGE_NAME: macOS-10.15
PYTHON_VERSION: 3.9
TOXENV: py39
windows-py36:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.6
TOXENV: py36
windows-py38-cover:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.8
TOXENV: py38-cover
windows-integration-certbot:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.8
TOXENV: integration-certbot
linux-oldest-tests-1:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.6
TOXENV: '{acme,apache,apache-v2,certbot}-oldest'
linux-oldest-tests-2:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.6
TOXENV: '{dns,nginx}-oldest'
linux-py36:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.6
TOXENV: py36
linux-py39-cover:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.9
TOXENV: py39-cover
linux-py37-lint:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.7
TOXENV: lint
linux-py36-mypy:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.6
TOXENV: mypy
linux-integration:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: pebble
apache-compat:
IMAGE_NAME: ubuntu-18.04
TOXENV: apache_compat
le-modification:
IMAGE_NAME: ubuntu-18.04
TOXENV: modification
apacheconftest:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.6
TOXENV: apacheconftest-with-pebble
nginxroundtrip:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.6
TOXENV: nginxroundtrip
pool:
vmImage: $(IMAGE_NAME)
steps:
- template: ../steps/tox-steps.yml
- job: test_sphinx_builds
pool:
vmImage: ubuntu-latest
steps:
- template: ../steps/sphinx-steps.yml

View File

@@ -1,19 +0,0 @@
stages:
- stage: Changelog
jobs:
- job: prepare
pool:
vmImage: vs2017-win2016
steps:
# If we change the output filename from `release_notes.md`, it should also be changed in tools/create_github_release.py
- bash: |
set -e
CERTBOT_VERSION="$(cd certbot && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
displayName: Prepare changelog
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
artifact: changelog
displayName: Publish changelog

View File

@@ -1,99 +0,0 @@
parameters:
- name: snapReleaseChannel
type: string
default: edge
values:
- edge
- beta
stages:
- stage: Deploy
jobs:
# This job relies on credentials used to publish the Certbot snaps. This
# credential file was created by running:
#
# snapcraft logout
# snapcraft login (provide the shared snapcraft credentials when prompted)
# snapcraft export-login --channels=beta,edge snapcraft.cfg
#
# Then the file was added as a secure file in Azure pipelines
# with the name snapcraft.cfg by following the instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops
# including authorizing the file in all pipelines as described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops#how-do-i-authorize-a-secure-file-for-use-in-all-pipelines.
#
# This file has a maximum lifetime of one year and the current
# file will expire on 2021-07-28 which is also tracked by
# https://github.com/certbot/certbot/issues/7931. The file will
# need to be updated before then to prevent automated deploys
# from breaking.
#
# Revoking these credentials can be done by changing the password of the
# account used to generate the credentials. See
# https://forum.snapcraft.io/t/revoking-exported-credentials/19031 for
# more info.
- job: publish_snap
pool:
vmImage: ubuntu-18.04
variables:
- group: certbot-common
steps:
- bash: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
sudo snap install --classic snapcraft
displayName: Install dependencies
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- task: DownloadSecureFile@1
name: snapcraftCfg
inputs:
secureFile: snapcraft.cfg
- bash: |
set -e
mkdir -p .snapcraft
ln -s $(snapcraftCfg.secureFilePath) .snapcraft/snapcraft.cfg
for SNAP_FILE in snap/*.snap; do
tools/retry.sh eval snapcraft upload --release=${{ parameters.snapReleaseChannel }} "${SNAP_FILE}"
done
displayName: Publish to Snap store
- job: publish_docker
pool:
vmImage: ubuntu-18.04
strategy:
matrix:
amd64:
DOCKER_ARCH: amd64
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: docker_$(DOCKER_ARCH)
path: $(Build.SourcesDirectory)
displayName: Retrieve Docker images
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
displayName: Load Docker images
- task: Docker@2
inputs:
command: login
# The credentials used here are for the shared certbotbot account
# on Docker Hub. The credentials are stored in a service account
# which was created by following the instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
# The name given to this service account must match the value
# given to containerRegistry below. "Grant access to all
# pipelines" should also be checked. To revoke these
# credentials, we can change the password on the certbotbot
# Docker Hub account or remove the account from the
# Certbot organization on Docker Hub.
containerRegistry: docker-hub
displayName: Login to Docker Hub
- bash: set -e && tools/docker/deploy.sh $(dockerTag) $DOCKER_ARCH
displayName: Deploy the Docker images

View File

@@ -1,19 +0,0 @@
stages:
- stage: On_Failure
jobs:
- job: notify_mattermost
variables:
- group: certbot-common
pool:
vmImage: ubuntu-latest
steps:
- bash: |
set -e
MESSAGE="\
---\n\
##### Azure Pipeline
*Repo* $(Build.Repository.ID) - *Pipeline* $(Build.DefinitionName) #$(Build.BuildNumber) - *Branch/PR* $(Build.SourceBranchName)\n\
:warning: __Pipeline has failed__: [Link to the build](https://dev.azure.com/$(Build.Repository.ID)/_build/results?buildId=$(Build.BuildId)&view=results)\n\n\
---"
curl -i -X POST --data-urlencode "payload={\"text\":\"${MESSAGE}\"}" "$(MATTERMOST_URL)"
condition: failed()

View File

@@ -1,6 +0,0 @@
stages:
- stage: TestAndPackage
jobs:
- template: ../jobs/standard-tests-jobs.yml
- template: ../jobs/extended-tests-jobs.yml
- template: ../jobs/packaging-jobs.yml

View File

@@ -1,23 +0,0 @@
steps:
- bash: |
FINAL_STATUS=0
declare -a FAILED_BUILDS
python3 -m venv .venv
source .venv/bin/activate
python tools/pipstrap.py
for doc_path in */docs
do
echo ""
echo "##[group]Building $doc_path"
pip install -q -e $doc_path/..[docs]
if ! sphinx-build -W --keep-going -b html $doc_path $doc_path/_build/html; then
FINAL_STATUS=1
FAILED_BUILDS[${#FAILED_BUILDS[@]}]="${doc_path%/docs}"
fi
echo "##[endgroup]"
done
if [[ $FINAL_STATUS -ne 0 ]]; then
echo "##[error]The following builds failed: ${FAILED_BUILDS[*]}"
exit 1
fi
displayName: Build Sphinx Documentation

View File

@@ -1,53 +0,0 @@
steps:
- bash: |
set -e
brew install augeas
condition: startswith(variables['IMAGE_NAME'], 'macOS')
displayName: Install MacOS dependencies
- bash: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends \
python-dev \
gcc \
libaugeas0 \
libssl-dev \
libffi-dev \
ca-certificates \
nginx-light \
openssl
sudo systemctl stop nginx
condition: startswith(variables['IMAGE_NAME'], 'ubuntu')
displayName: Install Linux dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: $(PYTHON_VERSION)
addToPath: true
# tools/pip_install.py is used to pin packages to a known working version
# except in tests where the environment variable CERTBOT_NO_PIN is set.
# virtualenv is listed here explicitly to make sure it is upgraded when
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
# version of virtualenv. The option "-I" is set so when CERTBOT_NO_PIN is also
# set, pip updates dependencies it thinks are already satisfied to avoid some
# problems with its lack of real dependency resolution.
- bash: |
set -e
python tools/pipstrap.py
python tools/pip_install.py -I tox virtualenv
displayName: Install runtime dependencies
- task: DownloadSecureFile@1
name: testFarmPem
inputs:
secureFile: azure-test-farm.pem
condition: contains(variables['TOXENV'], 'test-farm')
- bash: |
set -e
export TARGET_BRANCH="`echo "${BUILD_SOURCEBRANCH}" | sed -E 's!refs/(heads|tags)/!!g'`"
[ -z "${SYSTEM_PULLREQUEST_TARGETBRANCH}" ] || export TARGET_BRANCH="${SYSTEM_PULLREQUEST_TARGETBRANCH}"
env
python -m tox
env:
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
AWS_EC2_PEM_FILE: $(testFarmPem.secureFilePath)
displayName: Run tox

View File

@@ -1,5 +1,2 @@
[run]
omit = */setup.py
[report]
omit = */setup.py

View File

@@ -1,18 +0,0 @@
# https://editorconfig.org/
root = true
[*]
insert_final_newline = true
trim_trailing_whitespace = true
end_of_line = lf
[*.py]
indent_style = space
indent_size = 4
charset = utf-8
max_line_length = 100
[*.yaml]
indent_style = space
indent_size = 2

12
.envrc
View File

@@ -1,12 +0,0 @@
# This file is just a nicety for developers who use direnv. When you cd under
# the Certbot repo, Certbot's virtual environment will be automatically
# activated and then deactivated when you cd elsewhere. Developers have to have
# direnv set up and run `direnv allow` to allow this file to execute on their
# system. You can find more information at https://direnv.net/.
. venv3/bin/activate
# direnv doesn't support modifying PS1 so we unset it to squelch the error
# it'll otherwise print about this being done in the activate script. See
# https://github.com/direnv/direnv/wiki/PS1. If you would like your shell
# prompt to change like it normally does, see
# https://github.com/direnv/direnv/wiki/Python#restoring-the-ps1.
unset PS1

35
.github/stale.yml vendored
View File

@@ -1,35 +0,0 @@
# Configuration for https://github.com/marketplace/stale
# Number of days of inactivity before an Issue or Pull Request becomes stale
daysUntilStale: 365
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
# When changing this value, be sure to also update markComment below.
daysUntilClose: 30
# Ignore issues with an assignee (defaults to false)
exemptAssignees: true
# Label to use when marking as stale
staleLabel: needs-update
# Comment to post when marking as stale. Set to `false` to disable
markComment: >
We've made a lot of changes to Certbot since this issue was opened. If you
still have this issue with an up-to-date version of Certbot, can you please
add a comment letting us know? This helps us to better see what issues are
still affecting our users. If there is no activity in the next 30 days, this
issue will be automatically closed.
# Comment to post when closing a stale Issue or Pull Request.
closeComment: >
This issue has been closed due to lack of activity, but if you think it
should be reopened, please open a new issue with a link to this one and we'll
take a look.
# Limit the number of actions per hour, from 1-30. Default is 30
limitPerRun: 1
# Don't mark pull requests as stale.
only: issues

29
.gitignore vendored
View File

@@ -6,8 +6,7 @@ dist*/
/venv*/
/kgs/
/.tox/
/releases*/
/log*
/releases/
letsencrypt.log
certbot.log
letsencrypt-auto-source/letsencrypt-auto.sig.lzma.base64
@@ -26,7 +25,6 @@ tags
\#*#
.idea
.ropeproject
.vscode
# auth --cert-path --chain-path
/*.pem
@@ -35,33 +33,8 @@ tags
tests/letstest/letest-*/
tests/letstest/*.pem
tests/letstest/venv/
tests/letstest/venv3/
.venv
# pytest cache
.cache
.mypy_cache/
.pytest_cache/
# docker files
.docker
# certbot tests
.certbot_test_workspace
**/assets/pebble*
**/assets/challtestsrv*
# snap files
.snapcraft
parts
prime
stage
*.snap
snap-constraints.txt
qemu-*
certbot-dns*/certbot-dns*_amd64*.txt
certbot-dns*/certbot-dns*_arm*.txt
/certbot_amd64*.txt
/certbot_arm*.txt
certbot-dns*/snap

View File

@@ -1,7 +0,0 @@
[settings]
skip_glob=venv*
skip=letsencrypt-auto-source
force_sort_within_sections=True
force_single_line=True
order_by_type=False
line_length=400

View File

@@ -24,11 +24,6 @@ persistent=yes
# usually to register additional checkers.
load-plugins=linter_plugin
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
[MESSAGES CONTROL]
@@ -46,14 +41,10 @@ extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
# CERTBOT COMMENT
# 1) Once certbot codebase is claimed to be compatible exclusively with Python 3,
# the useless-object-inheritance check can be enabled again, and code fixed accordingly.
# 2) Check unsubscriptable-object tends to create a lot of false positives. Let's disable it.
# See https://github.com/PyCQA/pylint/issues/1498.
# 3) Same as point 2 for no-value-for-parameter.
# See https://github.com/PyCQA/pylint/issues/2820.
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue
disable=fixme,locally-disabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,too-few-public-methods,no-self-use,invalid-name,too-many-instance-attributes,cyclic-import,duplicate-code
# abstract-class-not-used cannot be disabled locally (at least in
# pylint 1.4.1), same for abstract-class-little-used
[REPORTS]
@@ -260,7 +251,7 @@ ignored-modules=pkg_resources,confargparse,argparse,six.moves,six.moves.urllib
# List of classes names for which member attributes should not be checked
# (useful for classes with attributes dynamically set).
ignored-classes=Field,Header,JWS,closing
ignored-classes=SQLObject
# When zope mode is activated, add a predefined set of Zope acquired attributes
# to generated-members.
@@ -306,6 +297,40 @@ valid-classmethod-first-arg=cls
valid-metaclass-classmethod-first-arg=mcs
[DESIGN]
# Maximum number of arguments for function / method
max-args=6
# Argument names that match this expression will be ignored. Default to name
# with leading underscore
ignored-argument-names=_.*
# Maximum number of locals for function / method body
max-locals=15
# Maximum number of return / yield for function / method body
max-returns=6
# Maximum number of branch for function / method body
max-branches=12
# Maximum number of statements in function / method body
max-statements=50
# Maximum number of parents for a class (see R0901).
max-parents=12
# Maximum number of attributes for a class (see R0902).
max-attributes=7
# Minimum number of public methods for a class (see R0903).
min-public-methods=2
# Maximum number of public methods for a class (see R0904).
max-public-methods=20
[EXCEPTIONS]
# Exceptions that will emit a warning when being caught. Defaults to

108
.travis.yml Normal file
View File

@@ -0,0 +1,108 @@
language: python
cache:
directories:
- $HOME/.cache/pip
before_install:
- '([ $TRAVIS_OS_NAME == linux ] && dpkg -s libaugeas0) || (brew update && brew install augeas python3 && brew upgrade python && brew link python)'
before_script:
- 'if [ $TRAVIS_OS_NAME = osx ] ; then ulimit -n 1024 ; fi'
matrix:
include:
- python: "2.7"
env: TOXENV=py27_install BOULDER_INTEGRATION=1
sudo: required
services: docker
- python: "2.7"
env: TOXENV=cover FYI="this also tests py27"
- sudo: required
env: TOXENV=nginx_compat
services: docker
before_install:
addons:
- python: "2.7"
env: TOXENV=lint
- python: "2.6"
env: TOXENV=py26
sudo: required
services: docker
- python: "2.7"
env: TOXENV=py27-oldest
sudo: required
services: docker
- python: "3.3"
env: TOXENV=py33
sudo: required
services: docker
- python: "3.6"
env: TOXENV=py36
sudo: required
services: docker
- sudo: required
env: TOXENV=apache_compat
services: docker
before_install:
addons:
- sudo: required
env: TOXENV=le_auto_trusty
services: docker
before_install:
addons:
- python: "2.7"
env: TOXENV=apacheconftest
sudo: required
- python: "2.7"
env: TOXENV=nginxroundtrip
# Only build pushes to the master branch, PRs, and branches beginning with
# `test-` or of the form `digit(s).digit(s).x`. This reduces the number of
# simultaneous Travis runs, which speeds turnaround time on review since there
# is a cap of on the number of simultaneous runs.
branches:
only:
- master
- /^\d+\.\d+\.x$/
- /^test-.*$/
# container-based infrastructure
sudo: false
addons:
apt:
sources:
- augeas
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
- python-dev
- python-virtualenv
- gcc
- libaugeas0
- libssl-dev
- libffi-dev
- ca-certificates
# For certbot-nginx integration testing
- nginx-light
- openssl
# for apacheconftest
- apache2
- libapache2-mod-wsgi
- libapache2-mod-macro
install: "travis_retry $(command -v pip || command -v pip3) install tox coveralls"
script:
- travis_retry tox
- '[ -z "${BOULDER_INTEGRATION+x}" ] || (travis_retry tests/boulder-fetch.sh && tests/tox-boulder-integration.sh)'
after_success: '[ "$TOXENV" == "cover" ] && coveralls'
notifications:
email: false
irc:
channels:
- secure: "SGWZl3ownKx9xKVV2VnGt7DqkTmutJ89oJV9tjKhSs84kLijU6EYdPnllqISpfHMTxXflNZuxtGo0wTDYHXBuZL47w1O32W6nzuXdra5zC+i4sYQwYULUsyfOv9gJX8zWAULiK0Z3r0oho45U+FR5ZN6TPCidi8/eGU+EEPwaAw="
on_success: never
on_failure: always
use_notice: true

View File

@@ -1,12 +1,10 @@
Authors
=======
* [Aaron Gable](https://github.com/aarongable)
* [Aaron Zirbes](https://github.com/aaronzirbes)
* Aaron Zuehlke
* Ada Lovelace
* [Adam Woodbeck](https://github.com/awoodbeck)
* [Adrien Ferrand](https://github.com/adferrand)
* [Aidin Gharibnavaz](https://github.com/aidin36)
* [AJ ONeal](https://github.com/coolaj86)
* [Alcaro](https://github.com/Alcaro)
@@ -16,13 +14,10 @@ Authors
* [Alex Gaynor](https://github.com/alex)
* [Alex Halderman](https://github.com/jhalderm)
* [Alex Jordan](https://github.com/strugee)
* [Alex Zorin](https://github.com/alexzorin)
* [Amjad Mashaal](https://github.com/TheNavigat)
* [Andrew Murray](https://github.com/radarhere)
* [Andrzej Górski](https://github.com/andrzej3393)
* [Anselm Levskaya](https://github.com/levskaya)
* [Antoine Jacoutot](https://github.com/ajacoutot)
* [April King](https://github.com/april)
* [asaph](https://github.com/asaph)
* [Axel Beckert](https://github.com/xtaran)
* [Bas](https://github.com/Mechazawa)
@@ -37,9 +32,7 @@ Authors
* [Blake Griffith](https://github.com/cowlicks)
* [Brad Warren](https://github.com/bmw)
* [Brandon Kraft](https://github.com/kraftbj)
* [Brandon Kreisel](https://github.com/BKreisel)
* [Brian Heim](https://github.com/brianlheim)
* [Cameron Steel](https://github.com/Tugzrida)
* [Brandon Kreisel](https://github.com/kraftbj)
* [Ceesjan Luiten](https://github.com/quinox)
* [Chad Whitacre](https://github.com/whit537)
* [Chhatoi Pritam Baral](https://github.com/pritambaral)
@@ -61,9 +54,7 @@ Authors
* [DanCld](https://github.com/DanCld)
* [Daniel Albers](https://github.com/AID)
* [Daniel Aleksandersen](https://github.com/da2x)
* [Daniel Almasi](https://github.com/almasen)
* [Daniel Convissor](https://github.com/convissor)
* [Daniel "Drex" Drexler](https://github.com/aeturnum)
* [Daniel Huang](https://github.com/dhuang)
* [Dave Guarino](https://github.com/daguar)
* [David cz](https://github.com/dave-cz)
@@ -84,11 +75,9 @@ Authors
* [Fabian](https://github.com/faerbit)
* [Faidon Liambotis](https://github.com/paravoid)
* [Fan Jiang](https://github.com/tcz001)
* [Felix Lechner](https://github.com/lechner)
* [Felix Schwarz](https://github.com/FelixSchwarz)
* [Felix Yan](https://github.com/felixonmars)
* [Filip Ochnik](https://github.com/filipochnik)
* [Florian Klink](https://github.com/flokli)
* [Francois Marier](https://github.com/fmarier)
* [Frank](https://github.com/Frankkkkk)
* [Frederic BLANC](https://github.com/fblanc)
@@ -107,9 +96,7 @@ Authors
* [Harlan Lieberman-Berg](https://github.com/hlieberman)
* [Henri Salo](https://github.com/fgeek)
* [Henry Chen](https://github.com/henrychen95)
* [Hugo van Kemenade](https://github.com/hugovk)
* [Ingolf Becker](https://github.com/watercrossing)
* [Ivan Nejgebauer](https://github.com/inejge)
* [Jaap Eldering](https://github.com/eldering)
* [Jacob Hoffman-Andrews](https://github.com/jsha)
* [Jacob Sachs](https://github.com/jsachs)
@@ -133,12 +120,10 @@ Authors
* [Jonathan Herlin](https://github.com/Jonher937)
* [Jon Walsh](https://github.com/code-tree)
* [Joona Hoikkala](https://github.com/joohoi)
* [Josh McCullough](https://github.com/JoshMcCullough)
* [Josh Soref](https://github.com/jsoref)
* [Joubin Jabbari](https://github.com/joubin)
* [Juho Juopperi](https://github.com/jkjuopperi)
* [Kane York](https://github.com/riking)
* [Kenichi Maehashi](https://github.com/kmaehashi)
* [Kenneth Skovhede](https://github.com/kenkendk)
* [Kevin Burke](https://github.com/kevinburke)
* [Kevin London](https://github.com/kevinlondon)
@@ -151,13 +136,11 @@ Authors
* [Lior Sabag](https://github.com/liorsbg)
* [Lipis](https://github.com/lipis)
* [lord63](https://github.com/lord63)
* [Lorenzo Fundaró](https://github.com/lfundaro)
* [Luca Beltrame](https://github.com/lbeltrame)
* [Luca Ebach](https://github.com/lucebac)
* [Luca Olivetti](https://github.com/olivluca)
* [Luke Rogers](https://github.com/lukeroge)
* [Maarten](https://github.com/mrtndwrd)
* [Mads Jensen](https://github.com/atombrella)
* [Maikel Martens](https://github.com/krukas)
* [Malte Janduda](https://github.com/MalteJ)
* [Mantas Mikulėnas](https://github.com/grawity)
@@ -176,10 +159,8 @@ Authors
* [Michael Schumacher](https://github.com/schumaml)
* [Michael Strache](https://github.com/Jarodiv)
* [Michael Sverdlin](https://github.com/sveder)
* [Michael Watters](https://github.com/blackknight36)
* [Michal Moravec](https://github.com/https://github.com/Majkl578)
* [Michal Papis](https://github.com/mpapis)
* [Mickaël Schoentgen](https://github.com/BoboTiG)
* [Minn Soe](https://github.com/MinnSoe)
* [Min RK](https://github.com/minrk)
* [Miquel Ruiz](https://github.com/miquelruiz)
@@ -209,7 +190,6 @@ Authors
* [Pierre Jaury](https://github.com/kaiyou)
* [Piotr Kasprzyk](https://github.com/kwadrat)
* [Prayag Verma](https://github.com/pra85)
* [Rasesh Patel](https://github.com/raspat1)
* [Reinaldo de Souza Jr](https://github.com/juniorz)
* [Remi Rampin](https://github.com/remram44)
* [Rémy HUBSCHER](https://github.com/Natim)
@@ -217,7 +197,6 @@ Authors
* [Richard Barnes](https://github.com/r-barnes)
* [Richard Panek](https://github.com/kernelpanek)
* [Robert Buchholz](https://github.com/rbu)
* [Robert Dailey](https://github.com/pahrohfit)
* [Robert Habermann](https://github.com/frennkie)
* [Robert Xiao](https://github.com/nneonneo)
* [Roland Shoemaker](https://github.com/rolandshoemaker)
@@ -243,11 +222,8 @@ Authors
* [Spencer Bliven](https://github.com/sbliven)
* [Stacey Sheldon](https://github.com/solidgoldbomb)
* [Stavros Korokithakis](https://github.com/skorokithakis)
* [Ștefan Talpalaru](https://github.com/stefantalpalaru)
* [Stefan Weil](https://github.com/stweil)
* [Steve Desmond](https://github.com/stevedesmond-ca)
* [sydneyli](https://github.com/sydneyli)
* [taixx046](https://github.com/taixx046)
* [Tan Jay Jun](https://github.com/jayjun)
* [Tapple Gao](https://github.com/tapple)
* [Telepenin Nikolay](https://github.com/telepenin)
@@ -279,6 +255,5 @@ Authors
* [Yomna](https://github.com/ynasser)
* [Yoni Jah](https://github.com/yonjah)
* [YourDaddyIsHere](https://github.com/YourDaddyIsHere)
* [Yuseong Cho](https://github.com/g6123)
* [Zach Shepherd](https://github.com/zjs)
* [陈三](https://github.com/chenxsan)

View File

@@ -1 +0,0 @@
certbot/CHANGELOG.md

772
CHANGELOG.md Normal file
View File

@@ -0,0 +1,772 @@
# Certbot change log
Certbot adheres to [Semantic Versioning](http://semver.org/).
## 0.20.0 - 2017-12-06
### Added
* Certbot's ACME library now recognizes URL fields in challenge objects in
preparation for Let's Encrypt's new ACME endpoint. The value is still
accessible in our ACME library through the name "uri".
### Changed
* The Apache plugin now parses some distro specific Apache configuration files
on non-Debian systems allowing it to get a clearer picture on the running
configuration. Internally, these changes were structured so that external
contributors can easily write patches to make the plugin work in new Apache
configurations.
* Certbot better reports network failures by removing information about
connection retries from the error output.
* An unnecessary question when using Certbot's webroot plugin interactively has
been removed.
### Fixed
* Certbot's NGINX plugin no longer sometimes incorrectly reports that it was
unable to deploy a HTTP->HTTPS redirect when requesting Certbot to enable a
redirect for multiple domains.
* Problems where the Apache plugin was failing to find directives and
duplicating existing directives on openSUSE have been resolved.
* An issue running the test shipped with Certbot and some our DNS plugins with
older versions of mock have been resolved.
* On some systems, users reported strangely interleaved output depending on
when stdout and stderr were flushed. This problem was resolved by having
Certbot regularly flush these streams.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/44?closed=1
## 0.19.0 - 2017-10-04
### Added
* Certbot now has renewal hook directories where executable files can be placed
for Certbot to run with the renew subcommand. Pre-hooks, deploy-hooks, and
post-hooks can be specified in the renewal-hooks/pre, renewal-hooks/deploy,
and renewal-hooks/post directories respectively in Certbot's configuration
directory (which is /etc/letsencrypt by default). Certbot will automatically
create these directories when it is run if they do not already exist.
* After revoking a certificate with the revoke subcommand, Certbot will offer
to delete the lineage associated with the certificate. When Certbot is run
with --non-interactive, it will automatically try to delete the associated
lineage.
* When using Certbot's Google Cloud DNS plugin on Google Compute Engine, you no
longer have to provide a credential file to Certbot if you have configured
sufficient permissions for the instance which Certbot can automatically
obtain using Google's metadata service.
### Changed
* When deleting certificates interactively using the delete subcommand, Certbot
will now allow you to select multiple lineages to be deleted at once.
* Certbot's Apache plugin no longer always parses Apache's sites-available on
Debian based systems and instead only parses virtual hosts included in your
Apache configuration. You can provide an additional directory for Certbot to
parse using the command line flag --apache-vhost-root.
### Fixed
* The plugins subcommand can now be run without root access.
* certbot-auto now includes a timeout when updating itself so it no longer
hangs indefinitely when it is unable to connect to the external server.
* An issue where Certbot's Apache plugin would sometimes fail to deploy a
certificate on Debian based systems if mod_ssl wasn't already enabled has
been resolved.
* A bug in our Docker image where the certificates subcommand could not report
if certificates maintained by Certbot had been revoked has been fixed.
* Certbot's RFC 2136 DNS plugin (for use with software like BIND) now properly
performs DNS challenges when the domain being verified contains a CNAME
record.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/43?closed=1
## 0.18.2 - 2017-09-20
### Fixed
* An issue where Certbot's ACME module would raise an AttributeError trying to
create self-signed certificates when used with pyOpenSSL 17.3.0 has been
resolved. For Certbot users with this version of pyOpenSSL, this caused
Certbot to crash when performing a TLS SNI challenge or when the Nginx plugin
tried to create an SSL server block.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/46?closed=1
## 0.18.1 - 2017-09-08
### Fixed
* If certbot-auto was running as an unprivileged user and it upgraded from
0.17.0 to 0.18.0, it would crash with a permissions error and would need to
be run again to successfully complete the upgrade. This has been fixed and
certbot-auto should upgrade cleanly to 0.18.1.
* Certbot usually uses "certbot-auto" or "letsencrypt-auto" in error messages
and the User-Agent string instead of "certbot" when you are using one of
these wrapper scripts. Proper detection of this was broken with Certbot's new
installation path in /opt in 0.18.0 but this problem has been resolved.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/45?closed=1
## 0.18.0 - 2017-09-06
### Added
* The Nginx plugin now configures Nginx to use 2048-bit Diffie-Hellman
parameters. Java 6 clients do not support Diffie-Hellman parameters larger
than 1024 bits, so if you need to support these clients you will need to
manually modify your Nginx configuration after using the Nginx installer.
### Changed
* certbot-auto now installs Certbot in directories under `/opt/eff.org`. If you
had an existing installation from certbot-auto, a symlink is created to the
new directory. You can configure certbot-auto to use a different path by
setting the environment variable VENV_PATH.
* The Nginx plugin can now be selected in Certbot's interactive output.
* Output verbosity of renewal failures when running with `--quiet` has been
reduced.
* The default revocation reason shown in Certbot help output now is a human
readable string instead of a numerical code.
* Plugin selection is now included in normal terminal output.
### Fixed
* A newer version of ConfigArgParse is now installed when using certbot-auto
causing values set to false in a Certbot INI configuration file to be handled
intuitively. Setting a boolean command line flag to false is equivalent to
not including it in the configuration file at all.
* New naming conventions preventing certbot-auto from installing OS
dependencies on Fedora 26 have been resolved.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/42?closed=1
## 0.17.0 - 2017-08-02
### Added
* Support in our nginx plugin for modifying SSL server blocks that do
not contain certificate or key directives.
* A `--max-log-backups` flag to allow users to configure or even completely
disable Certbot's built in log rotation.
* A `--user-agent-comment` flag to allow people who build tools around Certbot
to differentiate their user agent string by adding a comment to its default
value.
### Changed
* Due to some awesome work by
[cryptography project](https://github.com/pyca/cryptography), compilation can
now be avoided on most systems when using certbot-auto. This eliminates many
problems people have had in the past such as running out of memory, having
invalid headers/libraries, and changes to the OS packages on their system
after compilation breaking Certbot.
* The `--renew-hook` flag has been hidden in favor of `--deploy-hook`. This new
flag works exactly the same way except it is always run when a certificate is
issued rather than just when it is renewed.
* We have started printing deprecation warnings in certbot-auto for
experimentally supported systems with OS packages available.
* A certificate lineage's name is included in error messages during renewal.
### Fixed
* Encoding errors that could occur when parsing error messages from the ACME
server containing Unicode have been resolved.
* certbot-auto no longer prints misleading messages about there being a newer
pip version available when installation fails.
* Certbot's ACME library now properly extracts domains from critical SAN
extensions.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.17.0+is%3Aclosed
## 0.16.0 - 2017-07-05
### Added
* A plugin for performing DNS challenges using dynamic DNS updates as defined
in RFC 2316. This plugin is packaged separately from Certbot and is available
at https://pypi.python.org/pypi/certbot-dns-rfc2136. It supports Python 2.6,
2.7, and 3.3+. At this time, there isn't a good way to install this plugin
when using certbot-auto, but this should change in the near future.
* Plugins for performing DNS challenges for the providers
[DNS Made Easy](https://pypi.python.org/pypi/certbot-dns-dnsmadeeasy) and
[LuaDNS](https://pypi.python.org/pypi/certbot-dns-luadns). These plugins are
packaged separately from Certbot and support Python 2.7 and 3.3+. Currently,
there isn't a good way to install these plugins when using certbot-auto,
but that should change soon.
* Support for performing TLS-SNI-01 challenges when using the manual plugin.
* Automatic detection of Arch Linux in the Apache plugin providing better
default settings for the plugin.
### Changed
* The text of the interactive question about whether a redirect from HTTP to
HTTPS should be added by Certbot has been rewritten to better explain the
choices to the user.
* Simplified HTTP challenge instructions in the manual plugin.
### Fixed
* Problems performing a dry run when using the Nginx plugin have been fixed.
* Resolved an issue where certbot-dns-digitalocean's test suite would sometimes
fail when ran using Python 3.
* On some systems, previous versions of certbot-auto would error out with a
message about a missing hash for setuptools. This has been fixed.
* A bug where Certbot would sometimes not print a space at the end of an
interactive prompt has been resolved.
* Nonfatal tracebacks are no longer shown in rare cases where Certbot
encounters an exception trying to close its TCP connection with the ACME
server.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.16.0+is%3Aclosed
## 0.15.0 - 2017-06-08
### Added
* Plugins for performing DNS challenges for popular providers. Like the Apache
and Nginx plugins, these plugins are packaged separately and not included in
Certbot by default. So far, we have plugins for
[Amazon Route 53](https://pypi.python.org/pypi/certbot-dns-route53),
[Cloudflare](https://pypi.python.org/pypi/certbot-dns-cloudflare),
[DigitalOcean](https://pypi.python.org/pypi/certbot-dns-digitalocean), and
[Google Cloud](https://pypi.python.org/pypi/certbot-dns-google) which all
work on Python 2.6, 2.7, and 3.3+. Additionally, we have plugins for
[CloudXNS](https://pypi.python.org/pypi/certbot-dns-cloudxns),
[DNSimple](https://pypi.python.org/pypi/certbot-dns-dnsimple),
[NS1](https://pypi.python.org/pypi/certbot-dns-nsone) which work on Python
2.7 and 3.3+ (and not 2.6). Currently, there isn't a good way to install
these plugins when using `certbot-auto`, but that should change soon.
* IPv6 support in the standalone plugin. When performing a challenge, the
standalone plugin automatically handles listening for IPv4/IPv6 traffic based
on the configuration of your system.
* A mechanism for keeping your Apache and Nginx SSL/TLS configuration up to
date. When the Apache or Nginx plugins are used, they place SSL/TLS
configuration options in the root of Certbot's config directory
(`/etc/letsencrypt` by default). Now when a new version of these plugins run
on your system, they will automatically update the file to the newest
version if it is unmodified. If you manually modified the file, Certbot will
display a warning giving you a path to the updated file which you can use as
a reference to manually update your modified copy.
* `--http-01-address` and `--tls-sni-01-address` flags for controlling the
address Certbot listens on when using the standalone plugin.
* The command `certbot certificates` that lists certificates managed by Certbot
now performs additional validity checks to notify you if your files have
become corrupted.
### Changed
* Messages custom hooks print to `stdout` are now displayed by Certbot when not
running in `--quiet` mode.
* `jwk` and `alg` fields in JWS objects have been moved into the protected
header causing Certbot to more closely follow the latest version of the ACME
spec.
### Fixed
* Permissions on renewal configuration files are now properly preserved when
they are updated.
* A bug causing Certbot to display strange defaults in its help output when
using Python <= 2.7.4 has been fixed.
* Certbot now properly handles mixed case domain names found in custom CSRs.
* A number of poorly worded prompts and error messages.
### Removed
* Support for OpenSSL 1.0.0 in `certbot-auto` has been removed as we now pin a
newer version of `cryptography` which dropped support for this version.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.15.0+is%3Aclosed
## 0.14.2 - 2017-05-25
### Fixed
* Certbot 0.14.0 included a bug where Certbot would create a temporary log file
(usually in /tmp) if the program exited during argument parsing. If a user
provided -h/--help/help, --version, or an invalid command line argument,
Certbot would create this temporary log file. This was especially bothersome to
certbot-auto users as certbot-auto runs `certbot --version` internally to see
if the script needs to upgrade causing it to create at least one of these files
on every run. This problem has been resolved.
More details about this change can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.14.2+is%3Aclosed
## 0.14.1 - 2017-05-16
### Fixed
* Certbot now works with configargparse 0.12.0.
* Issues with the Apache plugin and Augeas 1.7+ have been resolved.
* A problem where the Nginx plugin would fail to install certificates on
systems that had the plugin's SSL/TLS options file from 7+ months ago has been
fixed.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.14.1+is%3Aclosed
## 0.14.0 - 2017-05-04
### Added
* Python 3.3+ support for all Certbot packages. `certbot-auto` still currently
only supports Python 2, but the `acme`, `certbot`, `certbot-apache`, and
`certbot-nginx` packages on PyPI now fully support Python 2.6, 2.7, and 3.3+.
* Certbot's Apache plugin now handles multiple virtual hosts per file.
* Lockfiles to prevent multiple versions of Certbot running simultaneously.
### Changed
* When converting an HTTP virtual host to HTTPS in Apache, Certbot only copies
the virtual host rather than the entire contents of the file it's contained
in.
* The Nginx plugin now includes SSL/TLS directives in a separate file located
in Certbot's configuration directory rather than copying the contents of the
file into every modified `server` block.
### Fixed
* Ensure logging is configured before parts of Certbot attempt to log any
messages.
* Support for the `--quiet` flag in `certbot-auto`.
* Reverted a change made in a previous release to make the `acme` and `certbot`
packages always depend on `argparse`. This dependency is conditional again on
the user's Python version.
* Small bugs in the Nginx plugin such as properly handling empty `server`
blocks and setting `server_names_hash_bucket_size` during challenges.
As always, a more complete list of changes can be found on GitHub:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.14.0+is%3Aclosed
## 0.13.0 - 2017-04-06
### Added
* `--debug-challenges` now pauses Certbot after setting up challenges for debugging.
* The Nginx parser can now handle all valid directives in configuration files.
* Nginx ciphersuites have changed to Mozilla Intermediate.
* `certbot-auto --no-bootstrap` provides the option to not install OS dependencies.
### Fixed
* `--register-unsafely-without-email` now respects `--quiet`.
* Hyphenated renewal parameters are now saved in renewal config files.
* `--dry-run` no longer persists keys and csrs.
* Certbot no longer hangs when trying to start Nginx in Arch Linux.
* Apache rewrite rules no longer double-encode characters.
A full list of changes is available on GitHub:
https://github.com/certbot/certbot/issues?q=is%3Aissue%20milestone%3A0.13.0%20is%3Aclosed%20
## 0.12.0 - 2017-03-02
### Added
* Certbot now allows non-camelcase Apache VirtualHost names.
* Certbot now allows more log messages to be silenced.
### Fixed
* Fixed a regression around using `--cert-name` when getting new certificates
More information about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue%20milestone%3A0.12.0
## 0.11.1 - 2017-02-01
### Fixed
* Resolved a problem where Certbot would crash while parsing command line
arguments in some cases.
* Fixed a typo.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/pulls?q=is%3Apr%20milestone%3A0.11.1%20is%3Aclosed
## 0.11.0 - 2017-02-01
### Added
* When using the standalone plugin while running Certbot interactively
and a required port is bound by another process, Certbot will give you
the option to retry to grab the port rather than immediately exiting.
* You are now able to deactivate your account with the Let's Encrypt
server using the `unregister` subcommand.
* When revoking a certificate using the `revoke` subcommand, you now
have the option to provide the reason the certificate is being revoked
to Let's Encrypt with `--reason`.
### Changed
* Providing `--quiet` to `certbot-auto` now silences package manager output.
### Removed
* Removed the optional `dnspython` dependency in our `acme` package.
Now the library does not support client side verification of the DNS
challenge.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.11.0+is%3Aclosed
## 0.10.2 - 2017-01-25
### Added
* If Certbot receives a request with a `badNonce` error, it now
automatically retries the request. Since nonces from Let's Encrypt expire,
this helps people performing the DNS challenge with the `manual` plugin
who may have to wait an extended period of time for their DNS changes to
propagate.
### Fixed
* Certbot now saves the `--preferred-challenges` values for renewal. Previously
these values were discarded causing a different challenge type to be used when
renewing certs in some cases.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.10.2+is%3Aclosed
## 0.10.1 - 2017-01-13
### Fixed
* Resolve problems where when asking Certbot to update a certificate at
an existing path to include different domain names, the old names would
continue to be used.
* Fix issues successfully running our unit test suite on some systems.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.10.1+is%3Aclosed
## 0.10.0 - 2017-01-11
## Added
* Added the ability to customize and automatically complete DNS and HTTP
domain validation challenges with the manual plugin. The flags
`--manual-auth-hook` and `--manual-cleanup-hook` can now be provided
when using the manual plugin to execute commands provided by the user to
perform and clean up challenges provided by the CA. This is best used in
complicated setups where the DNS challenge must be used or Certbot's
existing plugins cannot be used to perform HTTP challenges. For more
information on how this works, see `certbot --help manual`.
* Added a `--cert-name` flag for specifying the name to use for the
certificate in Certbot's configuration directory. Using this flag in
combination with `-d/--domains`, a user can easily request a new
certificate with different domains and save it with the name provided by
`--cert-name`. Additionally, `--cert-name` can be used to select a
certificate with the `certonly` and `run` subcommands so a full list of
domains in the certificate does not have to be provided.
* Added subcommand `certificates` for listing the certificates managed by
Certbot and their properties.
* Added the `delete` subcommand for removing certificates managed by Certbot
from the configuration directory.
* Certbot now supports requesting internationalized domain names (IDNs).
* Hooks provided to Certbot are now saved to be reused during renewal.
If you run Certbot with `--pre-hook`, `--renew-hook`, or `--post-hook`
flags when obtaining a certificate, the provided commands will
automatically be saved and executed again when renewing the certificate.
A pre-hook and/or post-hook can also be given to the `certbot renew`
command either on the command line or in a [configuration
file](https://certbot.eff.org/docs/using.html#configuration-file) to run
an additional command before/after any certificate is renewed. Hooks
will only be run if a certificate is renewed.
* Support Busybox in certbot-auto.
### Changed
* Recategorized `-h/--help` output to improve documentation and
discoverability.
### Removed
* Removed the ncurses interface. This change solves problems people
were having on many systems, reduces the number of Certbot
dependencies, and simplifies our code. Certbot's only interface now is
the text interface which was available by providing `-t/--text` to
earlier versions of Certbot.
### Fixed
* Many small bug fixes.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.10.0is%3Aclosed
## 0.9.3 - 2016-10-13
### Added
* The Apache plugin uses information about your OS to help determine the
layout of your Apache configuration directory. We added a patch to
ensure this code behaves the same way when testing on different systems
as the tests were failing in some cases.
### Changed
* Certbot adopted more conservative behavior about reporting a needed port as
unavailable when using the standalone plugin.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/27?closed=1
## 0.9.2 - 2016-10-12
### Added
* Certbot stopped requiring that all possibly required ports are available when
using the standalone plugin. It now only verifies that the ports are available
when they are necessary.
### Fixed
* Certbot now verifies that our optional dependencies version matches what is
required by Certbot.
* Certnot now properly copies the `ssl on;` directives as necessary when
performing domain validation in the Nginx plugin.
* Fixed problem where symlinks were becoming files when they were
packaged, causing errors during testing and OS packaging.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/26?closed=1
## 0.9.1 - 2016-10-06
### Fixed
* Fixed a bug that was introduced in version 0.9.0 where the command
line flag -q/--quiet wasn't respected in some cases.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/25?closed=1
## 0.9.0 - 2016-10-05
### Added
* Added an alpha version of the Nginx plugin. This plugin fully automates the
process of obtaining and installing certificates with Nginx.
Additionally, it is able to automatically configure security
enhancements such as an HTTP to HTTPS redirect and OCSP stapling. To use
this plugin, you must have the `certbot-nginx` package installed (which
is installed automatically when using `certbot-auto`) and provide
`--nginx` on the command line. This plugin is still in its early stages
so we recommend you use it with some caution and make sure you have a
backup of your Nginx configuration.
* Added support for the `DNS` challenge in the `acme` library and `DNS` in
Certbot's `manual` plugin. This allows you to create DNS records to
prove to Let's Encrypt you control the requested domain name. To use
this feature, include `--manual --preferred-challenges dns` on the
command line.
* Certbot now helps with enabling Extra Packages for Enterprise Linux (EPEL) on
CentOS 6 when using `certbot-auto`. To use `certbot-auto` on CentOS 6,
the EPEL repository has to be enabled. `certbot-auto` will now prompt
users asking them if they would like the script to enable this for them
automatically. This is done without prompting users when using
`letsencrypt-auto` or if `-n/--non-interactive/--noninteractive` is
included on the command line.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.9.0+is%3Aclosed
## 0.8.1 - 2016-06-14
### Added
* Certbot now preserves a certificate's common name when using `renew`.
* Certbot now saves webroot values for renewal when they are entered interactively.
* Certbot now gracefully reports that the Apache plugin isn't usable when Augeas is not installed.
* Added experimental support for Mageia has been added to `certbot-auto`.
### Fixed
* Fixed problems with an invalid user-agent string on OS X.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.8.1+
## 0.8.0 - 2016-06-02
### Added
* Added the `register` subcommand which can be used to register an account
with the Let's Encrypt CA.
* You can now run `certbot register --update-registration` to
change the e-mail address associated with your registration.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue+milestone%3A0.8.0+
## 0.7.0 - 2016-05-27
### Added
* Added `--must-staple` to request certificates from Let's Encrypt
with the OCSP must staple extension.
* Certbot now automatically configures OSCP stapling for Apache.
* Certbot now allows requesting certificates for domains found in the common name
of a custom CSR.
### Fixed
* Fixed a number of miscellaneous bugs
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=milestone%3A0.7.0+is%3Aissue
## 0.6.0 - 2016-05-12
### Added
* Versioned the datetime dependency in setup.py.
### Changed
* Renamed the client from `letsencrypt` to `certbot`.
### Fixed
* Fixed a small json deserialization error.
* Certbot now preserves domain order in generated CSRs.
* Fixed some minor bugs.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/issues?q=is%3Aissue%20milestone%3A0.6.0%20is%3Aclosed%20
## 0.5.0 - 2016-04-05
### Added
* Added the ability to use the webroot plugin interactively.
* Added the flags --pre-hook, --post-hook, and --renew-hook which can be used with
the renew subcommand to register shell commands to run in response to
renewal events. Pre-hook commands will be run before any certs are
renewed, post-hook commands will be run after any certs are renewed,
and renew-hook commands will be run after each cert is renewed. If no
certs are due for renewal, no command is run.
* Added a -q/--quiet flag which silences all output except errors.
* Added an --allow-subset-of-domains flag which can be used with the renew
command to prevent renewal failures for a subset of the requested
domains from causing the client to exit.
### Changed
* Certbot now uses renewal configuration files. In /etc/letsencrypt/renewal
by default, these files can be used to control what parameters are
used when renewing a specific certificate.
More details about these changes can be found on our GitHub repo:
https://github.com/letsencrypt/letsencrypt/issues?q=milestone%3A0.5.0+is%3Aissue
## 0.4.2 - 2016-03-03
### Fixed
* Resolved problems encountered when compiling letsencrypt
against the new OpenSSL release.
* Fixed problems encountered when using `letsencrypt renew` with configuration files
from the private beta.
More details about these changes can be found on our GitHub repo:
https://github.com/letsencrypt/letsencrypt/issues?q=is%3Aissue+milestone%3A0.4.2
## 0.4.1 - 2016-02-29
### Fixed
* Fixed Apache parsing errors encountered with some configurations.
* Fixed Werkzeug dependency problems encountered on some Red Hat systems.
* Fixed bootstrapping failures when using letsencrypt-auto with --no-self-upgrade.
* Fixed problems with parsing renewal config files from private beta.
More details about these changes can be found on our GitHub repo:
https://github.com/letsencrypt/letsencrypt/issues?q=is:issue+milestone:0.4.1
## 0.4.0 - 2016-02-10
### Added
* Added the verb/subcommand `renew` which can be used to renew your existing
certificates as they approach expiration. Running `letsencrypt renew`
will examine all existing certificate lineages and determine if any are
less than 30 days from expiration. If so, the client will use the
settings provided when you previously obtained the certificate to renew
it. The subcommand finishes by printing a summary of which renewals were
successful, failed, or not yet due.
* Added a `--dry-run` flag to help with testing configuration
without affecting production rate limits. Currently supported by the
`renew` and `certonly` subcommands, providing `--dry-run` on the command
line will obtain certificates from the staging server without saving the
resulting certificates to disk.
* Added major improvements to letsencrypt-auto. This script
has been rewritten to include full support for Python 2.6, the ability
for letsencrypt-auto to update itself, and improvements to the
stability, security, and performance of the script.
* Added support for Apache 2.2 to the Apache plugin.
More details about these changes can be found on our GitHub repo:
https://github.com/letsencrypt/letsencrypt/issues?q=is%3Aissue+milestone%3A0.4.0
## 0.3.0 - 2016-01-27
### Added
* Added a non-interactive mode which can be enabled by including `-n` or
`--non-interactive` on the command line. This can be used to guarantee
the client will not prompt when run automatically using cron/systemd.
* Added preparation for the new letsencrypt-auto script. Over the past
couple months, we've been working on increasing the reliability and
security of letsencrypt-auto. A number of changes landed in this
release to prepare for the new version of this script.
More details about these changes can be found on our GitHub repo:
https://github.com/letsencrypt/letsencrypt/issues?q=is%3Aissue+milestone%3A0.3.0
## 0.2.0 - 2016-01-14
### Added
* Added Apache plugin support for non-Debian based systems. Support has been
added for modern Red Hat based systems such as Fedora 23, Red Hat 7,
and CentOS 7 running Apache 2.4. In theory, this plugin should be
able to be configured to run on any Unix-like OS running Apache 2.4.
* Relaxed PyOpenSSL version requirements. This adds support for systems
with PyOpenSSL versions 0.13 or 0.14.
* Improved error messages from the client.
### Fixed
* Resolved issues with the Apache plugin enabling an HTTP to HTTPS
redirect on some systems.
More details about these changes can be found on our GitHub repo:
https://github.com/letsencrypt/letsencrypt/issues?q=is%3Aissue+milestone%3A0.2.0
## 0.1.1 - 2015-12-15
### Added
* Added a check that avoids attempting to issue for unqualified domain names like
"localhost".
### Fixed
* Fixed a confusing UI path that caused some users to repeatedly renew
their certs while experimenting with the client, in some cases hitting
issuance rate limits.
* Fixed numerous Apache configuration parser problems
* Fixed --webroot permission handling for non-root users
More details about these changes can be found on our GitHub repo:
https://github.com/letsencrypt/letsencrypt/issues?q=milestone%3A0.1.1

8
CHANGES.rst Normal file
View File

@@ -0,0 +1,8 @@
ChangeLog
=========
To see the changes in a given release, view the issues closed in a given
release's GitHub milestone:
- `Past releases <https://github.com/certbot/certbot/milestones?state=closed>`_
- `Upcoming releases <https://github.com/certbot/certbot/milestones>`_

View File

@@ -1 +0,0 @@
This project is governed by [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode).

View File

@@ -11,7 +11,7 @@ to the Sphinx generated docs is provided below.
[1] https://github.com/blog/1184-contributing-guidelines
[2] https://docutils.sourceforge.io/docs/user/rst/quickref.html#hyperlink-targets
[2] http://docutils.sourceforge.net/docs/user/rst/quickref.html#hyperlink-targets
-->
@@ -33,5 +33,3 @@ started. In particular, we recommend you read these sections
- [Finding issues to work on](https://certbot.eff.org/docs/contributing.html#find-issues-to-work-on)
- [Coding style](https://certbot.eff.org/docs/contributing.html#coding-style)
- [Submitting a pull request](https://certbot.eff.org/docs/contributing.html#submitting-a-pull-request)
- [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode)

27
Dockerfile Normal file
View File

@@ -0,0 +1,27 @@
FROM python:2-alpine
ENTRYPOINT [ "certbot" ]
EXPOSE 80 443
VOLUME /etc/letsencrypt /var/lib/letsencrypt
WORKDIR /opt/certbot
COPY CHANGES.rst README.rst setup.py src/
COPY acme src/acme
COPY certbot src/certbot
RUN apk add --no-cache --virtual .certbot-deps \
libffi \
libssl1.0 \
openssl \
ca-certificates \
binutils
RUN apk add --no-cache --virtual .build-deps \
gcc \
linux-headers \
openssl-dev \
musl-dev \
libffi-dev \
&& pip install --no-cache-dir \
--editable /opt/certbot/src/acme \
--editable /opt/certbot/src \
&& apk del .build-deps

View File

@@ -1,20 +1,70 @@
# This Dockerfile builds an image for development.
FROM debian:buster
FROM ubuntu:trusty
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
MAINTAINER William Budington <bill@eff.org>
MAINTAINER Yan <yan@eff.org>
# Note: this only exposes the port to other docker containers.
EXPOSE 80 443
# Note: this only exposes the port to other docker containers. You
# still have to bind to 443@host at runtime, as per the ACME spec.
EXPOSE 443
# TODO: make sure --config-dir and --work-dir cannot be changed
# through the CLI (certbot-docker wrapper that uses standalone
# authenticator and text mode only?)
VOLUME /etc/letsencrypt /var/lib/letsencrypt
WORKDIR /opt/certbot/src
COPY . .
RUN apt-get update && \
apt-get install apache2 git python3-dev python3-venv gcc libaugeas0 \
libssl-dev libffi-dev ca-certificates openssl nginx-light -y && \
# no need to mkdir anything:
# https://docs.docker.com/reference/builder/#copy
# If <dest> doesn't exist, it is created along with all missing
# directories in its path.
# TODO: Install Apache/Nginx for plugin development.
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
apt-get install python3-dev git -y && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*
RUN VENV_NAME="../venv3" python3 tools/venv3.py
# the above is not likely to change, so by putting it further up the
# Dockerfile we make sure we cache as much as possible
ENV PATH /opt/certbot/venv3/bin:$PATH
COPY setup.py README.rst CHANGES.rst MANIFEST.in linter_plugin.py tox.cover.sh tox.ini .pylintrc /opt/certbot/src/
# all above files are necessary for setup.py, however, package source
# code directory has to be copied separately to a subdirectory...
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
# directory, the entire contents of the directory are copied,
# including filesystem metadata. Note: The directory itself is not
# copied, just its contents." Order again matters, three files are far
# more likely to be cached than the whole project directory
COPY certbot /opt/certbot/src/certbot/
COPY acme /opt/certbot/src/acme/
COPY certbot-apache /opt/certbot/src/certbot-apache/
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
COPY letshelp-certbot /opt/certbot/src/letshelp-certbot/
COPY certbot-compatibility-test /opt/certbot/src/certbot-compatibility-test/
COPY tests /opt/certbot/src/tests/
RUN virtualenv --no-site-packages -p python2 /opt/certbot/venv && \
/opt/certbot/venv/bin/pip install -U pip && \
/opt/certbot/venv/bin/pip install -U setuptools && \
/opt/certbot/venv/bin/pip install \
-e /opt/certbot/src/acme \
-e /opt/certbot/src \
-e /opt/certbot/src/certbot-apache \
-e /opt/certbot/src/certbot-nginx \
-e /opt/certbot/src/letshelp-certbot \
-e /opt/certbot/src/certbot-compatibility-test \
-e /opt/certbot/src[dev,docs]
# install in editable mode (-e) to save space: it's not possible to
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
# this might also help in debugging: you can "docker run --entrypoint
# bash" and investigate, apply patches, etc.
ENV PATH /opt/certbot/venv/bin:$PATH

75
Dockerfile-old Normal file
View File

@@ -0,0 +1,75 @@
# https://github.com/letsencrypt/letsencrypt/pull/431#issuecomment-103659297
# it is more likely developers will already have ubuntu:trusty rather
# than e.g. debian:jessie and image size differences are negligible
FROM ubuntu:trusty
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
MAINTAINER William Budington <bill@eff.org>
# Note: this only exposes the port to other docker containers. You
# still have to bind to 443@host at runtime, as per the ACME spec.
EXPOSE 443
# TODO: make sure --config-dir and --work-dir cannot be changed
# through the CLI (certbot-docker wrapper that uses standalone
# authenticator and text mode only?)
VOLUME /etc/letsencrypt /var/lib/letsencrypt
WORKDIR /opt/certbot
# no need to mkdir anything:
# https://docs.docker.com/reference/builder/#copy
# If <dest> doesn't exist, it is created along with all missing
# directories in its path.
ENV DEBIAN_FRONTEND=noninteractive
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*
# the above is not likely to change, so by putting it further up the
# Dockerfile we make sure we cache as much as possible
COPY setup.py README.rst CHANGES.rst MANIFEST.in letsencrypt-auto-source/pieces/pipstrap.py /opt/certbot/src/
# all above files are necessary for setup.py and venv setup, however,
# package source code directory has to be copied separately to a
# subdirectory...
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
# directory, the entire contents of the directory are copied,
# including filesystem metadata. Note: The directory itself is not
# copied, just its contents." Order again matters, three files are far
# more likely to be cached than the whole project directory
COPY certbot /opt/certbot/src/certbot/
COPY acme /opt/certbot/src/acme/
COPY certbot-apache /opt/certbot/src/certbot-apache/
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
RUN virtualenv --no-site-packages -p python2 /opt/certbot/venv
# PATH is set now so pipstrap upgrades the correct (v)env
ENV PATH /opt/certbot/venv/bin:$PATH
RUN /opt/certbot/venv/bin/python /opt/certbot/src/pipstrap.py && \
/opt/certbot/venv/bin/pip install \
-e /opt/certbot/src/acme \
-e /opt/certbot/src \
-e /opt/certbot/src/certbot-apache \
-e /opt/certbot/src/certbot-nginx
# install in editable mode (-e) to save space: it's not possible to
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
# this might also help in debugging: you can "docker run --entrypoint
# bash" and investigate, apply patches, etc.
# set up certbot/letsencrypt wrapper to warn people about Dockerfile changes
COPY tools/docker-warning.sh /opt/certbot/bin/certbot
RUN ln -s /opt/certbot/bin/certbot /opt/certbot/bin/letsencrypt
ENV PATH /opt/certbot/bin:$PATH
ENTRYPOINT [ "certbot" ]

View File

@@ -1,9 +1,3 @@
If you're having trouble using Certbot and aren't sure you've found a bug or
request for a new feature, please first try asking for help at
https://community.letsencrypt.org/. There is a much larger community there of
people familiar with the project who will be able to more quickly answer your
questions.
## My operating system is (include version):

View File

@@ -1,10 +1,9 @@
include README.rst
include CHANGELOG.md
include CHANGES.rst
include CONTRIBUTING.md
include LICENSE.txt
include linter_plugin.py
recursive-include docs *
recursive-include examples *
recursive-include certbot/tests/testdata *
recursive-include tests *.py
include certbot/ssl-dhparams.pem
global-exclude __pycache__
global-exclude *.py[cod]

View File

@@ -1 +0,0 @@
certbot/README.rst

161
README.rst Normal file
View File

@@ -0,0 +1,161 @@
.. This file contains a series of comments that are used to include sections of this README in other files. Do not modify these comments unless you know what you are doing. tag:intro-begin
Certbot is part of EFFs effort to encrypt the entire Internet. Secure communication over the Web relies on HTTPS, which requires the use of a digital certificate that lets browsers verify the identity of web servers (e.g., is that really google.com?). Web servers obtain their certificates from trusted third parties called certificate authorities (CAs). Certbot is an easy-to-use client that fetches a certificate from Lets Encrypt—an open certificate authority launched by the EFF, Mozilla, and others—and deploys it to a web server.
Anyone who has gone through the trouble of setting up a secure website knows what a hassle getting and maintaining a certificate is. Certbot and Lets Encrypt can automate away the pain and let you turn on and manage HTTPS with simple commands. Using Certbot and Let's Encrypt is free, so theres no need to arrange payment.
How you use Certbot depends on the configuration of your web server. The best way to get started is to use our `interactive guide <https://certbot.eff.org>`_. It generates instructions based on your configuration settings. In most cases, youll need `root or administrator access <https://certbot.eff.org/faq/#does-certbot-require-root-administrator-privileges>`_ to your web server to run Certbot.
If youre using a hosted service and dont have direct access to your web server, you might not be able to use Certbot. Check with your hosting provider for documentation about uploading certificates or using certificates issued by Lets Encrypt.
Certbot is a fully-featured, extensible client for the Let's
Encrypt CA (or any other CA that speaks the `ACME
<https://github.com/ietf-wg-acme/acme/blob/master/draft-ietf-acme-acme.md>`_
protocol) that can automate the tasks of obtaining certificates and
configuring webservers to use them. This client runs on Unix-based operating
systems.
To see the changes made to Certbot between versions please refer to our
`changelog <https://github.com/certbot/certbot/blob/master/CHANGELOG.md>`_.
Until May 2016, Certbot was named simply ``letsencrypt`` or ``letsencrypt-auto``,
depending on install method. Instructions on the Internet, and some pieces of the
software, may still refer to this older name.
Contributing
------------
If you'd like to contribute to this project please read `Developer Guide
<https://certbot.eff.org/docs/contributing.html>`_.
.. _installation:
Installation
------------
The easiest way to install Certbot is by visiting `certbot.eff.org`_, where you can
find the correct installation instructions for many web server and OS combinations.
For more information, see `Get Certbot <https://certbot.eff.org/docs/install.html>`_.
.. _certbot.eff.org: https://certbot.eff.org/
How to run the client
---------------------
In many cases, you can just run ``certbot-auto`` or ``certbot``, and the
client will guide you through the process of obtaining and installing certs
interactively.
For full command line help, you can type::
./certbot-auto --help all
You can also tell it exactly what you want it to do from the command line.
For instance, if you want to obtain a cert for ``example.com``,
``www.example.com``, and ``other.example.net``, using the Apache plugin to both
obtain and install the certs, you could do this::
./certbot-auto --apache -d example.com -d www.example.com -d other.example.net
(The first time you run the command, it will make an account, and ask for an
email and agreement to the Let's Encrypt Subscriber Agreement; you can
automate those with ``--email`` and ``--agree-tos``)
If you want to use a webserver that doesn't have full plugin support yet, you
can still use "standalone" or "webroot" plugins to obtain a certificate::
./certbot-auto certonly --standalone --email admin@example.com -d example.com -d www.example.com -d other.example.net
Understanding the client in more depth
--------------------------------------
To understand what the client is doing in detail, it's important to
understand the way it uses plugins. Please see the `explanation of
plugins <https://certbot.eff.org/docs/using.html#plugins>`_ in
the User Guide.
Links
=====
.. Do not modify this comment unless you know what you're doing. tag:links-begin
Documentation: https://certbot.eff.org/docs
Software project: https://github.com/certbot/certbot
Notes for developers: https://certbot.eff.org/docs/contributing.html
Main Website: https://certbot.eff.org
Let's Encrypt Website: https://letsencrypt.org
IRC Channel: #letsencrypt on `Freenode`_
Community: https://community.letsencrypt.org
ACME spec: http://ietf-wg-acme.github.io/acme/
ACME working area in github: https://github.com/ietf-wg-acme/acme
|build-status| |coverage| |docs| |container|
.. _Freenode: https://webchat.freenode.net?channels=%23letsencrypt
.. |build-status| image:: https://travis-ci.org/certbot/certbot.svg?branch=master
:target: https://travis-ci.org/certbot/certbot
:alt: Travis CI status
.. |coverage| image:: https://coveralls.io/repos/certbot/certbot/badge.svg?branch=master
:target: https://coveralls.io/r/certbot/certbot
:alt: Coverage status
.. |docs| image:: https://readthedocs.org/projects/letsencrypt/badge/
:target: https://readthedocs.org/projects/letsencrypt/
:alt: Documentation status
.. |container| image:: https://quay.io/repository/letsencrypt/letsencrypt/status
:target: https://quay.io/repository/letsencrypt/letsencrypt
:alt: Docker Repository on Quay.io
.. Do not modify this comment unless you know what you're doing. tag:links-end
System Requirements
===================
See https://certbot.eff.org/docs/install.html#system-requirements.
.. Do not modify this comment unless you know what you're doing. tag:intro-end
.. Do not modify this comment unless you know what you're doing. tag:features-begin
Current Features
=====================
* Supports multiple web servers:
- apache/2.x
- nginx/0.8.48+
- webroot (adds files to webroot directories in order to prove control of
domains and obtain certs)
- standalone (runs its own simple webserver to prove you control a domain)
- other server software via `third party plugins <https://certbot.eff.org/docs/using.html#third-party-plugins>`_
* The private key is generated locally on your system.
* Can talk to the Let's Encrypt CA or optionally to other ACME
compliant services.
* Can get domain-validated (DV) certificates.
* Can revoke certificates.
* Adjustable RSA key bit-length (2048 (default), 4096, ...).
* Can optionally install a http -> https redirect, so your site effectively
runs https only (Apache only)
* Fully automated.
* Configuration changes are logged and can be reverted.
* Supports an interactive text UI, or can be driven entirely from the
command line.
* Free and Open Source Software, made with Python.
.. Do not modify this comment unless you know what you're doing. tag:features-end
For extensive documentation on using and contributing to Certbot, go to https://certbot.eff.org/docs. If you would like to contribute to the project or run the latest code from git, you should read our `developer guide <https://certbot.eff.org/docs/contributing.html>`_.

View File

@@ -1,8 +1,5 @@
include LICENSE.txt
include README.rst
include pytest.ini
recursive-include docs *
recursive-include examples *
recursive-include tests *
global-exclude __pycache__
global-exclude *.py[cod]
recursive-include acme/testdata *

View File

@@ -1,21 +1,21 @@
"""ACME protocol implementation.
This module is an implementation of the `ACME protocol`_.
This module is an implementation of the `ACME protocol`_. Latest
supported version: `draft-ietf-acme-01`_.
.. _`ACME protocol`: https://ietf-wg-acme.github.io/acme
.. _`draft-ietf-acme-01`:
https://github.com/ietf-wg-acme/acme/tree/draft-ietf-acme-acme-01
"""
import sys
import warnings
# This code exists to keep backwards compatibility with people using acme.jose
# before it became the standalone josepy package.
#
# It is based on
# https://github.com/requests/requests/blob/1278ecdf71a312dc2268f3bfc0aabfab3c006dcf/requests/packages.py
import josepy as jose
for mod in list(sys.modules):
# This traversal is apparently necessary such that the identities are
# preserved (acme.jose.* is josepy.*)
if mod == 'josepy' or mod.startswith('josepy.'):
sys.modules['acme.' + mod.replace('josepy', 'jose', 1)] = sys.modules[mod]
if sys.version_info[:2] == (3, 3):
warnings.warn(
"Python 3.3 support will be dropped in the next release of "
"acme. Please upgrade your Python version.",
PendingDeprecationWarning,
) #pragma: no cover

View File

@@ -1,6 +1,5 @@
"""ACME Identifier Validation Challenges."""
import abc
import codecs
import functools
import hashlib
import logging
@@ -8,21 +7,21 @@ import socket
from cryptography.hazmat.primitives import hashes # type: ignore
import josepy as jose
import OpenSSL
import requests
import six
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
from OpenSSL import crypto
from acme import crypto_util
from acme import errors
from acme import crypto_util
from acme import fields
from acme.mixins import ResourceMixin, TypeMixin
logger = logging.getLogger(__name__)
# pylint: disable=too-few-public-methods
class Challenge(jose.TypedJSONObjectWithFields):
# _fields_to_partial_json
# _fields_to_partial_json | pylint: disable=abstract-method
"""ACME challenge."""
TYPES = {} # type: dict
@@ -35,8 +34,8 @@ class Challenge(jose.TypedJSONObjectWithFields):
return UnrecognizedChallenge.from_json(jobj)
class ChallengeResponse(ResourceMixin, TypeMixin, jose.TypedJSONObjectWithFields):
# _fields_to_partial_json
class ChallengeResponse(jose.TypedJSONObjectWithFields):
# _fields_to_partial_json | pylint: disable=abstract-method
"""ACME challenge response."""
TYPES = {} # type: dict
resource_type = 'challenge'
@@ -61,7 +60,8 @@ class UnrecognizedChallenge(Challenge):
object.__setattr__(self, "jobj", jobj)
def to_partial_json(self):
return self.jobj # pylint: disable=no-member
# pylint: disable=no-member
return self.jobj
@classmethod
def from_json(cls, jobj):
@@ -94,7 +94,6 @@ class _TokenChallenge(Challenge):
"""
# TODO: check that path combined with uri does not go above
# URI_ROOT_PATH!
# pylint: disable=unsupported-membership-test
return b'..' not in self.token and b'/' not in self.token
@@ -119,7 +118,7 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
:rtype: bool
"""
parts = self.key_authorization.split('.')
parts = self.key_authorization.split('.') # pylint: disable=no-member
if len(parts) != 2:
logger.debug("Key authorization (%r) is not well formed",
self.key_authorization)
@@ -139,21 +138,17 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
return True
def to_partial_json(self):
jobj = super(KeyAuthorizationChallengeResponse, self).to_partial_json()
jobj.pop('keyAuthorization', None)
return jobj
@six.add_metaclass(abc.ABCMeta)
class KeyAuthorizationChallenge(_TokenChallenge):
# pylint: disable=abstract-class-little-used,too-many-ancestors
"""Challenge based on Key Authorization.
:param response_cls: Subclass of `KeyAuthorizationChallengeResponse`
that will be used to generate ``response``.
:param str typ: type of the challenge
that will be used to generate `response`.
"""
typ = NotImplemented
__metaclass__ = abc.ABCMeta
response_cls = NotImplemented
thumbprint_hash_function = (
KeyAuthorizationChallengeResponse.thumbprint_hash_function)
@@ -178,7 +173,7 @@ class KeyAuthorizationChallenge(_TokenChallenge):
:rtype: KeyAuthorizationChallengeResponse
"""
return self.response_cls( # pylint: disable=not-callable
return self.response_cls(
key_authorization=self.key_authorization(account_key))
@abc.abstractmethod
@@ -215,7 +210,7 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
"""ACME dns-01 challenge response."""
typ = "dns-01"
def simple_verify(self, chall, domain, account_public_key): # pylint: disable=unused-argument
def simple_verify(self, chall, domain, account_public_key):
"""Simple verify.
This method no longer checks DNS records and is a simple wrapper
@@ -231,13 +226,14 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
:rtype: bool
"""
# pylint: disable=unused-argument
verified = self.verify(chall, account_public_key)
if not verified:
logger.debug("Verification of key authorization in response failed")
return verified
@Challenge.register
@Challenge.register # pylint: disable=too-many-ancestors
class DNS01(KeyAuthorizationChallenge):
"""ACME dns-01 challenge."""
response_cls = DNS01Response
@@ -310,7 +306,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
uri = chall.uri(domain)
logger.debug("Verifying %s at %s...", chall.typ, uri)
try:
http_response = requests.get(uri, verify=False)
http_response = requests.get(uri)
except requests.exceptions.RequestException as error:
logger.error("Unable to reach %s: %s", uri, error)
return False
@@ -327,7 +323,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
return True
@Challenge.register
@Challenge.register # pylint: disable=too-many-ancestors
class HTTP01(KeyAuthorizationChallenge):
"""ACME http-01 challenge."""
response_cls = HTTP01Response
@@ -368,9 +364,12 @@ class HTTP01(KeyAuthorizationChallenge):
@ChallengeResponse.register
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""ACME tls-alpn-01 challenge response."""
typ = "tls-alpn-01"
class TLSSNI01Response(KeyAuthorizationChallengeResponse):
"""ACME tls-sni-01 challenge response."""
typ = "tls-sni-01"
DOMAIN_SUFFIX = b".acme.invalid"
"""Domain name suffix."""
PORT = 443
"""Verification port as defined by the protocol.
@@ -380,18 +379,28 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""
ID_PE_ACME_IDENTIFIER_V1 = b"1.3.6.1.5.5.7.1.30.1"
ACME_TLS_1_PROTOCOL = "acme-tls/1"
@property
def z(self): # pylint: disable=invalid-name
"""``z`` value used for verification.
:rtype bytes:
"""
return hashlib.sha256(
self.key_authorization.encode("utf-8")).hexdigest().lower().encode()
@property
def h(self):
"""Hash value stored in challenge certificate"""
return hashlib.sha256(self.key_authorization.encode('utf-8')).digest()
def z_domain(self):
"""Domain name used for verification, generated from `z`.
def gen_cert(self, domain, key=None, bits=2048):
"""Generate tls-alpn-01 certificate.
:rtype bytes:
"""
return self.z[:32] + b'.' + self.z[32:] + self.DOMAIN_SUFFIX
def gen_cert(self, key=None, bits=2048):
"""Generate tls-sni-01 certificate.
:param unicode domain: Domain verified by the challenge.
:param OpenSSL.crypto.PKey key: Optional private key used in
certificate generation. If not provided (``None``), then
fresh key will be generated.
@@ -401,38 +410,33 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""
if key is None:
key = crypto.PKey()
key.generate_key(crypto.TYPE_RSA, bits)
key = OpenSSL.crypto.PKey()
key.generate_key(OpenSSL.crypto.TYPE_RSA, bits)
return crypto_util.gen_ss_cert(key, [
# z_domain is too big to fit into CN, hence first dummy domain
'dummy', self.z_domain.decode()], force_san=True), key
def probe_cert(self, domain, **kwargs):
"""Probe tls-sni-01 challenge certificate.
der_value = b"DER:" + codecs.encode(self.h, 'hex')
acme_extension = crypto.X509Extension(self.ID_PE_ACME_IDENTIFIER_V1,
critical=True, value=der_value)
return crypto_util.gen_ss_cert(key, [domain], force_san=True,
extensions=[acme_extension]), key
def probe_cert(self, domain, host=None, port=None):
"""Probe tls-alpn-01 challenge certificate.
:param unicode domain: domain being validated, required.
:param string host: IP address used to probe the certificate.
:param int port: Port used to probe the certificate.
:param unicode domain:
"""
if host is None:
# TODO: domain is not necessary if host is provided
if "host" not in kwargs:
host = socket.gethostbyname(domain)
logger.debug('%s resolved to %s', domain, host)
if port is None:
port = self.PORT
kwargs["host"] = host
return crypto_util.probe_sni(host=host, port=port, name=domain,
alpn_protocols=[self.ACME_TLS_1_PROTOCOL])
kwargs.setdefault("port", self.PORT)
kwargs["name"] = self.z_domain
# TODO: try different methods?
# pylint: disable=protected-access
return crypto_util.probe_sni(**kwargs)
def verify_cert(self, domain, cert):
"""Verify tls-alpn-01 challenge certificate.
def verify_cert(self, cert):
"""Verify tls-sni-01 challenge certificate.
:param unicode domain: Domain name being validated.
:param OpensSSL.crypto.X509 cert: Challenge certificate.
:returns: Whether the certificate was successfully verified.
@@ -440,40 +444,28 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""
# pylint: disable=protected-access
names = crypto_util._pyopenssl_cert_or_req_all_names(cert)
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), names)
if len(names) != 1 or names[0].lower() != domain.lower():
return False
sans = crypto_util._pyopenssl_cert_or_req_san(cert)
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), sans)
return self.z_domain.decode() in sans
for i in range(cert.get_extension_count()):
ext = cert.get_extension(i)
# FIXME: assume this is the ACME extension. Currently there is no
# way to get full OID of an unknown extension from pyopenssl.
if ext.get_short_name() == b'UNDEF':
data = ext.get_data()
return data == self.h
return False
# pylint: disable=too-many-arguments
def simple_verify(self, chall, domain, account_public_key,
cert=None, host=None, port=None):
cert=None, **kwargs):
"""Simple verify.
Verify ``validation`` using ``account_public_key``, optionally
probe tls-alpn-01 certificate and check using `verify_cert`.
probe tls-sni-01 certificate and check using `verify_cert`.
:param .challenges.TLSALPN01 chall: Corresponding challenge.
:param .challenges.TLSSNI01 chall: Corresponding challenge.
:param str domain: Domain name being validated.
:param JWK account_public_key:
:param OpenSSL.crypto.X509 cert: Optional certificate. If not
provided (``None``) certificate will be retrieved using
`probe_cert`.
:param string host: IP address used to probe the certificate.
:param int port: Port used to probe the certificate.
:returns: ``True`` if and only if client's control of the domain has been verified.
:returns: ``True`` iff client's control of the domain has been
verified.
:rtype: bool
"""
@@ -483,25 +475,27 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
if cert is None:
try:
cert = self.probe_cert(domain=domain, host=host, port=port)
cert = self.probe_cert(domain=domain, **kwargs)
except errors.Error as error:
logger.debug(str(error), exc_info=True)
logger.debug(error, exc_info=True)
return False
return self.verify_cert(domain, cert)
return self.verify_cert(cert)
@Challenge.register # pylint: disable=too-many-ancestors
class TLSALPN01(KeyAuthorizationChallenge):
"""ACME tls-alpn-01 challenge."""
response_cls = TLSALPN01Response
class TLSSNI01(KeyAuthorizationChallenge):
"""ACME tls-sni-01 challenge."""
response_cls = TLSSNI01Response
typ = response_cls.typ
# boulder#962, ietf-wg-acme#22
#n = jose.Field("n", encoder=int, decoder=int)
def validation(self, account_key, **kwargs):
"""Generate validation.
:param JWK account_key:
:param unicode domain: Domain verified by the challenge.
:param OpenSSL.crypto.PKey cert_key: Optional private key used
in certificate generation. If not provided (``None``), then
fresh key will be generated.
@@ -509,26 +503,10 @@ class TLSALPN01(KeyAuthorizationChallenge):
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
"""
return self.response(account_key).gen_cert(
key=kwargs.get('cert_key'),
domain=kwargs.get('domain'))
@staticmethod
def is_supported():
"""
Check if TLS-ALPN-01 challenge is supported on this machine.
This implies that a recent version of OpenSSL is installed (>= 1.0.2),
or a recent cryptography version shipped with the OpenSSL library is installed.
:returns: ``True`` if TLS-ALPN-01 is supported on this machine, ``False`` otherwise.
:rtype: bool
"""
return (hasattr(SSL.Connection, "set_alpn_protos")
and hasattr(SSL.Context, "set_alpn_select_callback"))
return self.response(account_key).gen_cert(key=kwargs.get('cert_key'))
@Challenge.register
@Challenge.register # pylint: disable=too-many-ancestors
class DNS(_TokenChallenge):
"""ACME "dns" challenge."""
typ = "dns"

View File

@@ -2,17 +2,14 @@
import unittest
import josepy as jose
import mock
import OpenSSL
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import requests
from six.moves.urllib import parse as urllib_parse
from six.moves.urllib import parse as urllib_parse # pylint: disable=import-error
from acme import errors
import test_util
from acme import test_util
CERT = test_util.load_comparable_cert('cert.pem')
KEY = jose.JWKRSA(key=test_util.load_rsa_private_key('rsa512_key.pem'))
@@ -24,6 +21,7 @@ class ChallengeTest(unittest.TestCase):
from acme.challenges import Challenge
from acme.challenges import UnrecognizedChallenge
chall = UnrecognizedChallenge({"type": "foo"})
# pylint: disable=no-member
self.assertEqual(chall, Challenge.from_json(chall.jobj))
@@ -79,6 +77,7 @@ class KeyAuthorizationChallengeResponseTest(unittest.TestCase):
class DNS01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import DNS01Response
@@ -94,8 +93,7 @@ class DNS01ResponseTest(unittest.TestCase):
self.response = self.chall.response(KEY)
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.msg.to_partial_json())
self.assertEqual(self.jmsg, self.msg.to_partial_json())
def test_from_json(self):
from acme.challenges import DNS01Response
@@ -150,6 +148,7 @@ class DNS01Test(unittest.TestCase):
class HTTP01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import HTTP01Response
@@ -165,8 +164,7 @@ class HTTP01ResponseTest(unittest.TestCase):
self.response = self.chall.response(KEY)
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.msg.to_partial_json())
self.assertEqual(self.jmsg, self.msg.to_partial_json())
def test_from_json(self):
from acme.challenges import HTTP01Response
@@ -187,7 +185,7 @@ class HTTP01ResponseTest(unittest.TestCase):
mock_get.return_value = mock.MagicMock(text=validation)
self.assertTrue(self.response.simple_verify(
self.chall, "local", KEY.public_key()))
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
mock_get.assert_called_once_with(self.chall.uri("local"))
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_bad_validation(self, mock_get):
@@ -203,7 +201,7 @@ class HTTP01ResponseTest(unittest.TestCase):
HTTP01Response.WHITESPACE_CUTSET))
self.assertTrue(self.response.simple_verify(
self.chall, "local", KEY.public_key()))
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
mock_get.assert_called_once_with(self.chall.uri("local"))
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_connection_error(self, mock_get):
@@ -259,68 +257,42 @@ class HTTP01Test(unittest.TestCase):
self.msg.update(token=b'..').good_token)
class TLSALPN01ResponseTest(unittest.TestCase):
class TLSSNI01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import TLSALPN01
self.chall = TLSALPN01(
from acme.challenges import TLSSNI01
self.chall = TLSSNI01(
token=jose.b64decode(b'a82d5ff8ef740d12881f6d3c2277ab2e'))
self.domain = u'example.com'
self.domain2 = u'example2.com'
self.response = self.chall.response(KEY)
self.jmsg = {
'resource': 'challenge',
'type': 'tls-alpn-01',
'type': 'tls-sni-01',
'keyAuthorization': self.response.key_authorization,
}
# pylint: disable=invalid-name
label1 = b'dc38d9c3fa1a4fdcc3a5501f2d38583f'
label2 = b'b7793728f084394f2a1afd459556bb5c'
self.z = label1 + label2
self.z_domain = label1 + b'.' + label2 + b'.acme.invalid'
self.domain = 'foo.com'
def test_z_and_domain(self):
self.assertEqual(self.z, self.response.z)
self.assertEqual(self.z_domain, self.response.z_domain)
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.response.to_partial_json())
self.assertEqual(self.jmsg, self.response.to_partial_json())
def test_from_json(self):
from acme.challenges import TLSALPN01Response
self.assertEqual(self.response, TLSALPN01Response.from_json(self.jmsg))
from acme.challenges import TLSSNI01Response
self.assertEqual(self.response, TLSSNI01Response.from_json(self.jmsg))
def test_from_json_hashable(self):
from acme.challenges import TLSALPN01Response
hash(TLSALPN01Response.from_json(self.jmsg))
def test_gen_verify_cert(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(self.domain, key1)
self.assertEqual(key1, key2)
self.assertTrue(self.response.verify_cert(self.domain, cert))
def test_gen_verify_cert_gen_key(self):
cert, key = self.response.gen_cert(self.domain)
self.assertTrue(isinstance(key, OpenSSL.crypto.PKey))
self.assertTrue(self.response.verify_cert(self.domain, cert))
def test_verify_bad_cert(self):
self.assertFalse(self.response.verify_cert(self.domain,
test_util.load_cert('cert.pem')))
def test_verify_bad_domain(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(self.domain, key1)
self.assertEqual(key1, key2)
self.assertFalse(self.response.verify_cert(self.domain2, cert))
def test_simple_verify_bad_key_authorization(self):
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
self.response.simple_verify(self.chall, "local", key2.public_key())
@mock.patch('acme.challenges.TLSALPN01Response.verify_cert', autospec=True)
def test_simple_verify(self, mock_verify_cert):
mock_verify_cert.return_value = mock.sentinel.verification
self.assertEqual(
mock.sentinel.verification, self.response.simple_verify(
self.chall, self.domain, KEY.public_key(),
cert=mock.sentinel.cert))
mock_verify_cert.assert_called_once_with(
self.response, self.domain, mock.sentinel.cert)
from acme.challenges import TLSSNI01Response
hash(TLSSNI01Response.from_json(self.jmsg))
@mock.patch('acme.challenges.socket.gethostbyname')
@mock.patch('acme.challenges.crypto_util.probe_sni')
@@ -329,29 +301,70 @@ class TLSALPN01ResponseTest(unittest.TestCase):
self.response.probe_cert('foo.com')
mock_gethostbyname.assert_called_once_with('foo.com')
mock_probe_sni.assert_called_once_with(
host='127.0.0.1', port=self.response.PORT, name='foo.com',
alpn_protocols=['acme-tls/1'])
host='127.0.0.1', port=self.response.PORT,
name=self.z_domain)
self.response.probe_cert('foo.com', host='8.8.8.8')
mock_probe_sni.assert_called_with(
host='8.8.8.8', port=mock.ANY, name='foo.com',
alpn_protocols=['acme-tls/1'])
host='8.8.8.8', port=mock.ANY, name=mock.ANY)
@mock.patch('acme.challenges.TLSALPN01Response.probe_cert')
self.response.probe_cert('foo.com', port=1234)
mock_probe_sni.assert_called_with(
host=mock.ANY, port=1234, name=mock.ANY)
self.response.probe_cert('foo.com', bar='baz')
mock_probe_sni.assert_called_with(
host=mock.ANY, port=mock.ANY, name=mock.ANY, bar='baz')
self.response.probe_cert('foo.com', name=b'xxx')
mock_probe_sni.assert_called_with(
host=mock.ANY, port=mock.ANY,
name=self.z_domain)
def test_gen_verify_cert(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(key1)
self.assertEqual(key1, key2)
self.assertTrue(self.response.verify_cert(cert))
def test_gen_verify_cert_gen_key(self):
cert, key = self.response.gen_cert()
self.assertTrue(isinstance(key, OpenSSL.crypto.PKey))
self.assertTrue(self.response.verify_cert(cert))
def test_verify_bad_cert(self):
self.assertFalse(self.response.verify_cert(
test_util.load_cert('cert.pem')))
def test_simple_verify_bad_key_authorization(self):
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
self.response.simple_verify(self.chall, "local", key2.public_key())
@mock.patch('acme.challenges.TLSSNI01Response.verify_cert', autospec=True)
def test_simple_verify(self, mock_verify_cert):
mock_verify_cert.return_value = mock.sentinel.verification
self.assertEqual(
mock.sentinel.verification, self.response.simple_verify(
self.chall, self.domain, KEY.public_key(),
cert=mock.sentinel.cert))
mock_verify_cert.assert_called_once_with(
self.response, mock.sentinel.cert)
@mock.patch('acme.challenges.TLSSNI01Response.probe_cert')
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
mock_probe_cert.side_effect = errors.Error
self.assertFalse(self.response.simple_verify(
self.chall, self.domain, KEY.public_key()))
class TLSALPN01Test(unittest.TestCase):
class TLSSNI01Test(unittest.TestCase):
def setUp(self):
from acme.challenges import TLSALPN01
self.msg = TLSALPN01(
from acme.challenges import TLSSNI01
self.msg = TLSSNI01(
token=jose.b64decode('a82d5ff8ef740d12881f6d3c2277ab2e'))
self.jmsg = {
'type': 'tls-alpn-01',
'type': 'tls-sni-01',
'token': 'a82d5ff8ef740d12881f6d3c2277ab2e',
}
@@ -359,26 +372,25 @@ class TLSALPN01Test(unittest.TestCase):
self.assertEqual(self.jmsg, self.msg.to_partial_json())
def test_from_json(self):
from acme.challenges import TLSALPN01
self.assertEqual(self.msg, TLSALPN01.from_json(self.jmsg))
from acme.challenges import TLSSNI01
self.assertEqual(self.msg, TLSSNI01.from_json(self.jmsg))
def test_from_json_hashable(self):
from acme.challenges import TLSALPN01
hash(TLSALPN01.from_json(self.jmsg))
from acme.challenges import TLSSNI01
hash(TLSSNI01.from_json(self.jmsg))
def test_from_json_invalid_token_length(self):
from acme.challenges import TLSALPN01
from acme.challenges import TLSSNI01
self.jmsg['token'] = jose.encode_b64jose(b'abcd')
self.assertRaises(
jose.DeserializationError, TLSALPN01.from_json, self.jmsg)
jose.DeserializationError, TLSSNI01.from_json, self.jmsg)
@mock.patch('acme.challenges.TLSALPN01Response.gen_cert')
@mock.patch('acme.challenges.TLSSNI01Response.gen_cert')
def test_validation(self, mock_gen_cert):
mock_gen_cert.return_value = ('cert', 'key')
self.assertEqual(('cert', 'key'), self.msg.validation(
KEY, cert_key=mock.sentinel.cert_key, domain=mock.sentinel.domain))
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key,
domain=mock.sentinel.domain)
KEY, cert_key=mock.sentinel.cert_key))
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key)
class DNSTest(unittest.TestCase):
@@ -481,18 +493,5 @@ class DNSResponseTest(unittest.TestCase):
self.msg.check_validation(self.chall, KEY.public_key()))
class JWSPayloadRFC8555Compliant(unittest.TestCase):
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
def test_challenge_payload(self):
from acme.challenges import HTTP01Response
challenge_body = HTTP01Response()
challenge_body.le_acme_version = 2
jobj = challenge_body.json_dumps(indent=2).encode()
# RFC8555 states that challenge responses must have an empty payload.
self.assertEqual(jobj, b'{}')
if __name__ == '__main__':
unittest.main() # pragma: no cover

File diff suppressed because it is too large Load Diff

View File

@@ -1,54 +1,30 @@
"""Tests for acme.client."""
# pylint: disable=too-many-lines
import copy
import datetime
import json
import unittest
import josepy as jose
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import OpenSSL
import requests
from six.moves import http_client # pylint: disable=import-error
import josepy as jose
import mock
import requests
from acme import challenges
from acme import errors
from acme import jws as acme_jws
from acme import messages
from acme.mixins import VersionedLEACMEMixin
import messages_test
import test_util
from acme import messages_test
from acme import test_util
CERT_DER = test_util.load_vector('cert.der')
CERT_SAN_PEM = test_util.load_vector('cert-san.pem')
CSR_SAN_PEM = test_util.load_vector('csr-san.pem')
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
KEY2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
DIRECTORY_V1 = messages.Directory({
messages.NewRegistration:
'https://www.letsencrypt-demo.org/acme/new-reg',
messages.Revocation:
'https://www.letsencrypt-demo.org/acme/revoke-cert',
messages.NewAuthorization:
'https://www.letsencrypt-demo.org/acme/new-authz',
messages.CertificateRequest:
'https://www.letsencrypt-demo.org/acme/new-cert',
})
DIRECTORY_V2 = messages.Directory({
'newAccount': 'https://www.letsencrypt-demo.org/acme/new-account',
'newNonce': 'https://www.letsencrypt-demo.org/acme/new-nonce',
'newOrder': 'https://www.letsencrypt-demo.org/acme/new-order',
'revokeCert': 'https://www.letsencrypt-demo.org/acme/revoke-cert',
})
class ClientTestBase(unittest.TestCase):
"""Base for tests in acme.client."""
class ClientTest(unittest.TestCase):
"""Tests for acme.client.Client."""
# pylint: disable=too-many-instance-attributes,too-many-public-methods
def setUp(self):
self.response = mock.MagicMock(
@@ -57,6 +33,21 @@ class ClientTestBase(unittest.TestCase):
self.net.post.return_value = self.response
self.net.get.return_value = self.response
self.directory = messages.Directory({
messages.NewRegistration:
'https://www.letsencrypt-demo.org/acme/new-reg',
messages.Revocation:
'https://www.letsencrypt-demo.org/acme/revoke-cert',
messages.NewAuthorization:
'https://www.letsencrypt-demo.org/acme/new-authz',
messages.CertificateRequest:
'https://www.letsencrypt-demo.org/acme/new-cert',
})
from acme.client import Client
self.client = Client(
directory=self.directory, key=KEY, alg=jose.RS256, net=self.net)
self.identifier = messages.Identifier(
typ=messages.IDENTIFIER_FQDN, value='example.com')
@@ -64,10 +55,10 @@ class ClientTestBase(unittest.TestCase):
self.contact = ('mailto:cert-admin@example.com', 'tel:+12025551212')
reg = messages.Registration(
contact=self.contact, key=KEY.public_key())
the_arg = dict(reg) # type: Dict
self.new_reg = messages.NewRegistration(**the_arg)
self.new_reg = messages.NewRegistration(**dict(reg))
self.regr = messages.RegistrationResource(
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1')
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1',
terms_of_service='https://www.letsencrypt-demo.org/tos')
# Authorization
authzr_uri = 'https://www.letsencrypt-demo.org/acme/authz/1'
@@ -84,260 +75,14 @@ class ClientTestBase(unittest.TestCase):
self.authzr = messages.AuthorizationResource(
body=self.authz, uri=authzr_uri)
# Reason code for revocation
self.rsn = 1
class BackwardsCompatibleClientV2Test(ClientTestBase):
"""Tests for acme.client.BackwardsCompatibleClientV2."""
def setUp(self):
super(BackwardsCompatibleClientV2Test, self).setUp()
# contains a loaded cert
self.certr = messages.CertificateResource(
body=messages_test.CERT)
loaded = OpenSSL.crypto.load_certificate(
OpenSSL.crypto.FILETYPE_PEM, CERT_SAN_PEM)
wrapped = jose.ComparableX509(loaded)
self.chain = [wrapped, wrapped]
self.cert_pem = OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_PEM, messages_test.CERT.wrapped).decode()
single_chain = OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_PEM, loaded).decode()
self.chain_pem = single_chain + single_chain
self.fullchain_pem = self.cert_pem + self.chain_pem
self.orderr = messages.OrderResource(
csr_pem=CSR_SAN_PEM)
def _init(self):
uri = 'http://www.letsencrypt-demo.org/directory'
from acme.client import BackwardsCompatibleClientV2
return BackwardsCompatibleClientV2(net=self.net,
key=KEY, server=uri)
def test_init_downloads_directory(self):
uri = 'http://www.letsencrypt-demo.org/directory'
from acme.client import BackwardsCompatibleClientV2
BackwardsCompatibleClientV2(net=self.net,
key=KEY, server=uri)
self.net.get.assert_called_once_with(uri)
def test_init_acme_version(self):
self.response.json.return_value = DIRECTORY_V1.to_json()
client = self._init()
self.assertEqual(client.acme_version, 1)
self.response.json.return_value = DIRECTORY_V2.to_json()
client = self._init()
self.assertEqual(client.acme_version, 2)
def test_query_registration_client_v2(self):
self.response.json.return_value = DIRECTORY_V2.to_json()
client = self._init()
self.response.json.return_value = self.regr.body.to_json()
self.assertEqual(self.regr, client.query_registration(self.regr))
def test_forwarding(self):
self.response.json.return_value = DIRECTORY_V1.to_json()
client = self._init()
self.assertEqual(client.directory, client.client.directory)
self.assertEqual(client.key, KEY)
self.assertEqual(client.deactivate_registration, client.client.deactivate_registration)
self.assertRaises(AttributeError, client.__getattr__, 'nonexistent')
self.assertRaises(AttributeError, client.__getattr__, 'new_account_and_tos')
self.assertRaises(AttributeError, client.__getattr__, 'new_account')
def test_new_account_and_tos(self):
# v2 no tos
self.response.json.return_value = DIRECTORY_V2.to_json()
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.new_account_and_tos(self.new_reg)
mock_client().new_account.assert_called_with(self.new_reg)
# v2 tos good
with mock.patch('acme.client.ClientV2') as mock_client:
mock_client().directory.meta.__contains__.return_value = True
client = self._init()
client.new_account_and_tos(self.new_reg, lambda x: True)
mock_client().new_account.assert_called_with(
self.new_reg.update(terms_of_service_agreed=True))
# v2 tos bad
with mock.patch('acme.client.ClientV2') as mock_client:
mock_client().directory.meta.__contains__.return_value = True
client = self._init()
def _tos_cb(tos):
raise errors.Error
self.assertRaises(errors.Error, client.new_account_and_tos,
self.new_reg, _tos_cb)
mock_client().new_account.assert_not_called()
# v1 yes tos
self.response.json.return_value = DIRECTORY_V1.to_json()
with mock.patch('acme.client.Client') as mock_client:
regr = mock.MagicMock(terms_of_service="TOS")
mock_client().register.return_value = regr
client = self._init()
client.new_account_and_tos(self.new_reg)
mock_client().register.assert_called_once_with(self.new_reg)
mock_client().agree_to_tos.assert_called_once_with(regr)
# v1 no tos
with mock.patch('acme.client.Client') as mock_client:
regr = mock.MagicMock(terms_of_service=None)
mock_client().register.return_value = regr
client = self._init()
client.new_account_and_tos(self.new_reg)
mock_client().register.assert_called_once_with(self.new_reg)
mock_client().agree_to_tos.assert_not_called()
@mock.patch('OpenSSL.crypto.load_certificate_request')
@mock.patch('acme.crypto_util._pyopenssl_cert_or_req_all_names')
def test_new_order_v1(self, mock__pyopenssl_cert_or_req_all_names,
unused_mock_load_certificate_request):
self.response.json.return_value = DIRECTORY_V1.to_json()
mock__pyopenssl_cert_or_req_all_names.return_value = ['example.com', 'www.example.com']
mock_csr_pem = mock.MagicMock()
with mock.patch('acme.client.Client') as mock_client:
mock_client().request_domain_challenges.return_value = mock.sentinel.auth
client = self._init()
orderr = client.new_order(mock_csr_pem)
self.assertEqual(orderr.authorizations, [mock.sentinel.auth, mock.sentinel.auth])
def test_new_order_v2(self):
self.response.json.return_value = DIRECTORY_V2.to_json()
mock_csr_pem = mock.MagicMock()
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.new_order(mock_csr_pem)
mock_client().new_order.assert_called_once_with(mock_csr_pem)
@mock.patch('acme.client.Client')
def test_finalize_order_v1_success(self, mock_client):
self.response.json.return_value = DIRECTORY_V1.to_json()
mock_client().request_issuance.return_value = self.certr
mock_client().fetch_chain.return_value = self.chain
deadline = datetime.datetime(9999, 9, 9)
client = self._init()
result = client.finalize_order(self.orderr, deadline)
self.assertEqual(result.fullchain_pem, self.fullchain_pem)
mock_client().fetch_chain.assert_called_once_with(self.certr)
@mock.patch('acme.client.Client')
def test_finalize_order_v1_fetch_chain_error(self, mock_client):
self.response.json.return_value = DIRECTORY_V1.to_json()
mock_client().request_issuance.return_value = self.certr
mock_client().fetch_chain.return_value = self.chain
mock_client().fetch_chain.side_effect = [errors.Error, self.chain]
deadline = datetime.datetime(9999, 9, 9)
client = self._init()
result = client.finalize_order(self.orderr, deadline)
self.assertEqual(result.fullchain_pem, self.fullchain_pem)
self.assertEqual(mock_client().fetch_chain.call_count, 2)
@mock.patch('acme.client.Client')
def test_finalize_order_v1_timeout(self, mock_client):
self.response.json.return_value = DIRECTORY_V1.to_json()
mock_client().request_issuance.return_value = self.certr
deadline = deadline = datetime.datetime.now() - datetime.timedelta(seconds=60)
client = self._init()
self.assertRaises(errors.TimeoutError, client.finalize_order,
self.orderr, deadline)
def test_finalize_order_v2(self):
self.response.json.return_value = DIRECTORY_V2.to_json()
mock_orderr = mock.MagicMock()
mock_deadline = mock.MagicMock()
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.finalize_order(mock_orderr, mock_deadline)
mock_client().finalize_order.assert_called_once_with(mock_orderr, mock_deadline, False)
def test_revoke(self):
self.response.json.return_value = DIRECTORY_V1.to_json()
with mock.patch('acme.client.Client') as mock_client:
client = self._init()
client.revoke(messages_test.CERT, self.rsn)
mock_client().revoke.assert_called_once_with(messages_test.CERT, self.rsn)
self.response.json.return_value = DIRECTORY_V2.to_json()
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.revoke(messages_test.CERT, self.rsn)
mock_client().revoke.assert_called_once_with(messages_test.CERT, self.rsn)
def test_update_registration(self):
self.response.json.return_value = DIRECTORY_V1.to_json()
with mock.patch('acme.client.Client') as mock_client:
client = self._init()
client.update_registration(mock.sentinel.regr, None)
mock_client().update_registration.assert_called_once_with(mock.sentinel.regr, None)
# newNonce present means it will pick acme_version 2
def test_external_account_required_true(self):
self.response.json.return_value = messages.Directory({
'newNonce': 'http://letsencrypt-test.com/acme/new-nonce',
'meta': messages.Directory.Meta(external_account_required=True),
}).to_json()
client = self._init()
self.assertTrue(client.external_account_required())
# newNonce present means it will pick acme_version 2
def test_external_account_required_false(self):
self.response.json.return_value = messages.Directory({
'newNonce': 'http://letsencrypt-test.com/acme/new-nonce',
'meta': messages.Directory.Meta(external_account_required=False),
}).to_json()
client = self._init()
self.assertFalse(client.external_account_required())
def test_external_account_required_false_v1(self):
self.response.json.return_value = messages.Directory({
'meta': messages.Directory.Meta(external_account_required=False),
}).to_json()
client = self._init()
self.assertFalse(client.external_account_required())
class ClientTest(ClientTestBase):
"""Tests for acme.client.Client."""
def setUp(self):
super(ClientTest, self).setUp()
self.directory = DIRECTORY_V1
# Registration
self.regr = self.regr.update(
terms_of_service='https://www.letsencrypt-demo.org/tos')
# Request issuance
self.certr = messages.CertificateResource(
body=messages_test.CERT, authzrs=(self.authzr,),
uri='https://www.letsencrypt-demo.org/acme/cert/1',
cert_chain_uri='https://www.letsencrypt-demo.org/ca')
from acme.client import Client
self.client = Client(
directory=self.directory, key=KEY, alg=jose.RS256, net=self.net)
# Reason code for revocation
self.rsn = 1
def test_init_downloads_directory(self):
uri = 'http://www.letsencrypt-demo.org/directory'
@@ -346,18 +91,9 @@ class ClientTest(ClientTestBase):
directory=uri, key=KEY, alg=jose.RS256, net=self.net)
self.net.get.assert_called_once_with(uri)
@mock.patch('acme.client.ClientNetwork')
def test_init_without_net(self, mock_net):
mock_net.return_value = mock.sentinel.net
alg = jose.RS256
from acme.client import Client
self.client = Client(
directory=self.directory, key=KEY, alg=alg)
mock_net.called_once_with(KEY, alg=alg, verify_ssl=True)
self.assertEqual(self.client.net, mock.sentinel.net)
def test_register(self):
# "Instance of 'Field' has no to_json/update member" bug:
# pylint: disable=no-member
self.response.status_code = http_client.CREATED
self.response.json.return_value = self.regr.body.to_json()
self.response.headers['Location'] = self.regr.uri
@@ -370,6 +106,7 @@ class ClientTest(ClientTestBase):
def test_update_registration(self):
# "Instance of 'Field' has no to_json/update member" bug:
# pylint: disable=no-member
self.response.headers['Location'] = self.regr.uri
self.response.json.return_value = self.regr.body.to_json()
self.assertEqual(self.regr, self.client.update_registration(self.regr))
@@ -405,23 +142,20 @@ class ClientTest(ClientTestBase):
self.client.request_challenges(self.identifier)
self.net.post.assert_called_once_with(
self.directory.new_authz,
messages.NewAuthorization(identifier=self.identifier),
acme_version=1)
messages.NewAuthorization(identifier=self.identifier))
def test_request_challenges_deprecated_arg(self):
self._prepare_response_for_request_challenges()
self.client.request_challenges(self.identifier, new_authzr_uri="hi")
self.net.post.assert_called_once_with(
self.directory.new_authz,
messages.NewAuthorization(identifier=self.identifier),
acme_version=1)
messages.NewAuthorization(identifier=self.identifier))
def test_request_challenges_custom_uri(self):
self._prepare_response_for_request_challenges()
self.client.request_challenges(self.identifier)
self.net.post.assert_called_once_with(
'https://www.letsencrypt-demo.org/acme/new-authz', mock.ANY,
acme_version=1)
'https://www.letsencrypt-demo.org/acme/new-authz', mock.ANY)
def test_request_challenges_unexpected_update(self):
self._prepare_response_for_request_challenges()
@@ -431,13 +165,6 @@ class ClientTest(ClientTestBase):
errors.UnexpectedUpdate, self.client.request_challenges,
self.identifier)
def test_request_challenges_wildcard(self):
wildcard_identifier = messages.Identifier(
typ=messages.IDENTIFIER_FQDN, value='*.example.org')
self.assertRaises(
errors.WildcardUnsupportedError, self.client.request_challenges,
wildcard_identifier)
def test_request_domain_challenges(self):
self.client.request_challenges = mock.MagicMock()
self.assertEqual(
@@ -637,14 +364,6 @@ class ClientTest(ClientTestBase):
errors.PollError, self.client.poll_and_request_issuance,
csr, authzrs, mintime=mintime, max_attempts=2)
def test_deactivate_authorization(self):
authzb = self.authzr.body.update(status=messages.STATUS_DEACTIVATED)
self.response.json.return_value = authzb.to_json()
authzr = self.client.deactivate_authorization(self.authzr)
self.assertEqual(authzb, authzr.body)
self.assertEqual(self.client.net.post.call_count, 1)
self.assertTrue(self.authzr.uri in self.net.post.call_args_list[0][0])
def test_check_cert(self):
self.response.headers['Location'] = self.certr.uri
self.response.content = CERT_DER
@@ -698,12 +417,12 @@ class ClientTest(ClientTestBase):
def test_revoke(self):
self.client.revoke(self.certr.body, self.rsn)
self.net.post.assert_called_once_with(
self.directory[messages.Revocation], mock.ANY, acme_version=1)
self.directory[messages.Revocation], mock.ANY, content_type=None)
def test_revocation_payload(self):
obj = messages.Revocation(certificate=self.certr.body, reason=self.rsn)
self.assertTrue('reason' in obj.to_partial_json().keys())
self.assertEqual(self.rsn, obj.to_partial_json()['reason'])
self.assertEquals(self.rsn, obj.to_partial_json()['reason'])
def test_revoke_bad_status_raises_error(self):
self.response.status_code = http_client.METHOD_NOT_ALLOWED
@@ -714,220 +433,6 @@ class ClientTest(ClientTestBase):
self.rsn)
class ClientV2Test(ClientTestBase):
"""Tests for acme.client.ClientV2."""
def setUp(self):
super(ClientV2Test, self).setUp()
self.directory = DIRECTORY_V2
from acme.client import ClientV2
self.client = ClientV2(self.directory, self.net)
self.new_reg = self.new_reg.update(terms_of_service_agreed=True)
self.authzr_uri2 = 'https://www.letsencrypt-demo.org/acme/authz/2'
self.authz2 = self.authz.update(identifier=messages.Identifier(
typ=messages.IDENTIFIER_FQDN, value='www.example.com'),
status=messages.STATUS_PENDING)
self.authzr2 = messages.AuthorizationResource(
body=self.authz2, uri=self.authzr_uri2)
self.order = messages.Order(
identifiers=(self.authz.identifier, self.authz2.identifier),
status=messages.STATUS_PENDING,
authorizations=(self.authzr.uri, self.authzr_uri2),
finalize='https://www.letsencrypt-demo.org/acme/acct/1/order/1/finalize')
self.orderr = messages.OrderResource(
body=self.order,
uri='https://www.letsencrypt-demo.org/acme/acct/1/order/1',
authorizations=[self.authzr, self.authzr2], csr_pem=CSR_SAN_PEM)
def test_new_account(self):
self.response.status_code = http_client.CREATED
self.response.json.return_value = self.regr.body.to_json()
self.response.headers['Location'] = self.regr.uri
self.assertEqual(self.regr, self.client.new_account(self.new_reg))
def test_new_account_conflict(self):
self.response.status_code = http_client.OK
self.response.headers['Location'] = self.regr.uri
self.assertRaises(errors.ConflictError, self.client.new_account, self.new_reg)
def test_new_order(self):
order_response = copy.deepcopy(self.response)
order_response.status_code = http_client.CREATED
order_response.json.return_value = self.order.to_json()
order_response.headers['Location'] = self.orderr.uri
self.net.post.return_value = order_response
authz_response = copy.deepcopy(self.response)
authz_response.json.return_value = self.authz.to_json()
authz_response.headers['Location'] = self.authzr.uri
authz_response2 = self.response
authz_response2.json.return_value = self.authz2.to_json()
authz_response2.headers['Location'] = self.authzr2.uri
with mock.patch('acme.client.ClientV2._post_as_get') as mock_post_as_get:
mock_post_as_get.side_effect = (authz_response, authz_response2)
self.assertEqual(self.client.new_order(CSR_SAN_PEM), self.orderr)
@mock.patch('acme.client.datetime')
def test_poll_and_finalize(self, mock_datetime):
mock_datetime.datetime.now.return_value = datetime.datetime(2018, 2, 15)
mock_datetime.timedelta = datetime.timedelta
expected_deadline = mock_datetime.datetime.now() + datetime.timedelta(seconds=90)
self.client.poll_authorizations = mock.Mock(return_value=self.orderr)
self.client.finalize_order = mock.Mock(return_value=self.orderr)
self.assertEqual(self.client.poll_and_finalize(self.orderr), self.orderr)
self.client.poll_authorizations.assert_called_once_with(self.orderr, expected_deadline)
self.client.finalize_order.assert_called_once_with(self.orderr, expected_deadline)
@mock.patch('acme.client.datetime')
def test_poll_authorizations_timeout(self, mock_datetime):
now_side_effect = [datetime.datetime(2018, 2, 15),
datetime.datetime(2018, 2, 16),
datetime.datetime(2018, 2, 17)]
mock_datetime.datetime.now.side_effect = now_side_effect
self.response.json.side_effect = [
self.authz.to_json(), self.authz2.to_json(), self.authz2.to_json()]
self.assertRaises(
errors.TimeoutError, self.client.poll_authorizations, self.orderr, now_side_effect[1])
def test_poll_authorizations_failure(self):
deadline = datetime.datetime(9999, 9, 9)
challb = self.challr.body.update(status=messages.STATUS_INVALID,
error=messages.Error.with_code('unauthorized'))
authz = self.authz.update(status=messages.STATUS_INVALID, challenges=(challb,))
self.response.json.return_value = authz.to_json()
self.assertRaises(
errors.ValidationError, self.client.poll_authorizations, self.orderr, deadline)
def test_poll_authorizations_success(self):
deadline = datetime.datetime(9999, 9, 9)
updated_authz2 = self.authz2.update(status=messages.STATUS_VALID)
updated_authzr2 = messages.AuthorizationResource(
body=updated_authz2, uri=self.authzr_uri2)
updated_orderr = self.orderr.update(authorizations=[self.authzr, updated_authzr2])
self.response.json.side_effect = (
self.authz.to_json(), self.authz2.to_json(), updated_authz2.to_json())
self.assertEqual(self.client.poll_authorizations(self.orderr, deadline), updated_orderr)
def test_finalize_order_success(self):
updated_order = self.order.update(
certificate='https://www.letsencrypt-demo.org/acme/cert/')
updated_orderr = self.orderr.update(body=updated_order, fullchain_pem=CERT_SAN_PEM)
self.response.json.return_value = updated_order.to_json()
self.response.text = CERT_SAN_PEM
deadline = datetime.datetime(9999, 9, 9)
self.assertEqual(self.client.finalize_order(self.orderr, deadline), updated_orderr)
def test_finalize_order_error(self):
updated_order = self.order.update(error=messages.Error.with_code('unauthorized'))
self.response.json.return_value = updated_order.to_json()
deadline = datetime.datetime(9999, 9, 9)
self.assertRaises(errors.IssuanceError, self.client.finalize_order, self.orderr, deadline)
def test_finalize_order_timeout(self):
deadline = datetime.datetime.now() - datetime.timedelta(seconds=60)
self.assertRaises(errors.TimeoutError, self.client.finalize_order, self.orderr, deadline)
def test_finalize_order_alt_chains(self):
updated_order = self.order.update(
certificate='https://www.letsencrypt-demo.org/acme/cert/',
)
updated_orderr = self.orderr.update(body=updated_order,
fullchain_pem=CERT_SAN_PEM,
alternative_fullchains_pem=[CERT_SAN_PEM,
CERT_SAN_PEM])
self.response.json.return_value = updated_order.to_json()
self.response.text = CERT_SAN_PEM
self.response.headers['Link'] ='<https://example.com/acme/cert/1>;rel="alternate", ' + \
'<https://example.com/dir>;rel="index", ' + \
'<https://example.com/acme/cert/2>;title="foo";rel="alternate"'
deadline = datetime.datetime(9999, 9, 9)
resp = self.client.finalize_order(self.orderr, deadline, fetch_alternative_chains=True)
self.net.post.assert_any_call('https://example.com/acme/cert/1',
mock.ANY, acme_version=2, new_nonce_url=mock.ANY)
self.net.post.assert_any_call('https://example.com/acme/cert/2',
mock.ANY, acme_version=2, new_nonce_url=mock.ANY)
self.assertEqual(resp, updated_orderr)
del self.response.headers['Link']
resp = self.client.finalize_order(self.orderr, deadline, fetch_alternative_chains=True)
self.assertEqual(resp, updated_orderr.update(alternative_fullchains_pem=[]))
def test_revoke(self):
self.client.revoke(messages_test.CERT, self.rsn)
self.net.post.assert_called_once_with(
self.directory["revokeCert"], mock.ANY, acme_version=2,
new_nonce_url=DIRECTORY_V2['newNonce'])
def test_update_registration(self):
# "Instance of 'Field' has no to_json/update member" bug:
self.response.headers['Location'] = self.regr.uri
self.response.json.return_value = self.regr.body.to_json()
self.assertEqual(self.regr, self.client.update_registration(self.regr))
self.assertNotEqual(self.client.net.account, None)
self.assertEqual(self.client.net.post.call_count, 2)
self.assertTrue(DIRECTORY_V2.newAccount in self.net.post.call_args_list[0][0])
self.response.json.return_value = self.regr.body.update(
contact=()).to_json()
def test_external_account_required_true(self):
self.client.directory = messages.Directory({
'meta': messages.Directory.Meta(external_account_required=True)
})
self.assertTrue(self.client.external_account_required())
def test_external_account_required_false(self):
self.client.directory = messages.Directory({
'meta': messages.Directory.Meta(external_account_required=False)
})
self.assertFalse(self.client.external_account_required())
def test_external_account_required_default(self):
self.assertFalse(self.client.external_account_required())
def test_post_as_get(self):
with mock.patch('acme.client.ClientV2._authzr_from_response') as mock_client:
mock_client.return_value = self.authzr2
self.client.poll(self.authzr2) # pylint: disable=protected-access
self.client.net.post.assert_called_once_with(
self.authzr2.uri, None, acme_version=2,
new_nonce_url='https://www.letsencrypt-demo.org/acme/new-nonce')
self.client.net.get.assert_not_called()
class MockJSONDeSerializable(VersionedLEACMEMixin, jose.JSONDeSerializable):
# pylint: disable=missing-docstring
def __init__(self, value):
self.value = value
def to_partial_json(self):
return {'foo': self.value}
@classmethod
def from_json(cls, jobj):
pass # pragma: no cover
class ClientNetworkTest(unittest.TestCase):
"""Tests for acme.client.ClientNetwork."""
@@ -948,25 +453,24 @@ class ClientNetworkTest(unittest.TestCase):
self.assertTrue(self.net.verify_ssl is self.verify_ssl)
def test_wrap_in_jws(self):
# pylint: disable=protected-access
jws_dump = self.net._wrap_in_jws(
MockJSONDeSerializable('foo'), nonce=b'Tg', url="url",
acme_version=1)
jws = acme_jws.JWS.json_loads(jws_dump)
self.assertEqual(json.loads(jws.payload.decode()), {'foo': 'foo'})
self.assertEqual(jws.signature.combined.nonce, b'Tg')
class MockJSONDeSerializable(jose.JSONDeSerializable):
# pylint: disable=missing-docstring
def __init__(self, value):
self.value = value
def to_partial_json(self):
return {'foo': self.value}
@classmethod
def from_json(cls, value):
pass # pragma: no cover
def test_wrap_in_jws_v2(self):
self.net.account = {'uri': 'acct-uri'}
# pylint: disable=protected-access
jws_dump = self.net._wrap_in_jws(
MockJSONDeSerializable('foo'), nonce=b'Tg', url="url",
acme_version=2)
MockJSONDeSerializable('foo'), nonce=b'Tg')
jws = acme_jws.JWS.json_loads(jws_dump)
self.assertEqual(json.loads(jws.payload.decode()), {'foo': 'foo'})
self.assertEqual(jws.signature.combined.nonce, b'Tg')
self.assertEqual(jws.signature.combined.kid, u'acct-uri')
self.assertEqual(jws.signature.combined.url, u'url')
def test_check_response_not_ok_jobj_no_error(self):
self.response.ok = False
@@ -979,8 +483,8 @@ class ClientNetworkTest(unittest.TestCase):
def test_check_response_not_ok_jobj_error(self):
self.response.ok = False
self.response.json.return_value = messages.Error.with_code(
'serverInternal', detail='foo', title='some title').to_json()
self.response.json.return_value = messages.Error(
detail='foo', typ='serverInternal', title='some title').to_json()
# pylint: disable=protected-access
self.assertRaises(
messages.Error, self.net._check_response, self.response)
@@ -1005,39 +509,10 @@ class ClientNetworkTest(unittest.TestCase):
self.response.json.side_effect = ValueError
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
self.response.headers['Content-Type'] = response_ct
# pylint: disable=protected-access
# pylint: disable=protected-access,no-value-for-parameter
self.assertEqual(
self.response, self.net._check_response(self.response))
@mock.patch('acme.client.logger')
def test_check_response_ok_ct_with_charset(self, mock_logger):
self.response.json.return_value = {}
self.response.headers['Content-Type'] = 'application/json; charset=utf-8'
# pylint: disable=protected-access
self.assertEqual(self.response, self.net._check_response(
self.response, content_type='application/json'))
try:
mock_logger.debug.assert_called_with(
'Ignoring wrong Content-Type (%r) for JSON decodable response',
'application/json; charset=utf-8'
)
except AssertionError:
return
raise AssertionError('Expected Content-Type warning ' #pragma: no cover
'to not have been logged')
@mock.patch('acme.client.logger')
def test_check_response_ok_bad_ct(self, mock_logger):
self.response.json.return_value = {}
self.response.headers['Content-Type'] = 'text/plain'
# pylint: disable=protected-access
self.assertEqual(self.response, self.net._check_response(
self.response, content_type='application/json'))
mock_logger.debug.assert_called_with(
'Ignoring wrong Content-Type (%r) for JSON decodable response',
'text/plain'
)
def test_check_response_conflict(self):
self.response.ok = False
self.response.status_code = 409
@@ -1048,7 +523,7 @@ class ClientNetworkTest(unittest.TestCase):
self.response.json.return_value = {}
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
self.response.headers['Content-Type'] = response_ct
# pylint: disable=protected-access
# pylint: disable=protected-access,no-value-for-parameter
self.assertEqual(
self.response, self.net._check_response(self.response))
@@ -1159,11 +634,12 @@ class ClientNetworkTest(unittest.TestCase):
# Requests Library Exceptions
except requests.exceptions.ConnectionError as z: #pragma: no cover
self.assertTrue("'Connection aborted.'" in str(z) or "[WinError 10061]" in str(z))
self.assertEqual("('Connection aborted.', "
"error(111, 'Connection refused'))", str(z))
class ClientNetworkWithMockedResponseTest(unittest.TestCase):
"""Tests for acme.client.ClientNetwork which mock out response."""
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.client import ClientNetwork
@@ -1172,10 +648,7 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
self.response = mock.MagicMock(ok=True, status_code=http_client.OK)
self.response.headers = {}
self.response.links = {}
self.response.checked = False
self.acmev1_nonce_response = mock.MagicMock(
ok=False, status_code=http_client.METHOD_NOT_ALLOWED)
self.acmev1_nonce_response.headers = {}
self.checked_response = mock.MagicMock()
self.obj = mock.MagicMock()
self.wrapped_obj = mock.MagicMock()
self.content_type = mock.sentinel.content_type
@@ -1187,21 +660,13 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
def send_request(*args, **kwargs):
# pylint: disable=unused-argument,missing-docstring
self.assertFalse("new_nonce_url" in kwargs)
method = args[0]
uri = args[1]
if method == 'HEAD' and uri != "new_nonce_uri":
response = self.acmev1_nonce_response
else:
response = self.response
if self.available_nonces:
response.headers = {
self.response.headers = {
self.net.REPLAY_NONCE_HEADER:
self.available_nonces.pop().decode()}
else:
response.headers = {}
return response
self.response.headers = {}
return self.response
# pylint: disable=protected-access
self.net._send_request = self.send_request = mock.MagicMock(
@@ -1213,47 +678,36 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
# pylint: disable=missing-docstring
self.assertEqual(self.response, response)
self.assertEqual(self.content_type, content_type)
self.assertTrue(self.response.ok)
self.response.checked = True
return self.response
return self.checked_response
def test_head(self):
self.assertEqual(self.acmev1_nonce_response, self.net.head(
self.assertEqual(self.response, self.net.head(
'http://example.com/', 'foo', bar='baz'))
self.send_request.assert_called_once_with(
'HEAD', 'http://example.com/', 'foo', bar='baz')
def test_head_v2(self):
self.assertEqual(self.response, self.net.head(
'new_nonce_uri', 'foo', bar='baz'))
self.send_request.assert_called_once_with(
'HEAD', 'new_nonce_uri', 'foo', bar='baz')
def test_get(self):
self.assertEqual(self.response, self.net.get(
self.assertEqual(self.checked_response, self.net.get(
'http://example.com/', content_type=self.content_type, bar='baz'))
self.assertTrue(self.response.checked)
self.send_request.assert_called_once_with(
'GET', 'http://example.com/', bar='baz')
def test_post_no_content_type(self):
self.content_type = self.net.JOSE_CONTENT_TYPE
self.assertEqual(self.response, self.net.post('uri', self.obj))
self.assertTrue(self.response.checked)
self.assertEqual(self.checked_response, self.net.post('uri', self.obj))
def test_post(self):
# pylint: disable=protected-access
self.assertEqual(self.response, self.net.post(
self.assertEqual(self.checked_response, self.net.post(
'uri', self.obj, content_type=self.content_type))
self.assertTrue(self.response.checked)
self.net._wrap_in_jws.assert_called_once_with(
self.obj, jose.b64decode(self.all_nonces.pop()), "uri", 1)
self.obj, jose.b64decode(self.all_nonces.pop()))
self.available_nonces = []
self.assertRaises(errors.MissingNonce, self.net.post,
'uri', self.obj, content_type=self.content_type)
self.net._wrap_in_jws.assert_called_with(
self.obj, jose.b64decode(self.all_nonces.pop()), "uri", 1)
self.obj, jose.b64decode(self.all_nonces.pop()))
def test_post_wrong_initial_nonce(self): # HEAD
self.available_nonces = [b'f', jose.b64encode(b'good')]
@@ -1277,7 +731,7 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
def test_post_not_retried(self):
check_response = mock.MagicMock()
check_response.side_effect = [messages.Error.with_code('malformed'),
self.response]
self.checked_response]
# pylint: disable=protected-access
self.net._check_response = check_response
@@ -1285,12 +739,13 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
self.obj, content_type=self.content_type)
def test_post_successful_retry(self):
post_once = mock.MagicMock()
post_once.side_effect = [messages.Error.with_code('badNonce'),
self.response]
check_response = mock.MagicMock()
check_response.side_effect = [messages.Error.with_code('badNonce'),
self.checked_response]
# pylint: disable=protected-access
self.assertEqual(self.response, self.net.post(
self.net._check_response = check_response
self.assertEqual(self.checked_response, self.net.post(
'uri', self.obj, content_type=self.content_type))
def test_head_get_post_error_passthrough(self):
@@ -1301,51 +756,6 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
self.assertRaises(requests.exceptions.RequestException,
self.net.post, 'uri', obj=self.obj)
def test_post_bad_nonce_head(self):
# pylint: disable=protected-access
# regression test for https://github.com/certbot/certbot/issues/6092
bad_response = mock.MagicMock(ok=False, status_code=http_client.SERVICE_UNAVAILABLE)
self.net._send_request = mock.MagicMock()
self.net._send_request.return_value = bad_response
self.content_type = None
check_response = mock.MagicMock()
self.net._check_response = check_response
self.assertRaises(errors.ClientError, self.net.post, 'uri',
self.obj, content_type=self.content_type, acme_version=2,
new_nonce_url='new_nonce_uri')
self.assertEqual(check_response.call_count, 1)
def test_new_nonce_uri_removed(self):
self.content_type = None
self.net.post('uri', self.obj, content_type=None,
acme_version=2, new_nonce_url='new_nonce_uri')
class ClientNetworkSourceAddressBindingTest(unittest.TestCase):
"""Tests that if ClientNetwork has a source IP set manually, the underlying library has
used the provided source address."""
def setUp(self):
self.source_address = "8.8.8.8"
def test_source_address_set(self):
from acme.client import ClientNetwork
net = ClientNetwork(key=None, alg=None, source_address=self.source_address)
for adapter in net.session.adapters.values():
self.assertTrue(self.source_address in adapter.source_address)
def test_behavior_assumption(self):
"""This is a test that guardrails the HTTPAdapter behavior so that if the default for
a Session() changes, the assumptions here aren't violated silently."""
from acme.client import ClientNetwork
# Source address not specified, so the default adapter type should be bound -- this
# test should fail if the default adapter type is changed by requests
net = ClientNetwork(key=None, alg=None)
session = requests.Session()
for scheme in session.adapters:
client_network_adapter = net.session.adapters.get(scheme)
default_adapter = session.adapters.get(scheme)
self.assertEqual(client_network_adapter.__class__, default_adapter.__class__)
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -5,35 +5,26 @@ import logging
import os
import re
import socket
import sys
import josepy as jose
from OpenSSL import crypto
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
import OpenSSL
from acme import errors
from acme.magic_typing import Callable
from acme.magic_typing import Tuple
from acme.magic_typing import Union
logger = logging.getLogger(__name__)
# Default SSL method selected here is the most compatible, while secure
# SSL method: TLSv1_METHOD is only compatible with
# TLSSNI01 certificate serving and probing is not affected by SSL
# vulnerabilities: prober needs to check certificate for expected
# contents anyway. Working SNI is the only thing that's necessary for
# the challenge and thus scoping down SSL/TLS method (version) would
# cause interoperability issues: TLSv1_METHOD is only compatible with
# TLSv1_METHOD, while SSLv23_METHOD is compatible with all other
# methods, including TLSv2_METHOD (read more at
# https://www.openssl.org/docs/ssl/SSLv23_method.html). _serve_sni
# should be changed to use "set_options" to disable SSLv2 and SSLv3,
# in case it's used for things other than probing/serving!
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
class _DefaultCertSelection(object):
def __init__(self, certs):
self.certs = certs
def __call__(self, connection):
server_name = connection.get_servername()
return self.certs.get(server_name, None)
_DEFAULT_TLSSNI01_SSL_METHOD = OpenSSL.SSL.SSLv23_METHOD # type: ignore
class SSLSocket(object): # pylint: disable=too-few-public-methods
@@ -43,25 +34,12 @@ class SSLSocket(object): # pylint: disable=too-few-public-methods
:ivar dict certs: Mapping from domain names (`bytes`) to
`OpenSSL.crypto.X509`.
:ivar method: See `OpenSSL.SSL.Context` for allowed values.
:ivar alpn_selection: Hook to select negotiated ALPN protocol for
connection.
:ivar cert_selection: Hook to select certificate for connection. If given,
`certs` parameter would be ignored, and therefore must be empty.
"""
def __init__(self, sock, certs=None,
method=_DEFAULT_SSL_METHOD, alpn_selection=None,
cert_selection=None):
def __init__(self, sock, certs, method=_DEFAULT_TLSSNI01_SSL_METHOD):
self.sock = sock
self.alpn_selection = alpn_selection
self.certs = certs
self.method = method
if not cert_selection and not certs:
raise ValueError("Neither cert_selection or certs specified.")
if cert_selection and certs:
raise ValueError("Both cert_selection and certs specified.")
if cert_selection is None:
cert_selection = _DefaultCertSelection(certs)
self.cert_selection = cert_selection
def __getattr__(self, name):
return getattr(self.sock, name)
@@ -78,25 +56,24 @@ class SSLSocket(object): # pylint: disable=too-few-public-methods
:type connection: :class:`OpenSSL.Connection`
"""
pair = self.cert_selection(connection)
if pair is None:
logger.debug("Certificate selection for server name %s failed, dropping SSL",
connection.get_servername())
server_name = connection.get_servername()
try:
key, cert = self.certs[server_name]
except KeyError:
logger.debug("Server name (%s) not recognized, dropping SSL",
server_name)
return
key, cert = pair
new_context = SSL.Context(self.method)
new_context.set_options(SSL.OP_NO_SSLv2)
new_context.set_options(SSL.OP_NO_SSLv3)
new_context = OpenSSL.SSL.Context(self.method)
new_context.set_options(OpenSSL.SSL.OP_NO_SSLv2)
new_context.set_options(OpenSSL.SSL.OP_NO_SSLv3)
new_context.use_privatekey(key)
new_context.use_certificate(cert)
if self.alpn_selection is not None:
new_context.set_alpn_select_callback(self.alpn_selection)
connection.set_context(new_context)
class FakeConnection(object):
"""Fake OpenSSL.SSL.Connection."""
# pylint: disable=missing-function-docstring
# pylint: disable=too-few-public-methods,missing-docstring
def __init__(self, connection):
self._wrapped = connection
@@ -108,23 +85,21 @@ class SSLSocket(object): # pylint: disable=too-few-public-methods
# OpenSSL.SSL.Connection.shutdown doesn't accept any args
return self._wrapped.shutdown()
def accept(self): # pylint: disable=missing-function-docstring
def accept(self): # pylint: disable=missing-docstring
sock, addr = self.sock.accept()
context = SSL.Context(self.method)
context.set_options(SSL.OP_NO_SSLv2)
context.set_options(SSL.OP_NO_SSLv3)
context = OpenSSL.SSL.Context(self.method)
context.set_options(OpenSSL.SSL.OP_NO_SSLv2)
context.set_options(OpenSSL.SSL.OP_NO_SSLv3)
context.set_tlsext_servername_callback(self._pick_certificate_cb)
if self.alpn_selection is not None:
context.set_alpn_select_callback(self.alpn_selection)
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
ssl_sock = self.FakeConnection(OpenSSL.SSL.Connection(context, sock))
ssl_sock.set_accept_state()
logger.debug("Performing handshake with %s", addr)
try:
ssl_sock.do_handshake()
except SSL.Error as error:
except OpenSSL.SSL.Error as error:
# _pick_certificate_cb might have returned without
# creating SSL context (wrong server name)
raise socket.error(error)
@@ -132,9 +107,8 @@ class SSLSocket(object): # pylint: disable=too-few-public-methods
return ssl_sock, addr
def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-arguments
method=_DEFAULT_SSL_METHOD, source_address=('', 0),
alpn_protocols=None):
def probe_sni(name, host, port=443, timeout=300,
method=_DEFAULT_TLSSNI01_SSL_METHOD, source_address=('', 0)):
"""Probe SNI server for SSL certificate.
:param bytes name: Byte string to send as the server name in the
@@ -146,8 +120,6 @@ def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-argu
:param tuple source_address: Enables multi-path probing (selection
of source interface). See `socket.creation_connection` for more
info. Available only in Python 2.7+.
:param alpn_protocols: Protocols to request using ALPN.
:type alpn_protocols: `list` of `bytes`
:raises acme.errors.Error: In case of any problems.
@@ -155,38 +127,34 @@ def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-argu
:rtype: OpenSSL.crypto.X509
"""
context = SSL.Context(method)
context = OpenSSL.SSL.Context(method)
context.set_timeout(timeout)
socket_kwargs = {'source_address': source_address}
socket_kwargs = {} if sys.version_info < (2, 7) else {
'source_address': source_address}
host_protocol_agnostic = None if host == '::' or host == '0' else host
try:
logger.debug(
"Attempting to connect to %s:%d%s.", host, port,
" from {0}:{1}".format(
source_address[0],
source_address[1]
) if any(source_address) else ""
)
socket_tuple = (host, port) # type: Tuple[str, int]
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore
# pylint: disable=star-args
logger.debug("Attempting to connect to %s:%d%s.", host_protocol_agnostic, port,
" from {0}:{1}".format(source_address[0], source_address[1]) if \
socket_kwargs else "")
sock = socket.create_connection((host_protocol_agnostic, port), **socket_kwargs)
except socket.error as error:
raise errors.Error(error)
with contextlib.closing(sock) as client:
client_ssl = SSL.Connection(context, client)
client_ssl = OpenSSL.SSL.Connection(context, client)
client_ssl.set_connect_state()
client_ssl.set_tlsext_host_name(name) # pyOpenSSL>=0.13
if alpn_protocols is not None:
client_ssl.set_alpn_protos(alpn_protocols)
try:
client_ssl.do_handshake()
client_ssl.shutdown()
except SSL.Error as error:
except OpenSSL.SSL.Error as error:
raise errors.Error(error)
return client_ssl.get_peer_certificate()
def make_csr(private_key_pem, domains, must_staple=False):
"""Generate a CSR containing a list of domains as subjectAltNames.
@@ -196,18 +164,18 @@ def make_csr(private_key_pem, domains, must_staple=False):
OCSP Must Staple: https://tools.ietf.org/html/rfc7633).
:returns: buffer PEM-encoded Certificate Signing Request.
"""
private_key = crypto.load_privatekey(
crypto.FILETYPE_PEM, private_key_pem)
csr = crypto.X509Req()
private_key = OpenSSL.crypto.load_privatekey(
OpenSSL.crypto.FILETYPE_PEM, private_key_pem)
csr = OpenSSL.crypto.X509Req()
extensions = [
crypto.X509Extension(
OpenSSL.crypto.X509Extension(
b'subjectAltName',
critical=False,
value=', '.join('DNS:' + d for d in domains).encode('ascii')
),
]
if must_staple:
extensions.append(crypto.X509Extension(
extensions.append(OpenSSL.crypto.X509Extension(
b"1.3.6.1.5.5.7.1.24",
critical=False,
value=b"DER:30:03:02:01:05"))
@@ -215,18 +183,8 @@ def make_csr(private_key_pem, domains, must_staple=False):
csr.set_pubkey(private_key)
csr.set_version(2)
csr.sign(private_key, 'sha256')
return crypto.dump_certificate_request(
crypto.FILETYPE_PEM, csr)
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
common_name = loaded_cert_or_req.get_subject().CN
sans = _pyopenssl_cert_or_req_san(loaded_cert_or_req)
if common_name is None:
return sans
return [common_name] + [d for d in sans if d != common_name]
return OpenSSL.crypto.dump_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr)
def _pyopenssl_cert_or_req_san(cert_or_req):
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
@@ -254,12 +212,11 @@ def _pyopenssl_cert_or_req_san(cert_or_req):
parts_separator = ", "
prefix = "DNS" + part_separator
if isinstance(cert_or_req, crypto.X509):
# pylint: disable=line-too-long
func = crypto.dump_certificate # type: Union[Callable[[int, crypto.X509Req], bytes], Callable[[int, crypto.X509], bytes]]
if isinstance(cert_or_req, OpenSSL.crypto.X509):
func = OpenSSL.crypto.dump_certificate
else:
func = crypto.dump_certificate_request
text = func(crypto.FILETYPE_TEXT, cert_or_req).decode("utf-8")
func = OpenSSL.crypto.dump_certificate_request
text = func(OpenSSL.crypto.FILETYPE_TEXT, cert_or_req).decode("utf-8")
# WARNING: this function does not support multiple SANs extensions.
# Multiple X509v3 extensions of the same type is disallowed by RFC 5280.
match = re.search(r"X509v3 Subject Alternative Name:(?: critical)?\s*(.*)", text)
@@ -272,14 +229,12 @@ def _pyopenssl_cert_or_req_san(cert_or_req):
def gen_ss_cert(key, domains, not_before=None,
validity=(7 * 24 * 60 * 60), force_san=True, extensions=None):
validity=(7 * 24 * 60 * 60), force_san=True):
"""Generate new self-signed certificate.
:type domains: `list` of `unicode`
:param OpenSSL.crypto.PKey key:
:param bool force_san:
:param extensions: List of additional extensions to include in the cert.
:type extensions: `list` of `OpenSSL.crypto.X509Extension`
If more than one domain is provided, all of the domains are put into
``subjectAltName`` X.509 extension and first domain is set as the
@@ -288,24 +243,21 @@ def gen_ss_cert(key, domains, not_before=None,
"""
assert domains, "Must provide one or more hostnames for the cert."
cert = crypto.X509()
cert = OpenSSL.crypto.X509()
cert.set_serial_number(int(binascii.hexlify(os.urandom(16)), 16))
cert.set_version(2)
if extensions is None:
extensions = []
extensions.append(
crypto.X509Extension(
extensions = [
OpenSSL.crypto.X509Extension(
b"basicConstraints", True, b"CA:TRUE, pathlen:0"),
)
]
cert.get_subject().CN = domains[0]
# TODO: what to put into cert.get_subject()?
cert.set_issuer(cert.get_subject())
if force_san or len(domains) > 1:
extensions.append(crypto.X509Extension(
extensions.append(OpenSSL.crypto.X509Extension(
b"subjectAltName",
critical=False,
value=b", ".join(b"DNS:" + d.encode() for d in domains)
@@ -319,26 +271,3 @@ def gen_ss_cert(key, domains, not_before=None,
cert.set_pubkey(key)
cert.sign(key, "sha256")
return cert
def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
"""Dump certificate chain into a bundle.
:param list chain: List of `OpenSSL.crypto.X509` (or wrapped in
:class:`josepy.util.ComparableX509`).
:returns: certificate chain bundle
:rtype: bytes
"""
# XXX: returns empty string when no chain is available, which
# shuts up RenewableCert, but might not be the best solution...
def _dump_cert(cert):
if isinstance(cert, jose.ComparableX509):
cert = cert.wrapped
return crypto.dump_certificate(filetype, cert)
# assumes that OpenSSL.crypto.dump_certificate includes ending
# newline character
return b"".join(_dump_cert(cert) for cert in chain)

View File

@@ -5,18 +5,20 @@ import threading
import time
import unittest
import six
from six.moves import socketserver #type: ignore # pylint: disable=import-error
import josepy as jose
import OpenSSL
import six
from six.moves import socketserver # type: ignore # pylint: disable=import-error
from acme import errors
import test_util
from acme import test_util
class SSLSocketAndProbeSNITest(unittest.TestCase):
"""Tests for acme.crypto_util.SSLSocket/probe_sni."""
def setUp(self):
self.cert = test_util.load_comparable_cert('rsa2048_cert.pem')
key = test_util.load_pyopenssl_private_key('rsa2048_key.pem')
@@ -27,85 +29,40 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
class _TestServer(socketserver.TCPServer):
# pylint: disable=too-few-public-methods
# six.moves.* | pylint: disable=attribute-defined-outside-init,no-init
def server_bind(self): # pylint: disable=missing-docstring
self.socket = SSLSocket(socket.socket(),
certs)
self.socket = SSLSocket(socket.socket(), certs=certs)
socketserver.TCPServer.server_bind(self)
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
self.port = self.server.socket.getsockname()[1]
self.server_thread = threading.Thread(
# pylint: disable=no-member
target=self.server.handle_request)
self.server_thread.start()
time.sleep(1) # TODO: avoid race conditions in other way
def tearDown(self):
if self.server_thread.is_alive():
# The thread may have already terminated.
self.server_thread.join() # pragma: no cover
self.server_thread.join()
def _probe(self, name):
from acme.crypto_util import probe_sni
return jose.ComparableX509(probe_sni(
name, host='127.0.0.1', port=self.port))
def _start_server(self):
self.server_thread.start()
time.sleep(1) # TODO: avoid race conditions in other way
def test_probe_ok(self):
self._start_server()
self.assertEqual(self.cert, self._probe(b'foo'))
def test_probe_not_recognized_name(self):
self._start_server()
self.assertRaises(errors.Error, self._probe, b'bar')
def test_probe_connection_error(self):
# pylint has a hard time with six
self.server.server_close()
original_timeout = socket.getdefaulttimeout()
try:
socket.setdefaulttimeout(1)
self.assertRaises(errors.Error, self._probe, b'bar')
finally:
socket.setdefaulttimeout(original_timeout)
class SSLSocketTest(unittest.TestCase):
"""Tests for acme.crypto_util.SSLSocket."""
def test_ssl_socket_invalid_arguments(self):
from acme.crypto_util import SSLSocket
with self.assertRaises(ValueError):
_ = SSLSocket(None, {'sni': ('key', 'cert')},
cert_selection=lambda _: None)
with self.assertRaises(ValueError):
_ = SSLSocket(None)
class PyOpenSSLCertOrReqAllNamesTest(unittest.TestCase):
"""Test for acme.crypto_util._pyopenssl_cert_or_req_all_names."""
@classmethod
def _call(cls, loader, name):
# pylint: disable=protected-access
from acme.crypto_util import _pyopenssl_cert_or_req_all_names
return _pyopenssl_cert_or_req_all_names(loader(name))
def _call_cert(self, name):
return self._call(test_util.load_cert, name)
def test_cert_one_san_no_common(self):
self.assertEqual(self._call_cert('cert-nocn.der'),
['no-common-name.badssl.com'])
def test_cert_no_sans_yes_common(self):
self.assertEqual(self._call_cert('cert.pem'), ['example.com'])
def test_cert_two_sans_yes_common(self):
self.assertEqual(self._call_cert('cert-san.pem'),
['example.com', 'www.example.com'])
# TODO: py33/py34 tox hangs forever on do_handshake in second probe
#def probe_connection_error(self):
# self._probe(b'foo')
# #time.sleep(1) # TODO: avoid race conditions in other way
# self.assertRaises(errors.Error, self._probe, b'bar')
class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
@@ -184,7 +141,7 @@ class RandomSnTest(unittest.TestCase):
def setUp(self):
self.cert_count = 5
self.serial_num = [] # type: List[int]
self.serial_num = []
self.key = OpenSSL.crypto.PKey()
self.key.generate_key(OpenSSL.crypto.TYPE_RSA, 2048)
@@ -213,12 +170,12 @@ class MakeCSRTest(unittest.TestCase):
self.assertTrue(b'--END CERTIFICATE REQUEST--' in csr_pem)
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
# have a get_extensions() method, so we skip this test if the method
# isn't available.
# In pyopenssl 0.13 (used with TOXENV=py26-oldest and py27-oldest), csr
# objects don't have a get_extensions() method, so we skip this test if
# the method isn't available.
if hasattr(csr, 'get_extensions'):
self.assertEqual(len(csr.get_extensions()), 1)
self.assertEqual(csr.get_extensions()[0].get_data(),
self.assertEquals(len(csr.get_extensions()), 1)
self.assertEquals(csr.get_extensions()[0].get_data(),
OpenSSL.crypto.X509Extension(
b'subjectAltName',
critical=False,
@@ -231,11 +188,11 @@ class MakeCSRTest(unittest.TestCase):
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
# have a get_extensions() method, so we skip this test if the method
# isn't available.
# In pyopenssl 0.13 (used with TOXENV=py26-oldest and py27-oldest), csr
# objects don't have a get_extensions() method, so we skip this test if
# the method isn't available.
if hasattr(csr, 'get_extensions'):
self.assertEqual(len(csr.get_extensions()), 2)
self.assertEquals(len(csr.get_extensions()), 2)
# NOTE: Ideally we would filter by the TLS Feature OID, but
# OpenSSL.crypto.X509Extension doesn't give us the extension's raw OID,
# and the shortname field is just "UNDEF"
@@ -244,33 +201,5 @@ class MakeCSRTest(unittest.TestCase):
self.assertEqual(len(must_staple_exts), 1,
"Expected exactly one Must Staple extension")
class DumpPyopensslChainTest(unittest.TestCase):
"""Test for dump_pyopenssl_chain."""
@classmethod
def _call(cls, loaded):
# pylint: disable=protected-access
from acme.crypto_util import dump_pyopenssl_chain
return dump_pyopenssl_chain(loaded)
def test_dump_pyopenssl_chain(self):
names = ['cert.pem', 'cert-san.pem', 'cert-idnsans.pem']
loaded = [test_util.load_cert(name) for name in names]
length = sum(
len(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, cert))
for cert in loaded)
self.assertEqual(len(self._call(loaded)), length)
def test_dump_pyopenssl_chain_wrapped(self):
names = ['cert.pem', 'cert-san.pem', 'cert-idnsans.pem']
loaded = [test_util.load_cert(name) for name in names]
wrap_func = jose.ComparableX509
wrapped = [wrap_func(cert) for cert in loaded]
dump_func = OpenSSL.crypto.dump_certificate
length = sum(len(dump_func(OpenSSL.crypto.FILETYPE_PEM, cert)) for cert in loaded)
self.assertEqual(len(self._call(wrapped)), length)
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -29,12 +29,7 @@ class NonceError(ClientError):
class BadNonce(NonceError):
"""Bad nonce error."""
def __init__(self, nonce, error, *args, **kwargs):
# MyPy complains here that there is too many arguments for BaseException constructor.
# This is an error fixed in typeshed, see https://github.com/python/mypy/issues/4183
# The fix is included in MyPy>=0.740, but upgrading it would bring dozen of errors due to
# new types definitions. So we ignore the error until the code base is fixed to match
# with MyPy>=0.740 referential.
super(BadNonce, self).__init__(*args, **kwargs) # type: ignore
super(BadNonce, self).__init__(*args, **kwargs)
self.nonce = nonce
self.error = error
@@ -49,12 +44,11 @@ class MissingNonce(NonceError):
Replay-Nonce header field in each successful response to a POST it
provides to a client (...)".
:ivar requests.Response ~.response: HTTP Response
:ivar requests.Response response: HTTP Response
"""
def __init__(self, response, *args, **kwargs):
# See comment in BadNonce constructor above for an explanation of type: ignore here.
super(MissingNonce, self).__init__(*args, **kwargs) # type: ignore
super(MissingNonce, self).__init__(*args, **kwargs)
self.response = response
def __str__(self):
@@ -89,44 +83,13 @@ class PollError(ClientError):
return '{0}(exhausted={1!r}, updated={2!r})'.format(
self.__class__.__name__, self.exhausted, self.updated)
class ValidationError(Error):
"""Error for authorization failures. Contains a list of authorization
resources, each of which is invalid and should have an error field.
"""
def __init__(self, failed_authzrs):
self.failed_authzrs = failed_authzrs
super(ValidationError, self).__init__()
class TimeoutError(Error): # pylint: disable=redefined-builtin
"""Error for when polling an authorization or an order times out."""
class IssuanceError(Error):
"""Error sent by the server after requesting issuance of a certificate."""
def __init__(self, error):
"""Initialize.
:param messages.Error error: The error provided by the server.
"""
self.error = error
super(IssuanceError, self).__init__()
class ConflictError(ClientError):
"""Error for when the server returns a 409 (Conflict) HTTP status.
In the version of ACME implemented by Boulder, this is used to find an
account if you only have the private key, but don't know the account URL.
Also used in V2 of the ACME client for the same purpose.
"""
def __init__(self, location):
self.location = location
super(ConflictError, self).__init__()
class WildcardUnsupportedError(Error):
"""Error for when a wildcard is requested but is unsupported by ACME CA."""

View File

@@ -1,10 +1,7 @@
"""Tests for acme.errors."""
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import mock
class BadNonceTest(unittest.TestCase):
@@ -38,7 +35,7 @@ class PollErrorTest(unittest.TestCase):
def setUp(self):
from acme.errors import PollError
self.timeout = PollError(
exhausted={mock.sentinel.AR},
exhausted=set([mock.sentinel.AR]),
updated={})
self.invalid = PollError(exhausted=set(), updated={
mock.sentinel.AR: mock.sentinel.AR2})

View File

@@ -4,6 +4,7 @@ import logging
import josepy as jose
import pyrfc3339
logger = logging.getLogger(__name__)

View File

@@ -15,7 +15,7 @@ class Header(jose.Header):
url = jose.Field('url', omitempty=True)
@nonce.decoder
def nonce(value): # pylint: disable=no-self-argument,missing-function-docstring
def nonce(value): # pylint: disable=missing-docstring,no-self-argument
try:
return jose.decode_b64jose(value)
except jose.DeserializationError as error:
@@ -40,10 +40,10 @@ class Signature(jose.Signature):
class JWS(jose.JWS):
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
signature_cls = Signature
__slots__ = jose.JWS._orig_slots
__slots__ = jose.JWS._orig_slots # pylint: disable=no-member
@classmethod
# pylint: disable=arguments-differ
# pylint: disable=arguments-differ,too-many-arguments
def sign(cls, payload, key, alg, nonce, url=None, kid=None):
# Per ACME spec, jwk and kid are mutually exclusive, so only include a
# jwk field if kid is not provided.

View File

@@ -3,7 +3,8 @@ import unittest
import josepy as jose
import test_util
from acme import test_util
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))

View File

@@ -1,16 +0,0 @@
"""Shim class to not have to depend on typing module in prod."""
import sys
class TypingClass(object):
"""Ignore import errors by getting anything"""
def __getattr__(self, name):
return None
try:
# mypy doesn't respect modifying sys.modules
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
from typing import Collection, IO # type: ignore
except ImportError:
# mypy complains because TypingClass is not a module
sys.modules[__name__] = TypingClass() # type: ignore

View File

@@ -1,57 +1,32 @@
"""ACME protocol messages."""
import json
import collections
import six
import josepy as jose
import six
from acme import challenges
from acme import errors
from acme import fields
from acme import jws
from acme import util
from acme.mixins import ResourceMixin
try:
from collections.abc import Hashable
except ImportError: # pragma: no cover
from collections import Hashable
OLD_ERROR_PREFIX = "urn:acme:error:"
ERROR_PREFIX = "urn:ietf:params:acme:error:"
ERROR_CODES = {
'accountDoesNotExist': 'The request specified an account that does not exist',
'alreadyRevoked': 'The request specified a certificate to be revoked that has' \
' already been revoked',
'badCSR': 'The CSR is unacceptable (e.g., due to a short key)',
'badNonce': 'The client sent an unacceptable anti-replay nonce',
'badPublicKey': 'The JWS was signed by a public key the server does not support',
'badRevocationReason': 'The revocation reason provided is not allowed by the server',
'badSignatureAlgorithm': 'The JWS was signed with an algorithm the server does not support',
'caa': 'Certification Authority Authorization (CAA) records forbid the CA from issuing' \
' a certificate',
'compound': 'Specific error conditions are indicated in the "subproblems" array',
'connection': ('The server could not connect to the client to verify the'
' domain'),
'dns': 'There was a problem with a DNS query during identifier validation',
'dnssec': 'The server could not validate a DNSSEC signed domain',
'incorrectResponse': 'Response received didn\'t match the challenge\'s requirements',
# deprecate invalidEmail
'invalidEmail': 'The provided email for a registration was invalid',
'invalidContact': 'The provided contact URI was invalid',
'malformed': 'The request message was malformed',
'rejectedIdentifier': 'The server will not issue certificates for the identifier',
'orderNotReady': 'The request attempted to finalize an order that is not ready to be finalized',
'rateLimited': 'There were too many requests of a given type',
'serverInternal': 'The server experienced an internal error',
'tls': 'The server experienced a TLS error during domain verification',
'unauthorized': 'The client lacks sufficient authorization',
'unsupportedContact': 'A contact URL for an account used an unsupported protocol scheme',
'unknownHost': 'The server could not resolve a domain name',
'unsupportedIdentifier': 'An identifier is of an unsupported type',
'externalAccountRequired': 'The server requires external account binding',
}
ERROR_TYPE_DESCRIPTIONS = dict(
@@ -65,6 +40,7 @@ def is_acme_error(err):
"""Check if argument is an ACME error."""
if isinstance(err, Error) and (err.typ is not None):
return (ERROR_PREFIX in err.typ) or (OLD_ERROR_PREFIX in err.typ)
else:
return False
@@ -120,7 +96,6 @@ class Error(jose.JSONObjectWithFields, errors.Error):
code = str(self.typ).split(':')[-1]
if code in ERROR_CODES:
return code
return None
def __str__(self):
return b' :: '.join(
@@ -129,25 +104,24 @@ class Error(jose.JSONObjectWithFields, errors.Error):
if part is not None).decode()
class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
class _Constant(jose.JSONDeSerializable, collections.Hashable): # type: ignore
"""ACME constant."""
__slots__ = ('name',)
POSSIBLE_NAMES = NotImplemented
def __init__(self, name):
super(_Constant, self).__init__()
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
self.POSSIBLE_NAMES[name] = self
self.name = name
def to_partial_json(self):
return self.name
@classmethod
def from_json(cls, jobj):
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
def from_json(cls, value):
if value not in cls.POSSIBLE_NAMES:
raise jose.DeserializationError(
'{0} not recognized'.format(cls.__name__))
return cls.POSSIBLE_NAMES[jobj]
return cls.POSSIBLE_NAMES[value]
def __repr__(self):
return '{0}({1})'.format(self.__class__.__name__, self.name)
@@ -171,8 +145,6 @@ STATUS_PROCESSING = Status('processing')
STATUS_VALID = Status('valid')
STATUS_INVALID = Status('invalid')
STATUS_REVOKED = Status('revoked')
STATUS_READY = Status('ready')
STATUS_DEACTIVATED = Status('deactivated')
class IdentifierType(_Constant):
@@ -199,30 +171,9 @@ class Directory(jose.JSONDeSerializable):
class Meta(jose.JSONObjectWithFields):
"""Directory Meta."""
_terms_of_service = jose.Field('terms-of-service', omitempty=True)
_terms_of_service_v2 = jose.Field('termsOfService', omitempty=True)
terms_of_service = jose.Field('terms-of-service', omitempty=True)
website = jose.Field('website', omitempty=True)
caa_identities = jose.Field('caaIdentities', omitempty=True)
external_account_required = jose.Field('externalAccountRequired', omitempty=True)
def __init__(self, **kwargs):
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
super(Directory.Meta, self).__init__(**kwargs)
@property
def terms_of_service(self):
"""URL for the CA TOS"""
return self._terms_of_service or self._terms_of_service_v2
def __iter__(self):
# When iterating over fields, use the external name 'terms_of_service' instead of
# the internal '_terms_of_service'.
for name in super(Directory.Meta, self).__iter__():
yield name[1:] if name == '_terms_of_service' else name
def _internal_name(self, name):
return '_' + name if name == 'terms_of_service' else name
caa_identities = jose.Field('caa-identities', omitempty=True)
@classmethod
def _canon_key(cls, key):
@@ -246,13 +197,13 @@ class Directory(jose.JSONDeSerializable):
try:
return self[name.replace('_', '-')]
except KeyError as error:
raise AttributeError(str(error))
raise AttributeError(str(error) + ': ' + name)
def __getitem__(self, name):
try:
return self._jobj[self._canon_key(name)]
except KeyError:
raise KeyError('Directory field "' + self._canon_key(name) + '" not found')
raise KeyError('Directory field not found')
def to_partial_json(self):
return self._jobj
@@ -275,7 +226,7 @@ class Resource(jose.JSONObjectWithFields):
class ResourceWithURI(Resource):
"""ACME Resource with URI.
:ivar unicode ~.uri: Location of the resource.
:ivar unicode uri: Location of the resource.
"""
uri = jose.Field('uri') # no ChallengeResource.uri
@@ -285,24 +236,6 @@ class ResourceBody(jose.JSONObjectWithFields):
"""ACME Resource Body."""
class ExternalAccountBinding(object):
"""ACME External Account Binding"""
@classmethod
def from_data(cls, account_public_key, kid, hmac_key, directory):
"""Create External Account Binding Resource from contact details, kid and hmac."""
key_json = json.dumps(account_public_key.to_partial_json()).encode()
decoded_hmac_key = jose.b64.b64decode(hmac_key)
url = directory["newAccount"]
eab = jws.JWS.sign(key_json, jose.jwk.JWKOct(key=decoded_hmac_key),
jose.jwa.HS256, None,
url, kid)
return eab.to_partial_json()
class Registration(ResourceBody):
"""Registration Resource Body.
@@ -315,88 +248,29 @@ class Registration(ResourceBody):
# on new-reg key server ignores 'key' and populates it based on
# JWS.signature.combined.jwk
key = jose.Field('key', omitempty=True, decoder=jose.JWK.from_json)
# Contact field implements special behavior to allow messages that clear existing
# contacts while not expecting the `contact` field when loading from json.
# This is implemented in the constructor and *_json methods.
contact = jose.Field('contact', omitempty=True, default=())
agreement = jose.Field('agreement', omitempty=True)
status = jose.Field('status', omitempty=True)
terms_of_service_agreed = jose.Field('termsOfServiceAgreed', omitempty=True)
only_return_existing = jose.Field('onlyReturnExisting', omitempty=True)
external_account_binding = jose.Field('externalAccountBinding', omitempty=True)
phone_prefix = 'tel:'
email_prefix = 'mailto:'
@classmethod
def from_data(cls, phone=None, email=None, external_account_binding=None, **kwargs):
"""
Create registration resource from contact details.
The `contact` keyword being passed to a Registration object is meaningful, so
this function represents empty iterables in its kwargs by passing on an empty
`tuple`.
"""
# Note if `contact` was in kwargs.
contact_provided = 'contact' in kwargs
# Pop `contact` from kwargs and add formatted email or phone numbers
def from_data(cls, phone=None, email=None, **kwargs):
"""Create registration resource from contact details."""
details = list(kwargs.pop('contact', ()))
if phone is not None:
details.append(cls.phone_prefix + phone)
if email is not None:
details.extend([cls.email_prefix + mail for mail in email.split(',')])
# Insert formatted contact information back into kwargs
# or insert an empty tuple if `contact` provided.
if details or contact_provided:
details.append(cls.email_prefix + email)
kwargs['contact'] = tuple(details)
if external_account_binding:
kwargs['external_account_binding'] = external_account_binding
return cls(**kwargs)
def __init__(self, **kwargs):
"""Note if the user provides a value for the `contact` member."""
if 'contact' in kwargs:
# Avoid the __setattr__ used by jose.TypedJSONObjectWithFields
object.__setattr__(self, '_add_contact', True)
super(Registration, self).__init__(**kwargs)
def _filter_contact(self, prefix):
return tuple(
detail[len(prefix):] for detail in self.contact # pylint: disable=not-an-iterable
detail[len(prefix):] for detail in self.contact
if detail.startswith(prefix))
def _add_contact_if_appropriate(self, jobj):
"""
The `contact` member of Registration objects should not be required when
de-serializing (as it would be if the Fields' `omitempty` flag were `False`), but
it should be included in serializations if it was provided.
:param jobj: Dictionary containing this Registrations' data
:type jobj: dict
:returns: Dictionary containing Registrations data to transmit to the server
:rtype: dict
"""
if getattr(self, '_add_contact', False):
jobj['contact'] = self.encode('contact')
return jobj
def to_partial_json(self):
"""Modify josepy.JSONDeserializable.to_partial_json()"""
jobj = super(Registration, self).to_partial_json()
return self._add_contact_if_appropriate(jobj)
def fields_to_partial_json(self):
"""Modify josepy.JSONObjectWithFields.fields_to_partial_json()"""
jobj = super(Registration, self).fields_to_partial_json()
return self._add_contact_if_appropriate(jobj)
@property
def phones(self):
"""All phones found in the ``contact`` field."""
@@ -409,13 +283,13 @@ class Registration(ResourceBody):
@Directory.register
class NewRegistration(ResourceMixin, Registration):
class NewRegistration(Registration):
"""New registration."""
resource_type = 'new-reg'
resource = fields.Resource(resource_type)
class UpdateRegistration(ResourceMixin, Registration):
class UpdateRegistration(Registration):
"""Update registration."""
resource_type = 'reg'
resource = fields.Resource(resource_type)
@@ -465,7 +339,8 @@ class ChallengeBody(ResourceBody):
omitempty=True, default=None)
def __init__(self, **kwargs):
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
# pylint: disable=star-args
super(ChallengeBody, self).__init__(**kwargs)
def encode(self, name):
@@ -513,6 +388,7 @@ class ChallengeResource(Resource):
@property
def uri(self):
"""The URL of the challenge body."""
# pylint: disable=function-redefined,no-member
return self.body.uri
@@ -527,7 +403,7 @@ class Authorization(ResourceBody):
:ivar datetime.datetime expires:
"""
identifier = jose.Field('identifier', decoder=Identifier.from_json, omitempty=True)
identifier = jose.Field('identifier', decoder=Identifier.from_json)
challenges = jose.Field('challenges', omitempty=True)
combinations = jose.Field('combinations', omitempty=True)
@@ -537,32 +413,25 @@ class Authorization(ResourceBody):
# be absent'... then acme-spec gives example with 'expires'
# present... That's confusing!
expires = fields.RFC3339Field('expires', omitempty=True)
wildcard = jose.Field('wildcard', omitempty=True)
@challenges.decoder
def challenges(value): # pylint: disable=no-self-argument,missing-function-docstring
def challenges(value): # pylint: disable=missing-docstring,no-self-argument
return tuple(ChallengeBody.from_json(chall) for chall in value)
@property
def resolved_combinations(self):
"""Combinations with challenges instead of indices."""
return tuple(tuple(self.challenges[idx] for idx in combo)
for combo in self.combinations) # pylint: disable=not-an-iterable
for combo in self.combinations)
@Directory.register
class NewAuthorization(ResourceMixin, Authorization):
class NewAuthorization(Authorization):
"""New authorization."""
resource_type = 'new-authz'
resource = fields.Resource(resource_type)
class UpdateAuthorization(ResourceMixin, Authorization):
"""Update authorization."""
resource_type = 'authz'
resource = fields.Resource(resource_type)
class AuthorizationResource(ResourceWithURI):
"""Authorization Resource.
@@ -575,7 +444,7 @@ class AuthorizationResource(ResourceWithURI):
@Directory.register
class CertificateRequest(ResourceMixin, jose.JSONObjectWithFields):
class CertificateRequest(jose.JSONObjectWithFields):
"""ACME new-cert request.
:ivar josepy.util.ComparableX509 csr:
@@ -601,7 +470,7 @@ class CertificateResource(ResourceWithURI):
@Directory.register
class Revocation(ResourceMixin, jose.JSONObjectWithFields):
class Revocation(jose.JSONObjectWithFields):
"""Revocation message.
:ivar .ComparableX509 certificate: `OpenSSL.crypto.X509` wrapped in
@@ -613,56 +482,3 @@ class Revocation(ResourceMixin, jose.JSONObjectWithFields):
certificate = jose.Field(
'certificate', decoder=jose.decode_cert, encoder=jose.encode_cert)
reason = jose.Field('reason')
class Order(ResourceBody):
"""Order Resource Body.
:ivar identifiers: List of identifiers for the certificate.
:vartype identifiers: `list` of `.Identifier`
:ivar acme.messages.Status status:
:ivar authorizations: URLs of authorizations.
:vartype authorizations: `list` of `str`
:ivar str certificate: URL to download certificate as a fullchain PEM.
:ivar str finalize: URL to POST to to request issuance once all
authorizations have "valid" status.
:ivar datetime.datetime expires: When the order expires.
:ivar ~.Error error: Any error that occurred during finalization, if applicable.
"""
identifiers = jose.Field('identifiers', omitempty=True)
status = jose.Field('status', decoder=Status.from_json,
omitempty=True)
authorizations = jose.Field('authorizations', omitempty=True)
certificate = jose.Field('certificate', omitempty=True)
finalize = jose.Field('finalize', omitempty=True)
expires = fields.RFC3339Field('expires', omitempty=True)
error = jose.Field('error', omitempty=True, decoder=Error.from_json)
@identifiers.decoder
def identifiers(value): # pylint: disable=no-self-argument,missing-function-docstring
return tuple(Identifier.from_json(identifier) for identifier in value)
class OrderResource(ResourceWithURI):
"""Order Resource.
:ivar acme.messages.Order body:
:ivar str csr_pem: The CSR this Order will be finalized with.
:ivar authorizations: Fully-fetched AuthorizationResource objects.
:vartype authorizations: `list` of `acme.messages.AuthorizationResource`
:ivar str fullchain_pem: The fetched contents of the certificate URL
produced once the order was finalized, if it's present.
:ivar alternative_fullchains_pem: The fetched contents of alternative certificate
chain URLs produced once the order was finalized, if present and requested during
finalization.
:vartype alternative_fullchains_pem: `list` of `str`
"""
body = jose.Field('body', decoder=Order.from_json)
csr_pem = jose.Field('csr_pem', omitempty=True)
authorizations = jose.Field('authorizations')
fullchain_pem = jose.Field('fullchain_pem', omitempty=True)
alternative_fullchains_pem = jose.Field('alternative_fullchains_pem', omitempty=True)
@Directory.register
class NewOrder(Order):
"""New order."""
resource_type = 'new-order'

View File

@@ -2,13 +2,11 @@
import unittest
import josepy as jose
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import mock
from acme import challenges
import test_util
from acme import test_util
CERT = test_util.load_comparable_cert('cert.der')
CSR = test_util.load_comparable_csr('csr.der')
@@ -20,7 +18,8 @@ class ErrorTest(unittest.TestCase):
def setUp(self):
from acme.messages import Error, ERROR_PREFIX
self.error = Error.with_code('malformed', detail='foo', title='title')
self.error = Error(
detail='foo', typ=ERROR_PREFIX + 'malformed', title='title')
self.jobj = {
'detail': 'foo',
'title': 'some title',
@@ -28,6 +27,7 @@ class ErrorTest(unittest.TestCase):
}
self.error_custom = Error(typ='custom', detail='bar')
self.empty_error = Error()
self.jobj_custom = {'type': 'custom', 'detail': 'bar'}
def test_default_typ(self):
from acme.messages import Error
@@ -42,7 +42,8 @@ class ErrorTest(unittest.TestCase):
hash(Error.from_json(self.error.to_json()))
def test_description(self):
self.assertEqual('The request message was malformed', self.error.description)
self.assertEqual(
'The request message was malformed', self.error.description)
self.assertTrue(self.error_custom.description is None)
def test_code(self):
@@ -52,17 +53,17 @@ class ErrorTest(unittest.TestCase):
self.assertEqual(None, Error().code)
def test_is_acme_error(self):
from acme.messages import is_acme_error, Error
from acme.messages import is_acme_error
self.assertTrue(is_acme_error(self.error))
self.assertFalse(is_acme_error(self.error_custom))
self.assertFalse(is_acme_error(Error()))
self.assertFalse(is_acme_error(self.empty_error))
self.assertFalse(is_acme_error("must pet all the {dogs|rabbits}"))
def test_unicode_error(self):
from acme.messages import Error, is_acme_error
arabic_error = Error.with_code(
'malformed', detail=u'\u0639\u062f\u0627\u0644\u0629', title='title')
from acme.messages import Error, ERROR_PREFIX, is_acme_error
arabic_error = Error(
detail=u'\u0639\u062f\u0627\u0644\u0629', typ=ERROR_PREFIX + 'malformed',
title='title')
self.assertTrue(is_acme_error(arabic_error))
def test_with_code(self):
@@ -84,7 +85,7 @@ class ConstantTest(unittest.TestCase):
from acme.messages import _Constant
class MockConstant(_Constant): # pylint: disable=missing-docstring
POSSIBLE_NAMES = {} # type: Dict
POSSIBLE_NAMES = {}
self.MockConstant = MockConstant # pylint: disable=invalid-name
self.const_a = MockConstant('a')
@@ -108,11 +109,11 @@ class ConstantTest(unittest.TestCase):
def test_equality(self):
const_a_prime = self.MockConstant('a')
self.assertNotEqual(self.const_a, self.const_b)
self.assertEqual(self.const_a, const_a_prime)
self.assertFalse(self.const_a == self.const_b)
self.assertTrue(self.const_a == const_a_prime)
self.assertNotEqual(self.const_a, self.const_b)
self.assertEqual(self.const_a, const_a_prime)
self.assertTrue(self.const_a != self.const_b)
self.assertFalse(self.const_a != const_a_prime)
class DirectoryTest(unittest.TestCase):
@@ -156,7 +157,7 @@ class DirectoryTest(unittest.TestCase):
'meta': {
'terms-of-service': 'https://example.com/acme/terms',
'website': 'https://www.example.com/',
'caaIdentities': ['example.com'],
'caa-identities': ['example.com'],
},
})
@@ -164,31 +165,6 @@ class DirectoryTest(unittest.TestCase):
from acme.messages import Directory
Directory.from_json({'foo': 'bar'})
def test_iter_meta(self):
result = False
for k in self.dir.meta:
if k == 'terms_of_service':
result = self.dir.meta[k] == 'https://example.com/acme/terms'
self.assertTrue(result)
class ExternalAccountBindingTest(unittest.TestCase):
def setUp(self):
from acme.messages import Directory
self.key = jose.jwk.JWKRSA(key=KEY.public_key())
self.kid = "kid-for-testing"
self.hmac_key = "hmac-key-for-testing"
self.dir = Directory({
'newAccount': 'http://url/acme/new-account',
})
def test_from_data(self):
from acme.messages import ExternalAccountBinding
eab = ExternalAccountBinding.from_data(self.key, self.kid, self.hmac_key, self.dir)
self.assertEqual(len(eab), 3)
self.assertEqual(sorted(eab.keys()), sorted(['protected', 'payload', 'signature']))
class RegistrationTest(unittest.TestCase):
"""Tests for acme.messages.Registration."""
@@ -221,22 +197,6 @@ class RegistrationTest(unittest.TestCase):
'mailto:admin@foo.com',
))
def test_new_registration_from_data_with_eab(self):
from acme.messages import NewRegistration, ExternalAccountBinding, Directory
key = jose.jwk.JWKRSA(key=KEY.public_key())
kid = "kid-for-testing"
hmac_key = "hmac-key-for-testing"
directory = Directory({
'newAccount': 'http://url/acme/new-account',
})
eab = ExternalAccountBinding.from_data(key, kid, hmac_key, directory)
reg = NewRegistration.from_data(email='admin@foo.com', external_account_binding=eab)
self.assertEqual(reg.contact, (
'mailto:admin@foo.com',
))
self.assertEqual(sorted(reg.external_account_binding.keys()),
sorted(['protected', 'payload', 'signature']))
def test_phones(self):
self.assertEqual(('1234',), self.reg.phones)
@@ -254,19 +214,6 @@ class RegistrationTest(unittest.TestCase):
from acme.messages import Registration
hash(Registration.from_json(self.jobj_from))
def test_default_not_transmitted(self):
from acme.messages import NewRegistration
empty_new_reg = NewRegistration()
new_reg_with_contact = NewRegistration(contact=())
self.assertEqual(empty_new_reg.contact, ())
self.assertEqual(new_reg_with_contact.contact, ())
self.assertTrue('contact' not in empty_new_reg.to_partial_json())
self.assertTrue('contact' not in empty_new_reg.fields_to_partial_json())
self.assertTrue('contact' in new_reg_with_contact.to_partial_json())
self.assertTrue('contact' in new_reg_with_contact.fields_to_partial_json())
class UpdateRegistrationTest(unittest.TestCase):
"""Tests for acme.messages.UpdateRegistration."""
@@ -316,7 +263,8 @@ class ChallengeBodyTest(unittest.TestCase):
from acme.messages import Error
from acme.messages import STATUS_INVALID
self.status = STATUS_INVALID
error = Error.with_code('serverInternal', detail='Unable to communicate with DNS server')
error = Error(typ='urn:ietf:params:acme:error:serverInternal',
detail='Unable to communicate with DNS server')
self.challb = ChallengeBody(
uri='http://challb', chall=self.chall, status=self.status,
error=error)
@@ -453,48 +401,5 @@ class RevocationTest(unittest.TestCase):
hash(Revocation.from_json(self.rev.to_json()))
class OrderResourceTest(unittest.TestCase):
"""Tests for acme.messages.OrderResource."""
def setUp(self):
from acme.messages import OrderResource
self.regr = OrderResource(
body=mock.sentinel.body, uri=mock.sentinel.uri)
def test_to_partial_json(self):
self.assertEqual(self.regr.to_json(), {
'body': mock.sentinel.body,
'uri': mock.sentinel.uri,
'authorizations': None,
})
class NewOrderTest(unittest.TestCase):
"""Tests for acme.messages.NewOrder."""
def setUp(self):
from acme.messages import NewOrder
self.reg = NewOrder(
identifiers=mock.sentinel.identifiers)
def test_to_partial_json(self):
self.assertEqual(self.reg.to_json(), {
'identifiers': mock.sentinel.identifiers,
})
class JWSPayloadRFC8555Compliant(unittest.TestCase):
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
def test_message_payload(self):
from acme.messages import NewAuthorization
new_order = NewAuthorization()
new_order.le_acme_version = 2
jobj = new_order.json_dumps(indent=2).encode()
# RFC8555 states that JWS bodies must not have a resource field.
self.assertEqual(jobj, b'{}')
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,65 +0,0 @@
"""Useful mixins for Challenge and Resource objects"""
class VersionedLEACMEMixin(object):
"""This mixin stores the version of Let's Encrypt's endpoint being used."""
@property
def le_acme_version(self):
"""Define the version of ACME protocol to use"""
return getattr(self, '_le_acme_version', 1)
@le_acme_version.setter
def le_acme_version(self, version):
# We need to use object.__setattr__ to not depend on the specific implementation of
# __setattr__ in current class (eg. jose.TypedJSONObjectWithFields raises AttributeError
# for any attempt to set an attribute to make objects immutable).
object.__setattr__(self, '_le_acme_version', version)
def __setattr__(self, key, value):
if key == 'le_acme_version':
# Required for @property to operate properly. See comment above.
object.__setattr__(self, key, value)
else:
super(VersionedLEACMEMixin, self).__setattr__(key, value) # pragma: no cover
class ResourceMixin(VersionedLEACMEMixin):
"""
This mixin generates a RFC8555 compliant JWS payload
by removing the `resource` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self):
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(ResourceMixin, self),
'to_partial_json', 'resource')
def fields_to_partial_json(self):
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(ResourceMixin, self),
'fields_to_partial_json', 'resource')
class TypeMixin(VersionedLEACMEMixin):
"""
This mixin allows generation of a RFC8555 compliant JWS payload
by removing the `type` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self):
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(TypeMixin, self),
'to_partial_json', 'type')
def fields_to_partial_json(self):
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(TypeMixin, self),
'fields_to_partial_json', 'type')
def _safe_jobj_compliance(instance, jobj_method, uncompliant_field):
if hasattr(instance, jobj_method):
jobj = getattr(instance, jobj_method)()
if instance.le_acme_version == 2:
jobj.pop(uncompliant_field, None)
return jobj
raise AttributeError('Method {0}() is not implemented.'.format(jobj_method)) # pragma: no cover

View File

@@ -1,20 +1,28 @@
"""Support for standalone client challenge solvers. """
import argparse
import collections
import functools
import logging
import os
import socket
import sys
import threading
from six.moves import BaseHTTPServer # type: ignore
from six.moves import http_client
from six.moves import socketserver # type: ignore
from six.moves import BaseHTTPServer # type: ignore # pylint: disable=import-error
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
import OpenSSL
from acme import challenges
from acme import crypto_util
from acme.magic_typing import List
logger = logging.getLogger(__name__)
# six.moves.* | pylint: disable=no-member,attribute-defined-outside-init
# pylint: disable=too-few-public-methods,no-init
class TLSServer(socketserver.TCPServer):
"""Generic TLS Server."""
@@ -27,27 +35,21 @@ class TLSServer(socketserver.TCPServer):
self.address_family = socket.AF_INET
self.certs = kwargs.pop("certs", {})
self.method = kwargs.pop(
"method", crypto_util._DEFAULT_SSL_METHOD)
# pylint: disable=protected-access
"method", crypto_util._DEFAULT_TLSSNI01_SSL_METHOD)
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
socketserver.TCPServer.__init__(self, *args, **kwargs)
def _wrap_sock(self):
self.socket = crypto_util.SSLSocket(
self.socket, cert_selection=self._cert_selection,
alpn_selection=getattr(self, '_alpn_selection', None),
method=self.method)
self.socket, certs=self.certs, method=self.method)
def _cert_selection(self, connection): # pragma: no cover
"""Callback selecting certificate for connection."""
server_name = connection.get_servername()
return self.certs.get(server_name, None)
def server_bind(self):
def server_bind(self): # pylint: disable=missing-docstring
self._wrap_sock()
return socketserver.TCPServer.server_bind(self)
class ACMEServerMixin:
class ACMEServerMixin: # pylint: disable=old-style-class
"""ACME server common settings mixin."""
# TODO: c.f. #858
server_version = "ACME client standalone challenge solver"
@@ -64,8 +66,8 @@ class BaseDualNetworkedServers(object):
def __init__(self, ServerClass, server_address, *remaining_args, **kwargs):
port = server_address[1]
self.threads = [] # type: List[threading.Thread]
self.servers = [] # type: List[ACMEServerMixin]
self.threads = []
self.servers = []
# Must try True first.
# Ubuntu, for example, will fail to bind to IPv4 if we've already bound
@@ -79,35 +81,23 @@ class BaseDualNetworkedServers(object):
kwargs["ipv6"] = ip_version
new_address = (server_address[0],) + (port,) + server_address[2:]
new_args = (new_address,) + remaining_args
server = ServerClass(*new_args, **kwargs)
logger.debug(
"Successfully bound to %s:%s using %s", new_address[0],
new_address[1], "IPv6" if ip_version else "IPv4")
server = ServerClass(*new_args, **kwargs) # pylint: disable=star-args
except socket.error:
if self.servers:
# Already bound using IPv6.
logger.debug(
"Certbot wasn't able to bind to %s:%s using %s, this "
"is often expected due to the dual stack nature of "
"IPv6 socket implementations.",
new_address[0], new_address[1],
"IPv6" if ip_version else "IPv4")
else:
logger.debug(
"Failed to bind to %s:%s using %s", new_address[0],
logger.debug("Failed to bind to %s:%s using %s", new_address[0],
new_address[1], "IPv6" if ip_version else "IPv4")
else:
self.servers.append(server)
# If two servers are set up and port 0 was passed in, ensure we always
# bind to the same port for both servers.
port = server.socket.getsockname()[1]
if not self.servers:
if len(self.servers) == 0:
raise socket.error("Could not bind to IPv4 or IPv6.")
def serve_forever(self):
"""Wraps socketserver.TCPServer.serve_forever"""
for server in self.servers:
thread = threading.Thread(
# pylint: disable=no-member
target=server.serve_forever)
thread.start()
self.threads.append(thread)
@@ -127,38 +117,33 @@ class BaseDualNetworkedServers(object):
self.threads = []
class TLSALPN01Server(TLSServer, ACMEServerMixin):
"""TLSALPN01 Server."""
class TLSSNI01Server(TLSServer, ACMEServerMixin):
"""TLSSNI01 Server."""
ACME_TLS_1_PROTOCOL = b"acme-tls/1"
def __init__(self, server_address, certs, challenge_certs, ipv6=False):
def __init__(self, server_address, certs, ipv6=False):
TLSServer.__init__(
self, server_address, _BaseRequestHandlerWithLogging, certs=certs,
ipv6=ipv6)
self.challenge_certs = challenge_certs
self, server_address, BaseRequestHandlerWithLogging, certs=certs, ipv6=ipv6)
def _cert_selection(self, connection):
# TODO: We would like to serve challenge cert only if asked for it via
# ALPN. To do this, we need to retrieve the list of protos from client
# hello, but this is currently impossible with openssl [0], and ALPN
# negotiation is done after cert selection.
# Therefore, currently we always return challenge cert, and terminate
# handshake in alpn_selection() if ALPN protos are not what we expect.
# [0] https://github.com/openssl/openssl/issues/4952
server_name = connection.get_servername()
logger.debug("Serving challenge cert for server name %s", server_name)
return self.challenge_certs.get(server_name, None)
def _alpn_selection(self, _connection, alpn_protos):
"""Callback to select alpn protocol."""
if len(alpn_protos) == 1 and alpn_protos[0] == self.ACME_TLS_1_PROTOCOL:
logger.debug("Agreed on %s ALPN", self.ACME_TLS_1_PROTOCOL)
return self.ACME_TLS_1_PROTOCOL
logger.debug("Cannot agree on ALPN proto. Got: %s", str(alpn_protos))
# Explicitly close the connection now, by returning an empty string.
# See https://www.pyopenssl.org/en/stable/api/ssl.html#OpenSSL.SSL.Context.set_alpn_select_callback # pylint: disable=line-too-long
return b""
class TLSSNI01DualNetworkedServers(BaseDualNetworkedServers):
"""TLSSNI01Server Wrapper. Tries everything for both. Failures for one don't
affect the other."""
def __init__(self, *args, **kwargs):
BaseDualNetworkedServers.__init__(self, TLSSNI01Server, *args, **kwargs)
class BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
"""BaseRequestHandler with logging."""
def log_message(self, format, *args): # pylint: disable=redefined-builtin
"""Log arbitrary message."""
logger.debug("%s - - %s", self.client_address[0], format % args)
def handle(self):
"""Handle request."""
self.log_message("Incoming request")
socketserver.BaseRequestHandler.handle(self)
class HTTPServer(BaseHTTPServer.HTTPServer):
@@ -176,10 +161,10 @@ class HTTPServer(BaseHTTPServer.HTTPServer):
class HTTP01Server(HTTPServer, ACMEServerMixin):
"""HTTP01 Server."""
def __init__(self, server_address, resources, ipv6=False, timeout=30):
def __init__(self, server_address, resources, ipv6=False):
HTTPServer.__init__(
self, server_address, HTTP01RequestHandler.partial_init(
simple_http_resources=resources, timeout=timeout), ipv6=ipv6)
simple_http_resources=resources), ipv6=ipv6)
class HTTP01DualNetworkedServers(BaseDualNetworkedServers):
@@ -204,8 +189,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
def __init__(self, *args, **kwargs):
self.simple_http_resources = kwargs.pop("simple_http_resources", set())
self.timeout = kwargs.pop('timeout', 30)
BaseHTTPServer.BaseHTTPRequestHandler.__init__(self, *args, **kwargs)
socketserver.BaseRequestHandler.__init__(self, *args, **kwargs)
def log_message(self, format, *args): # pylint: disable=redefined-builtin
"""Log arbitrary message."""
@@ -216,7 +200,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
self.log_message("Incoming request")
BaseHTTPServer.BaseHTTPRequestHandler.handle(self)
def do_GET(self): # pylint: disable=invalid-name,missing-function-docstring
def do_GET(self): # pylint: disable=invalid-name,missing-docstring
if self.path == "/":
self.handle_index()
elif self.path.startswith("/" + challenges.HTTP01.URI_ROOT_PATH):
@@ -254,7 +238,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
self.path)
@classmethod
def partial_init(cls, simple_http_resources, timeout):
def partial_init(cls, simple_http_resources):
"""Partially initialize this handler.
This is useful because `socketserver.BaseServer` takes
@@ -263,18 +247,40 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
"""
return functools.partial(
cls, simple_http_resources=simple_http_resources,
timeout=timeout)
cls, simple_http_resources=simple_http_resources)
class _BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
"""BaseRequestHandler with logging."""
def simple_tls_sni_01_server(cli_args, forever=True):
"""Run simple standalone TLSSNI01 server."""
logging.basicConfig(level=logging.DEBUG)
def log_message(self, format, *args): # pylint: disable=redefined-builtin
"""Log arbitrary message."""
logger.debug("%s - - %s", self.client_address[0], format % args)
parser = argparse.ArgumentParser()
parser.add_argument(
"-p", "--port", default=0, help="Port to serve at. By default "
"picks random free port.")
args = parser.parse_args(cli_args[1:])
def handle(self):
"""Handle request."""
self.log_message("Incoming request")
socketserver.BaseRequestHandler.handle(self)
certs = {}
_, hosts, _ = next(os.walk('.'))
for host in hosts:
with open(os.path.join(host, "cert.pem")) as cert_file:
cert_contents = cert_file.read()
with open(os.path.join(host, "key.pem")) as key_file:
key_contents = key_file.read()
certs[host.encode()] = (
OpenSSL.crypto.load_privatekey(
OpenSSL.crypto.FILETYPE_PEM, key_contents),
OpenSSL.crypto.load_certificate(
OpenSSL.crypto.FILETYPE_PEM, cert_contents))
server = TLSSNI01Server(('', int(args.port)), certs=certs)
logger.info("Serving at https://%s:%s...", *server.socket.getsockname()[:2])
if forever: # pragma: no cover
server.serve_forever()
else:
server.handle_request()
if __name__ == "__main__":
sys.exit(simple_tls_sni_01_server(sys.argv)) # pragma: no cover

View File

@@ -1,22 +1,23 @@
"""Tests for acme.standalone."""
import os
import shutil
import socket
import threading
import tempfile
import time
import unittest
import josepy as jose
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import requests
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
import josepy as jose
import mock
import requests
from acme import challenges
from acme import crypto_util
from acme import errors
import test_util
from acme import test_util
class TLSServerTest(unittest.TestCase):
@@ -27,14 +28,41 @@ class TLSServerTest(unittest.TestCase):
from acme.standalone import TLSServer
server = TLSServer(
('', 0), socketserver.BaseRequestHandler, bind_and_activate=True)
server.server_close()
server.server_close() # pylint: disable=no-member
def test_ipv6(self):
if socket.has_ipv6:
from acme.standalone import TLSServer
server = TLSServer(
('', 0), socketserver.BaseRequestHandler, bind_and_activate=True, ipv6=True)
server.server_close()
server.server_close() # pylint: disable=no-member
class TLSSNI01ServerTest(unittest.TestCase):
"""Test for acme.standalone.TLSSNI01Server."""
def setUp(self):
self.certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
test_util.load_cert('rsa2048_cert.pem'),
)}
from acme.standalone import TLSSNI01Server
self.server = TLSSNI01Server(("", 0), certs=self.certs)
# pylint: disable=no-member
self.thread = threading.Thread(target=self.server.serve_forever)
self.thread.start()
def tearDown(self):
self.server.shutdown() # pylint: disable=no-member
self.thread.join()
def test_it(self):
host, port = self.server.socket.getsockname()[:2]
cert = crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1)
self.assertEqual(jose.ComparableX509(cert),
jose.ComparableX509(self.certs[b'localhost'][1]))
class HTTP01ServerTest(unittest.TestCase):
@@ -44,17 +72,18 @@ class HTTP01ServerTest(unittest.TestCase):
def setUp(self):
self.account_key = jose.JWK.load(
test_util.load_vector('rsa1024_key.pem'))
self.resources = set() # type: Set
self.resources = set()
from acme.standalone import HTTP01Server
self.server = HTTP01Server(('', 0), resources=self.resources)
# pylint: disable=no-member
self.port = self.server.socket.getsockname()[1]
self.thread = threading.Thread(target=self.server.serve_forever)
self.thread.start()
def tearDown(self):
self.server.shutdown()
self.server.shutdown() # pylint: disable=no-member
self.thread.join()
def test_index(self):
@@ -88,81 +117,6 @@ class HTTP01ServerTest(unittest.TestCase):
def test_http01_not_found(self):
self.assertFalse(self._test_http01(add=False))
def test_timely_shutdown(self):
from acme.standalone import HTTP01Server
server = HTTP01Server(('', 0), resources=set(), timeout=0.05)
server_thread = threading.Thread(target=server.serve_forever)
server_thread.start()
client = socket.socket()
client.connect(('localhost', server.socket.getsockname()[1]))
stop_thread = threading.Thread(target=server.shutdown)
stop_thread.start()
server_thread.join(5.)
is_hung = server_thread.is_alive()
try:
client.shutdown(socket.SHUT_RDWR)
except: # pragma: no cover, pylint: disable=bare-except
# may raise error because socket could already be closed
pass
self.assertFalse(is_hung, msg='Server shutdown should not be hung')
@unittest.skipIf(not challenges.TLSALPN01.is_supported(), "pyOpenSSL too old")
class TLSALPN01ServerTest(unittest.TestCase):
"""Test for acme.standalone.TLSALPN01Server."""
def setUp(self):
self.certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
test_util.load_cert('rsa2048_cert.pem'),
)}
# Use different certificate for challenge.
self.challenge_certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa4096_key.pem'),
test_util.load_cert('rsa4096_cert.pem'),
)}
from acme.standalone import TLSALPN01Server
self.server = TLSALPN01Server(("localhost", 0), certs=self.certs,
challenge_certs=self.challenge_certs)
# pylint: disable=no-member
self.thread = threading.Thread(target=self.server.serve_forever)
self.thread.start()
def tearDown(self):
self.server.shutdown() # pylint: disable=no-member
self.thread.join()
# TODO: This is not implemented yet, see comments in standalone.py
# def test_certs(self):
# host, port = self.server.socket.getsockname()[:2]
# cert = crypto_util.probe_sni(
# b'localhost', host=host, port=port, timeout=1)
# # Expect normal cert when connecting without ALPN.
# self.assertEqual(jose.ComparableX509(cert),
# jose.ComparableX509(self.certs[b'localhost'][1]))
def test_challenge_certs(self):
host, port = self.server.socket.getsockname()[:2]
cert = crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1,
alpn_protocols=[b"acme-tls/1"])
# Expect challenge cert when connecting with ALPN.
self.assertEqual(
jose.ComparableX509(cert),
jose.ComparableX509(self.challenge_certs[b'localhost'][1])
)
def test_bad_alpn(self):
host, port = self.server.socket.getsockname()[:2]
with self.assertRaises(errors.Error):
crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1,
alpn_protocols=[b"bad-alpn"])
class BaseDualNetworkedServersTest(unittest.TestCase):
"""Test for acme.standalone.BaseDualNetworkedServers."""
@@ -179,10 +133,8 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
self.address_family = socket.AF_INET
socketserver.TCPServer.__init__(self, *args, **kwargs)
if ipv6:
# NB: On Windows, socket.IPPROTO_IPV6 constant may be missing.
# We use the corresponding value (41) instead.
level = getattr(socket, "IPPROTO_IPV6", 41)
self.socket.setsockopt(level, socket.IPV6_V6ONLY, 1)
# pylint: disable=no-member
self.socket.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_V6ONLY, 1)
try:
self.server_bind()
self.server_activate()
@@ -196,14 +148,14 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
from acme.standalone import BaseDualNetworkedServers
self.assertRaises(socket.error, BaseDualNetworkedServers,
BaseDualNetworkedServersTest.SingleProtocolServer,
('', 0),
("", 0),
socketserver.BaseRequestHandler)
def test_ports_equal(self):
from acme.standalone import BaseDualNetworkedServers
servers = BaseDualNetworkedServers(
BaseDualNetworkedServersTest.SingleProtocolServer,
('', 0),
("", 0),
socketserver.BaseRequestHandler)
socknames = servers.getsocknames()
prev_port = None
@@ -215,17 +167,46 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
prev_port = port
class TLSSNI01DualNetworkedServersTest(unittest.TestCase):
"""Test for acme.standalone.TLSSNI01DualNetworkedServers."""
def setUp(self):
self.certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
test_util.load_cert('rsa2048_cert.pem'),
)}
from acme.standalone import TLSSNI01DualNetworkedServers
self.servers = TLSSNI01DualNetworkedServers(("", 0), certs=self.certs)
self.servers.serve_forever()
def tearDown(self):
self.servers.shutdown_and_server_close()
def test_connect(self):
socknames = self.servers.getsocknames()
# connect to all addresses
for sockname in socknames:
host, port = sockname[:2]
cert = crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1)
self.assertEqual(jose.ComparableX509(cert),
jose.ComparableX509(self.certs[b'localhost'][1]))
class HTTP01DualNetworkedServersTest(unittest.TestCase):
"""Tests for acme.standalone.HTTP01DualNetworkedServers."""
def setUp(self):
self.account_key = jose.JWK.load(
test_util.load_vector('rsa1024_key.pem'))
self.resources = set() # type: Set
self.resources = set()
from acme.standalone import HTTP01DualNetworkedServers
self.servers = HTTP01DualNetworkedServers(('', 0), resources=self.resources)
# pylint: disable=no-member
self.port = self.servers.getsocknames()[0][1]
self.servers.serve_forever()
@@ -264,5 +245,56 @@ class HTTP01DualNetworkedServersTest(unittest.TestCase):
self.assertFalse(self._test_http01(add=False))
class TestSimpleTLSSNI01Server(unittest.TestCase):
"""Tests for acme.standalone.simple_tls_sni_01_server."""
def setUp(self):
# mirror ../examples/standalone
self.test_cwd = tempfile.mkdtemp()
localhost_dir = os.path.join(self.test_cwd, 'localhost')
os.makedirs(localhost_dir)
shutil.copy(test_util.vector_path('rsa2048_cert.pem'),
os.path.join(localhost_dir, 'cert.pem'))
shutil.copy(test_util.vector_path('rsa2048_key.pem'),
os.path.join(localhost_dir, 'key.pem'))
from acme.standalone import simple_tls_sni_01_server
self.port = 1234
self.thread = threading.Thread(
target=simple_tls_sni_01_server, kwargs={
'cli_args': ('xxx', '--port', str(self.port)),
'forever': False,
},
)
self.old_cwd = os.getcwd()
os.chdir(self.test_cwd)
def tearDown(self):
os.chdir(self.old_cwd)
self.thread.join()
shutil.rmtree(self.test_cwd)
def test_it(self):
max_attempts = 5
for attempt in range(max_attempts):
try:
cert = crypto_util.probe_sni(
b'localhost', b'0.0.0.0', self.port)
except errors.Error:
self.assertTrue(attempt + 1 < max_attempts, "Timeout!")
time.sleep(1) # wait until thread starts
else:
self.assertEqual(jose.ComparableX509(cert),
test_util.load_comparable_cert(
'rsa2048_cert.pem'))
break
if attempt == 0:
# the first attempt is always meant to fail, so we can test
# the socket failure code-path for probe_sni, as well
self.thread.start()
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -4,12 +4,19 @@
"""
import os
import pkg_resources
import unittest
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import serialization
import josepy as jose
from OpenSSL import crypto
import pkg_resources
import OpenSSL
def vector_path(*names):
"""Path to a test vector."""
return pkg_resources.resource_filename(
__name__, os.path.join('testdata', *names))
def load_vector(*names):
@@ -25,14 +32,15 @@ def _guess_loader(filename, loader_pem, loader_der):
return loader_pem
elif ext.lower() == '.der':
return loader_der
raise ValueError("Loader could not be recognized based on extension") # pragma: no cover
else: # pragma: no cover
raise ValueError("Loader could not be recognized based on extension")
def load_cert(*names):
"""Load certificate."""
loader = _guess_loader(
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
return crypto.load_certificate(loader, load_vector(*names))
names[-1], OpenSSL.crypto.FILETYPE_PEM, OpenSSL.crypto.FILETYPE_ASN1)
return OpenSSL.crypto.load_certificate(loader, load_vector(*names))
def load_comparable_cert(*names):
@@ -43,8 +51,8 @@ def load_comparable_cert(*names):
def load_csr(*names):
"""Load certificate request."""
loader = _guess_loader(
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
return crypto.load_certificate_request(loader, load_vector(*names))
names[-1], OpenSSL.crypto.FILETYPE_PEM, OpenSSL.crypto.FILETYPE_ASN1)
return OpenSSL.crypto.load_certificate_request(loader, load_vector(*names))
def load_comparable_csr(*names):
@@ -63,5 +71,26 @@ def load_rsa_private_key(*names):
def load_pyopenssl_private_key(*names):
"""Load pyOpenSSL private key."""
loader = _guess_loader(
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
return crypto.load_privatekey(loader, load_vector(*names))
names[-1], OpenSSL.crypto.FILETYPE_PEM, OpenSSL.crypto.FILETYPE_ASN1)
return OpenSSL.crypto.load_privatekey(loader, load_vector(*names))
def skip_unless(condition, reason): # pragma: no cover
"""Skip tests unless a condition holds.
This implements the basic functionality of unittest.skipUnless
which is only available on Python 2.7+.
:param bool condition: If ``False``, the test will be skipped
:param str reason: the reason for skipping the test
:rtype: callable
:returns: decorator that hides tests unless condition is ``True``
"""
if hasattr(unittest, "skipUnless"):
return unittest.skipUnless(condition, reason)
elif condition:
return lambda cls: cls
else:
return lambda cls: None

View File

@@ -4,14 +4,12 @@ to use appropriate extension for vector filenames: .pem for PEM and
The following command has been used to generate test keys:
for k in 256 512 1024 2048 4096; do openssl genrsa -out rsa${k}_key.pem $k; done
for x in 256 512 1024 2048; do openssl genrsa -out rsa${k}_key.pem $k; done
and for the CSR:
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -outform DER > csr.der
and for the certificates:
and for the certificate:
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 > rsa2048_cert.pem
openssl req -key rsa1024_key.pem -new -subj '/CN=example.com' -x509 > rsa1024_cert.pem
openssl req -key rsa2047_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der

View File

@@ -4,4 +4,4 @@ import six
def map_keys(dikt, func):
"""Map dictionary keys."""
return {func(key): value for key, value in six.iteritems(dikt)}
return dict((func(key), value) for key, value in six.iteritems(dikt))

View File

@@ -9,7 +9,7 @@ BUILDDIR = _build
# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from https://www.sphinx-doc.org/)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
endif
# Internal variables.

5
acme/docs/api/other.rst Normal file
View File

@@ -0,0 +1,5 @@
Other ACME objects
------------------
.. automodule:: acme.other
:members:

View File

@@ -12,8 +12,10 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
import os
import sys
import os
import shlex
here = os.path.abspath(os.path.dirname(__file__))
@@ -40,7 +42,7 @@ extensions = [
]
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
autodoc_default_flags = ['show-inheritance', 'private-members']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
@@ -85,10 +87,7 @@ language = 'en'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = [
'_build',
'man/*'
]
exclude_patterns = ['_build']
# The reST default role (used for this markup: `text`) to use for all
# documents.
@@ -115,7 +114,7 @@ pygments_style = 'sphinx'
#keep_warnings = False
# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = False
todo_include_todos = True
# -- Options for HTML output ----------------------------------------------
@@ -123,7 +122,7 @@ todo_include_todos = False
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# http://docs.readthedocs.org/en/latest/theme.html#how-do-i-use-this-locally-and-on-read-the-docs
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally

View File

@@ -16,6 +16,13 @@ Contents:
.. automodule:: acme
:members:
Example client:
.. include:: ../examples/example_client.py
:code: python
Indices and tables
==================

View File

@@ -65,7 +65,7 @@ if errorlevel 9009 (
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.https://www.sphinx-doc.org/
echo.http://sphinx-doc.org/
exit /b 1
)

View File

@@ -0,0 +1,47 @@
"""Example script showing how to use acme client API."""
import logging
import os
import pkg_resources
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import rsa
import josepy as jose
import OpenSSL
from acme import client
from acme import messages
logging.basicConfig(level=logging.DEBUG)
DIRECTORY_URL = 'https://acme-staging.api.letsencrypt.org/directory'
BITS = 2048 # minimum for Boulder
DOMAIN = 'example1.com' # example.com is ignored by Boulder
# generate_private_key requires cryptography>=0.5
key = jose.JWKRSA(key=rsa.generate_private_key(
public_exponent=65537,
key_size=BITS,
backend=default_backend()))
acme = client.Client(DIRECTORY_URL, key)
regr = acme.register()
logging.info('Auto-accepting TOS: %s', regr.terms_of_service)
acme.agree_to_tos(regr)
logging.debug(regr)
authzr = acme.request_challenges(
identifier=messages.Identifier(typ=messages.IDENTIFIER_FQDN, value=DOMAIN))
logging.debug(authzr)
authzr, authzr_response = acme.poll(authzr)
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_ASN1, pkg_resources.resource_string(
'acme', os.path.join('testdata', 'csr.der')))
try:
acme.request_issuance(jose.util.ComparableX509(csr), (authzr,))
except messages.Error as error:
print ("This script is doomed to fail as no authorization "
"challenges are ever solved. Error from server: {0}".format(error))

View File

@@ -1,241 +0,0 @@
"""Example ACME-V2 API for HTTP-01 challenge.
Brief:
This a complete usage example of the python-acme API.
Limitations of this example:
- Works for only one Domain name
- Performs only HTTP-01 challenge
- Uses ACME-v2
Workflow:
(Account creation)
- Create account key
- Register account and accept TOS
(Certificate actions)
- Select HTTP-01 within offered challenges by the CA server
- Set up http challenge resource
- Set up standalone web server
- Create domain private key and CSR
- Issue certificate
- Renew certificate
- Revoke certificate
(Account update actions)
- Change contact information
- Deactivate Account
"""
from contextlib import contextmanager
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import rsa
import josepy as jose
import OpenSSL
from acme import challenges
from acme import client
from acme import crypto_util
from acme import errors
from acme import messages
from acme import standalone
# Constants:
# This is the staging point for ACME-V2 within Let's Encrypt.
DIRECTORY_URL = 'https://acme-staging-v02.api.letsencrypt.org/directory'
USER_AGENT = 'python-acme-example'
# Account key size
ACC_KEY_BITS = 2048
# Certificate private key size
CERT_PKEY_BITS = 2048
# Domain name for the certificate.
DOMAIN = 'client.example.com'
# If you are running Boulder locally, it is possible to configure any port
# number to execute the challenge, but real CA servers will always use port
# 80, as described in the ACME specification.
PORT = 80
# Useful methods and classes:
def new_csr_comp(domain_name, pkey_pem=None):
"""Create certificate signing request."""
if pkey_pem is None:
# Create private key.
pkey = OpenSSL.crypto.PKey()
pkey.generate_key(OpenSSL.crypto.TYPE_RSA, CERT_PKEY_BITS)
pkey_pem = OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM,
pkey)
csr_pem = crypto_util.make_csr(pkey_pem, [domain_name])
return pkey_pem, csr_pem
def select_http01_chall(orderr):
"""Extract authorization resource from within order resource."""
# Authorization Resource: authz.
# This object holds the offered challenges by the server and their status.
authz_list = orderr.authorizations
for authz in authz_list:
# Choosing challenge.
# authz.body.challenges is a set of ChallengeBody objects.
for i in authz.body.challenges:
# Find the supported challenge.
if isinstance(i.chall, challenges.HTTP01):
return i
raise Exception('HTTP-01 challenge was not offered by the CA server.')
@contextmanager
def challenge_server(http_01_resources):
"""Manage standalone server set up and shutdown."""
# Setting up a fake server that binds at PORT and any address.
address = ('', PORT)
try:
servers = standalone.HTTP01DualNetworkedServers(address,
http_01_resources)
# Start client standalone web server.
servers.serve_forever()
yield servers
finally:
# Shutdown client web server and unbind from PORT
servers.shutdown_and_server_close()
def perform_http01(client_acme, challb, orderr):
"""Set up standalone webserver and perform HTTP-01 challenge."""
response, validation = challb.response_and_validation(client_acme.net.key)
resource = standalone.HTTP01RequestHandler.HTTP01Resource(
chall=challb.chall, response=response, validation=validation)
with challenge_server({resource}):
# Let the CA server know that we are ready for the challenge.
client_acme.answer_challenge(challb, response)
# Wait for challenge status and then issue a certificate.
# It is possible to set a deadline time.
finalized_orderr = client_acme.poll_and_finalize(orderr)
return finalized_orderr.fullchain_pem
# Main examples:
def example_http():
"""This example executes the whole process of fulfilling a HTTP-01
challenge for one specific domain.
The workflow consists of:
(Account creation)
- Create account key
- Register account and accept TOS
(Certificate actions)
- Select HTTP-01 within offered challenges by the CA server
- Set up http challenge resource
- Set up standalone web server
- Create domain private key and CSR
- Issue certificate
- Renew certificate
- Revoke certificate
(Account update actions)
- Change contact information
- Deactivate Account
"""
# Create account key
acc_key = jose.JWKRSA(
key=rsa.generate_private_key(public_exponent=65537,
key_size=ACC_KEY_BITS,
backend=default_backend()))
# Register account and accept TOS
net = client.ClientNetwork(acc_key, user_agent=USER_AGENT)
directory = messages.Directory.from_json(net.get(DIRECTORY_URL).json())
client_acme = client.ClientV2(directory, net=net)
# Terms of Service URL is in client_acme.directory.meta.terms_of_service
# Registration Resource: regr
# Creates account with contact information.
email = ('fake@example.com')
regr = client_acme.new_account(
messages.NewRegistration.from_data(
email=email, terms_of_service_agreed=True))
# Create domain private key and CSR
pkey_pem, csr_pem = new_csr_comp(DOMAIN)
# Issue certificate
orderr = client_acme.new_order(csr_pem)
# Select HTTP-01 within offered challenges by the CA server
challb = select_http01_chall(orderr)
# The certificate is ready to be used in the variable "fullchain_pem".
fullchain_pem = perform_http01(client_acme, challb, orderr)
# Renew certificate
_, csr_pem = new_csr_comp(DOMAIN, pkey_pem)
orderr = client_acme.new_order(csr_pem)
challb = select_http01_chall(orderr)
# Performing challenge
fullchain_pem = perform_http01(client_acme, challb, orderr)
# Revoke certificate
fullchain_com = jose.ComparableX509(
OpenSSL.crypto.load_certificate(
OpenSSL.crypto.FILETYPE_PEM, fullchain_pem))
try:
client_acme.revoke(fullchain_com, 0) # revocation reason = 0
except errors.ConflictError:
# Certificate already revoked.
pass
# Query registration status.
client_acme.net.account = regr
try:
regr = client_acme.query_registration(regr)
except errors.Error as err:
if err.typ == messages.OLD_ERROR_PREFIX + 'unauthorized' \
or err.typ == messages.ERROR_PREFIX + 'unauthorized':
# Status is deactivated.
pass
raise
# Change contact information
email = 'newfake@example.com'
regr = client_acme.update_registration(
regr.update(
body=regr.body.update(
contact=('mailto:' + email,)
)
)
)
# Deactivate account/registration
regr = client_acme.deactivate_registration(regr)
if __name__ == "__main__":
example_http()

View File

@@ -1,2 +0,0 @@
[pytest]
norecursedirs = .* build dist CVS _darcs {arch} *.egg

View File

@@ -1,10 +1,10 @@
# readthedocs.org gives no way to change the install command to "pip
# install -e acme[docs]" (that would in turn install documentation
# install -e .[docs]" (that would in turn install documentation
# dependencies), but it allows to specify a requirements.txt file at
# https://readthedocs.org/dashboard/letsencrypt/advanced/ (c.f. #259)
# Although ReadTheDocs certainly doesn't need to install the project
# in --editable mode (-e), just "pip install acme[docs]" does not work as
# expected and "pip install -e acme[docs]" must be used instead
# in --editable mode (-e), just "pip install .[docs]" does not work as
# expected and "pip install -e .[docs]" must be used instead
-e acme[docs]

View File

@@ -1,36 +1,36 @@
from distutils.version import LooseVersion
import sys
from setuptools import __version__ as setuptools_version
from setuptools import find_packages
from setuptools import setup
from setuptools import find_packages
version = '1.12.0.dev0'
version = '0.21.0.dev0'
# Please update tox.ini when modifying dependency version requirements
install_requires = [
'cryptography>=2.1.4',
# load_pem_private/public_key (>=0.6)
# rsa_recover_prime_factors (>=0.8)
'cryptography>=0.8',
# formerly known as acme.jose:
# 1.1.0+ is required to avoid the warnings described at
# https://github.com/certbot/josepy/issues/13.
'josepy>=1.1.0',
'PyOpenSSL>=17.3.0',
'josepy>=1.0.0',
# Connection.set_tlsext_host_name (>=0.13)
'mock',
'PyOpenSSL>=0.13',
'pyrfc3339',
'pytz',
'requests[security]>=2.6.0', # security extras added in 2.4.1
'requests-toolbelt>=0.3.0',
'setuptools>=39.0.1',
'six>=1.11.0',
'requests[security]>=2.4.1', # security extras added in 2.4.1
# For pkg_resources. >=1.0 so pip resolves it to a version cryptography
# will tolerate; see #2599:
'setuptools>=1.0',
'six>=1.9.0', # needed for python_2_unicode_compatible
]
setuptools_known_environment_markers = (LooseVersion(setuptools_version) >= LooseVersion('36.2'))
if setuptools_known_environment_markers:
install_requires.append('mock ; python_version < "3.3"')
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Error, you are trying to build certbot wheels using an old version '
'of setuptools. Version 36.2+ of setuptools is required.')
elif sys.version_info < (3,3):
install_requires.append('mock')
# env markers cause problems with older pip and setuptools
if sys.version_info < (2, 7):
install_requires.extend([
'argparse',
'ordereddict',
])
dev_extras = [
'pytest',
@@ -43,6 +43,7 @@ docs_extras = [
'sphinx_rtd_theme',
]
setup(
name='acme',
version=version,
@@ -51,17 +52,19 @@ setup(
author="Certbot Project",
author_email='client-dev@letsencrypt.org',
license='Apache License 2.0',
python_requires='>=3.6',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Development Status :: 3 - Alpha',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],
@@ -73,4 +76,5 @@ setup(
'dev': dev_extras,
'docs': docs_extras,
},
test_suite='acme',
)

View File

@@ -1,53 +0,0 @@
"""Tests for acme.jose shim."""
import importlib
import unittest
class JoseTest(unittest.TestCase):
"""Tests for acme.jose shim."""
def _test_it(self, submodule, attribute):
if submodule:
acme_jose_path = 'acme.jose.' + submodule
josepy_path = 'josepy.' + submodule
else:
acme_jose_path = 'acme.jose'
josepy_path = 'josepy'
acme_jose_mod = importlib.import_module(acme_jose_path)
josepy_mod = importlib.import_module(josepy_path)
self.assertIs(acme_jose_mod, josepy_mod)
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))
# We use the imports below with eval, but pylint doesn't
# understand that.
import acme # pylint: disable=unused-import
import josepy # pylint: disable=unused-import
acme_jose_mod = eval(acme_jose_path) # pylint: disable=eval-used
josepy_mod = eval(josepy_path) # pylint: disable=eval-used
self.assertIs(acme_jose_mod, josepy_mod)
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))
def test_top_level(self):
self._test_it('', 'RS512')
def test_submodules(self):
# This test ensures that the modules in josepy that were
# available at the time it was moved into its own package are
# available under acme.jose. Backwards compatibility with new
# modules or testing code is not maintained.
mods_and_attrs = [('b64', 'b64decode',),
('errors', 'Error',),
('interfaces', 'JSONDeSerializable',),
('json_util', 'Field',),
('jwa', 'HS256',),
('jwk', 'JWK',),
('jws', 'JWS',),
('util', 'ImmutableMap',),]
for mod, attr in mods_and_attrs:
self._test_it(mod, attr)
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,44 +0,0 @@
"""Tests for acme.magic_typing."""
import sys
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
class MagicTypingTest(unittest.TestCase):
"""Tests for acme.magic_typing."""
def test_import_success(self):
try:
import typing as temp_typing
except ImportError: # pragma: no cover
temp_typing = None # pragma: no cover
typing_class_mock = mock.MagicMock()
text_mock = mock.MagicMock()
typing_class_mock.Text = text_mock
sys.modules['typing'] = typing_class_mock
if 'acme.magic_typing' in sys.modules:
del sys.modules['acme.magic_typing'] # pragma: no cover
from acme.magic_typing import Text
self.assertEqual(Text, text_mock)
del sys.modules['acme.magic_typing']
sys.modules['typing'] = temp_typing
def test_import_failure(self):
try:
import typing as temp_typing
except ImportError: # pragma: no cover
temp_typing = None # pragma: no cover
sys.modules['typing'] = None
if 'acme.magic_typing' in sys.modules:
del sys.modules['acme.magic_typing'] # pragma: no cover
from acme.magic_typing import Text
self.assertTrue(Text is None)
del sys.modules['acme.magic_typing']
sys.modules['typing'] = temp_typing
if __name__ == '__main__':
unittest.main() # pragma: no cover

Binary file not shown.

View File

@@ -1,13 +0,0 @@
-----BEGIN CERTIFICATE-----
MIIB/TCCAWagAwIBAgIJAOyRIBs3QT8QMA0GCSqGSIb3DQEBCwUAMBYxFDASBgNV
BAMMC2V4YW1wbGUuY29tMB4XDTE4MDQyMzEwMzE0NFoXDTE4MDUyMzEwMzE0NFow
FjEUMBIGA1UEAwwLZXhhbXBsZS5jb20wgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJ
AoGBAJqJ87R8aVwByONxgQA9hwgvQd/QqI1r1UInXhEF2VnEtZGtUWLi100IpIqr
Mq4qusDwNZ3g8cUPtSkvJGs89djoajMDIJP7lQUEKUYnYrI0q755Tr/DgLWSk7iW
l5ezym0VzWUD0/xXUz8yRbNMTjTac80rS5SZk2ja2wWkYlRJAgMBAAGjUzBRMB0G
A1UdDgQWBBSsaX0IVZ4XXwdeffVAbG7gnxSYjTAfBgNVHSMEGDAWgBSsaX0IVZ4X
XwdeffVAbG7gnxSYjTAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4GB
ADe7SVmvGH2nkwVfONk8TauRUDkePN1CJZKFb2zW1uO9ANJ2v5Arm/OQp0BG/xnI
Djw/aLTNVESF89oe15dkrUErtcaF413MC1Ld5lTCaJLHLGqDKY69e02YwRuxW7jY
qarpt7k7aR5FbcfO5r4V/FK/Gvp4Dmoky8uap7SJIW6x
-----END CERTIFICATE-----

View File

@@ -1,30 +0,0 @@
-----BEGIN CERTIFICATE-----
MIIFDTCCAvWgAwIBAgIUImqDrP53V69vFROsjP/gL0YtoA4wDQYJKoZIhvcNAQEL
BQAwFjEUMBIGA1UEAwwLZXhhbXBsZS5jb20wHhcNMjAwNTI3MjMyNDE0WhcNMjAw
NjI2MjMyNDE0WjAWMRQwEgYDVQQDDAtleGFtcGxlLmNvbTCCAiIwDQYJKoZIhvcN
AQEBBQADggIPADCCAgoCggIBANY9LKLk9Dxn0MUMQFHwBoTN4ehDSWBws2KcytpF
mc8m9Mfk1wmb4fQSKYtK3wIFMfIyo9HQu0nKqMkkUw52o3ZXyOv+oWwF5qNy2BKu
lh5OMSkaZ0o13zoPpW42e+IUnyxvg70+0urD+sUue4cyTHh/nBIUjrM/05ZJ/ac8
HR0RK3H41YoqBjq69JjMZczZZhbNFit3s6p0R1TbVAgc3ckqbtX5BDyQMQQCP4Ed
m4DgbAFVqdcPUCC5W3F3fmuQiPKHiADzONZnXpy6lUvLDWqcd6loKp+nKHM6OkXX
8hmD7pE1PYMQo4hqOfhBR2IgMjAShwd5qUFjl1m2oo0Qm3PFXOk6i2ZQdS6AA/yd
B5/mX0RnM2oIdFZPb6UZFSmtEgs9sTzn+hMUyNSZQRE54px1ur1xws2R+vbsCyM5
+KoFVxDjVjU9TlZx3GvDvnqz/tbHjji6l8VHZYOBMBUXbKHu2U6pJFZ5Zp7k68/z
a3Fb9Pjtn3iRkXEyC0N5kLgqO4QTlExnxebV8aMvQpWd/qefnMn9qPYIZPEXSQAR
mEBIahkcACb60s+acG0WFFluwBPtBqEr8Q67XlSF0Ibf4iBiRzpPobhlWta1nrFg
4IWHMSoZ0PE75bhIGBEkhrpcXQCAxXmAfxfjKDH7jdJ1fRdnZ/9+OzwYGVX5GH/l
0QDtAgMBAAGjUzBRMB0GA1UdDgQWBBQh3xiz/o1nEU2ySylZ9gxCXvIPGzAfBgNV
HSMEGDAWgBQh3xiz/o1nEU2ySylZ9gxCXvIPGzAPBgNVHRMBAf8EBTADAQH/MA0G
CSqGSIb3DQEBCwUAA4ICAQAELoXz31oR9pdAwidlv9ZBOKiC7KBWy8VMqXNVkfTn
bVRxAUex7zleLFIOkWnqadsMesU9sIwrbLzBcZ8Q/vBY+z2xOPdXcgcAoAmdKWoq
YBQNiqng9r54sqlzB/77QZCf5fdktESe7NTxhCifgx5SAWq7IUQs/lm3tnMUSAfE
5ctuN6M+w8K54y3WDprcfMHpnc3ZHeSPhVQApHM0h/bDvXq0bRS7kmq27Hb153Qm
nH3TwYB5pPSWW38NbUc+s/a7mItO7S8ly8yGbA0j9c/IbN5lM+OCdk06asz3+c8E
uo8nuCBoYO5+6AqC2N7WJ3Tdr/pFA8jTbd6VNVlgCWTIR8ZosL5Fgkfv+4fUBrHt
zdVUqMUzvga5rvZnwnJ5Qfu/drHeAAo9MTNFQNe2QgDlYfWBh5GweolgmFSwrpkY
v/5wLtIyv/ASHKswybbqMIlpttcLTXjx5yuh8swttT6Wh+FQqqQ32KSRB3StiwyK
oH0ZhrwYHiFYNlPxecGX6XUta6rFtTlEdkBGSnXzgiTzL2l+Nc0as0V5B9RninZG
qJ+VOChSQ0OFvg1riSXv7tMvbLdGQnxwTRL3t6BMS8I4LA2m3ZfWUcuXT783ODTH
16f1Q1AgXd2csstTWO9cv+N/0fpX31nqrm6+CrGduSr2u4HjYYnlLIUhmdTvK3fX
Fg==
-----END CERTIFICATE-----

View File

@@ -1,51 +0,0 @@
-----BEGIN RSA PRIVATE KEY-----
MIIJKgIBAAKCAgEA1j0souT0PGfQxQxAUfAGhM3h6ENJYHCzYpzK2kWZzyb0x+TX
CZvh9BIpi0rfAgUx8jKj0dC7ScqoySRTDnajdlfI6/6hbAXmo3LYEq6WHk4xKRpn
SjXfOg+lbjZ74hSfLG+DvT7S6sP6xS57hzJMeH+cEhSOsz/Tlkn9pzwdHRErcfjV
iioGOrr0mMxlzNlmFs0WK3ezqnRHVNtUCBzdySpu1fkEPJAxBAI/gR2bgOBsAVWp
1w9QILlbcXd+a5CI8oeIAPM41mdenLqVS8sNapx3qWgqn6coczo6RdfyGYPukTU9
gxCjiGo5+EFHYiAyMBKHB3mpQWOXWbaijRCbc8Vc6TqLZlB1LoAD/J0Hn+ZfRGcz
agh0Vk9vpRkVKa0SCz2xPOf6ExTI1JlBETninHW6vXHCzZH69uwLIzn4qgVXEONW
NT1OVnHca8O+erP+1seOOLqXxUdlg4EwFRdsoe7ZTqkkVnlmnuTrz/NrcVv0+O2f
eJGRcTILQ3mQuCo7hBOUTGfF5tXxoy9ClZ3+p5+cyf2o9ghk8RdJABGYQEhqGRwA
JvrSz5pwbRYUWW7AE+0GoSvxDrteVIXQht/iIGJHOk+huGVa1rWesWDghYcxKhnQ
8TvluEgYESSGulxdAIDFeYB/F+MoMfuN0nV9F2dn/347PBgZVfkYf+XRAO0CAwEA
AQKCAgEA0hZdTkQtCYtYm9LexDsXeWYX8VcCfrMmBj7xYcg9A3oVMmzDPuYBVwH0
gWbjd6y2hOaJ5TfGYZ99kvmvBRDsTSHaoyopC7BhssjtAKz6Ay/0X3VH8usPQ3WS
aZi+NT65tK6KRqtz08ppgLGLa1G00bl5x/Um1rpxeACI4FU/y4BJ1VMJvJpnT3KE
Z86Qyagqx5NH+UpCApZSWPFX3zjHePzGgcfXErjniCHYOnpZQrFQ2KIzkfSvQ9fg
x01ByKOM2CB2C1B33TCzBAioXRH6zyAu7A59NeCK9ywTduhDvie1a+oEryFC7IQW
4s7I/H3MGX4hsf/pLXlHMy+5CZJOjRaC2h+pypfbbcuiXu6Sn64kHNpiI7SxI5DI
MIRjyG7MdUcrzq0Rt8ogwwpbCoRqrl/w3bhxtqmeZaEZtyxbjlm7reK2YkIFDgyz
JMqiJK5ZAi+9L/8c0xhjjAQQ0sIzrjmjA8U+6YnWL9jU5qXTVnBB8XQucyeeZGgk
yRHyMur71qOXN8z3UEva7MHkDTUBlj8DgTz6sEjqCipaWl0CXfDNa4IhHIXD5qiF
wplhq7OeS0v6EGG/UFa3Q/lFntxtrayxJX7uvvSccGzjPKXTjpWUELLi/FdnIsum
eXT3RgIEYozj4BibDXaBLfHTCVzxOr7AAEvKM9XWSUgLA0paSWECggEBAO9ZBeE1
GWzd1ejTTkcxBC9AK2rNsYG8PdNqiof/iTbuJWNeRqpG+KB/0CNIpjZ2X5xZd0tM
FDpHTFehlP26Roxuq50iRAFc+SN5KoiO0A3JuJAidreIgRTia1saUUrypHqWrYEA
VZVj2AI8Pyg3s1OkR2frFskY7hXBVb/pJNDP/m9xTXXIYiIXYkHYe+4RIJCnAxRv
q5YHKaX+0Ull9YCZJCxmwvcHat8sgu8qkiwUMEM6QSNEkrEbdnWYBABvC1AR6sws
7MP1h9+j22n4Zc/3D6kpFZEL9Erx8nNyhbOZ6q2Tdnf6YKVVjZdyVa8VyNnR0ROl
3BjkFaHb/bg4e4kCggEBAOUk8ZJS3qBeGCOjug384zbHGcnhUBYtYJiOz+RXBtP+
PRksbFtTkgk1sHuSGO8YRddU4Qv7Av1xL8o+DEsLBSD0YQ7pmLrR/LK+iDQ5N63O
Fve9uJH0ybxAOkiua7G24+lTsVUP//KWToL4Wh5zbHBBjL5D2Z9zoeVbcE87xhva
lImMVr4Ex252DqNP9wkZxBjudFyJ/C/TnXrjPcgwhxWTC7sLQMhE5p+490G7c4hX
PywkIKrANbu37KDiAvVS+dC66ZgpL/NUDkeloAmGNO08LGzbV6YKchlvDyWU/AvW
0hYjbL0FUq7K/wp1G9fumolB+fbI25K9c13X93STzUUCggEBAJDsNFUyk5yJjbYW
C/WrRj9d+WwH9Az77+uNPSgvn+O0usq6EMuVgYGdImfa21lqv2Wp/kOHY1AOT7lX
yyD+oyzw7dSNJOQ2aVwDR6+72Vof5DLRy1RBwPbmSd61xrc8yD658YCEtU1pUSe5
VvyBDYH9nIbdn8RP5gkiMUusXXBaIFNWJXLFzDWcNxBrhk6V7EPp/EFphFmpKJyr
+AkbRVWCZJbF+hMdWKadCwLJogwyhS6PnVU/dhrq6AU38GRa2Fy5HJRYN1xH1Oej
DX3Su8L6c28Xw0k6FcczTHx+wVoIPkKvYTIwVkiFzt/+iMckx6KsGo5tBSHFKRwC
WlQrTxECggEBALjUruLnY1oZ7AC7bTUhOimSOfQEgTQSUCtebsRxijlvhtsKYTDd
XRt+qidStjgN7S/+8DRYuZWzOeg5WnMhpXZqiOudcyume922IGl3ibjxVsdoyjs5
J4xohlrgDlBgBMDNWGoTqNGFejjcmNydH+gAh8VlN2INxJYbxqCyx17qVgwJHmLR
uggYxD/pHYvCs9GkbknCp5/wYsOgDtKuihfV741lS1D/esN1UEQ+LrfYIEW7snno
5q7Pcdhn1hkKYCWEzy2Ec4Aj2gzixQ9JqOF/OxpnZvCw1k47rg0TeqcWFYnz8x8Y
7xO8/DH0OoxXk2GJzVXJuItJs4gLzzfCjL0CggEAJFHfC9jisdy7CoWiOpNCSF1B
S0/CWDz77cZdlWkpTdaXGGp1MA/UKUFPIH8sOHfvpKS660+X4G/1ZBHmFb4P5kFF
Qy8UyUMKtSOEdZS6KFlRlfSCAMd5aSTmCvq4OSjYEpMRwUhU/iEJNkn9Z1Soehe0
U3dxJ8KiT1071geO6rRquSHoSJs6Y0WQKriYYQJOhh4Axs3PQihER2eyh+WGk8YJ
02m0mMsjntqnXtdc6IcdKaHp9ko+OpM9QZLsvt19fxBcrXj/i21uUXrzuNtKfO6M
JqGhsOrO2dh8lMhvodENvgKA0DmYDC9N7ogo7bxTNSedcjBF46FhJoqii8m70Q==
-----END RSA PRIVATE KEY-----

View File

@@ -1,7 +1,7 @@
include LICENSE.txt
include README.rst
recursive-include tests *
recursive-include certbot_apache/_internal/augeas_lens *.aug
recursive-include certbot_apache/_internal/tls_configs *.conf
global-exclude __pycache__
global-exclude *.py[cod]
recursive-include docs *
recursive-include certbot_apache/tests/testdata *
include certbot_apache/centos-options-ssl-apache.conf
include certbot_apache/options-ssl-apache.conf
recursive-include certbot_apache/augeas_lens *.aug

View File

@@ -1 +0,0 @@
"""Certbot Apache plugin."""

View File

@@ -1,257 +0,0 @@
""" Utility functions for certbot-apache plugin """
import binascii
import fnmatch
import logging
import re
import subprocess
import pkg_resources
from certbot import errors
from certbot import util
from certbot.compat import os
logger = logging.getLogger(__name__)
def get_mod_deps(mod_name):
"""Get known module dependencies.
.. note:: This does not need to be accurate in order for the client to
run. This simply keeps things clean if the user decides to revert
changes.
.. warning:: If all deps are not included, it may cause incorrect parsing
behavior, due to enable_mod's shortcut for updating the parser's
currently defined modules (`.ApacheParser.add_mod`)
This would only present a major problem in extremely atypical
configs that use ifmod for the missing deps.
"""
deps = {
"ssl": ["setenvif", "mime"]
}
return deps.get(mod_name, [])
def get_file_path(vhost_path):
"""Get file path from augeas_vhost_path.
Takes in Augeas path and returns the file name
:param str vhost_path: Augeas virtual host path
:returns: filename of vhost
:rtype: str
"""
if not vhost_path or not vhost_path.startswith("/files/"):
return None
return _split_aug_path(vhost_path)[0]
def get_internal_aug_path(vhost_path):
"""Get the Augeas path for a vhost with the file path removed.
:param str vhost_path: Augeas virtual host path
:returns: Augeas path to vhost relative to the containing file
:rtype: str
"""
return _split_aug_path(vhost_path)[1]
def _split_aug_path(vhost_path):
"""Splits an Augeas path into a file path and an internal path.
After removing "/files", this function splits vhost_path into the
file path and the remaining Augeas path.
:param str vhost_path: Augeas virtual host path
:returns: file path and internal Augeas path
:rtype: `tuple` of `str`
"""
# Strip off /files
file_path = vhost_path[6:]
internal_path = []
# Remove components from the end of file_path until it becomes valid
while not os.path.exists(file_path):
file_path, _, internal_path_part = file_path.rpartition("/")
internal_path.append(internal_path_part)
return file_path, "/".join(reversed(internal_path))
def parse_define_file(filepath, varname):
""" Parses Defines from a variable in configuration file
:param str filepath: Path of file to parse
:param str varname: Name of the variable
:returns: Dict of Define:Value pairs
:rtype: `dict`
"""
return_vars = {}
# Get list of words in the variable
a_opts = util.get_var_from_file(varname, filepath).split()
for i, v in enumerate(a_opts):
# Handle Define statements and make sure it has an argument
if v == "-D" and len(a_opts) >= i+2:
var_parts = a_opts[i+1].partition("=")
return_vars[var_parts[0]] = var_parts[2]
elif len(v) > 2 and v.startswith("-D"):
# Found var with no whitespace separator
var_parts = v[2:].partition("=")
return_vars[var_parts[0]] = var_parts[2]
return return_vars
def unique_id():
""" Returns an unique id to be used as a VirtualHost identifier"""
return binascii.hexlify(os.urandom(16)).decode("utf-8")
def included_in_paths(filepath, paths):
"""
Returns true if the filepath is included in the list of paths
that may contain full paths or wildcard paths that need to be
expanded.
:param str filepath: Filepath to check
:params list paths: List of paths to check against
:returns: True if included
:rtype: bool
"""
return any(fnmatch.fnmatch(filepath, path) for path in paths)
def parse_defines(apachectl):
"""
Gets Defines from httpd process and returns a dictionary of
the defined variables.
:param str apachectl: Path to apachectl executable
:returns: dictionary of defined variables
:rtype: dict
"""
variables = {}
define_cmd = [apachectl, "-t", "-D",
"DUMP_RUN_CFG"]
matches = parse_from_subprocess(define_cmd, r"Define: ([^ \n]*)")
try:
matches.remove("DUMP_RUN_CFG")
except ValueError:
return {}
for match in matches:
if match.count("=") > 1:
logger.error("Unexpected number of equal signs in "
"runtime config dump.")
raise errors.PluginError(
"Error parsing Apache runtime variables")
parts = match.partition("=")
variables[parts[0]] = parts[2]
return variables
def parse_includes(apachectl):
"""
Gets Include directives from httpd process and returns a list of
their values.
:param str apachectl: Path to apachectl executable
:returns: list of found Include directive values
:rtype: list of str
"""
inc_cmd = [apachectl, "-t", "-D",
"DUMP_INCLUDES"]
return parse_from_subprocess(inc_cmd, r"\(.*\) (.*)")
def parse_modules(apachectl):
"""
Get loaded modules from httpd process, and return the list
of loaded module names.
:param str apachectl: Path to apachectl executable
:returns: list of found LoadModule module names
:rtype: list of str
"""
mod_cmd = [apachectl, "-t", "-D",
"DUMP_MODULES"]
return parse_from_subprocess(mod_cmd, r"(.*)_module")
def parse_from_subprocess(command, regexp):
"""Get values from stdout of subprocess command
:param list command: Command to run
:param str regexp: Regexp for parsing
:returns: list parsed from command output
:rtype: list
"""
stdout = _get_runtime_cfg(command)
return re.compile(regexp).findall(stdout)
def _get_runtime_cfg(command):
"""
Get runtime configuration info.
:param command: Command to run
:returns: stdout from command
"""
try:
proc = subprocess.Popen(
command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True,
env=util.env_no_snap_for_external_calls())
stdout, stderr = proc.communicate()
except (OSError, ValueError):
logger.error(
"Error running command %s for runtime parameters!%s",
command, os.linesep)
raise errors.MisconfigurationError(
"Error accessing loaded Apache parameters: {0}".format(
command))
# Small errors that do not impede
if proc.returncode != 0:
logger.warning("Error in checking parameter list: %s", stderr)
raise errors.MisconfigurationError(
"Apache is unable to check whether or not the module is "
"loaded because Apache is misconfigured.")
return stdout
def find_ssl_apache_conf(prefix):
"""
Find a TLS Apache config file in the dedicated storage.
:param str prefix: prefix of the TLS Apache config file to find
:return: the path the TLS Apache config file
:rtype: str
"""
return pkg_resources.resource_filename(
"certbot_apache",
os.path.join("_internal", "tls_configs", "{0}-options-ssl-apache.conf".format(prefix)))

View File

@@ -1,172 +0,0 @@
""" apacheconfig implementation of the ParserNode interfaces """
from certbot_apache._internal import assertions
from certbot_apache._internal import interfaces
from certbot_apache._internal import parsernode_util as util
class ApacheParserNode(interfaces.ParserNode):
""" apacheconfig implementation of ParserNode interface.
Expects metadata `ac_ast` to be passed in, where `ac_ast` is the AST provided
by parsing the equivalent configuration text using the apacheconfig library.
"""
def __init__(self, **kwargs):
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs) # pylint: disable=unused-variable
super(ApacheParserNode, self).__init__(**kwargs)
self.ancestor = ancestor
self.filepath = filepath
self.dirty = dirty
self.metadata = metadata
self._raw = self.metadata["ac_ast"]
def save(self, msg): # pragma: no cover
pass
def find_ancestors(self, name): # pylint: disable=unused-variable
"""Find ancestor BlockNodes with a given name"""
return [ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
class ApacheCommentNode(ApacheParserNode):
""" apacheconfig implementation of CommentNode interface """
def __init__(self, **kwargs):
comment, kwargs = util.commentnode_kwargs(kwargs) # pylint: disable=unused-variable
super(ApacheCommentNode, self).__init__(**kwargs)
self.comment = comment
def __eq__(self, other): # pragma: no cover
if isinstance(other, self.__class__):
return (self.comment == other.comment and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata and
self.filepath == other.filepath)
return False
class ApacheDirectiveNode(ApacheParserNode):
""" apacheconfig implementation of DirectiveNode interface """
def __init__(self, **kwargs):
name, parameters, enabled, kwargs = util.directivenode_kwargs(kwargs)
super(ApacheDirectiveNode, self).__init__(**kwargs)
self.name = name
self.parameters = parameters
self.enabled = enabled
self.include = None
def __eq__(self, other): # pragma: no cover
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
def set_parameters(self, _parameters): # pragma: no cover
"""Sets the parameters for DirectiveNode"""
return
class ApacheBlockNode(ApacheDirectiveNode):
""" apacheconfig implementation of BlockNode interface """
def __init__(self, **kwargs):
super(ApacheBlockNode, self).__init__(**kwargs)
self.children = ()
def __eq__(self, other): # pragma: no cover
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.children == other.children and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
# pylint: disable=unused-argument
def add_child_block(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new BlockNode to the sequence of children"""
new_block = ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)
self.children += (new_block,)
return new_block
# pylint: disable=unused-argument
def add_child_directive(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new DirectiveNode to the sequence of children"""
new_dir = ApacheDirectiveNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)
self.children += (new_dir,)
return new_dir
# pylint: disable=unused-argument
def add_child_comment(self, comment="", position=None): # pragma: no cover
"""Adds a new CommentNode to the sequence of children"""
new_comment = ApacheCommentNode(comment=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)
self.children += (new_comment,)
return new_comment
def find_blocks(self, name, exclude=True): # pylint: disable=unused-argument
"""Recursive search of BlockNodes from the sequence of children"""
return [ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
def find_directives(self, name, exclude=True): # pylint: disable=unused-argument
"""Recursive search of DirectiveNodes from the sequence of children"""
return [ApacheDirectiveNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
# pylint: disable=unused-argument
def find_comments(self, comment, exact=False): # pragma: no cover
"""Recursive search of DirectiveNodes from the sequence of children"""
return [ApacheCommentNode(comment=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
def delete_child(self, child): # pragma: no cover
"""Deletes a ParserNode from the sequence of children"""
return
def unsaved_files(self): # pragma: no cover
"""Returns a list of unsaved filepaths"""
return [assertions.PASS]
def parsed_paths(self): # pragma: no cover
"""Returns a list of parsed configuration file paths"""
return [assertions.PASS]
interfaces.CommentNode.register(ApacheCommentNode)
interfaces.DirectiveNode.register(ApacheDirectiveNode)
interfaces.BlockNode.register(ApacheBlockNode)

View File

@@ -1,142 +0,0 @@
"""Dual parser node assertions"""
import fnmatch
from certbot_apache._internal import interfaces
PASS = "CERTBOT_PASS_ASSERT"
def assertEqual(first, second):
""" Equality assertion """
if isinstance(first, interfaces.CommentNode):
assertEqualComment(first, second)
elif isinstance(first, interfaces.DirectiveNode):
assertEqualDirective(first, second)
# Do an extra interface implementation assertion, as the contents were
# already checked for BlockNode in the assertEqualDirective
if isinstance(first, interfaces.BlockNode):
assert isinstance(second, interfaces.BlockNode)
# Skip tests if filepath includes the pass value. This is done
# because filepath is variable of the base ParserNode interface, and
# unless the implementation is actually done, we cannot assume getting
# correct results from boolean assertion for dirty
if not isPass(first.filepath) and not isPass(second.filepath):
assert first.dirty == second.dirty
# We might want to disable this later if testing with two separate
# (but identical) directory structures.
assert first.filepath == second.filepath
def assertEqualComment(first, second): # pragma: no cover
""" Equality assertion for CommentNode """
assert isinstance(first, interfaces.CommentNode)
assert isinstance(second, interfaces.CommentNode)
if not isPass(first.comment) and not isPass(second.comment): # type: ignore
assert first.comment == second.comment # type: ignore
def _assertEqualDirectiveComponents(first, second): # pragma: no cover
""" Handles assertion for instance variables for DirectiveNode and BlockNode"""
# Enabled value cannot be asserted, because Augeas implementation
# is unable to figure that out.
# assert first.enabled == second.enabled
if not isPass(first.name) and not isPass(second.name):
assert first.name == second.name
if not isPass(first.parameters) and not isPass(second.parameters):
assert first.parameters == second.parameters
def assertEqualDirective(first, second):
""" Equality assertion for DirectiveNode """
assert isinstance(first, interfaces.DirectiveNode)
assert isinstance(second, interfaces.DirectiveNode)
_assertEqualDirectiveComponents(first, second)
def isPass(value): # pragma: no cover
"""Checks if the value is set to PASS"""
if isinstance(value, bool):
return True
return PASS in value
def isPassDirective(block):
""" Checks if BlockNode or DirectiveNode should pass the assertion """
if isPass(block.name):
return True
if isPass(block.parameters): # pragma: no cover
return True
if isPass(block.filepath): # pragma: no cover
return True
return False
def isPassComment(comment):
""" Checks if CommentNode should pass the assertion """
if isPass(comment.comment):
return True
if isPass(comment.filepath): # pragma: no cover
return True
return False
def isPassNodeList(nodelist): # pragma: no cover
""" Checks if a ParserNode in the nodelist should pass the assertion,
this function is used for results of find_* methods. Unimplemented find_*
methods should return a sequence containing a single ParserNode instance
with assertion pass string."""
try:
node = nodelist[0]
except IndexError:
node = None
if not node: # pragma: no cover
return False
if isinstance(node, interfaces.DirectiveNode):
return isPassDirective(node)
return isPassComment(node)
def assertEqualSimple(first, second):
""" Simple assertion """
if not isPass(first) and not isPass(second):
assert first == second
def isEqualVirtualHost(first, second):
"""
Checks that two VirtualHost objects are similar. There are some built
in differences with the implementations: VirtualHost created by ParserNode
implementation doesn't have "path" defined, as it was used for Augeas path
and that cannot obviously be used in the future. Similarly the legacy
version lacks "node" variable, that has a reference to the BlockNode for the
VirtualHost.
"""
return (
first.name == second.name and
first.aliases == second.aliases and
first.filep == second.filep and
first.addrs == second.addrs and
first.ssl == second.ssl and
first.enabled == second.enabled and
first.modmacro == second.modmacro and
first.ancestor == second.ancestor
)
def assertEqualPathsList(first, second): # pragma: no cover
"""
Checks that the two lists of file paths match. This assertion allows for wildcard
paths.
"""
if any(isPass(path) for path in first):
return
if any(isPass(path) for path in second):
return
for fpath in first:
assert any([fnmatch.fnmatch(fpath, spath) for spath in second])
for spath in second:
assert any([fnmatch.fnmatch(fpath, spath) for fpath in first])

View File

@@ -1,538 +0,0 @@
"""
Augeas implementation of the ParserNode interfaces.
Augeas works internally by using XPATH notation. The following is a short example
of how this all works internally, to better understand what's going on under the
hood.
A configuration file /etc/apache2/apache2.conf with the following content:
# First comment line
# Second comment line
WhateverDirective whatevervalue
<ABlock>
DirectiveInABlock dirvalue
</ABlock>
SomeDirective somedirectivevalue
<ABlock>
AnotherDirectiveInABlock dirvalue
</ABlock>
# Yet another comment
Translates over to Augeas path notation (of immediate children), when calling
for example: aug.match("/files/etc/apache2/apache2.conf/*")
[
"/files/etc/apache2/apache2.conf/#comment[1]",
"/files/etc/apache2/apache2.conf/#comment[2]",
"/files/etc/apache2/apache2.conf/directive[1]",
"/files/etc/apache2/apache2.conf/ABlock[1]",
"/files/etc/apache2/apache2.conf/directive[2]",
"/files/etc/apache2/apache2.conf/ABlock[2]",
"/files/etc/apache2/apache2.conf/#comment[3]"
]
Regardless of directives name, its key in the Augeas tree is always "directive",
with index where needed of course. Comments work similarly, while blocks
have their own key in the Augeas XPATH notation.
It's important to note that all of the unique keys have their own indices.
Augeas paths are case sensitive, while Apache configuration is case insensitive.
It looks like this:
<block>
directive value
</block>
<Block>
Directive Value
</Block>
<block>
directive value
</block>
<bLoCk>
DiReCtiVe VaLuE
</bLoCk>
Translates over to:
[
"/files/etc/apache2/apache2.conf/block[1]",
"/files/etc/apache2/apache2.conf/Block[1]",
"/files/etc/apache2/apache2.conf/block[2]",
"/files/etc/apache2/apache2.conf/bLoCk[1]",
]
"""
from acme.magic_typing import Set
from certbot import errors
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import assertions
from certbot_apache._internal import interfaces
from certbot_apache._internal import parser
from certbot_apache._internal import parsernode_util as util
class AugeasParserNode(interfaces.ParserNode):
""" Augeas implementation of ParserNode interface """
def __init__(self, **kwargs):
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs) # pylint: disable=unused-variable
super(AugeasParserNode, self).__init__(**kwargs)
self.ancestor = ancestor
self.filepath = filepath
self.dirty = dirty
self.metadata = metadata
self.parser = self.metadata.get("augeasparser")
try:
if self.metadata["augeaspath"].endswith("/"):
raise errors.PluginError(
"Augeas path: {} has a trailing slash".format(
self.metadata["augeaspath"]
)
)
except KeyError:
raise errors.PluginError("Augeas path is required")
def save(self, msg):
self.parser.save(msg)
def find_ancestors(self, name):
"""
Searches for ancestor BlockNodes with a given name.
:param str name: Name of the BlockNode parent to search for
:returns: List of matching ancestor nodes.
:rtype: list of AugeasBlockNode
"""
ancestors = []
parent = self.metadata["augeaspath"]
while True:
# Get the path of ancestor node
parent = parent.rpartition("/")[0]
# Root of the tree
if not parent or parent == "/files":
break
anc = self._create_blocknode(parent)
if anc.name.lower() == name.lower():
ancestors.append(anc)
return ancestors
def _create_blocknode(self, path):
"""
Helper function to create a BlockNode from Augeas path. This is used by
AugeasParserNode.find_ancestors and AugeasBlockNode.
and AugeasBlockNode.find_blocks
"""
name = self._aug_get_name(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(path)
)
return AugeasBlockNode(name=name,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _aug_get_name(self, path):
"""
Helper function to get name of a configuration block or variable from path.
"""
# Remove the ending slash if any
if path[-1] == "/": # pragma: no cover
path = path[:-1]
# Get the block name
name = path.split("/")[-1]
# remove [...], it's not allowed in Apache configuration and is used
# for indexing within Augeas
name = name.split("[")[0]
return name
class AugeasCommentNode(AugeasParserNode):
""" Augeas implementation of CommentNode interface """
def __init__(self, **kwargs):
comment, kwargs = util.commentnode_kwargs(kwargs) # pylint: disable=unused-variable
super(AugeasCommentNode, self).__init__(**kwargs)
# self.comment = comment
self.comment = comment
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.comment == other.comment and
self.filepath == other.filepath and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
class AugeasDirectiveNode(AugeasParserNode):
""" Augeas implementation of DirectiveNode interface """
def __init__(self, **kwargs):
name, parameters, enabled, kwargs = util.directivenode_kwargs(kwargs)
super(AugeasDirectiveNode, self).__init__(**kwargs)
self.name = name
self.enabled = enabled
if parameters:
self.set_parameters(parameters)
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
def set_parameters(self, parameters):
"""
Sets parameters of a DirectiveNode or BlockNode object.
:param list parameters: List of all parameters for the node to set.
"""
orig_params = self._aug_get_params(self.metadata["augeaspath"])
# Clear out old parameters
for _ in orig_params:
# When the first parameter is removed, the indices get updated
param_path = "{}/arg[1]".format(self.metadata["augeaspath"])
self.parser.aug.remove(param_path)
# Insert new ones
for pi, param in enumerate(parameters):
param_path = "{}/arg[{}]".format(self.metadata["augeaspath"], pi+1)
self.parser.aug.set(param_path, param)
@property
def parameters(self):
"""
Fetches the parameters from Augeas tree, ensuring that the sequence always
represents the current state
:returns: Tuple of parameters for this DirectiveNode
:rtype: tuple:
"""
return tuple(self._aug_get_params(self.metadata["augeaspath"]))
def _aug_get_params(self, path):
"""Helper function to get parameters for DirectiveNodes and BlockNodes"""
arg_paths = self.parser.aug.match(path + "/arg")
return [self.parser.get_arg(apath) for apath in arg_paths]
class AugeasBlockNode(AugeasDirectiveNode):
""" Augeas implementation of BlockNode interface """
def __init__(self, **kwargs):
super(AugeasBlockNode, self).__init__(**kwargs)
self.children = ()
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.children == other.children and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
# pylint: disable=unused-argument
def add_child_block(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new BlockNode to the sequence of children"""
insertpath, realpath, before = self._aug_resolve_child_position(
name,
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
# Create the new block
self.parser.aug.insert(insertpath, name, before)
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(realpath)
)
# Parameters will be set at the initialization of the new object
new_block = AugeasBlockNode(name=name,
parameters=parameters,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_block
# pylint: disable=unused-argument
def add_child_directive(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new DirectiveNode to the sequence of children"""
if not parameters:
raise errors.PluginError("Directive requires parameters and none were set.")
insertpath, realpath, before = self._aug_resolve_child_position(
"directive",
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
# Create the new directive
self.parser.aug.insert(insertpath, "directive", before)
# Set the directive key
self.parser.aug.set(realpath, name)
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(realpath)
)
new_dir = AugeasDirectiveNode(name=name,
parameters=parameters,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_dir
def add_child_comment(self, comment="", position=None):
"""Adds a new CommentNode to the sequence of children"""
insertpath, realpath, before = self._aug_resolve_child_position(
"#comment",
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
# Create the new comment
self.parser.aug.insert(insertpath, "#comment", before)
# Set the comment content
self.parser.aug.set(realpath, comment)
new_comment = AugeasCommentNode(comment=comment,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_comment
def find_blocks(self, name, exclude=True):
"""Recursive search of BlockNodes from the sequence of children"""
nodes = []
paths = self._aug_find_blocks(name)
if exclude:
paths = self.parser.exclude_dirs(paths)
for path in paths:
nodes.append(self._create_blocknode(path))
return nodes
def find_directives(self, name, exclude=True):
"""Recursive search of DirectiveNodes from the sequence of children"""
nodes = []
ownpath = self.metadata.get("augeaspath")
directives = self.parser.find_dir(name, start=ownpath, exclude=exclude)
already_parsed = set() # type: Set[str]
for directive in directives:
# Remove the /arg part from the Augeas path
directive = directive.partition("/arg")[0]
# find_dir returns an object for each _parameter_ of a directive
# so we need to filter out duplicates.
if directive not in already_parsed:
nodes.append(self._create_directivenode(directive))
already_parsed.add(directive)
return nodes
def find_comments(self, comment):
"""
Recursive search of DirectiveNodes from the sequence of children.
:param str comment: Comment content to search for.
"""
nodes = []
ownpath = self.metadata.get("augeaspath")
comments = self.parser.find_comments(comment, start=ownpath)
for com in comments:
nodes.append(self._create_commentnode(com))
return nodes
def delete_child(self, child):
"""
Deletes a ParserNode from the sequence of children, and raises an
exception if it's unable to do so.
:param AugeasParserNode: child: A node to delete.
"""
if not self.parser.aug.remove(child.metadata["augeaspath"]):
raise errors.PluginError(
("Could not delete child node, the Augeas path: {} doesn't " +
"seem to exist.").format(child.metadata["augeaspath"])
)
def unsaved_files(self):
"""Returns a list of unsaved filepaths"""
return self.parser.unsaved_files()
def parsed_paths(self):
"""
Returns a list of file paths that have currently been parsed into the parser
tree. The returned list may include paths with wildcard characters, for
example: ['/etc/apache2/conf.d/*.load']
This is typically called on the root node of the ParserNode tree.
:returns: list of file paths of files that have been parsed
"""
res_paths = []
paths = self.parser.existing_paths
for directory in paths:
for filename in paths[directory]:
res_paths.append(os.path.join(directory, filename))
return res_paths
def _create_commentnode(self, path):
"""Helper function to create a CommentNode from Augeas path"""
comment = self.parser.aug.get(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
# Because of the dynamic nature of AugeasParser and the fact that we're
# not populating the complete node tree, the ancestor has a dummy value
return AugeasCommentNode(comment=comment,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _create_directivenode(self, path):
"""Helper function to create a DirectiveNode from Augeas path"""
name = self.parser.get_arg(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(path)
)
return AugeasDirectiveNode(name=name,
ancestor=assertions.PASS,
enabled=enabled,
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _aug_find_blocks(self, name):
"""Helper function to perform a search to Augeas DOM tree to search
configuration blocks with a given name"""
# The code here is modified from configurator.get_virtual_hosts()
blk_paths = set()
for vhost_path in list(self.parser.parser_paths):
paths = self.parser.aug.match(
("/files%s//*[label()=~regexp('%s')]" %
(vhost_path, parser.case_i(name))))
blk_paths.update([path for path in paths if
name.lower() in os.path.basename(path).lower()])
return blk_paths
def _aug_resolve_child_position(self, name, position):
"""
Helper function that iterates through the immediate children and figures
out the insertion path for a new AugeasParserNode.
Augeas also generalizes indices for directives and comments, simply by
using "directive" or "comment" respectively as their names.
This function iterates over the existing children of the AugeasBlockNode,
returning their insertion path, resulting Augeas path and if the new node
should be inserted before or after the returned insertion path.
Note: while Apache is case insensitive, Augeas is not, and blocks like
Nameofablock and NameOfABlock have different indices.
:param str name: Name of the AugeasBlockNode to insert, "directive" for
AugeasDirectiveNode or "comment" for AugeasCommentNode
:param int position: The position to insert the child AugeasParserNode to
:returns: Tuple of insert path, resulting path and a boolean if the new
node should be inserted before it.
:rtype: tuple of str, str, bool
"""
# Default to appending
before = False
all_children = self.parser.aug.match("{}/*".format(
self.metadata["augeaspath"])
)
# Calculate resulting_path
# Augeas indices start at 1. We use counter to calculate the index to
# be used in resulting_path.
counter = 1
for i, child in enumerate(all_children):
if position is not None and i >= position:
# We're not going to insert the new node to an index after this
break
childname = self._aug_get_name(child)
if name == childname:
counter += 1
resulting_path = "{}/{}[{}]".format(
self.metadata["augeaspath"],
name,
counter
)
# Form the correct insert_path
# Inserting the only child and appending as the last child work
# similarly in Augeas.
append = not all_children or position is None or position >= len(all_children)
if append:
insert_path = "{}/*[last()]".format(
self.metadata["augeaspath"]
)
elif position == 0:
# Insert as the first child, before the current first one.
insert_path = all_children[0]
before = True
else:
insert_path = "{}/*[{}]".format(
self.metadata["augeaspath"],
position
)
return (insert_path, resulting_path, before)
interfaces.CommentNode.register(AugeasCommentNode)
interfaces.DirectiveNode.register(AugeasDirectiveNode)
interfaces.BlockNode.register(AugeasBlockNode)

Some files were not shown because too many files have changed in this diff Show More