Compare commits
4 Commits
remove-ngi
...
apache_res
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0df0cd8f6c | ||
|
|
91298e4e35 | ||
|
|
a0c617f826 | ||
|
|
a3c1132c1e |
@@ -1,119 +0,0 @@
|
||||
# Configuring Azure Pipelines with Certbot
|
||||
|
||||
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
|
||||
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
|
||||
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
|
||||
|
||||
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
|
||||
|
||||
Unlike Travis, where CodeCov is working without any action required, CodeCov supports Azure Pipelines
|
||||
using the coverage-bash utility (not python-coverage for now) only if you provide the Codecov repo token
|
||||
using the `CODECOV_TOKEN` environment variable. So `CODECOV_TOKEN` needs to be set as a secured
|
||||
environment variable to allow the main pipeline to publish coverage reports to CodeCov.
|
||||
|
||||
This INSTALL.md file explains how to configure Azure Pipelines with Certbot in order to execute the CI/CD logic defined in `.azure-pipelines` folder with it.
|
||||
During this installation step, warnings describing user access and legal comitments will be displayed like this:
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
```
|
||||
|
||||
This document suppose that the Azure DevOps organization is named _certbot_, and the Azure DevOps project is also _certbot_.
|
||||
|
||||
## Useful links
|
||||
|
||||
* https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema
|
||||
* https://www.azuredevopslabs.com/labs/azuredevops/github-integration/
|
||||
* https://docs.microsoft.com/en-us/azure/devops/pipelines/ecosystems/python?view=azure-devops
|
||||
|
||||
## Prerequisites
|
||||
|
||||
### Having a GitHub account
|
||||
|
||||
Use your GitHub user for a normal GitHub account, or a user that has administrative rights to the GitHub organization if relevant.
|
||||
|
||||
### Having an Azure DevOps account
|
||||
- Go to https://dev.azure.com/, click "Start free with GitHub"
|
||||
- Login to GitHub
|
||||
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
Personal user data (email + profile info, in read-only)
|
||||
```
|
||||
|
||||
- Microsoft will create a Live account using the email referenced for the GitHub account. This account is also linked to GitHub account (meaning you can log it using GitHub authentication)
|
||||
- Proceed with account registration (birth date, country), add details about name and email contact
|
||||
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
Microsoft proposes to send commercial links to this mail
|
||||
Azure DevOps terms of service need to be accepted
|
||||
```
|
||||
|
||||
_Logged to Azure DevOps, account is ready._
|
||||
|
||||
### Installing Azure Pipelines to GitHub
|
||||
|
||||
- On GitHub, go to Marketplace
|
||||
- Select Azure Pipeline, and "Set up a plan"
|
||||
- Select Free, then "Install it for free"
|
||||
- Click "Complete order and begin installation"
|
||||
|
||||
```
|
||||
!!! ACCESS !!!
|
||||
Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses, deployments, issues, pull requests.
|
||||
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
|
||||
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
|
||||
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
|
||||
protected branches, like master, and cannot modify the security context around this on GitHub.
|
||||
Access can be defined for all or only selected repositories, which is nice.
|
||||
```
|
||||
|
||||
- Redirected to Azure DevOps, select the account created in _Having an Azure DevOps account_ section.
|
||||
- Select the organization, and click "Create a new project" (let's name it the same than the targetted github repo)
|
||||
- The Visibility is public, to profit from 10 parallel jobs
|
||||
|
||||
```
|
||||
!!! ACCESS !!!
|
||||
Azure Pipelines needs access to the GitHub account (in term of beeing able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
|
||||
```
|
||||
|
||||
_Done. We can move to pipelines configuration._
|
||||
|
||||
## Import an existing pipelines from `.azure-pipelines` folder
|
||||
|
||||
- On Azure DevOps, go to your organization (eg. _certbot_) then your project (eg. _certbot_)
|
||||
- Click "Pipelines" tab
|
||||
- Click "New pipeline"
|
||||
- Where is your code?: select "__Use the classic editor__"
|
||||
|
||||
__Warning: Do not choose the GitHub option in Where is your code? section. Indeed, this option will trigger an OAuth
|
||||
grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAuth Application. The permissions asked
|
||||
then are way too large (admin level on almost everything), while the classic approach does not add any more
|
||||
permissions, and works perfectly well.__
|
||||
|
||||
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
|
||||
- Click on YAML option for "Select a template"
|
||||
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
|
||||
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
|
||||
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
|
||||
|
||||
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._
|
||||
|
||||
## Add a secret variable to a pipeline (like `CODECOV_TOKEN`)
|
||||
|
||||
__NB: Following steps suppose that you already setup the YAML pipeline file to
|
||||
consume the secret variable that these steps will create as an environment variable.
|
||||
For a variable named `CODECOV_TOKEN` consuming the variable `codecov_token`,
|
||||
in the YAML file this setup would take the form of the following:
|
||||
```
|
||||
steps:
|
||||
- script: ./do_something_that_consumes_CODECOV_TOKEN # Eg. `codecov -F windows`
|
||||
env:
|
||||
CODECOV_TOKEN: $(codecov_token)
|
||||
```
|
||||
|
||||
To set up a variable that is shared between pipelines, follow the instructions
|
||||
at
|
||||
https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups.
|
||||
When adding variables to a group, don't forget to tick "Keep this value secret"
|
||||
if it shouldn't be shared publcily.
|
||||
@@ -1,20 +0,0 @@
|
||||
# Advanced pipeline for isolated checks and release purpose
|
||||
trigger:
|
||||
- test-*
|
||||
- '*.x'
|
||||
pr:
|
||||
- test-*
|
||||
# This pipeline is also nightly run on master
|
||||
schedules:
|
||||
- cron: "0 4 * * *"
|
||||
displayName: Nightly build
|
||||
branches:
|
||||
include:
|
||||
- master
|
||||
always: true
|
||||
|
||||
jobs:
|
||||
# Any addition here should be reflected in the release pipeline.
|
||||
# It is advised to declare all jobs here as templates to improve maintainability.
|
||||
- template: templates/tests-suite.yml
|
||||
- template: templates/installer-tests.yml
|
||||
@@ -1,12 +0,0 @@
|
||||
trigger:
|
||||
# apache-parser-v2 is a temporary branch for doing work related to
|
||||
# rewriting the parser in the Apache plugin.
|
||||
- apache-parser-v2
|
||||
- master
|
||||
pr:
|
||||
- apache-parser-v2
|
||||
- master
|
||||
- '*.x'
|
||||
|
||||
jobs:
|
||||
- template: templates/tests-suite.yml
|
||||
@@ -1,13 +0,0 @@
|
||||
# Release pipeline to build and deploy Certbot for Windows for GitHub release tags
|
||||
trigger:
|
||||
tags:
|
||||
include:
|
||||
- v*
|
||||
pr: none
|
||||
|
||||
jobs:
|
||||
# Any addition here should be reflected in the advanced pipeline.
|
||||
# It is advised to declare all jobs here as templates to improve maintainability.
|
||||
- template: templates/tests-suite.yml
|
||||
- template: templates/installer-tests.yml
|
||||
- template: templates/changelog.yml
|
||||
@@ -1,14 +0,0 @@
|
||||
jobs:
|
||||
- job: changelog
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
steps:
|
||||
- bash: |
|
||||
CERTBOT_VERSION="$(python -c "import certbot; print(certbot.__version__)")"
|
||||
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
|
||||
displayName: Prepare changelog
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: changelog
|
||||
displayName: Publish changelog
|
||||
@@ -1,32 +0,0 @@
|
||||
jobs:
|
||||
- job: installer
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.7
|
||||
architecture: x86
|
||||
addToPath: true
|
||||
- script: python windows-installer/construct.py
|
||||
displayName: Build Certbot installer
|
||||
- task: CopyFiles@2
|
||||
inputs:
|
||||
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
|
||||
contents: '*.exe'
|
||||
targetFolder: $(Build.ArtifactStagingDirectory)
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: windows-installer
|
||||
displayName: Publish Windows installer
|
||||
- script: $(Build.ArtifactStagingDirectory)\certbot-beta-installer-win32.exe /S
|
||||
displayName: Install Certbot
|
||||
- script: |
|
||||
python -m venv venv
|
||||
venv\Scripts\python tools\pip_install.py -e certbot-ci
|
||||
displayName: Prepare Certbot-CI
|
||||
- script: |
|
||||
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
|
||||
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
|
||||
displayName: Run integration tests
|
||||
@@ -1,38 +0,0 @@
|
||||
jobs:
|
||||
- job: test
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
strategy:
|
||||
matrix:
|
||||
py35:
|
||||
PYTHON_VERSION: 3.5
|
||||
TOXENV: py35
|
||||
py37-cover:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: py37-cover
|
||||
integration-certbot:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: integration-certbot
|
||||
PYTEST_ADDOPTS: --numprocesses 4
|
||||
variables:
|
||||
- group: certbot-common
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: $(PYTHON_VERSION)
|
||||
addToPath: true
|
||||
- script: python tools/pip_install.py -U tox coverage
|
||||
displayName: Install dependencies
|
||||
- script: python -m tox
|
||||
displayName: Run tox
|
||||
# We do not require codecov report upload to succeed. So to avoid to break the pipeline if
|
||||
# something goes wrong, each command is suffixed with a command that hides any non zero exit
|
||||
# codes and echoes an informative message instead.
|
||||
- bash: |
|
||||
curl -s https://codecov.io/bash -o codecov-bash || echo "Failed to download codecov-bash"
|
||||
chmod +x codecov-bash || echo "Failed to apply execute permissions on codecov-bash"
|
||||
./codecov-bash -F windows || echo "Codecov did not collect coverage reports"
|
||||
condition: in(variables['TOXENV'], 'py37-cover', 'integration-certbot')
|
||||
env:
|
||||
CODECOV_TOKEN: $(codecov_token)
|
||||
displayName: Publish coverage
|
||||
18
.codecov.yml
18
.codecov.yml
@@ -1,18 +0,0 @@
|
||||
coverage:
|
||||
status:
|
||||
project:
|
||||
default: off
|
||||
linux:
|
||||
flags: linux
|
||||
# Fixed target instead of auto set by #7173, can
|
||||
# be removed when flags in Codecov are added back.
|
||||
target: 97.4
|
||||
threshold: 0.1
|
||||
base: auto
|
||||
windows:
|
||||
flags: windows
|
||||
# Fixed target instead of auto set by #7173, can
|
||||
# be removed when flags in Codecov are added back.
|
||||
target: 97.4
|
||||
threshold: 0.1
|
||||
base: auto
|
||||
@@ -1,5 +1,2 @@
|
||||
[run]
|
||||
omit = */setup.py
|
||||
|
||||
[report]
|
||||
omit = */setup.py
|
||||
|
||||
35
.github/stale.yml
vendored
35
.github/stale.yml
vendored
@@ -1,35 +0,0 @@
|
||||
# Configuration for https://github.com/marketplace/stale
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request becomes stale
|
||||
daysUntilStale: 365
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
|
||||
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
|
||||
# When changing this value, be sure to also update markComment below.
|
||||
daysUntilClose: 30
|
||||
|
||||
# Ignore issues with an assignee (defaults to false)
|
||||
exemptAssignees: true
|
||||
|
||||
# Label to use when marking as stale
|
||||
staleLabel: needs-update
|
||||
|
||||
# Comment to post when marking as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
We've made a lot of changes to Certbot since this issue was opened. If you
|
||||
still have this issue with an up-to-date version of Certbot, can you please
|
||||
add a comment letting us know? This helps us to better see what issues are
|
||||
still affecting our users. If there is no activity in the next 30 days, this
|
||||
issue will be automatically closed.
|
||||
|
||||
# Comment to post when closing a stale Issue or Pull Request.
|
||||
closeComment: >
|
||||
This issue has been closed due to lack of activity, but if you think it
|
||||
should be reopened, please open a new issue with a link to this one and we'll
|
||||
take a look.
|
||||
|
||||
# Limit the number of actions per hour, from 1-30. Default is 30
|
||||
limitPerRun: 1
|
||||
|
||||
# Don't mark pull requests as stale.
|
||||
only: issues
|
||||
10
.gitignore
vendored
10
.gitignore
vendored
@@ -6,8 +6,7 @@ dist*/
|
||||
/venv*/
|
||||
/kgs/
|
||||
/.tox/
|
||||
/releases*/
|
||||
/log*
|
||||
/releases/
|
||||
letsencrypt.log
|
||||
certbot.log
|
||||
letsencrypt-auto-source/letsencrypt-auto.sig.lzma.base64
|
||||
@@ -39,13 +38,6 @@ tests/letstest/venv/
|
||||
|
||||
# pytest cache
|
||||
.cache
|
||||
.mypy_cache/
|
||||
.pytest_cache/
|
||||
|
||||
# docker files
|
||||
.docker
|
||||
|
||||
# certbot tests
|
||||
.certbot_test_workspace
|
||||
**/assets/pebble*
|
||||
**/assets/challtestsrv*
|
||||
|
||||
38
.pylintrc
38
.pylintrc
@@ -41,7 +41,7 @@ load-plugins=linter_plugin
|
||||
# --enable=similarities". If you want to run only the classes checker, but have
|
||||
# no Warning level messages displayed, use"--disable=all --enable=classes
|
||||
# --disable=W"
|
||||
disable=fixme,locally-disabled,locally-enabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design
|
||||
disable=fixme,locally-disabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,too-few-public-methods,no-self-use,invalid-name,too-many-instance-attributes,cyclic-import,duplicate-code
|
||||
# abstract-class-not-used cannot be disabled locally (at least in
|
||||
# pylint 1.4.1), same for abstract-class-little-used
|
||||
|
||||
@@ -251,7 +251,7 @@ ignored-modules=pkg_resources,confargparse,argparse,six.moves,six.moves.urllib
|
||||
|
||||
# List of classes names for which member attributes should not be checked
|
||||
# (useful for classes with attributes dynamically set).
|
||||
ignored-classes=Field,Header,JWS,closing
|
||||
ignored-classes=SQLObject
|
||||
|
||||
# When zope mode is activated, add a predefined set of Zope acquired attributes
|
||||
# to generated-members.
|
||||
@@ -297,6 +297,40 @@ valid-classmethod-first-arg=cls
|
||||
valid-metaclass-classmethod-first-arg=mcs
|
||||
|
||||
|
||||
[DESIGN]
|
||||
|
||||
# Maximum number of arguments for function / method
|
||||
max-args=6
|
||||
|
||||
# Argument names that match this expression will be ignored. Default to name
|
||||
# with leading underscore
|
||||
ignored-argument-names=_.*
|
||||
|
||||
# Maximum number of locals for function / method body
|
||||
max-locals=15
|
||||
|
||||
# Maximum number of return / yield for function / method body
|
||||
max-returns=6
|
||||
|
||||
# Maximum number of branch for function / method body
|
||||
max-branches=12
|
||||
|
||||
# Maximum number of statements in function / method body
|
||||
max-statements=50
|
||||
|
||||
# Maximum number of parents for a class (see R0901).
|
||||
max-parents=12
|
||||
|
||||
# Maximum number of attributes for a class (see R0902).
|
||||
max-attributes=7
|
||||
|
||||
# Minimum number of public methods for a class (see R0903).
|
||||
min-public-methods=2
|
||||
|
||||
# Maximum number of public methods for a class (see R0904).
|
||||
max-public-methods=20
|
||||
|
||||
|
||||
[EXCEPTIONS]
|
||||
|
||||
# Exceptions that will emit a warning when being caught. Defaults to
|
||||
|
||||
336
.travis.yml
336
.travis.yml
@@ -1,17 +1,62 @@
|
||||
language: python
|
||||
dist: xenial
|
||||
|
||||
cache:
|
||||
directories:
|
||||
- $HOME/.cache/pip
|
||||
|
||||
before_install:
|
||||
- '([ $TRAVIS_OS_NAME == linux ] && dpkg -s libaugeas0) || (brew update && brew install augeas python3 && brew upgrade python && brew link python)'
|
||||
|
||||
before_script:
|
||||
- 'if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then ulimit -n 1024 ; fi'
|
||||
# On Travis, the fastest parallelization for integration tests has proved to be 4.
|
||||
- 'if [[ "$TOXENV" == *"integration"* ]]; then export PYTEST_ADDOPTS="--numprocesses 4"; fi'
|
||||
# Use Travis retry feature for farm tests since they are flaky
|
||||
- 'if [[ "$TOXENV" == "travis-test-farm"* ]]; then export TRAVIS_RETRY=travis_retry; fi'
|
||||
- export TOX_TESTENV_PASSENV=TRAVIS
|
||||
- 'if [ $TRAVIS_OS_NAME = osx ] ; then ulimit -n 1024 ; fi'
|
||||
|
||||
matrix:
|
||||
include:
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27_install BOULDER_INTEGRATION=v1
|
||||
sudo: required
|
||||
services: docker
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27_install BOULDER_INTEGRATION=v2
|
||||
sudo: required
|
||||
services: docker
|
||||
- python: "2.7"
|
||||
env: TOXENV=cover FYI="this also tests py27"
|
||||
- sudo: required
|
||||
env: TOXENV=nginx_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
- python: "2.7"
|
||||
env: TOXENV=lint
|
||||
- python: "2.7"
|
||||
env: TOXENV='py27-{acme,apache,certbot,dns,nginx}-oldest'
|
||||
sudo: required
|
||||
services: docker
|
||||
- python: "3.4"
|
||||
env: TOXENV=py34
|
||||
sudo: required
|
||||
services: docker
|
||||
- python: "3.6"
|
||||
env: TOXENV=py36
|
||||
sudo: required
|
||||
services: docker
|
||||
- sudo: required
|
||||
env: TOXENV=apache_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_trusty
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
- python: "2.7"
|
||||
env: TOXENV=apacheconftest
|
||||
sudo: required
|
||||
- python: "2.7"
|
||||
env: TOXENV=nginxroundtrip
|
||||
|
||||
|
||||
# Only build pushes to the master branch, PRs, and branches beginning with
|
||||
# `test-` or of the form `digit(s).digit(s).x`. This reduces the number of
|
||||
@@ -19,262 +64,20 @@ before_script:
|
||||
# is a cap of on the number of simultaneous runs.
|
||||
branches:
|
||||
only:
|
||||
# apache-parser-v2 is a temporary branch for doing work related to
|
||||
# rewriting the parser in the Apache plugin.
|
||||
- apache-parser-v2
|
||||
- master
|
||||
- /^\d+\.\d+\.x$/
|
||||
- /^test-.*$/
|
||||
|
||||
# Jobs for the main test suite are always executed (including on PRs) except for pushes on master.
|
||||
not-on-master: ¬-on-master
|
||||
if: NOT (type = push AND branch = master)
|
||||
|
||||
# Jobs for the extended test suite are executed for cron jobs and pushes to
|
||||
# non-development branches. See the explanation for apache-parser-v2 above.
|
||||
extended-test-suite: &extended-test-suite
|
||||
if: type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))
|
||||
|
||||
matrix:
|
||||
include:
|
||||
# Main test suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=pebble TOXENV=integration
|
||||
<<: *not-on-master
|
||||
|
||||
# This job is always executed, including on master
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27-cover FYI="py27 tests + code coverage"
|
||||
|
||||
- python: "2.7"
|
||||
env: TOXENV=lint
|
||||
<<: *not-on-master
|
||||
- python: "3.4"
|
||||
env: TOXENV=mypy
|
||||
<<: *not-on-master
|
||||
- python: "3.5"
|
||||
env: TOXENV=mypy
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
env: TOXENV='py27-{acme,apache,certbot,dns,nginx}-oldest'
|
||||
<<: *not-on-master
|
||||
- python: "3.4"
|
||||
env: TOXENV=py34
|
||||
<<: *not-on-master
|
||||
- python: "3.7"
|
||||
env: TOXENV=py37
|
||||
<<: *not-on-master
|
||||
- python: "3.8"
|
||||
env: TOXENV=py38
|
||||
<<: *not-on-master
|
||||
- sudo: required
|
||||
env: TOXENV=apache_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
<<: *not-on-master
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_xenial
|
||||
services: docker
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
env: TOXENV=apacheconftest-with-pebble
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
env: TOXENV=nginxroundtrip
|
||||
<<: *not-on-master
|
||||
|
||||
# Extended test suite on cron jobs and pushes to tested branches other than master
|
||||
- sudo: required
|
||||
env: TOXENV=nginx_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-apache2
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-leauto-upgrades
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
git:
|
||||
depth: false # This is needed to have the history to checkout old versions of certbot-auto.
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-certonly-standalone
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-sdists
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: TOXENV=py37 CERTBOT_NO_PIN=1
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration-certbot-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration-certbot-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration-nginx-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration-nginx-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: TOXENV=py34
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: TOXENV=py35
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: TOXENV=py36
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: TOXENV=py37
|
||||
<<: *extended-test-suite
|
||||
- python: "3.8-dev"
|
||||
env: TOXENV=py38
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.8-dev"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
<<: *extended-test-suite
|
||||
- python: "3.8-dev"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_jessie
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_centos6
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=docker_dev
|
||||
services: docker
|
||||
addons:
|
||||
apt:
|
||||
packages: # don't install nginx and apache
|
||||
- libaugeas0
|
||||
<<: *extended-test-suite
|
||||
- language: generic
|
||||
env: TOXENV=py27
|
||||
os: osx
|
||||
# Using this osx_image is a workaround for
|
||||
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
|
||||
osx_image: xcode10.2
|
||||
addons:
|
||||
homebrew:
|
||||
packages:
|
||||
- augeas
|
||||
- python2
|
||||
<<: *extended-test-suite
|
||||
- language: generic
|
||||
env: TOXENV=py3
|
||||
os: osx
|
||||
# Using this osx_image is a workaround for
|
||||
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
|
||||
osx_image: xcode10.2
|
||||
addons:
|
||||
homebrew:
|
||||
packages:
|
||||
- augeas
|
||||
- python3
|
||||
<<: *extended-test-suite
|
||||
|
||||
# container-based infrastructure
|
||||
sudo: false
|
||||
|
||||
addons:
|
||||
apt:
|
||||
sources:
|
||||
- augeas
|
||||
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
|
||||
- python-dev
|
||||
- python-virtualenv
|
||||
- gcc
|
||||
- libaugeas0
|
||||
- libssl-dev
|
||||
@@ -283,30 +86,23 @@ addons:
|
||||
# For certbot-nginx integration testing
|
||||
- nginx-light
|
||||
- openssl
|
||||
# for apacheconftest
|
||||
- apache2
|
||||
- libapache2-mod-wsgi
|
||||
- libapache2-mod-macro
|
||||
|
||||
# tools/pip_install.py is used to pin packages to a known working version
|
||||
# except in tests where the environment variable CERTBOT_NO_PIN is set.
|
||||
# virtualenv is listed here explicitly to make sure it is upgraded when
|
||||
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
|
||||
# version of virtualenv.
|
||||
install: 'tools/pip_install.py -U codecov tox virtualenv'
|
||||
# Most of the time TRAVIS_RETRY is an empty string, and has no effect on the
|
||||
# script command. It is set only to `travis_retry` during farm tests, in
|
||||
# order to trigger the Travis retry feature, and compensate the inherent
|
||||
# flakiness of these specific tests.
|
||||
script: '$TRAVIS_RETRY tox'
|
||||
install: "travis_retry $(command -v pip || command -v pip3) install tox coveralls"
|
||||
script:
|
||||
- travis_retry tox
|
||||
- '[ -z "${BOULDER_INTEGRATION+x}" ] || (travis_retry tests/boulder-fetch.sh && tests/tox-boulder-integration.sh)'
|
||||
|
||||
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov -F linux'
|
||||
after_success: '[ "$TOXENV" == "cover" ] && coveralls'
|
||||
|
||||
notifications:
|
||||
email: false
|
||||
irc:
|
||||
channels:
|
||||
# This is set to a secure variable to prevent forks from sending
|
||||
# notifications. This value was created by installing
|
||||
# https://github.com/travis-ci/travis.rb and running
|
||||
# `travis encrypt "chat.freenode.net#certbot-devel"`.
|
||||
- secure: "EWW66E2+KVPZyIPR8ViENZwfcup4Gx3/dlimmAZE0WuLwxDCshBBOd3O8Rf6pBokEoZlXM5eDT6XdyJj8n0DLslgjO62pExdunXpbcMwdY7l1ELxX2/UbnDTE6UnPYa09qVBHNG7156Z6yE0x2lH4M9Ykvp0G0cubjPQHylAwo0="
|
||||
on_cancel: never
|
||||
- secure: "SGWZl3ownKx9xKVV2VnGt7DqkTmutJ89oJV9tjKhSs84kLijU6EYdPnllqISpfHMTxXflNZuxtGo0wTDYHXBuZL47w1O32W6nzuXdra5zC+i4sYQwYULUsyfOv9gJX8zWAULiK0Z3r0oho45U+FR5ZN6TPCidi8/eGU+EEPwaAw="
|
||||
on_success: never
|
||||
on_failure: always
|
||||
use_notice: true
|
||||
|
||||
@@ -5,7 +5,6 @@ Authors
|
||||
* Aaron Zuehlke
|
||||
* Ada Lovelace
|
||||
* [Adam Woodbeck](https://github.com/awoodbeck)
|
||||
* [Adrien Ferrand](https://github.com/adferrand)
|
||||
* [Aidin Gharibnavaz](https://github.com/aidin36)
|
||||
* [AJ ONeal](https://github.com/coolaj86)
|
||||
* [Alcaro](https://github.com/Alcaro)
|
||||
@@ -15,10 +14,8 @@ Authors
|
||||
* [Alex Gaynor](https://github.com/alex)
|
||||
* [Alex Halderman](https://github.com/jhalderm)
|
||||
* [Alex Jordan](https://github.com/strugee)
|
||||
* [Alex Zorin](https://github.com/alexzorin)
|
||||
* [Amjad Mashaal](https://github.com/TheNavigat)
|
||||
* [Andrew Murray](https://github.com/radarhere)
|
||||
* [Andrzej Górski](https://github.com/andrzej3393)
|
||||
* [Anselm Levskaya](https://github.com/levskaya)
|
||||
* [Antoine Jacoutot](https://github.com/ajacoutot)
|
||||
* [asaph](https://github.com/asaph)
|
||||
@@ -78,7 +75,6 @@ Authors
|
||||
* [Fabian](https://github.com/faerbit)
|
||||
* [Faidon Liambotis](https://github.com/paravoid)
|
||||
* [Fan Jiang](https://github.com/tcz001)
|
||||
* [Felix Lechner](https://github.com/lechner)
|
||||
* [Felix Schwarz](https://github.com/FelixSchwarz)
|
||||
* [Felix Yan](https://github.com/felixonmars)
|
||||
* [Filip Ochnik](https://github.com/filipochnik)
|
||||
@@ -128,7 +124,6 @@ Authors
|
||||
* [Joubin Jabbari](https://github.com/joubin)
|
||||
* [Juho Juopperi](https://github.com/jkjuopperi)
|
||||
* [Kane York](https://github.com/riking)
|
||||
* [Kenichi Maehashi](https://github.com/kmaehashi)
|
||||
* [Kenneth Skovhede](https://github.com/kenkendk)
|
||||
* [Kevin Burke](https://github.com/kevinburke)
|
||||
* [Kevin London](https://github.com/kevinlondon)
|
||||
@@ -164,10 +159,8 @@ Authors
|
||||
* [Michael Schumacher](https://github.com/schumaml)
|
||||
* [Michael Strache](https://github.com/Jarodiv)
|
||||
* [Michael Sverdlin](https://github.com/sveder)
|
||||
* [Michael Watters](https://github.com/blackknight36)
|
||||
* [Michal Moravec](https://github.com/https://github.com/Majkl578)
|
||||
* [Michal Papis](https://github.com/mpapis)
|
||||
* [Mickaël Schoentgen](https://github.com/BoboTiG)
|
||||
* [Minn Soe](https://github.com/MinnSoe)
|
||||
* [Min RK](https://github.com/minrk)
|
||||
* [Miquel Ruiz](https://github.com/miquelruiz)
|
||||
@@ -231,7 +224,6 @@ Authors
|
||||
* [Stavros Korokithakis](https://github.com/skorokithakis)
|
||||
* [Stefan Weil](https://github.com/stweil)
|
||||
* [Steve Desmond](https://github.com/stevedesmond-ca)
|
||||
* [sydneyli](https://github.com/sydneyli)
|
||||
* [Tan Jay Jun](https://github.com/jayjun)
|
||||
* [Tapple Gao](https://github.com/tapple)
|
||||
* [Telepenin Nikolay](https://github.com/telepenin)
|
||||
|
||||
917
CHANGELOG.md
917
CHANGELOG.md
@@ -1,919 +1,6 @@
|
||||
# Certbot change log
|
||||
|
||||
Certbot adheres to [Semantic Versioning](https://semver.org/).
|
||||
|
||||
## 1.0.0 - master
|
||||
|
||||
### Added
|
||||
|
||||
*
|
||||
|
||||
### Removed
|
||||
|
||||
* The `docs` extras for the `certbot-apache` and `certbot-nginx` packages
|
||||
have been removed.
|
||||
|
||||
### Changed
|
||||
|
||||
* certbot-auto has deprecated support for systems using OpenSSL 1.0.1 that are
|
||||
not running on x86-64. This primarily affects RHEL 6 based systems.
|
||||
* Certbot's `config_changes` subcommand has been removed
|
||||
* `certbot.plugins.common.TLSSNI01` has been removed.
|
||||
* Deprecated attributes related to the TLS-SNI-01 challenge in
|
||||
`acme.challenges` and `acme.standalone`
|
||||
have been removed.
|
||||
* The functions `certbot.client.view_config_changes`,
|
||||
`certbot.main.config_changes`,
|
||||
`certbot.plugins.common.Installer.view_config_changes`,
|
||||
`certbot.reverter.Reverter.view_config_changes`, and
|
||||
`certbot.util.get_systemd_os_info` have been removed
|
||||
* Certbot's `register --update-registration` subcommand has been removed
|
||||
|
||||
### Fixed
|
||||
|
||||
*
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.40.1 - 2019-11-05
|
||||
|
||||
### Changed
|
||||
|
||||
* Added back support for Python 3.4 to Certbot components and certbot-auto due
|
||||
to a bug when requiring Python 2.7 or 3.5+ on RHEL 6 based systems.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.40.0 - 2019-11-05
|
||||
|
||||
### Added
|
||||
|
||||
*
|
||||
|
||||
### Changed
|
||||
|
||||
* We deprecated support for Python 3.4 in Certbot and its ACME library. Support
|
||||
for Python 3.4 will be removed in the next major release of Certbot.
|
||||
certbot-auto users on RHEL 6 based systems will be asked to enable Software
|
||||
Collections (SCL) repository so Python 3.6 can be installed. certbot-auto can
|
||||
enable the SCL repo for you on CentOS 6 while users on other RHEL 6 based
|
||||
systems will be asked to do this manually.
|
||||
* `--server` may now be combined with `--dry-run`. Certbot will, as before, use the
|
||||
staging server instead of the live server when `--dry-run` is used.
|
||||
* `--dry-run` now requests fresh authorizations every time, fixing the issue
|
||||
where it was prone to falsely reporting success.
|
||||
* Updated certbot-dns-google to depend on newer versions of
|
||||
google-api-python-client and oauth2client.
|
||||
* The OS detection logic again uses distro library for Linux OSes
|
||||
* certbot.plugins.common.TLSSNI01 has been deprecated and will be removed in a
|
||||
future release.
|
||||
* CLI flags --tls-sni-01-port and --tls-sni-01-address have been removed.
|
||||
* The values tls-sni and tls-sni-01 for the --preferred-challenges flag are no
|
||||
longer accepted.
|
||||
* Removed the flags: `--agree-dev-preview`, `--dialog`, and `--apache-init-script`
|
||||
* acme.standalone.BaseRequestHandlerWithLogging and
|
||||
acme.standalone.simple_tls_sni_01_server have been deprecated and will be
|
||||
removed in a future release of the library.
|
||||
* certbot-dns-rfc2136 now use TCP to query SOA records.
|
||||
|
||||
### Fixed
|
||||
|
||||
*
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.39.0 - 2019-10-01
|
||||
|
||||
### Added
|
||||
|
||||
* Support for Python 3.8 was added to Certbot and all of its components.
|
||||
* Support for CentOS 8 was added to certbot-auto.
|
||||
|
||||
### Changed
|
||||
|
||||
* Don't send OCSP requests for expired certificates
|
||||
* Return to using platform.linux_distribution instead of distro.linux_distribution in OS fingerprinting for Python < 3.8
|
||||
* Updated the Nginx plugin's TLS configuration to keep support for some versions of IE11.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Fixed OS detection in the Apache plugin on RHEL 6.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.38.0 - 2019-09-03
|
||||
|
||||
### Added
|
||||
|
||||
* Disable session tickets for Nginx users when appropriate.
|
||||
|
||||
### Changed
|
||||
|
||||
* If Certbot fails to rollback your server configuration, the error message
|
||||
links to the Let's Encrypt forum. Change the link to the Help category now
|
||||
that the Server category has been closed.
|
||||
* Replace platform.linux_distribution with distro.linux_distribution as a step
|
||||
towards Python 3.8 support in Certbot.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Fixed OS detection in the Apache plugin on Scientific Linux.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.37.2 - 2019-08-21
|
||||
|
||||
* Stop disabling TLS session tickets in Nginx as it caused TLS failures on
|
||||
some systems.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.37.1 - 2019-08-08
|
||||
|
||||
### Fixed
|
||||
|
||||
* Stop disabling TLS session tickets in Apache as it caused TLS failures on
|
||||
some systems.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.37.0 - 2019-08-07
|
||||
|
||||
### Added
|
||||
|
||||
* Turn off session tickets for apache plugin by default
|
||||
* acme: Authz deactivation added to `acme` module.
|
||||
|
||||
### Changed
|
||||
|
||||
* Follow updated Mozilla recommendations for Nginx ssl_protocols, ssl_ciphers,
|
||||
and ssl_prefer_server_ciphers
|
||||
|
||||
### Fixed
|
||||
|
||||
* Fix certbot-auto failures on RHEL 8.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.36.0 - 2019-07-11
|
||||
|
||||
### Added
|
||||
|
||||
* Turn off session tickets for nginx plugin by default
|
||||
* Added missing error types from RFC8555 to acme
|
||||
|
||||
### Changed
|
||||
|
||||
* Support for Ubuntu 14.04 Trusty has been removed.
|
||||
* Update the 'manage your account' help to be more generic.
|
||||
* The error message when Certbot's Apache plugin is unable to modify your
|
||||
Apache configuration has been improved.
|
||||
* Certbot's config_changes subcommand has been deprecated and will be
|
||||
removed in a future release.
|
||||
* `certbot config_changes` no longer accepts a --num parameter.
|
||||
* The functions `certbot.plugins.common.Installer.view_config_changes` and
|
||||
`certbot.reverter.Reverter.view_config_changes` have been deprecated and will
|
||||
be removed in a future release.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Replace some unnecessary platform-specific line separation.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.35.1 - 2019-06-10
|
||||
|
||||
### Fixed
|
||||
|
||||
* Support for specifying an authoritative base domain in our dns-rfc2136 plugin
|
||||
has been removed. This feature was added in our last release but had a bug
|
||||
which caused the plugin to fail so the feature has been removed until it can
|
||||
be added properly.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* certbot-dns-rfc2136
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.35.0 - 2019-06-05
|
||||
|
||||
### Added
|
||||
|
||||
* dns_rfc2136 plugin now supports explicitly specifing an authorative
|
||||
base domain for cases when the automatic method does not work (e.g.
|
||||
Split horizon DNS)
|
||||
|
||||
### Changed
|
||||
|
||||
*
|
||||
|
||||
### Fixed
|
||||
|
||||
* Renewal parameter `webroot_path` is always saved, avoiding some regressions
|
||||
when `webroot` authenticator plugin is invoked with no challenge to perform.
|
||||
* Certbot now accepts OCSP responses when an explicit authorized
|
||||
responder, different from the issuer, is used to sign OCSP
|
||||
responses.
|
||||
* Scripts in Certbot hook directories are no longer executed when their
|
||||
filenames end in a tilde.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* certbot
|
||||
* certbot-dns-rfc2136
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.34.2 - 2019-05-07
|
||||
|
||||
### Fixed
|
||||
|
||||
* certbot-auto no longer writes a check_permissions.py script at the root
|
||||
of the filesystem.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
changes in this release were to certbot-auto.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.34.1 - 2019-05-06
|
||||
|
||||
### Fixed
|
||||
|
||||
* certbot-auto no longer prints a blank line when there are no permissions
|
||||
problems.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
changes in this release were to certbot-auto.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.34.0 - 2019-05-01
|
||||
|
||||
### Changed
|
||||
|
||||
* Apache plugin now tries to restart httpd on Fedora using systemctl if a
|
||||
configuration test error is detected. This has to be done due to the way
|
||||
Fedora now generates the self signed certificate files upon first
|
||||
restart.
|
||||
* Updated Certbot and its plugins to improve the handling of file system permissions
|
||||
on Windows as a step towards adding proper Windows support to Certbot.
|
||||
* Updated urllib3 to 1.24.2 in certbot-auto.
|
||||
* Removed the fallback introduced with 0.32.0 in `acme` to retry a challenge response
|
||||
with a `keyAuthorization` if sending the response without this field caused a
|
||||
`malformed` error to be received from the ACME server.
|
||||
* Linode DNS plugin now supports api keys created from their new panel
|
||||
at [cloud.linode.com](https://cloud.linode.com)
|
||||
|
||||
### Fixed
|
||||
|
||||
* Fixed Google DNS Challenge issues when private zones exist
|
||||
* Adding a warning noting that future versions of Certbot will automatically configure the
|
||||
webserver so that all requests redirect to secure HTTPS access. You can control this
|
||||
behavior and disable this warning with the --redirect and --no-redirect flags.
|
||||
* certbot-auto now prints warnings when run as root with insecure file system
|
||||
permissions. If you see these messages, you should fix the problem by
|
||||
following the instructions at
|
||||
https://community.letsencrypt.org/t/certbot-auto-deployment-best-practices/91979/,
|
||||
however, these warnings can be disabled as necessary with the flag
|
||||
--no-permissions-check.
|
||||
* `acme` module uses now a POST-as-GET request to retrieve the registration
|
||||
from an ACME v2 server
|
||||
* Convert the tsig algorithm specified in the certbot_dns_rfc2136 configuration file to
|
||||
all uppercase letters before validating. This makes the value in the config case
|
||||
insensitive.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* acme
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-dns-cloudflare
|
||||
* certbot-dns-cloudxns
|
||||
* certbot-dns-digitalocean
|
||||
* certbot-dns-dnsimple
|
||||
* certbot-dns-dnsmadeeasy
|
||||
* certbot-dns-gehirn
|
||||
* certbot-dns-google
|
||||
* certbot-dns-linode
|
||||
* certbot-dns-luadns
|
||||
* certbot-dns-nsone
|
||||
* certbot-dns-ovh
|
||||
* certbot-dns-rfc2136
|
||||
* certbot-dns-route53
|
||||
* certbot-dns-sakuracloud
|
||||
* certbot-nginx
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.33.1 - 2019-04-04
|
||||
|
||||
### Fixed
|
||||
|
||||
* A bug causing certbot-auto to print warnings or crash on some RHEL based
|
||||
systems has been resolved.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
changes in this release were to certbot-auto.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.33.0 - 2019-04-03
|
||||
|
||||
### Added
|
||||
|
||||
* Fedora 29+ is now supported by certbot-auto. Since Python 2.x is on a deprecation
|
||||
path in Fedora, certbot-auto will install and use Python 3.x on Fedora 29+.
|
||||
* CLI flag `--https-port` has been added for Nginx plugin exclusively, and replaces
|
||||
`--tls-sni-01-port`. It defines the HTTPS port the Nginx plugin will use while
|
||||
setting up a new SSL vhost. By default the HTTPS port is 443.
|
||||
|
||||
### Changed
|
||||
|
||||
* Support for TLS-SNI-01 has been removed from all official Certbot plugins.
|
||||
* Attributes related to the TLS-SNI-01 challenge in `acme.challenges` and `acme.standalone`
|
||||
modules are deprecated and will be removed soon.
|
||||
* CLI flags `--tls-sni-01-port` and `--tls-sni-01-address` are now no-op, will
|
||||
generate a deprecation warning if used, and will be removed soon.
|
||||
* Options `tls-sni` and `tls-sni-01` in `--preferred-challenges` flag are now no-op,
|
||||
will generate a deprecation warning if used, and will be removed soon.
|
||||
* CLI flag `--standalone-supported-challenges` has been removed.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Certbot uses the Python library cryptography for OCSP when cryptography>=2.5
|
||||
is installed. We fixed a bug in Certbot causing it to interpret timestamps in
|
||||
the OCSP response as being in the local timezone rather than UTC.
|
||||
* Issue causing the default CentOS 6 TLS configuration to ignore some of the
|
||||
HTTPS VirtualHosts created by Certbot. mod_ssl loading is now moved to main
|
||||
http.conf for this environment where possible.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* acme
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-nginx
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.32.0 - 2019-03-06
|
||||
|
||||
### Added
|
||||
|
||||
* If possible, Certbot uses built-in support for OCSP from recent cryptography
|
||||
versions instead of the OpenSSL binary: as a consequence Certbot does not need
|
||||
the OpenSSL binary to be installed anymore if cryptography>=2.5 is installed.
|
||||
|
||||
### Changed
|
||||
|
||||
* Certbot and its acme module now depend on josepy>=1.1.0 to avoid printing the
|
||||
warnings described at https://github.com/certbot/josepy/issues/13.
|
||||
* Apache plugin now respects CERTBOT_DOCS environment variable when adding
|
||||
command line defaults.
|
||||
* The running of manual plugin hooks is now always included in Certbot's log
|
||||
output.
|
||||
* Tests execution for certbot, certbot-apache and certbot-nginx packages now relies on pytest.
|
||||
* An ACME CA server may return a "Retry-After" HTTP header on authorization polling, as
|
||||
specified in the ACME protocol, to indicate when the next polling should occur. Certbot now
|
||||
reads this header if set and respect its value.
|
||||
* The `acme` module avoids sending the `keyAuthorization` field in the JWS
|
||||
payload when responding to a challenge as the field is not included in the
|
||||
current ACME protocol. To ease the migration path for ACME CA servers,
|
||||
Certbot and its `acme` module will first try the request without the
|
||||
`keyAuthorization` field but will temporarily retry the request with the
|
||||
field included if a `malformed` error is received. This fallback will be
|
||||
removed in version 0.34.0.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* acme
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-nginx
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.31.0 - 2019-02-07
|
||||
|
||||
### Added
|
||||
|
||||
* Avoid reprocessing challenges that are already validated
|
||||
when a certificate is issued.
|
||||
* Support for initiating (but not solving end-to-end) TLS-ALPN-01 challenges
|
||||
with the `acme` module.
|
||||
|
||||
### Changed
|
||||
|
||||
* Certbot's official Docker images are now based on Alpine Linux 3.9 rather
|
||||
than 3.7. The new version comes with OpenSSL 1.1.1.
|
||||
* Lexicon-based DNS plugins are now fully compatible with Lexicon 3.x (support
|
||||
on 2.x branch is maintained).
|
||||
* Apache plugin now attempts to configure all VirtualHosts matching requested
|
||||
domain name instead of only a single one when answering the HTTP-01 challenge.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Fixed accessing josepy contents through acme.jose when the full acme.jose
|
||||
path is used.
|
||||
* Clarify behavior for deleting certs as part of revocation.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* acme
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-dns-cloudxns
|
||||
* certbot-dns-dnsimple
|
||||
* certbot-dns-dnsmadeeasy
|
||||
* certbot-dns-gehirn
|
||||
* certbot-dns-linode
|
||||
* certbot-dns-luadns
|
||||
* certbot-dns-nsone
|
||||
* certbot-dns-ovh
|
||||
* certbot-dns-sakuracloud
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.30.2 - 2019-01-25
|
||||
|
||||
### Fixed
|
||||
|
||||
* Update the version of setuptools pinned in certbot-auto to 40.6.3 to
|
||||
solve installation problems on newer OSes.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, this
|
||||
release only affects certbot-auto.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.30.1 - 2019-01-24
|
||||
|
||||
### Fixed
|
||||
|
||||
* Always download the pinned version of pip in pipstrap to address breakages
|
||||
* Rename old,default.conf to old-and-default.conf to address commas in filenames
|
||||
breaking recent versions of pip.
|
||||
* Add VIRTUALENV_NO_DOWNLOAD=1 to all calls to virtualenv to address breakages
|
||||
from venv downloading the latest pip
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* certbot-apache
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.30.0 - 2019-01-02
|
||||
|
||||
### Added
|
||||
|
||||
* Added the `update_account` subcommand for account management commands.
|
||||
|
||||
### Changed
|
||||
|
||||
* Copied account management functionality from the `register` subcommand
|
||||
to the `update_account` subcommand.
|
||||
* Marked usage `register --update-registration` for deprecation and
|
||||
removal in a future release.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Older modules in the josepy library can now be accessed through acme.jose
|
||||
like it could in previous versions of acme. This is only done to preserve
|
||||
backwards compatibility and support for doing this with new modules in josepy
|
||||
will not be added. Users of the acme library should switch to using josepy
|
||||
directly if they haven't done so already.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* acme
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.29.1 - 2018-12-05
|
||||
|
||||
### Added
|
||||
|
||||
*
|
||||
|
||||
### Changed
|
||||
|
||||
*
|
||||
|
||||
### Fixed
|
||||
|
||||
* The default work and log directories have been changed back to
|
||||
/var/lib/letsencrypt and /var/log/letsencrypt respectively.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* certbot
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.29.0 - 2018-12-05
|
||||
|
||||
### Added
|
||||
|
||||
* Noninteractive renewals with `certbot renew` (those not started from a
|
||||
terminal) now randomly sleep 1-480 seconds before beginning work in
|
||||
order to spread out load spikes on the server side.
|
||||
* Added External Account Binding support in cli and acme library.
|
||||
Command line arguments --eab-kid and --eab-hmac-key added.
|
||||
|
||||
### Changed
|
||||
|
||||
* Private key permissioning changes: Renewal preserves existing group mode
|
||||
& gid of previous private key material. Private keys for new
|
||||
lineages (i.e. new certs, not renewed) default to 0o600.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Update code and dependencies to clean up Resource and Deprecation Warnings.
|
||||
* Only depend on imgconverter extension for Sphinx >= 1.6
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* acme
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-dns-cloudflare
|
||||
* certbot-dns-digitalocean
|
||||
* certbot-dns-google
|
||||
* certbot-nginx
|
||||
|
||||
More details about these changes can be found on our GitHub repo:
|
||||
https://github.com/certbot/certbot/milestone/62?closed=1
|
||||
|
||||
## 0.28.0 - 2018-11-7
|
||||
|
||||
### Added
|
||||
|
||||
* `revoke` accepts `--cert-name`, and doesn't accept both `--cert-name` and `--cert-path`.
|
||||
* Use the ACMEv2 newNonce endpoint when a new nonce is needed, and newNonce is available in the directory.
|
||||
|
||||
### Changed
|
||||
|
||||
* Removed documentation mentions of `#letsencrypt` IRC on Freenode.
|
||||
* Write README to the base of (config-dir)/live directory
|
||||
* `--manual` will explicitly warn users that earlier challenges should remain in place when setting up subsequent challenges.
|
||||
* Warn when using deprecated acme.challenges.TLSSNI01
|
||||
* Log warning about TLS-SNI deprecation in Certbot
|
||||
* Stop preferring TLS-SNI in the Apache, Nginx, and standalone plugins
|
||||
* OVH DNS plugin now relies on Lexicon>=2.7.14 to support HTTP proxies
|
||||
* Default time the Linode plugin waits for DNS changes to propogate is now 1200 seconds.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Match Nginx parser update in allowing variable names to start with `${`.
|
||||
* Fix ranking of vhosts in Nginx so that all port-matching vhosts come first
|
||||
* Correct OVH integration tests on machines without internet access.
|
||||
* Stop caching the results of ipv6_info in http01.py
|
||||
* Test fix for Route53 plugin to prevent boto3 making outgoing connections.
|
||||
* The grammar used by Augeas parser in Apache plugin was updated to fix various parsing errors.
|
||||
* The CloudXNS, DNSimple, DNS Made Easy, Gehirn, Linode, LuaDNS, NS1, OVH, and
|
||||
Sakura Cloud DNS plugins are now compatible with Lexicon 3.0+.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* acme
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-dns-cloudxns
|
||||
* certbot-dns-dnsimple
|
||||
* certbot-dns-dnsmadeeasy
|
||||
* certbot-dns-gehirn
|
||||
* certbot-dns-linode
|
||||
* certbot-dns-luadns
|
||||
* certbot-dns-nsone
|
||||
* certbot-dns-ovh
|
||||
* certbot-dns-route53
|
||||
* certbot-dns-sakuracloud
|
||||
* certbot-nginx
|
||||
|
||||
More details about these changes can be found on our GitHub repo:
|
||||
https://github.com/certbot/certbot/milestone/59?closed=1
|
||||
|
||||
## 0.27.1 - 2018-09-06
|
||||
|
||||
### Fixed
|
||||
|
||||
* Fixed parameter name in OpenSUSE overrides for default parameters in the
|
||||
Apache plugin. Certbot on OpenSUSE works again.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* certbot-apache
|
||||
|
||||
More details about these changes can be found on our GitHub repo:
|
||||
https://github.com/certbot/certbot/milestone/60?closed=1
|
||||
|
||||
## 0.27.0 - 2018-09-05
|
||||
|
||||
### Added
|
||||
|
||||
* The Apache plugin now accepts the parameter --apache-ctl which can be
|
||||
used to configure the path to the Apache control script.
|
||||
|
||||
### Changed
|
||||
|
||||
* When using `acme.client.ClientV2` (or
|
||||
`acme.client.BackwardsCompatibleClientV2` with an ACME server that supports a
|
||||
newer version of the ACME protocol), an `acme.errors.ConflictError` will be
|
||||
raised if you try to create an ACME account with a key that has already been
|
||||
used. Previously, a JSON parsing error was raised in this scenario when using
|
||||
the library with Let's Encrypt's ACMEv2 endpoint.
|
||||
|
||||
### Fixed
|
||||
|
||||
* When Apache is not installed, Certbot's Apache plugin no longer prints
|
||||
messages about being unable to find apachectl to the terminal when the plugin
|
||||
is not selected.
|
||||
* If you're using the Apache plugin with the --apache-vhost-root flag set to a
|
||||
directory containing a disabled virtual host for the domain you're requesting
|
||||
a certificate for, the virtual host will now be temporarily enabled if
|
||||
necessary to pass the HTTP challenge.
|
||||
* The documentation for the Certbot package can now be built using Sphinx 1.6+.
|
||||
* You can now call `query_registration` without having to first call
|
||||
`new_account` on `acme.client.ClientV2` objects.
|
||||
* The requirement of `setuptools>=1.0` has been removed from `certbot-dns-ovh`.
|
||||
* Names in certbot-dns-sakuracloud's tests have been updated to refer to Sakura
|
||||
Cloud rather than NS1 whose plugin certbot-dns-sakuracloud was based on.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* acme
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-dns-ovh
|
||||
* certbot-dns-sakuracloud
|
||||
|
||||
More details about these changes can be found on our GitHub repo:
|
||||
https://github.com/certbot/certbot/milestone/57?closed=1
|
||||
|
||||
## 0.26.1 - 2018-07-17
|
||||
|
||||
### Fixed
|
||||
|
||||
* Fix a bug that was triggered when users who had previously manually set `--server` to get ACMEv2 certs tried to renew ACMEv1 certs.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of all Certbot components during releases for the time being, however, the only package with changes other than its version number was:
|
||||
|
||||
* certbot
|
||||
|
||||
More details about these changes can be found on our GitHub repo:
|
||||
https://github.com/certbot/certbot/milestone/58?closed=1
|
||||
|
||||
## 0.26.0 - 2018-07-11
|
||||
|
||||
### Added
|
||||
|
||||
* A new security enhancement which we're calling AutoHSTS has been added to
|
||||
Certbot's Apache plugin. This enhancement configures your webserver to send a
|
||||
HTTP Strict Transport Security header with a low max-age value that is slowly
|
||||
increased over time. The max-age value is not increased to a large value
|
||||
until you've successfully managed to renew your certificate. This enhancement
|
||||
can be requested with the --auto-hsts flag.
|
||||
* New official DNS plugins have been created for Gehirn Infrastracture Service,
|
||||
Linode, OVH, and Sakura Cloud. These plugins can be found on our Docker Hub
|
||||
page at https://hub.docker.com/u/certbot and on PyPI.
|
||||
* The ability to reuse ACME accounts from Let's Encrypt's ACMEv1 endpoint on
|
||||
Let's Encrypt's ACMEv2 endpoint has been added.
|
||||
* Certbot and its components now support Python 3.7.
|
||||
* Certbot's install subcommand now allows you to interactively choose which
|
||||
certificate to install from the list of certificates managed by Certbot.
|
||||
* Certbot now accepts the flag `--no-autorenew` which causes any obtained
|
||||
certificates to not be automatically renewed when it approaches expiration.
|
||||
* Support for parsing the TLS-ALPN-01 challenge has been added back to the acme
|
||||
library.
|
||||
|
||||
### Changed
|
||||
|
||||
* Certbot's default ACME server has been changed to Let's Encrypt's ACMEv2
|
||||
endpoint. By default, this server will now be used for both new certificate
|
||||
lineages and renewals.
|
||||
* The Nginx plugin is no longer marked labeled as an "Alpha" version.
|
||||
* The `prepare` method of Certbot's plugins is no longer called before running
|
||||
"Updater" enhancements that are run on every invocation of `certbot renew`.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
packages with functional changes were:
|
||||
|
||||
* acme
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-dns-gehirn
|
||||
* certbot-dns-linode
|
||||
* certbot-dns-ovh
|
||||
* certbot-dns-sakuracloud
|
||||
* certbot-nginx
|
||||
|
||||
More details about these changes can be found on our GitHub repo:
|
||||
https://github.com/certbot/certbot/milestone/55?closed=1
|
||||
|
||||
## 0.25.1 - 2018-06-13
|
||||
|
||||
### Fixed
|
||||
|
||||
* TLS-ALPN-01 support has been removed from our acme library. Using our current
|
||||
dependencies, we are unable to provide a correct implementation of this
|
||||
challenge so we decided to remove it from the library until we can provide
|
||||
proper support.
|
||||
* Issues causing test failures when running the tests in the acme package with
|
||||
pytest<3.0 has been resolved.
|
||||
* certbot-nginx now correctly depends on acme>=0.25.0.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
packages with changes other than their version number were:
|
||||
|
||||
* acme
|
||||
* certbot-nginx
|
||||
|
||||
More details about these changes can be found on our GitHub repo:
|
||||
https://github.com/certbot/certbot/milestone/56?closed=1
|
||||
|
||||
## 0.25.0 - 2018-06-06
|
||||
|
||||
### Added
|
||||
|
||||
* Support for the ready status type was added to acme. Without this change,
|
||||
Certbot and acme users will begin encountering errors when using Let's
|
||||
Encrypt's ACMEv2 API starting on June 19th for the staging environment and
|
||||
July 5th for production. See
|
||||
https://community.letsencrypt.org/t/acmev2-order-ready-status/62866 for more
|
||||
information.
|
||||
* Certbot now accepts the flag --reuse-key which will cause the same key to be
|
||||
used in the certificate when the lineage is renewed rather than generating a
|
||||
new key.
|
||||
* You can now add multiple email addresses to your ACME account with Certbot by
|
||||
providing a comma separated list of emails to the --email flag.
|
||||
* Support for Let's Encrypt's upcoming TLS-ALPN-01 challenge was added to acme.
|
||||
For more information, see
|
||||
https://community.letsencrypt.org/t/tls-alpn-validation-method/63814/1.
|
||||
* acme now supports specifying the source address to bind to when sending
|
||||
outgoing connections. You still cannot specify this address using Certbot.
|
||||
* If you run Certbot against Let's Encrypt's ACMEv2 staging server but don't
|
||||
already have an account registered at that server URL, Certbot will
|
||||
automatically reuse your staging account from Let's Encrypt's ACMEv1 endpoint
|
||||
if it exists.
|
||||
* Interfaces were added to Certbot allowing plugins to be called at additional
|
||||
points. The `GenericUpdater` interface allows plugins to perform actions
|
||||
every time `certbot renew` is run, regardless of whether any certificates are
|
||||
due for renewal, and the `RenewDeployer` interface allows plugins to perform
|
||||
actions when a certificate is renewed. See `certbot.interfaces` for more
|
||||
information.
|
||||
|
||||
### Changed
|
||||
|
||||
* When running Certbot with --dry-run and you don't already have a staging
|
||||
account, the created account does not contain an email address even if one
|
||||
was provided to avoid expiration emails from Let's Encrypt's staging server.
|
||||
* certbot-nginx does a better job of automatically detecting the location of
|
||||
Nginx's configuration files when run on BSD based systems.
|
||||
* acme now requires and uses pytest when running tests with setuptools with
|
||||
`python setup.py test`.
|
||||
* `certbot config_changes` no longer waits for user input before exiting.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Misleading log output that caused users to think that Certbot's standalone
|
||||
plugin failed to bind to a port when performing a challenge has been
|
||||
corrected.
|
||||
* An issue where certbot-nginx would fail to enable HSTS if the server block
|
||||
already had an `add_header` directive has been resolved.
|
||||
* certbot-nginx now does a better job detecting the server block to base the
|
||||
configuration for TLS-SNI challenges on.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
packages with functional changes were:
|
||||
|
||||
* acme
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-nginx
|
||||
|
||||
More details about these changes can be found on our GitHub repo:
|
||||
https://github.com/certbot/certbot/milestone/54?closed=1
|
||||
|
||||
## 0.24.0 - 2018-05-02
|
||||
|
||||
### Added
|
||||
|
||||
* certbot now has an enhance subcommand which allows you to configure security
|
||||
enhancements like HTTP to HTTPS redirects, OCSP stapling, and HSTS without
|
||||
reinstalling a certificate.
|
||||
* certbot-dns-rfc2136 now allows the user to specify the port to use to reach
|
||||
the DNS server in its credentials file.
|
||||
* acme now parses the wildcard field included in authorizations so it can be
|
||||
used by users of the library.
|
||||
|
||||
### Changed
|
||||
|
||||
* certbot-dns-route53 used to wait for each DNS update to propagate before
|
||||
sending the next one, but now it sends all updates before waiting which
|
||||
speeds up issuance for multiple domains dramatically.
|
||||
* Certbot's official Docker images are now based on Alpine Linux 3.7 rather
|
||||
than 3.4 because 3.4 has reached its end-of-life.
|
||||
* We've doubled the time Certbot will spend polling authorizations before
|
||||
timing out.
|
||||
* The level of the message logged when Certbot is being used with
|
||||
non-standard paths warning that crontabs for renewal included in Certbot
|
||||
packages from OS package managers may not work has been reduced. This stops
|
||||
the message from being written to stderr every time `certbot renew` runs.
|
||||
|
||||
### Fixed
|
||||
|
||||
* certbot-auto now works with Python 3.6.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
packages with changes other than their version number were:
|
||||
|
||||
* acme
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-dns-digitalocean (only style improvements to tests)
|
||||
* certbot-dns-rfc2136
|
||||
|
||||
More details about these changes can be found on our GitHub repo:
|
||||
https://github.com/certbot/certbot/milestone/52?closed=1
|
||||
|
||||
## 0.23.0 - 2018-04-04
|
||||
|
||||
### Added
|
||||
|
||||
* Support for OpenResty was added to the Nginx plugin.
|
||||
|
||||
### Changed
|
||||
|
||||
* The timestamps in Certbot's logfiles now use the system's local time zone
|
||||
rather than UTC.
|
||||
* Certbot's DNS plugins that use Lexicon now rely on Lexicon>=2.2.1 to be able
|
||||
to create and delete multiple TXT records on a single domain.
|
||||
* certbot-dns-google's test suite now works without an internet connection.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Removed a small window that if during which an error occurred, Certbot
|
||||
wouldn't clean up performed challenges.
|
||||
* The parameters `default` and `ipv6only` are now removed from `listen`
|
||||
directives when creating a new server block in the Nginx plugin.
|
||||
* `server_name` directives enclosed in quotation marks in Nginx are now properly
|
||||
supported.
|
||||
* Resolved an issue preventing the Apache plugin from starting Apache when it's
|
||||
not currently running on RHEL and Gentoo based systems.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
packages with changes other than their version number were:
|
||||
|
||||
* certbot
|
||||
* certbot-apache
|
||||
* certbot-dns-cloudxns
|
||||
* certbot-dns-dnsimple
|
||||
* certbot-dns-dnsmadeeasy
|
||||
* certbot-dns-google
|
||||
* certbot-dns-luadns
|
||||
* certbot-dns-nsone
|
||||
* certbot-dns-rfc2136
|
||||
* certbot-nginx
|
||||
|
||||
More details about these changes can be found on our GitHub repo:
|
||||
https://github.com/certbot/certbot/milestone/50?closed=1
|
||||
Certbot adheres to [Semantic Versioning](http://semver.org/).
|
||||
|
||||
## 0.22.2 - 2018-03-19
|
||||
|
||||
@@ -1476,7 +563,7 @@ https://github.com/certbot/certbot/pulls?q=is%3Apr%20milestone%3A0.11.1%20is%3Ac
|
||||
|
||||
### Added
|
||||
|
||||
* When using the standalone plugin while running Certbot interactively
|
||||
* When using the standalone plugin while running Certbot interactively
|
||||
and a required port is bound by another process, Certbot will give you
|
||||
the option to retry to grab the port rather than immediately exiting.
|
||||
* You are now able to deactivate your account with the Let's Encrypt
|
||||
|
||||
8
CHANGES.rst
Normal file
8
CHANGES.rst
Normal file
@@ -0,0 +1,8 @@
|
||||
ChangeLog
|
||||
=========
|
||||
|
||||
To see the changes in a given release, view the issues closed in a given
|
||||
release's GitHub milestone:
|
||||
|
||||
- `Past releases <https://github.com/certbot/certbot/milestones?state=closed>`_
|
||||
- `Upcoming releases <https://github.com/certbot/certbot/milestones>`_
|
||||
@@ -1 +0,0 @@
|
||||
This project is governed by [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode).
|
||||
@@ -33,5 +33,3 @@ started. In particular, we recommend you read these sections
|
||||
- [Finding issues to work on](https://certbot.eff.org/docs/contributing.html#find-issues-to-work-on)
|
||||
- [Coding style](https://certbot.eff.org/docs/contributing.html#coding-style)
|
||||
- [Submitting a pull request](https://certbot.eff.org/docs/contributing.html#submitting-a-pull-request)
|
||||
- [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode)
|
||||
|
||||
|
||||
27
Dockerfile
Normal file
27
Dockerfile
Normal file
@@ -0,0 +1,27 @@
|
||||
FROM python:2-alpine
|
||||
|
||||
ENTRYPOINT [ "certbot" ]
|
||||
EXPOSE 80 443
|
||||
VOLUME /etc/letsencrypt /var/lib/letsencrypt
|
||||
WORKDIR /opt/certbot
|
||||
|
||||
COPY CHANGES.rst README.rst setup.py src/
|
||||
COPY acme src/acme
|
||||
COPY certbot src/certbot
|
||||
|
||||
RUN apk add --no-cache --virtual .certbot-deps \
|
||||
libffi \
|
||||
libssl1.0 \
|
||||
openssl \
|
||||
ca-certificates \
|
||||
binutils
|
||||
RUN apk add --no-cache --virtual .build-deps \
|
||||
gcc \
|
||||
linux-headers \
|
||||
openssl-dev \
|
||||
musl-dev \
|
||||
libffi-dev \
|
||||
&& pip install --no-cache-dir \
|
||||
--editable /opt/certbot/src/acme \
|
||||
--editable /opt/certbot/src \
|
||||
&& apk del .build-deps
|
||||
@@ -1,5 +1,5 @@
|
||||
# This Dockerfile builds an image for development.
|
||||
FROM debian:buster
|
||||
FROM ubuntu:xenial
|
||||
|
||||
# Note: this only exposes the port to other docker containers.
|
||||
EXPOSE 80 443
|
||||
@@ -16,6 +16,6 @@ RUN apt-get update && \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
RUN VENV_NAME="../venv" python tools/venv.py
|
||||
RUN VENV_NAME="../venv" tools/venv.sh
|
||||
|
||||
ENV PATH /opt/certbot/venv/bin:$PATH
|
||||
|
||||
75
Dockerfile-old
Normal file
75
Dockerfile-old
Normal file
@@ -0,0 +1,75 @@
|
||||
# https://github.com/letsencrypt/letsencrypt/pull/431#issuecomment-103659297
|
||||
# it is more likely developers will already have ubuntu:trusty rather
|
||||
# than e.g. debian:jessie and image size differences are negligible
|
||||
FROM ubuntu:trusty
|
||||
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
|
||||
MAINTAINER William Budington <bill@eff.org>
|
||||
|
||||
# Note: this only exposes the port to other docker containers. You
|
||||
# still have to bind to 443@host at runtime, as per the ACME spec.
|
||||
EXPOSE 443
|
||||
|
||||
# TODO: make sure --config-dir and --work-dir cannot be changed
|
||||
# through the CLI (certbot-docker wrapper that uses standalone
|
||||
# authenticator and text mode only?)
|
||||
VOLUME /etc/letsencrypt /var/lib/letsencrypt
|
||||
|
||||
WORKDIR /opt/certbot
|
||||
|
||||
# no need to mkdir anything:
|
||||
# https://docs.docker.com/reference/builder/#copy
|
||||
# If <dest> doesn't exist, it is created along with all missing
|
||||
# directories in its path.
|
||||
|
||||
ENV DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
|
||||
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
# the above is not likely to change, so by putting it further up the
|
||||
# Dockerfile we make sure we cache as much as possible
|
||||
|
||||
|
||||
COPY setup.py README.rst CHANGES.rst MANIFEST.in letsencrypt-auto-source/pieces/pipstrap.py /opt/certbot/src/
|
||||
|
||||
# all above files are necessary for setup.py and venv setup, however,
|
||||
# package source code directory has to be copied separately to a
|
||||
# subdirectory...
|
||||
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
|
||||
# directory, the entire contents of the directory are copied,
|
||||
# including filesystem metadata. Note: The directory itself is not
|
||||
# copied, just its contents." Order again matters, three files are far
|
||||
# more likely to be cached than the whole project directory
|
||||
|
||||
COPY certbot /opt/certbot/src/certbot/
|
||||
COPY acme /opt/certbot/src/acme/
|
||||
COPY certbot-apache /opt/certbot/src/certbot-apache/
|
||||
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
|
||||
|
||||
|
||||
RUN virtualenv --no-site-packages -p python2 /opt/certbot/venv
|
||||
|
||||
# PATH is set now so pipstrap upgrades the correct (v)env
|
||||
ENV PATH /opt/certbot/venv/bin:$PATH
|
||||
RUN /opt/certbot/venv/bin/python /opt/certbot/src/pipstrap.py && \
|
||||
/opt/certbot/venv/bin/pip install \
|
||||
-e /opt/certbot/src/acme \
|
||||
-e /opt/certbot/src \
|
||||
-e /opt/certbot/src/certbot-apache \
|
||||
-e /opt/certbot/src/certbot-nginx
|
||||
|
||||
# install in editable mode (-e) to save space: it's not possible to
|
||||
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
|
||||
# this might also help in debugging: you can "docker run --entrypoint
|
||||
# bash" and investigate, apply patches, etc.
|
||||
|
||||
# set up certbot/letsencrypt wrapper to warn people about Dockerfile changes
|
||||
COPY tools/docker-warning.sh /opt/certbot/bin/certbot
|
||||
RUN ln -s /opt/certbot/bin/certbot /opt/certbot/bin/letsencrypt
|
||||
ENV PATH /opt/certbot/bin:$PATH
|
||||
|
||||
ENTRYPOINT [ "certbot" ]
|
||||
@@ -1,5 +1,5 @@
|
||||
include README.rst
|
||||
include CHANGELOG.md
|
||||
include CHANGES.rst
|
||||
include CONTRIBUTING.md
|
||||
include LICENSE.txt
|
||||
include linter_plugin.py
|
||||
|
||||
54
README.rst
54
README.rst
@@ -6,7 +6,7 @@ Anyone who has gone through the trouble of setting up a secure website knows wha
|
||||
|
||||
How you use Certbot depends on the configuration of your web server. The best way to get started is to use our `interactive guide <https://certbot.eff.org>`_. It generates instructions based on your configuration settings. In most cases, you’ll need `root or administrator access <https://certbot.eff.org/faq/#does-certbot-require-root-administrator-privileges>`_ to your web server to run Certbot.
|
||||
|
||||
Certbot is meant to be run directly on your web server, not on your personal computer. If you’re using a hosted service and don’t have direct access to your web server, you might not be able to use Certbot. Check with your hosting provider for documentation about uploading certificates or using certificates issued by Let’s Encrypt.
|
||||
If you’re using a hosted service and don’t have direct access to your web server, you might not be able to use Certbot. Check with your hosting provider for documentation about uploading certificates or using certificates issued by Let’s Encrypt.
|
||||
|
||||
Certbot is a fully-featured, extensible client for the Let's
|
||||
Encrypt CA (or any other CA that speaks the `ACME
|
||||
@@ -28,19 +28,45 @@ Contributing
|
||||
If you'd like to contribute to this project please read `Developer Guide
|
||||
<https://certbot.eff.org/docs/contributing.html>`_.
|
||||
|
||||
This project is governed by `EFF's Public Projects Code of Conduct <https://www.eff.org/pages/eppcode>`_.
|
||||
|
||||
.. _installation:
|
||||
|
||||
Installation
|
||||
------------
|
||||
|
||||
The easiest way to install Certbot is by visiting `certbot.eff.org`_, where you can
|
||||
find the correct installation instructions for many web server and OS combinations.
|
||||
For more information, see `Get Certbot <https://certbot.eff.org/docs/install.html>`_.
|
||||
|
||||
.. _certbot.eff.org: https://certbot.eff.org/
|
||||
|
||||
How to run the client
|
||||
---------------------
|
||||
|
||||
The easiest way to install and run Certbot is by visiting `certbot.eff.org`_,
|
||||
where you can find the correct instructions for many web server and OS
|
||||
combinations. For more information, see `Get Certbot
|
||||
<https://certbot.eff.org/docs/install.html>`_.
|
||||
In many cases, you can just run ``certbot-auto`` or ``certbot``, and the
|
||||
client will guide you through the process of obtaining and installing certs
|
||||
interactively.
|
||||
|
||||
For full command line help, you can type::
|
||||
|
||||
./certbot-auto --help all
|
||||
|
||||
|
||||
You can also tell it exactly what you want it to do from the command line.
|
||||
For instance, if you want to obtain a cert for ``example.com``,
|
||||
``www.example.com``, and ``other.example.net``, using the Apache plugin to both
|
||||
obtain and install the certs, you could do this::
|
||||
|
||||
./certbot-auto --apache -d example.com -d www.example.com -d other.example.net
|
||||
|
||||
(The first time you run the command, it will make an account, and ask for an
|
||||
email and agreement to the Let's Encrypt Subscriber Agreement; you can
|
||||
automate those with ``--email`` and ``--agree-tos``)
|
||||
|
||||
If you want to use a webserver that doesn't have full plugin support yet, you
|
||||
can still use "standalone" or "webroot" plugins to obtain a certificate::
|
||||
|
||||
./certbot-auto certonly --standalone --email admin@example.com -d example.com -d www.example.com -d other.example.net
|
||||
|
||||
.. _certbot.eff.org: https://certbot.eff.org/
|
||||
|
||||
Understanding the client in more depth
|
||||
--------------------------------------
|
||||
@@ -65,6 +91,8 @@ Main Website: https://certbot.eff.org
|
||||
|
||||
Let's Encrypt Website: https://letsencrypt.org
|
||||
|
||||
IRC Channel: #letsencrypt on `Freenode`_
|
||||
|
||||
Community: https://community.letsencrypt.org
|
||||
|
||||
ACME spec: http://ietf-wg-acme.github.io/acme/
|
||||
@@ -73,12 +101,14 @@ ACME working area in github: https://github.com/ietf-wg-acme/acme
|
||||
|
||||
|build-status| |coverage| |docs| |container|
|
||||
|
||||
.. |build-status| image:: https://travis-ci.com/certbot/certbot.svg?branch=master
|
||||
:target: https://travis-ci.com/certbot/certbot
|
||||
.. _Freenode: https://webchat.freenode.net?channels=%23letsencrypt
|
||||
|
||||
.. |build-status| image:: https://travis-ci.org/certbot/certbot.svg?branch=master
|
||||
:target: https://travis-ci.org/certbot/certbot
|
||||
:alt: Travis CI status
|
||||
|
||||
.. |coverage| image:: https://codecov.io/gh/certbot/certbot/branch/master/graph/badge.svg
|
||||
:target: https://codecov.io/gh/certbot/certbot
|
||||
.. |coverage| image:: https://coveralls.io/repos/certbot/certbot/badge.svg?branch=master
|
||||
:target: https://coveralls.io/r/certbot/certbot
|
||||
:alt: Coverage status
|
||||
|
||||
.. |docs| image:: https://readthedocs.org/projects/letsencrypt/badge/
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
include LICENSE.txt
|
||||
include README.rst
|
||||
include pytest.ini
|
||||
recursive-include docs *
|
||||
recursive-include examples *
|
||||
recursive-include acme/testdata *
|
||||
|
||||
@@ -1,23 +1,12 @@
|
||||
"""ACME protocol implementation.
|
||||
|
||||
This module is an implementation of the `ACME protocol`_.
|
||||
This module is an implementation of the `ACME protocol`_. Latest
|
||||
supported version: `draft-ietf-acme-01`_.
|
||||
|
||||
|
||||
.. _`ACME protocol`: https://ietf-wg-acme.github.io/acme
|
||||
|
||||
.. _`draft-ietf-acme-01`:
|
||||
https://github.com/ietf-wg-acme/acme/tree/draft-ietf-acme-acme-01
|
||||
|
||||
"""
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
# This code exists to keep backwards compatibility with people using acme.jose
|
||||
# before it became the standalone josepy package.
|
||||
#
|
||||
# It is based on
|
||||
# https://github.com/requests/requests/blob/1278ecdf71a312dc2268f3bfc0aabfab3c006dcf/requests/packages.py
|
||||
|
||||
import josepy as jose
|
||||
|
||||
for mod in list(sys.modules):
|
||||
# This traversal is apparently necessary such that the identities are
|
||||
# preserved (acme.jose.* is josepy.*)
|
||||
if mod == 'josepy' or mod.startswith('josepy.'):
|
||||
sys.modules['acme.' + mod.replace('josepy', 'jose', 1)] = sys.modules[mod]
|
||||
|
||||
@@ -3,19 +3,25 @@ import abc
|
||||
import functools
|
||||
import hashlib
|
||||
import logging
|
||||
import socket
|
||||
|
||||
from cryptography.hazmat.primitives import hashes # type: ignore
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
import requests
|
||||
import six
|
||||
|
||||
from acme import errors
|
||||
from acme import crypto_util
|
||||
from acme import fields
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# pylint: disable=too-few-public-methods
|
||||
|
||||
|
||||
class Challenge(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
# _fields_to_partial_json | pylint: disable=abstract-method
|
||||
"""ACME challenge."""
|
||||
TYPES = {} # type: dict
|
||||
|
||||
@@ -29,7 +35,7 @@ class Challenge(jose.TypedJSONObjectWithFields):
|
||||
|
||||
|
||||
class ChallengeResponse(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
# _fields_to_partial_json | pylint: disable=abstract-method
|
||||
"""ACME challenge response."""
|
||||
TYPES = {} # type: dict
|
||||
resource_type = 'challenge'
|
||||
@@ -88,7 +94,6 @@ class _TokenChallenge(Challenge):
|
||||
"""
|
||||
# TODO: check that path combined with uri does not go above
|
||||
# URI_ROOT_PATH!
|
||||
# pylint: disable=unsupported-membership-test
|
||||
return b'..' not in self.token and b'/' not in self.token
|
||||
|
||||
|
||||
@@ -133,21 +138,17 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
|
||||
|
||||
return True
|
||||
|
||||
def to_partial_json(self):
|
||||
jobj = super(KeyAuthorizationChallengeResponse, self).to_partial_json()
|
||||
jobj.pop('keyAuthorization', None)
|
||||
return jobj
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class KeyAuthorizationChallenge(_TokenChallenge):
|
||||
# pylint: disable=abstract-class-little-used,too-many-ancestors
|
||||
"""Challenge based on Key Authorization.
|
||||
|
||||
:param response_cls: Subclass of `KeyAuthorizationChallengeResponse`
|
||||
that will be used to generate `response`.
|
||||
:param str typ: type of the challenge
|
||||
|
||||
"""
|
||||
typ = NotImplemented
|
||||
__metaclass__ = abc.ABCMeta
|
||||
|
||||
response_cls = NotImplemented
|
||||
thumbprint_hash_function = (
|
||||
KeyAuthorizationChallengeResponse.thumbprint_hash_function)
|
||||
@@ -172,7 +173,7 @@ class KeyAuthorizationChallenge(_TokenChallenge):
|
||||
:rtype: KeyAuthorizationChallengeResponse
|
||||
|
||||
"""
|
||||
return self.response_cls( # pylint: disable=not-callable
|
||||
return self.response_cls(
|
||||
key_authorization=self.key_authorization(account_key))
|
||||
|
||||
@abc.abstractmethod
|
||||
@@ -209,7 +210,7 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME dns-01 challenge response."""
|
||||
typ = "dns-01"
|
||||
|
||||
def simple_verify(self, chall, domain, account_public_key): # pylint: disable=unused-argument
|
||||
def simple_verify(self, chall, domain, account_public_key):
|
||||
"""Simple verify.
|
||||
|
||||
This method no longer checks DNS records and is a simple wrapper
|
||||
@@ -225,13 +226,14 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
# pylint: disable=unused-argument
|
||||
verified = self.verify(chall, account_public_key)
|
||||
if not verified:
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return verified
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class DNS01(KeyAuthorizationChallenge):
|
||||
"""ACME dns-01 challenge."""
|
||||
response_cls = DNS01Response
|
||||
@@ -321,7 +323,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
|
||||
return True
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class HTTP01(KeyAuthorizationChallenge):
|
||||
"""ACME http-01 challenge."""
|
||||
response_cls = HTTP01Response
|
||||
@@ -362,33 +364,149 @@ class HTTP01(KeyAuthorizationChallenge):
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME TLS-ALPN-01 challenge response.
|
||||
class TLSSNI01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME tls-sni-01 challenge response."""
|
||||
typ = "tls-sni-01"
|
||||
|
||||
This class only allows initiating a TLS-ALPN-01 challenge returned from the
|
||||
CA. Full support for responding to TLS-ALPN-01 challenges by generating and
|
||||
serving the expected response certificate is not currently provided.
|
||||
"""
|
||||
typ = "tls-alpn-01"
|
||||
DOMAIN_SUFFIX = b".acme.invalid"
|
||||
"""Domain name suffix."""
|
||||
|
||||
PORT = 443
|
||||
"""Verification port as defined by the protocol.
|
||||
|
||||
@Challenge.register
|
||||
class TLSALPN01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-alpn-01 challenge.
|
||||
|
||||
This class simply allows parsing the TLS-ALPN-01 challenge returned from
|
||||
the CA. Full TLS-ALPN-01 support is not currently provided.
|
||||
You can override it (e.g. for testing) by passing ``port`` to
|
||||
`simple_verify`.
|
||||
|
||||
"""
|
||||
typ = "tls-alpn-01"
|
||||
response_cls = TLSALPN01Response
|
||||
|
||||
@property
|
||||
def z(self): # pylint: disable=invalid-name
|
||||
"""``z`` value used for verification.
|
||||
|
||||
:rtype bytes:
|
||||
|
||||
"""
|
||||
return hashlib.sha256(
|
||||
self.key_authorization.encode("utf-8")).hexdigest().lower().encode()
|
||||
|
||||
@property
|
||||
def z_domain(self):
|
||||
"""Domain name used for verification, generated from `z`.
|
||||
|
||||
:rtype bytes:
|
||||
|
||||
"""
|
||||
return self.z[:32] + b'.' + self.z[32:] + self.DOMAIN_SUFFIX
|
||||
|
||||
def gen_cert(self, key=None, bits=2048):
|
||||
"""Generate tls-sni-01 certificate.
|
||||
|
||||
:param OpenSSL.crypto.PKey key: Optional private key used in
|
||||
certificate generation. If not provided (``None``), then
|
||||
fresh key will be generated.
|
||||
:param int bits: Number of bits for newly generated key.
|
||||
|
||||
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
|
||||
|
||||
"""
|
||||
if key is None:
|
||||
key = OpenSSL.crypto.PKey()
|
||||
key.generate_key(OpenSSL.crypto.TYPE_RSA, bits)
|
||||
return crypto_util.gen_ss_cert(key, [
|
||||
# z_domain is too big to fit into CN, hence first dummy domain
|
||||
'dummy', self.z_domain.decode()], force_san=True), key
|
||||
|
||||
def probe_cert(self, domain, **kwargs):
|
||||
"""Probe tls-sni-01 challenge certificate.
|
||||
|
||||
:param unicode domain:
|
||||
|
||||
"""
|
||||
# TODO: domain is not necessary if host is provided
|
||||
if "host" not in kwargs:
|
||||
host = socket.gethostbyname(domain)
|
||||
logger.debug('%s resolved to %s', domain, host)
|
||||
kwargs["host"] = host
|
||||
|
||||
kwargs.setdefault("port", self.PORT)
|
||||
kwargs["name"] = self.z_domain
|
||||
# TODO: try different methods?
|
||||
# pylint: disable=protected-access
|
||||
return crypto_util.probe_sni(**kwargs)
|
||||
|
||||
def verify_cert(self, cert):
|
||||
"""Verify tls-sni-01 challenge certificate.
|
||||
|
||||
:param OpensSSL.crypto.X509 cert: Challenge certificate.
|
||||
|
||||
:returns: Whether the certificate was successfully verified.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
# pylint: disable=protected-access
|
||||
sans = crypto_util._pyopenssl_cert_or_req_san(cert)
|
||||
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), sans)
|
||||
return self.z_domain.decode() in sans
|
||||
|
||||
def simple_verify(self, chall, domain, account_public_key,
|
||||
cert=None, **kwargs):
|
||||
"""Simple verify.
|
||||
|
||||
Verify ``validation`` using ``account_public_key``, optionally
|
||||
probe tls-sni-01 certificate and check using `verify_cert`.
|
||||
|
||||
:param .challenges.TLSSNI01 chall: Corresponding challenge.
|
||||
:param str domain: Domain name being validated.
|
||||
:param JWK account_public_key:
|
||||
:param OpenSSL.crypto.X509 cert: Optional certificate. If not
|
||||
provided (``None``) certificate will be retrieved using
|
||||
`probe_cert`.
|
||||
:param int port: Port used to probe the certificate.
|
||||
|
||||
|
||||
:returns: ``True`` iff client's control of the domain has been
|
||||
verified.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not self.verify(chall, account_public_key):
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return False
|
||||
|
||||
if cert is None:
|
||||
try:
|
||||
cert = self.probe_cert(domain=domain, **kwargs)
|
||||
except errors.Error as error:
|
||||
logger.debug(error, exc_info=True)
|
||||
return False
|
||||
|
||||
return self.verify_cert(cert)
|
||||
|
||||
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class TLSSNI01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-sni-01 challenge."""
|
||||
response_cls = TLSSNI01Response
|
||||
typ = response_cls.typ
|
||||
|
||||
# boulder#962, ietf-wg-acme#22
|
||||
#n = jose.Field("n", encoder=int, decoder=int)
|
||||
|
||||
def validation(self, account_key, **kwargs):
|
||||
"""Generate validation for the challenge."""
|
||||
raise NotImplementedError()
|
||||
"""Generate validation.
|
||||
|
||||
:param JWK account_key:
|
||||
:param OpenSSL.crypto.PKey cert_key: Optional private key used
|
||||
in certificate generation. If not provided (``None``), then
|
||||
fresh key will be generated.
|
||||
|
||||
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
|
||||
|
||||
"""
|
||||
return self.response(account_key).gen_cert(key=kwargs.get('cert_key'))
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class DNS(_TokenChallenge):
|
||||
"""ACME "dns" challenge."""
|
||||
typ = "dns"
|
||||
|
||||
@@ -3,10 +3,12 @@ import unittest
|
||||
|
||||
import josepy as jose
|
||||
import mock
|
||||
import OpenSSL
|
||||
import requests
|
||||
|
||||
from six.moves.urllib import parse as urllib_parse # pylint: disable=relative-import
|
||||
from six.moves.urllib import parse as urllib_parse # pylint: disable=import-error
|
||||
|
||||
from acme import errors
|
||||
from acme import test_util
|
||||
|
||||
CERT = test_util.load_comparable_cert('cert.pem')
|
||||
@@ -75,6 +77,7 @@ class KeyAuthorizationChallengeResponseTest(unittest.TestCase):
|
||||
|
||||
|
||||
class DNS01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import DNS01Response
|
||||
@@ -90,8 +93,7 @@ class DNS01ResponseTest(unittest.TestCase):
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import DNS01Response
|
||||
@@ -146,6 +148,7 @@ class DNS01Test(unittest.TestCase):
|
||||
|
||||
|
||||
class HTTP01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
@@ -161,8 +164,7 @@ class HTTP01ResponseTest(unittest.TestCase):
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
@@ -255,42 +257,114 @@ class HTTP01Test(unittest.TestCase):
|
||||
self.msg.update(token=b'..').good_token)
|
||||
|
||||
|
||||
class TLSALPN01ResponseTest(unittest.TestCase):
|
||||
class TLSSNI01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
self.msg = TLSALPN01Response(key_authorization=u'foo')
|
||||
from acme.challenges import TLSSNI01
|
||||
self.chall = TLSSNI01(
|
||||
token=jose.b64decode(b'a82d5ff8ef740d12881f6d3c2277ab2e'))
|
||||
|
||||
self.response = self.chall.response(KEY)
|
||||
self.jmsg = {
|
||||
'resource': 'challenge',
|
||||
'type': 'tls-alpn-01',
|
||||
'keyAuthorization': u'foo',
|
||||
'type': 'tls-sni-01',
|
||||
'keyAuthorization': self.response.key_authorization,
|
||||
}
|
||||
|
||||
from acme.challenges import TLSALPN01
|
||||
self.chall = TLSALPN01(token=(b'x' * 16))
|
||||
self.response = self.chall.response(KEY)
|
||||
# pylint: disable=invalid-name
|
||||
label1 = b'dc38d9c3fa1a4fdcc3a5501f2d38583f'
|
||||
label2 = b'b7793728f084394f2a1afd459556bb5c'
|
||||
self.z = label1 + label2
|
||||
self.z_domain = label1 + b'.' + label2 + b'.acme.invalid'
|
||||
self.domain = 'foo.com'
|
||||
|
||||
def test_z_and_domain(self):
|
||||
self.assertEqual(self.z, self.response.z)
|
||||
self.assertEqual(self.z_domain, self.response.z_domain)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.response.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
self.assertEqual(self.msg, TLSALPN01Response.from_json(self.jmsg))
|
||||
from acme.challenges import TLSSNI01Response
|
||||
self.assertEqual(self.response, TLSSNI01Response.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
hash(TLSALPN01Response.from_json(self.jmsg))
|
||||
from acme.challenges import TLSSNI01Response
|
||||
hash(TLSSNI01Response.from_json(self.jmsg))
|
||||
|
||||
@mock.patch('acme.challenges.socket.gethostbyname')
|
||||
@mock.patch('acme.challenges.crypto_util.probe_sni')
|
||||
def test_probe_cert(self, mock_probe_sni, mock_gethostbyname):
|
||||
mock_gethostbyname.return_value = '127.0.0.1'
|
||||
self.response.probe_cert('foo.com')
|
||||
mock_gethostbyname.assert_called_once_with('foo.com')
|
||||
mock_probe_sni.assert_called_once_with(
|
||||
host='127.0.0.1', port=self.response.PORT,
|
||||
name=self.z_domain)
|
||||
|
||||
self.response.probe_cert('foo.com', host='8.8.8.8')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host='8.8.8.8', port=mock.ANY, name=mock.ANY)
|
||||
|
||||
self.response.probe_cert('foo.com', port=1234)
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=1234, name=mock.ANY)
|
||||
|
||||
self.response.probe_cert('foo.com', bar='baz')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=mock.ANY, name=mock.ANY, bar='baz')
|
||||
|
||||
self.response.probe_cert('foo.com', name=b'xxx')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=mock.ANY,
|
||||
name=self.z_domain)
|
||||
|
||||
def test_gen_verify_cert(self):
|
||||
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
|
||||
cert, key2 = self.response.gen_cert(key1)
|
||||
self.assertEqual(key1, key2)
|
||||
self.assertTrue(self.response.verify_cert(cert))
|
||||
|
||||
def test_gen_verify_cert_gen_key(self):
|
||||
cert, key = self.response.gen_cert()
|
||||
self.assertTrue(isinstance(key, OpenSSL.crypto.PKey))
|
||||
self.assertTrue(self.response.verify_cert(cert))
|
||||
|
||||
def test_verify_bad_cert(self):
|
||||
self.assertFalse(self.response.verify_cert(
|
||||
test_util.load_cert('cert.pem')))
|
||||
|
||||
def test_simple_verify_bad_key_authorization(self):
|
||||
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
|
||||
self.response.simple_verify(self.chall, "local", key2.public_key())
|
||||
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.verify_cert', autospec=True)
|
||||
def test_simple_verify(self, mock_verify_cert):
|
||||
mock_verify_cert.return_value = mock.sentinel.verification
|
||||
self.assertEqual(
|
||||
mock.sentinel.verification, self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key(),
|
||||
cert=mock.sentinel.cert))
|
||||
mock_verify_cert.assert_called_once_with(
|
||||
self.response, mock.sentinel.cert)
|
||||
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.probe_cert')
|
||||
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
|
||||
mock_probe_cert.side_effect = errors.Error
|
||||
self.assertFalse(self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key()))
|
||||
|
||||
|
||||
class TLSALPN01Test(unittest.TestCase):
|
||||
class TLSSNI01Test(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
self.msg = TLSALPN01(
|
||||
from acme.challenges import TLSSNI01
|
||||
self.msg = TLSSNI01(
|
||||
token=jose.b64decode('a82d5ff8ef740d12881f6d3c2277ab2e'))
|
||||
self.jmsg = {
|
||||
'type': 'tls-alpn-01',
|
||||
'type': 'tls-sni-01',
|
||||
'token': 'a82d5ff8ef740d12881f6d3c2277ab2e',
|
||||
}
|
||||
|
||||
@@ -298,21 +372,25 @@ class TLSALPN01Test(unittest.TestCase):
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
self.assertEqual(self.msg, TLSALPN01.from_json(self.jmsg))
|
||||
from acme.challenges import TLSSNI01
|
||||
self.assertEqual(self.msg, TLSSNI01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
hash(TLSALPN01.from_json(self.jmsg))
|
||||
from acme.challenges import TLSSNI01
|
||||
hash(TLSSNI01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_invalid_token_length(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
from acme.challenges import TLSSNI01
|
||||
self.jmsg['token'] = jose.encode_b64jose(b'abcd')
|
||||
self.assertRaises(
|
||||
jose.DeserializationError, TLSALPN01.from_json, self.jmsg)
|
||||
jose.DeserializationError, TLSSNI01.from_json, self.jmsg)
|
||||
|
||||
def test_validation(self):
|
||||
self.assertRaises(NotImplementedError, self.msg.validation, KEY)
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.gen_cert')
|
||||
def test_validation(self, mock_gen_cert):
|
||||
mock_gen_cert.return_value = ('cert', 'key')
|
||||
self.assertEqual(('cert', 'key'), self.msg.validation(
|
||||
KEY, cert_key=mock.sentinel.cert_key))
|
||||
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key)
|
||||
|
||||
|
||||
class DNSTest(unittest.TestCase):
|
||||
|
||||
@@ -6,23 +6,20 @@ from email.utils import parsedate_tz
|
||||
import heapq
|
||||
import logging
|
||||
import time
|
||||
import re
|
||||
import sys
|
||||
|
||||
import six
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
import re
|
||||
import requests
|
||||
from requests.adapters import HTTPAdapter
|
||||
from requests_toolbelt.adapters.source import SourceAddressAdapter
|
||||
import sys
|
||||
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
from acme import jws
|
||||
from acme import messages
|
||||
# pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Dict, List, Set, Text
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -33,7 +30,6 @@ logger = logging.getLogger(__name__)
|
||||
# https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning
|
||||
if sys.version_info < (2, 7, 9): # pragma: no cover
|
||||
try:
|
||||
# pylint: disable=no-member
|
||||
requests.packages.urllib3.contrib.pyopenssl.inject_into_urllib3() # type: ignore
|
||||
except AttributeError:
|
||||
import urllib3.contrib.pyopenssl # pylint: disable=import-error
|
||||
@@ -44,13 +40,14 @@ DEFAULT_NETWORK_TIMEOUT = 45
|
||||
DER_CONTENT_TYPE = 'application/pkix-cert'
|
||||
|
||||
|
||||
class ClientBase(object):
|
||||
class ClientBase(object): # pylint: disable=too-many-instance-attributes
|
||||
"""ACME client base object.
|
||||
|
||||
:ivar messages.Directory directory:
|
||||
:ivar .ClientNetwork net: Client network.
|
||||
:ivar int acme_version: ACME protocol version. 1 or 2.
|
||||
"""
|
||||
|
||||
def __init__(self, directory, net, acme_version):
|
||||
"""Initialize.
|
||||
|
||||
@@ -90,8 +87,6 @@ class ClientBase(object):
|
||||
|
||||
"""
|
||||
kwargs.setdefault('acme_version', self.acme_version)
|
||||
if hasattr(self.directory, 'newNonce'):
|
||||
kwargs.setdefault('new_nonce_url', getattr(self.directory, 'newNonce'))
|
||||
return self.net.post(*args, **kwargs)
|
||||
|
||||
def update_registration(self, regr, update=None):
|
||||
@@ -123,21 +118,14 @@ class ClientBase(object):
|
||||
"""
|
||||
return self.update_registration(regr, update={'status': 'deactivated'})
|
||||
|
||||
def deactivate_authorization(self, authzr):
|
||||
# type: (messages.AuthorizationResource) -> messages.AuthorizationResource
|
||||
"""Deactivate authorization.
|
||||
def query_registration(self, regr):
|
||||
"""Query server about registration.
|
||||
|
||||
:param messages.AuthorizationResource authzr: The Authorization resource
|
||||
to be deactivated.
|
||||
|
||||
:returns: The Authorization resource that was deactivated.
|
||||
:rtype: `.AuthorizationResource`
|
||||
:param messages.RegistrationResource: Existing Registration
|
||||
Resource.
|
||||
|
||||
"""
|
||||
body = messages.UpdateAuthorization(status='deactivated')
|
||||
response = self._post(authzr.uri, body)
|
||||
return self._authzr_from_response(response,
|
||||
authzr.body.identifier, authzr.uri)
|
||||
return self._send_recv_regr(regr, messages.UpdateRegistration())
|
||||
|
||||
def _authzr_from_response(self, response, identifier=None, uri=None):
|
||||
authzr = messages.AuthorizationResource(
|
||||
@@ -207,6 +195,22 @@ class ClientBase(object):
|
||||
|
||||
return datetime.datetime.now() + datetime.timedelta(seconds=seconds)
|
||||
|
||||
def poll(self, authzr):
|
||||
"""Poll Authorization Resource for status.
|
||||
|
||||
:param authzr: Authorization Resource
|
||||
:type authzr: `.AuthorizationResource`
|
||||
|
||||
:returns: Updated Authorization Resource and HTTP response.
|
||||
|
||||
:rtype: (`.AuthorizationResource`, `requests.Response`)
|
||||
|
||||
"""
|
||||
response = self.net.get(authzr.uri)
|
||||
updated_authzr = self._authzr_from_response(
|
||||
response, authzr.body.identifier, authzr.uri)
|
||||
return updated_authzr, response
|
||||
|
||||
def _revoke(self, cert, rsn, url):
|
||||
"""Revoke certificate.
|
||||
|
||||
@@ -228,7 +232,6 @@ class ClientBase(object):
|
||||
raise errors.ClientError(
|
||||
'Successful revocation must return HTTP OK status')
|
||||
|
||||
|
||||
class Client(ClientBase):
|
||||
"""ACME client for a v1 API.
|
||||
|
||||
@@ -254,6 +257,7 @@ class Client(ClientBase):
|
||||
URI from which the resource will be downloaded.
|
||||
|
||||
"""
|
||||
# pylint: disable=too-many-arguments
|
||||
self.key = key
|
||||
if net is None:
|
||||
net = ClientNetwork(key, alg=alg, verify_ssl=verify_ssl)
|
||||
@@ -282,15 +286,6 @@ class Client(ClientBase):
|
||||
# pylint: disable=no-member
|
||||
return self._regr_from_response(response)
|
||||
|
||||
def query_registration(self, regr):
|
||||
"""Query server about registration.
|
||||
|
||||
:param messages.RegistrationResource: Existing Registration
|
||||
Resource.
|
||||
|
||||
"""
|
||||
return self._send_recv_regr(regr, messages.UpdateRegistration())
|
||||
|
||||
def agree_to_tos(self, regr):
|
||||
"""Agree to the terms-of-service.
|
||||
|
||||
@@ -389,22 +384,6 @@ class Client(ClientBase):
|
||||
body=jose.ComparableX509(OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_ASN1, response.content)))
|
||||
|
||||
def poll(self, authzr):
|
||||
"""Poll Authorization Resource for status.
|
||||
|
||||
:param authzr: Authorization Resource
|
||||
:type authzr: `.AuthorizationResource`
|
||||
|
||||
:returns: Updated Authorization Resource and HTTP response.
|
||||
|
||||
:rtype: (`.AuthorizationResource`, `requests.Response`)
|
||||
|
||||
"""
|
||||
response = self.net.get(authzr.uri)
|
||||
updated_authzr = self._authzr_from_response(
|
||||
response, authzr.body.identifier, authzr.uri)
|
||||
return updated_authzr, response
|
||||
|
||||
def poll_and_request_issuance(
|
||||
self, csr, authzrs, mintime=5, max_attempts=10):
|
||||
"""Poll and request issuance.
|
||||
@@ -434,8 +413,9 @@ class Client(ClientBase):
|
||||
was marked by the CA as invalid
|
||||
|
||||
"""
|
||||
# pylint: disable=too-many-locals
|
||||
assert max_attempts > 0
|
||||
attempts = collections.defaultdict(int) # type: Dict[messages.AuthorizationResource, int]
|
||||
attempts = collections.defaultdict(int)
|
||||
exhausted = set()
|
||||
|
||||
# priority queue with datetime.datetime (based on Retry-After) as key,
|
||||
@@ -549,7 +529,7 @@ class Client(ClientBase):
|
||||
:rtype: `list` of `OpenSSL.crypto.X509` wrapped in `.ComparableX509`
|
||||
|
||||
"""
|
||||
chain = [] # type: List[jose.ComparableX509]
|
||||
chain = []
|
||||
uri = certr.cert_chain_uri
|
||||
while uri is not None and len(chain) < max_length:
|
||||
response, cert = self._get_cert(uri)
|
||||
@@ -595,60 +575,16 @@ class ClientV2(ClientBase):
|
||||
|
||||
:param .NewRegistration new_account:
|
||||
|
||||
:raises .ConflictError: in case the account already exists
|
||||
|
||||
:returns: Registration Resource.
|
||||
:rtype: `.RegistrationResource`
|
||||
"""
|
||||
response = self._post(self.directory['newAccount'], new_account)
|
||||
# if account already exists
|
||||
if response.status_code == 200 and 'Location' in response.headers:
|
||||
raise errors.ConflictError(response.headers.get('Location'))
|
||||
# "Instance of 'Field' has no key/contact member" bug:
|
||||
# pylint: disable=no-member
|
||||
regr = self._regr_from_response(response)
|
||||
self.net.account = regr
|
||||
return regr
|
||||
|
||||
def query_registration(self, regr):
|
||||
"""Query server about registration.
|
||||
|
||||
:param messages.RegistrationResource: Existing Registration
|
||||
Resource.
|
||||
|
||||
"""
|
||||
self.net.account = regr # See certbot/certbot#6258
|
||||
# ACME v2 requires to use a POST-as-GET request (POST an empty JWS) here.
|
||||
# This is done by passing None instead of an empty UpdateRegistration to _post().
|
||||
response = self._post(regr.uri, None)
|
||||
self.net.account = self._regr_from_response(response, uri=regr.uri,
|
||||
terms_of_service=regr.terms_of_service)
|
||||
return self.net.account
|
||||
|
||||
def update_registration(self, regr, update=None):
|
||||
"""Update registration.
|
||||
|
||||
:param messages.RegistrationResource regr: Registration Resource.
|
||||
:param messages.Registration update: Updated body of the
|
||||
resource. If not provided, body will be taken from `regr`.
|
||||
|
||||
:returns: Updated Registration Resource.
|
||||
:rtype: `.RegistrationResource`
|
||||
|
||||
"""
|
||||
# https://github.com/certbot/certbot/issues/6155
|
||||
new_regr = self._get_v2_account(regr)
|
||||
return super(ClientV2, self).update_registration(new_regr, update)
|
||||
|
||||
def _get_v2_account(self, regr):
|
||||
self.net.account = None
|
||||
only_existing_reg = regr.body.update(only_return_existing=True)
|
||||
response = self._post(self.directory['newAccount'], only_existing_reg)
|
||||
updated_uri = response.headers['Location']
|
||||
new_regr = regr.update(uri=updated_uri)
|
||||
self.net.account = new_regr
|
||||
return new_regr
|
||||
|
||||
def new_order(self, csr_pem):
|
||||
"""Request a new Order object from the server.
|
||||
|
||||
@@ -669,30 +605,14 @@ class ClientV2(ClientBase):
|
||||
response = self._post(self.directory['newOrder'], order)
|
||||
body = messages.Order.from_json(response.json())
|
||||
authorizations = []
|
||||
for url in body.authorizations: # pylint: disable=not-an-iterable
|
||||
authorizations.append(self._authzr_from_response(self._post_as_get(url), uri=url))
|
||||
for url in body.authorizations:
|
||||
authorizations.append(self._authzr_from_response(self.net.get(url), uri=url))
|
||||
return messages.OrderResource(
|
||||
body=body,
|
||||
uri=response.headers.get('Location'),
|
||||
authorizations=authorizations,
|
||||
csr_pem=csr_pem)
|
||||
|
||||
def poll(self, authzr):
|
||||
"""Poll Authorization Resource for status.
|
||||
|
||||
:param authzr: Authorization Resource
|
||||
:type authzr: `.AuthorizationResource`
|
||||
|
||||
:returns: Updated Authorization Resource and HTTP response.
|
||||
|
||||
:rtype: (`.AuthorizationResource`, `requests.Response`)
|
||||
|
||||
"""
|
||||
response = self._post_as_get(authzr.uri)
|
||||
updated_authzr = self._authzr_from_response(
|
||||
response, authzr.body.identifier, authzr.uri)
|
||||
return updated_authzr, response
|
||||
|
||||
def poll_and_finalize(self, orderr, deadline=None):
|
||||
"""Poll authorizations and finalize the order.
|
||||
|
||||
@@ -716,7 +636,7 @@ class ClientV2(ClientBase):
|
||||
responses = []
|
||||
for url in orderr.body.authorizations:
|
||||
while datetime.datetime.now() < deadline:
|
||||
authzr = self._authzr_from_response(self._post_as_get(url), uri=url)
|
||||
authzr = self._authzr_from_response(self.net.get(url), uri=url)
|
||||
if authzr.body.status != messages.STATUS_PENDING:
|
||||
responses.append(authzr)
|
||||
break
|
||||
@@ -731,7 +651,7 @@ class ClientV2(ClientBase):
|
||||
for chall in authzr.body.challenges:
|
||||
if chall.error != None:
|
||||
failed.append(authzr)
|
||||
if failed:
|
||||
if len(failed) > 0:
|
||||
raise errors.ValidationError(failed)
|
||||
return orderr.update(authorizations=responses)
|
||||
|
||||
@@ -751,12 +671,13 @@ class ClientV2(ClientBase):
|
||||
self._post(orderr.body.finalize, wrapped_csr)
|
||||
while datetime.datetime.now() < deadline:
|
||||
time.sleep(1)
|
||||
response = self._post_as_get(orderr.uri)
|
||||
response = self.net.get(orderr.uri)
|
||||
body = messages.Order.from_json(response.json())
|
||||
if body.error is not None:
|
||||
raise errors.IssuanceError(body.error)
|
||||
if body.certificate is not None:
|
||||
certificate_response = self._post_as_get(body.certificate).text
|
||||
certificate_response = self.net.get(body.certificate,
|
||||
content_type=DER_CONTENT_TYPE).text
|
||||
return orderr.update(body=body, fullchain_pem=certificate_response)
|
||||
raise errors.TimeoutError()
|
||||
|
||||
@@ -773,36 +694,6 @@ class ClientV2(ClientBase):
|
||||
"""
|
||||
return self._revoke(cert, rsn, self.directory['revokeCert'])
|
||||
|
||||
def external_account_required(self):
|
||||
"""Checks if ACME server requires External Account Binding authentication."""
|
||||
return hasattr(self.directory, 'meta') and self.directory.meta.external_account_required
|
||||
|
||||
def _post_as_get(self, *args, **kwargs):
|
||||
"""
|
||||
Send GET request using the POST-as-GET protocol if needed.
|
||||
The request will be first issued using POST-as-GET for ACME v2. If the ACME CA servers do
|
||||
not support this yet and return an error, request will be retried using GET.
|
||||
For ACME v1, only GET request will be tried, as POST-as-GET is not supported.
|
||||
:param args:
|
||||
:param kwargs:
|
||||
:return:
|
||||
"""
|
||||
if self.acme_version >= 2:
|
||||
# We add an empty payload for POST-as-GET requests
|
||||
new_args = args[:1] + (None,) + args[1:]
|
||||
try:
|
||||
return self._post(*new_args, **kwargs)
|
||||
except messages.Error as error:
|
||||
if error.code == 'malformed':
|
||||
logger.debug('Error during a POST-as-GET request, '
|
||||
'your ACME CA server may not support it:\n%s', error)
|
||||
logger.debug('Retrying request with GET.')
|
||||
else: # pragma: no cover
|
||||
raise
|
||||
|
||||
# If POST-as-GET is not supported yet, we use a GET instead.
|
||||
return self.net.get(*args, **kwargs)
|
||||
|
||||
|
||||
class BackwardsCompatibleClientV2(object):
|
||||
"""ACME client wrapper that tends towards V2-style calls, but
|
||||
@@ -832,7 +723,12 @@ class BackwardsCompatibleClientV2(object):
|
||||
self.client = ClientV2(directory, net=net)
|
||||
|
||||
def __getattr__(self, name):
|
||||
return getattr(self.client, name)
|
||||
if name in vars(self.client):
|
||||
return getattr(self.client, name)
|
||||
elif name in dir(ClientBase):
|
||||
return getattr(self.client, name)
|
||||
else:
|
||||
raise AttributeError()
|
||||
|
||||
def new_account_and_tos(self, regr, check_tos_cb=None):
|
||||
"""Combined register and agree_tos for V1, new_account for V2
|
||||
@@ -879,7 +775,8 @@ class BackwardsCompatibleClientV2(object):
|
||||
for domain in dnsNames:
|
||||
authorizations.append(self.client.request_domain_challenges(domain))
|
||||
return messages.OrderResource(authorizations=authorizations, csr_pem=csr_pem)
|
||||
return self.client.new_order(csr_pem)
|
||||
else:
|
||||
return self.client.new_order(csr_pem)
|
||||
|
||||
def finalize_order(self, orderr, deadline):
|
||||
"""Finalize an order and obtain a certificate.
|
||||
@@ -916,7 +813,8 @@ class BackwardsCompatibleClientV2(object):
|
||||
chain = crypto_util.dump_pyopenssl_chain(chain).decode()
|
||||
|
||||
return orderr.update(fullchain_pem=(cert + chain))
|
||||
return self.client.finalize_order(orderr, deadline)
|
||||
else:
|
||||
return self.client.finalize_order(orderr, deadline)
|
||||
|
||||
def revoke(self, cert, rsn):
|
||||
"""Revoke certificate.
|
||||
@@ -934,18 +832,11 @@ class BackwardsCompatibleClientV2(object):
|
||||
def _acme_version_from_directory(self, directory):
|
||||
if hasattr(directory, 'newNonce'):
|
||||
return 2
|
||||
return 1
|
||||
|
||||
def external_account_required(self):
|
||||
"""Checks if the server requires an external account for ACMEv2 servers.
|
||||
|
||||
Always return False for ACMEv1 servers, as it doesn't use External Account Binding."""
|
||||
if self.acme_version == 1:
|
||||
return False
|
||||
return self.client.external_account_required()
|
||||
else:
|
||||
return 1
|
||||
|
||||
|
||||
class ClientNetwork(object):
|
||||
class ClientNetwork(object): # pylint: disable=too-many-instance-attributes
|
||||
"""Wrapper around requests that signs POSTs for authentication.
|
||||
|
||||
Also adds user agent, and handles Content-Type.
|
||||
@@ -965,27 +856,18 @@ class ClientNetwork(object):
|
||||
:param bool verify_ssl: Whether to verify certificates on SSL connections.
|
||||
:param str user_agent: String to send as User-Agent header.
|
||||
:param float timeout: Timeout for requests.
|
||||
:param source_address: Optional source address to bind to when making requests.
|
||||
:type source_address: str or tuple(str, int)
|
||||
"""
|
||||
def __init__(self, key, account=None, alg=jose.RS256, verify_ssl=True,
|
||||
user_agent='acme-python', timeout=DEFAULT_NETWORK_TIMEOUT,
|
||||
source_address=None):
|
||||
user_agent='acme-python', timeout=DEFAULT_NETWORK_TIMEOUT):
|
||||
# pylint: disable=too-many-arguments
|
||||
self.key = key
|
||||
self.account = account
|
||||
self.alg = alg
|
||||
self.verify_ssl = verify_ssl
|
||||
self._nonces = set() # type: Set[Text]
|
||||
self._nonces = set()
|
||||
self.user_agent = user_agent
|
||||
self.session = requests.Session()
|
||||
self._default_timeout = timeout
|
||||
adapter = HTTPAdapter()
|
||||
|
||||
if source_address is not None:
|
||||
adapter = SourceAddressAdapter(source_address)
|
||||
|
||||
self.session.mount("http://", adapter)
|
||||
self.session.mount("https://", adapter)
|
||||
|
||||
def __del__(self):
|
||||
# Try to close the session, but don't show exceptions to the
|
||||
@@ -1006,7 +888,7 @@ class ClientNetwork(object):
|
||||
:rtype: `josepy.JWS`
|
||||
|
||||
"""
|
||||
jobj = obj.json_dumps(indent=2).encode() if obj else b''
|
||||
jobj = obj.json_dumps(indent=2).encode()
|
||||
logger.debug('JWS payload:\n%s', jobj)
|
||||
kwargs = {
|
||||
"alg": self.alg,
|
||||
@@ -1015,10 +897,10 @@ class ClientNetwork(object):
|
||||
if acme_version == 2:
|
||||
kwargs["url"] = url
|
||||
# newAccount and revokeCert work without the kid
|
||||
# newAccount must not have kid
|
||||
if self.account is not None:
|
||||
kwargs["kid"] = self.account["uri"]
|
||||
kwargs["key"] = self.key
|
||||
# pylint: disable=star-args
|
||||
return jws.JWS.sign(jobj, **kwargs).json_dumps(indent=2)
|
||||
|
||||
@classmethod
|
||||
@@ -1078,6 +960,7 @@ class ClientNetwork(object):
|
||||
return response
|
||||
|
||||
def _send_request(self, method, url, *args, **kwargs):
|
||||
# pylint: disable=too-many-locals
|
||||
"""Send HTTP request.
|
||||
|
||||
Makes sure that `verify_ssl` is respected. Logs request and
|
||||
@@ -1134,7 +1017,7 @@ class ClientNetwork(object):
|
||||
if response.headers.get("Content-Type") == DER_CONTENT_TYPE:
|
||||
debug_content = base64.b64encode(response.content)
|
||||
else:
|
||||
debug_content = response.content.decode("utf-8")
|
||||
debug_content = response.content
|
||||
logger.debug('Received response:\nHTTP %d\n%s\n\n%s',
|
||||
response.status_code,
|
||||
"\n".join(["{0}: {1}".format(k, v)
|
||||
@@ -1169,15 +1052,10 @@ class ClientNetwork(object):
|
||||
else:
|
||||
raise errors.MissingNonce(response)
|
||||
|
||||
def _get_nonce(self, url, new_nonce_url):
|
||||
def _get_nonce(self, url):
|
||||
if not self._nonces:
|
||||
logger.debug('Requesting fresh nonce')
|
||||
if new_nonce_url is None:
|
||||
response = self.head(url)
|
||||
else:
|
||||
# request a new nonce from the acme newNonce endpoint
|
||||
response = self._check_response(self.head(new_nonce_url), content_type=None)
|
||||
self._add_nonce(response)
|
||||
self._add_nonce(self.head(url))
|
||||
return self._nonces.pop()
|
||||
|
||||
def post(self, *args, **kwargs):
|
||||
@@ -1198,10 +1076,8 @@ class ClientNetwork(object):
|
||||
|
||||
def _post_once(self, url, obj, content_type=JOSE_CONTENT_TYPE,
|
||||
acme_version=1, **kwargs):
|
||||
new_nonce_url = kwargs.pop('new_nonce_url', None)
|
||||
data = self._wrap_in_jws(obj, self._get_nonce(url, new_nonce_url), url, acme_version)
|
||||
data = self._wrap_in_jws(obj, self._get_nonce(url), url, acme_version)
|
||||
kwargs.setdefault('headers', {'Content-Type': content_type})
|
||||
response = self._send_request('POST', url, data=data, **kwargs)
|
||||
response = self._check_response(response, content_type=content_type)
|
||||
self._add_nonce(response)
|
||||
return response
|
||||
return self._check_response(response, content_type=content_type)
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
"""Tests for acme.client."""
|
||||
# pylint: disable=too-many-lines
|
||||
import copy
|
||||
import datetime
|
||||
import json
|
||||
@@ -18,7 +17,6 @@ from acme import jws as acme_jws
|
||||
from acme import messages
|
||||
from acme import messages_test
|
||||
from acme import test_util
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
CERT_DER = test_util.load_vector('cert.der')
|
||||
@@ -63,8 +61,7 @@ class ClientTestBase(unittest.TestCase):
|
||||
self.contact = ('mailto:cert-admin@example.com', 'tel:+12025551212')
|
||||
reg = messages.Registration(
|
||||
contact=self.contact, key=KEY.public_key())
|
||||
the_arg = dict(reg) # type: Dict
|
||||
self.new_reg = messages.NewRegistration(**the_arg)
|
||||
self.new_reg = messages.NewRegistration(**dict(reg))
|
||||
self.regr = messages.RegistrationResource(
|
||||
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1')
|
||||
|
||||
@@ -135,18 +132,12 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
|
||||
client = self._init()
|
||||
self.assertEqual(client.acme_version, 2)
|
||||
|
||||
def test_query_registration_client_v2(self):
|
||||
self.response.json.return_value = DIRECTORY_V2.to_json()
|
||||
client = self._init()
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.assertEqual(self.regr, client.query_registration(self.regr))
|
||||
|
||||
def test_forwarding(self):
|
||||
self.response.json.return_value = DIRECTORY_V1.to_json()
|
||||
client = self._init()
|
||||
self.assertEqual(client.directory, client.client.directory)
|
||||
self.assertEqual(client.key, KEY)
|
||||
self.assertEqual(client.deactivate_registration, client.client.deactivate_registration)
|
||||
self.assertEqual(client.update_registration, client.client.update_registration)
|
||||
self.assertRaises(AttributeError, client.__getattr__, 'nonexistent')
|
||||
self.assertRaises(AttributeError, client.__getattr__, 'new_account_and_tos')
|
||||
self.assertRaises(AttributeError, client.__getattr__, 'new_account')
|
||||
@@ -277,47 +268,10 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
|
||||
client.revoke(messages_test.CERT, self.rsn)
|
||||
mock_client().revoke.assert_called_once_with(messages_test.CERT, self.rsn)
|
||||
|
||||
def test_update_registration(self):
|
||||
self.response.json.return_value = DIRECTORY_V1.to_json()
|
||||
with mock.patch('acme.client.Client') as mock_client:
|
||||
client = self._init()
|
||||
client.update_registration(mock.sentinel.regr, None)
|
||||
mock_client().update_registration.assert_called_once_with(mock.sentinel.regr, None)
|
||||
|
||||
# newNonce present means it will pick acme_version 2
|
||||
def test_external_account_required_true(self):
|
||||
self.response.json.return_value = messages.Directory({
|
||||
'newNonce': 'http://letsencrypt-test.com/acme/new-nonce',
|
||||
'meta': messages.Directory.Meta(external_account_required=True),
|
||||
}).to_json()
|
||||
|
||||
client = self._init()
|
||||
|
||||
self.assertTrue(client.external_account_required())
|
||||
|
||||
# newNonce present means it will pick acme_version 2
|
||||
def test_external_account_required_false(self):
|
||||
self.response.json.return_value = messages.Directory({
|
||||
'newNonce': 'http://letsencrypt-test.com/acme/new-nonce',
|
||||
'meta': messages.Directory.Meta(external_account_required=False),
|
||||
}).to_json()
|
||||
|
||||
client = self._init()
|
||||
|
||||
self.assertFalse(client.external_account_required())
|
||||
|
||||
def test_external_account_required_false_v1(self):
|
||||
self.response.json.return_value = messages.Directory({
|
||||
'meta': messages.Directory.Meta(external_account_required=False),
|
||||
}).to_json()
|
||||
|
||||
client = self._init()
|
||||
|
||||
self.assertFalse(client.external_account_required())
|
||||
|
||||
|
||||
class ClientTest(ClientTestBase):
|
||||
"""Tests for acme.client.Client."""
|
||||
# pylint: disable=too-many-instance-attributes,too-many-public-methods
|
||||
|
||||
def setUp(self):
|
||||
super(ClientTest, self).setUp()
|
||||
@@ -357,6 +311,7 @@ class ClientTest(ClientTestBase):
|
||||
|
||||
def test_register(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
# pylint: disable=no-member
|
||||
self.response.status_code = http_client.CREATED
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
@@ -369,6 +324,7 @@ class ClientTest(ClientTestBase):
|
||||
|
||||
def test_update_registration(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
# pylint: disable=no-member
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.assertEqual(self.regr, self.client.update_registration(self.regr))
|
||||
@@ -636,14 +592,6 @@ class ClientTest(ClientTestBase):
|
||||
errors.PollError, self.client.poll_and_request_issuance,
|
||||
csr, authzrs, mintime=mintime, max_attempts=2)
|
||||
|
||||
def test_deactivate_authorization(self):
|
||||
authzb = self.authzr.body.update(status=messages.STATUS_DEACTIVATED)
|
||||
self.response.json.return_value = authzb.to_json()
|
||||
authzr = self.client.deactivate_authorization(self.authzr)
|
||||
self.assertEqual(authzb, authzr.body)
|
||||
self.assertEqual(self.client.net.post.call_count, 1)
|
||||
self.assertTrue(self.authzr.uri in self.net.post.call_args_list[0][0])
|
||||
|
||||
def test_check_cert(self):
|
||||
self.response.headers['Location'] = self.certr.uri
|
||||
self.response.content = CERT_DER
|
||||
@@ -702,7 +650,7 @@ class ClientTest(ClientTestBase):
|
||||
def test_revocation_payload(self):
|
||||
obj = messages.Revocation(certificate=self.certr.body, reason=self.rsn)
|
||||
self.assertTrue('reason' in obj.to_partial_json().keys())
|
||||
self.assertEqual(self.rsn, obj.to_partial_json()['reason'])
|
||||
self.assertEquals(self.rsn, obj.to_partial_json()['reason'])
|
||||
|
||||
def test_revoke_bad_status_raises_error(self):
|
||||
self.response.status_code = http_client.METHOD_NOT_ALLOWED
|
||||
@@ -712,7 +660,6 @@ class ClientTest(ClientTestBase):
|
||||
self.certr,
|
||||
self.rsn)
|
||||
|
||||
|
||||
class ClientV2Test(ClientTestBase):
|
||||
"""Tests for acme.client.ClientV2."""
|
||||
|
||||
@@ -750,11 +697,6 @@ class ClientV2Test(ClientTestBase):
|
||||
|
||||
self.assertEqual(self.regr, self.client.new_account(self.new_reg))
|
||||
|
||||
def test_new_account_conflict(self):
|
||||
self.response.status_code = http_client.OK
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
self.assertRaises(errors.ConflictError, self.client.new_account, self.new_reg)
|
||||
|
||||
def test_new_order(self):
|
||||
order_response = copy.deepcopy(self.response)
|
||||
order_response.status_code = http_client.CREATED
|
||||
@@ -768,10 +710,9 @@ class ClientV2Test(ClientTestBase):
|
||||
authz_response2 = self.response
|
||||
authz_response2.json.return_value = self.authz2.to_json()
|
||||
authz_response2.headers['Location'] = self.authzr2.uri
|
||||
self.net.get.side_effect = (authz_response, authz_response2)
|
||||
|
||||
with mock.patch('acme.client.ClientV2._post_as_get') as mock_post_as_get:
|
||||
mock_post_as_get.side_effect = (authz_response, authz_response2)
|
||||
self.assertEqual(self.client.new_order(CSR_SAN_PEM), self.orderr)
|
||||
self.assertEqual(self.client.new_order(CSR_SAN_PEM), self.orderr)
|
||||
|
||||
@mock.patch('acme.client.datetime')
|
||||
def test_poll_and_finalize(self, mock_datetime):
|
||||
@@ -844,61 +785,7 @@ class ClientV2Test(ClientTestBase):
|
||||
def test_revoke(self):
|
||||
self.client.revoke(messages_test.CERT, self.rsn)
|
||||
self.net.post.assert_called_once_with(
|
||||
self.directory["revokeCert"], mock.ANY, acme_version=2,
|
||||
new_nonce_url=DIRECTORY_V2['newNonce'])
|
||||
|
||||
def test_update_registration(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.assertEqual(self.regr, self.client.update_registration(self.regr))
|
||||
self.assertNotEqual(self.client.net.account, None)
|
||||
self.assertEqual(self.client.net.post.call_count, 2)
|
||||
self.assertTrue(DIRECTORY_V2.newAccount in self.net.post.call_args_list[0][0])
|
||||
|
||||
self.response.json.return_value = self.regr.body.update(
|
||||
contact=()).to_json()
|
||||
|
||||
def test_external_account_required_true(self):
|
||||
self.client.directory = messages.Directory({
|
||||
'meta': messages.Directory.Meta(external_account_required=True)
|
||||
})
|
||||
|
||||
self.assertTrue(self.client.external_account_required())
|
||||
|
||||
def test_external_account_required_false(self):
|
||||
self.client.directory = messages.Directory({
|
||||
'meta': messages.Directory.Meta(external_account_required=False)
|
||||
})
|
||||
|
||||
self.assertFalse(self.client.external_account_required())
|
||||
|
||||
def test_external_account_required_default(self):
|
||||
self.assertFalse(self.client.external_account_required())
|
||||
|
||||
def test_post_as_get(self):
|
||||
with mock.patch('acme.client.ClientV2._authzr_from_response') as mock_client:
|
||||
mock_client.return_value = self.authzr2
|
||||
|
||||
self.client.poll(self.authzr2) # pylint: disable=protected-access
|
||||
|
||||
self.client.net.post.assert_called_once_with(
|
||||
self.authzr2.uri, None, acme_version=2,
|
||||
new_nonce_url='https://www.letsencrypt-demo.org/acme/new-nonce')
|
||||
self.client.net.get.assert_not_called()
|
||||
|
||||
class FakeError(messages.Error):
|
||||
"""Fake error to reproduce a malformed request ACME error"""
|
||||
def __init__(self): # pylint: disable=super-init-not-called
|
||||
pass
|
||||
@property
|
||||
def code(self):
|
||||
return 'malformed'
|
||||
self.client.net.post.side_effect = FakeError()
|
||||
|
||||
self.client.poll(self.authzr2) # pylint: disable=protected-access
|
||||
|
||||
self.client.net.get.assert_called_once_with(self.authzr2.uri)
|
||||
self.directory["revokeCert"], mock.ANY, acme_version=2)
|
||||
|
||||
|
||||
class MockJSONDeSerializable(jose.JSONDeSerializable):
|
||||
@@ -910,12 +797,13 @@ class MockJSONDeSerializable(jose.JSONDeSerializable):
|
||||
return {'foo': self.value}
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
def from_json(cls, value):
|
||||
pass # pragma: no cover
|
||||
|
||||
|
||||
class ClientNetworkTest(unittest.TestCase):
|
||||
"""Tests for acme.client.ClientNetwork."""
|
||||
# pylint: disable=too-many-public-methods
|
||||
|
||||
def setUp(self):
|
||||
self.verify_ssl = mock.MagicMock()
|
||||
@@ -954,6 +842,7 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
self.assertEqual(jws.signature.combined.kid, u'acct-uri')
|
||||
self.assertEqual(jws.signature.combined.url, u'url')
|
||||
|
||||
|
||||
def test_check_response_not_ok_jobj_no_error(self):
|
||||
self.response.ok = False
|
||||
self.response.json.return_value = {}
|
||||
@@ -1116,11 +1005,12 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
|
||||
# Requests Library Exceptions
|
||||
except requests.exceptions.ConnectionError as z: #pragma: no cover
|
||||
self.assertTrue("'Connection aborted.'" in str(z) or "[WinError 10061]" in str(z))
|
||||
|
||||
self.assertEqual("('Connection aborted.', "
|
||||
"error(111, 'Connection refused'))", str(z))
|
||||
|
||||
class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
"""Tests for acme.client.ClientNetwork which mock out response."""
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.client import ClientNetwork
|
||||
@@ -1129,10 +1019,7 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
self.response = mock.MagicMock(ok=True, status_code=http_client.OK)
|
||||
self.response.headers = {}
|
||||
self.response.links = {}
|
||||
self.response.checked = False
|
||||
self.acmev1_nonce_response = mock.MagicMock(ok=False,
|
||||
status_code=http_client.METHOD_NOT_ALLOWED)
|
||||
self.acmev1_nonce_response.headers = {}
|
||||
self.checked_response = mock.MagicMock()
|
||||
self.obj = mock.MagicMock()
|
||||
self.wrapped_obj = mock.MagicMock()
|
||||
self.content_type = mock.sentinel.content_type
|
||||
@@ -1144,21 +1031,13 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
|
||||
def send_request(*args, **kwargs):
|
||||
# pylint: disable=unused-argument,missing-docstring
|
||||
self.assertFalse("new_nonce_url" in kwargs)
|
||||
method = args[0]
|
||||
uri = args[1]
|
||||
if method == 'HEAD' and uri != "new_nonce_uri":
|
||||
response = self.acmev1_nonce_response
|
||||
else:
|
||||
response = self.response
|
||||
|
||||
if self.available_nonces:
|
||||
response.headers = {
|
||||
self.response.headers = {
|
||||
self.net.REPLAY_NONCE_HEADER:
|
||||
self.available_nonces.pop().decode()}
|
||||
else:
|
||||
response.headers = {}
|
||||
return response
|
||||
self.response.headers = {}
|
||||
return self.response
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.net._send_request = self.send_request = mock.MagicMock(
|
||||
@@ -1170,39 +1049,28 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
# pylint: disable=missing-docstring
|
||||
self.assertEqual(self.response, response)
|
||||
self.assertEqual(self.content_type, content_type)
|
||||
self.assertTrue(self.response.ok)
|
||||
self.response.checked = True
|
||||
return self.response
|
||||
return self.checked_response
|
||||
|
||||
def test_head(self):
|
||||
self.assertEqual(self.acmev1_nonce_response, self.net.head(
|
||||
self.assertEqual(self.response, self.net.head(
|
||||
'http://example.com/', 'foo', bar='baz'))
|
||||
self.send_request.assert_called_once_with(
|
||||
'HEAD', 'http://example.com/', 'foo', bar='baz')
|
||||
|
||||
def test_head_v2(self):
|
||||
self.assertEqual(self.response, self.net.head(
|
||||
'new_nonce_uri', 'foo', bar='baz'))
|
||||
self.send_request.assert_called_once_with(
|
||||
'HEAD', 'new_nonce_uri', 'foo', bar='baz')
|
||||
|
||||
def test_get(self):
|
||||
self.assertEqual(self.response, self.net.get(
|
||||
self.assertEqual(self.checked_response, self.net.get(
|
||||
'http://example.com/', content_type=self.content_type, bar='baz'))
|
||||
self.assertTrue(self.response.checked)
|
||||
self.send_request.assert_called_once_with(
|
||||
'GET', 'http://example.com/', bar='baz')
|
||||
|
||||
def test_post_no_content_type(self):
|
||||
self.content_type = self.net.JOSE_CONTENT_TYPE
|
||||
self.assertEqual(self.response, self.net.post('uri', self.obj))
|
||||
self.assertTrue(self.response.checked)
|
||||
self.assertEqual(self.checked_response, self.net.post('uri', self.obj))
|
||||
|
||||
def test_post(self):
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.response, self.net.post(
|
||||
self.assertEqual(self.checked_response, self.net.post(
|
||||
'uri', self.obj, content_type=self.content_type))
|
||||
self.assertTrue(self.response.checked)
|
||||
self.net._wrap_in_jws.assert_called_once_with(
|
||||
self.obj, jose.b64decode(self.all_nonces.pop()), "uri", 1)
|
||||
|
||||
@@ -1234,7 +1102,7 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
def test_post_not_retried(self):
|
||||
check_response = mock.MagicMock()
|
||||
check_response.side_effect = [messages.Error.with_code('malformed'),
|
||||
self.response]
|
||||
self.checked_response]
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.net._check_response = check_response
|
||||
@@ -1242,12 +1110,13 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
self.obj, content_type=self.content_type)
|
||||
|
||||
def test_post_successful_retry(self):
|
||||
post_once = mock.MagicMock()
|
||||
post_once.side_effect = [messages.Error.with_code('badNonce'),
|
||||
self.response]
|
||||
check_response = mock.MagicMock()
|
||||
check_response.side_effect = [messages.Error.with_code('badNonce'),
|
||||
self.checked_response]
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.response, self.net.post(
|
||||
self.net._check_response = check_response
|
||||
self.assertEqual(self.checked_response, self.net.post(
|
||||
'uri', self.obj, content_type=self.content_type))
|
||||
|
||||
def test_head_get_post_error_passthrough(self):
|
||||
@@ -1258,51 +1127,6 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
self.assertRaises(requests.exceptions.RequestException,
|
||||
self.net.post, 'uri', obj=self.obj)
|
||||
|
||||
def test_post_bad_nonce_head(self):
|
||||
# pylint: disable=protected-access
|
||||
# regression test for https://github.com/certbot/certbot/issues/6092
|
||||
bad_response = mock.MagicMock(ok=False, status_code=http_client.SERVICE_UNAVAILABLE)
|
||||
self.net._send_request = mock.MagicMock()
|
||||
self.net._send_request.return_value = bad_response
|
||||
self.content_type = None
|
||||
check_response = mock.MagicMock()
|
||||
self.net._check_response = check_response
|
||||
self.assertRaises(errors.ClientError, self.net.post, 'uri',
|
||||
self.obj, content_type=self.content_type, acme_version=2,
|
||||
new_nonce_url='new_nonce_uri')
|
||||
self.assertEqual(check_response.call_count, 1)
|
||||
|
||||
def test_new_nonce_uri_removed(self):
|
||||
self.content_type = None
|
||||
self.net.post('uri', self.obj, content_type=None,
|
||||
acme_version=2, new_nonce_url='new_nonce_uri')
|
||||
|
||||
|
||||
class ClientNetworkSourceAddressBindingTest(unittest.TestCase):
|
||||
"""Tests that if ClientNetwork has a source IP set manually, the underlying library has
|
||||
used the provided source address."""
|
||||
|
||||
def setUp(self):
|
||||
self.source_address = "8.8.8.8"
|
||||
|
||||
def test_source_address_set(self):
|
||||
from acme.client import ClientNetwork
|
||||
net = ClientNetwork(key=None, alg=None, source_address=self.source_address)
|
||||
for adapter in net.session.adapters.values():
|
||||
self.assertTrue(self.source_address in adapter.source_address)
|
||||
|
||||
def test_behavior_assumption(self):
|
||||
"""This is a test that guardrails the HTTPAdapter behavior so that if the default for
|
||||
a Session() changes, the assumptions here aren't violated silently."""
|
||||
from acme.client import ClientNetwork
|
||||
# Source address not specified, so the default adapter type should be bound -- this
|
||||
# test should fail if the default adapter type is changed by requests
|
||||
net = ClientNetwork(key=None, alg=None)
|
||||
session = requests.Session()
|
||||
for scheme in session.adapters.keys():
|
||||
client_network_adapter = net.session.adapters.get(scheme)
|
||||
default_adapter = session.adapters.get(scheme)
|
||||
self.assertEqual(client_network_adapter.__class__, default_adapter.__class__)
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
|
||||
@@ -6,29 +6,29 @@ import os
|
||||
import re
|
||||
import socket
|
||||
|
||||
from OpenSSL import crypto
|
||||
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
|
||||
import OpenSSL
|
||||
import josepy as jose
|
||||
|
||||
|
||||
from acme import errors
|
||||
# pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Callable, Union, Tuple, Optional
|
||||
# pylint: enable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Default SSL method selected here is the most compatible, while secure
|
||||
# SSL method: TLSv1_METHOD is only compatible with
|
||||
# TLSSNI01 certificate serving and probing is not affected by SSL
|
||||
# vulnerabilities: prober needs to check certificate for expected
|
||||
# contents anyway. Working SNI is the only thing that's necessary for
|
||||
# the challenge and thus scoping down SSL/TLS method (version) would
|
||||
# cause interoperability issues: TLSv1_METHOD is only compatible with
|
||||
# TLSv1_METHOD, while SSLv23_METHOD is compatible with all other
|
||||
# methods, including TLSv2_METHOD (read more at
|
||||
# https://www.openssl.org/docs/ssl/SSLv23_method.html). _serve_sni
|
||||
# should be changed to use "set_options" to disable SSLv2 and SSLv3,
|
||||
# in case it's used for things other than probing/serving!
|
||||
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
|
||||
_DEFAULT_TLSSNI01_SSL_METHOD = OpenSSL.SSL.SSLv23_METHOD # type: ignore
|
||||
|
||||
|
||||
class SSLSocket(object):
|
||||
class SSLSocket(object): # pylint: disable=too-few-public-methods
|
||||
"""SSL wrapper for sockets.
|
||||
|
||||
:ivar socket sock: Original wrapped socket.
|
||||
@@ -37,7 +37,7 @@ class SSLSocket(object):
|
||||
:ivar method: See `OpenSSL.SSL.Context` for allowed values.
|
||||
|
||||
"""
|
||||
def __init__(self, sock, certs, method=_DEFAULT_SSL_METHOD):
|
||||
def __init__(self, sock, certs, method=_DEFAULT_TLSSNI01_SSL_METHOD):
|
||||
self.sock = sock
|
||||
self.certs = certs
|
||||
self.method = method
|
||||
@@ -64,9 +64,9 @@ class SSLSocket(object):
|
||||
logger.debug("Server name (%s) not recognized, dropping SSL",
|
||||
server_name)
|
||||
return
|
||||
new_context = SSL.Context(self.method)
|
||||
new_context.set_options(SSL.OP_NO_SSLv2)
|
||||
new_context.set_options(SSL.OP_NO_SSLv3)
|
||||
new_context = OpenSSL.SSL.Context(self.method)
|
||||
new_context.set_options(OpenSSL.SSL.OP_NO_SSLv2)
|
||||
new_context.set_options(OpenSSL.SSL.OP_NO_SSLv3)
|
||||
new_context.use_privatekey(key)
|
||||
new_context.use_certificate(cert)
|
||||
connection.set_context(new_context)
|
||||
@@ -74,7 +74,7 @@ class SSLSocket(object):
|
||||
class FakeConnection(object):
|
||||
"""Fake OpenSSL.SSL.Connection."""
|
||||
|
||||
# pylint: disable=missing-docstring
|
||||
# pylint: disable=too-few-public-methods,missing-docstring
|
||||
|
||||
def __init__(self, connection):
|
||||
self._wrapped = connection
|
||||
@@ -89,18 +89,18 @@ class SSLSocket(object):
|
||||
def accept(self): # pylint: disable=missing-docstring
|
||||
sock, addr = self.sock.accept()
|
||||
|
||||
context = SSL.Context(self.method)
|
||||
context.set_options(SSL.OP_NO_SSLv2)
|
||||
context.set_options(SSL.OP_NO_SSLv3)
|
||||
context = OpenSSL.SSL.Context(self.method)
|
||||
context.set_options(OpenSSL.SSL.OP_NO_SSLv2)
|
||||
context.set_options(OpenSSL.SSL.OP_NO_SSLv3)
|
||||
context.set_tlsext_servername_callback(self._pick_certificate_cb)
|
||||
|
||||
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
|
||||
ssl_sock = self.FakeConnection(OpenSSL.SSL.Connection(context, sock))
|
||||
ssl_sock.set_accept_state()
|
||||
|
||||
logger.debug("Performing handshake with %s", addr)
|
||||
try:
|
||||
ssl_sock.do_handshake()
|
||||
except SSL.Error as error:
|
||||
except OpenSSL.SSL.Error as error:
|
||||
# _pick_certificate_cb might have returned without
|
||||
# creating SSL context (wrong server name)
|
||||
raise socket.error(error)
|
||||
@@ -109,7 +109,7 @@ class SSLSocket(object):
|
||||
|
||||
|
||||
def probe_sni(name, host, port=443, timeout=300,
|
||||
method=_DEFAULT_SSL_METHOD, source_address=('', 0)):
|
||||
method=_DEFAULT_TLSSNI01_SSL_METHOD, source_address=('', 0)):
|
||||
"""Probe SNI server for SSL certificate.
|
||||
|
||||
:param bytes name: Byte string to send as the server name in the
|
||||
@@ -128,32 +128,30 @@ def probe_sni(name, host, port=443, timeout=300,
|
||||
:rtype: OpenSSL.crypto.X509
|
||||
|
||||
"""
|
||||
context = SSL.Context(method)
|
||||
context = OpenSSL.SSL.Context(method)
|
||||
context.set_timeout(timeout)
|
||||
|
||||
socket_kwargs = {'source_address': source_address}
|
||||
|
||||
host_protocol_agnostic = None if host == '::' or host == '0' else host
|
||||
|
||||
try:
|
||||
logger.debug(
|
||||
"Attempting to connect to %s:%d%s.", host, port,
|
||||
" from {0}:{1}".format(
|
||||
source_address[0],
|
||||
source_address[1]
|
||||
) if socket_kwargs else ""
|
||||
)
|
||||
socket_tuple = (host, port) # type: Tuple[str, int]
|
||||
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore
|
||||
# pylint: disable=star-args
|
||||
logger.debug("Attempting to connect to %s:%d%s.", host_protocol_agnostic, port,
|
||||
" from {0}:{1}".format(source_address[0], source_address[1]) if \
|
||||
socket_kwargs else "")
|
||||
sock = socket.create_connection((host_protocol_agnostic, port), **socket_kwargs)
|
||||
except socket.error as error:
|
||||
raise errors.Error(error)
|
||||
|
||||
with contextlib.closing(sock) as client:
|
||||
client_ssl = SSL.Connection(context, client)
|
||||
client_ssl = OpenSSL.SSL.Connection(context, client)
|
||||
client_ssl.set_connect_state()
|
||||
client_ssl.set_tlsext_host_name(name) # pyOpenSSL>=0.13
|
||||
try:
|
||||
client_ssl.do_handshake()
|
||||
client_ssl.shutdown()
|
||||
except SSL.Error as error:
|
||||
except OpenSSL.SSL.Error as error:
|
||||
raise errors.Error(error)
|
||||
return client_ssl.get_peer_certificate()
|
||||
|
||||
@@ -166,18 +164,18 @@ def make_csr(private_key_pem, domains, must_staple=False):
|
||||
OCSP Must Staple: https://tools.ietf.org/html/rfc7633).
|
||||
:returns: buffer PEM-encoded Certificate Signing Request.
|
||||
"""
|
||||
private_key = crypto.load_privatekey(
|
||||
crypto.FILETYPE_PEM, private_key_pem)
|
||||
csr = crypto.X509Req()
|
||||
private_key = OpenSSL.crypto.load_privatekey(
|
||||
OpenSSL.crypto.FILETYPE_PEM, private_key_pem)
|
||||
csr = OpenSSL.crypto.X509Req()
|
||||
extensions = [
|
||||
crypto.X509Extension(
|
||||
OpenSSL.crypto.X509Extension(
|
||||
b'subjectAltName',
|
||||
critical=False,
|
||||
value=', '.join('DNS:' + d for d in domains).encode('ascii')
|
||||
),
|
||||
]
|
||||
if must_staple:
|
||||
extensions.append(crypto.X509Extension(
|
||||
extensions.append(OpenSSL.crypto.X509Extension(
|
||||
b"1.3.6.1.5.5.7.1.24",
|
||||
critical=False,
|
||||
value=b"DER:30:03:02:01:05"))
|
||||
@@ -185,8 +183,8 @@ def make_csr(private_key_pem, domains, must_staple=False):
|
||||
csr.set_pubkey(private_key)
|
||||
csr.set_version(2)
|
||||
csr.sign(private_key, 'sha256')
|
||||
return crypto.dump_certificate_request(
|
||||
crypto.FILETYPE_PEM, csr)
|
||||
return OpenSSL.crypto.dump_certificate_request(
|
||||
OpenSSL.crypto.FILETYPE_PEM, csr)
|
||||
|
||||
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
|
||||
common_name = loaded_cert_or_req.get_subject().CN
|
||||
@@ -194,7 +192,8 @@ def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
|
||||
|
||||
if common_name is None:
|
||||
return sans
|
||||
return [common_name] + [d for d in sans if d != common_name]
|
||||
else:
|
||||
return [common_name] + [d for d in sans if d != common_name]
|
||||
|
||||
def _pyopenssl_cert_or_req_san(cert_or_req):
|
||||
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
|
||||
@@ -222,12 +221,11 @@ def _pyopenssl_cert_or_req_san(cert_or_req):
|
||||
parts_separator = ", "
|
||||
prefix = "DNS" + part_separator
|
||||
|
||||
if isinstance(cert_or_req, crypto.X509):
|
||||
# pylint: disable=line-too-long
|
||||
func = crypto.dump_certificate # type: Union[Callable[[int, crypto.X509Req], bytes], Callable[[int, crypto.X509], bytes]]
|
||||
if isinstance(cert_or_req, OpenSSL.crypto.X509):
|
||||
func = OpenSSL.crypto.dump_certificate
|
||||
else:
|
||||
func = crypto.dump_certificate_request
|
||||
text = func(crypto.FILETYPE_TEXT, cert_or_req).decode("utf-8")
|
||||
func = OpenSSL.crypto.dump_certificate_request
|
||||
text = func(OpenSSL.crypto.FILETYPE_TEXT, cert_or_req).decode("utf-8")
|
||||
# WARNING: this function does not support multiple SANs extensions.
|
||||
# Multiple X509v3 extensions of the same type is disallowed by RFC 5280.
|
||||
match = re.search(r"X509v3 Subject Alternative Name:(?: critical)?\s*(.*)", text)
|
||||
@@ -254,12 +252,12 @@ def gen_ss_cert(key, domains, not_before=None,
|
||||
|
||||
"""
|
||||
assert domains, "Must provide one or more hostnames for the cert."
|
||||
cert = crypto.X509()
|
||||
cert = OpenSSL.crypto.X509()
|
||||
cert.set_serial_number(int(binascii.hexlify(os.urandom(16)), 16))
|
||||
cert.set_version(2)
|
||||
|
||||
extensions = [
|
||||
crypto.X509Extension(
|
||||
OpenSSL.crypto.X509Extension(
|
||||
b"basicConstraints", True, b"CA:TRUE, pathlen:0"),
|
||||
]
|
||||
|
||||
@@ -268,7 +266,7 @@ def gen_ss_cert(key, domains, not_before=None,
|
||||
cert.set_issuer(cert.get_subject())
|
||||
|
||||
if force_san or len(domains) > 1:
|
||||
extensions.append(crypto.X509Extension(
|
||||
extensions.append(OpenSSL.crypto.X509Extension(
|
||||
b"subjectAltName",
|
||||
critical=False,
|
||||
value=b", ".join(b"DNS:" + d.encode() for d in domains)
|
||||
@@ -283,7 +281,7 @@ def gen_ss_cert(key, domains, not_before=None,
|
||||
cert.sign(key, "sha256")
|
||||
return cert
|
||||
|
||||
def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
|
||||
def dump_pyopenssl_chain(chain, filetype=OpenSSL.crypto.FILETYPE_PEM):
|
||||
"""Dump certificate chain into a bundle.
|
||||
|
||||
:param list chain: List of `OpenSSL.crypto.X509` (or wrapped in
|
||||
@@ -300,7 +298,7 @@ def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
|
||||
if isinstance(cert, jose.ComparableX509):
|
||||
# pylint: disable=protected-access
|
||||
cert = cert.wrapped
|
||||
return crypto.dump_certificate(filetype, cert)
|
||||
return OpenSSL.crypto.dump_certificate(filetype, cert)
|
||||
|
||||
# assumes that OpenSSL.crypto.dump_certificate includes ending
|
||||
# newline character
|
||||
|
||||
@@ -13,7 +13,6 @@ import OpenSSL
|
||||
|
||||
from acme import errors
|
||||
from acme import test_util
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
@@ -30,6 +29,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
|
||||
class _TestServer(socketserver.TCPServer):
|
||||
|
||||
# pylint: disable=too-few-public-methods
|
||||
# six.moves.* | pylint: disable=attribute-defined-outside-init,no-init
|
||||
|
||||
def server_bind(self): # pylint: disable=missing-docstring
|
||||
@@ -41,38 +41,28 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
self.server_thread = threading.Thread(
|
||||
# pylint: disable=no-member
|
||||
target=self.server.handle_request)
|
||||
self.server_thread.start()
|
||||
time.sleep(1) # TODO: avoid race conditions in other way
|
||||
|
||||
def tearDown(self):
|
||||
if self.server_thread.is_alive():
|
||||
# The thread may have already terminated.
|
||||
self.server_thread.join() # pragma: no cover
|
||||
self.server_thread.join()
|
||||
|
||||
def _probe(self, name):
|
||||
from acme.crypto_util import probe_sni
|
||||
return jose.ComparableX509(probe_sni(
|
||||
name, host='127.0.0.1', port=self.port))
|
||||
|
||||
def _start_server(self):
|
||||
self.server_thread.start()
|
||||
time.sleep(1) # TODO: avoid race conditions in other way
|
||||
|
||||
def test_probe_ok(self):
|
||||
self._start_server()
|
||||
self.assertEqual(self.cert, self._probe(b'foo'))
|
||||
|
||||
def test_probe_not_recognized_name(self):
|
||||
self._start_server()
|
||||
self.assertRaises(errors.Error, self._probe, b'bar')
|
||||
|
||||
def test_probe_connection_error(self):
|
||||
# pylint has a hard time with six
|
||||
self.server.server_close() # pylint: disable=no-member
|
||||
original_timeout = socket.getdefaulttimeout()
|
||||
try:
|
||||
socket.setdefaulttimeout(1)
|
||||
self.assertRaises(errors.Error, self._probe, b'bar')
|
||||
finally:
|
||||
socket.setdefaulttimeout(original_timeout)
|
||||
# TODO: py33/py34 tox hangs forever on do_handshake in second probe
|
||||
#def probe_connection_error(self):
|
||||
# self._probe(b'foo')
|
||||
# #time.sleep(1) # TODO: avoid race conditions in other way
|
||||
# self.assertRaises(errors.Error, self._probe, b'bar')
|
||||
|
||||
|
||||
class PyOpenSSLCertOrReqAllNamesTest(unittest.TestCase):
|
||||
@@ -175,7 +165,7 @@ class RandomSnTest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.cert_count = 5
|
||||
self.serial_num = [] # type: List[int]
|
||||
self.serial_num = []
|
||||
self.key = OpenSSL.crypto.PKey()
|
||||
self.key.generate_key(OpenSSL.crypto.TYPE_RSA, 2048)
|
||||
|
||||
@@ -208,8 +198,8 @@ class MakeCSRTest(unittest.TestCase):
|
||||
# have a get_extensions() method, so we skip this test if the method
|
||||
# isn't available.
|
||||
if hasattr(csr, 'get_extensions'):
|
||||
self.assertEqual(len(csr.get_extensions()), 1)
|
||||
self.assertEqual(csr.get_extensions()[0].get_data(),
|
||||
self.assertEquals(len(csr.get_extensions()), 1)
|
||||
self.assertEquals(csr.get_extensions()[0].get_data(),
|
||||
OpenSSL.crypto.X509Extension(
|
||||
b'subjectAltName',
|
||||
critical=False,
|
||||
@@ -226,7 +216,7 @@ class MakeCSRTest(unittest.TestCase):
|
||||
# have a get_extensions() method, so we skip this test if the method
|
||||
# isn't available.
|
||||
if hasattr(csr, 'get_extensions'):
|
||||
self.assertEqual(len(csr.get_extensions()), 2)
|
||||
self.assertEquals(len(csr.get_extensions()), 2)
|
||||
# NOTE: Ideally we would filter by the TLS Feature OID, but
|
||||
# OpenSSL.crypto.X509Extension doesn't give us the extension's raw OID,
|
||||
# and the shortname field is just "UNDEF"
|
||||
|
||||
@@ -110,8 +110,6 @@ class ConflictError(ClientError):
|
||||
|
||||
In the version of ACME implemented by Boulder, this is used to find an
|
||||
account if you only have the private key, but don't know the account URL.
|
||||
|
||||
Also used in V2 of the ACME client for the same purpose.
|
||||
"""
|
||||
def __init__(self, location):
|
||||
self.location = location
|
||||
|
||||
@@ -1,53 +0,0 @@
|
||||
"""Tests for acme.jose shim."""
|
||||
import importlib
|
||||
import unittest
|
||||
|
||||
class JoseTest(unittest.TestCase):
|
||||
"""Tests for acme.jose shim."""
|
||||
|
||||
def _test_it(self, submodule, attribute):
|
||||
if submodule:
|
||||
acme_jose_path = 'acme.jose.' + submodule
|
||||
josepy_path = 'josepy.' + submodule
|
||||
else:
|
||||
acme_jose_path = 'acme.jose'
|
||||
josepy_path = 'josepy'
|
||||
acme_jose_mod = importlib.import_module(acme_jose_path)
|
||||
josepy_mod = importlib.import_module(josepy_path)
|
||||
|
||||
self.assertIs(acme_jose_mod, josepy_mod)
|
||||
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))
|
||||
|
||||
# We use the imports below with eval, but pylint doesn't
|
||||
# understand that.
|
||||
# pylint: disable=eval-used,unused-variable
|
||||
import acme
|
||||
import josepy
|
||||
acme_jose_mod = eval(acme_jose_path)
|
||||
josepy_mod = eval(josepy_path)
|
||||
self.assertIs(acme_jose_mod, josepy_mod)
|
||||
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))
|
||||
|
||||
def test_top_level(self):
|
||||
self._test_it('', 'RS512')
|
||||
|
||||
def test_submodules(self):
|
||||
# This test ensures that the modules in josepy that were
|
||||
# available at the time it was moved into its own package are
|
||||
# available under acme.jose. Backwards compatibility with new
|
||||
# modules or testing code is not maintained.
|
||||
mods_and_attrs = [('b64', 'b64decode',),
|
||||
('errors', 'Error',),
|
||||
('interfaces', 'JSONDeSerializable',),
|
||||
('json_util', 'Field',),
|
||||
('jwa', 'HS256',),
|
||||
('jwk', 'JWK',),
|
||||
('jws', 'JWS',),
|
||||
('util', 'ImmutableMap',),]
|
||||
|
||||
for mod, attr in mods_and_attrs:
|
||||
self._test_it(mod, attr)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -43,7 +43,7 @@ class JWS(jose.JWS):
|
||||
__slots__ = jose.JWS._orig_slots # pylint: disable=no-member
|
||||
|
||||
@classmethod
|
||||
# pylint: disable=arguments-differ
|
||||
# pylint: disable=arguments-differ,too-many-arguments
|
||||
def sign(cls, payload, key, alg, nonce, url=None, kid=None):
|
||||
# Per ACME spec, jwk and kid are mutually exclusive, so only include a
|
||||
# jwk field if kid is not provided.
|
||||
|
||||
@@ -1,16 +0,0 @@
|
||||
"""Shim class to not have to depend on typing module in prod."""
|
||||
import sys
|
||||
|
||||
class TypingClass(object):
|
||||
"""Ignore import errors by getting anything"""
|
||||
def __getattr__(self, name):
|
||||
return None
|
||||
|
||||
try:
|
||||
# mypy doesn't respect modifying sys.modules
|
||||
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
|
||||
# pylint: disable=unused-import
|
||||
from typing import Collection, IO # type: ignore
|
||||
# pylint: enable=unused-import
|
||||
except ImportError:
|
||||
sys.modules[__name__] = TypingClass()
|
||||
@@ -1,41 +0,0 @@
|
||||
"""Tests for acme.magic_typing."""
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
|
||||
|
||||
class MagicTypingTest(unittest.TestCase):
|
||||
"""Tests for acme.magic_typing."""
|
||||
def test_import_success(self):
|
||||
try:
|
||||
import typing as temp_typing
|
||||
except ImportError: # pragma: no cover
|
||||
temp_typing = None # pragma: no cover
|
||||
typing_class_mock = mock.MagicMock()
|
||||
text_mock = mock.MagicMock()
|
||||
typing_class_mock.Text = text_mock
|
||||
sys.modules['typing'] = typing_class_mock
|
||||
if 'acme.magic_typing' in sys.modules:
|
||||
del sys.modules['acme.magic_typing'] # pragma: no cover
|
||||
from acme.magic_typing import Text # pylint: disable=no-name-in-module
|
||||
self.assertEqual(Text, text_mock)
|
||||
del sys.modules['acme.magic_typing']
|
||||
sys.modules['typing'] = temp_typing
|
||||
|
||||
def test_import_failure(self):
|
||||
try:
|
||||
import typing as temp_typing
|
||||
except ImportError: # pragma: no cover
|
||||
temp_typing = None # pragma: no cover
|
||||
sys.modules['typing'] = None
|
||||
if 'acme.magic_typing' in sys.modules:
|
||||
del sys.modules['acme.magic_typing'] # pragma: no cover
|
||||
from acme.magic_typing import Text # pylint: disable=no-name-in-module
|
||||
self.assertTrue(Text is None)
|
||||
del sys.modules['acme.magic_typing']
|
||||
sys.modules['typing'] = temp_typing
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,10 +1,6 @@
|
||||
"""ACME protocol messages."""
|
||||
import json
|
||||
import collections
|
||||
import six
|
||||
try:
|
||||
from collections.abc import Hashable # pylint: disable=no-name-in-module
|
||||
except ImportError: # pragma: no cover
|
||||
from collections import Hashable
|
||||
|
||||
import josepy as jose
|
||||
|
||||
@@ -12,42 +8,25 @@ from acme import challenges
|
||||
from acme import errors
|
||||
from acme import fields
|
||||
from acme import util
|
||||
from acme import jws
|
||||
|
||||
OLD_ERROR_PREFIX = "urn:acme:error:"
|
||||
ERROR_PREFIX = "urn:ietf:params:acme:error:"
|
||||
|
||||
ERROR_CODES = {
|
||||
'accountDoesNotExist': 'The request specified an account that does not exist',
|
||||
'alreadyRevoked': 'The request specified a certificate to be revoked that has' \
|
||||
' already been revoked',
|
||||
'badCSR': 'The CSR is unacceptable (e.g., due to a short key)',
|
||||
'badNonce': 'The client sent an unacceptable anti-replay nonce',
|
||||
'badPublicKey': 'The JWS was signed by a public key the server does not support',
|
||||
'badRevocationReason': 'The revocation reason provided is not allowed by the server',
|
||||
'badSignatureAlgorithm': 'The JWS was signed with an algorithm the server does not support',
|
||||
'caa': 'Certification Authority Authorization (CAA) records forbid the CA from issuing' \
|
||||
' a certificate',
|
||||
'compound': 'Specific error conditions are indicated in the "subproblems" array',
|
||||
'connection': ('The server could not connect to the client to verify the'
|
||||
' domain'),
|
||||
'dns': 'There was a problem with a DNS query during identifier validation',
|
||||
'dnssec': 'The server could not validate a DNSSEC signed domain',
|
||||
'incorrectResponse': 'Response recieved didn\'t match the challenge\'s requirements',
|
||||
# deprecate invalidEmail
|
||||
'invalidEmail': 'The provided email for a registration was invalid',
|
||||
'invalidContact': 'The provided contact URI was invalid',
|
||||
'malformed': 'The request message was malformed',
|
||||
'rejectedIdentifier': 'The server will not issue certificates for the identifier',
|
||||
'orderNotReady': 'The request attempted to finalize an order that is not ready to be finalized',
|
||||
'rateLimited': 'There were too many requests of a given type',
|
||||
'serverInternal': 'The server experienced an internal error',
|
||||
'tls': 'The server experienced a TLS error during domain verification',
|
||||
'unauthorized': 'The client lacks sufficient authorization',
|
||||
'unsupportedContact': 'A contact URL for an account used an unsupported protocol scheme',
|
||||
'unknownHost': 'The server could not resolve a domain name',
|
||||
'unsupportedIdentifier': 'An identifier is of an unsupported type',
|
||||
'externalAccountRequired': 'The server requires external account binding',
|
||||
}
|
||||
|
||||
ERROR_TYPE_DESCRIPTIONS = dict(
|
||||
@@ -61,7 +40,8 @@ def is_acme_error(err):
|
||||
"""Check if argument is an ACME error."""
|
||||
if isinstance(err, Error) and (err.typ is not None):
|
||||
return (ERROR_PREFIX in err.typ) or (OLD_ERROR_PREFIX in err.typ)
|
||||
return False
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
@six.python_2_unicode_compatible
|
||||
@@ -116,7 +96,6 @@ class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
code = str(self.typ).split(':')[-1]
|
||||
if code in ERROR_CODES:
|
||||
return code
|
||||
return None
|
||||
|
||||
def __str__(self):
|
||||
return b' :: '.join(
|
||||
@@ -125,25 +104,24 @@ class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
if part is not None).decode()
|
||||
|
||||
|
||||
class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
|
||||
class _Constant(jose.JSONDeSerializable, collections.Hashable): # type: ignore
|
||||
"""ACME constant."""
|
||||
__slots__ = ('name',)
|
||||
POSSIBLE_NAMES = NotImplemented
|
||||
|
||||
def __init__(self, name):
|
||||
super(_Constant, self).__init__()
|
||||
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
|
||||
self.POSSIBLE_NAMES[name] = self
|
||||
self.name = name
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.name
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
|
||||
def from_json(cls, value):
|
||||
if value not in cls.POSSIBLE_NAMES:
|
||||
raise jose.DeserializationError(
|
||||
'{0} not recognized'.format(cls.__name__))
|
||||
return cls.POSSIBLE_NAMES[jobj] # pylint: disable=unsubscriptable-object
|
||||
return cls.POSSIBLE_NAMES[value]
|
||||
|
||||
def __repr__(self):
|
||||
return '{0}({1})'.format(self.__class__.__name__, self.name)
|
||||
@@ -167,8 +145,6 @@ STATUS_PROCESSING = Status('processing')
|
||||
STATUS_VALID = Status('valid')
|
||||
STATUS_INVALID = Status('invalid')
|
||||
STATUS_REVOKED = Status('revoked')
|
||||
STATUS_READY = Status('ready')
|
||||
STATUS_DEACTIVATED = Status('deactivated')
|
||||
|
||||
|
||||
class IdentifierType(_Constant):
|
||||
@@ -199,10 +175,10 @@ class Directory(jose.JSONDeSerializable):
|
||||
_terms_of_service_v2 = jose.Field('termsOfService', omitempty=True)
|
||||
website = jose.Field('website', omitempty=True)
|
||||
caa_identities = jose.Field('caaIdentities', omitempty=True)
|
||||
external_account_required = jose.Field('externalAccountRequired', omitempty=True)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
|
||||
# pylint: disable=star-args
|
||||
super(Directory.Meta, self).__init__(**kwargs)
|
||||
|
||||
@property
|
||||
@@ -281,24 +257,6 @@ class ResourceBody(jose.JSONObjectWithFields):
|
||||
"""ACME Resource Body."""
|
||||
|
||||
|
||||
class ExternalAccountBinding(object):
|
||||
"""ACME External Account Binding"""
|
||||
|
||||
@classmethod
|
||||
def from_data(cls, account_public_key, kid, hmac_key, directory):
|
||||
"""Create External Account Binding Resource from contact details, kid and hmac."""
|
||||
|
||||
key_json = json.dumps(account_public_key.to_partial_json()).encode()
|
||||
decoded_hmac_key = jose.b64.b64decode(hmac_key)
|
||||
url = directory["newAccount"]
|
||||
|
||||
eab = jws.JWS.sign(key_json, jose.jwk.JWKOct(key=decoded_hmac_key),
|
||||
jose.jwa.HS256, None,
|
||||
url, kid)
|
||||
|
||||
return eab.to_partial_json()
|
||||
|
||||
|
||||
class Registration(ResourceBody):
|
||||
"""Registration Resource Body.
|
||||
|
||||
@@ -315,30 +273,24 @@ class Registration(ResourceBody):
|
||||
agreement = jose.Field('agreement', omitempty=True)
|
||||
status = jose.Field('status', omitempty=True)
|
||||
terms_of_service_agreed = jose.Field('termsOfServiceAgreed', omitempty=True)
|
||||
only_return_existing = jose.Field('onlyReturnExisting', omitempty=True)
|
||||
external_account_binding = jose.Field('externalAccountBinding', omitempty=True)
|
||||
|
||||
phone_prefix = 'tel:'
|
||||
email_prefix = 'mailto:'
|
||||
|
||||
@classmethod
|
||||
def from_data(cls, phone=None, email=None, external_account_binding=None, **kwargs):
|
||||
def from_data(cls, phone=None, email=None, **kwargs):
|
||||
"""Create registration resource from contact details."""
|
||||
details = list(kwargs.pop('contact', ()))
|
||||
if phone is not None:
|
||||
details.append(cls.phone_prefix + phone)
|
||||
if email is not None:
|
||||
details.extend([cls.email_prefix + mail for mail in email.split(',')])
|
||||
details.append(cls.email_prefix + email)
|
||||
kwargs['contact'] = tuple(details)
|
||||
|
||||
if external_account_binding:
|
||||
kwargs['external_account_binding'] = external_account_binding
|
||||
|
||||
return cls(**kwargs)
|
||||
|
||||
def _filter_contact(self, prefix):
|
||||
return tuple(
|
||||
detail[len(prefix):] for detail in self.contact # pylint: disable=not-an-iterable
|
||||
detail[len(prefix):] for detail in self.contact
|
||||
if detail.startswith(prefix))
|
||||
|
||||
@property
|
||||
@@ -410,6 +362,7 @@ class ChallengeBody(ResourceBody):
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
|
||||
# pylint: disable=star-args
|
||||
super(ChallengeBody, self).__init__(**kwargs)
|
||||
|
||||
def encode(self, name):
|
||||
@@ -472,7 +425,7 @@ class Authorization(ResourceBody):
|
||||
:ivar datetime.datetime expires:
|
||||
|
||||
"""
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json, omitempty=True)
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json)
|
||||
challenges = jose.Field('challenges', omitempty=True)
|
||||
combinations = jose.Field('combinations', omitempty=True)
|
||||
|
||||
@@ -482,7 +435,6 @@ class Authorization(ResourceBody):
|
||||
# be absent'... then acme-spec gives example with 'expires'
|
||||
# present... That's confusing!
|
||||
expires = fields.RFC3339Field('expires', omitempty=True)
|
||||
wildcard = jose.Field('wildcard', omitempty=True)
|
||||
|
||||
@challenges.decoder
|
||||
def challenges(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
@@ -492,7 +444,7 @@ class Authorization(ResourceBody):
|
||||
def resolved_combinations(self):
|
||||
"""Combinations with challenges instead of indices."""
|
||||
return tuple(tuple(self.challenges[idx] for idx in combo)
|
||||
for combo in self.combinations) # pylint: disable=not-an-iterable
|
||||
for combo in self.combinations)
|
||||
|
||||
|
||||
@Directory.register
|
||||
@@ -502,12 +454,6 @@ class NewAuthorization(Authorization):
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class UpdateAuthorization(Authorization):
|
||||
"""Update authorization."""
|
||||
resource_type = 'authz'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class AuthorizationResource(ResourceWithURI):
|
||||
"""Authorization Resource.
|
||||
|
||||
@@ -574,7 +520,7 @@ class Order(ResourceBody):
|
||||
"""
|
||||
identifiers = jose.Field('identifiers', omitempty=True)
|
||||
status = jose.Field('status', decoder=Status.from_json,
|
||||
omitempty=True)
|
||||
omitempty=True, default=STATUS_PENDING)
|
||||
authorizations = jose.Field('authorizations', omitempty=True)
|
||||
certificate = jose.Field('certificate', omitempty=True)
|
||||
finalize = jose.Field('finalize', omitempty=True)
|
||||
@@ -604,3 +550,4 @@ class OrderResource(ResourceWithURI):
|
||||
class NewOrder(Order):
|
||||
"""New order."""
|
||||
resource_type = 'new-order'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
@@ -6,7 +6,6 @@ import mock
|
||||
|
||||
from acme import challenges
|
||||
from acme import test_util
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
CERT = test_util.load_comparable_cert('cert.der')
|
||||
@@ -86,7 +85,7 @@ class ConstantTest(unittest.TestCase):
|
||||
from acme.messages import _Constant
|
||||
|
||||
class MockConstant(_Constant): # pylint: disable=missing-docstring
|
||||
POSSIBLE_NAMES = {} # type: Dict
|
||||
POSSIBLE_NAMES = {}
|
||||
|
||||
self.MockConstant = MockConstant # pylint: disable=invalid-name
|
||||
self.const_a = MockConstant('a')
|
||||
@@ -174,24 +173,6 @@ class DirectoryTest(unittest.TestCase):
|
||||
self.assertTrue(result)
|
||||
|
||||
|
||||
class ExternalAccountBindingTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
from acme.messages import Directory
|
||||
self.key = jose.jwk.JWKRSA(key=KEY.public_key())
|
||||
self.kid = "kid-for-testing"
|
||||
self.hmac_key = "hmac-key-for-testing"
|
||||
self.dir = Directory({
|
||||
'newAccount': 'http://url/acme/new-account',
|
||||
})
|
||||
|
||||
def test_from_data(self):
|
||||
from acme.messages import ExternalAccountBinding
|
||||
eab = ExternalAccountBinding.from_data(self.key, self.kid, self.hmac_key, self.dir)
|
||||
|
||||
self.assertEqual(len(eab), 3)
|
||||
self.assertEqual(sorted(eab.keys()), sorted(['protected', 'payload', 'signature']))
|
||||
|
||||
|
||||
class RegistrationTest(unittest.TestCase):
|
||||
"""Tests for acme.messages.Registration."""
|
||||
|
||||
@@ -223,22 +204,6 @@ class RegistrationTest(unittest.TestCase):
|
||||
'mailto:admin@foo.com',
|
||||
))
|
||||
|
||||
def test_new_registration_from_data_with_eab(self):
|
||||
from acme.messages import NewRegistration, ExternalAccountBinding, Directory
|
||||
key = jose.jwk.JWKRSA(key=KEY.public_key())
|
||||
kid = "kid-for-testing"
|
||||
hmac_key = "hmac-key-for-testing"
|
||||
directory = Directory({
|
||||
'newAccount': 'http://url/acme/new-account',
|
||||
})
|
||||
eab = ExternalAccountBinding.from_data(key, kid, hmac_key, directory)
|
||||
reg = NewRegistration.from_data(email='admin@foo.com', external_account_binding=eab)
|
||||
self.assertEqual(reg.contact, (
|
||||
'mailto:admin@foo.com',
|
||||
))
|
||||
self.assertEqual(sorted(reg.external_account_binding.keys()),
|
||||
sorted(['protected', 'payload', 'signature']))
|
||||
|
||||
def test_phones(self):
|
||||
self.assertEqual(('1234',), self.reg.phones)
|
||||
|
||||
@@ -458,19 +423,6 @@ class OrderResourceTest(unittest.TestCase):
|
||||
'authorizations': None,
|
||||
})
|
||||
|
||||
class NewOrderTest(unittest.TestCase):
|
||||
"""Tests for acme.messages.NewOrder."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.messages import NewOrder
|
||||
self.reg = NewOrder(
|
||||
identifiers=mock.sentinel.identifiers)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.reg.to_json(), {
|
||||
'identifiers': mock.sentinel.identifiers,
|
||||
})
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
|
||||
@@ -1,23 +1,27 @@
|
||||
"""Support for standalone client challenge solvers. """
|
||||
import argparse
|
||||
import collections
|
||||
import functools
|
||||
import logging
|
||||
import os
|
||||
import socket
|
||||
import sys
|
||||
import threading
|
||||
|
||||
from six.moves import BaseHTTPServer # type: ignore # pylint: disable=import-error
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
from six.moves import socketserver # type: ignore # pylint: disable=import-error
|
||||
|
||||
import OpenSSL
|
||||
|
||||
from acme import challenges
|
||||
from acme import crypto_util
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# six.moves.* | pylint: disable=no-member,attribute-defined-outside-init
|
||||
# pylint: disable=no-init
|
||||
# pylint: disable=too-few-public-methods,no-init
|
||||
|
||||
|
||||
class TLSServer(socketserver.TCPServer):
|
||||
@@ -32,7 +36,7 @@ class TLSServer(socketserver.TCPServer):
|
||||
self.certs = kwargs.pop("certs", {})
|
||||
self.method = kwargs.pop(
|
||||
# pylint: disable=protected-access
|
||||
"method", crypto_util._DEFAULT_SSL_METHOD)
|
||||
"method", crypto_util._DEFAULT_TLSSNI01_SSL_METHOD)
|
||||
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
|
||||
socketserver.TCPServer.__init__(self, *args, **kwargs)
|
||||
|
||||
@@ -62,8 +66,8 @@ class BaseDualNetworkedServers(object):
|
||||
|
||||
def __init__(self, ServerClass, server_address, *remaining_args, **kwargs):
|
||||
port = server_address[1]
|
||||
self.threads = [] # type: List[threading.Thread]
|
||||
self.servers = [] # type: List[ACMEServerMixin]
|
||||
self.threads = []
|
||||
self.servers = []
|
||||
|
||||
# Must try True first.
|
||||
# Ubuntu, for example, will fail to bind to IPv4 if we've already bound
|
||||
@@ -77,29 +81,16 @@ class BaseDualNetworkedServers(object):
|
||||
kwargs["ipv6"] = ip_version
|
||||
new_address = (server_address[0],) + (port,) + server_address[2:]
|
||||
new_args = (new_address,) + remaining_args
|
||||
server = ServerClass(*new_args, **kwargs)
|
||||
logger.debug(
|
||||
"Successfully bound to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
server = ServerClass(*new_args, **kwargs) # pylint: disable=star-args
|
||||
except socket.error:
|
||||
if self.servers:
|
||||
# Already bound using IPv6.
|
||||
logger.debug(
|
||||
"Certbot wasn't able to bind to %s:%s using %s, this "
|
||||
"is often expected due to the dual stack nature of "
|
||||
"IPv6 socket implementations.",
|
||||
new_address[0], new_address[1],
|
||||
"IPv6" if ip_version else "IPv4")
|
||||
else:
|
||||
logger.debug(
|
||||
"Failed to bind to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
logger.debug("Failed to bind to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
else:
|
||||
self.servers.append(server)
|
||||
# If two servers are set up and port 0 was passed in, ensure we always
|
||||
# bind to the same port for both servers.
|
||||
port = server.socket.getsockname()[1]
|
||||
if not self.servers:
|
||||
if len(self.servers) == 0:
|
||||
raise socket.error("Could not bind to IPv4 or IPv6.")
|
||||
|
||||
def serve_forever(self):
|
||||
@@ -126,6 +117,35 @@ class BaseDualNetworkedServers(object):
|
||||
self.threads = []
|
||||
|
||||
|
||||
class TLSSNI01Server(TLSServer, ACMEServerMixin):
|
||||
"""TLSSNI01 Server."""
|
||||
|
||||
def __init__(self, server_address, certs, ipv6=False):
|
||||
TLSServer.__init__(
|
||||
self, server_address, BaseRequestHandlerWithLogging, certs=certs, ipv6=ipv6)
|
||||
|
||||
|
||||
class TLSSNI01DualNetworkedServers(BaseDualNetworkedServers):
|
||||
"""TLSSNI01Server Wrapper. Tries everything for both. Failures for one don't
|
||||
affect the other."""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
BaseDualNetworkedServers.__init__(self, TLSSNI01Server, *args, **kwargs)
|
||||
|
||||
|
||||
class BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
|
||||
"""BaseRequestHandler with logging."""
|
||||
|
||||
def log_message(self, format, *args): # pylint: disable=redefined-builtin
|
||||
"""Log arbitrary message."""
|
||||
logger.debug("%s - - %s", self.client_address[0], format % args)
|
||||
|
||||
def handle(self):
|
||||
"""Handle request."""
|
||||
self.log_message("Incoming request")
|
||||
socketserver.BaseRequestHandler.handle(self)
|
||||
|
||||
|
||||
class HTTPServer(BaseHTTPServer.HTTPServer):
|
||||
"""Generic HTTP Server."""
|
||||
|
||||
@@ -169,7 +189,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.simple_http_resources = kwargs.pop("simple_http_resources", set())
|
||||
BaseHTTPServer.BaseHTTPRequestHandler.__init__(self, *args, **kwargs)
|
||||
socketserver.BaseRequestHandler.__init__(self, *args, **kwargs)
|
||||
|
||||
def log_message(self, format, *args): # pylint: disable=redefined-builtin
|
||||
"""Log arbitrary message."""
|
||||
@@ -228,3 +248,39 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
"""
|
||||
return functools.partial(
|
||||
cls, simple_http_resources=simple_http_resources)
|
||||
|
||||
|
||||
def simple_tls_sni_01_server(cli_args, forever=True):
|
||||
"""Run simple standalone TLSSNI01 server."""
|
||||
logging.basicConfig(level=logging.DEBUG)
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
"-p", "--port", default=0, help="Port to serve at. By default "
|
||||
"picks random free port.")
|
||||
args = parser.parse_args(cli_args[1:])
|
||||
|
||||
certs = {}
|
||||
|
||||
_, hosts, _ = next(os.walk('.'))
|
||||
for host in hosts:
|
||||
with open(os.path.join(host, "cert.pem")) as cert_file:
|
||||
cert_contents = cert_file.read()
|
||||
with open(os.path.join(host, "key.pem")) as key_file:
|
||||
key_contents = key_file.read()
|
||||
certs[host.encode()] = (
|
||||
OpenSSL.crypto.load_privatekey(
|
||||
OpenSSL.crypto.FILETYPE_PEM, key_contents),
|
||||
OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_PEM, cert_contents))
|
||||
|
||||
server = TLSSNI01Server(('', int(args.port)), certs=certs)
|
||||
logger.info("Serving at https://%s:%s...", *server.socket.getsockname()[:2])
|
||||
if forever: # pragma: no cover
|
||||
server.serve_forever()
|
||||
else:
|
||||
server.handle_request()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(simple_tls_sni_01_server(sys.argv)) # pragma: no cover
|
||||
|
||||
@@ -1,6 +1,10 @@
|
||||
"""Tests for acme.standalone."""
|
||||
import os
|
||||
import shutil
|
||||
import socket
|
||||
import threading
|
||||
import tempfile
|
||||
import time
|
||||
import unittest
|
||||
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
@@ -11,8 +15,9 @@ import mock
|
||||
import requests
|
||||
|
||||
from acme import challenges
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
from acme import test_util
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
class TLSServerTest(unittest.TestCase):
|
||||
@@ -23,14 +28,41 @@ class TLSServerTest(unittest.TestCase):
|
||||
from acme.standalone import TLSServer
|
||||
server = TLSServer(
|
||||
('', 0), socketserver.BaseRequestHandler, bind_and_activate=True)
|
||||
server.server_close()
|
||||
server.server_close() # pylint: disable=no-member
|
||||
|
||||
def test_ipv6(self):
|
||||
if socket.has_ipv6:
|
||||
from acme.standalone import TLSServer
|
||||
server = TLSServer(
|
||||
('', 0), socketserver.BaseRequestHandler, bind_and_activate=True, ipv6=True)
|
||||
server.server_close()
|
||||
server.server_close() # pylint: disable=no-member
|
||||
|
||||
|
||||
class TLSSNI01ServerTest(unittest.TestCase):
|
||||
"""Test for acme.standalone.TLSSNI01Server."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
self.certs = {b'localhost': (
|
||||
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
|
||||
test_util.load_cert('rsa2048_cert.pem'),
|
||||
)}
|
||||
from acme.standalone import TLSSNI01Server
|
||||
self.server = TLSSNI01Server(("", 0), certs=self.certs)
|
||||
# pylint: disable=no-member
|
||||
self.thread = threading.Thread(target=self.server.serve_forever)
|
||||
self.thread.start()
|
||||
|
||||
def tearDown(self):
|
||||
self.server.shutdown() # pylint: disable=no-member
|
||||
self.thread.join()
|
||||
|
||||
def test_it(self):
|
||||
host, port = self.server.socket.getsockname()[:2]
|
||||
cert = crypto_util.probe_sni(
|
||||
b'localhost', host=host, port=port, timeout=1)
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
jose.ComparableX509(self.certs[b'localhost'][1]))
|
||||
|
||||
|
||||
class HTTP01ServerTest(unittest.TestCase):
|
||||
@@ -40,17 +72,18 @@ class HTTP01ServerTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.account_key = jose.JWK.load(
|
||||
test_util.load_vector('rsa1024_key.pem'))
|
||||
self.resources = set() # type: Set
|
||||
self.resources = set()
|
||||
|
||||
from acme.standalone import HTTP01Server
|
||||
self.server = HTTP01Server(('', 0), resources=self.resources)
|
||||
|
||||
# pylint: disable=no-member
|
||||
self.port = self.server.socket.getsockname()[1]
|
||||
self.thread = threading.Thread(target=self.server.serve_forever)
|
||||
self.thread.start()
|
||||
|
||||
def tearDown(self):
|
||||
self.server.shutdown()
|
||||
self.server.shutdown() # pylint: disable=no-member
|
||||
self.thread.join()
|
||||
|
||||
def test_index(self):
|
||||
@@ -100,10 +133,8 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
self.address_family = socket.AF_INET
|
||||
socketserver.TCPServer.__init__(self, *args, **kwargs)
|
||||
if ipv6:
|
||||
# NB: On Windows, socket.IPPROTO_IPV6 constant may be missing.
|
||||
# We use the corresponding value (41) instead.
|
||||
level = getattr(socket, "IPPROTO_IPV6", 41)
|
||||
self.socket.setsockopt(level, socket.IPV6_V6ONLY, 1)
|
||||
# pylint: disable=no-member
|
||||
self.socket.setsockopt(socket.IPPROTO_IPV6, socket.IPV6_V6ONLY, 1)
|
||||
try:
|
||||
self.server_bind()
|
||||
self.server_activate()
|
||||
@@ -116,15 +147,15 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
mock_bind.side_effect = socket.error
|
||||
from acme.standalone import BaseDualNetworkedServers
|
||||
self.assertRaises(socket.error, BaseDualNetworkedServers,
|
||||
BaseDualNetworkedServersTest.SingleProtocolServer,
|
||||
('', 0),
|
||||
socketserver.BaseRequestHandler)
|
||||
BaseDualNetworkedServersTest.SingleProtocolServer,
|
||||
("", 0),
|
||||
socketserver.BaseRequestHandler)
|
||||
|
||||
def test_ports_equal(self):
|
||||
from acme.standalone import BaseDualNetworkedServers
|
||||
servers = BaseDualNetworkedServers(
|
||||
BaseDualNetworkedServersTest.SingleProtocolServer,
|
||||
('', 0),
|
||||
("", 0),
|
||||
socketserver.BaseRequestHandler)
|
||||
socknames = servers.getsocknames()
|
||||
prev_port = None
|
||||
@@ -136,6 +167,33 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
prev_port = port
|
||||
|
||||
|
||||
class TLSSNI01DualNetworkedServersTest(unittest.TestCase):
|
||||
"""Test for acme.standalone.TLSSNI01DualNetworkedServers."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
self.certs = {b'localhost': (
|
||||
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
|
||||
test_util.load_cert('rsa2048_cert.pem'),
|
||||
)}
|
||||
from acme.standalone import TLSSNI01DualNetworkedServers
|
||||
self.servers = TLSSNI01DualNetworkedServers(("", 0), certs=self.certs)
|
||||
self.servers.serve_forever()
|
||||
|
||||
def tearDown(self):
|
||||
self.servers.shutdown_and_server_close()
|
||||
|
||||
def test_connect(self):
|
||||
socknames = self.servers.getsocknames()
|
||||
# connect to all addresses
|
||||
for sockname in socknames:
|
||||
host, port = sockname[:2]
|
||||
cert = crypto_util.probe_sni(
|
||||
b'localhost', host=host, port=port, timeout=1)
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
jose.ComparableX509(self.certs[b'localhost'][1]))
|
||||
|
||||
|
||||
class HTTP01DualNetworkedServersTest(unittest.TestCase):
|
||||
"""Tests for acme.standalone.HTTP01DualNetworkedServers."""
|
||||
|
||||
@@ -143,11 +201,12 @@ class HTTP01DualNetworkedServersTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.account_key = jose.JWK.load(
|
||||
test_util.load_vector('rsa1024_key.pem'))
|
||||
self.resources = set() # type: Set
|
||||
self.resources = set()
|
||||
|
||||
from acme.standalone import HTTP01DualNetworkedServers
|
||||
self.servers = HTTP01DualNetworkedServers(('', 0), resources=self.resources)
|
||||
|
||||
# pylint: disable=no-member
|
||||
self.port = self.servers.getsocknames()[0][1]
|
||||
self.servers.serve_forever()
|
||||
|
||||
@@ -186,5 +245,56 @@ class HTTP01DualNetworkedServersTest(unittest.TestCase):
|
||||
self.assertFalse(self._test_http01(add=False))
|
||||
|
||||
|
||||
class TestSimpleTLSSNI01Server(unittest.TestCase):
|
||||
"""Tests for acme.standalone.simple_tls_sni_01_server."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
# mirror ../examples/standalone
|
||||
self.test_cwd = tempfile.mkdtemp()
|
||||
localhost_dir = os.path.join(self.test_cwd, 'localhost')
|
||||
os.makedirs(localhost_dir)
|
||||
shutil.copy(test_util.vector_path('rsa2048_cert.pem'),
|
||||
os.path.join(localhost_dir, 'cert.pem'))
|
||||
shutil.copy(test_util.vector_path('rsa2048_key.pem'),
|
||||
os.path.join(localhost_dir, 'key.pem'))
|
||||
|
||||
from acme.standalone import simple_tls_sni_01_server
|
||||
self.port = 1234
|
||||
self.thread = threading.Thread(
|
||||
target=simple_tls_sni_01_server, kwargs={
|
||||
'cli_args': ('xxx', '--port', str(self.port)),
|
||||
'forever': False,
|
||||
},
|
||||
)
|
||||
self.old_cwd = os.getcwd()
|
||||
os.chdir(self.test_cwd)
|
||||
|
||||
def tearDown(self):
|
||||
os.chdir(self.old_cwd)
|
||||
self.thread.join()
|
||||
shutil.rmtree(self.test_cwd)
|
||||
|
||||
def test_it(self):
|
||||
max_attempts = 5
|
||||
for attempt in range(max_attempts):
|
||||
try:
|
||||
cert = crypto_util.probe_sni(
|
||||
b'localhost', b'0.0.0.0', self.port)
|
||||
except errors.Error:
|
||||
self.assertTrue(attempt + 1 < max_attempts, "Timeout!")
|
||||
time.sleep(1) # wait until thread starts
|
||||
else:
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
test_util.load_comparable_cert(
|
||||
'rsa2048_cert.pem'))
|
||||
break
|
||||
|
||||
if attempt == 0:
|
||||
# the first attempt is always meant to fail, so we can test
|
||||
# the socket failure code-path for probe_sni, as well
|
||||
self.thread.start()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
|
||||
@@ -5,11 +5,18 @@
|
||||
"""
|
||||
import os
|
||||
import pkg_resources
|
||||
import unittest
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
import OpenSSL
|
||||
|
||||
|
||||
def vector_path(*names):
|
||||
"""Path to a test vector."""
|
||||
return pkg_resources.resource_filename(
|
||||
__name__, os.path.join('testdata', *names))
|
||||
|
||||
|
||||
def load_vector(*names):
|
||||
@@ -32,8 +39,8 @@ def _guess_loader(filename, loader_pem, loader_der):
|
||||
def load_cert(*names):
|
||||
"""Load certificate."""
|
||||
loader = _guess_loader(
|
||||
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
|
||||
return crypto.load_certificate(loader, load_vector(*names))
|
||||
names[-1], OpenSSL.crypto.FILETYPE_PEM, OpenSSL.crypto.FILETYPE_ASN1)
|
||||
return OpenSSL.crypto.load_certificate(loader, load_vector(*names))
|
||||
|
||||
|
||||
def load_comparable_cert(*names):
|
||||
@@ -44,8 +51,8 @@ def load_comparable_cert(*names):
|
||||
def load_csr(*names):
|
||||
"""Load certificate request."""
|
||||
loader = _guess_loader(
|
||||
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
|
||||
return crypto.load_certificate_request(loader, load_vector(*names))
|
||||
names[-1], OpenSSL.crypto.FILETYPE_PEM, OpenSSL.crypto.FILETYPE_ASN1)
|
||||
return OpenSSL.crypto.load_certificate_request(loader, load_vector(*names))
|
||||
|
||||
|
||||
def load_comparable_csr(*names):
|
||||
@@ -64,5 +71,26 @@ def load_rsa_private_key(*names):
|
||||
def load_pyopenssl_private_key(*names):
|
||||
"""Load pyOpenSSL private key."""
|
||||
loader = _guess_loader(
|
||||
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
|
||||
return crypto.load_privatekey(loader, load_vector(*names))
|
||||
names[-1], OpenSSL.crypto.FILETYPE_PEM, OpenSSL.crypto.FILETYPE_ASN1)
|
||||
return OpenSSL.crypto.load_privatekey(loader, load_vector(*names))
|
||||
|
||||
|
||||
def skip_unless(condition, reason): # pragma: no cover
|
||||
"""Skip tests unless a condition holds.
|
||||
|
||||
This implements the basic functionality of unittest.skipUnless
|
||||
which is only available on Python 2.7+.
|
||||
|
||||
:param bool condition: If ``False``, the test will be skipped
|
||||
:param str reason: the reason for skipping the test
|
||||
|
||||
:rtype: callable
|
||||
:returns: decorator that hides tests unless condition is ``True``
|
||||
|
||||
"""
|
||||
if hasattr(unittest, "skipUnless"):
|
||||
return unittest.skipUnless(condition, reason)
|
||||
elif condition:
|
||||
return lambda cls: cls
|
||||
else:
|
||||
return lambda cls: None
|
||||
|
||||
@@ -16,6 +16,13 @@ Contents:
|
||||
.. automodule:: acme
|
||||
:members:
|
||||
|
||||
|
||||
Example client:
|
||||
|
||||
.. include:: ../examples/example_client.py
|
||||
:code: python
|
||||
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
||||
|
||||
47
acme/examples/example_client.py
Normal file
47
acme/examples/example_client.py
Normal file
@@ -0,0 +1,47 @@
|
||||
"""Example script showing how to use acme client API."""
|
||||
import logging
|
||||
import os
|
||||
import pkg_resources
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
|
||||
from acme import client
|
||||
from acme import messages
|
||||
|
||||
|
||||
logging.basicConfig(level=logging.DEBUG)
|
||||
|
||||
|
||||
DIRECTORY_URL = 'https://acme-staging.api.letsencrypt.org/directory'
|
||||
BITS = 2048 # minimum for Boulder
|
||||
DOMAIN = 'example1.com' # example.com is ignored by Boulder
|
||||
|
||||
# generate_private_key requires cryptography>=0.5
|
||||
key = jose.JWKRSA(key=rsa.generate_private_key(
|
||||
public_exponent=65537,
|
||||
key_size=BITS,
|
||||
backend=default_backend()))
|
||||
acme = client.Client(DIRECTORY_URL, key)
|
||||
|
||||
regr = acme.register()
|
||||
logging.info('Auto-accepting TOS: %s', regr.terms_of_service)
|
||||
acme.agree_to_tos(regr)
|
||||
logging.debug(regr)
|
||||
|
||||
authzr = acme.request_challenges(
|
||||
identifier=messages.Identifier(typ=messages.IDENTIFIER_FQDN, value=DOMAIN))
|
||||
logging.debug(authzr)
|
||||
|
||||
authzr, authzr_response = acme.poll(authzr)
|
||||
|
||||
csr = OpenSSL.crypto.load_certificate_request(
|
||||
OpenSSL.crypto.FILETYPE_ASN1, pkg_resources.resource_string(
|
||||
'acme', os.path.join('testdata', 'csr.der')))
|
||||
try:
|
||||
acme.request_issuance(jose.util.ComparableX509(csr), (authzr,))
|
||||
except messages.Error as error:
|
||||
print ("This script is doomed to fail as no authorization "
|
||||
"challenges are ever solved. Error from server: {0}".format(error))
|
||||
@@ -1,240 +0,0 @@
|
||||
"""Example ACME-V2 API for HTTP-01 challenge.
|
||||
|
||||
Brief:
|
||||
|
||||
This a complete usage example of the python-acme API.
|
||||
|
||||
Limitations of this example:
|
||||
- Works for only one Domain name
|
||||
- Performs only HTTP-01 challenge
|
||||
- Uses ACME-v2
|
||||
|
||||
Workflow:
|
||||
(Account creation)
|
||||
- Create account key
|
||||
- Register account and accept TOS
|
||||
(Certificate actions)
|
||||
- Select HTTP-01 within offered challenges by the CA server
|
||||
- Set up http challenge resource
|
||||
- Set up standalone web server
|
||||
- Create domain private key and CSR
|
||||
- Issue certificate
|
||||
- Renew certificate
|
||||
- Revoke certificate
|
||||
(Account update actions)
|
||||
- Change contact information
|
||||
- Deactivate Account
|
||||
"""
|
||||
from contextlib import contextmanager
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
import OpenSSL
|
||||
|
||||
from acme import challenges
|
||||
from acme import client
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
from acme import messages
|
||||
from acme import standalone
|
||||
import josepy as jose
|
||||
|
||||
# Constants:
|
||||
|
||||
# This is the staging point for ACME-V2 within Let's Encrypt.
|
||||
DIRECTORY_URL = 'https://acme-staging-v02.api.letsencrypt.org/directory'
|
||||
|
||||
USER_AGENT = 'python-acme-example'
|
||||
|
||||
# Account key size
|
||||
ACC_KEY_BITS = 2048
|
||||
|
||||
# Certificate private key size
|
||||
CERT_PKEY_BITS = 2048
|
||||
|
||||
# Domain name for the certificate.
|
||||
DOMAIN = 'client.example.com'
|
||||
|
||||
# If you are running Boulder locally, it is possible to configure any port
|
||||
# number to execute the challenge, but real CA servers will always use port
|
||||
# 80, as described in the ACME specification.
|
||||
PORT = 80
|
||||
|
||||
|
||||
# Useful methods and classes:
|
||||
|
||||
|
||||
def new_csr_comp(domain_name, pkey_pem=None):
|
||||
"""Create certificate signing request."""
|
||||
if pkey_pem is None:
|
||||
# Create private key.
|
||||
pkey = OpenSSL.crypto.PKey()
|
||||
pkey.generate_key(OpenSSL.crypto.TYPE_RSA, CERT_PKEY_BITS)
|
||||
pkey_pem = OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM,
|
||||
pkey)
|
||||
csr_pem = crypto_util.make_csr(pkey_pem, [domain_name])
|
||||
return pkey_pem, csr_pem
|
||||
|
||||
|
||||
def select_http01_chall(orderr):
|
||||
"""Extract authorization resource from within order resource."""
|
||||
# Authorization Resource: authz.
|
||||
# This object holds the offered challenges by the server and their status.
|
||||
authz_list = orderr.authorizations
|
||||
|
||||
for authz in authz_list:
|
||||
# Choosing challenge.
|
||||
# authz.body.challenges is a set of ChallengeBody objects.
|
||||
for i in authz.body.challenges:
|
||||
# Find the supported challenge.
|
||||
if isinstance(i.chall, challenges.HTTP01):
|
||||
return i
|
||||
|
||||
raise Exception('HTTP-01 challenge was not offered by the CA server.')
|
||||
|
||||
|
||||
@contextmanager
|
||||
def challenge_server(http_01_resources):
|
||||
"""Manage standalone server set up and shutdown."""
|
||||
|
||||
# Setting up a fake server that binds at PORT and any address.
|
||||
address = ('', PORT)
|
||||
try:
|
||||
servers = standalone.HTTP01DualNetworkedServers(address,
|
||||
http_01_resources)
|
||||
# Start client standalone web server.
|
||||
servers.serve_forever()
|
||||
yield servers
|
||||
finally:
|
||||
# Shutdown client web server and unbind from PORT
|
||||
servers.shutdown_and_server_close()
|
||||
|
||||
|
||||
def perform_http01(client_acme, challb, orderr):
|
||||
"""Set up standalone webserver and perform HTTP-01 challenge."""
|
||||
|
||||
response, validation = challb.response_and_validation(client_acme.net.key)
|
||||
|
||||
resource = standalone.HTTP01RequestHandler.HTTP01Resource(
|
||||
chall=challb.chall, response=response, validation=validation)
|
||||
|
||||
with challenge_server({resource}):
|
||||
# Let the CA server know that we are ready for the challenge.
|
||||
client_acme.answer_challenge(challb, response)
|
||||
|
||||
# Wait for challenge status and then issue a certificate.
|
||||
# It is possible to set a deadline time.
|
||||
finalized_orderr = client_acme.poll_and_finalize(orderr)
|
||||
|
||||
return finalized_orderr.fullchain_pem
|
||||
|
||||
|
||||
# Main examples:
|
||||
|
||||
|
||||
def example_http():
|
||||
"""This example executes the whole process of fulfilling a HTTP-01
|
||||
challenge for one specific domain.
|
||||
|
||||
The workflow consists of:
|
||||
(Account creation)
|
||||
- Create account key
|
||||
- Register account and accept TOS
|
||||
(Certificate actions)
|
||||
- Select HTTP-01 within offered challenges by the CA server
|
||||
- Set up http challenge resource
|
||||
- Set up standalone web server
|
||||
- Create domain private key and CSR
|
||||
- Issue certificate
|
||||
- Renew certificate
|
||||
- Revoke certificate
|
||||
(Account update actions)
|
||||
- Change contact information
|
||||
- Deactivate Account
|
||||
|
||||
"""
|
||||
# Create account key
|
||||
|
||||
acc_key = jose.JWKRSA(
|
||||
key=rsa.generate_private_key(public_exponent=65537,
|
||||
key_size=ACC_KEY_BITS,
|
||||
backend=default_backend()))
|
||||
|
||||
# Register account and accept TOS
|
||||
|
||||
net = client.ClientNetwork(acc_key, user_agent=USER_AGENT)
|
||||
directory = messages.Directory.from_json(net.get(DIRECTORY_URL).json())
|
||||
client_acme = client.ClientV2(directory, net=net)
|
||||
|
||||
# Terms of Service URL is in client_acme.directory.meta.terms_of_service
|
||||
# Registration Resource: regr
|
||||
# Creates account with contact information.
|
||||
email = ('fake@example.com')
|
||||
regr = client_acme.new_account(
|
||||
messages.NewRegistration.from_data(
|
||||
email=email, terms_of_service_agreed=True))
|
||||
|
||||
# Create domain private key and CSR
|
||||
pkey_pem, csr_pem = new_csr_comp(DOMAIN)
|
||||
|
||||
# Issue certificate
|
||||
|
||||
orderr = client_acme.new_order(csr_pem)
|
||||
|
||||
# Select HTTP-01 within offered challenges by the CA server
|
||||
challb = select_http01_chall(orderr)
|
||||
|
||||
# The certificate is ready to be used in the variable "fullchain_pem".
|
||||
fullchain_pem = perform_http01(client_acme, challb, orderr)
|
||||
|
||||
# Renew certificate
|
||||
|
||||
_, csr_pem = new_csr_comp(DOMAIN, pkey_pem)
|
||||
|
||||
orderr = client_acme.new_order(csr_pem)
|
||||
|
||||
challb = select_http01_chall(orderr)
|
||||
|
||||
# Performing challenge
|
||||
fullchain_pem = perform_http01(client_acme, challb, orderr)
|
||||
|
||||
# Revoke certificate
|
||||
|
||||
fullchain_com = jose.ComparableX509(
|
||||
OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_PEM, fullchain_pem))
|
||||
|
||||
try:
|
||||
client_acme.revoke(fullchain_com, 0) # revocation reason = 0
|
||||
except errors.ConflictError:
|
||||
# Certificate already revoked.
|
||||
pass
|
||||
|
||||
# Query registration status.
|
||||
client_acme.net.account = regr
|
||||
try:
|
||||
regr = client_acme.query_registration(regr)
|
||||
except errors.Error as err:
|
||||
if err.typ == messages.OLD_ERROR_PREFIX + 'unauthorized' \
|
||||
or err.typ == messages.ERROR_PREFIX + 'unauthorized':
|
||||
# Status is deactivated.
|
||||
pass
|
||||
raise
|
||||
|
||||
# Change contact information
|
||||
|
||||
email = 'newfake@example.com'
|
||||
regr = client_acme.update_registration(
|
||||
regr.update(
|
||||
body=regr.body.update(
|
||||
contact=('mailto:' + email,)
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
# Deactivate account/registration
|
||||
|
||||
regr = client_acme.deactivate_registration(regr)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
example_http()
|
||||
@@ -1,2 +0,0 @@
|
||||
[pytest]
|
||||
norecursedirs = .* build dist CVS _darcs {arch} *.egg
|
||||
@@ -1,26 +1,24 @@
|
||||
from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
from setuptools.command.test import test as TestCommand
|
||||
import sys
|
||||
|
||||
version = '1.0.0.dev0'
|
||||
from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.23.0.dev0'
|
||||
|
||||
# Please update tox.ini when modifying dependency version requirements
|
||||
install_requires = [
|
||||
# load_pem_private/public_key (>=0.6)
|
||||
# rsa_recover_prime_factors (>=0.8)
|
||||
'cryptography>=1.2.3',
|
||||
'cryptography>=0.8',
|
||||
# formerly known as acme.jose:
|
||||
# 1.1.0+ is required to avoid the warnings described at
|
||||
# https://github.com/certbot/josepy/issues/13.
|
||||
'josepy>=1.1.0',
|
||||
'josepy>=1.0.0',
|
||||
# Connection.set_tlsext_host_name (>=0.13)
|
||||
'mock',
|
||||
'PyOpenSSL>=0.13.1',
|
||||
'PyOpenSSL>=0.13',
|
||||
'pyrfc3339',
|
||||
'pytz',
|
||||
'requests[security]>=2.6.0', # security extras added in 2.4.1
|
||||
'requests-toolbelt>=0.3.0',
|
||||
'requests[security]>=2.4.1', # security extras added in 2.4.1
|
||||
'setuptools',
|
||||
'six>=1.9.0', # needed for python_2_unicode_compatible
|
||||
]
|
||||
@@ -37,21 +35,6 @@ docs_extras = [
|
||||
]
|
||||
|
||||
|
||||
class PyTest(TestCommand):
|
||||
user_options = []
|
||||
|
||||
def initialize_options(self):
|
||||
TestCommand.initialize_options(self)
|
||||
self.pytest_args = ''
|
||||
|
||||
def run_tests(self):
|
||||
import shlex
|
||||
# import here, cause outside the eggs aren't loaded
|
||||
import pytest
|
||||
errno = pytest.main(shlex.split(self.pytest_args))
|
||||
sys.exit(errno)
|
||||
|
||||
|
||||
setup(
|
||||
name='acme',
|
||||
version=version,
|
||||
@@ -62,7 +45,7 @@ setup(
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Development Status :: 3 - Alpha',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Programming Language :: Python',
|
||||
@@ -72,8 +55,6 @@ setup(
|
||||
'Programming Language :: Python :: 3.4',
|
||||
'Programming Language :: Python :: 3.5',
|
||||
'Programming Language :: Python :: 3.6',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
],
|
||||
@@ -86,6 +67,4 @@ setup(
|
||||
'docs': docs_extras,
|
||||
},
|
||||
test_suite='acme',
|
||||
tests_require=["pytest"],
|
||||
cmdclass={"test": PyTest},
|
||||
)
|
||||
|
||||
@@ -1,9 +1,7 @@
|
||||
""" Utility functions for certbot-apache plugin """
|
||||
import binascii
|
||||
import os
|
||||
|
||||
from certbot import util
|
||||
from certbot.compat import os
|
||||
|
||||
|
||||
def get_mod_deps(mod_name):
|
||||
"""Get known module dependencies.
|
||||
@@ -100,8 +98,3 @@ def parse_define_file(filepath, varname):
|
||||
var_parts = v[2:].partition("=")
|
||||
return_vars[var_parts[0]] = var_parts[2]
|
||||
return return_vars
|
||||
|
||||
|
||||
def unique_id():
|
||||
""" Returns an unique id to be used as a VirtualHost identifier"""
|
||||
return binascii.hexlify(os.urandom(16)).decode("utf-8")
|
||||
|
||||
207
certbot-apache/certbot_apache/augeas_configurator.py
Normal file
207
certbot-apache/certbot_apache/augeas_configurator.py
Normal file
@@ -0,0 +1,207 @@
|
||||
"""Class of Augeas Configurators."""
|
||||
import logging
|
||||
|
||||
|
||||
from certbot import errors
|
||||
from certbot.plugins import common
|
||||
|
||||
from certbot_apache import constants
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class AugeasConfigurator(common.Installer):
|
||||
"""Base Augeas Configurator class.
|
||||
|
||||
:ivar config: Configuration.
|
||||
:type config: :class:`~certbot.interfaces.IConfig`
|
||||
|
||||
:ivar aug: Augeas object
|
||||
:type aug: :class:`augeas.Augeas`
|
||||
|
||||
:ivar str save_notes: Human-readable configuration change notes
|
||||
:ivar reverter: saves and reverts checkpoints
|
||||
:type reverter: :class:`certbot.reverter.Reverter`
|
||||
|
||||
"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(AugeasConfigurator, self).__init__(*args, **kwargs)
|
||||
|
||||
# Placeholder for augeas
|
||||
self.aug = None
|
||||
|
||||
self.save_notes = ""
|
||||
|
||||
|
||||
def init_augeas(self):
|
||||
""" Initialize the actual Augeas instance """
|
||||
import augeas
|
||||
self.aug = augeas.Augeas(
|
||||
# specify a directory to load our preferred lens from
|
||||
loadpath=constants.AUGEAS_LENS_DIR,
|
||||
# Do not save backup (we do it ourselves), do not load
|
||||
# anything by default
|
||||
flags=(augeas.Augeas.NONE |
|
||||
augeas.Augeas.NO_MODL_AUTOLOAD |
|
||||
augeas.Augeas.ENABLE_SPAN))
|
||||
# See if any temporary changes need to be recovered
|
||||
# This needs to occur before VirtualHost objects are setup...
|
||||
# because this will change the underlying configuration and potential
|
||||
# vhosts
|
||||
self.recovery_routine()
|
||||
|
||||
def check_parsing_errors(self, lens):
|
||||
"""Verify Augeas can parse all of the lens files.
|
||||
|
||||
:param str lens: lens to check for errors
|
||||
|
||||
:raises .errors.PluginError: If there has been an error in parsing with
|
||||
the specified lens.
|
||||
|
||||
"""
|
||||
error_files = self.aug.match("/augeas//error")
|
||||
|
||||
for path in error_files:
|
||||
# Check to see if it was an error resulting from the use of
|
||||
# the httpd lens
|
||||
lens_path = self.aug.get(path + "/lens")
|
||||
# As aug.get may return null
|
||||
if lens_path and lens in lens_path:
|
||||
msg = (
|
||||
"There has been an error in parsing the file {0} on line {1}: "
|
||||
"{2}".format(
|
||||
# Strip off /augeas/files and /error
|
||||
path[13:len(path) - 6],
|
||||
self.aug.get(path + "/line"),
|
||||
self.aug.get(path + "/message")))
|
||||
raise errors.PluginError(msg)
|
||||
|
||||
def ensure_augeas_state(self):
|
||||
"""Makes sure that all Augeas dom changes are written to files to avoid
|
||||
loss of configuration directives when doing additional augeas parsing,
|
||||
causing a possible augeas.load() resulting dom reset
|
||||
"""
|
||||
|
||||
if self.unsaved_files():
|
||||
self.save_notes += "(autosave)"
|
||||
self.save()
|
||||
|
||||
def unsaved_files(self):
|
||||
"""Lists files that have modified Augeas DOM but the changes have not
|
||||
been written to the filesystem yet, used by `self.save()` and
|
||||
ApacheConfigurator to check the file state.
|
||||
|
||||
:raises .errors.PluginError: If there was an error in Augeas, in
|
||||
an attempt to save the configuration, or an error creating a
|
||||
checkpoint
|
||||
|
||||
:returns: `set` of unsaved files
|
||||
"""
|
||||
save_state = self.aug.get("/augeas/save")
|
||||
self.aug.set("/augeas/save", "noop")
|
||||
# Existing Errors
|
||||
ex_errs = self.aug.match("/augeas//error")
|
||||
try:
|
||||
# This is a noop save
|
||||
self.aug.save()
|
||||
except (RuntimeError, IOError):
|
||||
self._log_save_errors(ex_errs)
|
||||
# Erase Save Notes
|
||||
self.save_notes = ""
|
||||
raise errors.PluginError(
|
||||
"Error saving files, check logs for more info.")
|
||||
|
||||
# Return the original save method
|
||||
self.aug.set("/augeas/save", save_state)
|
||||
|
||||
# Retrieve list of modified files
|
||||
# Note: Noop saves can cause the file to be listed twice, I used a
|
||||
# set to remove this possibility. This is a known augeas 0.10 error.
|
||||
save_paths = self.aug.match("/augeas/events/saved")
|
||||
|
||||
save_files = set()
|
||||
if save_paths:
|
||||
for path in save_paths:
|
||||
save_files.add(self.aug.get(path)[6:])
|
||||
return save_files
|
||||
|
||||
def save(self, title=None, temporary=False):
|
||||
"""Saves all changes to the configuration files.
|
||||
|
||||
This function first checks for save errors, if none are found,
|
||||
all configuration changes made will be saved. According to the
|
||||
function parameters. If an exception is raised, a new checkpoint
|
||||
was not created.
|
||||
|
||||
:param str title: The title of the save. If a title is given, the
|
||||
configuration will be saved as a new checkpoint and put in a
|
||||
timestamped directory.
|
||||
|
||||
:param bool temporary: Indicates whether the changes made will
|
||||
be quickly reversed in the future (ie. challenges)
|
||||
|
||||
"""
|
||||
save_files = self.unsaved_files()
|
||||
if save_files:
|
||||
self.add_to_checkpoint(save_files,
|
||||
self.save_notes, temporary=temporary)
|
||||
|
||||
self.save_notes = ""
|
||||
self.aug.save()
|
||||
|
||||
# Force reload if files were modified
|
||||
# This is needed to recalculate augeas directive span
|
||||
if save_files:
|
||||
for sf in save_files:
|
||||
self.aug.remove("/files/"+sf)
|
||||
self.aug.load()
|
||||
if title and not temporary:
|
||||
self.finalize_checkpoint(title)
|
||||
|
||||
def _log_save_errors(self, ex_errs):
|
||||
"""Log errors due to bad Augeas save.
|
||||
|
||||
:param list ex_errs: Existing errors before save
|
||||
|
||||
"""
|
||||
# Check for the root of save problems
|
||||
new_errs = self.aug.match("/augeas//error")
|
||||
# logger.error("During Save - %s", mod_conf)
|
||||
logger.error("Unable to save files: %s. Attempted Save Notes: %s",
|
||||
", ".join(err[13:len(err) - 6] for err in new_errs
|
||||
# Only new errors caused by recent save
|
||||
if err not in ex_errs), self.save_notes)
|
||||
|
||||
# Wrapper functions for Reverter class
|
||||
def recovery_routine(self):
|
||||
"""Revert all previously modified files.
|
||||
|
||||
Reverts all modified files that have not been saved as a checkpoint
|
||||
|
||||
:raises .errors.PluginError: If unable to recover the configuration
|
||||
|
||||
"""
|
||||
super(AugeasConfigurator, self).recovery_routine()
|
||||
# Need to reload configuration after these changes take effect
|
||||
self.aug.load()
|
||||
|
||||
def revert_challenge_config(self):
|
||||
"""Used to cleanup challenge configurations.
|
||||
|
||||
:raises .errors.PluginError: If unable to revert the challenge config.
|
||||
|
||||
"""
|
||||
self.revert_temporary_config()
|
||||
self.aug.load()
|
||||
|
||||
def rollback_checkpoints(self, rollback=1):
|
||||
"""Rollback saved checkpoints.
|
||||
|
||||
:param int rollback: Number of checkpoints to revert
|
||||
|
||||
:raises .errors.PluginError: If there is a problem with the input or
|
||||
the function is unable to correctly revert the configuration
|
||||
|
||||
"""
|
||||
super(AugeasConfigurator, self).rollback_checkpoints(rollback)
|
||||
self.aug.load()
|
||||
@@ -44,134 +44,67 @@ autoload xfm
|
||||
*****************************************************************)
|
||||
let dels (s:string) = del s s
|
||||
|
||||
(* The continuation sequence that indicates that we should consider the
|
||||
* next line part of the current line *)
|
||||
let cont = /\\\\\r?\n/
|
||||
|
||||
(* Whitespace within a line: space, tab, and the continuation sequence *)
|
||||
let ws = /[ \t]/ | cont
|
||||
|
||||
(* Any possible character - '.' does not match \n *)
|
||||
let any = /(.|\n)/
|
||||
|
||||
(* Any character preceded by a backslash *)
|
||||
let esc_any = /\\\\(.|\n)/
|
||||
|
||||
(* Newline sequence - both for Unix and DOS newlines *)
|
||||
let nl = /\r?\n/
|
||||
|
||||
(* Whitespace at the end of a line *)
|
||||
let eol = del (ws* . nl) "\n"
|
||||
|
||||
(* deal with continuation lines *)
|
||||
let sep_spc = del ws+ " "
|
||||
let sep_osp = del ws* ""
|
||||
let sep_eq = del (ws* . "=" . ws*) "="
|
||||
let sep_spc = del /([ \t]+|[ \t]*\\\\\r?\n[ \t]*)+/ " "
|
||||
let sep_osp = del /([ \t]*|[ \t]*\\\\\r?\n[ \t]*)*/ ""
|
||||
let sep_eq = del /[ \t]*=[ \t]*/ "="
|
||||
|
||||
let nmtoken = /[a-zA-Z:_][a-zA-Z0-9:_.-]*/
|
||||
let word = /[a-z][a-z0-9._-]*/i
|
||||
|
||||
(* A complete line that is either just whitespace or a comment that only
|
||||
* contains whitespace *)
|
||||
let empty = [ del (ws* . /#?/ . ws* . nl) "\n" ]
|
||||
|
||||
let eol = Util.doseol
|
||||
let empty = Util.empty_dos
|
||||
let indent = Util.indent
|
||||
|
||||
(* A comment that is not just whitespace. We define it in terms of the
|
||||
* things that are not allowed as part of such a comment:
|
||||
* 1) Starts with whitespace
|
||||
* 2) Ends with whitespace, a backslash or \r
|
||||
* 3) Unescaped newlines
|
||||
*)
|
||||
let comment =
|
||||
let comment_start = del (ws* . "#" . ws* ) "# " in
|
||||
let unesc_eol = /[^\]?/ . nl in
|
||||
let w = /[^\t\n\r \\]/ in
|
||||
let r = /[\r\\]/ in
|
||||
let s = /[\t\r ]/ in
|
||||
(*
|
||||
* we'd like to write
|
||||
* let b = /\\\\/ in
|
||||
* let t = /[\t\n\r ]/ in
|
||||
* let x = b . (t? . (s|w)* ) in
|
||||
* but the definition of b depends on commit 244c0edd in 1.9.0 and
|
||||
* would make the lens unusable with versions before 1.9.0. So we write
|
||||
* x out which works in older versions, too
|
||||
*)
|
||||
let x = /\\\\[\t\n\r ]?[^\n\\]*/ in
|
||||
let line = ((r . s* . w|w|r) . (s|w)* . x*|(r.s* )?).w.(s*.w)* in
|
||||
[ label "#comment" . comment_start . store line . eol ]
|
||||
let comment_val_re = /([^ \t\r\n](.|\\\\\r?\n)*[^ \\\t\r\n]|[^ \t\r\n])/
|
||||
let comment = [ label "#comment" . del /[ \t]*#[ \t]*/ "# "
|
||||
. store comment_val_re . eol ]
|
||||
|
||||
(* borrowed from shellvars.aug *)
|
||||
let char_arg_dir = /([^\\ '"{\t\r\n]|[^ '"{\t\r\n]+[^\\ \t\r\n])|\\\\"|\\\\'|\\\\ /
|
||||
let char_arg_sec = /([^\\ '"\t\r\n>]|[^ '"\t\r\n>]+[^\\ \t\r\n>])|\\\\"|\\\\'|\\\\ /
|
||||
let char_arg_wl = /([^\\ '"},\t\r\n]|[^ '"},\t\r\n]+[^\\ '"},\t\r\n])/
|
||||
|
||||
let cdot = /\\\\./
|
||||
let cl = /\\\\\n/
|
||||
let dquot =
|
||||
let no_dquot = /[^"\\\r\n]/
|
||||
in /"/ . (no_dquot|esc_any)* . /"/
|
||||
in /"/ . (no_dquot|cdot|cl)* . /"/
|
||||
let dquot_msg =
|
||||
let no_dquot = /([^ \t"\\\r\n]|[^"\\\r\n]+[^ \t"\\\r\n])/
|
||||
in /"/ . (no_dquot|esc_any)* . no_dquot
|
||||
|
||||
in /"/ . (no_dquot|cdot|cl)*
|
||||
let squot =
|
||||
let no_squot = /[^'\\\r\n]/
|
||||
in /'/ . (no_squot|esc_any)* . /'/
|
||||
in /'/ . (no_squot|cdot|cl)* . /'/
|
||||
let comp = /[<>=]?=/
|
||||
|
||||
(******************************************************************
|
||||
* Attributes
|
||||
*****************************************************************)
|
||||
|
||||
(* The arguments for a directive come in two flavors: quoted with single or
|
||||
* double quotes, or bare. Bare arguments may not start with a single or
|
||||
* double quote; since we also treat "word lists" special, i.e. lists
|
||||
* enclosed in curly braces, bare arguments may not start with those,
|
||||
* either.
|
||||
*
|
||||
* Bare arguments may not contain unescaped spaces, but we allow escaping
|
||||
* with '\\'. Quoted arguments can contain anything, though the quote must
|
||||
* be escaped with '\\'.
|
||||
*)
|
||||
let bare = /([^{"' \t\n\r]|\\\\.)([^ \t\n\r]|\\\\.)*[^ \t\n\r\\]|[^{"' \t\n\r\\]/
|
||||
|
||||
let arg_quoted = [ label "arg" . store (dquot|squot) ]
|
||||
let arg_bare = [ label "arg" . store bare ]
|
||||
|
||||
let arg_dir = [ label "arg" . store (char_arg_dir+|dquot|squot) ]
|
||||
(* message argument starts with " but ends at EOL *)
|
||||
let arg_dir_msg = [ label "arg" . store dquot_msg ]
|
||||
let arg_sec = [ label "arg" . store (char_arg_sec+|comp|dquot|squot) ]
|
||||
let arg_wl = [ label "arg" . store (char_arg_wl+|dquot|squot) ]
|
||||
|
||||
(* comma-separated wordlist as permitted in the SSLRequire directive *)
|
||||
let arg_wordlist =
|
||||
let wl_start = dels "{" in
|
||||
let wl_end = dels "}" in
|
||||
let wl_start = Util.del_str "{" in
|
||||
let wl_end = Util.del_str "}" in
|
||||
let wl_sep = del /[ \t]*,[ \t]*/ ", "
|
||||
in [ label "wordlist" . wl_start . arg_wl . (wl_sep . arg_wl)* . wl_end ]
|
||||
|
||||
let argv (l:lens) = l . (sep_spc . l)*
|
||||
|
||||
(* the arguments of a directive. We use this once we have parsed the name
|
||||
* of the directive, and the space right after it. When dir_args is used,
|
||||
* we also know that we have at least one argument. We need to be careful
|
||||
* with the spacing between arguments: quoted arguments and word lists do
|
||||
* not need to have space between them, but bare arguments do.
|
||||
*
|
||||
* Apache apparently is also happy if the last argument starts with a double
|
||||
* quote, but has no corresponding closing duoble quote, which is what
|
||||
* arg_dir_msg handles
|
||||
*)
|
||||
let dir_args =
|
||||
let arg_nospc = arg_quoted|arg_wordlist in
|
||||
(arg_bare . sep_spc | arg_nospc . sep_osp)* . (arg_bare|arg_nospc|arg_dir_msg)
|
||||
|
||||
let directive =
|
||||
[ indent . label "directive" . store word . (sep_spc . dir_args)? . eol ]
|
||||
|
||||
let arg_sec = [ label "arg" . store (char_arg_sec+|comp|dquot|squot) ]
|
||||
(* arg_dir_msg may be the last or only argument *)
|
||||
let dir_args = (argv (arg_dir|arg_wordlist) . (sep_spc . arg_dir_msg)?) | arg_dir_msg
|
||||
in [ indent . label "directive" . store word . (sep_spc . dir_args)? . eol ]
|
||||
|
||||
let section (body:lens) =
|
||||
(* opt_eol includes empty lines *)
|
||||
let opt_eol = del /([ \t]*#?[ \t]*\r?\n)*/ "\n" in
|
||||
let opt_eol = del /([ \t]*#?\r?\n)*/ "\n" in
|
||||
let inner = (sep_spc . argv arg_sec)? . sep_osp .
|
||||
dels ">" . opt_eol . ((body|comment) . (body|empty|comment)*)? .
|
||||
indent . dels "</" in
|
||||
@@ -200,7 +133,6 @@ let filter = (incl "/etc/apache2/apache2.conf") .
|
||||
(incl "/etc/httpd/conf.d/*.conf") .
|
||||
(incl "/etc/httpd/httpd.conf") .
|
||||
(incl "/etc/httpd/conf/httpd.conf") .
|
||||
(incl "/etc/httpd/conf.modules.d/*.conf") .
|
||||
Util.stdexcl
|
||||
|
||||
let xfm = transform lns filter
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -9,7 +9,6 @@ MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
|
||||
UPDATED_MOD_SSL_CONF_DIGEST = ".updated-options-ssl-apache-conf-digest.txt"
|
||||
"""Name of the hash of the updated or informed mod_ssl_conf as saved in `IConfig.config_dir`."""
|
||||
|
||||
# NEVER REMOVE A SINGLE HASH FROM THIS LIST UNLESS YOU KNOW EXACTLY WHAT YOU ARE DOING!
|
||||
ALL_SSL_OPTIONS_HASHES = [
|
||||
'2086bca02db48daf93468332543c60ac6acdb6f0b58c7bfdf578a5d47092f82a',
|
||||
'4844d36c9a0f587172d9fa10f4f1c9518e3bcfa1947379f155e16a70a728c21a',
|
||||
@@ -19,10 +18,6 @@ ALL_SSL_OPTIONS_HASHES = [
|
||||
'cfdd7c18d2025836ea3307399f509cfb1ebf2612c87dd600a65da2a8e2f2797b',
|
||||
'80720bd171ccdc2e6b917ded340defae66919e4624962396b992b7218a561791',
|
||||
'c0c022ea6b8a51ecc8f1003d0a04af6c3f2bc1c3ce506b3c2dfc1f11ef931082',
|
||||
'717b0a89f5e4c39b09a42813ac6e747cfbdeb93439499e73f4f70a1fe1473f20',
|
||||
'0fcdc81280cd179a07ec4d29d3595068b9326b455c488de4b09f585d5dafc137',
|
||||
'86cc09ad5415cd6d5f09a947fe2501a9344328b1e8a8b458107ea903e80baa6c',
|
||||
'06675349e457eae856120cdebb564efe546f0b87399f2264baeb41e442c724c7',
|
||||
]
|
||||
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""
|
||||
|
||||
@@ -53,16 +48,3 @@ UIR_ARGS = ["always", "set", "Content-Security-Policy",
|
||||
|
||||
HEADER_ARGS = {"Strict-Transport-Security": HSTS_ARGS,
|
||||
"Upgrade-Insecure-Requests": UIR_ARGS}
|
||||
|
||||
AUTOHSTS_STEPS = [60, 300, 900, 3600, 21600, 43200, 86400]
|
||||
"""AutoHSTS increase steps: 1min, 5min, 15min, 1h, 6h, 12h, 24h"""
|
||||
|
||||
AUTOHSTS_PERMANENT = 31536000
|
||||
"""Value for the last max-age of HSTS"""
|
||||
|
||||
AUTOHSTS_FREQ = 172800
|
||||
"""Minimum time since last increase to perform a new one: 48h"""
|
||||
|
||||
MANAGED_COMMENT = "DO NOT REMOVE - Managed by Certbot"
|
||||
MANAGED_COMMENT_ID = MANAGED_COMMENT+", VirtualHost id: {0}"
|
||||
"""Managed by Certbot comments and the VirtualHost identification template"""
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
"""Contains UI methods for Apache operations."""
|
||||
import logging
|
||||
import os
|
||||
|
||||
import zope.component
|
||||
|
||||
import certbot.display.util as display_util
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot.compat import os
|
||||
|
||||
import certbot.display.util as display_util
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -24,7 +26,7 @@ def select_vhost_multiple(vhosts):
|
||||
return list()
|
||||
tags_list = [vhost.display_repr()+"\n" for vhost in vhosts]
|
||||
# Remove the extra newline from the last entry
|
||||
if tags_list:
|
||||
if len(tags_list):
|
||||
tags_list[-1] = tags_list[-1][:-1]
|
||||
code, names = zope.component.getUtility(interfaces.IDisplay).checklist(
|
||||
"Which VirtualHosts would you like to install the wildcard certificate for?",
|
||||
@@ -60,7 +62,8 @@ def select_vhost(domain, vhosts):
|
||||
code, tag = _vhost_menu(domain, vhosts)
|
||||
if code == display_util.OK:
|
||||
return vhosts[tag]
|
||||
return None
|
||||
else:
|
||||
return None
|
||||
|
||||
def _vhost_menu(domain, vhosts):
|
||||
"""Select an appropriate Apache Vhost.
|
||||
@@ -90,7 +93,7 @@ def _vhost_menu(domain, vhosts):
|
||||
for vhost in vhosts:
|
||||
if len(vhost.get_names()) == 1:
|
||||
disp_name = next(iter(vhost.get_names()))
|
||||
elif not vhost.get_names():
|
||||
elif len(vhost.get_names()) == 0:
|
||||
disp_name = ""
|
||||
else:
|
||||
disp_name = "Multiple Names"
|
||||
@@ -110,7 +113,8 @@ def _vhost_menu(domain, vhosts):
|
||||
code, tag = zope.component.getUtility(interfaces.IDisplay).menu(
|
||||
"We were unable to find a vhost with a ServerName "
|
||||
"or Address of {0}.{1}Which virtual host would you "
|
||||
"like to choose?".format(domain, os.linesep),
|
||||
"like to choose?\n(note: conf files with multiple "
|
||||
"vhosts are not yet supported)".format(domain, os.linesep),
|
||||
choices, force_interactive=True)
|
||||
except errors.MissingCommandlineFlag:
|
||||
msg = (
|
||||
|
||||
@@ -1,13 +1,8 @@
|
||||
""" Entry point for Apache Plugin """
|
||||
# Pylint does not like disutils.version when running inside a venv.
|
||||
# See: https://github.com/PyCQA/pylint/issues/73
|
||||
from distutils.version import LooseVersion # pylint: disable=no-name-in-module,import-error
|
||||
|
||||
from certbot import util
|
||||
|
||||
from certbot_apache import configurator
|
||||
from certbot_apache import override_arch
|
||||
from certbot_apache import override_fedora
|
||||
from certbot_apache import override_darwin
|
||||
from certbot_apache import override_debian
|
||||
from certbot_apache import override_centos
|
||||
@@ -16,18 +11,13 @@ from certbot_apache import override_suse
|
||||
|
||||
OVERRIDE_CLASSES = {
|
||||
"arch": override_arch.ArchConfigurator,
|
||||
"cloudlinux": override_centos.CentOSConfigurator,
|
||||
"darwin": override_darwin.DarwinConfigurator,
|
||||
"debian": override_debian.DebianConfigurator,
|
||||
"ubuntu": override_debian.DebianConfigurator,
|
||||
"centos": override_centos.CentOSConfigurator,
|
||||
"centos linux": override_centos.CentOSConfigurator,
|
||||
"fedora_old": override_centos.CentOSConfigurator,
|
||||
"fedora": override_fedora.FedoraConfigurator,
|
||||
"linuxmint": override_debian.DebianConfigurator,
|
||||
"fedora": override_centos.CentOSConfigurator,
|
||||
"ol": override_centos.CentOSConfigurator,
|
||||
"oracle": override_centos.CentOSConfigurator,
|
||||
"redhatenterpriseserver": override_centos.CentOSConfigurator,
|
||||
"red hat enterprise linux server": override_centos.CentOSConfigurator,
|
||||
"rhel": override_centos.CentOSConfigurator,
|
||||
"amazon": override_centos.CentOSConfigurator,
|
||||
@@ -35,24 +25,14 @@ OVERRIDE_CLASSES = {
|
||||
"gentoo base system": override_gentoo.GentooConfigurator,
|
||||
"opensuse": override_suse.OpenSUSEConfigurator,
|
||||
"suse": override_suse.OpenSUSEConfigurator,
|
||||
"sles": override_suse.OpenSUSEConfigurator,
|
||||
"scientific": override_centos.CentOSConfigurator,
|
||||
"scientific linux": override_centos.CentOSConfigurator,
|
||||
}
|
||||
|
||||
|
||||
def get_configurator():
|
||||
""" Get correct configurator class based on the OS fingerprint """
|
||||
os_name, os_version = util.get_os_info()
|
||||
os_name = os_name.lower()
|
||||
os_info = util.get_os_info()
|
||||
override_class = None
|
||||
|
||||
# Special case for older Fedora versions
|
||||
if os_name == 'fedora' and LooseVersion(os_version) < LooseVersion('29'):
|
||||
os_name = 'fedora_old'
|
||||
|
||||
try:
|
||||
override_class = OVERRIDE_CLASSES[os_name]
|
||||
override_class = OVERRIDE_CLASSES[os_info[0].lower()]
|
||||
except KeyError:
|
||||
# OS not found in the list
|
||||
os_like = util.get_systemd_os_like()
|
||||
@@ -65,5 +45,4 @@ def get_configurator():
|
||||
override_class = configurator.ApacheConfigurator
|
||||
return override_class
|
||||
|
||||
|
||||
ENTRYPOINT = get_configurator()
|
||||
|
||||
@@ -1,20 +1,14 @@
|
||||
"""A class that performs HTTP-01 challenges for Apache"""
|
||||
import logging
|
||||
|
||||
from acme.magic_typing import List, Set # pylint: disable=unused-import, no-name-in-module
|
||||
import os
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
from certbot.compat import filesystem
|
||||
from certbot.plugins import common
|
||||
|
||||
from certbot_apache.obj import VirtualHost # pylint: disable=unused-import
|
||||
from certbot_apache.parser import get_aug_path
|
||||
from certbot.plugins import common
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ApacheHttp01(common.ChallengePerformer):
|
||||
class ApacheHttp01(common.TLSSNI01):
|
||||
"""Class that performs HTTP-01 challenges within the Apache configurator."""
|
||||
|
||||
CONFIG_TEMPLATE22_PRE = """\
|
||||
@@ -57,7 +51,7 @@ class ApacheHttp01(common.ChallengePerformer):
|
||||
self.challenge_dir = os.path.join(
|
||||
self.configurator.config.work_dir,
|
||||
"http_challenges")
|
||||
self.moded_vhosts = set() # type: Set[VirtualHost]
|
||||
self.moded_vhosts = set()
|
||||
|
||||
def perform(self):
|
||||
"""Perform all HTTP-01 challenges."""
|
||||
@@ -93,27 +87,15 @@ class ApacheHttp01(common.ChallengePerformer):
|
||||
self.configurator.enable_mod(mod, temp=True)
|
||||
|
||||
def _mod_config(self):
|
||||
selected_vhosts = [] # type: List[VirtualHost]
|
||||
http_port = str(self.configurator.config.http01_port)
|
||||
for chall in self.achalls:
|
||||
# Search for matching VirtualHosts
|
||||
for vh in self._matching_vhosts(chall.domain):
|
||||
selected_vhosts.append(vh)
|
||||
|
||||
# Ensure that we have one or more VirtualHosts that we can continue
|
||||
# with. (one that listens to port configured with --http-01-port)
|
||||
found = False
|
||||
for vhost in selected_vhosts:
|
||||
if any(a.is_wildcard() or a.get_port() == http_port for a in vhost.addrs):
|
||||
found = True
|
||||
|
||||
if not found:
|
||||
for vh in self._relevant_vhosts():
|
||||
selected_vhosts.append(vh)
|
||||
|
||||
# Add the challenge configuration
|
||||
for vh in selected_vhosts:
|
||||
self._set_up_include_directives(vh)
|
||||
vh = self.configurator.find_best_http_vhost(
|
||||
chall.domain, filter_defaults=False,
|
||||
port=str(self.configurator.config.http01_port))
|
||||
if vh:
|
||||
self._set_up_include_directives(vh)
|
||||
else:
|
||||
for vh in self._relevant_vhosts():
|
||||
self._set_up_include_directives(vh)
|
||||
|
||||
self.configurator.reverter.register_file_creation(
|
||||
True, self.challenge_conf_pre)
|
||||
@@ -137,20 +119,6 @@ class ApacheHttp01(common.ChallengePerformer):
|
||||
with open(self.challenge_conf_post, "w") as new_conf:
|
||||
new_conf.write(config_text_post)
|
||||
|
||||
def _matching_vhosts(self, domain):
|
||||
"""Return all VirtualHost objects that have the requested domain name or
|
||||
a wildcard name that would match the domain in ServerName or ServerAlias
|
||||
directive.
|
||||
"""
|
||||
matching_vhosts = []
|
||||
for vhost in self.configurator.vhosts:
|
||||
if self.configurator.domain_in_names(vhost.get_names(), domain):
|
||||
# domain_in_names also matches the exact names, so no need
|
||||
# to check "domain in vhost.get_names()" explicitly here
|
||||
matching_vhosts.append(vhost)
|
||||
|
||||
return matching_vhosts
|
||||
|
||||
def _relevant_vhosts(self):
|
||||
http01_port = str(self.configurator.config.http01_port)
|
||||
relevant_vhosts = []
|
||||
@@ -169,7 +137,8 @@ class ApacheHttp01(common.ChallengePerformer):
|
||||
|
||||
def _set_up_challenges(self):
|
||||
if not os.path.isdir(self.challenge_dir):
|
||||
filesystem.makedirs(self.challenge_dir, 0o755)
|
||||
os.makedirs(self.challenge_dir)
|
||||
os.chmod(self.challenge_dir, 0o755)
|
||||
|
||||
responses = []
|
||||
for achall in self.achalls:
|
||||
@@ -185,7 +154,7 @@ class ApacheHttp01(common.ChallengePerformer):
|
||||
self.configurator.reverter.register_file_creation(True, name)
|
||||
with open(name, 'wb') as f:
|
||||
f.write(validation.encode())
|
||||
filesystem.chmod(name, 0o644)
|
||||
os.chmod(name, 0o644)
|
||||
|
||||
return response
|
||||
|
||||
@@ -202,9 +171,4 @@ class ApacheHttp01(common.ChallengePerformer):
|
||||
self.configurator.parser.add_dir(
|
||||
vhost.path, "Include", self.challenge_conf_post)
|
||||
|
||||
if not vhost.enabled:
|
||||
self.configurator.parser.add_dir(
|
||||
get_aug_path(self.configurator.parser.loc["default"]),
|
||||
"Include", vhost.filep)
|
||||
|
||||
self.moded_vhosts.add(vhost)
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
"""Module contains classes used by the Apache Configurator."""
|
||||
import re
|
||||
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
from certbot.plugins import common
|
||||
|
||||
|
||||
@@ -26,7 +25,7 @@ class Addr(common.Addr):
|
||||
def __repr__(self):
|
||||
return "certbot_apache.obj.Addr(" + repr(self.tup) + ")"
|
||||
|
||||
def __hash__(self): # pylint: disable=useless-super-delegation
|
||||
def __hash__(self):
|
||||
# Python 3 requires explicit overridden for __hash__ if __eq__ or
|
||||
# __cmp__ is overridden. See https://bugs.python.org/issue2235
|
||||
return super(Addr, self).__hash__()
|
||||
@@ -47,7 +46,8 @@ class Addr(common.Addr):
|
||||
return 0
|
||||
elif self.get_addr() == "*":
|
||||
return 1
|
||||
return 2
|
||||
else:
|
||||
return 2
|
||||
|
||||
def conflicts(self, addr):
|
||||
r"""Returns if address could conflict with correct function of self.
|
||||
@@ -98,7 +98,7 @@ class Addr(common.Addr):
|
||||
return self.get_addr_obj(port)
|
||||
|
||||
|
||||
class VirtualHost(object):
|
||||
class VirtualHost(object): # pylint: disable=too-few-public-methods
|
||||
"""Represents an Apache Virtualhost.
|
||||
|
||||
:ivar str filep: file path of VH
|
||||
@@ -126,6 +126,7 @@ class VirtualHost(object):
|
||||
def __init__(self, filep, path, addrs, ssl, enabled, name=None,
|
||||
aliases=None, modmacro=False, ancestor=None):
|
||||
|
||||
# pylint: disable=too-many-arguments
|
||||
"""Initialize a VH."""
|
||||
self.filep = filep
|
||||
self.path = path
|
||||
@@ -139,7 +140,7 @@ class VirtualHost(object):
|
||||
|
||||
def get_names(self):
|
||||
"""Return a set of all names."""
|
||||
all_names = set() # type: Set[str]
|
||||
all_names = set()
|
||||
all_names.update(self.aliases)
|
||||
# Strip out any scheme:// and <port> field from servername
|
||||
if self.name is not None:
|
||||
@@ -250,7 +251,7 @@ class VirtualHost(object):
|
||||
|
||||
# already_found acts to keep everything very conservative.
|
||||
# Don't allow multiple ip:ports in same set.
|
||||
already_found = set() # type: Set[str]
|
||||
already_found = set()
|
||||
|
||||
for addr in vhost.addrs:
|
||||
for local_addr in self.addrs:
|
||||
|
||||
@@ -16,14 +16,14 @@ class ArchConfigurator(configurator.ApacheConfigurator):
|
||||
vhost_root="/etc/httpd/conf",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/httpd",
|
||||
ctl="apachectl",
|
||||
version_cmd=['apachectl', '-v'],
|
||||
apache_cmd="apachectl",
|
||||
restart_cmd=['apachectl', 'graceful'],
|
||||
conftest_cmd=['apachectl', 'configtest'],
|
||||
enmod=None,
|
||||
dismod=None,
|
||||
le_vhost_ext="-le-ssl.conf",
|
||||
handle_modules=False,
|
||||
handle_mods=False,
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/httpd/conf",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
|
||||
@@ -1,24 +1,14 @@
|
||||
""" Distribution specific override class for CentOS family (RHEL, Fedora) """
|
||||
import logging
|
||||
|
||||
import pkg_resources
|
||||
|
||||
import zope.interface
|
||||
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot import util
|
||||
from certbot.errors import MisconfigurationError
|
||||
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
from certbot_apache import apache_util
|
||||
from certbot_apache import configurator
|
||||
from certbot_apache import parser
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class CentOSConfigurator(configurator.ApacheConfigurator):
|
||||
"""CentOS specific ApacheConfigurator override class"""
|
||||
@@ -28,144 +18,27 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
|
||||
vhost_root="/etc/httpd/conf.d",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/httpd",
|
||||
ctl="apachectl",
|
||||
version_cmd=['apachectl', '-v'],
|
||||
apache_cmd="apachectl",
|
||||
restart_cmd=['apachectl', 'graceful'],
|
||||
restart_cmd_alt=['apachectl', 'restart'],
|
||||
conftest_cmd=['apachectl', 'configtest'],
|
||||
enmod=None,
|
||||
dismod=None,
|
||||
le_vhost_ext="-le-ssl.conf",
|
||||
handle_modules=False,
|
||||
handle_mods=False,
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/httpd/conf.d",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
"certbot_apache", "centos-options-ssl-apache.conf")
|
||||
)
|
||||
|
||||
def config_test(self):
|
||||
"""
|
||||
Override config_test to mitigate configtest error in vanilla installation
|
||||
of mod_ssl in Fedora. The error is caused by non-existent self-signed
|
||||
certificates referenced by the configuration, that would be autogenerated
|
||||
during the first (re)start of httpd.
|
||||
"""
|
||||
|
||||
os_info = util.get_os_info()
|
||||
fedora = os_info[0].lower() == "fedora"
|
||||
|
||||
try:
|
||||
super(CentOSConfigurator, self).config_test()
|
||||
except errors.MisconfigurationError:
|
||||
if fedora:
|
||||
self._try_restart_fedora()
|
||||
else:
|
||||
raise
|
||||
|
||||
def _try_restart_fedora(self):
|
||||
"""
|
||||
Tries to restart httpd using systemctl to generate the self signed keypair.
|
||||
"""
|
||||
|
||||
try:
|
||||
util.run_script(['systemctl', 'restart', 'httpd'])
|
||||
except errors.SubprocessError as err:
|
||||
raise errors.MisconfigurationError(str(err))
|
||||
|
||||
# Finish with actual config check to see if systemctl restart helped
|
||||
super(CentOSConfigurator, self).config_test()
|
||||
|
||||
def _prepare_options(self):
|
||||
"""
|
||||
Override the options dictionary initialization in order to support
|
||||
alternative restart cmd used in CentOS.
|
||||
"""
|
||||
super(CentOSConfigurator, self)._prepare_options()
|
||||
self.options["restart_cmd_alt"][0] = self.option("ctl")
|
||||
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
return CentOSParser(
|
||||
self.option("server_root"), self.option("vhost_root"),
|
||||
self.aug, self.conf("server-root"), self.conf("vhost-root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
def _deploy_cert(self, *args, **kwargs): # pylint: disable=arguments-differ
|
||||
"""
|
||||
Override _deploy_cert in order to ensure that the Apache configuration
|
||||
has "LoadModule ssl_module..." before parsing the VirtualHost configuration
|
||||
that was created by Certbot
|
||||
"""
|
||||
super(CentOSConfigurator, self)._deploy_cert(*args, **kwargs)
|
||||
if self.version < (2, 4, 0):
|
||||
self._deploy_loadmodule_ssl_if_needed()
|
||||
|
||||
def _deploy_loadmodule_ssl_if_needed(self):
|
||||
"""
|
||||
Add "LoadModule ssl_module <pre-existing path>" to main httpd.conf if
|
||||
it doesn't exist there already.
|
||||
"""
|
||||
|
||||
loadmods = self.parser.find_dir("LoadModule", "ssl_module", exclude=False)
|
||||
|
||||
correct_ifmods = [] # type: List[str]
|
||||
loadmod_args = [] # type: List[str]
|
||||
loadmod_paths = [] # type: List[str]
|
||||
for m in loadmods:
|
||||
noarg_path = m.rpartition("/")[0]
|
||||
path_args = self.parser.get_all_args(noarg_path)
|
||||
if loadmod_args:
|
||||
if loadmod_args != path_args:
|
||||
msg = ("Certbot encountered multiple LoadModule directives "
|
||||
"for LoadModule ssl_module with differing library paths. "
|
||||
"Please remove or comment out the one(s) that are not in "
|
||||
"use, and run Certbot again.")
|
||||
raise MisconfigurationError(msg)
|
||||
else:
|
||||
loadmod_args = path_args
|
||||
|
||||
if self.parser.not_modssl_ifmodule(noarg_path): # pylint: disable=no-member
|
||||
if self.parser.loc["default"] in noarg_path:
|
||||
# LoadModule already in the main configuration file
|
||||
if ("ifmodule/" in noarg_path.lower() or
|
||||
"ifmodule[1]" in noarg_path.lower()):
|
||||
# It's the first or only IfModule in the file
|
||||
return
|
||||
# Populate the list of known !mod_ssl.c IfModules
|
||||
nodir_path = noarg_path.rpartition("/directive")[0]
|
||||
correct_ifmods.append(nodir_path)
|
||||
else:
|
||||
loadmod_paths.append(noarg_path)
|
||||
|
||||
if not loadmod_args:
|
||||
# Do not try to enable mod_ssl
|
||||
return
|
||||
|
||||
# Force creation as the directive wasn't found from the beginning of
|
||||
# httpd.conf
|
||||
rootconf_ifmod = self.parser.create_ifmod(
|
||||
parser.get_aug_path(self.parser.loc["default"]),
|
||||
"!mod_ssl.c", beginning=True)
|
||||
# parser.get_ifmod returns a path postfixed with "/", remove that
|
||||
self.parser.add_dir(rootconf_ifmod[:-1], "LoadModule", loadmod_args)
|
||||
correct_ifmods.append(rootconf_ifmod[:-1])
|
||||
self.save_notes += "Added LoadModule ssl_module to main configuration.\n"
|
||||
|
||||
# Wrap LoadModule mod_ssl inside of <IfModule !mod_ssl.c> if it's not
|
||||
# configured like this already.
|
||||
for loadmod_path in loadmod_paths:
|
||||
nodir_path = loadmod_path.split("/directive")[0]
|
||||
# Remove the old LoadModule directive
|
||||
self.parser.aug.remove(loadmod_path)
|
||||
|
||||
# Create a new IfModule !mod_ssl.c if not already found on path
|
||||
ssl_ifmod = self.parser.get_ifmod(nodir_path, "!mod_ssl.c",
|
||||
beginning=True)[:-1]
|
||||
if ssl_ifmod not in correct_ifmods:
|
||||
self.parser.add_dir(ssl_ifmod, "LoadModule", loadmod_args)
|
||||
correct_ifmods.append(ssl_ifmod)
|
||||
self.save_notes += ("Wrapped pre-existing LoadModule ssl_module "
|
||||
"inside of <IfModule !mod_ssl> block.\n")
|
||||
|
||||
|
||||
class CentOSParser(parser.ApacheParser):
|
||||
"""CentOS specific ApacheParser override class"""
|
||||
@@ -174,44 +47,14 @@ class CentOSParser(parser.ApacheParser):
|
||||
self.sysconfig_filep = "/etc/sysconfig/httpd"
|
||||
super(CentOSParser, self).__init__(*args, **kwargs)
|
||||
|
||||
def update_runtime_variables(self):
|
||||
def update_runtime_variables(self, *args, **kwargs):
|
||||
""" Override for update_runtime_variables for custom parsing """
|
||||
# Opportunistic, works if SELinux not enforced
|
||||
super(CentOSParser, self).update_runtime_variables()
|
||||
super(CentOSParser, self).update_runtime_variables(*args, **kwargs)
|
||||
self.parse_sysconfig_var()
|
||||
|
||||
def parse_sysconfig_var(self):
|
||||
""" Parses Apache CLI options from CentOS configuration file """
|
||||
defines = apache_util.parse_define_file(self.sysconfig_filep, "OPTIONS")
|
||||
for k in defines:
|
||||
for k in defines.keys():
|
||||
self.variables[k] = defines[k]
|
||||
|
||||
def not_modssl_ifmodule(self, path):
|
||||
"""Checks if the provided Augeas path has argument !mod_ssl"""
|
||||
|
||||
if "ifmodule" not in path.lower():
|
||||
return False
|
||||
|
||||
# Trim the path to the last ifmodule
|
||||
workpath = path.lower()
|
||||
while workpath:
|
||||
# Get path to the last IfModule (ignore the tail)
|
||||
parts = workpath.rpartition("ifmodule")
|
||||
|
||||
if not parts[0]:
|
||||
# IfModule not found
|
||||
break
|
||||
ifmod_path = parts[0] + parts[1]
|
||||
# Check if ifmodule had an index
|
||||
if parts[2].startswith("["):
|
||||
# Append the index from tail
|
||||
ifmod_path += parts[2].partition("/")[0]
|
||||
# Get the original path trimmed to correct length
|
||||
# This is required to preserve cases
|
||||
ifmod_real_path = path[0:len(ifmod_path)]
|
||||
if "!mod_ssl.c" in self.get_all_args(ifmod_real_path):
|
||||
return True
|
||||
# Set the workpath to the heading part
|
||||
workpath = parts[0]
|
||||
|
||||
return False
|
||||
|
||||
@@ -16,14 +16,14 @@ class DarwinConfigurator(configurator.ApacheConfigurator):
|
||||
vhost_root="/etc/apache2/other",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/apache2",
|
||||
ctl="apachectl",
|
||||
version_cmd=['apachectl', '-v'],
|
||||
version_cmd=['/usr/sbin/httpd', '-v'],
|
||||
apache_cmd="/usr/sbin/httpd",
|
||||
restart_cmd=['apachectl', 'graceful'],
|
||||
conftest_cmd=['apachectl', 'configtest'],
|
||||
enmod=None,
|
||||
dismod=None,
|
||||
le_vhost_ext="-le-ssl.conf",
|
||||
handle_modules=False,
|
||||
handle_mods=False,
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/apache2/other",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
|
||||
@@ -1,21 +1,19 @@
|
||||
""" Distribution specific override class for Debian family (Ubuntu/Debian) """
|
||||
import logging
|
||||
|
||||
import os
|
||||
import pkg_resources
|
||||
|
||||
import zope.interface
|
||||
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot import util
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import apache_util
|
||||
from certbot_apache import configurator
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
"""Debian specific ApacheConfigurator override class"""
|
||||
@@ -25,14 +23,14 @@ class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
vhost_root="/etc/apache2/sites-available",
|
||||
vhost_files="*",
|
||||
logs_root="/var/log/apache2",
|
||||
ctl="apache2ctl",
|
||||
version_cmd=['apache2ctl', '-v'],
|
||||
apache_cmd="apache2ctl",
|
||||
restart_cmd=['apache2ctl', 'graceful'],
|
||||
conftest_cmd=['apache2ctl', 'configtest'],
|
||||
enmod="a2enmod",
|
||||
dismod="a2dismod",
|
||||
le_vhost_ext="-le-ssl.conf",
|
||||
handle_modules=True,
|
||||
handle_mods=True,
|
||||
handle_sites=True,
|
||||
challenge_location="/etc/apache2",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
@@ -53,7 +51,7 @@ class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
|
||||
"""
|
||||
if vhost.enabled:
|
||||
return None
|
||||
return
|
||||
|
||||
enabled_path = ("%s/sites-enabled/%s" %
|
||||
(self.parser.root,
|
||||
@@ -66,11 +64,11 @@ class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
try:
|
||||
os.symlink(vhost.filep, enabled_path)
|
||||
except OSError as err:
|
||||
if os.path.islink(enabled_path) and filesystem.realpath(
|
||||
if os.path.islink(enabled_path) and os.path.realpath(
|
||||
enabled_path) == vhost.filep:
|
||||
# Already in shape
|
||||
vhost.enabled = True
|
||||
return None
|
||||
return
|
||||
else:
|
||||
logger.warning(
|
||||
"Could not symlink %s to %s, got error: %s", enabled_path,
|
||||
@@ -83,9 +81,9 @@ class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
vhost.enabled = True
|
||||
logger.info("Enabling available site: %s", vhost.filep)
|
||||
self.save_notes += "Enabled site %s\n" % vhost.filep
|
||||
return None
|
||||
|
||||
def enable_mod(self, mod_name, temp=False):
|
||||
# pylint: disable=unused-argument
|
||||
"""Enables module in Apache.
|
||||
|
||||
Both enables and reloads Apache so module is active.
|
||||
@@ -136,11 +134,11 @@ class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
# Generate reversal command.
|
||||
# Try to be safe here... check that we can probably reverse before
|
||||
# applying enmod command
|
||||
if not util.exe_exists(self.option("dismod")):
|
||||
if not util.exe_exists(self.conf("dismod")):
|
||||
raise errors.MisconfigurationError(
|
||||
"Unable to find a2dismod, please make sure a2enmod and "
|
||||
"a2dismod are configured correctly for certbot.")
|
||||
|
||||
self.reverter.register_undo_command(
|
||||
temp, [self.option("dismod"), "-f", mod_name])
|
||||
util.run_script([self.option("enmod"), mod_name])
|
||||
temp, [self.conf("dismod"), "-f", mod_name])
|
||||
util.run_script([self.conf("enmod"), mod_name])
|
||||
|
||||
@@ -1,98 +0,0 @@
|
||||
""" Distribution specific override class for Fedora 29+ """
|
||||
import pkg_resources
|
||||
import zope.interface
|
||||
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot import util
|
||||
|
||||
from certbot_apache import apache_util
|
||||
from certbot_apache import configurator
|
||||
from certbot_apache import parser
|
||||
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class FedoraConfigurator(configurator.ApacheConfigurator):
|
||||
"""Fedora 29+ specific ApacheConfigurator override class"""
|
||||
|
||||
OS_DEFAULTS = dict(
|
||||
server_root="/etc/httpd",
|
||||
vhost_root="/etc/httpd/conf.d",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/httpd",
|
||||
ctl="httpd",
|
||||
version_cmd=['httpd', '-v'],
|
||||
restart_cmd=['apachectl', 'graceful'],
|
||||
restart_cmd_alt=['apachectl', 'restart'],
|
||||
conftest_cmd=['apachectl', 'configtest'],
|
||||
enmod=None,
|
||||
dismod=None,
|
||||
le_vhost_ext="-le-ssl.conf",
|
||||
handle_modules=False,
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/httpd/conf.d",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
# TODO: eventually newest version of Fedora will need their own config
|
||||
"certbot_apache", "centos-options-ssl-apache.conf")
|
||||
)
|
||||
|
||||
def config_test(self):
|
||||
"""
|
||||
Override config_test to mitigate configtest error in vanilla installation
|
||||
of mod_ssl in Fedora. The error is caused by non-existent self-signed
|
||||
certificates referenced by the configuration, that would be autogenerated
|
||||
during the first (re)start of httpd.
|
||||
"""
|
||||
try:
|
||||
super(FedoraConfigurator, self).config_test()
|
||||
except errors.MisconfigurationError:
|
||||
self._try_restart_fedora()
|
||||
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
return FedoraParser(
|
||||
self.option("server_root"), self.option("vhost_root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
def _try_restart_fedora(self):
|
||||
"""
|
||||
Tries to restart httpd using systemctl to generate the self signed keypair.
|
||||
"""
|
||||
try:
|
||||
util.run_script(['systemctl', 'restart', 'httpd'])
|
||||
except errors.SubprocessError as err:
|
||||
raise errors.MisconfigurationError(str(err))
|
||||
|
||||
# Finish with actual config check to see if systemctl restart helped
|
||||
super(FedoraConfigurator, self).config_test()
|
||||
|
||||
def _prepare_options(self):
|
||||
"""
|
||||
Override the options dictionary initialization to keep using apachectl
|
||||
instead of httpd and so take advantages of this new bash script in newer versions
|
||||
of Fedora to restart httpd.
|
||||
"""
|
||||
super(FedoraConfigurator, self)._prepare_options()
|
||||
self.options["restart_cmd"][0] = 'apachectl'
|
||||
self.options["restart_cmd_alt"][0] = 'apachectl'
|
||||
self.options["conftest_cmd"][0] = 'apachectl'
|
||||
|
||||
|
||||
class FedoraParser(parser.ApacheParser):
|
||||
"""Fedora 29+ specific ApacheParser override class"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
# Fedora 29+ specific configuration file for Apache
|
||||
self.sysconfig_filep = "/etc/sysconfig/httpd"
|
||||
super(FedoraParser, self).__init__(*args, **kwargs)
|
||||
|
||||
def update_runtime_variables(self):
|
||||
""" Override for update_runtime_variables for custom parsing """
|
||||
# Opportunistic, works if SELinux not enforced
|
||||
super(FedoraParser, self).update_runtime_variables()
|
||||
self._parse_sysconfig_var()
|
||||
|
||||
def _parse_sysconfig_var(self):
|
||||
""" Parses Apache CLI options from Fedora configuration file """
|
||||
defines = apache_util.parse_define_file(self.sysconfig_filep, "OPTIONS")
|
||||
for k in defines:
|
||||
self.variables[k] = defines[k]
|
||||
@@ -18,33 +18,25 @@ class GentooConfigurator(configurator.ApacheConfigurator):
|
||||
vhost_root="/etc/apache2/vhosts.d",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/apache2",
|
||||
ctl="apache2ctl",
|
||||
version_cmd=['apache2ctl', '-v'],
|
||||
version_cmd=['/usr/sbin/apache2', '-v'],
|
||||
apache_cmd="apache2ctl",
|
||||
restart_cmd=['apache2ctl', 'graceful'],
|
||||
restart_cmd_alt=['apache2ctl', 'restart'],
|
||||
conftest_cmd=['apache2ctl', 'configtest'],
|
||||
enmod=None,
|
||||
dismod=None,
|
||||
le_vhost_ext="-le-ssl.conf",
|
||||
handle_modules=False,
|
||||
handle_mods=False,
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/apache2/vhosts.d",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
"certbot_apache", "options-ssl-apache.conf")
|
||||
)
|
||||
|
||||
def _prepare_options(self):
|
||||
"""
|
||||
Override the options dictionary initialization in order to support
|
||||
alternative restart cmd used in Gentoo.
|
||||
"""
|
||||
super(GentooConfigurator, self)._prepare_options()
|
||||
self.options["restart_cmd_alt"][0] = self.option("ctl")
|
||||
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
return GentooParser(
|
||||
self.option("server_root"), self.option("vhost_root"),
|
||||
self.aug, self.conf("server-root"), self.conf("vhost-root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
|
||||
@@ -64,12 +56,12 @@ class GentooParser(parser.ApacheParser):
|
||||
""" Parses Apache CLI options from Gentoo configuration file """
|
||||
defines = apache_util.parse_define_file(self.apacheconfig_filep,
|
||||
"APACHE2_OPTS")
|
||||
for k in defines:
|
||||
for k in defines.keys():
|
||||
self.variables[k] = defines[k]
|
||||
|
||||
def update_modules(self):
|
||||
"""Get loaded modules from httpd process, and add them to DOM"""
|
||||
mod_cmd = [self.configurator.option("ctl"), "modules"]
|
||||
mod_cmd = [self.configurator.constant("apache_cmd"), "modules"]
|
||||
matches = self.parse_from_subprocess(mod_cmd, r"(.*)_module")
|
||||
for mod in matches:
|
||||
self.add_mod(mod.strip())
|
||||
|
||||
@@ -16,14 +16,14 @@ class OpenSUSEConfigurator(configurator.ApacheConfigurator):
|
||||
vhost_root="/etc/apache2/vhosts.d",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/apache2",
|
||||
ctl="apache2ctl",
|
||||
version_cmd=['apache2ctl', '-v'],
|
||||
apache_cmd="apache2ctl",
|
||||
restart_cmd=['apache2ctl', 'graceful'],
|
||||
conftest_cmd=['apache2ctl', 'configtest'],
|
||||
enmod="a2enmod",
|
||||
dismod="a2dismod",
|
||||
le_vhost_ext="-le-ssl.conf",
|
||||
handle_modules=False,
|
||||
handle_mods=False,
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/apache2/vhosts.d",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
|
||||
@@ -2,18 +2,14 @@
|
||||
import copy
|
||||
import fnmatch
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
import six
|
||||
|
||||
from acme.magic_typing import Dict, List, Set # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import constants
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -33,7 +29,7 @@ class ApacheParser(object):
|
||||
arg_var_interpreter = re.compile(r"\$\{[^ \}]*}")
|
||||
fnmatch_chars = set(["*", "?", "\\", "[", "]"])
|
||||
|
||||
def __init__(self, root, vhostroot=None, version=(2, 4),
|
||||
def __init__(self, aug, root, vhostroot=None, version=(2, 4),
|
||||
configurator=None):
|
||||
# Note: Order is important here.
|
||||
|
||||
@@ -42,20 +38,11 @@ class ApacheParser(object):
|
||||
# issues with aug.load() after adding new files / defines to parse tree
|
||||
self.configurator = configurator
|
||||
|
||||
# Initialize augeas
|
||||
self.aug = None
|
||||
self.init_augeas()
|
||||
|
||||
if not self.check_aug_version():
|
||||
raise errors.NotSupportedError(
|
||||
"Apache plugin support requires libaugeas0 and augeas-lenses "
|
||||
"version 1.2.0 or higher, please make sure you have you have "
|
||||
"those installed.")
|
||||
|
||||
self.modules = set() # type: Set[str]
|
||||
self.parser_paths = {} # type: Dict[str, List[str]]
|
||||
self.variables = {} # type: Dict[str, str]
|
||||
self.modules = set()
|
||||
self.parser_paths = {}
|
||||
self.variables = {}
|
||||
|
||||
self.aug = aug
|
||||
# Find configuration root and make sure augeas can parse it.
|
||||
self.root = os.path.abspath(root)
|
||||
self.loc = {"root": self._find_config_root()}
|
||||
@@ -80,153 +67,13 @@ class ApacheParser(object):
|
||||
# Must also attempt to parse additional virtual host root
|
||||
if vhostroot:
|
||||
self.parse_file(os.path.abspath(vhostroot) + "/" +
|
||||
self.configurator.option("vhost_files"))
|
||||
self.configurator.constant("vhost_files"))
|
||||
|
||||
# check to see if there were unparsed define statements
|
||||
if version < (2, 4):
|
||||
if self.find_dir("Define", exclude=False):
|
||||
raise errors.PluginError("Error parsing runtime variables")
|
||||
|
||||
def init_augeas(self):
|
||||
""" Initialize the actual Augeas instance """
|
||||
|
||||
try:
|
||||
import augeas
|
||||
except ImportError: # pragma: no cover
|
||||
raise errors.NoInstallationError("Problem in Augeas installation")
|
||||
|
||||
self.aug = augeas.Augeas(
|
||||
# specify a directory to load our preferred lens from
|
||||
loadpath=constants.AUGEAS_LENS_DIR,
|
||||
# Do not save backup (we do it ourselves), do not load
|
||||
# anything by default
|
||||
flags=(augeas.Augeas.NONE |
|
||||
augeas.Augeas.NO_MODL_AUTOLOAD |
|
||||
augeas.Augeas.ENABLE_SPAN))
|
||||
|
||||
def check_parsing_errors(self, lens):
|
||||
"""Verify Augeas can parse all of the lens files.
|
||||
|
||||
:param str lens: lens to check for errors
|
||||
|
||||
:raises .errors.PluginError: If there has been an error in parsing with
|
||||
the specified lens.
|
||||
|
||||
"""
|
||||
error_files = self.aug.match("/augeas//error")
|
||||
|
||||
for path in error_files:
|
||||
# Check to see if it was an error resulting from the use of
|
||||
# the httpd lens
|
||||
lens_path = self.aug.get(path + "/lens")
|
||||
# As aug.get may return null
|
||||
if lens_path and lens in lens_path:
|
||||
msg = (
|
||||
"There has been an error in parsing the file {0} on line {1}: "
|
||||
"{2}".format(
|
||||
# Strip off /augeas/files and /error
|
||||
path[13:len(path) - 6],
|
||||
self.aug.get(path + "/line"),
|
||||
self.aug.get(path + "/message")))
|
||||
raise errors.PluginError(msg)
|
||||
|
||||
def check_aug_version(self):
|
||||
""" Checks that we have recent enough version of libaugeas.
|
||||
If augeas version is recent enough, it will support case insensitive
|
||||
regexp matching"""
|
||||
|
||||
self.aug.set("/test/path/testing/arg", "aRgUMeNT")
|
||||
try:
|
||||
matches = self.aug.match(
|
||||
"/test//*[self::arg=~regexp('argument', 'i')]")
|
||||
except RuntimeError:
|
||||
self.aug.remove("/test/path")
|
||||
return False
|
||||
self.aug.remove("/test/path")
|
||||
return matches
|
||||
|
||||
def unsaved_files(self):
|
||||
"""Lists files that have modified Augeas DOM but the changes have not
|
||||
been written to the filesystem yet, used by `self.save()` and
|
||||
ApacheConfigurator to check the file state.
|
||||
|
||||
:raises .errors.PluginError: If there was an error in Augeas, in
|
||||
an attempt to save the configuration, or an error creating a
|
||||
checkpoint
|
||||
|
||||
:returns: `set` of unsaved files
|
||||
"""
|
||||
save_state = self.aug.get("/augeas/save")
|
||||
self.aug.set("/augeas/save", "noop")
|
||||
# Existing Errors
|
||||
ex_errs = self.aug.match("/augeas//error")
|
||||
try:
|
||||
# This is a noop save
|
||||
self.aug.save()
|
||||
except (RuntimeError, IOError):
|
||||
self._log_save_errors(ex_errs)
|
||||
# Erase Save Notes
|
||||
self.configurator.save_notes = ""
|
||||
raise errors.PluginError(
|
||||
"Error saving files, check logs for more info.")
|
||||
|
||||
# Return the original save method
|
||||
self.aug.set("/augeas/save", save_state)
|
||||
|
||||
# Retrieve list of modified files
|
||||
# Note: Noop saves can cause the file to be listed twice, I used a
|
||||
# set to remove this possibility. This is a known augeas 0.10 error.
|
||||
save_paths = self.aug.match("/augeas/events/saved")
|
||||
|
||||
save_files = set()
|
||||
if save_paths:
|
||||
for path in save_paths:
|
||||
save_files.add(self.aug.get(path)[6:])
|
||||
return save_files
|
||||
|
||||
def ensure_augeas_state(self):
|
||||
"""Makes sure that all Augeas dom changes are written to files to avoid
|
||||
loss of configuration directives when doing additional augeas parsing,
|
||||
causing a possible augeas.load() resulting dom reset
|
||||
"""
|
||||
|
||||
if self.unsaved_files():
|
||||
self.configurator.save_notes += "(autosave)"
|
||||
self.configurator.save()
|
||||
|
||||
def save(self, save_files):
|
||||
"""Saves all changes to the configuration files.
|
||||
|
||||
save() is called from ApacheConfigurator to handle the parser specific
|
||||
tasks of saving.
|
||||
|
||||
:param list save_files: list of strings of file paths that we need to save.
|
||||
|
||||
"""
|
||||
self.configurator.save_notes = ""
|
||||
self.aug.save()
|
||||
|
||||
# Force reload if files were modified
|
||||
# This is needed to recalculate augeas directive span
|
||||
if save_files:
|
||||
for sf in save_files:
|
||||
self.aug.remove("/files/"+sf)
|
||||
self.aug.load()
|
||||
|
||||
def _log_save_errors(self, ex_errs):
|
||||
"""Log errors due to bad Augeas save.
|
||||
|
||||
:param list ex_errs: Existing errors before save
|
||||
|
||||
"""
|
||||
# Check for the root of save problems
|
||||
new_errs = self.aug.match("/augeas//error")
|
||||
# logger.error("During Save - %s", mod_conf)
|
||||
logger.error("Unable to save files: %s. Attempted Save Notes: %s",
|
||||
", ".join(err[13:len(err) - 6] for err in new_errs
|
||||
# Only new errors caused by recent save
|
||||
if err not in ex_errs), self.configurator.save_notes)
|
||||
|
||||
def add_include(self, main_config, inc_path):
|
||||
"""Add Include for a new configuration file if one does not exist
|
||||
|
||||
@@ -234,7 +81,7 @@ class ApacheParser(object):
|
||||
:param str inc_path: path of file to include
|
||||
|
||||
"""
|
||||
if not self.find_dir(case_i("Include"), inc_path):
|
||||
if len(self.find_dir(case_i("Include"), inc_path)) == 0:
|
||||
logger.debug("Adding Include %s to %s",
|
||||
inc_path, get_aug_path(main_config))
|
||||
self.add_dir(
|
||||
@@ -244,7 +91,12 @@ class ApacheParser(object):
|
||||
# Add new path to parser paths
|
||||
new_dir = os.path.dirname(inc_path)
|
||||
new_file = os.path.basename(inc_path)
|
||||
self.existing_paths.setdefault(new_dir, []).append(new_file)
|
||||
if new_dir in self.existing_paths.keys():
|
||||
# Add to existing path
|
||||
self.existing_paths[new_dir].append(new_file)
|
||||
else:
|
||||
# Create a new path
|
||||
self.existing_paths[new_dir] = [new_file]
|
||||
|
||||
def add_mod(self, mod_name):
|
||||
"""Shortcut for updating parser modules."""
|
||||
@@ -267,7 +119,7 @@ class ApacheParser(object):
|
||||
the iteration issue. Else... parse and enable mods at same time.
|
||||
|
||||
"""
|
||||
mods = set() # type: Set[str]
|
||||
mods = set()
|
||||
matches = self.find_dir("LoadModule")
|
||||
iterator = iter(matches)
|
||||
# Make sure prev_size != cur_size for do: while: iteration
|
||||
@@ -285,7 +137,7 @@ class ApacheParser(object):
|
||||
mods.add(os.path.basename(mod_filename)[:-2] + "c")
|
||||
else:
|
||||
logger.debug("Could not read LoadModule directive from " +
|
||||
"Augeas path: %s", match_name[6:])
|
||||
"Augeas path: {0}".format(match_name[6:]))
|
||||
self.modules.update(mods)
|
||||
|
||||
def update_runtime_variables(self):
|
||||
@@ -298,7 +150,7 @@ class ApacheParser(object):
|
||||
"""Get Defines from httpd process"""
|
||||
|
||||
variables = dict()
|
||||
define_cmd = [self.configurator.option("ctl"), "-t", "-D",
|
||||
define_cmd = [self.configurator.constant("apache_cmd"), "-t", "-D",
|
||||
"DUMP_RUN_CFG"]
|
||||
matches = self.parse_from_subprocess(define_cmd, r"Define: ([^ \n]*)")
|
||||
try:
|
||||
@@ -325,7 +177,7 @@ class ApacheParser(object):
|
||||
# configuration files
|
||||
_ = self.find_dir("Include")
|
||||
|
||||
inc_cmd = [self.configurator.option("ctl"), "-t", "-D",
|
||||
inc_cmd = [self.configurator.constant("apache_cmd"), "-t", "-D",
|
||||
"DUMP_INCLUDES"]
|
||||
matches = self.parse_from_subprocess(inc_cmd, r"\(.*\) (.*)")
|
||||
if matches:
|
||||
@@ -336,7 +188,7 @@ class ApacheParser(object):
|
||||
def update_modules(self):
|
||||
"""Get loaded modules from httpd process, and add them to DOM"""
|
||||
|
||||
mod_cmd = [self.configurator.option("ctl"), "-t", "-D",
|
||||
mod_cmd = [self.configurator.constant("apache_cmd"), "-t", "-D",
|
||||
"DUMP_MODULES"]
|
||||
matches = self.parse_from_subprocess(mod_cmd, r"(.*)_module")
|
||||
for mod in matches:
|
||||
@@ -375,8 +227,8 @@ class ApacheParser(object):
|
||||
"Error running command %s for runtime parameters!%s",
|
||||
command, os.linesep)
|
||||
raise errors.MisconfigurationError(
|
||||
"Error accessing loaded Apache parameters: {0}".format(
|
||||
command))
|
||||
"Error accessing loaded Apache parameters: %s",
|
||||
command)
|
||||
# Small errors that do not impede
|
||||
if proc.returncode != 0:
|
||||
logger.warning("Error in checking parameter list: %s", stderr)
|
||||
@@ -402,12 +254,12 @@ class ApacheParser(object):
|
||||
"""
|
||||
filtered = []
|
||||
if args == 1:
|
||||
for i, match in enumerate(matches):
|
||||
if match.endswith("/arg"):
|
||||
for i in range(len(matches)):
|
||||
if matches[i].endswith("/arg"):
|
||||
filtered.append(matches[i][:-4])
|
||||
else:
|
||||
for i, match in enumerate(matches):
|
||||
if match.endswith("/arg[%d]" % args):
|
||||
for i in range(len(matches)):
|
||||
if matches[i].endswith("/arg[%d]" % args):
|
||||
# Make sure we don't cause an IndexError (end of list)
|
||||
# Check to make sure arg + 1 doesn't exist
|
||||
if (i == (len(matches) - 1) or
|
||||
@@ -432,7 +284,7 @@ class ApacheParser(object):
|
||||
"""
|
||||
# TODO: Add error checking code... does the path given even exist?
|
||||
# Does it throw exceptions?
|
||||
if_mod_path = self.get_ifmod(aug_conf_path, "mod_ssl.c")
|
||||
if_mod_path = self._get_ifmod(aug_conf_path, "mod_ssl.c")
|
||||
# IfModule can have only one valid argument, so append after
|
||||
self.aug.insert(if_mod_path + "arg", "directive", False)
|
||||
nvh_path = if_mod_path + "directive[1]"
|
||||
@@ -443,54 +295,22 @@ class ApacheParser(object):
|
||||
for i, arg in enumerate(args):
|
||||
self.aug.set("%s/arg[%d]" % (nvh_path, i + 1), arg)
|
||||
|
||||
def get_ifmod(self, aug_conf_path, mod, beginning=False):
|
||||
def _get_ifmod(self, aug_conf_path, mod):
|
||||
"""Returns the path to <IfMod mod> and creates one if it doesn't exist.
|
||||
|
||||
:param str aug_conf_path: Augeas configuration path
|
||||
:param str mod: module ie. mod_ssl.c
|
||||
:param bool beginning: If the IfModule should be created to the beginning
|
||||
of augeas path DOM tree.
|
||||
|
||||
:returns: Augeas path of the requested IfModule directive that pre-existed
|
||||
or was created during the process. The path may be dynamic,
|
||||
i.e. .../IfModule[last()]
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
if_mods = self.aug.match(("%s/IfModule/*[self::arg='%s']" %
|
||||
(aug_conf_path, mod)))
|
||||
if not if_mods:
|
||||
return self.create_ifmod(aug_conf_path, mod, beginning)
|
||||
|
||||
if len(if_mods) == 0:
|
||||
self.aug.set("%s/IfModule[last() + 1]" % aug_conf_path, "")
|
||||
self.aug.set("%s/IfModule[last()]/arg" % aug_conf_path, mod)
|
||||
if_mods = self.aug.match(("%s/IfModule/*[self::arg='%s']" %
|
||||
(aug_conf_path, mod)))
|
||||
# Strip off "arg" at end of first ifmod path
|
||||
return if_mods[0].rpartition("arg")[0]
|
||||
|
||||
def create_ifmod(self, aug_conf_path, mod, beginning=False):
|
||||
"""Creates a new <IfMod mod> and returns its path.
|
||||
|
||||
:param str aug_conf_path: Augeas configuration path
|
||||
:param str mod: module ie. mod_ssl.c
|
||||
:param bool beginning: If the IfModule should be created to the beginning
|
||||
of augeas path DOM tree.
|
||||
|
||||
:returns: Augeas path of the newly created IfModule directive.
|
||||
The path may be dynamic, i.e. .../IfModule[last()]
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
if beginning:
|
||||
c_path_arg = "{}/IfModule[1]/arg".format(aug_conf_path)
|
||||
# Insert IfModule before the first directive
|
||||
self.aug.insert("{}/directive[1]".format(aug_conf_path),
|
||||
"IfModule", True)
|
||||
retpath = "{}/IfModule[1]/".format(aug_conf_path)
|
||||
else:
|
||||
c_path = "{}/IfModule[last() + 1]".format(aug_conf_path)
|
||||
c_path_arg = "{}/IfModule[last()]/arg".format(aug_conf_path)
|
||||
self.aug.set(c_path, "")
|
||||
retpath = "{}/IfModule[last()]/".format(aug_conf_path)
|
||||
self.aug.set(c_path_arg, mod)
|
||||
return retpath
|
||||
return if_mods[0][:len(if_mods[0]) - 3]
|
||||
|
||||
def add_dir(self, aug_conf_path, directive, args):
|
||||
"""Appends directive to the end fo the file given by aug_conf_path.
|
||||
@@ -529,37 +349,6 @@ class ApacheParser(object):
|
||||
else:
|
||||
self.aug.set(first_dir + "/arg", args)
|
||||
|
||||
def add_comment(self, aug_conf_path, comment):
|
||||
"""Adds the comment to the augeas path
|
||||
|
||||
:param str aug_conf_path: Augeas configuration path to add directive
|
||||
:param str comment: Comment content
|
||||
|
||||
"""
|
||||
self.aug.set(aug_conf_path + "/#comment[last() + 1]", comment)
|
||||
|
||||
def find_comments(self, arg, start=None):
|
||||
"""Finds a comment with specified content from the provided DOM path
|
||||
|
||||
:param str arg: Comment content to search
|
||||
:param str start: Beginning Augeas path to begin looking
|
||||
|
||||
:returns: List of augeas paths containing the comment content
|
||||
:rtype: list
|
||||
|
||||
"""
|
||||
if not start:
|
||||
start = get_aug_path(self.root)
|
||||
|
||||
comments = self.aug.match("%s//*[label() = '#comment']" % start)
|
||||
|
||||
results = []
|
||||
for comment in comments:
|
||||
c_content = self.aug.get(comment)
|
||||
if c_content and arg in c_content:
|
||||
results.append(comment)
|
||||
return results
|
||||
|
||||
def find_dir(self, directive, arg=None, start=None, exclude=True):
|
||||
"""Finds directive in the configuration.
|
||||
|
||||
@@ -619,7 +408,7 @@ class ApacheParser(object):
|
||||
else:
|
||||
arg_suffix = "/*[self::arg=~regexp('%s')]" % case_i(arg)
|
||||
|
||||
ordered_matches = [] # type: List[str]
|
||||
ordered_matches = []
|
||||
|
||||
# TODO: Wildcards should be included in alphabetical order
|
||||
# https://httpd.apache.org/docs/2.4/mod/core.html#include
|
||||
@@ -636,20 +425,6 @@ class ApacheParser(object):
|
||||
|
||||
return ordered_matches
|
||||
|
||||
def get_all_args(self, match):
|
||||
"""
|
||||
Tries to fetch all arguments for a directive. See get_arg.
|
||||
|
||||
Note that if match is an ancestor node, it returns all names of
|
||||
child directives as well as the list of arguments.
|
||||
|
||||
"""
|
||||
|
||||
if match[-1] != "/":
|
||||
match = match+"/"
|
||||
allargs = self.aug.match(match + '*')
|
||||
return [self.get_arg(arg) for arg in allargs]
|
||||
|
||||
def get_arg(self, match):
|
||||
"""Uses augeas.get to get argument value and interprets result.
|
||||
|
||||
@@ -793,8 +568,9 @@ class ApacheParser(object):
|
||||
if sys.version_info < (3, 6):
|
||||
# This strips off final /Z(?ms)
|
||||
return fnmatch.translate(clean_fn_match)[:-7]
|
||||
# Since Python 3.6, it returns a different pattern like (?s:.*\.load)\Z
|
||||
return fnmatch.translate(clean_fn_match)[4:-3] # pragma: no cover
|
||||
else: # pragma: no cover
|
||||
# Since Python 3.6, it returns a different pattern like (?s:.*\.load)\Z
|
||||
return fnmatch.translate(clean_fn_match)[4:-3]
|
||||
|
||||
def parse_file(self, filepath):
|
||||
"""Parse file with Augeas
|
||||
@@ -808,7 +584,8 @@ class ApacheParser(object):
|
||||
use_new, remove_old = self._check_path_actions(filepath)
|
||||
# Ensure that we have the latest Augeas DOM state on disk before
|
||||
# calling aug.load() which reloads the state from disk
|
||||
self.ensure_augeas_state()
|
||||
if self.configurator:
|
||||
self.configurator.ensure_augeas_state()
|
||||
# Test if augeas included file for Httpd.lens
|
||||
# Note: This works for augeas globs, ie. *.conf
|
||||
if use_new:
|
||||
@@ -875,7 +652,10 @@ class ApacheParser(object):
|
||||
use_new = False
|
||||
else:
|
||||
use_new = True
|
||||
remove_old = new_file_match == "*"
|
||||
if new_file_match == "*":
|
||||
remove_old = True
|
||||
else:
|
||||
remove_old = False
|
||||
except KeyError:
|
||||
use_new = True
|
||||
remove_old = False
|
||||
|
||||
@@ -3,11 +3,6 @@
|
||||
# A hackish script to see if the client is behaving as expected
|
||||
# with each of the "passing" conf files.
|
||||
|
||||
if [ -z "$SERVER" ]; then
|
||||
echo "Please set SERVER to the ACME server's directory URL."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
export EA=/etc/apache2/
|
||||
TESTDIR="`dirname $0`"
|
||||
cd $TESTDIR/passing
|
||||
@@ -51,7 +46,6 @@ function Cleanup() {
|
||||
|
||||
# if our environment asks us to enable modules, do our best!
|
||||
if [ "$1" = --debian-modules ] ; then
|
||||
sudo apt-get install -y apache2
|
||||
sudo apt-get install -y libapache2-mod-wsgi
|
||||
sudo apt-get install -y libapache2-mod-macro
|
||||
|
||||
@@ -61,16 +55,13 @@ if [ "$1" = --debian-modules ] ; then
|
||||
done
|
||||
fi
|
||||
|
||||
CERTBOT_CMD="sudo $(command -v certbot) --server $SERVER -vvvv"
|
||||
CERTBOT_CMD="$CERTBOT_CMD --debug --apache --register-unsafely-without-email"
|
||||
CERTBOT_CMD="$CERTBOT_CMD --agree-tos certonly -t --no-verify-ssl"
|
||||
|
||||
FAILS=0
|
||||
trap CleanupExit INT
|
||||
for f in *.conf ; do
|
||||
echo -n testing "$f"...
|
||||
Setup
|
||||
RESULT=`echo c | $CERTBOT_CMD 2>&1`
|
||||
RESULT=`echo c | sudo $(command -v certbot) -vvvv --debug --staging --apache --register-unsafely-without-email --agree-tos certonly -t 2>&1`
|
||||
if echo $RESULT | grep -Eq \("Which names would you like"\|"mod_macro is not yet"\) ; then
|
||||
echo passed
|
||||
else
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
"""
|
||||
This executable script wraps the apache-conf-test bash script, in order to setup a pebble instance
|
||||
before its execution. Directory URL is passed through the SERVER environment variable.
|
||||
"""
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
from certbot_integration_tests.utils import acme_server
|
||||
|
||||
SCRIPT_DIRNAME = os.path.dirname(__file__)
|
||||
|
||||
|
||||
def main(args=None):
|
||||
if not args:
|
||||
args = sys.argv[1:]
|
||||
with acme_server.ACMEServer('pebble', [], False) as acme_xdist:
|
||||
environ = os.environ.copy()
|
||||
environ['SERVER'] = acme_xdist['directory_url']
|
||||
command = [os.path.join(SCRIPT_DIRNAME, 'apache-conf-test')]
|
||||
command.extend(args)
|
||||
return subprocess.call(command, env=environ)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(main())
|
||||
@@ -1,7 +1,7 @@
|
||||
#LoadModule ssl_module modules/mod_ssl.so
|
||||
|
||||
Listen 4443
|
||||
<VirtualHost *:4443>
|
||||
Listen 443
|
||||
<VirtualHost *:443>
|
||||
# The ServerName directive sets the request scheme, hostname and port that
|
||||
# the server uses to identify itself. This is used when creating
|
||||
# redirection URLs. In the context of virtual hosts, the ServerName
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Test for certbot_apache.configurator implementations of reverter"""
|
||||
"""Test for certbot_apache.augeas_configurator."""
|
||||
import os
|
||||
import shutil
|
||||
import unittest
|
||||
|
||||
@@ -9,12 +10,12 @@ from certbot import errors
|
||||
from certbot_apache.tests import util
|
||||
|
||||
|
||||
class ConfiguratorReverterTest(util.ApacheTest):
|
||||
"""Test for ApacheConfigurator reverter methods"""
|
||||
class AugeasConfiguratorTest(util.ApacheTest):
|
||||
"""Test for Augeas Configurator base class."""
|
||||
|
||||
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
super(ConfiguratorReverterTest, self).setUp()
|
||||
super(AugeasConfiguratorTest, self).setUp()
|
||||
|
||||
self.config = util.get_apache_configurator(
|
||||
self.config_path, self.vhost_path, self.config_dir, self.work_dir)
|
||||
@@ -27,6 +28,20 @@ class ConfiguratorReverterTest(util.ApacheTest):
|
||||
shutil.rmtree(self.work_dir)
|
||||
shutil.rmtree(self.temp_dir)
|
||||
|
||||
def test_bad_parse(self):
|
||||
# pylint: disable=protected-access
|
||||
self.config.parser.parse_file(os.path.join(
|
||||
self.config.parser.root, "conf-available", "bad_conf_file.conf"))
|
||||
self.assertRaises(
|
||||
errors.PluginError, self.config.check_parsing_errors, "httpd.aug")
|
||||
|
||||
def test_bad_save(self):
|
||||
mock_save = mock.Mock()
|
||||
mock_save.side_effect = IOError
|
||||
self.config.aug.save = mock_save
|
||||
|
||||
self.assertRaises(errors.PluginError, self.config.save)
|
||||
|
||||
def test_bad_save_checkpoint(self):
|
||||
self.config.reverter.add_to_checkpoint = mock.Mock(
|
||||
side_effect=errors.ReverterError)
|
||||
@@ -48,9 +63,23 @@ class ConfiguratorReverterTest(util.ApacheTest):
|
||||
|
||||
self.assertTrue(mock_finalize.is_called)
|
||||
|
||||
def test_recovery_routine(self):
|
||||
mock_load = mock.Mock()
|
||||
self.config.aug.load = mock_load
|
||||
|
||||
self.config.recovery_routine()
|
||||
self.assertEqual(mock_load.call_count, 1)
|
||||
|
||||
def test_recovery_routine_error(self):
|
||||
self.config.reverter.recovery_routine = mock.Mock(
|
||||
side_effect=errors.ReverterError)
|
||||
|
||||
self.assertRaises(
|
||||
errors.PluginError, self.config.recovery_routine)
|
||||
|
||||
def test_revert_challenge_config(self):
|
||||
mock_load = mock.Mock()
|
||||
self.config.parser.aug.load = mock_load
|
||||
self.config.aug.load = mock_load
|
||||
|
||||
self.config.revert_challenge_config()
|
||||
self.assertEqual(mock_load.call_count, 1)
|
||||
@@ -64,7 +93,7 @@ class ConfiguratorReverterTest(util.ApacheTest):
|
||||
|
||||
def test_rollback_checkpoints(self):
|
||||
mock_load = mock.Mock()
|
||||
self.config.parser.aug.load = mock_load
|
||||
self.config.aug.load = mock_load
|
||||
|
||||
self.config.rollback_checkpoints()
|
||||
self.assertEqual(mock_load.call_count, 1)
|
||||
@@ -74,11 +103,13 @@ class ConfiguratorReverterTest(util.ApacheTest):
|
||||
side_effect=errors.ReverterError)
|
||||
self.assertRaises(errors.PluginError, self.config.rollback_checkpoints)
|
||||
|
||||
def test_recovery_routine_reload(self):
|
||||
mock_load = mock.Mock()
|
||||
self.config.parser.aug.load = mock_load
|
||||
self.config.recovery_routine()
|
||||
self.assertEqual(mock_load.call_count, 1)
|
||||
def test_view_config_changes(self):
|
||||
self.config.view_config_changes()
|
||||
|
||||
def test_view_config_changes_error(self):
|
||||
self.config.reverter.view_config_changes = mock.Mock(
|
||||
side_effect=errors.ReverterError)
|
||||
self.assertRaises(errors.PluginError, self.config.view_config_changes)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
@@ -1,188 +0,0 @@
|
||||
# pylint: disable=too-many-lines
|
||||
"""Test for certbot_apache.configurator AutoHSTS functionality"""
|
||||
import re
|
||||
import unittest
|
||||
import mock
|
||||
# six is used in mock.patch()
|
||||
import six # pylint: disable=unused-import
|
||||
|
||||
from certbot import errors
|
||||
from certbot_apache import constants
|
||||
from certbot_apache.tests import util
|
||||
|
||||
|
||||
class AutoHSTSTest(util.ApacheTest):
|
||||
"""Tests for AutoHSTS feature"""
|
||||
# pylint: disable=protected-access
|
||||
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
super(AutoHSTSTest, self).setUp()
|
||||
|
||||
self.config = util.get_apache_configurator(
|
||||
self.config_path, self.vhost_path, self.config_dir, self.work_dir)
|
||||
self.config.parser.modules.add("headers_module")
|
||||
self.config.parser.modules.add("mod_headers.c")
|
||||
self.config.parser.modules.add("ssl_module")
|
||||
self.config.parser.modules.add("mod_ssl.c")
|
||||
|
||||
self.vh_truth = util.get_vh_truth(
|
||||
self.temp_dir, "debian_apache_2_4/multiple_vhosts")
|
||||
|
||||
def get_autohsts_value(self, vh_path):
|
||||
""" Get value from Strict-Transport-Security header """
|
||||
header_path = self.config.parser.find_dir("Header", None, vh_path)
|
||||
if header_path:
|
||||
pat = '(?:[ "]|^)(strict-transport-security)(?:[ "]|$)'
|
||||
for head in header_path:
|
||||
if re.search(pat, self.config.parser.aug.get(head).lower()):
|
||||
return self.config.parser.aug.get(
|
||||
head.replace("arg[3]", "arg[4]"))
|
||||
return None # pragma: no cover
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.restart")
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.enable_mod")
|
||||
def test_autohsts_enable_headers_mod(self, mock_enable, _restart):
|
||||
self.config.parser.modules.discard("headers_module")
|
||||
self.config.parser.modules.discard("mod_header.c")
|
||||
self.config.enable_autohsts(mock.MagicMock(), ["ocspvhost.com"])
|
||||
self.assertTrue(mock_enable.called)
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.restart")
|
||||
def test_autohsts_deploy_already_exists(self, _restart):
|
||||
self.config.enable_autohsts(mock.MagicMock(), ["ocspvhost.com"])
|
||||
self.assertRaises(errors.PluginEnhancementAlreadyPresent,
|
||||
self.config.enable_autohsts,
|
||||
mock.MagicMock(), ["ocspvhost.com"])
|
||||
|
||||
@mock.patch("certbot_apache.constants.AUTOHSTS_FREQ", 0)
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.restart")
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.prepare")
|
||||
def test_autohsts_increase(self, mock_prepare, _mock_restart):
|
||||
self.config._prepared = False
|
||||
maxage = "\"max-age={0}\""
|
||||
initial_val = maxage.format(constants.AUTOHSTS_STEPS[0])
|
||||
inc_val = maxage.format(constants.AUTOHSTS_STEPS[1])
|
||||
|
||||
self.config.enable_autohsts(mock.MagicMock(), ["ocspvhost.com"])
|
||||
# Verify initial value
|
||||
self.assertEqual(self.get_autohsts_value(self.vh_truth[7].path),
|
||||
initial_val)
|
||||
# Increase
|
||||
self.config.update_autohsts(mock.MagicMock())
|
||||
# Verify increased value
|
||||
self.assertEqual(self.get_autohsts_value(self.vh_truth[7].path),
|
||||
inc_val)
|
||||
self.assertTrue(mock_prepare.called)
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.restart")
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator._autohsts_increase")
|
||||
def test_autohsts_increase_noop(self, mock_increase, _restart):
|
||||
maxage = "\"max-age={0}\""
|
||||
initial_val = maxage.format(constants.AUTOHSTS_STEPS[0])
|
||||
self.config.enable_autohsts(mock.MagicMock(), ["ocspvhost.com"])
|
||||
# Verify initial value
|
||||
self.assertEqual(self.get_autohsts_value(self.vh_truth[7].path),
|
||||
initial_val)
|
||||
|
||||
self.config.update_autohsts(mock.MagicMock())
|
||||
# Freq not patched, so value shouldn't increase
|
||||
self.assertFalse(mock_increase.called)
|
||||
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.restart")
|
||||
@mock.patch("certbot_apache.constants.AUTOHSTS_FREQ", 0)
|
||||
def test_autohsts_increase_no_header(self, _restart):
|
||||
self.config.enable_autohsts(mock.MagicMock(), ["ocspvhost.com"])
|
||||
# Remove the header
|
||||
dir_locs = self.config.parser.find_dir("Header", None,
|
||||
self.vh_truth[7].path)
|
||||
dir_loc = "/".join(dir_locs[0].split("/")[:-1])
|
||||
self.config.parser.aug.remove(dir_loc)
|
||||
self.assertRaises(errors.PluginError,
|
||||
self.config.update_autohsts,
|
||||
mock.MagicMock())
|
||||
|
||||
@mock.patch("certbot_apache.constants.AUTOHSTS_FREQ", 0)
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.restart")
|
||||
def test_autohsts_increase_and_make_permanent(self, _mock_restart):
|
||||
maxage = "\"max-age={0}\""
|
||||
max_val = maxage.format(constants.AUTOHSTS_PERMANENT)
|
||||
mock_lineage = mock.MagicMock()
|
||||
mock_lineage.key_path = "/etc/apache2/ssl/key-certbot_15.pem"
|
||||
self.config.enable_autohsts(mock.MagicMock(), ["ocspvhost.com"])
|
||||
for i in range(len(constants.AUTOHSTS_STEPS)-1):
|
||||
# Ensure that value is not made permanent prematurely
|
||||
self.config.deploy_autohsts(mock_lineage)
|
||||
self.assertNotEqual(self.get_autohsts_value(self.vh_truth[7].path),
|
||||
max_val)
|
||||
self.config.update_autohsts(mock.MagicMock())
|
||||
# Value should match pre-permanent increment step
|
||||
cur_val = maxage.format(constants.AUTOHSTS_STEPS[i+1])
|
||||
self.assertEqual(self.get_autohsts_value(self.vh_truth[7].path),
|
||||
cur_val)
|
||||
# Ensure that the value is raised to max
|
||||
self.assertEqual(self.get_autohsts_value(self.vh_truth[7].path),
|
||||
maxage.format(constants.AUTOHSTS_STEPS[-1]))
|
||||
# Make permanent
|
||||
self.config.deploy_autohsts(mock_lineage)
|
||||
self.assertEqual(self.get_autohsts_value(self.vh_truth[7].path),
|
||||
max_val)
|
||||
|
||||
def test_autohsts_update_noop(self):
|
||||
with mock.patch("time.time") as mock_time:
|
||||
# Time mock is used to make sure that the execution does not
|
||||
# continue when no autohsts entries exist in pluginstorage
|
||||
self.config.update_autohsts(mock.MagicMock())
|
||||
self.assertFalse(mock_time.called)
|
||||
|
||||
def test_autohsts_make_permanent_noop(self):
|
||||
self.config.storage.put = mock.MagicMock()
|
||||
self.config.deploy_autohsts(mock.MagicMock())
|
||||
# Make sure that the execution does not continue when no entries in store
|
||||
self.assertFalse(self.config.storage.put.called)
|
||||
|
||||
@mock.patch("certbot_apache.display_ops.select_vhost")
|
||||
def test_autohsts_no_ssl_vhost(self, mock_select):
|
||||
mock_select.return_value = self.vh_truth[0]
|
||||
with mock.patch("certbot_apache.configurator.logger.warning") as mock_log:
|
||||
self.assertRaises(errors.PluginError,
|
||||
self.config.enable_autohsts,
|
||||
mock.MagicMock(), "invalid.example.com")
|
||||
self.assertTrue(
|
||||
"Certbot was not able to find SSL" in mock_log.call_args[0][0])
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.restart")
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.add_vhost_id")
|
||||
def test_autohsts_dont_enhance_twice(self, mock_id, _restart):
|
||||
mock_id.return_value = "1234567"
|
||||
self.config.enable_autohsts(mock.MagicMock(),
|
||||
["ocspvhost.com", "ocspvhost.com"])
|
||||
self.assertEqual(mock_id.call_count, 1)
|
||||
|
||||
def test_autohsts_remove_orphaned(self):
|
||||
# pylint: disable=protected-access
|
||||
self.config._autohsts_fetch_state()
|
||||
self.config._autohsts["orphan_id"] = {"laststep": 0, "timestamp": 0}
|
||||
|
||||
self.config._autohsts_save_state()
|
||||
self.config.update_autohsts(mock.MagicMock())
|
||||
self.assertFalse("orphan_id" in self.config._autohsts)
|
||||
# Make sure it's removed from the pluginstorage file as well
|
||||
self.config._autohsts = None
|
||||
self.config._autohsts_fetch_state()
|
||||
self.assertFalse(self.config._autohsts)
|
||||
|
||||
def test_autohsts_make_permanent_vhost_not_found(self):
|
||||
# pylint: disable=protected-access
|
||||
self.config._autohsts_fetch_state()
|
||||
self.config._autohsts["orphan_id"] = {"laststep": 999, "timestamp": 0}
|
||||
self.config._autohsts_save_state()
|
||||
with mock.patch("certbot_apache.configurator.logger.warning") as mock_log:
|
||||
self.config.deploy_autohsts(mock.MagicMock())
|
||||
self.assertTrue(mock_log.called)
|
||||
self.assertTrue(
|
||||
"VirtualHost with id orphan_id was not" in mock_log.call_args[0][0])
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,222 +0,0 @@
|
||||
"""Test for certbot_apache.configurator for CentOS 6 overrides"""
|
||||
import unittest
|
||||
|
||||
from certbot.compat import os
|
||||
from certbot.errors import MisconfigurationError
|
||||
|
||||
from certbot_apache import obj
|
||||
from certbot_apache import override_centos
|
||||
from certbot_apache import parser
|
||||
from certbot_apache.tests import util
|
||||
|
||||
|
||||
def get_vh_truth(temp_dir, config_name):
|
||||
"""Return the ground truth for the specified directory."""
|
||||
prefix = os.path.join(
|
||||
temp_dir, config_name, "httpd/conf.d")
|
||||
|
||||
aug_pre = "/files" + prefix
|
||||
vh_truth = [
|
||||
obj.VirtualHost(
|
||||
os.path.join(prefix, "test.example.com.conf"),
|
||||
os.path.join(aug_pre, "test.example.com.conf/VirtualHost"),
|
||||
set([obj.Addr.fromstring("*:80")]),
|
||||
False, True, "test.example.com"),
|
||||
obj.VirtualHost(
|
||||
os.path.join(prefix, "ssl.conf"),
|
||||
os.path.join(aug_pre, "ssl.conf/VirtualHost"),
|
||||
set([obj.Addr.fromstring("_default_:443")]),
|
||||
True, True, None)
|
||||
]
|
||||
return vh_truth
|
||||
|
||||
class CentOS6Tests(util.ApacheTest):
|
||||
"""Tests for CentOS 6"""
|
||||
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
test_dir = "centos6_apache/apache"
|
||||
config_root = "centos6_apache/apache/httpd"
|
||||
vhost_root = "centos6_apache/apache/httpd/conf.d"
|
||||
super(CentOS6Tests, self).setUp(test_dir=test_dir,
|
||||
config_root=config_root,
|
||||
vhost_root=vhost_root)
|
||||
|
||||
self.config = util.get_apache_configurator(
|
||||
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
|
||||
version=(2, 2, 15), os_info="centos")
|
||||
self.vh_truth = get_vh_truth(
|
||||
self.temp_dir, "centos6_apache/apache")
|
||||
|
||||
def test_get_parser(self):
|
||||
self.assertTrue(isinstance(self.config.parser,
|
||||
override_centos.CentOSParser))
|
||||
|
||||
def test_get_virtual_hosts(self):
|
||||
"""Make sure all vhosts are being properly found."""
|
||||
vhs = self.config.get_virtual_hosts()
|
||||
self.assertEqual(len(vhs), 2)
|
||||
found = 0
|
||||
|
||||
for vhost in vhs:
|
||||
for centos_truth in self.vh_truth:
|
||||
if vhost == centos_truth:
|
||||
found += 1
|
||||
break
|
||||
else:
|
||||
raise Exception("Missed: %s" % vhost) # pragma: no cover
|
||||
self.assertEqual(found, 2)
|
||||
|
||||
def test_loadmod_default(self):
|
||||
ssl_loadmods = self.config.parser.find_dir(
|
||||
"LoadModule", "ssl_module", exclude=False)
|
||||
self.assertEqual(len(ssl_loadmods), 1)
|
||||
# Make sure the LoadModule ssl_module is in ssl.conf (default)
|
||||
self.assertTrue("ssl.conf" in ssl_loadmods[0])
|
||||
# ...and that it's not inside of <IfModule>
|
||||
self.assertFalse("IfModule" in ssl_loadmods[0])
|
||||
|
||||
# Get the example vhost
|
||||
self.config.assoc["test.example.com"] = self.vh_truth[0]
|
||||
self.config.deploy_cert(
|
||||
"random.demo", "example/cert.pem", "example/key.pem",
|
||||
"example/cert_chain.pem", "example/fullchain.pem")
|
||||
self.config.save()
|
||||
|
||||
post_loadmods = self.config.parser.find_dir(
|
||||
"LoadModule", "ssl_module", exclude=False)
|
||||
|
||||
# We should now have LoadModule ssl_module in root conf and ssl.conf
|
||||
self.assertEqual(len(post_loadmods), 2)
|
||||
for lm in post_loadmods:
|
||||
# lm[:-7] removes "/arg[#]" from the path
|
||||
arguments = self.config.parser.get_all_args(lm[:-7])
|
||||
self.assertEqual(arguments, ["ssl_module", "modules/mod_ssl.so"])
|
||||
# ...and both of them should be wrapped in <IfModule !mod_ssl.c>
|
||||
# lm[:-17] strips off /directive/arg[1] from the path.
|
||||
ifmod_args = self.config.parser.get_all_args(lm[:-17])
|
||||
self.assertTrue("!mod_ssl.c" in ifmod_args)
|
||||
|
||||
def test_loadmod_multiple(self):
|
||||
sslmod_args = ["ssl_module", "modules/mod_ssl.so"]
|
||||
# Adds another LoadModule to main httpd.conf in addtition to ssl.conf
|
||||
self.config.parser.add_dir(self.config.parser.loc["default"], "LoadModule",
|
||||
sslmod_args)
|
||||
self.config.save()
|
||||
pre_loadmods = self.config.parser.find_dir(
|
||||
"LoadModule", "ssl_module", exclude=False)
|
||||
# LoadModules are not within IfModule blocks
|
||||
self.assertFalse(any(["ifmodule" in m.lower() for m in pre_loadmods]))
|
||||
self.config.assoc["test.example.com"] = self.vh_truth[0]
|
||||
self.config.deploy_cert(
|
||||
"random.demo", "example/cert.pem", "example/key.pem",
|
||||
"example/cert_chain.pem", "example/fullchain.pem")
|
||||
post_loadmods = self.config.parser.find_dir(
|
||||
"LoadModule", "ssl_module", exclude=False)
|
||||
|
||||
for mod in post_loadmods:
|
||||
self.assertTrue(self.config.parser.not_modssl_ifmodule(mod)) #pylint: disable=no-member
|
||||
|
||||
def test_loadmod_rootconf_exists(self):
|
||||
sslmod_args = ["ssl_module", "modules/mod_ssl.so"]
|
||||
rootconf_ifmod = self.config.parser.get_ifmod(
|
||||
parser.get_aug_path(self.config.parser.loc["default"]),
|
||||
"!mod_ssl.c", beginning=True)
|
||||
self.config.parser.add_dir(rootconf_ifmod[:-1], "LoadModule", sslmod_args)
|
||||
self.config.save()
|
||||
# Get the example vhost
|
||||
self.config.assoc["test.example.com"] = self.vh_truth[0]
|
||||
self.config.deploy_cert(
|
||||
"random.demo", "example/cert.pem", "example/key.pem",
|
||||
"example/cert_chain.pem", "example/fullchain.pem")
|
||||
self.config.save()
|
||||
|
||||
root_loadmods = self.config.parser.find_dir(
|
||||
"LoadModule", "ssl_module",
|
||||
start=parser.get_aug_path(self.config.parser.loc["default"]),
|
||||
exclude=False)
|
||||
|
||||
mods = [lm for lm in root_loadmods if self.config.parser.loc["default"] in lm]
|
||||
|
||||
self.assertEqual(len(mods), 1)
|
||||
# [:-7] removes "/arg[#]" from the path
|
||||
self.assertEqual(
|
||||
self.config.parser.get_all_args(mods[0][:-7]),
|
||||
sslmod_args)
|
||||
|
||||
def test_neg_loadmod_already_on_path(self):
|
||||
loadmod_args = ["ssl_module", "modules/mod_ssl.so"]
|
||||
ifmod = self.config.parser.get_ifmod(
|
||||
self.vh_truth[1].path, "!mod_ssl.c", beginning=True)
|
||||
self.config.parser.add_dir(ifmod[:-1], "LoadModule", loadmod_args)
|
||||
self.config.parser.add_dir(self.vh_truth[1].path, "LoadModule", loadmod_args)
|
||||
self.config.save()
|
||||
pre_loadmods = self.config.parser.find_dir(
|
||||
"LoadModule", "ssl_module", start=self.vh_truth[1].path, exclude=False)
|
||||
self.assertEqual(len(pre_loadmods), 2)
|
||||
# The ssl.conf now has two LoadModule directives, one inside of
|
||||
# !mod_ssl.c IfModule
|
||||
self.config.assoc["test.example.com"] = self.vh_truth[0]
|
||||
self.config.deploy_cert(
|
||||
"random.demo", "example/cert.pem", "example/key.pem",
|
||||
"example/cert_chain.pem", "example/fullchain.pem")
|
||||
self.config.save()
|
||||
# Ensure that the additional LoadModule wasn't written into the IfModule
|
||||
post_loadmods = self.config.parser.find_dir(
|
||||
"LoadModule", "ssl_module", start=self.vh_truth[1].path, exclude=False)
|
||||
self.assertEqual(len(post_loadmods), 1)
|
||||
|
||||
def test_loadmod_non_duplicate(self):
|
||||
# the modules/mod_ssl.so exists in ssl.conf
|
||||
sslmod_args = ["ssl_module", "modules/mod_somethingelse.so"]
|
||||
rootconf_ifmod = self.config.parser.get_ifmod(
|
||||
parser.get_aug_path(self.config.parser.loc["default"]),
|
||||
"!mod_ssl.c", beginning=True)
|
||||
self.config.parser.add_dir(rootconf_ifmod[:-1], "LoadModule", sslmod_args)
|
||||
self.config.save()
|
||||
self.config.assoc["test.example.com"] = self.vh_truth[0]
|
||||
pre_matches = self.config.parser.find_dir("LoadModule",
|
||||
"ssl_module", exclude=False)
|
||||
|
||||
self.assertRaises(MisconfigurationError, self.config.deploy_cert,
|
||||
"random.demo", "example/cert.pem", "example/key.pem",
|
||||
"example/cert_chain.pem", "example/fullchain.pem")
|
||||
|
||||
post_matches = self.config.parser.find_dir("LoadModule",
|
||||
"ssl_module", exclude=False)
|
||||
# Make sure that none was changed
|
||||
self.assertEqual(pre_matches, post_matches)
|
||||
|
||||
def test_loadmod_not_found(self):
|
||||
# Remove all existing LoadModule ssl_module... directives
|
||||
orig_loadmods = self.config.parser.find_dir("LoadModule",
|
||||
"ssl_module",
|
||||
exclude=False)
|
||||
for mod in orig_loadmods:
|
||||
noarg_path = mod.rpartition("/")[0]
|
||||
self.config.parser.aug.remove(noarg_path)
|
||||
self.config.save()
|
||||
self.config.deploy_cert(
|
||||
"random.demo", "example/cert.pem", "example/key.pem",
|
||||
"example/cert_chain.pem", "example/fullchain.pem")
|
||||
|
||||
post_loadmods = self.config.parser.find_dir("LoadModule",
|
||||
"ssl_module",
|
||||
exclude=False)
|
||||
self.assertFalse(post_loadmods)
|
||||
|
||||
def test_no_ifmod_search_false(self):
|
||||
#pylint: disable=no-member
|
||||
|
||||
self.assertFalse(self.config.parser.not_modssl_ifmodule(
|
||||
"/path/does/not/include/ifmod"
|
||||
))
|
||||
self.assertFalse(self.config.parser.not_modssl_ifmodule(
|
||||
""
|
||||
))
|
||||
self.assertFalse(self.config.parser.not_modssl_ifmodule(
|
||||
"/path/includes/IfModule/but/no/arguments"
|
||||
))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,17 +1,15 @@
|
||||
"""Test for certbot_apache.configurator for Centos overrides"""
|
||||
import os
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import obj
|
||||
from certbot_apache import override_centos
|
||||
from certbot_apache.tests import util
|
||||
|
||||
|
||||
def get_vh_truth(temp_dir, config_name):
|
||||
"""Return the ground truth for the specified directory."""
|
||||
prefix = os.path.join(
|
||||
@@ -32,59 +30,6 @@ def get_vh_truth(temp_dir, config_name):
|
||||
]
|
||||
return vh_truth
|
||||
|
||||
class FedoraRestartTest(util.ApacheTest):
|
||||
"""Tests for Fedora specific self-signed certificate override"""
|
||||
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
test_dir = "centos7_apache/apache"
|
||||
config_root = "centos7_apache/apache/httpd"
|
||||
vhost_root = "centos7_apache/apache/httpd/conf.d"
|
||||
super(FedoraRestartTest, self).setUp(test_dir=test_dir,
|
||||
config_root=config_root,
|
||||
vhost_root=vhost_root)
|
||||
self.config = util.get_apache_configurator(
|
||||
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
|
||||
os_info="fedora_old")
|
||||
self.vh_truth = get_vh_truth(
|
||||
self.temp_dir, "centos7_apache/apache")
|
||||
|
||||
def _run_fedora_test(self):
|
||||
self.assertIsInstance(self.config, override_centos.CentOSConfigurator)
|
||||
with mock.patch("certbot.util.get_os_info") as mock_info:
|
||||
mock_info.return_value = ["fedora", "28"]
|
||||
self.config.config_test()
|
||||
|
||||
def test_non_fedora_error(self):
|
||||
c_test = "certbot_apache.configurator.ApacheConfigurator.config_test"
|
||||
with mock.patch(c_test) as mock_test:
|
||||
mock_test.side_effect = errors.MisconfigurationError
|
||||
with mock.patch("certbot.util.get_os_info") as mock_info:
|
||||
mock_info.return_value = ["not_fedora"]
|
||||
self.assertRaises(errors.MisconfigurationError,
|
||||
self.config.config_test)
|
||||
|
||||
def test_fedora_restart_error(self):
|
||||
c_test = "certbot_apache.configurator.ApacheConfigurator.config_test"
|
||||
with mock.patch(c_test) as mock_test:
|
||||
# First call raises error, second doesn't
|
||||
mock_test.side_effect = [errors.MisconfigurationError, '']
|
||||
with mock.patch("certbot.util.run_script") as mock_run:
|
||||
mock_run.side_effect = errors.SubprocessError
|
||||
self.assertRaises(errors.MisconfigurationError,
|
||||
self._run_fedora_test)
|
||||
|
||||
def test_fedora_restart(self):
|
||||
c_test = "certbot_apache.configurator.ApacheConfigurator.config_test"
|
||||
with mock.patch(c_test) as mock_test:
|
||||
with mock.patch("certbot.util.run_script") as mock_run:
|
||||
# First call raises error, second doesn't
|
||||
mock_test.side_effect = [errors.MisconfigurationError, '']
|
||||
self._run_fedora_test()
|
||||
self.assertEqual(mock_test.call_count, 2)
|
||||
self.assertEqual(mock_run.call_args[0][0],
|
||||
['systemctl', 'restart', 'httpd'])
|
||||
|
||||
|
||||
class MultipleVhostsTestCentOS(util.ApacheTest):
|
||||
"""Multiple vhost tests for CentOS / RHEL family of distros"""
|
||||
|
||||
@@ -105,7 +50,8 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
|
||||
self.temp_dir, "centos7_apache/apache")
|
||||
|
||||
def test_get_parser(self):
|
||||
self.assertIsInstance(self.config.parser, override_centos.CentOSParser)
|
||||
self.assertTrue(isinstance(self.config.parser,
|
||||
override_centos.CentOSParser))
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
|
||||
def test_opportunistic_httpd_runtime_parsing(self, mock_get):
|
||||
@@ -135,9 +81,9 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
|
||||
mock_osi.return_value = ("centos", "7")
|
||||
self.config.parser.update_runtime_variables()
|
||||
|
||||
self.assertEqual(mock_get.call_count, 3)
|
||||
self.assertEqual(len(self.config.parser.modules), 4)
|
||||
self.assertEqual(len(self.config.parser.variables), 2)
|
||||
self.assertEquals(mock_get.call_count, 3)
|
||||
self.assertEquals(len(self.config.parser.modules), 4)
|
||||
self.assertEquals(len(self.config.parser.variables), 2)
|
||||
self.assertTrue("TEST2" in self.config.parser.variables.keys())
|
||||
self.assertTrue("mod_another.c" in self.config.parser.modules)
|
||||
|
||||
@@ -161,7 +107,7 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
|
||||
"""Make sure we read the sysconfig OPTIONS variable correctly"""
|
||||
# Return nothing for the process calls
|
||||
mock_cfg.return_value = ""
|
||||
self.config.parser.sysconfig_filep = filesystem.realpath(
|
||||
self.config.parser.sysconfig_filep = os.path.realpath(
|
||||
os.path.join(self.config.parser.root, "../sysconfig/httpd"))
|
||||
self.config.parser.variables = {}
|
||||
|
||||
@@ -181,7 +127,7 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
|
||||
def test_alt_restart_works(self, mock_run_script):
|
||||
mock_run_script.side_effect = [None, errors.SubprocessError, None]
|
||||
self.config.restart()
|
||||
self.assertEqual(mock_run_script.call_count, 3)
|
||||
self.assertEquals(mock_run_script.call_count, 3)
|
||||
|
||||
@mock.patch("certbot_apache.configurator.util.run_script")
|
||||
def test_alt_restart_errors(self, mock_run_script):
|
||||
@@ -189,7 +135,5 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
|
||||
errors.SubprocessError,
|
||||
errors.SubprocessError]
|
||||
self.assertRaises(errors.MisconfigurationError, self.config.restart)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
"""Tests for certbot_apache.parser."""
|
||||
import os
|
||||
import shutil
|
||||
import unittest
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache.tests import util
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# pylint: disable=too-many-lines
|
||||
# pylint: disable=too-many-public-methods,too-many-lines
|
||||
"""Test for certbot_apache.configurator."""
|
||||
import copy
|
||||
import os
|
||||
import shutil
|
||||
import socket
|
||||
import tempfile
|
||||
@@ -15,21 +15,22 @@ from acme import challenges
|
||||
from certbot import achallenges
|
||||
from certbot import crypto_util
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
from certbot.compat import filesystem
|
||||
|
||||
from certbot.tests import acme_util
|
||||
from certbot.tests import util as certbot_util
|
||||
|
||||
from certbot_apache import apache_util
|
||||
from certbot_apache import constants
|
||||
from certbot_apache import obj
|
||||
from certbot_apache import parser
|
||||
from certbot_apache import obj
|
||||
|
||||
from certbot_apache.tests import util
|
||||
|
||||
|
||||
class MultipleVhostsTest(util.ApacheTest):
|
||||
"""Test two standard well-configured HTTP vhosts."""
|
||||
|
||||
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
super(MultipleVhostsTest, self).setUp()
|
||||
|
||||
@@ -51,14 +52,25 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.deploy_cert = mocked_deploy_cert
|
||||
return self.config
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.init_augeas")
|
||||
@mock.patch("certbot_apache.configurator.path_surgery")
|
||||
def test_prepare_no_install(self, mock_surgery):
|
||||
def test_prepare_no_install(self, mock_surgery, _init_augeas):
|
||||
silly_path = {"PATH": "/tmp/nothingness2342"}
|
||||
mock_surgery.return_value = False
|
||||
with mock.patch.dict('os.environ', silly_path):
|
||||
self.assertRaises(errors.NoInstallationError, self.config.prepare)
|
||||
self.assertEqual(mock_surgery.call_count, 1)
|
||||
|
||||
@mock.patch("certbot_apache.augeas_configurator.AugeasConfigurator.init_augeas")
|
||||
def test_prepare_no_augeas(self, mock_init_augeas):
|
||||
""" Test augeas initialization ImportError """
|
||||
def side_effect_error():
|
||||
""" Side effect error for the test """
|
||||
raise ImportError
|
||||
mock_init_augeas.side_effect = side_effect_error
|
||||
self.assertRaises(
|
||||
errors.NoInstallationError, self.config.prepare)
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser")
|
||||
@mock.patch("certbot_apache.configurator.util.exe_exists")
|
||||
def test_prepare_version(self, mock_exe_exists, _):
|
||||
@@ -70,6 +82,16 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.assertRaises(
|
||||
errors.NotSupportedError, self.config.prepare)
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser")
|
||||
@mock.patch("certbot_apache.configurator.util.exe_exists")
|
||||
def test_prepare_old_aug(self, mock_exe_exists, _):
|
||||
mock_exe_exists.return_value = True
|
||||
self.config.config_test = mock.Mock()
|
||||
# pylint: disable=protected-access
|
||||
self.config._check_aug_version = mock.Mock(return_value=False)
|
||||
self.assertRaises(
|
||||
errors.NotSupportedError, self.config.prepare)
|
||||
|
||||
def test_prepare_locked(self):
|
||||
server_root = self.config.conf("server-root")
|
||||
self.config.config_test = mock.Mock()
|
||||
@@ -93,49 +115,9 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# Weak test..
|
||||
ApacheConfigurator.add_parser_arguments(mock.MagicMock())
|
||||
|
||||
def test_docs_parser_arguments(self):
|
||||
os.environ["CERTBOT_DOCS"] = "1"
|
||||
from certbot_apache.configurator import ApacheConfigurator
|
||||
mock_add = mock.MagicMock()
|
||||
ApacheConfigurator.add_parser_arguments(mock_add)
|
||||
parserargs = ["server_root", "enmod", "dismod", "le_vhost_ext",
|
||||
"vhost_root", "logs_root", "challenge_location",
|
||||
"handle_modules", "handle_sites", "ctl"]
|
||||
exp = dict()
|
||||
|
||||
for k in ApacheConfigurator.OS_DEFAULTS:
|
||||
if k in parserargs:
|
||||
exp[k.replace("_", "-")] = ApacheConfigurator.OS_DEFAULTS[k]
|
||||
# Special cases
|
||||
exp["vhost-root"] = None
|
||||
|
||||
found = set()
|
||||
for call in mock_add.call_args_list:
|
||||
found.add(call[0][0])
|
||||
|
||||
# Make sure that all (and only) the expected values exist
|
||||
self.assertEqual(len(mock_add.call_args_list), len(found))
|
||||
for e in exp:
|
||||
self.assertTrue(e in found)
|
||||
|
||||
del os.environ["CERTBOT_DOCS"]
|
||||
|
||||
def test_add_parser_arguments_all_configurators(self): # pylint: disable=no-self-use
|
||||
from certbot_apache.entrypoint import OVERRIDE_CLASSES
|
||||
for cls in OVERRIDE_CLASSES.values():
|
||||
cls.add_parser_arguments(mock.MagicMock())
|
||||
|
||||
def test_all_configurators_defaults_defined(self):
|
||||
from certbot_apache.entrypoint import OVERRIDE_CLASSES
|
||||
from certbot_apache.configurator import ApacheConfigurator
|
||||
parameters = set(ApacheConfigurator.OS_DEFAULTS.keys())
|
||||
for cls in OVERRIDE_CLASSES.values():
|
||||
self.assertTrue(parameters.issubset(set(cls.OS_DEFAULTS.keys())))
|
||||
|
||||
def test_constant(self):
|
||||
self.assertTrue("debian_apache_2_4/multiple_vhosts/apache" in
|
||||
self.config.option("server_root"))
|
||||
self.assertEqual(self.config.option("nonexistent"), None)
|
||||
self.assertEqual(self.config.constant("server_root"), "/etc/apache2")
|
||||
self.assertEqual(self.config.constant("nonexistent"), None)
|
||||
|
||||
@certbot_util.patch_get_utility()
|
||||
def test_get_all_names(self, mock_getutility):
|
||||
@@ -144,8 +126,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
names = self.config.get_all_names()
|
||||
self.assertEqual(names, set(
|
||||
["certbot.demo", "ocspvhost.com", "encryption-example.demo",
|
||||
"nonsym.link", "vhost.in.rootconf", "www.certbot.demo",
|
||||
"duplicate.example.com"]
|
||||
"nonsym.link", "vhost.in.rootconf", "www.certbot.demo"]
|
||||
))
|
||||
|
||||
@certbot_util.patch_get_utility()
|
||||
@@ -164,7 +145,8 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.vhosts.append(vhost)
|
||||
|
||||
names = self.config.get_all_names()
|
||||
self.assertEqual(len(names), 9)
|
||||
# Names get filtered, only 5 are returned
|
||||
self.assertEqual(len(names), 8)
|
||||
self.assertTrue("zombo.com" in names)
|
||||
self.assertTrue("google.com" in names)
|
||||
self.assertTrue("certbot.demo" in names)
|
||||
@@ -205,7 +187,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
def test_get_virtual_hosts(self):
|
||||
"""Make sure all vhosts are being properly found."""
|
||||
vhs = self.config.get_virtual_hosts()
|
||||
self.assertEqual(len(vhs), 12)
|
||||
self.assertEqual(len(vhs), 10)
|
||||
found = 0
|
||||
|
||||
for vhost in vhs:
|
||||
@@ -216,7 +198,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
else:
|
||||
raise Exception("Missed: %s" % vhost) # pragma: no cover
|
||||
|
||||
self.assertEqual(found, 12)
|
||||
self.assertEqual(found, 10)
|
||||
|
||||
# Handle case of non-debian layout get_virtual_hosts
|
||||
with mock.patch(
|
||||
@@ -224,7 +206,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
) as mock_conf:
|
||||
mock_conf.return_value = False
|
||||
vhs = self.config.get_virtual_hosts()
|
||||
self.assertEqual(len(vhs), 12)
|
||||
self.assertEqual(len(vhs), 10)
|
||||
|
||||
@mock.patch("certbot_apache.display_ops.select_vhost")
|
||||
def test_choose_vhost_none_avail(self, mock_select):
|
||||
@@ -264,7 +246,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
@mock.patch("certbot_apache.display_ops.select_vhost")
|
||||
def test_choose_vhost_select_vhost_with_temp(self, mock_select):
|
||||
mock_select.return_value = self.vh_truth[0]
|
||||
chosen_vhost = self.config.choose_vhost("none.com", create_if_no_ssl=False)
|
||||
chosen_vhost = self.config.choose_vhost("none.com", temp=True)
|
||||
self.assertEqual(self.vh_truth[0], chosen_vhost)
|
||||
|
||||
@mock.patch("certbot_apache.display_ops.select_vhost")
|
||||
@@ -327,7 +309,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.vhosts = [
|
||||
vh for vh in self.config.vhosts
|
||||
if vh.name not in ["certbot.demo", "nonsym.link",
|
||||
"encryption-example.demo", "duplicate.example.com",
|
||||
"encryption-example.demo",
|
||||
"ocspvhost.com", "vhost.in.rootconf"]
|
||||
and "*.blue.purple.com" not in vh.aliases
|
||||
]
|
||||
@@ -338,7 +320,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
def test_non_default_vhosts(self):
|
||||
# pylint: disable=protected-access
|
||||
vhosts = self.config._non_default_vhosts(self.config.vhosts)
|
||||
self.assertEqual(len(vhosts), 10)
|
||||
self.assertEqual(len(vhosts), 8)
|
||||
|
||||
def test_deploy_cert_enable_new_vhost(self):
|
||||
# Create
|
||||
@@ -358,7 +340,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
"""Mock method for parser.find_dir"""
|
||||
if directive == "Include" and argument.endswith("options-ssl-apache.conf"):
|
||||
return ["/path/to/whatever"]
|
||||
return None # pragma: no cover
|
||||
|
||||
mock_add = mock.MagicMock()
|
||||
self.config.parser.add_dir = mock_add
|
||||
@@ -372,11 +353,14 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
|
||||
self.config.parser.find_dir = mock_find_dir
|
||||
mock_add.reset_mock()
|
||||
|
||||
self.config._add_dummy_ssl_directives(self.vh_truth[0]) # pylint: disable=protected-access
|
||||
tried_to_add = []
|
||||
for a in mock_add.call_args_list:
|
||||
if a[0][1] == "Include" and a[0][2] == self.config.mod_ssl_conf:
|
||||
self.fail("Include shouldn't be added, as patched find_dir 'finds' existing one") \
|
||||
# pragma: no cover
|
||||
tried_to_add.append(a[0][1] == "Include" and
|
||||
a[0][2] == self.config.mod_ssl_conf)
|
||||
# Include shouldn't be added, as patched find_dir "finds" existing one
|
||||
self.assertFalse(any(tried_to_add))
|
||||
|
||||
def test_deploy_cert(self):
|
||||
self.config.parser.modules.add("ssl_module")
|
||||
@@ -470,7 +454,8 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
but an SSLCertificateKeyFile directive is missing."""
|
||||
if "SSLCertificateFile" in args:
|
||||
return ["example/cert.pem"]
|
||||
return []
|
||||
else:
|
||||
return []
|
||||
|
||||
mock_find_dir = mock.MagicMock(return_value=[])
|
||||
mock_find_dir.side_effect = side_effect
|
||||
@@ -650,7 +635,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# span excludes the closing </VirtualHost> tag in older versions
|
||||
# of Augeas
|
||||
return_value = [self.vh_truth[0].filep, 1, 12, 0, 0, 0, 1142]
|
||||
with mock.patch.object(self.config.parser.aug, 'span') as mock_span:
|
||||
with mock.patch.object(self.config.aug, 'span') as mock_span:
|
||||
mock_span.return_value = return_value
|
||||
self.test_make_vhost_ssl()
|
||||
|
||||
@@ -658,7 +643,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# span includes the closing </VirtualHost> tag in newer versions
|
||||
# of Augeas
|
||||
return_value = [self.vh_truth[0].filep, 1, 12, 0, 0, 0, 1157]
|
||||
with mock.patch.object(self.config.parser.aug, 'span') as mock_span:
|
||||
with mock.patch.object(self.config.aug, 'span') as mock_span:
|
||||
mock_span.return_value = return_value
|
||||
self.test_make_vhost_ssl()
|
||||
|
||||
@@ -669,9 +654,22 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.assertEqual(ssl_vhost_slink.name, "nonsym.link")
|
||||
|
||||
def test_make_vhost_ssl_nonexistent_vhost_path(self):
|
||||
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[1])
|
||||
self.assertEqual(os.path.dirname(ssl_vhost.filep),
|
||||
os.path.dirname(filesystem.realpath(self.vh_truth[1].filep)))
|
||||
def conf_side_effect(arg):
|
||||
""" Mock function for ApacheConfigurator.conf """
|
||||
confvars = {
|
||||
"vhost-root": "/tmp/nonexistent",
|
||||
"le_vhost_ext": "-le-ssl.conf",
|
||||
"handle-sites": True}
|
||||
return confvars[arg]
|
||||
|
||||
with mock.patch(
|
||||
"certbot_apache.configurator.ApacheConfigurator.conf"
|
||||
) as mock_conf:
|
||||
mock_conf.side_effect = conf_side_effect
|
||||
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[1])
|
||||
self.assertEqual(os.path.dirname(ssl_vhost.filep),
|
||||
os.path.dirname(os.path.realpath(
|
||||
self.vh_truth[1].filep)))
|
||||
|
||||
def test_make_vhost_ssl(self):
|
||||
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[0])
|
||||
@@ -692,7 +690,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.assertEqual(self.config.is_name_vhost(self.vh_truth[0]),
|
||||
self.config.is_name_vhost(ssl_vhost))
|
||||
|
||||
self.assertEqual(len(self.config.vhosts), 13)
|
||||
self.assertEqual(len(self.config.vhosts), 11)
|
||||
|
||||
def test_clean_vhost_ssl(self):
|
||||
# pylint: disable=protected-access
|
||||
@@ -785,19 +783,32 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.assertEqual(self.config.add_name_vhost.call_count, 2)
|
||||
|
||||
@mock.patch("certbot_apache.configurator.http_01.ApacheHttp01.perform")
|
||||
@mock.patch("certbot_apache.configurator.tls_sni_01.ApacheTlsSni01.perform")
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.restart")
|
||||
def test_perform(self, mock_restart, mock_http_perform):
|
||||
def test_perform(self, mock_restart, mock_tls_perform, mock_http_perform):
|
||||
# Only tests functionality specific to configurator.perform
|
||||
# Note: As more challenges are offered this will have to be expanded
|
||||
account_key, achalls = self.get_key_and_achalls()
|
||||
|
||||
expected = [achall.response(account_key) for achall in achalls]
|
||||
mock_http_perform.return_value = expected
|
||||
all_expected = []
|
||||
http_expected = []
|
||||
tls_expected = []
|
||||
for achall in achalls:
|
||||
response = achall.response(account_key)
|
||||
if isinstance(achall.chall, challenges.HTTP01):
|
||||
http_expected.append(response)
|
||||
else:
|
||||
tls_expected.append(response)
|
||||
all_expected.append(response)
|
||||
|
||||
mock_http_perform.return_value = http_expected
|
||||
mock_tls_perform.return_value = tls_expected
|
||||
|
||||
responses = self.config.perform(achalls)
|
||||
|
||||
self.assertEqual(mock_http_perform.call_count, 1)
|
||||
self.assertEqual(responses, expected)
|
||||
self.assertEqual(mock_tls_perform.call_count, 1)
|
||||
self.assertEqual(responses, all_expected)
|
||||
|
||||
self.assertEqual(mock_restart.call_count, 1)
|
||||
|
||||
@@ -925,22 +936,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
errors.PluginError,
|
||||
self.config.enhance, "certbot.demo", "unknown_enhancement")
|
||||
|
||||
def test_enhance_no_ssl_vhost(self):
|
||||
with mock.patch("certbot_apache.configurator.logger.warning") as mock_log:
|
||||
self.assertRaises(errors.PluginError, self.config.enhance,
|
||||
"certbot.demo", "redirect")
|
||||
# Check that correct logger.warning was printed
|
||||
self.assertTrue("not able to find" in mock_log.call_args[0][0])
|
||||
self.assertTrue("\"redirect\"" in mock_log.call_args[0][0])
|
||||
|
||||
mock_log.reset_mock()
|
||||
|
||||
self.assertRaises(errors.PluginError, self.config.enhance,
|
||||
"certbot.demo", "ensure-http-header", "Test")
|
||||
# Check that correct logger.warning was printed
|
||||
self.assertTrue("not able to find" in mock_log.call_args[0][0])
|
||||
self.assertTrue("Test" in mock_log.call_args[0][0])
|
||||
|
||||
@mock.patch("certbot.util.exe_exists")
|
||||
def test_ocsp_stapling(self, mock_exe):
|
||||
self.config.parser.update_runtime_variables = mock.Mock()
|
||||
@@ -950,7 +945,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
mock_exe.return_value = True
|
||||
|
||||
# This will create an ssl vhost for certbot.demo
|
||||
self.config.choose_vhost("certbot.demo")
|
||||
self.config.enhance("certbot.demo", "staple-ocsp")
|
||||
|
||||
# Get the ssl vhost for certbot.demo
|
||||
@@ -977,7 +971,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
mock_exe.return_value = True
|
||||
|
||||
# Checking the case with already enabled ocsp stapling configuration
|
||||
self.config.choose_vhost("ocspvhost.com")
|
||||
self.config.enhance("ocspvhost.com", "staple-ocsp")
|
||||
|
||||
# Get the ssl vhost for letsencrypt.demo
|
||||
@@ -1002,7 +995,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.parser.modules.add("mod_ssl.c")
|
||||
self.config.parser.modules.add("socache_shmcb_module")
|
||||
self.config.get_version = mock.Mock(return_value=(2, 2, 0))
|
||||
self.config.choose_vhost("certbot.demo")
|
||||
|
||||
self.assertRaises(errors.PluginError,
|
||||
self.config.enhance, "certbot.demo", "staple-ocsp")
|
||||
@@ -1017,7 +1009,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
|
||||
# pylint: disable=protected-access
|
||||
http_vh = self.config._get_http_vhost(ssl_vh)
|
||||
self.assertFalse(http_vh.ssl)
|
||||
self.assertTrue(http_vh.ssl == False)
|
||||
|
||||
@mock.patch("certbot.util.run_script")
|
||||
@mock.patch("certbot.util.exe_exists")
|
||||
@@ -1028,7 +1020,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
mock_exe.return_value = True
|
||||
|
||||
# This will create an ssl vhost for certbot.demo
|
||||
self.config.choose_vhost("certbot.demo")
|
||||
self.config.enhance("certbot.demo", "ensure-http-header",
|
||||
"Strict-Transport-Security")
|
||||
|
||||
@@ -1048,8 +1039,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# skip the enable mod
|
||||
self.config.parser.modules.add("headers_module")
|
||||
|
||||
# This will create an ssl vhost for encryption-example.demo
|
||||
self.config.choose_vhost("encryption-example.demo")
|
||||
# This will create an ssl vhost for certbot.demo
|
||||
self.config.enhance("encryption-example.demo", "ensure-http-header",
|
||||
"Strict-Transport-Security")
|
||||
|
||||
@@ -1068,7 +1058,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
mock_exe.return_value = True
|
||||
|
||||
# This will create an ssl vhost for certbot.demo
|
||||
self.config.choose_vhost("certbot.demo")
|
||||
self.config.enhance("certbot.demo", "ensure-http-header",
|
||||
"Upgrade-Insecure-Requests")
|
||||
|
||||
@@ -1090,8 +1079,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# skip the enable mod
|
||||
self.config.parser.modules.add("headers_module")
|
||||
|
||||
# This will create an ssl vhost for encryption-example.demo
|
||||
self.config.choose_vhost("encryption-example.demo")
|
||||
# This will create an ssl vhost for certbot.demo
|
||||
self.config.enhance("encryption-example.demo", "ensure-http-header",
|
||||
"Upgrade-Insecure-Requests")
|
||||
|
||||
@@ -1109,7 +1097,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.get_version = mock.Mock(return_value=(2, 2))
|
||||
|
||||
# This will create an ssl vhost for certbot.demo
|
||||
self.config.choose_vhost("certbot.demo")
|
||||
self.config.enhance("certbot.demo", "redirect")
|
||||
|
||||
# These are not immediately available in find_dir even with save() and
|
||||
@@ -1160,7 +1147,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.save()
|
||||
|
||||
# This will create an ssl vhost for certbot.demo
|
||||
self.config.choose_vhost("certbot.demo")
|
||||
self.config.enhance("certbot.demo", "redirect")
|
||||
|
||||
# These are not immediately available in find_dir even with save() and
|
||||
@@ -1206,7 +1192,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
except errors.PluginEnhancementAlreadyPresent:
|
||||
args_paths = self.config.parser.find_dir(
|
||||
"RewriteRule", None, http_vhost.path, False)
|
||||
arg_vals = [self.config.parser.aug.get(x) for x in args_paths]
|
||||
arg_vals = [self.config.aug.get(x) for x in args_paths]
|
||||
self.assertEqual(arg_vals, constants.REWRITE_HTTPS_ARGS)
|
||||
|
||||
|
||||
@@ -1227,9 +1213,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.parser.modules.add("rewrite_module")
|
||||
self.config.get_version = mock.Mock(return_value=(2, 3, 9))
|
||||
|
||||
# Creates ssl vhost for the domain
|
||||
self.config.choose_vhost("red.blue.purple.com")
|
||||
|
||||
self.config.enhance("red.blue.purple.com", "redirect")
|
||||
verify_no_redirect = ("certbot_apache.configurator."
|
||||
"ApacheConfigurator._verify_no_certbot_redirect")
|
||||
@@ -1241,7 +1224,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# Skip the enable mod
|
||||
self.config.parser.modules.add("rewrite_module")
|
||||
self.config.get_version = mock.Mock(return_value=(2, 3, 9))
|
||||
self.config.choose_vhost("red.blue.purple.com")
|
||||
|
||||
self.config.enhance("red.blue.purple.com", "redirect")
|
||||
# Clear state about enabling redirect on this run
|
||||
# pylint: disable=protected-access
|
||||
@@ -1260,7 +1243,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.config._enable_redirect(self.vh_truth[1], "")
|
||||
self.assertEqual(len(self.config.vhosts), 13)
|
||||
self.assertEqual(len(self.config.vhosts), 11)
|
||||
|
||||
def test_create_own_redirect_for_old_apache_version(self):
|
||||
self.config.parser.modules.add("rewrite_module")
|
||||
@@ -1271,7 +1254,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.config._enable_redirect(self.vh_truth[1], "")
|
||||
self.assertEqual(len(self.config.vhosts), 13)
|
||||
self.assertEqual(len(self.config.vhosts), 11)
|
||||
|
||||
def test_sift_rewrite_rule(self):
|
||||
# pylint: disable=protected-access
|
||||
@@ -1292,13 +1275,13 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
account_key = self.rsa512jwk
|
||||
achall1 = achallenges.KeyAuthorizationAnnotatedChallenge(
|
||||
challb=acme_util.chall_to_challb(
|
||||
challenges.HTTP01(
|
||||
challenges.TLSSNI01(
|
||||
token=b"jIq_Xy1mXGN37tb4L6Xj_es58fW571ZNyXekdZzhh7Q"),
|
||||
"pending"),
|
||||
domain="encryption-example.demo", account_key=account_key)
|
||||
achall2 = achallenges.KeyAuthorizationAnnotatedChallenge(
|
||||
challb=acme_util.chall_to_challb(
|
||||
challenges.HTTP01(
|
||||
challenges.TLSSNI01(
|
||||
token=b"uqnaPzxtrndteOqtrXb0Asl5gOJfWAnnx6QJyvcmlDU"),
|
||||
"pending"),
|
||||
domain="certbot.demo", account_key=account_key)
|
||||
@@ -1309,6 +1292,24 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
|
||||
return account_key, (achall1, achall2, achall3)
|
||||
|
||||
def test_make_addrs_sni_ready(self):
|
||||
self.config.version = (2, 2)
|
||||
self.config.make_addrs_sni_ready(
|
||||
set([obj.Addr.fromstring("*:443"), obj.Addr.fromstring("*:80")]))
|
||||
self.assertTrue(self.config.parser.find_dir(
|
||||
"NameVirtualHost", "*:80", exclude=False))
|
||||
self.assertTrue(self.config.parser.find_dir(
|
||||
"NameVirtualHost", "*:443", exclude=False))
|
||||
|
||||
def test_aug_version(self):
|
||||
mock_match = mock.Mock(return_value=["something"])
|
||||
self.config.aug.match = mock_match
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.config._check_aug_version(),
|
||||
["something"])
|
||||
self.config.aug.match.side_effect = RuntimeError
|
||||
self.assertFalse(self.config._check_aug_version())
|
||||
|
||||
def test_enable_site_nondebian(self):
|
||||
inc_path = "/path/to/wherever"
|
||||
vhost = self.vh_truth[0]
|
||||
@@ -1331,8 +1332,8 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.parser.modules.add("ssl_module")
|
||||
self.config.parser.modules.add("mod_ssl.c")
|
||||
self.config.parser.modules.add("socache_shmcb_module")
|
||||
tmp_path = filesystem.realpath(tempfile.mkdtemp("vhostroot"))
|
||||
filesystem.chmod(tmp_path, 0o755)
|
||||
tmp_path = os.path.realpath(tempfile.mkdtemp("vhostroot"))
|
||||
os.chmod(tmp_path, 0o755)
|
||||
mock_p = "certbot_apache.configurator.ApacheConfigurator._get_ssl_vhost_path"
|
||||
mock_a = "certbot_apache.parser.ApacheParser.add_include"
|
||||
|
||||
@@ -1364,7 +1365,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# pylint: disable=protected-access
|
||||
cases = {u"*.example.org": True, b"*.x.example.org": True,
|
||||
u"a.example.org": False, b"a.x.example.org": False}
|
||||
for key in cases:
|
||||
for key in cases.keys():
|
||||
self.assertEqual(self.config._wildcard_domain(key), cases[key])
|
||||
|
||||
def test_choose_vhosts_wildcard(self):
|
||||
@@ -1375,11 +1376,11 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
vhs = self.config._choose_vhosts_wildcard("*.certbot.demo",
|
||||
create_ssl=True)
|
||||
# Check that the dialog was called with one vh: certbot.demo
|
||||
self.assertEqual(mock_select_vhs.call_args[0][0][0], self.vh_truth[3])
|
||||
self.assertEqual(len(mock_select_vhs.call_args_list), 1)
|
||||
self.assertEquals(mock_select_vhs.call_args[0][0][0], self.vh_truth[3])
|
||||
self.assertEquals(len(mock_select_vhs.call_args_list), 1)
|
||||
|
||||
# And the actual returned values
|
||||
self.assertEqual(len(vhs), 1)
|
||||
self.assertEquals(len(vhs), 1)
|
||||
self.assertTrue(vhs[0].name == "certbot.demo")
|
||||
self.assertTrue(vhs[0].ssl)
|
||||
|
||||
@@ -1394,7 +1395,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
vhs = self.config._choose_vhosts_wildcard("*.certbot.demo",
|
||||
create_ssl=False)
|
||||
self.assertFalse(mock_makessl.called)
|
||||
self.assertEqual(vhs[0], self.vh_truth[1])
|
||||
self.assertEquals(vhs[0], self.vh_truth[1])
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator._vhosts_for_wildcard")
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.make_vhost_ssl")
|
||||
@@ -1407,15 +1408,15 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
mock_select_vhs.return_value = [self.vh_truth[7]]
|
||||
vhs = self.config._choose_vhosts_wildcard("whatever",
|
||||
create_ssl=True)
|
||||
self.assertEqual(mock_select_vhs.call_args[0][0][0], self.vh_truth[7])
|
||||
self.assertEqual(len(mock_select_vhs.call_args_list), 1)
|
||||
self.assertEquals(mock_select_vhs.call_args[0][0][0], self.vh_truth[7])
|
||||
self.assertEquals(len(mock_select_vhs.call_args_list), 1)
|
||||
# Ensure that make_vhost_ssl was not called, vhost.ssl == true
|
||||
self.assertFalse(mock_makessl.called)
|
||||
|
||||
# And the actual returned values
|
||||
self.assertEqual(len(vhs), 1)
|
||||
self.assertEquals(len(vhs), 1)
|
||||
self.assertTrue(vhs[0].ssl)
|
||||
self.assertEqual(vhs[0], self.vh_truth[7])
|
||||
self.assertEquals(vhs[0], self.vh_truth[7])
|
||||
|
||||
|
||||
def test_deploy_cert_wildcard(self):
|
||||
@@ -1428,7 +1429,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.deploy_cert("*.wildcard.example.org", "/tmp/path",
|
||||
"/tmp/path", "/tmp/path", "/tmp/path")
|
||||
self.assertTrue(mock_dep.called)
|
||||
self.assertEqual(len(mock_dep.call_args_list), 1)
|
||||
self.assertEquals(len(mock_dep.call_args_list), 1)
|
||||
self.assertEqual(self.vh_truth[7], mock_dep.call_args_list[0][0][0])
|
||||
|
||||
@mock.patch("certbot_apache.display_ops.select_vhost_multiple")
|
||||
@@ -1445,7 +1446,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# pylint: disable=protected-access
|
||||
self.config.parser.modules.add("mod_ssl.c")
|
||||
self.config.parser.modules.add("headers_module")
|
||||
self.vh_truth[3].ssl = True
|
||||
self.config._wildcard_vhosts["*.certbot.demo"] = [self.vh_truth[3]]
|
||||
self.config.enhance("*.certbot.demo", "ensure-http-header",
|
||||
"Upgrade-Insecure-Requests")
|
||||
@@ -1453,7 +1453,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator._choose_vhosts_wildcard")
|
||||
def test_enhance_wildcard_no_install(self, mock_choose):
|
||||
self.vh_truth[3].ssl = True
|
||||
mock_choose.return_value = [self.vh_truth[3]]
|
||||
self.config.parser.modules.add("mod_ssl.c")
|
||||
self.config.parser.modules.add("headers_module")
|
||||
@@ -1461,44 +1460,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
"Upgrade-Insecure-Requests")
|
||||
self.assertTrue(mock_choose.called)
|
||||
|
||||
def test_add_vhost_id(self):
|
||||
for vh in [self.vh_truth[0], self.vh_truth[1], self.vh_truth[2]]:
|
||||
vh_id = self.config.add_vhost_id(vh)
|
||||
self.assertEqual(vh, self.config.find_vhost_by_id(vh_id))
|
||||
|
||||
def test_find_vhost_by_id_404(self):
|
||||
self.assertRaises(errors.PluginError,
|
||||
self.config.find_vhost_by_id,
|
||||
"nonexistent")
|
||||
|
||||
def test_add_vhost_id_already_exists(self):
|
||||
first_id = self.config.add_vhost_id(self.vh_truth[0])
|
||||
second_id = self.config.add_vhost_id(self.vh_truth[0])
|
||||
self.assertEqual(first_id, second_id)
|
||||
|
||||
def test_realpath_replaces_symlink(self):
|
||||
orig_match = self.config.parser.aug.match
|
||||
mock_vhost = copy.deepcopy(self.vh_truth[0])
|
||||
mock_vhost.filep = mock_vhost.filep.replace('sites-enabled', u'sites-available')
|
||||
mock_vhost.path = mock_vhost.path.replace('sites-enabled', 'sites-available')
|
||||
mock_vhost.enabled = False
|
||||
self.config.parser.parse_file(mock_vhost.filep)
|
||||
|
||||
def mock_match(aug_expr):
|
||||
"""Return a mocked match list of VirtualHosts"""
|
||||
if "/mocked/path" in aug_expr:
|
||||
return [self.vh_truth[1].path, self.vh_truth[0].path, mock_vhost.path]
|
||||
return orig_match(aug_expr)
|
||||
|
||||
self.config.parser.parser_paths = ["/mocked/path"]
|
||||
self.config.parser.aug.match = mock_match
|
||||
vhs = self.config.get_virtual_hosts()
|
||||
self.assertEqual(len(vhs), 2)
|
||||
self.assertTrue(vhs[0] == self.vh_truth[1])
|
||||
# mock_vhost should have replaced the vh_truth[0], because its filepath
|
||||
# isn't a symlink
|
||||
self.assertTrue(vhs[1] == mock_vhost)
|
||||
|
||||
|
||||
class AugeasVhostsTest(util.ApacheTest):
|
||||
"""Test vhosts with illegal names dependent on augeas version."""
|
||||
@@ -1517,16 +1478,16 @@ class AugeasVhostsTest(util.ApacheTest):
|
||||
self.work_dir)
|
||||
|
||||
def test_choosevhost_with_illegal_name(self):
|
||||
self.config.parser.aug = mock.MagicMock()
|
||||
self.config.parser.aug.match.side_effect = RuntimeError
|
||||
path = "debian_apache_2_4/augeas_vhosts/apache2/sites-available/old-and-default.conf"
|
||||
self.config.aug = mock.MagicMock()
|
||||
self.config.aug.match.side_effect = RuntimeError
|
||||
path = "debian_apache_2_4/augeas_vhosts/apache2/sites-available/old,default.conf"
|
||||
chosen_vhost = self.config._create_vhost(path)
|
||||
self.assertEqual(None, chosen_vhost)
|
||||
|
||||
def test_choosevhost_works(self):
|
||||
path = "debian_apache_2_4/augeas_vhosts/apache2/sites-available/old-and-default.conf"
|
||||
path = "debian_apache_2_4/augeas_vhosts/apache2/sites-available/old,default.conf"
|
||||
chosen_vhost = self.config._create_vhost(path)
|
||||
self.assertTrue(chosen_vhost is None or chosen_vhost.path == path)
|
||||
self.assertTrue(chosen_vhost == None or chosen_vhost.path == path)
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator._create_vhost")
|
||||
def test_get_vhost_continue(self, mock_vhost):
|
||||
@@ -1580,7 +1541,7 @@ class AugeasVhostsTest(util.ApacheTest):
|
||||
broken_vhost)
|
||||
|
||||
class MultiVhostsTest(util.ApacheTest):
|
||||
"""Test configuration with multiple virtualhosts in a single file."""
|
||||
"""Test vhosts with illegal names dependent on augeas version."""
|
||||
# pylint: disable=protected-access
|
||||
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
@@ -1647,8 +1608,7 @@ class MultiVhostsTest(util.ApacheTest):
|
||||
self.assertTrue(self.config.parser.find_dir(
|
||||
"RewriteEngine", "on", ssl_vhost.path, False))
|
||||
|
||||
with open(ssl_vhost.filep) as the_file:
|
||||
conf_text = the_file.read()
|
||||
conf_text = open(ssl_vhost.filep).read()
|
||||
commented_rewrite_rule = ("# RewriteRule \"^/secrets/(.+)\" "
|
||||
"\"https://new.example.com/docs/$1\" [R,L]")
|
||||
uncommented_rewrite_rule = ("RewriteRule \"^/docs/(.+)\" "
|
||||
@@ -1664,8 +1624,7 @@ class MultiVhostsTest(util.ApacheTest):
|
||||
|
||||
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[3])
|
||||
|
||||
with open(ssl_vhost.filep) as the_file:
|
||||
conf_lines = the_file.readlines()
|
||||
conf_lines = open(ssl_vhost.filep).readlines()
|
||||
conf_line_set = [l.strip() for l in conf_lines]
|
||||
not_commented_cond1 = ("RewriteCond "
|
||||
"%{DOCUMENT_ROOT}/%{REQUEST_FILENAME} !-f")
|
||||
@@ -1702,7 +1661,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
|
||||
self.config.updated_mod_ssl_conf_digest)
|
||||
|
||||
def _current_ssl_options_hash(self):
|
||||
return crypto_util.sha256sum(self.config.option("MOD_SSL_CONF_SRC"))
|
||||
return crypto_util.sha256sum(self.config.constant("MOD_SSL_CONF_SRC"))
|
||||
|
||||
def _assert_current_file(self):
|
||||
self.assertTrue(os.path.isfile(self.config.mod_ssl_conf))
|
||||
@@ -1738,7 +1697,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
|
||||
self.assertFalse(mock_logger.warning.called)
|
||||
self.assertTrue(os.path.isfile(self.config.mod_ssl_conf))
|
||||
self.assertEqual(crypto_util.sha256sum(
|
||||
self.config.option("MOD_SSL_CONF_SRC")),
|
||||
self.config.constant("MOD_SSL_CONF_SRC")),
|
||||
self._current_ssl_options_hash())
|
||||
self.assertNotEqual(crypto_util.sha256sum(self.config.mod_ssl_conf),
|
||||
self._current_ssl_options_hash())
|
||||
@@ -1754,7 +1713,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
|
||||
"%s has been manually modified; updated file "
|
||||
"saved to %s. We recommend updating %s for security purposes.")
|
||||
self.assertEqual(crypto_util.sha256sum(
|
||||
self.config.option("MOD_SSL_CONF_SRC")),
|
||||
self.config.constant("MOD_SSL_CONF_SRC")),
|
||||
self._current_ssl_options_hash())
|
||||
# only print warning once
|
||||
with mock.patch("certbot.plugins.common.logger") as mock_logger:
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
"""Test for certbot_apache.configurator for Debian overrides"""
|
||||
import os
|
||||
import shutil
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import apache_util
|
||||
from certbot_apache import obj
|
||||
@@ -20,7 +20,7 @@ class MultipleVhostsTestDebian(util.ApacheTest):
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
super(MultipleVhostsTestDebian, self).setUp()
|
||||
self.config = util.get_apache_configurator(
|
||||
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
|
||||
self.config_path, None, self.config_dir, self.work_dir,
|
||||
os_info="debian")
|
||||
self.config = self.mock_deploy_cert(self.config)
|
||||
self.vh_truth = util.get_vh_truth(self.temp_dir,
|
||||
@@ -79,9 +79,9 @@ class MultipleVhostsTestDebian(util.ApacheTest):
|
||||
|
||||
def test_enable_site_failure(self):
|
||||
self.config.parser.root = "/tmp/nonexistent"
|
||||
with mock.patch("certbot.compat.os.path.isdir") as mock_dir:
|
||||
with mock.patch("os.path.isdir") as mock_dir:
|
||||
mock_dir.return_value = True
|
||||
with mock.patch("certbot.compat.os.path.islink") as mock_link:
|
||||
with mock.patch("os.path.islink") as mock_link:
|
||||
mock_link.return_value = False
|
||||
self.assertRaises(
|
||||
errors.NotSupportedError,
|
||||
@@ -161,8 +161,6 @@ class MultipleVhostsTestDebian(util.ApacheTest):
|
||||
self.config.parser.modules.add("mod_ssl.c")
|
||||
self.config.get_version = mock.Mock(return_value=(2, 4, 7))
|
||||
mock_exe.return_value = True
|
||||
# This will create an ssl vhost for certbot.demo
|
||||
self.config.choose_vhost("certbot.demo")
|
||||
self.config.enhance("certbot.demo", "staple-ocsp")
|
||||
self.assertTrue("socache_shmcb_module" in self.config.parser.modules)
|
||||
|
||||
@@ -174,7 +172,6 @@ class MultipleVhostsTestDebian(util.ApacheTest):
|
||||
mock_exe.return_value = True
|
||||
|
||||
# This will create an ssl vhost for certbot.demo
|
||||
self.config.choose_vhost("certbot.demo")
|
||||
self.config.enhance("certbot.demo", "ensure-http-header",
|
||||
"Strict-Transport-Security")
|
||||
self.assertTrue("headers_module" in self.config.parser.modules)
|
||||
@@ -186,7 +183,6 @@ class MultipleVhostsTestDebian(util.ApacheTest):
|
||||
mock_exe.return_value = True
|
||||
self.config.get_version = mock.Mock(return_value=(2, 2))
|
||||
# This will create an ssl vhost for certbot.demo
|
||||
self.config.choose_vhost("certbot.demo")
|
||||
self.config.enhance("certbot.demo", "redirect")
|
||||
self.assertTrue("rewrite_module" in self.config.parser.modules)
|
||||
|
||||
|
||||
@@ -6,7 +6,6 @@ import mock
|
||||
from certbot_apache import configurator
|
||||
from certbot_apache import entrypoint
|
||||
|
||||
|
||||
class EntryPointTest(unittest.TestCase):
|
||||
"""Entrypoint tests"""
|
||||
|
||||
@@ -15,13 +14,8 @@ class EntryPointTest(unittest.TestCase):
|
||||
def test_get_configurator(self):
|
||||
|
||||
with mock.patch("certbot.util.get_os_info") as mock_info:
|
||||
for distro in entrypoint.OVERRIDE_CLASSES:
|
||||
return_value = (distro, "whatever")
|
||||
if distro == 'fedora_old':
|
||||
return_value = ('fedora', '28')
|
||||
elif distro == 'fedora':
|
||||
return_value = ('fedora', '29')
|
||||
mock_info.return_value = return_value
|
||||
for distro in entrypoint.OVERRIDE_CLASSES.keys():
|
||||
mock_info.return_value = (distro, "whatever")
|
||||
self.assertEqual(entrypoint.get_configurator(),
|
||||
entrypoint.OVERRIDE_CLASSES[distro])
|
||||
|
||||
@@ -29,7 +23,7 @@ class EntryPointTest(unittest.TestCase):
|
||||
with mock.patch("certbot.util.get_os_info") as mock_info:
|
||||
mock_info.return_value = ("nonexistent", "irrelevant")
|
||||
with mock.patch("certbot.util.get_systemd_os_like") as mock_like:
|
||||
for like in entrypoint.OVERRIDE_CLASSES:
|
||||
for like in entrypoint.OVERRIDE_CLASSES.keys():
|
||||
mock_like.return_value = [like]
|
||||
self.assertEqual(entrypoint.get_configurator(),
|
||||
entrypoint.OVERRIDE_CLASSES[like])
|
||||
|
||||
@@ -1,195 +0,0 @@
|
||||
"""Test for certbot_apache.configurator for Fedora 29+ overrides"""
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import obj
|
||||
from certbot_apache import override_fedora
|
||||
from certbot_apache.tests import util
|
||||
|
||||
|
||||
def get_vh_truth(temp_dir, config_name):
|
||||
"""Return the ground truth for the specified directory."""
|
||||
prefix = os.path.join(
|
||||
temp_dir, config_name, "httpd/conf.d")
|
||||
|
||||
aug_pre = "/files" + prefix
|
||||
# TODO: eventually, these tests should have a dedicated configuration instead
|
||||
# of reusing the ones from centos_test
|
||||
vh_truth = [
|
||||
obj.VirtualHost(
|
||||
os.path.join(prefix, "centos.example.com.conf"),
|
||||
os.path.join(aug_pre, "centos.example.com.conf/VirtualHost"),
|
||||
{obj.Addr.fromstring("*:80")},
|
||||
False, True, "centos.example.com"),
|
||||
obj.VirtualHost(
|
||||
os.path.join(prefix, "ssl.conf"),
|
||||
os.path.join(aug_pre, "ssl.conf/VirtualHost"),
|
||||
{obj.Addr.fromstring("_default_:443")},
|
||||
True, True, None)
|
||||
]
|
||||
return vh_truth
|
||||
|
||||
|
||||
class FedoraRestartTest(util.ApacheTest):
|
||||
"""Tests for Fedora specific self-signed certificate override"""
|
||||
|
||||
# TODO: eventually, these tests should have a dedicated configuration instead
|
||||
# of reusing the ones from centos_test
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
test_dir = "centos7_apache/apache"
|
||||
config_root = "centos7_apache/apache/httpd"
|
||||
vhost_root = "centos7_apache/apache/httpd/conf.d"
|
||||
super(FedoraRestartTest, self).setUp(test_dir=test_dir,
|
||||
config_root=config_root,
|
||||
vhost_root=vhost_root)
|
||||
self.config = util.get_apache_configurator(
|
||||
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
|
||||
os_info="fedora")
|
||||
self.vh_truth = get_vh_truth(
|
||||
self.temp_dir, "centos7_apache/apache")
|
||||
|
||||
def _run_fedora_test(self):
|
||||
self.assertIsInstance(self.config, override_fedora.FedoraConfigurator)
|
||||
self.config.config_test()
|
||||
|
||||
def test_fedora_restart_error(self):
|
||||
c_test = "certbot_apache.configurator.ApacheConfigurator.config_test"
|
||||
with mock.patch(c_test) as mock_test:
|
||||
# First call raises error, second doesn't
|
||||
mock_test.side_effect = [errors.MisconfigurationError, '']
|
||||
with mock.patch("certbot.util.run_script") as mock_run:
|
||||
mock_run.side_effect = errors.SubprocessError
|
||||
self.assertRaises(errors.MisconfigurationError,
|
||||
self._run_fedora_test)
|
||||
|
||||
def test_fedora_restart(self):
|
||||
c_test = "certbot_apache.configurator.ApacheConfigurator.config_test"
|
||||
with mock.patch(c_test) as mock_test:
|
||||
with mock.patch("certbot.util.run_script") as mock_run:
|
||||
# First call raises error, second doesn't
|
||||
mock_test.side_effect = [errors.MisconfigurationError, '']
|
||||
self._run_fedora_test()
|
||||
self.assertEqual(mock_test.call_count, 2)
|
||||
self.assertEqual(mock_run.call_args[0][0],
|
||||
['systemctl', 'restart', 'httpd'])
|
||||
|
||||
|
||||
class MultipleVhostsTestFedora(util.ApacheTest):
|
||||
"""Multiple vhost tests for CentOS / RHEL family of distros"""
|
||||
|
||||
_multiprocess_can_split_ = True
|
||||
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
test_dir = "centos7_apache/apache"
|
||||
config_root = "centos7_apache/apache/httpd"
|
||||
vhost_root = "centos7_apache/apache/httpd/conf.d"
|
||||
super(MultipleVhostsTestFedora, self).setUp(test_dir=test_dir,
|
||||
config_root=config_root,
|
||||
vhost_root=vhost_root)
|
||||
|
||||
self.config = util.get_apache_configurator(
|
||||
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
|
||||
os_info="fedora")
|
||||
self.vh_truth = get_vh_truth(
|
||||
self.temp_dir, "centos7_apache/apache")
|
||||
|
||||
def test_get_parser(self):
|
||||
self.assertIsInstance(self.config.parser, override_fedora.FedoraParser)
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
|
||||
def test_opportunistic_httpd_runtime_parsing(self, mock_get):
|
||||
define_val = (
|
||||
'Define: TEST1\n'
|
||||
'Define: TEST2\n'
|
||||
'Define: DUMP_RUN_CFG\n'
|
||||
)
|
||||
mod_val = (
|
||||
'Loaded Modules:\n'
|
||||
' mock_module (static)\n'
|
||||
' another_module (static)\n'
|
||||
)
|
||||
def mock_get_cfg(command):
|
||||
"""Mock httpd process stdout"""
|
||||
if command == ['httpd', '-t', '-D', 'DUMP_RUN_CFG']:
|
||||
return define_val
|
||||
elif command == ['httpd', '-t', '-D', 'DUMP_MODULES']:
|
||||
return mod_val
|
||||
return ""
|
||||
mock_get.side_effect = mock_get_cfg
|
||||
self.config.parser.modules = set()
|
||||
self.config.parser.variables = {}
|
||||
|
||||
with mock.patch("certbot.util.get_os_info") as mock_osi:
|
||||
# Make sure we have the have the CentOS httpd constants
|
||||
mock_osi.return_value = ("fedora", "29")
|
||||
self.config.parser.update_runtime_variables()
|
||||
|
||||
self.assertEqual(mock_get.call_count, 3)
|
||||
self.assertEqual(len(self.config.parser.modules), 4)
|
||||
self.assertEqual(len(self.config.parser.variables), 2)
|
||||
self.assertTrue("TEST2" in self.config.parser.variables.keys())
|
||||
self.assertTrue("mod_another.c" in self.config.parser.modules)
|
||||
|
||||
@mock.patch("certbot_apache.configurator.util.run_script")
|
||||
def test_get_version(self, mock_run_script):
|
||||
mock_run_script.return_value = ('', None)
|
||||
self.assertRaises(errors.PluginError, self.config.get_version)
|
||||
self.assertEqual(mock_run_script.call_args[0][0][0], 'httpd')
|
||||
|
||||
def test_get_virtual_hosts(self):
|
||||
"""Make sure all vhosts are being properly found."""
|
||||
vhs = self.config.get_virtual_hosts()
|
||||
self.assertEqual(len(vhs), 2)
|
||||
found = 0
|
||||
|
||||
for vhost in vhs:
|
||||
for centos_truth in self.vh_truth:
|
||||
if vhost == centos_truth:
|
||||
found += 1
|
||||
break
|
||||
else:
|
||||
raise Exception("Missed: %s" % vhost) # pragma: no cover
|
||||
self.assertEqual(found, 2)
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
|
||||
def test_get_sysconfig_vars(self, mock_cfg):
|
||||
"""Make sure we read the sysconfig OPTIONS variable correctly"""
|
||||
# Return nothing for the process calls
|
||||
mock_cfg.return_value = ""
|
||||
self.config.parser.sysconfig_filep = filesystem.realpath(
|
||||
os.path.join(self.config.parser.root, "../sysconfig/httpd"))
|
||||
self.config.parser.variables = {}
|
||||
|
||||
with mock.patch("certbot.util.get_os_info") as mock_osi:
|
||||
# Make sure we have the have the CentOS httpd constants
|
||||
mock_osi.return_value = ("fedora", "29")
|
||||
self.config.parser.update_runtime_variables()
|
||||
|
||||
self.assertTrue("mock_define" in self.config.parser.variables.keys())
|
||||
self.assertTrue("mock_define_too" in self.config.parser.variables.keys())
|
||||
self.assertTrue("mock_value" in self.config.parser.variables.keys())
|
||||
self.assertEqual("TRUE", self.config.parser.variables["mock_value"])
|
||||
self.assertTrue("MOCK_NOSEP" in self.config.parser.variables.keys())
|
||||
self.assertEqual("NOSEP_VAL", self.config.parser.variables["NOSEP_TWO"])
|
||||
|
||||
@mock.patch("certbot_apache.configurator.util.run_script")
|
||||
def test_alt_restart_works(self, mock_run_script):
|
||||
mock_run_script.side_effect = [None, errors.SubprocessError, None]
|
||||
self.config.restart()
|
||||
self.assertEqual(mock_run_script.call_count, 3)
|
||||
|
||||
@mock.patch("certbot_apache.configurator.util.run_script")
|
||||
def test_alt_restart_errors(self, mock_run_script):
|
||||
mock_run_script.side_effect = [None,
|
||||
errors.SubprocessError,
|
||||
errors.SubprocessError]
|
||||
self.assertRaises(errors.MisconfigurationError, self.config.restart)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,17 +1,15 @@
|
||||
"""Test for certbot_apache.configurator for Gentoo overrides"""
|
||||
import os
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import obj
|
||||
from certbot_apache import override_gentoo
|
||||
from certbot_apache import obj
|
||||
from certbot_apache.tests import util
|
||||
|
||||
|
||||
def get_vh_truth(temp_dir, config_name):
|
||||
"""Return the ground truth for the specified directory."""
|
||||
prefix = os.path.join(
|
||||
@@ -82,7 +80,7 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
|
||||
"""Make sure we read the Gentoo APACHE2_OPTS variable correctly"""
|
||||
defines = ['DEFAULT_VHOST', 'INFO',
|
||||
'SSL', 'SSL_DEFAULT_VHOST', 'LANGUAGE']
|
||||
self.config.parser.apacheconfig_filep = filesystem.realpath(
|
||||
self.config.parser.apacheconfig_filep = os.path.realpath(
|
||||
os.path.join(self.config.parser.root, "../conf.d/apache2"))
|
||||
self.config.parser.variables = {}
|
||||
with mock.patch("certbot_apache.override_gentoo.GentooParser.update_modules"):
|
||||
@@ -115,24 +113,23 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
|
||||
"""Mock httpd process stdout"""
|
||||
if command == ['apache2ctl', 'modules']:
|
||||
return mod_val
|
||||
return None # pragma: no cover
|
||||
mock_get.side_effect = mock_get_cfg
|
||||
self.config.parser.modules = set()
|
||||
|
||||
with mock.patch("certbot.util.get_os_info") as mock_osi:
|
||||
# Make sure we have the have the Gentoo httpd constants
|
||||
# Make sure we have the have the CentOS httpd constants
|
||||
mock_osi.return_value = ("gentoo", "123")
|
||||
self.config.parser.update_runtime_variables()
|
||||
|
||||
self.assertEqual(mock_get.call_count, 1)
|
||||
self.assertEqual(len(self.config.parser.modules), 4)
|
||||
self.assertEquals(mock_get.call_count, 1)
|
||||
self.assertEquals(len(self.config.parser.modules), 4)
|
||||
self.assertTrue("mod_another.c" in self.config.parser.modules)
|
||||
|
||||
@mock.patch("certbot_apache.configurator.util.run_script")
|
||||
def test_alt_restart_works(self, mock_run_script):
|
||||
mock_run_script.side_effect = [None, errors.SubprocessError, None]
|
||||
self.config.restart()
|
||||
self.assertEqual(mock_run_script.call_count, 3)
|
||||
self.assertEquals(mock_run_script.call_count, 3)
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
|
||||
@@ -1,35 +1,54 @@
|
||||
"""Test for certbot_apache.http_01."""
|
||||
import unittest
|
||||
import mock
|
||||
import os
|
||||
import unittest
|
||||
|
||||
from acme import challenges
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
from certbot import achallenges
|
||||
from certbot import errors
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot.tests import acme_util
|
||||
|
||||
from certbot_apache.parser import get_aug_path
|
||||
from certbot_apache.tests import util
|
||||
|
||||
|
||||
NUM_ACHALLS = 3
|
||||
|
||||
|
||||
class ApacheHttp01TestMeta(type):
|
||||
"""Generates parmeterized tests for testing perform."""
|
||||
def __new__(mcs, name, bases, class_dict):
|
||||
|
||||
def _gen_test(num_achalls, minor_version):
|
||||
def _test(self):
|
||||
achalls = self.achalls[:num_achalls]
|
||||
vhosts = self.vhosts[:num_achalls]
|
||||
self.config.version = (2, minor_version)
|
||||
self.common_perform_test(achalls, vhosts)
|
||||
return _test
|
||||
|
||||
for i in range(1, NUM_ACHALLS + 1):
|
||||
for j in (2, 4):
|
||||
test_name = "test_perform_{0}_{1}".format(i, j)
|
||||
class_dict[test_name] = _gen_test(i, j)
|
||||
return type.__new__(mcs, name, bases, class_dict)
|
||||
|
||||
|
||||
class ApacheHttp01Test(util.ApacheTest):
|
||||
"""Test for certbot_apache.http_01.ApacheHttp01."""
|
||||
|
||||
def setUp(self, *args, **kwargs): # pylint: disable=arguments-differ
|
||||
__metaclass__ = ApacheHttp01TestMeta
|
||||
|
||||
def setUp(self, *args, **kwargs):
|
||||
super(ApacheHttp01Test, self).setUp(*args, **kwargs)
|
||||
|
||||
self.account_key = self.rsa512jwk
|
||||
self.achalls = [] # type: List[achallenges.KeyAuthorizationAnnotatedChallenge]
|
||||
self.achalls = []
|
||||
vh_truth = util.get_vh_truth(
|
||||
self.temp_dir, "debian_apache_2_4/multiple_vhosts")
|
||||
# Takes the vhosts for encryption-example.demo, certbot.demo
|
||||
# and vhost.in.rootconf
|
||||
# Takes the vhosts for encryption-example.demo, certbot.demo, and
|
||||
# vhost.in.rootconf
|
||||
self.vhosts = [vh_truth[0], vh_truth[3], vh_truth[10]]
|
||||
|
||||
for i in range(NUM_ACHALLS):
|
||||
@@ -40,7 +59,7 @@ class ApacheHttp01Test(util.ApacheTest):
|
||||
"pending"),
|
||||
domain=self.vhosts[i].name, account_key=self.account_key))
|
||||
|
||||
modules = ["ssl", "rewrite", "authz_core", "authz_host"]
|
||||
modules = ["rewrite", "authz_core", "authz_host"]
|
||||
for mod in modules:
|
||||
self.config.parser.modules.add("mod_{0}.c".format(mod))
|
||||
self.config.parser.modules.add(mod + "_module")
|
||||
@@ -52,7 +71,7 @@ class ApacheHttp01Test(util.ApacheTest):
|
||||
self.assertFalse(self.http.perform())
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.enable_mod")
|
||||
def test_enable_modules_apache_2_2(self, mock_enmod):
|
||||
def test_enable_modules_22(self, mock_enmod):
|
||||
self.config.version = (2, 2)
|
||||
self.config.parser.modules.remove("authz_host_module")
|
||||
self.config.parser.modules.remove("mod_authz_host.c")
|
||||
@@ -61,7 +80,7 @@ class ApacheHttp01Test(util.ApacheTest):
|
||||
self.assertEqual(enmod_calls[0][0][0], "authz_host")
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.enable_mod")
|
||||
def test_enable_modules_apache_2_4(self, mock_enmod):
|
||||
def test_enable_modules_24(self, mock_enmod):
|
||||
self.config.parser.modules.remove("authz_core_module")
|
||||
self.config.parser.modules.remove("mod_authz_core.c")
|
||||
|
||||
@@ -79,7 +98,7 @@ class ApacheHttp01Test(util.ApacheTest):
|
||||
calls = mock_enmod.call_args_list
|
||||
other_calls = []
|
||||
for call in calls:
|
||||
if call[0][0] != "rewrite":
|
||||
if "rewrite" != call[0][0]:
|
||||
other_calls.append(call)
|
||||
|
||||
# If these lists are equal, we never enabled mod_rewrite
|
||||
@@ -112,63 +131,12 @@ class ApacheHttp01Test(util.ApacheTest):
|
||||
domain="something.nonexistent", account_key=self.account_key)]
|
||||
self.common_perform_test(achalls, vhosts)
|
||||
|
||||
def test_configure_multiple_vhosts(self):
|
||||
vhosts = [v for v in self.config.vhosts if "duplicate.example.com" in v.get_names()]
|
||||
self.assertEqual(len(vhosts), 2)
|
||||
achalls = [
|
||||
achallenges.KeyAuthorizationAnnotatedChallenge(
|
||||
challb=acme_util.chall_to_challb(
|
||||
challenges.HTTP01(token=((b'a' * 16))),
|
||||
"pending"),
|
||||
domain="duplicate.example.com", account_key=self.account_key)]
|
||||
self.common_perform_test(achalls, vhosts)
|
||||
|
||||
def test_no_vhost(self):
|
||||
for achall in self.achalls:
|
||||
self.http.add_chall(achall)
|
||||
self.config.config.http01_port = 12345
|
||||
self.assertRaises(errors.PluginError, self.http.perform)
|
||||
|
||||
def test_perform_1_achall_apache_2_2(self):
|
||||
self.combinations_perform_test(num_achalls=1, minor_version=2)
|
||||
|
||||
def test_perform_1_achall_apache_2_4(self):
|
||||
self.combinations_perform_test(num_achalls=1, minor_version=4)
|
||||
|
||||
def test_perform_2_achall_apache_2_2(self):
|
||||
self.combinations_perform_test(num_achalls=2, minor_version=2)
|
||||
|
||||
def test_perform_2_achall_apache_2_4(self):
|
||||
self.combinations_perform_test(num_achalls=2, minor_version=4)
|
||||
|
||||
def test_perform_3_achall_apache_2_2(self):
|
||||
self.combinations_perform_test(num_achalls=3, minor_version=2)
|
||||
|
||||
def test_perform_3_achall_apache_2_4(self):
|
||||
self.combinations_perform_test(num_achalls=3, minor_version=4)
|
||||
|
||||
def test_activate_disabled_vhost(self):
|
||||
vhosts = [v for v in self.config.vhosts if v.name == "certbot.demo"]
|
||||
achalls = [
|
||||
achallenges.KeyAuthorizationAnnotatedChallenge(
|
||||
challb=acme_util.chall_to_challb(
|
||||
challenges.HTTP01(token=((b'a' * 16))),
|
||||
"pending"),
|
||||
domain="certbot.demo", account_key=self.account_key)]
|
||||
vhosts[0].enabled = False
|
||||
self.common_perform_test(achalls, vhosts)
|
||||
matches = self.config.parser.find_dir(
|
||||
"Include", vhosts[0].filep,
|
||||
get_aug_path(self.config.parser.loc["default"]))
|
||||
self.assertEqual(len(matches), 1)
|
||||
|
||||
def combinations_perform_test(self, num_achalls, minor_version):
|
||||
"""Test perform with the given achall count and Apache version."""
|
||||
achalls = self.achalls[:num_achalls]
|
||||
vhosts = self.vhosts[:num_achalls]
|
||||
self.config.version = (2, minor_version)
|
||||
self.common_perform_test(achalls, vhosts)
|
||||
|
||||
def common_perform_test(self, achalls, vhosts):
|
||||
"""Tests perform with the given achalls."""
|
||||
challenge_dir = self.http.challenge_dir
|
||||
@@ -181,21 +149,22 @@ class ApacheHttp01Test(util.ApacheTest):
|
||||
self.assertEqual(self.http.perform(), expected_response)
|
||||
|
||||
self.assertTrue(os.path.isdir(self.http.challenge_dir))
|
||||
self.assertTrue(filesystem.has_min_permissions(self.http.challenge_dir, 0o755))
|
||||
self._has_min_permissions(self.http.challenge_dir, 0o755)
|
||||
self._test_challenge_conf()
|
||||
|
||||
for achall in achalls:
|
||||
self._test_challenge_file(achall)
|
||||
|
||||
for vhost in vhosts:
|
||||
matches = self.config.parser.find_dir("Include",
|
||||
self.http.challenge_conf_pre,
|
||||
vhost.path)
|
||||
self.assertEqual(len(matches), 1)
|
||||
matches = self.config.parser.find_dir("Include",
|
||||
self.http.challenge_conf_post,
|
||||
vhost.path)
|
||||
self.assertEqual(len(matches), 1)
|
||||
if not vhost.ssl:
|
||||
matches = self.config.parser.find_dir("Include",
|
||||
self.http.challenge_conf_pre,
|
||||
vhost.path)
|
||||
self.assertEqual(len(matches), 1)
|
||||
matches = self.config.parser.find_dir("Include",
|
||||
self.http.challenge_conf_post,
|
||||
vhost.path)
|
||||
self.assertEqual(len(matches), 1)
|
||||
|
||||
self.assertTrue(os.path.exists(challenge_dir))
|
||||
|
||||
@@ -219,10 +188,15 @@ class ApacheHttp01Test(util.ApacheTest):
|
||||
name = os.path.join(self.http.challenge_dir, achall.chall.encode("token"))
|
||||
validation = achall.validation(self.account_key)
|
||||
|
||||
self.assertTrue(filesystem.has_min_permissions(name, 0o644))
|
||||
self._has_min_permissions(name, 0o644)
|
||||
with open(name, 'rb') as f:
|
||||
self.assertEqual(f.read(), validation.encode())
|
||||
|
||||
def _has_min_permissions(self, path, min_mode):
|
||||
"""Tests the given file has at least the permissions in mode."""
|
||||
st_mode = os.stat(path).st_mode
|
||||
self.assertEqual(st_mode, st_mode | min_mode)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
|
||||
@@ -1,11 +1,12 @@
|
||||
"""Tests for certbot_apache.parser."""
|
||||
import os
|
||||
import shutil
|
||||
import unittest
|
||||
|
||||
import augeas
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache.tests import util
|
||||
|
||||
@@ -21,27 +22,6 @@ class BasicParserTest(util.ParserTest):
|
||||
shutil.rmtree(self.config_dir)
|
||||
shutil.rmtree(self.work_dir)
|
||||
|
||||
def test_bad_parse(self):
|
||||
self.parser.parse_file(os.path.join(self.parser.root,
|
||||
"conf-available", "bad_conf_file.conf"))
|
||||
self.assertRaises(
|
||||
errors.PluginError, self.parser.check_parsing_errors, "httpd.aug")
|
||||
|
||||
def test_bad_save(self):
|
||||
mock_save = mock.Mock()
|
||||
mock_save.side_effect = IOError
|
||||
self.parser.aug.save = mock_save
|
||||
self.assertRaises(errors.PluginError, self.parser.unsaved_files)
|
||||
|
||||
def test_aug_version(self):
|
||||
mock_match = mock.Mock(return_value=["something"])
|
||||
self.parser.aug.match = mock_match
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.parser.check_aug_version(),
|
||||
["something"])
|
||||
self.parser.aug.match.side_effect = RuntimeError
|
||||
self.assertFalse(self.parser.check_aug_version())
|
||||
|
||||
def test_find_config_root_no_root(self):
|
||||
# pylint: disable=protected-access
|
||||
os.remove(self.parser.loc["root"])
|
||||
@@ -72,7 +52,7 @@ class BasicParserTest(util.ParserTest):
|
||||
test2 = self.parser.find_dir("documentroot")
|
||||
|
||||
self.assertEqual(len(test), 1)
|
||||
self.assertEqual(len(test2), 8)
|
||||
self.assertEqual(len(test2), 7)
|
||||
|
||||
def test_add_dir(self):
|
||||
aug_default = "/files" + self.parser.loc["default"]
|
||||
@@ -104,7 +84,7 @@ class BasicParserTest(util.ParserTest):
|
||||
self.assertEqual(self.parser.aug.get(match), str(i + 1))
|
||||
|
||||
def test_empty_arg(self):
|
||||
self.assertEqual(None,
|
||||
self.assertEquals(None,
|
||||
self.parser.get_arg("/files/whatever/nonexistent"))
|
||||
|
||||
def test_add_dir_to_ifmodssl(self):
|
||||
@@ -254,7 +234,6 @@ class BasicParserTest(util.ParserTest):
|
||||
return inc_val
|
||||
elif cmd[-1] == "DUMP_MODULES":
|
||||
return mod_val
|
||||
return None # pragma: no cover
|
||||
|
||||
mock_cfg.side_effect = mock_get_vars
|
||||
|
||||
@@ -303,11 +282,11 @@ class BasicParserTest(util.ParserTest):
|
||||
self.assertRaises(
|
||||
errors.PluginError, self.parser.update_runtime_variables)
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.option")
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.constant")
|
||||
@mock.patch("certbot_apache.parser.subprocess.Popen")
|
||||
def test_update_runtime_vars_bad_ctl(self, mock_popen, mock_opt):
|
||||
def test_update_runtime_vars_bad_ctl(self, mock_popen, mock_const):
|
||||
mock_popen.side_effect = OSError
|
||||
mock_opt.return_value = "nonexistent"
|
||||
mock_const.return_value = "nonexistent"
|
||||
self.assertRaises(
|
||||
errors.MisconfigurationError,
|
||||
self.parser.update_runtime_variables)
|
||||
@@ -320,49 +299,25 @@ class BasicParserTest(util.ParserTest):
|
||||
errors.MisconfigurationError,
|
||||
self.parser.update_runtime_variables)
|
||||
|
||||
def test_add_comment(self):
|
||||
from certbot_apache.parser import get_aug_path
|
||||
self.parser.add_comment(get_aug_path(self.parser.loc["name"]), "123456")
|
||||
comm = self.parser.find_comments("123456")
|
||||
self.assertEqual(len(comm), 1)
|
||||
self.assertTrue(self.parser.loc["name"] in comm[0])
|
||||
|
||||
|
||||
class ParserInitTest(util.ApacheTest):
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
super(ParserInitTest, self).setUp()
|
||||
self.aug = augeas.Augeas(
|
||||
flags=augeas.Augeas.NONE | augeas.Augeas.NO_MODL_AUTOLOAD)
|
||||
|
||||
def tearDown(self):
|
||||
shutil.rmtree(self.temp_dir)
|
||||
shutil.rmtree(self.config_dir)
|
||||
shutil.rmtree(self.work_dir)
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser.init_augeas")
|
||||
def test_prepare_no_augeas(self, mock_init_augeas):
|
||||
from certbot_apache.parser import ApacheParser
|
||||
mock_init_augeas.side_effect = errors.NoInstallationError
|
||||
self.config.config_test = mock.Mock()
|
||||
self.assertRaises(
|
||||
errors.NoInstallationError, ApacheParser,
|
||||
os.path.relpath(self.config_path), "/dummy/vhostpath",
|
||||
version=(2, 4, 22), configurator=self.config)
|
||||
|
||||
def test_init_old_aug(self):
|
||||
from certbot_apache.parser import ApacheParser
|
||||
with mock.patch("certbot_apache.parser.ApacheParser.check_aug_version") as mock_c:
|
||||
mock_c.return_value = False
|
||||
self.assertRaises(
|
||||
errors.NotSupportedError,
|
||||
ApacheParser, os.path.relpath(self.config_path),
|
||||
"/dummy/vhostpath", version=(2, 4, 22), configurator=self.config)
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
|
||||
def test_unparseable(self, mock_cfg):
|
||||
from certbot_apache.parser import ApacheParser
|
||||
mock_cfg.return_value = ('Define: TEST')
|
||||
self.assertRaises(
|
||||
errors.PluginError,
|
||||
ApacheParser, os.path.relpath(self.config_path),
|
||||
ApacheParser, self.aug, os.path.relpath(self.config_path),
|
||||
"/dummy/vhostpath", version=(2, 2, 22), configurator=self.config)
|
||||
|
||||
def test_root_normalized(self):
|
||||
@@ -374,7 +329,8 @@ class ParserInitTest(util.ApacheTest):
|
||||
self.temp_dir,
|
||||
"debian_apache_2_4/////multiple_vhosts/../multiple_vhosts/apache2")
|
||||
|
||||
parser = ApacheParser(path, "/dummy/vhostpath", configurator=self.config)
|
||||
parser = ApacheParser(self.aug, path,
|
||||
"/dummy/vhostpath", configurator=self.config)
|
||||
|
||||
self.assertEqual(parser.root, self.config_path)
|
||||
|
||||
@@ -383,7 +339,7 @@ class ParserInitTest(util.ApacheTest):
|
||||
with mock.patch("certbot_apache.parser.ApacheParser."
|
||||
"update_runtime_variables"):
|
||||
parser = ApacheParser(
|
||||
os.path.relpath(self.config_path),
|
||||
self.aug, os.path.relpath(self.config_path),
|
||||
"/dummy/vhostpath", configurator=self.config)
|
||||
|
||||
self.assertEqual(parser.root, self.config_path)
|
||||
@@ -393,7 +349,7 @@ class ParserInitTest(util.ApacheTest):
|
||||
with mock.patch("certbot_apache.parser.ApacheParser."
|
||||
"update_runtime_variables"):
|
||||
parser = ApacheParser(
|
||||
self.config_path + os.path.sep,
|
||||
self.aug, self.config_path + os.path.sep,
|
||||
"/dummy/vhostpath", configurator=self.config)
|
||||
self.assertEqual(parser.root, self.config_path)
|
||||
|
||||
|
||||
@@ -1,9 +0,0 @@
|
||||
|
||||
This directory holds Apache 2.0 module-specific configuration files;
|
||||
any files in this directory which have the ".conf" extension will be
|
||||
processed as Apache configuration files.
|
||||
|
||||
Files are processed in alphabetical order, so if using configuration
|
||||
directives which depend on, say, mod_perl being loaded, ensure that
|
||||
these are placed in a filename later in the sort order than "perl.conf".
|
||||
|
||||
@@ -1,222 +0,0 @@
|
||||
#
|
||||
# This is the Apache server configuration file providing SSL support.
|
||||
# It contains the configuration directives to instruct the server how to
|
||||
# serve pages over an https connection. For detailing information about these
|
||||
# directives see <URL:http://httpd.apache.org/docs/2.2/mod/mod_ssl.html>
|
||||
#
|
||||
# Do NOT simply read the instructions in here without understanding
|
||||
# what they do. They're here only as hints or reminders. If you are unsure
|
||||
# consult the online docs. You have been warned.
|
||||
#
|
||||
|
||||
LoadModule ssl_module modules/mod_ssl.so
|
||||
|
||||
#
|
||||
# When we also provide SSL we have to listen to the
|
||||
# the HTTPS port in addition.
|
||||
#
|
||||
Listen 443
|
||||
|
||||
##
|
||||
## SSL Global Context
|
||||
##
|
||||
## All SSL configuration in this context applies both to
|
||||
## the main server and all SSL-enabled virtual hosts.
|
||||
##
|
||||
|
||||
# Pass Phrase Dialog:
|
||||
# Configure the pass phrase gathering process.
|
||||
# The filtering dialog program (`builtin' is a internal
|
||||
# terminal dialog) has to provide the pass phrase on stdout.
|
||||
SSLPassPhraseDialog builtin
|
||||
|
||||
# Inter-Process Session Cache:
|
||||
# Configure the SSL Session Cache: First the mechanism
|
||||
# to use and second the expiring timeout (in seconds).
|
||||
SSLSessionCache shmcb:/var/cache/mod_ssl/scache(512000)
|
||||
SSLSessionCacheTimeout 300
|
||||
|
||||
# Semaphore:
|
||||
# Configure the path to the mutual exclusion semaphore the
|
||||
# SSL engine uses internally for inter-process synchronization.
|
||||
SSLMutex default
|
||||
|
||||
# Pseudo Random Number Generator (PRNG):
|
||||
# Configure one or more sources to seed the PRNG of the
|
||||
# SSL library. The seed data should be of good random quality.
|
||||
# WARNING! On some platforms /dev/random blocks if not enough entropy
|
||||
# is available. This means you then cannot use the /dev/random device
|
||||
# because it would lead to very long connection times (as long as
|
||||
# it requires to make more entropy available). But usually those
|
||||
# platforms additionally provide a /dev/urandom device which doesn't
|
||||
# block. So, if available, use this one instead. Read the mod_ssl User
|
||||
# Manual for more details.
|
||||
SSLRandomSeed startup file:/dev/urandom 256
|
||||
SSLRandomSeed connect builtin
|
||||
#SSLRandomSeed startup file:/dev/random 512
|
||||
#SSLRandomSeed connect file:/dev/random 512
|
||||
#SSLRandomSeed connect file:/dev/urandom 512
|
||||
|
||||
#
|
||||
# Use "SSLCryptoDevice" to enable any supported hardware
|
||||
# accelerators. Use "openssl engine -v" to list supported
|
||||
# engine names. NOTE: If you enable an accelerator and the
|
||||
# server does not start, consult the error logs and ensure
|
||||
# your accelerator is functioning properly.
|
||||
#
|
||||
SSLCryptoDevice builtin
|
||||
#SSLCryptoDevice ubsec
|
||||
|
||||
##
|
||||
## SSL Virtual Host Context
|
||||
##
|
||||
|
||||
<VirtualHost _default_:443>
|
||||
|
||||
# General setup for the virtual host, inherited from global configuration
|
||||
#DocumentRoot "/var/www/html"
|
||||
#ServerName www.example.com:443
|
||||
|
||||
# Use separate log files for the SSL virtual host; note that LogLevel
|
||||
# is not inherited from httpd.conf.
|
||||
ErrorLog logs/ssl_error_log
|
||||
TransferLog logs/ssl_access_log
|
||||
LogLevel warn
|
||||
|
||||
# SSL Engine Switch:
|
||||
# Enable/Disable SSL for this virtual host.
|
||||
SSLEngine on
|
||||
|
||||
# SSL Protocol support:
|
||||
# List the enable protocol levels with which clients will be able to
|
||||
# connect. Disable SSLv2 access by default:
|
||||
SSLProtocol all -SSLv2
|
||||
|
||||
# SSL Cipher Suite:
|
||||
# List the ciphers that the client is permitted to negotiate.
|
||||
# See the mod_ssl documentation for a complete list.
|
||||
SSLCipherSuite DEFAULT:!EXP:!SSLv2:!DES:!IDEA:!SEED:+3DES
|
||||
|
||||
# Server Certificate:
|
||||
# Point SSLCertificateFile at a PEM encoded certificate. If
|
||||
# the certificate is encrypted, then you will be prompted for a
|
||||
# pass phrase. Note that a kill -HUP will prompt again. A new
|
||||
# certificate can be generated using the genkey(1) command.
|
||||
SSLCertificateFile /etc/pki/tls/certs/localhost.crt
|
||||
|
||||
# Server Private Key:
|
||||
# If the key is not combined with the certificate, use this
|
||||
# directive to point at the key file. Keep in mind that if
|
||||
# you've both a RSA and a DSA private key you can configure
|
||||
# both in parallel (to also allow the use of DSA ciphers, etc.)
|
||||
SSLCertificateKeyFile /etc/pki/tls/private/localhost.key
|
||||
|
||||
# Server Certificate Chain:
|
||||
# Point SSLCertificateChainFile at a file containing the
|
||||
# concatenation of PEM encoded CA certificates which form the
|
||||
# certificate chain for the server certificate. Alternatively
|
||||
# the referenced file can be the same as SSLCertificateFile
|
||||
# when the CA certificates are directly appended to the server
|
||||
# certificate for convinience.
|
||||
#SSLCertificateChainFile /etc/pki/tls/certs/server-chain.crt
|
||||
|
||||
# Certificate Authority (CA):
|
||||
# Set the CA certificate verification path where to find CA
|
||||
# certificates for client authentication or alternatively one
|
||||
# huge file containing all of them (file must be PEM encoded)
|
||||
#SSLCACertificateFile /etc/pki/tls/certs/ca-bundle.crt
|
||||
|
||||
# Client Authentication (Type):
|
||||
# Client certificate verification type and depth. Types are
|
||||
# none, optional, require and optional_no_ca. Depth is a
|
||||
# number which specifies how deeply to verify the certificate
|
||||
# issuer chain before deciding the certificate is not valid.
|
||||
#SSLVerifyClient require
|
||||
#SSLVerifyDepth 10
|
||||
|
||||
# Access Control:
|
||||
# With SSLRequire you can do per-directory access control based
|
||||
# on arbitrary complex boolean expressions containing server
|
||||
# variable checks and other lookup directives. The syntax is a
|
||||
# mixture between C and Perl. See the mod_ssl documentation
|
||||
# for more details.
|
||||
#<Location />
|
||||
#SSLRequire ( %{SSL_CIPHER} !~ m/^(EXP|NULL)/ \
|
||||
# and %{SSL_CLIENT_S_DN_O} eq "Snake Oil, Ltd." \
|
||||
# and %{SSL_CLIENT_S_DN_OU} in {"Staff", "CA", "Dev"} \
|
||||
# and %{TIME_WDAY} >= 1 and %{TIME_WDAY} <= 5 \
|
||||
# and %{TIME_HOUR} >= 8 and %{TIME_HOUR} <= 20 ) \
|
||||
# or %{REMOTE_ADDR} =~ m/^192\.76\.162\.[0-9]+$/
|
||||
#</Location>
|
||||
|
||||
# SSL Engine Options:
|
||||
# Set various options for the SSL engine.
|
||||
# o FakeBasicAuth:
|
||||
# Translate the client X.509 into a Basic Authorisation. This means that
|
||||
# the standard Auth/DBMAuth methods can be used for access control. The
|
||||
# user name is the `one line' version of the client's X.509 certificate.
|
||||
# Note that no password is obtained from the user. Every entry in the user
|
||||
# file needs this password: `xxj31ZMTZzkVA'.
|
||||
# o ExportCertData:
|
||||
# This exports two additional environment variables: SSL_CLIENT_CERT and
|
||||
# SSL_SERVER_CERT. These contain the PEM-encoded certificates of the
|
||||
# server (always existing) and the client (only existing when client
|
||||
# authentication is used). This can be used to import the certificates
|
||||
# into CGI scripts.
|
||||
# o StdEnvVars:
|
||||
# This exports the standard SSL/TLS related `SSL_*' environment variables.
|
||||
# Per default this exportation is switched off for performance reasons,
|
||||
# because the extraction step is an expensive operation and is usually
|
||||
# useless for serving static content. So one usually enables the
|
||||
# exportation for CGI and SSI requests only.
|
||||
# o StrictRequire:
|
||||
# This denies access when "SSLRequireSSL" or "SSLRequire" applied even
|
||||
# under a "Satisfy any" situation, i.e. when it applies access is denied
|
||||
# and no other module can change it.
|
||||
# o OptRenegotiate:
|
||||
# This enables optimized SSL connection renegotiation handling when SSL
|
||||
# directives are used in per-directory context.
|
||||
#SSLOptions +FakeBasicAuth +ExportCertData +StrictRequire
|
||||
<Files ~ "\.(cgi|shtml|phtml|php3?)$">
|
||||
SSLOptions +StdEnvVars
|
||||
</Files>
|
||||
<Directory "/var/www/cgi-bin">
|
||||
SSLOptions +StdEnvVars
|
||||
</Directory>
|
||||
|
||||
# SSL Protocol Adjustments:
|
||||
# The safe and default but still SSL/TLS standard compliant shutdown
|
||||
# approach is that mod_ssl sends the close notify alert but doesn't wait for
|
||||
# the close notify alert from client. When you need a different shutdown
|
||||
# approach you can use one of the following variables:
|
||||
# o ssl-unclean-shutdown:
|
||||
# This forces an unclean shutdown when the connection is closed, i.e. no
|
||||
# SSL close notify alert is send or allowed to received. This violates
|
||||
# the SSL/TLS standard but is needed for some brain-dead browsers. Use
|
||||
# this when you receive I/O errors because of the standard approach where
|
||||
# mod_ssl sends the close notify alert.
|
||||
# o ssl-accurate-shutdown:
|
||||
# This forces an accurate shutdown when the connection is closed, i.e. a
|
||||
# SSL close notify alert is send and mod_ssl waits for the close notify
|
||||
# alert of the client. This is 100% SSL/TLS standard compliant, but in
|
||||
# practice often causes hanging connections with brain-dead browsers. Use
|
||||
# this only for browsers where you know that their SSL implementation
|
||||
# works correctly.
|
||||
# Notice: Most problems of broken clients are also related to the HTTP
|
||||
# keep-alive facility, so you usually additionally want to disable
|
||||
# keep-alive for those clients, too. Use variable "nokeepalive" for this.
|
||||
# Similarly, one has to force some clients to use HTTP/1.0 to workaround
|
||||
# their broken HTTP/1.1 implementation. Use variables "downgrade-1.0" and
|
||||
# "force-response-1.0" for this.
|
||||
SetEnvIf User-Agent ".*MSIE.*" \
|
||||
nokeepalive ssl-unclean-shutdown \
|
||||
downgrade-1.0 force-response-1.0
|
||||
|
||||
# Per-Server Logging:
|
||||
# The home of a custom SSL log file. Use this when you want a
|
||||
# compact non-error SSL logfile on a virtual host basis.
|
||||
CustomLog logs/ssl_request_log \
|
||||
"%t %h %{SSL_PROTOCOL}x %{SSL_CIPHER}x \"%r\" %b"
|
||||
|
||||
</VirtualHost>
|
||||
|
||||
@@ -1,7 +0,0 @@
|
||||
<VirtualHost *:80>
|
||||
ServerName test.example.com
|
||||
ServerAdmin webmaster@dummy-host.example.com
|
||||
DocumentRoot /var/www/htdocs
|
||||
ErrorLog logs/dummy-host.example.com-error_log
|
||||
CustomLog logs/dummy-host.example.com-access_log common
|
||||
</VirtualHost>
|
||||
@@ -1,11 +0,0 @@
|
||||
#
|
||||
# This configuration file enables the default "Welcome"
|
||||
# page if there is no default index page present for
|
||||
# the root URL. To disable the Welcome page, comment
|
||||
# out all the lines below.
|
||||
#
|
||||
<LocationMatch "^/+$">
|
||||
Options -Indexes
|
||||
ErrorDocument 403 /error/noindex.html
|
||||
</LocationMatch>
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1 @@
|
||||
../sites-available/old,default.conf
|
||||
@@ -1 +0,0 @@
|
||||
../sites-available/old-and-default.conf
|
||||
@@ -1,9 +0,0 @@
|
||||
<VirtualHost 10.2.3.4:80>
|
||||
ServerName duplicate.example.com
|
||||
|
||||
ServerAdmin webmaster@certbot.demo
|
||||
DocumentRoot /var/www/html
|
||||
|
||||
ErrorLog ${APACHE_LOG_DIR}/error.log
|
||||
CustomLog ${APACHE_LOG_DIR}/access.log combined
|
||||
</VirtualHost>
|
||||
@@ -1,14 +0,0 @@
|
||||
<IfModule mod_ssl.c>
|
||||
<VirtualHost 10.2.3.4:443>
|
||||
ServerName duplicate.example.com
|
||||
|
||||
ServerAdmin webmaster@certbot.demo
|
||||
DocumentRoot /var/www/html
|
||||
|
||||
ErrorLog ${APACHE_LOG_DIR}/error.log
|
||||
CustomLog ${APACHE_LOG_DIR}/access.log combined
|
||||
|
||||
SSLCertificateFile /etc/apache2/certs/certbot-cert_5.pem
|
||||
SSLCertificateKeyFile /etc/apache2/ssl/key-certbot_15.pem
|
||||
</VirtualHost>
|
||||
</IfModule>
|
||||
@@ -1 +0,0 @@
|
||||
../sites-available/duplicatehttp.conf
|
||||
@@ -1 +0,0 @@
|
||||
../sites-available/duplicatehttps.conf
|
||||
151
certbot-apache/certbot_apache/tests/tls_sni_01_test.py
Normal file
151
certbot-apache/certbot_apache/tests/tls_sni_01_test.py
Normal file
@@ -0,0 +1,151 @@
|
||||
"""Test for certbot_apache.tls_sni_01."""
|
||||
import shutil
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
from certbot.plugins import common_test
|
||||
|
||||
from certbot_apache import obj
|
||||
from certbot_apache.tests import util
|
||||
|
||||
from six.moves import xrange # pylint: disable=redefined-builtin, import-error
|
||||
|
||||
|
||||
class TlsSniPerformTest(util.ApacheTest):
|
||||
"""Test the ApacheTlsSni01 challenge."""
|
||||
|
||||
auth_key = common_test.AUTH_KEY
|
||||
achalls = common_test.ACHALLS
|
||||
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
super(TlsSniPerformTest, self).setUp()
|
||||
|
||||
config = util.get_apache_configurator(
|
||||
self.config_path, self.vhost_path, self.config_dir,
|
||||
self.work_dir)
|
||||
config.config.tls_sni_01_port = 443
|
||||
|
||||
from certbot_apache import tls_sni_01
|
||||
self.sni = tls_sni_01.ApacheTlsSni01(config)
|
||||
|
||||
def tearDown(self):
|
||||
shutil.rmtree(self.temp_dir)
|
||||
shutil.rmtree(self.config_dir)
|
||||
shutil.rmtree(self.work_dir)
|
||||
|
||||
def test_perform0(self):
|
||||
resp = self.sni.perform()
|
||||
self.assertEqual(len(resp), 0)
|
||||
|
||||
@mock.patch("certbot.util.exe_exists")
|
||||
@mock.patch("certbot.util.run_script")
|
||||
def test_perform1(self, _, mock_exists):
|
||||
self.sni.configurator.parser.modules.add("socache_shmcb_module")
|
||||
self.sni.configurator.parser.modules.add("ssl_module")
|
||||
|
||||
mock_exists.return_value = True
|
||||
self.sni.configurator.parser.update_runtime_variables = mock.Mock()
|
||||
|
||||
achall = self.achalls[0]
|
||||
self.sni.add_chall(achall)
|
||||
response = self.achalls[0].response(self.auth_key)
|
||||
mock_setup_cert = mock.MagicMock(return_value=response)
|
||||
# pylint: disable=protected-access
|
||||
self.sni._setup_challenge_cert = mock_setup_cert
|
||||
|
||||
responses = self.sni.perform()
|
||||
mock_setup_cert.assert_called_once_with(achall)
|
||||
|
||||
# Check to make sure challenge config path is included in apache config
|
||||
self.assertEqual(
|
||||
len(self.sni.configurator.parser.find_dir(
|
||||
"Include", self.sni.challenge_conf)), 1)
|
||||
self.assertEqual(len(responses), 1)
|
||||
self.assertEqual(responses[0], response)
|
||||
|
||||
def test_perform2(self):
|
||||
# Avoid load module
|
||||
self.sni.configurator.parser.modules.add("ssl_module")
|
||||
self.sni.configurator.parser.modules.add("socache_shmcb_module")
|
||||
acme_responses = []
|
||||
for achall in self.achalls:
|
||||
self.sni.add_chall(achall)
|
||||
acme_responses.append(achall.response(self.auth_key))
|
||||
|
||||
mock_setup_cert = mock.MagicMock(side_effect=acme_responses)
|
||||
# pylint: disable=protected-access
|
||||
self.sni._setup_challenge_cert = mock_setup_cert
|
||||
|
||||
with mock.patch(
|
||||
"certbot_apache.override_debian.DebianConfigurator.enable_mod"):
|
||||
sni_responses = self.sni.perform()
|
||||
|
||||
self.assertEqual(mock_setup_cert.call_count, 2)
|
||||
|
||||
# Make sure calls made to mocked function were correct
|
||||
self.assertEqual(
|
||||
mock_setup_cert.call_args_list[0], mock.call(self.achalls[0]))
|
||||
self.assertEqual(
|
||||
mock_setup_cert.call_args_list[1], mock.call(self.achalls[1]))
|
||||
|
||||
self.assertEqual(
|
||||
len(self.sni.configurator.parser.find_dir(
|
||||
"Include", self.sni.challenge_conf)),
|
||||
1)
|
||||
self.assertEqual(len(sni_responses), 2)
|
||||
for i in xrange(2):
|
||||
self.assertEqual(sni_responses[i], acme_responses[i])
|
||||
|
||||
def test_mod_config(self):
|
||||
z_domains = []
|
||||
for achall in self.achalls:
|
||||
self.sni.add_chall(achall)
|
||||
z_domain = achall.response(self.auth_key).z_domain
|
||||
z_domains.append(set([z_domain.decode('ascii')]))
|
||||
|
||||
self.sni._mod_config() # pylint: disable=protected-access
|
||||
self.sni.configurator.save()
|
||||
|
||||
self.sni.configurator.parser.find_dir(
|
||||
"Include", self.sni.challenge_conf)
|
||||
vh_match = self.sni.configurator.aug.match(
|
||||
"/files" + self.sni.challenge_conf + "//VirtualHost")
|
||||
|
||||
vhs = []
|
||||
for match in vh_match:
|
||||
# pylint: disable=protected-access
|
||||
vhs.append(self.sni.configurator._create_vhost(match))
|
||||
self.assertEqual(len(vhs), 2)
|
||||
for vhost in vhs:
|
||||
self.assertEqual(vhost.addrs, set([obj.Addr.fromstring("*:443")]))
|
||||
names = vhost.get_names()
|
||||
self.assertTrue(names in z_domains)
|
||||
|
||||
def test_get_addrs_default(self):
|
||||
self.sni.configurator.choose_vhost = mock.Mock(
|
||||
return_value=obj.VirtualHost(
|
||||
"path", "aug_path",
|
||||
set([obj.Addr.fromstring("_default_:443")]),
|
||||
False, False)
|
||||
)
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(
|
||||
set([obj.Addr.fromstring("*:443")]),
|
||||
self.sni._get_addrs(self.achalls[0]))
|
||||
|
||||
def test_get_addrs_no_vhost_found(self):
|
||||
self.sni.configurator.choose_vhost = mock.Mock(
|
||||
side_effect=errors.MissingCommandlineFlag(
|
||||
"Failed to run Apache plugin non-interactively"))
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(
|
||||
set([obj.Addr.fromstring("*:443")]),
|
||||
self.sni._get_addrs(self.achalls[0]))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,4 +1,5 @@
|
||||
"""Common utilities for certbot_apache."""
|
||||
import os
|
||||
import shutil
|
||||
import sys
|
||||
import unittest
|
||||
@@ -8,9 +9,10 @@ import josepy as jose
|
||||
import mock
|
||||
import zope.component
|
||||
|
||||
from certbot.compat import os
|
||||
from certbot.display import util as display_util
|
||||
|
||||
from certbot.plugins import common
|
||||
|
||||
from certbot.tests import util as test_util
|
||||
|
||||
from certbot_apache import configurator
|
||||
@@ -18,7 +20,7 @@ from certbot_apache import entrypoint
|
||||
from certbot_apache import obj
|
||||
|
||||
|
||||
class ApacheTest(unittest.TestCase):
|
||||
class ApacheTest(unittest.TestCase): # pylint: disable=too-few-public-methods
|
||||
|
||||
def setUp(self, test_dir="debian_apache_2_4/multiple_vhosts",
|
||||
config_root="debian_apache_2_4/multiple_vhosts/apache2",
|
||||
@@ -78,12 +80,14 @@ class ParserTest(ApacheTest):
|
||||
with mock.patch("certbot_apache.parser.ApacheParser."
|
||||
"update_runtime_variables"):
|
||||
self.parser = ApacheParser(
|
||||
self.config_path, self.vhost_path, configurator=self.config)
|
||||
self.aug, self.config_path, self.vhost_path,
|
||||
configurator=self.config)
|
||||
|
||||
|
||||
def get_apache_configurator(
|
||||
def get_apache_configurator( # pylint: disable=too-many-arguments, too-many-locals
|
||||
config_path, vhost_path,
|
||||
config_dir, work_dir, version=(2, 4, 7),
|
||||
conf=None,
|
||||
os_info="generic",
|
||||
conf_vhost_path=None):
|
||||
"""Create an Apache Configurator with the specified options.
|
||||
@@ -94,10 +98,9 @@ def get_apache_configurator(
|
||||
backups = os.path.join(work_dir, "backups")
|
||||
mock_le_config = mock.MagicMock(
|
||||
apache_server_root=config_path,
|
||||
apache_vhost_root=None,
|
||||
apache_vhost_root=conf_vhost_path,
|
||||
apache_le_vhost_ext="-le-ssl.conf",
|
||||
apache_challenge_location=config_path,
|
||||
apache_enmod=None,
|
||||
backup_dir=backups,
|
||||
config_dir=config_dir,
|
||||
http01_port=80,
|
||||
@@ -105,25 +108,37 @@ def get_apache_configurator(
|
||||
in_progress_dir=os.path.join(backups, "IN_PROGRESS"),
|
||||
work_dir=work_dir)
|
||||
|
||||
with mock.patch("certbot_apache.configurator.util.run_script"):
|
||||
with mock.patch("certbot_apache.configurator.util."
|
||||
"exe_exists") as mock_exe_exists:
|
||||
mock_exe_exists.return_value = True
|
||||
with mock.patch("certbot_apache.parser.ApacheParser."
|
||||
"update_runtime_variables"):
|
||||
try:
|
||||
config_class = entrypoint.OVERRIDE_CLASSES[os_info]
|
||||
except KeyError:
|
||||
config_class = configurator.ApacheConfigurator
|
||||
config = config_class(config=mock_le_config, name="apache",
|
||||
version=version)
|
||||
if not conf_vhost_path:
|
||||
config_class.OS_DEFAULTS["vhost_root"] = vhost_path
|
||||
else:
|
||||
# Custom virtualhost path was requested
|
||||
config.config.apache_vhost_root = conf_vhost_path
|
||||
config.config.apache_ctl = config_class.OS_DEFAULTS["ctl"]
|
||||
config.prepare()
|
||||
orig_os_constant = configurator.ApacheConfigurator(mock_le_config,
|
||||
name="apache",
|
||||
version=version).constant
|
||||
|
||||
def mock_os_constant(key, vhost_path=vhost_path):
|
||||
"""Mock default vhost path"""
|
||||
if key == "vhost_root":
|
||||
return vhost_path
|
||||
else:
|
||||
return orig_os_constant(key)
|
||||
|
||||
with mock.patch("certbot_apache.configurator.ApacheConfigurator.constant") as mock_cons:
|
||||
mock_cons.side_effect = mock_os_constant
|
||||
with mock.patch("certbot_apache.configurator.util.run_script"):
|
||||
with mock.patch("certbot_apache.configurator.util."
|
||||
"exe_exists") as mock_exe_exists:
|
||||
mock_exe_exists.return_value = True
|
||||
with mock.patch("certbot_apache.parser.ApacheParser."
|
||||
"update_runtime_variables"):
|
||||
try:
|
||||
config_class = entrypoint.OVERRIDE_CLASSES[os_info]
|
||||
except KeyError:
|
||||
config_class = configurator.ApacheConfigurator
|
||||
config = config_class(config=mock_le_config, name="apache",
|
||||
version=version)
|
||||
# This allows testing scripts to set it a bit more
|
||||
# quickly
|
||||
if conf is not None:
|
||||
config.conf = conf # pragma: no cover
|
||||
|
||||
config.prepare()
|
||||
return config
|
||||
|
||||
|
||||
@@ -193,17 +208,7 @@ def get_vh_truth(temp_dir, config_name):
|
||||
"/files" + os.path.join(temp_dir, config_name,
|
||||
"apache2/apache2.conf/VirtualHost"),
|
||||
set([obj.Addr.fromstring("*:80")]), False, True,
|
||||
"vhost.in.rootconf"),
|
||||
obj.VirtualHost(
|
||||
os.path.join(prefix, "duplicatehttp.conf"),
|
||||
os.path.join(aug_pre, "duplicatehttp.conf/VirtualHost"),
|
||||
set([obj.Addr.fromstring("10.2.3.4:80")]), False, True,
|
||||
"duplicate.example.com"),
|
||||
obj.VirtualHost(
|
||||
os.path.join(prefix, "duplicatehttps.conf"),
|
||||
os.path.join(aug_pre, "duplicatehttps.conf/IfModule/VirtualHost"),
|
||||
set([obj.Addr.fromstring("10.2.3.4:443")]), True, True,
|
||||
"duplicate.example.com")]
|
||||
"vhost.in.rootconf")]
|
||||
return vh_truth
|
||||
if config_name == "debian_apache_2_4/multi_vhosts":
|
||||
prefix = os.path.join(
|
||||
|
||||
172
certbot-apache/certbot_apache/tls_sni_01.py
Normal file
172
certbot-apache/certbot_apache/tls_sni_01.py
Normal file
@@ -0,0 +1,172 @@
|
||||
"""A class that performs TLS-SNI-01 challenges for Apache"""
|
||||
|
||||
import os
|
||||
import logging
|
||||
|
||||
from certbot.plugins import common
|
||||
from certbot.errors import PluginError, MissingCommandlineFlag
|
||||
|
||||
from certbot_apache import obj
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ApacheTlsSni01(common.TLSSNI01):
|
||||
"""Class that performs TLS-SNI-01 challenges within the Apache configurator
|
||||
|
||||
:ivar configurator: ApacheConfigurator object
|
||||
:type configurator: :class:`~apache.configurator.ApacheConfigurator`
|
||||
|
||||
:ivar list achalls: Annotated TLS-SNI-01
|
||||
(`.KeyAuthorizationAnnotatedChallenge`) challenges.
|
||||
|
||||
:param list indices: Meant to hold indices of challenges in a
|
||||
larger array. ApacheTlsSni01 is capable of solving many challenges
|
||||
at once which causes an indexing issue within ApacheConfigurator
|
||||
who must return all responses in order. Imagine ApacheConfigurator
|
||||
maintaining state about where all of the http-01 Challenges,
|
||||
TLS-SNI-01 Challenges belong in the response array. This is an
|
||||
optional utility.
|
||||
|
||||
:param str challenge_conf: location of the challenge config file
|
||||
|
||||
"""
|
||||
|
||||
VHOST_TEMPLATE = """\
|
||||
<VirtualHost {vhost}>
|
||||
ServerName {server_name}
|
||||
UseCanonicalName on
|
||||
SSLStrictSNIVHostCheck on
|
||||
|
||||
LimitRequestBody 1048576
|
||||
|
||||
Include {ssl_options_conf_path}
|
||||
SSLCertificateFile {cert_path}
|
||||
SSLCertificateKeyFile {key_path}
|
||||
|
||||
DocumentRoot {document_root}
|
||||
</VirtualHost>
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(ApacheTlsSni01, self).__init__(*args, **kwargs)
|
||||
|
||||
self.challenge_conf = os.path.join(
|
||||
self.configurator.conf("challenge-location"),
|
||||
"le_tls_sni_01_cert_challenge.conf")
|
||||
|
||||
def perform(self):
|
||||
"""Perform a TLS-SNI-01 challenge."""
|
||||
if not self.achalls:
|
||||
return []
|
||||
# Save any changes to the configuration as a precaution
|
||||
# About to make temporary changes to the config
|
||||
self.configurator.save("Changes before challenge setup", True)
|
||||
|
||||
# Prepare the server for HTTPS
|
||||
self.configurator.prepare_server_https(
|
||||
str(self.configurator.config.tls_sni_01_port), True)
|
||||
|
||||
responses = []
|
||||
|
||||
# Create all of the challenge certs
|
||||
for achall in self.achalls:
|
||||
responses.append(self._setup_challenge_cert(achall))
|
||||
|
||||
# Setup the configuration
|
||||
addrs = self._mod_config()
|
||||
self.configurator.save("Don't lose mod_config changes", True)
|
||||
self.configurator.make_addrs_sni_ready(addrs)
|
||||
|
||||
# Save reversible changes
|
||||
self.configurator.save("SNI Challenge", True)
|
||||
|
||||
return responses
|
||||
|
||||
def _mod_config(self):
|
||||
"""Modifies Apache config files to include challenge vhosts.
|
||||
|
||||
Result: Apache config includes virtual servers for issued challs
|
||||
|
||||
:returns: All TLS-SNI-01 addresses used
|
||||
:rtype: set
|
||||
|
||||
"""
|
||||
addrs = set()
|
||||
config_text = "<IfModule mod_ssl.c>\n"
|
||||
|
||||
for achall in self.achalls:
|
||||
achall_addrs = self._get_addrs(achall)
|
||||
addrs.update(achall_addrs)
|
||||
|
||||
config_text += self._get_config_text(achall, achall_addrs)
|
||||
|
||||
config_text += "</IfModule>\n"
|
||||
|
||||
self.configurator.parser.add_include(
|
||||
self.configurator.parser.loc["default"], self.challenge_conf)
|
||||
self.configurator.reverter.register_file_creation(
|
||||
True, self.challenge_conf)
|
||||
|
||||
logger.debug("writing a config file with text:\n %s", config_text)
|
||||
with open(self.challenge_conf, "w") as new_conf:
|
||||
new_conf.write(config_text)
|
||||
|
||||
return addrs
|
||||
|
||||
def _get_addrs(self, achall):
|
||||
"""Return the Apache addresses needed for TLS-SNI-01."""
|
||||
# TODO: Checkout _default_ rules.
|
||||
addrs = set()
|
||||
default_addr = obj.Addr(("*", str(
|
||||
self.configurator.config.tls_sni_01_port)))
|
||||
|
||||
try:
|
||||
vhost = self.configurator.choose_vhost(achall.domain, temp=True)
|
||||
except (PluginError, MissingCommandlineFlag):
|
||||
# We couldn't find the virtualhost for this domain, possibly
|
||||
# because it's a new vhost that's not configured yet
|
||||
# (GH #677). See also GH #2600.
|
||||
logger.warning("Falling back to default vhost %s...", default_addr)
|
||||
addrs.add(default_addr)
|
||||
return addrs
|
||||
|
||||
for addr in vhost.addrs:
|
||||
if "_default_" == addr.get_addr():
|
||||
addrs.add(default_addr)
|
||||
else:
|
||||
addrs.add(
|
||||
addr.get_sni_addr(
|
||||
self.configurator.config.tls_sni_01_port))
|
||||
|
||||
return addrs
|
||||
|
||||
def _get_config_text(self, achall, ip_addrs):
|
||||
"""Chocolate virtual server configuration text
|
||||
|
||||
:param .KeyAuthorizationAnnotatedChallenge achall: Annotated
|
||||
TLS-SNI-01 challenge.
|
||||
|
||||
:param list ip_addrs: addresses of challenged domain
|
||||
:class:`list` of type `~.obj.Addr`
|
||||
|
||||
:returns: virtual host configuration text
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
ips = " ".join(str(i) for i in ip_addrs)
|
||||
document_root = os.path.join(
|
||||
self.configurator.config.work_dir, "tls_sni_01_page/")
|
||||
# TODO: Python docs is not clear how multiline string literal
|
||||
# newlines are parsed on different platforms. At least on
|
||||
# Linux (Debian sid), when source file uses CRLF, Python still
|
||||
# parses it as "\n"... c.f.:
|
||||
# https://docs.python.org/2.7/reference/lexical_analysis.html
|
||||
return self.VHOST_TEMPLATE.format(
|
||||
vhost=ips,
|
||||
server_name=achall.response(achall.account_key).z_domain.decode('ascii'),
|
||||
ssl_options_conf_path=self.configurator.mod_ssl_conf,
|
||||
cert_path=self.get_cert_path(achall),
|
||||
key_path=self.get_key_path(achall),
|
||||
document_root=document_root).replace("\n", os.linesep)
|
||||
5
certbot-apache/docs/api/tls_sni_01.rst
Normal file
5
certbot-apache/docs/api/tls_sni_01.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
:mod:`certbot_apache.tls_sni_01`
|
||||
------------------------------------
|
||||
|
||||
.. automodule:: certbot_apache.tls_sni_01
|
||||
:members:
|
||||
@@ -13,7 +13,7 @@
|
||||
# serve to show the default.
|
||||
|
||||
import sys
|
||||
from certbot.compat import os
|
||||
import os
|
||||
import shlex
|
||||
|
||||
import mock
|
||||
|
||||
@@ -1,3 +1,2 @@
|
||||
# Remember to update setup.py to match the package versions below.
|
||||
acme[dev]==0.29.0
|
||||
certbot[dev]==0.39.0
|
||||
acme[dev]==0.21.1
|
||||
certbot[dev]==0.21.1
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user