Compare commits
3 Commits
test-power
...
test-old-l
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
ac7267ba0c | ||
|
|
9c5a13bb52 | ||
|
|
610289ade4 |
@@ -1,119 +0,0 @@
|
||||
# Configuring Azure Pipelines with Certbot
|
||||
|
||||
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
|
||||
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
|
||||
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
|
||||
|
||||
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
|
||||
|
||||
Unlike Travis, where CodeCov is working without any action required, CodeCov supports Azure Pipelines
|
||||
using the coverage-bash utility (not python-coverage for now) only if you provide the Codecov repo token
|
||||
using the `CODECOV_TOKEN` environment variable. So `CODECOV_TOKEN` needs to be set as a secured
|
||||
environment variable to allow the main pipeline to publish coverage reports to CodeCov.
|
||||
|
||||
This INSTALL.md file explains how to configure Azure Pipelines with Certbot in order to execute the CI/CD logic defined in `.azure-pipelines` folder with it.
|
||||
During this installation step, warnings describing user access and legal comitments will be displayed like this:
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
```
|
||||
|
||||
This document suppose that the Azure DevOps organization is named _certbot_, and the Azure DevOps project is also _certbot_.
|
||||
|
||||
## Useful links
|
||||
|
||||
* https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema
|
||||
* https://www.azuredevopslabs.com/labs/azuredevops/github-integration/
|
||||
* https://docs.microsoft.com/en-us/azure/devops/pipelines/ecosystems/python?view=azure-devops
|
||||
|
||||
## Prerequisites
|
||||
|
||||
### Having a GitHub account
|
||||
|
||||
Use your GitHub user for a normal GitHub account, or a user that has administrative rights to the GitHub organization if relevant.
|
||||
|
||||
### Having an Azure DevOps account
|
||||
- Go to https://dev.azure.com/, click "Start free with GitHub"
|
||||
- Login to GitHub
|
||||
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
Personal user data (email + profile info, in read-only)
|
||||
```
|
||||
|
||||
- Microsoft will create a Live account using the email referenced for the GitHub account. This account is also linked to GitHub account (meaning you can log it using GitHub authentication)
|
||||
- Proceed with account registration (birth date, country), add details about name and email contact
|
||||
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
Microsoft proposes to send commercial links to this mail
|
||||
Azure DevOps terms of service need to be accepted
|
||||
```
|
||||
|
||||
_Logged to Azure DevOps, account is ready._
|
||||
|
||||
### Installing Azure Pipelines to GitHub
|
||||
|
||||
- On GitHub, go to Marketplace
|
||||
- Select Azure Pipeline, and "Set up a plan"
|
||||
- Select Free, then "Install it for free"
|
||||
- Click "Complete order and begin installation"
|
||||
|
||||
```
|
||||
!!! ACCESS !!!
|
||||
Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses, deployments, issues, pull requests.
|
||||
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
|
||||
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
|
||||
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
|
||||
protected branches, like master, and cannot modify the security context around this on GitHub.
|
||||
Access can be defined for all or only selected repositories, which is nice.
|
||||
```
|
||||
|
||||
- Redirected to Azure DevOps, select the account created in _Having an Azure DevOps account_ section.
|
||||
- Select the organization, and click "Create a new project" (let's name it the same than the targetted github repo)
|
||||
- The Visibility is public, to profit from 10 parallel jobs
|
||||
|
||||
```
|
||||
!!! ACCESS !!!
|
||||
Azure Pipelines needs access to the GitHub account (in term of beeing able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
|
||||
```
|
||||
|
||||
_Done. We can move to pipelines configuration._
|
||||
|
||||
## Import an existing pipelines from `.azure-pipelines` folder
|
||||
|
||||
- On Azure DevOps, go to your organization (eg. _certbot_) then your project (eg. _certbot_)
|
||||
- Click "Pipelines" tab
|
||||
- Click "New pipeline"
|
||||
- Where is your code?: select "__Use the classic editor__"
|
||||
|
||||
__Warning: Do not choose the GitHub option in Where is your code? section. Indeed, this option will trigger an OAuth
|
||||
grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAuth Application. The permissions asked
|
||||
then are way too large (admin level on almost everything), while the classic approach does not add any more
|
||||
permissions, and works perfectly well.__
|
||||
|
||||
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
|
||||
- Click on YAML option for "Select a template"
|
||||
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
|
||||
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
|
||||
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
|
||||
|
||||
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._
|
||||
|
||||
## Add a secret variable to a pipeline (like `CODECOV_TOKEN`)
|
||||
|
||||
__NB: Following steps suppose that you already setup the YAML pipeline file to
|
||||
consume the secret variable that these steps will create as an environment variable.
|
||||
For a variable named `CODECOV_TOKEN` consuming the variable `codecov_token`,
|
||||
in the YAML file this setup would take the form of the following:
|
||||
```
|
||||
steps:
|
||||
- script: ./do_something_that_consumes_CODECOV_TOKEN # Eg. `codecov -F windows`
|
||||
env:
|
||||
CODECOV_TOKEN: $(codecov_token)
|
||||
```
|
||||
|
||||
To set up a variable that is shared between pipelines, follow the instructions
|
||||
at
|
||||
https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups.
|
||||
When adding variables to a group, don't forget to tick "Keep this value secret"
|
||||
if it shouldn't be shared publcily.
|
||||
@@ -1,19 +0,0 @@
|
||||
# Advanced pipeline for isolated checks and release purpose
|
||||
trigger:
|
||||
- test-*
|
||||
- '*.x'
|
||||
pr:
|
||||
- test-*
|
||||
# This pipeline is also nightly run on master
|
||||
schedules:
|
||||
- cron: "0 4 * * *"
|
||||
displayName: Nightly build
|
||||
branches:
|
||||
include:
|
||||
- master
|
||||
always: true
|
||||
|
||||
jobs:
|
||||
# Any addition here should be reflected in the release pipeline.
|
||||
# It is advised to declare all jobs here as templates to improve maintainability.
|
||||
- template: templates/installer-tests.yml
|
||||
@@ -1,12 +0,0 @@
|
||||
trigger:
|
||||
# apache-parser-v2 is a temporary branch for doing work related to
|
||||
# rewriting the parser in the Apache plugin.
|
||||
- apache-parser-v2
|
||||
- master
|
||||
pr:
|
||||
- apache-parser-v2
|
||||
- master
|
||||
- '*.x'
|
||||
|
||||
jobs:
|
||||
- template: templates/tests-suite.yml
|
||||
@@ -1,13 +0,0 @@
|
||||
# Release pipeline to build and deploy Certbot for Windows for GitHub release tags
|
||||
trigger:
|
||||
tags:
|
||||
include:
|
||||
- v*
|
||||
pr: none
|
||||
|
||||
jobs:
|
||||
# Any addition here should be reflected in the advanced pipeline.
|
||||
# It is advised to declare all jobs here as templates to improve maintainability.
|
||||
- template: templates/tests-suite.yml
|
||||
- template: templates/installer-tests.yml
|
||||
- template: templates/changelog.yml
|
||||
@@ -1,14 +0,0 @@
|
||||
jobs:
|
||||
- job: changelog
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
steps:
|
||||
- bash: |
|
||||
CERTBOT_VERSION="$(cd certbot && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
|
||||
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
|
||||
displayName: Prepare changelog
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: changelog
|
||||
displayName: Publish changelog
|
||||
@@ -1,15 +0,0 @@
|
||||
jobs:
|
||||
- job: installer_run
|
||||
strategy:
|
||||
matrix:
|
||||
win2019:
|
||||
imageName: windows-2019
|
||||
win2016:
|
||||
imageName: vs2017-win2016
|
||||
win2012r2:
|
||||
imageName: vs2015-win2012r2
|
||||
pool:
|
||||
vmImage: $(imageName)
|
||||
steps:
|
||||
- script: wusa /uninstall /kb:3134758 /quiet /norestart & exit 0
|
||||
- script: powershell -Command "$PSVersionTable.PSVersion"
|
||||
@@ -1,38 +0,0 @@
|
||||
jobs:
|
||||
- job: test
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
strategy:
|
||||
matrix:
|
||||
py35:
|
||||
PYTHON_VERSION: 3.5
|
||||
TOXENV: py35
|
||||
py37-cover:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: py37-cover
|
||||
integration-certbot:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: integration-certbot
|
||||
PYTEST_ADDOPTS: --numprocesses 4
|
||||
variables:
|
||||
- group: certbot-common
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: $(PYTHON_VERSION)
|
||||
addToPath: true
|
||||
- script: python tools/pip_install.py -U tox coverage
|
||||
displayName: Install dependencies
|
||||
- script: python -m tox
|
||||
displayName: Run tox
|
||||
# We do not require codecov report upload to succeed. So to avoid to break the pipeline if
|
||||
# something goes wrong, each command is suffixed with a command that hides any non zero exit
|
||||
# codes and echoes an informative message instead.
|
||||
- bash: |
|
||||
curl -s https://codecov.io/bash -o codecov-bash || echo "Failed to download codecov-bash"
|
||||
chmod +x codecov-bash || echo "Failed to apply execute permissions on codecov-bash"
|
||||
./codecov-bash -F windows || echo "Codecov did not collect coverage reports"
|
||||
condition: in(variables['TOXENV'], 'py37-cover', 'integration-certbot')
|
||||
env:
|
||||
CODECOV_TOKEN: $(codecov_token)
|
||||
displayName: Publish coverage
|
||||
18
.codecov.yml
18
.codecov.yml
@@ -1,18 +0,0 @@
|
||||
coverage:
|
||||
status:
|
||||
project:
|
||||
default: off
|
||||
linux:
|
||||
flags: linux
|
||||
# Fixed target instead of auto set by #7173, can
|
||||
# be removed when flags in Codecov are added back.
|
||||
target: 97.4
|
||||
threshold: 0.1
|
||||
base: auto
|
||||
windows:
|
||||
flags: windows
|
||||
# Fixed target instead of auto set by #7173, can
|
||||
# be removed when flags in Codecov are added back.
|
||||
target: 97.4
|
||||
threshold: 0.1
|
||||
base: auto
|
||||
@@ -1,5 +1,2 @@
|
||||
[run]
|
||||
omit = */setup.py
|
||||
|
||||
[report]
|
||||
omit = */setup.py
|
||||
|
||||
35
.github/stale.yml
vendored
35
.github/stale.yml
vendored
@@ -1,35 +0,0 @@
|
||||
# Configuration for https://github.com/marketplace/stale
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request becomes stale
|
||||
daysUntilStale: 365
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
|
||||
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
|
||||
# When changing this value, be sure to also update markComment below.
|
||||
daysUntilClose: 30
|
||||
|
||||
# Ignore issues with an assignee (defaults to false)
|
||||
exemptAssignees: true
|
||||
|
||||
# Label to use when marking as stale
|
||||
staleLabel: needs-update
|
||||
|
||||
# Comment to post when marking as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
We've made a lot of changes to Certbot since this issue was opened. If you
|
||||
still have this issue with an up-to-date version of Certbot, can you please
|
||||
add a comment letting us know? This helps us to better see what issues are
|
||||
still affecting our users. If there is no activity in the next 30 days, this
|
||||
issue will be automatically closed.
|
||||
|
||||
# Comment to post when closing a stale Issue or Pull Request.
|
||||
closeComment: >
|
||||
This issue has been closed due to lack of activity, but if you think it
|
||||
should be reopened, please open a new issue with a link to this one and we'll
|
||||
take a look.
|
||||
|
||||
# Limit the number of actions per hour, from 1-30. Default is 30
|
||||
limitPerRun: 1
|
||||
|
||||
# Don't mark pull requests as stale.
|
||||
only: issues
|
||||
5
.gitignore
vendored
5
.gitignore
vendored
@@ -44,8 +44,3 @@ tests/letstest/venv/
|
||||
|
||||
# docker files
|
||||
.docker
|
||||
|
||||
# certbot tests
|
||||
.certbot_test_workspace
|
||||
**/assets/pebble*
|
||||
**/assets/challtestsrv*
|
||||
|
||||
@@ -1,7 +0,0 @@
|
||||
[settings]
|
||||
skip_glob=venv*
|
||||
skip=letsencrypt-auto-source
|
||||
force_sort_within_sections=True
|
||||
force_single_line=True
|
||||
order_by_type=False
|
||||
line_length=400
|
||||
53
.pylintrc
53
.pylintrc
@@ -24,11 +24,6 @@ persistent=yes
|
||||
# usually to register additional checkers.
|
||||
load-plugins=linter_plugin
|
||||
|
||||
# A comma-separated list of package or module names from where C extensions may
|
||||
# be loaded. Extensions are loading into the active Python interpreter and may
|
||||
# run arbitrary code.
|
||||
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
|
||||
|
||||
|
||||
[MESSAGES CONTROL]
|
||||
|
||||
@@ -46,14 +41,10 @@ extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
|
||||
# --enable=similarities". If you want to run only the classes checker, but have
|
||||
# no Warning level messages displayed, use"--disable=all --enable=classes
|
||||
# --disable=W"
|
||||
# CERTBOT COMMENT
|
||||
# 1) Once certbot codebase is claimed to be compatible exclusively with Python 3,
|
||||
# the useless-object-inheritance check can be enabled again, and code fixed accordingly.
|
||||
# 2) Check unsubscriptable-object tends to create a lot of false positives. Let's disable it.
|
||||
# See https://github.com/PyCQA/pylint/issues/1498.
|
||||
# 3) Same as point 2 for no-value-for-parameter.
|
||||
# See https://github.com/PyCQA/pylint/issues/2820.
|
||||
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue
|
||||
disable=fixme,locally-disabled,locally-enabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,too-few-public-methods,no-self-use,invalid-name,too-many-instance-attributes,cyclic-import,duplicate-code
|
||||
# abstract-class-not-used cannot be disabled locally (at least in
|
||||
# pylint 1.4.1), same for abstract-class-little-used
|
||||
|
||||
|
||||
[REPORTS]
|
||||
|
||||
@@ -260,7 +251,7 @@ ignored-modules=pkg_resources,confargparse,argparse,six.moves,six.moves.urllib
|
||||
|
||||
# List of classes names for which member attributes should not be checked
|
||||
# (useful for classes with attributes dynamically set).
|
||||
ignored-classes=Field,Header,JWS,closing
|
||||
ignored-classes=SQLObject
|
||||
|
||||
# When zope mode is activated, add a predefined set of Zope acquired attributes
|
||||
# to generated-members.
|
||||
@@ -306,6 +297,40 @@ valid-classmethod-first-arg=cls
|
||||
valid-metaclass-classmethod-first-arg=mcs
|
||||
|
||||
|
||||
[DESIGN]
|
||||
|
||||
# Maximum number of arguments for function / method
|
||||
max-args=6
|
||||
|
||||
# Argument names that match this expression will be ignored. Default to name
|
||||
# with leading underscore
|
||||
ignored-argument-names=_.*
|
||||
|
||||
# Maximum number of locals for function / method body
|
||||
max-locals=15
|
||||
|
||||
# Maximum number of return / yield for function / method body
|
||||
max-returns=6
|
||||
|
||||
# Maximum number of branch for function / method body
|
||||
max-branches=12
|
||||
|
||||
# Maximum number of statements in function / method body
|
||||
max-statements=50
|
||||
|
||||
# Maximum number of parents for a class (see R0901).
|
||||
max-parents=12
|
||||
|
||||
# Maximum number of attributes for a class (see R0902).
|
||||
max-attributes=7
|
||||
|
||||
# Minimum number of public methods for a class (see R0903).
|
||||
min-public-methods=2
|
||||
|
||||
# Maximum number of public methods for a class (see R0904).
|
||||
max-public-methods=20
|
||||
|
||||
|
||||
[EXCEPTIONS]
|
||||
|
||||
# Exceptions that will emit a warning when being caught. Defaults to
|
||||
|
||||
40
.travis.yml
40
.travis.yml
@@ -1,16 +1,11 @@
|
||||
language: python
|
||||
dist: xenial
|
||||
|
||||
cache:
|
||||
directories:
|
||||
- $HOME/.cache/pip
|
||||
|
||||
before_script:
|
||||
- 'if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then ulimit -n 1024 ; fi'
|
||||
# On Travis, the fastest parallelization for integration tests has proved to be 4.
|
||||
- 'if [[ "$TOXENV" == *"integration"* ]]; then export PYTEST_ADDOPTS="--numprocesses 4"; fi'
|
||||
# Use Travis retry feature for farm tests since they are flaky
|
||||
- 'if [[ "$TOXENV" == "travis-test-farm"* ]]; then export TRAVIS_RETRY=travis_retry; fi'
|
||||
- 'if [ $TRAVIS_OS_NAME = osx ] ; then ulimit -n 1024 ; fi'
|
||||
- export TOX_TESTENV_PASSENV=TRAVIS
|
||||
|
||||
# Only build pushes to the master branch, PRs, and branches beginning with
|
||||
@@ -19,27 +14,14 @@ before_script:
|
||||
# is a cap of on the number of simultaneous runs.
|
||||
branches:
|
||||
only:
|
||||
# apache-parser-v2 is a temporary branch for doing work related to
|
||||
# rewriting the parser in the Apache plugin.
|
||||
- apache-parser-v2
|
||||
- master
|
||||
- /^\d+\.\d+\.x$/
|
||||
- /^test-.*$/
|
||||
|
||||
# Jobs for the main test suite are always executed (including on PRs) except for pushes on master.
|
||||
not-on-master: ¬-on-master
|
||||
if: NOT (type = push AND branch = master)
|
||||
|
||||
# Jobs for the extended test suite are executed for cron jobs and pushes to
|
||||
# non-development branches. See the explanation for apache-parser-v2 above.
|
||||
extended-test-suite: &extended-test-suite
|
||||
if: type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))
|
||||
|
||||
matrix:
|
||||
include:
|
||||
# This job is always executed, including on master
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27-cover FYI="py27 tests + code coverage"
|
||||
env: TOXENV=lint
|
||||
|
||||
# container-based infrastructure
|
||||
sudo: false
|
||||
@@ -48,6 +30,7 @@ addons:
|
||||
apt:
|
||||
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
|
||||
- python-dev
|
||||
- python-virtualenv
|
||||
- gcc
|
||||
- libaugeas0
|
||||
- libssl-dev
|
||||
@@ -57,19 +40,12 @@ addons:
|
||||
- nginx-light
|
||||
- openssl
|
||||
|
||||
# tools/pip_install.py is used to pin packages to a known working version
|
||||
# except in tests where the environment variable CERTBOT_NO_PIN is set.
|
||||
# virtualenv is listed here explicitly to make sure it is upgraded when
|
||||
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
|
||||
# version of virtualenv.
|
||||
install: 'tools/pip_install.py -U codecov tox virtualenv'
|
||||
# Most of the time TRAVIS_RETRY is an empty string, and has no effect on the
|
||||
# script command. It is set only to `travis_retry` during farm tests, in
|
||||
# order to trigger the Travis retry feature, and compensate the inherent
|
||||
# flakiness of these specific tests.
|
||||
script: '$TRAVIS_RETRY tox'
|
||||
install: "travis_retry $(command -v pip || command -v pip3) install codecov tox"
|
||||
script:
|
||||
- travis_retry tox
|
||||
- '[ -z "${BOULDER_INTEGRATION+x}" ] || (travis_retry tests/boulder-fetch.sh && tests/tox-boulder-integration.sh)'
|
||||
|
||||
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov -F linux'
|
||||
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov'
|
||||
|
||||
notifications:
|
||||
email: false
|
||||
|
||||
@@ -5,7 +5,6 @@ Authors
|
||||
* Aaron Zuehlke
|
||||
* Ada Lovelace
|
||||
* [Adam Woodbeck](https://github.com/awoodbeck)
|
||||
* [Adrien Ferrand](https://github.com/adferrand)
|
||||
* [Aidin Gharibnavaz](https://github.com/aidin36)
|
||||
* [AJ ONeal](https://github.com/coolaj86)
|
||||
* [Alcaro](https://github.com/Alcaro)
|
||||
@@ -15,10 +14,8 @@ Authors
|
||||
* [Alex Gaynor](https://github.com/alex)
|
||||
* [Alex Halderman](https://github.com/jhalderm)
|
||||
* [Alex Jordan](https://github.com/strugee)
|
||||
* [Alex Zorin](https://github.com/alexzorin)
|
||||
* [Amjad Mashaal](https://github.com/TheNavigat)
|
||||
* [Andrew Murray](https://github.com/radarhere)
|
||||
* [Andrzej Górski](https://github.com/andrzej3393)
|
||||
* [Anselm Levskaya](https://github.com/levskaya)
|
||||
* [Antoine Jacoutot](https://github.com/ajacoutot)
|
||||
* [asaph](https://github.com/asaph)
|
||||
@@ -78,7 +75,6 @@ Authors
|
||||
* [Fabian](https://github.com/faerbit)
|
||||
* [Faidon Liambotis](https://github.com/paravoid)
|
||||
* [Fan Jiang](https://github.com/tcz001)
|
||||
* [Felix Lechner](https://github.com/lechner)
|
||||
* [Felix Schwarz](https://github.com/FelixSchwarz)
|
||||
* [Felix Yan](https://github.com/felixonmars)
|
||||
* [Filip Ochnik](https://github.com/filipochnik)
|
||||
@@ -128,7 +124,6 @@ Authors
|
||||
* [Joubin Jabbari](https://github.com/joubin)
|
||||
* [Juho Juopperi](https://github.com/jkjuopperi)
|
||||
* [Kane York](https://github.com/riking)
|
||||
* [Kenichi Maehashi](https://github.com/kmaehashi)
|
||||
* [Kenneth Skovhede](https://github.com/kenkendk)
|
||||
* [Kevin Burke](https://github.com/kevinburke)
|
||||
* [Kevin London](https://github.com/kevinlondon)
|
||||
@@ -164,10 +159,8 @@ Authors
|
||||
* [Michael Schumacher](https://github.com/schumaml)
|
||||
* [Michael Strache](https://github.com/Jarodiv)
|
||||
* [Michael Sverdlin](https://github.com/sveder)
|
||||
* [Michael Watters](https://github.com/blackknight36)
|
||||
* [Michal Moravec](https://github.com/https://github.com/Majkl578)
|
||||
* [Michal Papis](https://github.com/mpapis)
|
||||
* [Mickaël Schoentgen](https://github.com/BoboTiG)
|
||||
* [Minn Soe](https://github.com/MinnSoe)
|
||||
* [Min RK](https://github.com/minrk)
|
||||
* [Miquel Ruiz](https://github.com/miquelruiz)
|
||||
@@ -231,7 +224,6 @@ Authors
|
||||
* [Stavros Korokithakis](https://github.com/skorokithakis)
|
||||
* [Stefan Weil](https://github.com/stweil)
|
||||
* [Steve Desmond](https://github.com/stevedesmond-ca)
|
||||
* [sydneyli](https://github.com/sydneyli)
|
||||
* [Tan Jay Jun](https://github.com/jayjun)
|
||||
* [Tapple Gao](https://github.com/tapple)
|
||||
* [Telepenin Nikolay](https://github.com/telepenin)
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
certbot/CHANGELOG.md
|
||||
1473
CHANGELOG.md
Normal file
1473
CHANGELOG.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1 +0,0 @@
|
||||
This project is governed by [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode).
|
||||
@@ -33,5 +33,3 @@ started. In particular, we recommend you read these sections
|
||||
- [Finding issues to work on](https://certbot.eff.org/docs/contributing.html#find-issues-to-work-on)
|
||||
- [Coding style](https://certbot.eff.org/docs/contributing.html#coding-style)
|
||||
- [Submitting a pull request](https://certbot.eff.org/docs/contributing.html#submitting-a-pull-request)
|
||||
- [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode)
|
||||
|
||||
|
||||
29
Dockerfile
Normal file
29
Dockerfile
Normal file
@@ -0,0 +1,29 @@
|
||||
FROM python:2-alpine3.9
|
||||
|
||||
ENTRYPOINT [ "certbot" ]
|
||||
EXPOSE 80 443
|
||||
VOLUME /etc/letsencrypt /var/lib/letsencrypt
|
||||
WORKDIR /opt/certbot
|
||||
|
||||
COPY CHANGELOG.md README.rst setup.py src/
|
||||
COPY letsencrypt-auto-source/pieces/dependency-requirements.txt .
|
||||
COPY acme src/acme
|
||||
COPY certbot src/certbot
|
||||
|
||||
RUN apk add --no-cache --virtual .certbot-deps \
|
||||
libffi \
|
||||
libssl1.1 \
|
||||
openssl \
|
||||
ca-certificates \
|
||||
binutils
|
||||
RUN apk add --no-cache --virtual .build-deps \
|
||||
gcc \
|
||||
linux-headers \
|
||||
openssl-dev \
|
||||
musl-dev \
|
||||
libffi-dev \
|
||||
&& pip install -r /opt/certbot/dependency-requirements.txt \
|
||||
&& pip install --no-cache-dir \
|
||||
--editable /opt/certbot/src/acme \
|
||||
--editable /opt/certbot/src \
|
||||
&& apk del .build-deps
|
||||
@@ -1,20 +1,21 @@
|
||||
# This Dockerfile builds an image for development.
|
||||
FROM debian:buster
|
||||
FROM ubuntu:xenial
|
||||
|
||||
# Note: this only exposes the port to other docker containers.
|
||||
EXPOSE 80 443
|
||||
|
||||
WORKDIR /opt/certbot/src
|
||||
|
||||
# TODO: Install Apache/Nginx for plugin development.
|
||||
COPY . .
|
||||
RUN apt-get update && \
|
||||
apt-get install apache2 git python3-dev python3-venv gcc libaugeas0 \
|
||||
libssl-dev libffi-dev ca-certificates openssl nginx-light -y && \
|
||||
apt-get install apache2 git nginx-light -y && \
|
||||
letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
RUN VENV_NAME="../venv3" python3 tools/venv3.py
|
||||
RUN VENV_NAME="../venv" python tools/venv.py
|
||||
|
||||
ENV PATH /opt/certbot/venv3/bin:$PATH
|
||||
ENV PATH /opt/certbot/venv/bin:$PATH
|
||||
|
||||
75
Dockerfile-old
Normal file
75
Dockerfile-old
Normal file
@@ -0,0 +1,75 @@
|
||||
# https://github.com/letsencrypt/letsencrypt/pull/431#issuecomment-103659297
|
||||
# it is more likely developers will already have ubuntu:trusty rather
|
||||
# than e.g. debian:jessie and image size differences are negligible
|
||||
FROM ubuntu:trusty
|
||||
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
|
||||
MAINTAINER William Budington <bill@eff.org>
|
||||
|
||||
# Note: this only exposes the port to other docker containers. You
|
||||
# still have to bind to 443@host at runtime, as per the ACME spec.
|
||||
EXPOSE 443
|
||||
|
||||
# TODO: make sure --config-dir and --work-dir cannot be changed
|
||||
# through the CLI (certbot-docker wrapper that uses standalone
|
||||
# authenticator and text mode only?)
|
||||
VOLUME /etc/letsencrypt /var/lib/letsencrypt
|
||||
|
||||
WORKDIR /opt/certbot
|
||||
|
||||
# no need to mkdir anything:
|
||||
# https://docs.docker.com/reference/builder/#copy
|
||||
# If <dest> doesn't exist, it is created along with all missing
|
||||
# directories in its path.
|
||||
|
||||
ENV DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
|
||||
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
# the above is not likely to change, so by putting it further up the
|
||||
# Dockerfile we make sure we cache as much as possible
|
||||
|
||||
|
||||
COPY setup.py README.rst CHANGELOG.md MANIFEST.in letsencrypt-auto-source/pieces/pipstrap.py /opt/certbot/src/
|
||||
|
||||
# all above files are necessary for setup.py and venv setup, however,
|
||||
# package source code directory has to be copied separately to a
|
||||
# subdirectory...
|
||||
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
|
||||
# directory, the entire contents of the directory are copied,
|
||||
# including filesystem metadata. Note: The directory itself is not
|
||||
# copied, just its contents." Order again matters, three files are far
|
||||
# more likely to be cached than the whole project directory
|
||||
|
||||
COPY certbot /opt/certbot/src/certbot/
|
||||
COPY acme /opt/certbot/src/acme/
|
||||
COPY certbot-apache /opt/certbot/src/certbot-apache/
|
||||
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
|
||||
|
||||
|
||||
RUN VIRTUALENV_NO_DOWNLOAD=1 virtualenv --no-site-packages -p python2 /opt/certbot/venv
|
||||
|
||||
# PATH is set now so pipstrap upgrades the correct (v)env
|
||||
ENV PATH /opt/certbot/venv/bin:$PATH
|
||||
RUN /opt/certbot/venv/bin/python /opt/certbot/src/pipstrap.py && \
|
||||
/opt/certbot/venv/bin/pip install \
|
||||
-e /opt/certbot/src/acme \
|
||||
-e /opt/certbot/src \
|
||||
-e /opt/certbot/src/certbot-apache \
|
||||
-e /opt/certbot/src/certbot-nginx
|
||||
|
||||
# install in editable mode (-e) to save space: it's not possible to
|
||||
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
|
||||
# this might also help in debugging: you can "docker run --entrypoint
|
||||
# bash" and investigate, apply patches, etc.
|
||||
|
||||
# set up certbot/letsencrypt wrapper to warn people about Dockerfile changes
|
||||
COPY tools/docker-warning.sh /opt/certbot/bin/certbot
|
||||
RUN ln -s /opt/certbot/bin/certbot /opt/certbot/bin/letsencrypt
|
||||
ENV PATH /opt/certbot/bin:$PATH
|
||||
|
||||
ENTRYPOINT [ "certbot" ]
|
||||
@@ -1,10 +1,9 @@
|
||||
include README.rst
|
||||
include CHANGELOG.md
|
||||
include CONTRIBUTING.md
|
||||
include LICENSE.txt
|
||||
include linter_plugin.py
|
||||
recursive-include docs *
|
||||
recursive-include examples *
|
||||
recursive-include certbot/tests/testdata *
|
||||
recursive-include tests *.py
|
||||
include certbot/ssl-dhparams.pem
|
||||
global-exclude __pycache__
|
||||
global-exclude *.py[cod]
|
||||
@@ -1 +0,0 @@
|
||||
certbot/README.rst
|
||||
157
README.rst
Normal file
157
README.rst
Normal file
@@ -0,0 +1,157 @@
|
||||
.. This file contains a series of comments that are used to include sections of this README in other files. Do not modify these comments unless you know what you are doing. tag:intro-begin
|
||||
|
||||
Certbot is part of EFF’s effort to encrypt the entire Internet. Secure communication over the Web relies on HTTPS, which requires the use of a digital certificate that lets browsers verify the identity of web servers (e.g., is that really google.com?). Web servers obtain their certificates from trusted third parties called certificate authorities (CAs). Certbot is an easy-to-use client that fetches a certificate from Let’s Encrypt—an open certificate authority launched by the EFF, Mozilla, and others—and deploys it to a web server.
|
||||
|
||||
Anyone who has gone through the trouble of setting up a secure website knows what a hassle getting and maintaining a certificate is. Certbot and Let’s Encrypt can automate away the pain and let you turn on and manage HTTPS with simple commands. Using Certbot and Let's Encrypt is free, so there’s no need to arrange payment.
|
||||
|
||||
How you use Certbot depends on the configuration of your web server. The best way to get started is to use our `interactive guide <https://certbot.eff.org>`_. It generates instructions based on your configuration settings. In most cases, you’ll need `root or administrator access <https://certbot.eff.org/faq/#does-certbot-require-root-administrator-privileges>`_ to your web server to run Certbot.
|
||||
|
||||
Certbot is meant to be run directly on your web server, not on your personal computer. If you’re using a hosted service and don’t have direct access to your web server, you might not be able to use Certbot. Check with your hosting provider for documentation about uploading certificates or using certificates issued by Let’s Encrypt.
|
||||
|
||||
Certbot is a fully-featured, extensible client for the Let's
|
||||
Encrypt CA (or any other CA that speaks the `ACME
|
||||
<https://github.com/ietf-wg-acme/acme/blob/master/draft-ietf-acme-acme.md>`_
|
||||
protocol) that can automate the tasks of obtaining certificates and
|
||||
configuring webservers to use them. This client runs on Unix-based operating
|
||||
systems.
|
||||
|
||||
To see the changes made to Certbot between versions please refer to our
|
||||
`changelog <https://github.com/certbot/certbot/blob/master/CHANGELOG.md>`_.
|
||||
|
||||
Until May 2016, Certbot was named simply ``letsencrypt`` or ``letsencrypt-auto``,
|
||||
depending on install method. Instructions on the Internet, and some pieces of the
|
||||
software, may still refer to this older name.
|
||||
|
||||
Contributing
|
||||
------------
|
||||
|
||||
If you'd like to contribute to this project please read `Developer Guide
|
||||
<https://certbot.eff.org/docs/contributing.html>`_.
|
||||
|
||||
.. _installation:
|
||||
|
||||
Installation
|
||||
------------
|
||||
|
||||
The easiest way to install Certbot is by visiting `certbot.eff.org`_, where you can
|
||||
find the correct installation instructions for many web server and OS combinations.
|
||||
For more information, see `Get Certbot <https://certbot.eff.org/docs/install.html>`_.
|
||||
|
||||
.. _certbot.eff.org: https://certbot.eff.org/
|
||||
|
||||
How to run the client
|
||||
---------------------
|
||||
|
||||
In many cases, you can just run ``certbot-auto`` or ``certbot``, and the
|
||||
client will guide you through the process of obtaining and installing certs
|
||||
interactively.
|
||||
|
||||
For full command line help, you can type::
|
||||
|
||||
./certbot-auto --help all
|
||||
|
||||
|
||||
You can also tell it exactly what you want it to do from the command line.
|
||||
For instance, if you want to obtain a cert for ``example.com``,
|
||||
``www.example.com``, and ``other.example.net``, using the Apache plugin to both
|
||||
obtain and install the certs, you could do this::
|
||||
|
||||
./certbot-auto --apache -d example.com -d www.example.com -d other.example.net
|
||||
|
||||
(The first time you run the command, it will make an account, and ask for an
|
||||
email and agreement to the Let's Encrypt Subscriber Agreement; you can
|
||||
automate those with ``--email`` and ``--agree-tos``)
|
||||
|
||||
If you want to use a webserver that doesn't have full plugin support yet, you
|
||||
can still use "standalone" or "webroot" plugins to obtain a certificate::
|
||||
|
||||
./certbot-auto certonly --standalone --email admin@example.com -d example.com -d www.example.com -d other.example.net
|
||||
|
||||
|
||||
Understanding the client in more depth
|
||||
--------------------------------------
|
||||
|
||||
To understand what the client is doing in detail, it's important to
|
||||
understand the way it uses plugins. Please see the `explanation of
|
||||
plugins <https://certbot.eff.org/docs/using.html#plugins>`_ in
|
||||
the User Guide.
|
||||
|
||||
Links
|
||||
=====
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:links-begin
|
||||
|
||||
Documentation: https://certbot.eff.org/docs
|
||||
|
||||
Software project: https://github.com/certbot/certbot
|
||||
|
||||
Notes for developers: https://certbot.eff.org/docs/contributing.html
|
||||
|
||||
Main Website: https://certbot.eff.org
|
||||
|
||||
Let's Encrypt Website: https://letsencrypt.org
|
||||
|
||||
Community: https://community.letsencrypt.org
|
||||
|
||||
ACME spec: http://ietf-wg-acme.github.io/acme/
|
||||
|
||||
ACME working area in github: https://github.com/ietf-wg-acme/acme
|
||||
|
||||
|build-status| |coverage| |docs| |container|
|
||||
|
||||
.. |build-status| image:: https://travis-ci.com/certbot/certbot.svg?branch=master
|
||||
:target: https://travis-ci.com/certbot/certbot
|
||||
:alt: Travis CI status
|
||||
|
||||
.. |coverage| image:: https://codecov.io/gh/certbot/certbot/branch/master/graph/badge.svg
|
||||
:target: https://codecov.io/gh/certbot/certbot
|
||||
:alt: Coverage status
|
||||
|
||||
.. |docs| image:: https://readthedocs.org/projects/letsencrypt/badge/
|
||||
:target: https://readthedocs.org/projects/letsencrypt/
|
||||
:alt: Documentation status
|
||||
|
||||
.. |container| image:: https://quay.io/repository/letsencrypt/letsencrypt/status
|
||||
:target: https://quay.io/repository/letsencrypt/letsencrypt
|
||||
:alt: Docker Repository on Quay.io
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:links-end
|
||||
|
||||
System Requirements
|
||||
===================
|
||||
|
||||
See https://certbot.eff.org/docs/install.html#system-requirements.
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:intro-end
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:features-begin
|
||||
|
||||
Current Features
|
||||
=====================
|
||||
|
||||
* Supports multiple web servers:
|
||||
|
||||
- apache/2.x
|
||||
- nginx/0.8.48+
|
||||
- webroot (adds files to webroot directories in order to prove control of
|
||||
domains and obtain certs)
|
||||
- standalone (runs its own simple webserver to prove you control a domain)
|
||||
- other server software via `third party plugins <https://certbot.eff.org/docs/using.html#third-party-plugins>`_
|
||||
|
||||
* The private key is generated locally on your system.
|
||||
* Can talk to the Let's Encrypt CA or optionally to other ACME
|
||||
compliant services.
|
||||
* Can get domain-validated (DV) certificates.
|
||||
* Can revoke certificates.
|
||||
* Adjustable RSA key bit-length (2048 (default), 4096, ...).
|
||||
* Can optionally install a http -> https redirect, so your site effectively
|
||||
runs https only (Apache only)
|
||||
* Fully automated.
|
||||
* Configuration changes are logged and can be reverted.
|
||||
* Supports an interactive text UI, or can be driven entirely from the
|
||||
command line.
|
||||
* Free and Open Source Software, made with Python.
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:features-end
|
||||
|
||||
For extensive documentation on using and contributing to Certbot, go to https://certbot.eff.org/docs. If you would like to contribute to the project or run the latest code from git, you should read our `developer guide <https://certbot.eff.org/docs/contributing.html>`_.
|
||||
@@ -3,6 +3,4 @@ include README.rst
|
||||
include pytest.ini
|
||||
recursive-include docs *
|
||||
recursive-include examples *
|
||||
recursive-include tests *
|
||||
global-exclude __pycache__
|
||||
global-exclude *.py[cod]
|
||||
recursive-include acme/testdata *
|
||||
|
||||
@@ -6,13 +6,13 @@ This module is an implementation of the `ACME protocol`_.
|
||||
|
||||
"""
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
# This code exists to keep backwards compatibility with people using acme.jose
|
||||
# before it became the standalone josepy package.
|
||||
#
|
||||
# It is based on
|
||||
# https://github.com/requests/requests/blob/1278ecdf71a312dc2268f3bfc0aabfab3c006dcf/requests/packages.py
|
||||
|
||||
import josepy as jose
|
||||
|
||||
for mod in list(sys.modules):
|
||||
|
||||
@@ -3,19 +3,27 @@ import abc
|
||||
import functools
|
||||
import hashlib
|
||||
import logging
|
||||
import socket
|
||||
import warnings
|
||||
|
||||
from cryptography.hazmat.primitives import hashes # type: ignore
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
import requests
|
||||
import six
|
||||
|
||||
from acme import errors
|
||||
from acme import crypto_util
|
||||
from acme import fields
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# pylint: disable=too-few-public-methods
|
||||
|
||||
|
||||
class Challenge(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
# _fields_to_partial_json | pylint: disable=abstract-method
|
||||
"""ACME challenge."""
|
||||
TYPES = {} # type: dict
|
||||
|
||||
@@ -29,7 +37,7 @@ class Challenge(jose.TypedJSONObjectWithFields):
|
||||
|
||||
|
||||
class ChallengeResponse(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
# _fields_to_partial_json | pylint: disable=abstract-method
|
||||
"""ACME challenge response."""
|
||||
TYPES = {} # type: dict
|
||||
resource_type = 'challenge'
|
||||
@@ -54,7 +62,8 @@ class UnrecognizedChallenge(Challenge):
|
||||
object.__setattr__(self, "jobj", jobj)
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.jobj # pylint: disable=no-member
|
||||
# pylint: disable=no-member
|
||||
return self.jobj
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
@@ -87,7 +96,6 @@ class _TokenChallenge(Challenge):
|
||||
"""
|
||||
# TODO: check that path combined with uri does not go above
|
||||
# URI_ROOT_PATH!
|
||||
# pylint: disable=unsupported-membership-test
|
||||
return b'..' not in self.token and b'/' not in self.token
|
||||
|
||||
|
||||
@@ -112,7 +120,7 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
parts = self.key_authorization.split('.')
|
||||
parts = self.key_authorization.split('.') # pylint: disable=no-member
|
||||
if len(parts) != 2:
|
||||
logger.debug("Key authorization (%r) is not well formed",
|
||||
self.key_authorization)
|
||||
@@ -132,14 +140,10 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
|
||||
|
||||
return True
|
||||
|
||||
def to_partial_json(self):
|
||||
jobj = super(KeyAuthorizationChallengeResponse, self).to_partial_json()
|
||||
jobj.pop('keyAuthorization', None)
|
||||
return jobj
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class KeyAuthorizationChallenge(_TokenChallenge):
|
||||
# pylint: disable=abstract-class-little-used,too-many-ancestors
|
||||
"""Challenge based on Key Authorization.
|
||||
|
||||
:param response_cls: Subclass of `KeyAuthorizationChallengeResponse`
|
||||
@@ -171,7 +175,7 @@ class KeyAuthorizationChallenge(_TokenChallenge):
|
||||
:rtype: KeyAuthorizationChallengeResponse
|
||||
|
||||
"""
|
||||
return self.response_cls( # pylint: disable=not-callable
|
||||
return self.response_cls(
|
||||
key_authorization=self.key_authorization(account_key))
|
||||
|
||||
@abc.abstractmethod
|
||||
@@ -208,7 +212,7 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME dns-01 challenge response."""
|
||||
typ = "dns-01"
|
||||
|
||||
def simple_verify(self, chall, domain, account_public_key): # pylint: disable=unused-argument
|
||||
def simple_verify(self, chall, domain, account_public_key):
|
||||
"""Simple verify.
|
||||
|
||||
This method no longer checks DNS records and is a simple wrapper
|
||||
@@ -224,13 +228,14 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
# pylint: disable=unused-argument
|
||||
verified = self.verify(chall, account_public_key)
|
||||
if not verified:
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return verified
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class DNS01(KeyAuthorizationChallenge):
|
||||
"""ACME dns-01 challenge."""
|
||||
response_cls = DNS01Response
|
||||
@@ -320,7 +325,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
|
||||
return True
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class HTTP01(KeyAuthorizationChallenge):
|
||||
"""ACME http-01 challenge."""
|
||||
response_cls = HTTP01Response
|
||||
@@ -360,6 +365,154 @@ class HTTP01(KeyAuthorizationChallenge):
|
||||
return self.key_authorization(account_key)
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class TLSSNI01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME tls-sni-01 challenge response."""
|
||||
typ = "tls-sni-01"
|
||||
|
||||
DOMAIN_SUFFIX = b".acme.invalid"
|
||||
"""Domain name suffix."""
|
||||
|
||||
PORT = 443
|
||||
"""Verification port as defined by the protocol.
|
||||
|
||||
You can override it (e.g. for testing) by passing ``port`` to
|
||||
`simple_verify`.
|
||||
|
||||
"""
|
||||
|
||||
@property
|
||||
def z(self): # pylint: disable=invalid-name
|
||||
"""``z`` value used for verification.
|
||||
|
||||
:rtype bytes:
|
||||
|
||||
"""
|
||||
return hashlib.sha256(
|
||||
self.key_authorization.encode("utf-8")).hexdigest().lower().encode()
|
||||
|
||||
@property
|
||||
def z_domain(self):
|
||||
"""Domain name used for verification, generated from `z`.
|
||||
|
||||
:rtype bytes:
|
||||
|
||||
"""
|
||||
return self.z[:32] + b'.' + self.z[32:] + self.DOMAIN_SUFFIX
|
||||
|
||||
def gen_cert(self, key=None, bits=2048):
|
||||
"""Generate tls-sni-01 certificate.
|
||||
|
||||
:param OpenSSL.crypto.PKey key: Optional private key used in
|
||||
certificate generation. If not provided (``None``), then
|
||||
fresh key will be generated.
|
||||
:param int bits: Number of bits for newly generated key.
|
||||
|
||||
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
|
||||
|
||||
"""
|
||||
if key is None:
|
||||
key = OpenSSL.crypto.PKey()
|
||||
key.generate_key(OpenSSL.crypto.TYPE_RSA, bits)
|
||||
return crypto_util.gen_ss_cert(key, [
|
||||
# z_domain is too big to fit into CN, hence first dummy domain
|
||||
'dummy', self.z_domain.decode()], force_san=True), key
|
||||
|
||||
def probe_cert(self, domain, **kwargs):
|
||||
"""Probe tls-sni-01 challenge certificate.
|
||||
|
||||
:param unicode domain:
|
||||
|
||||
"""
|
||||
# TODO: domain is not necessary if host is provided
|
||||
if "host" not in kwargs:
|
||||
host = socket.gethostbyname(domain)
|
||||
logger.debug('%s resolved to %s', domain, host)
|
||||
kwargs["host"] = host
|
||||
|
||||
kwargs.setdefault("port", self.PORT)
|
||||
kwargs["name"] = self.z_domain
|
||||
# TODO: try different methods?
|
||||
# pylint: disable=protected-access
|
||||
return crypto_util.probe_sni(**kwargs)
|
||||
|
||||
def verify_cert(self, cert):
|
||||
"""Verify tls-sni-01 challenge certificate.
|
||||
|
||||
:param OpensSSL.crypto.X509 cert: Challenge certificate.
|
||||
|
||||
:returns: Whether the certificate was successfully verified.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
# pylint: disable=protected-access
|
||||
sans = crypto_util._pyopenssl_cert_or_req_san(cert)
|
||||
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), sans)
|
||||
return self.z_domain.decode() in sans
|
||||
|
||||
def simple_verify(self, chall, domain, account_public_key,
|
||||
cert=None, **kwargs):
|
||||
"""Simple verify.
|
||||
|
||||
Verify ``validation`` using ``account_public_key``, optionally
|
||||
probe tls-sni-01 certificate and check using `verify_cert`.
|
||||
|
||||
:param .challenges.TLSSNI01 chall: Corresponding challenge.
|
||||
:param str domain: Domain name being validated.
|
||||
:param JWK account_public_key:
|
||||
:param OpenSSL.crypto.X509 cert: Optional certificate. If not
|
||||
provided (``None``) certificate will be retrieved using
|
||||
`probe_cert`.
|
||||
:param int port: Port used to probe the certificate.
|
||||
|
||||
|
||||
:returns: ``True`` iff client's control of the domain has been
|
||||
verified.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not self.verify(chall, account_public_key):
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return False
|
||||
|
||||
if cert is None:
|
||||
try:
|
||||
cert = self.probe_cert(domain=domain, **kwargs)
|
||||
except errors.Error as error:
|
||||
logger.debug(str(error), exc_info=True)
|
||||
return False
|
||||
|
||||
return self.verify_cert(cert)
|
||||
|
||||
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class TLSSNI01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-sni-01 challenge."""
|
||||
response_cls = TLSSNI01Response
|
||||
typ = response_cls.typ
|
||||
|
||||
# boulder#962, ietf-wg-acme#22
|
||||
#n = jose.Field("n", encoder=int, decoder=int)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
warnings.warn("TLS-SNI-01 is deprecated, and will stop working soon.",
|
||||
DeprecationWarning, stacklevel=2)
|
||||
super(TLSSNI01, self).__init__(*args, **kwargs)
|
||||
|
||||
def validation(self, account_key, **kwargs):
|
||||
"""Generate validation.
|
||||
|
||||
:param JWK account_key:
|
||||
:param OpenSSL.crypto.PKey cert_key: Optional private key used
|
||||
in certificate generation. If not provided (``None``), then
|
||||
fresh key will be generated.
|
||||
|
||||
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
|
||||
|
||||
"""
|
||||
return self.response(account_key).gen_cert(key=kwargs.get('cert_key'))
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME TLS-ALPN-01 challenge response.
|
||||
@@ -371,7 +524,7 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
typ = "tls-alpn-01"
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class TLSALPN01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-alpn-01 challenge.
|
||||
|
||||
@@ -387,7 +540,7 @@ class TLSALPN01(KeyAuthorizationChallenge):
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class DNS(_TokenChallenge):
|
||||
"""ACME "dns" challenge."""
|
||||
typ = "dns"
|
||||
|
||||
@@ -1,12 +1,16 @@
|
||||
"""Tests for acme.challenges."""
|
||||
import unittest
|
||||
import warnings
|
||||
|
||||
import josepy as jose
|
||||
import mock
|
||||
import OpenSSL
|
||||
import requests
|
||||
from six.moves.urllib import parse as urllib_parse
|
||||
|
||||
import test_util
|
||||
from six.moves.urllib import parse as urllib_parse # pylint: disable=import-error
|
||||
|
||||
from acme import errors
|
||||
from acme import test_util
|
||||
|
||||
CERT = test_util.load_comparable_cert('cert.pem')
|
||||
KEY = jose.JWKRSA(key=test_util.load_rsa_private_key('rsa512_key.pem'))
|
||||
@@ -18,6 +22,7 @@ class ChallengeTest(unittest.TestCase):
|
||||
from acme.challenges import Challenge
|
||||
from acme.challenges import UnrecognizedChallenge
|
||||
chall = UnrecognizedChallenge({"type": "foo"})
|
||||
# pylint: disable=no-member
|
||||
self.assertEqual(chall, Challenge.from_json(chall.jobj))
|
||||
|
||||
|
||||
@@ -73,6 +78,7 @@ class KeyAuthorizationChallengeResponseTest(unittest.TestCase):
|
||||
|
||||
|
||||
class DNS01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import DNS01Response
|
||||
@@ -88,8 +94,7 @@ class DNS01ResponseTest(unittest.TestCase):
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import DNS01Response
|
||||
@@ -144,6 +149,7 @@ class DNS01Test(unittest.TestCase):
|
||||
|
||||
|
||||
class HTTP01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
@@ -159,8 +165,7 @@ class HTTP01ResponseTest(unittest.TestCase):
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
@@ -253,7 +258,152 @@ class HTTP01Test(unittest.TestCase):
|
||||
self.msg.update(token=b'..').good_token)
|
||||
|
||||
|
||||
class TLSSNI01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import TLSSNI01
|
||||
self.chall = TLSSNI01(
|
||||
token=jose.b64decode(b'a82d5ff8ef740d12881f6d3c2277ab2e'))
|
||||
|
||||
self.response = self.chall.response(KEY)
|
||||
self.jmsg = {
|
||||
'resource': 'challenge',
|
||||
'type': 'tls-sni-01',
|
||||
'keyAuthorization': self.response.key_authorization,
|
||||
}
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
label1 = b'dc38d9c3fa1a4fdcc3a5501f2d38583f'
|
||||
label2 = b'b7793728f084394f2a1afd459556bb5c'
|
||||
self.z = label1 + label2
|
||||
self.z_domain = label1 + b'.' + label2 + b'.acme.invalid'
|
||||
self.domain = 'foo.com'
|
||||
|
||||
def test_z_and_domain(self):
|
||||
self.assertEqual(self.z, self.response.z)
|
||||
self.assertEqual(self.z_domain, self.response.z_domain)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.jmsg, self.response.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSSNI01Response
|
||||
self.assertEqual(self.response, TLSSNI01Response.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import TLSSNI01Response
|
||||
hash(TLSSNI01Response.from_json(self.jmsg))
|
||||
|
||||
@mock.patch('acme.challenges.socket.gethostbyname')
|
||||
@mock.patch('acme.challenges.crypto_util.probe_sni')
|
||||
def test_probe_cert(self, mock_probe_sni, mock_gethostbyname):
|
||||
mock_gethostbyname.return_value = '127.0.0.1'
|
||||
self.response.probe_cert('foo.com')
|
||||
mock_gethostbyname.assert_called_once_with('foo.com')
|
||||
mock_probe_sni.assert_called_once_with(
|
||||
host='127.0.0.1', port=self.response.PORT,
|
||||
name=self.z_domain)
|
||||
|
||||
self.response.probe_cert('foo.com', host='8.8.8.8')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host='8.8.8.8', port=mock.ANY, name=mock.ANY)
|
||||
|
||||
self.response.probe_cert('foo.com', port=1234)
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=1234, name=mock.ANY)
|
||||
|
||||
self.response.probe_cert('foo.com', bar='baz')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=mock.ANY, name=mock.ANY, bar='baz')
|
||||
|
||||
self.response.probe_cert('foo.com', name=b'xxx')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=mock.ANY,
|
||||
name=self.z_domain)
|
||||
|
||||
def test_gen_verify_cert(self):
|
||||
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
|
||||
cert, key2 = self.response.gen_cert(key1)
|
||||
self.assertEqual(key1, key2)
|
||||
self.assertTrue(self.response.verify_cert(cert))
|
||||
|
||||
def test_gen_verify_cert_gen_key(self):
|
||||
cert, key = self.response.gen_cert()
|
||||
self.assertTrue(isinstance(key, OpenSSL.crypto.PKey))
|
||||
self.assertTrue(self.response.verify_cert(cert))
|
||||
|
||||
def test_verify_bad_cert(self):
|
||||
self.assertFalse(self.response.verify_cert(
|
||||
test_util.load_cert('cert.pem')))
|
||||
|
||||
def test_simple_verify_bad_key_authorization(self):
|
||||
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
|
||||
self.response.simple_verify(self.chall, "local", key2.public_key())
|
||||
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.verify_cert', autospec=True)
|
||||
def test_simple_verify(self, mock_verify_cert):
|
||||
mock_verify_cert.return_value = mock.sentinel.verification
|
||||
self.assertEqual(
|
||||
mock.sentinel.verification, self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key(),
|
||||
cert=mock.sentinel.cert))
|
||||
mock_verify_cert.assert_called_once_with(
|
||||
self.response, mock.sentinel.cert)
|
||||
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.probe_cert')
|
||||
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
|
||||
mock_probe_cert.side_effect = errors.Error
|
||||
self.assertFalse(self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key()))
|
||||
|
||||
|
||||
class TLSSNI01Test(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.jmsg = {
|
||||
'type': 'tls-sni-01',
|
||||
'token': 'a82d5ff8ef740d12881f6d3c2277ab2e',
|
||||
}
|
||||
|
||||
def _msg(self):
|
||||
from acme.challenges import TLSSNI01
|
||||
with warnings.catch_warnings(record=True) as warn:
|
||||
warnings.simplefilter("always")
|
||||
msg = TLSSNI01(
|
||||
token=jose.b64decode('a82d5ff8ef740d12881f6d3c2277ab2e'))
|
||||
assert warn is not None # using a raw assert for mypy
|
||||
self.assertTrue(len(warn) == 1)
|
||||
self.assertTrue(issubclass(warn[-1].category, DeprecationWarning))
|
||||
self.assertTrue('deprecated' in str(warn[-1].message))
|
||||
return msg
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.jmsg, self._msg().to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSSNI01
|
||||
self.assertEqual(self._msg(), TLSSNI01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import TLSSNI01
|
||||
hash(TLSSNI01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_invalid_token_length(self):
|
||||
from acme.challenges import TLSSNI01
|
||||
self.jmsg['token'] = jose.encode_b64jose(b'abcd')
|
||||
self.assertRaises(
|
||||
jose.DeserializationError, TLSSNI01.from_json, self.jmsg)
|
||||
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.gen_cert')
|
||||
def test_validation(self, mock_gen_cert):
|
||||
mock_gen_cert.return_value = ('cert', 'key')
|
||||
self.assertEqual(('cert', 'key'), self._msg().validation(
|
||||
KEY, cert_key=mock.sentinel.cert_key))
|
||||
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key)
|
||||
|
||||
class TLSALPN01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
@@ -269,8 +419,7 @@ class TLSALPN01ResponseTest(unittest.TestCase):
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
@@ -5,26 +5,25 @@ import datetime
|
||||
from email.utils import parsedate_tz
|
||||
import heapq
|
||||
import logging
|
||||
import re
|
||||
import sys
|
||||
import time
|
||||
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
import requests
|
||||
from requests.adapters import HTTPAdapter
|
||||
from requests_toolbelt.adapters.source import SourceAddressAdapter
|
||||
import six
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
import re
|
||||
from requests_toolbelt.adapters.source import SourceAddressAdapter
|
||||
import requests
|
||||
from requests.adapters import HTTPAdapter
|
||||
import sys
|
||||
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
from acme import jws
|
||||
from acme import messages
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Text # pylint: disable=unused-import, no-name-in-module
|
||||
# pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Dict, List, Set, Text
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -34,6 +33,7 @@ logger = logging.getLogger(__name__)
|
||||
# https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning
|
||||
if sys.version_info < (2, 7, 9): # pragma: no cover
|
||||
try:
|
||||
# pylint: disable=no-member
|
||||
requests.packages.urllib3.contrib.pyopenssl.inject_into_urllib3() # type: ignore
|
||||
except AttributeError:
|
||||
import urllib3.contrib.pyopenssl # pylint: disable=import-error
|
||||
@@ -44,7 +44,7 @@ DEFAULT_NETWORK_TIMEOUT = 45
|
||||
DER_CONTENT_TYPE = 'application/pkix-cert'
|
||||
|
||||
|
||||
class ClientBase(object):
|
||||
class ClientBase(object): # pylint: disable=too-many-instance-attributes
|
||||
"""ACME client base object.
|
||||
|
||||
:ivar messages.Directory directory:
|
||||
@@ -123,21 +123,14 @@ class ClientBase(object):
|
||||
"""
|
||||
return self.update_registration(regr, update={'status': 'deactivated'})
|
||||
|
||||
def deactivate_authorization(self, authzr):
|
||||
# type: (messages.AuthorizationResource) -> messages.AuthorizationResource
|
||||
"""Deactivate authorization.
|
||||
def query_registration(self, regr):
|
||||
"""Query server about registration.
|
||||
|
||||
:param messages.AuthorizationResource authzr: The Authorization resource
|
||||
to be deactivated.
|
||||
|
||||
:returns: The Authorization resource that was deactivated.
|
||||
:rtype: `.AuthorizationResource`
|
||||
:param messages.RegistrationResource: Existing Registration
|
||||
Resource.
|
||||
|
||||
"""
|
||||
body = messages.UpdateAuthorization(status='deactivated')
|
||||
response = self._post(authzr.uri, body)
|
||||
return self._authzr_from_response(response,
|
||||
authzr.body.identifier, authzr.uri)
|
||||
return self._send_recv_regr(regr, messages.UpdateRegistration())
|
||||
|
||||
def _authzr_from_response(self, response, identifier=None, uri=None):
|
||||
authzr = messages.AuthorizationResource(
|
||||
@@ -254,6 +247,7 @@ class Client(ClientBase):
|
||||
URI from which the resource will be downloaded.
|
||||
|
||||
"""
|
||||
# pylint: disable=too-many-arguments
|
||||
self.key = key
|
||||
if net is None:
|
||||
net = ClientNetwork(key, alg=alg, verify_ssl=verify_ssl)
|
||||
@@ -279,17 +273,9 @@ class Client(ClientBase):
|
||||
assert response.status_code == http_client.CREATED
|
||||
|
||||
# "Instance of 'Field' has no key/contact member" bug:
|
||||
# pylint: disable=no-member
|
||||
return self._regr_from_response(response)
|
||||
|
||||
def query_registration(self, regr):
|
||||
"""Query server about registration.
|
||||
|
||||
:param messages.RegistrationResource: Existing Registration
|
||||
Resource.
|
||||
|
||||
"""
|
||||
return self._send_recv_regr(regr, messages.UpdateRegistration())
|
||||
|
||||
def agree_to_tos(self, regr):
|
||||
"""Agree to the terms-of-service.
|
||||
|
||||
@@ -433,6 +419,7 @@ class Client(ClientBase):
|
||||
was marked by the CA as invalid
|
||||
|
||||
"""
|
||||
# pylint: disable=too-many-locals
|
||||
assert max_attempts > 0
|
||||
attempts = collections.defaultdict(int) # type: Dict[messages.AuthorizationResource, int]
|
||||
exhausted = set()
|
||||
@@ -463,6 +450,7 @@ class Client(ClientBase):
|
||||
updated[authzr] = updated_authzr
|
||||
|
||||
attempts[authzr] += 1
|
||||
# pylint: disable=no-member
|
||||
if updated_authzr.body.status not in (
|
||||
messages.STATUS_VALID, messages.STATUS_INVALID):
|
||||
if attempts[authzr] < max_attempts:
|
||||
@@ -603,6 +591,7 @@ class ClientV2(ClientBase):
|
||||
if response.status_code == 200 and 'Location' in response.headers:
|
||||
raise errors.ConflictError(response.headers.get('Location'))
|
||||
# "Instance of 'Field' has no key/contact member" bug:
|
||||
# pylint: disable=no-member
|
||||
regr = self._regr_from_response(response)
|
||||
self.net.account = regr
|
||||
return regr
|
||||
@@ -614,13 +603,10 @@ class ClientV2(ClientBase):
|
||||
Resource.
|
||||
|
||||
"""
|
||||
self.net.account = regr # See certbot/certbot#6258
|
||||
# ACME v2 requires to use a POST-as-GET request (POST an empty JWS) here.
|
||||
# This is done by passing None instead of an empty UpdateRegistration to _post().
|
||||
response = self._post(regr.uri, None)
|
||||
self.net.account = self._regr_from_response(response, uri=regr.uri,
|
||||
terms_of_service=regr.terms_of_service)
|
||||
return self.net.account
|
||||
self.net.account = regr
|
||||
updated_regr = super(ClientV2, self).query_registration(regr)
|
||||
self.net.account = updated_regr
|
||||
return updated_regr
|
||||
|
||||
def update_registration(self, regr, update=None):
|
||||
"""Update registration.
|
||||
@@ -666,7 +652,7 @@ class ClientV2(ClientBase):
|
||||
response = self._post(self.directory['newOrder'], order)
|
||||
body = messages.Order.from_json(response.json())
|
||||
authorizations = []
|
||||
for url in body.authorizations: # pylint: disable=not-an-iterable
|
||||
for url in body.authorizations:
|
||||
authorizations.append(self._authzr_from_response(self._post_as_get(url), uri=url))
|
||||
return messages.OrderResource(
|
||||
body=body,
|
||||
@@ -726,9 +712,9 @@ class ClientV2(ClientBase):
|
||||
for authzr in responses:
|
||||
if authzr.body.status != messages.STATUS_VALID:
|
||||
for chall in authzr.body.challenges:
|
||||
if chall.error is not None:
|
||||
if chall.error != None:
|
||||
failed.append(authzr)
|
||||
if failed:
|
||||
if len(failed) > 0:
|
||||
raise errors.ValidationError(failed)
|
||||
return orderr.update(authorizations=responses)
|
||||
|
||||
@@ -772,17 +758,36 @@ class ClientV2(ClientBase):
|
||||
|
||||
def external_account_required(self):
|
||||
"""Checks if ACME server requires External Account Binding authentication."""
|
||||
return hasattr(self.directory, 'meta') and self.directory.meta.external_account_required
|
||||
if hasattr(self.directory, 'meta') and self.directory.meta.external_account_required:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
def _post_as_get(self, *args, **kwargs):
|
||||
"""
|
||||
Send GET request using the POST-as-GET protocol.
|
||||
Send GET request using the POST-as-GET protocol if needed.
|
||||
The request will be first issued using POST-as-GET for ACME v2. If the ACME CA servers do
|
||||
not support this yet and return an error, request will be retried using GET.
|
||||
For ACME v1, only GET request will be tried, as POST-as-GET is not supported.
|
||||
:param args:
|
||||
:param kwargs:
|
||||
:return:
|
||||
"""
|
||||
new_args = args[:1] + (None,) + args[1:]
|
||||
return self._post(*new_args, **kwargs)
|
||||
if self.acme_version >= 2:
|
||||
# We add an empty payload for POST-as-GET requests
|
||||
new_args = args[:1] + (None,) + args[1:]
|
||||
try:
|
||||
return self._post(*new_args, **kwargs) # pylint: disable=star-args
|
||||
except messages.Error as error:
|
||||
if error.code == 'malformed':
|
||||
logger.debug('Error during a POST-as-GET request, '
|
||||
'your ACME CA may not support it:\n%s', error)
|
||||
logger.debug('Retrying request with GET.')
|
||||
else: # pragma: no cover
|
||||
raise
|
||||
|
||||
# If POST-as-GET is not supported yet, we use a GET instead.
|
||||
return self.net.get(*args, **kwargs)
|
||||
|
||||
|
||||
class BackwardsCompatibleClientV2(object):
|
||||
@@ -860,7 +865,8 @@ class BackwardsCompatibleClientV2(object):
|
||||
for domain in dnsNames:
|
||||
authorizations.append(self.client.request_domain_challenges(domain))
|
||||
return messages.OrderResource(authorizations=authorizations, csr_pem=csr_pem)
|
||||
return self.client.new_order(csr_pem)
|
||||
else:
|
||||
return self.client.new_order(csr_pem)
|
||||
|
||||
def finalize_order(self, orderr, deadline):
|
||||
"""Finalize an order and obtain a certificate.
|
||||
@@ -897,7 +903,8 @@ class BackwardsCompatibleClientV2(object):
|
||||
chain = crypto_util.dump_pyopenssl_chain(chain).decode()
|
||||
|
||||
return orderr.update(fullchain_pem=(cert + chain))
|
||||
return self.client.finalize_order(orderr, deadline)
|
||||
else:
|
||||
return self.client.finalize_order(orderr, deadline)
|
||||
|
||||
def revoke(self, cert, rsn):
|
||||
"""Revoke certificate.
|
||||
@@ -915,7 +922,8 @@ class BackwardsCompatibleClientV2(object):
|
||||
def _acme_version_from_directory(self, directory):
|
||||
if hasattr(directory, 'newNonce'):
|
||||
return 2
|
||||
return 1
|
||||
else:
|
||||
return 1
|
||||
|
||||
def external_account_required(self):
|
||||
"""Checks if the server requires an external account for ACMEv2 servers.
|
||||
@@ -923,10 +931,11 @@ class BackwardsCompatibleClientV2(object):
|
||||
Always return False for ACMEv1 servers, as it doesn't use External Account Binding."""
|
||||
if self.acme_version == 1:
|
||||
return False
|
||||
return self.client.external_account_required()
|
||||
else:
|
||||
return self.client.external_account_required()
|
||||
|
||||
|
||||
class ClientNetwork(object):
|
||||
class ClientNetwork(object): # pylint: disable=too-many-instance-attributes
|
||||
"""Wrapper around requests that signs POSTs for authentication.
|
||||
|
||||
Also adds user agent, and handles Content-Type.
|
||||
@@ -952,6 +961,7 @@ class ClientNetwork(object):
|
||||
def __init__(self, key, account=None, alg=jose.RS256, verify_ssl=True,
|
||||
user_agent='acme-python', timeout=DEFAULT_NETWORK_TIMEOUT,
|
||||
source_address=None):
|
||||
# pylint: disable=too-many-arguments
|
||||
self.key = key
|
||||
self.account = account
|
||||
self.alg = alg
|
||||
@@ -1000,6 +1010,7 @@ class ClientNetwork(object):
|
||||
if self.account is not None:
|
||||
kwargs["kid"] = self.account["uri"]
|
||||
kwargs["key"] = self.key
|
||||
# pylint: disable=star-args
|
||||
return jws.JWS.sign(jobj, **kwargs).json_dumps(indent=2)
|
||||
|
||||
@classmethod
|
||||
@@ -1059,6 +1070,7 @@ class ClientNetwork(object):
|
||||
return response
|
||||
|
||||
def _send_request(self, method, url, *args, **kwargs):
|
||||
# pylint: disable=too-many-locals
|
||||
"""Send HTTP request.
|
||||
|
||||
Makes sure that `verify_ssl` is respected. Logs request and
|
||||
@@ -1105,9 +1117,10 @@ class ClientNetwork(object):
|
||||
err_regex = r".*host='(\S*)'.*Max retries exceeded with url\: (\/\w*).*(\[Errno \d+\])([A-Za-z ]*)"
|
||||
m = re.match(err_regex, str(e))
|
||||
if m is None:
|
||||
raise # pragma: no cover
|
||||
host, path, _err_no, err_msg = m.groups()
|
||||
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
|
||||
raise # pragma: no cover
|
||||
else:
|
||||
host, path, _err_no, err_msg = m.groups()
|
||||
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
|
||||
|
||||
# If content is DER, log the base64 of it instead of raw bytes, to keep
|
||||
# binary data out of the logs.
|
||||
@@ -1173,11 +1186,15 @@ class ClientNetwork(object):
|
||||
if error.code == 'badNonce':
|
||||
logger.debug('Retrying request after error:\n%s', error)
|
||||
return self._post_once(*args, **kwargs)
|
||||
raise
|
||||
else:
|
||||
raise
|
||||
|
||||
def _post_once(self, url, obj, content_type=JOSE_CONTENT_TYPE,
|
||||
acme_version=1, **kwargs):
|
||||
new_nonce_url = kwargs.pop('new_nonce_url', None)
|
||||
try:
|
||||
new_nonce_url = kwargs.pop('new_nonce_url')
|
||||
except KeyError:
|
||||
new_nonce_url = None
|
||||
data = self._wrap_in_jws(obj, self._get_nonce(url, new_nonce_url), url, acme_version)
|
||||
kwargs.setdefault('headers', {'Content-Type': content_type})
|
||||
response = self._send_request('POST', url, data=data, **kwargs)
|
||||
|
||||
@@ -5,19 +5,21 @@ import datetime
|
||||
import json
|
||||
import unittest
|
||||
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
|
||||
import josepy as jose
|
||||
import mock
|
||||
import OpenSSL
|
||||
import requests
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
|
||||
from acme import challenges
|
||||
from acme import errors
|
||||
from acme import jws as acme_jws
|
||||
from acme import messages
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
import messages_test
|
||||
import test_util
|
||||
from acme import messages_test
|
||||
from acme import test_util
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
CERT_DER = test_util.load_vector('cert.der')
|
||||
CERT_SAN_PEM = test_util.load_vector('cert-san.pem')
|
||||
@@ -61,8 +63,8 @@ class ClientTestBase(unittest.TestCase):
|
||||
self.contact = ('mailto:cert-admin@example.com', 'tel:+12025551212')
|
||||
reg = messages.Registration(
|
||||
contact=self.contact, key=KEY.public_key())
|
||||
the_arg = dict(reg) # type: Dict
|
||||
self.new_reg = messages.NewRegistration(**the_arg)
|
||||
the_arg = dict(reg) # type: Dict
|
||||
self.new_reg = messages.NewRegistration(**the_arg) # pylint: disable=star-args
|
||||
self.regr = messages.RegistrationResource(
|
||||
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1')
|
||||
|
||||
@@ -316,6 +318,7 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
|
||||
|
||||
class ClientTest(ClientTestBase):
|
||||
"""Tests for acme.client.Client."""
|
||||
# pylint: disable=too-many-instance-attributes,too-many-public-methods
|
||||
|
||||
def setUp(self):
|
||||
super(ClientTest, self).setUp()
|
||||
@@ -355,6 +358,7 @@ class ClientTest(ClientTestBase):
|
||||
|
||||
def test_register(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
# pylint: disable=no-member
|
||||
self.response.status_code = http_client.CREATED
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
@@ -367,6 +371,7 @@ class ClientTest(ClientTestBase):
|
||||
|
||||
def test_update_registration(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
# pylint: disable=no-member
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.assertEqual(self.regr, self.client.update_registration(self.regr))
|
||||
@@ -634,14 +639,6 @@ class ClientTest(ClientTestBase):
|
||||
errors.PollError, self.client.poll_and_request_issuance,
|
||||
csr, authzrs, mintime=mintime, max_attempts=2)
|
||||
|
||||
def test_deactivate_authorization(self):
|
||||
authzb = self.authzr.body.update(status=messages.STATUS_DEACTIVATED)
|
||||
self.response.json.return_value = authzb.to_json()
|
||||
authzr = self.client.deactivate_authorization(self.authzr)
|
||||
self.assertEqual(authzb, authzr.body)
|
||||
self.assertEqual(self.client.net.post.call_count, 1)
|
||||
self.assertTrue(self.authzr.uri in self.net.post.call_args_list[0][0])
|
||||
|
||||
def test_check_cert(self):
|
||||
self.response.headers['Location'] = self.certr.uri
|
||||
self.response.content = CERT_DER
|
||||
@@ -847,6 +844,7 @@ class ClientV2Test(ClientTestBase):
|
||||
|
||||
def test_update_registration(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
# pylint: disable=no-member
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.assertEqual(self.regr, self.client.update_registration(self.regr))
|
||||
@@ -885,6 +883,19 @@ class ClientV2Test(ClientTestBase):
|
||||
new_nonce_url='https://www.letsencrypt-demo.org/acme/new-nonce')
|
||||
self.client.net.get.assert_not_called()
|
||||
|
||||
class FakeError(messages.Error): # pylint: disable=too-many-ancestors
|
||||
"""Fake error to reproduce a malformed request ACME error"""
|
||||
def __init__(self): # pylint: disable=super-init-not-called
|
||||
pass
|
||||
@property
|
||||
def code(self):
|
||||
return 'malformed'
|
||||
self.client.net.post.side_effect = FakeError()
|
||||
|
||||
self.client.poll(self.authzr2) # pylint: disable=protected-access
|
||||
|
||||
self.client.net.get.assert_called_once_with(self.authzr2.uri)
|
||||
|
||||
|
||||
class MockJSONDeSerializable(jose.JSONDeSerializable):
|
||||
# pylint: disable=missing-docstring
|
||||
@@ -895,12 +906,13 @@ class MockJSONDeSerializable(jose.JSONDeSerializable):
|
||||
return {'foo': self.value}
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
def from_json(cls, value):
|
||||
pass # pragma: no cover
|
||||
|
||||
|
||||
class ClientNetworkTest(unittest.TestCase):
|
||||
"""Tests for acme.client.ClientNetwork."""
|
||||
# pylint: disable=too-many-public-methods
|
||||
|
||||
def setUp(self):
|
||||
self.verify_ssl = mock.MagicMock()
|
||||
@@ -950,8 +962,8 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
|
||||
def test_check_response_not_ok_jobj_error(self):
|
||||
self.response.ok = False
|
||||
self.response.json.return_value = messages.Error.with_code(
|
||||
'serverInternal', detail='foo', title='some title').to_json()
|
||||
self.response.json.return_value = messages.Error(
|
||||
detail='foo', typ='serverInternal', title='some title').to_json()
|
||||
# pylint: disable=protected-access
|
||||
self.assertRaises(
|
||||
messages.Error, self.net._check_response, self.response)
|
||||
@@ -976,7 +988,7 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
self.response.json.side_effect = ValueError
|
||||
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
|
||||
self.response.headers['Content-Type'] = response_ct
|
||||
# pylint: disable=protected-access
|
||||
# pylint: disable=protected-access,no-value-for-parameter
|
||||
self.assertEqual(
|
||||
self.response, self.net._check_response(self.response))
|
||||
|
||||
@@ -990,7 +1002,7 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
self.response.json.return_value = {}
|
||||
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
|
||||
self.response.headers['Content-Type'] = response_ct
|
||||
# pylint: disable=protected-access
|
||||
# pylint: disable=protected-access,no-value-for-parameter
|
||||
self.assertEqual(
|
||||
self.response, self.net._check_response(self.response))
|
||||
|
||||
@@ -1106,6 +1118,7 @@ class ClientNetworkTest(unittest.TestCase):
|
||||
|
||||
class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
"""Tests for acme.client.ClientNetwork which mock out response."""
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.client import ClientNetwork
|
||||
@@ -1115,8 +1128,8 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
self.response.headers = {}
|
||||
self.response.links = {}
|
||||
self.response.checked = False
|
||||
self.acmev1_nonce_response = mock.MagicMock(
|
||||
ok=False, status_code=http_client.METHOD_NOT_ALLOWED)
|
||||
self.acmev1_nonce_response = mock.MagicMock(ok=False,
|
||||
status_code=http_client.METHOD_NOT_ALLOWED)
|
||||
self.acmev1_nonce_response.headers = {}
|
||||
self.obj = mock.MagicMock()
|
||||
self.wrapped_obj = mock.MagicMock()
|
||||
@@ -6,29 +6,32 @@ import os
|
||||
import re
|
||||
import socket
|
||||
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
|
||||
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
|
||||
import josepy as jose
|
||||
|
||||
from acme import errors
|
||||
from acme.magic_typing import Callable # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Optional # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Tuple # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Union # pylint: disable=unused-import, no-name-in-module
|
||||
# pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Callable, Union, Tuple, Optional
|
||||
# pylint: enable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Default SSL method selected here is the most compatible, while secure
|
||||
# SSL method: TLSv1_METHOD is only compatible with
|
||||
# TLSSNI01 certificate serving and probing is not affected by SSL
|
||||
# vulnerabilities: prober needs to check certificate for expected
|
||||
# contents anyway. Working SNI is the only thing that's necessary for
|
||||
# the challenge and thus scoping down SSL/TLS method (version) would
|
||||
# cause interoperability issues: TLSv1_METHOD is only compatible with
|
||||
# TLSv1_METHOD, while SSLv23_METHOD is compatible with all other
|
||||
# methods, including TLSv2_METHOD (read more at
|
||||
# https://www.openssl.org/docs/ssl/SSLv23_method.html). _serve_sni
|
||||
# should be changed to use "set_options" to disable SSLv2 and SSLv3,
|
||||
# in case it's used for things other than probing/serving!
|
||||
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
|
||||
_DEFAULT_TLSSNI01_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
|
||||
|
||||
|
||||
class SSLSocket(object):
|
||||
class SSLSocket(object): # pylint: disable=too-few-public-methods
|
||||
"""SSL wrapper for sockets.
|
||||
|
||||
:ivar socket sock: Original wrapped socket.
|
||||
@@ -37,7 +40,7 @@ class SSLSocket(object):
|
||||
:ivar method: See `OpenSSL.SSL.Context` for allowed values.
|
||||
|
||||
"""
|
||||
def __init__(self, sock, certs, method=_DEFAULT_SSL_METHOD):
|
||||
def __init__(self, sock, certs, method=_DEFAULT_TLSSNI01_SSL_METHOD):
|
||||
self.sock = sock
|
||||
self.certs = certs
|
||||
self.method = method
|
||||
@@ -74,7 +77,7 @@ class SSLSocket(object):
|
||||
class FakeConnection(object):
|
||||
"""Fake OpenSSL.SSL.Connection."""
|
||||
|
||||
# pylint: disable=missing-docstring
|
||||
# pylint: disable=too-few-public-methods,missing-docstring
|
||||
|
||||
def __init__(self, connection):
|
||||
self._wrapped = connection
|
||||
@@ -109,7 +112,7 @@ class SSLSocket(object):
|
||||
|
||||
|
||||
def probe_sni(name, host, port=443, timeout=300,
|
||||
method=_DEFAULT_SSL_METHOD, source_address=('', 0)):
|
||||
method=_DEFAULT_TLSSNI01_SSL_METHOD, source_address=('', 0)):
|
||||
"""Probe SNI server for SSL certificate.
|
||||
|
||||
:param bytes name: Byte string to send as the server name in the
|
||||
@@ -134,6 +137,7 @@ def probe_sni(name, host, port=443, timeout=300,
|
||||
socket_kwargs = {'source_address': source_address}
|
||||
|
||||
try:
|
||||
# pylint: disable=star-args
|
||||
logger.debug(
|
||||
"Attempting to connect to %s:%d%s.", host, port,
|
||||
" from {0}:{1}".format(
|
||||
@@ -194,7 +198,8 @@ def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
|
||||
|
||||
if common_name is None:
|
||||
return sans
|
||||
return [common_name] + [d for d in sans if d != common_name]
|
||||
else:
|
||||
return [common_name] + [d for d in sans if d != common_name]
|
||||
|
||||
def _pyopenssl_cert_or_req_san(cert_or_req):
|
||||
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
|
||||
|
||||
@@ -5,14 +5,15 @@ import threading
|
||||
import time
|
||||
import unittest
|
||||
|
||||
import six
|
||||
from six.moves import socketserver #type: ignore # pylint: disable=import-error
|
||||
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
import six
|
||||
from six.moves import socketserver # type: ignore # pylint: disable=import-error
|
||||
|
||||
from acme import errors
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
import test_util
|
||||
from acme import test_util
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
@@ -29,6 +30,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
|
||||
class _TestServer(socketserver.TCPServer):
|
||||
|
||||
# pylint: disable=too-few-public-methods
|
||||
# six.moves.* | pylint: disable=attribute-defined-outside-init,no-init
|
||||
|
||||
def server_bind(self): # pylint: disable=missing-docstring
|
||||
@@ -38,6 +40,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
|
||||
self.port = self.server.socket.getsockname()[1]
|
||||
self.server_thread = threading.Thread(
|
||||
# pylint: disable=no-member
|
||||
target=self.server.handle_request)
|
||||
|
||||
def tearDown(self):
|
||||
@@ -64,7 +67,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
|
||||
def test_probe_connection_error(self):
|
||||
# pylint has a hard time with six
|
||||
self.server.server_close()
|
||||
self.server.server_close() # pylint: disable=no-member
|
||||
original_timeout = socket.getdefaulttimeout()
|
||||
try:
|
||||
socket.setdefaulttimeout(1)
|
||||
@@ -29,12 +29,7 @@ class NonceError(ClientError):
|
||||
class BadNonce(NonceError):
|
||||
"""Bad nonce error."""
|
||||
def __init__(self, nonce, error, *args, **kwargs):
|
||||
# MyPy complains here that there is too many arguments for BaseException constructor.
|
||||
# This is an error fixed in typeshed, see https://github.com/python/mypy/issues/4183
|
||||
# The fix is included in MyPy>=0.740, but upgrading it would bring dozen of errors due to
|
||||
# new types definitions. So we ignore the error until the code base is fixed to match
|
||||
# with MyPy>=0.740 referential.
|
||||
super(BadNonce, self).__init__(*args, **kwargs) # type: ignore
|
||||
super(BadNonce, self).__init__(*args, **kwargs)
|
||||
self.nonce = nonce
|
||||
self.error = error
|
||||
|
||||
@@ -53,8 +48,7 @@ class MissingNonce(NonceError):
|
||||
|
||||
"""
|
||||
def __init__(self, response, *args, **kwargs):
|
||||
# See comment in BadNonce constructor above for an explanation of type: ignore here.
|
||||
super(MissingNonce, self).__init__(*args, **kwargs) # type: ignore
|
||||
super(MissingNonce, self).__init__(*args, **kwargs)
|
||||
self.response = response
|
||||
|
||||
def __str__(self):
|
||||
@@ -89,7 +83,6 @@ class PollError(ClientError):
|
||||
return '{0}(exhausted={1!r}, updated={2!r})'.format(
|
||||
self.__class__.__name__, self.exhausted, self.updated)
|
||||
|
||||
|
||||
class ValidationError(Error):
|
||||
"""Error for authorization failures. Contains a list of authorization
|
||||
resources, each of which is invalid and should have an error field.
|
||||
@@ -98,11 +91,9 @@ class ValidationError(Error):
|
||||
self.failed_authzrs = failed_authzrs
|
||||
super(ValidationError, self).__init__()
|
||||
|
||||
|
||||
class TimeoutError(Error): # pylint: disable=redefined-builtin
|
||||
class TimeoutError(Error):
|
||||
"""Error for when polling an authorization or an order times out."""
|
||||
|
||||
|
||||
class IssuanceError(Error):
|
||||
"""Error sent by the server after requesting issuance of a certificate."""
|
||||
|
||||
@@ -114,7 +105,6 @@ class IssuanceError(Error):
|
||||
self.error = error
|
||||
super(IssuanceError, self).__init__()
|
||||
|
||||
|
||||
class ConflictError(ClientError):
|
||||
"""Error for when the server returns a 409 (Conflict) HTTP status.
|
||||
|
||||
|
||||
@@ -4,6 +4,7 @@ import logging
|
||||
import josepy as jose
|
||||
import pyrfc3339
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
|
||||
@@ -2,7 +2,6 @@
|
||||
import importlib
|
||||
import unittest
|
||||
|
||||
|
||||
class JoseTest(unittest.TestCase):
|
||||
"""Tests for acme.jose shim."""
|
||||
|
||||
@@ -21,10 +20,11 @@ class JoseTest(unittest.TestCase):
|
||||
|
||||
# We use the imports below with eval, but pylint doesn't
|
||||
# understand that.
|
||||
import acme # pylint: disable=unused-import
|
||||
import josepy # pylint: disable=unused-import
|
||||
acme_jose_mod = eval(acme_jose_path) # pylint: disable=eval-used
|
||||
josepy_mod = eval(josepy_path) # pylint: disable=eval-used
|
||||
# pylint: disable=eval-used,unused-variable
|
||||
import acme
|
||||
import josepy
|
||||
acme_jose_mod = eval(acme_jose_path)
|
||||
josepy_mod = eval(josepy_path)
|
||||
self.assertIs(acme_jose_mod, josepy_mod)
|
||||
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))
|
||||
|
||||
@@ -40,10 +40,10 @@ class Signature(jose.Signature):
|
||||
class JWS(jose.JWS):
|
||||
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
|
||||
signature_cls = Signature
|
||||
__slots__ = jose.JWS._orig_slots
|
||||
__slots__ = jose.JWS._orig_slots # pylint: disable=no-member
|
||||
|
||||
@classmethod
|
||||
# pylint: disable=arguments-differ
|
||||
# pylint: disable=arguments-differ,too-many-arguments
|
||||
def sign(cls, payload, key, alg, nonce, url=None, kid=None):
|
||||
# Per ACME spec, jwk and kid are mutually exclusive, so only include a
|
||||
# jwk field if kid is not provided.
|
||||
|
||||
@@ -3,7 +3,8 @@ import unittest
|
||||
|
||||
import josepy as jose
|
||||
|
||||
import test_util
|
||||
from acme import test_util
|
||||
|
||||
|
||||
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
"""Shim class to not have to depend on typing module in prod."""
|
||||
import sys
|
||||
|
||||
|
||||
class TypingClass(object):
|
||||
"""Ignore import errors by getting anything"""
|
||||
def __getattr__(self, name):
|
||||
|
||||
@@ -1,55 +1,37 @@
|
||||
"""ACME protocol messages."""
|
||||
import six
|
||||
import json
|
||||
try:
|
||||
from collections.abc import Hashable # pylint: disable=no-name-in-module
|
||||
except ImportError:
|
||||
from collections import Hashable
|
||||
|
||||
import josepy as jose
|
||||
import six
|
||||
|
||||
from acme import challenges
|
||||
from acme import errors
|
||||
from acme import fields
|
||||
from acme import jws
|
||||
from acme import util
|
||||
|
||||
try:
|
||||
from collections.abc import Hashable # pylint: disable=no-name-in-module
|
||||
except ImportError: # pragma: no cover
|
||||
from collections import Hashable
|
||||
|
||||
|
||||
from acme import jws
|
||||
|
||||
OLD_ERROR_PREFIX = "urn:acme:error:"
|
||||
ERROR_PREFIX = "urn:ietf:params:acme:error:"
|
||||
|
||||
ERROR_CODES = {
|
||||
'accountDoesNotExist': 'The request specified an account that does not exist',
|
||||
'alreadyRevoked': 'The request specified a certificate to be revoked that has' \
|
||||
' already been revoked',
|
||||
'badCSR': 'The CSR is unacceptable (e.g., due to a short key)',
|
||||
'badNonce': 'The client sent an unacceptable anti-replay nonce',
|
||||
'badPublicKey': 'The JWS was signed by a public key the server does not support',
|
||||
'badRevocationReason': 'The revocation reason provided is not allowed by the server',
|
||||
'badSignatureAlgorithm': 'The JWS was signed with an algorithm the server does not support',
|
||||
'caa': 'Certification Authority Authorization (CAA) records forbid the CA from issuing' \
|
||||
' a certificate',
|
||||
'compound': 'Specific error conditions are indicated in the "subproblems" array',
|
||||
'connection': ('The server could not connect to the client to verify the'
|
||||
' domain'),
|
||||
'dns': 'There was a problem with a DNS query during identifier validation',
|
||||
'dnssec': 'The server could not validate a DNSSEC signed domain',
|
||||
'incorrectResponse': 'Response recieved didn\'t match the challenge\'s requirements',
|
||||
# deprecate invalidEmail
|
||||
'invalidEmail': 'The provided email for a registration was invalid',
|
||||
'invalidContact': 'The provided contact URI was invalid',
|
||||
'malformed': 'The request message was malformed',
|
||||
'rejectedIdentifier': 'The server will not issue certificates for the identifier',
|
||||
'orderNotReady': 'The request attempted to finalize an order that is not ready to be finalized',
|
||||
'rateLimited': 'There were too many requests of a given type',
|
||||
'serverInternal': 'The server experienced an internal error',
|
||||
'tls': 'The server experienced a TLS error during domain verification',
|
||||
'unauthorized': 'The client lacks sufficient authorization',
|
||||
'unsupportedContact': 'A contact URL for an account used an unsupported protocol scheme',
|
||||
'unknownHost': 'The server could not resolve a domain name',
|
||||
'unsupportedIdentifier': 'An identifier is of an unsupported type',
|
||||
'externalAccountRequired': 'The server requires external account binding',
|
||||
}
|
||||
|
||||
@@ -64,7 +46,8 @@ def is_acme_error(err):
|
||||
"""Check if argument is an ACME error."""
|
||||
if isinstance(err, Error) and (err.typ is not None):
|
||||
return (ERROR_PREFIX in err.typ) or (OLD_ERROR_PREFIX in err.typ)
|
||||
return False
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
@six.python_2_unicode_compatible
|
||||
@@ -119,7 +102,6 @@ class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
code = str(self.typ).split(':')[-1]
|
||||
if code in ERROR_CODES:
|
||||
return code
|
||||
return None
|
||||
|
||||
def __str__(self):
|
||||
return b' :: '.join(
|
||||
@@ -134,19 +116,18 @@ class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
|
||||
POSSIBLE_NAMES = NotImplemented
|
||||
|
||||
def __init__(self, name):
|
||||
super(_Constant, self).__init__()
|
||||
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
|
||||
self.POSSIBLE_NAMES[name] = self
|
||||
self.name = name
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.name
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
|
||||
def from_json(cls, value):
|
||||
if value not in cls.POSSIBLE_NAMES:
|
||||
raise jose.DeserializationError(
|
||||
'{0} not recognized'.format(cls.__name__))
|
||||
return cls.POSSIBLE_NAMES[jobj]
|
||||
return cls.POSSIBLE_NAMES[value]
|
||||
|
||||
def __repr__(self):
|
||||
return '{0}({1})'.format(self.__class__.__name__, self.name)
|
||||
@@ -171,7 +152,6 @@ STATUS_VALID = Status('valid')
|
||||
STATUS_INVALID = Status('invalid')
|
||||
STATUS_REVOKED = Status('revoked')
|
||||
STATUS_READY = Status('ready')
|
||||
STATUS_DEACTIVATED = Status('deactivated')
|
||||
|
||||
|
||||
class IdentifierType(_Constant):
|
||||
@@ -206,6 +186,7 @@ class Directory(jose.JSONDeSerializable):
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
|
||||
# pylint: disable=star-args
|
||||
super(Directory.Meta, self).__init__(**kwargs)
|
||||
|
||||
@property
|
||||
@@ -341,7 +322,7 @@ class Registration(ResourceBody):
|
||||
|
||||
def _filter_contact(self, prefix):
|
||||
return tuple(
|
||||
detail[len(prefix):] for detail in self.contact # pylint: disable=not-an-iterable
|
||||
detail[len(prefix):] for detail in self.contact
|
||||
if detail.startswith(prefix))
|
||||
|
||||
@property
|
||||
@@ -413,6 +394,7 @@ class ChallengeBody(ResourceBody):
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
|
||||
# pylint: disable=star-args
|
||||
super(ChallengeBody, self).__init__(**kwargs)
|
||||
|
||||
def encode(self, name):
|
||||
@@ -475,7 +457,7 @@ class Authorization(ResourceBody):
|
||||
:ivar datetime.datetime expires:
|
||||
|
||||
"""
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json, omitempty=True)
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json)
|
||||
challenges = jose.Field('challenges', omitempty=True)
|
||||
combinations = jose.Field('combinations', omitempty=True)
|
||||
|
||||
@@ -495,7 +477,7 @@ class Authorization(ResourceBody):
|
||||
def resolved_combinations(self):
|
||||
"""Combinations with challenges instead of indices."""
|
||||
return tuple(tuple(self.challenges[idx] for idx in combo)
|
||||
for combo in self.combinations) # pylint: disable=not-an-iterable
|
||||
for combo in self.combinations)
|
||||
|
||||
|
||||
@Directory.register
|
||||
@@ -505,12 +487,6 @@ class NewAuthorization(Authorization):
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class UpdateAuthorization(Authorization):
|
||||
"""Update authorization."""
|
||||
resource_type = 'authz'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class AuthorizationResource(ResourceWithURI):
|
||||
"""Authorization Resource.
|
||||
|
||||
|
||||
@@ -5,8 +5,9 @@ import josepy as jose
|
||||
import mock
|
||||
|
||||
from acme import challenges
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
import test_util
|
||||
from acme import test_util
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
CERT = test_util.load_comparable_cert('cert.der')
|
||||
CSR = test_util.load_comparable_csr('csr.der')
|
||||
@@ -18,7 +19,8 @@ class ErrorTest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
from acme.messages import Error, ERROR_PREFIX
|
||||
self.error = Error.with_code('malformed', detail='foo', title='title')
|
||||
self.error = Error(
|
||||
detail='foo', typ=ERROR_PREFIX + 'malformed', title='title')
|
||||
self.jobj = {
|
||||
'detail': 'foo',
|
||||
'title': 'some title',
|
||||
@@ -26,6 +28,7 @@ class ErrorTest(unittest.TestCase):
|
||||
}
|
||||
self.error_custom = Error(typ='custom', detail='bar')
|
||||
self.empty_error = Error()
|
||||
self.jobj_custom = {'type': 'custom', 'detail': 'bar'}
|
||||
|
||||
def test_default_typ(self):
|
||||
from acme.messages import Error
|
||||
@@ -40,7 +43,8 @@ class ErrorTest(unittest.TestCase):
|
||||
hash(Error.from_json(self.error.to_json()))
|
||||
|
||||
def test_description(self):
|
||||
self.assertEqual('The request message was malformed', self.error.description)
|
||||
self.assertEqual(
|
||||
'The request message was malformed', self.error.description)
|
||||
self.assertTrue(self.error_custom.description is None)
|
||||
|
||||
def test_code(self):
|
||||
@@ -50,17 +54,17 @@ class ErrorTest(unittest.TestCase):
|
||||
self.assertEqual(None, Error().code)
|
||||
|
||||
def test_is_acme_error(self):
|
||||
from acme.messages import is_acme_error, Error
|
||||
from acme.messages import is_acme_error
|
||||
self.assertTrue(is_acme_error(self.error))
|
||||
self.assertFalse(is_acme_error(self.error_custom))
|
||||
self.assertFalse(is_acme_error(Error()))
|
||||
self.assertFalse(is_acme_error(self.empty_error))
|
||||
self.assertFalse(is_acme_error("must pet all the {dogs|rabbits}"))
|
||||
|
||||
def test_unicode_error(self):
|
||||
from acme.messages import Error, is_acme_error
|
||||
arabic_error = Error.with_code(
|
||||
'malformed', detail=u'\u0639\u062f\u0627\u0644\u0629', title='title')
|
||||
from acme.messages import Error, ERROR_PREFIX, is_acme_error
|
||||
arabic_error = Error(
|
||||
detail=u'\u0639\u062f\u0627\u0644\u0629', typ=ERROR_PREFIX + 'malformed',
|
||||
title='title')
|
||||
self.assertTrue(is_acme_error(arabic_error))
|
||||
|
||||
def test_with_code(self):
|
||||
@@ -301,7 +305,8 @@ class ChallengeBodyTest(unittest.TestCase):
|
||||
from acme.messages import Error
|
||||
from acme.messages import STATUS_INVALID
|
||||
self.status = STATUS_INVALID
|
||||
error = Error.with_code('serverInternal', detail='Unable to communicate with DNS server')
|
||||
error = Error(typ='urn:ietf:params:acme:error:serverInternal',
|
||||
detail='Unable to communicate with DNS server')
|
||||
self.challb = ChallengeBody(
|
||||
uri='http://challb', chall=self.chall, status=self.status,
|
||||
error=error)
|
||||
@@ -1,22 +1,28 @@
|
||||
"""Support for standalone client challenge solvers. """
|
||||
import argparse
|
||||
import collections
|
||||
import functools
|
||||
import logging
|
||||
import os
|
||||
import socket
|
||||
import sys
|
||||
import threading
|
||||
|
||||
from six.moves import BaseHTTPServer # type: ignore # pylint: disable=import-error
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
from six.moves import socketserver # type: ignore # pylint: disable=import-error
|
||||
|
||||
import OpenSSL
|
||||
|
||||
from acme import challenges
|
||||
from acme import crypto_util
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# six.moves.* | pylint: disable=no-member,attribute-defined-outside-init
|
||||
# pylint: disable=no-init
|
||||
# pylint: disable=too-few-public-methods,no-init
|
||||
|
||||
|
||||
class TLSServer(socketserver.TCPServer):
|
||||
@@ -31,7 +37,7 @@ class TLSServer(socketserver.TCPServer):
|
||||
self.certs = kwargs.pop("certs", {})
|
||||
self.method = kwargs.pop(
|
||||
# pylint: disable=protected-access
|
||||
"method", crypto_util._DEFAULT_SSL_METHOD)
|
||||
"method", crypto_util._DEFAULT_TLSSNI01_SSL_METHOD)
|
||||
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
|
||||
socketserver.TCPServer.__init__(self, *args, **kwargs)
|
||||
|
||||
@@ -44,7 +50,7 @@ class TLSServer(socketserver.TCPServer):
|
||||
return socketserver.TCPServer.server_bind(self)
|
||||
|
||||
|
||||
class ACMEServerMixin:
|
||||
class ACMEServerMixin: # pylint: disable=old-style-class
|
||||
"""ACME server common settings mixin."""
|
||||
# TODO: c.f. #858
|
||||
server_version = "ACME client standalone challenge solver"
|
||||
@@ -76,7 +82,7 @@ class BaseDualNetworkedServers(object):
|
||||
kwargs["ipv6"] = ip_version
|
||||
new_address = (server_address[0],) + (port,) + server_address[2:]
|
||||
new_args = (new_address,) + remaining_args
|
||||
server = ServerClass(*new_args, **kwargs)
|
||||
server = ServerClass(*new_args, **kwargs) # pylint: disable=star-args
|
||||
logger.debug(
|
||||
"Successfully bound to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
@@ -84,8 +90,8 @@ class BaseDualNetworkedServers(object):
|
||||
if self.servers:
|
||||
# Already bound using IPv6.
|
||||
logger.debug(
|
||||
"Certbot wasn't able to bind to %s:%s using %s, this "
|
||||
"is often expected due to the dual stack nature of "
|
||||
"Certbot wasn't able to bind to %s:%s using %s, this " +
|
||||
"is often expected due to the dual stack nature of " +
|
||||
"IPv6 socket implementations.",
|
||||
new_address[0], new_address[1],
|
||||
"IPv6" if ip_version else "IPv4")
|
||||
@@ -98,13 +104,14 @@ class BaseDualNetworkedServers(object):
|
||||
# If two servers are set up and port 0 was passed in, ensure we always
|
||||
# bind to the same port for both servers.
|
||||
port = server.socket.getsockname()[1]
|
||||
if not self.servers:
|
||||
if len(self.servers) == 0:
|
||||
raise socket.error("Could not bind to IPv4 or IPv6.")
|
||||
|
||||
def serve_forever(self):
|
||||
"""Wraps socketserver.TCPServer.serve_forever"""
|
||||
for server in self.servers:
|
||||
thread = threading.Thread(
|
||||
# pylint: disable=no-member
|
||||
target=server.serve_forever)
|
||||
thread.start()
|
||||
self.threads.append(thread)
|
||||
@@ -124,6 +131,35 @@ class BaseDualNetworkedServers(object):
|
||||
self.threads = []
|
||||
|
||||
|
||||
class TLSSNI01Server(TLSServer, ACMEServerMixin):
|
||||
"""TLSSNI01 Server."""
|
||||
|
||||
def __init__(self, server_address, certs, ipv6=False):
|
||||
TLSServer.__init__(
|
||||
self, server_address, BaseRequestHandlerWithLogging, certs=certs, ipv6=ipv6)
|
||||
|
||||
|
||||
class TLSSNI01DualNetworkedServers(BaseDualNetworkedServers):
|
||||
"""TLSSNI01Server Wrapper. Tries everything for both. Failures for one don't
|
||||
affect the other."""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
BaseDualNetworkedServers.__init__(self, TLSSNI01Server, *args, **kwargs)
|
||||
|
||||
|
||||
class BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
|
||||
"""BaseRequestHandler with logging."""
|
||||
|
||||
def log_message(self, format, *args): # pylint: disable=redefined-builtin
|
||||
"""Log arbitrary message."""
|
||||
logger.debug("%s - - %s", self.client_address[0], format % args)
|
||||
|
||||
def handle(self):
|
||||
"""Handle request."""
|
||||
self.log_message("Incoming request")
|
||||
socketserver.BaseRequestHandler.handle(self)
|
||||
|
||||
|
||||
class HTTPServer(BaseHTTPServer.HTTPServer):
|
||||
"""Generic HTTP Server."""
|
||||
|
||||
@@ -226,3 +262,39 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
"""
|
||||
return functools.partial(
|
||||
cls, simple_http_resources=simple_http_resources)
|
||||
|
||||
|
||||
def simple_tls_sni_01_server(cli_args, forever=True):
|
||||
"""Run simple standalone TLSSNI01 server."""
|
||||
logging.basicConfig(level=logging.DEBUG)
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
"-p", "--port", default=0, help="Port to serve at. By default "
|
||||
"picks random free port.")
|
||||
args = parser.parse_args(cli_args[1:])
|
||||
|
||||
certs = {}
|
||||
|
||||
_, hosts, _ = next(os.walk('.')) # type: ignore # https://github.com/python/mypy/issues/465
|
||||
for host in hosts:
|
||||
with open(os.path.join(host, "cert.pem")) as cert_file:
|
||||
cert_contents = cert_file.read()
|
||||
with open(os.path.join(host, "key.pem")) as key_file:
|
||||
key_contents = key_file.read()
|
||||
certs[host.encode()] = (
|
||||
OpenSSL.crypto.load_privatekey(
|
||||
OpenSSL.crypto.FILETYPE_PEM, key_contents),
|
||||
OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_PEM, cert_contents))
|
||||
|
||||
server = TLSSNI01Server(('', int(args.port)), certs=certs)
|
||||
logger.info("Serving at https://%s:%s...", *server.socket.getsockname()[:2])
|
||||
if forever: # pragma: no cover
|
||||
server.serve_forever()
|
||||
else:
|
||||
server.handle_request()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(simple_tls_sni_01_server(sys.argv)) # pragma: no cover
|
||||
|
||||
@@ -1,17 +1,23 @@
|
||||
"""Tests for acme.standalone."""
|
||||
import os
|
||||
import shutil
|
||||
import socket
|
||||
import threading
|
||||
import tempfile
|
||||
import unittest
|
||||
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
from six.moves import queue # pylint: disable=import-error
|
||||
from six.moves import socketserver # type: ignore # pylint: disable=import-error
|
||||
|
||||
import josepy as jose
|
||||
import mock
|
||||
import requests
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
from six.moves import socketserver # type: ignore # pylint: disable=import-error
|
||||
|
||||
from acme import challenges
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
import test_util
|
||||
from acme import crypto_util
|
||||
from acme import test_util
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
class TLSServerTest(unittest.TestCase):
|
||||
@@ -22,14 +28,41 @@ class TLSServerTest(unittest.TestCase):
|
||||
from acme.standalone import TLSServer
|
||||
server = TLSServer(
|
||||
('', 0), socketserver.BaseRequestHandler, bind_and_activate=True)
|
||||
server.server_close()
|
||||
server.server_close() # pylint: disable=no-member
|
||||
|
||||
def test_ipv6(self):
|
||||
if socket.has_ipv6:
|
||||
from acme.standalone import TLSServer
|
||||
server = TLSServer(
|
||||
('', 0), socketserver.BaseRequestHandler, bind_and_activate=True, ipv6=True)
|
||||
server.server_close()
|
||||
server.server_close() # pylint: disable=no-member
|
||||
|
||||
|
||||
class TLSSNI01ServerTest(unittest.TestCase):
|
||||
"""Test for acme.standalone.TLSSNI01Server."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
self.certs = {b'localhost': (
|
||||
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
|
||||
test_util.load_cert('rsa2048_cert.pem'),
|
||||
)}
|
||||
from acme.standalone import TLSSNI01Server
|
||||
self.server = TLSSNI01Server(('localhost', 0), certs=self.certs)
|
||||
# pylint: disable=no-member
|
||||
self.thread = threading.Thread(target=self.server.serve_forever)
|
||||
self.thread.start()
|
||||
|
||||
def tearDown(self):
|
||||
self.server.shutdown() # pylint: disable=no-member
|
||||
self.thread.join()
|
||||
|
||||
def test_it(self):
|
||||
host, port = self.server.socket.getsockname()[:2]
|
||||
cert = crypto_util.probe_sni(
|
||||
b'localhost', host=host, port=port, timeout=1)
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
jose.ComparableX509(self.certs[b'localhost'][1]))
|
||||
|
||||
|
||||
class HTTP01ServerTest(unittest.TestCase):
|
||||
@@ -44,12 +77,13 @@ class HTTP01ServerTest(unittest.TestCase):
|
||||
from acme.standalone import HTTP01Server
|
||||
self.server = HTTP01Server(('', 0), resources=self.resources)
|
||||
|
||||
# pylint: disable=no-member
|
||||
self.port = self.server.socket.getsockname()[1]
|
||||
self.thread = threading.Thread(target=self.server.serve_forever)
|
||||
self.thread.start()
|
||||
|
||||
def tearDown(self):
|
||||
self.server.shutdown()
|
||||
self.server.shutdown() # pylint: disable=no-member
|
||||
self.thread.join()
|
||||
|
||||
def test_index(self):
|
||||
@@ -102,6 +136,7 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
# NB: On Windows, socket.IPPROTO_IPV6 constant may be missing.
|
||||
# We use the corresponding value (41) instead.
|
||||
level = getattr(socket, "IPPROTO_IPV6", 41)
|
||||
# pylint: disable=no-member
|
||||
self.socket.setsockopt(level, socket.IPV6_V6ONLY, 1)
|
||||
try:
|
||||
self.server_bind()
|
||||
@@ -135,6 +170,33 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
prev_port = port
|
||||
|
||||
|
||||
class TLSSNI01DualNetworkedServersTest(unittest.TestCase):
|
||||
"""Test for acme.standalone.TLSSNI01DualNetworkedServers."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
self.certs = {b'localhost': (
|
||||
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
|
||||
test_util.load_cert('rsa2048_cert.pem'),
|
||||
)}
|
||||
from acme.standalone import TLSSNI01DualNetworkedServers
|
||||
self.servers = TLSSNI01DualNetworkedServers(('localhost', 0), certs=self.certs)
|
||||
self.servers.serve_forever()
|
||||
|
||||
def tearDown(self):
|
||||
self.servers.shutdown_and_server_close()
|
||||
|
||||
def test_connect(self):
|
||||
socknames = self.servers.getsocknames()
|
||||
# connect to all addresses
|
||||
for sockname in socknames:
|
||||
host, port = sockname[:2]
|
||||
cert = crypto_util.probe_sni(
|
||||
b'localhost', host=host, port=port, timeout=1)
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
jose.ComparableX509(self.certs[b'localhost'][1]))
|
||||
|
||||
|
||||
class HTTP01DualNetworkedServersTest(unittest.TestCase):
|
||||
"""Tests for acme.standalone.HTTP01DualNetworkedServers."""
|
||||
|
||||
@@ -147,6 +209,7 @@ class HTTP01DualNetworkedServersTest(unittest.TestCase):
|
||||
from acme.standalone import HTTP01DualNetworkedServers
|
||||
self.servers = HTTP01DualNetworkedServers(('', 0), resources=self.resources)
|
||||
|
||||
# pylint: disable=no-member
|
||||
self.port = self.servers.getsocknames()[0][1]
|
||||
self.servers.serve_forever()
|
||||
|
||||
@@ -185,5 +248,51 @@ class HTTP01DualNetworkedServersTest(unittest.TestCase):
|
||||
self.assertFalse(self._test_http01(add=False))
|
||||
|
||||
|
||||
@test_util.broken_on_windows
|
||||
class TestSimpleTLSSNI01Server(unittest.TestCase):
|
||||
"""Tests for acme.standalone.simple_tls_sni_01_server."""
|
||||
|
||||
|
||||
def setUp(self):
|
||||
# mirror ../examples/standalone
|
||||
self.test_cwd = tempfile.mkdtemp()
|
||||
localhost_dir = os.path.join(self.test_cwd, 'localhost')
|
||||
os.makedirs(localhost_dir)
|
||||
shutil.copy(test_util.vector_path('rsa2048_cert.pem'),
|
||||
os.path.join(localhost_dir, 'cert.pem'))
|
||||
shutil.copy(test_util.vector_path('rsa2048_key.pem'),
|
||||
os.path.join(localhost_dir, 'key.pem'))
|
||||
|
||||
from acme.standalone import simple_tls_sni_01_server
|
||||
self.thread = threading.Thread(
|
||||
target=simple_tls_sni_01_server, kwargs={
|
||||
'cli_args': ('filename',),
|
||||
'forever': False,
|
||||
},
|
||||
)
|
||||
self.old_cwd = os.getcwd()
|
||||
os.chdir(self.test_cwd)
|
||||
|
||||
def tearDown(self):
|
||||
os.chdir(self.old_cwd)
|
||||
self.thread.join()
|
||||
shutil.rmtree(self.test_cwd)
|
||||
|
||||
@mock.patch('acme.standalone.logger')
|
||||
def test_it(self, mock_logger):
|
||||
# Use a Queue because mock objects aren't thread safe.
|
||||
q = queue.Queue() # type: queue.Queue[int]
|
||||
# Add port number to the queue.
|
||||
mock_logger.info.side_effect = lambda *args: q.put(args[-1])
|
||||
self.thread.start()
|
||||
|
||||
# After the timeout, an exception is raised if the queue is empty.
|
||||
port = q.get(timeout=5)
|
||||
cert = crypto_util.probe_sni(b'localhost', b'0.0.0.0', port)
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
test_util.load_comparable_cert(
|
||||
'rsa2048_cert.pem'))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -4,12 +4,20 @@
|
||||
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import pkg_resources
|
||||
import unittest
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
import pkg_resources
|
||||
|
||||
|
||||
def vector_path(*names):
|
||||
"""Path to a test vector."""
|
||||
return pkg_resources.resource_filename(
|
||||
__name__, os.path.join('testdata', *names))
|
||||
|
||||
|
||||
def load_vector(*names):
|
||||
@@ -25,7 +33,8 @@ def _guess_loader(filename, loader_pem, loader_der):
|
||||
return loader_pem
|
||||
elif ext.lower() == '.der':
|
||||
return loader_der
|
||||
raise ValueError("Loader could not be recognized based on extension") # pragma: no cover
|
||||
else: # pragma: no cover
|
||||
raise ValueError("Loader could not be recognized based on extension")
|
||||
|
||||
|
||||
def load_cert(*names):
|
||||
@@ -65,3 +74,32 @@ def load_pyopenssl_private_key(*names):
|
||||
loader = _guess_loader(
|
||||
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
|
||||
return crypto.load_privatekey(loader, load_vector(*names))
|
||||
|
||||
|
||||
def skip_unless(condition, reason): # pragma: no cover
|
||||
"""Skip tests unless a condition holds.
|
||||
|
||||
This implements the basic functionality of unittest.skipUnless
|
||||
which is only available on Python 2.7+.
|
||||
|
||||
:param bool condition: If ``False``, the test will be skipped
|
||||
:param str reason: the reason for skipping the test
|
||||
|
||||
:rtype: callable
|
||||
:returns: decorator that hides tests unless condition is ``True``
|
||||
|
||||
"""
|
||||
if hasattr(unittest, "skipUnless"):
|
||||
return unittest.skipUnless(condition, reason)
|
||||
elif condition:
|
||||
return lambda cls: cls
|
||||
else:
|
||||
return lambda cls: None
|
||||
|
||||
def broken_on_windows(function):
|
||||
"""Decorator to skip temporarily a broken test on Windows."""
|
||||
reason = 'Test is broken and ignored on windows but should be fixed.'
|
||||
return unittest.skipIf(
|
||||
sys.platform == 'win32'
|
||||
and os.environ.get('SKIP_BROKEN_TESTS_ON_WINDOWS', 'true') == 'true',
|
||||
reason)(function)
|
||||
@@ -12,9 +12,10 @@
|
||||
# All configuration values have a default; values that are commented out
|
||||
# serve to show the default.
|
||||
|
||||
import sys
|
||||
import os
|
||||
import shlex
|
||||
import sys
|
||||
|
||||
|
||||
here = os.path.abspath(os.path.dirname(__file__))
|
||||
|
||||
|
||||
@@ -1,241 +0,0 @@
|
||||
"""Example ACME-V2 API for HTTP-01 challenge.
|
||||
|
||||
Brief:
|
||||
|
||||
This a complete usage example of the python-acme API.
|
||||
|
||||
Limitations of this example:
|
||||
- Works for only one Domain name
|
||||
- Performs only HTTP-01 challenge
|
||||
- Uses ACME-v2
|
||||
|
||||
Workflow:
|
||||
(Account creation)
|
||||
- Create account key
|
||||
- Register account and accept TOS
|
||||
(Certificate actions)
|
||||
- Select HTTP-01 within offered challenges by the CA server
|
||||
- Set up http challenge resource
|
||||
- Set up standalone web server
|
||||
- Create domain private key and CSR
|
||||
- Issue certificate
|
||||
- Renew certificate
|
||||
- Revoke certificate
|
||||
(Account update actions)
|
||||
- Change contact information
|
||||
- Deactivate Account
|
||||
"""
|
||||
from contextlib import contextmanager
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
|
||||
from acme import challenges
|
||||
from acme import client
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
from acme import messages
|
||||
from acme import standalone
|
||||
|
||||
# Constants:
|
||||
|
||||
# This is the staging point for ACME-V2 within Let's Encrypt.
|
||||
DIRECTORY_URL = 'https://acme-staging-v02.api.letsencrypt.org/directory'
|
||||
|
||||
USER_AGENT = 'python-acme-example'
|
||||
|
||||
# Account key size
|
||||
ACC_KEY_BITS = 2048
|
||||
|
||||
# Certificate private key size
|
||||
CERT_PKEY_BITS = 2048
|
||||
|
||||
# Domain name for the certificate.
|
||||
DOMAIN = 'client.example.com'
|
||||
|
||||
# If you are running Boulder locally, it is possible to configure any port
|
||||
# number to execute the challenge, but real CA servers will always use port
|
||||
# 80, as described in the ACME specification.
|
||||
PORT = 80
|
||||
|
||||
|
||||
# Useful methods and classes:
|
||||
|
||||
|
||||
def new_csr_comp(domain_name, pkey_pem=None):
|
||||
"""Create certificate signing request."""
|
||||
if pkey_pem is None:
|
||||
# Create private key.
|
||||
pkey = OpenSSL.crypto.PKey()
|
||||
pkey.generate_key(OpenSSL.crypto.TYPE_RSA, CERT_PKEY_BITS)
|
||||
pkey_pem = OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM,
|
||||
pkey)
|
||||
csr_pem = crypto_util.make_csr(pkey_pem, [domain_name])
|
||||
return pkey_pem, csr_pem
|
||||
|
||||
|
||||
def select_http01_chall(orderr):
|
||||
"""Extract authorization resource from within order resource."""
|
||||
# Authorization Resource: authz.
|
||||
# This object holds the offered challenges by the server and their status.
|
||||
authz_list = orderr.authorizations
|
||||
|
||||
for authz in authz_list:
|
||||
# Choosing challenge.
|
||||
# authz.body.challenges is a set of ChallengeBody objects.
|
||||
for i in authz.body.challenges:
|
||||
# Find the supported challenge.
|
||||
if isinstance(i.chall, challenges.HTTP01):
|
||||
return i
|
||||
|
||||
raise Exception('HTTP-01 challenge was not offered by the CA server.')
|
||||
|
||||
|
||||
@contextmanager
|
||||
def challenge_server(http_01_resources):
|
||||
"""Manage standalone server set up and shutdown."""
|
||||
|
||||
# Setting up a fake server that binds at PORT and any address.
|
||||
address = ('', PORT)
|
||||
try:
|
||||
servers = standalone.HTTP01DualNetworkedServers(address,
|
||||
http_01_resources)
|
||||
# Start client standalone web server.
|
||||
servers.serve_forever()
|
||||
yield servers
|
||||
finally:
|
||||
# Shutdown client web server and unbind from PORT
|
||||
servers.shutdown_and_server_close()
|
||||
|
||||
|
||||
def perform_http01(client_acme, challb, orderr):
|
||||
"""Set up standalone webserver and perform HTTP-01 challenge."""
|
||||
|
||||
response, validation = challb.response_and_validation(client_acme.net.key)
|
||||
|
||||
resource = standalone.HTTP01RequestHandler.HTTP01Resource(
|
||||
chall=challb.chall, response=response, validation=validation)
|
||||
|
||||
with challenge_server({resource}):
|
||||
# Let the CA server know that we are ready for the challenge.
|
||||
client_acme.answer_challenge(challb, response)
|
||||
|
||||
# Wait for challenge status and then issue a certificate.
|
||||
# It is possible to set a deadline time.
|
||||
finalized_orderr = client_acme.poll_and_finalize(orderr)
|
||||
|
||||
return finalized_orderr.fullchain_pem
|
||||
|
||||
|
||||
# Main examples:
|
||||
|
||||
|
||||
def example_http():
|
||||
"""This example executes the whole process of fulfilling a HTTP-01
|
||||
challenge for one specific domain.
|
||||
|
||||
The workflow consists of:
|
||||
(Account creation)
|
||||
- Create account key
|
||||
- Register account and accept TOS
|
||||
(Certificate actions)
|
||||
- Select HTTP-01 within offered challenges by the CA server
|
||||
- Set up http challenge resource
|
||||
- Set up standalone web server
|
||||
- Create domain private key and CSR
|
||||
- Issue certificate
|
||||
- Renew certificate
|
||||
- Revoke certificate
|
||||
(Account update actions)
|
||||
- Change contact information
|
||||
- Deactivate Account
|
||||
|
||||
"""
|
||||
# Create account key
|
||||
|
||||
acc_key = jose.JWKRSA(
|
||||
key=rsa.generate_private_key(public_exponent=65537,
|
||||
key_size=ACC_KEY_BITS,
|
||||
backend=default_backend()))
|
||||
|
||||
# Register account and accept TOS
|
||||
|
||||
net = client.ClientNetwork(acc_key, user_agent=USER_AGENT)
|
||||
directory = messages.Directory.from_json(net.get(DIRECTORY_URL).json())
|
||||
client_acme = client.ClientV2(directory, net=net)
|
||||
|
||||
# Terms of Service URL is in client_acme.directory.meta.terms_of_service
|
||||
# Registration Resource: regr
|
||||
# Creates account with contact information.
|
||||
email = ('fake@example.com')
|
||||
regr = client_acme.new_account(
|
||||
messages.NewRegistration.from_data(
|
||||
email=email, terms_of_service_agreed=True))
|
||||
|
||||
# Create domain private key and CSR
|
||||
pkey_pem, csr_pem = new_csr_comp(DOMAIN)
|
||||
|
||||
# Issue certificate
|
||||
|
||||
orderr = client_acme.new_order(csr_pem)
|
||||
|
||||
# Select HTTP-01 within offered challenges by the CA server
|
||||
challb = select_http01_chall(orderr)
|
||||
|
||||
# The certificate is ready to be used in the variable "fullchain_pem".
|
||||
fullchain_pem = perform_http01(client_acme, challb, orderr)
|
||||
|
||||
# Renew certificate
|
||||
|
||||
_, csr_pem = new_csr_comp(DOMAIN, pkey_pem)
|
||||
|
||||
orderr = client_acme.new_order(csr_pem)
|
||||
|
||||
challb = select_http01_chall(orderr)
|
||||
|
||||
# Performing challenge
|
||||
fullchain_pem = perform_http01(client_acme, challb, orderr)
|
||||
|
||||
# Revoke certificate
|
||||
|
||||
fullchain_com = jose.ComparableX509(
|
||||
OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_PEM, fullchain_pem))
|
||||
|
||||
try:
|
||||
client_acme.revoke(fullchain_com, 0) # revocation reason = 0
|
||||
except errors.ConflictError:
|
||||
# Certificate already revoked.
|
||||
pass
|
||||
|
||||
# Query registration status.
|
||||
client_acme.net.account = regr
|
||||
try:
|
||||
regr = client_acme.query_registration(regr)
|
||||
except errors.Error as err:
|
||||
if err.typ == messages.OLD_ERROR_PREFIX + 'unauthorized' \
|
||||
or err.typ == messages.ERROR_PREFIX + 'unauthorized':
|
||||
# Status is deactivated.
|
||||
pass
|
||||
raise
|
||||
|
||||
# Change contact information
|
||||
|
||||
email = 'newfake@example.com'
|
||||
regr = client_acme.update_registration(
|
||||
regr.update(
|
||||
body=regr.body.update(
|
||||
contact=('mailto:' + email,)
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
# Deactivate account/registration
|
||||
|
||||
regr = client_acme.deactivate_registration(regr)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
example_http()
|
||||
@@ -1,10 +1,10 @@
|
||||
# readthedocs.org gives no way to change the install command to "pip
|
||||
# install -e acme[docs]" (that would in turn install documentation
|
||||
# install -e .[docs]" (that would in turn install documentation
|
||||
# dependencies), but it allows to specify a requirements.txt file at
|
||||
# https://readthedocs.org/dashboard/letsencrypt/advanced/ (c.f. #259)
|
||||
|
||||
# Although ReadTheDocs certainly doesn't need to install the project
|
||||
# in --editable mode (-e), just "pip install acme[docs]" does not work as
|
||||
# expected and "pip install -e acme[docs]" must be used instead
|
||||
# in --editable mode (-e), just "pip install .[docs]" does not work as
|
||||
# expected and "pip install -e .[docs]" must be used instead
|
||||
|
||||
-e acme[docs]
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
from setuptools.command.test import test as TestCommand
|
||||
import sys
|
||||
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
from setuptools.command.test import test as TestCommand
|
||||
|
||||
version = '1.1.0.dev0'
|
||||
version = '0.32.0.dev0'
|
||||
|
||||
# Please update tox.ini when modifying dependency version requirements
|
||||
install_requires = [
|
||||
@@ -15,8 +14,8 @@ install_requires = [
|
||||
# 1.1.0+ is required to avoid the warnings described at
|
||||
# https://github.com/certbot/josepy/issues/13.
|
||||
'josepy>=1.1.0',
|
||||
'mock',
|
||||
# Connection.set_tlsext_host_name (>=0.13)
|
||||
'mock',
|
||||
'PyOpenSSL>=0.13.1',
|
||||
'pyrfc3339',
|
||||
'pytz',
|
||||
@@ -37,7 +36,6 @@ docs_extras = [
|
||||
'sphinx_rtd_theme',
|
||||
]
|
||||
|
||||
|
||||
class PyTest(TestCommand):
|
||||
user_options = []
|
||||
|
||||
@@ -52,7 +50,6 @@ class PyTest(TestCommand):
|
||||
errno = pytest.main(shlex.split(self.pytest_args))
|
||||
sys.exit(errno)
|
||||
|
||||
|
||||
setup(
|
||||
name='acme',
|
||||
version=version,
|
||||
@@ -74,7 +71,6 @@ setup(
|
||||
'Programming Language :: Python :: 3.5',
|
||||
'Programming Language :: Python :: 3.6',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
],
|
||||
@@ -86,7 +82,7 @@ setup(
|
||||
'dev': dev_extras,
|
||||
'docs': docs_extras,
|
||||
},
|
||||
test_suite='acme',
|
||||
tests_require=["pytest"],
|
||||
test_suite='acme',
|
||||
cmdclass={"test": PyTest},
|
||||
)
|
||||
|
||||
32
appveyor.yml
Normal file
32
appveyor.yml
Normal file
@@ -0,0 +1,32 @@
|
||||
image: Visual Studio 2015
|
||||
|
||||
environment:
|
||||
matrix:
|
||||
- TOXENV: py35
|
||||
- TOXENV: py37-cover
|
||||
|
||||
branches:
|
||||
only:
|
||||
- master
|
||||
- /^\d+\.\d+\.x$/ # Version branches like X.X.X
|
||||
- /^test-.*$/
|
||||
|
||||
install:
|
||||
# Use Python 3.7 by default
|
||||
- "SET PATH=C:\\Python37;C:\\Python37\\Scripts;%PATH%"
|
||||
# Check env
|
||||
- "python --version"
|
||||
# Upgrade pip to avoid warnings
|
||||
- "python -m pip install --upgrade pip"
|
||||
# Ready to install tox and coverage
|
||||
- "pip install tox codecov"
|
||||
|
||||
build: off
|
||||
|
||||
test_script:
|
||||
- set TOX_TESTENV_PASSENV=APPVEYOR
|
||||
# Test env is set by TOXENV env variable
|
||||
- tox
|
||||
|
||||
on_success:
|
||||
- if exist .coverage codecov
|
||||
@@ -1,8 +1,7 @@
|
||||
include LICENSE.txt
|
||||
include README.rst
|
||||
recursive-include tests *
|
||||
include certbot_apache/_internal/centos-options-ssl-apache.conf
|
||||
include certbot_apache/_internal/options-ssl-apache.conf
|
||||
recursive-include certbot_apache/_internal/augeas_lens *.aug
|
||||
global-exclude __pycache__
|
||||
global-exclude *.py[cod]
|
||||
recursive-include docs *
|
||||
recursive-include certbot_apache/tests/testdata *
|
||||
include certbot_apache/centos-options-ssl-apache.conf
|
||||
include certbot_apache/options-ssl-apache.conf
|
||||
recursive-include certbot_apache/augeas_lens *.aug
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
"""Certbot Apache plugin."""
|
||||
@@ -1,215 +0,0 @@
|
||||
""" Distribution specific override class for CentOS family (RHEL, Fedora) """
|
||||
import logging
|
||||
|
||||
import pkg_resources
|
||||
import zope.interface
|
||||
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot import util
|
||||
from certbot.compat import os
|
||||
from certbot.errors import MisconfigurationError
|
||||
from certbot_apache._internal import apache_util
|
||||
from certbot_apache._internal import configurator
|
||||
from certbot_apache._internal import parser
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class CentOSConfigurator(configurator.ApacheConfigurator):
|
||||
"""CentOS specific ApacheConfigurator override class"""
|
||||
|
||||
OS_DEFAULTS = dict(
|
||||
server_root="/etc/httpd",
|
||||
vhost_root="/etc/httpd/conf.d",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/httpd",
|
||||
ctl="apachectl",
|
||||
version_cmd=['apachectl', '-v'],
|
||||
restart_cmd=['apachectl', 'graceful'],
|
||||
restart_cmd_alt=['apachectl', 'restart'],
|
||||
conftest_cmd=['apachectl', 'configtest'],
|
||||
enmod=None,
|
||||
dismod=None,
|
||||
le_vhost_ext="-le-ssl.conf",
|
||||
handle_modules=False,
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/httpd/conf.d",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
"certbot_apache", os.path.join("_internal", "centos-options-ssl-apache.conf"))
|
||||
)
|
||||
|
||||
def config_test(self):
|
||||
"""
|
||||
Override config_test to mitigate configtest error in vanilla installation
|
||||
of mod_ssl in Fedora. The error is caused by non-existent self-signed
|
||||
certificates referenced by the configuration, that would be autogenerated
|
||||
during the first (re)start of httpd.
|
||||
"""
|
||||
|
||||
os_info = util.get_os_info()
|
||||
fedora = os_info[0].lower() == "fedora"
|
||||
|
||||
try:
|
||||
super(CentOSConfigurator, self).config_test()
|
||||
except errors.MisconfigurationError:
|
||||
if fedora:
|
||||
self._try_restart_fedora()
|
||||
else:
|
||||
raise
|
||||
|
||||
def _try_restart_fedora(self):
|
||||
"""
|
||||
Tries to restart httpd using systemctl to generate the self signed keypair.
|
||||
"""
|
||||
|
||||
try:
|
||||
util.run_script(['systemctl', 'restart', 'httpd'])
|
||||
except errors.SubprocessError as err:
|
||||
raise errors.MisconfigurationError(str(err))
|
||||
|
||||
# Finish with actual config check to see if systemctl restart helped
|
||||
super(CentOSConfigurator, self).config_test()
|
||||
|
||||
def _prepare_options(self):
|
||||
"""
|
||||
Override the options dictionary initialization in order to support
|
||||
alternative restart cmd used in CentOS.
|
||||
"""
|
||||
super(CentOSConfigurator, self)._prepare_options()
|
||||
self.options["restart_cmd_alt"][0] = self.option("ctl")
|
||||
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
return CentOSParser(
|
||||
self.option("server_root"), self.option("vhost_root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
def _deploy_cert(self, *args, **kwargs): # pylint: disable=arguments-differ
|
||||
"""
|
||||
Override _deploy_cert in order to ensure that the Apache configuration
|
||||
has "LoadModule ssl_module..." before parsing the VirtualHost configuration
|
||||
that was created by Certbot
|
||||
"""
|
||||
super(CentOSConfigurator, self)._deploy_cert(*args, **kwargs)
|
||||
if self.version < (2, 4, 0):
|
||||
self._deploy_loadmodule_ssl_if_needed()
|
||||
|
||||
def _deploy_loadmodule_ssl_if_needed(self):
|
||||
"""
|
||||
Add "LoadModule ssl_module <pre-existing path>" to main httpd.conf if
|
||||
it doesn't exist there already.
|
||||
"""
|
||||
|
||||
loadmods = self.parser.find_dir("LoadModule", "ssl_module", exclude=False)
|
||||
|
||||
correct_ifmods = [] # type: List[str]
|
||||
loadmod_args = [] # type: List[str]
|
||||
loadmod_paths = [] # type: List[str]
|
||||
for m in loadmods:
|
||||
noarg_path = m.rpartition("/")[0]
|
||||
path_args = self.parser.get_all_args(noarg_path)
|
||||
if loadmod_args:
|
||||
if loadmod_args != path_args:
|
||||
msg = ("Certbot encountered multiple LoadModule directives "
|
||||
"for LoadModule ssl_module with differing library paths. "
|
||||
"Please remove or comment out the one(s) that are not in "
|
||||
"use, and run Certbot again.")
|
||||
raise MisconfigurationError(msg)
|
||||
else:
|
||||
loadmod_args = path_args
|
||||
|
||||
if self.parser.not_modssl_ifmodule(noarg_path): # pylint: disable=no-member
|
||||
if self.parser.loc["default"] in noarg_path:
|
||||
# LoadModule already in the main configuration file
|
||||
if ("ifmodule/" in noarg_path.lower() or
|
||||
"ifmodule[1]" in noarg_path.lower()):
|
||||
# It's the first or only IfModule in the file
|
||||
return
|
||||
# Populate the list of known !mod_ssl.c IfModules
|
||||
nodir_path = noarg_path.rpartition("/directive")[0]
|
||||
correct_ifmods.append(nodir_path)
|
||||
else:
|
||||
loadmod_paths.append(noarg_path)
|
||||
|
||||
if not loadmod_args:
|
||||
# Do not try to enable mod_ssl
|
||||
return
|
||||
|
||||
# Force creation as the directive wasn't found from the beginning of
|
||||
# httpd.conf
|
||||
rootconf_ifmod = self.parser.create_ifmod(
|
||||
parser.get_aug_path(self.parser.loc["default"]),
|
||||
"!mod_ssl.c", beginning=True)
|
||||
# parser.get_ifmod returns a path postfixed with "/", remove that
|
||||
self.parser.add_dir(rootconf_ifmod[:-1], "LoadModule", loadmod_args)
|
||||
correct_ifmods.append(rootconf_ifmod[:-1])
|
||||
self.save_notes += "Added LoadModule ssl_module to main configuration.\n"
|
||||
|
||||
# Wrap LoadModule mod_ssl inside of <IfModule !mod_ssl.c> if it's not
|
||||
# configured like this already.
|
||||
for loadmod_path in loadmod_paths:
|
||||
nodir_path = loadmod_path.split("/directive")[0]
|
||||
# Remove the old LoadModule directive
|
||||
self.parser.aug.remove(loadmod_path)
|
||||
|
||||
# Create a new IfModule !mod_ssl.c if not already found on path
|
||||
ssl_ifmod = self.parser.get_ifmod(nodir_path, "!mod_ssl.c",
|
||||
beginning=True)[:-1]
|
||||
if ssl_ifmod not in correct_ifmods:
|
||||
self.parser.add_dir(ssl_ifmod, "LoadModule", loadmod_args)
|
||||
correct_ifmods.append(ssl_ifmod)
|
||||
self.save_notes += ("Wrapped pre-existing LoadModule ssl_module "
|
||||
"inside of <IfModule !mod_ssl> block.\n")
|
||||
|
||||
|
||||
class CentOSParser(parser.ApacheParser):
|
||||
"""CentOS specific ApacheParser override class"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
# CentOS specific configuration file for Apache
|
||||
self.sysconfig_filep = "/etc/sysconfig/httpd"
|
||||
super(CentOSParser, self).__init__(*args, **kwargs)
|
||||
|
||||
def update_runtime_variables(self):
|
||||
""" Override for update_runtime_variables for custom parsing """
|
||||
# Opportunistic, works if SELinux not enforced
|
||||
super(CentOSParser, self).update_runtime_variables()
|
||||
self.parse_sysconfig_var()
|
||||
|
||||
def parse_sysconfig_var(self):
|
||||
""" Parses Apache CLI options from CentOS configuration file """
|
||||
defines = apache_util.parse_define_file(self.sysconfig_filep, "OPTIONS")
|
||||
for k in defines:
|
||||
self.variables[k] = defines[k]
|
||||
|
||||
def not_modssl_ifmodule(self, path):
|
||||
"""Checks if the provided Augeas path has argument !mod_ssl"""
|
||||
|
||||
if "ifmodule" not in path.lower():
|
||||
return False
|
||||
|
||||
# Trim the path to the last ifmodule
|
||||
workpath = path.lower()
|
||||
while workpath:
|
||||
# Get path to the last IfModule (ignore the tail)
|
||||
parts = workpath.rpartition("ifmodule")
|
||||
|
||||
if not parts[0]:
|
||||
# IfModule not found
|
||||
break
|
||||
ifmod_path = parts[0] + parts[1]
|
||||
# Check if ifmodule had an index
|
||||
if parts[2].startswith("["):
|
||||
# Append the index from tail
|
||||
ifmod_path += parts[2].partition("/")[0]
|
||||
# Get the original path trimmed to correct length
|
||||
# This is required to preserve cases
|
||||
ifmod_real_path = path[0:len(ifmod_path)]
|
||||
if "!mod_ssl.c" in self.get_all_args(ifmod_real_path):
|
||||
return True
|
||||
# Set the workpath to the heading part
|
||||
workpath = parts[0]
|
||||
|
||||
return False
|
||||
@@ -1,98 +0,0 @@
|
||||
""" Distribution specific override class for Fedora 29+ """
|
||||
import pkg_resources
|
||||
import zope.interface
|
||||
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot import util
|
||||
from certbot.compat import os
|
||||
from certbot_apache._internal import apache_util
|
||||
from certbot_apache._internal import configurator
|
||||
from certbot_apache._internal import parser
|
||||
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class FedoraConfigurator(configurator.ApacheConfigurator):
|
||||
"""Fedora 29+ specific ApacheConfigurator override class"""
|
||||
|
||||
OS_DEFAULTS = dict(
|
||||
server_root="/etc/httpd",
|
||||
vhost_root="/etc/httpd/conf.d",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/httpd",
|
||||
ctl="httpd",
|
||||
version_cmd=['httpd', '-v'],
|
||||
restart_cmd=['apachectl', 'graceful'],
|
||||
restart_cmd_alt=['apachectl', 'restart'],
|
||||
conftest_cmd=['apachectl', 'configtest'],
|
||||
enmod=None,
|
||||
dismod=None,
|
||||
le_vhost_ext="-le-ssl.conf",
|
||||
handle_modules=False,
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/httpd/conf.d",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
# TODO: eventually newest version of Fedora will need their own config
|
||||
"certbot_apache", os.path.join("_internal", "centos-options-ssl-apache.conf"))
|
||||
)
|
||||
|
||||
def config_test(self):
|
||||
"""
|
||||
Override config_test to mitigate configtest error in vanilla installation
|
||||
of mod_ssl in Fedora. The error is caused by non-existent self-signed
|
||||
certificates referenced by the configuration, that would be autogenerated
|
||||
during the first (re)start of httpd.
|
||||
"""
|
||||
try:
|
||||
super(FedoraConfigurator, self).config_test()
|
||||
except errors.MisconfigurationError:
|
||||
self._try_restart_fedora()
|
||||
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
return FedoraParser(
|
||||
self.option("server_root"), self.option("vhost_root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
def _try_restart_fedora(self):
|
||||
"""
|
||||
Tries to restart httpd using systemctl to generate the self signed keypair.
|
||||
"""
|
||||
try:
|
||||
util.run_script(['systemctl', 'restart', 'httpd'])
|
||||
except errors.SubprocessError as err:
|
||||
raise errors.MisconfigurationError(str(err))
|
||||
|
||||
# Finish with actual config check to see if systemctl restart helped
|
||||
super(FedoraConfigurator, self).config_test()
|
||||
|
||||
def _prepare_options(self):
|
||||
"""
|
||||
Override the options dictionary initialization to keep using apachectl
|
||||
instead of httpd and so take advantages of this new bash script in newer versions
|
||||
of Fedora to restart httpd.
|
||||
"""
|
||||
super(FedoraConfigurator, self)._prepare_options()
|
||||
self.options["restart_cmd"][0] = 'apachectl'
|
||||
self.options["restart_cmd_alt"][0] = 'apachectl'
|
||||
self.options["conftest_cmd"][0] = 'apachectl'
|
||||
|
||||
|
||||
class FedoraParser(parser.ApacheParser):
|
||||
"""Fedora 29+ specific ApacheParser override class"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
# Fedora 29+ specific configuration file for Apache
|
||||
self.sysconfig_filep = "/etc/sysconfig/httpd"
|
||||
super(FedoraParser, self).__init__(*args, **kwargs)
|
||||
|
||||
def update_runtime_variables(self):
|
||||
""" Override for update_runtime_variables for custom parsing """
|
||||
# Opportunistic, works if SELinux not enforced
|
||||
super(FedoraParser, self).update_runtime_variables()
|
||||
self._parse_sysconfig_var()
|
||||
|
||||
def _parse_sysconfig_var(self):
|
||||
""" Parses Apache CLI options from Fedora configuration file """
|
||||
defines = apache_util.parse_define_file(self.sysconfig_filep, "OPTIONS")
|
||||
for k in defines:
|
||||
self.variables[k] = defines[k]
|
||||
@@ -1,9 +1,8 @@
|
||||
""" Utility functions for certbot-apache plugin """
|
||||
import binascii
|
||||
import os
|
||||
|
||||
from certbot import util
|
||||
from certbot.compat import os
|
||||
|
||||
|
||||
def get_mod_deps(mod_name):
|
||||
"""Get known module dependencies.
|
||||
207
certbot-apache/certbot_apache/augeas_configurator.py
Normal file
207
certbot-apache/certbot_apache/augeas_configurator.py
Normal file
@@ -0,0 +1,207 @@
|
||||
"""Class of Augeas Configurators."""
|
||||
import logging
|
||||
|
||||
|
||||
from certbot import errors
|
||||
from certbot.plugins import common
|
||||
|
||||
from certbot_apache import constants
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class AugeasConfigurator(common.Installer):
|
||||
"""Base Augeas Configurator class.
|
||||
|
||||
:ivar config: Configuration.
|
||||
:type config: :class:`~certbot.interfaces.IConfig`
|
||||
|
||||
:ivar aug: Augeas object
|
||||
:type aug: :class:`augeas.Augeas`
|
||||
|
||||
:ivar str save_notes: Human-readable configuration change notes
|
||||
:ivar reverter: saves and reverts checkpoints
|
||||
:type reverter: :class:`certbot.reverter.Reverter`
|
||||
|
||||
"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(AugeasConfigurator, self).__init__(*args, **kwargs)
|
||||
|
||||
# Placeholder for augeas
|
||||
self.aug = None
|
||||
|
||||
self.save_notes = ""
|
||||
|
||||
|
||||
def init_augeas(self):
|
||||
""" Initialize the actual Augeas instance """
|
||||
import augeas
|
||||
self.aug = augeas.Augeas(
|
||||
# specify a directory to load our preferred lens from
|
||||
loadpath=constants.AUGEAS_LENS_DIR,
|
||||
# Do not save backup (we do it ourselves), do not load
|
||||
# anything by default
|
||||
flags=(augeas.Augeas.NONE |
|
||||
augeas.Augeas.NO_MODL_AUTOLOAD |
|
||||
augeas.Augeas.ENABLE_SPAN))
|
||||
# See if any temporary changes need to be recovered
|
||||
# This needs to occur before VirtualHost objects are setup...
|
||||
# because this will change the underlying configuration and potential
|
||||
# vhosts
|
||||
self.recovery_routine()
|
||||
|
||||
def check_parsing_errors(self, lens):
|
||||
"""Verify Augeas can parse all of the lens files.
|
||||
|
||||
:param str lens: lens to check for errors
|
||||
|
||||
:raises .errors.PluginError: If there has been an error in parsing with
|
||||
the specified lens.
|
||||
|
||||
"""
|
||||
error_files = self.aug.match("/augeas//error")
|
||||
|
||||
for path in error_files:
|
||||
# Check to see if it was an error resulting from the use of
|
||||
# the httpd lens
|
||||
lens_path = self.aug.get(path + "/lens")
|
||||
# As aug.get may return null
|
||||
if lens_path and lens in lens_path:
|
||||
msg = (
|
||||
"There has been an error in parsing the file {0} on line {1}: "
|
||||
"{2}".format(
|
||||
# Strip off /augeas/files and /error
|
||||
path[13:len(path) - 6],
|
||||
self.aug.get(path + "/line"),
|
||||
self.aug.get(path + "/message")))
|
||||
raise errors.PluginError(msg)
|
||||
|
||||
def ensure_augeas_state(self):
|
||||
"""Makes sure that all Augeas dom changes are written to files to avoid
|
||||
loss of configuration directives when doing additional augeas parsing,
|
||||
causing a possible augeas.load() resulting dom reset
|
||||
"""
|
||||
|
||||
if self.unsaved_files():
|
||||
self.save_notes += "(autosave)"
|
||||
self.save()
|
||||
|
||||
def unsaved_files(self):
|
||||
"""Lists files that have modified Augeas DOM but the changes have not
|
||||
been written to the filesystem yet, used by `self.save()` and
|
||||
ApacheConfigurator to check the file state.
|
||||
|
||||
:raises .errors.PluginError: If there was an error in Augeas, in
|
||||
an attempt to save the configuration, or an error creating a
|
||||
checkpoint
|
||||
|
||||
:returns: `set` of unsaved files
|
||||
"""
|
||||
save_state = self.aug.get("/augeas/save")
|
||||
self.aug.set("/augeas/save", "noop")
|
||||
# Existing Errors
|
||||
ex_errs = self.aug.match("/augeas//error")
|
||||
try:
|
||||
# This is a noop save
|
||||
self.aug.save()
|
||||
except (RuntimeError, IOError):
|
||||
self._log_save_errors(ex_errs)
|
||||
# Erase Save Notes
|
||||
self.save_notes = ""
|
||||
raise errors.PluginError(
|
||||
"Error saving files, check logs for more info.")
|
||||
|
||||
# Return the original save method
|
||||
self.aug.set("/augeas/save", save_state)
|
||||
|
||||
# Retrieve list of modified files
|
||||
# Note: Noop saves can cause the file to be listed twice, I used a
|
||||
# set to remove this possibility. This is a known augeas 0.10 error.
|
||||
save_paths = self.aug.match("/augeas/events/saved")
|
||||
|
||||
save_files = set()
|
||||
if save_paths:
|
||||
for path in save_paths:
|
||||
save_files.add(self.aug.get(path)[6:])
|
||||
return save_files
|
||||
|
||||
def save(self, title=None, temporary=False):
|
||||
"""Saves all changes to the configuration files.
|
||||
|
||||
This function first checks for save errors, if none are found,
|
||||
all configuration changes made will be saved. According to the
|
||||
function parameters. If an exception is raised, a new checkpoint
|
||||
was not created.
|
||||
|
||||
:param str title: The title of the save. If a title is given, the
|
||||
configuration will be saved as a new checkpoint and put in a
|
||||
timestamped directory.
|
||||
|
||||
:param bool temporary: Indicates whether the changes made will
|
||||
be quickly reversed in the future (ie. challenges)
|
||||
|
||||
"""
|
||||
save_files = self.unsaved_files()
|
||||
if save_files:
|
||||
self.add_to_checkpoint(save_files,
|
||||
self.save_notes, temporary=temporary)
|
||||
|
||||
self.save_notes = ""
|
||||
self.aug.save()
|
||||
|
||||
# Force reload if files were modified
|
||||
# This is needed to recalculate augeas directive span
|
||||
if save_files:
|
||||
for sf in save_files:
|
||||
self.aug.remove("/files/"+sf)
|
||||
self.aug.load()
|
||||
if title and not temporary:
|
||||
self.finalize_checkpoint(title)
|
||||
|
||||
def _log_save_errors(self, ex_errs):
|
||||
"""Log errors due to bad Augeas save.
|
||||
|
||||
:param list ex_errs: Existing errors before save
|
||||
|
||||
"""
|
||||
# Check for the root of save problems
|
||||
new_errs = self.aug.match("/augeas//error")
|
||||
# logger.error("During Save - %s", mod_conf)
|
||||
logger.error("Unable to save files: %s. Attempted Save Notes: %s",
|
||||
", ".join(err[13:len(err) - 6] for err in new_errs
|
||||
# Only new errors caused by recent save
|
||||
if err not in ex_errs), self.save_notes)
|
||||
|
||||
# Wrapper functions for Reverter class
|
||||
def recovery_routine(self):
|
||||
"""Revert all previously modified files.
|
||||
|
||||
Reverts all modified files that have not been saved as a checkpoint
|
||||
|
||||
:raises .errors.PluginError: If unable to recover the configuration
|
||||
|
||||
"""
|
||||
super(AugeasConfigurator, self).recovery_routine()
|
||||
# Need to reload configuration after these changes take effect
|
||||
self.aug.load()
|
||||
|
||||
def revert_challenge_config(self):
|
||||
"""Used to cleanup challenge configurations.
|
||||
|
||||
:raises .errors.PluginError: If unable to revert the challenge config.
|
||||
|
||||
"""
|
||||
self.revert_temporary_config()
|
||||
self.aug.load()
|
||||
|
||||
def rollback_checkpoints(self, rollback=1):
|
||||
"""Rollback saved checkpoints.
|
||||
|
||||
:param int rollback: Number of checkpoints to revert
|
||||
|
||||
:raises .errors.PluginError: If there is a problem with the input or
|
||||
the function is unable to correctly revert the configuration
|
||||
|
||||
"""
|
||||
super(AugeasConfigurator, self).rollback_checkpoints(rollback)
|
||||
self.aug.load()
|
||||
@@ -1,39 +1,40 @@
|
||||
"""Apache Configurator."""
|
||||
"""Apache Configuration based off of Augeas Configurator."""
|
||||
# pylint: disable=too-many-lines
|
||||
from collections import defaultdict
|
||||
import copy
|
||||
import fnmatch
|
||||
import logging
|
||||
import os
|
||||
import pkg_resources
|
||||
import re
|
||||
import six
|
||||
import socket
|
||||
import time
|
||||
|
||||
import pkg_resources
|
||||
import six
|
||||
import zope.component
|
||||
import zope.interface
|
||||
|
||||
from acme import challenges
|
||||
from acme.magic_typing import DefaultDict # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Union # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Any, DefaultDict, Dict, List, Set, Union # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot import util
|
||||
|
||||
from certbot.achallenges import KeyAuthorizationAnnotatedChallenge # pylint: disable=unused-import
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
from certbot.plugins import common
|
||||
from certbot.plugins.enhancements import AutoHSTSEnhancement
|
||||
from certbot.plugins.util import path_surgery
|
||||
from certbot_apache._internal import apache_util
|
||||
from certbot_apache._internal import constants
|
||||
from certbot_apache._internal import display_ops
|
||||
from certbot_apache._internal import http_01
|
||||
from certbot_apache._internal import obj
|
||||
from certbot_apache._internal import parser
|
||||
from certbot.plugins.enhancements import AutoHSTSEnhancement
|
||||
|
||||
from certbot_apache import apache_util
|
||||
from certbot_apache import augeas_configurator
|
||||
from certbot_apache import constants
|
||||
from certbot_apache import display_ops
|
||||
from certbot_apache import http_01
|
||||
from certbot_apache import obj
|
||||
from certbot_apache import parser
|
||||
from certbot_apache import tls_sni_01
|
||||
|
||||
from collections import defaultdict
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -69,18 +70,22 @@ logger = logging.getLogger(__name__)
|
||||
|
||||
@zope.interface.implementer(interfaces.IAuthenticator, interfaces.IInstaller)
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class ApacheConfigurator(common.Installer):
|
||||
class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
# pylint: disable=too-many-instance-attributes,too-many-public-methods
|
||||
"""Apache configurator.
|
||||
|
||||
State of Configurator: This code has been been tested and built for Ubuntu
|
||||
14.04 Apache 2.4 and it works for Ubuntu 12.04 Apache 2.2
|
||||
|
||||
:ivar config: Configuration.
|
||||
:type config: :class:`~certbot.interfaces.IConfig`
|
||||
|
||||
:ivar parser: Handles low level parsing
|
||||
:type parser: :class:`~certbot_apache._internal.parser`
|
||||
:type parser: :class:`~certbot_apache.parser`
|
||||
|
||||
:ivar tup version: version of Apache
|
||||
:ivar list vhosts: All vhosts found in the configuration
|
||||
(:class:`list` of :class:`~certbot_apache._internal.obj.VirtualHost`)
|
||||
(:class:`list` of :class:`~certbot_apache.obj.VirtualHost`)
|
||||
|
||||
:ivar dict assoc: Mapping between domains and vhosts
|
||||
|
||||
@@ -109,7 +114,7 @@ class ApacheConfigurator(common.Installer):
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/apache2",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
|
||||
"certbot_apache", "options-ssl-apache.conf")
|
||||
)
|
||||
|
||||
def option(self, key):
|
||||
@@ -172,6 +177,8 @@ class ApacheConfigurator(common.Installer):
|
||||
"(Only Ubuntu/Debian currently)")
|
||||
add("ctl", default=DEFAULTS["ctl"],
|
||||
help="Full path to Apache control script")
|
||||
util.add_deprecated_argument(
|
||||
add, argument_name="init-script", nargs=1)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
"""Initialize an Apache Configurator.
|
||||
@@ -194,8 +201,6 @@ class ApacheConfigurator(common.Installer):
|
||||
self._enhanced_vhosts = defaultdict(set) # type: DefaultDict[str, Set[obj.VirtualHost]]
|
||||
# Temporary state for AutoHSTS enhancement
|
||||
self._autohsts = {} # type: Dict[str, Dict[str, Union[int, float]]]
|
||||
# Reverter save notes
|
||||
self.save_notes = ""
|
||||
|
||||
# These will be set in the prepare function
|
||||
self._prepared = False
|
||||
@@ -210,13 +215,15 @@ class ApacheConfigurator(common.Installer):
|
||||
@property
|
||||
def mod_ssl_conf(self):
|
||||
"""Full absolute path to SSL configuration file."""
|
||||
return os.path.join(self.config.config_dir, constants.MOD_SSL_CONF_DEST)
|
||||
return os.path.join(self.config.config_dir,
|
||||
constants.MOD_SSL_CONF_DEST)
|
||||
|
||||
@property
|
||||
def updated_mod_ssl_conf_digest(self):
|
||||
"""Full absolute path to digest of updated SSL configuration file."""
|
||||
return os.path.join(self.config.config_dir, constants.UPDATED_MOD_SSL_CONF_DIGEST)
|
||||
|
||||
|
||||
def prepare(self):
|
||||
"""Prepare the authenticator/installer.
|
||||
|
||||
@@ -226,6 +233,12 @@ class ApacheConfigurator(common.Installer):
|
||||
:raises .errors.PluginError: If there is any other error
|
||||
|
||||
"""
|
||||
# Perform the actual Augeas initialization to be able to react
|
||||
try:
|
||||
self.init_augeas()
|
||||
except ImportError:
|
||||
raise errors.NoInstallationError("Problem in Augeas installation")
|
||||
|
||||
self._prepare_options()
|
||||
|
||||
# Verify Apache is installed
|
||||
@@ -241,16 +254,18 @@ class ApacheConfigurator(common.Installer):
|
||||
'.'.join(str(i) for i in self.version))
|
||||
if self.version < (2, 2):
|
||||
raise errors.NotSupportedError(
|
||||
"Apache Version {0} not supported.".format(str(self.version)))
|
||||
"Apache Version %s not supported.", str(self.version))
|
||||
|
||||
if not self._check_aug_version():
|
||||
raise errors.NotSupportedError(
|
||||
"Apache plugin support requires libaugeas0 and augeas-lenses "
|
||||
"version 1.2.0 or higher, please make sure you have you have "
|
||||
"those installed.")
|
||||
|
||||
# Recover from previous crash before Augeas initialization to have the
|
||||
# correct parse tree from the get go.
|
||||
self.recovery_routine()
|
||||
# Perform the actual Augeas initialization to be able to react
|
||||
self.parser = self.get_parser()
|
||||
|
||||
# Check for errors in parsing files with Augeas
|
||||
self.parser.check_parsing_errors("httpd.aug")
|
||||
self.check_parsing_errors("httpd.aug")
|
||||
|
||||
# Get all of the available vhosts
|
||||
self.vhosts = self.get_virtual_hosts()
|
||||
@@ -264,72 +279,9 @@ class ApacheConfigurator(common.Installer):
|
||||
except (OSError, errors.LockError):
|
||||
logger.debug("Encountered error:", exc_info=True)
|
||||
raise errors.PluginError(
|
||||
"Unable to create a lock file in {0}. Are you running"
|
||||
" Certbot with sufficient privileges to modify your"
|
||||
" Apache configuration?".format(self.option("server_root")))
|
||||
"Unable to lock %s", self.option("server_root"))
|
||||
self._prepared = True
|
||||
|
||||
def save(self, title=None, temporary=False):
|
||||
"""Saves all changes to the configuration files.
|
||||
|
||||
This function first checks for save errors, if none are found,
|
||||
all configuration changes made will be saved. According to the
|
||||
function parameters. If an exception is raised, a new checkpoint
|
||||
was not created.
|
||||
|
||||
:param str title: The title of the save. If a title is given, the
|
||||
configuration will be saved as a new checkpoint and put in a
|
||||
timestamped directory.
|
||||
|
||||
:param bool temporary: Indicates whether the changes made will
|
||||
be quickly reversed in the future (ie. challenges)
|
||||
|
||||
"""
|
||||
save_files = self.parser.unsaved_files()
|
||||
if save_files:
|
||||
self.add_to_checkpoint(save_files,
|
||||
self.save_notes, temporary=temporary)
|
||||
# Handle the parser specific tasks
|
||||
self.parser.save(save_files)
|
||||
if title and not temporary:
|
||||
self.finalize_checkpoint(title)
|
||||
|
||||
def recovery_routine(self):
|
||||
"""Revert all previously modified files.
|
||||
|
||||
Reverts all modified files that have not been saved as a checkpoint
|
||||
|
||||
:raises .errors.PluginError: If unable to recover the configuration
|
||||
|
||||
"""
|
||||
super(ApacheConfigurator, self).recovery_routine()
|
||||
# Reload configuration after these changes take effect if needed
|
||||
# ie. ApacheParser has been initialized.
|
||||
if self.parser:
|
||||
# TODO: wrap into non-implementation specific parser interface
|
||||
self.parser.aug.load()
|
||||
|
||||
def revert_challenge_config(self):
|
||||
"""Used to cleanup challenge configurations.
|
||||
|
||||
:raises .errors.PluginError: If unable to revert the challenge config.
|
||||
|
||||
"""
|
||||
self.revert_temporary_config()
|
||||
self.parser.aug.load()
|
||||
|
||||
def rollback_checkpoints(self, rollback=1):
|
||||
"""Rollback saved checkpoints.
|
||||
|
||||
:param int rollback: Number of checkpoints to revert
|
||||
|
||||
:raises .errors.PluginError: If there is a problem with the input or
|
||||
the function is unable to correctly revert the configuration
|
||||
|
||||
"""
|
||||
super(ApacheConfigurator, self).rollback_checkpoints(rollback)
|
||||
self.parser.aug.load()
|
||||
|
||||
def _verify_exe_availability(self, exe):
|
||||
"""Checks availability of Apache executable"""
|
||||
if not util.exe_exists(exe):
|
||||
@@ -337,11 +289,26 @@ class ApacheConfigurator(common.Installer):
|
||||
raise errors.NoInstallationError(
|
||||
'Cannot find Apache executable {0}'.format(exe))
|
||||
|
||||
def _check_aug_version(self):
|
||||
""" Checks that we have recent enough version of libaugeas.
|
||||
If augeas version is recent enough, it will support case insensitive
|
||||
regexp matching"""
|
||||
|
||||
self.aug.set("/test/path/testing/arg", "aRgUMeNT")
|
||||
try:
|
||||
matches = self.aug.match(
|
||||
"/test//*[self::arg=~regexp('argument', 'i')]")
|
||||
except RuntimeError:
|
||||
self.aug.remove("/test/path")
|
||||
return False
|
||||
self.aug.remove("/test/path")
|
||||
return matches
|
||||
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
# If user provided vhost_root value in command line, use it
|
||||
return parser.ApacheParser(
|
||||
self.option("server_root"), self.conf("vhost-root"),
|
||||
self.aug, self.option("server_root"), self.conf("vhost-root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
def _wildcard_domain(self, domain):
|
||||
@@ -390,7 +357,7 @@ class ApacheConfigurator(common.Installer):
|
||||
counterpart, should one get created
|
||||
|
||||
:returns: List of VirtualHosts or None
|
||||
:rtype: `list` of :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:rtype: `list` of :class:`~certbot_apache.obj.VirtualHost`
|
||||
"""
|
||||
|
||||
if self._wildcard_domain(domain):
|
||||
@@ -428,7 +395,7 @@ class ApacheConfigurator(common.Installer):
|
||||
"""
|
||||
if len(name.split(".")) == len(domain.split(".")):
|
||||
return fnmatch.fnmatch(name, domain)
|
||||
return None
|
||||
|
||||
|
||||
def _choose_vhosts_wildcard(self, domain, create_ssl=True):
|
||||
"""Prompts user to choose vhosts to install a wildcard certificate for"""
|
||||
@@ -449,7 +416,7 @@ class ApacheConfigurator(common.Installer):
|
||||
filtered_vhosts[name] = vhost
|
||||
|
||||
# Only unique VHost objects
|
||||
dialog_input = set(filtered_vhosts.values())
|
||||
dialog_input = set([vhost for vhost in filtered_vhosts.values()])
|
||||
|
||||
# Ask the user which of names to enable, expect list of names back
|
||||
dialog_output = display_ops.select_vhost_multiple(list(dialog_input))
|
||||
@@ -474,6 +441,7 @@ class ApacheConfigurator(common.Installer):
|
||||
self._wildcard_vhosts[domain] = return_vhosts
|
||||
return return_vhosts
|
||||
|
||||
|
||||
def _deploy_cert(self, vhost, cert_path, key_path, chain_path, fullchain_path):
|
||||
"""
|
||||
Helper function for deploy_cert() that handles the actual deployment
|
||||
@@ -481,6 +449,8 @@ class ApacheConfigurator(common.Installer):
|
||||
domain originally passed for deploy_cert(). This is especially true
|
||||
with wildcard certificates
|
||||
"""
|
||||
|
||||
|
||||
# This is done first so that ssl module is enabled and cert_path,
|
||||
# cert_key... can all be parsed appropriately
|
||||
self.prepare_server_https("443")
|
||||
@@ -520,8 +490,8 @@ class ApacheConfigurator(common.Installer):
|
||||
# install SSLCertificateFile, SSLCertificateKeyFile,
|
||||
# and SSLCertificateChainFile directives
|
||||
set_cert_path = cert_path
|
||||
self.parser.aug.set(path["cert_path"][-1], cert_path)
|
||||
self.parser.aug.set(path["cert_key"][-1], key_path)
|
||||
self.aug.set(path["cert_path"][-1], cert_path)
|
||||
self.aug.set(path["cert_key"][-1], key_path)
|
||||
if chain_path is not None:
|
||||
self.parser.add_dir(vhost.path,
|
||||
"SSLCertificateChainFile", chain_path)
|
||||
@@ -533,8 +503,8 @@ class ApacheConfigurator(common.Installer):
|
||||
raise errors.PluginError("Please provide the --fullchain-path "
|
||||
"option pointing to your full chain file")
|
||||
set_cert_path = fullchain_path
|
||||
self.parser.aug.set(path["cert_path"][-1], fullchain_path)
|
||||
self.parser.aug.set(path["cert_key"][-1], key_path)
|
||||
self.aug.set(path["cert_path"][-1], fullchain_path)
|
||||
self.aug.set(path["cert_key"][-1], key_path)
|
||||
|
||||
# Enable the new vhost if needed
|
||||
if not vhost.enabled:
|
||||
@@ -565,7 +535,7 @@ class ApacheConfigurator(common.Installer):
|
||||
counterpart, should one get created
|
||||
|
||||
:returns: vhost associated with name
|
||||
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:rtype: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:raises .errors.PluginError: If no vhost is available or chosen
|
||||
|
||||
@@ -600,9 +570,9 @@ class ApacheConfigurator(common.Installer):
|
||||
"in the Apache config.",
|
||||
target_name)
|
||||
raise errors.PluginError("No vhost selected")
|
||||
if temp:
|
||||
elif temp:
|
||||
return vhost
|
||||
if not vhost.ssl:
|
||||
elif not vhost.ssl:
|
||||
addrs = self._get_proposed_addrs(vhost, "443")
|
||||
# TODO: Conflicts is too conservative
|
||||
if not any(vhost.enabled and vhost.conflicts(addrs) for
|
||||
@@ -668,7 +638,7 @@ class ApacheConfigurator(common.Installer):
|
||||
|
||||
:param str target_name: domain handled by the desired vhost
|
||||
:param vhosts: vhosts to consider
|
||||
:type vhosts: `collections.Iterable` of :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhosts: `collections.Iterable` of :class:`~certbot_apache.obj.VirtualHost`
|
||||
:param bool filter_defaults: whether a vhost with a _default_
|
||||
addr is acceptable
|
||||
|
||||
@@ -752,7 +722,7 @@ class ApacheConfigurator(common.Installer):
|
||||
if name:
|
||||
all_names.add(name)
|
||||
|
||||
if vhost_macro:
|
||||
if len(vhost_macro) > 0:
|
||||
zope.component.getUtility(interfaces.IDisplay).notification(
|
||||
"Apache mod_macro seems to be in use in file(s):\n{0}"
|
||||
"\n\nUnfortunately mod_macro is not yet supported".format(
|
||||
@@ -810,7 +780,7 @@ class ApacheConfigurator(common.Installer):
|
||||
"""Helper function for get_virtual_hosts().
|
||||
|
||||
:param host: In progress vhost whose names will be added
|
||||
:type host: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type host: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
"""
|
||||
|
||||
@@ -829,12 +799,12 @@ class ApacheConfigurator(common.Installer):
|
||||
:param str path: Augeas path to virtual host
|
||||
|
||||
:returns: newly created vhost
|
||||
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:rtype: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
"""
|
||||
addrs = set()
|
||||
try:
|
||||
args = self.parser.aug.match(path + "/arg")
|
||||
args = self.aug.match(path + "/arg")
|
||||
except RuntimeError:
|
||||
logger.warning("Encountered a problem while parsing file: %s, skipping", path)
|
||||
return None
|
||||
@@ -852,7 +822,7 @@ class ApacheConfigurator(common.Installer):
|
||||
is_ssl = True
|
||||
|
||||
filename = apache_util.get_file_path(
|
||||
self.parser.aug.get("/augeas/files%s/path" % apache_util.get_file_path(path)))
|
||||
self.aug.get("/augeas/files%s/path" % apache_util.get_file_path(path)))
|
||||
if filename is None:
|
||||
return None
|
||||
|
||||
@@ -870,7 +840,7 @@ class ApacheConfigurator(common.Installer):
|
||||
def get_virtual_hosts(self):
|
||||
"""Returns list of virtual hosts found in the Apache configuration.
|
||||
|
||||
:returns: List of :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:returns: List of :class:`~certbot_apache.obj.VirtualHost`
|
||||
objects found in configuration
|
||||
:rtype: list
|
||||
|
||||
@@ -882,7 +852,7 @@ class ApacheConfigurator(common.Installer):
|
||||
# Make a list of parser paths because the parser_paths
|
||||
# dictionary may be modified during the loop.
|
||||
for vhost_path in list(self.parser.parser_paths):
|
||||
paths = self.parser.aug.match(
|
||||
paths = self.aug.match(
|
||||
("/files%s//*[label()=~regexp('%s')]" %
|
||||
(vhost_path, parser.case_i("VirtualHost"))))
|
||||
paths = [path for path in paths if
|
||||
@@ -892,7 +862,7 @@ class ApacheConfigurator(common.Installer):
|
||||
if not new_vhost:
|
||||
continue
|
||||
internal_path = apache_util.get_internal_aug_path(new_vhost.path)
|
||||
realpath = filesystem.realpath(new_vhost.filep)
|
||||
realpath = os.path.realpath(new_vhost.filep)
|
||||
if realpath not in file_paths:
|
||||
file_paths[realpath] = new_vhost.filep
|
||||
internal_paths[realpath].add(internal_path)
|
||||
@@ -927,7 +897,7 @@ class ApacheConfigurator(common.Installer):
|
||||
now NameVirtualHosts. If version is earlier than 2.4, check if addr
|
||||
has a NameVirtualHost directive in the Apache config
|
||||
|
||||
:param certbot_apache._internal.obj.Addr target_addr: vhost address
|
||||
:param certbot_apache.obj.Addr target_addr: vhost address
|
||||
|
||||
:returns: Success
|
||||
:rtype: bool
|
||||
@@ -945,18 +915,19 @@ class ApacheConfigurator(common.Installer):
|
||||
"""Adds NameVirtualHost directive for given address.
|
||||
|
||||
:param addr: Address that will be added as NameVirtualHost directive
|
||||
:type addr: :class:`~certbot_apache._internal.obj.Addr`
|
||||
:type addr: :class:`~certbot_apache.obj.Addr`
|
||||
|
||||
"""
|
||||
|
||||
loc = parser.get_aug_path(self.parser.loc["name"])
|
||||
if addr.get_port() == "443":
|
||||
self.parser.add_dir_to_ifmodssl(
|
||||
path = self.parser.add_dir_to_ifmodssl(
|
||||
loc, "NameVirtualHost", [str(addr)])
|
||||
else:
|
||||
self.parser.add_dir(loc, "NameVirtualHost", [str(addr)])
|
||||
path = self.parser.add_dir(loc, "NameVirtualHost", [str(addr)])
|
||||
|
||||
msg = "Setting {0} to be NameBasedVirtualHost\n".format(addr)
|
||||
msg = ("Setting %s to be NameBasedVirtualHost\n"
|
||||
"\tDirective added to %s\n" % (addr, path))
|
||||
logger.debug(msg)
|
||||
self.save_notes += msg
|
||||
|
||||
@@ -1097,7 +1068,6 @@ class ApacheConfigurator(common.Installer):
|
||||
# Ugly but takes care of protocol def, eg: 1.1.1.1:443 https
|
||||
if listen.split(":")[-1].split(" ")[0] == port:
|
||||
return True
|
||||
return None
|
||||
|
||||
def prepare_https_modules(self, temp):
|
||||
"""Helper method for prepare_server_https, taking care of enabling
|
||||
@@ -1113,7 +1083,24 @@ class ApacheConfigurator(common.Installer):
|
||||
if "ssl_module" not in self.parser.modules:
|
||||
self.enable_mod("ssl", temp=temp)
|
||||
|
||||
def make_vhost_ssl(self, nonssl_vhost):
|
||||
def make_addrs_sni_ready(self, addrs):
|
||||
"""Checks to see if the server is ready for SNI challenges.
|
||||
|
||||
:param addrs: Addresses to check SNI compatibility
|
||||
:type addrs: :class:`~certbot_apache.obj.Addr`
|
||||
|
||||
"""
|
||||
# Version 2.4 and later are automatically SNI ready.
|
||||
if self.version >= (2, 4):
|
||||
return
|
||||
|
||||
for addr in addrs:
|
||||
if not self.is_name_vhost(addr):
|
||||
logger.debug("Setting VirtualHost at %s to be a name "
|
||||
"based virtual host", addr)
|
||||
self.add_name_vhost(addr)
|
||||
|
||||
def make_vhost_ssl(self, nonssl_vhost): # pylint: disable=too-many-locals
|
||||
"""Makes an ssl_vhost version of a nonssl_vhost.
|
||||
|
||||
Duplicates vhost and adds default ssl options
|
||||
@@ -1123,10 +1110,10 @@ class ApacheConfigurator(common.Installer):
|
||||
.. note:: This function saves the configuration
|
||||
|
||||
:param nonssl_vhost: Valid VH that doesn't have SSLEngine on
|
||||
:type nonssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type nonssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:returns: SSL vhost
|
||||
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:rtype: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:raises .errors.PluginError: If more than one virtual host is in
|
||||
the file or if plugin is unable to write/read vhost files.
|
||||
@@ -1135,16 +1122,16 @@ class ApacheConfigurator(common.Installer):
|
||||
avail_fp = nonssl_vhost.filep
|
||||
ssl_fp = self._get_ssl_vhost_path(avail_fp)
|
||||
|
||||
orig_matches = self.parser.aug.match("/files%s//* [label()=~regexp('%s')]" %
|
||||
orig_matches = self.aug.match("/files%s//* [label()=~regexp('%s')]" %
|
||||
(self._escape(ssl_fp),
|
||||
parser.case_i("VirtualHost")))
|
||||
|
||||
self._copy_create_ssl_vhost_skeleton(nonssl_vhost, ssl_fp)
|
||||
|
||||
# Reload augeas to take into account the new vhost
|
||||
self.parser.aug.load()
|
||||
self.aug.load()
|
||||
# Get Vhost augeas path for new vhost
|
||||
new_matches = self.parser.aug.match("/files%s//* [label()=~regexp('%s')]" %
|
||||
new_matches = self.aug.match("/files%s//* [label()=~regexp('%s')]" %
|
||||
(self._escape(ssl_fp),
|
||||
parser.case_i("VirtualHost")))
|
||||
|
||||
@@ -1155,7 +1142,7 @@ class ApacheConfigurator(common.Installer):
|
||||
# Make Augeas aware of the new vhost
|
||||
self.parser.parse_file(ssl_fp)
|
||||
# Try to search again
|
||||
new_matches = self.parser.aug.match(
|
||||
new_matches = self.aug.match(
|
||||
"/files%s//* [label()=~regexp('%s')]" %
|
||||
(self._escape(ssl_fp),
|
||||
parser.case_i("VirtualHost")))
|
||||
@@ -1217,15 +1204,16 @@ class ApacheConfigurator(common.Installer):
|
||||
"""
|
||||
|
||||
if self.conf("vhost-root") and os.path.exists(self.conf("vhost-root")):
|
||||
fp = os.path.join(filesystem.realpath(self.option("vhost_root")),
|
||||
fp = os.path.join(os.path.realpath(self.option("vhost_root")),
|
||||
os.path.basename(non_ssl_vh_fp))
|
||||
else:
|
||||
# Use non-ssl filepath
|
||||
fp = filesystem.realpath(non_ssl_vh_fp)
|
||||
fp = os.path.realpath(non_ssl_vh_fp)
|
||||
|
||||
if fp.endswith(".conf"):
|
||||
return fp[:-(len(".conf"))] + self.option("le_vhost_ext")
|
||||
return fp + self.option("le_vhost_ext")
|
||||
else:
|
||||
return fp + self.option("le_vhost_ext")
|
||||
|
||||
def _sift_rewrite_rule(self, line):
|
||||
"""Decides whether a line should be copied to a SSL vhost.
|
||||
@@ -1305,8 +1293,8 @@ class ApacheConfigurator(common.Installer):
|
||||
"vhost for your HTTPS site located at {1} because they have "
|
||||
"the potential to create redirection loops.".format(
|
||||
vhost.filep, ssl_fp), reporter.MEDIUM_PRIORITY)
|
||||
self.parser.aug.set("/augeas/files%s/mtime" % (self._escape(ssl_fp)), "0")
|
||||
self.parser.aug.set("/augeas/files%s/mtime" % (self._escape(vhost.filep)), "0")
|
||||
self.aug.set("/augeas/files%s/mtime" % (self._escape(ssl_fp)), "0")
|
||||
self.aug.set("/augeas/files%s/mtime" % (self._escape(vhost.filep)), "0")
|
||||
|
||||
def _sift_rewrite_rules(self, contents):
|
||||
""" Helper function for _copy_create_ssl_vhost_skeleton to prepare the
|
||||
@@ -1364,9 +1352,12 @@ class ApacheConfigurator(common.Installer):
|
||||
result.append(comment)
|
||||
sift = True
|
||||
|
||||
result.append('\n'.join(['# ' + l for l in chunk]))
|
||||
result.append('\n'.join(
|
||||
['# ' + l for l in chunk]))
|
||||
continue
|
||||
else:
|
||||
result.append('\n'.join(chunk))
|
||||
continue
|
||||
return result, sift
|
||||
|
||||
def _get_vhost_block(self, vhost):
|
||||
@@ -1378,7 +1369,7 @@ class ApacheConfigurator(common.Installer):
|
||||
"""
|
||||
|
||||
try:
|
||||
span_val = self.parser.aug.span(vhost.path)
|
||||
span_val = self.aug.span(vhost.path)
|
||||
except ValueError:
|
||||
logger.critical("Error while reading the VirtualHost %s from "
|
||||
"file %s", vhost.name, vhost.filep, exc_info=True)
|
||||
@@ -1413,13 +1404,13 @@ class ApacheConfigurator(common.Installer):
|
||||
|
||||
def _update_ssl_vhosts_addrs(self, vh_path):
|
||||
ssl_addrs = set()
|
||||
ssl_addr_p = self.parser.aug.match(vh_path + "/arg")
|
||||
ssl_addr_p = self.aug.match(vh_path + "/arg")
|
||||
|
||||
for addr in ssl_addr_p:
|
||||
old_addr = obj.Addr.fromstring(
|
||||
str(self.parser.get_arg(addr)))
|
||||
ssl_addr = old_addr.get_addr_obj("443")
|
||||
self.parser.aug.set(addr, str(ssl_addr))
|
||||
self.aug.set(addr, str(ssl_addr))
|
||||
ssl_addrs.add(ssl_addr)
|
||||
|
||||
return ssl_addrs
|
||||
@@ -1438,14 +1429,15 @@ class ApacheConfigurator(common.Installer):
|
||||
vh_path, False)) > 1:
|
||||
directive_path = self.parser.find_dir(directive, None,
|
||||
vh_path, False)
|
||||
self.parser.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
|
||||
self.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
|
||||
|
||||
def _remove_directives(self, vh_path, directives):
|
||||
for directive in directives:
|
||||
while self.parser.find_dir(directive, None, vh_path, False):
|
||||
while len(self.parser.find_dir(directive, None,
|
||||
vh_path, False)) > 0:
|
||||
directive_path = self.parser.find_dir(directive, None,
|
||||
vh_path, False)
|
||||
self.parser.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
|
||||
self.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
|
||||
|
||||
def _add_dummy_ssl_directives(self, vh_path):
|
||||
self.parser.add_dir(vh_path, "SSLCertificateFile",
|
||||
@@ -1484,7 +1476,7 @@ class ApacheConfigurator(common.Installer):
|
||||
"""
|
||||
matches = self.parser.find_dir(
|
||||
"ServerAlias", start=vh_path, exclude=False)
|
||||
aliases = (self.parser.aug.get(match) for match in matches)
|
||||
aliases = (self.aug.get(match) for match in matches)
|
||||
return self.domain_in_names(aliases, target_name)
|
||||
|
||||
def _add_name_vhost_if_necessary(self, vhost):
|
||||
@@ -1494,7 +1486,7 @@ class ApacheConfigurator(common.Installer):
|
||||
https://httpd.apache.org/docs/2.2/mod/core.html#namevirtualhost
|
||||
|
||||
:param vhost: New virtual host that was recently created.
|
||||
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
"""
|
||||
need_to_save = False
|
||||
@@ -1529,7 +1521,7 @@ class ApacheConfigurator(common.Installer):
|
||||
:param str id_str: Id string for matching
|
||||
|
||||
:returns: The matched VirtualHost or None
|
||||
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost` or None
|
||||
:rtype: :class:`~certbot_apache.obj.VirtualHost` or None
|
||||
|
||||
:raises .errors.PluginError: If no VirtualHost is found
|
||||
"""
|
||||
@@ -1546,7 +1538,7 @@ class ApacheConfigurator(common.Installer):
|
||||
used for keeping track of VirtualHost directive over time.
|
||||
|
||||
:param vhost: Virtual host to add the id
|
||||
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:returns: The unique ID or None
|
||||
:rtype: str or None
|
||||
@@ -1568,7 +1560,7 @@ class ApacheConfigurator(common.Installer):
|
||||
If ID already exists, returns that instead.
|
||||
|
||||
:param vhost: Virtual host to add or find the id
|
||||
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:returns: The unique ID for vhost
|
||||
:rtype: str or None
|
||||
@@ -1606,9 +1598,9 @@ class ApacheConfigurator(common.Installer):
|
||||
|
||||
:param str domain: domain to enhance
|
||||
:param str enhancement: enhancement type defined in
|
||||
:const:`~certbot.plugins.enhancements.ENHANCEMENTS`
|
||||
:const:`~certbot.constants.ENHANCEMENTS`
|
||||
:param options: options for the enhancement
|
||||
See :const:`~certbot.plugins.enhancements.ENHANCEMENTS`
|
||||
See :const:`~certbot.constants.ENHANCEMENTS`
|
||||
documentation for appropriate parameter.
|
||||
|
||||
:raises .errors.PluginError: If Enhancement is not supported, or if
|
||||
@@ -1646,7 +1638,7 @@ class ApacheConfigurator(common.Installer):
|
||||
"""Increase the AutoHSTS max-age value
|
||||
|
||||
:param vhost: Virtual host object to modify
|
||||
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:param str id_str: The unique ID string of VirtualHost
|
||||
|
||||
@@ -1667,7 +1659,7 @@ class ApacheConfigurator(common.Installer):
|
||||
if header_path:
|
||||
pat = '(?:[ "]|^)(strict-transport-security)(?:[ "]|$)'
|
||||
for match in header_path:
|
||||
if re.search(pat, self.parser.aug.get(match).lower()):
|
||||
if re.search(pat, self.aug.get(match).lower()):
|
||||
hsts_dirpath = match
|
||||
if not hsts_dirpath:
|
||||
err_msg = ("Certbot was unable to find the existing HSTS header "
|
||||
@@ -1681,7 +1673,7 @@ class ApacheConfigurator(common.Installer):
|
||||
# Our match statement was for string strict-transport-security, but
|
||||
# we need to update the value instead. The next index is for the value
|
||||
hsts_dirpath = hsts_dirpath.replace("arg[3]", "arg[4]")
|
||||
self.parser.aug.set(hsts_dirpath, hsts_maxage)
|
||||
self.aug.set(hsts_dirpath, hsts_maxage)
|
||||
note_msg = ("Increasing HSTS max-age value to {0} for VirtualHost "
|
||||
"in {1}\n".format(nextstep_value, vhost.filep))
|
||||
logger.debug(note_msg)
|
||||
@@ -1730,13 +1722,13 @@ class ApacheConfigurator(common.Installer):
|
||||
.. note:: This function saves the configuration
|
||||
|
||||
:param ssl_vhost: Destination of traffic, an ssl enabled vhost
|
||||
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:param unused_options: Not currently used
|
||||
:type unused_options: Not Available
|
||||
|
||||
:returns: Success, general_vhost (HTTP vhost)
|
||||
:rtype: (bool, :class:`~certbot_apache._internal.obj.VirtualHost`)
|
||||
:rtype: (bool, :class:`~certbot_apache.obj.VirtualHost`)
|
||||
|
||||
"""
|
||||
min_apache_ver = (2, 3, 3)
|
||||
@@ -1763,7 +1755,7 @@ class ApacheConfigurator(common.Installer):
|
||||
# We'll simply delete the directive, so that we'll have a
|
||||
# consistent OCSP cache path.
|
||||
if stapling_cache_aug_path:
|
||||
self.parser.aug.remove(
|
||||
self.aug.remove(
|
||||
re.sub(r"/\w*$", "", stapling_cache_aug_path[0]))
|
||||
|
||||
self.parser.add_dir_to_ifmodssl(ssl_vhost_aug_path,
|
||||
@@ -1786,14 +1778,14 @@ class ApacheConfigurator(common.Installer):
|
||||
.. note:: This function saves the configuration
|
||||
|
||||
:param ssl_vhost: Destination of traffic, an ssl enabled vhost
|
||||
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:param header_substring: string that uniquely identifies a header.
|
||||
e.g: Strict-Transport-Security, Upgrade-Insecure-Requests.
|
||||
:type str
|
||||
|
||||
:returns: Success, general_vhost (HTTP vhost)
|
||||
:rtype: (bool, :class:`~certbot_apache._internal.obj.VirtualHost`)
|
||||
:rtype: (bool, :class:`~certbot_apache.obj.VirtualHost`)
|
||||
|
||||
:raises .errors.PluginError: If no viable HTTP host can be created or
|
||||
set with header header_substring.
|
||||
@@ -1821,7 +1813,7 @@ class ApacheConfigurator(common.Installer):
|
||||
contains the string header_substring.
|
||||
|
||||
:param ssl_vhost: vhost to check
|
||||
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:param header_substring: string that uniquely identifies a header.
|
||||
e.g: Strict-Transport-Security, Upgrade-Insecure-Requests.
|
||||
@@ -1840,7 +1832,7 @@ class ApacheConfigurator(common.Installer):
|
||||
# "Existing Header directive for virtualhost"
|
||||
pat = '(?:[ "]|^)(%s)(?:[ "]|$)' % (header_substring.lower())
|
||||
for match in header_path:
|
||||
if re.search(pat, self.parser.aug.get(match).lower()):
|
||||
if re.search(pat, self.aug.get(match).lower()):
|
||||
raise errors.PluginEnhancementAlreadyPresent(
|
||||
"Existing %s header" % (header_substring))
|
||||
|
||||
@@ -1858,7 +1850,7 @@ class ApacheConfigurator(common.Installer):
|
||||
.. note:: This function saves the configuration
|
||||
|
||||
:param ssl_vhost: Destination of traffic, an ssl enabled vhost
|
||||
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:param unused_options: Not currently used
|
||||
:type unused_options: Not Available
|
||||
@@ -1933,6 +1925,7 @@ class ApacheConfigurator(common.Installer):
|
||||
self.parser.add_dir(vhost.path, "RewriteRule",
|
||||
constants.REWRITE_HTTPS_ARGS)
|
||||
|
||||
|
||||
def _verify_no_certbot_redirect(self, vhost):
|
||||
"""Checks to see if a redirect was already installed by certbot.
|
||||
|
||||
@@ -1943,7 +1936,7 @@ class ApacheConfigurator(common.Installer):
|
||||
delete certbot's old rewrite rules and set the new one instead.
|
||||
|
||||
:param vhost: vhost to check
|
||||
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:raises errors.PluginEnhancementAlreadyPresent: When the exact
|
||||
certbot redirection WriteRule exists in virtual host.
|
||||
@@ -1967,11 +1960,11 @@ class ApacheConfigurator(common.Installer):
|
||||
constants.REWRITE_HTTPS_ARGS_WITH_END]
|
||||
|
||||
for dir_path, args_paths in rewrite_args_dict.items():
|
||||
arg_vals = [self.parser.aug.get(x) for x in args_paths]
|
||||
arg_vals = [self.aug.get(x) for x in args_paths]
|
||||
|
||||
# Search for past redirection rule, delete it, set the new one
|
||||
if arg_vals in constants.OLD_REWRITE_HTTPS_ARGS:
|
||||
self.parser.aug.remove(dir_path)
|
||||
self.aug.remove(dir_path)
|
||||
self._set_https_redirection_rewrite_rule(vhost)
|
||||
self.save()
|
||||
raise errors.PluginEnhancementAlreadyPresent(
|
||||
@@ -1985,7 +1978,7 @@ class ApacheConfigurator(common.Installer):
|
||||
"""Checks if there exists a RewriteRule directive in vhost
|
||||
|
||||
:param vhost: vhost to check
|
||||
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:returns: True if a RewriteRule directive exists.
|
||||
:rtype: bool
|
||||
@@ -1999,7 +1992,7 @@ class ApacheConfigurator(common.Installer):
|
||||
"""Checks if a RewriteEngine directive is on
|
||||
|
||||
:param vhost: vhost to check
|
||||
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
"""
|
||||
rewrite_engine_path_list = self.parser.find_dir("RewriteEngine", "on",
|
||||
@@ -2016,10 +2009,10 @@ class ApacheConfigurator(common.Installer):
|
||||
"""Creates an http_vhost specifically to redirect for the ssl_vhost.
|
||||
|
||||
:param ssl_vhost: ssl vhost
|
||||
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:returns: tuple of the form
|
||||
(`success`, :class:`~certbot_apache._internal.obj.VirtualHost`)
|
||||
(`success`, :class:`~certbot_apache.obj.VirtualHost`)
|
||||
:rtype: tuple
|
||||
|
||||
"""
|
||||
@@ -2027,7 +2020,7 @@ class ApacheConfigurator(common.Installer):
|
||||
|
||||
redirect_filepath = self._write_out_redirect(ssl_vhost, text)
|
||||
|
||||
self.parser.aug.load()
|
||||
self.aug.load()
|
||||
# Make a new vhost data structure and add it to the lists
|
||||
new_vhost = self._create_vhost(parser.get_aug_path(self._escape(redirect_filepath)))
|
||||
self.vhosts.append(new_vhost)
|
||||
@@ -2145,7 +2138,7 @@ class ApacheConfigurator(common.Installer):
|
||||
of this method where available.
|
||||
|
||||
:param vhost: vhost to enable
|
||||
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:raises .errors.NotSupportedError: If filesystem layout is not
|
||||
supported.
|
||||
@@ -2163,7 +2156,7 @@ class ApacheConfigurator(common.Installer):
|
||||
vhost.enabled = True
|
||||
return
|
||||
|
||||
def enable_mod(self, mod_name, temp=False): # pylint: disable=unused-argument
|
||||
def enable_mod(self, mod_name, temp=False): # pylint: disable=unused-argument
|
||||
"""Enables module in Apache.
|
||||
|
||||
Both enables and reloads Apache so module is active.
|
||||
@@ -2200,6 +2193,7 @@ class ApacheConfigurator(common.Installer):
|
||||
:raises .errors.MisconfigurationError: If reload fails
|
||||
|
||||
"""
|
||||
error = ""
|
||||
try:
|
||||
util.run_script(self.option("restart_cmd"))
|
||||
except errors.SubprocessError as err:
|
||||
@@ -2273,7 +2267,7 @@ class ApacheConfigurator(common.Installer):
|
||||
###########################################################################
|
||||
def get_chall_pref(self, unused_domain): # pylint: disable=no-self-use
|
||||
"""Return list of challenge preferences."""
|
||||
return [challenges.HTTP01]
|
||||
return [challenges.HTTP01, challenges.TLSSNI01]
|
||||
|
||||
def perform(self, achalls):
|
||||
"""Perform the configuration related challenge.
|
||||
@@ -2286,15 +2280,20 @@ class ApacheConfigurator(common.Installer):
|
||||
self._chall_out.update(achalls)
|
||||
responses = [None] * len(achalls)
|
||||
http_doer = http_01.ApacheHttp01(self)
|
||||
sni_doer = tls_sni_01.ApacheTlsSni01(self)
|
||||
|
||||
for i, achall in enumerate(achalls):
|
||||
# Currently also have chall_doer hold associated index of the
|
||||
# challenge. This helps to put all of the responses back together
|
||||
# when they are all complete.
|
||||
http_doer.add_chall(achall, i)
|
||||
if isinstance(achall.chall, challenges.HTTP01):
|
||||
http_doer.add_chall(achall, i)
|
||||
else: # tls-sni-01
|
||||
sni_doer.add_chall(achall, i)
|
||||
|
||||
http_response = http_doer.perform()
|
||||
if http_response:
|
||||
sni_response = sni_doer.perform()
|
||||
if http_response or sni_response:
|
||||
# Must reload in order to activate the challenges.
|
||||
# Handled here because we may be able to load up other challenge
|
||||
# types
|
||||
@@ -2305,6 +2304,7 @@ class ApacheConfigurator(common.Installer):
|
||||
time.sleep(3)
|
||||
|
||||
self._update_responses(responses, http_response, http_doer)
|
||||
self._update_responses(responses, sni_response, sni_doer)
|
||||
|
||||
return responses
|
||||
|
||||
@@ -2339,7 +2339,7 @@ class ApacheConfigurator(common.Installer):
|
||||
Enable the AutoHSTS enhancement for defined domains
|
||||
|
||||
:param _unused_lineage: Certificate lineage object, unused
|
||||
:type _unused_lineage: certbot._internal.storage.RenewableCert
|
||||
:type _unused_lineage: certbot.storage.RenewableCert
|
||||
|
||||
:param domains: List of domains in certificate to enhance
|
||||
:type domains: str
|
||||
@@ -2382,7 +2382,7 @@ class ApacheConfigurator(common.Installer):
|
||||
"""Do the initial AutoHSTS deployment to a vhost
|
||||
|
||||
:param ssl_vhost: The VirtualHost object to deploy the AutoHSTS
|
||||
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost` or None
|
||||
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost` or None
|
||||
|
||||
:raises errors.PluginEnhancementAlreadyPresent: When already enhanced
|
||||
|
||||
@@ -2464,7 +2464,7 @@ class ApacheConfigurator(common.Installer):
|
||||
and changes the HSTS max-age to a high value.
|
||||
|
||||
:param lineage: Certificate lineage object
|
||||
:type lineage: certbot._internal.storage.RenewableCert
|
||||
:type lineage: certbot.storage.RenewableCert
|
||||
"""
|
||||
self._autohsts_fetch_state()
|
||||
if not self._autohsts:
|
||||
@@ -2509,4 +2509,4 @@ class ApacheConfigurator(common.Installer):
|
||||
self._autohsts_save_state()
|
||||
|
||||
|
||||
AutoHSTSEnhancement.register(ApacheConfigurator)
|
||||
AutoHSTSEnhancement.register(ApacheConfigurator) # pylint: disable=no-member
|
||||
@@ -1,7 +1,6 @@
|
||||
"""Apache plugin constants."""
|
||||
import pkg_resources
|
||||
|
||||
from certbot.compat import os
|
||||
|
||||
MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
|
||||
"""Name of the mod_ssl config file as saved in `IConfig.config_dir`."""
|
||||
@@ -10,7 +9,6 @@ MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
|
||||
UPDATED_MOD_SSL_CONF_DIGEST = ".updated-options-ssl-apache-conf-digest.txt"
|
||||
"""Name of the hash of the updated or informed mod_ssl_conf as saved in `IConfig.config_dir`."""
|
||||
|
||||
# NEVER REMOVE A SINGLE HASH FROM THIS LIST UNLESS YOU KNOW EXACTLY WHAT YOU ARE DOING!
|
||||
ALL_SSL_OPTIONS_HASHES = [
|
||||
'2086bca02db48daf93468332543c60ac6acdb6f0b58c7bfdf578a5d47092f82a',
|
||||
'4844d36c9a0f587172d9fa10f4f1c9518e3bcfa1947379f155e16a70a728c21a',
|
||||
@@ -20,15 +18,11 @@ ALL_SSL_OPTIONS_HASHES = [
|
||||
'cfdd7c18d2025836ea3307399f509cfb1ebf2612c87dd600a65da2a8e2f2797b',
|
||||
'80720bd171ccdc2e6b917ded340defae66919e4624962396b992b7218a561791',
|
||||
'c0c022ea6b8a51ecc8f1003d0a04af6c3f2bc1c3ce506b3c2dfc1f11ef931082',
|
||||
'717b0a89f5e4c39b09a42813ac6e747cfbdeb93439499e73f4f70a1fe1473f20',
|
||||
'0fcdc81280cd179a07ec4d29d3595068b9326b455c488de4b09f585d5dafc137',
|
||||
'86cc09ad5415cd6d5f09a947fe2501a9344328b1e8a8b458107ea903e80baa6c',
|
||||
'06675349e457eae856120cdebb564efe546f0b87399f2264baeb41e442c724c7',
|
||||
]
|
||||
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""
|
||||
|
||||
AUGEAS_LENS_DIR = pkg_resources.resource_filename(
|
||||
"certbot_apache", os.path.join("_internal", "augeas_lens"))
|
||||
"certbot_apache", "augeas_lens")
|
||||
"""Path to the Augeas lens directory"""
|
||||
|
||||
REWRITE_HTTPS_ARGS = [
|
||||
@@ -1,13 +1,15 @@
|
||||
"""Contains UI methods for Apache operations."""
|
||||
import logging
|
||||
import os
|
||||
|
||||
import zope.component
|
||||
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot.compat import os
|
||||
|
||||
import certbot.display.util as display_util
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@@ -24,7 +26,7 @@ def select_vhost_multiple(vhosts):
|
||||
return list()
|
||||
tags_list = [vhost.display_repr()+"\n" for vhost in vhosts]
|
||||
# Remove the extra newline from the last entry
|
||||
if tags_list:
|
||||
if len(tags_list):
|
||||
tags_list[-1] = tags_list[-1][:-1]
|
||||
code, names = zope.component.getUtility(interfaces.IDisplay).checklist(
|
||||
"Which VirtualHosts would you like to install the wildcard certificate for?",
|
||||
@@ -60,7 +62,8 @@ def select_vhost(domain, vhosts):
|
||||
code, tag = _vhost_menu(domain, vhosts)
|
||||
if code == display_util.OK:
|
||||
return vhosts[tag]
|
||||
return None
|
||||
else:
|
||||
return None
|
||||
|
||||
def _vhost_menu(domain, vhosts):
|
||||
"""Select an appropriate Apache Vhost.
|
||||
@@ -77,7 +80,7 @@ def _vhost_menu(domain, vhosts):
|
||||
|
||||
if free_chars < 2:
|
||||
logger.debug("Display size is too small for "
|
||||
"certbot_apache._internal.display_ops._vhost_menu()")
|
||||
"certbot_apache.display_ops._vhost_menu()")
|
||||
# This runs the edge off the screen, but it doesn't cause an "error"
|
||||
filename_size = 1
|
||||
disp_name_size = 1
|
||||
@@ -90,7 +93,7 @@ def _vhost_menu(domain, vhosts):
|
||||
for vhost in vhosts:
|
||||
if len(vhost.get_names()) == 1:
|
||||
disp_name = next(iter(vhost.get_names()))
|
||||
elif not vhost.get_names():
|
||||
elif len(vhost.get_names()) == 0:
|
||||
disp_name = ""
|
||||
else:
|
||||
disp_name = "Multiple Names"
|
||||
@@ -1,32 +1,23 @@
|
||||
""" Entry point for Apache Plugin """
|
||||
# Pylint does not like disutils.version when running inside a venv.
|
||||
# See: https://github.com/PyCQA/pylint/issues/73
|
||||
from distutils.version import LooseVersion # pylint: disable=no-name-in-module,import-error
|
||||
|
||||
from certbot import util
|
||||
from certbot_apache._internal import configurator
|
||||
from certbot_apache._internal import override_arch
|
||||
from certbot_apache._internal import override_centos
|
||||
from certbot_apache._internal import override_darwin
|
||||
from certbot_apache._internal import override_debian
|
||||
from certbot_apache._internal import override_fedora
|
||||
from certbot_apache._internal import override_gentoo
|
||||
from certbot_apache._internal import override_suse
|
||||
|
||||
from certbot_apache import configurator
|
||||
from certbot_apache import override_arch
|
||||
from certbot_apache import override_darwin
|
||||
from certbot_apache import override_debian
|
||||
from certbot_apache import override_centos
|
||||
from certbot_apache import override_gentoo
|
||||
from certbot_apache import override_suse
|
||||
|
||||
OVERRIDE_CLASSES = {
|
||||
"arch": override_arch.ArchConfigurator,
|
||||
"cloudlinux": override_centos.CentOSConfigurator,
|
||||
"darwin": override_darwin.DarwinConfigurator,
|
||||
"debian": override_debian.DebianConfigurator,
|
||||
"ubuntu": override_debian.DebianConfigurator,
|
||||
"centos": override_centos.CentOSConfigurator,
|
||||
"centos linux": override_centos.CentOSConfigurator,
|
||||
"fedora_old": override_centos.CentOSConfigurator,
|
||||
"fedora": override_fedora.FedoraConfigurator,
|
||||
"linuxmint": override_debian.DebianConfigurator,
|
||||
"fedora": override_centos.CentOSConfigurator,
|
||||
"ol": override_centos.CentOSConfigurator,
|
||||
"oracle": override_centos.CentOSConfigurator,
|
||||
"redhatenterpriseserver": override_centos.CentOSConfigurator,
|
||||
"red hat enterprise linux server": override_centos.CentOSConfigurator,
|
||||
"rhel": override_centos.CentOSConfigurator,
|
||||
"amazon": override_centos.CentOSConfigurator,
|
||||
@@ -34,24 +25,14 @@ OVERRIDE_CLASSES = {
|
||||
"gentoo base system": override_gentoo.GentooConfigurator,
|
||||
"opensuse": override_suse.OpenSUSEConfigurator,
|
||||
"suse": override_suse.OpenSUSEConfigurator,
|
||||
"sles": override_suse.OpenSUSEConfigurator,
|
||||
"scientific": override_centos.CentOSConfigurator,
|
||||
"scientific linux": override_centos.CentOSConfigurator,
|
||||
}
|
||||
|
||||
|
||||
def get_configurator():
|
||||
""" Get correct configurator class based on the OS fingerprint """
|
||||
os_name, os_version = util.get_os_info()
|
||||
os_name = os_name.lower()
|
||||
os_info = util.get_os_info()
|
||||
override_class = None
|
||||
|
||||
# Special case for older Fedora versions
|
||||
if os_name == 'fedora' and LooseVersion(os_version) < LooseVersion('29'):
|
||||
os_name = 'fedora_old'
|
||||
|
||||
try:
|
||||
override_class = OVERRIDE_CLASSES[os_name]
|
||||
override_class = OVERRIDE_CLASSES[os_info[0].lower()]
|
||||
except KeyError:
|
||||
# OS not found in the list
|
||||
os_like = util.get_systemd_os_like()
|
||||
@@ -64,5 +45,4 @@ def get_configurator():
|
||||
override_class = configurator.ApacheConfigurator
|
||||
return override_class
|
||||
|
||||
|
||||
ENTRYPOINT = get_configurator()
|
||||
@@ -1,19 +1,16 @@
|
||||
"""A class that performs HTTP-01 challenges for Apache"""
|
||||
import logging
|
||||
import os
|
||||
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import List, Set # pylint: disable=unused-import, no-name-in-module
|
||||
from certbot import errors
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
from certbot.plugins import common
|
||||
from certbot_apache._internal.obj import VirtualHost # pylint: disable=unused-import
|
||||
from certbot_apache._internal.parser import get_aug_path
|
||||
from certbot_apache.obj import VirtualHost # pylint: disable=unused-import
|
||||
from certbot_apache.parser import get_aug_path
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ApacheHttp01(common.ChallengePerformer):
|
||||
class ApacheHttp01(common.TLSSNI01):
|
||||
"""Class that performs HTTP-01 challenges within the Apache configurator."""
|
||||
|
||||
CONFIG_TEMPLATE22_PRE = """\
|
||||
@@ -168,7 +165,8 @@ class ApacheHttp01(common.ChallengePerformer):
|
||||
|
||||
def _set_up_challenges(self):
|
||||
if not os.path.isdir(self.challenge_dir):
|
||||
filesystem.makedirs(self.challenge_dir, 0o755)
|
||||
os.makedirs(self.challenge_dir)
|
||||
os.chmod(self.challenge_dir, 0o755)
|
||||
|
||||
responses = []
|
||||
for achall in self.achalls:
|
||||
@@ -184,7 +182,7 @@ class ApacheHttp01(common.ChallengePerformer):
|
||||
self.configurator.reverter.register_file_creation(True, name)
|
||||
with open(name, 'wb') as f:
|
||||
f.write(validation.encode())
|
||||
filesystem.chmod(name, 0o644)
|
||||
os.chmod(name, 0o644)
|
||||
|
||||
return response
|
||||
|
||||
@@ -194,8 +192,8 @@ class ApacheHttp01(common.ChallengePerformer):
|
||||
|
||||
if vhost not in self.moded_vhosts:
|
||||
logger.debug(
|
||||
"Adding a temporary challenge validation Include for name: %s in: %s",
|
||||
vhost.name, vhost.filep)
|
||||
"Adding a temporary challenge validation Include for name: %s " +
|
||||
"in: %s", vhost.name, vhost.filep)
|
||||
self.configurator.parser.add_dir_beginning(
|
||||
vhost.path, "Include", self.challenge_conf_pre)
|
||||
self.configurator.parser.add_dir(
|
||||
@@ -1,7 +1,7 @@
|
||||
"""Module contains classes used by the Apache Configurator."""
|
||||
import re
|
||||
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
from certbot.plugins import common
|
||||
|
||||
|
||||
@@ -24,9 +24,9 @@ class Addr(common.Addr):
|
||||
return not self.__eq__(other)
|
||||
|
||||
def __repr__(self):
|
||||
return "certbot_apache._internal.obj.Addr(" + repr(self.tup) + ")"
|
||||
return "certbot_apache.obj.Addr(" + repr(self.tup) + ")"
|
||||
|
||||
def __hash__(self): # pylint: disable=useless-super-delegation
|
||||
def __hash__(self):
|
||||
# Python 3 requires explicit overridden for __hash__ if __eq__ or
|
||||
# __cmp__ is overridden. See https://bugs.python.org/issue2235
|
||||
return super(Addr, self).__hash__()
|
||||
@@ -47,7 +47,8 @@ class Addr(common.Addr):
|
||||
return 0
|
||||
elif self.get_addr() == "*":
|
||||
return 1
|
||||
return 2
|
||||
else:
|
||||
return 2
|
||||
|
||||
def conflicts(self, addr):
|
||||
r"""Returns if address could conflict with correct function of self.
|
||||
@@ -98,7 +99,7 @@ class Addr(common.Addr):
|
||||
return self.get_addr_obj(port)
|
||||
|
||||
|
||||
class VirtualHost(object):
|
||||
class VirtualHost(object): # pylint: disable=too-few-public-methods
|
||||
"""Represents an Apache Virtualhost.
|
||||
|
||||
:ivar str filep: file path of VH
|
||||
@@ -126,6 +127,7 @@ class VirtualHost(object):
|
||||
def __init__(self, filep, path, addrs, ssl, enabled, name=None,
|
||||
aliases=None, modmacro=False, ancestor=None):
|
||||
|
||||
# pylint: disable=too-many-arguments
|
||||
"""Initialize a VH."""
|
||||
self.filep = filep
|
||||
self.path = path
|
||||
@@ -1,11 +1,11 @@
|
||||
""" Distribution specific override class for Arch Linux """
|
||||
import pkg_resources
|
||||
|
||||
import zope.interface
|
||||
|
||||
from certbot import interfaces
|
||||
from certbot.compat import os
|
||||
from certbot_apache._internal import configurator
|
||||
|
||||
from certbot_apache import configurator
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class ArchConfigurator(configurator.ApacheConfigurator):
|
||||
@@ -27,5 +27,5 @@ class ArchConfigurator(configurator.ApacheConfigurator):
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/httpd/conf",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
|
||||
"certbot_apache", "options-ssl-apache.conf")
|
||||
)
|
||||
68
certbot-apache/certbot_apache/override_centos.py
Normal file
68
certbot-apache/certbot_apache/override_centos.py
Normal file
@@ -0,0 +1,68 @@
|
||||
""" Distribution specific override class for CentOS family (RHEL, Fedora) """
|
||||
import pkg_resources
|
||||
|
||||
import zope.interface
|
||||
|
||||
from certbot import interfaces
|
||||
|
||||
from certbot_apache import apache_util
|
||||
from certbot_apache import configurator
|
||||
from certbot_apache import parser
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class CentOSConfigurator(configurator.ApacheConfigurator):
|
||||
"""CentOS specific ApacheConfigurator override class"""
|
||||
|
||||
OS_DEFAULTS = dict(
|
||||
server_root="/etc/httpd",
|
||||
vhost_root="/etc/httpd/conf.d",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/httpd",
|
||||
ctl="apachectl",
|
||||
version_cmd=['apachectl', '-v'],
|
||||
restart_cmd=['apachectl', 'graceful'],
|
||||
restart_cmd_alt=['apachectl', 'restart'],
|
||||
conftest_cmd=['apachectl', 'configtest'],
|
||||
enmod=None,
|
||||
dismod=None,
|
||||
le_vhost_ext="-le-ssl.conf",
|
||||
handle_modules=False,
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/httpd/conf.d",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
"certbot_apache", "centos-options-ssl-apache.conf")
|
||||
)
|
||||
|
||||
def _prepare_options(self):
|
||||
"""
|
||||
Override the options dictionary initialization in order to support
|
||||
alternative restart cmd used in CentOS.
|
||||
"""
|
||||
super(CentOSConfigurator, self)._prepare_options()
|
||||
self.options["restart_cmd_alt"][0] = self.option("ctl")
|
||||
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
return CentOSParser(
|
||||
self.aug, self.option("server_root"), self.option("vhost_root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
|
||||
class CentOSParser(parser.ApacheParser):
|
||||
"""CentOS specific ApacheParser override class"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
# CentOS specific configuration file for Apache
|
||||
self.sysconfig_filep = "/etc/sysconfig/httpd"
|
||||
super(CentOSParser, self).__init__(*args, **kwargs)
|
||||
|
||||
def update_runtime_variables(self):
|
||||
""" Override for update_runtime_variables for custom parsing """
|
||||
# Opportunistic, works if SELinux not enforced
|
||||
super(CentOSParser, self).update_runtime_variables()
|
||||
self.parse_sysconfig_var()
|
||||
|
||||
def parse_sysconfig_var(self):
|
||||
""" Parses Apache CLI options from CentOS configuration file """
|
||||
defines = apache_util.parse_define_file(self.sysconfig_filep, "OPTIONS")
|
||||
for k in defines.keys():
|
||||
self.variables[k] = defines[k]
|
||||
@@ -1,11 +1,11 @@
|
||||
""" Distribution specific override class for macOS """
|
||||
import pkg_resources
|
||||
|
||||
import zope.interface
|
||||
|
||||
from certbot import interfaces
|
||||
from certbot.compat import os
|
||||
from certbot_apache._internal import configurator
|
||||
|
||||
from certbot_apache import configurator
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class DarwinConfigurator(configurator.ApacheConfigurator):
|
||||
@@ -27,5 +27,5 @@ class DarwinConfigurator(configurator.ApacheConfigurator):
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/apache2/other",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
|
||||
"certbot_apache", "options-ssl-apache.conf")
|
||||
)
|
||||
@@ -1,20 +1,19 @@
|
||||
""" Distribution specific override class for Debian family (Ubuntu/Debian) """
|
||||
import logging
|
||||
|
||||
import os
|
||||
import pkg_resources
|
||||
|
||||
import zope.interface
|
||||
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot import util
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
from certbot_apache._internal import apache_util
|
||||
from certbot_apache._internal import configurator
|
||||
|
||||
from certbot_apache import apache_util
|
||||
from certbot_apache import configurator
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
"""Debian specific ApacheConfigurator override class"""
|
||||
@@ -35,7 +34,7 @@ class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
handle_sites=True,
|
||||
challenge_location="/etc/apache2",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
|
||||
"certbot_apache", "options-ssl-apache.conf")
|
||||
)
|
||||
|
||||
def enable_site(self, vhost):
|
||||
@@ -45,14 +44,14 @@ class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
modules are enabled appropriately.
|
||||
|
||||
:param vhost: vhost to enable
|
||||
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
|
||||
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
|
||||
|
||||
:raises .errors.NotSupportedError: If filesystem layout is not
|
||||
supported.
|
||||
|
||||
"""
|
||||
if vhost.enabled:
|
||||
return None
|
||||
return
|
||||
|
||||
enabled_path = ("%s/sites-enabled/%s" %
|
||||
(self.parser.root,
|
||||
@@ -65,25 +64,26 @@ class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
try:
|
||||
os.symlink(vhost.filep, enabled_path)
|
||||
except OSError as err:
|
||||
if os.path.islink(enabled_path) and filesystem.realpath(
|
||||
if os.path.islink(enabled_path) and os.path.realpath(
|
||||
enabled_path) == vhost.filep:
|
||||
# Already in shape
|
||||
vhost.enabled = True
|
||||
return None
|
||||
logger.warning(
|
||||
"Could not symlink %s to %s, got error: %s", enabled_path,
|
||||
vhost.filep, err.strerror)
|
||||
errstring = ("Encountered error while trying to enable a " +
|
||||
"newly created VirtualHost located at {0} by " +
|
||||
"linking to it from {1}")
|
||||
raise errors.NotSupportedError(errstring.format(vhost.filep,
|
||||
enabled_path))
|
||||
return
|
||||
else:
|
||||
logger.warning(
|
||||
"Could not symlink %s to %s, got error: %s", enabled_path,
|
||||
vhost.filep, err.strerror)
|
||||
errstring = ("Encountered error while trying to enable a " +
|
||||
"newly created VirtualHost located at {0} by " +
|
||||
"linking to it from {1}")
|
||||
raise errors.NotSupportedError(errstring.format(vhost.filep,
|
||||
enabled_path))
|
||||
vhost.enabled = True
|
||||
logger.info("Enabling available site: %s", vhost.filep)
|
||||
self.save_notes += "Enabled site %s\n" % vhost.filep
|
||||
return None
|
||||
|
||||
def enable_mod(self, mod_name, temp=False):
|
||||
# pylint: disable=unused-argument
|
||||
"""Enables module in Apache.
|
||||
|
||||
Both enables and reloads Apache so module is active.
|
||||
@@ -1,13 +1,13 @@
|
||||
""" Distribution specific override class for Gentoo Linux """
|
||||
import pkg_resources
|
||||
|
||||
import zope.interface
|
||||
|
||||
from certbot import interfaces
|
||||
from certbot.compat import os
|
||||
from certbot_apache._internal import apache_util
|
||||
from certbot_apache._internal import configurator
|
||||
from certbot_apache._internal import parser
|
||||
|
||||
from certbot_apache import apache_util
|
||||
from certbot_apache import configurator
|
||||
from certbot_apache import parser
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class GentooConfigurator(configurator.ApacheConfigurator):
|
||||
@@ -30,7 +30,7 @@ class GentooConfigurator(configurator.ApacheConfigurator):
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/apache2/vhosts.d",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
|
||||
"certbot_apache", "options-ssl-apache.conf")
|
||||
)
|
||||
|
||||
def _prepare_options(self):
|
||||
@@ -44,7 +44,7 @@ class GentooConfigurator(configurator.ApacheConfigurator):
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
return GentooParser(
|
||||
self.option("server_root"), self.option("vhost_root"),
|
||||
self.aug, self.option("server_root"), self.option("vhost_root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
|
||||
@@ -64,7 +64,7 @@ class GentooParser(parser.ApacheParser):
|
||||
""" Parses Apache CLI options from Gentoo configuration file """
|
||||
defines = apache_util.parse_define_file(self.apacheconfig_filep,
|
||||
"APACHE2_OPTS")
|
||||
for k in defines:
|
||||
for k in defines.keys():
|
||||
self.variables[k] = defines[k]
|
||||
|
||||
def update_modules(self):
|
||||
@@ -1,11 +1,11 @@
|
||||
""" Distribution specific override class for OpenSUSE """
|
||||
import pkg_resources
|
||||
|
||||
import zope.interface
|
||||
|
||||
from certbot import interfaces
|
||||
from certbot.compat import os
|
||||
from certbot_apache._internal import configurator
|
||||
|
||||
from certbot_apache import configurator
|
||||
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class OpenSUSEConfigurator(configurator.ApacheConfigurator):
|
||||
@@ -27,5 +27,5 @@ class OpenSUSEConfigurator(configurator.ApacheConfigurator):
|
||||
handle_sites=False,
|
||||
challenge_location="/etc/apache2/vhosts.d",
|
||||
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
|
||||
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
|
||||
"certbot_apache", "options-ssl-apache.conf")
|
||||
)
|
||||
@@ -2,23 +2,21 @@
|
||||
import copy
|
||||
import fnmatch
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
import six
|
||||
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Dict, List, Set # pylint: disable=unused-import, no-name-in-module
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
from certbot_apache._internal import constants
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ApacheParser(object):
|
||||
# pylint: disable=too-many-public-methods
|
||||
"""Class handles the fine details of parsing the Apache Configuration.
|
||||
|
||||
.. todo:: Make parsing general... remove sites-available etc...
|
||||
@@ -33,7 +31,7 @@ class ApacheParser(object):
|
||||
arg_var_interpreter = re.compile(r"\$\{[^ \}]*}")
|
||||
fnmatch_chars = set(["*", "?", "\\", "[", "]"])
|
||||
|
||||
def __init__(self, root, vhostroot=None, version=(2, 4),
|
||||
def __init__(self, aug, root, vhostroot=None, version=(2, 4),
|
||||
configurator=None):
|
||||
# Note: Order is important here.
|
||||
|
||||
@@ -42,20 +40,11 @@ class ApacheParser(object):
|
||||
# issues with aug.load() after adding new files / defines to parse tree
|
||||
self.configurator = configurator
|
||||
|
||||
# Initialize augeas
|
||||
self.aug = None
|
||||
self.init_augeas()
|
||||
|
||||
if not self.check_aug_version():
|
||||
raise errors.NotSupportedError(
|
||||
"Apache plugin support requires libaugeas0 and augeas-lenses "
|
||||
"version 1.2.0 or higher, please make sure you have you have "
|
||||
"those installed.")
|
||||
|
||||
self.modules = set() # type: Set[str]
|
||||
self.parser_paths = {} # type: Dict[str, List[str]]
|
||||
self.variables = {} # type: Dict[str, str]
|
||||
|
||||
self.aug = aug
|
||||
# Find configuration root and make sure augeas can parse it.
|
||||
self.root = os.path.abspath(root)
|
||||
self.loc = {"root": self._find_config_root()}
|
||||
@@ -87,146 +76,6 @@ class ApacheParser(object):
|
||||
if self.find_dir("Define", exclude=False):
|
||||
raise errors.PluginError("Error parsing runtime variables")
|
||||
|
||||
def init_augeas(self):
|
||||
""" Initialize the actual Augeas instance """
|
||||
|
||||
try:
|
||||
import augeas
|
||||
except ImportError: # pragma: no cover
|
||||
raise errors.NoInstallationError("Problem in Augeas installation")
|
||||
|
||||
self.aug = augeas.Augeas(
|
||||
# specify a directory to load our preferred lens from
|
||||
loadpath=constants.AUGEAS_LENS_DIR,
|
||||
# Do not save backup (we do it ourselves), do not load
|
||||
# anything by default
|
||||
flags=(augeas.Augeas.NONE |
|
||||
augeas.Augeas.NO_MODL_AUTOLOAD |
|
||||
augeas.Augeas.ENABLE_SPAN))
|
||||
|
||||
def check_parsing_errors(self, lens):
|
||||
"""Verify Augeas can parse all of the lens files.
|
||||
|
||||
:param str lens: lens to check for errors
|
||||
|
||||
:raises .errors.PluginError: If there has been an error in parsing with
|
||||
the specified lens.
|
||||
|
||||
"""
|
||||
error_files = self.aug.match("/augeas//error")
|
||||
|
||||
for path in error_files:
|
||||
# Check to see if it was an error resulting from the use of
|
||||
# the httpd lens
|
||||
lens_path = self.aug.get(path + "/lens")
|
||||
# As aug.get may return null
|
||||
if lens_path and lens in lens_path:
|
||||
msg = (
|
||||
"There has been an error in parsing the file {0} on line {1}: "
|
||||
"{2}".format(
|
||||
# Strip off /augeas/files and /error
|
||||
path[13:len(path) - 6],
|
||||
self.aug.get(path + "/line"),
|
||||
self.aug.get(path + "/message")))
|
||||
raise errors.PluginError(msg)
|
||||
|
||||
def check_aug_version(self):
|
||||
""" Checks that we have recent enough version of libaugeas.
|
||||
If augeas version is recent enough, it will support case insensitive
|
||||
regexp matching"""
|
||||
|
||||
self.aug.set("/test/path/testing/arg", "aRgUMeNT")
|
||||
try:
|
||||
matches = self.aug.match(
|
||||
"/test//*[self::arg=~regexp('argument', 'i')]")
|
||||
except RuntimeError:
|
||||
self.aug.remove("/test/path")
|
||||
return False
|
||||
self.aug.remove("/test/path")
|
||||
return matches
|
||||
|
||||
def unsaved_files(self):
|
||||
"""Lists files that have modified Augeas DOM but the changes have not
|
||||
been written to the filesystem yet, used by `self.save()` and
|
||||
ApacheConfigurator to check the file state.
|
||||
|
||||
:raises .errors.PluginError: If there was an error in Augeas, in
|
||||
an attempt to save the configuration, or an error creating a
|
||||
checkpoint
|
||||
|
||||
:returns: `set` of unsaved files
|
||||
"""
|
||||
save_state = self.aug.get("/augeas/save")
|
||||
self.aug.set("/augeas/save", "noop")
|
||||
# Existing Errors
|
||||
ex_errs = self.aug.match("/augeas//error")
|
||||
try:
|
||||
# This is a noop save
|
||||
self.aug.save()
|
||||
except (RuntimeError, IOError):
|
||||
self._log_save_errors(ex_errs)
|
||||
# Erase Save Notes
|
||||
self.configurator.save_notes = ""
|
||||
raise errors.PluginError(
|
||||
"Error saving files, check logs for more info.")
|
||||
|
||||
# Return the original save method
|
||||
self.aug.set("/augeas/save", save_state)
|
||||
|
||||
# Retrieve list of modified files
|
||||
# Note: Noop saves can cause the file to be listed twice, I used a
|
||||
# set to remove this possibility. This is a known augeas 0.10 error.
|
||||
save_paths = self.aug.match("/augeas/events/saved")
|
||||
|
||||
save_files = set()
|
||||
if save_paths:
|
||||
for path in save_paths:
|
||||
save_files.add(self.aug.get(path)[6:])
|
||||
return save_files
|
||||
|
||||
def ensure_augeas_state(self):
|
||||
"""Makes sure that all Augeas dom changes are written to files to avoid
|
||||
loss of configuration directives when doing additional augeas parsing,
|
||||
causing a possible augeas.load() resulting dom reset
|
||||
"""
|
||||
|
||||
if self.unsaved_files():
|
||||
self.configurator.save_notes += "(autosave)"
|
||||
self.configurator.save()
|
||||
|
||||
def save(self, save_files):
|
||||
"""Saves all changes to the configuration files.
|
||||
|
||||
save() is called from ApacheConfigurator to handle the parser specific
|
||||
tasks of saving.
|
||||
|
||||
:param list save_files: list of strings of file paths that we need to save.
|
||||
|
||||
"""
|
||||
self.configurator.save_notes = ""
|
||||
self.aug.save()
|
||||
|
||||
# Force reload if files were modified
|
||||
# This is needed to recalculate augeas directive span
|
||||
if save_files:
|
||||
for sf in save_files:
|
||||
self.aug.remove("/files/"+sf)
|
||||
self.aug.load()
|
||||
|
||||
def _log_save_errors(self, ex_errs):
|
||||
"""Log errors due to bad Augeas save.
|
||||
|
||||
:param list ex_errs: Existing errors before save
|
||||
|
||||
"""
|
||||
# Check for the root of save problems
|
||||
new_errs = self.aug.match("/augeas//error")
|
||||
# logger.error("During Save - %s", mod_conf)
|
||||
logger.error("Unable to save files: %s. Attempted Save Notes: %s",
|
||||
", ".join(err[13:len(err) - 6] for err in new_errs
|
||||
# Only new errors caused by recent save
|
||||
if err not in ex_errs), self.configurator.save_notes)
|
||||
|
||||
def add_include(self, main_config, inc_path):
|
||||
"""Add Include for a new configuration file if one does not exist
|
||||
|
||||
@@ -234,7 +83,7 @@ class ApacheParser(object):
|
||||
:param str inc_path: path of file to include
|
||||
|
||||
"""
|
||||
if not self.find_dir(case_i("Include"), inc_path):
|
||||
if len(self.find_dir(case_i("Include"), inc_path)) == 0:
|
||||
logger.debug("Adding Include %s to %s",
|
||||
inc_path, get_aug_path(main_config))
|
||||
self.add_dir(
|
||||
@@ -244,7 +93,12 @@ class ApacheParser(object):
|
||||
# Add new path to parser paths
|
||||
new_dir = os.path.dirname(inc_path)
|
||||
new_file = os.path.basename(inc_path)
|
||||
self.existing_paths.setdefault(new_dir, []).append(new_file)
|
||||
if new_dir in self.existing_paths.keys():
|
||||
# Add to existing path
|
||||
self.existing_paths[new_dir].append(new_file)
|
||||
else:
|
||||
# Create a new path
|
||||
self.existing_paths[new_dir] = [new_file]
|
||||
|
||||
def add_mod(self, mod_name):
|
||||
"""Shortcut for updating parser modules."""
|
||||
@@ -284,8 +138,8 @@ class ApacheParser(object):
|
||||
mods.add(mod_name)
|
||||
mods.add(os.path.basename(mod_filename)[:-2] + "c")
|
||||
else:
|
||||
logger.debug("Could not read LoadModule directive from Augeas path: %s",
|
||||
match_name[6:])
|
||||
logger.debug("Could not read LoadModule directive from " +
|
||||
"Augeas path: {0}".format(match_name[6:]))
|
||||
self.modules.update(mods)
|
||||
|
||||
def update_runtime_variables(self):
|
||||
@@ -375,8 +229,8 @@ class ApacheParser(object):
|
||||
"Error running command %s for runtime parameters!%s",
|
||||
command, os.linesep)
|
||||
raise errors.MisconfigurationError(
|
||||
"Error accessing loaded Apache parameters: {0}".format(
|
||||
command))
|
||||
"Error accessing loaded Apache parameters: %s",
|
||||
command)
|
||||
# Small errors that do not impede
|
||||
if proc.returncode != 0:
|
||||
logger.warning("Error in checking parameter list: %s", stderr)
|
||||
@@ -402,12 +256,12 @@ class ApacheParser(object):
|
||||
"""
|
||||
filtered = []
|
||||
if args == 1:
|
||||
for i, match in enumerate(matches):
|
||||
if match.endswith("/arg"):
|
||||
for i in range(len(matches)):
|
||||
if matches[i].endswith("/arg"):
|
||||
filtered.append(matches[i][:-4])
|
||||
else:
|
||||
for i, match in enumerate(matches):
|
||||
if match.endswith("/arg[%d]" % args):
|
||||
for i in range(len(matches)):
|
||||
if matches[i].endswith("/arg[%d]" % args):
|
||||
# Make sure we don't cause an IndexError (end of list)
|
||||
# Check to make sure arg + 1 doesn't exist
|
||||
if (i == (len(matches) - 1) or
|
||||
@@ -432,7 +286,7 @@ class ApacheParser(object):
|
||||
"""
|
||||
# TODO: Add error checking code... does the path given even exist?
|
||||
# Does it throw exceptions?
|
||||
if_mod_path = self.get_ifmod(aug_conf_path, "mod_ssl.c")
|
||||
if_mod_path = self._get_ifmod(aug_conf_path, "mod_ssl.c")
|
||||
# IfModule can have only one valid argument, so append after
|
||||
self.aug.insert(if_mod_path + "arg", "directive", False)
|
||||
nvh_path = if_mod_path + "directive[1]"
|
||||
@@ -443,54 +297,22 @@ class ApacheParser(object):
|
||||
for i, arg in enumerate(args):
|
||||
self.aug.set("%s/arg[%d]" % (nvh_path, i + 1), arg)
|
||||
|
||||
def get_ifmod(self, aug_conf_path, mod, beginning=False):
|
||||
def _get_ifmod(self, aug_conf_path, mod):
|
||||
"""Returns the path to <IfMod mod> and creates one if it doesn't exist.
|
||||
|
||||
:param str aug_conf_path: Augeas configuration path
|
||||
:param str mod: module ie. mod_ssl.c
|
||||
:param bool beginning: If the IfModule should be created to the beginning
|
||||
of augeas path DOM tree.
|
||||
|
||||
:returns: Augeas path of the requested IfModule directive that pre-existed
|
||||
or was created during the process. The path may be dynamic,
|
||||
i.e. .../IfModule[last()]
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
if_mods = self.aug.match(("%s/IfModule/*[self::arg='%s']" %
|
||||
(aug_conf_path, mod)))
|
||||
if not if_mods:
|
||||
return self.create_ifmod(aug_conf_path, mod, beginning)
|
||||
|
||||
if len(if_mods) == 0:
|
||||
self.aug.set("%s/IfModule[last() + 1]" % aug_conf_path, "")
|
||||
self.aug.set("%s/IfModule[last()]/arg" % aug_conf_path, mod)
|
||||
if_mods = self.aug.match(("%s/IfModule/*[self::arg='%s']" %
|
||||
(aug_conf_path, mod)))
|
||||
# Strip off "arg" at end of first ifmod path
|
||||
return if_mods[0].rpartition("arg")[0]
|
||||
|
||||
def create_ifmod(self, aug_conf_path, mod, beginning=False):
|
||||
"""Creates a new <IfMod mod> and returns its path.
|
||||
|
||||
:param str aug_conf_path: Augeas configuration path
|
||||
:param str mod: module ie. mod_ssl.c
|
||||
:param bool beginning: If the IfModule should be created to the beginning
|
||||
of augeas path DOM tree.
|
||||
|
||||
:returns: Augeas path of the newly created IfModule directive.
|
||||
The path may be dynamic, i.e. .../IfModule[last()]
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
if beginning:
|
||||
c_path_arg = "{}/IfModule[1]/arg".format(aug_conf_path)
|
||||
# Insert IfModule before the first directive
|
||||
self.aug.insert("{}/directive[1]".format(aug_conf_path),
|
||||
"IfModule", True)
|
||||
retpath = "{}/IfModule[1]/".format(aug_conf_path)
|
||||
else:
|
||||
c_path = "{}/IfModule[last() + 1]".format(aug_conf_path)
|
||||
c_path_arg = "{}/IfModule[last()]/arg".format(aug_conf_path)
|
||||
self.aug.set(c_path, "")
|
||||
retpath = "{}/IfModule[last()]/".format(aug_conf_path)
|
||||
self.aug.set(c_path_arg, mod)
|
||||
return retpath
|
||||
return if_mods[0][:len(if_mods[0]) - 3]
|
||||
|
||||
def add_dir(self, aug_conf_path, directive, args):
|
||||
"""Appends directive to the end fo the file given by aug_conf_path.
|
||||
@@ -625,7 +447,7 @@ class ApacheParser(object):
|
||||
# https://httpd.apache.org/docs/2.4/mod/core.html#include
|
||||
for match in matches:
|
||||
dir_ = self.aug.get(match).lower()
|
||||
if dir_ in ("include", "includeoptional"):
|
||||
if dir_ == "include" or dir_ == "includeoptional":
|
||||
ordered_matches.extend(self.find_dir(
|
||||
directive, arg,
|
||||
self._get_include_path(self.get_arg(match + "/arg")),
|
||||
@@ -636,20 +458,6 @@ class ApacheParser(object):
|
||||
|
||||
return ordered_matches
|
||||
|
||||
def get_all_args(self, match):
|
||||
"""
|
||||
Tries to fetch all arguments for a directive. See get_arg.
|
||||
|
||||
Note that if match is an ancestor node, it returns all names of
|
||||
child directives as well as the list of arguments.
|
||||
|
||||
"""
|
||||
|
||||
if match[-1] != "/":
|
||||
match = match+"/"
|
||||
allargs = self.aug.match(match + '*')
|
||||
return [self.get_arg(arg) for arg in allargs]
|
||||
|
||||
def get_arg(self, match):
|
||||
"""Uses augeas.get to get argument value and interprets result.
|
||||
|
||||
@@ -665,7 +473,8 @@ class ApacheParser(object):
|
||||
# e.g. strip now, not later
|
||||
if not value:
|
||||
return None
|
||||
value = value.strip("'\"")
|
||||
else:
|
||||
value = value.strip("'\"")
|
||||
|
||||
variables = ApacheParser.arg_var_interpreter.findall(value)
|
||||
|
||||
@@ -792,8 +601,9 @@ class ApacheParser(object):
|
||||
if sys.version_info < (3, 6):
|
||||
# This strips off final /Z(?ms)
|
||||
return fnmatch.translate(clean_fn_match)[:-7]
|
||||
# Since Python 3.6, it returns a different pattern like (?s:.*\.load)\Z
|
||||
return fnmatch.translate(clean_fn_match)[4:-3] # pragma: no cover
|
||||
else: # pragma: no cover
|
||||
# Since Python 3.6, it returns a different pattern like (?s:.*\.load)\Z
|
||||
return fnmatch.translate(clean_fn_match)[4:-3]
|
||||
|
||||
def parse_file(self, filepath):
|
||||
"""Parse file with Augeas
|
||||
@@ -807,7 +617,8 @@ class ApacheParser(object):
|
||||
use_new, remove_old = self._check_path_actions(filepath)
|
||||
# Ensure that we have the latest Augeas DOM state on disk before
|
||||
# calling aug.load() which reloads the state from disk
|
||||
self.ensure_augeas_state()
|
||||
if self.configurator:
|
||||
self.configurator.ensure_augeas_state()
|
||||
# Test if augeas included file for Httpd.lens
|
||||
# Note: This works for augeas globs, ie. *.conf
|
||||
if use_new:
|
||||
@@ -874,7 +685,10 @@ class ApacheParser(object):
|
||||
use_new = False
|
||||
else:
|
||||
use_new = True
|
||||
remove_old = new_file_match == "*"
|
||||
if new_file_match == "*":
|
||||
remove_old = True
|
||||
else:
|
||||
remove_old = False
|
||||
except KeyError:
|
||||
use_new = True
|
||||
remove_old = False
|
||||
1
certbot-apache/certbot_apache/tests/__init__.py
Normal file
1
certbot-apache/certbot_apache/tests/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Certbot Apache Tests"""
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user