Compare commits

..

16 Commits

Author SHA1 Message Date
Brad Warren
2cbacd1219 test quiet and fast 2019-06-14 13:00:08 -07:00
Adrien Ferrand
b91dc709bf Clear implementation 2019-06-14 21:00:40 +02:00
Adrien Ferrand
8e47b1cca6 Update acme_server.py 2019-06-14 19:59:07 +02:00
Adrien Ferrand
675428e43a Update proxy.py 2019-06-14 19:56:49 +02:00
Adrien Ferrand
325c611d72 Update proxy.py 2019-06-14 19:54:49 +02:00
Adrien Ferrand
6ce34d0447 Update acme_server.py 2019-06-14 19:25:02 +02:00
Adrien Ferrand
a9921351fb Update proxy.py 2019-06-14 19:22:49 +02:00
Adrien Ferrand
d7d787f225 Update constants.py 2019-06-14 19:13:22 +02:00
Adrien Ferrand
d11408b46b Give proxy process to the ACMEServer context manager 2019-06-14 17:17:02 +02:00
Adrien Ferrand
563ed73abc Python 2/3 cross-compatibility 2019-06-14 17:16:44 +02:00
Adrien Ferrand
c3cd7412b9 Merge branch 'master' into certbot-ci-python-proxy
# Conflicts:
#	certbot-ci/certbot_integration_tests/utils/acme_server.py
2019-06-14 16:59:23 +02:00
Adrien Ferrand
f6c7fe8b2d Resolve from the path 2019-06-14 16:53:39 +02:00
Adrien Ferrand
d97c395133 Working logic 2019-06-14 16:52:03 +02:00
Adrien Ferrand
d60b140986 Refactor proxy config 2019-06-14 15:58:36 +02:00
Adrien Ferrand
a8751deeaa Work in progress 2019-06-13 23:56:42 +02:00
Adrien Ferrand
bd011375ae Create a python proxy 2019-06-13 15:39:07 +02:00
862 changed files with 10966 additions and 10558 deletions

View File

@@ -1,119 +0,0 @@
# Configuring Azure Pipelines with Certbot
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
Unlike Travis, where CodeCov is working without any action required, CodeCov supports Azure Pipelines
using the coverage-bash utility (not python-coverage for now) only if you provide the Codecov repo token
using the `CODECOV_TOKEN` environment variable. So `CODECOV_TOKEN` needs to be set as a secured
environment variable to allow the main pipeline to publish coverage reports to CodeCov.
This INSTALL.md file explains how to configure Azure Pipelines with Certbot in order to execute the CI/CD logic defined in `.azure-pipelines` folder with it.
During this installation step, warnings describing user access and legal comitments will be displayed like this:
```
!!! ACCESS REQUIRED !!!
```
This document suppose that the Azure DevOps organization is named _certbot_, and the Azure DevOps project is also _certbot_.
## Useful links
* https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema
* https://www.azuredevopslabs.com/labs/azuredevops/github-integration/
* https://docs.microsoft.com/en-us/azure/devops/pipelines/ecosystems/python?view=azure-devops
## Prerequisites
### Having a GitHub account
Use your GitHub user for a normal GitHub account, or a user that has administrative rights to the GitHub organization if relevant.
### Having an Azure DevOps account
- Go to https://dev.azure.com/, click "Start free with GitHub"
- Login to GitHub
```
!!! ACCESS REQUIRED !!!
Personal user data (email + profile info, in read-only)
```
- Microsoft will create a Live account using the email referenced for the GitHub account. This account is also linked to GitHub account (meaning you can log it using GitHub authentication)
- Proceed with account registration (birth date, country), add details about name and email contact
```
!!! ACCESS REQUIRED !!!
Microsoft proposes to send commercial links to this mail
Azure DevOps terms of service need to be accepted
```
_Logged to Azure DevOps, account is ready._
### Installing Azure Pipelines to GitHub
- On GitHub, go to Marketplace
- Select Azure Pipeline, and "Set up a plan"
- Select Free, then "Install it for free"
- Click "Complete order and begin installation"
```
!!! ACCESS !!!
Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses, deployments, issues, pull requests.
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
protected branches, like master, and cannot modify the security context around this on GitHub.
Access can be defined for all or only selected repositories, which is nice.
```
- Redirected to Azure DevOps, select the account created in _Having an Azure DevOps account_ section.
- Select the organization, and click "Create a new project" (let's name it the same than the targetted github repo)
- The Visibility is public, to profit from 10 parallel jobs
```
!!! ACCESS !!!
Azure Pipelines needs access to the GitHub account (in term of beeing able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
```
_Done. We can move to pipelines configuration._
## Import an existing pipelines from `.azure-pipelines` folder
- On Azure DevOps, go to your organization (eg. _certbot_) then your project (eg. _certbot_)
- Click "Pipelines" tab
- Click "New pipeline"
- Where is your code?: select "__Use the classic editor__"
__Warning: Do not choose the GitHub option in Where is your code? section. Indeed, this option will trigger an OAuth
grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAuth Application. The permissions asked
then are way too large (admin level on almost everything), while the classic approach does not add any more
permissions, and works perfectly well.__
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
- Click on YAML option for "Select a template"
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._
## Add a secret variable to a pipeline (like `CODECOV_TOKEN`)
__NB: Following steps suppose that you already setup the YAML pipeline file to
consume the secret variable that these steps will create as an environment variable.
For a variable named `CODECOV_TOKEN` consuming the variable `codecov_token`,
in the YAML file this setup would take the form of the following:
```
steps:
- script: ./do_something_that_consumes_CODECOV_TOKEN # Eg. `codecov -F windows`
env:
CODECOV_TOKEN: $(codecov_token)
```
To set up a variable that is shared between pipelines, follow the instructions
at
https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups.
When adding variables to a group, don't forget to tick "Keep this value secret"
if it shouldn't be shared publcily.

View File

@@ -1,19 +0,0 @@
# Advanced pipeline for isolated checks and release purpose
trigger:
- test-*
- '*.x'
pr:
- test-*
# This pipeline is also nightly run on master
schedules:
- cron: "0 4 * * *"
displayName: Nightly build
branches:
include:
- master
always: true
jobs:
# Any addition here should be reflected in the release pipeline.
# It is advised to declare all jobs here as templates to improve maintainability.
- template: templates/installer-tests.yml

View File

@@ -1,12 +0,0 @@
trigger:
# apache-parser-v2 is a temporary branch for doing work related to
# rewriting the parser in the Apache plugin.
- apache-parser-v2
- master
pr:
- apache-parser-v2
- master
- '*.x'
jobs:
- template: templates/tests-suite.yml

View File

@@ -1,13 +0,0 @@
# Release pipeline to build and deploy Certbot for Windows for GitHub release tags
trigger:
tags:
include:
- v*
pr: none
jobs:
# Any addition here should be reflected in the advanced pipeline.
# It is advised to declare all jobs here as templates to improve maintainability.
- template: templates/tests-suite.yml
- template: templates/installer-tests.yml
- template: templates/changelog.yml

View File

@@ -1,14 +0,0 @@
jobs:
- job: changelog
pool:
vmImage: vs2017-win2016
steps:
- bash: |
CERTBOT_VERSION="$(cd certbot && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
displayName: Prepare changelog
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: changelog
displayName: Publish changelog

View File

@@ -1,15 +0,0 @@
jobs:
- job: installer_run
strategy:
matrix:
win2019:
imageName: windows-2019
win2016:
imageName: vs2017-win2016
win2012r2:
imageName: vs2015-win2012r2
pool:
vmImage: $(imageName)
steps:
- script: wusa /uninstall /kb:3134758 /quiet /norestart & exit 0
- script: powershell -Command "$PSVersionTable.PSVersion"

View File

@@ -1,38 +0,0 @@
jobs:
- job: test
pool:
vmImage: vs2017-win2016
strategy:
matrix:
py35:
PYTHON_VERSION: 3.5
TOXENV: py35
py37-cover:
PYTHON_VERSION: 3.7
TOXENV: py37-cover
integration-certbot:
PYTHON_VERSION: 3.7
TOXENV: integration-certbot
PYTEST_ADDOPTS: --numprocesses 4
variables:
- group: certbot-common
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(PYTHON_VERSION)
addToPath: true
- script: python tools/pip_install.py -U tox coverage
displayName: Install dependencies
- script: python -m tox
displayName: Run tox
# We do not require codecov report upload to succeed. So to avoid to break the pipeline if
# something goes wrong, each command is suffixed with a command that hides any non zero exit
# codes and echoes an informative message instead.
- bash: |
curl -s https://codecov.io/bash -o codecov-bash || echo "Failed to download codecov-bash"
chmod +x codecov-bash || echo "Failed to apply execute permissions on codecov-bash"
./codecov-bash -F windows || echo "Codecov did not collect coverage reports"
condition: in(variables['TOXENV'], 'py37-cover', 'integration-certbot')
env:
CODECOV_TOKEN: $(codecov_token)
displayName: Publish coverage

View File

@@ -4,15 +4,11 @@ coverage:
default: off
linux:
flags: linux
# Fixed target instead of auto set by #7173, can
# be removed when flags in Codecov are added back.
target: 97.4
target: auto
threshold: 0.1
base: auto
windows:
flags: windows
# Fixed target instead of auto set by #7173, can
# be removed when flags in Codecov are added back.
target: 97.4
target: auto
threshold: 0.1
base: auto

View File

@@ -1,5 +1,2 @@
[run]
omit = */setup.py
[report]
omit = */setup.py

9
.github/stale.yml vendored
View File

@@ -1,12 +1,11 @@
# Configuration for https://github.com/marketplace/stale
# Number of days of inactivity before an Issue or Pull Request becomes stale
daysUntilStale: 365
daysUntilStale: 180
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
# When changing this value, be sure to also update markComment below.
daysUntilClose: 30
daysUntilClose: 7
# Ignore issues with an assignee (defaults to false)
exemptAssignees: true
@@ -19,8 +18,8 @@ markComment: >
We've made a lot of changes to Certbot since this issue was opened. If you
still have this issue with an up-to-date version of Certbot, can you please
add a comment letting us know? This helps us to better see what issues are
still affecting our users. If there is no activity in the next 30 days, this
issue will be automatically closed.
still affecting our users. If there is no further activity, this issue will
be automatically closed.
# Comment to post when closing a stale Issue or Pull Request.
closeComment: >

2
.gitignore vendored
View File

@@ -47,5 +47,3 @@ tests/letstest/venv/
# certbot tests
.certbot_test_workspace
**/assets/pebble*
**/assets/challtestsrv*

View File

@@ -1,7 +0,0 @@
[settings]
skip_glob=venv*
skip=letsencrypt-auto-source
force_sort_within_sections=True
force_single_line=True
order_by_type=False
line_length=400

View File

@@ -24,11 +24,6 @@ persistent=yes
# usually to register additional checkers.
load-plugins=linter_plugin
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
[MESSAGES CONTROL]
@@ -46,14 +41,10 @@ extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
# CERTBOT COMMENT
# 1) Once certbot codebase is claimed to be compatible exclusively with Python 3,
# the useless-object-inheritance check can be enabled again, and code fixed accordingly.
# 2) Check unsubscriptable-object tends to create a lot of false positives. Let's disable it.
# See https://github.com/PyCQA/pylint/issues/1498.
# 3) Same as point 2 for no-value-for-parameter.
# See https://github.com/PyCQA/pylint/issues/2820.
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue
disable=fixme,locally-disabled,locally-enabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,too-few-public-methods,no-self-use,invalid-name,too-many-instance-attributes,cyclic-import,duplicate-code
# abstract-class-not-used cannot be disabled locally (at least in
# pylint 1.4.1), same for abstract-class-little-used
[REPORTS]
@@ -306,6 +297,40 @@ valid-classmethod-first-arg=cls
valid-metaclass-classmethod-first-arg=mcs
[DESIGN]
# Maximum number of arguments for function / method
max-args=6
# Argument names that match this expression will be ignored. Default to name
# with leading underscore
ignored-argument-names=(unused)?_.*|dummy
# Maximum number of locals for function / method body
max-locals=15
# Maximum number of return / yield for function / method body
max-returns=6
# Maximum number of branch for function / method body
max-branches=12
# Maximum number of statements in function / method body
max-statements=50
# Maximum number of parents for a class (see R0901).
max-parents=12
# Maximum number of attributes for a class (see R0902).
max-attributes=7
# Minimum number of public methods for a class (see R0903).
min-public-methods=2
# Maximum number of public methods for a class (see R0904).
max-public-methods=20
[EXCEPTIONS]
# Exceptions that will emit a warning when being caught. Defaults to

View File

@@ -1,5 +1,4 @@
language: python
dist: xenial
cache:
directories:
@@ -9,8 +8,6 @@ before_script:
- 'if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then ulimit -n 1024 ; fi'
# On Travis, the fastest parallelization for integration tests has proved to be 4.
- 'if [[ "$TOXENV" == *"integration"* ]]; then export PYTEST_ADDOPTS="--numprocesses 4"; fi'
# Use Travis retry feature for farm tests since they are flaky
- 'if [[ "$TOXENV" == "travis-test-farm"* ]]; then export TRAVIS_RETRY=travis_retry; fi'
- export TOX_TESTENV_PASSENV=TRAVIS
# Only build pushes to the master branch, PRs, and branches beginning with
@@ -19,9 +16,6 @@ before_script:
# is a cap of on the number of simultaneous runs.
branches:
only:
# apache-parser-v2 is a temporary branch for doing work related to
# rewriting the parser in the Apache plugin.
- apache-parser-v2
- master
- /^\d+\.\d+\.x$/
- /^test-.*$/
@@ -30,16 +24,84 @@ branches:
not-on-master: &not-on-master
if: NOT (type = push AND branch = master)
# Jobs for the extended test suite are executed for cron jobs and pushes to
# non-development branches. See the explanation for apache-parser-v2 above.
# Jobs for the extended test suite are executed for cron jobs and pushes on non-master branches.
extended-test-suite: &extended-test-suite
if: type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))
if: type = cron OR (type = push AND branch != master)
matrix:
include:
# This job is always executed, including on master
- python: "2.7"
env: TOXENV=py27-cover FYI="py27 tests + code coverage"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration-certbot-oldest
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration-certbot-oldest
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration-nginx-oldest
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration-nginx-oldest
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.4"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.4"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.5"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.5"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.6"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.6"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.7"
dist: xenial
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.7"
dist: xenial
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
# container-based infrastructure
sudo: false
@@ -48,6 +110,7 @@ addons:
apt:
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
- python-dev
- python-virtualenv
- gcc
- libaugeas0
- libssl-dev
@@ -57,17 +120,8 @@ addons:
- nginx-light
- openssl
# tools/pip_install.py is used to pin packages to a known working version
# except in tests where the environment variable CERTBOT_NO_PIN is set.
# virtualenv is listed here explicitly to make sure it is upgraded when
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
# version of virtualenv.
install: 'tools/pip_install.py -U codecov tox virtualenv'
# Most of the time TRAVIS_RETRY is an empty string, and has no effect on the
# script command. It is set only to `travis_retry` during farm tests, in
# order to trigger the Travis retry feature, and compensate the inherent
# flakiness of these specific tests.
script: '$TRAVIS_RETRY tox'
install: "$(command -v pip || command -v pip3) install codecov tox"
script: tox
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov -F linux'

View File

@@ -15,10 +15,8 @@ Authors
* [Alex Gaynor](https://github.com/alex)
* [Alex Halderman](https://github.com/jhalderm)
* [Alex Jordan](https://github.com/strugee)
* [Alex Zorin](https://github.com/alexzorin)
* [Amjad Mashaal](https://github.com/TheNavigat)
* [Andrew Murray](https://github.com/radarhere)
* [Andrzej Górski](https://github.com/andrzej3393)
* [Anselm Levskaya](https://github.com/levskaya)
* [Antoine Jacoutot](https://github.com/ajacoutot)
* [asaph](https://github.com/asaph)
@@ -128,7 +126,6 @@ Authors
* [Joubin Jabbari](https://github.com/joubin)
* [Juho Juopperi](https://github.com/jkjuopperi)
* [Kane York](https://github.com/riking)
* [Kenichi Maehashi](https://github.com/kmaehashi)
* [Kenneth Skovhede](https://github.com/kenkendk)
* [Kevin Burke](https://github.com/kevinburke)
* [Kevin London](https://github.com/kevinlondon)
@@ -164,10 +161,8 @@ Authors
* [Michael Schumacher](https://github.com/schumaml)
* [Michael Strache](https://github.com/Jarodiv)
* [Michael Sverdlin](https://github.com/sveder)
* [Michael Watters](https://github.com/blackknight36)
* [Michal Moravec](https://github.com/https://github.com/Majkl578)
* [Michal Papis](https://github.com/mpapis)
* [Mickaël Schoentgen](https://github.com/BoboTiG)
* [Minn Soe](https://github.com/MinnSoe)
* [Min RK](https://github.com/minrk)
* [Miquel Ruiz](https://github.com/miquelruiz)
@@ -231,7 +226,6 @@ Authors
* [Stavros Korokithakis](https://github.com/skorokithakis)
* [Stefan Weil](https://github.com/stweil)
* [Steve Desmond](https://github.com/stevedesmond-ca)
* [sydneyli](https://github.com/sydneyli)
* [Tan Jay Jun](https://github.com/jayjun)
* [Tapple Gao](https://github.com/tapple)
* [Telepenin Nikolay](https://github.com/telepenin)

View File

@@ -1 +0,0 @@
certbot/CHANGELOG.md

1687
CHANGELOG.md Normal file

File diff suppressed because it is too large Load Diff

35
Dockerfile Normal file
View File

@@ -0,0 +1,35 @@
FROM python:2-alpine3.9
ENTRYPOINT [ "certbot" ]
EXPOSE 80 443
VOLUME /etc/letsencrypt /var/lib/letsencrypt
WORKDIR /opt/certbot
COPY CHANGELOG.md README.rst setup.py src/
# Generate constraints file to pin dependency versions
COPY letsencrypt-auto-source/pieces/dependency-requirements.txt .
COPY tools /opt/certbot/tools
RUN sh -c 'cat dependency-requirements.txt | /opt/certbot/tools/strip_hashes.py > unhashed_requirements.txt'
RUN sh -c 'cat tools/dev_constraints.txt unhashed_requirements.txt | /opt/certbot/tools/merge_requirements.py > docker_constraints.txt'
COPY acme src/acme
COPY certbot src/certbot
RUN apk add --no-cache --virtual .certbot-deps \
libffi \
libssl1.1 \
openssl \
ca-certificates \
binutils
RUN apk add --no-cache --virtual .build-deps \
gcc \
linux-headers \
openssl-dev \
musl-dev \
libffi-dev \
&& pip install -r /opt/certbot/dependency-requirements.txt \
&& pip install --no-cache-dir --no-deps \
--editable /opt/certbot/src/acme \
--editable /opt/certbot/src \
&& apk del .build-deps

View File

@@ -1,20 +1,21 @@
# This Dockerfile builds an image for development.
FROM debian:buster
FROM ubuntu:xenial
# Note: this only exposes the port to other docker containers.
EXPOSE 80 443
WORKDIR /opt/certbot/src
# TODO: Install Apache/Nginx for plugin development.
COPY . .
RUN apt-get update && \
apt-get install apache2 git python3-dev python3-venv gcc libaugeas0 \
libssl-dev libffi-dev ca-certificates openssl nginx-light -y && \
apt-get install apache2 git nginx-light -y && \
letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*
RUN VENV_NAME="../venv3" python3 tools/venv3.py
RUN VENV_NAME="../venv" python tools/venv.py
ENV PATH /opt/certbot/venv3/bin:$PATH
ENV PATH /opt/certbot/venv/bin:$PATH

75
Dockerfile-old Normal file
View File

@@ -0,0 +1,75 @@
# https://github.com/letsencrypt/letsencrypt/pull/431#issuecomment-103659297
# it is more likely developers will already have ubuntu:trusty rather
# than e.g. debian:jessie and image size differences are negligible
FROM ubuntu:trusty
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
MAINTAINER William Budington <bill@eff.org>
# Note: this only exposes the port to other docker containers. You
# still have to bind to 443@host at runtime, as per the ACME spec.
EXPOSE 443
# TODO: make sure --config-dir and --work-dir cannot be changed
# through the CLI (certbot-docker wrapper that uses standalone
# authenticator and text mode only?)
VOLUME /etc/letsencrypt /var/lib/letsencrypt
WORKDIR /opt/certbot
# no need to mkdir anything:
# https://docs.docker.com/reference/builder/#copy
# If <dest> doesn't exist, it is created along with all missing
# directories in its path.
ENV DEBIAN_FRONTEND=noninteractive
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*
# the above is not likely to change, so by putting it further up the
# Dockerfile we make sure we cache as much as possible
COPY setup.py README.rst CHANGELOG.md MANIFEST.in letsencrypt-auto-source/pieces/pipstrap.py /opt/certbot/src/
# all above files are necessary for setup.py and venv setup, however,
# package source code directory has to be copied separately to a
# subdirectory...
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
# directory, the entire contents of the directory are copied,
# including filesystem metadata. Note: The directory itself is not
# copied, just its contents." Order again matters, three files are far
# more likely to be cached than the whole project directory
COPY certbot /opt/certbot/src/certbot/
COPY acme /opt/certbot/src/acme/
COPY certbot-apache /opt/certbot/src/certbot-apache/
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
RUN VIRTUALENV_NO_DOWNLOAD=1 virtualenv --no-site-packages -p python2 /opt/certbot/venv
# PATH is set now so pipstrap upgrades the correct (v)env
ENV PATH /opt/certbot/venv/bin:$PATH
RUN /opt/certbot/venv/bin/python /opt/certbot/src/pipstrap.py && \
/opt/certbot/venv/bin/pip install \
-e /opt/certbot/src/acme \
-e /opt/certbot/src \
-e /opt/certbot/src/certbot-apache \
-e /opt/certbot/src/certbot-nginx
# install in editable mode (-e) to save space: it's not possible to
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
# this might also help in debugging: you can "docker run --entrypoint
# bash" and investigate, apply patches, etc.
# set up certbot/letsencrypt wrapper to warn people about Dockerfile changes
COPY tools/docker-warning.sh /opt/certbot/bin/certbot
RUN ln -s /opt/certbot/bin/certbot /opt/certbot/bin/letsencrypt
ENV PATH /opt/certbot/bin:$PATH
ENTRYPOINT [ "certbot" ]

View File

@@ -1,10 +1,9 @@
include README.rst
include CHANGELOG.md
include CONTRIBUTING.md
include LICENSE.txt
include linter_plugin.py
recursive-include docs *
recursive-include examples *
recursive-include certbot/tests/testdata *
recursive-include tests *.py
include certbot/ssl-dhparams.pem
global-exclude __pycache__
global-exclude *.py[cod]

View File

@@ -1 +0,0 @@
certbot/README.rst

131
README.rst Normal file
View File

@@ -0,0 +1,131 @@
.. This file contains a series of comments that are used to include sections of this README in other files. Do not modify these comments unless you know what you are doing. tag:intro-begin
Certbot is part of EFFs effort to encrypt the entire Internet. Secure communication over the Web relies on HTTPS, which requires the use of a digital certificate that lets browsers verify the identity of web servers (e.g., is that really google.com?). Web servers obtain their certificates from trusted third parties called certificate authorities (CAs). Certbot is an easy-to-use client that fetches a certificate from Lets Encrypt—an open certificate authority launched by the EFF, Mozilla, and others—and deploys it to a web server.
Anyone who has gone through the trouble of setting up a secure website knows what a hassle getting and maintaining a certificate is. Certbot and Lets Encrypt can automate away the pain and let you turn on and manage HTTPS with simple commands. Using Certbot and Let's Encrypt is free, so theres no need to arrange payment.
How you use Certbot depends on the configuration of your web server. The best way to get started is to use our `interactive guide <https://certbot.eff.org>`_. It generates instructions based on your configuration settings. In most cases, youll need `root or administrator access <https://certbot.eff.org/faq/#does-certbot-require-root-administrator-privileges>`_ to your web server to run Certbot.
Certbot is meant to be run directly on your web server, not on your personal computer. If youre using a hosted service and dont have direct access to your web server, you might not be able to use Certbot. Check with your hosting provider for documentation about uploading certificates or using certificates issued by Lets Encrypt.
Certbot is a fully-featured, extensible client for the Let's
Encrypt CA (or any other CA that speaks the `ACME
<https://github.com/ietf-wg-acme/acme/blob/master/draft-ietf-acme-acme.md>`_
protocol) that can automate the tasks of obtaining certificates and
configuring webservers to use them. This client runs on Unix-based operating
systems.
To see the changes made to Certbot between versions please refer to our
`changelog <https://github.com/certbot/certbot/blob/master/CHANGELOG.md>`_.
Until May 2016, Certbot was named simply ``letsencrypt`` or ``letsencrypt-auto``,
depending on install method. Instructions on the Internet, and some pieces of the
software, may still refer to this older name.
Contributing
------------
If you'd like to contribute to this project please read `Developer Guide
<https://certbot.eff.org/docs/contributing.html>`_.
This project is governed by `EFF's Public Projects Code of Conduct <https://www.eff.org/pages/eppcode>`_.
.. _installation:
How to run the client
---------------------
The easiest way to install and run Certbot is by visiting `certbot.eff.org`_,
where you can find the correct instructions for many web server and OS
combinations. For more information, see `Get Certbot
<https://certbot.eff.org/docs/install.html>`_.
.. _certbot.eff.org: https://certbot.eff.org/
Understanding the client in more depth
--------------------------------------
To understand what the client is doing in detail, it's important to
understand the way it uses plugins. Please see the `explanation of
plugins <https://certbot.eff.org/docs/using.html#plugins>`_ in
the User Guide.
Links
=====
.. Do not modify this comment unless you know what you're doing. tag:links-begin
Documentation: https://certbot.eff.org/docs
Software project: https://github.com/certbot/certbot
Notes for developers: https://certbot.eff.org/docs/contributing.html
Main Website: https://certbot.eff.org
Let's Encrypt Website: https://letsencrypt.org
Community: https://community.letsencrypt.org
ACME spec: http://ietf-wg-acme.github.io/acme/
ACME working area in github: https://github.com/ietf-wg-acme/acme
|build-status| |coverage| |docs| |container|
.. |build-status| image:: https://travis-ci.com/certbot/certbot.svg?branch=master
:target: https://travis-ci.com/certbot/certbot
:alt: Travis CI status
.. |coverage| image:: https://codecov.io/gh/certbot/certbot/branch/master/graph/badge.svg
:target: https://codecov.io/gh/certbot/certbot
:alt: Coverage status
.. |docs| image:: https://readthedocs.org/projects/letsencrypt/badge/
:target: https://readthedocs.org/projects/letsencrypt/
:alt: Documentation status
.. |container| image:: https://quay.io/repository/letsencrypt/letsencrypt/status
:target: https://quay.io/repository/letsencrypt/letsencrypt
:alt: Docker Repository on Quay.io
.. Do not modify this comment unless you know what you're doing. tag:links-end
System Requirements
===================
See https://certbot.eff.org/docs/install.html#system-requirements.
.. Do not modify this comment unless you know what you're doing. tag:intro-end
.. Do not modify this comment unless you know what you're doing. tag:features-begin
Current Features
=====================
* Supports multiple web servers:
- apache/2.x
- nginx/0.8.48+
- webroot (adds files to webroot directories in order to prove control of
domains and obtain certs)
- standalone (runs its own simple webserver to prove you control a domain)
- other server software via `third party plugins <https://certbot.eff.org/docs/using.html#third-party-plugins>`_
* The private key is generated locally on your system.
* Can talk to the Let's Encrypt CA or optionally to other ACME
compliant services.
* Can get domain-validated (DV) certificates.
* Can revoke certificates.
* Adjustable RSA key bit-length (2048 (default), 4096, ...).
* Can optionally install a http -> https redirect, so your site effectively
runs https only (Apache only)
* Fully automated.
* Configuration changes are logged and can be reverted.
* Supports an interactive text UI, or can be driven entirely from the
command line.
* Free and Open Source Software, made with Python.
.. Do not modify this comment unless you know what you're doing. tag:features-end
For extensive documentation on using and contributing to Certbot, go to https://certbot.eff.org/docs. If you would like to contribute to the project or run the latest code from git, you should read our `developer guide <https://certbot.eff.org/docs/contributing.html>`_.

View File

@@ -3,6 +3,4 @@ include README.rst
include pytest.ini
recursive-include docs *
recursive-include examples *
recursive-include tests *
global-exclude __pycache__
global-exclude *.py[cod]
recursive-include acme/testdata *

View File

@@ -13,6 +13,7 @@ import warnings
#
# It is based on
# https://github.com/requests/requests/blob/1278ecdf71a312dc2268f3bfc0aabfab3c006dcf/requests/packages.py
import josepy as jose
for mod in list(sys.modules):
@@ -20,3 +21,30 @@ for mod in list(sys.modules):
# preserved (acme.jose.* is josepy.*)
if mod == 'josepy' or mod.startswith('josepy.'):
sys.modules['acme.' + mod.replace('josepy', 'jose', 1)] = sys.modules[mod]
# This class takes a similar approach to the cryptography project to deprecate attributes
# in public modules. See the _ModuleWithDeprecation class here:
# https://github.com/pyca/cryptography/blob/91105952739442a74582d3e62b3d2111365b0dc7/src/cryptography/utils.py#L129
class _TLSSNI01DeprecationModule(object):
"""
Internal class delegating to a module, and displaying warnings when
attributes related to TLS-SNI-01 are accessed.
"""
def __init__(self, module):
self.__dict__['_module'] = module
def __getattr__(self, attr):
if 'TLSSNI01' in attr:
warnings.warn('{0} attribute is deprecated, and will be removed soon.'.format(attr),
DeprecationWarning, stacklevel=2)
return getattr(self._module, attr)
def __setattr__(self, attr, value): # pragma: no cover
setattr(self._module, attr, value)
def __delattr__(self, attr): # pragma: no cover
delattr(self._module, attr)
def __dir__(self): # pragma: no cover
return ['_module'] + dir(self._module)

View File

@@ -3,13 +3,19 @@ import abc
import functools
import hashlib
import logging
import socket
import sys
from cryptography.hazmat.primitives import hashes # type: ignore
import josepy as jose
import OpenSSL
import requests
import six
from acme import errors
from acme import crypto_util
from acme import fields
from acme import _TLSSNI01DeprecationModule
logger = logging.getLogger(__name__)
@@ -54,7 +60,8 @@ class UnrecognizedChallenge(Challenge):
object.__setattr__(self, "jobj", jobj)
def to_partial_json(self):
return self.jobj # pylint: disable=no-member
# pylint: disable=no-member
return self.jobj
@classmethod
def from_json(cls, jobj):
@@ -112,7 +119,7 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
:rtype: bool
"""
parts = self.key_authorization.split('.')
parts = self.key_authorization.split('.') # pylint: disable=no-member
if len(parts) != 2:
logger.debug("Key authorization (%r) is not well formed",
self.key_authorization)
@@ -230,7 +237,7 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
return verified
@Challenge.register
@Challenge.register # pylint: disable=too-many-ancestors
class DNS01(KeyAuthorizationChallenge):
"""ACME dns-01 challenge."""
response_cls = DNS01Response
@@ -320,7 +327,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
return True
@Challenge.register
@Challenge.register # pylint: disable=too-many-ancestors
class HTTP01(KeyAuthorizationChallenge):
"""ACME http-01 challenge."""
response_cls = HTTP01Response
@@ -360,6 +367,148 @@ class HTTP01(KeyAuthorizationChallenge):
return self.key_authorization(account_key)
@ChallengeResponse.register
class TLSSNI01Response(KeyAuthorizationChallengeResponse):
"""ACME tls-sni-01 challenge response."""
typ = "tls-sni-01"
DOMAIN_SUFFIX = b".acme.invalid"
"""Domain name suffix."""
PORT = 443
"""Verification port as defined by the protocol.
You can override it (e.g. for testing) by passing ``port`` to
`simple_verify`.
"""
@property
def z(self): # pylint: disable=invalid-name
"""``z`` value used for verification.
:rtype bytes:
"""
return hashlib.sha256(
self.key_authorization.encode("utf-8")).hexdigest().lower().encode()
@property
def z_domain(self):
"""Domain name used for verification, generated from `z`.
:rtype bytes:
"""
return self.z[:32] + b'.' + self.z[32:] + self.DOMAIN_SUFFIX
def gen_cert(self, key=None, bits=2048):
"""Generate tls-sni-01 certificate.
:param OpenSSL.crypto.PKey key: Optional private key used in
certificate generation. If not provided (``None``), then
fresh key will be generated.
:param int bits: Number of bits for newly generated key.
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
"""
if key is None:
key = OpenSSL.crypto.PKey()
key.generate_key(OpenSSL.crypto.TYPE_RSA, bits)
return crypto_util.gen_ss_cert(key, [
# z_domain is too big to fit into CN, hence first dummy domain
'dummy', self.z_domain.decode()], force_san=True), key
def probe_cert(self, domain, **kwargs):
"""Probe tls-sni-01 challenge certificate.
:param unicode domain:
"""
# TODO: domain is not necessary if host is provided
if "host" not in kwargs:
host = socket.gethostbyname(domain)
logger.debug('%s resolved to %s', domain, host)
kwargs["host"] = host
kwargs.setdefault("port", self.PORT)
kwargs["name"] = self.z_domain
# TODO: try different methods?
return crypto_util.probe_sni(**kwargs)
def verify_cert(self, cert):
"""Verify tls-sni-01 challenge certificate.
:param OpensSSL.crypto.X509 cert: Challenge certificate.
:returns: Whether the certificate was successfully verified.
:rtype: bool
"""
# pylint: disable=protected-access
sans = crypto_util._pyopenssl_cert_or_req_san(cert)
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), sans)
return self.z_domain.decode() in sans
def simple_verify(self, chall, domain, account_public_key,
cert=None, **kwargs):
"""Simple verify.
Verify ``validation`` using ``account_public_key``, optionally
probe tls-sni-01 certificate and check using `verify_cert`.
:param .challenges.TLSSNI01 chall: Corresponding challenge.
:param str domain: Domain name being validated.
:param JWK account_public_key:
:param OpenSSL.crypto.X509 cert: Optional certificate. If not
provided (``None``) certificate will be retrieved using
`probe_cert`.
:param int port: Port used to probe the certificate.
:returns: ``True`` iff client's control of the domain has been
verified.
:rtype: bool
"""
if not self.verify(chall, account_public_key):
logger.debug("Verification of key authorization in response failed")
return False
if cert is None:
try:
cert = self.probe_cert(domain=domain, **kwargs)
except errors.Error as error:
logger.debug(str(error), exc_info=True)
return False
return self.verify_cert(cert)
@Challenge.register # pylint: disable=too-many-ancestors
class TLSSNI01(KeyAuthorizationChallenge):
"""ACME tls-sni-01 challenge."""
response_cls = TLSSNI01Response
typ = response_cls.typ
# boulder#962, ietf-wg-acme#22
#n = jose.Field("n", encoder=int, decoder=int)
def validation(self, account_key, **kwargs):
"""Generate validation.
:param JWK account_key:
:param OpenSSL.crypto.PKey cert_key: Optional private key used
in certificate generation. If not provided (``None``), then
fresh key will be generated.
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
"""
return self.response(account_key).gen_cert(key=kwargs.get('cert_key'))
@ChallengeResponse.register
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""ACME TLS-ALPN-01 challenge response.
@@ -371,7 +520,7 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
typ = "tls-alpn-01"
@Challenge.register
@Challenge.register # pylint: disable=too-many-ancestors
class TLSALPN01(KeyAuthorizationChallenge):
"""ACME tls-alpn-01 challenge.
@@ -468,3 +617,7 @@ class DNSResponse(ChallengeResponse):
"""
return chall.check_validation(self.validation, account_public_key)
# Patching ourselves to warn about TLS-SNI challenge deprecation and removal.
sys.modules[__name__] = _TLSSNI01DeprecationModule(sys.modules[__name__])

View File

@@ -3,10 +3,13 @@ import unittest
import josepy as jose
import mock
import OpenSSL
import requests
from six.moves.urllib import parse as urllib_parse
import test_util
from six.moves.urllib import parse as urllib_parse # pylint: disable=relative-import
from acme import errors
from acme import test_util
CERT = test_util.load_comparable_cert('cert.pem')
KEY = jose.JWKRSA(key=test_util.load_rsa_private_key('rsa512_key.pem'))
@@ -18,6 +21,7 @@ class ChallengeTest(unittest.TestCase):
from acme.challenges import Challenge
from acme.challenges import UnrecognizedChallenge
chall = UnrecognizedChallenge({"type": "foo"})
# pylint: disable=no-member
self.assertEqual(chall, Challenge.from_json(chall.jobj))
@@ -73,6 +77,7 @@ class KeyAuthorizationChallengeResponseTest(unittest.TestCase):
class DNS01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import DNS01Response
@@ -144,6 +149,7 @@ class DNS01Test(unittest.TestCase):
class HTTP01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import HTTP01Response
@@ -253,7 +259,152 @@ class HTTP01Test(unittest.TestCase):
self.msg.update(token=b'..').good_token)
class TLSSNI01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import TLSSNI01
self.chall = TLSSNI01(
token=jose.b64decode(b'a82d5ff8ef740d12881f6d3c2277ab2e'))
self.response = self.chall.response(KEY)
self.jmsg = {
'resource': 'challenge',
'type': 'tls-sni-01',
'keyAuthorization': self.response.key_authorization,
}
# pylint: disable=invalid-name
label1 = b'dc38d9c3fa1a4fdcc3a5501f2d38583f'
label2 = b'b7793728f084394f2a1afd459556bb5c'
self.z = label1 + label2
self.z_domain = label1 + b'.' + label2 + b'.acme.invalid'
self.domain = 'foo.com'
def test_z_and_domain(self):
self.assertEqual(self.z, self.response.z)
self.assertEqual(self.z_domain, self.response.z_domain)
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.response.to_partial_json())
def test_from_json(self):
from acme.challenges import TLSSNI01Response
self.assertEqual(self.response, TLSSNI01Response.from_json(self.jmsg))
def test_from_json_hashable(self):
from acme.challenges import TLSSNI01Response
hash(TLSSNI01Response.from_json(self.jmsg))
@mock.patch('acme.challenges.socket.gethostbyname')
@mock.patch('acme.challenges.crypto_util.probe_sni')
def test_probe_cert(self, mock_probe_sni, mock_gethostbyname):
mock_gethostbyname.return_value = '127.0.0.1'
self.response.probe_cert('foo.com')
mock_gethostbyname.assert_called_once_with('foo.com')
mock_probe_sni.assert_called_once_with(
host='127.0.0.1', port=self.response.PORT,
name=self.z_domain)
self.response.probe_cert('foo.com', host='8.8.8.8')
mock_probe_sni.assert_called_with(
host='8.8.8.8', port=mock.ANY, name=mock.ANY)
self.response.probe_cert('foo.com', port=1234)
mock_probe_sni.assert_called_with(
host=mock.ANY, port=1234, name=mock.ANY)
self.response.probe_cert('foo.com', bar='baz')
mock_probe_sni.assert_called_with(
host=mock.ANY, port=mock.ANY, name=mock.ANY, bar='baz')
self.response.probe_cert('foo.com', name=b'xxx')
mock_probe_sni.assert_called_with(
host=mock.ANY, port=mock.ANY,
name=self.z_domain)
def test_gen_verify_cert(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(key1)
self.assertEqual(key1, key2)
self.assertTrue(self.response.verify_cert(cert))
def test_gen_verify_cert_gen_key(self):
cert, key = self.response.gen_cert()
self.assertTrue(isinstance(key, OpenSSL.crypto.PKey))
self.assertTrue(self.response.verify_cert(cert))
def test_verify_bad_cert(self):
self.assertFalse(self.response.verify_cert(
test_util.load_cert('cert.pem')))
def test_simple_verify_bad_key_authorization(self):
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
self.response.simple_verify(self.chall, "local", key2.public_key())
@mock.patch('acme.challenges.TLSSNI01Response.verify_cert', autospec=True)
def test_simple_verify(self, mock_verify_cert):
mock_verify_cert.return_value = mock.sentinel.verification
self.assertEqual(
mock.sentinel.verification, self.response.simple_verify(
self.chall, self.domain, KEY.public_key(),
cert=mock.sentinel.cert))
mock_verify_cert.assert_called_once_with(
self.response, mock.sentinel.cert)
@mock.patch('acme.challenges.TLSSNI01Response.probe_cert')
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
mock_probe_cert.side_effect = errors.Error
self.assertFalse(self.response.simple_verify(
self.chall, self.domain, KEY.public_key()))
class TLSSNI01Test(unittest.TestCase):
def setUp(self):
self.jmsg = {
'type': 'tls-sni-01',
'token': 'a82d5ff8ef740d12881f6d3c2277ab2e',
}
from acme.challenges import TLSSNI01
self.msg = TLSSNI01(
token=jose.b64decode('a82d5ff8ef740d12881f6d3c2277ab2e'))
def test_to_partial_json(self):
self.assertEqual(self.jmsg, self.msg.to_partial_json())
def test_from_json(self):
from acme.challenges import TLSSNI01
self.assertEqual(self.msg, TLSSNI01.from_json(self.jmsg))
def test_from_json_hashable(self):
from acme.challenges import TLSSNI01
hash(TLSSNI01.from_json(self.jmsg))
def test_from_json_invalid_token_length(self):
from acme.challenges import TLSSNI01
self.jmsg['token'] = jose.encode_b64jose(b'abcd')
self.assertRaises(
jose.DeserializationError, TLSSNI01.from_json, self.jmsg)
@mock.patch('acme.challenges.TLSSNI01Response.gen_cert')
def test_validation(self, mock_gen_cert):
mock_gen_cert.return_value = ('cert', 'key')
self.assertEqual(('cert', 'key'), self.msg.validation(
KEY, cert_key=mock.sentinel.cert_key))
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key)
def test_deprecation_message(self):
with mock.patch('acme.warnings.warn') as mock_warn:
from acme.challenges import TLSSNI01
assert TLSSNI01
self.assertEqual(mock_warn.call_count, 1)
self.assertTrue('deprecated' in mock_warn.call_args[0][0])
class TLSALPN01ResponseTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.challenges import TLSALPN01Response

View File

@@ -5,26 +5,25 @@ import datetime
from email.utils import parsedate_tz
import heapq
import logging
import time
import re
import sys
import time
import six
from six.moves import http_client # pylint: disable=import-error
import josepy as jose
import OpenSSL
import requests
from requests.adapters import HTTPAdapter
from requests_toolbelt.adapters.source import SourceAddressAdapter
import six
from six.moves import http_client # pylint: disable=import-error
from acme import crypto_util
from acme import errors
from acme import jws
from acme import messages
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Text # pylint: disable=unused-import, no-name-in-module
# pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict, List, Set, Text
logger = logging.getLogger(__name__)
@@ -34,6 +33,7 @@ logger = logging.getLogger(__name__)
# https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning
if sys.version_info < (2, 7, 9): # pragma: no cover
try:
# pylint: disable=no-member
requests.packages.urllib3.contrib.pyopenssl.inject_into_urllib3() # type: ignore
except AttributeError:
import urllib3.contrib.pyopenssl # pylint: disable=import-error
@@ -44,7 +44,7 @@ DEFAULT_NETWORK_TIMEOUT = 45
DER_CONTENT_TYPE = 'application/pkix-cert'
class ClientBase(object):
class ClientBase(object): # pylint: disable=too-many-instance-attributes
"""ACME client base object.
:ivar messages.Directory directory:
@@ -123,22 +123,6 @@ class ClientBase(object):
"""
return self.update_registration(regr, update={'status': 'deactivated'})
def deactivate_authorization(self, authzr):
# type: (messages.AuthorizationResource) -> messages.AuthorizationResource
"""Deactivate authorization.
:param messages.AuthorizationResource authzr: The Authorization resource
to be deactivated.
:returns: The Authorization resource that was deactivated.
:rtype: `.AuthorizationResource`
"""
body = messages.UpdateAuthorization(status='deactivated')
response = self._post(authzr.uri, body)
return self._authzr_from_response(response,
authzr.body.identifier, authzr.uri)
def _authzr_from_response(self, response, identifier=None, uri=None):
authzr = messages.AuthorizationResource(
body=messages.Authorization.from_json(response.json()),
@@ -254,6 +238,7 @@ class Client(ClientBase):
URI from which the resource will be downloaded.
"""
# pylint: disable=too-many-arguments
self.key = key
if net is None:
net = ClientNetwork(key, alg=alg, verify_ssl=verify_ssl)
@@ -279,6 +264,7 @@ class Client(ClientBase):
assert response.status_code == http_client.CREATED
# "Instance of 'Field' has no key/contact member" bug:
# pylint: disable=no-member
return self._regr_from_response(response)
def query_registration(self, regr):
@@ -433,6 +419,7 @@ class Client(ClientBase):
was marked by the CA as invalid
"""
# pylint: disable=too-many-locals
assert max_attempts > 0
attempts = collections.defaultdict(int) # type: Dict[messages.AuthorizationResource, int]
exhausted = set()
@@ -463,6 +450,7 @@ class Client(ClientBase):
updated[authzr] = updated_authzr
attempts[authzr] += 1
# pylint: disable=no-member
if updated_authzr.body.status not in (
messages.STATUS_VALID, messages.STATUS_INVALID):
if attempts[authzr] < max_attempts:
@@ -603,6 +591,7 @@ class ClientV2(ClientBase):
if response.status_code == 200 and 'Location' in response.headers:
raise errors.ConflictError(response.headers.get('Location'))
# "Instance of 'Field' has no key/contact member" bug:
# pylint: disable=no-member
regr = self._regr_from_response(response)
self.net.account = regr
return regr
@@ -726,7 +715,7 @@ class ClientV2(ClientBase):
for authzr in responses:
if authzr.body.status != messages.STATUS_VALID:
for chall in authzr.body.challenges:
if chall.error is not None:
if chall.error != None:
failed.append(authzr)
if failed:
raise errors.ValidationError(failed)
@@ -776,13 +765,29 @@ class ClientV2(ClientBase):
def _post_as_get(self, *args, **kwargs):
"""
Send GET request using the POST-as-GET protocol.
Send GET request using the POST-as-GET protocol if needed.
The request will be first issued using POST-as-GET for ACME v2. If the ACME CA servers do
not support this yet and return an error, request will be retried using GET.
For ACME v1, only GET request will be tried, as POST-as-GET is not supported.
:param args:
:param kwargs:
:return:
"""
new_args = args[:1] + (None,) + args[1:]
return self._post(*new_args, **kwargs)
if self.acme_version >= 2:
# We add an empty payload for POST-as-GET requests
new_args = args[:1] + (None,) + args[1:]
try:
return self._post(*new_args, **kwargs)
except messages.Error as error:
if error.code == 'malformed':
logger.debug('Error during a POST-as-GET request, '
'your ACME CA server may not support it:\n%s', error)
logger.debug('Retrying request with GET.')
else: # pragma: no cover
raise
# If POST-as-GET is not supported yet, we use a GET instead.
return self.net.get(*args, **kwargs)
class BackwardsCompatibleClientV2(object):
@@ -926,7 +931,7 @@ class BackwardsCompatibleClientV2(object):
return self.client.external_account_required()
class ClientNetwork(object):
class ClientNetwork(object): # pylint: disable=too-many-instance-attributes
"""Wrapper around requests that signs POSTs for authentication.
Also adds user agent, and handles Content-Type.
@@ -952,6 +957,7 @@ class ClientNetwork(object):
def __init__(self, key, account=None, alg=jose.RS256, verify_ssl=True,
user_agent='acme-python', timeout=DEFAULT_NETWORK_TIMEOUT,
source_address=None):
# pylint: disable=too-many-arguments
self.key = key
self.account = account
self.alg = alg
@@ -1059,6 +1065,7 @@ class ClientNetwork(object):
return response
def _send_request(self, method, url, *args, **kwargs):
# pylint: disable=too-many-locals
"""Send HTTP request.
Makes sure that `verify_ssl` is respected. Logs request and
@@ -1105,9 +1112,10 @@ class ClientNetwork(object):
err_regex = r".*host='(\S*)'.*Max retries exceeded with url\: (\/\w*).*(\[Errno \d+\])([A-Za-z ]*)"
m = re.match(err_regex, str(e))
if m is None:
raise # pragma: no cover
host, path, _err_no, err_msg = m.groups()
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
raise # pragma: no cover
else:
host, path, _err_no, err_msg = m.groups()
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
# If content is DER, log the base64 of it instead of raw bytes, to keep
# binary data out of the logs.
@@ -1173,7 +1181,8 @@ class ClientNetwork(object):
if error.code == 'badNonce':
logger.debug('Retrying request after error:\n%s', error)
return self._post_once(*args, **kwargs)
raise
else:
raise
def _post_once(self, url, obj, content_type=JOSE_CONTENT_TYPE,
acme_version=1, **kwargs):

View File

@@ -5,19 +5,21 @@ import datetime
import json
import unittest
from six.moves import http_client # pylint: disable=import-error
import josepy as jose
import mock
import OpenSSL
import requests
from six.moves import http_client # pylint: disable=import-error
from acme import challenges
from acme import errors
from acme import jws as acme_jws
from acme import messages
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
import messages_test
import test_util
from acme import messages_test
from acme import test_util
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
CERT_DER = test_util.load_vector('cert.der')
CERT_SAN_PEM = test_util.load_vector('cert-san.pem')
@@ -61,7 +63,7 @@ class ClientTestBase(unittest.TestCase):
self.contact = ('mailto:cert-admin@example.com', 'tel:+12025551212')
reg = messages.Registration(
contact=self.contact, key=KEY.public_key())
the_arg = dict(reg) # type: Dict
the_arg = dict(reg) # type: Dict
self.new_reg = messages.NewRegistration(**the_arg)
self.regr = messages.RegistrationResource(
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1')
@@ -316,6 +318,7 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
class ClientTest(ClientTestBase):
"""Tests for acme.client.Client."""
# pylint: disable=too-many-instance-attributes,too-many-public-methods
def setUp(self):
super(ClientTest, self).setUp()
@@ -634,14 +637,6 @@ class ClientTest(ClientTestBase):
errors.PollError, self.client.poll_and_request_issuance,
csr, authzrs, mintime=mintime, max_attempts=2)
def test_deactivate_authorization(self):
authzb = self.authzr.body.update(status=messages.STATUS_DEACTIVATED)
self.response.json.return_value = authzb.to_json()
authzr = self.client.deactivate_authorization(self.authzr)
self.assertEqual(authzb, authzr.body)
self.assertEqual(self.client.net.post.call_count, 1)
self.assertTrue(self.authzr.uri in self.net.post.call_args_list[0][0])
def test_check_cert(self):
self.response.headers['Location'] = self.certr.uri
self.response.content = CERT_DER
@@ -885,6 +880,19 @@ class ClientV2Test(ClientTestBase):
new_nonce_url='https://www.letsencrypt-demo.org/acme/new-nonce')
self.client.net.get.assert_not_called()
class FakeError(messages.Error): # pylint: disable=too-many-ancestors
"""Fake error to reproduce a malformed request ACME error"""
def __init__(self): # pylint: disable=super-init-not-called
pass
@property
def code(self):
return 'malformed'
self.client.net.post.side_effect = FakeError()
self.client.poll(self.authzr2) # pylint: disable=protected-access
self.client.net.get.assert_called_once_with(self.authzr2.uri)
class MockJSONDeSerializable(jose.JSONDeSerializable):
# pylint: disable=missing-docstring
@@ -901,6 +909,7 @@ class MockJSONDeSerializable(jose.JSONDeSerializable):
class ClientNetworkTest(unittest.TestCase):
"""Tests for acme.client.ClientNetwork."""
# pylint: disable=too-many-public-methods
def setUp(self):
self.verify_ssl = mock.MagicMock()
@@ -950,8 +959,8 @@ class ClientNetworkTest(unittest.TestCase):
def test_check_response_not_ok_jobj_error(self):
self.response.ok = False
self.response.json.return_value = messages.Error.with_code(
'serverInternal', detail='foo', title='some title').to_json()
self.response.json.return_value = messages.Error(
detail='foo', typ='serverInternal', title='some title').to_json()
# pylint: disable=protected-access
self.assertRaises(
messages.Error, self.net._check_response, self.response)
@@ -976,7 +985,7 @@ class ClientNetworkTest(unittest.TestCase):
self.response.json.side_effect = ValueError
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
self.response.headers['Content-Type'] = response_ct
# pylint: disable=protected-access
# pylint: disable=protected-access,no-value-for-parameter
self.assertEqual(
self.response, self.net._check_response(self.response))
@@ -990,7 +999,7 @@ class ClientNetworkTest(unittest.TestCase):
self.response.json.return_value = {}
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
self.response.headers['Content-Type'] = response_ct
# pylint: disable=protected-access
# pylint: disable=protected-access,no-value-for-parameter
self.assertEqual(
self.response, self.net._check_response(self.response))
@@ -1106,6 +1115,7 @@ class ClientNetworkTest(unittest.TestCase):
class ClientNetworkWithMockedResponseTest(unittest.TestCase):
"""Tests for acme.client.ClientNetwork which mock out response."""
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.client import ClientNetwork
@@ -1115,8 +1125,8 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
self.response.headers = {}
self.response.links = {}
self.response.checked = False
self.acmev1_nonce_response = mock.MagicMock(
ok=False, status_code=http_client.METHOD_NOT_ALLOWED)
self.acmev1_nonce_response = mock.MagicMock(ok=False,
status_code=http_client.METHOD_NOT_ALLOWED)
self.acmev1_nonce_response.headers = {}
self.obj = mock.MagicMock()
self.wrapped_obj = mock.MagicMock()

View File

@@ -6,15 +6,15 @@ import os
import re
import socket
import josepy as jose
from OpenSSL import crypto
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
import josepy as jose
from acme import errors
from acme.magic_typing import Callable # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Optional # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Tuple # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Union # pylint: disable=unused-import, no-name-in-module
# pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Callable, Union, Tuple, Optional
# pylint: enable=unused-import, no-name-in-module
logger = logging.getLogger(__name__)
@@ -28,7 +28,7 @@ logger = logging.getLogger(__name__)
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
class SSLSocket(object):
class SSLSocket(object): # pylint: disable=too-few-public-methods
"""SSL wrapper for sockets.
:ivar socket sock: Original wrapped socket.
@@ -74,7 +74,7 @@ class SSLSocket(object):
class FakeConnection(object):
"""Fake OpenSSL.SSL.Connection."""
# pylint: disable=missing-docstring
# pylint: disable=too-few-public-methods,missing-docstring
def __init__(self, connection):
self._wrapped = connection

View File

@@ -5,14 +5,15 @@ import threading
import time
import unittest
import six
from six.moves import socketserver #type: ignore # pylint: disable=import-error
import josepy as jose
import OpenSSL
import six
from six.moves import socketserver # type: ignore # pylint: disable=import-error
from acme import errors
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
import test_util
from acme import test_util
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
class SSLSocketAndProbeSNITest(unittest.TestCase):
@@ -29,6 +30,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
class _TestServer(socketserver.TCPServer):
# pylint: disable=too-few-public-methods
# six.moves.* | pylint: disable=attribute-defined-outside-init,no-init
def server_bind(self): # pylint: disable=missing-docstring
@@ -38,6 +40,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
self.port = self.server.socket.getsockname()[1]
self.server_thread = threading.Thread(
# pylint: disable=no-member
target=self.server.handle_request)
def tearDown(self):
@@ -64,7 +67,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
def test_probe_connection_error(self):
# pylint has a hard time with six
self.server.server_close()
self.server.server_close() # pylint: disable=no-member
original_timeout = socket.getdefaulttimeout()
try:
socket.setdefaulttimeout(1)

View File

@@ -29,12 +29,7 @@ class NonceError(ClientError):
class BadNonce(NonceError):
"""Bad nonce error."""
def __init__(self, nonce, error, *args, **kwargs):
# MyPy complains here that there is too many arguments for BaseException constructor.
# This is an error fixed in typeshed, see https://github.com/python/mypy/issues/4183
# The fix is included in MyPy>=0.740, but upgrading it would bring dozen of errors due to
# new types definitions. So we ignore the error until the code base is fixed to match
# with MyPy>=0.740 referential.
super(BadNonce, self).__init__(*args, **kwargs) # type: ignore
super(BadNonce, self).__init__(*args, **kwargs)
self.nonce = nonce
self.error = error
@@ -53,8 +48,7 @@ class MissingNonce(NonceError):
"""
def __init__(self, response, *args, **kwargs):
# See comment in BadNonce constructor above for an explanation of type: ignore here.
super(MissingNonce, self).__init__(*args, **kwargs) # type: ignore
super(MissingNonce, self).__init__(*args, **kwargs)
self.response = response
def __str__(self):
@@ -89,7 +83,6 @@ class PollError(ClientError):
return '{0}(exhausted={1!r}, updated={2!r})'.format(
self.__class__.__name__, self.exhausted, self.updated)
class ValidationError(Error):
"""Error for authorization failures. Contains a list of authorization
resources, each of which is invalid and should have an error field.
@@ -98,11 +91,9 @@ class ValidationError(Error):
self.failed_authzrs = failed_authzrs
super(ValidationError, self).__init__()
class TimeoutError(Error): # pylint: disable=redefined-builtin
class TimeoutError(Error):
"""Error for when polling an authorization or an order times out."""
class IssuanceError(Error):
"""Error sent by the server after requesting issuance of a certificate."""
@@ -114,7 +105,6 @@ class IssuanceError(Error):
self.error = error
super(IssuanceError, self).__init__()
class ConflictError(ClientError):
"""Error for when the server returns a 409 (Conflict) HTTP status.

View File

@@ -4,6 +4,7 @@ import logging
import josepy as jose
import pyrfc3339
logger = logging.getLogger(__name__)

View File

@@ -2,7 +2,6 @@
import importlib
import unittest
class JoseTest(unittest.TestCase):
"""Tests for acme.jose shim."""
@@ -21,10 +20,11 @@ class JoseTest(unittest.TestCase):
# We use the imports below with eval, but pylint doesn't
# understand that.
import acme # pylint: disable=unused-import
import josepy # pylint: disable=unused-import
acme_jose_mod = eval(acme_jose_path) # pylint: disable=eval-used
josepy_mod = eval(josepy_path) # pylint: disable=eval-used
# pylint: disable=eval-used,unused-variable
import acme
import josepy
acme_jose_mod = eval(acme_jose_path)
josepy_mod = eval(josepy_path)
self.assertIs(acme_jose_mod, josepy_mod)
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))

View File

@@ -40,10 +40,10 @@ class Signature(jose.Signature):
class JWS(jose.JWS):
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
signature_cls = Signature
__slots__ = jose.JWS._orig_slots
__slots__ = jose.JWS._orig_slots # pylint: disable=no-member
@classmethod
# pylint: disable=arguments-differ
# pylint: disable=arguments-differ,too-many-arguments
def sign(cls, payload, key, alg, nonce, url=None, kid=None):
# Per ACME spec, jwk and kid are mutually exclusive, so only include a
# jwk field if kid is not provided.

View File

@@ -3,7 +3,8 @@ import unittest
import josepy as jose
import test_util
from acme import test_util
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))

View File

@@ -1,7 +1,6 @@
"""Shim class to not have to depend on typing module in prod."""
import sys
class TypingClass(object):
"""Ignore import errors by getting anything"""
def __getattr__(self, name):

View File

@@ -1,55 +1,37 @@
"""ACME protocol messages."""
import json
import josepy as jose
import six
from acme import challenges
from acme import errors
from acme import fields
from acme import jws
from acme import util
try:
from collections.abc import Hashable # pylint: disable=no-name-in-module
except ImportError: # pragma: no cover
from collections import Hashable
import josepy as jose
from acme import challenges
from acme import errors
from acme import fields
from acme import util
from acme import jws
OLD_ERROR_PREFIX = "urn:acme:error:"
ERROR_PREFIX = "urn:ietf:params:acme:error:"
ERROR_CODES = {
'accountDoesNotExist': 'The request specified an account that does not exist',
'alreadyRevoked': 'The request specified a certificate to be revoked that has' \
' already been revoked',
'badCSR': 'The CSR is unacceptable (e.g., due to a short key)',
'badNonce': 'The client sent an unacceptable anti-replay nonce',
'badPublicKey': 'The JWS was signed by a public key the server does not support',
'badRevocationReason': 'The revocation reason provided is not allowed by the server',
'badSignatureAlgorithm': 'The JWS was signed with an algorithm the server does not support',
'caa': 'Certification Authority Authorization (CAA) records forbid the CA from issuing' \
' a certificate',
'compound': 'Specific error conditions are indicated in the "subproblems" array',
'connection': ('The server could not connect to the client to verify the'
' domain'),
'dns': 'There was a problem with a DNS query during identifier validation',
'dnssec': 'The server could not validate a DNSSEC signed domain',
'incorrectResponse': 'Response recieved didn\'t match the challenge\'s requirements',
# deprecate invalidEmail
'invalidEmail': 'The provided email for a registration was invalid',
'invalidContact': 'The provided contact URI was invalid',
'malformed': 'The request message was malformed',
'rejectedIdentifier': 'The server will not issue certificates for the identifier',
'orderNotReady': 'The request attempted to finalize an order that is not ready to be finalized',
'rateLimited': 'There were too many requests of a given type',
'serverInternal': 'The server experienced an internal error',
'tls': 'The server experienced a TLS error during domain verification',
'unauthorized': 'The client lacks sufficient authorization',
'unsupportedContact': 'A contact URL for an account used an unsupported protocol scheme',
'unknownHost': 'The server could not resolve a domain name',
'unsupportedIdentifier': 'An identifier is of an unsupported type',
'externalAccountRequired': 'The server requires external account binding',
}
@@ -146,7 +128,7 @@ class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
raise jose.DeserializationError(
'{0} not recognized'.format(cls.__name__))
return cls.POSSIBLE_NAMES[jobj]
return cls.POSSIBLE_NAMES[jobj] # pylint: disable=unsubscriptable-object
def __repr__(self):
return '{0}({1})'.format(self.__class__.__name__, self.name)
@@ -171,7 +153,6 @@ STATUS_VALID = Status('valid')
STATUS_INVALID = Status('invalid')
STATUS_REVOKED = Status('revoked')
STATUS_READY = Status('ready')
STATUS_DEACTIVATED = Status('deactivated')
class IdentifierType(_Constant):
@@ -475,7 +456,7 @@ class Authorization(ResourceBody):
:ivar datetime.datetime expires:
"""
identifier = jose.Field('identifier', decoder=Identifier.from_json, omitempty=True)
identifier = jose.Field('identifier', decoder=Identifier.from_json)
challenges = jose.Field('challenges', omitempty=True)
combinations = jose.Field('combinations', omitempty=True)
@@ -505,12 +486,6 @@ class NewAuthorization(Authorization):
resource = fields.Resource(resource_type)
class UpdateAuthorization(Authorization):
"""Update authorization."""
resource_type = 'authz'
resource = fields.Resource(resource_type)
class AuthorizationResource(ResourceWithURI):
"""Authorization Resource.

View File

@@ -5,8 +5,9 @@ import josepy as jose
import mock
from acme import challenges
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
import test_util
from acme import test_util
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
CERT = test_util.load_comparable_cert('cert.der')
CSR = test_util.load_comparable_csr('csr.der')
@@ -18,7 +19,8 @@ class ErrorTest(unittest.TestCase):
def setUp(self):
from acme.messages import Error, ERROR_PREFIX
self.error = Error.with_code('malformed', detail='foo', title='title')
self.error = Error(
detail='foo', typ=ERROR_PREFIX + 'malformed', title='title')
self.jobj = {
'detail': 'foo',
'title': 'some title',
@@ -26,6 +28,7 @@ class ErrorTest(unittest.TestCase):
}
self.error_custom = Error(typ='custom', detail='bar')
self.empty_error = Error()
self.jobj_custom = {'type': 'custom', 'detail': 'bar'}
def test_default_typ(self):
from acme.messages import Error
@@ -40,7 +43,8 @@ class ErrorTest(unittest.TestCase):
hash(Error.from_json(self.error.to_json()))
def test_description(self):
self.assertEqual('The request message was malformed', self.error.description)
self.assertEqual(
'The request message was malformed', self.error.description)
self.assertTrue(self.error_custom.description is None)
def test_code(self):
@@ -50,17 +54,17 @@ class ErrorTest(unittest.TestCase):
self.assertEqual(None, Error().code)
def test_is_acme_error(self):
from acme.messages import is_acme_error, Error
from acme.messages import is_acme_error
self.assertTrue(is_acme_error(self.error))
self.assertFalse(is_acme_error(self.error_custom))
self.assertFalse(is_acme_error(Error()))
self.assertFalse(is_acme_error(self.empty_error))
self.assertFalse(is_acme_error("must pet all the {dogs|rabbits}"))
def test_unicode_error(self):
from acme.messages import Error, is_acme_error
arabic_error = Error.with_code(
'malformed', detail=u'\u0639\u062f\u0627\u0644\u0629', title='title')
from acme.messages import Error, ERROR_PREFIX, is_acme_error
arabic_error = Error(
detail=u'\u0639\u062f\u0627\u0644\u0629', typ=ERROR_PREFIX + 'malformed',
title='title')
self.assertTrue(is_acme_error(arabic_error))
def test_with_code(self):
@@ -301,7 +305,8 @@ class ChallengeBodyTest(unittest.TestCase):
from acme.messages import Error
from acme.messages import STATUS_INVALID
self.status = STATUS_INVALID
error = Error.with_code('serverInternal', detail='Unable to communicate with DNS server')
error = Error(typ='urn:ietf:params:acme:error:serverInternal',
detail='Unable to communicate with DNS server')
self.challb = ChallengeBody(
uri='http://challb', chall=self.chall, status=self.status,
error=error)

View File

@@ -1,22 +1,29 @@
"""Support for standalone client challenge solvers. """
import argparse
import collections
import functools
import logging
import os
import socket
import sys
import threading
from six.moves import BaseHTTPServer # type: ignore # pylint: disable=import-error
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
import OpenSSL
from acme import challenges
from acme import crypto_util
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme import _TLSSNI01DeprecationModule
logger = logging.getLogger(__name__)
# six.moves.* | pylint: disable=no-member,attribute-defined-outside-init
# pylint: disable=no-init
# pylint: disable=too-few-public-methods,no-init
class TLSServer(socketserver.TCPServer):
@@ -44,7 +51,7 @@ class TLSServer(socketserver.TCPServer):
return socketserver.TCPServer.server_bind(self)
class ACMEServerMixin:
class ACMEServerMixin: # pylint: disable=old-style-class
"""ACME server common settings mixin."""
# TODO: c.f. #858
server_version = "ACME client standalone challenge solver"
@@ -105,6 +112,7 @@ class BaseDualNetworkedServers(object):
"""Wraps socketserver.TCPServer.serve_forever"""
for server in self.servers:
thread = threading.Thread(
# pylint: disable=no-member
target=server.serve_forever)
thread.start()
self.threads.append(thread)
@@ -124,6 +132,35 @@ class BaseDualNetworkedServers(object):
self.threads = []
class TLSSNI01Server(TLSServer, ACMEServerMixin):
"""TLSSNI01 Server."""
def __init__(self, server_address, certs, ipv6=False):
TLSServer.__init__(
self, server_address, BaseRequestHandlerWithLogging, certs=certs, ipv6=ipv6)
class TLSSNI01DualNetworkedServers(BaseDualNetworkedServers):
"""TLSSNI01Server Wrapper. Tries everything for both. Failures for one don't
affect the other."""
def __init__(self, *args, **kwargs):
BaseDualNetworkedServers.__init__(self, TLSSNI01Server, *args, **kwargs)
class BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
"""BaseRequestHandler with logging."""
def log_message(self, format, *args): # pylint: disable=redefined-builtin
"""Log arbitrary message."""
logger.debug("%s - - %s", self.client_address[0], format % args)
def handle(self):
"""Handle request."""
self.log_message("Incoming request")
socketserver.BaseRequestHandler.handle(self)
class HTTPServer(BaseHTTPServer.HTTPServer):
"""Generic HTTP Server."""
@@ -226,3 +263,43 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
"""
return functools.partial(
cls, simple_http_resources=simple_http_resources)
def simple_tls_sni_01_server(cli_args, forever=True):
"""Run simple standalone TLSSNI01 server."""
logging.basicConfig(level=logging.DEBUG)
parser = argparse.ArgumentParser()
parser.add_argument(
"-p", "--port", default=0, help="Port to serve at. By default "
"picks random free port.")
args = parser.parse_args(cli_args[1:])
certs = {}
_, hosts, _ = next(os.walk('.')) # type: ignore # https://github.com/python/mypy/issues/465
for host in hosts:
with open(os.path.join(host, "cert.pem")) as cert_file:
cert_contents = cert_file.read()
with open(os.path.join(host, "key.pem")) as key_file:
key_contents = key_file.read()
certs[host.encode()] = (
OpenSSL.crypto.load_privatekey(
OpenSSL.crypto.FILETYPE_PEM, key_contents),
OpenSSL.crypto.load_certificate(
OpenSSL.crypto.FILETYPE_PEM, cert_contents))
server = TLSSNI01Server(('', int(args.port)), certs=certs)
logger.info("Serving at https://%s:%s...", *server.socket.getsockname()[:2])
if forever: # pragma: no cover
server.serve_forever()
else:
server.handle_request()
# Patching ourselves to warn about TLS-SNI challenge deprecation and removal.
sys.modules[__name__] = _TLSSNI01DeprecationModule(sys.modules[__name__])
if __name__ == "__main__":
sys.exit(simple_tls_sni_01_server(sys.argv)) # pragma: no cover

View File

@@ -1,17 +1,26 @@
"""Tests for acme.standalone."""
import multiprocessing
import os
import shutil
import socket
import threading
import tempfile
import unittest
import time
from contextlib import closing
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
import josepy as jose
import mock
import requests
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
from acme import challenges
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
import test_util
from acme import crypto_util
from acme import errors
from acme import test_util
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
class TLSServerTest(unittest.TestCase):
@@ -32,6 +41,32 @@ class TLSServerTest(unittest.TestCase):
server.server_close()
class TLSSNI01ServerTest(unittest.TestCase):
"""Test for acme.standalone.TLSSNI01Server."""
def setUp(self):
self.certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
test_util.load_cert('rsa2048_cert.pem'),
)}
from acme.standalone import TLSSNI01Server
self.server = TLSSNI01Server(('localhost', 0), certs=self.certs)
self.thread = threading.Thread(target=self.server.serve_forever)
self.thread.start()
def tearDown(self):
self.server.shutdown()
self.thread.join()
def test_it(self):
host, port = self.server.socket.getsockname()[:2]
cert = crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1)
self.assertEqual(jose.ComparableX509(cert),
jose.ComparableX509(self.certs[b'localhost'][1]))
class HTTP01ServerTest(unittest.TestCase):
"""Tests for acme.standalone.HTTP01Server."""
@@ -135,6 +170,33 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
prev_port = port
class TLSSNI01DualNetworkedServersTest(unittest.TestCase):
"""Test for acme.standalone.TLSSNI01DualNetworkedServers."""
def setUp(self):
self.certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
test_util.load_cert('rsa2048_cert.pem'),
)}
from acme.standalone import TLSSNI01DualNetworkedServers
self.servers = TLSSNI01DualNetworkedServers(('localhost', 0), certs=self.certs)
self.servers.serve_forever()
def tearDown(self):
self.servers.shutdown_and_server_close()
def test_connect(self):
socknames = self.servers.getsocknames()
# connect to all addresses
for sockname in socknames:
host, port = sockname[:2]
cert = crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1)
self.assertEqual(jose.ComparableX509(cert),
jose.ComparableX509(self.certs[b'localhost'][1]))
class HTTP01DualNetworkedServersTest(unittest.TestCase):
"""Tests for acme.standalone.HTTP01DualNetworkedServers."""
@@ -185,5 +247,60 @@ class HTTP01DualNetworkedServersTest(unittest.TestCase):
self.assertFalse(self._test_http01(add=False))
class TestSimpleTLSSNI01Server(unittest.TestCase):
"""Tests for acme.standalone.simple_tls_sni_01_server."""
def setUp(self):
# mirror ../examples/standalone
self.test_cwd = tempfile.mkdtemp()
localhost_dir = os.path.join(self.test_cwd, 'localhost')
os.makedirs(localhost_dir)
shutil.copy(test_util.vector_path('rsa2048_cert.pem'),
os.path.join(localhost_dir, 'cert.pem'))
shutil.copy(test_util.vector_path('rsa2048_key.pem'),
os.path.join(localhost_dir, 'key.pem'))
with closing(socket.socket(socket.AF_INET, socket.SOCK_STREAM)) as sock:
sock.bind(('', 0))
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
self.port = sock.getsockname()[1]
from acme.standalone import simple_tls_sni_01_server
self.process = multiprocessing.Process(target=simple_tls_sni_01_server,
args=(['path', '-p', str(self.port)],))
self.old_cwd = os.getcwd()
os.chdir(self.test_cwd)
def tearDown(self):
os.chdir(self.old_cwd)
if self.process.is_alive():
self.process.terminate()
self.process.join(timeout=5)
# Check that we didn't timeout waiting for the process to
# terminate.
self.assertNotEqual(self.process.exitcode, None)
shutil.rmtree(self.test_cwd)
@mock.patch('acme.standalone.TLSSNI01Server.handle_request')
def test_mock(self, handle):
from acme.standalone import simple_tls_sni_01_server
simple_tls_sni_01_server(cli_args=['path', '-p', str(self.port)], forever=False)
self.assertEqual(handle.call_count, 1)
def test_live(self):
self.process.start()
cert = None
for _ in range(50):
time.sleep(0.1)
try:
cert = crypto_util.probe_sni(b'localhost', b'127.0.0.1', self.port)
break
except errors.Error: # pragma: no cover
pass
self.assertEqual(jose.ComparableX509(cert),
test_util.load_comparable_cert('rsa2048_cert.pem'))
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -4,12 +4,19 @@
"""
import os
import unittest
import pkg_resources
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import serialization
import josepy as jose
from OpenSSL import crypto
import pkg_resources
def vector_path(*names):
"""Path to a test vector."""
return pkg_resources.resource_filename(
__name__, os.path.join('testdata', *names))
def load_vector(*names):
@@ -25,7 +32,8 @@ def _guess_loader(filename, loader_pem, loader_der):
return loader_pem
elif ext.lower() == '.der':
return loader_der
raise ValueError("Loader could not be recognized based on extension") # pragma: no cover
else: # pragma: no cover
raise ValueError("Loader could not be recognized based on extension")
def load_cert(*names):
@@ -65,3 +73,23 @@ def load_pyopenssl_private_key(*names):
loader = _guess_loader(
names[-1], crypto.FILETYPE_PEM, crypto.FILETYPE_ASN1)
return crypto.load_privatekey(loader, load_vector(*names))
def skip_unless(condition, reason): # pragma: no cover
"""Skip tests unless a condition holds.
This implements the basic functionality of unittest.skipUnless
which is only available on Python 2.7+.
:param bool condition: If ``False``, the test will be skipped
:param str reason: the reason for skipping the test
:rtype: callable
:returns: decorator that hides tests unless condition is ``True``
"""
if hasattr(unittest, "skipUnless"):
return unittest.skipUnless(condition, reason)
elif condition:
return lambda cls: cls
return lambda cls: None

View File

@@ -12,9 +12,10 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
import sys
import os
import shlex
import sys
here = os.path.abspath(os.path.dirname(__file__))

View File

@@ -26,10 +26,8 @@ Workflow:
- Deactivate Account
"""
from contextlib import contextmanager
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import rsa
import josepy as jose
import OpenSSL
from acme import challenges
@@ -38,6 +36,7 @@ from acme import crypto_util
from acme import errors
from acme import messages
from acme import standalone
import josepy as jose
# Constants:

View File

@@ -1,10 +1,10 @@
# readthedocs.org gives no way to change the install command to "pip
# install -e acme[docs]" (that would in turn install documentation
# install -e .[docs]" (that would in turn install documentation
# dependencies), but it allows to specify a requirements.txt file at
# https://readthedocs.org/dashboard/letsencrypt/advanced/ (c.f. #259)
# Although ReadTheDocs certainly doesn't need to install the project
# in --editable mode (-e), just "pip install acme[docs]" does not work as
# expected and "pip install -e acme[docs]" must be used instead
# in --editable mode (-e), just "pip install .[docs]" does not work as
# expected and "pip install -e .[docs]" must be used instead
-e acme[docs]

View File

@@ -1,10 +1,9 @@
from setuptools import setup
from setuptools import find_packages
from setuptools.command.test import test as TestCommand
import sys
from setuptools import find_packages
from setuptools import setup
from setuptools.command.test import test as TestCommand
version = '1.1.0.dev0'
version = '0.36.0.dev0'
# Please update tox.ini when modifying dependency version requirements
install_requires = [
@@ -15,8 +14,8 @@ install_requires = [
# 1.1.0+ is required to avoid the warnings described at
# https://github.com/certbot/josepy/issues/13.
'josepy>=1.1.0',
'mock',
# Connection.set_tlsext_host_name (>=0.13)
'mock',
'PyOpenSSL>=0.13.1',
'pyrfc3339',
'pytz',
@@ -74,7 +73,6 @@ setup(
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],

40
appveyor.yml Normal file
View File

@@ -0,0 +1,40 @@
image: Visual Studio 2015
environment:
matrix:
- TOXENV: py35
- TOXENV: py37-cover
branches:
only:
- master
- /^\d+\.\d+\.x$/ # Version branches like X.X.X
- /^test-.*$/
init:
# Since master can receive only commits from PR that have already been tested, following
# condition avoid to launch all jobs except the coverage one for commits pushed to master.
- ps: |
if (-Not $Env:APPVEYOR_PULL_REQUEST_NUMBER -And $Env:APPVEYOR_REPO_BRANCH -Eq 'master' `
-And -Not ($Env:TOXENV -Like '*-cover'))
{ $Env:APPVEYOR_SKIP_FINALIZE_ON_EXIT = 'true'; Exit-AppVeyorBuild }
install:
# Use Python 3.7 by default
- "SET PATH=C:\\Python37;C:\\Python37\\Scripts;%PATH%"
# Check env
- "python --version"
# Upgrade pip to avoid warnings
- "python -m pip install --upgrade pip"
# Ready to install tox and coverage
- "pip install tox codecov"
build: off
test_script:
- set TOX_TESTENV_PASSENV=APPVEYOR
# Test env is set by TOXENV env variable
- tox
on_success:
- if exist .coverage codecov -F windows

View File

@@ -1,8 +1,7 @@
include LICENSE.txt
include README.rst
recursive-include tests *
include certbot_apache/_internal/centos-options-ssl-apache.conf
include certbot_apache/_internal/options-ssl-apache.conf
recursive-include certbot_apache/_internal/augeas_lens *.aug
global-exclude __pycache__
global-exclude *.py[cod]
recursive-include docs *
recursive-include certbot_apache/tests/testdata *
include certbot_apache/centos-options-ssl-apache.conf
include certbot_apache/options-ssl-apache.conf
recursive-include certbot_apache/augeas_lens *.aug

View File

@@ -1 +0,0 @@
"""Certbot Apache plugin."""

View File

@@ -0,0 +1,207 @@
"""Class of Augeas Configurators."""
import logging
from certbot import errors
from certbot.plugins import common
from certbot_apache import constants
logger = logging.getLogger(__name__)
class AugeasConfigurator(common.Installer):
"""Base Augeas Configurator class.
:ivar config: Configuration.
:type config: :class:`~certbot.interfaces.IConfig`
:ivar aug: Augeas object
:type aug: :class:`augeas.Augeas`
:ivar str save_notes: Human-readable configuration change notes
:ivar reverter: saves and reverts checkpoints
:type reverter: :class:`certbot.reverter.Reverter`
"""
def __init__(self, *args, **kwargs):
super(AugeasConfigurator, self).__init__(*args, **kwargs)
# Placeholder for augeas
self.aug = None
self.save_notes = ""
def init_augeas(self):
""" Initialize the actual Augeas instance """
import augeas
self.aug = augeas.Augeas(
# specify a directory to load our preferred lens from
loadpath=constants.AUGEAS_LENS_DIR,
# Do not save backup (we do it ourselves), do not load
# anything by default
flags=(augeas.Augeas.NONE |
augeas.Augeas.NO_MODL_AUTOLOAD |
augeas.Augeas.ENABLE_SPAN))
# See if any temporary changes need to be recovered
# This needs to occur before VirtualHost objects are setup...
# because this will change the underlying configuration and potential
# vhosts
self.recovery_routine()
def check_parsing_errors(self, lens):
"""Verify Augeas can parse all of the lens files.
:param str lens: lens to check for errors
:raises .errors.PluginError: If there has been an error in parsing with
the specified lens.
"""
error_files = self.aug.match("/augeas//error")
for path in error_files:
# Check to see if it was an error resulting from the use of
# the httpd lens
lens_path = self.aug.get(path + "/lens")
# As aug.get may return null
if lens_path and lens in lens_path:
msg = (
"There has been an error in parsing the file {0} on line {1}: "
"{2}".format(
# Strip off /augeas/files and /error
path[13:len(path) - 6],
self.aug.get(path + "/line"),
self.aug.get(path + "/message")))
raise errors.PluginError(msg)
def ensure_augeas_state(self):
"""Makes sure that all Augeas dom changes are written to files to avoid
loss of configuration directives when doing additional augeas parsing,
causing a possible augeas.load() resulting dom reset
"""
if self.unsaved_files():
self.save_notes += "(autosave)"
self.save()
def unsaved_files(self):
"""Lists files that have modified Augeas DOM but the changes have not
been written to the filesystem yet, used by `self.save()` and
ApacheConfigurator to check the file state.
:raises .errors.PluginError: If there was an error in Augeas, in
an attempt to save the configuration, or an error creating a
checkpoint
:returns: `set` of unsaved files
"""
save_state = self.aug.get("/augeas/save")
self.aug.set("/augeas/save", "noop")
# Existing Errors
ex_errs = self.aug.match("/augeas//error")
try:
# This is a noop save
self.aug.save()
except (RuntimeError, IOError):
self._log_save_errors(ex_errs)
# Erase Save Notes
self.save_notes = ""
raise errors.PluginError(
"Error saving files, check logs for more info.")
# Return the original save method
self.aug.set("/augeas/save", save_state)
# Retrieve list of modified files
# Note: Noop saves can cause the file to be listed twice, I used a
# set to remove this possibility. This is a known augeas 0.10 error.
save_paths = self.aug.match("/augeas/events/saved")
save_files = set()
if save_paths:
for path in save_paths:
save_files.add(self.aug.get(path)[6:])
return save_files
def save(self, title=None, temporary=False):
"""Saves all changes to the configuration files.
This function first checks for save errors, if none are found,
all configuration changes made will be saved. According to the
function parameters. If an exception is raised, a new checkpoint
was not created.
:param str title: The title of the save. If a title is given, the
configuration will be saved as a new checkpoint and put in a
timestamped directory.
:param bool temporary: Indicates whether the changes made will
be quickly reversed in the future (ie. challenges)
"""
save_files = self.unsaved_files()
if save_files:
self.add_to_checkpoint(save_files,
self.save_notes, temporary=temporary)
self.save_notes = ""
self.aug.save()
# Force reload if files were modified
# This is needed to recalculate augeas directive span
if save_files:
for sf in save_files:
self.aug.remove("/files/"+sf)
self.aug.load()
if title and not temporary:
self.finalize_checkpoint(title)
def _log_save_errors(self, ex_errs):
"""Log errors due to bad Augeas save.
:param list ex_errs: Existing errors before save
"""
# Check for the root of save problems
new_errs = self.aug.match("/augeas//error")
# logger.error("During Save - %s", mod_conf)
logger.error("Unable to save files: %s. Attempted Save Notes: %s",
", ".join(err[13:len(err) - 6] for err in new_errs
# Only new errors caused by recent save
if err not in ex_errs), self.save_notes)
# Wrapper functions for Reverter class
def recovery_routine(self):
"""Revert all previously modified files.
Reverts all modified files that have not been saved as a checkpoint
:raises .errors.PluginError: If unable to recover the configuration
"""
super(AugeasConfigurator, self).recovery_routine()
# Need to reload configuration after these changes take effect
self.aug.load()
def revert_challenge_config(self):
"""Used to cleanup challenge configurations.
:raises .errors.PluginError: If unable to revert the challenge config.
"""
self.revert_temporary_config()
self.aug.load()
def rollback_checkpoints(self, rollback=1):
"""Rollback saved checkpoints.
:param int rollback: Number of checkpoints to revert
:raises .errors.PluginError: If there is a problem with the input or
the function is unable to correctly revert the configuration
"""
super(AugeasConfigurator, self).rollback_checkpoints(rollback)
self.aug.load()

View File

@@ -1,6 +1,5 @@
"""Apache Configurator."""
"""Apache Configuration based off of Augeas Configurator."""
# pylint: disable=too-many-lines
from collections import defaultdict
import copy
import fnmatch
import logging
@@ -8,32 +7,34 @@ import re
import socket
import time
from collections import defaultdict
import pkg_resources
import six
import zope.component
import zope.interface
from acme import challenges
from acme.magic_typing import DefaultDict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Union # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import DefaultDict, Dict, List, Set, Union # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.achallenges import KeyAuthorizationAnnotatedChallenge # pylint: disable=unused-import
from certbot.compat import filesystem
from certbot.compat import os
from certbot.plugins import common
from certbot.plugins.enhancements import AutoHSTSEnhancement
from certbot.plugins.util import path_surgery
from certbot_apache._internal import apache_util
from certbot_apache._internal import constants
from certbot_apache._internal import display_ops
from certbot_apache._internal import http_01
from certbot_apache._internal import obj
from certbot_apache._internal import parser
from certbot.plugins.enhancements import AutoHSTSEnhancement
from certbot_apache import apache_util
from certbot_apache import augeas_configurator
from certbot_apache import constants
from certbot_apache import display_ops
from certbot_apache import http_01
from certbot_apache import obj
from certbot_apache import parser
logger = logging.getLogger(__name__)
@@ -69,18 +70,22 @@ logger = logging.getLogger(__name__)
@zope.interface.implementer(interfaces.IAuthenticator, interfaces.IInstaller)
@zope.interface.provider(interfaces.IPluginFactory)
class ApacheConfigurator(common.Installer):
class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
# pylint: disable=too-many-instance-attributes,too-many-public-methods
"""Apache configurator.
State of Configurator: This code has been been tested and built for Ubuntu
14.04 Apache 2.4 and it works for Ubuntu 12.04 Apache 2.2
:ivar config: Configuration.
:type config: :class:`~certbot.interfaces.IConfig`
:ivar parser: Handles low level parsing
:type parser: :class:`~certbot_apache._internal.parser`
:type parser: :class:`~certbot_apache.parser`
:ivar tup version: version of Apache
:ivar list vhosts: All vhosts found in the configuration
(:class:`list` of :class:`~certbot_apache._internal.obj.VirtualHost`)
(:class:`list` of :class:`~certbot_apache.obj.VirtualHost`)
:ivar dict assoc: Mapping between domains and vhosts
@@ -109,7 +114,7 @@ class ApacheConfigurator(common.Installer):
handle_sites=False,
challenge_location="/etc/apache2",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
"certbot_apache", "options-ssl-apache.conf")
)
def option(self, key):
@@ -172,6 +177,8 @@ class ApacheConfigurator(common.Installer):
"(Only Ubuntu/Debian currently)")
add("ctl", default=DEFAULTS["ctl"],
help="Full path to Apache control script")
util.add_deprecated_argument(
add, argument_name="init-script", nargs=1)
def __init__(self, *args, **kwargs):
"""Initialize an Apache Configurator.
@@ -194,8 +201,6 @@ class ApacheConfigurator(common.Installer):
self._enhanced_vhosts = defaultdict(set) # type: DefaultDict[str, Set[obj.VirtualHost]]
# Temporary state for AutoHSTS enhancement
self._autohsts = {} # type: Dict[str, Dict[str, Union[int, float]]]
# Reverter save notes
self.save_notes = ""
# These will be set in the prepare function
self._prepared = False
@@ -226,6 +231,12 @@ class ApacheConfigurator(common.Installer):
:raises .errors.PluginError: If there is any other error
"""
# Perform the actual Augeas initialization to be able to react
try:
self.init_augeas()
except ImportError:
raise errors.NoInstallationError("Problem in Augeas installation")
self._prepare_options()
# Verify Apache is installed
@@ -243,14 +254,16 @@ class ApacheConfigurator(common.Installer):
raise errors.NotSupportedError(
"Apache Version {0} not supported.".format(str(self.version)))
# Recover from previous crash before Augeas initialization to have the
# correct parse tree from the get go.
self.recovery_routine()
# Perform the actual Augeas initialization to be able to react
if not self._check_aug_version():
raise errors.NotSupportedError(
"Apache plugin support requires libaugeas0 and augeas-lenses "
"version 1.2.0 or higher, please make sure you have you have "
"those installed.")
self.parser = self.get_parser()
# Check for errors in parsing files with Augeas
self.parser.check_parsing_errors("httpd.aug")
self.check_parsing_errors("httpd.aug")
# Get all of the available vhosts
self.vhosts = self.get_virtual_hosts()
@@ -269,67 +282,6 @@ class ApacheConfigurator(common.Installer):
" Apache configuration?".format(self.option("server_root")))
self._prepared = True
def save(self, title=None, temporary=False):
"""Saves all changes to the configuration files.
This function first checks for save errors, if none are found,
all configuration changes made will be saved. According to the
function parameters. If an exception is raised, a new checkpoint
was not created.
:param str title: The title of the save. If a title is given, the
configuration will be saved as a new checkpoint and put in a
timestamped directory.
:param bool temporary: Indicates whether the changes made will
be quickly reversed in the future (ie. challenges)
"""
save_files = self.parser.unsaved_files()
if save_files:
self.add_to_checkpoint(save_files,
self.save_notes, temporary=temporary)
# Handle the parser specific tasks
self.parser.save(save_files)
if title and not temporary:
self.finalize_checkpoint(title)
def recovery_routine(self):
"""Revert all previously modified files.
Reverts all modified files that have not been saved as a checkpoint
:raises .errors.PluginError: If unable to recover the configuration
"""
super(ApacheConfigurator, self).recovery_routine()
# Reload configuration after these changes take effect if needed
# ie. ApacheParser has been initialized.
if self.parser:
# TODO: wrap into non-implementation specific parser interface
self.parser.aug.load()
def revert_challenge_config(self):
"""Used to cleanup challenge configurations.
:raises .errors.PluginError: If unable to revert the challenge config.
"""
self.revert_temporary_config()
self.parser.aug.load()
def rollback_checkpoints(self, rollback=1):
"""Rollback saved checkpoints.
:param int rollback: Number of checkpoints to revert
:raises .errors.PluginError: If there is a problem with the input or
the function is unable to correctly revert the configuration
"""
super(ApacheConfigurator, self).rollback_checkpoints(rollback)
self.parser.aug.load()
def _verify_exe_availability(self, exe):
"""Checks availability of Apache executable"""
if not util.exe_exists(exe):
@@ -337,11 +289,26 @@ class ApacheConfigurator(common.Installer):
raise errors.NoInstallationError(
'Cannot find Apache executable {0}'.format(exe))
def _check_aug_version(self):
""" Checks that we have recent enough version of libaugeas.
If augeas version is recent enough, it will support case insensitive
regexp matching"""
self.aug.set("/test/path/testing/arg", "aRgUMeNT")
try:
matches = self.aug.match(
"/test//*[self::arg=~regexp('argument', 'i')]")
except RuntimeError:
self.aug.remove("/test/path")
return False
self.aug.remove("/test/path")
return matches
def get_parser(self):
"""Initializes the ApacheParser"""
# If user provided vhost_root value in command line, use it
return parser.ApacheParser(
self.option("server_root"), self.conf("vhost-root"),
self.aug, self.option("server_root"), self.conf("vhost-root"),
self.version, configurator=self)
def _wildcard_domain(self, domain):
@@ -390,7 +357,7 @@ class ApacheConfigurator(common.Installer):
counterpart, should one get created
:returns: List of VirtualHosts or None
:rtype: `list` of :class:`~certbot_apache._internal.obj.VirtualHost`
:rtype: `list` of :class:`~certbot_apache.obj.VirtualHost`
"""
if self._wildcard_domain(domain):
@@ -449,7 +416,7 @@ class ApacheConfigurator(common.Installer):
filtered_vhosts[name] = vhost
# Only unique VHost objects
dialog_input = set(filtered_vhosts.values())
dialog_input = set([vhost for vhost in filtered_vhosts.values()])
# Ask the user which of names to enable, expect list of names back
dialog_output = display_ops.select_vhost_multiple(list(dialog_input))
@@ -520,8 +487,8 @@ class ApacheConfigurator(common.Installer):
# install SSLCertificateFile, SSLCertificateKeyFile,
# and SSLCertificateChainFile directives
set_cert_path = cert_path
self.parser.aug.set(path["cert_path"][-1], cert_path)
self.parser.aug.set(path["cert_key"][-1], key_path)
self.aug.set(path["cert_path"][-1], cert_path)
self.aug.set(path["cert_key"][-1], key_path)
if chain_path is not None:
self.parser.add_dir(vhost.path,
"SSLCertificateChainFile", chain_path)
@@ -533,8 +500,8 @@ class ApacheConfigurator(common.Installer):
raise errors.PluginError("Please provide the --fullchain-path "
"option pointing to your full chain file")
set_cert_path = fullchain_path
self.parser.aug.set(path["cert_path"][-1], fullchain_path)
self.parser.aug.set(path["cert_key"][-1], key_path)
self.aug.set(path["cert_path"][-1], fullchain_path)
self.aug.set(path["cert_key"][-1], key_path)
# Enable the new vhost if needed
if not vhost.enabled:
@@ -565,7 +532,7 @@ class ApacheConfigurator(common.Installer):
counterpart, should one get created
:returns: vhost associated with name
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost`
:rtype: :class:`~certbot_apache.obj.VirtualHost`
:raises .errors.PluginError: If no vhost is available or chosen
@@ -600,9 +567,9 @@ class ApacheConfigurator(common.Installer):
"in the Apache config.",
target_name)
raise errors.PluginError("No vhost selected")
if temp:
elif temp:
return vhost
if not vhost.ssl:
elif not vhost.ssl:
addrs = self._get_proposed_addrs(vhost, "443")
# TODO: Conflicts is too conservative
if not any(vhost.enabled and vhost.conflicts(addrs) for
@@ -668,7 +635,7 @@ class ApacheConfigurator(common.Installer):
:param str target_name: domain handled by the desired vhost
:param vhosts: vhosts to consider
:type vhosts: `collections.Iterable` of :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhosts: `collections.Iterable` of :class:`~certbot_apache.obj.VirtualHost`
:param bool filter_defaults: whether a vhost with a _default_
addr is acceptable
@@ -810,7 +777,7 @@ class ApacheConfigurator(common.Installer):
"""Helper function for get_virtual_hosts().
:param host: In progress vhost whose names will be added
:type host: :class:`~certbot_apache._internal.obj.VirtualHost`
:type host: :class:`~certbot_apache.obj.VirtualHost`
"""
@@ -829,12 +796,12 @@ class ApacheConfigurator(common.Installer):
:param str path: Augeas path to virtual host
:returns: newly created vhost
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost`
:rtype: :class:`~certbot_apache.obj.VirtualHost`
"""
addrs = set()
try:
args = self.parser.aug.match(path + "/arg")
args = self.aug.match(path + "/arg")
except RuntimeError:
logger.warning("Encountered a problem while parsing file: %s, skipping", path)
return None
@@ -852,7 +819,7 @@ class ApacheConfigurator(common.Installer):
is_ssl = True
filename = apache_util.get_file_path(
self.parser.aug.get("/augeas/files%s/path" % apache_util.get_file_path(path)))
self.aug.get("/augeas/files%s/path" % apache_util.get_file_path(path)))
if filename is None:
return None
@@ -870,7 +837,7 @@ class ApacheConfigurator(common.Installer):
def get_virtual_hosts(self):
"""Returns list of virtual hosts found in the Apache configuration.
:returns: List of :class:`~certbot_apache._internal.obj.VirtualHost`
:returns: List of :class:`~certbot_apache.obj.VirtualHost`
objects found in configuration
:rtype: list
@@ -882,7 +849,7 @@ class ApacheConfigurator(common.Installer):
# Make a list of parser paths because the parser_paths
# dictionary may be modified during the loop.
for vhost_path in list(self.parser.parser_paths):
paths = self.parser.aug.match(
paths = self.aug.match(
("/files%s//*[label()=~regexp('%s')]" %
(vhost_path, parser.case_i("VirtualHost"))))
paths = [path for path in paths if
@@ -892,7 +859,7 @@ class ApacheConfigurator(common.Installer):
if not new_vhost:
continue
internal_path = apache_util.get_internal_aug_path(new_vhost.path)
realpath = filesystem.realpath(new_vhost.filep)
realpath = os.path.realpath(new_vhost.filep)
if realpath not in file_paths:
file_paths[realpath] = new_vhost.filep
internal_paths[realpath].add(internal_path)
@@ -927,7 +894,7 @@ class ApacheConfigurator(common.Installer):
now NameVirtualHosts. If version is earlier than 2.4, check if addr
has a NameVirtualHost directive in the Apache config
:param certbot_apache._internal.obj.Addr target_addr: vhost address
:param certbot_apache.obj.Addr target_addr: vhost address
:returns: Success
:rtype: bool
@@ -945,18 +912,19 @@ class ApacheConfigurator(common.Installer):
"""Adds NameVirtualHost directive for given address.
:param addr: Address that will be added as NameVirtualHost directive
:type addr: :class:`~certbot_apache._internal.obj.Addr`
:type addr: :class:`~certbot_apache.obj.Addr`
"""
loc = parser.get_aug_path(self.parser.loc["name"])
if addr.get_port() == "443":
self.parser.add_dir_to_ifmodssl(
path = self.parser.add_dir_to_ifmodssl(
loc, "NameVirtualHost", [str(addr)])
else:
self.parser.add_dir(loc, "NameVirtualHost", [str(addr)])
path = self.parser.add_dir(loc, "NameVirtualHost", [str(addr)])
msg = "Setting {0} to be NameBasedVirtualHost\n".format(addr)
msg = ("Setting %s to be NameBasedVirtualHost\n"
"\tDirective added to %s\n" % (addr, path))
logger.debug(msg)
self.save_notes += msg
@@ -1113,7 +1081,7 @@ class ApacheConfigurator(common.Installer):
if "ssl_module" not in self.parser.modules:
self.enable_mod("ssl", temp=temp)
def make_vhost_ssl(self, nonssl_vhost):
def make_vhost_ssl(self, nonssl_vhost): # pylint: disable=too-many-locals
"""Makes an ssl_vhost version of a nonssl_vhost.
Duplicates vhost and adds default ssl options
@@ -1123,10 +1091,10 @@ class ApacheConfigurator(common.Installer):
.. note:: This function saves the configuration
:param nonssl_vhost: Valid VH that doesn't have SSLEngine on
:type nonssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type nonssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
:returns: SSL vhost
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost`
:rtype: :class:`~certbot_apache.obj.VirtualHost`
:raises .errors.PluginError: If more than one virtual host is in
the file or if plugin is unable to write/read vhost files.
@@ -1135,16 +1103,16 @@ class ApacheConfigurator(common.Installer):
avail_fp = nonssl_vhost.filep
ssl_fp = self._get_ssl_vhost_path(avail_fp)
orig_matches = self.parser.aug.match("/files%s//* [label()=~regexp('%s')]" %
orig_matches = self.aug.match("/files%s//* [label()=~regexp('%s')]" %
(self._escape(ssl_fp),
parser.case_i("VirtualHost")))
self._copy_create_ssl_vhost_skeleton(nonssl_vhost, ssl_fp)
# Reload augeas to take into account the new vhost
self.parser.aug.load()
self.aug.load()
# Get Vhost augeas path for new vhost
new_matches = self.parser.aug.match("/files%s//* [label()=~regexp('%s')]" %
new_matches = self.aug.match("/files%s//* [label()=~regexp('%s')]" %
(self._escape(ssl_fp),
parser.case_i("VirtualHost")))
@@ -1155,7 +1123,7 @@ class ApacheConfigurator(common.Installer):
# Make Augeas aware of the new vhost
self.parser.parse_file(ssl_fp)
# Try to search again
new_matches = self.parser.aug.match(
new_matches = self.aug.match(
"/files%s//* [label()=~regexp('%s')]" %
(self._escape(ssl_fp),
parser.case_i("VirtualHost")))
@@ -1217,11 +1185,11 @@ class ApacheConfigurator(common.Installer):
"""
if self.conf("vhost-root") and os.path.exists(self.conf("vhost-root")):
fp = os.path.join(filesystem.realpath(self.option("vhost_root")),
fp = os.path.join(os.path.realpath(self.option("vhost_root")),
os.path.basename(non_ssl_vh_fp))
else:
# Use non-ssl filepath
fp = filesystem.realpath(non_ssl_vh_fp)
fp = os.path.realpath(non_ssl_vh_fp)
if fp.endswith(".conf"):
return fp[:-(len(".conf"))] + self.option("le_vhost_ext")
@@ -1305,8 +1273,8 @@ class ApacheConfigurator(common.Installer):
"vhost for your HTTPS site located at {1} because they have "
"the potential to create redirection loops.".format(
vhost.filep, ssl_fp), reporter.MEDIUM_PRIORITY)
self.parser.aug.set("/augeas/files%s/mtime" % (self._escape(ssl_fp)), "0")
self.parser.aug.set("/augeas/files%s/mtime" % (self._escape(vhost.filep)), "0")
self.aug.set("/augeas/files%s/mtime" % (self._escape(ssl_fp)), "0")
self.aug.set("/augeas/files%s/mtime" % (self._escape(vhost.filep)), "0")
def _sift_rewrite_rules(self, contents):
""" Helper function for _copy_create_ssl_vhost_skeleton to prepare the
@@ -1364,9 +1332,12 @@ class ApacheConfigurator(common.Installer):
result.append(comment)
sift = True
result.append('\n'.join(['# ' + l for l in chunk]))
result.append('\n'.join(
['# ' + l for l in chunk]))
continue
else:
result.append('\n'.join(chunk))
continue
return result, sift
def _get_vhost_block(self, vhost):
@@ -1378,7 +1349,7 @@ class ApacheConfigurator(common.Installer):
"""
try:
span_val = self.parser.aug.span(vhost.path)
span_val = self.aug.span(vhost.path)
except ValueError:
logger.critical("Error while reading the VirtualHost %s from "
"file %s", vhost.name, vhost.filep, exc_info=True)
@@ -1413,13 +1384,13 @@ class ApacheConfigurator(common.Installer):
def _update_ssl_vhosts_addrs(self, vh_path):
ssl_addrs = set()
ssl_addr_p = self.parser.aug.match(vh_path + "/arg")
ssl_addr_p = self.aug.match(vh_path + "/arg")
for addr in ssl_addr_p:
old_addr = obj.Addr.fromstring(
str(self.parser.get_arg(addr)))
ssl_addr = old_addr.get_addr_obj("443")
self.parser.aug.set(addr, str(ssl_addr))
self.aug.set(addr, str(ssl_addr))
ssl_addrs.add(ssl_addr)
return ssl_addrs
@@ -1438,14 +1409,14 @@ class ApacheConfigurator(common.Installer):
vh_path, False)) > 1:
directive_path = self.parser.find_dir(directive, None,
vh_path, False)
self.parser.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
self.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
def _remove_directives(self, vh_path, directives):
for directive in directives:
while self.parser.find_dir(directive, None, vh_path, False):
directive_path = self.parser.find_dir(directive, None,
vh_path, False)
self.parser.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
self.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
def _add_dummy_ssl_directives(self, vh_path):
self.parser.add_dir(vh_path, "SSLCertificateFile",
@@ -1484,7 +1455,7 @@ class ApacheConfigurator(common.Installer):
"""
matches = self.parser.find_dir(
"ServerAlias", start=vh_path, exclude=False)
aliases = (self.parser.aug.get(match) for match in matches)
aliases = (self.aug.get(match) for match in matches)
return self.domain_in_names(aliases, target_name)
def _add_name_vhost_if_necessary(self, vhost):
@@ -1494,7 +1465,7 @@ class ApacheConfigurator(common.Installer):
https://httpd.apache.org/docs/2.2/mod/core.html#namevirtualhost
:param vhost: New virtual host that was recently created.
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
"""
need_to_save = False
@@ -1529,7 +1500,7 @@ class ApacheConfigurator(common.Installer):
:param str id_str: Id string for matching
:returns: The matched VirtualHost or None
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost` or None
:rtype: :class:`~certbot_apache.obj.VirtualHost` or None
:raises .errors.PluginError: If no VirtualHost is found
"""
@@ -1546,7 +1517,7 @@ class ApacheConfigurator(common.Installer):
used for keeping track of VirtualHost directive over time.
:param vhost: Virtual host to add the id
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:returns: The unique ID or None
:rtype: str or None
@@ -1568,7 +1539,7 @@ class ApacheConfigurator(common.Installer):
If ID already exists, returns that instead.
:param vhost: Virtual host to add or find the id
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:returns: The unique ID for vhost
:rtype: str or None
@@ -1606,9 +1577,9 @@ class ApacheConfigurator(common.Installer):
:param str domain: domain to enhance
:param str enhancement: enhancement type defined in
:const:`~certbot.plugins.enhancements.ENHANCEMENTS`
:const:`~certbot.constants.ENHANCEMENTS`
:param options: options for the enhancement
See :const:`~certbot.plugins.enhancements.ENHANCEMENTS`
See :const:`~certbot.constants.ENHANCEMENTS`
documentation for appropriate parameter.
:raises .errors.PluginError: If Enhancement is not supported, or if
@@ -1646,7 +1617,7 @@ class ApacheConfigurator(common.Installer):
"""Increase the AutoHSTS max-age value
:param vhost: Virtual host object to modify
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:param str id_str: The unique ID string of VirtualHost
@@ -1667,7 +1638,7 @@ class ApacheConfigurator(common.Installer):
if header_path:
pat = '(?:[ "]|^)(strict-transport-security)(?:[ "]|$)'
for match in header_path:
if re.search(pat, self.parser.aug.get(match).lower()):
if re.search(pat, self.aug.get(match).lower()):
hsts_dirpath = match
if not hsts_dirpath:
err_msg = ("Certbot was unable to find the existing HSTS header "
@@ -1681,7 +1652,7 @@ class ApacheConfigurator(common.Installer):
# Our match statement was for string strict-transport-security, but
# we need to update the value instead. The next index is for the value
hsts_dirpath = hsts_dirpath.replace("arg[3]", "arg[4]")
self.parser.aug.set(hsts_dirpath, hsts_maxage)
self.aug.set(hsts_dirpath, hsts_maxage)
note_msg = ("Increasing HSTS max-age value to {0} for VirtualHost "
"in {1}\n".format(nextstep_value, vhost.filep))
logger.debug(note_msg)
@@ -1730,13 +1701,13 @@ class ApacheConfigurator(common.Installer):
.. note:: This function saves the configuration
:param ssl_vhost: Destination of traffic, an ssl enabled vhost
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
:param unused_options: Not currently used
:type unused_options: Not Available
:returns: Success, general_vhost (HTTP vhost)
:rtype: (bool, :class:`~certbot_apache._internal.obj.VirtualHost`)
:rtype: (bool, :class:`~certbot_apache.obj.VirtualHost`)
"""
min_apache_ver = (2, 3, 3)
@@ -1763,7 +1734,7 @@ class ApacheConfigurator(common.Installer):
# We'll simply delete the directive, so that we'll have a
# consistent OCSP cache path.
if stapling_cache_aug_path:
self.parser.aug.remove(
self.aug.remove(
re.sub(r"/\w*$", "", stapling_cache_aug_path[0]))
self.parser.add_dir_to_ifmodssl(ssl_vhost_aug_path,
@@ -1786,14 +1757,14 @@ class ApacheConfigurator(common.Installer):
.. note:: This function saves the configuration
:param ssl_vhost: Destination of traffic, an ssl enabled vhost
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
:param header_substring: string that uniquely identifies a header.
e.g: Strict-Transport-Security, Upgrade-Insecure-Requests.
:type str
:returns: Success, general_vhost (HTTP vhost)
:rtype: (bool, :class:`~certbot_apache._internal.obj.VirtualHost`)
:rtype: (bool, :class:`~certbot_apache.obj.VirtualHost`)
:raises .errors.PluginError: If no viable HTTP host can be created or
set with header header_substring.
@@ -1821,7 +1792,7 @@ class ApacheConfigurator(common.Installer):
contains the string header_substring.
:param ssl_vhost: vhost to check
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:param header_substring: string that uniquely identifies a header.
e.g: Strict-Transport-Security, Upgrade-Insecure-Requests.
@@ -1840,7 +1811,7 @@ class ApacheConfigurator(common.Installer):
# "Existing Header directive for virtualhost"
pat = '(?:[ "]|^)(%s)(?:[ "]|$)' % (header_substring.lower())
for match in header_path:
if re.search(pat, self.parser.aug.get(match).lower()):
if re.search(pat, self.aug.get(match).lower()):
raise errors.PluginEnhancementAlreadyPresent(
"Existing %s header" % (header_substring))
@@ -1858,7 +1829,7 @@ class ApacheConfigurator(common.Installer):
.. note:: This function saves the configuration
:param ssl_vhost: Destination of traffic, an ssl enabled vhost
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
:param unused_options: Not currently used
:type unused_options: Not Available
@@ -1943,7 +1914,7 @@ class ApacheConfigurator(common.Installer):
delete certbot's old rewrite rules and set the new one instead.
:param vhost: vhost to check
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:raises errors.PluginEnhancementAlreadyPresent: When the exact
certbot redirection WriteRule exists in virtual host.
@@ -1967,11 +1938,11 @@ class ApacheConfigurator(common.Installer):
constants.REWRITE_HTTPS_ARGS_WITH_END]
for dir_path, args_paths in rewrite_args_dict.items():
arg_vals = [self.parser.aug.get(x) for x in args_paths]
arg_vals = [self.aug.get(x) for x in args_paths]
# Search for past redirection rule, delete it, set the new one
if arg_vals in constants.OLD_REWRITE_HTTPS_ARGS:
self.parser.aug.remove(dir_path)
self.aug.remove(dir_path)
self._set_https_redirection_rewrite_rule(vhost)
self.save()
raise errors.PluginEnhancementAlreadyPresent(
@@ -1985,7 +1956,7 @@ class ApacheConfigurator(common.Installer):
"""Checks if there exists a RewriteRule directive in vhost
:param vhost: vhost to check
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:returns: True if a RewriteRule directive exists.
:rtype: bool
@@ -1999,7 +1970,7 @@ class ApacheConfigurator(common.Installer):
"""Checks if a RewriteEngine directive is on
:param vhost: vhost to check
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
"""
rewrite_engine_path_list = self.parser.find_dir("RewriteEngine", "on",
@@ -2016,10 +1987,10 @@ class ApacheConfigurator(common.Installer):
"""Creates an http_vhost specifically to redirect for the ssl_vhost.
:param ssl_vhost: ssl vhost
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
:returns: tuple of the form
(`success`, :class:`~certbot_apache._internal.obj.VirtualHost`)
(`success`, :class:`~certbot_apache.obj.VirtualHost`)
:rtype: tuple
"""
@@ -2027,7 +1998,7 @@ class ApacheConfigurator(common.Installer):
redirect_filepath = self._write_out_redirect(ssl_vhost, text)
self.parser.aug.load()
self.aug.load()
# Make a new vhost data structure and add it to the lists
new_vhost = self._create_vhost(parser.get_aug_path(self._escape(redirect_filepath)))
self.vhosts.append(new_vhost)
@@ -2145,7 +2116,7 @@ class ApacheConfigurator(common.Installer):
of this method where available.
:param vhost: vhost to enable
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:raises .errors.NotSupportedError: If filesystem layout is not
supported.
@@ -2339,7 +2310,7 @@ class ApacheConfigurator(common.Installer):
Enable the AutoHSTS enhancement for defined domains
:param _unused_lineage: Certificate lineage object, unused
:type _unused_lineage: certbot._internal.storage.RenewableCert
:type _unused_lineage: certbot.storage.RenewableCert
:param domains: List of domains in certificate to enhance
:type domains: str
@@ -2382,7 +2353,7 @@ class ApacheConfigurator(common.Installer):
"""Do the initial AutoHSTS deployment to a vhost
:param ssl_vhost: The VirtualHost object to deploy the AutoHSTS
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost` or None
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost` or None
:raises errors.PluginEnhancementAlreadyPresent: When already enhanced
@@ -2464,7 +2435,7 @@ class ApacheConfigurator(common.Installer):
and changes the HSTS max-age to a high value.
:param lineage: Certificate lineage object
:type lineage: certbot._internal.storage.RenewableCert
:type lineage: certbot.storage.RenewableCert
"""
self._autohsts_fetch_state()
if not self._autohsts:
@@ -2509,4 +2480,4 @@ class ApacheConfigurator(common.Installer):
self._autohsts_save_state()
AutoHSTSEnhancement.register(ApacheConfigurator)
AutoHSTSEnhancement.register(ApacheConfigurator) # pylint: disable=no-member

View File

@@ -1,7 +1,6 @@
"""Apache plugin constants."""
import pkg_resources
from certbot.compat import os
MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
"""Name of the mod_ssl config file as saved in `IConfig.config_dir`."""
@@ -10,7 +9,6 @@ MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
UPDATED_MOD_SSL_CONF_DIGEST = ".updated-options-ssl-apache-conf-digest.txt"
"""Name of the hash of the updated or informed mod_ssl_conf as saved in `IConfig.config_dir`."""
# NEVER REMOVE A SINGLE HASH FROM THIS LIST UNLESS YOU KNOW EXACTLY WHAT YOU ARE DOING!
ALL_SSL_OPTIONS_HASHES = [
'2086bca02db48daf93468332543c60ac6acdb6f0b58c7bfdf578a5d47092f82a',
'4844d36c9a0f587172d9fa10f4f1c9518e3bcfa1947379f155e16a70a728c21a',
@@ -20,15 +18,11 @@ ALL_SSL_OPTIONS_HASHES = [
'cfdd7c18d2025836ea3307399f509cfb1ebf2612c87dd600a65da2a8e2f2797b',
'80720bd171ccdc2e6b917ded340defae66919e4624962396b992b7218a561791',
'c0c022ea6b8a51ecc8f1003d0a04af6c3f2bc1c3ce506b3c2dfc1f11ef931082',
'717b0a89f5e4c39b09a42813ac6e747cfbdeb93439499e73f4f70a1fe1473f20',
'0fcdc81280cd179a07ec4d29d3595068b9326b455c488de4b09f585d5dafc137',
'86cc09ad5415cd6d5f09a947fe2501a9344328b1e8a8b458107ea903e80baa6c',
'06675349e457eae856120cdebb564efe546f0b87399f2264baeb41e442c724c7',
]
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""
AUGEAS_LENS_DIR = pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "augeas_lens"))
"certbot_apache", "augeas_lens")
"""Path to the Augeas lens directory"""
REWRITE_HTTPS_ARGS = [

View File

@@ -3,10 +3,10 @@ import logging
import zope.component
import certbot.display.util as display_util
from certbot import errors
from certbot import interfaces
from certbot.compat import os
import certbot.display.util as display_util
logger = logging.getLogger(__name__)
@@ -77,7 +77,7 @@ def _vhost_menu(domain, vhosts):
if free_chars < 2:
logger.debug("Display size is too small for "
"certbot_apache._internal.display_ops._vhost_menu()")
"certbot_apache.display_ops._vhost_menu()")
# This runs the edge off the screen, but it doesn't cause an "error"
filename_size = 1
disp_name_size = 1

View File

@@ -4,18 +4,18 @@
from distutils.version import LooseVersion # pylint: disable=no-name-in-module,import-error
from certbot import util
from certbot_apache._internal import configurator
from certbot_apache._internal import override_arch
from certbot_apache._internal import override_centos
from certbot_apache._internal import override_darwin
from certbot_apache._internal import override_debian
from certbot_apache._internal import override_fedora
from certbot_apache._internal import override_gentoo
from certbot_apache._internal import override_suse
from certbot_apache import configurator
from certbot_apache import override_arch
from certbot_apache import override_fedora
from certbot_apache import override_darwin
from certbot_apache import override_debian
from certbot_apache import override_centos
from certbot_apache import override_gentoo
from certbot_apache import override_suse
OVERRIDE_CLASSES = {
"arch": override_arch.ArchConfigurator,
"cloudlinux": override_centos.CentOSConfigurator,
"darwin": override_darwin.DarwinConfigurator,
"debian": override_debian.DebianConfigurator,
"ubuntu": override_debian.DebianConfigurator,
@@ -23,10 +23,7 @@ OVERRIDE_CLASSES = {
"centos linux": override_centos.CentOSConfigurator,
"fedora_old": override_centos.CentOSConfigurator,
"fedora": override_fedora.FedoraConfigurator,
"linuxmint": override_debian.DebianConfigurator,
"ol": override_centos.CentOSConfigurator,
"oracle": override_centos.CentOSConfigurator,
"redhatenterpriseserver": override_centos.CentOSConfigurator,
"red hat enterprise linux server": override_centos.CentOSConfigurator,
"rhel": override_centos.CentOSConfigurator,
"amazon": override_centos.CentOSConfigurator,
@@ -34,9 +31,6 @@ OVERRIDE_CLASSES = {
"gentoo base system": override_gentoo.GentooConfigurator,
"opensuse": override_suse.OpenSUSEConfigurator,
"suse": override_suse.OpenSUSEConfigurator,
"sles": override_suse.OpenSUSEConfigurator,
"scientific": override_centos.CentOSConfigurator,
"scientific linux": override_centos.CentOSConfigurator,
}

View File

@@ -1,19 +1,19 @@
"""A class that performs HTTP-01 challenges for Apache"""
import logging
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List, Set # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot.compat import filesystem
from certbot.compat import os
from certbot.plugins import common
from certbot_apache._internal.obj import VirtualHost # pylint: disable=unused-import
from certbot_apache._internal.parser import get_aug_path
from certbot_apache.obj import VirtualHost # pylint: disable=unused-import
from certbot_apache.parser import get_aug_path
logger = logging.getLogger(__name__)
class ApacheHttp01(common.ChallengePerformer):
class ApacheHttp01(common.TLSSNI01):
"""Class that performs HTTP-01 challenges within the Apache configurator."""
CONFIG_TEMPLATE22_PRE = """\
@@ -168,7 +168,8 @@ class ApacheHttp01(common.ChallengePerformer):
def _set_up_challenges(self):
if not os.path.isdir(self.challenge_dir):
filesystem.makedirs(self.challenge_dir, 0o755)
os.makedirs(self.challenge_dir)
os.chmod(self.challenge_dir, 0o755)
responses = []
for achall in self.achalls:
@@ -184,7 +185,7 @@ class ApacheHttp01(common.ChallengePerformer):
self.configurator.reverter.register_file_creation(True, name)
with open(name, 'wb') as f:
f.write(validation.encode())
filesystem.chmod(name, 0o644)
os.chmod(name, 0o644)
return response
@@ -194,8 +195,8 @@ class ApacheHttp01(common.ChallengePerformer):
if vhost not in self.moded_vhosts:
logger.debug(
"Adding a temporary challenge validation Include for name: %s in: %s",
vhost.name, vhost.filep)
"Adding a temporary challenge validation Include for name: %s " +
"in: %s", vhost.name, vhost.filep)
self.configurator.parser.add_dir_beginning(
vhost.path, "Include", self.challenge_conf_pre)
self.configurator.parser.add_dir(

View File

@@ -1,7 +1,7 @@
"""Module contains classes used by the Apache Configurator."""
import re
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from certbot.plugins import common
@@ -24,7 +24,7 @@ class Addr(common.Addr):
return not self.__eq__(other)
def __repr__(self):
return "certbot_apache._internal.obj.Addr(" + repr(self.tup) + ")"
return "certbot_apache.obj.Addr(" + repr(self.tup) + ")"
def __hash__(self): # pylint: disable=useless-super-delegation
# Python 3 requires explicit overridden for __hash__ if __eq__ or
@@ -98,7 +98,7 @@ class Addr(common.Addr):
return self.get_addr_obj(port)
class VirtualHost(object):
class VirtualHost(object): # pylint: disable=too-few-public-methods
"""Represents an Apache Virtualhost.
:ivar str filep: file path of VH
@@ -126,6 +126,7 @@ class VirtualHost(object):
def __init__(self, filep, path, addrs, ssl, enabled, name=None,
aliases=None, modmacro=False, ancestor=None):
# pylint: disable=too-many-arguments
"""Initialize a VH."""
self.filep = filep
self.path = path

View File

@@ -1,11 +1,11 @@
""" Distribution specific override class for Arch Linux """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import configurator
from certbot_apache import configurator
@zope.interface.provider(interfaces.IPluginFactory)
class ArchConfigurator(configurator.ApacheConfigurator):
@@ -27,5 +27,5 @@ class ArchConfigurator(configurator.ApacheConfigurator):
handle_sites=False,
challenge_location="/etc/httpd/conf",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
"certbot_apache", "options-ssl-apache.conf")
)

View File

@@ -4,15 +4,17 @@ import logging
import pkg_resources
import zope.interface
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.compat import os
from certbot.errors import MisconfigurationError
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache._internal import parser
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from certbot_apache import apache_util
from certbot_apache import configurator
from certbot_apache import parser
logger = logging.getLogger(__name__)
@@ -38,7 +40,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
handle_sites=False,
challenge_location="/etc/httpd/conf.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "centos-options-ssl-apache.conf"))
"certbot_apache", "centos-options-ssl-apache.conf")
)
def config_test(self):
@@ -84,7 +86,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
def get_parser(self):
"""Initializes the ApacheParser"""
return CentOSParser(
self.option("server_root"), self.option("vhost_root"),
self.aug, self.option("server_root"), self.option("vhost_root"),
self.version, configurator=self)
def _deploy_cert(self, *args, **kwargs): # pylint: disable=arguments-differ
@@ -153,7 +155,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
for loadmod_path in loadmod_paths:
nodir_path = loadmod_path.split("/directive")[0]
# Remove the old LoadModule directive
self.parser.aug.remove(loadmod_path)
self.aug.remove(loadmod_path)
# Create a new IfModule !mod_ssl.c if not already found on path
ssl_ifmod = self.parser.get_ifmod(nodir_path, "!mod_ssl.c",

View File

@@ -1,11 +1,11 @@
""" Distribution specific override class for macOS """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import configurator
from certbot_apache import configurator
@zope.interface.provider(interfaces.IPluginFactory)
class DarwinConfigurator(configurator.ApacheConfigurator):
@@ -27,5 +27,5 @@ class DarwinConfigurator(configurator.ApacheConfigurator):
handle_sites=False,
challenge_location="/etc/apache2/other",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
"certbot_apache", "options-ssl-apache.conf")
)

View File

@@ -7,10 +7,10 @@ import zope.interface
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache import apache_util
from certbot_apache import configurator
logger = logging.getLogger(__name__)
@@ -35,7 +35,7 @@ class DebianConfigurator(configurator.ApacheConfigurator):
handle_sites=True,
challenge_location="/etc/apache2",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
"certbot_apache", "options-ssl-apache.conf")
)
def enable_site(self, vhost):
@@ -45,7 +45,7 @@ class DebianConfigurator(configurator.ApacheConfigurator):
modules are enabled appropriately.
:param vhost: vhost to enable
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:raises .errors.NotSupportedError: If filesystem layout is not
supported.
@@ -65,19 +65,20 @@ class DebianConfigurator(configurator.ApacheConfigurator):
try:
os.symlink(vhost.filep, enabled_path)
except OSError as err:
if os.path.islink(enabled_path) and filesystem.realpath(
if os.path.islink(enabled_path) and os.path.realpath(
enabled_path) == vhost.filep:
# Already in shape
vhost.enabled = True
return None
logger.warning(
"Could not symlink %s to %s, got error: %s", enabled_path,
vhost.filep, err.strerror)
errstring = ("Encountered error while trying to enable a " +
"newly created VirtualHost located at {0} by " +
"linking to it from {1}")
raise errors.NotSupportedError(errstring.format(vhost.filep,
enabled_path))
else:
logger.warning(
"Could not symlink %s to %s, got error: %s", enabled_path,
vhost.filep, err.strerror)
errstring = ("Encountered error while trying to enable a " +
"newly created VirtualHost located at {0} by " +
"linking to it from {1}")
raise errors.NotSupportedError(errstring.format(vhost.filep,
enabled_path))
vhost.enabled = True
logger.info("Enabling available site: %s", vhost.filep)
self.save_notes += "Enabled site %s\n" % vhost.filep

View File

@@ -5,10 +5,10 @@ import zope.interface
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache._internal import parser
from certbot_apache import apache_util
from certbot_apache import configurator
from certbot_apache import parser
@zope.interface.provider(interfaces.IPluginFactory)
@@ -33,7 +33,7 @@ class FedoraConfigurator(configurator.ApacheConfigurator):
challenge_location="/etc/httpd/conf.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
# TODO: eventually newest version of Fedora will need their own config
"certbot_apache", os.path.join("_internal", "centos-options-ssl-apache.conf"))
"certbot_apache", "centos-options-ssl-apache.conf")
)
def config_test(self):
@@ -51,7 +51,7 @@ class FedoraConfigurator(configurator.ApacheConfigurator):
def get_parser(self):
"""Initializes the ApacheParser"""
return FedoraParser(
self.option("server_root"), self.option("vhost_root"),
self.aug, self.option("server_root"), self.option("vhost_root"),
self.version, configurator=self)
def _try_restart_fedora(self):

View File

@@ -1,13 +1,13 @@
""" Distribution specific override class for Gentoo Linux """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache._internal import parser
from certbot_apache import apache_util
from certbot_apache import configurator
from certbot_apache import parser
@zope.interface.provider(interfaces.IPluginFactory)
class GentooConfigurator(configurator.ApacheConfigurator):
@@ -30,7 +30,7 @@ class GentooConfigurator(configurator.ApacheConfigurator):
handle_sites=False,
challenge_location="/etc/apache2/vhosts.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
"certbot_apache", "options-ssl-apache.conf")
)
def _prepare_options(self):
@@ -44,7 +44,7 @@ class GentooConfigurator(configurator.ApacheConfigurator):
def get_parser(self):
"""Initializes the ApacheParser"""
return GentooParser(
self.option("server_root"), self.option("vhost_root"),
self.aug, self.option("server_root"), self.option("vhost_root"),
self.version, configurator=self)

View File

@@ -1,11 +1,11 @@
""" Distribution specific override class for OpenSUSE """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import configurator
from certbot_apache import configurator
@zope.interface.provider(interfaces.IPluginFactory)
class OpenSUSEConfigurator(configurator.ApacheConfigurator):
@@ -27,5 +27,5 @@ class OpenSUSEConfigurator(configurator.ApacheConfigurator):
handle_sites=False,
challenge_location="/etc/apache2/vhosts.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
"certbot_apache", "options-ssl-apache.conf")
)

View File

@@ -8,17 +8,16 @@ import sys
import six
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict, List, Set # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot.compat import os
from certbot_apache._internal import constants
logger = logging.getLogger(__name__)
class ApacheParser(object):
# pylint: disable=too-many-public-methods
"""Class handles the fine details of parsing the Apache Configuration.
.. todo:: Make parsing general... remove sites-available etc...
@@ -33,7 +32,7 @@ class ApacheParser(object):
arg_var_interpreter = re.compile(r"\$\{[^ \}]*}")
fnmatch_chars = set(["*", "?", "\\", "[", "]"])
def __init__(self, root, vhostroot=None, version=(2, 4),
def __init__(self, aug, root, vhostroot=None, version=(2, 4),
configurator=None):
# Note: Order is important here.
@@ -42,20 +41,11 @@ class ApacheParser(object):
# issues with aug.load() after adding new files / defines to parse tree
self.configurator = configurator
# Initialize augeas
self.aug = None
self.init_augeas()
if not self.check_aug_version():
raise errors.NotSupportedError(
"Apache plugin support requires libaugeas0 and augeas-lenses "
"version 1.2.0 or higher, please make sure you have you have "
"those installed.")
self.modules = set() # type: Set[str]
self.parser_paths = {} # type: Dict[str, List[str]]
self.variables = {} # type: Dict[str, str]
self.aug = aug
# Find configuration root and make sure augeas can parse it.
self.root = os.path.abspath(root)
self.loc = {"root": self._find_config_root()}
@@ -87,146 +77,6 @@ class ApacheParser(object):
if self.find_dir("Define", exclude=False):
raise errors.PluginError("Error parsing runtime variables")
def init_augeas(self):
""" Initialize the actual Augeas instance """
try:
import augeas
except ImportError: # pragma: no cover
raise errors.NoInstallationError("Problem in Augeas installation")
self.aug = augeas.Augeas(
# specify a directory to load our preferred lens from
loadpath=constants.AUGEAS_LENS_DIR,
# Do not save backup (we do it ourselves), do not load
# anything by default
flags=(augeas.Augeas.NONE |
augeas.Augeas.NO_MODL_AUTOLOAD |
augeas.Augeas.ENABLE_SPAN))
def check_parsing_errors(self, lens):
"""Verify Augeas can parse all of the lens files.
:param str lens: lens to check for errors
:raises .errors.PluginError: If there has been an error in parsing with
the specified lens.
"""
error_files = self.aug.match("/augeas//error")
for path in error_files:
# Check to see if it was an error resulting from the use of
# the httpd lens
lens_path = self.aug.get(path + "/lens")
# As aug.get may return null
if lens_path and lens in lens_path:
msg = (
"There has been an error in parsing the file {0} on line {1}: "
"{2}".format(
# Strip off /augeas/files and /error
path[13:len(path) - 6],
self.aug.get(path + "/line"),
self.aug.get(path + "/message")))
raise errors.PluginError(msg)
def check_aug_version(self):
""" Checks that we have recent enough version of libaugeas.
If augeas version is recent enough, it will support case insensitive
regexp matching"""
self.aug.set("/test/path/testing/arg", "aRgUMeNT")
try:
matches = self.aug.match(
"/test//*[self::arg=~regexp('argument', 'i')]")
except RuntimeError:
self.aug.remove("/test/path")
return False
self.aug.remove("/test/path")
return matches
def unsaved_files(self):
"""Lists files that have modified Augeas DOM but the changes have not
been written to the filesystem yet, used by `self.save()` and
ApacheConfigurator to check the file state.
:raises .errors.PluginError: If there was an error in Augeas, in
an attempt to save the configuration, or an error creating a
checkpoint
:returns: `set` of unsaved files
"""
save_state = self.aug.get("/augeas/save")
self.aug.set("/augeas/save", "noop")
# Existing Errors
ex_errs = self.aug.match("/augeas//error")
try:
# This is a noop save
self.aug.save()
except (RuntimeError, IOError):
self._log_save_errors(ex_errs)
# Erase Save Notes
self.configurator.save_notes = ""
raise errors.PluginError(
"Error saving files, check logs for more info.")
# Return the original save method
self.aug.set("/augeas/save", save_state)
# Retrieve list of modified files
# Note: Noop saves can cause the file to be listed twice, I used a
# set to remove this possibility. This is a known augeas 0.10 error.
save_paths = self.aug.match("/augeas/events/saved")
save_files = set()
if save_paths:
for path in save_paths:
save_files.add(self.aug.get(path)[6:])
return save_files
def ensure_augeas_state(self):
"""Makes sure that all Augeas dom changes are written to files to avoid
loss of configuration directives when doing additional augeas parsing,
causing a possible augeas.load() resulting dom reset
"""
if self.unsaved_files():
self.configurator.save_notes += "(autosave)"
self.configurator.save()
def save(self, save_files):
"""Saves all changes to the configuration files.
save() is called from ApacheConfigurator to handle the parser specific
tasks of saving.
:param list save_files: list of strings of file paths that we need to save.
"""
self.configurator.save_notes = ""
self.aug.save()
# Force reload if files were modified
# This is needed to recalculate augeas directive span
if save_files:
for sf in save_files:
self.aug.remove("/files/"+sf)
self.aug.load()
def _log_save_errors(self, ex_errs):
"""Log errors due to bad Augeas save.
:param list ex_errs: Existing errors before save
"""
# Check for the root of save problems
new_errs = self.aug.match("/augeas//error")
# logger.error("During Save - %s", mod_conf)
logger.error("Unable to save files: %s. Attempted Save Notes: %s",
", ".join(err[13:len(err) - 6] for err in new_errs
# Only new errors caused by recent save
if err not in ex_errs), self.configurator.save_notes)
def add_include(self, main_config, inc_path):
"""Add Include for a new configuration file if one does not exist
@@ -284,8 +134,8 @@ class ApacheParser(object):
mods.add(mod_name)
mods.add(os.path.basename(mod_filename)[:-2] + "c")
else:
logger.debug("Could not read LoadModule directive from Augeas path: %s",
match_name[6:])
logger.debug("Could not read LoadModule directive from " +
"Augeas path: %s", match_name[6:])
self.modules.update(mods)
def update_runtime_variables(self):
@@ -625,7 +475,7 @@ class ApacheParser(object):
# https://httpd.apache.org/docs/2.4/mod/core.html#include
for match in matches:
dir_ = self.aug.get(match).lower()
if dir_ in ("include", "includeoptional"):
if dir_ == "include" or dir_ == "includeoptional":
ordered_matches.extend(self.find_dir(
directive, arg,
self._get_include_path(self.get_arg(match + "/arg")),
@@ -665,7 +515,8 @@ class ApacheParser(object):
# e.g. strip now, not later
if not value:
return None
value = value.strip("'\"")
else:
value = value.strip("'\"")
variables = ApacheParser.arg_var_interpreter.findall(value)
@@ -807,7 +658,8 @@ class ApacheParser(object):
use_new, remove_old = self._check_path_actions(filepath)
# Ensure that we have the latest Augeas DOM state on disk before
# calling aug.load() which reloads the state from disk
self.ensure_augeas_state()
if self.configurator:
self.configurator.ensure_augeas_state()
# Test if augeas included file for Httpd.lens
# Note: This works for augeas globs, ie. *.conf
if use_new:

View File

@@ -0,0 +1 @@
"""Certbot Apache Tests"""

View File

@@ -15,7 +15,7 @@ SCRIPT_DIRNAME = os.path.dirname(__file__)
def main(args=None):
if not args:
args = sys.argv[1:]
with acme_server.ACMEServer('pebble', [], False) as acme_xdist:
with acme_server.setup_acme_server('pebble', [], False) as acme_xdist:
environ = os.environ.copy()
environ['SERVER'] = acme_xdist['directory_url']
command = [os.path.join(SCRIPT_DIRNAME, 'apache-conf-test')]

Some files were not shown because too many files have changed in this diff Show More