Compare commits
176 Commits
test-real-
...
test-apach
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
9e0f58271e | ||
|
|
b1f3c6f859 | ||
|
|
d0c6be244f | ||
|
|
3c572b84ab | ||
|
|
d4f47e920b | ||
|
|
e1bae626f1 | ||
|
|
df40057bcc | ||
|
|
95f9ed247a | ||
|
|
3a411c092f | ||
|
|
3163801123 | ||
|
|
a68de55696 | ||
|
|
4bc473a2db | ||
|
|
23ec9afdb4 | ||
|
|
f401750b43 | ||
|
|
3ef88a68ed | ||
|
|
e6b2689dcb | ||
|
|
26aae00c36 | ||
|
|
0eeae8196d | ||
|
|
11d9107c95 | ||
|
|
3becf7be16 | ||
|
|
0fe28a6459 | ||
|
|
aaeb4582e2 | ||
|
|
fdb0a14812 | ||
|
|
0324d1740e | ||
|
|
ce325db4e4 | ||
|
|
74e6736c79 | ||
|
|
2ed7608ed3 | ||
|
|
eb02acfc4b | ||
|
|
4f19d516d6 | ||
|
|
3dd918b024 | ||
|
|
8320018978 | ||
|
|
c17f2ff6b0 | ||
|
|
46a2ef8ba1 | ||
|
|
17c1d016c1 | ||
|
|
70ed791709 | ||
|
|
d39f63feca | ||
|
|
6882f006ac | ||
|
|
9a047a6996 | ||
|
|
a8bd839223 | ||
|
|
a1aef4c15c | ||
|
|
cb7598b007 | ||
|
|
55cf49cebe | ||
|
|
933f60a3c1 | ||
|
|
44eb048098 | ||
|
|
794ce57356 | ||
|
|
48d9715bd5 | ||
|
|
c5e1be4fd7 | ||
|
|
e21401004b | ||
|
|
120137eb8d | ||
|
|
2911eda3bd | ||
|
|
f1ea37dd71 | ||
|
|
3d3cbc0d16 | ||
|
|
d978440cb5 | ||
|
|
0c04ce3c32 | ||
|
|
987ce2c6b2 | ||
|
|
dded9290b7 | ||
|
|
745ef6e869 | ||
|
|
e2844bd0ad | ||
|
|
b67fda8832 | ||
|
|
d6e6d64848 | ||
|
|
f4d17d9a6b | ||
|
|
8bcb04af4a | ||
|
|
14e10f40e5 | ||
|
|
1c7105a940 | ||
|
|
36b4c312c6 | ||
|
|
56f609d4f5 | ||
|
|
2d3f3a042a | ||
|
|
bfd4955bad | ||
|
|
9174c631d9 | ||
|
|
81e0b92b43 | ||
|
|
d3da19919f | ||
|
|
e6bf3fe7f8 | ||
|
|
40da709792 | ||
|
|
bf9c681c4f | ||
|
|
391f301dd8 | ||
|
|
06a0dae67f | ||
|
|
a35470292e | ||
|
|
47f64c7280 | ||
|
|
f7c736da6f | ||
|
|
71ff47daad | ||
|
|
41a17f913e | ||
|
|
750d6a9686 | ||
|
|
c4684f187a | ||
|
|
82ad736120 | ||
|
|
ca893bd836 | ||
|
|
d1934e36fe | ||
|
|
15b1d8e5a7 | ||
|
|
cbd0a37c7a | ||
|
|
13c44a0595 | ||
|
|
89f52ca9f9 | ||
|
|
d0a9695b09 | ||
|
|
add24d4861 | ||
|
|
74292a10f5 | ||
|
|
74bf9ef46a | ||
|
|
2ac99fefe0 | ||
|
|
43f58ca803 | ||
|
|
17f2cabbbf | ||
|
|
7d61e9ea56 | ||
|
|
20b595bc9e | ||
|
|
88876b9901 | ||
|
|
448d159223 | ||
|
|
3e872627d8 | ||
|
|
76b7eb0628 | ||
|
|
4fc30f2ecb | ||
|
|
1c75b6dacd | ||
|
|
c08a4dec2d | ||
|
|
4fc0ef0fbe | ||
|
|
26a1eddd89 | ||
|
|
1c6210ee00 | ||
|
|
a27f3ebd4f | ||
|
|
a778b50403 | ||
|
|
f2ab6a338c | ||
|
|
0d5bad6c8c | ||
|
|
dc0cfa21c9 | ||
|
|
a37a4486cf | ||
|
|
776e939a4c | ||
|
|
69cf64079c | ||
|
|
9962cf0b8e | ||
|
|
4c95b687ae | ||
|
|
a3bbdd52e7 | ||
|
|
2e3c1d7c77 | ||
|
|
249af5c4cd | ||
|
|
9a60f6df78 | ||
|
|
e9bcaaa576 | ||
|
|
5078b58de9 | ||
|
|
03cf5d15a6 | ||
|
|
8efe3fb19a | ||
|
|
9863c2d18e | ||
|
|
6172821d90 | ||
|
|
dde16df778 | ||
|
|
1df778859b | ||
|
|
20ca47dec6 | ||
|
|
6c53f5d8ed | ||
|
|
add90cef32 | ||
|
|
1b54c74621 | ||
|
|
e60651057e | ||
|
|
e394889864 | ||
|
|
d75908c645 | ||
|
|
72e5d89e95 | ||
|
|
0c5f526f8b | ||
|
|
5385375571 | ||
|
|
7b4201fbdb | ||
|
|
8106f74dc0 | ||
|
|
3bceae4a89 | ||
|
|
f18143b117 | ||
|
|
0cc56677e2 | ||
|
|
6334d065cf | ||
|
|
c3edc25fb7 | ||
|
|
23b52ca1c8 | ||
|
|
02cf051e45 | ||
|
|
4d034122c6 | ||
|
|
391f742df7 | ||
|
|
5c663d4d97 | ||
|
|
89d907b182 | ||
|
|
d0f1a3e205 | ||
|
|
f3b73c4d2a | ||
|
|
f25a9b2004 | ||
|
|
3568070c73 | ||
|
|
8e92577cb0 | ||
|
|
459ba89aef | ||
|
|
bfd1ce97ef | ||
|
|
419ad7df1e | ||
|
|
889aeb31df | ||
|
|
09b7d2f461 | ||
|
|
18797dca79 | ||
|
|
31e81e7ae0 | ||
|
|
4b06eeae64 | ||
|
|
641aba68b1 | ||
|
|
926c8c198c | ||
|
|
4c299be965 | ||
|
|
561534b754 | ||
|
|
7d35f95293 | ||
|
|
d2a2b88090 | ||
|
|
bf818036eb | ||
|
|
352218510a | ||
|
|
463d089407 |
@@ -4,11 +4,15 @@ coverage:
|
||||
default: off
|
||||
linux:
|
||||
flags: linux
|
||||
target: auto
|
||||
# Fixed target instead of auto set by #7173, can
|
||||
# be removed when flags in Codecov are added back.
|
||||
target: 97.5
|
||||
threshold: 0.1
|
||||
base: auto
|
||||
windows:
|
||||
flags: windows
|
||||
target: auto
|
||||
# Fixed target instead of auto set by #7173, can
|
||||
# be removed when flags in Codecov are added back.
|
||||
target: 97.6
|
||||
threshold: 0.1
|
||||
base: auto
|
||||
|
||||
35
.github/stale.yml
vendored
Normal file
35
.github/stale.yml
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
# Configuration for https://github.com/marketplace/stale
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request becomes stale
|
||||
daysUntilStale: 365
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
|
||||
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
|
||||
# When changing this value, be sure to also update markComment below.
|
||||
daysUntilClose: 30
|
||||
|
||||
# Ignore issues with an assignee (defaults to false)
|
||||
exemptAssignees: true
|
||||
|
||||
# Label to use when marking as stale
|
||||
staleLabel: needs-update
|
||||
|
||||
# Comment to post when marking as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
We've made a lot of changes to Certbot since this issue was opened. If you
|
||||
still have this issue with an up-to-date version of Certbot, can you please
|
||||
add a comment letting us know? This helps us to better see what issues are
|
||||
still affecting our users. If there is no activity in the next 30 days, this
|
||||
issue will be automatically closed.
|
||||
|
||||
# Comment to post when closing a stale Issue or Pull Request.
|
||||
closeComment: >
|
||||
This issue has been closed due to lack of activity, but if you think it
|
||||
should be reopened, please open a new issue with a link to this one and we'll
|
||||
take a look.
|
||||
|
||||
# Limit the number of actions per hour, from 1-30. Default is 30
|
||||
limitPerRun: 1
|
||||
|
||||
# Don't mark pull requests as stale.
|
||||
only: issues
|
||||
5
.gitignore
vendored
5
.gitignore
vendored
@@ -44,3 +44,8 @@ tests/letstest/venv/
|
||||
|
||||
# docker files
|
||||
.docker
|
||||
|
||||
# certbot tests
|
||||
.certbot_test_workspace
|
||||
**/assets/pebble*
|
||||
**/assets/challtestsrv*
|
||||
|
||||
176
.travis.yml
176
.travis.yml
@@ -5,9 +5,21 @@ cache:
|
||||
- $HOME/.cache/pip
|
||||
|
||||
before_script:
|
||||
# Install required apt packages
|
||||
- |
|
||||
if [[ "$TRAVIS_OS_NAME" != "osx" ]]; then
|
||||
./certbot-auto --non-interactive --os-packages-only
|
||||
sudo -E apt-get -yq --no-install-suggests --no-install-recommends install nginx-light
|
||||
sudo -E /etc/init.d/nginx stop
|
||||
sudo -E apt-get -yq --no-install-suggests --no-install-recommends install apache2
|
||||
sudo -E /etc/init.d/apache2 stop
|
||||
sudo -E chmod 777 -R /var/lib/apache2/module
|
||||
fi
|
||||
- 'if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then ulimit -n 1024 ; fi'
|
||||
# On Travis, the fastest parallelization for integration tests has proved to be 4.
|
||||
- 'if [[ "$TOXENV" == *"integration"* ]]; then export PYTEST_ADDOPTS="--numprocesses 4"; fi'
|
||||
# Use Travis retry feature for farm tests since they are flaky
|
||||
- 'if [[ "$TOXENV" == "travis-test-farm"* ]]; then export TRAVIS_RETRY=travis_retry; fi'
|
||||
- export TOX_TESTENV_PASSENV=TRAVIS
|
||||
|
||||
# Only build pushes to the master branch, PRs, and branches beginning with
|
||||
@@ -16,6 +28,9 @@ before_script:
|
||||
# is a cap of on the number of simultaneous runs.
|
||||
branches:
|
||||
only:
|
||||
# apache-parser-v2 is a temporary branch for doing work related to
|
||||
# rewriting the parser in the Apache plugin.
|
||||
- apache-parser-v2
|
||||
- master
|
||||
- /^\d+\.\d+\.x$/
|
||||
- /^test-.*$/
|
||||
@@ -24,29 +39,22 @@ branches:
|
||||
not-on-master: ¬-on-master
|
||||
if: NOT (type = push AND branch = master)
|
||||
|
||||
# Jobs for the extended test suite are executed for cron jobs and pushes on non-master branches.
|
||||
# Jobs for the extended test suite are executed for cron jobs and pushes to
|
||||
# non-development branches. See the explanation for apache-parser-v2 above.
|
||||
extended-test-suite: &extended-test-suite
|
||||
if: type = cron OR (type = push AND branch != master)
|
||||
if: type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))
|
||||
|
||||
matrix:
|
||||
include:
|
||||
# Main test suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=pebble TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *not-on-master
|
||||
|
||||
# This job is always executed, including on master
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27-cover FYI="py27 tests + code coverage"
|
||||
|
||||
- sudo: required
|
||||
env: TOXENV=nginx_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
env: TOXENV=lint
|
||||
<<: *not-on-master
|
||||
@@ -57,43 +65,32 @@ matrix:
|
||||
env: TOXENV=mypy
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
env: TOXENV='py27-{acme,apache,certbot,dns,nginx,postfix}-oldest'
|
||||
sudo: required
|
||||
services: docker
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
env: TOXENV='py27-{acme,apache,certbot,dns,nginx}-oldest'
|
||||
<<: *not-on-master
|
||||
- python: "3.4"
|
||||
env: TOXENV=py34
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *not-on-master
|
||||
- python: "3.7"
|
||||
dist: xenial
|
||||
env: TOXENV=py37
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *not-on-master
|
||||
- sudo: required
|
||||
env: TOXENV=apache_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
- env: TOXENV=apache_compat
|
||||
<<: *not-on-master
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_trusty
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
- env: TOXENV=le_auto_xenial
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
env: TOXENV=apacheconftest-with-pebble
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
env: TOXENV=nginxroundtrip
|
||||
<<: *not-on-master
|
||||
|
||||
# Extended test suite on cron jobs and pushes to tested branches other than master
|
||||
- env: TOXENV=nginx_compat
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-apache2
|
||||
@@ -117,44 +114,29 @@ matrix:
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
dist: xenial
|
||||
env: TOXENV=py37 CERTBOT_NO_PIN=1
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27-certbot-oldest
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27-nginx-oldest
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration-certbot-oldest
|
||||
sudo: required
|
||||
services: docker
|
||||
dist: trusty # See py27-{acme,apache,certbot,dns,nginx}-oldest tests
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration-certbot-oldest
|
||||
sudo: required
|
||||
services: docker
|
||||
dist: trusty # See py27-{acme,apache,certbot,dns,nginx}-oldest tests
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration-nginx-oldest
|
||||
sudo: required
|
||||
services: docker
|
||||
dist: trusty # See py27-{acme,apache,certbot,dns,nginx}-oldest tests
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration-nginx-oldest
|
||||
sudo: required
|
||||
services: docker
|
||||
dist: trusty # See py27-{acme,apache,certbot,dns,nginx}-oldest tests
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: TOXENV=py34
|
||||
@@ -166,74 +148,44 @@ matrix:
|
||||
env: TOXENV=py36
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
dist: xenial
|
||||
env: TOXENV=py37
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
dist: xenial
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
dist: xenial
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_xenial
|
||||
services: docker
|
||||
- env: TOXENV=le_auto_jessie
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_jessie
|
||||
services: docker
|
||||
- env: TOXENV=le_auto_centos6
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_centos6
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=docker_dev
|
||||
services: docker
|
||||
addons:
|
||||
apt:
|
||||
packages: # don't install nginx and apache
|
||||
- libaugeas0
|
||||
- env: TOXENV=docker_dev
|
||||
<<: *extended-test-suite
|
||||
- language: generic
|
||||
env: TOXENV=py27
|
||||
os: osx
|
||||
# Using this osx_image is a workaround for
|
||||
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
|
||||
osx_image: xcode10.2
|
||||
addons:
|
||||
homebrew:
|
||||
packages:
|
||||
@@ -243,6 +195,9 @@ matrix:
|
||||
- language: generic
|
||||
env: TOXENV=py3
|
||||
os: osx
|
||||
# Using this osx_image is a workaround for
|
||||
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
|
||||
osx_image: xcode10.2
|
||||
addons:
|
||||
homebrew:
|
||||
packages:
|
||||
@@ -250,34 +205,29 @@ matrix:
|
||||
- python3
|
||||
<<: *extended-test-suite
|
||||
|
||||
# container-based infrastructure
|
||||
sudo: false
|
||||
|
||||
addons:
|
||||
apt:
|
||||
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
|
||||
- python-dev
|
||||
- python-virtualenv
|
||||
- gcc
|
||||
- libaugeas0
|
||||
- libssl-dev
|
||||
- libffi-dev
|
||||
- ca-certificates
|
||||
# For certbot-nginx integration testing
|
||||
- nginx-light
|
||||
- openssl
|
||||
|
||||
install: "$(command -v pip || command -v pip3) install codecov tox"
|
||||
script: tox
|
||||
# tools/pip_install.py is used to pin packages to a known working version
|
||||
# except in tests where the environment variable CERTBOT_NO_PIN is set.
|
||||
# virtualenv is listed here explicitly to make sure it is upgraded when
|
||||
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
|
||||
# version of virtualenv.
|
||||
install: 'tools/pip_install.py -U codecov tox virtualenv'
|
||||
# Most of the time TRAVIS_RETRY is an empty string, and has no effect on the
|
||||
# script command. It is set only to `travis_retry` during farm tests, in
|
||||
# order to trigger the Travis retry feature, and compensate the inherent
|
||||
# flakiness of these specific tests.
|
||||
script: '$TRAVIS_RETRY tox'
|
||||
|
||||
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov -F linux'
|
||||
|
||||
notifications:
|
||||
email: false
|
||||
irc:
|
||||
channels:
|
||||
- secure: "SGWZl3ownKx9xKVV2VnGt7DqkTmutJ89oJV9tjKhSs84kLijU6EYdPnllqISpfHMTxXflNZuxtGo0wTDYHXBuZL47w1O32W6nzuXdra5zC+i4sYQwYULUsyfOv9gJX8zWAULiK0Z3r0oho45U+FR5ZN6TPCidi8/eGU+EEPwaAw="
|
||||
on_cancel: never
|
||||
on_success: never
|
||||
on_failure: always
|
||||
use_notice: true
|
||||
#notifications:
|
||||
# email: false
|
||||
# irc:
|
||||
# channels:
|
||||
# # This is set to a secure variable to prevent forks from sending
|
||||
# # notifications. This value was created by installing
|
||||
# # https://github.com/travis-ci/travis.rb and running
|
||||
# # `travis encrypt "chat.freenode.net#certbot-devel"`.
|
||||
# - secure: "EWW66E2+KVPZyIPR8ViENZwfcup4Gx3/dlimmAZE0WuLwxDCshBBOd3O8Rf6pBokEoZlXM5eDT6XdyJj8n0DLslgjO62pExdunXpbcMwdY7l1ELxX2/UbnDTE6UnPYa09qVBHNG7156Z6yE0x2lH4M9Ykvp0G0cubjPQHylAwo0="
|
||||
# on_cancel: never
|
||||
# on_success: never
|
||||
# on_failure: always
|
||||
|
||||
@@ -5,6 +5,7 @@ Authors
|
||||
* Aaron Zuehlke
|
||||
* Ada Lovelace
|
||||
* [Adam Woodbeck](https://github.com/awoodbeck)
|
||||
* [Adrien Ferrand](https://github.com/adferrand)
|
||||
* [Aidin Gharibnavaz](https://github.com/aidin36)
|
||||
* [AJ ONeal](https://github.com/coolaj86)
|
||||
* [Alcaro](https://github.com/Alcaro)
|
||||
@@ -14,6 +15,7 @@ Authors
|
||||
* [Alex Gaynor](https://github.com/alex)
|
||||
* [Alex Halderman](https://github.com/jhalderm)
|
||||
* [Alex Jordan](https://github.com/strugee)
|
||||
* [Alex Zorin](https://github.com/alexzorin)
|
||||
* [Amjad Mashaal](https://github.com/TheNavigat)
|
||||
* [Andrew Murray](https://github.com/radarhere)
|
||||
* [Anselm Levskaya](https://github.com/levskaya)
|
||||
@@ -75,6 +77,7 @@ Authors
|
||||
* [Fabian](https://github.com/faerbit)
|
||||
* [Faidon Liambotis](https://github.com/paravoid)
|
||||
* [Fan Jiang](https://github.com/tcz001)
|
||||
* [Felix Lechner](https://github.com/lechner)
|
||||
* [Felix Schwarz](https://github.com/FelixSchwarz)
|
||||
* [Felix Yan](https://github.com/felixonmars)
|
||||
* [Filip Ochnik](https://github.com/filipochnik)
|
||||
@@ -159,6 +162,7 @@ Authors
|
||||
* [Michael Schumacher](https://github.com/schumaml)
|
||||
* [Michael Strache](https://github.com/Jarodiv)
|
||||
* [Michael Sverdlin](https://github.com/sveder)
|
||||
* [Michael Watters](https://github.com/blackknight36)
|
||||
* [Michal Moravec](https://github.com/https://github.com/Majkl578)
|
||||
* [Michal Papis](https://github.com/mpapis)
|
||||
* [Minn Soe](https://github.com/MinnSoe)
|
||||
|
||||
116
CHANGELOG.md
116
CHANGELOG.md
@@ -2,12 +2,109 @@
|
||||
|
||||
Certbot adheres to [Semantic Versioning](https://semver.org/).
|
||||
|
||||
## 0.35.0 - master
|
||||
## 0.38.0 - master
|
||||
|
||||
### Added
|
||||
|
||||
* dns_rfc2136 plugin now supports explicitly specifing an authorative
|
||||
base domain for cases when the automatic method does not work (e.g.
|
||||
*
|
||||
|
||||
### Changed
|
||||
|
||||
* If Certbot fails to rollback your server configuration, the error message
|
||||
links to the Let's Encrypt forum. Change the link to the Help category now
|
||||
that the Server category has been closed.
|
||||
* Replace platform.linux_distribution with distro.linux_distribution as a step
|
||||
towards Python 3.8 support in Certbot.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Fixed OS detection in the Apache plugin on Scientific Linux.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.37.2 - 2019-08-21
|
||||
|
||||
* Stop disabling TLS session tickets in Nginx as it caused TLS failures on
|
||||
some systems.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.37.1 - 2019-08-08
|
||||
|
||||
### Fixed
|
||||
|
||||
* Stop disabling TLS session tickets in Apache as it caused TLS failures on
|
||||
some systems.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.37.0 - 2019-08-07
|
||||
|
||||
### Added
|
||||
|
||||
* Turn off session tickets for apache plugin by default
|
||||
* acme: Authz deactivation added to `acme` module.
|
||||
|
||||
### Changed
|
||||
|
||||
* Follow updated Mozilla recommendations for Nginx ssl_protocols, ssl_ciphers,
|
||||
and ssl_prefer_server_ciphers
|
||||
|
||||
### Fixed
|
||||
|
||||
* Fix certbot-auto failures on RHEL 8.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.36.0 - 2019-07-11
|
||||
|
||||
### Added
|
||||
|
||||
* Turn off session tickets for nginx plugin by default
|
||||
* Added missing error types from RFC8555 to acme
|
||||
|
||||
### Changed
|
||||
|
||||
* Support for Ubuntu 14.04 Trusty has been removed.
|
||||
* Update the 'manage your account' help to be more generic.
|
||||
* The error message when Certbot's Apache plugin is unable to modify your
|
||||
Apache configuration has been improved.
|
||||
* Certbot's config_changes subcommand has been deprecated and will be
|
||||
removed in a future release.
|
||||
* `certbot config_changes` no longer accepts a --num parameter.
|
||||
* The functions `certbot.plugins.common.Installer.view_config_changes` and
|
||||
`certbot.reverter.Reverter.view_config_changes` have been deprecated and will
|
||||
be removed in a future release.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Replace some unnecessary platform-specific line separation.
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.35.1 - 2019-06-10
|
||||
|
||||
### Fixed
|
||||
|
||||
* Support for specifying an authoritative base domain in our dns-rfc2136 plugin
|
||||
has been removed. This feature was added in our last release but had a bug
|
||||
which caused the plugin to fail so the feature has been removed until it can
|
||||
be added properly.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* certbot-dns-rfc2136
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
|
||||
## 0.35.0 - 2019-06-05
|
||||
|
||||
### Added
|
||||
|
||||
* dns_rfc2136 plugin now supports explicitly specifing an authorative
|
||||
base domain for cases when the automatic method does not work (e.g.
|
||||
Split horizon DNS)
|
||||
|
||||
### Changed
|
||||
@@ -16,12 +113,19 @@ Certbot adheres to [Semantic Versioning](https://semver.org/).
|
||||
|
||||
### Fixed
|
||||
|
||||
*
|
||||
* Renewal parameter `webroot_path` is always saved, avoiding some regressions
|
||||
when `webroot` authenticator plugin is invoked with no challenge to perform.
|
||||
* Certbot now accepts OCSP responses when an explicit authorized
|
||||
responder, different from the issuer, is used to sign OCSP
|
||||
responses.
|
||||
* Scripts in Certbot hook directories are no longer executed when their
|
||||
filenames end in a tilde.
|
||||
|
||||
Despite us having broken lockstep, we are continuing to release new versions of
|
||||
all Certbot components during releases for the time being, however, the only
|
||||
package with changes other than its version number was:
|
||||
|
||||
* certbot
|
||||
* certbot-dns-rfc2136
|
||||
|
||||
More details about these changes can be found on our GitHub repo.
|
||||
@@ -68,6 +172,10 @@ More details about these changes can be found on our GitHub repo.
|
||||
`malformed` error to be received from the ACME server.
|
||||
* Linode DNS plugin now supports api keys created from their new panel
|
||||
at [cloud.linode.com](https://cloud.linode.com)
|
||||
|
||||
### Fixed
|
||||
|
||||
* Fixed Google DNS Challenge issues when private zones exist
|
||||
* Adding a warning noting that future versions of Certbot will automatically configure the
|
||||
webserver so that all requests redirect to secure HTTPS access. You can control this
|
||||
behavior and disable this warning with the --redirect and --no-redirect flags.
|
||||
|
||||
35
Dockerfile
35
Dockerfile
@@ -1,35 +0,0 @@
|
||||
FROM python:2-alpine3.9
|
||||
|
||||
ENTRYPOINT [ "certbot" ]
|
||||
EXPOSE 80 443
|
||||
VOLUME /etc/letsencrypt /var/lib/letsencrypt
|
||||
WORKDIR /opt/certbot
|
||||
|
||||
COPY CHANGELOG.md README.rst setup.py src/
|
||||
|
||||
# Generate constraints file to pin dependency versions
|
||||
COPY letsencrypt-auto-source/pieces/dependency-requirements.txt .
|
||||
COPY tools /opt/certbot/tools
|
||||
RUN sh -c 'cat dependency-requirements.txt | /opt/certbot/tools/strip_hashes.py > unhashed_requirements.txt'
|
||||
RUN sh -c 'cat tools/dev_constraints.txt unhashed_requirements.txt | /opt/certbot/tools/merge_requirements.py > docker_constraints.txt'
|
||||
|
||||
COPY acme src/acme
|
||||
COPY certbot src/certbot
|
||||
|
||||
RUN apk add --no-cache --virtual .certbot-deps \
|
||||
libffi \
|
||||
libssl1.1 \
|
||||
openssl \
|
||||
ca-certificates \
|
||||
binutils
|
||||
RUN apk add --no-cache --virtual .build-deps \
|
||||
gcc \
|
||||
linux-headers \
|
||||
openssl-dev \
|
||||
musl-dev \
|
||||
libffi-dev \
|
||||
&& pip install -r /opt/certbot/dependency-requirements.txt \
|
||||
&& pip install --no-cache-dir --no-deps \
|
||||
--editable /opt/certbot/src/acme \
|
||||
--editable /opt/certbot/src \
|
||||
&& apk del .build-deps
|
||||
@@ -1,5 +1,5 @@
|
||||
# This Dockerfile builds an image for development.
|
||||
FROM ubuntu:xenial
|
||||
FROM debian:buster
|
||||
|
||||
# Note: this only exposes the port to other docker containers.
|
||||
EXPOSE 80 443
|
||||
|
||||
@@ -1,75 +0,0 @@
|
||||
# https://github.com/letsencrypt/letsencrypt/pull/431#issuecomment-103659297
|
||||
# it is more likely developers will already have ubuntu:trusty rather
|
||||
# than e.g. debian:jessie and image size differences are negligible
|
||||
FROM ubuntu:trusty
|
||||
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
|
||||
MAINTAINER William Budington <bill@eff.org>
|
||||
|
||||
# Note: this only exposes the port to other docker containers. You
|
||||
# still have to bind to 443@host at runtime, as per the ACME spec.
|
||||
EXPOSE 443
|
||||
|
||||
# TODO: make sure --config-dir and --work-dir cannot be changed
|
||||
# through the CLI (certbot-docker wrapper that uses standalone
|
||||
# authenticator and text mode only?)
|
||||
VOLUME /etc/letsencrypt /var/lib/letsencrypt
|
||||
|
||||
WORKDIR /opt/certbot
|
||||
|
||||
# no need to mkdir anything:
|
||||
# https://docs.docker.com/reference/builder/#copy
|
||||
# If <dest> doesn't exist, it is created along with all missing
|
||||
# directories in its path.
|
||||
|
||||
ENV DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
|
||||
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
# the above is not likely to change, so by putting it further up the
|
||||
# Dockerfile we make sure we cache as much as possible
|
||||
|
||||
|
||||
COPY setup.py README.rst CHANGELOG.md MANIFEST.in letsencrypt-auto-source/pieces/pipstrap.py /opt/certbot/src/
|
||||
|
||||
# all above files are necessary for setup.py and venv setup, however,
|
||||
# package source code directory has to be copied separately to a
|
||||
# subdirectory...
|
||||
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
|
||||
# directory, the entire contents of the directory are copied,
|
||||
# including filesystem metadata. Note: The directory itself is not
|
||||
# copied, just its contents." Order again matters, three files are far
|
||||
# more likely to be cached than the whole project directory
|
||||
|
||||
COPY certbot /opt/certbot/src/certbot/
|
||||
COPY acme /opt/certbot/src/acme/
|
||||
COPY certbot-apache /opt/certbot/src/certbot-apache/
|
||||
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
|
||||
|
||||
|
||||
RUN VIRTUALENV_NO_DOWNLOAD=1 virtualenv --no-site-packages -p python2 /opt/certbot/venv
|
||||
|
||||
# PATH is set now so pipstrap upgrades the correct (v)env
|
||||
ENV PATH /opt/certbot/venv/bin:$PATH
|
||||
RUN /opt/certbot/venv/bin/python /opt/certbot/src/pipstrap.py && \
|
||||
/opt/certbot/venv/bin/pip install \
|
||||
-e /opt/certbot/src/acme \
|
||||
-e /opt/certbot/src \
|
||||
-e /opt/certbot/src/certbot-apache \
|
||||
-e /opt/certbot/src/certbot-nginx
|
||||
|
||||
# install in editable mode (-e) to save space: it's not possible to
|
||||
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
|
||||
# this might also help in debugging: you can "docker run --entrypoint
|
||||
# bash" and investigate, apply patches, etc.
|
||||
|
||||
# set up certbot/letsencrypt wrapper to warn people about Dockerfile changes
|
||||
COPY tools/docker-warning.sh /opt/certbot/bin/certbot
|
||||
RUN ln -s /opt/certbot/bin/certbot /opt/certbot/bin/letsencrypt
|
||||
ENV PATH /opt/certbot/bin:$PATH
|
||||
|
||||
ENTRYPOINT [ "certbot" ]
|
||||
@@ -123,6 +123,21 @@ class ClientBase(object): # pylint: disable=too-many-instance-attributes
|
||||
"""
|
||||
return self.update_registration(regr, update={'status': 'deactivated'})
|
||||
|
||||
def deactivate_authorization(self, authzr):
|
||||
# type: (messages.AuthorizationResource) -> messages.AuthorizationResource
|
||||
"""Deactivate authorization.
|
||||
|
||||
:param messages.AuthorizationResource authzr: The Authorization resource
|
||||
to be deactivated.
|
||||
|
||||
:returns: The Authorization resource that was deactivated.
|
||||
:rtype: `.AuthorizationResource`
|
||||
|
||||
"""
|
||||
body = messages.UpdateAuthorization(status='deactivated')
|
||||
response = self._post(authzr.uri, body)
|
||||
return self._authzr_from_response(response)
|
||||
|
||||
def _authzr_from_response(self, response, identifier=None, uri=None):
|
||||
authzr = messages.AuthorizationResource(
|
||||
body=messages.Authorization.from_json(response.json()),
|
||||
|
||||
@@ -637,6 +637,14 @@ class ClientTest(ClientTestBase):
|
||||
errors.PollError, self.client.poll_and_request_issuance,
|
||||
csr, authzrs, mintime=mintime, max_attempts=2)
|
||||
|
||||
def test_deactivate_authorization(self):
|
||||
authzb = self.authzr.body.update(status=messages.STATUS_DEACTIVATED)
|
||||
self.response.json.return_value = authzb.to_json()
|
||||
authzr = self.client.deactivate_authorization(self.authzr)
|
||||
self.assertEqual(authzb, authzr.body)
|
||||
self.assertEqual(self.client.net.post.call_count, 1)
|
||||
self.assertTrue(self.authzr.uri in self.net.post.call_args_list[0][0])
|
||||
|
||||
def test_check_cert(self):
|
||||
self.response.headers['Location'] = self.certr.uri
|
||||
self.response.content = CERT_DER
|
||||
|
||||
@@ -18,20 +18,35 @@ OLD_ERROR_PREFIX = "urn:acme:error:"
|
||||
ERROR_PREFIX = "urn:ietf:params:acme:error:"
|
||||
|
||||
ERROR_CODES = {
|
||||
'accountDoesNotExist': 'The request specified an account that does not exist',
|
||||
'alreadyRevoked': 'The request specified a certificate to be revoked that has' \
|
||||
' already been revoked',
|
||||
'badCSR': 'The CSR is unacceptable (e.g., due to a short key)',
|
||||
'badNonce': 'The client sent an unacceptable anti-replay nonce',
|
||||
'badPublicKey': 'The JWS was signed by a public key the server does not support',
|
||||
'badRevocationReason': 'The revocation reason provided is not allowed by the server',
|
||||
'badSignatureAlgorithm': 'The JWS was signed with an algorithm the server does not support',
|
||||
'caa': 'Certification Authority Authorization (CAA) records forbid the CA from issuing' \
|
||||
' a certificate',
|
||||
'compound': 'Specific error conditions are indicated in the "subproblems" array',
|
||||
'connection': ('The server could not connect to the client to verify the'
|
||||
' domain'),
|
||||
'dns': 'There was a problem with a DNS query during identifier validation',
|
||||
'dnssec': 'The server could not validate a DNSSEC signed domain',
|
||||
'incorrectResponse': 'Response recieved didn\'t match the challenge\'s requirements',
|
||||
# deprecate invalidEmail
|
||||
'invalidEmail': 'The provided email for a registration was invalid',
|
||||
'invalidContact': 'The provided contact URI was invalid',
|
||||
'malformed': 'The request message was malformed',
|
||||
'rejectedIdentifier': 'The server will not issue certificates for the identifier',
|
||||
'orderNotReady': 'The request attempted to finalize an order that is not ready to be finalized',
|
||||
'rateLimited': 'There were too many requests of a given type',
|
||||
'serverInternal': 'The server experienced an internal error',
|
||||
'tls': 'The server experienced a TLS error during domain verification',
|
||||
'unauthorized': 'The client lacks sufficient authorization',
|
||||
'unsupportedContact': 'A contact URL for an account used an unsupported protocol scheme',
|
||||
'unknownHost': 'The server could not resolve a domain name',
|
||||
'unsupportedIdentifier': 'An identifier is of an unsupported type',
|
||||
'externalAccountRequired': 'The server requires external account binding',
|
||||
}
|
||||
|
||||
@@ -153,6 +168,7 @@ STATUS_VALID = Status('valid')
|
||||
STATUS_INVALID = Status('invalid')
|
||||
STATUS_REVOKED = Status('revoked')
|
||||
STATUS_READY = Status('ready')
|
||||
STATUS_DEACTIVATED = Status('deactivated')
|
||||
|
||||
|
||||
class IdentifierType(_Constant):
|
||||
@@ -456,7 +472,7 @@ class Authorization(ResourceBody):
|
||||
:ivar datetime.datetime expires:
|
||||
|
||||
"""
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json)
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json, omitempty=True)
|
||||
challenges = jose.Field('challenges', omitempty=True)
|
||||
combinations = jose.Field('combinations', omitempty=True)
|
||||
|
||||
@@ -486,6 +502,12 @@ class NewAuthorization(Authorization):
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class UpdateAuthorization(Authorization):
|
||||
"""Update authorization."""
|
||||
resource_type = 'authz'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class AuthorizationResource(ResourceWithURI):
|
||||
"""Authorization Resource.
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ from setuptools import find_packages
|
||||
from setuptools.command.test import test as TestCommand
|
||||
import sys
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Please update tox.ini when modifying dependency version requirements
|
||||
install_requires = [
|
||||
|
||||
15
appveyor.yml
15
appveyor.yml
@@ -4,9 +4,13 @@ environment:
|
||||
matrix:
|
||||
- TOXENV: py35
|
||||
- TOXENV: py37-cover
|
||||
- TOXENV: integration-certbot
|
||||
|
||||
branches:
|
||||
only:
|
||||
# apache-parser-v2 is a temporary branch for doing work related to
|
||||
# rewriting the parser in the Apache plugin.
|
||||
- apache-parser-v2
|
||||
- master
|
||||
- /^\d+\.\d+\.x$/ # Version branches like X.X.X
|
||||
- /^test-.*$/
|
||||
@@ -21,13 +25,16 @@ init:
|
||||
|
||||
install:
|
||||
# Use Python 3.7 by default
|
||||
- "SET PATH=C:\\Python37;C:\\Python37\\Scripts;%PATH%"
|
||||
- SET PATH=C:\\Python37;C:\\Python37\\Scripts;%PATH%
|
||||
# Using 4 processes is proven to be the most efficient integration tests config for AppVeyor
|
||||
- IF %TOXENV%==integration-certbot SET PYTEST_ADDOPTS=--numprocesses=4
|
||||
# Check env
|
||||
- "python --version"
|
||||
- python --version
|
||||
# Upgrade pip to avoid warnings
|
||||
- "python -m pip install --upgrade pip"
|
||||
- python -m pip install --upgrade pip
|
||||
# Ready to install tox and coverage
|
||||
- "pip install tox codecov"
|
||||
# tools/pip_install.py is used to pin packages to a known working version.
|
||||
- python tools\\pip_install.py tox codecov
|
||||
|
||||
build: off
|
||||
|
||||
|
||||
@@ -1,207 +0,0 @@
|
||||
"""Class of Augeas Configurators."""
|
||||
import logging
|
||||
|
||||
|
||||
from certbot import errors
|
||||
from certbot.plugins import common
|
||||
|
||||
from certbot_apache import constants
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class AugeasConfigurator(common.Installer):
|
||||
"""Base Augeas Configurator class.
|
||||
|
||||
:ivar config: Configuration.
|
||||
:type config: :class:`~certbot.interfaces.IConfig`
|
||||
|
||||
:ivar aug: Augeas object
|
||||
:type aug: :class:`augeas.Augeas`
|
||||
|
||||
:ivar str save_notes: Human-readable configuration change notes
|
||||
:ivar reverter: saves and reverts checkpoints
|
||||
:type reverter: :class:`certbot.reverter.Reverter`
|
||||
|
||||
"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(AugeasConfigurator, self).__init__(*args, **kwargs)
|
||||
|
||||
# Placeholder for augeas
|
||||
self.aug = None
|
||||
|
||||
self.save_notes = ""
|
||||
|
||||
|
||||
def init_augeas(self):
|
||||
""" Initialize the actual Augeas instance """
|
||||
import augeas
|
||||
self.aug = augeas.Augeas(
|
||||
# specify a directory to load our preferred lens from
|
||||
loadpath=constants.AUGEAS_LENS_DIR,
|
||||
# Do not save backup (we do it ourselves), do not load
|
||||
# anything by default
|
||||
flags=(augeas.Augeas.NONE |
|
||||
augeas.Augeas.NO_MODL_AUTOLOAD |
|
||||
augeas.Augeas.ENABLE_SPAN))
|
||||
# See if any temporary changes need to be recovered
|
||||
# This needs to occur before VirtualHost objects are setup...
|
||||
# because this will change the underlying configuration and potential
|
||||
# vhosts
|
||||
self.recovery_routine()
|
||||
|
||||
def check_parsing_errors(self, lens):
|
||||
"""Verify Augeas can parse all of the lens files.
|
||||
|
||||
:param str lens: lens to check for errors
|
||||
|
||||
:raises .errors.PluginError: If there has been an error in parsing with
|
||||
the specified lens.
|
||||
|
||||
"""
|
||||
error_files = self.aug.match("/augeas//error")
|
||||
|
||||
for path in error_files:
|
||||
# Check to see if it was an error resulting from the use of
|
||||
# the httpd lens
|
||||
lens_path = self.aug.get(path + "/lens")
|
||||
# As aug.get may return null
|
||||
if lens_path and lens in lens_path:
|
||||
msg = (
|
||||
"There has been an error in parsing the file {0} on line {1}: "
|
||||
"{2}".format(
|
||||
# Strip off /augeas/files and /error
|
||||
path[13:len(path) - 6],
|
||||
self.aug.get(path + "/line"),
|
||||
self.aug.get(path + "/message")))
|
||||
raise errors.PluginError(msg)
|
||||
|
||||
def ensure_augeas_state(self):
|
||||
"""Makes sure that all Augeas dom changes are written to files to avoid
|
||||
loss of configuration directives when doing additional augeas parsing,
|
||||
causing a possible augeas.load() resulting dom reset
|
||||
"""
|
||||
|
||||
if self.unsaved_files():
|
||||
self.save_notes += "(autosave)"
|
||||
self.save()
|
||||
|
||||
def unsaved_files(self):
|
||||
"""Lists files that have modified Augeas DOM but the changes have not
|
||||
been written to the filesystem yet, used by `self.save()` and
|
||||
ApacheConfigurator to check the file state.
|
||||
|
||||
:raises .errors.PluginError: If there was an error in Augeas, in
|
||||
an attempt to save the configuration, or an error creating a
|
||||
checkpoint
|
||||
|
||||
:returns: `set` of unsaved files
|
||||
"""
|
||||
save_state = self.aug.get("/augeas/save")
|
||||
self.aug.set("/augeas/save", "noop")
|
||||
# Existing Errors
|
||||
ex_errs = self.aug.match("/augeas//error")
|
||||
try:
|
||||
# This is a noop save
|
||||
self.aug.save()
|
||||
except (RuntimeError, IOError):
|
||||
self._log_save_errors(ex_errs)
|
||||
# Erase Save Notes
|
||||
self.save_notes = ""
|
||||
raise errors.PluginError(
|
||||
"Error saving files, check logs for more info.")
|
||||
|
||||
# Return the original save method
|
||||
self.aug.set("/augeas/save", save_state)
|
||||
|
||||
# Retrieve list of modified files
|
||||
# Note: Noop saves can cause the file to be listed twice, I used a
|
||||
# set to remove this possibility. This is a known augeas 0.10 error.
|
||||
save_paths = self.aug.match("/augeas/events/saved")
|
||||
|
||||
save_files = set()
|
||||
if save_paths:
|
||||
for path in save_paths:
|
||||
save_files.add(self.aug.get(path)[6:])
|
||||
return save_files
|
||||
|
||||
def save(self, title=None, temporary=False):
|
||||
"""Saves all changes to the configuration files.
|
||||
|
||||
This function first checks for save errors, if none are found,
|
||||
all configuration changes made will be saved. According to the
|
||||
function parameters. If an exception is raised, a new checkpoint
|
||||
was not created.
|
||||
|
||||
:param str title: The title of the save. If a title is given, the
|
||||
configuration will be saved as a new checkpoint and put in a
|
||||
timestamped directory.
|
||||
|
||||
:param bool temporary: Indicates whether the changes made will
|
||||
be quickly reversed in the future (ie. challenges)
|
||||
|
||||
"""
|
||||
save_files = self.unsaved_files()
|
||||
if save_files:
|
||||
self.add_to_checkpoint(save_files,
|
||||
self.save_notes, temporary=temporary)
|
||||
|
||||
self.save_notes = ""
|
||||
self.aug.save()
|
||||
|
||||
# Force reload if files were modified
|
||||
# This is needed to recalculate augeas directive span
|
||||
if save_files:
|
||||
for sf in save_files:
|
||||
self.aug.remove("/files/"+sf)
|
||||
self.aug.load()
|
||||
if title and not temporary:
|
||||
self.finalize_checkpoint(title)
|
||||
|
||||
def _log_save_errors(self, ex_errs):
|
||||
"""Log errors due to bad Augeas save.
|
||||
|
||||
:param list ex_errs: Existing errors before save
|
||||
|
||||
"""
|
||||
# Check for the root of save problems
|
||||
new_errs = self.aug.match("/augeas//error")
|
||||
# logger.error("During Save - %s", mod_conf)
|
||||
logger.error("Unable to save files: %s. Attempted Save Notes: %s",
|
||||
", ".join(err[13:len(err) - 6] for err in new_errs
|
||||
# Only new errors caused by recent save
|
||||
if err not in ex_errs), self.save_notes)
|
||||
|
||||
# Wrapper functions for Reverter class
|
||||
def recovery_routine(self):
|
||||
"""Revert all previously modified files.
|
||||
|
||||
Reverts all modified files that have not been saved as a checkpoint
|
||||
|
||||
:raises .errors.PluginError: If unable to recover the configuration
|
||||
|
||||
"""
|
||||
super(AugeasConfigurator, self).recovery_routine()
|
||||
# Need to reload configuration after these changes take effect
|
||||
self.aug.load()
|
||||
|
||||
def revert_challenge_config(self):
|
||||
"""Used to cleanup challenge configurations.
|
||||
|
||||
:raises .errors.PluginError: If unable to revert the challenge config.
|
||||
|
||||
"""
|
||||
self.revert_temporary_config()
|
||||
self.aug.load()
|
||||
|
||||
def rollback_checkpoints(self, rollback=1):
|
||||
"""Rollback saved checkpoints.
|
||||
|
||||
:param int rollback: Number of checkpoints to revert
|
||||
|
||||
:raises .errors.PluginError: If there is a problem with the input or
|
||||
the function is unable to correctly revert the configuration
|
||||
|
||||
"""
|
||||
super(AugeasConfigurator, self).rollback_checkpoints(rollback)
|
||||
self.aug.load()
|
||||
@@ -1,4 +1,4 @@
|
||||
"""Apache Configuration based off of Augeas Configurator."""
|
||||
"""Apache Configurator."""
|
||||
# pylint: disable=too-many-lines
|
||||
import copy
|
||||
import fnmatch
|
||||
@@ -23,13 +23,13 @@ from certbot import interfaces
|
||||
from certbot import util
|
||||
|
||||
from certbot.achallenges import KeyAuthorizationAnnotatedChallenge # pylint: disable=unused-import
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
from certbot.plugins import common
|
||||
from certbot.plugins.util import path_surgery
|
||||
from certbot.plugins.enhancements import AutoHSTSEnhancement
|
||||
|
||||
from certbot_apache import apache_util
|
||||
from certbot_apache import augeas_configurator
|
||||
from certbot_apache import constants
|
||||
from certbot_apache import display_ops
|
||||
from certbot_apache import http_01
|
||||
@@ -70,13 +70,10 @@ logger = logging.getLogger(__name__)
|
||||
|
||||
@zope.interface.implementer(interfaces.IAuthenticator, interfaces.IInstaller)
|
||||
@zope.interface.provider(interfaces.IPluginFactory)
|
||||
class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
class ApacheConfigurator(common.Installer):
|
||||
# pylint: disable=too-many-instance-attributes,too-many-public-methods
|
||||
"""Apache configurator.
|
||||
|
||||
State of Configurator: This code has been been tested and built for Ubuntu
|
||||
14.04 Apache 2.4 and it works for Ubuntu 12.04 Apache 2.2
|
||||
|
||||
:ivar config: Configuration.
|
||||
:type config: :class:`~certbot.interfaces.IConfig`
|
||||
|
||||
@@ -201,6 +198,8 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
self._enhanced_vhosts = defaultdict(set) # type: DefaultDict[str, Set[obj.VirtualHost]]
|
||||
# Temporary state for AutoHSTS enhancement
|
||||
self._autohsts = {} # type: Dict[str, Dict[str, Union[int, float]]]
|
||||
# Reverter save notes
|
||||
self.save_notes = ""
|
||||
|
||||
# These will be set in the prepare function
|
||||
self._prepared = False
|
||||
@@ -231,12 +230,6 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
:raises .errors.PluginError: If there is any other error
|
||||
|
||||
"""
|
||||
# Perform the actual Augeas initialization to be able to react
|
||||
try:
|
||||
self.init_augeas()
|
||||
except ImportError:
|
||||
raise errors.NoInstallationError("Problem in Augeas installation")
|
||||
|
||||
self._prepare_options()
|
||||
|
||||
# Verify Apache is installed
|
||||
@@ -254,16 +247,14 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
raise errors.NotSupportedError(
|
||||
"Apache Version {0} not supported.".format(str(self.version)))
|
||||
|
||||
if not self._check_aug_version():
|
||||
raise errors.NotSupportedError(
|
||||
"Apache plugin support requires libaugeas0 and augeas-lenses "
|
||||
"version 1.2.0 or higher, please make sure you have you have "
|
||||
"those installed.")
|
||||
|
||||
# Recover from previous crash before Augeas initialization to have the
|
||||
# correct parse tree from the get go.
|
||||
self.recovery_routine()
|
||||
# Perform the actual Augeas initialization to be able to react
|
||||
self.parser = self.get_parser()
|
||||
|
||||
# Check for errors in parsing files with Augeas
|
||||
self.check_parsing_errors("httpd.aug")
|
||||
self.parser.check_parsing_errors("httpd.aug")
|
||||
|
||||
# Get all of the available vhosts
|
||||
self.vhosts = self.get_virtual_hosts()
|
||||
@@ -276,9 +267,73 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
util.lock_dir_until_exit(self.option("server_root"))
|
||||
except (OSError, errors.LockError):
|
||||
logger.debug("Encountered error:", exc_info=True)
|
||||
raise errors.PluginError("Unable to lock {0}".format(self.option("server_root")))
|
||||
raise errors.PluginError(
|
||||
"Unable to create a lock file in {0}. Are you running"
|
||||
" Certbot with sufficient privileges to modify your"
|
||||
" Apache configuration?".format(self.option("server_root")))
|
||||
self._prepared = True
|
||||
|
||||
def save(self, title=None, temporary=False):
|
||||
"""Saves all changes to the configuration files.
|
||||
|
||||
This function first checks for save errors, if none are found,
|
||||
all configuration changes made will be saved. According to the
|
||||
function parameters. If an exception is raised, a new checkpoint
|
||||
was not created.
|
||||
|
||||
:param str title: The title of the save. If a title is given, the
|
||||
configuration will be saved as a new checkpoint and put in a
|
||||
timestamped directory.
|
||||
|
||||
:param bool temporary: Indicates whether the changes made will
|
||||
be quickly reversed in the future (ie. challenges)
|
||||
|
||||
"""
|
||||
save_files = self.parser.unsaved_files()
|
||||
if save_files:
|
||||
self.add_to_checkpoint(save_files,
|
||||
self.save_notes, temporary=temporary)
|
||||
# Handle the parser specific tasks
|
||||
self.parser.save(save_files)
|
||||
if title and not temporary:
|
||||
self.finalize_checkpoint(title)
|
||||
|
||||
def recovery_routine(self):
|
||||
"""Revert all previously modified files.
|
||||
|
||||
Reverts all modified files that have not been saved as a checkpoint
|
||||
|
||||
:raises .errors.PluginError: If unable to recover the configuration
|
||||
|
||||
"""
|
||||
super(ApacheConfigurator, self).recovery_routine()
|
||||
# Reload configuration after these changes take effect if needed
|
||||
# ie. ApacheParser has been initialized.
|
||||
if self.parser:
|
||||
# TODO: wrap into non-implementation specific parser interface
|
||||
self.parser.aug.load()
|
||||
|
||||
def revert_challenge_config(self):
|
||||
"""Used to cleanup challenge configurations.
|
||||
|
||||
:raises .errors.PluginError: If unable to revert the challenge config.
|
||||
|
||||
"""
|
||||
self.revert_temporary_config()
|
||||
self.parser.aug.load()
|
||||
|
||||
def rollback_checkpoints(self, rollback=1):
|
||||
"""Rollback saved checkpoints.
|
||||
|
||||
:param int rollback: Number of checkpoints to revert
|
||||
|
||||
:raises .errors.PluginError: If there is a problem with the input or
|
||||
the function is unable to correctly revert the configuration
|
||||
|
||||
"""
|
||||
super(ApacheConfigurator, self).rollback_checkpoints(rollback)
|
||||
self.parser.aug.load()
|
||||
|
||||
def _verify_exe_availability(self, exe):
|
||||
"""Checks availability of Apache executable"""
|
||||
if not util.exe_exists(exe):
|
||||
@@ -286,26 +341,11 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
raise errors.NoInstallationError(
|
||||
'Cannot find Apache executable {0}'.format(exe))
|
||||
|
||||
def _check_aug_version(self):
|
||||
""" Checks that we have recent enough version of libaugeas.
|
||||
If augeas version is recent enough, it will support case insensitive
|
||||
regexp matching"""
|
||||
|
||||
self.aug.set("/test/path/testing/arg", "aRgUMeNT")
|
||||
try:
|
||||
matches = self.aug.match(
|
||||
"/test//*[self::arg=~regexp('argument', 'i')]")
|
||||
except RuntimeError:
|
||||
self.aug.remove("/test/path")
|
||||
return False
|
||||
self.aug.remove("/test/path")
|
||||
return matches
|
||||
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
# If user provided vhost_root value in command line, use it
|
||||
return parser.ApacheParser(
|
||||
self.aug, self.option("server_root"), self.conf("vhost-root"),
|
||||
self.option("server_root"), self.conf("vhost-root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
def _wildcard_domain(self, domain):
|
||||
@@ -484,8 +524,8 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
# install SSLCertificateFile, SSLCertificateKeyFile,
|
||||
# and SSLCertificateChainFile directives
|
||||
set_cert_path = cert_path
|
||||
self.aug.set(path["cert_path"][-1], cert_path)
|
||||
self.aug.set(path["cert_key"][-1], key_path)
|
||||
self.parser.aug.set(path["cert_path"][-1], cert_path)
|
||||
self.parser.aug.set(path["cert_key"][-1], key_path)
|
||||
if chain_path is not None:
|
||||
self.parser.add_dir(vhost.path,
|
||||
"SSLCertificateChainFile", chain_path)
|
||||
@@ -497,8 +537,8 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
raise errors.PluginError("Please provide the --fullchain-path "
|
||||
"option pointing to your full chain file")
|
||||
set_cert_path = fullchain_path
|
||||
self.aug.set(path["cert_path"][-1], fullchain_path)
|
||||
self.aug.set(path["cert_key"][-1], key_path)
|
||||
self.parser.aug.set(path["cert_path"][-1], fullchain_path)
|
||||
self.parser.aug.set(path["cert_key"][-1], key_path)
|
||||
|
||||
# Enable the new vhost if needed
|
||||
if not vhost.enabled:
|
||||
@@ -798,7 +838,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
"""
|
||||
addrs = set()
|
||||
try:
|
||||
args = self.aug.match(path + "/arg")
|
||||
args = self.parser.aug.match(path + "/arg")
|
||||
except RuntimeError:
|
||||
logger.warning("Encountered a problem while parsing file: %s, skipping", path)
|
||||
return None
|
||||
@@ -816,7 +856,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
is_ssl = True
|
||||
|
||||
filename = apache_util.get_file_path(
|
||||
self.aug.get("/augeas/files%s/path" % apache_util.get_file_path(path)))
|
||||
self.parser.aug.get("/augeas/files%s/path" % apache_util.get_file_path(path)))
|
||||
if filename is None:
|
||||
return None
|
||||
|
||||
@@ -846,7 +886,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
# Make a list of parser paths because the parser_paths
|
||||
# dictionary may be modified during the loop.
|
||||
for vhost_path in list(self.parser.parser_paths):
|
||||
paths = self.aug.match(
|
||||
paths = self.parser.aug.match(
|
||||
("/files%s//*[label()=~regexp('%s')]" %
|
||||
(vhost_path, parser.case_i("VirtualHost"))))
|
||||
paths = [path for path in paths if
|
||||
@@ -856,7 +896,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
if not new_vhost:
|
||||
continue
|
||||
internal_path = apache_util.get_internal_aug_path(new_vhost.path)
|
||||
realpath = os.path.realpath(new_vhost.filep)
|
||||
realpath = filesystem.realpath(new_vhost.filep)
|
||||
if realpath not in file_paths:
|
||||
file_paths[realpath] = new_vhost.filep
|
||||
internal_paths[realpath].add(internal_path)
|
||||
@@ -1100,16 +1140,16 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
avail_fp = nonssl_vhost.filep
|
||||
ssl_fp = self._get_ssl_vhost_path(avail_fp)
|
||||
|
||||
orig_matches = self.aug.match("/files%s//* [label()=~regexp('%s')]" %
|
||||
orig_matches = self.parser.aug.match("/files%s//* [label()=~regexp('%s')]" %
|
||||
(self._escape(ssl_fp),
|
||||
parser.case_i("VirtualHost")))
|
||||
|
||||
self._copy_create_ssl_vhost_skeleton(nonssl_vhost, ssl_fp)
|
||||
|
||||
# Reload augeas to take into account the new vhost
|
||||
self.aug.load()
|
||||
self.parser.aug.load()
|
||||
# Get Vhost augeas path for new vhost
|
||||
new_matches = self.aug.match("/files%s//* [label()=~regexp('%s')]" %
|
||||
new_matches = self.parser.aug.match("/files%s//* [label()=~regexp('%s')]" %
|
||||
(self._escape(ssl_fp),
|
||||
parser.case_i("VirtualHost")))
|
||||
|
||||
@@ -1120,7 +1160,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
# Make Augeas aware of the new vhost
|
||||
self.parser.parse_file(ssl_fp)
|
||||
# Try to search again
|
||||
new_matches = self.aug.match(
|
||||
new_matches = self.parser.aug.match(
|
||||
"/files%s//* [label()=~regexp('%s')]" %
|
||||
(self._escape(ssl_fp),
|
||||
parser.case_i("VirtualHost")))
|
||||
@@ -1182,11 +1222,11 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
"""
|
||||
|
||||
if self.conf("vhost-root") and os.path.exists(self.conf("vhost-root")):
|
||||
fp = os.path.join(os.path.realpath(self.option("vhost_root")),
|
||||
fp = os.path.join(filesystem.realpath(self.option("vhost_root")),
|
||||
os.path.basename(non_ssl_vh_fp))
|
||||
else:
|
||||
# Use non-ssl filepath
|
||||
fp = os.path.realpath(non_ssl_vh_fp)
|
||||
fp = filesystem.realpath(non_ssl_vh_fp)
|
||||
|
||||
if fp.endswith(".conf"):
|
||||
return fp[:-(len(".conf"))] + self.option("le_vhost_ext")
|
||||
@@ -1270,8 +1310,8 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
"vhost for your HTTPS site located at {1} because they have "
|
||||
"the potential to create redirection loops.".format(
|
||||
vhost.filep, ssl_fp), reporter.MEDIUM_PRIORITY)
|
||||
self.aug.set("/augeas/files%s/mtime" % (self._escape(ssl_fp)), "0")
|
||||
self.aug.set("/augeas/files%s/mtime" % (self._escape(vhost.filep)), "0")
|
||||
self.parser.aug.set("/augeas/files%s/mtime" % (self._escape(ssl_fp)), "0")
|
||||
self.parser.aug.set("/augeas/files%s/mtime" % (self._escape(vhost.filep)), "0")
|
||||
|
||||
def _sift_rewrite_rules(self, contents):
|
||||
""" Helper function for _copy_create_ssl_vhost_skeleton to prepare the
|
||||
@@ -1346,7 +1386,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
"""
|
||||
|
||||
try:
|
||||
span_val = self.aug.span(vhost.path)
|
||||
span_val = self.parser.aug.span(vhost.path)
|
||||
except ValueError:
|
||||
logger.critical("Error while reading the VirtualHost %s from "
|
||||
"file %s", vhost.name, vhost.filep, exc_info=True)
|
||||
@@ -1381,13 +1421,13 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
|
||||
def _update_ssl_vhosts_addrs(self, vh_path):
|
||||
ssl_addrs = set()
|
||||
ssl_addr_p = self.aug.match(vh_path + "/arg")
|
||||
ssl_addr_p = self.parser.aug.match(vh_path + "/arg")
|
||||
|
||||
for addr in ssl_addr_p:
|
||||
old_addr = obj.Addr.fromstring(
|
||||
str(self.parser.get_arg(addr)))
|
||||
ssl_addr = old_addr.get_addr_obj("443")
|
||||
self.aug.set(addr, str(ssl_addr))
|
||||
self.parser.aug.set(addr, str(ssl_addr))
|
||||
ssl_addrs.add(ssl_addr)
|
||||
|
||||
return ssl_addrs
|
||||
@@ -1406,14 +1446,14 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
vh_path, False)) > 1:
|
||||
directive_path = self.parser.find_dir(directive, None,
|
||||
vh_path, False)
|
||||
self.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
|
||||
self.parser.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
|
||||
|
||||
def _remove_directives(self, vh_path, directives):
|
||||
for directive in directives:
|
||||
while self.parser.find_dir(directive, None, vh_path, False):
|
||||
directive_path = self.parser.find_dir(directive, None,
|
||||
vh_path, False)
|
||||
self.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
|
||||
self.parser.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
|
||||
|
||||
def _add_dummy_ssl_directives(self, vh_path):
|
||||
self.parser.add_dir(vh_path, "SSLCertificateFile",
|
||||
@@ -1452,7 +1492,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
"""
|
||||
matches = self.parser.find_dir(
|
||||
"ServerAlias", start=vh_path, exclude=False)
|
||||
aliases = (self.aug.get(match) for match in matches)
|
||||
aliases = (self.parser.aug.get(match) for match in matches)
|
||||
return self.domain_in_names(aliases, target_name)
|
||||
|
||||
def _add_name_vhost_if_necessary(self, vhost):
|
||||
@@ -1635,7 +1675,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
if header_path:
|
||||
pat = '(?:[ "]|^)(strict-transport-security)(?:[ "]|$)'
|
||||
for match in header_path:
|
||||
if re.search(pat, self.aug.get(match).lower()):
|
||||
if re.search(pat, self.parser.aug.get(match).lower()):
|
||||
hsts_dirpath = match
|
||||
if not hsts_dirpath:
|
||||
err_msg = ("Certbot was unable to find the existing HSTS header "
|
||||
@@ -1649,7 +1689,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
# Our match statement was for string strict-transport-security, but
|
||||
# we need to update the value instead. The next index is for the value
|
||||
hsts_dirpath = hsts_dirpath.replace("arg[3]", "arg[4]")
|
||||
self.aug.set(hsts_dirpath, hsts_maxage)
|
||||
self.parser.aug.set(hsts_dirpath, hsts_maxage)
|
||||
note_msg = ("Increasing HSTS max-age value to {0} for VirtualHost "
|
||||
"in {1}\n".format(nextstep_value, vhost.filep))
|
||||
logger.debug(note_msg)
|
||||
@@ -1731,7 +1771,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
# We'll simply delete the directive, so that we'll have a
|
||||
# consistent OCSP cache path.
|
||||
if stapling_cache_aug_path:
|
||||
self.aug.remove(
|
||||
self.parser.aug.remove(
|
||||
re.sub(r"/\w*$", "", stapling_cache_aug_path[0]))
|
||||
|
||||
self.parser.add_dir_to_ifmodssl(ssl_vhost_aug_path,
|
||||
@@ -1808,7 +1848,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
# "Existing Header directive for virtualhost"
|
||||
pat = '(?:[ "]|^)(%s)(?:[ "]|$)' % (header_substring.lower())
|
||||
for match in header_path:
|
||||
if re.search(pat, self.aug.get(match).lower()):
|
||||
if re.search(pat, self.parser.aug.get(match).lower()):
|
||||
raise errors.PluginEnhancementAlreadyPresent(
|
||||
"Existing %s header" % (header_substring))
|
||||
|
||||
@@ -1935,11 +1975,11 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
constants.REWRITE_HTTPS_ARGS_WITH_END]
|
||||
|
||||
for dir_path, args_paths in rewrite_args_dict.items():
|
||||
arg_vals = [self.aug.get(x) for x in args_paths]
|
||||
arg_vals = [self.parser.aug.get(x) for x in args_paths]
|
||||
|
||||
# Search for past redirection rule, delete it, set the new one
|
||||
if arg_vals in constants.OLD_REWRITE_HTTPS_ARGS:
|
||||
self.aug.remove(dir_path)
|
||||
self.parser.aug.remove(dir_path)
|
||||
self._set_https_redirection_rewrite_rule(vhost)
|
||||
self.save()
|
||||
raise errors.PluginEnhancementAlreadyPresent(
|
||||
@@ -1995,7 +2035,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
|
||||
|
||||
redirect_filepath = self._write_out_redirect(ssl_vhost, text)
|
||||
|
||||
self.aug.load()
|
||||
self.parser.aug.load()
|
||||
# Make a new vhost data structure and add it to the lists
|
||||
new_vhost = self._create_vhost(parser.get_aug_path(self._escape(redirect_filepath)))
|
||||
self.vhosts.append(new_vhost)
|
||||
|
||||
@@ -9,6 +9,7 @@ MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
|
||||
UPDATED_MOD_SSL_CONF_DIGEST = ".updated-options-ssl-apache-conf-digest.txt"
|
||||
"""Name of the hash of the updated or informed mod_ssl_conf as saved in `IConfig.config_dir`."""
|
||||
|
||||
# NEVER REMOVE A SINGLE HASH FROM THIS LIST UNLESS YOU KNOW EXACTLY WHAT YOU ARE DOING!
|
||||
ALL_SSL_OPTIONS_HASHES = [
|
||||
'2086bca02db48daf93468332543c60ac6acdb6f0b58c7bfdf578a5d47092f82a',
|
||||
'4844d36c9a0f587172d9fa10f4f1c9518e3bcfa1947379f155e16a70a728c21a',
|
||||
@@ -18,6 +19,10 @@ ALL_SSL_OPTIONS_HASHES = [
|
||||
'cfdd7c18d2025836ea3307399f509cfb1ebf2612c87dd600a65da2a8e2f2797b',
|
||||
'80720bd171ccdc2e6b917ded340defae66919e4624962396b992b7218a561791',
|
||||
'c0c022ea6b8a51ecc8f1003d0a04af6c3f2bc1c3ce506b3c2dfc1f11ef931082',
|
||||
'717b0a89f5e4c39b09a42813ac6e747cfbdeb93439499e73f4f70a1fe1473f20',
|
||||
'0fcdc81280cd179a07ec4d29d3595068b9326b455c488de4b09f585d5dafc137',
|
||||
'86cc09ad5415cd6d5f09a947fe2501a9344328b1e8a8b458107ea903e80baa6c',
|
||||
'06675349e457eae856120cdebb564efe546f0b87399f2264baeb41e442c724c7',
|
||||
]
|
||||
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""
|
||||
|
||||
|
||||
@@ -31,6 +31,8 @@ OVERRIDE_CLASSES = {
|
||||
"gentoo base system": override_gentoo.GentooConfigurator,
|
||||
"opensuse": override_suse.OpenSUSEConfigurator,
|
||||
"suse": override_suse.OpenSUSEConfigurator,
|
||||
"scientific": override_centos.CentOSConfigurator,
|
||||
"scientific linux": override_centos.CentOSConfigurator,
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -5,6 +5,7 @@ from acme.magic_typing import List, Set # pylint: disable=unused-import, no-nam
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
from certbot.compat import filesystem
|
||||
from certbot.plugins import common
|
||||
|
||||
from certbot_apache.obj import VirtualHost # pylint: disable=unused-import
|
||||
@@ -168,8 +169,7 @@ class ApacheHttp01(common.TLSSNI01):
|
||||
|
||||
def _set_up_challenges(self):
|
||||
if not os.path.isdir(self.challenge_dir):
|
||||
os.makedirs(self.challenge_dir)
|
||||
os.chmod(self.challenge_dir, 0o755)
|
||||
filesystem.makedirs(self.challenge_dir, 0o755)
|
||||
|
||||
responses = []
|
||||
for achall in self.achalls:
|
||||
@@ -185,7 +185,7 @@ class ApacheHttp01(common.TLSSNI01):
|
||||
self.configurator.reverter.register_file_creation(True, name)
|
||||
with open(name, 'wb') as f:
|
||||
f.write(validation.encode())
|
||||
os.chmod(name, 0o644)
|
||||
filesystem.chmod(name, 0o644)
|
||||
|
||||
return response
|
||||
|
||||
|
||||
@@ -86,7 +86,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
return CentOSParser(
|
||||
self.aug, self.option("server_root"), self.option("vhost_root"),
|
||||
self.option("server_root"), self.option("vhost_root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
def _deploy_cert(self, *args, **kwargs): # pylint: disable=arguments-differ
|
||||
@@ -155,7 +155,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
|
||||
for loadmod_path in loadmod_paths:
|
||||
nodir_path = loadmod_path.split("/directive")[0]
|
||||
# Remove the old LoadModule directive
|
||||
self.aug.remove(loadmod_path)
|
||||
self.parser.aug.remove(loadmod_path)
|
||||
|
||||
# Create a new IfModule !mod_ssl.c if not already found on path
|
||||
ssl_ifmod = self.parser.get_ifmod(nodir_path, "!mod_ssl.c",
|
||||
|
||||
@@ -7,6 +7,7 @@ import zope.interface
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot import util
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import apache_util
|
||||
@@ -65,7 +66,7 @@ class DebianConfigurator(configurator.ApacheConfigurator):
|
||||
try:
|
||||
os.symlink(vhost.filep, enabled_path)
|
||||
except OSError as err:
|
||||
if os.path.islink(enabled_path) and os.path.realpath(
|
||||
if os.path.islink(enabled_path) and filesystem.realpath(
|
||||
enabled_path) == vhost.filep:
|
||||
# Already in shape
|
||||
vhost.enabled = True
|
||||
|
||||
@@ -51,7 +51,7 @@ class FedoraConfigurator(configurator.ApacheConfigurator):
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
return FedoraParser(
|
||||
self.aug, self.option("server_root"), self.option("vhost_root"),
|
||||
self.option("server_root"), self.option("vhost_root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
def _try_restart_fedora(self):
|
||||
|
||||
@@ -44,7 +44,7 @@ class GentooConfigurator(configurator.ApacheConfigurator):
|
||||
def get_parser(self):
|
||||
"""Initializes the ApacheParser"""
|
||||
return GentooParser(
|
||||
self.aug, self.option("server_root"), self.option("vhost_root"),
|
||||
self.option("server_root"), self.option("vhost_root"),
|
||||
self.version, configurator=self)
|
||||
|
||||
|
||||
|
||||
@@ -13,6 +13,8 @@ from acme.magic_typing import Dict, List, Set # pylint: disable=unused-import,
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import constants
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@@ -32,7 +34,7 @@ class ApacheParser(object):
|
||||
arg_var_interpreter = re.compile(r"\$\{[^ \}]*}")
|
||||
fnmatch_chars = set(["*", "?", "\\", "[", "]"])
|
||||
|
||||
def __init__(self, aug, root, vhostroot=None, version=(2, 4),
|
||||
def __init__(self, root, vhostroot=None, version=(2, 4),
|
||||
configurator=None):
|
||||
# Note: Order is important here.
|
||||
|
||||
@@ -41,11 +43,20 @@ class ApacheParser(object):
|
||||
# issues with aug.load() after adding new files / defines to parse tree
|
||||
self.configurator = configurator
|
||||
|
||||
# Initialize augeas
|
||||
self.aug = None
|
||||
self.init_augeas()
|
||||
|
||||
if not self.check_aug_version():
|
||||
raise errors.NotSupportedError(
|
||||
"Apache plugin support requires libaugeas0 and augeas-lenses "
|
||||
"version 1.2.0 or higher, please make sure you have you have "
|
||||
"those installed.")
|
||||
|
||||
self.modules = set() # type: Set[str]
|
||||
self.parser_paths = {} # type: Dict[str, List[str]]
|
||||
self.variables = {} # type: Dict[str, str]
|
||||
|
||||
self.aug = aug
|
||||
# Find configuration root and make sure augeas can parse it.
|
||||
self.root = os.path.abspath(root)
|
||||
self.loc = {"root": self._find_config_root()}
|
||||
@@ -77,6 +88,146 @@ class ApacheParser(object):
|
||||
if self.find_dir("Define", exclude=False):
|
||||
raise errors.PluginError("Error parsing runtime variables")
|
||||
|
||||
def init_augeas(self):
|
||||
""" Initialize the actual Augeas instance """
|
||||
|
||||
try:
|
||||
import augeas
|
||||
except ImportError: # pragma: no cover
|
||||
raise errors.NoInstallationError("Problem in Augeas installation")
|
||||
|
||||
self.aug = augeas.Augeas(
|
||||
# specify a directory to load our preferred lens from
|
||||
loadpath=constants.AUGEAS_LENS_DIR,
|
||||
# Do not save backup (we do it ourselves), do not load
|
||||
# anything by default
|
||||
flags=(augeas.Augeas.NONE |
|
||||
augeas.Augeas.NO_MODL_AUTOLOAD |
|
||||
augeas.Augeas.ENABLE_SPAN))
|
||||
|
||||
def check_parsing_errors(self, lens):
|
||||
"""Verify Augeas can parse all of the lens files.
|
||||
|
||||
:param str lens: lens to check for errors
|
||||
|
||||
:raises .errors.PluginError: If there has been an error in parsing with
|
||||
the specified lens.
|
||||
|
||||
"""
|
||||
error_files = self.aug.match("/augeas//error")
|
||||
|
||||
for path in error_files:
|
||||
# Check to see if it was an error resulting from the use of
|
||||
# the httpd lens
|
||||
lens_path = self.aug.get(path + "/lens")
|
||||
# As aug.get may return null
|
||||
if lens_path and lens in lens_path:
|
||||
msg = (
|
||||
"There has been an error in parsing the file {0} on line {1}: "
|
||||
"{2}".format(
|
||||
# Strip off /augeas/files and /error
|
||||
path[13:len(path) - 6],
|
||||
self.aug.get(path + "/line"),
|
||||
self.aug.get(path + "/message")))
|
||||
raise errors.PluginError(msg)
|
||||
|
||||
def check_aug_version(self):
|
||||
""" Checks that we have recent enough version of libaugeas.
|
||||
If augeas version is recent enough, it will support case insensitive
|
||||
regexp matching"""
|
||||
|
||||
self.aug.set("/test/path/testing/arg", "aRgUMeNT")
|
||||
try:
|
||||
matches = self.aug.match(
|
||||
"/test//*[self::arg=~regexp('argument', 'i')]")
|
||||
except RuntimeError:
|
||||
self.aug.remove("/test/path")
|
||||
return False
|
||||
self.aug.remove("/test/path")
|
||||
return matches
|
||||
|
||||
def unsaved_files(self):
|
||||
"""Lists files that have modified Augeas DOM but the changes have not
|
||||
been written to the filesystem yet, used by `self.save()` and
|
||||
ApacheConfigurator to check the file state.
|
||||
|
||||
:raises .errors.PluginError: If there was an error in Augeas, in
|
||||
an attempt to save the configuration, or an error creating a
|
||||
checkpoint
|
||||
|
||||
:returns: `set` of unsaved files
|
||||
"""
|
||||
save_state = self.aug.get("/augeas/save")
|
||||
self.aug.set("/augeas/save", "noop")
|
||||
# Existing Errors
|
||||
ex_errs = self.aug.match("/augeas//error")
|
||||
try:
|
||||
# This is a noop save
|
||||
self.aug.save()
|
||||
except (RuntimeError, IOError):
|
||||
self._log_save_errors(ex_errs)
|
||||
# Erase Save Notes
|
||||
self.configurator.save_notes = ""
|
||||
raise errors.PluginError(
|
||||
"Error saving files, check logs for more info.")
|
||||
|
||||
# Return the original save method
|
||||
self.aug.set("/augeas/save", save_state)
|
||||
|
||||
# Retrieve list of modified files
|
||||
# Note: Noop saves can cause the file to be listed twice, I used a
|
||||
# set to remove this possibility. This is a known augeas 0.10 error.
|
||||
save_paths = self.aug.match("/augeas/events/saved")
|
||||
|
||||
save_files = set()
|
||||
if save_paths:
|
||||
for path in save_paths:
|
||||
save_files.add(self.aug.get(path)[6:])
|
||||
return save_files
|
||||
|
||||
def ensure_augeas_state(self):
|
||||
"""Makes sure that all Augeas dom changes are written to files to avoid
|
||||
loss of configuration directives when doing additional augeas parsing,
|
||||
causing a possible augeas.load() resulting dom reset
|
||||
"""
|
||||
|
||||
if self.unsaved_files():
|
||||
self.configurator.save_notes += "(autosave)"
|
||||
self.configurator.save()
|
||||
|
||||
def save(self, save_files):
|
||||
"""Saves all changes to the configuration files.
|
||||
|
||||
save() is called from ApacheConfigurator to handle the parser specific
|
||||
tasks of saving.
|
||||
|
||||
:param list save_files: list of strings of file paths that we need to save.
|
||||
|
||||
"""
|
||||
self.configurator.save_notes = ""
|
||||
self.aug.save()
|
||||
|
||||
# Force reload if files were modified
|
||||
# This is needed to recalculate augeas directive span
|
||||
if save_files:
|
||||
for sf in save_files:
|
||||
self.aug.remove("/files/"+sf)
|
||||
self.aug.load()
|
||||
|
||||
def _log_save_errors(self, ex_errs):
|
||||
"""Log errors due to bad Augeas save.
|
||||
|
||||
:param list ex_errs: Existing errors before save
|
||||
|
||||
"""
|
||||
# Check for the root of save problems
|
||||
new_errs = self.aug.match("/augeas//error")
|
||||
# logger.error("During Save - %s", mod_conf)
|
||||
logger.error("Unable to save files: %s. Attempted Save Notes: %s",
|
||||
", ".join(err[13:len(err) - 6] for err in new_errs
|
||||
# Only new errors caused by recent save
|
||||
if err not in ex_errs), self.configurator.save_notes)
|
||||
|
||||
def add_include(self, main_config, inc_path):
|
||||
"""Add Include for a new configuration file if one does not exist
|
||||
|
||||
@@ -658,8 +809,7 @@ class ApacheParser(object):
|
||||
use_new, remove_old = self._check_path_actions(filepath)
|
||||
# Ensure that we have the latest Augeas DOM state on disk before
|
||||
# calling aug.load() which reloads the state from disk
|
||||
if self.configurator:
|
||||
self.configurator.ensure_augeas_state()
|
||||
self.ensure_augeas_state()
|
||||
# Test if augeas included file for Httpd.lens
|
||||
# Note: This works for augeas globs, ie. *.conf
|
||||
if use_new:
|
||||
|
||||
@@ -0,0 +1,27 @@
|
||||
#!/usr/bin/env python
|
||||
"""
|
||||
This executable script wraps the apache-conf-test bash script, in order to setup a pebble instance
|
||||
before its execution. Directory URL is passed through the SERVER environment variable.
|
||||
"""
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
from certbot_integration_tests.utils import acme_server
|
||||
|
||||
SCRIPT_DIRNAME = os.path.dirname(__file__)
|
||||
|
||||
|
||||
def main(args=None):
|
||||
if not args:
|
||||
args = sys.argv[1:]
|
||||
with acme_server.ACMEServer('pebble', [], False) as acme_xdist:
|
||||
environ = os.environ.copy()
|
||||
environ['SERVER'] = acme_xdist['directory_url']
|
||||
command = [os.path.join(SCRIPT_DIRNAME, 'apache-conf-test')]
|
||||
command.extend(args)
|
||||
return subprocess.call(command, env=environ)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(main())
|
||||
@@ -165,10 +165,6 @@ class CentOS6Tests(util.ApacheTest):
|
||||
"LoadModule", "ssl_module", start=self.vh_truth[1].path, exclude=False)
|
||||
self.assertEqual(len(post_loadmods), 1)
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def test_loadmod_non_duplicate(self):
|
||||
# the modules/mod_ssl.so exists in ssl.conf
|
||||
sslmod_args = ["ssl_module", "modules/mod_somethingelse.so"]
|
||||
@@ -197,7 +193,7 @@ class CentOS6Tests(util.ApacheTest):
|
||||
exclude=False)
|
||||
for mod in orig_loadmods:
|
||||
noarg_path = mod.rpartition("/")[0]
|
||||
self.config.aug.remove(noarg_path)
|
||||
self.config.parser.aug.remove(noarg_path)
|
||||
self.config.save()
|
||||
self.config.deploy_cert(
|
||||
"random.demo", "example/cert.pem", "example/key.pem",
|
||||
|
||||
@@ -4,6 +4,7 @@ import unittest
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import obj
|
||||
@@ -160,7 +161,7 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
|
||||
"""Make sure we read the sysconfig OPTIONS variable correctly"""
|
||||
# Return nothing for the process calls
|
||||
mock_cfg.return_value = ""
|
||||
self.config.parser.sysconfig_filep = os.path.realpath(
|
||||
self.config.parser.sysconfig_filep = filesystem.realpath(
|
||||
os.path.join(self.config.parser.root, "../sysconfig/httpd"))
|
||||
self.config.parser.variables = {}
|
||||
|
||||
|
||||
@@ -1,21 +1,20 @@
|
||||
"""Test for certbot_apache.augeas_configurator."""
|
||||
"""Test for certbot_apache.configurator implementations of reverter"""
|
||||
import shutil
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache.tests import util
|
||||
|
||||
|
||||
class AugeasConfiguratorTest(util.ApacheTest):
|
||||
"""Test for Augeas Configurator base class."""
|
||||
class ConfiguratorReverterTest(util.ApacheTest):
|
||||
"""Test for ApacheConfigurator reverter methods"""
|
||||
|
||||
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
super(AugeasConfiguratorTest, self).setUp()
|
||||
super(ConfiguratorReverterTest, self).setUp()
|
||||
|
||||
self.config = util.get_apache_configurator(
|
||||
self.config_path, self.vhost_path, self.config_dir, self.work_dir)
|
||||
@@ -28,20 +27,6 @@ class AugeasConfiguratorTest(util.ApacheTest):
|
||||
shutil.rmtree(self.work_dir)
|
||||
shutil.rmtree(self.temp_dir)
|
||||
|
||||
def test_bad_parse(self):
|
||||
# pylint: disable=protected-access
|
||||
self.config.parser.parse_file(os.path.join(
|
||||
self.config.parser.root, "conf-available", "bad_conf_file.conf"))
|
||||
self.assertRaises(
|
||||
errors.PluginError, self.config.check_parsing_errors, "httpd.aug")
|
||||
|
||||
def test_bad_save(self):
|
||||
mock_save = mock.Mock()
|
||||
mock_save.side_effect = IOError
|
||||
self.config.aug.save = mock_save
|
||||
|
||||
self.assertRaises(errors.PluginError, self.config.save)
|
||||
|
||||
def test_bad_save_checkpoint(self):
|
||||
self.config.reverter.add_to_checkpoint = mock.Mock(
|
||||
side_effect=errors.ReverterError)
|
||||
@@ -63,23 +48,9 @@ class AugeasConfiguratorTest(util.ApacheTest):
|
||||
|
||||
self.assertTrue(mock_finalize.is_called)
|
||||
|
||||
def test_recovery_routine(self):
|
||||
mock_load = mock.Mock()
|
||||
self.config.aug.load = mock_load
|
||||
|
||||
self.config.recovery_routine()
|
||||
self.assertEqual(mock_load.call_count, 1)
|
||||
|
||||
def test_recovery_routine_error(self):
|
||||
self.config.reverter.recovery_routine = mock.Mock(
|
||||
side_effect=errors.ReverterError)
|
||||
|
||||
self.assertRaises(
|
||||
errors.PluginError, self.config.recovery_routine)
|
||||
|
||||
def test_revert_challenge_config(self):
|
||||
mock_load = mock.Mock()
|
||||
self.config.aug.load = mock_load
|
||||
self.config.parser.aug.load = mock_load
|
||||
|
||||
self.config.revert_challenge_config()
|
||||
self.assertEqual(mock_load.call_count, 1)
|
||||
@@ -93,7 +64,7 @@ class AugeasConfiguratorTest(util.ApacheTest):
|
||||
|
||||
def test_rollback_checkpoints(self):
|
||||
mock_load = mock.Mock()
|
||||
self.config.aug.load = mock_load
|
||||
self.config.parser.aug.load = mock_load
|
||||
|
||||
self.config.rollback_checkpoints()
|
||||
self.assertEqual(mock_load.call_count, 1)
|
||||
@@ -103,13 +74,11 @@ class AugeasConfiguratorTest(util.ApacheTest):
|
||||
side_effect=errors.ReverterError)
|
||||
self.assertRaises(errors.PluginError, self.config.rollback_checkpoints)
|
||||
|
||||
def test_view_config_changes(self):
|
||||
self.config.view_config_changes()
|
||||
|
||||
def test_view_config_changes_error(self):
|
||||
self.config.reverter.view_config_changes = mock.Mock(
|
||||
side_effect=errors.ReverterError)
|
||||
self.assertRaises(errors.PluginError, self.config.view_config_changes)
|
||||
def test_recovery_routine_reload(self):
|
||||
mock_load = mock.Mock()
|
||||
self.config.parser.aug.load = mock_load
|
||||
self.config.recovery_routine()
|
||||
self.assertEqual(mock_load.call_count, 1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
@@ -16,6 +16,7 @@ from certbot import achallenges
|
||||
from certbot import crypto_util
|
||||
from certbot import errors
|
||||
from certbot.compat import os
|
||||
from certbot.compat import filesystem
|
||||
from certbot.tests import acme_util
|
||||
from certbot.tests import util as certbot_util
|
||||
|
||||
@@ -50,25 +51,14 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.deploy_cert = mocked_deploy_cert
|
||||
return self.config
|
||||
|
||||
@mock.patch("certbot_apache.configurator.ApacheConfigurator.init_augeas")
|
||||
@mock.patch("certbot_apache.configurator.path_surgery")
|
||||
def test_prepare_no_install(self, mock_surgery, _init_augeas):
|
||||
def test_prepare_no_install(self, mock_surgery):
|
||||
silly_path = {"PATH": "/tmp/nothingness2342"}
|
||||
mock_surgery.return_value = False
|
||||
with mock.patch.dict('os.environ', silly_path):
|
||||
self.assertRaises(errors.NoInstallationError, self.config.prepare)
|
||||
self.assertEqual(mock_surgery.call_count, 1)
|
||||
|
||||
@mock.patch("certbot_apache.augeas_configurator.AugeasConfigurator.init_augeas")
|
||||
def test_prepare_no_augeas(self, mock_init_augeas):
|
||||
""" Test augeas initialization ImportError """
|
||||
def side_effect_error():
|
||||
""" Side effect error for the test """
|
||||
raise ImportError
|
||||
mock_init_augeas.side_effect = side_effect_error
|
||||
self.assertRaises(
|
||||
errors.NoInstallationError, self.config.prepare)
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser")
|
||||
@mock.patch("certbot_apache.configurator.util.exe_exists")
|
||||
def test_prepare_version(self, mock_exe_exists, _):
|
||||
@@ -80,16 +70,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.assertRaises(
|
||||
errors.NotSupportedError, self.config.prepare)
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser")
|
||||
@mock.patch("certbot_apache.configurator.util.exe_exists")
|
||||
def test_prepare_old_aug(self, mock_exe_exists, _):
|
||||
mock_exe_exists.return_value = True
|
||||
self.config.config_test = mock.Mock()
|
||||
# pylint: disable=protected-access
|
||||
self.config._check_aug_version = mock.Mock(return_value=False)
|
||||
self.assertRaises(
|
||||
errors.NotSupportedError, self.config.prepare)
|
||||
|
||||
def test_prepare_locked(self):
|
||||
server_root = self.config.conf("server-root")
|
||||
self.config.config_test = mock.Mock()
|
||||
@@ -674,7 +654,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# span excludes the closing </VirtualHost> tag in older versions
|
||||
# of Augeas
|
||||
return_value = [self.vh_truth[0].filep, 1, 12, 0, 0, 0, 1142]
|
||||
with mock.patch.object(self.config.aug, 'span') as mock_span:
|
||||
with mock.patch.object(self.config.parser.aug, 'span') as mock_span:
|
||||
mock_span.return_value = return_value
|
||||
self.test_make_vhost_ssl()
|
||||
|
||||
@@ -682,7 +662,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# span includes the closing </VirtualHost> tag in newer versions
|
||||
# of Augeas
|
||||
return_value = [self.vh_truth[0].filep, 1, 12, 0, 0, 0, 1157]
|
||||
with mock.patch.object(self.config.aug, 'span') as mock_span:
|
||||
with mock.patch.object(self.config.parser.aug, 'span') as mock_span:
|
||||
mock_span.return_value = return_value
|
||||
self.test_make_vhost_ssl()
|
||||
|
||||
@@ -695,8 +675,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
def test_make_vhost_ssl_nonexistent_vhost_path(self):
|
||||
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[1])
|
||||
self.assertEqual(os.path.dirname(ssl_vhost.filep),
|
||||
os.path.dirname(os.path.realpath(
|
||||
self.vh_truth[1].filep)))
|
||||
os.path.dirname(filesystem.realpath(self.vh_truth[1].filep)))
|
||||
|
||||
def test_make_vhost_ssl(self):
|
||||
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[0])
|
||||
@@ -1231,7 +1210,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
except errors.PluginEnhancementAlreadyPresent:
|
||||
args_paths = self.config.parser.find_dir(
|
||||
"RewriteRule", None, http_vhost.path, False)
|
||||
arg_vals = [self.config.aug.get(x) for x in args_paths]
|
||||
arg_vals = [self.config.parser.aug.get(x) for x in args_paths]
|
||||
self.assertEqual(arg_vals, constants.REWRITE_HTTPS_ARGS)
|
||||
|
||||
|
||||
@@ -1334,15 +1313,6 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
|
||||
return account_key, (achall1, achall2, achall3)
|
||||
|
||||
def test_aug_version(self):
|
||||
mock_match = mock.Mock(return_value=["something"])
|
||||
self.config.aug.match = mock_match
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.config._check_aug_version(),
|
||||
["something"])
|
||||
self.config.aug.match.side_effect = RuntimeError
|
||||
self.assertFalse(self.config._check_aug_version())
|
||||
|
||||
def test_enable_site_nondebian(self):
|
||||
inc_path = "/path/to/wherever"
|
||||
vhost = self.vh_truth[0]
|
||||
@@ -1365,8 +1335,8 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.config.parser.modules.add("ssl_module")
|
||||
self.config.parser.modules.add("mod_ssl.c")
|
||||
self.config.parser.modules.add("socache_shmcb_module")
|
||||
tmp_path = os.path.realpath(tempfile.mkdtemp("vhostroot"))
|
||||
os.chmod(tmp_path, 0o755)
|
||||
tmp_path = filesystem.realpath(tempfile.mkdtemp("vhostroot"))
|
||||
filesystem.chmod(tmp_path, 0o755)
|
||||
mock_p = "certbot_apache.configurator.ApacheConfigurator._get_ssl_vhost_path"
|
||||
mock_a = "certbot_apache.parser.ApacheParser.add_include"
|
||||
|
||||
@@ -1511,7 +1481,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
self.assertEqual(first_id, second_id)
|
||||
|
||||
def test_realpath_replaces_symlink(self):
|
||||
orig_match = self.config.aug.match
|
||||
orig_match = self.config.parser.aug.match
|
||||
mock_vhost = copy.deepcopy(self.vh_truth[0])
|
||||
mock_vhost.filep = mock_vhost.filep.replace('sites-enabled', u'sites-available')
|
||||
mock_vhost.path = mock_vhost.path.replace('sites-enabled', 'sites-available')
|
||||
@@ -1525,7 +1495,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
return orig_match(aug_expr)
|
||||
|
||||
self.config.parser.parser_paths = ["/mocked/path"]
|
||||
self.config.aug.match = mock_match
|
||||
self.config.parser.aug.match = mock_match
|
||||
vhs = self.config.get_virtual_hosts()
|
||||
self.assertEqual(len(vhs), 2)
|
||||
self.assertTrue(vhs[0] == self.vh_truth[1])
|
||||
@@ -1551,8 +1521,8 @@ class AugeasVhostsTest(util.ApacheTest):
|
||||
self.work_dir)
|
||||
|
||||
def test_choosevhost_with_illegal_name(self):
|
||||
self.config.aug = mock.MagicMock()
|
||||
self.config.aug.match.side_effect = RuntimeError
|
||||
self.config.parser.aug = mock.MagicMock()
|
||||
self.config.parser.aug.match.side_effect = RuntimeError
|
||||
path = "debian_apache_2_4/augeas_vhosts/apache2/sites-available/old-and-default.conf"
|
||||
chosen_vhost = self.config._create_vhost(path)
|
||||
self.assertEqual(None, chosen_vhost)
|
||||
|
||||
@@ -79,9 +79,9 @@ class MultipleVhostsTestDebian(util.ApacheTest):
|
||||
|
||||
def test_enable_site_failure(self):
|
||||
self.config.parser.root = "/tmp/nonexistent"
|
||||
with mock.patch("os.path.isdir") as mock_dir:
|
||||
with mock.patch("certbot.compat.os.path.isdir") as mock_dir:
|
||||
mock_dir.return_value = True
|
||||
with mock.patch("os.path.islink") as mock_link:
|
||||
with mock.patch("certbot.compat.os.path.islink") as mock_link:
|
||||
mock_link.return_value = False
|
||||
self.assertRaises(
|
||||
errors.NotSupportedError,
|
||||
|
||||
@@ -4,6 +4,7 @@ import unittest
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import obj
|
||||
@@ -160,7 +161,7 @@ class MultipleVhostsTestFedora(util.ApacheTest):
|
||||
"""Make sure we read the sysconfig OPTIONS variable correctly"""
|
||||
# Return nothing for the process calls
|
||||
mock_cfg.return_value = ""
|
||||
self.config.parser.sysconfig_filep = os.path.realpath(
|
||||
self.config.parser.sysconfig_filep = filesystem.realpath(
|
||||
os.path.join(self.config.parser.root, "../sysconfig/httpd"))
|
||||
self.config.parser.variables = {}
|
||||
|
||||
|
||||
@@ -4,6 +4,7 @@ import unittest
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
from certbot.compat import filesystem
|
||||
from certbot.compat import os
|
||||
|
||||
from certbot_apache import obj
|
||||
@@ -81,7 +82,7 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
|
||||
"""Make sure we read the Gentoo APACHE2_OPTS variable correctly"""
|
||||
defines = ['DEFAULT_VHOST', 'INFO',
|
||||
'SSL', 'SSL_DEFAULT_VHOST', 'LANGUAGE']
|
||||
self.config.parser.apacheconfig_filep = os.path.realpath(
|
||||
self.config.parser.apacheconfig_filep = filesystem.realpath(
|
||||
os.path.join(self.config.parser.root, "../conf.d/apache2"))
|
||||
self.config.parser.variables = {}
|
||||
with mock.patch("certbot_apache.override_gentoo.GentooParser.update_modules"):
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
# pylint: disable=too-many-public-methods
|
||||
"""Tests for certbot_apache.parser."""
|
||||
import shutil
|
||||
import unittest
|
||||
|
||||
import augeas
|
||||
import mock
|
||||
|
||||
from certbot import errors
|
||||
@@ -22,6 +22,27 @@ class BasicParserTest(util.ParserTest):
|
||||
shutil.rmtree(self.config_dir)
|
||||
shutil.rmtree(self.work_dir)
|
||||
|
||||
def test_bad_parse(self):
|
||||
self.parser.parse_file(os.path.join(self.parser.root,
|
||||
"conf-available", "bad_conf_file.conf"))
|
||||
self.assertRaises(
|
||||
errors.PluginError, self.parser.check_parsing_errors, "httpd.aug")
|
||||
|
||||
def test_bad_save(self):
|
||||
mock_save = mock.Mock()
|
||||
mock_save.side_effect = IOError
|
||||
self.parser.aug.save = mock_save
|
||||
self.assertRaises(errors.PluginError, self.parser.unsaved_files)
|
||||
|
||||
def test_aug_version(self):
|
||||
mock_match = mock.Mock(return_value=["something"])
|
||||
self.parser.aug.match = mock_match
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.parser.check_aug_version(),
|
||||
["something"])
|
||||
self.parser.aug.match.side_effect = RuntimeError
|
||||
self.assertFalse(self.parser.check_aug_version())
|
||||
|
||||
def test_find_config_root_no_root(self):
|
||||
# pylint: disable=protected-access
|
||||
os.remove(self.parser.loc["root"])
|
||||
@@ -311,21 +332,38 @@ class BasicParserTest(util.ParserTest):
|
||||
class ParserInitTest(util.ApacheTest):
|
||||
def setUp(self): # pylint: disable=arguments-differ
|
||||
super(ParserInitTest, self).setUp()
|
||||
self.aug = augeas.Augeas(
|
||||
flags=augeas.Augeas.NONE | augeas.Augeas.NO_MODL_AUTOLOAD)
|
||||
|
||||
def tearDown(self):
|
||||
shutil.rmtree(self.temp_dir)
|
||||
shutil.rmtree(self.config_dir)
|
||||
shutil.rmtree(self.work_dir)
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser.init_augeas")
|
||||
def test_prepare_no_augeas(self, mock_init_augeas):
|
||||
from certbot_apache.parser import ApacheParser
|
||||
mock_init_augeas.side_effect = errors.NoInstallationError
|
||||
self.config.config_test = mock.Mock()
|
||||
self.assertRaises(
|
||||
errors.NoInstallationError, ApacheParser,
|
||||
os.path.relpath(self.config_path), "/dummy/vhostpath",
|
||||
version=(2, 4, 22), configurator=self.config)
|
||||
|
||||
def test_init_old_aug(self):
|
||||
from certbot_apache.parser import ApacheParser
|
||||
with mock.patch("certbot_apache.parser.ApacheParser.check_aug_version") as mock_c:
|
||||
mock_c.return_value = False
|
||||
self.assertRaises(
|
||||
errors.NotSupportedError,
|
||||
ApacheParser, os.path.relpath(self.config_path),
|
||||
"/dummy/vhostpath", version=(2, 4, 22), configurator=self.config)
|
||||
|
||||
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
|
||||
def test_unparseable(self, mock_cfg):
|
||||
from certbot_apache.parser import ApacheParser
|
||||
mock_cfg.return_value = ('Define: TEST')
|
||||
self.assertRaises(
|
||||
errors.PluginError,
|
||||
ApacheParser, self.aug, os.path.relpath(self.config_path),
|
||||
ApacheParser, os.path.relpath(self.config_path),
|
||||
"/dummy/vhostpath", version=(2, 2, 22), configurator=self.config)
|
||||
|
||||
def test_root_normalized(self):
|
||||
@@ -337,8 +375,7 @@ class ParserInitTest(util.ApacheTest):
|
||||
self.temp_dir,
|
||||
"debian_apache_2_4/////multiple_vhosts/../multiple_vhosts/apache2")
|
||||
|
||||
parser = ApacheParser(self.aug, path,
|
||||
"/dummy/vhostpath", configurator=self.config)
|
||||
parser = ApacheParser(path, "/dummy/vhostpath", configurator=self.config)
|
||||
|
||||
self.assertEqual(parser.root, self.config_path)
|
||||
|
||||
@@ -347,7 +384,7 @@ class ParserInitTest(util.ApacheTest):
|
||||
with mock.patch("certbot_apache.parser.ApacheParser."
|
||||
"update_runtime_variables"):
|
||||
parser = ApacheParser(
|
||||
self.aug, os.path.relpath(self.config_path),
|
||||
os.path.relpath(self.config_path),
|
||||
"/dummy/vhostpath", configurator=self.config)
|
||||
|
||||
self.assertEqual(parser.root, self.config_path)
|
||||
@@ -357,7 +394,7 @@ class ParserInitTest(util.ApacheTest):
|
||||
with mock.patch("certbot_apache.parser.ApacheParser."
|
||||
"update_runtime_variables"):
|
||||
parser = ApacheParser(
|
||||
self.aug, self.config_path + os.path.sep,
|
||||
self.config_path + os.path.sep,
|
||||
"/dummy/vhostpath", configurator=self.config)
|
||||
self.assertEqual(parser.root, self.config_path)
|
||||
|
||||
|
||||
@@ -78,8 +78,7 @@ class ParserTest(ApacheTest):
|
||||
with mock.patch("certbot_apache.parser.ApacheParser."
|
||||
"update_runtime_variables"):
|
||||
self.parser = ApacheParser(
|
||||
self.aug, self.config_path, self.vhost_path,
|
||||
configurator=self.config)
|
||||
self.config_path, self.vhost_path, configurator=self.config)
|
||||
|
||||
|
||||
def get_apache_configurator( # pylint: disable=too-many-arguments, too-many-locals
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
# Remember to update setup.py to match the package versions below.
|
||||
acme[dev]==0.29.0
|
||||
certbot[dev]==0.34.0
|
||||
certbot[dev]==0.37.0
|
||||
|
||||
@@ -4,13 +4,13 @@ from setuptools.command.test import test as TestCommand
|
||||
import sys
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
install_requires = [
|
||||
'acme>=0.29.0',
|
||||
'certbot>=0.34.0',
|
||||
'certbot>=0.37.0',
|
||||
'mock',
|
||||
'python-augeas',
|
||||
'setuptools',
|
||||
|
||||
53
certbot-auto
53
certbot-auto
@@ -31,7 +31,7 @@ if [ -z "$VENV_PATH" ]; then
|
||||
fi
|
||||
VENV_BIN="$VENV_PATH/bin"
|
||||
BOOTSTRAP_VERSION_PATH="$VENV_PATH/certbot-auto-bootstrap-version.txt"
|
||||
LE_AUTO_VERSION="0.34.2"
|
||||
LE_AUTO_VERSION="0.37.2"
|
||||
BASENAME=$(basename $0)
|
||||
USAGE="Usage: $BASENAME [OPTIONS]
|
||||
A self-updating wrapper script for the Certbot ACME client. When run, updates
|
||||
@@ -755,13 +755,31 @@ elif [ -f /etc/redhat-release ]; then
|
||||
prev_le_python="$LE_PYTHON"
|
||||
unset LE_PYTHON
|
||||
DeterminePythonVersion "NOCRASH"
|
||||
# Starting to Fedora 29, python2 is on a deprecation path. Let's move to python3 then.
|
||||
|
||||
RPM_DIST_NAME=`(. /etc/os-release 2> /dev/null && echo $ID) || echo "unknown"`
|
||||
RPM_DIST_VERSION=0
|
||||
if [ "$RPM_DIST_NAME" = "fedora" ]; then
|
||||
RPM_DIST_VERSION=`(. /etc/os-release 2> /dev/null && echo $VERSION_ID) || echo "0"`
|
||||
|
||||
# Set RPM_DIST_VERSION to VERSION_ID from /etc/os-release after splitting on
|
||||
# '.' characters (e.g. "8.0" becomes "8"). If the command exits with an
|
||||
# error, RPM_DIST_VERSION is set to "unknown".
|
||||
RPM_DIST_VERSION=$( (. /etc/os-release 2> /dev/null && echo "$VERSION_ID") | cut -d '.' -f1 || echo "unknown")
|
||||
|
||||
# If RPM_DIST_VERSION is an empty string or it contains any nonnumeric
|
||||
# characters, the value is unexpected so we set RPM_DIST_VERSION to 0.
|
||||
if [ -z "$RPM_DIST_VERSION" ] || [ -n "$(echo "$RPM_DIST_VERSION" | tr -d '[0-9]')" ]; then
|
||||
RPM_DIST_VERSION=0
|
||||
fi
|
||||
|
||||
# Starting to Fedora 29, python2 is on a deprecation path. Let's move to python3 then.
|
||||
# RHEL 8 also uses python3 by default.
|
||||
if [ "$RPM_DIST_NAME" = "fedora" -a "$RPM_DIST_VERSION" -ge 29 -o "$PYVER" -eq 26 ]; then
|
||||
RPM_USE_PYTHON_3=1
|
||||
elif [ "$RPM_DIST_NAME" = "rhel" -a "$RPM_DIST_VERSION" -ge 8 ]; then
|
||||
RPM_USE_PYTHON_3=1
|
||||
else
|
||||
RPM_USE_PYTHON_3=0
|
||||
fi
|
||||
|
||||
if [ "$RPM_USE_PYTHON_3" = 1 ]; then
|
||||
Bootstrap() {
|
||||
BootstrapMessage "RedHat-based OSes that will use Python3"
|
||||
BootstrapRpmPython3
|
||||
@@ -775,6 +793,7 @@ elif [ -f /etc/redhat-release ]; then
|
||||
}
|
||||
BOOTSTRAP_VERSION="BootstrapRpmCommon $BOOTSTRAP_RPM_COMMON_VERSION"
|
||||
fi
|
||||
|
||||
LE_PYTHON="$prev_le_python"
|
||||
elif [ -f /etc/os-release ] && `grep -q openSUSE /etc/os-release` ; then
|
||||
Bootstrap() {
|
||||
@@ -1314,18 +1333,18 @@ letsencrypt==0.7.0 \
|
||||
--hash=sha256:105a5fb107e45bcd0722eb89696986dcf5f08a86a321d6aef25a0c7c63375ade \
|
||||
--hash=sha256:c36e532c486a7e92155ee09da54b436a3c420813ec1c590b98f635d924720de9
|
||||
|
||||
certbot==0.34.2 \
|
||||
--hash=sha256:238bb1c100d0d17f0bda147387435c307e128b2f1a8339eb85cef7fb99909cb9 \
|
||||
--hash=sha256:30732ddcb10ccd8b8410c515a76ae0429ad907130b8bf8caa58b73826d0ec9bb
|
||||
acme==0.34.2 \
|
||||
--hash=sha256:f2b3cec09270499211fa54e588571bac67a015d375a4806c6c23431c91fdf7e3 \
|
||||
--hash=sha256:bd5b0dfcbca82a2be6fe12e7c7939721d6b3dacb7d8529ba519b56274060dc2a
|
||||
certbot-apache==0.34.2 \
|
||||
--hash=sha256:c9cbbc2499084361a741f865a6f9af717296d5b0fec5fdd45819df2a56014a63 \
|
||||
--hash=sha256:74c302b2099c9906dd4783cd57f546393235902dcc179302a2da280d83e72b96
|
||||
certbot-nginx==0.34.2 \
|
||||
--hash=sha256:4883f638e703b8fbab0ec15df6d9f0ebbb3cd81e221521b65ca27cdc9e9d070d \
|
||||
--hash=sha256:13d58e40097f6b36e323752c146dc90d06120dc69a313e141476e0bc1a74ee17
|
||||
certbot==0.37.2 \
|
||||
--hash=sha256:8f6f0097fb2aac64f13e5d6974781ac85a051d84a6cb3f4d79c6b75c5ea451b8 \
|
||||
--hash=sha256:e454368aa8d62559c673091b511319c130c8e0ea1c4dfa314ed7bdc91dd96ef5
|
||||
acme==0.37.2 \
|
||||
--hash=sha256:5666ba927a9e7bf3f9ed5a268bd5acf627b5838fb409e8401f05d2aaaee188ba \
|
||||
--hash=sha256:88798fae3bc692397db79c66930bd02fcaba8a6b1fba9a62f111dda42cc47f5c
|
||||
certbot-apache==0.37.2 \
|
||||
--hash=sha256:e3ae7057f727506ab3796095ed66ca083f4e295d06f209ab96d2a3f37dea51b9 \
|
||||
--hash=sha256:4cb44d1a7c56176a84446a11412c561479ed0fed19848632e61f104dbf6a3031
|
||||
certbot-nginx==0.37.2 \
|
||||
--hash=sha256:a92dffdf3daca97db5d7ae2287e505110c3fa01c035b9356abb2ef9fa32e8695 \
|
||||
--hash=sha256:404f7b5b7611f0dce8773739170f306e94a59b69528cb74337e7f354936ac061
|
||||
|
||||
UNLIKELY_EOF
|
||||
# -------------------------------------------------------------------------
|
||||
|
||||
@@ -0,0 +1,171 @@
|
||||
import shutil
|
||||
import subprocess
|
||||
import os
|
||||
|
||||
import pkg_resources
|
||||
import getpass
|
||||
|
||||
|
||||
def construct_apache_config_dir(apache_root, http_port, https_port, key_path=None,
|
||||
cert_path=None, wtf_prefix='le'):
|
||||
config_path = os.path.join(apache_root, 'config')
|
||||
shutil.copytree('/etc/apache2', config_path, symlinks=True)
|
||||
|
||||
webroot_path = os.path.join(apache_root, 'www')
|
||||
os.mkdir(webroot_path)
|
||||
with open(os.path.join(webroot_path, 'index.html'), 'w') as file_h:
|
||||
file_h.write('Hello World!')
|
||||
|
||||
main_config_path = os.path.join(config_path, 'apache2.conf')
|
||||
with open(main_config_path, 'w') as file_h:
|
||||
file_h.write('''\
|
||||
ServerRoot "{config}"
|
||||
DefaultRuntimeDir ${{APACHE_RUN_DIR}}
|
||||
PidFile ${{APACHE_PID_FILE}}
|
||||
Timeout 300
|
||||
KeepAlive On
|
||||
MaxKeepAliveRequests 100
|
||||
KeepAliveTimeout 5
|
||||
User ${{APACHE_RUN_USER}}
|
||||
Group ${{APACHE_RUN_GROUP}}
|
||||
HostnameLookups Off
|
||||
ErrorLog ${{APACHE_LOG_DIR}}/error.log
|
||||
LogLevel warn
|
||||
|
||||
IncludeOptional mods-enabled/*.load
|
||||
IncludeOptional mods-enabled/*.conf
|
||||
|
||||
Include ports.conf
|
||||
|
||||
<Directory />
|
||||
Options FollowSymLinks
|
||||
AllowOverride None
|
||||
Require all denied
|
||||
</Directory>
|
||||
|
||||
<Directory /usr/share>
|
||||
AllowOverride None
|
||||
Require all granted
|
||||
</Directory>
|
||||
|
||||
<Directory {webroot}/>
|
||||
Options Indexes FollowSymLinks
|
||||
AllowOverride None
|
||||
Require all granted
|
||||
</Directory>
|
||||
|
||||
AccessFileName .htaccess
|
||||
|
||||
<FilesMatch "^\.ht">
|
||||
Require all denied
|
||||
</FilesMatch>
|
||||
|
||||
LogFormat "%v:%p %h %l %u %t \\"%r\\" %>s %O \\"%{{Referer}}i\\" \\"%{{User-Agent}}i\\"" vhost_combined
|
||||
LogFormat "%h %l %u %t \\"%r\\" %>s %O \\"%{{Referer}}i\\" \\"%{{User-Agent}}i\\"" combined
|
||||
LogFormat "%h %l %u %t \\"%r\\" %>s %O" common
|
||||
LogFormat "%{{Referer}}i -> %U" referer
|
||||
LogFormat "%{{User-agent}}i" agent
|
||||
|
||||
IncludeOptional conf-enabled/*.conf
|
||||
IncludeOptional sites-enabled/*.conf
|
||||
'''.format(config=config_path, webroot=webroot_path))
|
||||
|
||||
with open(os.path.join(config_path, 'ports.conf'), 'w') as file_h:
|
||||
file_h.write('''\
|
||||
Listen {http}
|
||||
<IfModule ssl_module>
|
||||
Listen {https}
|
||||
</IfModule>
|
||||
<IfModule mod_gnutls.c>
|
||||
Listen {https}
|
||||
</IfModule>
|
||||
'''.format(http=http_port, https=https_port))
|
||||
|
||||
new_environ = os.environ.copy()
|
||||
new_environ['APACHE_CONFDIR'] = config_path
|
||||
|
||||
run_path = os.path.join(apache_root, 'run')
|
||||
lock_path = os.path.join(apache_root, 'lock')
|
||||
logs_path = os.path.join(apache_root, 'logs')
|
||||
os.mkdir(run_path)
|
||||
os.mkdir(lock_path)
|
||||
os.mkdir(logs_path)
|
||||
|
||||
user = getpass.getuser()
|
||||
user = user if user != 'root' else 'www-data'
|
||||
group = user
|
||||
|
||||
pid_file = os.path.join(run_path, 'apache.pid')
|
||||
|
||||
with open(os.path.join(config_path, 'envvars'), 'w') as file_h:
|
||||
file_h.write('''\
|
||||
unset HOME
|
||||
export APACHE_RUN_USER={user}
|
||||
export APACHE_RUN_GROUP={group}
|
||||
export APACHE_PID_FILE={pid_file}
|
||||
export APACHE_RUN_DIR={run_path}
|
||||
export APACHE_LOCK_DIR={lock_path}
|
||||
export APACHE_LOG_DIR={logs_path}
|
||||
export LANG=C
|
||||
'''.format(user=user, group=group, pid_file=pid_file,
|
||||
run_path=run_path, lock_path=lock_path, logs_path=logs_path))
|
||||
|
||||
new_environ['APACHE_RUN_USER'] = user
|
||||
new_environ['APACHE_RUN_GROUP'] = group
|
||||
new_environ['APACHE_PID_FILE'] = pid_file
|
||||
new_environ['APACHE_RUN_DIR'] = run_path
|
||||
new_environ['APACHE_LOCK_DIR'] = lock_path
|
||||
new_environ['APACHE_LOG_DIR'] = logs_path
|
||||
|
||||
le_host = 'apache.{0}.wtf'.format(wtf_prefix)
|
||||
|
||||
with open(os.path.join(config_path, 'sites-available', '000-default.conf'), 'w') as file_h:
|
||||
file_h.write('''\
|
||||
<VirtualHost *:{http}>
|
||||
ServerAdmin webmaster@localhost
|
||||
ServerName {le_host}
|
||||
DocumentRoot {webroot}
|
||||
|
||||
ErrorLog ${{APACHE_LOG_DIR}}/error.log
|
||||
CustomLog ${{APACHE_LOG_DIR}}/access.log combined
|
||||
</VirtualHost>
|
||||
'''.format(http=http_port, le_host=le_host, webroot=webroot_path))
|
||||
|
||||
key_path = key_path if key_path \
|
||||
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/key.pem')
|
||||
cert_path = cert_path if cert_path \
|
||||
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/cert.pem')
|
||||
|
||||
with open(os.path.join(config_path, 'sites-available', 'default-ssl.conf'), 'w') as file_h:
|
||||
file_h.write('''\
|
||||
<IfModule mod_ssl.c>
|
||||
<VirtualHost _default_:{https}>
|
||||
ServerAdmin webmaster@localhost
|
||||
ServerName {le_host}
|
||||
DocumentRoot {webroot}
|
||||
|
||||
ErrorLog ${{APACHE_LOG_DIR}}/error.log
|
||||
CustomLog ${{APACHE_LOG_DIR}}/access.log combined
|
||||
|
||||
SSLEngine on
|
||||
SSLCertificateFile {cert_path}
|
||||
SSLCertificateKeyFile {key_path}
|
||||
|
||||
<FilesMatch "\.(cgi|shtml|phtml|php)$">
|
||||
SSLOptions +StdEnvVars
|
||||
</FilesMatch>
|
||||
|
||||
<Directory /usr/lib/cgi-bin>
|
||||
SSLOptions +StdEnvVars
|
||||
</Directory>
|
||||
</VirtualHost>
|
||||
</IfModule>
|
||||
'''.format(https=https_port, le_host=le_host, webroot=webroot_path,
|
||||
cert_path=cert_path, key_path=key_path))
|
||||
|
||||
return new_environ, config_path, pid_file
|
||||
|
||||
|
||||
def test():
|
||||
env = construct_apache_config_dir('/tmp/test1', 5001, 5002)
|
||||
subprocess.call(['apache2ctl', '-DFOREGROUND'], env=env)
|
||||
51
certbot-ci/certbot_integration_tests/apache_tests/context.py
Normal file
51
certbot-ci/certbot_integration_tests/apache_tests/context.py
Normal file
@@ -0,0 +1,51 @@
|
||||
import errno
|
||||
import os
|
||||
import signal
|
||||
import subprocess
|
||||
|
||||
from certbot_integration_tests.certbot_tests import context as certbot_context
|
||||
from certbot_integration_tests.apache_tests import apache_config
|
||||
from certbot_integration_tests.utils import certbot_call
|
||||
|
||||
|
||||
class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
|
||||
def __init__(self, request):
|
||||
super(IntegrationTestsContext, self).__init__(request)
|
||||
|
||||
subprocess.check_output(['chmod', '+x', self.workspace])
|
||||
|
||||
self.apache_root = os.path.join(self.workspace, 'apache')
|
||||
os.mkdir(self.apache_root)
|
||||
|
||||
self.env, self.config_dir, self.pid_file = apache_config.construct_apache_config_dir(
|
||||
self.apache_root, self.http_01_port, self.tls_alpn_01_port,
|
||||
wtf_prefix=self.worker_id)
|
||||
|
||||
def cleanup(self):
|
||||
self._stop_apache()
|
||||
super(IntegrationTestsContext, self).cleanup()
|
||||
|
||||
def certbot_test_apache(self, args):
|
||||
command = ['--authenticator', 'apache', '--installer', 'apache',
|
||||
'--apache-server-root', self.config_dir,
|
||||
'--apache-challenge-location', self.apache_root]
|
||||
command.extend(args)
|
||||
|
||||
return certbot_call.certbot_test(
|
||||
command, self.directory_url, self.http_01_port, self.tls_alpn_01_port,
|
||||
self.config_dir, self.workspace, env=self.env, force_renew=True)
|
||||
|
||||
def _stop_apache(self):
|
||||
try:
|
||||
with open(self.pid_file) as file_h:
|
||||
pid = int(file_h.read().strip())
|
||||
except BaseException:
|
||||
pid = None
|
||||
|
||||
if pid:
|
||||
try:
|
||||
os.kill(pid, signal.SIGTERM)
|
||||
except OSError as err:
|
||||
# Ignore "No such process" error, Apache may already be stopped.
|
||||
if err.errno != errno.ESRCH:
|
||||
raise
|
||||
@@ -0,0 +1,18 @@
|
||||
import pytest
|
||||
|
||||
from certbot_integration_tests.apache_tests import context as apache_context
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def context(request):
|
||||
# Fixture request is a built-in pytest fixture describing current test request.
|
||||
integration_test_context = apache_context.IntegrationTestsContext(request)
|
||||
try:
|
||||
yield integration_test_context
|
||||
finally:
|
||||
integration_test_context.cleanup()
|
||||
|
||||
|
||||
def test_it(context):
|
||||
command = ['-d', 'apache.{0}.wtf'.format(context.worker_id)]
|
||||
context.certbot_test_apache(command)
|
||||
11
certbot-ci/certbot_integration_tests/assets/hook.py
Executable file
11
certbot-ci/certbot_integration_tests/assets/hook.py
Executable file
@@ -0,0 +1,11 @@
|
||||
#!/usr/bin/env python
|
||||
import sys
|
||||
import os
|
||||
|
||||
hook_script_type = os.path.basename(os.path.dirname(sys.argv[1]))
|
||||
if hook_script_type == 'deploy' and ('RENEWED_DOMAINS' not in os.environ or 'RENEWED_LINEAGE' not in os.environ):
|
||||
sys.stderr.write('Environment variables not properly set!\n')
|
||||
sys.exit(1)
|
||||
|
||||
with open(sys.argv[2], 'a') as file_h:
|
||||
file_h.write(hook_script_type + '\n')
|
||||
@@ -1,6 +1,17 @@
|
||||
"""This module contains advanced assertions for the certbot integration tests."""
|
||||
import os
|
||||
import grp
|
||||
try:
|
||||
import grp
|
||||
POSIX_MODE = True
|
||||
except ImportError:
|
||||
import win32api
|
||||
import win32security
|
||||
import ntsecuritycon
|
||||
POSIX_MODE = False
|
||||
|
||||
EVERYBODY_SID = 'S-1-1-0'
|
||||
SYSTEM_SID = 'S-1-5-18'
|
||||
ADMINS_SID = 'S-1-5-32-544'
|
||||
|
||||
|
||||
def assert_hook_execution(probe_path, probe_content):
|
||||
@@ -10,9 +21,10 @@ def assert_hook_execution(probe_path, probe_content):
|
||||
:param probe_content: content expected when the hook is executed
|
||||
"""
|
||||
with open(probe_path, 'r') as file:
|
||||
lines = file.readlines()
|
||||
data = file.read()
|
||||
|
||||
assert '{0}{1}'.format(probe_content, os.linesep) in lines
|
||||
lines = [line.strip() for line in data.splitlines()]
|
||||
assert probe_content in lines
|
||||
|
||||
|
||||
def assert_saved_renew_hook(config_dir, lineage):
|
||||
@@ -38,16 +50,51 @@ def assert_cert_count_for_lineage(config_dir, lineage, count):
|
||||
assert len(certs) == count
|
||||
|
||||
|
||||
def assert_equals_permissions(file1, file2, mask):
|
||||
def assert_equals_group_permissions(file1, file2):
|
||||
"""
|
||||
Assert that permissions on two files are identical in respect to a given umask.
|
||||
Assert that two files have the same permissions for group owner.
|
||||
:param file1: first file path to compare
|
||||
:param file2: second file path to compare
|
||||
:param mask: 3-octal representation of a POSIX umask under which the two files mode
|
||||
should match (eg. 0o074 will test RWX on group and R on world)
|
||||
"""
|
||||
mode_file1 = os.stat(file1).st_mode & mask
|
||||
mode_file2 = os.stat(file2).st_mode & mask
|
||||
# On Windows there is no group, so this assertion does nothing on this platform
|
||||
if POSIX_MODE:
|
||||
mode_file1 = os.stat(file1).st_mode & 0o070
|
||||
mode_file2 = os.stat(file2).st_mode & 0o070
|
||||
|
||||
assert mode_file1 == mode_file2
|
||||
|
||||
|
||||
def assert_equals_world_read_permissions(file1, file2):
|
||||
"""
|
||||
Assert that two files have the same read permissions for everyone.
|
||||
:param file1: first file path to compare
|
||||
:param file2: second file path to compare
|
||||
"""
|
||||
if POSIX_MODE:
|
||||
mode_file1 = os.stat(file1).st_mode & 0o004
|
||||
mode_file2 = os.stat(file2).st_mode & 0o004
|
||||
else:
|
||||
everybody = win32security.ConvertStringSidToSid(EVERYBODY_SID)
|
||||
|
||||
security1 = win32security.GetFileSecurity(file1, win32security.DACL_SECURITY_INFORMATION)
|
||||
dacl1 = security1.GetSecurityDescriptorDacl()
|
||||
|
||||
mode_file1 = dacl1.GetEffectiveRightsFromAcl({
|
||||
'TrusteeForm': win32security.TRUSTEE_IS_SID,
|
||||
'TrusteeType': win32security.TRUSTEE_IS_USER,
|
||||
'Identifier': everybody,
|
||||
})
|
||||
mode_file1 = mode_file1 & ntsecuritycon.FILE_GENERIC_READ
|
||||
|
||||
security2 = win32security.GetFileSecurity(file2, win32security.DACL_SECURITY_INFORMATION)
|
||||
dacl2 = security2.GetSecurityDescriptorDacl()
|
||||
|
||||
mode_file2 = dacl2.GetEffectiveRightsFromAcl({
|
||||
'TrusteeForm': win32security.TRUSTEE_IS_SID,
|
||||
'TrusteeType': win32security.TRUSTEE_IS_USER,
|
||||
'Identifier': everybody,
|
||||
})
|
||||
mode_file2 = mode_file2 & ntsecuritycon.FILE_GENERIC_READ
|
||||
|
||||
assert mode_file1 == mode_file2
|
||||
|
||||
@@ -57,20 +104,57 @@ def assert_equals_group_owner(file1, file2):
|
||||
Assert that two files have the same group owner.
|
||||
:param file1: first file path to compare
|
||||
:param file2: second file path to compare
|
||||
:return:
|
||||
"""
|
||||
group_owner_file1 = grp.getgrgid(os.stat(file1).st_gid)[0]
|
||||
group_owner_file2 = grp.getgrgid(os.stat(file2).st_gid)[0]
|
||||
# On Windows there is no group, so this assertion does nothing on this platform
|
||||
if POSIX_MODE:
|
||||
group_owner_file1 = grp.getgrgid(os.stat(file1).st_gid)[0]
|
||||
group_owner_file2 = grp.getgrgid(os.stat(file2).st_gid)[0]
|
||||
|
||||
assert group_owner_file1 == group_owner_file2
|
||||
assert group_owner_file1 == group_owner_file2
|
||||
|
||||
|
||||
def assert_world_permissions(file, mode):
|
||||
def assert_world_no_permissions(file):
|
||||
"""
|
||||
Assert that a file has the expected world permission.
|
||||
:param file: file path to check
|
||||
:param mode: world permissions mode expected
|
||||
Assert that the given file is not world-readable.
|
||||
:param file: path of the file to check
|
||||
"""
|
||||
mode_file_all = os.stat(file).st_mode & 0o007
|
||||
if POSIX_MODE:
|
||||
mode_file_all = os.stat(file).st_mode & 0o007
|
||||
assert mode_file_all == 0
|
||||
else:
|
||||
security = win32security.GetFileSecurity(file, win32security.DACL_SECURITY_INFORMATION)
|
||||
dacl = security.GetSecurityDescriptorDacl()
|
||||
mode = dacl.GetEffectiveRightsFromAcl({
|
||||
'TrusteeForm': win32security.TRUSTEE_IS_SID,
|
||||
'TrusteeType': win32security.TRUSTEE_IS_USER,
|
||||
'Identifier': win32security.ConvertStringSidToSid(EVERYBODY_SID),
|
||||
})
|
||||
|
||||
assert mode_file_all == mode
|
||||
assert not mode
|
||||
|
||||
|
||||
def assert_world_read_permissions(file):
|
||||
"""
|
||||
Assert that the given file is world-readable, but not world-writable or world-executable.
|
||||
:param file: path of the file to check
|
||||
"""
|
||||
if POSIX_MODE:
|
||||
mode_file_all = os.stat(file).st_mode & 0o007
|
||||
assert mode_file_all == 4
|
||||
else:
|
||||
security = win32security.GetFileSecurity(file, win32security.DACL_SECURITY_INFORMATION)
|
||||
dacl = security.GetSecurityDescriptorDacl()
|
||||
mode = dacl.GetEffectiveRightsFromAcl({
|
||||
'TrusteeForm': win32security.TRUSTEE_IS_SID,
|
||||
'TrusteeType': win32security.TRUSTEE_IS_USER,
|
||||
'Identifier': win32security.ConvertStringSidToSid(EVERYBODY_SID),
|
||||
})
|
||||
|
||||
assert not mode & ntsecuritycon.FILE_GENERIC_WRITE
|
||||
assert not mode & ntsecuritycon.FILE_GENERIC_EXECUTE
|
||||
assert mode & ntsecuritycon.FILE_GENERIC_READ == ntsecuritycon.FILE_GENERIC_READ
|
||||
|
||||
|
||||
def _get_current_user():
|
||||
account_name = win32api.GetUserNameEx(win32api.NameSamCompatible)
|
||||
return win32security.LookupAccountName(None, account_name)[0]
|
||||
|
||||
@@ -1,12 +1,11 @@
|
||||
"""Module to handle the context of integration tests."""
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
from distutils.version import LooseVersion
|
||||
|
||||
from certbot_integration_tests.utils import misc
|
||||
from certbot_integration_tests.utils import certbot_call
|
||||
|
||||
|
||||
class IntegrationTestsContext(object):
|
||||
@@ -21,7 +20,7 @@ class IntegrationTestsContext(object):
|
||||
self.worker_id = 'primary'
|
||||
acme_xdist = request.config.acme_xdist
|
||||
|
||||
self.acme_server =acme_xdist['acme_server']
|
||||
self.acme_server = acme_xdist['acme_server']
|
||||
self.directory_url = acme_xdist['directory_url']
|
||||
self.tls_alpn_01_port = acme_xdist['https_port'][self.worker_id]
|
||||
self.http_01_port = acme_xdist['http_port'][self.worker_id]
|
||||
@@ -30,14 +29,12 @@ class IntegrationTestsContext(object):
|
||||
# is listening on challtestsrv_port.
|
||||
self.challtestsrv_port = acme_xdist['challtestsrv_port']
|
||||
|
||||
# Certbot version does not depend on the test context. But getting its value requires
|
||||
# calling certbot from a subprocess. Since it will be called a lot of times through
|
||||
# _common_test_no_force_renew, we cache its value as a member of the fixture context.
|
||||
self.certbot_version = misc.get_certbot_version()
|
||||
|
||||
self.workspace = tempfile.mkdtemp()
|
||||
self.config_dir = os.path.join(self.workspace, 'conf')
|
||||
self.hook_probe = tempfile.mkstemp(dir=self.workspace)[1]
|
||||
|
||||
probe = tempfile.mkstemp(dir=self.workspace)
|
||||
os.close(probe[0])
|
||||
self.hook_probe = probe[1]
|
||||
|
||||
self.manual_dns_auth_hook = (
|
||||
'{0} -c "import os; import requests; import json; '
|
||||
@@ -60,71 +57,18 @@ class IntegrationTestsContext(object):
|
||||
"""Cleanup the integration test context."""
|
||||
shutil.rmtree(self.workspace)
|
||||
|
||||
def _common_test_no_force_renew(self, args):
|
||||
"""
|
||||
Base command to execute certbot in a distributed integration test context,
|
||||
not renewing certificates by default.
|
||||
"""
|
||||
new_environ = os.environ.copy()
|
||||
new_environ['TMPDIR'] = self.workspace
|
||||
|
||||
additional_args = []
|
||||
if self.certbot_version >= LooseVersion('0.30.0'):
|
||||
additional_args.append('--no-random-sleep-on-renew')
|
||||
|
||||
command = [
|
||||
'certbot',
|
||||
'--server', self.directory_url,
|
||||
'--no-verify-ssl',
|
||||
'--http-01-port', str(self.http_01_port),
|
||||
'--https-port', str(self.tls_alpn_01_port),
|
||||
'--manual-public-ip-logging-ok',
|
||||
'--config-dir', self.config_dir,
|
||||
'--work-dir', os.path.join(self.workspace, 'work'),
|
||||
'--logs-dir', os.path.join(self.workspace, 'logs'),
|
||||
'--non-interactive',
|
||||
'--no-redirect',
|
||||
'--agree-tos',
|
||||
'--register-unsafely-without-email',
|
||||
'--debug',
|
||||
'-vv'
|
||||
]
|
||||
|
||||
command.extend(args)
|
||||
command.extend(additional_args)
|
||||
|
||||
print('Invoke command:\n{0}'.format(subprocess.list2cmdline(command)))
|
||||
return subprocess.check_output(command, universal_newlines=True,
|
||||
cwd=self.workspace, env=new_environ)
|
||||
|
||||
def _common_test(self, args):
|
||||
"""
|
||||
Base command to execute certbot in a distributed integration test context,
|
||||
renewing certificates by default.
|
||||
"""
|
||||
command = ['--renew-by-default']
|
||||
command.extend(args)
|
||||
return self._common_test_no_force_renew(command)
|
||||
|
||||
def certbot_no_force_renew(self, args):
|
||||
def certbot(self, args, force_renew=True):
|
||||
"""
|
||||
Execute certbot with given args, not renewing certificates by default.
|
||||
:param args: args to pass to certbot
|
||||
:param force_renew: set to False to not renew by default
|
||||
:return: output of certbot execution
|
||||
"""
|
||||
command = ['--authenticator', 'standalone', '--installer', 'null']
|
||||
command.extend(args)
|
||||
return self._common_test_no_force_renew(command)
|
||||
|
||||
def certbot(self, args):
|
||||
"""
|
||||
Execute certbot with given args, renewing certificates by default.
|
||||
:param args: args to pass to certbot
|
||||
:return: output of certbot execution
|
||||
"""
|
||||
command = ['--renew-by-default']
|
||||
command.extend(args)
|
||||
return self.certbot_no_force_renew(command)
|
||||
return certbot_call.certbot_test(
|
||||
command, self.directory_url, self.http_01_port, self.tls_alpn_01_port,
|
||||
self.config_dir, self.workspace, force_renew=force_renew)
|
||||
|
||||
def get_domain(self, subdomain='le'):
|
||||
"""
|
||||
|
||||
@@ -11,8 +11,11 @@ from os.path import join, exists
|
||||
import pytest
|
||||
from certbot_integration_tests.certbot_tests import context as certbot_context
|
||||
from certbot_integration_tests.certbot_tests.assertions import (
|
||||
assert_hook_execution, assert_saved_renew_hook, assert_cert_count_for_lineage,
|
||||
assert_world_permissions, assert_equals_group_owner, assert_equals_permissions,
|
||||
assert_hook_execution, assert_saved_renew_hook,
|
||||
assert_cert_count_for_lineage,
|
||||
assert_world_no_permissions, assert_world_read_permissions,
|
||||
assert_equals_group_owner, assert_equals_group_permissions, assert_equals_world_read_permissions,
|
||||
EVERYBODY_SID
|
||||
)
|
||||
from certbot_integration_tests.utils import misc
|
||||
|
||||
@@ -84,9 +87,9 @@ def test_http_01(context):
|
||||
context.certbot([
|
||||
'--domains', certname, '--preferred-challenges', 'http-01', 'run',
|
||||
'--cert-name', certname,
|
||||
'--pre-hook', 'echo wtf.pre >> "{0}"'.format(context.hook_probe),
|
||||
'--post-hook', 'echo wtf.post >> "{0}"'.format(context.hook_probe),
|
||||
'--deploy-hook', 'echo deploy >> "{0}"'.format(context.hook_probe)
|
||||
'--pre-hook', misc.echo('wtf_pre', context.hook_probe),
|
||||
'--post-hook', misc.echo('wtf_post', context.hook_probe),
|
||||
'--deploy-hook', misc.echo('deploy', context.hook_probe),
|
||||
])
|
||||
|
||||
assert_hook_execution(context.hook_probe, 'deploy')
|
||||
@@ -104,9 +107,9 @@ def test_manual_http_auth(context):
|
||||
'--cert-name', certname,
|
||||
'--manual-auth-hook', scripts[0],
|
||||
'--manual-cleanup-hook', scripts[1],
|
||||
'--pre-hook', 'echo wtf.pre >> "{0}"'.format(context.hook_probe),
|
||||
'--post-hook', 'echo wtf.post >> "{0}"'.format(context.hook_probe),
|
||||
'--renew-hook', 'echo renew >> "{0}"'.format(context.hook_probe)
|
||||
'--pre-hook', misc.echo('wtf_pre', context.hook_probe),
|
||||
'--post-hook', misc.echo('wtf_post', context.hook_probe),
|
||||
'--renew-hook', misc.echo('renew', context.hook_probe),
|
||||
])
|
||||
|
||||
with pytest.raises(AssertionError):
|
||||
@@ -122,9 +125,9 @@ def test_manual_dns_auth(context):
|
||||
'run', '--cert-name', certname,
|
||||
'--manual-auth-hook', context.manual_dns_auth_hook,
|
||||
'--manual-cleanup-hook', context.manual_dns_cleanup_hook,
|
||||
'--pre-hook', 'echo wtf.pre >> "{0}"'.format(context.hook_probe),
|
||||
'--post-hook', 'echo wtf.post >> "{0}"'.format(context.hook_probe),
|
||||
'--renew-hook', 'echo renew >> "{0}"'.format(context.hook_probe)
|
||||
'--pre-hook', misc.echo('wtf_pre', context.hook_probe),
|
||||
'--post-hook', misc.echo('wtf_post', context.hook_probe),
|
||||
'--renew-hook', misc.echo('renew', context.hook_probe),
|
||||
])
|
||||
|
||||
with pytest.raises(AssertionError):
|
||||
@@ -173,21 +176,19 @@ def test_renew_files_permissions(context):
|
||||
certname = context.get_domain('renew')
|
||||
context.certbot(['-d', certname])
|
||||
|
||||
privkey1 = join(context.config_dir, 'archive', certname, 'privkey1.pem')
|
||||
privkey2 = join(context.config_dir, 'archive', certname, 'privkey2.pem')
|
||||
|
||||
assert_cert_count_for_lineage(context.config_dir, certname, 1)
|
||||
assert_world_permissions(
|
||||
join(context.config_dir, 'archive', certname, 'privkey1.pem'), 0)
|
||||
assert_world_no_permissions(privkey1)
|
||||
|
||||
context.certbot(['renew'])
|
||||
|
||||
assert_cert_count_for_lineage(context.config_dir, certname, 2)
|
||||
assert_world_permissions(
|
||||
join(context.config_dir, 'archive', certname, 'privkey2.pem'), 0)
|
||||
assert_equals_group_owner(
|
||||
join(context.config_dir, 'archive', certname, 'privkey1.pem'),
|
||||
join(context.config_dir, 'archive', certname, 'privkey2.pem'))
|
||||
assert_equals_permissions(
|
||||
join(context.config_dir, 'archive', certname, 'privkey1.pem'),
|
||||
join(context.config_dir, 'archive', certname, 'privkey2.pem'), 0o074)
|
||||
assert_world_no_permissions(privkey2)
|
||||
assert_equals_group_owner(privkey1, privkey2)
|
||||
assert_equals_world_read_permissions(privkey1, privkey2)
|
||||
assert_equals_group_permissions(privkey1, privkey2)
|
||||
|
||||
|
||||
def test_renew_with_hook_scripts(context):
|
||||
@@ -211,15 +212,35 @@ def test_renew_files_propagate_permissions(context):
|
||||
|
||||
assert_cert_count_for_lineage(context.config_dir, certname, 1)
|
||||
|
||||
os.chmod(join(context.config_dir, 'archive', certname, 'privkey1.pem'), 0o444)
|
||||
privkey1 = join(context.config_dir, 'archive', certname, 'privkey1.pem')
|
||||
privkey2 = join(context.config_dir, 'archive', certname, 'privkey2.pem')
|
||||
|
||||
if os.name != 'nt':
|
||||
os.chmod(privkey1, 0o444)
|
||||
else:
|
||||
import win32security
|
||||
import ntsecuritycon
|
||||
# Get the current DACL of the private key
|
||||
security = win32security.GetFileSecurity(privkey1, win32security.DACL_SECURITY_INFORMATION)
|
||||
dacl = security.GetSecurityDescriptorDacl()
|
||||
# Create a read permission for Everybody group
|
||||
everybody = win32security.ConvertStringSidToSid(EVERYBODY_SID)
|
||||
dacl.AddAccessAllowedAce(win32security.ACL_REVISION, ntsecuritycon.FILE_GENERIC_READ, everybody)
|
||||
# Apply the updated DACL to the private key
|
||||
security.SetSecurityDescriptorDacl(1, dacl, 0)
|
||||
win32security.SetFileSecurity(privkey1, win32security.DACL_SECURITY_INFORMATION, security)
|
||||
|
||||
context.certbot(['renew'])
|
||||
|
||||
assert_cert_count_for_lineage(context.config_dir, certname, 2)
|
||||
assert_world_permissions(
|
||||
join(context.config_dir, 'archive', certname, 'privkey2.pem'), 4)
|
||||
assert_equals_permissions(
|
||||
join(context.config_dir, 'archive', certname, 'privkey1.pem'),
|
||||
join(context.config_dir, 'archive', certname, 'privkey2.pem'), 0o074)
|
||||
if os.name != 'nt':
|
||||
# On Linux, read world permissions + all group permissions will be copied from the previous private key
|
||||
assert_world_read_permissions(privkey2)
|
||||
assert_equals_world_read_permissions(privkey1, privkey2)
|
||||
assert_equals_group_permissions(privkey1, privkey2)
|
||||
else:
|
||||
# On Windows, world will never have any permissions, and group permission is irrelevant for this platform
|
||||
assert_world_no_permissions(privkey2)
|
||||
|
||||
|
||||
def test_graceful_renew_it_is_not_time(context):
|
||||
@@ -229,8 +250,8 @@ def test_graceful_renew_it_is_not_time(context):
|
||||
|
||||
assert_cert_count_for_lineage(context.config_dir, certname, 1)
|
||||
|
||||
context.certbot_no_force_renew([
|
||||
'renew', '--deploy-hook', 'echo deploy >> "{0}"'.format(context.hook_probe)])
|
||||
context.certbot(['renew', '--deploy-hook', misc.echo('deploy', context.hook_probe)],
|
||||
force_renew=False)
|
||||
|
||||
assert_cert_count_for_lineage(context.config_dir, certname, 1)
|
||||
with pytest.raises(AssertionError):
|
||||
@@ -250,8 +271,8 @@ def test_graceful_renew_it_is_time(context):
|
||||
with open(join(context.config_dir, 'renewal', '{0}.conf'.format(certname)), 'w') as file:
|
||||
file.writelines(lines)
|
||||
|
||||
context.certbot_no_force_renew([
|
||||
'renew', '--deploy-hook', 'echo deploy >> "{0}"'.format(context.hook_probe)])
|
||||
context.certbot(['renew', '--deploy-hook', misc.echo('deploy', context.hook_probe)],
|
||||
force_renew=False)
|
||||
|
||||
assert_cert_count_for_lineage(context.config_dir, certname, 2)
|
||||
assert_hook_execution(context.hook_probe, 'deploy')
|
||||
@@ -317,9 +338,9 @@ def test_renew_hook_override(context):
|
||||
context.certbot([
|
||||
'certonly', '-d', certname,
|
||||
'--preferred-challenges', 'http-01',
|
||||
'--pre-hook', 'echo pre >> "{0}"'.format(context.hook_probe),
|
||||
'--post-hook', 'echo post >> "{0}"'.format(context.hook_probe),
|
||||
'--deploy-hook', 'echo deploy >> "{0}"'.format(context.hook_probe)
|
||||
'--pre-hook', misc.echo('pre', context.hook_probe),
|
||||
'--post-hook', misc.echo('post', context.hook_probe),
|
||||
'--deploy-hook', misc.echo('deploy', context.hook_probe),
|
||||
])
|
||||
|
||||
assert_hook_execution(context.hook_probe, 'pre')
|
||||
@@ -330,14 +351,14 @@ def test_renew_hook_override(context):
|
||||
open(context.hook_probe, 'w').close()
|
||||
context.certbot([
|
||||
'renew', '--cert-name', certname,
|
||||
'--pre-hook', 'echo pre-override >> "{0}"'.format(context.hook_probe),
|
||||
'--post-hook', 'echo post-override >> "{0}"'.format(context.hook_probe),
|
||||
'--deploy-hook', 'echo deploy-override >> "{0}"'.format(context.hook_probe)
|
||||
'--pre-hook', misc.echo('pre_override', context.hook_probe),
|
||||
'--post-hook', misc.echo('post_override', context.hook_probe),
|
||||
'--deploy-hook', misc.echo('deploy_override', context.hook_probe),
|
||||
])
|
||||
|
||||
assert_hook_execution(context.hook_probe, 'pre-override')
|
||||
assert_hook_execution(context.hook_probe, 'post-override')
|
||||
assert_hook_execution(context.hook_probe, 'deploy-override')
|
||||
assert_hook_execution(context.hook_probe, 'pre_override')
|
||||
assert_hook_execution(context.hook_probe, 'post_override')
|
||||
assert_hook_execution(context.hook_probe, 'deploy_override')
|
||||
with pytest.raises(AssertionError):
|
||||
assert_hook_execution(context.hook_probe, 'pre')
|
||||
with pytest.raises(AssertionError):
|
||||
@@ -349,11 +370,11 @@ def test_renew_hook_override(context):
|
||||
open(context.hook_probe, 'w').close()
|
||||
context.certbot(['renew', '--cert-name', certname])
|
||||
|
||||
assert_hook_execution(context.hook_probe, 'pre-override')
|
||||
assert_hook_execution(context.hook_probe, 'post-override')
|
||||
assert_hook_execution(context.hook_probe, 'deploy-override')
|
||||
assert_hook_execution(context.hook_probe, 'pre_override')
|
||||
assert_hook_execution(context.hook_probe, 'post_override')
|
||||
assert_hook_execution(context.hook_probe, 'deploy_override')
|
||||
|
||||
|
||||
|
||||
def test_invalid_domain_with_dns_challenge(context):
|
||||
"""Test certificate issuance failure with DNS-01 challenge."""
|
||||
# Manual dns auth hooks from misc are designed to fail if the domain contains 'fail-*'.
|
||||
@@ -512,7 +533,7 @@ def test_revoke_multiple_lineages(context):
|
||||
data = file.read()
|
||||
|
||||
data = re.sub('archive_dir = .*\n',
|
||||
'archive_dir = {0}\n'.format(join(context.config_dir, 'archive', cert1)),
|
||||
'archive_dir = {0}\n'.format(join(context.config_dir, 'archive', cert1).replace('\\', '\\\\')),
|
||||
data)
|
||||
|
||||
with open(join(context.config_dir, 'renewal', '{0}.conf'.format(cert2)), 'w') as file:
|
||||
@@ -555,11 +576,9 @@ def test_ocsp_status_stale(context):
|
||||
|
||||
def test_ocsp_status_live(context):
|
||||
"""Test retrieval of OCSP statuses for live config"""
|
||||
if context.acme_server == 'pebble':
|
||||
pytest.skip('Pebble does not support OCSP status requests.')
|
||||
cert = context.get_domain('ocsp-check')
|
||||
|
||||
# OSCP 1: Check live certificate OCSP status (VALID)
|
||||
cert = context.get_domain('ocsp-check')
|
||||
context.certbot(['--domains', cert])
|
||||
output = context.certbot(['certificates'])
|
||||
|
||||
|
||||
@@ -68,17 +68,18 @@ def _setup_primary_node(config):
|
||||
:param config: Configuration of the pytest primary node
|
||||
"""
|
||||
# Check for runtime compatibility: some tools are required to be available in PATH
|
||||
try:
|
||||
subprocess.check_output(['docker', '-v'], stderr=subprocess.STDOUT)
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
raise ValueError('Error: docker is required in PATH to launch the integration tests, '
|
||||
'but is not installed or not available for current user.')
|
||||
if 'boulder' in config.option.acme_server:
|
||||
try:
|
||||
subprocess.check_output(['docker', '-v'], stderr=subprocess.STDOUT)
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
raise ValueError('Error: docker is required in PATH to launch the integration tests on'
|
||||
'boulder, but is not installed or not available for current user.')
|
||||
|
||||
try:
|
||||
subprocess.check_output(['docker-compose', '-v'], stderr=subprocess.STDOUT)
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
raise ValueError('Error: docker-compose is required in PATH to launch the integration tests, '
|
||||
'but is not installed or not available for current user.')
|
||||
try:
|
||||
subprocess.check_output(['docker-compose', '-v'], stderr=subprocess.STDOUT)
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
raise ValueError('Error: docker-compose is required in PATH to launch the integration tests, '
|
||||
'but is not installed or not available for current user.')
|
||||
|
||||
# Parameter numprocesses is added to option by pytest-xdist
|
||||
workers = ['primary'] if not config.option.numprocesses\
|
||||
@@ -86,7 +87,9 @@ def _setup_primary_node(config):
|
||||
|
||||
# By calling setup_acme_server we ensure that all necessary acme server instances will be
|
||||
# fully started. This runtime is reflected by the acme_xdist returned.
|
||||
acme_xdist = acme_lib.setup_acme_server(config.option.acme_server, workers)
|
||||
print('ACME xdist config:\n{0}'.format(acme_xdist))
|
||||
acme_server = acme_lib.ACMEServer(config.option.acme_server, workers)
|
||||
config.add_cleanup(acme_server.stop)
|
||||
print('ACME xdist config:\n{0}'.format(acme_server.acme_xdist))
|
||||
acme_server.start()
|
||||
|
||||
return acme_xdist
|
||||
return acme_server.acme_xdist
|
||||
|
||||
@@ -2,7 +2,7 @@ import os
|
||||
import subprocess
|
||||
|
||||
from certbot_integration_tests.certbot_tests import context as certbot_context
|
||||
from certbot_integration_tests.utils import misc
|
||||
from certbot_integration_tests.utils import misc, certbot_call
|
||||
from certbot_integration_tests.nginx_tests import nginx_config as config
|
||||
|
||||
|
||||
@@ -33,11 +33,14 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
|
||||
"""
|
||||
Main command to execute certbot using the nginx plugin.
|
||||
:param list args: list of arguments to pass to nginx
|
||||
:param bool force_renew: set to False to not renew by default
|
||||
"""
|
||||
command = ['--authenticator', 'nginx', '--installer', 'nginx',
|
||||
'--nginx-server-root', self.nginx_root]
|
||||
command.extend(args)
|
||||
return self._common_test(command)
|
||||
return certbot_call.certbot_test(
|
||||
command, self.directory_url, self.http_01_port, self.tls_alpn_01_port,
|
||||
self.config_dir, self.workspace, force_renew=True)
|
||||
|
||||
def _start_nginx(self, default_server):
|
||||
self.nginx_config = config.construct_nginx_config(
|
||||
|
||||
@@ -21,9 +21,9 @@ def construct_nginx_config(nginx_root, nginx_webroot, http_port, https_port, oth
|
||||
:rtype: str
|
||||
"""
|
||||
key_path = key_path if key_path \
|
||||
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/nginx_key.pem')
|
||||
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/key.pem')
|
||||
cert_path = cert_path if cert_path \
|
||||
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/nginx_cert.pem')
|
||||
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/cert.pem')
|
||||
return '''\
|
||||
# This error log will be written regardless of server scope error_log
|
||||
# definitions, so we have to set this here in the main scope.
|
||||
|
||||
@@ -30,6 +30,7 @@ def context(request):
|
||||
('nginx6.{0}.wtf,nginx7.{0}.wtf', ['--preferred-challenges', 'http'], {'default_server': False}),
|
||||
], indirect=['context'])
|
||||
def test_certificate_deployment(certname_pattern, params, context):
|
||||
# type: (str, list, nginx_context.IntegrationTestsContext) -> None
|
||||
"""
|
||||
Test various scenarios to deploy a certificate to nginx using certbot.
|
||||
"""
|
||||
@@ -45,10 +46,7 @@ def test_certificate_deployment(certname_pattern, params, context):
|
||||
|
||||
assert server_cert == certbot_cert
|
||||
|
||||
command = ['--authenticator', 'nginx', '--installer', 'nginx',
|
||||
'--nginx-server-root', context.nginx_root,
|
||||
'rollback', '--checkpoints', '1']
|
||||
context._common_test_no_force_renew(command)
|
||||
context.certbot_test_nginx(['rollback', '--checkpoints', '1'])
|
||||
|
||||
with open(context.nginx_config_path, 'r') as file_h:
|
||||
current_nginx_config = file_h.read()
|
||||
|
||||
330
certbot-ci/certbot_integration_tests/utils/acme_server.py
Normal file → Executable file
330
certbot-ci/certbot_integration_tests/utils/acme_server.py
Normal file → Executable file
@@ -1,197 +1,219 @@
|
||||
#!/usr/bin/env python
|
||||
"""Module to setup an ACME CA server environment able to run multiple tests in parallel"""
|
||||
from __future__ import print_function
|
||||
import errno
|
||||
import json
|
||||
import tempfile
|
||||
import atexit
|
||||
import time
|
||||
import os
|
||||
import subprocess
|
||||
import shutil
|
||||
import stat
|
||||
import sys
|
||||
from os.path import join
|
||||
|
||||
import requests
|
||||
import json
|
||||
import yaml
|
||||
|
||||
from certbot_integration_tests.utils import misc
|
||||
|
||||
# These ports are set implicitly in the docker-compose.yml files of Boulder/Pebble.
|
||||
CHALLTESTSRV_PORT = 8055
|
||||
HTTP_01_PORT = 5002
|
||||
from certbot_integration_tests.utils import misc, proxy, pebble_artifacts
|
||||
from certbot_integration_tests.utils.constants import *
|
||||
|
||||
|
||||
def setup_acme_server(acme_server, nodes):
|
||||
class ACMEServer(object):
|
||||
"""
|
||||
This method will setup an ACME CA server and an HTTP reverse proxy instance, to allow parallel
|
||||
execution of integration tests against the unique http-01 port expected by the ACME CA server.
|
||||
Instances are properly closed and cleaned when the Python process exits using atexit.
|
||||
ACMEServer configures and handles the lifecycle of an ACME CA server and an HTTP reverse proxy
|
||||
instance, to allow parallel execution of integration tests against the unique http-01 port
|
||||
expected by the ACME CA server.
|
||||
Typically all pytest integration tests will be executed in this context.
|
||||
This method returns an object describing ports and directory url to use for each pytest node
|
||||
with the relevant pytest xdist node.
|
||||
:param str acme_server: the type of acme server used (boulder-v1, boulder-v2 or pebble)
|
||||
:param str[] nodes: list of node names that will be setup by pytest xdist
|
||||
:return: a dict describing the challenge ports that have been setup for the nodes
|
||||
:rtype: dict
|
||||
ACMEServer gives access the acme_xdist parameter, listing the ports and directory url to use
|
||||
for each pytest node. It exposes also start and stop methods in order to start the stack, and
|
||||
stop it with proper resources cleanup.
|
||||
ACMEServer is also a context manager, and so can be used to ensure ACME server is started/stopped
|
||||
upon context enter/exit.
|
||||
"""
|
||||
acme_type = 'pebble' if acme_server == 'pebble' else 'boulder'
|
||||
acme_xdist = _construct_acme_xdist(acme_server, nodes)
|
||||
workspace = _construct_workspace(acme_type)
|
||||
def __init__(self, acme_server, nodes, http_proxy=True, stdout=False):
|
||||
"""
|
||||
Create an ACMEServer instance.
|
||||
:param str acme_server: the type of acme server used (boulder-v1, boulder-v2 or pebble)
|
||||
:param list nodes: list of node names that will be setup by pytest xdist
|
||||
:param bool http_proxy: if False do not start the HTTP proxy
|
||||
:param bool stdout: if True stream subprocesses stdout to standard stdout
|
||||
"""
|
||||
self._construct_acme_xdist(acme_server, nodes)
|
||||
|
||||
_prepare_traefik_proxy(workspace, acme_xdist)
|
||||
_prepare_acme_server(workspace, acme_type, acme_xdist)
|
||||
self._acme_type = 'pebble' if acme_server == 'pebble' else 'boulder'
|
||||
self._proxy = http_proxy
|
||||
self._workspace = tempfile.mkdtemp()
|
||||
self._processes = []
|
||||
self._stdout = sys.stdout if stdout else open(os.devnull, 'w')
|
||||
|
||||
return acme_xdist
|
||||
def start(self):
|
||||
"""Start the test stack"""
|
||||
try:
|
||||
if self._proxy:
|
||||
self._prepare_http_proxy()
|
||||
if self._acme_type == 'pebble':
|
||||
self._prepare_pebble_server()
|
||||
if self._acme_type == 'boulder':
|
||||
self._prepare_boulder_server()
|
||||
except BaseException as e:
|
||||
self.stop()
|
||||
raise e
|
||||
|
||||
def stop(self):
|
||||
"""Stop the test stack, and clean its resources"""
|
||||
print('=> Tear down the test infrastructure...')
|
||||
try:
|
||||
for process in self._processes:
|
||||
try:
|
||||
process.terminate()
|
||||
except OSError as e:
|
||||
# Process may be not started yet, so no PID and terminate fails.
|
||||
# Then the process never started, and the situation is acceptable.
|
||||
if e.errno != errno.ESRCH:
|
||||
raise
|
||||
for process in self._processes:
|
||||
process.wait()
|
||||
|
||||
def _construct_acme_xdist(acme_server, nodes):
|
||||
"""Generate and return the acme_xdist dict"""
|
||||
acme_xdist = {'acme_server': acme_server, 'challtestsrv_port': CHALLTESTSRV_PORT}
|
||||
if os.path.exists(os.path.join(self._workspace, 'boulder')):
|
||||
# Boulder docker generates build artifacts owned by root with 0o744 permissions.
|
||||
# If we started the acme server from a normal user that has access to the Docker
|
||||
# daemon, this user will not be able to delete these artifacts from the host.
|
||||
# We need to do it through a docker.
|
||||
process = self._launch_process(['docker', 'run', '--rm', '-v',
|
||||
'{0}:/workspace'.format(self._workspace),
|
||||
'alpine', 'rm', '-rf', '/workspace/boulder'])
|
||||
process.wait()
|
||||
finally:
|
||||
shutil.rmtree(self._workspace)
|
||||
if self._stdout != sys.stdout:
|
||||
self._stdout.close()
|
||||
print('=> Test infrastructure stopped and cleaned up.')
|
||||
|
||||
# Directory and ACME port are set implicitly in the docker-compose.yml files of Boulder/Pebble.
|
||||
if acme_server == 'pebble':
|
||||
acme_xdist['directory_url'] = 'https://localhost:14000/dir'
|
||||
else: # boulder
|
||||
port = 4001 if acme_server == 'boulder-v2' else 4000
|
||||
acme_xdist['directory_url'] = 'http://localhost:{0}/directory'.format(port)
|
||||
def __enter__(self):
|
||||
self.start()
|
||||
return self.acme_xdist
|
||||
|
||||
acme_xdist['http_port'] = {node: port for (node, port)
|
||||
in zip(nodes, range(5200, 5200 + len(nodes)))}
|
||||
acme_xdist['https_port'] = {node: port for (node, port)
|
||||
in zip(nodes, range(5100, 5100 + len(nodes)))}
|
||||
acme_xdist['other_port'] = {node: port for (node, port)
|
||||
in zip(nodes, range(5300, 5300 + len(nodes)))}
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
self.stop()
|
||||
|
||||
return acme_xdist
|
||||
def _construct_acme_xdist(self, acme_server, nodes):
|
||||
"""Generate and return the acme_xdist dict"""
|
||||
acme_xdist = {'acme_server': acme_server, 'challtestsrv_port': CHALLTESTSRV_PORT}
|
||||
|
||||
# Directory and ACME port are set implicitly in the docker-compose.yml files of Boulder/Pebble.
|
||||
if acme_server == 'pebble':
|
||||
acme_xdist['directory_url'] = PEBBLE_DIRECTORY_URL
|
||||
else: # boulder
|
||||
acme_xdist['directory_url'] = BOULDER_V2_DIRECTORY_URL \
|
||||
if acme_server == 'boulder-v2' else BOULDER_V1_DIRECTORY_URL
|
||||
|
||||
def _construct_workspace(acme_type):
|
||||
"""Create a temporary workspace for integration tests stack"""
|
||||
workspace = tempfile.mkdtemp()
|
||||
acme_xdist['http_port'] = {node: port for (node, port)
|
||||
in zip(nodes, range(5200, 5200 + len(nodes)))}
|
||||
acme_xdist['https_port'] = {node: port for (node, port)
|
||||
in zip(nodes, range(5100, 5100 + len(nodes)))}
|
||||
acme_xdist['other_port'] = {node: port for (node, port)
|
||||
in zip(nodes, range(5300, 5300 + len(nodes)))}
|
||||
|
||||
def cleanup():
|
||||
"""Cleanup function to call that will teardown relevant dockers and their configuration."""
|
||||
for instance in [acme_type, 'traefik']:
|
||||
print('=> Tear down the {0} instance...'.format(instance))
|
||||
instance_path = join(workspace, instance)
|
||||
try:
|
||||
if os.path.isfile(join(instance_path, 'docker-compose.yml')):
|
||||
_launch_command(['docker-compose', 'down'], cwd=instance_path)
|
||||
except subprocess.CalledProcessError:
|
||||
pass
|
||||
print('=> Finished tear down of {0} instance.'.format(acme_type))
|
||||
self.acme_xdist = acme_xdist
|
||||
|
||||
shutil.rmtree(workspace)
|
||||
def _prepare_pebble_server(self):
|
||||
"""Configure and launch the Pebble server"""
|
||||
print('=> Starting pebble instance deployment...')
|
||||
pebble_path, challtestsrv_path, pebble_config_path = pebble_artifacts.fetch(self._workspace)
|
||||
|
||||
# Here with atexit we ensure that clean function is called no matter what.
|
||||
atexit.register(cleanup)
|
||||
# Configure Pebble at full speed (PEBBLE_VA_NOSLEEP=1) and not randomly refusing valid
|
||||
# nonce (PEBBLE_WFE_NONCEREJECT=0) to have a stable test environment.
|
||||
environ = os.environ.copy()
|
||||
environ['PEBBLE_VA_NOSLEEP'] = '1'
|
||||
environ['PEBBLE_WFE_NONCEREJECT'] = '0'
|
||||
|
||||
return workspace
|
||||
self._launch_process(
|
||||
[pebble_path, '-config', pebble_config_path, '-dnsserver', '127.0.0.1:8053'],
|
||||
env=environ)
|
||||
|
||||
self._launch_process(
|
||||
[challtestsrv_path, '-management', ':{0}'.format(CHALLTESTSRV_PORT), '-defaultIPv6', '""',
|
||||
'-defaultIPv4', '127.0.0.1', '-http01', '""', '-tlsalpn01', '""', '-https01', '""'])
|
||||
|
||||
def _prepare_acme_server(workspace, acme_type, acme_xdist):
|
||||
"""Configure and launch the ACME server, Boulder or Pebble"""
|
||||
print('=> Starting {0} instance deployment...'.format(acme_type))
|
||||
instance_path = join(workspace, acme_type)
|
||||
try:
|
||||
# Load Boulder/Pebble from git, that includes a docker-compose.yml ready for production.
|
||||
_launch_command(['git', 'clone', 'https://github.com/letsencrypt/{0}'.format(acme_type),
|
||||
'--single-branch', '--depth=1', instance_path])
|
||||
if acme_type == 'boulder':
|
||||
# Allow Boulder to ignore usual limit rate policies, useful for tests.
|
||||
os.rename(join(instance_path, 'test/rate-limit-policies-b.yml'),
|
||||
join(instance_path, 'test/rate-limit-policies.yml'))
|
||||
if acme_type == 'pebble':
|
||||
# Configure Pebble at full speed (PEBBLE_VA_NOSLEEP=1) and not randomly refusing valid
|
||||
# nonce (PEBBLE_WFE_NONCEREJECT=0) to have a stable test environment.
|
||||
with open(os.path.join(instance_path, 'docker-compose.yml'), 'r') as file_handler:
|
||||
config = yaml.load(file_handler.read())
|
||||
|
||||
config['services']['pebble'].setdefault('environment', [])\
|
||||
.extend(['PEBBLE_VA_NOSLEEP=1', 'PEBBLE_WFE_NONCEREJECT=0'])
|
||||
with open(os.path.join(instance_path, 'docker-compose.yml'), 'w') as file_handler:
|
||||
file_handler.write(yaml.dump(config))
|
||||
|
||||
# Launch the ACME CA server.
|
||||
_launch_command(['docker-compose', 'up', '--force-recreate', '-d'], cwd=instance_path)
|
||||
# pebble_ocsp_server is imported here and not at the top of module in order to avoid a useless
|
||||
# ImportError, in the case where cryptography dependency is too old to support ocsp, but
|
||||
# Boulder is used instead of Pebble, so pebble_ocsp_server is not used. This is the typical
|
||||
# situation of integration-certbot-oldest tox testenv.
|
||||
from certbot_integration_tests.utils import pebble_ocsp_server
|
||||
self._launch_process([sys.executable, pebble_ocsp_server.__file__])
|
||||
|
||||
# Wait for the ACME CA server to be up.
|
||||
print('=> Waiting for {0} instance to respond...'.format(acme_type))
|
||||
misc.check_until_timeout(acme_xdist['directory_url'])
|
||||
print('=> Waiting for pebble instance to respond...')
|
||||
misc.check_until_timeout(self.acme_xdist['directory_url'])
|
||||
|
||||
print('=> Finished pebble instance deployment.')
|
||||
|
||||
def _prepare_boulder_server(self):
|
||||
"""Configure and launch the Boulder server"""
|
||||
print('=> Starting boulder instance deployment...')
|
||||
instance_path = join(self._workspace, 'boulder')
|
||||
|
||||
# Load Boulder from git, that includes a docker-compose.yml ready for production.
|
||||
process = self._launch_process(['git', 'clone', 'https://github.com/letsencrypt/boulder',
|
||||
'--single-branch', '--depth=1', instance_path])
|
||||
process.wait()
|
||||
|
||||
# Allow Boulder to ignore usual limit rate policies, useful for tests.
|
||||
os.rename(join(instance_path, 'test/rate-limit-policies-b.yml'),
|
||||
join(instance_path, 'test/rate-limit-policies.yml'))
|
||||
|
||||
# Launch the Boulder server
|
||||
self._launch_process(['docker-compose', 'up', '--force-recreate'], cwd=instance_path)
|
||||
|
||||
# Wait for the ACME CA server to be up.
|
||||
print('=> Waiting for boulder instance to respond...')
|
||||
misc.check_until_timeout(self.acme_xdist['directory_url'], attempts=240)
|
||||
|
||||
# Configure challtestsrv to answer any A record request with ip of the docker host.
|
||||
acme_subnet = '10.77.77' if acme_type == 'boulder' else '10.30.50'
|
||||
response = requests.post('http://localhost:{0}/set-default-ipv4'
|
||||
.format(acme_xdist['challtestsrv_port']),
|
||||
json={'ip': '{0}.1'.format(acme_subnet)})
|
||||
response = requests.post('http://localhost:{0}/set-default-ipv4'.format(CHALLTESTSRV_PORT),
|
||||
json={'ip': '10.77.77.1'})
|
||||
response.raise_for_status()
|
||||
|
||||
print('=> Finished {0} instance deployment.'.format(acme_type))
|
||||
except BaseException:
|
||||
print('Error while setting up {0} instance.'.format(acme_type))
|
||||
raise
|
||||
print('=> Finished boulder instance deployment.')
|
||||
|
||||
def _prepare_http_proxy(self):
|
||||
"""Configure and launch an HTTP proxy"""
|
||||
print('=> Configuring the HTTP proxy...')
|
||||
mapping = {r'.+\.{0}\.wtf'.format(node): 'http://127.0.0.1:{0}'.format(port)
|
||||
for node, port in self.acme_xdist['http_port'].items()}
|
||||
command = [sys.executable, proxy.__file__, str(HTTP_01_PORT), json.dumps(mapping)]
|
||||
self._launch_process(command)
|
||||
print('=> Finished configuring the HTTP proxy.')
|
||||
|
||||
def _launch_process(self, command, cwd=os.getcwd(), env=None):
|
||||
"""Launch silently an subprocess OS command"""
|
||||
if not env:
|
||||
env = os.environ
|
||||
process = subprocess.Popen(command, stdout=self._stdout, stderr=subprocess.STDOUT, cwd=cwd, env=env)
|
||||
self._processes.append(process)
|
||||
return process
|
||||
|
||||
|
||||
def _prepare_traefik_proxy(workspace, acme_xdist):
|
||||
"""Configure and launch Traefik, the HTTP reverse proxy"""
|
||||
print('=> Starting traefik instance deployment...')
|
||||
instance_path = join(workspace, 'traefik')
|
||||
traefik_subnet = '10.33.33'
|
||||
traefik_api_port = 8056
|
||||
def main():
|
||||
args = sys.argv[1:]
|
||||
server_type = args[0] if args else 'pebble'
|
||||
possible_values = ('pebble', 'boulder-v1', 'boulder-v2')
|
||||
if server_type not in possible_values:
|
||||
raise ValueError('Invalid server value {0}, should be one of {1}'
|
||||
.format(server_type, possible_values))
|
||||
|
||||
acme_server = ACMEServer(server_type, [], http_proxy=False, stdout=True)
|
||||
|
||||
try:
|
||||
os.mkdir(instance_path)
|
||||
with acme_server as acme_xdist:
|
||||
print('--> Instance of {0} is running, directory URL is {0}'
|
||||
.format(acme_xdist['directory_url']))
|
||||
print('--> Press CTRL+C to stop the ACME server.')
|
||||
|
||||
with open(join(instance_path, 'docker-compose.yml'), 'w') as file_h:
|
||||
file_h.write('''\
|
||||
version: '3'
|
||||
services:
|
||||
traefik:
|
||||
image: traefik
|
||||
command: --api --rest
|
||||
ports:
|
||||
- {http_01_port}:80
|
||||
- {traefik_api_port}:8080
|
||||
networks:
|
||||
traefiknet:
|
||||
ipv4_address: {traefik_subnet}.2
|
||||
networks:
|
||||
traefiknet:
|
||||
ipam:
|
||||
config:
|
||||
- subnet: {traefik_subnet}.0/24
|
||||
'''.format(traefik_subnet=traefik_subnet,
|
||||
traefik_api_port=traefik_api_port,
|
||||
http_01_port=HTTP_01_PORT))
|
||||
|
||||
_launch_command(['docker-compose', 'up', '--force-recreate', '-d'], cwd=instance_path)
|
||||
|
||||
misc.check_until_timeout('http://localhost:{0}/api'.format(traefik_api_port))
|
||||
config = {
|
||||
'backends': {
|
||||
node: {
|
||||
'servers': {node: {'url': 'http://{0}.1:{1}'.format(traefik_subnet, port)}}
|
||||
} for node, port in acme_xdist['http_port'].items()
|
||||
},
|
||||
'frontends': {
|
||||
node: {
|
||||
'backend': node, 'passHostHeader': True,
|
||||
'routes': {node: {'rule': 'HostRegexp: {{subdomain:.+}}.{0}.wtf'.format(node)}}
|
||||
} for node in acme_xdist['http_port'].keys()
|
||||
}
|
||||
}
|
||||
response = requests.put('http://localhost:{0}/api/providers/rest'.format(traefik_api_port),
|
||||
data=json.dumps(config))
|
||||
response.raise_for_status()
|
||||
|
||||
print('=> Finished traefik instance deployment.')
|
||||
except BaseException:
|
||||
print('Error while setting up traefik instance.')
|
||||
raise
|
||||
while True:
|
||||
time.sleep(3600)
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
|
||||
|
||||
def _launch_command(command, cwd=os.getcwd()):
|
||||
"""Launch silently an OS command, output will be displayed in case of failure"""
|
||||
try:
|
||||
subprocess.check_output(command, stderr=subprocess.STDOUT, cwd=cwd, universal_newlines=True)
|
||||
except subprocess.CalledProcessError as e:
|
||||
sys.stderr.write(e.output)
|
||||
raise
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
|
||||
140
certbot-ci/certbot_integration_tests/utils/certbot_call.py
Executable file
140
certbot-ci/certbot_integration_tests/utils/certbot_call.py
Executable file
@@ -0,0 +1,140 @@
|
||||
#!/usr/bin/env python
|
||||
"""Module to call certbot in test mode"""
|
||||
from __future__ import absolute_import
|
||||
from distutils.version import LooseVersion
|
||||
import subprocess
|
||||
import sys
|
||||
import os
|
||||
|
||||
import certbot_integration_tests
|
||||
from certbot_integration_tests.utils.constants import *
|
||||
|
||||
|
||||
def certbot_test(certbot_args, directory_url, http_01_port, tls_alpn_01_port,
|
||||
config_dir, workspace, env=None, force_renew=True):
|
||||
"""
|
||||
Invoke the certbot executable available in PATH in a test context for the given args.
|
||||
The test context consists in running certbot in debug mode, with various flags suitable
|
||||
for tests (eg. no ssl check, customizable ACME challenge ports and config directory ...).
|
||||
This command captures stdout and returns it to the caller.
|
||||
:param list certbot_args: the arguments to pass to the certbot executable
|
||||
:param str directory_url: URL of the ACME directory server to use
|
||||
:param int http_01_port: port for the HTTP-01 challenges
|
||||
:param int tls_alpn_01_port: port for the TLS-ALPN-01 challenges
|
||||
:param str config_dir: certbot configuration directory to use
|
||||
:param str workspace: certbot current directory to use
|
||||
:param obj env: the environment variables to use (default: None, then current env will be used)
|
||||
:param bool force_renew: set False to not force renew existing certificates (default: True)
|
||||
:return: stdout as string
|
||||
:rtype: str
|
||||
"""
|
||||
new_environ = env if env else os.environ.copy()
|
||||
command, env = _prepare_args_env(certbot_args, directory_url, http_01_port, tls_alpn_01_port,
|
||||
config_dir, workspace, force_renew, new_environ)
|
||||
|
||||
return subprocess.check_output(command, universal_newlines=True, cwd=workspace, env=env)
|
||||
|
||||
|
||||
def _prepare_environ(workspace, new_environ):
|
||||
new_environ = new_environ.copy()
|
||||
new_environ['TMPDIR'] = workspace
|
||||
|
||||
# So, pytest is nice, and a little too nice for our usage.
|
||||
# In order to help user to call seamlessly any piece of python code without requiring to
|
||||
# install it as a full-fledged setuptools distribution for instance, it may inject the path
|
||||
# to the test files into the PYTHONPATH. This allows the python interpreter to import
|
||||
# as modules any python file available at this path.
|
||||
# See https://docs.pytest.org/en/3.2.5/pythonpath.html for the explanation and description.
|
||||
# However this behavior is not good in integration tests, in particular the nginx oldest ones.
|
||||
# Indeed during these kind of tests certbot is installed as a transitive dependency to
|
||||
# certbot-nginx. Here is the trick: this certbot version is not necessarily the same as
|
||||
# the certbot codebase lying in current working directory. For instance in oldest tests
|
||||
# certbot==0.36.0 may be installed while the codebase corresponds to certbot==0.37.0.dev0.
|
||||
# Then during a pytest run, PYTHONPATH contains the path to the Certbot codebase, so invoking
|
||||
# certbot will import the modules from the codebase (0.37.0.dev0), not from the
|
||||
# required/installed version (0.36.0).
|
||||
# This will lead to funny and totally incomprehensible errors. To avoid that, we ensure that
|
||||
# if PYTHONPATH is set, it does not contain the path to the root of the codebase.
|
||||
if new_environ.get('PYTHONPATH'):
|
||||
# certbot_integration_tests.__file__ is:
|
||||
# '/path/to/certbot/certbot-ci/certbot_integration_tests/__init__.pyc'
|
||||
# ... and we want '/path/to/certbot'
|
||||
certbot_root = os.path.dirname(os.path.dirname(os.path.dirname(certbot_integration_tests.__file__)))
|
||||
python_paths = [path for path in new_environ['PYTHONPATH'].split(':') if path != certbot_root]
|
||||
new_environ['PYTHONPATH'] = ':'.join(python_paths)
|
||||
|
||||
return new_environ
|
||||
|
||||
|
||||
def _compute_additional_args(workspace, environ, force_renew):
|
||||
additional_args = []
|
||||
output = subprocess.check_output(['certbot', '--version'],
|
||||
universal_newlines=True, stderr=subprocess.STDOUT,
|
||||
cwd=workspace, env=environ)
|
||||
version_str = output.split(' ')[1].strip() # Typical response is: output = 'certbot 0.31.0.dev0'
|
||||
if LooseVersion(version_str) >= LooseVersion('0.30.0'):
|
||||
additional_args.append('--no-random-sleep-on-renew')
|
||||
|
||||
if force_renew:
|
||||
additional_args.append('--renew-by-default')
|
||||
|
||||
return additional_args
|
||||
|
||||
|
||||
def _prepare_args_env(certbot_args, directory_url, http_01_port, tls_alpn_01_port,
|
||||
config_dir, workspace, force_renew, new_environ):
|
||||
|
||||
new_environ = _prepare_environ(workspace, new_environ)
|
||||
additional_args = _compute_additional_args(workspace, new_environ, force_renew)
|
||||
|
||||
command = [
|
||||
'certbot',
|
||||
'--server', directory_url,
|
||||
'--no-verify-ssl',
|
||||
'--http-01-port', str(http_01_port),
|
||||
'--https-port', str(tls_alpn_01_port),
|
||||
'--manual-public-ip-logging-ok',
|
||||
'--config-dir', config_dir,
|
||||
'--work-dir', os.path.join(workspace, 'work'),
|
||||
'--logs-dir', os.path.join(workspace, 'logs'),
|
||||
'--non-interactive',
|
||||
'--no-redirect',
|
||||
'--agree-tos',
|
||||
'--register-unsafely-without-email',
|
||||
'--debug',
|
||||
'-vv'
|
||||
]
|
||||
|
||||
command.extend(certbot_args)
|
||||
command.extend(additional_args)
|
||||
|
||||
print('--> Invoke command:\n=====\n{0}\n====='.format(subprocess.list2cmdline(command)))
|
||||
|
||||
return command, new_environ
|
||||
|
||||
|
||||
def main():
|
||||
args = sys.argv[1:]
|
||||
|
||||
# Default config is pebble
|
||||
directory_url = os.environ.get('SERVER', PEBBLE_DIRECTORY_URL)
|
||||
http_01_port = int(os.environ.get('HTTP_01_PORT', HTTP_01_PORT))
|
||||
tls_alpn_01_port = int(os.environ.get('TLS_ALPN_01_PORT', TLS_ALPN_01_PORT))
|
||||
|
||||
# Execution of certbot in a self-contained workspace
|
||||
workspace = os.environ.get('WORKSPACE', os.path.join(os.getcwd(), '.certbot_test_workspace'))
|
||||
if not os.path.exists(workspace):
|
||||
print('--> Creating a workspace for certbot_test: {0}'.format(workspace))
|
||||
os.mkdir(workspace)
|
||||
else:
|
||||
print('--> Using an existing workspace for certbot_test: {0}'.format(workspace))
|
||||
config_dir = os.path.join(workspace, 'conf')
|
||||
|
||||
# Invoke certbot in test mode, without capturing output so users see directly the outcome.
|
||||
command, env = _prepare_args_env(args, directory_url, http_01_port, tls_alpn_01_port,
|
||||
config_dir, workspace, True)
|
||||
subprocess.check_call(command, universal_newlines=True, cwd=workspace, env=env)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
9
certbot-ci/certbot_integration_tests/utils/constants.py
Normal file
9
certbot-ci/certbot_integration_tests/utils/constants.py
Normal file
@@ -0,0 +1,9 @@
|
||||
"""Some useful constants to use throughout certbot-ci integration tests"""
|
||||
HTTP_01_PORT = 5002
|
||||
TLS_ALPN_01_PORT = 5001
|
||||
CHALLTESTSRV_PORT = 8055
|
||||
BOULDER_V1_DIRECTORY_URL = 'http://localhost:4000/directory'
|
||||
BOULDER_V2_DIRECTORY_URL = 'http://localhost:4001/directory'
|
||||
PEBBLE_DIRECTORY_URL = 'https://localhost:14000/dir'
|
||||
PEBBLE_MANAGEMENT_URL = 'https://localhost:15000'
|
||||
MOCK_OCSP_SERVER_PORT = 4002
|
||||
@@ -3,9 +3,11 @@ Misc module contains stateless functions that could be used during pytest execut
|
||||
or outside during setup/teardown of the integration tests environment.
|
||||
"""
|
||||
import contextlib
|
||||
import logging
|
||||
import errno
|
||||
import multiprocessing
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
import stat
|
||||
import subprocess
|
||||
@@ -23,19 +25,17 @@ from cryptography.hazmat.primitives.asymmetric import ec
|
||||
from cryptography.hazmat.primitives.serialization import Encoding, PrivateFormat, NoEncryption
|
||||
from six.moves import socketserver, SimpleHTTPServer
|
||||
|
||||
from acme import crypto_util
|
||||
|
||||
|
||||
RSA_KEY_TYPE = 'rsa'
|
||||
ECDSA_KEY_TYPE = 'ecdsa'
|
||||
|
||||
|
||||
def check_until_timeout(url):
|
||||
def check_until_timeout(url, attempts=30):
|
||||
"""
|
||||
Wait and block until given url responds with status 200, or raise an exception
|
||||
after 150 attempts.
|
||||
after the specified number of attempts.
|
||||
:param str url: the URL to test
|
||||
:raise ValueError: exception raised after 150 unsuccessful attempts to reach the URL
|
||||
:param int attempts: the number of times to try to connect to the URL
|
||||
:raise ValueError: exception raised if unable to reach the URL
|
||||
"""
|
||||
try:
|
||||
import urllib3
|
||||
@@ -45,7 +45,7 @@ def check_until_timeout(url):
|
||||
from requests.packages.urllib3.exceptions import InsecureRequestWarning
|
||||
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
|
||||
|
||||
for _ in range(0, 150):
|
||||
for _ in range(attempts):
|
||||
time.sleep(1)
|
||||
try:
|
||||
if requests.get(url, verify=False).status_code == 200:
|
||||
@@ -53,7 +53,7 @@ def check_until_timeout(url):
|
||||
except requests.exceptions.ConnectionError:
|
||||
pass
|
||||
|
||||
raise ValueError('Error, url did not respond after 150 attempts: {0}'.format(url))
|
||||
raise ValueError('Error, url did not respond after {0} attempts: {1}'.format(attempts, url))
|
||||
|
||||
|
||||
class GracefulTCPServer(socketserver.TCPServer):
|
||||
@@ -64,6 +64,10 @@ class GracefulTCPServer(socketserver.TCPServer):
|
||||
allow_reuse_address = True
|
||||
|
||||
|
||||
def _run_server(port):
|
||||
GracefulTCPServer(('', port), SimpleHTTPServer.SimpleHTTPRequestHandler).serve_forever()
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def create_http_server(port):
|
||||
"""
|
||||
@@ -76,10 +80,7 @@ def create_http_server(port):
|
||||
current_cwd = os.getcwd()
|
||||
webroot = tempfile.mkdtemp()
|
||||
|
||||
def run():
|
||||
GracefulTCPServer(('', port), SimpleHTTPServer.SimpleHTTPRequestHandler).serve_forever()
|
||||
|
||||
process = multiprocessing.Process(target=run)
|
||||
process = multiprocessing.Process(target=_run_server, args=(port,))
|
||||
|
||||
try:
|
||||
# SimpleHTTPServer is designed to serve files from the current working directory at the
|
||||
@@ -121,15 +122,9 @@ def generate_test_file_hooks(config_dir, hook_probe):
|
||||
:param str config_dir: current certbot config directory
|
||||
:param hook_probe: path to the hook probe to test hook scripts execution
|
||||
"""
|
||||
if sys.platform == 'win32':
|
||||
extension = 'bat'
|
||||
else:
|
||||
extension = 'sh'
|
||||
hook_path = pkg_resources.resource_filename('certbot_integration_tests', 'assets/hook.py')
|
||||
|
||||
renewal_hooks_dirs = list_renewal_hooks_dirs(config_dir)
|
||||
renewal_deploy_hook_path = os.path.join(renewal_hooks_dirs[1], 'hook.sh')
|
||||
|
||||
for hook_dir in renewal_hooks_dirs:
|
||||
for hook_dir in list_renewal_hooks_dirs(config_dir):
|
||||
# We want an equivalent of bash `chmod -p $HOOK_DIR, that does not fail if one folder of
|
||||
# the hierarchy already exists. It is not the case of os.makedirs. Python 3 has an
|
||||
# optional parameter `exists_ok` to not fail on existing dir, but Python 2.7 does not.
|
||||
@@ -139,26 +134,25 @@ def generate_test_file_hooks(config_dir, hook_probe):
|
||||
except OSError as error:
|
||||
if error.errno != errno.EEXIST:
|
||||
raise
|
||||
hook_path = os.path.join(hook_dir, 'hook.{0}'.format(extension))
|
||||
if extension == 'sh':
|
||||
data = '''\
|
||||
#!/bin/bash -xe
|
||||
if [ "$0" = "{0}" ]; then
|
||||
if [ -z "$RENEWED_DOMAINS" -o -z "$RENEWED_LINEAGE" ]; then
|
||||
echo "Environment variables not properly set!" >&2
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
echo $(basename $(dirname "$0")) >> "{1}"\
|
||||
'''.format(renewal_deploy_hook_path, hook_probe)
|
||||
else:
|
||||
# TODO: Write the equivalent bat file for Windows
|
||||
data = '''\
|
||||
|
||||
'''
|
||||
with open(hook_path, 'w') as file:
|
||||
file.write(data)
|
||||
os.chmod(hook_path, os.stat(hook_path).st_mode | stat.S_IEXEC)
|
||||
if os.name != 'nt':
|
||||
entrypoint_script_path = os.path.join(hook_dir, 'entrypoint.sh')
|
||||
entrypoint_script = '''\
|
||||
#!/usr/bin/env bash
|
||||
set -e
|
||||
"{0}" "{1}" "{2}" "{3}"
|
||||
'''.format(sys.executable, hook_path, entrypoint_script_path, hook_probe)
|
||||
else:
|
||||
entrypoint_script_path = os.path.join(hook_dir, 'entrypoint.bat')
|
||||
entrypoint_script = '''\
|
||||
@echo off
|
||||
"{0}" "{1}" "{2}" "{3}"
|
||||
'''.format(sys.executable, hook_path, entrypoint_script_path, hook_probe)
|
||||
|
||||
with open(entrypoint_script_path, 'w') as file_h:
|
||||
file_h.write(entrypoint_script)
|
||||
|
||||
os.chmod(entrypoint_script_path, os.stat(entrypoint_script_path).st_mode | stat.S_IEXEC)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
@@ -195,7 +189,7 @@ for _ in range(0, 10):
|
||||
except requests.exceptions.ConnectionError:
|
||||
pass
|
||||
raise ValueError('Error, url did not respond after 10 attempts: {{0}}'.format(url))
|
||||
'''.format(http_server_root, http_port))
|
||||
'''.format(http_server_root.replace('\\', '\\\\'), http_port))
|
||||
os.chmod(auth_script_path, 0o755)
|
||||
|
||||
cleanup_script_path = os.path.join(tempdir, 'cleanup.py')
|
||||
@@ -206,7 +200,7 @@ import os
|
||||
import shutil
|
||||
well_known = os.path.join('{0}', '.well-known')
|
||||
shutil.rmtree(well_known)
|
||||
'''.format(http_server_root))
|
||||
'''.format(http_server_root.replace('\\', '\\\\')))
|
||||
os.chmod(cleanup_script_path, 0o755)
|
||||
|
||||
yield ('{0} {1}'.format(sys.executable, auth_script_path),
|
||||
@@ -215,18 +209,6 @@ shutil.rmtree(well_known)
|
||||
shutil.rmtree(tempdir)
|
||||
|
||||
|
||||
def get_certbot_version():
|
||||
"""
|
||||
Find the version of the certbot available in PATH.
|
||||
:return str: the certbot version
|
||||
"""
|
||||
output = subprocess.check_output(['certbot', '--version'],
|
||||
universal_newlines=True, stderr=subprocess.STDOUT)
|
||||
# Typical response is: output = 'certbot 0.31.0.dev0'
|
||||
version_str = output.split(' ')[1].strip()
|
||||
return LooseVersion(version_str)
|
||||
|
||||
|
||||
def generate_csr(domains, key_path, csr_path, key_type=RSA_KEY_TYPE):
|
||||
"""
|
||||
Generate a private key, and a CSR for the given domains using this key.
|
||||
@@ -250,13 +232,20 @@ def generate_csr(domains, key_path, csr_path, key_type=RSA_KEY_TYPE):
|
||||
else:
|
||||
raise ValueError('Invalid key type: {0}'.format(key_type))
|
||||
|
||||
key_bytes = crypto.dump_privatekey(crypto.FILETYPE_PEM, key)
|
||||
with open(key_path, 'wb') as file:
|
||||
file.write(key_bytes)
|
||||
with open(key_path, 'wb') as file_h:
|
||||
file_h.write(crypto.dump_privatekey(crypto.FILETYPE_PEM, key))
|
||||
|
||||
csr_bytes = crypto_util.make_csr(key_bytes, domains)
|
||||
with open(csr_path, 'wb') as file:
|
||||
file.write(csr_bytes)
|
||||
req = crypto.X509Req()
|
||||
san = ', '.join(['DNS:{0}'.format(item) for item in domains])
|
||||
san_constraint = crypto.X509Extension(b'subjectAltName', False, san.encode('utf-8'))
|
||||
req.add_extensions([san_constraint])
|
||||
|
||||
req.set_pubkey(key)
|
||||
req.set_version(2)
|
||||
req.sign(key, 'sha256')
|
||||
|
||||
with open(csr_path, 'wb') as file_h:
|
||||
file_h.write(crypto.dump_certificate_request(crypto.FILETYPE_ASN1, req))
|
||||
|
||||
|
||||
def read_certificate(cert_path):
|
||||
@@ -282,4 +271,32 @@ def load_sample_data_path(workspace):
|
||||
original = pkg_resources.resource_filename('certbot_integration_tests', 'assets/sample-config')
|
||||
copied = os.path.join(workspace, 'sample-config')
|
||||
shutil.copytree(original, copied, symlinks=True)
|
||||
|
||||
if os.name == 'nt':
|
||||
# Fix the symlinks on Windows since GIT is not creating them upon checkout
|
||||
for lineage in ['a.encryption-example.com', 'b.encryption-example.com']:
|
||||
current_live = os.path.join(copied, 'live', lineage)
|
||||
for name in os.listdir(current_live):
|
||||
if name != 'README':
|
||||
current_file = os.path.join(current_live, name)
|
||||
with open(current_file) as file_h:
|
||||
src = file_h.read()
|
||||
os.unlink(current_file)
|
||||
os.symlink(os.path.join(current_live, src), current_file)
|
||||
|
||||
return copied
|
||||
|
||||
|
||||
def echo(keyword, path=None):
|
||||
"""
|
||||
Generate a platform independent executable command
|
||||
that echoes the given keyword into the given file.
|
||||
:param keyword: the keyword to echo (must be a single keyword)
|
||||
:param path: path to the file were keyword is echoed
|
||||
:return: the executable command
|
||||
"""
|
||||
if not re.match(r'^\w+$', keyword):
|
||||
raise ValueError('Error, keyword `{0}` is not a single keyword.'
|
||||
.format(keyword))
|
||||
return '{0} -c "from __future__ import print_function; print(\'{1}\')"{2}'.format(
|
||||
os.path.basename(sys.executable), keyword, ' >> "{0}"'.format(path) if path else '')
|
||||
|
||||
@@ -0,0 +1,53 @@
|
||||
import json
|
||||
import os
|
||||
import stat
|
||||
|
||||
import pkg_resources
|
||||
import requests
|
||||
|
||||
from certbot_integration_tests.utils.constants import MOCK_OCSP_SERVER_PORT
|
||||
|
||||
PEBBLE_VERSION = 'v2.2.1'
|
||||
ASSETS_PATH = pkg_resources.resource_filename('certbot_integration_tests', 'assets')
|
||||
|
||||
|
||||
def fetch(workspace):
|
||||
suffix = 'linux-amd64' if os.name != 'nt' else 'windows-amd64.exe'
|
||||
|
||||
pebble_path = _fetch_asset('pebble', suffix)
|
||||
challtestsrv_path = _fetch_asset('pebble-challtestsrv', suffix)
|
||||
pebble_config_path = _build_pebble_config(workspace)
|
||||
|
||||
return pebble_path, challtestsrv_path, pebble_config_path
|
||||
|
||||
|
||||
def _fetch_asset(asset, suffix):
|
||||
asset_path = os.path.join(ASSETS_PATH, '{0}_{1}_{2}'.format(asset, PEBBLE_VERSION, suffix))
|
||||
if not os.path.exists(asset_path):
|
||||
asset_url = ('https://github.com/letsencrypt/pebble/releases/download/{0}/{1}_{2}'
|
||||
.format(PEBBLE_VERSION, asset, suffix))
|
||||
response = requests.get(asset_url)
|
||||
response.raise_for_status()
|
||||
with open(asset_path, 'wb') as file_h:
|
||||
file_h.write(response.content)
|
||||
os.chmod(asset_path, os.stat(asset_path).st_mode | stat.S_IEXEC)
|
||||
|
||||
return asset_path
|
||||
|
||||
|
||||
def _build_pebble_config(workspace):
|
||||
config_path = os.path.join(workspace, 'pebble-config.json')
|
||||
with open(config_path, 'w') as file_h:
|
||||
file_h.write(json.dumps({
|
||||
'pebble': {
|
||||
'listenAddress': '0.0.0.0:14000',
|
||||
'managementListenAddress': '0.0.0.0:15000',
|
||||
'certificate': os.path.join(ASSETS_PATH, 'cert.pem'),
|
||||
'privateKey': os.path.join(ASSETS_PATH, 'key.pem'),
|
||||
'httpPort': 5002,
|
||||
'tlsPort': 5001,
|
||||
'ocspResponderURL': 'http://127.0.0.1:{0}'.format(MOCK_OCSP_SERVER_PORT),
|
||||
},
|
||||
}))
|
||||
|
||||
return config_path
|
||||
71
certbot-ci/certbot_integration_tests/utils/pebble_ocsp_server.py
Executable file
71
certbot-ci/certbot_integration_tests/utils/pebble_ocsp_server.py
Executable file
@@ -0,0 +1,71 @@
|
||||
#!/usr/bin/env python
|
||||
"""
|
||||
This runnable module interfaces itself with the Pebble management interface in order
|
||||
to serve a mock OCSP responder during integration tests against Pebble.
|
||||
"""
|
||||
import datetime
|
||||
import re
|
||||
|
||||
import requests
|
||||
from dateutil import parser
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import serialization, hashes
|
||||
from cryptography import x509
|
||||
from cryptography.x509 import ocsp
|
||||
from six.moves import BaseHTTPServer
|
||||
|
||||
from certbot_integration_tests.utils.misc import GracefulTCPServer
|
||||
from certbot_integration_tests.utils.constants import MOCK_OCSP_SERVER_PORT, PEBBLE_MANAGEMENT_URL
|
||||
|
||||
|
||||
class _ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
def do_POST(self):
|
||||
request = requests.get(PEBBLE_MANAGEMENT_URL + '/intermediate-keys/0', verify=False)
|
||||
issuer_key = serialization.load_pem_private_key(request.content, None, default_backend())
|
||||
|
||||
request = requests.get(PEBBLE_MANAGEMENT_URL + '/intermediates/0', verify=False)
|
||||
issuer_cert = x509.load_pem_x509_certificate(request.content, default_backend())
|
||||
|
||||
try:
|
||||
content_len = int(self.headers.getheader('content-length', 0))
|
||||
except AttributeError:
|
||||
content_len = int(self.headers.get('Content-Length'))
|
||||
|
||||
ocsp_request = ocsp.load_der_ocsp_request(self.rfile.read(content_len))
|
||||
response = requests.get('{0}/cert-status-by-serial/{1}'.format(
|
||||
PEBBLE_MANAGEMENT_URL, str(hex(ocsp_request.serial_number)).replace('0x', '')), verify=False)
|
||||
|
||||
if not response.ok:
|
||||
ocsp_response = ocsp.OCSPResponseBuilder.build_unsuccessful(ocsp.OCSPResponseStatus.UNAUTHORIZED)
|
||||
else:
|
||||
data = response.json()
|
||||
|
||||
now = datetime.datetime.utcnow()
|
||||
cert = x509.load_pem_x509_certificate(data['Certificate'].encode(), default_backend())
|
||||
if data['Status'] != 'Revoked':
|
||||
ocsp_status, revocation_time, revocation_reason = ocsp.OCSPCertStatus.GOOD, None, None
|
||||
else:
|
||||
ocsp_status, revocation_reason = ocsp.OCSPCertStatus.REVOKED, x509.ReasonFlags.unspecified
|
||||
revoked_at = re.sub(r'( \+\d{4}).*$', r'\1', data['RevokedAt']) # "... +0000 UTC" => "+0000"
|
||||
revocation_time = parser.parse(revoked_at)
|
||||
|
||||
ocsp_response = ocsp.OCSPResponseBuilder().add_response(
|
||||
cert=cert, issuer=issuer_cert, algorithm=hashes.SHA1(),
|
||||
cert_status=ocsp_status,
|
||||
this_update=now, next_update=now + datetime.timedelta(hours=1),
|
||||
revocation_time=revocation_time, revocation_reason=revocation_reason
|
||||
).responder_id(
|
||||
ocsp.OCSPResponderEncoding.NAME, issuer_cert
|
||||
).sign(issuer_key, hashes.SHA256())
|
||||
|
||||
self.send_response(200)
|
||||
self.end_headers()
|
||||
self.wfile.write(ocsp_response.public_bytes(serialization.Encoding.DER))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
try:
|
||||
GracefulTCPServer(('', MOCK_OCSP_SERVER_PORT), _ProxyHandler).serve_forever()
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
36
certbot-ci/certbot_integration_tests/utils/proxy.py
Normal file
36
certbot-ci/certbot_integration_tests/utils/proxy.py
Normal file
@@ -0,0 +1,36 @@
|
||||
#!/usr/bin/env python
|
||||
import json
|
||||
import sys
|
||||
import re
|
||||
|
||||
import requests
|
||||
from six.moves import BaseHTTPServer
|
||||
|
||||
from certbot_integration_tests.utils.misc import GracefulTCPServer
|
||||
|
||||
|
||||
def _create_proxy(mapping):
|
||||
class ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
def do_GET(self):
|
||||
headers = {key.lower(): value for key, value in self.headers.items()}
|
||||
backend = [backend for pattern, backend in mapping.items()
|
||||
if re.match(pattern, headers['host'])][0]
|
||||
response = requests.get(backend + self.path, headers=headers)
|
||||
|
||||
self.send_response(response.status_code)
|
||||
for key, value in response.headers.items():
|
||||
self.send_header(key, value)
|
||||
self.end_headers()
|
||||
self.wfile.write(response.content)
|
||||
|
||||
return ProxyHandler
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
http_port = int(sys.argv[1])
|
||||
port_mapping = json.loads(sys.argv[2])
|
||||
httpd = GracefulTCPServer(('', http_port), _create_proxy(port_mapping))
|
||||
try:
|
||||
httpd.serve_forever()
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
@@ -1,22 +1,35 @@
|
||||
from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
import sys
|
||||
|
||||
from distutils.version import StrictVersion
|
||||
from setuptools import setup, find_packages, __version__ as setuptools_version
|
||||
|
||||
|
||||
version = '0.32.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'acme',
|
||||
'coverage',
|
||||
'cryptography',
|
||||
'pyopenssl',
|
||||
'pytest',
|
||||
'pytest-cov',
|
||||
'pytest-xdist',
|
||||
'python-dateutil',
|
||||
'pyyaml',
|
||||
'requests',
|
||||
'six',
|
||||
]
|
||||
|
||||
# Add pywin32 on Windows platforms to handle low-level system calls.
|
||||
# This dependency needs to be added using environment markers to avoid its installation on Linux.
|
||||
# However environment markers are supported only with setuptools >= 36.2.
|
||||
# So this dependency is not added for old Linux distributions with old setuptools,
|
||||
# in order to allow these systems to build certbot from sources.
|
||||
if StrictVersion(setuptools_version) >= StrictVersion('36.2'):
|
||||
install_requires.append("pywin32>=224 ; sys_platform == 'win32'")
|
||||
elif 'bdist_wheel' in sys.argv[1:]:
|
||||
raise RuntimeError('Error, you are trying to build certbot wheels using an old version '
|
||||
'of setuptools. Version 36.2+ of setuptools is required.')
|
||||
|
||||
setup(
|
||||
name='certbot-ci',
|
||||
version=version,
|
||||
@@ -45,4 +58,11 @@ setup(
|
||||
packages=find_packages(),
|
||||
include_package_data=True,
|
||||
install_requires=install_requires,
|
||||
|
||||
entry_points={
|
||||
'console_scripts': [
|
||||
'certbot_test=certbot_integration_tests.utils.certbot_call:main',
|
||||
'run_acme_server=certbot_integration_tests.utils.acme_server:main',
|
||||
],
|
||||
}
|
||||
)
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
FROM debian:jessie
|
||||
FROM debian:stretch
|
||||
MAINTAINER Brad Warren <bmw@eff.org>
|
||||
|
||||
# no need to mkdir anything:
|
||||
|
||||
@@ -4,7 +4,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'certbot',
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-cloudflare
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-cloudflare
|
||||
@@ -22,7 +22,9 @@ Credentials
|
||||
|
||||
Use of this plugin requires a configuration file containing Cloudflare API
|
||||
credentials, obtained from your Cloudflare
|
||||
`account page <https://www.cloudflare.com/a/account/my-account>`_.
|
||||
`account page <https://www.cloudflare.com/a/account/my-account>`_. This plugin
|
||||
does not currently support Cloudflare's "API Tokens", so please ensure you use
|
||||
the "Global API Key" for authentication.
|
||||
|
||||
.. code-block:: ini
|
||||
:name: credentials.ini
|
||||
|
||||
@@ -10,7 +10,7 @@ from certbot.plugins import dns_common
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
ACCOUNT_URL = 'https://www.cloudflare.com/a/account/my-account'
|
||||
ACCOUNT_URL = 'https://dash.cloudflare.com/profile/api-tokens'
|
||||
|
||||
|
||||
@zope.interface.implementer(interfaces.IAuthenticator)
|
||||
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-cloudxns
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-cloudxns
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-digitalocean
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-digitalocean
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-dnsimple
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-dnsimple
|
||||
@@ -3,7 +3,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-dnsmadeeasy
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-dnsmadeeasy
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-gehirn
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-gehirn
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Please update tox.ini when modifying dependency version requirements
|
||||
install_requires = [
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-google
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-google
|
||||
@@ -274,10 +274,11 @@ class _GoogleClient(object):
|
||||
raise errors.PluginError('Encountered error finding managed zone: {0}'
|
||||
.format(e))
|
||||
|
||||
if zones:
|
||||
zone_id = zones[0]['id']
|
||||
logger.debug('Found id of %s for %s using name %s', zone_id, domain, zone_name)
|
||||
return zone_id
|
||||
for zone in zones:
|
||||
zone_id = zone['id']
|
||||
if 'privateVisibilityConfig' not in zone:
|
||||
logger.debug('Found id of %s for %s using name %s', zone_id, domain, zone_name)
|
||||
return zone_id
|
||||
|
||||
raise errors.PluginError('Unable to determine managed zone for {0} using zone names: {1}.'
|
||||
.format(domain, zone_dns_name_guesses))
|
||||
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-linode
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-linode
|
||||
@@ -1,7 +1,7 @@
|
||||
from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Please update tox.ini when modifying dependency version requirements
|
||||
install_requires = [
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-luadns
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-luadns
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-nsone
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-nsone
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-ovh
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-ovh
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-rfc2136
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-rfc2136
|
||||
@@ -21,8 +21,8 @@ Credentials
|
||||
-----------
|
||||
|
||||
Use of this plugin requires a configuration file containing the target DNS
|
||||
server, optional authorative domain and optional port that supports RFC 2136 Dynamic Updates,
|
||||
the name of the TSIG key, the TSIG key secret itself and the algorithm used if it's
|
||||
server and optional port that supports RFC 2136 Dynamic Updates, the name
|
||||
of the TSIG key, the TSIG key secret itself and the algorithm used if it's
|
||||
different to HMAC-MD5.
|
||||
|
||||
.. code-block:: ini
|
||||
@@ -33,8 +33,6 @@ different to HMAC-MD5.
|
||||
dns_rfc2136_server = 192.0.2.1
|
||||
# Target DNS port
|
||||
dns_rfc2136_port = 53
|
||||
# Authorative domain (optional, will try to auto-detect if missing)
|
||||
dns_rfc2136_base_domain = example.com
|
||||
# TSIG key name
|
||||
dns_rfc2136_name = keyname.
|
||||
# TSIG key secret
|
||||
|
||||
@@ -79,33 +79,25 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
self._get_rfc2136_client().del_txt_record(validation_name, validation)
|
||||
|
||||
def _get_rfc2136_client(self):
|
||||
key = _RFC2136Key(self.credentials.conf('name'),
|
||||
self.credentials.conf('secret'),
|
||||
self.ALGORITHMS.get(self.credentials.conf('algorithm'),
|
||||
dns.tsig.HMAC_MD5))
|
||||
return _RFC2136Client(self.credentials.conf('server'),
|
||||
int(self.credentials.conf('port') or self.PORT),
|
||||
key,
|
||||
self.credentials.conf('base-domain'))
|
||||
self.credentials.conf('name'),
|
||||
self.credentials.conf('secret'),
|
||||
self.ALGORITHMS.get(self.credentials.conf('algorithm'),
|
||||
dns.tsig.HMAC_MD5))
|
||||
|
||||
class _RFC2136Key(object):
|
||||
def __init__(self, name, secret, algorithm):
|
||||
self.name = name
|
||||
self.secret = secret
|
||||
self.algorithm = algorithm
|
||||
|
||||
class _RFC2136Client(object):
|
||||
"""
|
||||
Encapsulates all communication with the target DNS server.
|
||||
"""
|
||||
def __init__(self, server, port, base_domain, key):
|
||||
def __init__(self, server, port, key_name, key_secret, key_algorithm):
|
||||
self.server = server
|
||||
self.port = port
|
||||
self.keyring = dns.tsigkeyring.from_text({
|
||||
key.name: key.secret
|
||||
key_name: key_secret
|
||||
})
|
||||
self.algorithm = key.algorithm
|
||||
self.base_domain = base_domain
|
||||
self.algorithm = key_algorithm
|
||||
|
||||
def add_txt_record(self, record_name, record_content, record_ttl):
|
||||
"""
|
||||
@@ -179,33 +171,23 @@ class _RFC2136Client(object):
|
||||
|
||||
def _find_domain(self, record_name):
|
||||
"""
|
||||
If 'base_domain' option is specified check if the requested domain matches this base domain
|
||||
and return it. If not explicitly specified find the closest domain with an SOA record for
|
||||
the given domain name.
|
||||
Find the closest domain with an SOA record for a given domain name.
|
||||
|
||||
:param str record_name: The record name for which to find the base domain.
|
||||
:param str record_name: The record name for which to find the closest SOA record.
|
||||
:returns: The domain, if found.
|
||||
:rtype: str
|
||||
:raises certbot.errors.PluginError: if no SOA record can be found.
|
||||
"""
|
||||
|
||||
if self.base_domain:
|
||||
if not record_name.endswith(self.base_domain):
|
||||
raise errors.PluginError('Requested domain {0} does not match specified base '
|
||||
'domain {1}.'
|
||||
.format(record_name, self.base_domain))
|
||||
else:
|
||||
return self.base_domain
|
||||
else:
|
||||
domain_name_guesses = dns_common.base_domain_name_guesses(record_name)
|
||||
domain_name_guesses = dns_common.base_domain_name_guesses(record_name)
|
||||
|
||||
# Loop through until we find an authoritative SOA record
|
||||
for guess in domain_name_guesses:
|
||||
if self._query_soa(guess):
|
||||
return guess
|
||||
# Loop through until we find an authoritative SOA record
|
||||
for guess in domain_name_guesses:
|
||||
if self._query_soa(guess):
|
||||
return guess
|
||||
|
||||
raise errors.PluginError('Unable to determine base domain for {0} using names: {1}.'
|
||||
.format(record_name, domain_name_guesses))
|
||||
raise errors.PluginError('Unable to determine base domain for {0} using names: {1}.'
|
||||
.format(record_name, domain_name_guesses))
|
||||
|
||||
def _query_soa(self, domain_name):
|
||||
"""
|
||||
|
||||
@@ -73,12 +73,9 @@ class AuthenticatorTest(test_util.TempDirTestCase, dns_test_common.BaseAuthentic
|
||||
class RFC2136ClientTest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
from certbot_dns_rfc2136.dns_rfc2136 import _RFC2136Client, _RFC2136Key
|
||||
from certbot_dns_rfc2136.dns_rfc2136 import _RFC2136Client
|
||||
|
||||
self.rfc2136_client = _RFC2136Client(SERVER,
|
||||
PORT,
|
||||
None,
|
||||
_RFC2136Key(NAME, SECRET, dns.tsig.HMAC_MD5))
|
||||
self.rfc2136_client = _RFC2136Client(SERVER, PORT, NAME, SECRET, dns.tsig.HMAC_MD5)
|
||||
|
||||
@mock.patch("dns.query.tcp")
|
||||
def test_add_txt_record(self, query_mock):
|
||||
@@ -165,28 +162,6 @@ class RFC2136ClientTest(unittest.TestCase):
|
||||
self.rfc2136_client._find_domain,
|
||||
'foo.bar.'+DOMAIN)
|
||||
|
||||
def test_find_domain_with_base(self):
|
||||
# _query_soa | pylint: disable=protected-access
|
||||
self.rfc2136_client._query_soa = mock.MagicMock(side_effect=[False, False, True])
|
||||
self.rfc2136_client.base_domain = 'bar.' + DOMAIN
|
||||
|
||||
# _find_domain | pylint: disable=protected-access
|
||||
domain = self.rfc2136_client._find_domain('foo.bar.' + DOMAIN)
|
||||
|
||||
self.assertTrue(domain == 'bar.' + DOMAIN)
|
||||
|
||||
def test_find_domain_with_wrong_base(self):
|
||||
|
||||
# _query_soa | pylint: disable=protected-access
|
||||
self.rfc2136_client._query_soa = mock.MagicMock(side_effect=[False, False, True])
|
||||
self.rfc2136_client.base_domain = 'wrong.' + DOMAIN
|
||||
|
||||
self.assertRaises(
|
||||
errors.PluginError,
|
||||
# _find_domain | pylint: disable=protected-access
|
||||
self.rfc2136_client._find_domain,
|
||||
'foo.bar.' + DOMAIN)
|
||||
|
||||
@mock.patch("dns.query.udp")
|
||||
def test_query_soa_found(self, query_mock):
|
||||
query_mock.return_value = mock.MagicMock(answer=[mock.MagicMock()], flags=dns.flags.AA)
|
||||
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-route53
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-route53
|
||||
@@ -1,3 +1,3 @@
|
||||
include LICENSE
|
||||
include LICENSE.txt
|
||||
include README
|
||||
recursive-include docs *
|
||||
|
||||
@@ -1,16 +1,11 @@
|
||||
from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Remember to update local-oldest-requirements.txt when changing the minimum
|
||||
# acme/certbot version.
|
||||
install_requires = [
|
||||
# boto3 requires urllib<1.25 while requests 2.22+ requires urllib<1.26
|
||||
# Since pip lacks a real dependency graph resolver, it will peak the constraint only from
|
||||
# requests, and install urllib==1.25.2. Setting an explicit dependency here solves the issue.
|
||||
# Check https://github.com/boto/botocore/issues/1733 for resolution in botocore.
|
||||
'urllib3<1.25',
|
||||
'acme>=0.29.0',
|
||||
'certbot>=0.34.0',
|
||||
'boto3',
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
FROM certbot/certbot
|
||||
|
||||
COPY . src/certbot-dns-sakuracloud
|
||||
|
||||
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-sakuracloud
|
||||
@@ -2,7 +2,7 @@ from setuptools import setup
|
||||
from setuptools import find_packages
|
||||
|
||||
|
||||
version = '0.35.0.dev0'
|
||||
version = '0.38.0.dev0'
|
||||
|
||||
# Please update tox.ini when modifying dependency version requirements
|
||||
install_requires = [
|
||||
|
||||
@@ -2,4 +2,4 @@ include LICENSE.txt
|
||||
include README.rst
|
||||
recursive-include docs *
|
||||
recursive-include certbot_nginx/tests/testdata *
|
||||
include certbot_nginx/options-ssl-nginx.conf
|
||||
recursive-include certbot_nginx/tls_configs *.conf
|
||||
|
||||
@@ -6,6 +6,8 @@ import subprocess
|
||||
import tempfile
|
||||
import time
|
||||
|
||||
import pkg_resources
|
||||
|
||||
import OpenSSL
|
||||
import zope.interface
|
||||
|
||||
@@ -18,7 +20,6 @@ from certbot import crypto_util
|
||||
from certbot import errors
|
||||
from certbot import interfaces
|
||||
from certbot import util
|
||||
from certbot.compat import misc
|
||||
from certbot.compat import os
|
||||
from certbot.plugins import common
|
||||
|
||||
@@ -120,6 +121,17 @@ class NginxConfigurator(common.Installer):
|
||||
|
||||
self.reverter.recovery_routine()
|
||||
|
||||
@property
|
||||
def mod_ssl_conf_src(self):
|
||||
"""Full absolute path to SSL configuration file source."""
|
||||
config_filename = "options-ssl-nginx.conf"
|
||||
if self.version < (1, 5, 9):
|
||||
config_filename = "options-ssl-nginx-old.conf"
|
||||
elif self.version < (1, 13, 0):
|
||||
config_filename = "options-ssl-nginx-tls12-only.conf"
|
||||
return pkg_resources.resource_filename(
|
||||
"certbot_nginx", os.path.join("tls_configs", config_filename))
|
||||
|
||||
@property
|
||||
def mod_ssl_conf(self):
|
||||
"""Full absolute path to SSL configuration file."""
|
||||
@@ -130,6 +142,11 @@ class NginxConfigurator(common.Installer):
|
||||
"""Full absolute path to digest of updated SSL configuration file."""
|
||||
return os.path.join(self.config.config_dir, constants.UPDATED_MOD_SSL_CONF_DIGEST)
|
||||
|
||||
def install_ssl_options_conf(self, options_ssl, options_ssl_digest):
|
||||
"""Copy Certbot's SSL options file into the system's config dir if required."""
|
||||
return common.install_version_controlled_file(options_ssl, options_ssl_digest,
|
||||
self.mod_ssl_conf_src, constants.ALL_SSL_OPTIONS_HASHES)
|
||||
|
||||
# This is called in determine_authenticator and determine_installer
|
||||
def prepare(self):
|
||||
"""Prepare the authenticator/installer.
|
||||
@@ -148,14 +165,14 @@ class NginxConfigurator(common.Installer):
|
||||
|
||||
self.parser = parser.NginxParser(self.conf('server-root'))
|
||||
|
||||
install_ssl_options_conf(self.mod_ssl_conf, self.updated_mod_ssl_conf_digest)
|
||||
|
||||
self.install_ssl_dhparams()
|
||||
|
||||
# Set Version
|
||||
if self.version is None:
|
||||
self.version = self.get_version()
|
||||
|
||||
self.install_ssl_options_conf(self.mod_ssl_conf, self.updated_mod_ssl_conf_digest)
|
||||
|
||||
self.install_ssl_dhparams()
|
||||
|
||||
# Prevent two Nginx plugins from modifying a config at once
|
||||
try:
|
||||
util.lock_dir_until_exit(self.conf('server-root'))
|
||||
@@ -888,13 +905,9 @@ class NginxConfigurator(common.Installer):
|
||||
have permissions of root.
|
||||
|
||||
"""
|
||||
uid = misc.os_geteuid()
|
||||
util.make_or_verify_dir(
|
||||
self.config.work_dir, core_constants.CONFIG_DIRS_MODE, uid)
|
||||
util.make_or_verify_dir(
|
||||
self.config.backup_dir, core_constants.CONFIG_DIRS_MODE, uid)
|
||||
util.make_or_verify_dir(
|
||||
self.config.config_dir, core_constants.CONFIG_DIRS_MODE, uid)
|
||||
util.make_or_verify_dir(self.config.work_dir, core_constants.CONFIG_DIRS_MODE)
|
||||
util.make_or_verify_dir(self.config.backup_dir, core_constants.CONFIG_DIRS_MODE)
|
||||
util.make_or_verify_dir(self.config.config_dir, core_constants.CONFIG_DIRS_MODE)
|
||||
|
||||
def get_version(self):
|
||||
"""Return version of Nginx Server.
|
||||
@@ -1131,12 +1144,6 @@ def nginx_restart(nginx_ctl, nginx_conf):
|
||||
time.sleep(1)
|
||||
|
||||
|
||||
def install_ssl_options_conf(options_ssl, options_ssl_digest):
|
||||
"""Copy Certbot's SSL options file into the system's config dir if required."""
|
||||
return common.install_version_controlled_file(options_ssl, options_ssl_digest,
|
||||
constants.MOD_SSL_CONF_SRC, constants.ALL_SSL_OPTIONS_HASHES)
|
||||
|
||||
|
||||
def _determine_default_server_root():
|
||||
if os.environ.get("CERTBOT_DOCS") == "1":
|
||||
default_server_root = "%s or %s" % (constants.LINUX_SERVER_ROOT,
|
||||
|
||||
@@ -1,8 +1,6 @@
|
||||
"""nginx plugin constants."""
|
||||
import platform
|
||||
|
||||
import pkg_resources
|
||||
|
||||
FREEBSD_DARWIN_SERVER_ROOT = "/usr/local/etc/nginx"
|
||||
LINUX_SERVER_ROOT = "/etc/nginx"
|
||||
|
||||
@@ -21,14 +19,23 @@ CLI_DEFAULTS = dict(
|
||||
MOD_SSL_CONF_DEST = "options-ssl-nginx.conf"
|
||||
"""Name of the mod_ssl config file as saved in `IConfig.config_dir`."""
|
||||
|
||||
MOD_SSL_CONF_SRC = pkg_resources.resource_filename(
|
||||
"certbot_nginx", "options-ssl-nginx.conf")
|
||||
"""Path to the nginx mod_ssl config file found in the Certbot
|
||||
distribution."""
|
||||
|
||||
UPDATED_MOD_SSL_CONF_DIGEST = ".updated-options-ssl-nginx-conf-digest.txt"
|
||||
"""Name of the hash of the updated or informed mod_ssl_conf as saved in `IConfig.config_dir`."""
|
||||
|
||||
SSL_OPTIONS_HASHES_NEW = [
|
||||
'108c4555058a087496a3893aea5d9e1cee0f20a3085d44a52dc1a66522299ac3',
|
||||
'd5e021706ecdccc7090111b0ae9a29ef61523e927f020e410caf0a1fd7063981',
|
||||
]
|
||||
"""SHA256 hashes of the contents of versions of MOD_SSL_CONF_SRC for nginx >= 1.13.0"""
|
||||
|
||||
SSL_OPTIONS_HASHES_MEDIUM = [
|
||||
'63e2bddebb174a05c9d8a7cf2adf72f7af04349ba59a1a925fe447f73b2f1abf',
|
||||
'2901debc7ecbc10917edd9084c05464c9c5930b463677571eaf8c94bffd11ae2',
|
||||
'30baca73ed9a5b0e9a69ea40e30482241d8b1a7343aa79b49dc5d7db0bf53b6c',
|
||||
'02329eb19930af73c54b3632b3165d84571383b8c8c73361df940cb3894dd426',
|
||||
]
|
||||
"""SHA256 hashes of the contents of versions of MOD_SSL_CONF_SRC for nginx >= 1.5.9
|
||||
and nginx < 1.13.0"""
|
||||
|
||||
ALL_SSL_OPTIONS_HASHES = [
|
||||
'0f81093a1465e3d4eaa8b0c14e77b2a2e93568b0fc1351c2b87893a95f0de87c',
|
||||
@@ -37,7 +44,9 @@ ALL_SSL_OPTIONS_HASHES = [
|
||||
'7f95624dd95cf5afc708b9f967ee83a24b8025dc7c8d9df2b556bbc64256b3ff',
|
||||
'394732f2bbe3e5e637c3fb5c6e980a1f1b90b01e2e8d6b7cff41dde16e2a756d',
|
||||
'4b16fec2bcbcd8a2f3296d886f17f9953ffdcc0af54582452ca1e52f5f776f16',
|
||||
]
|
||||
'c052ffff0ad683f43bffe105f7c606b339536163490930e2632a335c8d191cc4',
|
||||
'02329eb19930af73c54b3632b3165d84571383b8c8c73361df940cb3894dd426',
|
||||
] + SSL_OPTIONS_HASHES_MEDIUM + SSL_OPTIONS_HASHES_NEW
|
||||
"""SHA256 hashes of the contents of all versions of MOD_SSL_CONF_SRC"""
|
||||
|
||||
def os_constant(key):
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user